Basic export import...

I am testing export from oracle 7i to 11g.
1) Is 'incremental' export supported only on FULL database exports? Can this option be used only on a specific schemas?
2) If i want to export full database and import, does it allow the option to remap users i.e. say in full export we have 3 schemas for user USER1, USER2, USER3 when importing can i remap these to NEWUSER1, NEWUSER2, NEWUSER3?
3) For exports, say 1 take a full DB export which includes TBLA with 1million rows. Next day I take incremental export and two rows are added/changed in TBLA will the incremental export, export all rows in TBLA or only 2 rows which were addeed/changed?

Handle:      user11356487
Status Level:      Newbie
Registered:      Jul 4, 2009
Total Posts:      54
Total Questions:      25 (18 unresolved)
so many questions and so few answers.
1) NO
2) NO
3) NO
bcm@bcm-laptop:~$ exp help=yes
Export: Release 11.2.0.1.0 - Production on Sun Mar 20 14:56:27 2011
Copyright (c) 1982, 2009, Oracle and/or its affiliates.  All rights reserved.
You can let Export prompt you for parameters by entering the EXP
command followed by your username/password:
     Example: EXP SCOTT/TIGER
Or, you can control how Export runs by entering the EXP command followed
by various arguments. To specify parameters, you use keywords:
     Format:  EXP KEYWORD=value or KEYWORD=(value1,value2,...,valueN)
     Example: EXP SCOTT/TIGER GRANTS=Y TABLES=(EMP,DEPT,MGR)
               or TABLES=(T1:P1,T1:P2), if T1 is partitioned table
USERID must be the first parameter on the command line.
Keyword    Description (Default)      Keyword      Description (Default)
USERID     username/password          FULL         export entire file (N)
BUFFER     size of data buffer        OWNER        list of owner usernames
FILE       output files (EXPDAT.DMP)  TABLES       list of table names
COMPRESS   import into one extent (Y) RECORDLENGTH length of IO record
GRANTS     export grants (Y)          INCTYPE      incremental export type
INDEXES    export indexes (Y)         RECORD       track incr. export (Y)
DIRECT     direct path (N)            TRIGGERS     export triggers (Y)
LOG        log file of screen output  STATISTICS   analyze objects (ESTIMATE)
ROWS       export data rows (Y)       PARFILE      parameter filename
CONSISTENT cross-table consistency(N) CONSTRAINTS  export constraints (Y)
OBJECT_CONSISTENT    transaction set to read only during object export (N)
FEEDBACK             display progress every x rows (0)
FILESIZE             maximum size of each dump file
FLASHBACK_SCN        SCN used to set session snapshot back to
FLASHBACK_TIME       time used to get the SCN closest to the specified time
QUERY                select clause used to export a subset of a table
RESUMABLE            suspend when a space related error is encountered(N)
RESUMABLE_NAME       text string used to identify resumable statement
RESUMABLE_TIMEOUT    wait time for RESUMABLE
TTS_FULL_CHECK       perform full or partial dependency check for TTS
VOLSIZE              number of bytes to write to each tape volume
TABLESPACES          list of tablespaces to export
TRANSPORT_TABLESPACE export transportable tablespace metadata (N)
TEMPLATE             template name which invokes iAS mode export
Export terminated successfully without warnings.Edited by: sb92075 on Mar 20, 2011 3:05 PM

Similar Messages

  • Transfer Foreign Trade Export Import values in Purchase Order

    Hello SRM brothers and sisters,
    I have a question regarding the transfer of the Import/Export data (Foreign Trade) in the PO from SRM to ECC in the Classic Extended Scenario.
    Basically, in the SRM BBP_ECS_PO_OUT_BADI -> BBP_B46B_PO_OUTBOUND method where we map all the values, I try to fill in the CT_BAPI_POEXPIMPITEM table with the item's Import/Export values.
    The way this is done is:
    if nothing exists in this CT_BAPI_POEXPIMPITEM table for the PO item, add an empty line and assign the following fields ( PO_ITEM, EXPORT_IMPORT_PROCEDURE, COUNTRYORI and COMM_CODE).
    This is then sent to ECC via the RFC BBP_PO_INBOUND function module - and inside I know that it assigns the X-structure of Export/Import Item. It then calls the BAPI_PO_CREATE / BAPI_PO_CHANGE and in the debugger the values seem to be correct.
    This all seems to be very nice, however when I take a look at the PO in ME23n, the values of the Import/Export data are not there! In table EIPO there is also nothing... As for error messages, nothing comes up either.
    So, has this happened to anyone? Am I doing something wrong? Is there an extra step I should do (ie, fill another structure/table/field)?
    Cheers,
    Adi

    Hi Kishor,
    Thanks for your reply.I need to some more inputs if you can...
    As you mention :-You need to enter the special UOM according to the specifications in the index of goods or foreign trade statistics: can you give a example to get more clarification.and how Qtys. are converted to this unit for the export declaration.
    You mentionwd : (FOr each item, you have to enter the amount of the unit of measure determined
    in the index of goods. Eg: number of units) -can u explain with example. what is index of Goods.
    Thanks in adavance.
    Thank you.
    Regards
    Amar

  • Export/import in 904 versus 10.1.2

    We have two 10G AS 904 installs (dev and prod - same O/S too). From time to time we move our production portal content to the dev server. We sometimes get corruption and have to use the Schema validation Utility scripts to fix the import to the dev server. Not a show stopper but this adds another manual step to the process.
    I see the 10.1.2 version has improved import/export checks. Can anyone give feedback on the improvements in the export/import process?
    Thanks
    Kevin

    Kevin,
    Careful with this approach. Passing from DEV to PROD is ok, but then to DEV back again, what I suggest you is that you have a clean start on DEV, ie, clean the DEV machine and later do the same to the PROD. Portal Export / Import works only in one-way and not both ways (this basically has to do with the checks we do make and with possible conflicts with the objects on both sides). Also check the available documentation... which is all compiled in Metalink Note:263995.1 - Master Note for OracleAS Portal Export / Import Issues
    As to the improvement of the process, please give it a check on the New Features papers (make a find on "export" - it is easier to pick the references):-
    10.1.2 -- http://www.oracle.com/technology/products/ias/pdf/1012_nf_paper.pdf
    10.1.4 -- http://www.oracle.com/technology/products/ias/portal/pdf/portal_1014_new_features.pdf
    I hope it helps...
    Cheers,
    Pedro.

  • Usage of 'export' & 'import' command in SSL

    Hi,
    I have a query regarding 'export' and 'import' command while creating security certificates.
    Why do we use export word in this command and not anywhere else -
    keytool -export -alias weblogic -file trust.pem -keystore mykeystore.jks -storepass weblogic -rfc
    why can't we use import instead of export and vice versa.
    what do u mean when u say, import or export, what is the basic difference between these two in terms of security command.
    Thanks,
    Sid

    Hi Vankan,
    Your question is so extensive that should be explained a bit.
    Could you please be more specific? Which platform are we talking about? DB, OS, etc?
    Which kind of export/import do you want to carry out?: e.g.: client transport, client copy, homogeneous system copy, heterogeneous system copy...
    The technique you will use for that depends on what you want to do and on the platform you are working on.

  • System copy using R3load ( Export / Import )

    Hi,
    We are testing System copy using R3load ( Export / Import ) using our production data.
    Environment : 46C / Oracle.
    while executing export, it takes more than 20 hours, for the data to get exported, we want to reduce the export time drastically. hence, we want to achieve it by scrutinizing the input parameters.
    during export, there is a parameter by name splitting the *.STR and *.EXT files for R3load.
    the default input for the above parameter is No, do not split STR and EXT files.
    My question 1 : If we input yes instead of No and split the *.STR and *.EXT files, will the export time get reduced ?
    My Question 2: If the answer is yes to Question 1, will the time reduced will be significant enough ? or how much percentage of time we can expect it to reduce, compare to a No Input.
    Best Regards
    L Raghunahth

    Hi
    The time of the export depends on the size of your database (and the size of your biggest tables) and your hardware capacity.
    My question 1 : If we input yes instead of No and split the *.STR and *.EXT files, will the export time get reduced ?
    In case you have a few very large tables, and you have multiple cpu's and a decent disk storage, then splitting might significantly reduce the export time.
    My Question 2: If the answer is yes to Question 1, will the time reduced will be significant enough ? or how much percentage of time we can expect it to reduce, compare to a No Input.
    As you did not tell us about your database size and hardware details there is no way to give you anything but very basic metrics. Did you specify a parallel degree at all, was your hardware idling for 20 hrs or fully loaded already?
    20 hrs for a 100gb database is very slow. It is reasonable (rather fast in my opinion)  for a 2tb database.
    Best regards, Michael

  • Export-import if database is down

    I had a query whether we can run the export and the import process if the database is down. Basically export process is the command, 'exp'. So, would this
    command run even if the database is down. Even, import command is run with command, 'imp', so, would it be requiring database to be up.
    I hope, my question is clear.
    Please, help in solving the doubt.
    regards

    Hi,
    Export is a Logical Backup in sense DB Objects, Think how come with out DB instance is Up and Running, does exp application can extract the things.
    - Pavan Kumar N

  • Files Copied during Client export/import??

    Hi All,
       Wat type of files get formed when perform client export/import??

    Hello Abdul,
    Please check the follwoing link:
    http://help.sap.com/saphelp_nw04/helpdata/en/69/c24c824ba111d189750000e8322d00/content.htm
    it will solve all your queries.
    Basically 3 transport requests get created with different kinds of data in them. Each one of them when released will create a data file and a co file.
    Please award points accordingly. 
    Regards.
    Ruchit.

  • Reduce export import time

    Hl All,
    I want to reduce the total time spent during export-import.
    Restrictions:
    =========
    1)Cross Platform
    2)exp and imp from lower version to upper version
    3)No 10G database
    Basically i want to do exp and imp in parallel so it reduces the total time spent on this activity. I thought of doing schema level exp-imp in parallel but i am afraid of the dependencies.
    Is there any other way to achieve the same or if i go with the above specified approach, can anyone provide some valuable inputs to that.
    I am trying to automate the above so that it becomes one time effort and rest all the times it(script) should do at his own.
    Thanks and regards
    Neeraj

    Hi All,
    Data volume is not less than 60GB and not more than 150GB.
    If i use a pipe on unix in between exp-imp what if my exp is slower at any
    point of time and import is faster(Any reason)?
    is import going to wait for the contents coming into the pipe through export or import will fail.
    I can consider creating indexes using the flat file.
    is there any way to get only the indexes in the flat file, i mean if i use indexfile
    option for import it gives me "CREATE TABLE..." statements too which means import utility is reading full
    dumpfile and giving me the output, i want only the "CREATE INDEX..." statement in the flat file.
    What about the schema level export and import?
    Any valuable inputs or proper steps from anyone out there
    Any restrictions while importing the schema's.
    Thanks and Regards

  • Export/Import of Database to increase the number of datafiles

    My BW system is close to 400 GB and it only has 3 datafiles.  Since we have 8 CPU cores, I need 8 datafiles.  What is the best way to export/import the database in order to achive 8 datafiles?

    With a BW system that size you can probably get away with it.  You most likely do at least a few full loads so all that data will be evenly distrubuted when you drop and reload.  If you can clean up some of your PSAs and log tables you can probably shrink the 3 files down a little anyway.  If you do little maintenance like that every few weeks, after locking auto growth, you can probably shave 2-3 GBs off each file each week.  Do that for a few months and your large files will love 20 GBs while the other ones start to grow.  Rebuilding indexes also helps with that.  You will be surprised how fast they will level out that way.
    With regard to performance you should be fine.  I certainly wouldnt do it at 10 am on Tuesday though :-).  You can probably get away with it over a weekend though.  It will take basically no time at all to create them and very little IO.  If you choose to clean out the 3 existing ones that will take some time.  I have found it takes about 1-3 hours to shrink a 150 GB datafile down to 100 GBs, that was with 8 CPUs, 30 GBs of RAM, and an SAN that I don't fully understand

  • Basic Exporting Question

    My main question is a very basic exporting question, but here is a super-condensed explanation of my big-picture goal for context:
    Large (~450 MB) aiff on CD --> trim w/ Quicktime --> small (~16 MB) MP3 in iTunes
    I would like to take an audio file (aiff) that is approximately 450 MB and export it from Quicktime and in doing so, reduce the file size and convert it to MP3. When I simply try to export it, it doesn't ask me about what size I want, nor does it give the option of MP3 formatting.
    I have figured out how to reach my goal, but it's a mess. After I make my trims in QT, I have to:
    1. select "share" instead of "export".
    2. It asks me what size I want and I select small.
    3. The file is then exported as a Quicktime movie into Mail and the size is reduced from 450 MB to about 20 MB.
    4. I then have to "right-click" on the attachment in the email that is created,
    5. save the attachment,
    6. discard the email,
    7. import the file into iTunes, and
    8. create an MP3 in iTunes to finally arrive at my goal.
    This seems like a ridiculously convoluted process to change a large aiff to a small MP3 and put it in iTunes. Any suggestions?

    Thanks. I guess the basic answer to my question is that it can't be done in one or two fell swoops. The problem is that it starts on a burned disc, and I need to make edits to it before it ends up in iTunes (to be eventually used in iWeb). I was hoping to avoid juggling back and forth between iTunes and QT. I either would have to send it back to QT to do the edits after -->iTunes-->mp3, or import the large file from the CD to QT, make the edits, save changes, import large file to iTunes, convert to mp3.

  • Export ,import efficiency

    Hi,
    I have a basic question here. Which is more efficient exp/imp or the expdp/impdp and why? I would like to read about it. good documents are welcomed.
    Thanks
    Kris

    Hi ,
    Definetly expdp/impdp(Datapump export import) is much better than original exp/imp which is more
    used in oracle 9i Databases.
    Top 10 difference between exp/imp(export/import) and expdp/impdp(Datapump export and import) are:
    1)Data Pump Export and Import operate on a group of files called a dump file set
    rather than on a single sequential dump file.
    2)Data Pump Export and Import access files on the server rather than on the client.
    This results in improved performance. It also means that directory objects are
    required when you specify file locations.
    3)The Data Pump Export and Import modes operate symmetrically, whereas original
    export and import did not always exhibit this behavior.
    For example, suppose you perform an export with FULL=Y, followed by an import using SCHEMAS=HR. This will produce the same results as if you performed an
    export with SCHEMAS=HR, followed by an import with FULL=Y.
    4)Data Pump Export and Import use parallel execution rather than a single stream of
    execution, for improved performance. This means that the order of data within
    dump file sets and the information in the log files is more variable.
    5)Data Pump Export and Import represent metadata in the dump file set as XML
    documents rather than as DDL commands. This provides improved flexibility for
    transforming the metadata at import time.
    6)Data Pump Export and Import are self-tuning utilities. Tuning parameters that
    were used in original Export and Import, such as BUFFER and RECORDLENGTH,
    are neither required nor supported by Data Pump Export and Import.
    7)At import time there is no option to perform interim commits during the
    restoration of a partition. This was provided by the COMMIT parameter in original
    Import.
    8)There is no option to merge extents when you re-create tables. In original Import,
    this was provided by the COMPRESS parameter. Instead, extents are reallocated
    according to storage parameters for the target table.
    9)Sequential media, such as tapes and pipes, are not supported.
    10)The Data Pump method for moving data between different database versions is
    different than the method used by original Export/Import. With original Export,
    you had to run an older version of Export (exp) to produce a dump file that was
    compatible with an older database version. With Data Pump, you can use the
    current Export (expdp) version and simply use the VERSION parameter to specify the target database version
    For more details and options:
    exp help=y
    imp help=y
    expdp help=y
    impdp help=y
    Fine manuals for referring:
    http://www.oracle-base.com/articles/10g/OracleDataPump10g.php
    Hope it helps.
    Best regards,
    Rafi.
    http://rafioracledba.blogspot.com

  • Memory Limitation on EXPORT & IMPORT Internal Tables?

    Hi All,
    I have a need to export and import the internal tables to memory. I do not want to export it to any data base tables. is there a limitation on the amount of memroy that is can be used for the EXPORT & IMPORT. I will free the memory once I import it. The maximum I expect would be 13,000,000 lines.
    Thanks,
    Alex (Arthur Samson)

    You don't have limitations, but try to keep your table as small as possible.
    Otherwise, if you are familiar with the ABAP OO context, try use Shared Objects instead of IMPORT/EXPORT.
    <a href="http://help.sap.com/saphelp_erp2004/helpdata/en/13/dc853f11ed0617e10000000a114084/frameset.htm">SAP Help On Shared Objects</a>
    Hope this helps,
    Roby.

  • Using set/get parameters or export/import in BSP.

    Hi All,
    Is it possible to use set/get or export/import in BSP?
    We need to set/export some variables from a BADI and get/ import them in the BSP application.
    Code snippet will be of great help..
    Thanks,
    Anubhav

    Hi Anubhav,
    You can use the Export / Import statements for your requirement,
    from the BADI use EXPORT to send the variable data to a unique memory location
    with IDs
    e.g.
    *data declaration required for background processing
          DATA: WA_INDX TYPE INDX.
    **here CNAME is the variable you want to export
    EXPORT PNAME = CNAME TO DATABASE INDX(XY) FROM WA_INDX CLIENT
                SY-MANDT ID 'ZVAR1'.
    and in the BSP application use the IMPORT statement to fetch back the values
    set with the IDs above.
    IMPORT PNAME = LV_CNAME
      FROM DATABASE INDX(XY) TO WA_INDX CLIENT
      SY-MANDT ID 'ZVAR1'.
    deletes the data to save wastage of memory
      DELETE FROM DATABASE INDX(XY)
        CLIENT SY-MANDT
        ID 'ZVAR1'.
    Regards,
    Samson Rodrigues

  • Issue with Memory ID export / import

    Hi Experts,
    We are facing strange issue during export/import from memory ID.
    We are exporting value to Memory ID from module pool program and trying to import same value in SAP workflow method.
    While importing the value from memory ID it is failing to import and gives sy-subrc =4.
    Any idea how can we do export/import in module pool program to Workflow method?
    Regards,
    Sanjana

    Hi sanjana,
    Please check the link. Here you can find some examples also.
    http://wiki.scn.sap.com/wiki/display/Snippets/Import+and+Export+to+Cluster+Databases
    P.S
    http://help.sap.com/saphelp_nw04/helpdata/en/fc/eb3bf8358411d1829f0000e829fbfe/content.htm
    You just need to create a key and an ID to save data into INDX table.Once its saved you can use same key and id to import .
    Again mentioning , dont forget to delete using same key and ID once you have imported else it may give error.
    Regards,
    Sandeep Katoch

  • Best choice for exporting / importing EUL

    Hi all
    I have been tasked with migrating an EUL from a R11 to a R12 environment. The Discoverer version on both environments is 10.1.2 and the OS is Solaris on oracle db's.
    I am unfortunately not experienced with Discoverer and there seems to be no one available to assist for various reasons. So I have been reading the manual and forum posts and viewing metalink articles.
    I tried exporting the entire eul via the wizard and then importing it to the new environment but i was not successfull and experienced the system hanging for many hours with a white screen and the log file just ended.
    I assumed this was a memory problem or slow network issues causing this delay. Someone suggested I export import the EUL in pieces and this seemed to be effective but I got missing item warnings when trying to open reports. This piece meal approach also worried me regarding consistency.
    So I decided to try do the full import on the server to try negate the first problem I experienced. Due to the clients security policies I am not able to open the source eul and send it to our dev. I was able to get it from their dev 11 system but I dismissed this as the dev reports were not working and the only reliable eul is the Prod one. I managed to get a prod eex file from a client resource but the upload to my server was extremely slow.
    I asked the dba to assit with the third option of exporting a db dump of the eul_us and importing this into my r12 dev environment. I managed this but had to export the db file using sys which alleviated a priviledge problem when logging in, I have reports that run and my user can see the reports but there are reports that were not shared to sysadmin in the source enviroment are now prefixed with the version 11 user_id in my desktop and the user cannot see her reports only the sysadmin ones.
    I refreshed the BA's using a shell script I made up which uses the java cmd with parameters.
    After some re reading I tried selecting all the options in the validate menu and refreshing in the discover admin tool.
    If I validate and refresh the BA using the admin tool I get the hanging screen and a lot of warnings that items are missing( so much for my java cmd refresh!) and now the report will not open and I see the substitute missing item dialogue boxes.
    My question to the forum is which would be the best approach to migrate the entire eul from a R11 instance to a R12 instance in these circumstances?
    Many thanks
    Regards
    Nick

    Hi Srini
    The os and db details are as follows:
    Source:
    eBus 11.5.2
    Oracle Database 10g Enterprise Edition Release 10.2.0.4.0 - 64bi
    SunOS 5.10 Generic_142900-11 sun4u sparc SUNW,Sun-Fire-V890
    Target:
    ebus 12.1.2
    Oracle Database 11g Enterprise Edition Release 11.1.0.7.0 - 64bit Production DEV12
    SunOS 5.10 Generic_142900-11 sun4u sparc SUNW,Sun-Fire-V890
    Yes the DBA initially did an exp for me using EUL_US as the owner but a strange thing happened with privileges and also, some of the imported tables appeared in the target environment under the apps schema(21 tables) even though the eul_us exp had been 48 tables.
    I also had a problem on the db with "eul_us has insufficient privileges on table space discoverer" type errors.
    I checked the eul_us db privileges and was unable to resolve this initial privilege error even though the privileges were granted to eul_us.
    The dba managed to exp as system and then import it with the full=y flag in the import command which seems to bring in the privileges.
    Then I ran the eul5_id.sql and then made up a list of the business areas and made a sh script to refresh the business areas as follows:
    java -jar eulbuilder.jar -connect sysadmin/oracle1@dev -apps_user -apps_responsibility "System Administrator" -refresh_business_area "ABM Activities" -log refresh.log
    This runs successfully and I can log in select business area and grant access to the users. The reports return data.
    Then one of the users said she can't see all her reports. I noticed some if I opened desktop that were sitting there prefixed with a hash and her version 11 user id.
    So back to the manuals and in the disco admin help the instructions are to first go to view > validate > select all options then go to the business area and click file refresh. This gives me a lot of warnings about items that are missing. I assume this is because the item identifiers brought across in the db dump are the version 11 ones and thus not found in the new system.
    Any suggestions?
    Many thanks
    Nick

Maybe you are looking for