Schema export/Import

Will schema export/import work if database is not open?

user503988 wrote:
Will schema export/import work if database is not open?NO!

Similar Messages

  • Oracle 10g - SCHEMA Export Import

    Hi
    I have a query - Whenever there are some changes in Database schema say for eg, New Columns are added, Some New Sequences are created and sundry other changes for an application release - What is the best way to take backup of the DDL so that I can bring the database / it's objects to the same position as they were in before the release?
    Regards
    Kapil

    You can take export of the schema. Its not required any downtime.
    Also, you take the cold backup and restore it to other server.
    Clone the database by RMAN duplication and restore it to other server before making the changes.
    Regards
    Asif Kabir

  • Error when exporting/importing

    I am using MDM 7.1
    When I export the schema from my Dev system to import into QA I get the following Error:
    "This repository requires additional steps before transport. See the MDS log for details.
    In the log my issue is that I am trying to export an Assignment that includes an expression that uses "look-ups".
    In my Dev system I removed the expression to confirm if this is the issue, once I no longer have expressions with look-ups then it allows me to export the schema. I then tries to import it to QA (since the expressions are not changing I planned on excluding them from the import as a temporary workaround.
    however I get the same error message when trying to import. It seems that I can not export or import with a system that has an assignment with an expression that uses look-ups.
    Is there some config I am missing?

    Hi Brad,
    assignments/validations are a general problem when it comes to Schema exports/imports! What you can do in case there are not too many assignments -  is to delete the assignments and create them new (manually) after you have imported the schema.
    Hope this helps a little.
    Regards,
    Erdal

  • EXPORT/IMPORT SBO SCHEMA ERROR

    Hello,
    We have a client who is working with SAP BusinessOne version for HANA, with HANA PLATFORM rev. 69 and I need to export your database and import it to our development server .
    To backup/restore the database I have performed the following steps :
    1 - I have made ​​the export of a ' SCHEMA ' with "EXPORT "SBOXXX_ES"."*" AS BINARY INTO '/tmp/SBOXXX_ES' WITH REPLACE THREADS 10;".
    2 - I have compressed files in Linux using the command : tar- czvf .....
    3 - I've Unzipped the files to our local server.
    4 - Performing the ' IMPORT ' command gives me the following errors,I tried two ways to import :
    4.1 - IMPORT  "SBOGXXX_ES"."*" as binary from '/tmp/SBOXXX_ES' with replace threads 10 rename schema "SBOXXX_ES" to "SBOXXX_ES_2";
    Error mesage: SAP DBTech JDBC: [2]: general error: [Other schema]._SYS_SS_CE_1957622_RET not found in the import directory.
    4.2 - IMPORT "SBOXXX_ES"."*" as binary from '/tmp/SBOXXX_ES'
    with replace threads 10 NO DEPENDENCIES
    rename schema "SBOXXX_ES" to "SBOXXX_ES_2";
    Error mesage: SAP DBTech JDBC: [2048]: column store error: table import failed:  [30111] Binary import failed (could not execute create statement);invalid column view: ODLN: line 7 col 45 (at pos 506) at ptime/query/checker/check_cube.cc:259
    I guess when you export the " SCHEMA " exported la la " DEPENDENCIES ON" option and these are generating the first error when importing .
    If the error is of the DEPENDENCIES, in which DEPENDENCIES are stored files to edit , and remove all that is giving the error (SYS_SS_CE_1957622_RET ) .
    Thank you very much for your support .
    Adria.

    Hi Adria,
    I asked you to test the import on the system you exported from to determine if the problem is occurring because you are transporting the files from system 1 to system 2.  If it works on point of origin we should investigate why the transfer is corrupting the file.  I thought it would be ok because you are renaming the schema as part of the process.  I would not want to import @point of origin without the rename.
    Whenever some one asks me how to export a schema and import it with a different name I provide:
    export "SBODEMOGB"."*" as
    binary into '/home/SBODEMOGB' with replace threads 10;
    import "SBODEMOGB "."*" as
    binary from '/home/SBODEMOGB_901' with replace threads 10 rename schema "SBODEMOGB" to
    "SBODEMOGB_NEW";
    You are using the statement I would provide. 
    I cannot find an hints for the error reported (SYS_SS_CE_1957622_RET).  If you do not see anything more helpful in the trace file I would suggest opening a customer message.
    Best regards,
    Duncan

  • Export/Import with apps user or EUL schema owner ?

    Hi,
    I am working with a migration plan to move Discoverer 9 to Discoverer 10.
    It is an apps mode EUL, where business areas and workbooks have been granted to Apps Users & Responsibilities.
    Customer has only maintained the EUL with the database user of the EUL owner.
    What is recommended for exporting/importing the EUL from 9 to 10.
    To connect to Administrator as the EUL owner, or to connect as an apps user ( SYSADMIN ? )
    Will all the grants to apps users/responsibilities work when importing as schema owner ?
    I tried to create a entire EUL export file connected as EUL owner, and for the workbooks the export log contained :
    1234#.Workbook name1
    2345#.Workbook name 2
    1234 is the user_id for the workbook owner ( All workbooks have been created by an apps user )
    Will the import process manage to convert this user_id, and set the correct owner for the imported workbooks ?
    ( I will use the option Only take ownership if original owner not found )
    I am not sure, but I think I have seen the following syntax in other projects when exporting an apps mode EUL
    SYSADMIN.Workbook name 1
    SYSADMIN.Workbook name 2

    >
    What is recommended for exporting/importing the EUL from 9 to 10.
    To connect to Administrator as the EUL owner, or to connect as an apps user ( SYSADMIN ? )
    Will all the grants to apps users/responsibilities work when importing as schema owner ?
    Hi,
    The best that i know is that you should export using the DB user so that you will not have problem with workbooks or BA that you are not granted for (or any other grants or privileges).
    The import should be done using the APPS user (the super user you use).
    That way if you import a workbook that the owner will not be found then you will get the ownership and you can after migration deal with it.
    If you'll do that with the DB (EUL owner) user after migration you will not have access to those workbooks.
    Any way about the workbooks i suggest you'll save them as DIS files for cases that the import of the workbooks or the owner association for them fail.

  • Export & import of sap sid schema for building a new system

    Hi,
    We have a quality BW system which runs on HANA SPS07 (Revision 73). The database contains many schemas in addition to sap<sid>, the rest of the schemas are used through native HANA and are very huge.
    We are planning to build a new test BW system by system copy method from quality BW systems. We dont want the export/import of entire database as it will take a long time because of huge native HANA schemas, also the schemas other than sap<sid> are not required in the target. Please let me know how to copy only the sap<sid> schema to the new database.
    Thanks & Regards,
    Saravanan

    Hi Saravanan,
    Try with rigth clic on schema -> Export,
    Select CSV format, choose a directory.
    And for import the schema use the follow query:
    import "old_schema"."*" as CSV from 'Directory' with replace threads 10 rename schema "old_schema" to "NEW_schema_name";
    Regards.
    Ferando.

  • Exporting & Importing Physical & Logical schemas, data servers, agents

    Hi,
    I am using ODI 10g.
    I want to export Physical & Logical schemas, data servers, agents from my ODI test environment and to import them to ODI production environment.
    My requirement is to do this export import through some scripts instead of doing it manually.
    Please guide for this.
    Thanks,
    Divya

    Hi Divya,
    Personally i feel rather than exporting individual components/objects, its suggested to export master repository as such.
    You can make use of ODI utility OdiExportMaster (under <ur package>->Tools->Oracle Data Integrator Objects) for exporting and Import Master Repository wizard (All Programs->Oracle-> Oracle Data Integrator->Repository Management-> Master Repository Import )for importing.
    Thanks,
    Guru.

  • Full list of schemas not exported / imported when FULL=Y

    Hi Guys,
    Verison: 10.2.0.5
    Can advise on the full list of schemas not exported / imported when FULL=Y?
    thanks!

    dbaing wrote:
    Hi Guys,
    Verison: 10.2.0.5
    Can advise on the full list of schemas not exported / imported when FULL=Y?
    thanks!System schemas. Please see the following:
    http://docs.oracle.com/cd/B19306_01/server.102/b14215/dp_export.htm#i1006790
    http://docs.oracle.com/cd/E11882_01/server.112/e22490/dp_export.htm#i1006790
    You may also be interested in the following:
    -How to Perform A Full Database Export Import During Upgrade, Migrate, Copy Or Move Of A Database [ID 286775.1]
    -Oracle DataPump Quick Start [ID 413965.1]
    -How To move Or Copy A Database Using DataPump [ID 855268.1]

  • Some objects in sys schema not imported usinf full database import/export.

    Hello All,
    We need to migrate database from one server to new server using same database version 11g Release 2 (11.2.0.1.0). We have some production objects in sys schema. We took full export using full=y and then import on new server using full=y. Other objects belonging to different tablespaces and users has successfully imported but production objects i.e table in sys schema not import.
    We used following commands for export and import:
    # exp system file=/u01/backup/orcl_full_exp.dmp log=/u01/backup/orcl_full_exp.log full=y statistics=none
    # imp system file=/u01/backup/orcl_full_exp.dmp log=/u01/backup/orcl_full_imp.log full=y ignore=y
    Kind Regards,
    Sharjeel

    Hi,
    First of all it is not good practice to keep user object in the sys schema
    second you are useing version 11gr2 what not you go for datapump export/import method.
    third, did you get any error in import, what is the error

  • Best choice for exporting / importing EUL

    Hi all
    I have been tasked with migrating an EUL from a R11 to a R12 environment. The Discoverer version on both environments is 10.1.2 and the OS is Solaris on oracle db's.
    I am unfortunately not experienced with Discoverer and there seems to be no one available to assist for various reasons. So I have been reading the manual and forum posts and viewing metalink articles.
    I tried exporting the entire eul via the wizard and then importing it to the new environment but i was not successfull and experienced the system hanging for many hours with a white screen and the log file just ended.
    I assumed this was a memory problem or slow network issues causing this delay. Someone suggested I export import the EUL in pieces and this seemed to be effective but I got missing item warnings when trying to open reports. This piece meal approach also worried me regarding consistency.
    So I decided to try do the full import on the server to try negate the first problem I experienced. Due to the clients security policies I am not able to open the source eul and send it to our dev. I was able to get it from their dev 11 system but I dismissed this as the dev reports were not working and the only reliable eul is the Prod one. I managed to get a prod eex file from a client resource but the upload to my server was extremely slow.
    I asked the dba to assit with the third option of exporting a db dump of the eul_us and importing this into my r12 dev environment. I managed this but had to export the db file using sys which alleviated a priviledge problem when logging in, I have reports that run and my user can see the reports but there are reports that were not shared to sysadmin in the source enviroment are now prefixed with the version 11 user_id in my desktop and the user cannot see her reports only the sysadmin ones.
    I refreshed the BA's using a shell script I made up which uses the java cmd with parameters.
    After some re reading I tried selecting all the options in the validate menu and refreshing in the discover admin tool.
    If I validate and refresh the BA using the admin tool I get the hanging screen and a lot of warnings that items are missing( so much for my java cmd refresh!) and now the report will not open and I see the substitute missing item dialogue boxes.
    My question to the forum is which would be the best approach to migrate the entire eul from a R11 instance to a R12 instance in these circumstances?
    Many thanks
    Regards
    Nick

    Hi Srini
    The os and db details are as follows:
    Source:
    eBus 11.5.2
    Oracle Database 10g Enterprise Edition Release 10.2.0.4.0 - 64bi
    SunOS 5.10 Generic_142900-11 sun4u sparc SUNW,Sun-Fire-V890
    Target:
    ebus 12.1.2
    Oracle Database 11g Enterprise Edition Release 11.1.0.7.0 - 64bit Production DEV12
    SunOS 5.10 Generic_142900-11 sun4u sparc SUNW,Sun-Fire-V890
    Yes the DBA initially did an exp for me using EUL_US as the owner but a strange thing happened with privileges and also, some of the imported tables appeared in the target environment under the apps schema(21 tables) even though the eul_us exp had been 48 tables.
    I also had a problem on the db with "eul_us has insufficient privileges on table space discoverer" type errors.
    I checked the eul_us db privileges and was unable to resolve this initial privilege error even though the privileges were granted to eul_us.
    The dba managed to exp as system and then import it with the full=y flag in the import command which seems to bring in the privileges.
    Then I ran the eul5_id.sql and then made up a list of the business areas and made a sh script to refresh the business areas as follows:
    java -jar eulbuilder.jar -connect sysadmin/oracle1@dev -apps_user -apps_responsibility "System Administrator" -refresh_business_area "ABM Activities" -log refresh.log
    This runs successfully and I can log in select business area and grant access to the users. The reports return data.
    Then one of the users said she can't see all her reports. I noticed some if I opened desktop that were sitting there prefixed with a hash and her version 11 user id.
    So back to the manuals and in the disco admin help the instructions are to first go to view > validate > select all options then go to the business area and click file refresh. This gives me a lot of warnings about items that are missing. I assume this is because the item identifiers brought across in the db dump are the version 11 ones and thus not found in the new system.
    Any suggestions?
    Many thanks
    Nick

  • Schema Export using DBMS_DATAPUMP is extremely slow

    Hi,
    I created a procedure that duplicates a schema within a given database by first exporting the schema to a dump file using DBMS_DATAPUMP and then imports the same file (can't use network link because it fails most of the time).
    My problem is that a regular schema datapump export takes about 1.5 minutes whereas the export using dbms_datapump takes about 10 times longer - something in the range of 14 minutes.
    here is the code of the procedure that duplicates the schema:
    CREATE OR REPLACE PROCEDURE MOR_DBA.copy_schema3 (
                                              source_schema in varchar2,
                                              destination_schema in varchar2,
                                              include_data in number default 0,
                                              new_password in varchar2 default null,
                                              new_tablespace in varchar2 default null
                                            ) as
      h   number;
      js  varchar2(9); -- COMPLETED or STOPPED
      q   varchar2(1) := chr(39);
      v_old_tablespace varchar2(30);
      v_table_name varchar2(30);
    BEGIN
       /* open a new schema level export job */
       h := dbms_datapump.open ('EXPORT',  'SCHEMA');
       /* attach a file to the operation */
       DBMS_DATAPUMP.ADD_FILE (h, 'COPY_SCHEMA_EXP' ||copy_schema_unique_counter.NEXTVAL || '.DMP', 'LOCAL_DATAPUMP_DIR');
       /* restrict to the schema we want to copy */
       dbms_datapump.metadata_filter (h, 'SCHEMA_LIST',q||source_schema||q);
       /* apply the data filter if we don't want to copy the data */
       IF include_data = 0 THEN
          dbms_datapump.data_filter(h,'INCLUDE_ROWS',0);
       END IF;
       /* start the job */
       dbms_datapump.start_job(h);
       /* wait for the job to finish */
       dbms_datapump.wait_for_job(h, js);
       /* detach the job handle and free the resources */
       dbms_datapump.detach(h);
       /* open a new schema level import job */
       h := dbms_datapump.open ('IMPORT',  'SCHEMA');
       /* attach a file to the operation */
       DBMS_DATAPUMP.ADD_FILE (h, 'COPY_SCHEMA_EXP' ||copy_schema_unique_counter.CURRVAL || '.DMP', 'LOCAL_DATAPUMP_DIR');
       /* restrict to the schema we want to copy */
       dbms_datapump.metadata_filter (h, 'SCHEMA_LIST',q||source_schema||q);
       /* remap the importing schema name to the schema we want to create */     
       dbms_datapump.metadata_remap(h,'REMAP_SCHEMA',source_schema,destination_schema);
       /* remap the tablespace if needed */
       IF new_tablespace IS NOT NULL THEN
          select default_tablespace
          into v_old_tablespace
          from dba_users
          where username=source_schema;
          dbms_datapump.metadata_remap(h,'REMAP_TABLESPACE', v_old_tablespace, new_tablespace);
       END IF;
       /* apply the data filter if we don't want to copy the data */
       IF include_data = 0 THEN
          dbms_datapump.data_filter(h,'INCLUDE_ROWS',0);
       END IF;
       /* start the job */
       dbms_datapump.start_job(h);
       /* wait for the job to finish */
       dbms_datapump.wait_for_job(h, js);
       /* detach the job handle and free the resources */
       dbms_datapump.detach(h);
       /* change the password as the new user has the same password hash as the old user,
       which means the new user can't login! */
       execute immediate 'alter user '||destination_schema||' identified by '||NVL(new_password, destination_schema);
       /* finally, remove the dump file */
       utl_file.fremove('LOCAL_DATAPUMP_DIR','COPY_SCHEMA_EXP' ||copy_schema_unique_counter.CURRVAL|| '.DMP');
    /*EXCEPTION
       WHEN OTHERS THEN    --CLEAN UP IF SOMETHING GOES WRONG
          SELECT t.table_name
          INTO v_table_name
          FROM user_tables t, user_datapump_jobs j
          WHERE t.table_name=j.job_name
          AND j.state='NOT RUNNING';
          execute immediate 'DROP TABLE  ' || v_table_name || ' PURGE';
          RAISE;*/
    end copy_schema3;
    /The import part of the procedure takes about 2 minutes which is the same time a regular dp import takes on the same schema.
    If I disable the import completely it (the export) still takes about 14 minutes.
    Does anyone know why the export using dbms_datapump takes so long for exporting?
    thanks.

    Hi,
    I did a tkprof on the DM trace file and this is what I found:
    Trace file: D:\Oracle\diag\rdbms\instanceid\instanceid\trace\instanceid_dm00_8004.trc
    Sort options: prsela  execpu  fchela 
    count    = number of times OCI procedure was executed
    cpu      = cpu time in seconds executing
    elapsed  = elapsed time in seconds executing
    disk     = number of physical reads of buffers from disk
    query    = number of buffers gotten for consistent read
    current  = number of buffers gotten in current mode (usually for update)
    rows     = number of rows processed by the fetch or execute call
    SQL ID: bjf05cwcj5s6p
    Plan Hash: 0
    BEGIN :1 := sys.kupc$que_int.receive(:2); END;
    call     count       cpu    elapsed       disk      query    current        rows
    Parse        3      0.00       0.00          0          0          0           0
    Execute    229      1.26     939.00         10       2445          0          66
    Fetch        0      0.00       0.00          0          0          0           0
    total      232      1.26     939.00         10       2445          0          66
    Misses in library cache during parse: 0
    Optimizer mode: ALL_ROWS
    Parsing user id: SYS   (recursive depth: 2)
    Elapsed times include waiting on following events:
      Event waited on                             Times   Max. Wait  Total Waited
      ----------------------------------------   Waited  ----------  ------------
      wait for unread message on broadcast channel
                                                    949        1.01        936.39
    ********************************************************************************what does "wait for unread message on broadcast channel" mean and why did it take 939 seconds (more than 15 minutes) ?

  • Unit Tester: Bug in Export/Import of Suites

    I was moving my Unit Test repository from one schema to another, so I first exported all of my suites and then imported them into a new repository.
    But, when I imported them into their new repository, all of the tests forgot what specific function/procedure name within a package they were actually calling. For example, the test that used to be for "pccdb.rotat_maintenance.parseRotatKey", now says it's going to call "pccdb.rotat" (i.e. the specific function name it's supposed to be calling within that package disappeared).
    I went looking and found that in the UT_TEST table, the "object_call" column was null for all of my tests, but because all of my tests call methods within packages, I'm thinking that column shouldn't be NULL. So, I filled in the appropriate function name for one of them, and the test started working....
    So, something in the import or export of unit suites (or their tests) isn't working quite right in populating the ut_tests.object_call column, I'm guessing - please take a look at it.

    This was originally reported in SQL Developer 2.1: Problem exporting and importing unit tests thread. It was logged as
    Bug 9236694 - 2.1: OTN: UT_TEST.OBJECT_CALL COLUMN NOT EXPORTED/IMPORTED
    It has been fixed and will be available in the next patch release.
    Brian Jeffries
    SQL Developer Team

  • Migrate Database- Export/Import

    Hi,
    I need to migrate an Oracle database 9i from Sun Solaris to Linux. The final target database version would be 11g.
    Since this is a 9i database, I see that we have only option of export and database. We have around 15 schemas.
    I have some queries related to it.
    1. If I perform a export with full=y rows=y, Does it export Sys, System Schema objects also?
    2. Can we perform export in Oracle 9i and use datapump import on targert 11g?
    3. What is the ebst approach - a) to perform schema by schema export or b) to perform a full database export with exp / file=xxx.dmp log=xxxx.log full=y?
    Since there is a database version different I dont want to touch sys, system schema objects.
    Appreciate your thoughts.
    Regards
    Cherrish Vaidiyan

    Hi,
    Let me try to answer some of these questions you queried for:
    1. If I perform a export with full=y rows=y, Does it export Sys, System Schema objects also?Export won't export sys objects. For example, there are tables in sys, like obj$ that contain information for other metadata objects, like scott.emp, etc. This is not exported becuase when scott.emp is exported, the data from obj$ is essentially exported that way. When the dumpfile is imported and scott.emp is recreated, the data in sys.obj$ will be restored through the create table statement. As far as the SYSTEM schema is concerned, some objects are exported and some are not. There are tables in system that contain information about queues, jobs, etc. These will probably not make any sense on the target system so those types of tables are excluded from the export job. Other objects make sense to export/import so those are done. This is all figured out in the internals of export/import. Thre are other schemas that are not exproted. Some that I can think of are DMSYS, ORDSYS, etc. This would be for the same reason as SYS.
    2. Can we perform export in Oracle 9i and use datapump import on targert 11g?No, the dumpfiles are formatted differently. If you use exp, then you must use imp. If you use expdp, then you must use impdp. You can do exp on 9i and imp on 11g with the dumfile that was created on 9i.
    3. What is the ebst approach - a) to perform schema by schema export or b) to perform a full database export with exp / file=xxx.dmp log=xxxx.log full=y?This is case by case decision. It depends on what you want. If you want the complete database moved, then I would personally think that a full=y is what you would want to do. If you just did schema exports, then you would never export the tablespaces. This would mean that you would have to create the tablespaces on the source system before you ran imp. There are other objects that are not exported when a schema level export is performed that are exproted when a full is performed. This information can be seen in the utilities guide. Look to see what is exported when in user/schema mode vs full/database mode.
    Since there is a database version different I dont want to touch sys, system schema objects.This is all done for you with the internal workings of exp/imp.
    Dean
    Edited by: Dean Gagne on Jul 29, 2009 8:38 AM

  • Using export/import to migrate data from 8i to 9i

    We are trying to migrate all data from 8i database to 9i database. We plan to migrate the data using export/import utility so that we can have the current 8i database intact. And also the 8i and 9i database will reside on the same machine. Our 8i database size is around 300GB.
    We plan to follow below steps :
    Export data from 8i
    Install 9i
    Create tablespaces
    Create schema and tables
    create user (user used for exporting data)
    Import data in 9i
    Please let me know if below par file is correct for the export :
    BUFFER=560000
    COMPRESS=y
    CONSISTENT=y
    CONSTRAINTS=y
    DIRECT=y
    FEEDBACK=1000
    FILE=dat1.dmp, dat2.dmp, dat3.dmp (more filenames here)
    FILESIZE=2048GB
    FULL=y
    GRANTS=y
    INDEXES=y
    LOG=export.log
    OBJECT_CONSISTENT=y
    PARFILE=exp.par
    ROWS=y
    STATISTICS=ESTIMATE
    TRIGGERS=y
    TTS_FULL_CHECK=TRUE
    Thanks,
    Vinod Bhansali

    I recommend you to change some parameters and remove
    others:
    BUFFER=560000
    COMPRESS=y -- This will increase better storage
    structure ( It is good )
    CONSISTENT=y
    CONSTRAINTS=y
    DIRECT=n -- if you set that parameter in yes you
    can have problems with some objects
    FEEDBACK=1000
    FILE=dat1.dmp, dat2.dmp, dat3.dmp (more filenames here)
    FILESIZE=2048GB
    FULL=y
    GRANTS=y -- this value is the default ( It is
    not necesary )
    INDEXES=y
    LOG=export.log
    OBJECT_CONSISTENT=y -- ( start the database in restrict
    mode and do not set this param )
    PARFILE=exp.par
    ROWS=y
    STATISTICS=ESTIMATE -- this value is the default ( It is
    not necesary )
    TRIGGERS=y -- this value is the default ( It is
    not necesary )
    TTS_FULL_CHECK=TRUE
    you can see what parameters are not needed if you apply
    this command:
    [oracle@ozawa oracle]$ exp help=y
    Export: Release 9.2.0.1.0 - Production on Sun Dec 28 16:37:37 2003
    Copyright (c) 1982, 2002, Oracle Corporation. All rights reserved.
    You can let Export prompt you for parameters by entering the EXP
    command followed by your username/password:
    Example: EXP SCOTT/TIGER
    Or, you can control how Export runs by entering the EXP command followed
    by various arguments. To specify parameters, you use keywords:
    Format: EXP KEYWORD=value or KEYWORD=(value1,value2,...,valueN)
    Example: EXP SCOTT/TIGER GRANTS=Y TABLES=(EMP,DEPT,MGR)
    or TABLES=(T1:P1,T1:P2), if T1 is partitioned table
    USERID must be the first parameter on the command line.
    Keyword Description (Default) Keyword Description (Default)
    USERID username/password FULL export entire file (N)
    BUFFER size of data buffer OWNER list of owner usernames
    FILE output files (EXPDAT.DMP) TABLES list of table names
    COMPRESS import into one extent (Y) RECORDLENGTH length of IO record
    GRANTS export grants (Y) INCTYPE incremental export type
    INDEXES export indexes (Y) RECORD track incr. export (Y)
    DIRECT direct path (N) TRIGGERS export triggers (Y)
    LOG log file of screen output STATISTICS analyze objects (ESTIMATE)
    ROWS export data rows (Y) PARFILE parameter filename
    CONSISTENT cross-table consistency(N) CONSTRAINTS export constraints (Y)
    OBJECT_CONSISTENT transaction set to read only during object export (N)
    FEEDBACK display progress every x rows (0)
    FILESIZE maximum size of each dump file
    FLASHBACK_SCN SCN used to set session snapshot back to
    FLASHBACK_TIME time used to get the SCN closest to the specified time
    QUERY select clause used to export a subset of a table
    RESUMABLE suspend when a space related error is encountered(N)
    RESUMABLE_NAME text string used to identify resumable statement
    RESUMABLE_TIMEOUT wait time for RESUMABLE
    TTS_FULL_CHECK perform full or partial dependency check for TTS
    VOLSIZE number of bytes to write to each tape volume
    TABLESPACES list of tablespaces to export
    TRANSPORT_TABLESPACE export transportable tablespace metadata (N)
    TEMPLATE template name which invokes iAS mode export
    Export terminated successfully without warnings.
    [oracle@ozawa oracle]$
    Joel P�rez

  • Export Import to Maintain Test and Production Environments

    We have developed an application using Locally built Database Providers in Oracle Portal 9.0.2.6 which is installed to 2 schemas, has 5 different providers, and over 100 Portal major components (Forms, Reports, and Calendars) and over 200 minor components (LOV's and Links). We have used export/import transport sets with some luck, but it is a struggle becuase the import procedures are not very robust. Many things (such as missing LOV's, corrupt components, preexisting versions, etc, etc.) can cause an import to fail. And the cleanup necessary to finally achieve a successful import can be very time-consuming.
    Having a robust import mechanism is very important to our strategy for keeping installed (our own and clients') portal instances up-to-date with our latest release. Some of the enhancements that would make it much easier to develop and maintain Portal applications include:
    Within the Portal:
    1. Ability to copy an entire provider within the same portal (rather than one component at a time).
    2. Ability to change the schema to which a Provider is associated.
    3. When copying a component from one provider to another, the dependent items (i.e. LOVs and Links) should be copied to new second provider as well. (i.e. rather rebuilding each LOV in each provider and then editing each form to point to the new LOVs)
    Transport Sets:
    4. Should allow for changing provider names and provider schema, and global component name changes, and resetting unqiue id's on import (to create copy rather than overwrite).
    5. Should allow the option to ignore errors and import all components which pass pre-check (rather than failing all components if all items do not pass pre-check).
    How are other Portal Developers dealing with installing and then rolling out new sets of Locally built Database Providers from Development environments to Production? Are there any whitepapers on the best practices for replicating/installing a portal application to a new portal instance and then keeping it updated?
    Oracle, are any of my wish-list items above on the future enhancement lists? Or have others figured out workarounds?
    Thanks,
    Trenton

    There are a couple of references which can be found on Portalstudio.oracle.com that are of some use:
    1. A FAQ for Portal 9.0.2.6 Export/Import http://portalstudio.oracle.com/pls/ops/docs/FOLDER/COMMUNITY/OTN_CONTENT/MAINPAGE/DEPLOY_PERFORM/9026_EXPORT_IMPORT_FAQ_0308.HTM
    2. Migration Instructions by Larry Boussard (BRUSARDL)
    3. Migrating Oracle Portal from Dev Systems to Production Systems bt Dheeraj Kataria.
    These are all useful documents for a successful first-time Export-Import. However, the limitations and lack of robustness I listed in my first post, make the process so time-consuming and error fraught as to not be a practical development strategy.

Maybe you are looking for

  • Dual Boot Win 8.1 & Ubuntu 14.04

    Hi, I would like to create a dual boot. I have a rather new ThinkPad  Yoga. Here are my questions: 1. I was told I should make a recovery image or something when I bought it. How do I do that and what is it? is it just a plane Image or is it "Create

  • Poor PDF results from FreeHand MX v.11

    When I export to PDF using Shift-Command-R, the resulting effects (glows, drop shadows, etc.) don't render as they do within FreeHand - they appear in the PDF as un-graduated transluscent color blocks (everything else in the export, including placed

  • Can I change the font of a whole folder?

    One of my folders contains lots of .doc files that I'd like to change the font of with one click. Can I do this, or do I have to - aaaagh - go through each file in turn?

  • Partition NTFS External Drive to work with Time Machine

    Hello Apple Support Community, I use a 1TB External NTFS hard drive to backup the contents of my Desktop Windows computer. I was wondering if I could use Disk Utility (or another tool) to partition the external hard drive so I could use TIme Machine

  • How to find a rented movie on Apple TV

    Hi there This is my first post - don't want to seem completely stupid but where do the rented movies appear on the menu.  I have 2nd gen ATV and rented a film yesterday but it doesn't show up anywhere.  It tells me I have already rented it when I try