Encryption in EXPDP

Dear DBAs,
Is there a way to encrypt a backup dump using expdp in oracle10g ??
Actually the tables are not encrypted and i only want to encrypt the backup dump.
thanks in advance for your help

DataPump is not a backup. If you want a backup. And you want encryption. Use the correct tool: RMAN.
But just to answer your question a demo of export encryption can be found here:
http://www.psoug.org/reference/datapump.html
The fact that this demo exists should not be taken as an indication that encryption does not require additional licensing: It may. Check with your Oracle sales rep.

Similar Messages

  • TDE (Encryption) and expdp (Datapump)

    Hi,
    I'm testing TDE's features on an Oracle 10g R2 database. I edit the sqlnet.ora then I created a wallet on database 1 (DEV1), a table, did inserts, pumped (expdp) data and tried re-import it in another database (DEV2) which I use with DataGuard for failover, but something went wrong. So I try re-create my wallet on the first database (DEV1) by alter system set wallet close and delete it from my hard drive and drop table purge. I went trought the whole process of creating the wallet and table but when I tried expdp I got the ORA-28362 master key not found.
    How could I not be able to export my data ? What can I do to do a clean deletion of the wallet and recreate it ?
    Thanks for you help
    Francis

    Thank you,
    but I already read those pages and understand them. But as I said, even though I delete and recreate the wallet, I can't get datapump to extract data with the ENCRYPTION_PASSWORD parameter. If I don't put this parameter everything is ok! but the data is not encrypted.

  • Encryption - expdp

    I am on 10.2.0.4 and want to encrypt the password while using expdp on a scheduled job. What is the best way?
    Thanks.

    Eg: expdp userid=user/pwd@db full=y etc.,
    When you schedule a the above export as job, is there a way to encrypt the password?
    Thanks!Use of Oracle Wallet Manager may be a solution for you.
    http://download.oracle.com/docs/cd/B19306_01/network.102/b14268/asowalet.htm
    Regards,
    S.K.

  • Expdp error anyone have an idea on how to fix this?

    I'm trying to get an export of a database that has encrypted columns. Exp really isn't a good option I'd have to unencrypt the columns first everytime.
    First I create the directory D:\DPDUMP and give Everyone SERVICE and SYSTEM full rights to the directory. Oracle runs as the Local System account.
    Created new user myuser that has the role DBA with EXP_FULL_DATABASE etc.
    Made directory object.
    CREATE DIRECTORY dpump_dir_01 AS 'D:\DPDUMP';
    grant read, write on directory dpump_dir_01 to myuser;
    Run expdp as myuser with following PARFILE entries just testing on the tables that are giving me problems.
    DIRECTORY=dpump_dir_01
    LOGFILE=Encryptest.log
    DUMPFILE=Encryptest.dmp
    FULL=N
    encryption_password=********
    TABLES=MYBANK.CARDAGREEMENT
    CONTENT=ALL
    get errors
    ORA-39006: internal error
    ORA-39213: Metadata processing is not available
    I've added USERID="/ as sysdba" to the PARFILE.
    I've tried it to the default DATA_PUMP_DIR after giving read write to that with the same error. expdp TABLES=MYBANK.CARDAGREEMENT.
    No matter what I do I get this error. So far what I've seen is a Directory permissions issue normally causes it and granting access fixes it but I'm having no luck at all.

    Here the error definition message, have you try the recommanded action ?
    ORA-39213: Metadata processing is not available
    Cause: The Data Pump could not use the Metadata API. Typically, this is caused by the XSL stylesheets not being set up properly.
    Action: Connect AS SYSDBA and execute dbms_metadata_util.load_stylesheets to reload the stylesheets.
    Nicolas.

  • Encrypt datapump in oracle 10g

    Hi, I have an Oracle RAC SE 10gR2 (10.2.03) with 3 nodes on SLES 10 on ibm ppc.
    Every night I have some oracle exports (expdp) of some schemas. I want to encrypt one of various datadumps throught a shell bash script. I see that exists TDE for encrypt colums in a table, and I have to create a wallet for encrypt. However, I'm thinking that I preferred clear text (no encrypt in my tables and schemas) and only encrypt the datapumps files. For example, I have my database same as now, and I should specify one options in my expdp for encrypt.
    It's possible?
    Can I encrypt those datapumps without use TDE (in Oracle 10g)?
    Thanks you very much!!

    Rober wrote:
    Hi, I have an Oracle RAC SE 10gR2 (10.2.03) with 3 nodes on SLES 10 on ibm ppc.
    Every night I have some oracle exports (expdp) of some schemas. I want to encrypt one of various datadumps throught a shell bash script. I see that exists TDE for encrypt colums in a table, and I have to create a wallet for encrypt. However, I'm thinking that I preferred clear text (no encrypt in my tables and schemas) and only encrypt the datapumps files. For example, I have my database same as now, and I should specify one options in my expdp for encrypt.
    It's possible?
    Can I encrypt those datapumps without use TDE (in Oracle 10g)?
    Thanks you very much!!TDE perform encryption at storage level.
    data pump has encryption clause in 11g
    http://www.oracle-base.com/articles/11g/data-pump-enhancements-11gr1.php#encryption_parameters

  • How to conduct Tablespace encryption in 11gR1

    hi,
    i recently completed 10g upgrade to 11gr1 (11.0.6). upgrade is completed and successful.
    Now there is 1 read only tablespace which has HR data(ssn, credit card, birth date etc) which i need to encrypt.
    I know high level steps is
    1; use expdp table the tablespace export
    2: drop the tablespace
    3; recreate new encrypted tablespace
    4:import into new encrypted tablepsace
    i have not used expdp and encryption in past.
    what syntax i need to use for expdp to conduct tabelspace export.

    So following steps have valid syntax?
    sqlplus " /as sysdba"
    sql>drop tablespace lawtbs;
    sql>ALTER SYSTEM SET ENCRYPTION KEY IDENTIFIED BY “lawtbs011”;
    sql>CREATE TABLESPACE lawtbs
    DATAFILE ‘/u01/app/oracle/oradata/lawtbs_data_01.dbf’ SIZE 2040m,
    '/u01/app/oracle/oradata/lawtbs_data_02.dbf’ SIZE 2040m
    ENCRYPTION USING ‘AES256
    DEFAULT STORAGE(ENCRYPT);
    sql>create or replace directory imp_dir as '/u01/app/import_ts';
    sql>exit;
    $impdp system/<passwd> tablespaces=lawtbs directory=imp_dir dumpfile=expdp_tbs.dmp logfile=impdp_tbs.log

  • How to make a dmp file for a table contains encrypted column/s?

    Hi All,
    I've created a table with encrypted columns using TDE (Wallet). When I tried to make a dmp file by this command:
    exp system/password file=<path>\ex.dmp owner=ex
    the command pormpt gives this error: Feature (COLUMN ENCRYPTION) of column EMP_SSN in table EX.EMPLOYEES is not supported. the table will not be exported.
    How to solve this problem?
    I want to make a dmp file which export data of encrypted column in an encrypted format not clear format. How?
    Note:
    there is a parameter: ENCRYPTED_COLUMNS_ONLY: Encrypted columns are written to the dump file set in encrypted format

    Start by reading the manual :-
    http://download.oracle.com/docs/cd/B19306_01/server.102/b14215/dp_overview.htm#SUTIL100
    Or you can run in interactive mode :-
    [oracle@dev-oranode-221 ~]$ expdp
    Export: Release 11.2.0.1.0 - Production on Mon Nov 23 13:08:00 2009
    Copyright (c) 1982, 2009, Oracle and/or its affiliates.  All rights reserved.
    Username: a
    Password:
    Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.1.0 - 64bit Production
    With the Partitioning, OLAP, Data Mining, Oracle Database Vault
    and Real Application Testing options
    FLASHBACK automatically enabled to preserve database integrity.
    Starting "A"."SYS_EXPORT_SCHEMA_01":  a/********
    Estimate in progress using BLOCKS method...
    Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
    Total estimation using BLOCKS method: 192 KB
    Processing object type SCHEMA_EXPORT/USER
    Processing object type SCHEMA_EXPORT/SYSTEM_GRANT
    Processing object type SCHEMA_EXPORT/ROLE_GRANT
    Processing object type SCHEMA_EXPORT/DEFAULT_ROLE
    Processing object type SCHEMA_EXPORT/TABLESPACE_QUOTA
    Processing object type SCHEMA_EXPORT/PRE_SCHEMA/PROCACT_SCHEMA
    Processing object type SCHEMA_EXPORT/TABLE/TABLE
    Processing object type SCHEMA_EXPORT/TABLE/INDEX/INDEX
    Processing object type SCHEMA_EXPORT/TABLE/CONSTRAINT/CONSTRAINT
    Processing object type SCHEMA_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS
    Processing object type SCHEMA_EXPORT/TABLE/COMMENT
    Processing object type SCHEMA_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
    Processing object type SCHEMA_EXPORT/TABLE/POST_TABLE_ACTION
    Processing object type SCHEMA_EXPORT/MATERIALIZED_VIEW
    Processing object type SCHEMA_EXPORT/TABLE/MATERIALIZED_VIEW_LOG
    Processing object type SCHEMA_EXPORT/POST_SCHEMA/PROCACT_SCHEMA
    . . exported "A"."EMP"                                   5.421 KB       1 rows
    . . exported "A"."EMP_MV"                                5.429 KB       1 rows
    . . exported "A"."TEST"                                  5.023 KB       1 rows
    . . exported "A"."MLOG$_EMP"                                 0 KB       0 rows
    . . exported "A"."SOURCE"                                    0 KB       0 rows
    Master table "A"."SYS_EXPORT_SCHEMA_01" successfully loaded/unloaded
    Dump file set for A.SYS_EXPORT_SCHEMA_01 is:
      /opt/oracle/product/admin/db1122/dpdump/expdat.dmp
    Job "A"."SYS_EXPORT_SCHEMA_01" successfully completed at 13:09:23

  • Data Pump Export with Column Encryption

    Hi All-
    I have the following two questions:
    - Can I export encrypted column data to a file, preserving the data encrypted? (ie. I don't want clear text in resulting file)
    - If yes, what would another person need to decrypt the generated file? (ie. I want to give the resulting file to a customer)
    Thanks a lot for your help!
    Juan

    Hi,
    expdp and impdp works only on oracle database. You you need to have oracle database on source and destination to use expdp/impdp. You can also use expdp with ENCRYPTION_PASSWORD option so that password can be passed to client for doing impdp at their end with given password.
    Regards

  • How to make export with sensitive user data encrypted?

    My organization need a backup dump of prod database sent to a customer, but want me to encrypt some user's password, and some data.
    Which expdp parameter can do that?
    Or any way I can do that ?
    Thanks in advance.

    To show you how export and import be encrypted at very basic level here is an example of exporting schema rmanencrypt from database orcl (source) to devorcl (target database)
    Encrypted export of schema "rmanencrypt"
    Source Database
    [oracle@localhost ~]$ expdp \'/as sysdba\' DIRECTORY=DATA_PUMP_DIR DUMPFILE=encrypted.dmp schemas=rmanencrypt LOGFILE=encrypted.log ENCRYPTION=ALL ENCRYPTION_PASSWORD=foobar
    Export: Release 11.2.0.3.0 - Production on Wed Jan 11 11:21:47 2012
    Copyright (c) 1982, 2011, Oracle and/or its affiliates.  All rights reserved.
    Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.3.0 - 64bit Production
    With the Partitioning, OLAP, Data Mining and Real Application Testing options
    Starting "SYS"."SYS_EXPORT_SCHEMA_01":  "/******** AS SYSDBA" DIRECTORY=DATA_PUMP_DIR DUMPFILE=encrypted.dmp schemas=rmanencrypt LOGFILE=encrypted.log ENCRYPTION=ALL ENCRYPTION_PASSWORD=********
    Estimate in progress using BLOCKS method...
    Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
    Total estimation using BLOCKS method: 128 KB
    Processing object type SCHEMA_EXPORT/USER
    Processing object type SCHEMA_EXPORT/SYSTEM_GRANT
    Processing object type SCHEMA_EXPORT/ROLE_GRANT
    Processing object type SCHEMA_EXPORT/DEFAULT_ROLE
    Processing object type SCHEMA_EXPORT/TABLESPACE_QUOTA
    Processing object type SCHEMA_EXPORT/PRE_SCHEMA/PROCACT_SCHEMA
    Processing object type SCHEMA_EXPORT/TABLE/TABLE
    . . exported "RMANENCRYPT"."D"                           5.945 KB       4 rows
    . . exported "RMANENCRYPT"."E"                           8.578 KB      14 rows
    Master table "SYS"."SYS_EXPORT_SCHEMA_01" successfully loaded/unloaded
    Dump file set for SYS.SYS_EXPORT_SCHEMA_01 is:
      /home/oracle/app/oracle/admin/orcl/dpdump/encrypted.dmp
    Job "SYS"."SYS_EXPORT_SCHEMA_01" successfully completed at 11:22:36At Target Database this schema doesn't exists.
    SQL> r
      1  select username from dba_users
      2* where username='RMANENCRYPT'
    no rows selectedNow importing the backup first by giving a wrong password.
    [oracle@localhost dpdump]$ impdp \'/as sysdba\' DIRECTORY=DATA_PUMP_DIR DUMPFILE=encrypted.dmp ENCRYPTION_PASSWORD=fooba
    Import: Release 11.2.0.3.0 - Production on Wed Jan 11 11:27:31 2012
    Copyright (c) 1982, 2011, Oracle and/or its affiliates.  All rights reserved.
    Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.3.0 - 64bit Production
    With the Partitioning, OLAP, Data Mining and Real Application Testing options
    ORA-39002: invalid operation
    ORA-39176: Encryption password is incorrect.Now with correct password
    [oracle@localhost dpdump]$ impdp \'/as sysdba\' DIRECTORY=DATA_PUMP_DIR DUMPFILE=encrypted.dmp ENCRYPTION_PASSWORD=foobar
    Import: Release 11.2.0.3.0 - Production on Wed Jan 11 11:27:47 2012
    Copyright (c) 1982, 2011, Oracle and/or its affiliates.  All rights reserved.
    Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.3.0 - 64bit Production
    With the Partitioning, OLAP, Data Mining and Real Application Testing options
    Master table "SYS"."SYS_IMPORT_FULL_01" successfully loaded/unloaded
    Starting "SYS"."SYS_IMPORT_FULL_01":  "/******** AS SYSDBA" DIRECTORY=DATA_PUMP_DIR DUMPFILE=encrypted.dmp ENCRYPTION_PASSWORD=********
    Processing object type SCHEMA_EXPORT/USER
    Processing object type SCHEMA_EXPORT/SYSTEM_GRANT
    Processing object type SCHEMA_EXPORT/ROLE_GRANT
    Processing object type SCHEMA_EXPORT/DEFAULT_ROLE
    Processing object type SCHEMA_EXPORT/TABLESPACE_QUOTA
    Processing object type SCHEMA_EXPORT/PRE_SCHEMA/PROCACT_SCHEMA
    Processing object type SCHEMA_EXPORT/TABLE/TABLE
    Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
    . . imported "RMANENCRYPT"."D"                           5.945 KB       4 rows
    . . imported "RMANENCRYPT"."E"                           8.578 KB      14 rows
    Job "SYS"."SYS_IMPORT_FULL_01" successfully completed at 11:27:58
    SQL> conn rmanencrypt/rmanencrypt
    Connected.
    SQL> select * from tab;
    TNAME                          TABTYPE  CLUSTERID
    D                              TABLE
    E                              TABLEHTH.
    Regards,
    NC

  • Transparaent Data Encryption

    Hi all,
    Is Transparent data encryption method is available in oracle 10g release 1 ?
    In release2 i can able to do TDE with wallet manager but it is not possible to do in oracle 10g release 1 and i can able to find dba_encrypted_columns in these release, kindly guide me is there any script or method to be used inorder to configure manually

    Hi,
    It's Purpose is to copy (Loading) source schema into a target schema.
    Suppose that you execute the following Export and Import commands to remap the hr schema into the scott schema:
    expdp SYSTEM/password SCHEMAS=hr DIRECTORY=dpump_dir1 DUMPFILE=hr.dmp
    impdp SYSTEM/password DIRECTORY=dpump_dir1 DUMPFILE=hr.dmp REMAP_SCHEMA=hr:scott
    In this example, if user scott already exists before the import, then the Import REMAP_SCHEMA command will add objects from the hr schema into the existing scott schema. You can connect to the scott schema after the import by using the existing password (without resetting it).
    If user scott does not exist before you execute the import operation, Import automatically creates it with an unusable password. This is possible because the dump file, hr.dmp, was created by SYSTEM, which has the privileges necessary to create a dump file that contains the metadata needed to create a schema. However, you cannot connect to scott on completion of the import, unless you reset the password for scott on the target database after the import completes.
    You can map different source schemas to the same target schema.
    Thanks
    Pavan Kumar N

  • Expdp from diff home

    Hi
    if i m having two oracle home 10g and 11g on same server and i want to do expdp of 11g database from 10g home is it possible ?
    for exp it is possible but i m not able to do it using expdp
    plz suggest

    Carlovski wrote:
    OK, yes, you could use the expdp utility to launch it from a different oracle home, but it's just a wrapper to make the appropriate calls to DBMS_DATAPUMP.
    I'm not sure if doing so would automatically set the coorect 'VERSION' parameters for you.
    I really don't see why you would want to do such a thing though.
    CarlYou have a good point indeed, Oracle is not setting the correct version...
    Assuming it is for further import into a 10.2 database from a 11.2 database :
    # sqlplus system/***@hdmo912
    SQL*Plus: Release 10.2.0.4.0 - Production on Tue Sep 28 15:27:09 2010
    Copyright (c) 1982, 2007, Oracle.  All Rights Reserved.
    Connected to:
    Oracle Database 11g Enterprise Edition Release 11.2.0.1.0 - 64bit Production
    With the Partitioning, OLAP, Data Mining and Real Application Testing options
    SQL> show parameter compatible
    NAME                                 TYPE                              VALUE
    compatible                           string                            11.2.0.1.0From the 10.2.0.4 client on same machine than the database is running on (without version specified)
    # expdp system/***@hdmo912 dumpfile=exp11g_10gexec.dmp tables=sysadm.psoprdefn logfile=test1.log
    Export: Release 10.2.0.4.0 - 64bit Production on Tuesday, 28 September, 2010 15:30:24
    Copyright (c) 2003, 2007, Oracle.  All rights reserved.
    Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.1.0 - 64bit Production
    With the Partitioning, OLAP, Data Mining and Real Application Testing options
    Starting "SYSTEM"."SYS_EXPORT_TABLE_01":  system/********@hdmo912 dumpfile=exp11g_10gexec.dmp tables=sysadm.psoprdefn logfile=test1.log
    Estimate in progress using BLOCKS method...
    Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
    Total estimation using BLOCKS method: 64 KB
    Processing object type TABLE_EXPORT/TABLE/TABLE
    Processing object type TABLE_EXPORT/TABLE/INDEX/INDEX
    Processing object type TABLE_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS
    Processing object type TABLE_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
    . . exported "SYSADM"."PSOPRDEFN"                        41.82 KB     155 rows
    Master table "SYSTEM"."SYS_EXPORT_TABLE_01" successfully loaded/unloaded
    Dump file set for SYSTEM.SYS_EXPORT_TABLE_01 is:
      /appl/oracle/product/11.2.0/rdbms/log/exp11g_10gexec.dmp
    Job "SYSTEM"."SYS_EXPORT_TABLE_01" successfully completed at 15:31:17From the 11.2.0.1 home hosting the database
    # export ORACLE_SID=hdmo912
    # expdp system/*** dumpfile=exp11g_11gexec.dmp tables=sysadm.psoprdefn logfile=test2.log version=10.2
    Export: Release 11.2.0.1.0 - Production on Tue Sep 28 15:34:57 2010
    Copyright (c) 1982, 2009, Oracle and/or its affiliates.  All rights reserved.
    Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.1.0 - 64bit Production
    With the Partitioning, OLAP, Data Mining and Real Application Testing options
    Starting "SYSTEM"."SYS_EXPORT_TABLE_01":  system/******** dumpfile=exp11g_11gexec.dmp tables=sysadm.psoprdefn logfile=test2.log version=10.2
    Estimate in progress using BLOCKS method...
    Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
    Total estimation using BLOCKS method: 64 KB
    Processing object type TABLE_EXPORT/TABLE/TABLE
    Processing object type TABLE_EXPORT/TABLE/INDEX/INDEX
    Processing object type TABLE_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS
    Processing object type TABLE_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
    . . exported "SYSADM"."PSOPRDEFN"                        40.23 KB     155 rows
    Master table "SYSTEM"."SYS_EXPORT_TABLE_01" successfully loaded/unloaded
    Dump file set for SYSTEM.SYS_EXPORT_TABLE_01 is:
      /appl/oracle/product/11.2.0/rdbms/log/exp11g_11gexec.dmp
    Job "SYSTEM"."SYS_EXPORT_TABLE_01" successfully completed at 15:35:18From the 10.2.0.4 client on same machine than the database is running on (with version specified)
    # expdp system/***@hdmo912 dumpfile=exp11g_10gexec_version.dmp tables=sysadm.psoprdefn logfile=test1.log version=10.2
    Export: Release 10.2.0.4.0 - 64bit Production on Tuesday, 28 September, 2010 15:59:59
    Copyright (c) 2003, 2007, Oracle.  All rights reserved.
    Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.1.0 - 64bit Production
    With the Partitioning, OLAP, Data Mining and Real Application Testing options
    Starting "SYSTEM"."SYS_EXPORT_TABLE_01":  system/********@hdmo912 dumpfile=exp11g_10gexec_version.dmp tables=sysadm.psoprdefn logfile=test1.log version=10.2
    Estimate in progress using BLOCKS method...
    Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
    Total estimation using BLOCKS method: 64 KB
    Processing object type TABLE_EXPORT/TABLE/TABLE
    Processing object type TABLE_EXPORT/TABLE/INDEX/INDEX
    Processing object type TABLE_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS
    Processing object type TABLE_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
    . . exported "SYSADM"."PSOPRDEFN"                        40.23 KB     155 rows
    Master table "SYSTEM"."SYS_EXPORT_TABLE_01" successfully loaded/unloaded
    Dump file set for SYSTEM.SYS_EXPORT_TABLE_01 is:
      /appl/oracle/product/11.2.0/rdbms/log/exp11g_10gexec_version.dmp
    Job "SYSTEM"."SYS_EXPORT_TABLE_01" successfully completed at 16:00:20Checking the dumpfiles header (see note #462488.1 for the procedure) :
    SQL> exec show_dumpfile_info(p_dir=> 'DATA_PUMP_DIR', p_file => 'exp11g_10gexec.dmp')
    Purpose..: Obtain details about export dumpfile.        Version: 19-MAR-2008
    Required.: RDBMS version: 10.2.0.1.0 or higher
    .          Export dumpfile version: 7.3.4.0.0 or higher
    .          Export Data Pump dumpfile version: 10.1.0.1.0 or higher
    Usage....: execute show_dumfile_info('DIRECTORY', 'DUMPFILE');
    Example..: exec show_dumfile_info('MY_DIR', 'expdp_s.dmp')
    Filename.: exp11g_10gexec.dmp
    Directory: DATA_PUMP_DIR
    Disk Path: /appl/oracle/product/11.2.0/rdbms/log/
    Filetype.: 1 (Export Data Pump dumpfile)
    ...File Version....: 3.1 (Oracle11g Release 2: 11.2.0.x)               <----
    ...Master Present..: 1 (Yes)
    ...GUID............: 9152D8F5A184E07CE0430A381E35E07C
    ...File Number.....: 1
    ...Characterset ID.: 871 (UTF8)
    ...Creation Date...: Tue Sep 28 15:31:17 2010
    ...Flags...........: 2
    ...Job Name........: "SYSTEM"."SYS_EXPORT_TABLE_01"
    ...Platform........: IBM AIX64/RS6000 V4 - 8.1.0
    ...Instance........: hdmo912
    ...Language........: UTF8
    ...Block size......: 4096
    ...Metadata Compres: 1 (Yes)
    ...Data Compressed.: 0 (No)
    ...Metadata Encrypt: 0 (No)
    ...Data Encrypted..: 0 (No)
    ...Column Encrypted: 0 (No)
    ...Encrypt.pwd. mod: 2 (None)
    ...Master Piece Cnt: 1
    ...Master Piece Num: 1
    ...Job Version.....: 11.02.00.01.00
    ...Max Items Code..: 22
    PL/SQL procedure successfully completed.
    SQL> exec show_dumpfile_info(p_dir=> 'DATA_PUMP_DIR', p_file => 'exp11g_11gexec.dmp')
    Purpose..: Obtain details about export dumpfile.        Version: 19-MAR-2008
    Required.: RDBMS version: 10.2.0.1.0 or higher
    .          Export dumpfile version: 7.3.4.0.0 or higher
    .          Export Data Pump dumpfile version: 10.1.0.1.0 or higher
    Usage....: execute show_dumfile_info('DIRECTORY', 'DUMPFILE');
    Example..: exec show_dumfile_info('MY_DIR', 'expdp_s.dmp')
    Filename.: exp11g_11gexec.dmp
    Directory: DATA_PUMP_DIR
    Disk Path: /appl/oracle/product/11.2.0/rdbms/log/
    Filetype.: 1 (Export Data Pump dumpfile)
    ...File Version....: 1.1 (Oracle10g Release 2: 10.2.0.x)               <----
    ...Master Present..: 1 (Yes)
    ...GUID............: 9152E932427560D0E0430A381E3560D0
    ...File Number.....: 1
    ...Characterset ID.: 871 (UTF8)
    ...Creation Date...: Tue Sep 28 15:35:18 2010
    ...Flags...........: 2
    ...Job Name........: "SYSTEM"."SYS_EXPORT_TABLE_01"
    ...Platform........: IBM AIX64/RS6000 V4 - 8.1.0
    ...Language........: UTF8
    ...Block size......: 4096
    ...Metadata Compres: 1 (Yes)
    ...Data Compressed.: 0 (No)
    ...Metadata Encrypt: 0 (No)
    ...Data Encrypted..: 0 (No)
    ...Column Encrypted: 0 (No)
    ...Encrypt.pwd. mod: 2 (None)
    ...Job Version.....: 10.02.00.00.00
    ...Max Items Code..: 22
    PL/SQL procedure successfully completed.
    SQL> exec show_dumpfile_info(p_dir=> 'DATA_PUMP_DIR', p_file => 'exp11g_10gexec_version.dmp')
    Purpose..: Obtain details about export dumpfile.        Version: 19-MAR-2008
    Required.: RDBMS version: 10.2.0.1.0 or higher
    .          Export dumpfile version: 7.3.4.0.0 or higher
    .          Export Data Pump dumpfile version: 10.1.0.1.0 or higher
    Usage....: execute show_dumfile_info('DIRECTORY', 'DUMPFILE');
    Example..: exec show_dumfile_info('MY_DIR', 'expdp_s.dmp')
    Filename.: exp11g_10gexec_version.dmp
    Directory: DATA_PUMP_DIR
    Disk Path: /appl/oracle/product/11.2.0/rdbms/log/
    Filetype.: 1 (Export Data Pump dumpfile)
    ...File Version....: 1.1 (Oracle10g Release 2: 10.2.0.x)               <----
    ...Master Present..: 1 (Yes)
    ...GUID............: 915342B989EDE1DAE0430A381E35E1DA
    ...File Number.....: 1
    ...Characterset ID.: 871 (UTF8)
    ...Creation Date...: Tue Sep 28 16:00:20 2010
    ...Flags...........: 2
    ...Job Name........: "SYSTEM"."SYS_EXPORT_TABLE_01"
    ...Platform........: IBM AIX64/RS6000 V4 - 8.1.0
    ...Language........: UTF8
    ...Block size......: 4096
    ...Metadata Compres: 1 (Yes)
    ...Data Compressed.: 0 (No)
    ...Metadata Encrypt: 0 (No)
    ...Data Encrypted..: 0 (No)
    ...Column Encrypted: 0 (No)
    ...Encrypt.pwd. mod: 2 (None)
    ...Job Version.....: 10.02.00.00.00
    ...Max Items Code..: 22
    PL/SQL procedure successfully completed.Then trying to import the files into a 10.2.0.4 database :
    # impdp nicolas/nicolas remap_schema=sysadm:nicolas dumpfile=exp11g_10gexec.dmp logfile=test1_imp>
    Import: Release 10.2.0.4.0 - 64bit Production on Tuesday, 28 September, 2010 15:56:45
    Copyright (c) 2003, 2007, Oracle.  All rights reserved.
    Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.4.0 - 64bit Production
    With the Partitioning, Oracle Label Security, OLAP, Data Mining
    and Real Application Testing options
    ORA-39001: invalid argument value
    ORA-39000: bad dump file specification
    ORA-39142: incompatible version number 3.1 in dump file "/appl/oracle/product/11.2.0/rdbms/log/exp11g_10gexec.dmp"
    # impdp nicolas/nicolas remap_schema=sysadm:nicolas dumpfile=exp11g_11gexec.dmp logfile=test2_imp>
    Import: Release 10.2.0.4.0 - 64bit Production on Tuesday, 28 September, 2010 15:57:01
    Copyright (c) 2003, 2007, Oracle.  All rights reserved.
    Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.4.0 - 64bit Production
    With the Partitioning, Oracle Label Security, OLAP, Data Mining
    and Real Application Testing options
    Master table "NICOLAS"."SYS_IMPORT_FULL_01" successfully loaded/unloaded
    Starting "NICOLAS"."SYS_IMPORT_FULL_01":  nicolas/******** remap_schema=sysadm:nicolas dumpfile=exp11g_11gexec.dmp logfile=test2_imp.log directory=nic
    Processing object type TABLE_EXPORT/TABLE/TABLE
    Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
    . . imported "NICOLAS"."PSOPRDEFN"                       40.23 KB     155 rows
    Processing object type TABLE_EXPORT/TABLE/INDEX/INDEX
    Processing object type TABLE_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS
    Processing object type TABLE_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
    Job "NICOLAS"."SYS_IMPORT_FULL_01" successfully completed at 15:57:06
    SQL> drop table psoprdefn;
    Table dropped.
    SQL> quit
    # impdp nicolas/nicolas remap_schema=sysadm:nicolas dumpfile=exp11g_10gexec_version.dmp logfile=t>
    Import: Release 10.2.0.4.0 - 64bit Production on Tuesday, 28 September, 2010 16:05:45
    Copyright (c) 2003, 2007, Oracle.  All rights reserved.
    Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.4.0 - 64bit Production
    With the Partitioning, Oracle Label Security, OLAP, Data Mining
    and Real Application Testing options
    Master table "NICOLAS"."SYS_IMPORT_FULL_01" successfully loaded/unloaded
    Starting "NICOLAS"."SYS_IMPORT_FULL_01":  nicolas/******** remap_schema=sysadm:nicolas dumpfile=exp11g_10gexec_version.dmp logfile=test3_imp.log directory=nic
    Processing object type TABLE_EXPORT/TABLE/TABLE
    Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
    . . imported "NICOLAS"."PSOPRDEFN"                       40.23 KB     155 rows
    Processing object type TABLE_EXPORT/TABLE/INDEX/INDEX
    Processing object type TABLE_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS
    Processing object type TABLE_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
    Job "NICOLAS"."SYS_IMPORT_FULL_01" successfully completed at 16:05:48
    #So, you should specify VERSION to be able to import on lower version even if you use lower version of expdp utility.
    Nicolas.

  • Unknown parameter name 'ENCRYPTION'

    Hi All,
    I've made a dmp file using expdp for a table containing encrypted coulums.
    But when I tried to use ENCRYPTION parameter to enable encryption of data in dump file sets the command pormpt gives this error:
    unknown parameter name 'ENCRYPTION'
    My DB Version: 10.2.1.0.1

    Dev. Musbah wrote:
    It stills giving the same error: unknown parameter name 'ENCRYPTION'
    Does this parameter can't be used in DB 10g Release 2?Please see the docs or the help with the command. There is no such parameter in data pump.
    C:\Documents and Settings\admin>expdp help=y
    Export: Release 10.2.0.1.0 - Production on Tuesday, 01 December, 2009 14:02:13
    Copyright (c) 2003, 2005, Oracle.  All rights reserved.
    The Data Pump export utility provides a mechanism for transferring data objects
    between Oracle databases. The utility is invoked with the following command:
       Example: expdp scott/tiger DIRECTORY=dmpdir DUMPFILE=scott.dmp
    You can control how Export runs by entering the 'expdp' command followed
    by various parameters. To specify parameters, you use keywords:
       Format:  expdp KEYWORD=value or KEYWORD=(value1,value2,...,valueN)
       Example: expdp scott/tiger DUMPFILE=scott.dmp DIRECTORY=dmpdir SCHEMAS=scott
                   or TABLES=(T1:P1,T1:P2), if T1 is partitioned table
    USERID must be the first parameter on the command line.
    Keyword               Description (Default)
    ATTACH                Attach to existing job, e.g. ATTACH [=job name].
    COMPRESSION           Reduce size of dumpfile contents where valid
                          keyword values are: (METADATA_ONLY) and NONE.
    CONTENT               Specifies data to unload where the valid keywords are:
                          (ALL), DATA_ONLY, and METADATA_ONLY.
    DIRECTORY             Directory object to be used for dumpfiles and logfiles.
    DUMPFILE              List of destination dump files (expdat.dmp),
                          e.g. DUMPFILE=scott1.dmp, scott2.dmp, dmpdir:scott3.dmp.
    ENCRYPTION_PASSWORD   Password key for creating encrypted column data.
    ESTIMATE              Calculate job estimates where the valid keywords are:
                          (BLOCKS) and STATISTICS.
    ESTIMATE_ONLY         Calculate job estimates without performing the export.
    EXCLUDE               Exclude specific object types, e.g. EXCLUDE=TABLE:EMP.
    FILESIZE              Specify the size of each dumpfile in units of bytes.
    FLASHBACK_SCN         SCN used to set session snapshot back to.
    FLASHBACK_TIME        Time used to get the SCN closest to the specified time.
    FULL                  Export entire database (N).
    HELP                  Display Help messages (N).
    INCLUDE               Include specific object types, e.g. INCLUDE=TABLE_DATA.
    JOB_NAME              Name of export job to create.
    LOGFILE               Log file name (export.log).
    NETWORK_LINK          Name of remote database link to the source system.
    NOLOGFILE             Do not write logfile (N).
    PARALLEL              Change the number of active workers for current job.
    PARFILE               Specify parameter file.
    QUERY                 Predicate clause used to export a subset of a table.
    SAMPLE                Percentage of data to be exported;
    SCHEMAS               List of schemas to export (login schema).
    STATUS                Frequency (secs) job status is to be monitored where
                          the default (0) will show new status when available.
    TABLES                Identifies a list of tables to export - one schema only.
    TABLESPACES           Identifies a list of tablespaces to export.
    TRANSPORT_FULL_CHECK  Verify storage segments of all tables (N).
    TRANSPORT_TABLESPACES List of tablespaces from which metadata will be unloaded.
    VERSION               Version of objects to export where valid keywords are:
                          (COMPATIBLE), LATEST, or any valid database version.
    The following commands are valid while in interactive mode.
    Note: abbreviations are allowed
    Command               Description
    ADD_FILE              Add dumpfile to dumpfile set.
    CONTINUE_CLIENT       Return to logging mode. Job will be re-started if idle.
    EXIT_CLIENT           Quit client session and leave job running.
    FILESIZE              Default filesize (bytes) for subsequent ADD_FILE commands
    HELP                  Summarize interactive commands.
    KILL_JOB              Detach and delete job.
    PARALLEL              Change the number of active workers for current job.
                          PARALLEL=<number of workers>.
    START_JOB             Start/resume current job.
    STATUS                Frequency (secs) job status is to be monitored where
                          the default (0) will show new status when available.
                          STATUS[=interval]
    STOP_JOB              Orderly shutdown of job execution and exits the client.
                          STOP_JOB=IMMEDIATE performs an immediate shutdown of the
                          Data Pump job.And AFAIK= As Far As I Know
    http://www.google.co.in/search?rlz=1C1GGLS_en-ININ355IN355&sourceid=chrome&ie=UTF-8&q=afaik
    HTH
    Aman....

  • Incremental EXPDP !!!

    Dear all,
    I have a production database on windows 2003 server 64 bit with oracle 10.2.0.3 installed. This db has a schema which expdp file is of 33gb, every day I am taking the schema level export and importing the same to my test server.
    Moreover, we also had configured flashback database.
    Its taking a lot of time to import everyday despite there is little changes in the schema. Our intention is to refresh the test schema every day.
    Is there a way to take the incremental expdp/flashback to record only the changes and then import it into the test server.
    Thanks.

    C:\>expdp help=y
    Export: Release 11.1.0.6.0 - Production on Friday, 16 October, 2009 10:41:50
    Copyright (c) 2003, 2007, Oracle.  All rights reserved.
    The Data Pump export utility provides a mechanism for transferring data objects
    between Oracle databases. The utility is invoked with the following command:
       Example: expdp scott/tiger DIRECTORY=dmpdir DUMPFILE=scott.dmp
    You can control how Export runs by entering the 'expdp' command followed
    by various parameters. To specify parameters, you use keywords:
       Format:  expdp KEYWORD=value or KEYWORD=(value1,value2,...,valueN)
       Example: expdp scott/tiger DUMPFILE=scott.dmp DIRECTORY=dmpdir SCHEMAS=scott
                   or TABLES=(T1:P1,T1:P2), if T1 is partitioned table
    USERID must be the first parameter on the command line.
    Keyword               Description (Default)
    ATTACH                Attach to existing job, e.g. ATTACH [=job name].
    COMPRESSION           Reduce size of dumpfile contents where valid keyword.
                          values are: ALL, (METADATA_ONLY), DATA_ONLY and NONE.
    CONTENT               Specifies data to unload where the valid keyword
                          values are: (ALL), DATA_ONLY, and METADATA_ONLY.
    DATA_OPTIONS          Data layer flags where the only valid value is:
                          XML_CLOBS-write XML datatype in CLOB format
    DIRECTORY             Directory object to be used for dumpfiles and logfiles.
    DUMPFILE              List of destination dump files (expdat.dmp),
                          e.g. DUMPFILE=scott1.dmp, scott2.dmp, dmpdir:scott3.dmp.
    ENCRYPTION            Encrypt part or all of the dump file where valid keyword
                          values are: ALL, DATA_ONLY, METADATA_ONLY,
                          ENCRYPTED_COLUMNS_ONLY, or NONE.
    ENCRYPTION_ALGORITHM  Specify how encryption should be done where valid
                          keyword values are: (AES128), AES192, and AES256.
    ENCRYPTION_MODE       Method of generating encryption key where valid keyword
                          values are: DUAL, PASSWORD, and (TRANSPARENT).
    ENCRYPTION_PASSWORD   Password key for creating encrypted column data.
    ESTIMATE              Calculate job estimates where the valid keyword
                          values are: (BLOCKS) and STATISTICS.
    ESTIMATE_ONLY         Calculate job estimates without performing the export.
    EXCLUDE               Exclude specific object types, e.g. EXCLUDE=TABLE:EMP.
    FILESIZE              Specify the size of each dumpfile in units of bytes.
    FLASHBACK_SCN         SCN used to set session snapshot back to.
    FLASHBACK_TIME        Time used to get the SCN closest to the specified time.
    FULL                  Export entire database (N).
    HELP                  Display Help messages (N).
    INCLUDE               Include specific object types, e.g. INCLUDE=TABLE_DATA.
    JOB_NAME              Name of export job to create.
    LOGFILE               Log file name (export.log).
    NETWORK_LINK          Name of remote database link to the source system.
    NOLOGFILE             Do not write logfile (N).
    PARALLEL              Change the number of active workers for current job.
    PARFILE               Specify parameter file.
    QUERY                 Predicate clause used to export a subset of a table.
    REMAP_DATA            Specify a data conversion function,
                          e.g. REMAP_DATA=EMP.EMPNO:REMAPPKG.EMPNO.
    REUSE_DUMPFILES       Overwrite destination dump file if it exists (N).
    SAMPLE                Percentage of data to be exported;
    SCHEMAS               List of schemas to export (login schema).
    STATUS                Frequency (secs) job status is to be monitored where
                          the default (0) will show new status when available.
    TABLES                Identifies a list of tables to export - one schema only.
    TABLESPACES           Identifies a list of tablespaces to export.
    TRANSPORTABLE         Specify whether transportable method can be used where
                          valid keyword values are: ALWAYS, (NEVER).
    TRANSPORT_FULL_CHECK  Verify storage segments of all tables (N).
    TRANSPORT_TABLESPACES List of tablespaces from which metadata will be unloaded.
    VERSION               Version of objects to export where valid keywords are:
                          (COMPATIBLE), LATEST, or any valid database version.
    The following commands are valid while in interactive mode.
    Note: abbreviations are allowed
    Command               Description
    ADD_FILE              Add dumpfile to dumpfile set.
    CONTINUE_CLIENT       Return to logging mode. Job will be re-started if idle.
    EXIT_CLIENT           Quit client session and leave job running.
    FILESIZE              Default filesize (bytes) for subsequent ADD_FILE commands.
    HELP                  Summarize interactive commands.
    KILL_JOB              Detach and delete job.
    PARALLEL              Change the number of active workers for current job.
                          PARALLEL=<number of workers>.
    REUSE_DUMPFILES       Overwrite destination dump file if it exists (N).
    START_JOB             Start/resume current job.
    STATUS                Frequency (secs) job status is to be monitored where
                          the default (0) will show new status when available.
                          STATUS[=interval]
    STOP_JOB              Orderly shutdown of job execution and exits the client.
                          STOP_JOB=IMMEDIATE performs an immediate shutdown of the
                          Data Pump job.Nothing for incrementally updating the clone database.
    Once you create a database using logical backup, it's altogether new database i.e. no links with the original database. So it's hard to co-relate the databases.
    Regards,
    S.K.

  • Expdp /impdp  Win 2003 Oracle 11g

    Hi Guys,
    I did the export from production and I want to restore on dev. What is the sysntax for impdp.
    expdp userid=system/system dumpfile=livelink.dmp schemas=livelink logfile=livelink.log
    impdp userid=system/system dumpfile=e:/oradata/livelink.dmp ........ Is there fromuser=livelink touser=livelink
    Thanx.

    Handle:      NPD
    Status Level:      Newbie
    Registered:      Mar 15, 2007
    Total Posts:      359
    Total Questions:      82 (73 unresolved)
    so many questions & so few answers.
    Is there fromuser=livelink touser=livelinkNO
    check out SCHEMAS
    bcm@bcm-laptop:~$ impdp help=yes
    Import: Release 11.2.0.1.0 - Production on Thu Oct 7 08:44:28 2010
    Copyright (c) 1982, 2009, Oracle and/or its affiliates.  All rights reserved.
    The Data Pump Import utility provides a mechanism for transferring data objects
    between Oracle databases. The utility is invoked with the following command:
         Example: impdp scott/tiger DIRECTORY=dmpdir DUMPFILE=scott.dmp
    You can control how Import runs by entering the 'impdp' command followed
    by various parameters. To specify parameters, you use keywords:
         Format:  impdp KEYWORD=value or KEYWORD=(value1,value2,...,valueN)
         Example: impdp scott/tiger DIRECTORY=dmpdir DUMPFILE=scott.dmp
    USERID must be the first parameter on the command line.
    The available keywords and their descriptions follow. Default values are listed within square brackets.
    ATTACH
    Attach to an existing job.
    For example, ATTACH=job_name.
    CONTENT
    Specifies data to load.
    Valid keywords are: [ALL], DATA_ONLY and METADATA_ONLY.
    DATA_OPTIONS
    Data layer option flags.
    Valid keywords are: SKIP_CONSTRAINT_ERRORS.
    DIRECTORY
    Directory object to be used for dump, log and sql files.
    DUMPFILE
    List of dumpfiles to import from [expdat.dmp].
    For example, DUMPFILE=scott1.dmp, scott2.dmp, dmpdir:scott3.dmp.
    ENCRYPTION_PASSWORD
    Password key for accessing encrypted data within a dump file.
    Not valid for network import jobs.
    ESTIMATE
    Calculate job estimates.
    Valid keywords are: [BLOCKS] and STATISTICS.
    EXCLUDE
    Exclude specific object types.
    For example, EXCLUDE=SCHEMA:"='HR'".
    FLASHBACK_SCN
    SCN used to reset session snapshot.
    FLASHBACK_TIME
    Time used to find the closest corresponding SCN value.
    FULL
    Import everything from source [Y].
    HELP
    Display help messages [N].
    INCLUDE
    Include specific object types.
    For example, INCLUDE=TABLE_DATA.
    JOB_NAME
    Name of import job to create.
    LOGFILE
    Log file name [import.log].
    NETWORK_LINK
    Name of remote database link to the source system.
    NOLOGFILE
    Do not write log file [N].
    PARALLEL
    Change the number of active workers for current job.
    PARFILE
    Specify parameter file.
    PARTITION_OPTIONS
    Specify how partitions should be transformed.
    Valid keywords are: DEPARTITION, MERGE and [NONE].
    QUERY
    Predicate clause used to import a subset of a table.
    For example, QUERY=employees:"WHERE department_id > 10".
    REMAP_DATA
    Specify a data conversion function.
    For example, REMAP_DATA=EMP.EMPNO:REMAPPKG.EMPNO.
    REMAP_DATAFILE
    Redefine datafile references in all DDL statements.
    REMAP_SCHEMA
    Objects from one schema are loaded into another schema.
    REMAP_TABLE
    Table names are remapped to another table.
    For example, REMAP_TABLE=EMP.EMPNO:REMAPPKG.EMPNO.
    REMAP_TABLESPACE
    Tablespace object are remapped to another tablespace.
    REUSE_DATAFILES
    Tablespace will be initialized if it already exists [N].
    SCHEMAS
    List of schemas to import.
    SKIP_UNUSABLE_INDEXES
    Skip indexes that were set to the Index Unusable state.
    SOURCE_EDITION
    Edition to be used for extracting metadata.
    SQLFILE
    Write all the SQL DDL to a specified file.
    STATUS
    Frequency (secs) job status is to be monitored where
    the default [0] will show new status when available.
    STREAMS_CONFIGURATION
    Enable the loading of Streams metadata
    TABLE_EXISTS_ACTION
    Action to take if imported object already exists.
    Valid keywords are: APPEND, REPLACE, [SKIP] and TRUNCATE.
    TABLES
    Identifies a list of tables to import.
    For example, TABLES=HR.EMPLOYEES,SH.SALES:SALES_1995.
    TABLESPACES
    Identifies a list of tablespaces to import.
    TARGET_EDITION
    Edition to be used for loading metadata.
    TRANSFORM
    Metadata transform to apply to applicable objects.
    Valid keywords are: OID, PCTSPACE, SEGMENT_ATTRIBUTES and STORAGE.
    TRANSPORTABLE
    Options for choosing transportable data movement.
    Valid keywords are: ALWAYS and [NEVER].
    Only valid in NETWORK_LINK mode import operations.
    TRANSPORT_DATAFILES
    List of datafiles to be imported by transportable mode.
    TRANSPORT_FULL_CHECK
    Verify storage segments of all tables [N].
    TRANSPORT_TABLESPACES
    List of tablespaces from which metadata will be loaded.
    Only valid in NETWORK_LINK mode import operations.
    VERSION
    Version of objects to import.
    Valid keywords are: [COMPATIBLE], LATEST or any valid database version.
    Only valid for NETWORK_LINK and SQLFILE.
    The following commands are valid while in interactive mode.
    Note: abbreviations are allowed.
    CONTINUE_CLIENT
    Return to logging mode. Job will be restarted if idle.
    EXIT_CLIENT
    Quit client session and leave job running.
    HELP
    Summarize interactive commands.
    KILL_JOB
    Detach and delete job.
    PARALLEL
    Change the number of active workers for current job.
    START_JOB
    Start or resume current job.
    Valid keywords are: SKIP_CURRENT.
    STATUS
    Frequency (secs) job status is to be monitored where
    the default [0] will show new status when available.
    STOP_JOB
    Orderly shutdown of job execution and exits the client.
    Valid keywords are: IMMEDIATE.
    bcm@bcm-laptop:~$ Edited by: sb92075 on Oct 7, 2010 8:53 AM

  • Expdp is faster

    Oracle 10g / Linux
    Why oracle 10g expdp and impdp is fater than exp and imp.
    how to implement network export and network mport in 10g /OEL environment.
    i would like to take full expdp daily.
    santhanam.

    expdp help=yes
    Export: Release 10.2.0.1.0 - Production on Thursday, 15 April, 2010 19:06:44
    Copyright (c) 2003, 2005, Oracle.  All rights reserved.
    The Data Pump export utility provides a mechanism for transferring data objects
    between Oracle databases. The utility is invoked with the following command:
       Example: expdp scott/tiger DIRECTORY=dmpdir DUMPFILE=scott.dmp
    You can control how Export runs by entering the 'expdp' command followed
    by various parameters. To specify parameters, you use keywords:
       Format:  expdp KEYWORD=value or KEYWORD=(value1,value2,...,valueN)
       Example: expdp scott/tiger DUMPFILE=scott.dmp DIRECTORY=dmpdir SCHEMAS=scott
                   or TABLES=(T1:P1,T1:P2), if T1 is partitioned table
    USERID must be the first parameter on the command line.
    Keyword               Description (Default)
    ATTACH                Attach to existing job, e.g. ATTACH [=job name].
    COMPRESSION           Reduce size of dumpfile contents where valid
                          keyword values are: (METADATA_ONLY) and NONE.
    CONTENT               Specifies data to unload where the valid keywords are:
                          (ALL), DATA_ONLY, and METADATA_ONLY.
    DIRECTORY             Directory object to be used for dumpfiles and logfiles.
    DUMPFILE              List of destination dump files (expdat.dmp),
                          e.g. DUMPFILE=scott1.dmp, scott2.dmp, dmpdir:scott3.dmp.
    ENCRYPTION_PASSWORD   Password key for creating encrypted column data.
    ESTIMATE              Calculate job estimates where the valid keywords are:
                          (BLOCKS) and STATISTICS.
    ESTIMATE_ONLY         Calculate job estimates without performing the export.
    EXCLUDE               Exclude specific object types, e.g. EXCLUDE=TABLE:EMP.
    FILESIZE              Specify the size of each dumpfile in units of bytes.
    FLASHBACK_SCN         SCN used to set session snapshot back to.
    FLASHBACK_TIME        Time used to get the SCN closest to the specified time.
    FULL                  Export entire database (N).
    HELP                  Display Help messages (N).
    INCLUDE               Include specific object types, e.g. INCLUDE=TABLE_DATA.
    JOB_NAME              Name of export job to create.
    LOGFILE               Log file name (export.log).
    NETWORK_LINK          Name of remote database link to the source system.
    NOLOGFILE             Do not write logfile (N).
    PARALLEL              Change the number of active workers for current job.
    PARFILE               Specify parameter file.
    QUERY                 Predicate clause used to export a subset of a table.
    SAMPLE                Percentage of data to be exported;
    SCHEMAS               List of schemas to export (login schema).
    STATUS                Frequency (secs) job status is to be monitored where
                          the default (0) will show new status when available.
    TABLES                Identifies a list of tables to export - one schema only.
    TABLESPACES           Identifies a list of tablespaces to export.
    TRANSPORT_FULL_CHECK  Verify storage segments of all tables (N).
    TRANSPORT_TABLESPACES List of tablespaces from which metadata will be unloaded.
    VERSION               Version of objects to export where valid keywords are:
                          (COMPATIBLE), LATEST, or any valid database version.
    The following commands are valid while in interactive mode.
    Note: abbreviations are allowed
    Command               Description
    ADD_FILE              Add dumpfile to dumpfile set.
    CONTINUE_CLIENT       Return to logging mode. Job will be re-started if idle.
    EXIT_CLIENT           Quit client session and leave job running.
    FILESIZE              Default filesize (bytes) for subsequent ADD_FILE commands.
    HELP                  Summarize interactive commands.
    KILL_JOB              Detach and delete job.
    PARALLEL              Change the number of active workers for current job.
                          PARALLEL=<number of workers>.
    START_JOB             Start/resume current job.
    STATUS                Frequency (secs) job status is to be monitored where
                          the default (0) will show new status when available.
                          STATUS[=interval]
    STOP_JOB              Orderly shutdown of job execution and exits the client.
                          STOP_JOB=IMMEDIATE performs an immediate shutdown of the
                          Data Pump job.http://download.oracle.com/docs/cd/B19306_01/server.102/b14215/dp_export.htm#i1007466

Maybe you are looking for

  • Report Pdf output getting Junk characters for some boiler Plate fields

    Hi All For Dunning Letter Report registered in Oracle Applications,made changes in Column headings like added customer number as Kunde for German Letter but in pdf output it's appearing as junk characters even date field is happening like that. It's

  • Z fields update in BAPI_PO_CREATE1

    Hi All, I have to update z fields during PO creation from bapi "BAPI_PO_CREATE1". i am following all the steps required using the extension structure. Still the fields are not getting updated. Are there any configuration settings that need to be done

  • ESS: Who's Who Authorization Checks

    Hi, I am testing the ESS iView (tcode PZ01) in the Portal and it seems to be restricting the search results by my authorizations.  I am not getting a full list of people in the system.  Anyone know how to turn-off this authorization check? I noticed

  • Satellite L350-277 - Webcam failing to initialize

    I am still having a problem with my webcam on the above laptop. It did work when I first bought the laptop, but for some reason it is now failing to initialize. I have posted before and was advised to download drivers. I have done this but it still w

  • Unable to communicate from adapter engine to Integration engine

    Hi, I have a scenario as shown below. Target PI->File sender CC->Soap Receiver CC->SOAP Sender CC->Source PI The file is getting picked up using file sender CC but probably due to communication failure between file adapter and IE, the message is not