Network export in datapump

Hi ...I am not a dba and trying to do a network exp/import using the datapump in sql developer.I am suppozed to do this via the utility wizards in sql developer( view> DBA).I have EXP_FULL_DATABASE,DATAPUMP_EXP_FULL_DATABASE privileges on both source and target(and respective import priv also)
Please guide me on the process( in cmd line i know we have to add NETWORK_LINK=source_database_link to normal export ) but how do we do it using the wizard?
please suggest.

Rich i got connected to putty to try this out :
impdp schema1/psswd directory=DATA tables=xyz logfile=network_imp.log network_link=link                         
Connected to: Oracle Database 11g Enterprise Edition Release 11.1.0.7.0 - 64bit Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options
ORA-39006: internal error
ORA-39065: unexpected master process exception in DISPATCH
ORA-00904: "PARENT_PROCESS_ORDER": invalid identifier
ORA-39097: Data Pump job encountered unexpected error -904
what is wrong?   link is a public db link and i am able to do select * from table@link from schema1...

Similar Messages

  • Script for export in datapump  -- help needed !!!

    hello all,
    i am using the following script as batch file in my database for export
    script:
    =========
    exp name/password file=d:\exp\%date%.dmp full=y log=d:\exp\exp.log an this will replace the first file(monday) on next monday.
    similar way i need a script for data pump for full database export using datapump
    thanks ,
    gold

    login to database as a dba and create directory for your dumpfile path.
    create directory dpump_dir as 'd:\exp';
    and then use the below script for export.
    expdp username/password full=y directory=dpump_dir dumpfile=%date%.dmp logfile=exp.log

  • Error while export using DATAPUMP

    hi ,
    When i try to export the database using DataPump I am getting the following error.
    Details :
    DB version : 10.2.0.2
    OS : (HP-Unix)
    error :
    dbsrv:/u02/oradata2> 6 dumpfile=uatdb_preEOD28_jul.dmp logfile=uatdb_preEOD28jul.log directory=expdp_new <
    Export: Release 10.2.0.1.0 - 64bit Production on Sunday, 27 July, 2008 10:00:38
    Copyright (c) 2003, 2005, Oracle. All rights reserved.
    Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - 64bit Production
    With the Partitioning, OLAP and Data Mining options
    Starting "OCEANFNC"."JOB6": userid=oceanfnc/********@uatdb content=ALL full=Y job_name=job6 dumpfile=uatdb_preEOD28_jul.dmp logfile=uatdb_preEOD28jul.log directory=expdp_new
    Estimate in progress using BLOCKS method...
    Processing object type DATABASE_EXPORT/SCHEMA/TABLE/TABLE_DATA
    ORA-39125: Worker unexpected fatal error in KUPW$WORKER.GET_TABLE_DATA_OBJECTS while calling DBMS_METADATA.FETCH_XML_CLOB []
    ORA-31642: the following SQL statement fails:
    BEGIN "DMSYS"."DBMS_DM_MODEL_EXP".SCHEMA_CALLOUT(:1,0,1,'10.02.00.01.00'); END;
    ORA-06512: at "SYS.DBMS_SYS_ERROR", line 86
    ORA-06512: at "SYS.DBMS_METADATA", line 907
    ORA-04067: not executed, package body "DMSYS.DBMS_DM_MODEL_EXP" does not exist
    ORA-06508: PL/SQL: could not find program unit being called: "DMSYS.DBMS_DM_MODEL_EXP"
    ORA-06512: at "SYS.DBMS_SYS_ERROR", line 95
    ORA-06512: at "SYS.KUPW$WORKER", line 6235
    ----- PL/SQL Call Stack -----
    object line object
    handle number name
    c0000002ae4f0a60 14916 package body SYS.KUPW$WORKER
    c0000002ae4f0a60 6300 package body SYS.KUPW$WORKER
    c0000002ae4f0a60 9120 package body SYS.KUPW$WORKER
    c0000002ae4f0a60 1880 package body SYS.KUPW$WORKER
    c0000002ae4f0a60 6861 package body SYS.KUPW$WORKER
    c0000002ae4f0a60 1262 package body SYS.KUPW$WORKER
    c000000264e48dc0 2 anonymous block
    Job "OCEANFNC"."JOB6" stopped due to fatal error at 10:00:41
    Kindly advice me reg this!
    Thanks

    hi,
    we got a metalink note for this problem and the metalink id is "Note Id: 433022.1 " . Even after running the Mentioned catalog.sql and catproc.sql, in our DB, we are facing some problems. The error description :
    ORA-39127: unexpected error from call to export_string :=EXFSYS.DBMS_EXPFIL_DEPASEXP.system_info_exp(0,dynconnect,10.02.00.01.00',newblock)
    ORA-04067: not executed, package body "EXFSYS.DBMS_EXPFIL_DEPASEXP" does not exist
    ORA-06508: PL/SQL: could not find program unit being called: "EXFSYS.DBMS_EXPFIL_DEPASEXP"
    ORA-06512: at line 1
    ORA-06512: at "SYS.DBMS_METADATA", line 5327
    Processing object type DATABASE_EXPORT/SYSTEM_PROCOBJACT/PRE_SYSTEM_ACTIONS/PROCACT_SYSTEM
    ORA-39127: unexpected error from call to export_string :=EXFSYS.DBMS_EXPFIL_DEPASEXP.system_info_exp(1,dynconnect,10.02.00.01.00',newblock)
    ORA-04067: not executed, package body "EXFSYS.DBMS_EXPFIL_DEPASEXP" does not exist
    ORA-06508: PL/SQL: could not find program unit being called: "EXFSYS.DBMS_EXPFIL_DEPASEXP"
    ORA-06512: at line 1
    ORA-06512: at "SYS.DBMS_METADATA", line 5327
    Processing object type DATABASE_EXPORT/SYSTEM_PROCOBJACT/POST_SYSTEM_ACTIONS/PROCACT_SYSTEM
    ORA-39127: unexpected error from call to export_string :=EXFSYS.DBMS_EXPFIL_DEPASEXP.schema_info_exp('SYS',0,1,'10.02.00.01.00',newblock)
    ORA-04067: not executed, package body "EXFSYS.DBMS_EXPFIL_DEPASEXP" does not exist
    ORA-06508: PL/SQL: could not find program unit being called: "EXFSYS.DBMS_EXPFIL_DEPASEXP"
    ORA-06512: at line 1
    ORA-06512: at "SYS.DBMS_METADATA", line 5412
    ORA-39127: unexpected error from call to export_string :=EXFSYS.DBMS_EXPFIL_DEPASEXP.schema_info_exp('SYSTEM',0,1,'10.02.00.01.00',newblock)
    ORA-04067: not executed, package body "EXFSYS.DBMS_EXPFIL_DEPASEXP" does not exist
    ORA-06508: PL/SQL: could not find program unit being called: "EXFSYS.DBMS_EXPFIL_DEPASEXP"
    ORA-06512: at line 1
    ORA-06512: at "SYS.DBMS_METADATA", line 5412
    ORA-39127: unexpected error from call to export_string :=EXFSYS.DBMS_EXPFIL_DEPASEXP.schema_info_exp('OUTLN',0,1,'10.02.00.01.00',newblock)
    ORA-04067: not executed, package body "EXFSYS.DBMS_EXPFIL_DEPASEXP" does not exist
    ORA-06508: PL/SQL: could not find program unit being called: "EXFSYS.DBMS_EXPFIL_DEPASEXP"
    ORA-06512: at line 1
    ORA-06512: at "SYS.DBMS_METADATA", line 5412
    ORA-39127: unexpected error from call to export_string :=EXFSYS.DBMS_EXPFIL_DEPASEXP.schema_info_exp('TSMSYS',0,1,'10.02.00.01.00',newblock)
    ORA-04067: not executed, package body "EXFSYS.DBMS_EXPFIL_DEPASEXP" does not exist
    ORA-06508: PL/SQL: could not find program unit being called: "EXFSYS.DBMS_EXPFIL_DEPASEXP"
    ORA-06512: at line 1
    ORA-06512: at "SYS.DBMS_METADATA", line 5412
    ORA-39127: unexpected error from call to export_string :=EXFSYS.DBMS_EXPFIL_DEPASEXP.schema_info_exp('ANONYMOUS',0,1,'10.02.00.01.00',newblock)
    ORA-04067: not executed, package body "EXFSYS.DBMS_EXPFIL_DEPASEXP" does not exist
    ORA-06508: PL/SQL: could not find program unit being called: "EXFSYS.DBMS_EXPFIL_DEPASEXP"
    ORA-06512: at line 1
    ORA-06512: at "SYS.DBMS_METADATA", line 5412
    ORA-39127: unexpected error from call to export_string :=EXFSYS.DBMS_EXPFIL_DEPASEXP.schema_info_exp('OLAPSYS',0,1,'10.02.00.01.00',newblock)
    ORA-04067: not executed, package body "EXFSYS.DBMS_EXPFIL_DEPASEXP" does not exist
    ORA-06508: PL/SQL: could not find program unit being called: "EXFSYS.DBMS_EXPFIL_DEPASEXP"
    ORA-06512: at line 1
    ORA-06512: at "SYS.DBMS_METADATA", line 5412
    ORA-39127: unexpected error from call to export_string :=EXFSYS.DBMS_EXPFIL_DEPASEXP.schema_info_exp('SYSMAN',0,1,'10.02.00.01.00',newblock)
    ORA-04067: not executed, package body "EXFSYS.DBMS_EXPFIL_DEPASEXP" does not exist
    ORA-06508: PL/SQL: could not find program unit being called: "EXFSYS.DBMS_EXPFIL_DEPASEXP"
    ORA-06512: at line 1
    ORA-06512: at "SYS.DBMS_METADATA", line 5412
    ORA-39127: unexpected error from call to export_string :=EXFSYS.DBMS_EXPFIL_DEPASEXP.schema_info_exp('MDDATA',0,1,'10.02.00.01.00',newblock)
    ORA-04067: not executed, package body "EXFSYS.DBMS_EXPFIL_DEPASEXP" does not exist
    ORA-06508: PL/SQL: could not find program unit being called: "EXFSYS.DBMS_EXPFIL_DEPASEXP"
    ORA-06512: at line 1
    ORA-06512: at "SYS.DBMS_METADATA", line 5412
    ORA-39127: unexpected error from call to export_string :=EXFSYS.DBMS_EXPFIL_DEPASEXP.schema_info_exp('UATFNC',0,1,'10.02.00.01.00',newblock)
    ORA-04067: not executed, package body "EXFSYS.DBMS_EXPFIL_DEPASEXP" does not exist
    ORA-06508: PL/SQL: could not find program unit being called: "EXFSYS.DBMS_EXPFIL_DEPASEXP"
    ORA-06512: at line 1
    ORA-06512: at "SYS.DBMS_METADATA", line 5412
    ORA-39127: unexpected error from call to export_string :=EXFSYS.DBMS_EXPFIL_DEPASEXP.schema_info_exp('MGMT_VIEW',0,1,'10.02.00.01.00',newblock)
    ORA-04067: not executed, package body "EXFSYS.DBMS_EXPFIL_DEPASEXP" does not exist
    ORA-06508: PL/SQL: could not find program unit being called: "EXFSYS.DBMS_EXPFIL_DEPASEXP"
    ORA-06512: at line 1
    ORA-06512: at "SYS.DBMS_METADATA", line 5412
    ORA-39127: unexpected error from call to export_string :=EXFSYS.DBMS_EXPFIL_DEPASEXP.schema_info_exp('SCOTT',0,1,'10.02.00.01.00',newblock)
    ORA-04067: not executed, package body "EXFSYS.DBMS_EXPFIL_DEPASEXP" does not exist
    ORA-06508: PL/SQL: could not find program unit being called: "EXFSYS.DBMS_EXPFIL_DEPASEXP"
    ORA-06512: at line 1
    ORA-06512: at "SYS.DBMS_METADATA", line 5412
    ORA-39127: unexpected error from call to export_string :=EXFSYS.DBMS_EXPFIL_DEPASEXP.schema_info_exp('BRNSMS',0,1,'10.02.00.01.00',newblock)
    ORA-04067: not executed, package body "EXFSYS.DBMS_EXPFIL_DEPASEXP" does not exist
    ORA-06508: PL/SQL: could not find program unit being called: "EXFSYS.DBMS_EXPFIL_DEPASEXP"
    ORA-06512: at line 1
    ORA-06512: at "SYS.DBMS_METADATA", line 5412
    ORA-39127: unexpected error from call to export_string :=EXFSYS.DBMS_EXPFIL_DEPASEXP.schema_info_exp('FLEXML',0,1,'10.02.00.01.00',newblock)
    ORA-04067: not executed, package body "EXFSYS.DBMS_EXPFIL_DEPASEXP" does not exist
    ORA-06508: PL/SQL: could not find program unit being called: "EXFSYS.DBMS_EXPFIL_DEPASEXP"
    ORA-06512: at line 1
    ORA-06512: at "SYS.DBMS_METADATA", line 5412
    ORA-39127: unexpected error from call to export_string :=EXFSYS.DBMS_EXPFIL_DEPASEXP.schema_info_exp('FLEXBO',0,1,'10.02.00.01.00',newblock)
    ORA-04067: not executed, package body "EXFSYS.DBMS_EXPFIL_DEPASEXP" does not exist
    ORA-06508: PL/SQL: could not find program unit being called: "EXFSYS.DBMS_EXPFIL_DEPASEXP"
    ORA-06512: at line 1
    ORA-06512: at "SYS.DBMS_METADATA", line 5412
    ORA-39127: unexpected error from call to export_string :=EXFSYS.DBMS_EXPFIL_DEPASEXP.schema_info_exp('BRN000',0,1,'10.02.00.01.00',newblock)
    ORA-04067: not executed, package body "EXFSYS.DBMS_EXPFIL_DEPASEXP" does not exist
    ORA-06508: PL/SQL: could not find program unit being called: "EXFSYS.DBMS_EXPFIL_DEPASEXP"
    ORA-06512: at line 1
    ORA-06512: at "SYS.DBMS_METADATA", line 5412
    ORA-39127: unexpected error from call to export_string :=EXFSYS.DBMS_EXPFIL_DEPASEXP.schema_info_exp('FLEXRPT',0,1,'10.02.00.01.00',newblock)
    ORA-04067: not executed, package body "EXFSYS.DBMS_EXPFIL_DEPASEXP" does not exist
    ORA-06508: PL/SQL: could not find program unit being called: "EXFSYS.DBMS_EXPFIL_DEPASEXP"
    ORA-06512: at line 1
    ORA-06512: at "SYS.DBMS_METADATA", line 5412
    ORA-39127: unexpected error from call to export_string :=EXFSYS.DBMS_EXPFIL_DEPASEXP.schema_info_exp('BRN800',0,1,'10.02.00.01.00',newblock)
    ORA-04067: not executed, package body "EXFSYS.DBMS_EXPFIL_DEPASEXP" does not exist
    ORA-06508: PL/SQL: could not find program unit being called: "EXFSYS.DBMS_EXPFIL_DEPASEXP"
    ORA-06512: at line 1
    ORA-06512: at "SYS.DBMS_METADATA", line 5412
    ORA-39127: unexpected error from call to export_string :=EXFSYS.DBMS_EXPFIL_DEPASEXP.schema_info_exp('BRN900',0,1,'10.02.00.01.00',newblock)
    ORA-04067: not executed, package body "EXFSYS.DBMS_EXPFIL_DEPASEXP" does not exist
    ORA-06508: PL/SQL: could not find program unit being called: "EXFSYS.DBMS_EXPFIL_DEPASEXP"
    ORA-06512: at line 1
    ORA-06512: at "SYS.DBMS_METADATA", line 5412
    ORA-39127: unexpected error from call to export_string :=EXFSYS.DBMS_EXPFIL_DEPASEXP.schema_info_exp('OCEANFNC',0,1,'10.02.00.01.00',newblock)
    ORA-04067: not executed, package body "EXFSYS.DBMS_EXPFIL_DEPASEXP" does not exist
    ORA-06508: PL/SQL: could not find program unit being called: "EXFSYS.DBMS_EXPFIL_DEPASEXP"
    ORA-06512: at line 1
    ORA-06512: at "SYS.DBMS_METADATA", line 5412
    Kindly advice me regarding this !!
    Thanks a lot for ur reply !!

  • Total number of records of partition tables exported using datapump

    Hi All,
    Oracle Database 10g Enterprise Edition Release 10.2.0.4.0 - 64bit Production
    OS: RHEL
    I exported a table with partitions using datapump and I would like to verify the total number of all the records exported of all these partition tables.The export has a query on it with WHERE ITEMDATE< TO_DATE (1-JAN-2010). I need it to compare with the exact amount of records in the actual table if it's the same.
    Below is the log file of the exported table. It does not show the total number of rows exported but only individually via partition.
    Starting "SYS"."SYS_EXPORT_TABLE_05": '/******** AS SYSDBA' dumpfile=data_pump_dir:GSDBA_APPROVED_TL.dmp nologfile=y tables=GSDBA.APPROVED_TL query=GSDBA.APPROVED_TL:"
    WHERE ITEMDATE< TO_DATE(\'1-JAN-2010\'\)"
    Estimate in progress using BLOCKS method...
    Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
    Total estimation using BLOCKS method: 517.6 MB
    Processing object type TABLE_EXPORT/TABLE/TABLE
    Processing object type TABLE_EXPORT/TABLE/GRANT/OWNER_GRANT/OBJECT_GRANT
    Processing object type TABLE_EXPORT/TABLE/INDEX/INDEX
    Processing object type TABLE_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS
    Processing object type TABLE_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
    . . exported "GSDBA"."APPROVED_TL":"APPROVED_TL_2008_Q3" 35.02 MB 1361311 rows
    . . exported "GSDBA"."APPROVED_TL":"APPROVED_TL_2008_Q4" 33.23 MB 1292051 rows
    . . exported "GSDBA"."APPROVED_TL":"APPROVED_TL_2010_Q4" 5.875 KB 0 rows
    . . exported "GSDBA"."APPROVED_TL":"APPROVED_TL_2011_Q1" 5.875 KB 0 rows
    . . exported "GSDBA"."APPROVED_TL":"APPROVED_TL_2009_Q3" 30.53 MB 1186974 rows
    . . exported "GSDBA"."APPROVED_TL":"APPROVED_TL_2010_Q3" 5.875 KB 0 rows
    . . exported "GSDBA"."APPROVED_TL":"APPROVED_TL_2009_Q1" 30.44 MB 1183811 rows
    . . exported "GSDBA"."APPROVED_TL":"APPROVED_TL_2009_Q2" 30.29 MB 1177468 rows
    . . exported "GSDBA"."APPROVED_TL":"APPROVED_TL_2009_Q4" 30.09 MB 1170470 rows
    . . exported "GSDBA"."APPROVED_TL":"APPROVED_TL_2010_Q2" 5.875 KB 0 rows
    . . exported "GSDBA"."APPROVED_TL":"APPROVED_TL_2011_Q2" 5.875 KB 0 rows
    . . exported "GSDBA"."APPROVED_TL":"APPROVED_TL_2010_Q1" 5.875 KB 0 rows
    . . exported "GSDBA"."APPROVED_TL":"APPROVED_TL_2011_Q3" 5.875 KB 0 rows
    . . exported "GSDBA"."APPROVED_TL":"APPROVED_TL_2011_Q4" 5.875 KB 0 rows
    . . exported "GSDBA"."APPROVED_TL":"APPROVED_TL_2012_Q1" 5.875 KB 0 rows
    . . exported "GSDBA"."APPROVED_TL":"APPROVED_TL_2006_Q3" 0 KB 0 rows
    . . exported "GSDBA"."APPROVED_TL":"APPROVED_TL_2006_Q4" 0 KB 0 rows
    . . exported "GSDBA"."APPROVED_TL":"APPROVED_TL_2007_Q1" 0 KB 0 rows
    . . exported "GSDBA"."APPROVED_TL":"APPROVED_TL_2007_Q2" 0 KB 0 rows
    . . exported "GSDBA"."APPROVED_TL":"APPROVED_TL_2007_Q3" 0 KB 0 rows
    . . exported "GSDBA"."APPROVED_TL":"APPROVED_TL_2007_Q4" 0 KB 0 rows
    . . exported "GSDBA"."APPROVED_TL":"APPROVED_TL_2008_Q1" 0 KB 0 rows
    . . exported "GSDBA"."APPROVED_TL":"APPROVED_TL_2008_Q2" 0 KB 0 rows
    . . exported "GSDBA"."APPROVED_TL":"APPROVED_TL_2012_Q2" 0 KB 0 rows
    . . exported "GSDBA"."APPROVED_TL":"APPROVED_TL_2012_Q3" 0 KB 0 rows
    . . exported "GSDBA"."APPROVED_TL":"APPROVED_TL_2012_Q4" 0 KB 0 rows
    . . exported "GSDBA"."APPROVED_TL":"APPROVED_TL_MAXVALUE" 0 KB 0 rows
    Master table "SYS"."SYS_EXPORT_TABLE_05" successfully loaded/unloaded
    Dump file set for SYS.SYS_EXPORT_TABLE_05 is:
    /u01/export/GSDBA_APPROVED_TL.dmp
    Job "SYS"."SYS_EXPORT_TABLE_05" successfully completed at 12:00:36
    Edited by: 831134 on Jan 25, 2012 6:42 AM
    Edited by: 831134 on Jan 25, 2012 6:43 AM
    Edited by: 831134 on Jan 25, 2012 6:43 AM

    I assume you want this so you can run a script to check the count? If not and this is being done manually, then just add up the individual rows. I'm not very good at writing scripts, but I would think that someone here could come up with a script that would sum up the row count for the partitions of the tables from your log file. This is not something that Data Pump writes to the log file.
    Dean

  • Schedule Daily Export Using Datapump

    Hi
    I want to schedule my datapump daily full export using the oracle database features,Such as Enterprise Manager,
    How can I do it?
    Please tell me complete Solution,cause I have tried it so many times But...
    Thank You In Advance

    hi hesam
    u should better go with the shell scripting and add the script as the cron job at the required time that will be better
    cronjobs :
    http://www.quest-pipelines.com/newsletter/cron.htm
    there are some problems that the exp commands will not work in the cronjobs
    try below
    or whatever your ksh shell is located.
    # ! / bin / ksh
    # setup the environment
    export ORACLE_SID=TEST
    export ORAENV_ASK=NO;
    . oraenv
    FILE_STUB=/u02/backup/$ORACLE_SID/exp_$ORACLE_SID_full
    exp system/manager file=$FILE_STUB.dmp log=$FILE_STUB.log full=y direct=y
    or, alternatively, you can setup your environment in your crontab:
    10 0 * * * (/bin/ksh "export ORACLE_SID=TEST; export ORAENV_ASK=NO, . $HOME/.profile; $HOME/your_export_script_goes_here.ksh")

  • Table export using datapump  -  estimate the time to export

    I aim to export the table ABC of size 300GB using datapump ( oracle 10g)
    I would like to know how long to take the table for export ( estimate time)

    There are so many unknown variables like the number of CPUs, speed of disks, other loads on your database that makes it very inaccurate to guess.
    Based on your old version 10g and the assumption that your server is same age and average power, you should be looking at 30-50GB an hour, i.e. 6 to 10 hours. I could be wrong though.

  • DBMS_JOB not exported by DATAPUMP?

    Hi
    We are setting up expdp, and we cannot get DBMS_JOBS to be exported.
    And so far we have not found any documentation of wether DBMS_JOBS are supposed to be exported or not.
    And I know DBMS_JOB are going to be deprecated, and DBMS_SCHEDULE is beeing exported.
    Is there any document describing of DBMS_JOB interacts with DATAPUMP?
    Thanks

    I did not find any document even on Metalink. You can query data pump dictionary views but they only mention "job":
    select * from database_export_objects where object_path like '%JOB%';OBJECT_PATH
    COMMENTS                                 N
    DATABASE_EXPORT/SCHEMA/JOB
    Jobs
    JOB
    Jobs
    SCHEMA/JOB
    Jobs
    select * from schema_export_objects where object_path like '%JOB%';OBJECT_PATH
    COMMENTS                                 N
    JOB
    Jobs in the selected schemas
    SCHEMA_EXPORT/JOB
    Jobs in the selected schemas

  • How do I perform an export with DataPump on commandline ?

    According to page
    http://www.oracle.com/technology/obe/10gr2_db_vmware/manage/datapump/datapump.htm#t1tt
    I tried in SQLplus to export all the stuff from table CTEST to a local file
    D:\mydata\tabledata.txt by entering the following command:
    expdp "D:\mydata" TABLES=CTEST DUMPFILE=tabledata.txt
    but I got as answer:
    SP2-0734: unknown command beginning "expdp D:\m..."
    Why ?
    How can dump a table into a textfile otherwise ?

    Perhaps your (10g) oracle binaries location is not in your PATH?
    Perhaps you are actually executing against a 9i or earlier database release?
    perhaps you have quotes where they do not belong -- why "D:\mydata" It appears as if your login credentials are not correct?

  • Estimate size of export with datapump

    Hi,
    When i run an expdp with estimate_only=TRUE, the estimate give me 383M.
    When i run expdp the sum of the files generated are not equal to 383M but 143.64M.
    Why.
    Regards

    Fahd Mirza wrote:
    Hi,
    Estimate size of expdp is calculated using statistics for each table. That's wrong, well, at least by default estimate is working based on BLOCKS, not on STATISTICS.
    Find out more Understanding the ESTIMATE and ESTIMATE_ONLY Parameters in Export DataPump ID 786165.1+
    And using blocks method may give less accurate on estimation size when tables have been created with large extents, or mass delete...
    Nicolas.

  • HOW-TO Distinguish between file exported from Datapump and normal export

    How to distinguish between the dmp file export from the data pump utility and the export utility in Oracle 10g and 11g .
    Thanks,
    SS

    Like Werner suggested, it's never hurt to try anyway.
    use imp with SHOW=Y, if it works then Original otherwise Data pump.
    Once you determined one file, there's good chance that the rest are same. I don't think previous DBA will go through the trouble by mixing data pump and traditional exp/imp.

  • Transaction Status during Database Export through DataPump

    Hi,
    I've q question and need clarification. We have 120GB Database ( Version 11GR1 ) which has a single application schema other than standard schemas (Like ststem,sysaux, example etc). The object containg in the application schema will be round about 110~115 GB and the rest size of other standard schemas object. The application users that made transaction through front end J2EE application that hit to application's schema objects.
    We usually expdp that schema. Now my question is when we export the schema when the users are performing transactions what will be the status of that transactions that were made during export process ? what that be reflected in the dump or not? What is that Oracle Actual mechanism for that type of transactions in case of Datapup export.
    Regards,
    Kamran

    Let's say you have 3 tables
    table a
    table b
    table c with partition 1
    table c with partition 2
    Now let's say all of these tables are being exported and while the export is happening, users are adding/dropping rows to these tables/partitions. This is what Data Pump will do if you don't specify flashback_time or flashback_scn;
    Since table a and table b are not partitioned, they will be exported with the scn that is current when the unload of that table is scheduled. So,
    table a gets scheduled to be unloaded when the scn is 12345. The data in the dumpfile for this table will be consistent with 12345. If rows were modified before 12345, the data in the dumpfile will have those rows. If rows were added after 12345, they would not be in the dumpfiles.
    Table b gets scheduled to be unloaded when the scn is 23456. The data in the dumpfile for this table will be consistent with 23456. If rows were modified before 23456, the data in the dumpfile will have those rows. If rows were added after 23456, they would not be in the dumpfiles. If you had ref constraints from table a to table b, you could hit a problem.
    Table c partition 1 gets scheduled at 34567. All of the data in the dumpfile will be consistent with 34567. Just like above
    Here is the exception to the rule!
    Table c partition 2 gets scheduled at 45678. The difference is that partition 1 of this table got scheduled at 34567, so the data in this partition will also be
    consistent with 34567. All partitions of a table will use the same SCN as the first partition that is scheduled.
    If you want a consistent dumpfile, use either
    flashback_time='some date' you can use flashback_time=sysdate if you want to put it in a script and not have to update the date in the script.
    or
    flashback_scn='some_scn'
    Hope this explains what you wanted to know.
    Dean

  • Datapump Export stops at "Estimate in progress...."

    Hi,
    I am facing an issue while doing Schema level Datapump Export in Oracle 10g. The export for a particular schema stops at "Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA" and more over it only spawns one worker(DW01) irrespective of the PARALLEL parameter value. For other schema the export works fine and even the table level export for the problematic schema.
    I am clue less, because the alert log does not show anything, can any one please advice....
    Here is how my Parfile looks like:
    userid=id/password
    directory=impdir
    parallel=2
    schemas=prod11sep12
    dumpfile=expC2P_20120925_%U.dmp
    logfile=expC2P_20120925.log
    job_name=expC2P_20120925
    tail -f expC2P_20120925.log
    bash-3.00$ expdp parfile=expC2P.par ESTIMATE=STATISTICS
    Export: Release 10.2.0.4.0 - 64bit Production on Wednesday, 26 September, 2012 16:44:30
    Copyright (c) 2003, 2007, Oracle. All rights reserved.
    Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.4.0 - 64bit Production
    With the Partitioning, OLAP, Data Mining and Real Application Testing options
    Starting "SYSTEM"."EXPC2P_20120925": parfile=expC2P.par ESTIMATE=STATISTICS
    Estimate in progress using STATISTICS method...
    Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
    Alert log:
    kupprdp: master process DM00 started with pid=38, OS id=15156
    to execute - SYS.KUPM$MCP.MAIN('EXPC2P_20120925', 'SYSTEM', 'KUPC$C_1_20120926164430', 'KUPC$S_1_20120926164430', 0);
    kupprdp: worker process DW01 started with worker id=1, pid=46, OS id=15201
    to execute - SYS.KUPW$WORKER.MAIN('EXPC2P_20120925', 'SYSTEM');
    Thanks in Advance...

    Pl enable trace as per this MOS Doc to see if additional debug information can be gathered
    Export/Import DataPump Parameter TRACE - How to Diagnose Oracle Data Pump [ID 286496.1]
    HTH
    Srini

  • Export  "sys.aud$"  table as system user using datapump

    Friends,
    I want to export (using datapump 'expdp') the sys user's AUD$ table (sys.aud$) as the system
    user . But it shows the following error :
    bash-3.00$ expdp system/sys123@onlinete directory=test_dir TABLES=sys.AUD$ DUMPFILE=sysaud.$Date.dmp logfile=audit.$date.log
    Export: Release 10.2.0.1.0 - 64bit Production on Wednesday, 14 January, 2009 13:30:56
    Copyright (c) 2003, 2005, Oracle. All rights reserved.
    Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - 64bit Production
    With the Partitioning, OLAP and Data Mining options
    Starting "SYSTEM"."SYS_EXPORT_TABLE_01": system/********@onlinete directory=test_dir TABLES=sys.AUD$ DUMPFILE=sysaud..dmp logfile=audit..log
    Estimate in progress using BLOCKS method...
    Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
    Total estimation using BLOCKS method: 0 KB
    ORA-39165: Schema SYS was not found.
    ORA-39166: Object AUD$ was not found.
    ORA-31655: no data or metadata objects selected for job
    Job "SYSTEM"."SYS_EXPORT_TABLE_01" completed with 3 error(s) at 13:31:01
    It also shows error when I take it as SYS user :
    bash-3.00$ expdp sys/sys123@onlinete directory=test_dir TABLES=sys.AUD$ DUMPFILE=sysaud.$Date.dmp logfile=audit.$date.log
    Export: Release 10.2.0.1.0 - 64bit Production on Wednesday, 14 January, 2009 13:35:19
    Copyright (c) 2003, 2005, Oracle. All rights reserved.
    UDE-00008: operation generated ORACLE error 28009
    ORA-28009: connection as SYS should be as SYSDBA or SYSOPER
    Username: sys/sys123 as sysdba
    Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - 64bit Production
    With the Partitioning, OLAP and Data Mining options
    Starting "SYS"."SYS_EXPORT_TABLE_01": sys/******** AS SYSDBA directory=test_dir TABLES=sys.AUD$ DUMPFILE=sysaud..dmp logfile=audit..log
    Estimate in progress using BLOCKS method...
    Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
    Total estimation using BLOCKS method: 0 KB
    ORA-39165: Schema SYS was not found.
    ORA-39166: Object AUD$ was not found.
    ORA-31655: no data or metadata objects selected for job
    Job "SYS"."SYS_EXPORT_TABLE_01" completed with 3 error(s) at 13:35:29
    I dont understand the problem why it is not working . Need advice plz ... ...

    But that's not fair..
    Imagine the situation, where I figured out that some data was edited a year ago, but I don't know by whom.. Audit was enabled at that time, I was exporting (using the regular exp) AUD$ table during the year, everything is good.. BUT.. Two months ago I upgraded my DB to 11g. Hence I cannot use imp in order to restore the table and see what was going on a year ago.. That means that I always have to have an ability to create 10g database in order to use my AUD$ export??
    Is there any other way of backing up this table? Sop far I was doing exp+truncate, but since 11g release where exp/imp are not supported I am trying to think about another way of dealing with the audit...
    does anybody have ideas about it?
    thanks,
    M

  • ORA-39126 during an export of a partition via dbms_datapump

    Hi ,
    i did export using datapump in command line everything went fine but while exporting via dbms_datapump i got this:
    ORA-39126 during an export of a partition via dbms_datapump
    ORA-00920
    'SELECT FROM DUAL WHERE :1' P20060401
    ORA-06512: at "SYS.DBMS_SYS_ERROR", line 95
    ORA-06512: at "SYS.KUPW$WORKER", line 6228
    the procedure is:
    PROCEDURE pr_depura_bitacora
    IS
    l_job_handle NUMBER;
    l_job_state VARCHAR2(30);
    l_partition VARCHAR2(30);
    v_sql VARCHAR2(2000);
    BEGIN
    -- Create a user-named Data Pump job to do a "table:partition-level" export
    -- Local
    select 'P'|| to_char((select min(STP_LOG_DATE) from SAI_AUDITBITACORA),'YYYYMM')||'01'
    into l_partition
    from user_tab_partitions
    where table_name = 'SAI_AUDITBITACORA'
    and rownum = 1;
    l_partition := rtrim (l_partition,' ');
    l_job_handle:= DBMS_DATAPUMP.OPEN
    operation=>'EXPORT',
    job_mode =>'TABLE',
    job_name =>'EXPORT_ORACLENSSA'
    -- Schema filter
    DBMS_DATAPUMP.METADATA_FILTER
    handle => l_job_handle,
    name => 'SCHEMA_EXPR',
    value => 'IN (''ORACLENSSA'')'
    DBMS_OUTPUT.PUT_LINE('Added filter for schema list');
    -- Table filter
    DBMS_DATAPUMP.METADATA_FILTER
    handle => l_job_handle,
    name => 'NAME_EXPR',
    value => '=''SAI_AUDITBITACORA'''
    DBMS_OUTPUT.PUT_LINE('Added filter for table expression');
    -- Partition filter
    DBMS_DATAPUMP.DATA_FILTER
    handle => l_job_handle,
    name => 'PARTITION_EXPR',
    value => l_partition,
    table_name => 'SAI_AUDITBITACORA'
    DBMS_OUTPUT.PUT_LINE('Partition filter for schema list');
    DBMS_DATAPUMP.ADD_FILE
    handle => l_job_handle,
    filename => 'EXP'||l_partition||'.DMP',
    directory => 'EXP_DATA_PUMP',
    filetype => 1
    DBMS_DATAPUMP.ADD_FILE
    handle => l_job_handle,
    filename => 'EXP'||l_partition||'.LOG',
    directory => 'EXP_DATA_PUMP',
    filetype => 3
    DBMS_DATAPUMP.START_JOB
    handle => l_job_handle,
    skip_current => 0
    DBMS_DATAPUMP.WAIT_FOR_JOB
    handle => l_job_handle,
    job_state => l_job_state
    DBMS_OUTPUT.PUT_LINE('Job completed - job state = '||l_job_state);
    DBMS_DATAPUMP.DETACH(handle=>l_job_handle);
    END;
    I've already drop and recreate the directory, granted read, write to public and to user, grant create session, create table, create procedure, exp_full_database to user, restart the database and the listener with the var LD_LIBRARY pointing first to $ORACLE_HOME/lib, and add more space to temporary tablespace.

    The basic problem is:
    Error: ORA 920
    Text: invalid relational operator
    Cause: A search condition was entered with an invalid or missing relational
    operator.
    Action: Include a valid relational operator such as =, !=, ^=, <>, >, <, >=, <=
    , ALL, ANY, [NOT] BETWEEN, EXISTS, [NOT] IN, IS [NOT] NULL, or [NOT]
    LIKE in the condition.
    Obviously this refers to the invalid statement 'SELECT FROM DUAL ...'. I also recommend, you should contact Oracle Support, because it happens inside an Oracle provided package.
    Werner

  • Exporting an Apeture library from one external hard drive to another

    Whilst on the road I use a PowerBook G4 and I have my Aperture Library on an portable external hard drive. When I get back to base I would like to export this library from the portable to my main hard drive. Is this possible and if so can anybody please advise?

    I just got back from a three week trip to Yellowstone and used my MacBook Pro and Aperture to process over 3000 images while on the road. Got home, got on my home network, exported the Yellowstone project from my laptop to a drive on my G5, shutdown Aperture on the laptop (can't run two copies of Aperture at the same time), fired up Aperture on the G5 and imported the processed project into my main library. Slick! Can't even come close to doing that in LR (yet). Saved me a ton of time in getting a lot of the editing done during down time on the road and the integration into the existing library was very straight-forward. Quick final edit, select some of the better images, smart web gallery and a very quick web output. One of the easiest trip outputs I have had in years.

Maybe you are looking for

  • Target Cost Verion in Product Cost

    Hi, I'm not able to create Target Cost Version (1,2,3.....) in variance Calculation pertaining to Product Costing. I'm getting this message: Version 1000 1 not intended to be used as the target cost version Message no. KQ243 Diagnosis The specified t

  • Continue to Play Music, in Both Accounts

    I'm working on two separate user accounts constantly and combining them is not an option. It's irritating when I switch to the other account and my music stops. Any way to enable access to the playing music on both sides?

  • Itunes keeps closing during ipad 1 sync

    my ipad 1 is 4.3.5. version & itunes is 10.4.1 & for some reason everytime I connect to itunes & try to sync I get a message itunes has a fault & wants to close the programme. it's only just started to do this as it's been fine, any one know the answ

  • Srm global spend datasources

    hi I need to extract data in 0SR_FIDS1 - ODS. As mention in DataSources in Global Spend Analysis Based on FI Data that (No DataSources are delivered with Global Spend Analysis based on data from Financial Accounting. The DataSources are generated aft

  • I Can't get the Dictation Microphone Window to Come Up

    After talking with a friend about dictation, I decided to give it a try. I activated dictation in System Preferences and downloaded the Enhanced Dictation. But when I double click Fn, I get as single beep and nothing else. Looking at videos about dic