Fatal error in data pump import

Hi
Please help me in solving this error while importing data from dmp file
Import: Release 10.1.0.2.0 - Production on Thursday, 15 February, 2007 3:54
Copyright (c) 2003, Oracle. All rights reserved.
Connected to: Oracle Database 10g Enterprise Edition Release 10.1.0.2.0 - Production
With the Partitioning, OLAP and Data Mining options
Master table "SYSTEM"."SYS_SQL_FILE_FULL_02" successfully loaded/unloaded
Starting "SYSTEM"."SYS_SQL_FILE_FULL_02": system/******** directory=data_pump dumpfile=prashant_dp.
dmp SQLFILE=prashant_imp.sql logfile=prashant_imp.log
Processing object type TABLE_EXPORT/TABLE/TABLE
ORA-39125: Worker unexpected fatal error in KUPW$WORKER.PUT_DDLS while calling DBMS_METADATA.CONVERT
ORA-06502: PL/SQL: numeric or value error
LPX-00210: expected '<' instead of 'n'
----- PL/SQL Call Stack -----
object line object
handle number name
0x540a0b7c 13460 package body SYS.KUPW$WORKER
0x540a0b7c 5810 package body SYS.KUPW$WORKER
0x540a0b7c 11153 package body SYS.KUPW$WORKER
0x540a0b7c 2891 package body SYS.KUPW$WORKER
0x540a0b7c 3530 package body SYS.KUPW$WORKER
0x540a0b7c 6395 package body SYS.KUPW$WORKER
0x540a0b7c 1208 package body SYS.KUPW$WORKER
0x534beb10 2 anonymous block
Job "SYSTEM"."SYS_SQL_FILE_FULL_02" stopped due to fatal error at 03:54

here are trace files [http://files.mail.ru/B05EBC4402A342D0AE1E88BFB04191DB|http://files.mail.ru/B05EBC4402A342D0AE1E88BFB04191DB]
error here:
KUPM:12:47:19.571: Error detected by MCP
KUPM:12:47:19.572: ORA-39014: One or more workers have prematurely exited.
KUPM:12:47:19.572: ORA-39029: worker 1 with process name "DW00" prematurely terminated
KUPM:12:47:19.572: ORA-31671: Worker process DW00 had an unhandled exception.
ORA-39079: unable to enqueue message RQ
ORA-06512: at "SYS.DBMS_SYS_ERROR", line 86
ORA-06512: at "SYS.KUPC$QUE_INT", line 965
ORA-00931: missing identifier
ORA-06512: at "SYS.KUPW$WORKER", line 16746
ORA-06512: at "SYS.KUPW$WORKER", line 19035
ORA-06512: at "SYS.KUPW$WORKER", line 8191
ORA-39126: Worker unexpected fatal error in KUPW$WORKER.MAIN []
ORA-06512: at "SYS.KUPW$WORKER", line 1705
ORA-44002: invalid object name
ORA-06512: at line 2
KUPM:12:47:19.577: In restart_worker for worker 1...
KUPM:12:47:19.577: Error being processed is:  -26457
KUPM:12:47:19.577: worker id is:
KUPM:12:47:19.577: Worker error is: 0
KUPM:12:47:19.577: Exited main loop...
KUPM:12:47:19.577: Returned to MAIN
KUPV:12:47:19.577: Update request for job: SYSTEM.SYS_IMPORT_FULL_03, func: 1
KUPM:12:47:19.578: Entered state: UNDEFINED
KUPM:12:47:19.578: In RESPOND_TO_START
KUPC:12:47:19.578: Before ENQ: Sending Type: 2041 ID:
KUPC:12:47:19.578:  RP,KUPC$C_1_20130603124648,MCP,KUPC$A_1_20130603124649,10,Y
kwqberlst !retval block
kwqberlst rqan->lagno_kwqiia  5
kwqberlst rqan->lascn_kwqiia > 0 block
kwqberlst rqan->lascn_kwqiia  5
kwqberlst ascn 2375213 lascn 22
KUPM:12:47:19.579: In check_workers...
KUPM:12:47:19.579: Live worker count is:  0
KUPM:12:47:19.579: worker id is:
KUPM:12:47:19.579: Worker error is: 0
KUPM:12:47:19.579: Job is completing

Similar Messages

  • Error during data pump import with SQL developer

    Hello,
    I try to trasnfer data from one database to another one through data pump via SQL Developper (data amount is quite important) exporting several tables.
    Tables export is doing fine, but I encounter the following error when I import the file (I try data only and data + DDL).
    "Exception: ORA-39001: argument value invalid dbms_datapump.get_status(64...=
    ORA-39001: argument value invalid
    ORA-39000: ....
    ORA-31619: ...
    The file is in the right place, data pump folder of the new database. User is the same on both base, database version are similar.
    Do you have any idea of the problem ?
    Thanks

    With query SELECT * FROM v$version;
    Environment source
    Oracle Database 11g Enterprise Edition Release 11.2.0.1.0 - 64bit Production
    PL/SQL Release 11.2.0.1.0 - Production
    "CORE     11.2.0.1.0     Production"
    TNS for Linux: Version 11.2.0.1.0 - Production
    NLSRTL Version 11.2.0.1.0 - Production
    Environment target
    Oracle Database 11g Enterprise Edition Release 11.2.0.1.0 - 64bit Production
    PL/SQL Release 11.2.0.1.0 - Production
    "CORE     11.2.0.1.0     Production"
    TNS for Linux: Version 11.2.0.1.0 - Production
    NLSRTL Version 11.2.0.1.0 - Production

  • Data Pump Import

    I am having trouble using the data pump import tool. I am working with Oracle 10.2.4.0 on Windows Server 2008. I am trying to import a lot of data (several schemas, thousands of tables, hundreds of GB of data) from the same version of Oracle on Linux. Since it's so much data and will take so long to import, I am trying to make sure I can get it to work for one table or one schema before I attempt everything. So I'm trying to import the TEST_USER.TABLE1 table using the following command:
    impdp.exe dumpfile=big_file.dmp logfile=big_file_import.log tables=test_user.table1 remap_datafile=\'/oradata1/data1.dbf\':\'D:\oracle\product\10.2.0\oradata\orcl\data1.dbf\'
    I supply sys as sysdba for the login when it asks (not sure how you get that info into the initial command). However, I am getting the following error:
    user 'TEST_USER' does not exist
    My understanding was that the data pump utility would create all the requisite schemas for me. Is that not the case? The target database is a new install so the source schemas don't exist there.
    Even if I create the test_user schema by hand, then I get an error stating that the tablespace doesn't exist:
    ORA-00959: tablespace 'TS_1' does not exist
    Then even doing that, which I don't want to have to do to begin with, doesn't work. Then it gives me something about how the user doesn't have the proper privileges on the tablespace.
    Isn't the data pump utility supposed to make this stuff automatic? Do I really have to create all the schemas and tablespaces by hand? That's going to take a long time. Am I doing something wrong here?
    Thanks,
    Dave

    user5482172 wrote:
    Hemant K Chitale wrote:
    tables=test_user.table1 The "TABLES" mode does NOT create the database accounts.
    The FULL mode creates tablespaces and database accounts before importing data.
    The SCHEMAS mode creates database accounts before importing data-- but expects Tablespaces to pre-exist so that the tables and indexes can be created in the correct tablespaces.
    Hemant K Chitale
    http://hemantoracledba.blogspot.com
    That is consistent with what I'm seeing. So basically there is no way to import just a few tables or schemas and have it create all the necessary users and tablespaces for you. That seems a little arbitrary, but whatever. Why make things easy on your users when you can be all Oracley about it. Anyway, I'll give it a try and let you know what happens.
    Can I ask where you got this information? I'm sure you're right, but I can't find anything stating this.
    Thanks,
    Dave
    Edited by: user5482172 on Feb 18, 2010 10:51 PMAfter some testing, this appears to be correct. The system will not create tablespaces under the SCHEMAS mode and it will not create schemas under the TABLES mode; only under the FULL mode will everything be created for you. I find that unintuitive and extremely unfriendly, but that's Oracle for you. Have you ever tried doing this in SQL Server? You open the GUI, tell it where the dump file is, the name of the database, where to put the data files and you're done. Oracle has a real knack for making things way more complicated than they need to be. End Oracle rant.

  • Exclude DBMS_SCHEDULER jobs in DATA PUMP import

    Hello,
    I need to exclude all DBMS_SCHEDULER jobs during DATA PUMP import.
    I tried to do this with: EXCLUDE=JOB but this only works on DBMS_JOB.
    Is there a way to exclude all DBMS_SCHEDULER jobs during import?
    Kind Regards

    There are PROCOBJ that can be excluded (Procedural objects in the selected schemas) but I'm affraid it exclude not only DBMS_SCHEDULER jobs.
    Any ideas?

  • ERROR: Invalid data pump job state; one of: DEFINING or EXECUTING.

    Hi all.
    I'm using Oracle 10g (10.2.0.3) I had some scheduled jobs for export the DB with datapump. Some days ago they began to fail with the message:
    ERROR: Invalid data pump job state; one of: DEFINING or EXECUTING.
    Wait a few moments and check the log file (if one was used) for more details.
    The EM says that I can't monitor the job because there's no master tables. How can I find the log?
    I also founded in the bdump that exceptions occurred:
    *** 2008-08-22 09:04:54.671
    *** ACTION NAME:(EXPORT_ITATRD_DIARIO) 2008-08-22 09:04:54.671
    *** MODULE NAME:(Data Pump Master) 2008-08-22 09:04:54.671
    *** SERVICE NAME:(SYS$USERS) 2008-08-22 09:04:54.671
    *** SESSION ID:(132.3726) 2008-08-22 09:04:54.671
    kswsdlaqsub: unexpected exception err=604, err2=24010
    kswsdlaqsub: unexpected exception err=604, err2=24010
    * kjdrpkey2hv: called with pkey 114178, options x8
    * kjdrpkey2hv: called with pkey 114176, options x8
    * kjdrpkey2hv: called with pkey 114177, options x8
    Some help would be appreciated.
    Thanks.

    Delete any uncompleted or failed jobs and try again.

  • Error data pump import: Worker unexpected fatal error in KUPW$WORKER.MAIN

    Hello.
    I try to import dump
    impdp DIRECTORY=data_pump_dir DUMPFILE=04-2013.dmp TRANSFORM=OID:n LOGFILE=04-2013.dmp.log
    (user - system (tried with sys) , (also with parallel=1 (or 2, or 8))and have following error:
    Import: Release 11.2.0.1.0 - Production on Fri May 31 12:55:42 2013
    Copyright (c) 1982, 2009, Oracle and/or its affiliates.  All rights reserved.
    Connected to: Oracle Database 11g Release 11.2.0.1.0 - 64bit Production
    ORA-39014: One or more workers have prematurely exited.
    ORA-39029: worker 1 with process name "DW00" prematurely terminated
    ORA-31671: Worker process DW00 had an unhandled exception.
    ORA-39079: unable to enqueue message RQ
    ORA-06512: at "SYS.DBMS_SYS_ERROR", line 86
    ORA-06512: at "SYS.KUPC$QUE_INT", line 965
    ORA-00931: missing identifier
    ORA-06512: at "SYS.KUPW$WORKER", line 16746
    ORA-06512: at "SYS.KUPW$WORKER", line 19035
    ORA-06512: at "SYS.KUPW$WORKER", line 8191
    ORA-39126: Worker unexpected fatal error in KUPW$WORKER.MAIN []
    ORA-06512: at "SYS.KUPW$WORKER", line 1705
    ORA-44002: invalid object name
    ORA-06512: at line 2 
    How can I fix it?
    Alert_log
    Fri May 31 14:26:21 2013
    DM00 started with pid=32, OS id=36385, job SYSTEM.SYS_IMPORT_FULL_03
    Fri May 31 14:26:22 2013
    DW00 started with pid=33, OS id=36387, wid=1, job SYSTEM.SYS_IMPORT_FULL_03
    DW00 terminating with fatal err=39079, pid=33, wid=1, job SYSTEM.Also there is some invalid objects in SYS:
    SYS.DBMS_SQLTUNE_INTERNAL
    SYS.DBMS_SQLTUNE
    SYS.DBMS_WRR_INTERNAL
    SYS.DBMS_WORKLOAD_REPLAY
    SYS.PRVT_SQLPA
    SYS.DBMS_SMB_INTERNAL
    WMSYS.LTUTIL
    WMSYS.LTADM
    WMSYS.UD_TRIGS
    WMSYS.OWM_DDL_PKG
    WMSYS.OWM_MIG_PKG
    EXFSYS.DBMS_EXPFIL_DR
    EXFSYS.DBMS_EXPFIL
    EXFSYS.ADM_EXPFIL_SYSTRIG
    CTXSYS.DRVDDL
    XDB.DBMS_XDBZ0
    XDB.DBMS_XSLPROCESSOR
    XDB.DBMS_XDBUTIL_INT
    XDB.DBMS_CSX_ADMIN
    EXFSYS.DBMS_RLMGR_UTL
    EXFSYS.DBMS_RLMGR_IR
    EXFSYS.DBMS_RLMGR_IRPK
    EXFSYS.DBMS_RLMGR_DEPASEXP
    MDSYS.SDO_CS
    SYSMAN.EMD_MAINTENANCE
    SYSMAN.MGMT_ADMIN_DATA

    here are trace files [http://files.mail.ru/B05EBC4402A342D0AE1E88BFB04191DB|http://files.mail.ru/B05EBC4402A342D0AE1E88BFB04191DB]
    error here:
    KUPM:12:47:19.571: Error detected by MCP
    KUPM:12:47:19.572: ORA-39014: One or more workers have prematurely exited.
    KUPM:12:47:19.572: ORA-39029: worker 1 with process name "DW00" prematurely terminated
    KUPM:12:47:19.572: ORA-31671: Worker process DW00 had an unhandled exception.
    ORA-39079: unable to enqueue message RQ
    ORA-06512: at "SYS.DBMS_SYS_ERROR", line 86
    ORA-06512: at "SYS.KUPC$QUE_INT", line 965
    ORA-00931: missing identifier
    ORA-06512: at "SYS.KUPW$WORKER", line 16746
    ORA-06512: at "SYS.KUPW$WORKER", line 19035
    ORA-06512: at "SYS.KUPW$WORKER", line 8191
    ORA-39126: Worker unexpected fatal error in KUPW$WORKER.MAIN []
    ORA-06512: at "SYS.KUPW$WORKER", line 1705
    ORA-44002: invalid object name
    ORA-06512: at line 2
    KUPM:12:47:19.577: In restart_worker for worker 1...
    KUPM:12:47:19.577: Error being processed is:  -26457
    KUPM:12:47:19.577: worker id is:
    KUPM:12:47:19.577: Worker error is: 0
    KUPM:12:47:19.577: Exited main loop...
    KUPM:12:47:19.577: Returned to MAIN
    KUPV:12:47:19.577: Update request for job: SYSTEM.SYS_IMPORT_FULL_03, func: 1
    KUPM:12:47:19.578: Entered state: UNDEFINED
    KUPM:12:47:19.578: In RESPOND_TO_START
    KUPC:12:47:19.578: Before ENQ: Sending Type: 2041 ID:
    KUPC:12:47:19.578:  RP,KUPC$C_1_20130603124648,MCP,KUPC$A_1_20130603124649,10,Y
    kwqberlst !retval block
    kwqberlst rqan->lagno_kwqiia  5
    kwqberlst rqan->lascn_kwqiia > 0 block
    kwqberlst rqan->lascn_kwqiia  5
    kwqberlst ascn 2375213 lascn 22
    KUPM:12:47:19.579: In check_workers...
    KUPM:12:47:19.579: Live worker count is:  0
    KUPM:12:47:19.579: worker id is:
    KUPM:12:47:19.579: Worker error is: 0
    KUPM:12:47:19.579: Job is completing

  • Data pump import error with nested tables

    So the problem is somewhat long :)
    Actually the problems are two - why and how oracle are treating OO concept and why data pump doesn't work?
    So scenario for the 1st one:
    1) there is object type h1 and table of h1
    2) there is unrelated object type row_text and table of row_text
    3) there is object type h2 under h1 with attribute as table of row_text
    4) there is table tab1 with column b with data type as table of h1. Of course column b is stored as nested table.
    So how do you think - how many nested tables Oracle will create? The correct answer is 2. One explicitly defined and one hidden with system
    generated name for the type h2 which is under type h1. So the question is WHY? Why if I create an instance of supertype Oracle tries to adapt
    it for the subtype as well? Even more if I create another subtype h3 under h1 another hidden nested table appears.
    This was the first part.
    The second part is - if I do schema export and try to import it in another schema I got error saying that oracle failed to create storage table for
    nested table column b. So the second question is - if Oracle has created such a mess with hidden nested tables how to import/export to another
    schema?
    Ok and here is test case to demonstrate problems above:
    -- creating type h1 and table of it
    SQL> create or replace type h1 as object (a number)
      2  not final;
      3  /
    Type created.
    SQL> create or replace type tbl_h1 as table of h1;
      2  /
    Type created.
    -- creating type row_text and table of it
    SQL> create or replace type row_text as object (
      2    txt varchar2(100))
      3  not final;
      4  /
    Type created.
    SQL> create or replace type tbl_row_text as table of row_text;
      2  /
    Type created.
    -- creating type h2 as subtype of h1
    SQL> create or replace type h2 under h1 (some_texts tbl_row_text);
      2  /
    Type created.
    SQL> create table tab1 (a number, b tbl_h1)
      2  nested table b
      3  store as tab1_nested;
    Table created.
    -- so we have 2 nested tables now
    SQL> select table_name, parent_table_name, parent_table_column
      2  from user_nested_tables;
    TABLE_NAME                     PARENT_TABLE_NAME
    PARENT_TABLE_COLUMN
    SYSNTfsl/+pzu3+jgQAB/AQB27g==  TAB1_NESTED
    TREAT(SYS_NC_ROWINFO$ AS "GINTS"."H2")."SOME_TEXTS"
    TAB1_NESTED                    TAB1
    B
    -- another subtype of t1
    SQL> create or replace type h3 under h1 (some_texts tbl_row_text);
      2  /
    Type created.
    -- plus another nested table
    SQL> select table_name, parent_table_name, parent_table_column
      2  from user_nested_tables;
    TABLE_NAME                     PARENT_TABLE_NAME
    PARENT_TABLE_COLUMN
    SYSNTfsl/+pzu3+jgQAB/AQB27g==  TAB1_NESTED
    TREAT(SYS_NC_ROWINFO$ AS "GINTS"."H2")."SOME_TEXTS"
    SYSNTfsl/+pz03+jgQAB/AQB27g==  TAB1_NESTED
    TREAT(SYS_NC_ROWINFO$ AS "GINTS"."H3")."SOME_TEXTS"
    TAB1_NESTED                    TAB1
    B
    SQL> desc "SYSNTfsl/+pzu3+jgQAB/AQB27g=="
    Name                                      Null?    Type
    TXT                                                VARCHAR2(100)OK let it be and now I'm trying to export and import in another schema:
    [oracle@xxx]$ expdp gints/xxx@xxx directory=xxx dumpfile=gints.dmp logfile=gints.log
    Export: Release 11.2.0.1.0 - Production on Thu Feb 4 22:32:48 2010
    <irrelevant rows skipped>
    Processing object type SCHEMA_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
    . . exported "GINTS"."TAB1"                                  0 KB       0 rows
    . . exported "GINTS"."SYSNTfsl/+pz03+jgQAB/AQB27g=="         0 KB       0 rows
    . . exported "GINTS"."TAB1_NESTED"                           0 KB       0 rows
    . . exported "GINTS"."SYSNTfsl/+pzu3+jgQAB/AQB27g=="         0 KB       0 rows
    Master table "GINTS"."SYS_EXPORT_SCHEMA_01" successfully loaded/unloaded
    ******************************************************************************And now import. In order to create types transformation of OIDs is applied and also remap_schema
    Although it fails to create the table.
    [oracle@xxx]$ impdp gints1/xxx@xxx directory=xxx dumpfile=gints.dmp logfile=gints_imp.log remap_schema=gints:gints1 transform=OID:n
    Import: Release 11.2.0.1.0 - Production on Thu Feb 4 22:41:48 2010
    Copyright (c) 1982, 2009, Oracle and/or its affiliates.  All rights reserved.
    Connected to: Oracle Database 11g Release 11.2.0.1.0 - Production
    Master table "GINTS1"."SYS_IMPORT_FULL_01" successfully loaded/unloaded
    Starting "GINTS1"."SYS_IMPORT_FULL_01":  gints1/********@xxx directory=xxx dumpfile=gints.dmp logfile=gints_imp.log remap_schema=gints:gints1 transform=OID:n
    Processing object type SCHEMA_EXPORT/USER
    ORA-31684: Object type USER:"GINTS1" already exists
    Processing object type SCHEMA_EXPORT/SYSTEM_GRANT
    Processing object type SCHEMA_EXPORT/ROLE_GRANT
    Processing object type SCHEMA_EXPORT/DEFAULT_ROLE
    Processing object type SCHEMA_EXPORT/PRE_SCHEMA/PROCACT_SCHEMA
    Processing object type SCHEMA_EXPORT/TYPE/TYPE_SPEC
    Processing object type SCHEMA_EXPORT/TABLE/TABLE
    ORA-39083: Object type TABLE:"GINTS1"."TAB1" failed to create with error:
    ORA-02320: failure in creating storage table for nested table column B
    ORA-00904: : invalid identifier
    Failing sql is:
    CREATE TABLE "GINTS1"."TAB1" ("A" NUMBER, "B" "GINTS1"."TBL_H1" ) SEGMENT CREATION IMMEDIATE PCTFREE 10 PCTUSED 40 INITRANS 1 MAXTRANS 255 NOCOMPRESS LOGGING STORAGE(INITIAL 65536 NEXT 1048576 MINEXTENTS 1 MAXEXTENTS 2147483645 PCTINCREASE 0 FREELISTS 1 FREELIST GROUPS 1 BUFFER_POOL DEFAULT FLASH_CACHE DEFAULT CELL_
    Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
    Processing object type SCHEMA_EXPORT/TABLE/INDEX/INDEX
    Processing object type SCHEMA_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS
    ORA-39083: Object type INDEX_STATISTICS failed to create with error:
    ORA-01403: no data found
    ORA-01403: no data found
    Failing sql is:
    DECLARE I_N VARCHAR2(60);   I_O VARCHAR2(60);   c DBMS_METADATA.T_VAR_COLL;   df varchar2(21) := 'YYYY-MM-DD:HH24:MI:SS'; BEGIN  DELETE FROM "SYS"."IMPDP_STATS";   c(1) :=   DBMS_METADATA.GET_STAT_COLNAME('GINTS1','TAB1_NESTED',NULL,'TREAT(SYS_NC_ROWINFO$ AS "GINTS"."H2")."SOME_TEXTS"',1);  DBMS_METADATA.GET_STAT_INDNAME('GINTS1','TAB1_NESTED',c,1,i_o,i_n);   INSERT INTO "
    ORA-39083: Object type INDEX_STATISTICS failed to create with error:
    ORA-01403: no data found
    ORA-01403: no data found
    Failing sql is:
    DECLARE I_N VARCHAR2(60);   I_O VARCHAR2(60);   c DBMS_METADATA.T_VAR_COLL;   df varchar2(21) := 'YYYY-MM-DD:HH24:MI:SS'; BEGIN  DELETE FROM "SYS"."IMPDP_STATS";   c(1) :=   DBMS_METADATA.GET_STAT_COLNAME('GINTS1','TAB1_NESTED',NULL,'TREAT(SYS_NC_ROWINFO$ AS "GINTS"."H3")."SOME_TEXTS"',1);  DBMS_METADATA.GET_STAT_INDNAME('GINTS1','TAB1_NESTED',c,1,i_o,i_n);   INSERT INTO "
    Processing object type SCHEMA_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
    Job "GINTS1"."SYS_IMPORT_FULL_01" completed with 4 error(s) at 22:41:52So any idea how to make export/import of such tables?
    TIA
    Gints

    Tom Kyte has said it repeatedly ... I will repeat it here for you.
    The fact that Oracle allows you to build object tables is not an indication that you should.
    Store your data relationally and build object_views on top of them.
    http://www.morganslibrary.org/reference/object_views.html
    If you model properly, and store properly, you don' have any issues.

  • Data Pump import to a sql file error :ORA-31655 no data or metadata objects

    Hello,
    I'm using Data Pump to export/import data, one requirement is to import data to a sql file. The OS is window.
    I made the follow export :
    expdp system/password directory=dpump_dir dumpfile=tablesdump.dmp content=DATA_ONLY tables=user.tablename
    and it works, I can see the file TABLESDUMP.DMP in the directory path.
    then when I tried to import it to a sql file:
    impdp system/password directory=dpump_dir dumpfile=tablesdump.dmp sqlfile=tables_export.sql
    the log show :
    ORA-31655 no data or metadata objects selected for job
    and the sql file is created empty in the directory path.
    I'm not DBA, I'm a Java developer , Can you help me?
    Thks

    Hi, I added the command line :
    expdp system/system directory=dpump_dir dumpfile=tablesdump.dmp content=DATA_ONLY schemas=ko1 tables=KO1QT01 logfile=capture.log
    the log in the console screen is (is in Spanish), no log file was cerated in the directory path.
    Export: Release 10.2.0.1.0 - Production on Martes, 26 Enero, 2010 12:59:14
    Copyright (c) 2003, 2005, Oracle. All rights reserved.
    Conectado a: Oracle Database 10g Express Edition Release 10.2.0.1.0 - Production
    UDE-00010: se han solicitado varios modos de trabajo, schema y tables.
    (English error)
    UDE-00010: multiple job modes requested,schema y tables.
    This is why I used tables=user.tablename instead, is this right ?
    Thks

  • Error Data Pump Import

    Hi All,
    I am having errors below when trying to use data pump to import one table from the dump file (from a pre-prod db) into another database (prod_db) same version. I used the expdp to generate the dump file.
    Gettings errors -
    ORA-39083
    ORA-00959
    ORA-39112
    Any suggestions and/or advice would be appreciated.
    Thank you
    Import: Release 10.2.0.1.0 - 64bit Production
    Copyright (c) 2003, 2005, Oracle. All rights reserved.
    Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - 64bit Production
    With the Partitioning, OLAP and Data Mining options
    Master table "SYSTEM"."SYS_IMPORT_TABLE_01" successfully loaded/unloaded
    Starting "SYSTEM"."SYS_IMPORT_TABLE_01": system/******** DIRECTORY=DATA_PUMP_DIR DUMPFILE=EXP_Mxxxx_1203.DMP LOGFILE=Mxxxx_CLAIMS.LOG TABLES=CLAIMS
    Processing object type SCHEMA_EXPORT/TABLE/TABLE
    ORA-39083: Object type TABLE failed to create with error:
    ORA-00959: tablespace 'OXFORD_DATA_01' does not exist
    Failing sql is:
    CREATE TABLE "Mxxxx"."CLAIMS" ("CLAIM_NUM" VARCHAR2(25) NOT NULL ENABLE, "LINE_NUM" NUMBER(3,0) NOT NULL ENABLE, "PAID_FLAG" CHAR(1), "ADJ_FLAG" CHAR(1), "CLAIM_TYPE" VARCHAR2(20), "MEM_ID" VARCHAR2(20), "MEM_SUF" VARCHAR2(20), "MEM_BEG_DATE" DATE, "MEM_REL" CHAR(2), "MEM_NAME" VARCHAR2(40), "MEM_DOB" DATE, "MEM_SEX" CHAR(1), "REC_DATE" DATE, "PAID_DATE" DATE, "FROM_DATE" DATE, "
    Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
    Processing object type SCHEMA_EXPORT/TABLE/INDEX/INDEX
    ORA-39112: Dependent object type INDEX:"Mxxxx"."CLAIMS_IDX1" skipped, base object type TABLE:"Mxxxx"."CLAIMS" creation failed
    ORA-39112: Dependent object type INDEX:"Mxxxx"."CLAIMS_PK" skipped, base object type TABLE:"Mxxxx"."CLAIMS" creation failed
    Processing object type SCHEMA_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS
    ORA-39112: Dependent object type INDEX_STATISTICS skipped, base object type INDEX:"Mxxxx"."CLAIMS_IDX1" creation failed
    ORA-39112: Dependent object type INDEX_STATISTICS skipped, base object type INDEX:"Mxxxx"."CLAIMS_PK" creation failed
    Processing object type SCHEMA_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
    ORA-39112: Dependent object type TABLE_STATISTICS skipped, base object type TABLE:"Mxxxx"."CLAIMS" creation failed
    Job "SYSTEM"."SYS_IMPORT_TABLE_01" completed with 6 error(s) at 13:49:33
    impdp username/password DIRECTORY=DATA_PUMP_DIR DUMPFILE=EXP_Mxxxx_1203.DMP LOGFILE=Mxxxx_CLAIMS.LOG TABLES=CLAIMS

    Ok,
    Thank you for the advice-
    So I did use REMAP_TABLESPACE option and it did import and load the file but should I be concerned with 4 errors? Looks like have to rebuilt indexes which is fine.
    Master table "Mxxxx"."SYS_IMPORT_TABLE_01" successfully loaded/unloaded
    Starting "Mxxxx"."SYS_IMPORT_TABLE_01": Mxxxx/******** DIRECTORY=DATA_PUMP_DIR DUMPFILE=EXP_Mxxxx_1203.DMP LOGFILE=Mxxxx_CLAIMS.LOG TABLES=CLAIMS REMAP_TABLESPACE=OXFORD_DATA_01:Mxxxx_DATA_01
    Processing object type SCHEMA_EXPORT/TABLE/TABLE
    Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
    . . imported "Mxxxx"."CLAIMS" 605.3 MB 1715554 rows
    Processing object type SCHEMA_EXPORT/TABLE/INDEX/INDEX
    ORA-39083: Object type INDEX failed to create with error:
    ORA-00959: tablespace 'USER_INDEX_01' does not exist
    Failing sql is:
    CREATE INDEX "Mxxxx"."CLAIMS_IDX1" ON "Mxxxx"."CLAIMS" ("MEM_ID") PCTFREE 10 INITRANS 2 MAXTRANS 255 NOLOGGING STORAGE(INITIAL 8388608 NEXT 8388608 MINEXTENTS 1 MAXEXTENTS 2147483645 PCTINCREASE 0 FREELISTS 1 FREELIST GROUPS 1 BUFFER_POOL DEFAULT) TABLESPACE "USER_INDEX_01" PARALLEL 1
    ORA-39083: Object type INDEX failed to create with error:
    ORA-00959: tablespace 'USER_INDEX_01' does not exist
    Failing sql is:
    CREATE UNIQUE INDEX "Mxxxx"."CLAIMS_PK" ON "Mxxxx"."CLAIMS" ("CLAIM_NUM", "LINE_NUM") PCTFREE 10 INITRANS 2 MAXTRANS 255 NOLOGGING STORAGE(INITIAL 8388608 NEXT 8388608 MINEXTENTS 1 MAXEXTENTS 2147483645 PCTINCREASE 0 FREELISTS 1 FREELIST GROUPS 1 BUFFER_POOL DEFAULT) TABLESPACE "USER_INDEX_01" PARALLEL 1
    Processing object type SCHEMA_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS
    ORA-39112: Dependent object type INDEX_STATISTICS skipped, base object type INDEX:"Mxxxx"."CLAIMS_IDX1" creation failed
    ORA-39112: Dependent object type INDEX_STATISTICS skipped, base object type INDEX:"Mxxxx"."CLAIMS_PK" creation failed
    Processing object type SCHEMA_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
    Job "Mxxxx"."SYS_IMPORT_TABLE_01" completed with 4 error(s) at 16:57:12

  • Data pump import from 11g to 10g

    I have 2 database: first is 11.2.0.2.0 and second is 10.2.0.1.0
    In 10g i created database link on 11g
    CREATE DATABASE LINK "TEST.LINK"
    CONNECT TO "monservice" IDENTIFIED BY "monservice"
    USING '(DESCRIPTION = (ADDRESS_LIST = (ADDRESS = (PROTOCOL = TCP)(HOST = host)(PORT = port))) (CONNECT_DATA = (SID = sid)))';
    And execute this query for test dbLink which work fine:
    select * from v$[email protected];
    After it i try to call open function:
    declare
    h number;
    begin
    h := dbms_datapump.open('IMPORT', 'TABLE', 'TEST.LINK', null, '10.2');
    end;
    and get exception: 39001. 00000 - "invalid argument value"
    if i remove 'TEST.LINK' from the arguments it works fine
    Edited by: 990594 on 26.02.2013 23:41

    Hi Richard Harrison,
    result for import from 11g to 10g:
    impdp user/pass@dburl schemas=SCHEMANAME network_link=TEST_LINK version=10.2
    Connected to: Oracle Database 10g Express Edition Release 10.2.0.1.0 - Production
    ORA-39001: invalid argument value
    ORA-39169: Local version of 10.2.0.1.0 cannot work with remote version of 11.2.0.2.0
    result for export from 11g to 10g:
    expdp user/pass@dburl schemas=SCHEMANAME network_link=TEST_LINK version=10.2
    Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.2.0 - 64 bit Production
    With the Partitioning, OLAP, Data Mining and Real Application Testing option
    ORA-39006: internal error
    ORA-39065: unexpected master process exception in DISPATCH
    ORA-04052: error occurred when looking up remote object SYS.KUPM$MCP@TEST_LINK
    ORA-00604: error occurred at recursive SQL level 3
    ORA-06544: PL/SQL: internal, error, arguments: [55916], [], [], [], [], [], [], []
    ORA-06553: PLS-801: internal error [55916]
    ORA-02063: preceding 2 lines from TEST_LINK
    ORA_39097: Data Pump job encountered unexpected error -4052

  • Re: 10g to 11gR2 Upgrade using Data Pump Import

    Hi Srini,
    I read the below documented in export/import.
    "Grants on objects owned by the SYS schema are never exported."
    If this is the case, how do we import the grants in target database as we see many invalids due to grants.
    For example, I'm using expdp to export 11gr1 database from solaris and importing in 11gr2 database in RHEL.
    In target I see many invalids and when try to compile it manually it throws an error "insufficient privileges" which is missing grants in CTXSYS schema.
    How do we go about it.

    I branched this for you.
    The answer is not with Data Pump. Data Pump (expdp) and the old export utility (exp) does not support exporting grants on Oracle schemas. You would have to find these grants yourself and then reapply them on the target database.
    Probably not the answer you wanted, but with Data Pump, anything that is currently owned by the Oracle schemas is not exportable.
    Dean

  • 10g to 11gR2 Upgrade using Data Pump Import

    Hi,
    I am intending to move a 10g database from one windows server to another. However there is also a requirement to upgrade this database to 11gR2. Therefore I was going to combine the 2 in one movement by -
    1. take a full data pump export of the source 10g database
    2. create a new empty 11g database on the target environment
    3. import the dump file into the target database
    However I have a couple of queries running over in my mind about this approach -
    Q1. What happens with the SYSTEM and SYS objects from SYSTEM and SYSAUX during the import ? Given that I will have in effect already created a new dictionary on the empty target database - will any import of SYSTEM or SYS simply produce errror messages which should be ignored ?
    Q2. should I use EXCLUDE on SYS and SYSTEM ( is this EXCLUDE better on the export or import side ) ?
    Q3. what happens if there are things such as scheduled jobs etc on the source system - since these would be stored in SYSTEM owned tables, how would I bring these across to the new target 11g database ?
    thanks,
    Jim

    Jim Thompson wrote:
    Hi,
    I am intending to move a 10g database from one windows server to another. However there is also a requirement to upgrade this database to 11gR2. Therefore I was going to combine the 2 in one movement by -
    1. take a full data pump export of the source 10g database
    2. create a new empty 11g database on the target environment
    3. import the dump file into the target database
    However I have a couple of queries running over in my mind about this approach -
    Q1. What happens with the SYSTEM and SYS objects from SYSTEM and SYSAUX during the import ? Given that I will have in effect already created a new dictionary on the empty target database - will any import of SYSTEM or SYS simply produce errror messages which should be ignored ?
    You wont get error related to system , sysaux tablespace bacsei these wont be exported at all. Schemas like "SYS, CTXSYS, MDSYS and ORDSYS" are never exported using datapump. Thats why oracle reommends not to create any objects under sys,system schema.
    Q2. should I use EXCLUDE on SYS and SYSTEM ( is this EXCLUDE better on the export or import side ) ?
    Not required, as datadictionary schemas wont be exported, so spcefying EXCLUDE wont do anything
    Q3. what happens if there are things such as scheduled jobs etc on the source system - since these would be stored in SYSTEM owned tables, how would I bring these across to the new target 11g database ?
    DDL will get export and imported into new database. For example if you have schema A and you had defined job whos owner is A, then while importing this information also gets imported. For user defined Jobs there are view like USER_SCHEDULER_JOBS etc, So dont worry about the user jobs, these will be created.
    Also see
    MOS note - Schema's CTXSYS, MDSYS and ORDSYS are Not Exported [ID 228482.1]
    Export system or sys schema
    I also run following test in my db which shows i cannt export the objects under sys schema;
    SQL> show user
    USER is "SYS"
    SQL> create table pump (id number);
    Table created.
    SQL> insert into pump values (1);
    1 row created.
    SQL> insert into pump values (2);
    1 row created.
    SQL> exit
    Disconnected from Oracle Database 11g Enterprise Edition Release 11.2.0.1.0 - Pr
    oduction
    With the Partitioning, OLAP, Data Mining and Real Application Testing options
    C:\Documents and Settings\rnagi\My Documents\Ranjit Doc\Performance\SQL>expdp tables=sys.pump logfile=test.log
    Export: Release 11.2.0.1.0 - Production on Mon Feb 27 18:11:29 2012
    Copyright (c) 1982, 2009, Oracle and/or its affiliates.  All rights reserved.
    Username: / as sysdba
    Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.1.0 - Produc
    tion
    With the Partitioning, OLAP, Data Mining and Real Application Testing options
    Starting "SYS"."SYS_EXPORT_TABLE_01":  /******** AS SYSDBA tables=sys.pump logfi
    le=test.log
    Estimate in progress using BLOCKS method...
    Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
    Total estimation using BLOCKS method: 0 KB
    *ORA-39166: Object SYS.PUMP was not found.*
    *ORA-31655: no data or metadata objects selected for job*
    *Job "SYS"."SYS_EXPORT_TABLE_01" completed with 2 error(s) at 18:11:57*

  • Data pump import problem

    i have exported Hr schema from one macine using following code it export successfully but i tried to import this exported file in another machine it gave error. one thing more how i use package to import file as we exported by using data pump using package plz help
    C:\Documents and Settings\remote>impdp hr/hr DIRECTORY=DATA_PUMP_DIR DUMPFILE=YAHOO.DMP full=y
    With the Partitioning, OLAP, Data Mining and Real Application Testing options
    ORA-31626: job does not exist
    ORA-31637: cannot create job SYS_IMPORT_FULL_01 for user HR
    ORA-06512: at "SYS.DBMS_SYS_ERROR", line 95
    ORA-06512: at "SYS.KUPV$FT_INT", line 663
    ORA-39080: failed to create queues "KUPC$C_1_20090320121353" and "KUPC$S_1_20090
    320121353" for Data Pump job
    ORA-06512: at "SYS.DBMS_SYS_ERROR", line 95
    ORA-06512: at "SYS.KUPC$QUE_INT", line 1665
    ORA-01658: unable to create INITIAL extent for segment in tablespace SYSTEM
    DECLARE
    handle NUMBER;
    BEGIN
    handle := dbms_datapump.open (
    operation => 'EXPORT' ,job_mode => 'SCHEMA');
    dbms_datapump.add_file(
    handle => handle
    *,filename => 'YAHOO.dmp'*
    *,directory => 'DATA_PUMP_DIR'*
    *,filetype => 1);*
    dbms_datapump.metadata_filter(
    handle => handle
    *,name => 'SCHEMA_EXPR'*
    *,value => 'IN(''HR'')');*
    dbms_datapump.start_job(
    handle => handle);
    dbms_datapump.detach(handle => handle);
    end;

    +*<Moderator edit - deleted contents of MOS Doc 752374.1 -  pl do not post such contents - it is a violation of your Support agreement - locking this thread>*+                                                                                                                                                                                                                                                                                                                               

  • Data pump import issue in Oracle 10gR2

    Hi DBAs,
    Please help me identifying the data pump issue I am facing in one of our Oracle 10g R2 database
    Below is error message from the import log file
    Job "SYS"."SYS_IMPORT_TABLE_06" completed with 62 error(s) at 18:20:20
    ORA-39097: Data Pump job encountered unexpected error -39076
    ORA-39065: unexpected master process exception in KUPV$FT_INT.DELETE_JOB
    ORA-39076: cannot delete job SYS_IMPORT_TABLE_06 for user SYS
    ORA-06512: at "SYS.DBMS_SYS_ERROR", line 95
    ORA-06512: at "SYS.KUPV$FT_INT", line 934
    ORA-31635: unable to establish job resource synchronization
    ORA-39097: Data Pump job encountered unexpected error -39076
    ORA-39065: unexpected master process exception in MAIN
    ORA-39076: cannot delete job SYS_IMPORT_TABLE_06 for user SYS
    ORA-06512: at "SYS.DBMS_SYS_ERROR", line 95
    ORA-06512: at "SYS.KUPV$FT_INT", line 934
    ORA-31635: unable to establish job resource synchronization
    Please suggest me a workaround and root cause of this issue. Thanks.
    Regards,
    Arun

    user13344783 wrote:
    Hi DBAs,
    Please help me identifying the data pump issue I am facing in one of our Oracle 10g R2 database
    Below is error message from the import log file
    Job "SYS"."SYS_IMPORT_TABLE_06" completed with 62 error(s) at 18:20:20
    ORA-39097: Data Pump job encountered unexpected error -39076
    ORA-39065: unexpected master process exception in KUPV$FT_INT.DELETE_JOB
    ORA-39076: cannot delete job SYS_IMPORT_TABLE_06 for user SYS
    ORA-06512: at "SYS.DBMS_SYS_ERROR", line 95
    ORA-06512: at "SYS.KUPV$FT_INT", line 934
    ORA-31635: unable to establish job resource synchronization
    ORA-39097: Data Pump job encountered unexpected error -39076
    ORA-39065: unexpected master process exception in MAIN
    ORA-39076: cannot delete job SYS_IMPORT_TABLE_06 for user SYS
    ORA-06512: at "SYS.DBMS_SYS_ERROR", line 95
    ORA-06512: at "SYS.KUPV$FT_INT", line 934
    ORA-31635: unable to establish job resource synchronization
    Please suggest me a workaround and root cause of this issue. Thanks.
    Regards,
    ArunPost complete command line of expdp that made the dump file & the impdp that threw these errors

  • Data pump imports

    I have a need to move the data from certain tables(of schema A) to schema B.I would be moving only tables and no other objects.I am using oracle sql developer and thought there are 2 options:
    1.database copy  2.datapump
    I have few questions:
    1.Datapump import is not working if the target environment has truncated tables.It works only if the target tables are dropped...why? is there any setting that i can do so that it drops the table at runtime by itself if required?
    2.which is teh best? db copy or datapump ...also do we have any other effective options for moving data
    plaese suggest.thanks in advance

    3bc2b7be-dee7-4dfd-b905-2d63f2927e86 wrote:
    I have a need to move the data from certain tables(of schema A) to schema B.I would be moving only tables and no other objects.I am using oracle sql developer and thought there are 2 options:
    1.database copy  2.datapump
    I have few questions:
    1.Datapump import is not working if the target environment has truncated tables.It works only if the target tables are dropped...why? is there any setting that i can do so that it drops the table at runtime by itself if required?
    2.which is teh best? db copy or datapump ...also do we have any other effective options for moving data
    plaese suggest.thanks in advance
    Sounds like you need to take a look at the
    TABLE_EXISTS_ACTION
    parameter of impdp.
    From
    Oracle® Database Utilities
    11g Release 2 (11.2)
    E22490-05
    Contents
    TABLE_EXISTS_ACTION
    Default: SKIP (Note that if CONTENT=DATA_ONLY is specified, then the default is APPEND, not SKIP.)
    Purpose
    Tells Import what to do if the table it is trying to create already exists.
    Syntax and Description
    TABLE_EXISTS_ACTION=[SKIP | APPEND | TRUNCATE | REPLACE]
    The possible values have the following effects:
    SKIP leaves the table as is and moves on to the next object. This is not a valid option if the CONTENT parameter is set to DATA_ONLY.
    APPEND loads rows from the source and leaves existing rows unchanged.
    TRUNCATE deletes existing rows and then loads rows from the source.
    REPLACE drops the existing table and then creates and loads it from the source. This is not a valid option if the CONTENT parameter is set to DATA_ONLY.
    ====================
    Hint: whenever I have a question about how to do something with a utility, the very first thing I do is check the documentation and simply go down the list of run-time parameters to see if something looks like it might do what I want.

Maybe you are looking for

  • Some basic questions on File Adapter

    Hello all, I have some basic questions on XI and File Adapter and hope you can help me. Any answer is appreciated. 1. Can I use NFS transport protocol to poll a file from a machine in the network, which is not the XI? Or do I have to use FTP instead?

  • Redraw issues when compile for Air

    This is about redraw issues which occur in Flash, when I try to compile a file in Air 2.6. The initial files were built in inDesign CS5.5, saved as a FLA, and opened in Flash CS5.5. I created a test file in inDesign to be used as an Air App, as I saw

  • BAPI for FB50 tcode

    hi all, Can any body please give me report using BDC or BAPI for FB50 Tcode....... Regards, Imran

  • Can't see folders/files but connected to VPN

    I'm trying to connect to my office VPN. Unfortunately, I'm the only person in the office who uses a Mac and our remotely-based IT guy doesn't have a lot of Max experience, so I haven't been able to get much assistance recently. After a lot of trying,

  • European Article Number test

    I wanted to do a simple program to test if the check digit (?) is true, but i get error messages during compiling and i have no idea how to fix that. import java.applet.*; import java.awt.*; import java.awt.event.*; import java.io.*; public class ean