Problem in IMPDP

HI Experts,
I'm trying to import the full database from Production to Dev environment & using the following parameter file:
directory=<DUMP_DIRECTORY_LOCATION>
dumpfile=<DUMP_FILE_NAME>
logfile=<LOG_FILE_NAME>
FULL=y
the impdp completed but not a single row imported.
Kindly let me know if I'm missing any parameter.

Hi,
Tried with the option TABLE_EXISTS_ACTION=REPLACE
Resulted with the bellow error:
Failing sql is:
BEGIN
dbms_resource_manager.create_consumer_group('AUTO_TASK_CONSUMER_GROUP','System maintenance task consumer group','ROUND-ROBIN');COMMIT; END;
ORA-39083: Object type PROCOBJ failed to create with error:
ORA-06550: line 2, column 1:
PLS-00201: identifier 'BMS_SCHEDULER.DISABLE' must be declared
ORA-06550: line 2, column 1:
PL/SQL: Statement ignored
Any suggestion?

Similar Messages

  • Problem with impdp

    Hi, I have the following problem: I make the expdp from my production database and I need import in test database, to make the export I use the command:
    /u01/app/oracle/product/10g/bin/expdp newsys/mypass@prd schemas=NEWSYS directory=DMPDIR dumpfile=prd_newsys%U.dmp filesize=2G logfile=prd_newsys.log
    This command is executed with sucessful, but on import with the command:
    /u01/app/oracle/product/10g/bin/impdp newsys/mypass@tst schemas=NEWSYS directory=DMPDIR dumpfile=prd_newsys%U.dmp logfile=import.log
    This return the following:
    Import: Release 10.2.0.1.0 - Production on Sunday, 08 March, 2009 21:46:34
    Copyright (c) 2003, 2005, Oracle. All rights reserved.
    Connected to: Oracle Database 10g Release 10.2.0.1.0 - Production
    Master table "NEWSYS"."SYS_IMPORT_SCHEMA_05" successfully loaded/unloaded
    Starting "NEWSYS"."SYS_IMPORT_SCHEMA_05": NEWSYS/********@tst schemas=NEWSYS directory=BACKUP_DIR dumpfile=prd_newsys%U.dmp logfile=import.log
    Processing object type SCHEMA_EXPORT/PRE_SCHEMA/PROCACT_SCHEMA
    ORA-39126: Worker unexpected fatal error in KUPW$WORKER.LOAD_METADATA [SELECT process_order, flags, xml_clob, NVL(dump_fileid, :1), NVL(dump_position, :2), dump_length, dump_allocation, grantor, object_row, object_schema, object_long_name, processing_status, processing_state, base_object_type, base_object_schema, base_object_name, property, size_estimate, in_progress FROM "NEWSYS"."SYS_IMPORT_SCHEMA_05" WHERE  process_order between :3 AND :4 AND processing_state <> :5 AND duplicate = 0 ORDER BY process_order]
    ORA-31640: unable to open dump file "/backup/oracle/prd_newsys01.dmp" for read
    ORA-19505: failed to identify file "/backup/oracle/prd_newsys01.dmp"
    ORA-27046: file size is not a multiple of logical block size
    Additional information: 1
    ORA-06512: at "SYS.DBMS_SYS_ERROR", line 95
    ORA-06512: at "SYS.KUPW$WORKER", line 6273
    ----- PL/SQL Call Stack -----
    object line object
    handle number name
    0x418748f0 14916 package body SYS.KUPW$WORKER
    0x418748f0 6300 package body SYS.KUPW$WORKER
    0x418748f0 3514 package body SYS.KUPW$WORKER
    0x418748f0 6889 package body SYS.KUPW$WORKER
    0x418748f0 1262 package body SYS.KUPW$WORKER
    0x4aece4b0 2 anonymous block
    Job "NEWSYS"."SYS_IMPORT_SCHEMA_05" stopped due to fatal error at 21:48:08

    Hello,
    Is your production database and test database on the same server or diferent?
    +/u01/app/oracle/product/10g/bin/expdp newsys/mypass@prd schemas=NEWSYS directory=DMPDIR dumpfile=prd_newsys%U.dmp filesize=2G logfile=prd_newsys.log+
    This command is executed with sucessful, but on import with the command:
    +/u01/app/oracle/product/10g/bin/impdp newsys/mypass@tst schemas=NEWSYS directory=DMPDIR dumpfile=prd_newsys%U.dmp logfile=import.log+
    Seems to me you have it on the same server, correct me if that's not right? If not then copy your export dmp explicitly on test database server and then run impdp.
    Waiting for your response.
    Regards

  • Problem with "impdp" utility

    Hello, I've installed Oracle XE for Windows on my local machine, and I'm trying to import metadata only from a remote database. The command I'm running it's:
    +$ impdp system/******** directory=DATA_PUMP_DIR logfile=impCMS-PRE10g.log network_link=system_pre10g schemas=CMS content=METADATA_ONLY remap_tablespace=CMS_TBL:USERS remap_tablespace=CMS_IDX:USERS exclude=GRANT,STATISTICS+
    and the result obtained is next:
    Processing object type SCHEMA_EXPORT/USER
    Processing object type SCHEMA_EXPORT/SYSTEM_GRANT
    Processing object type SCHEMA_EXPORT/ROLE_GRANT
    rocessing object type SCHEMA_EXPORT/DEFAULT_ROLE
    Processing object type SCHEMA_EXPORT/PRE_SCHEMA/PROCACT_SCHEMA
    Processing object type SCHEMA_EXPORT/TYPE/TYPE_SPEC
    Processing object type SCHEMA_EXPORT/SEQUENCE/SEQUENCE
    Processing object type SCHEMA_EXPORT/TABLE/TABLE
    Processing object type SCHEMA_EXPORT/TABLE/INDEX/INDEX
    Processing object type SCHEMA_EXPORT/TABLE/CONSTRAINT/CONSTRAINT
    Processing object type SCHEMA_EXPORT/TABLE/COMMENT
    *ORA-39125: Worker unexpected fatal error in KUPW$WORKER.UNLOAD_METADATA while calling DBMS_METADATA.FETCH_XML_CLOB [COMMENT]*
    ORA-06502: PL/SQL:numeric or value error: character string buffer too small
    ORA-06512: "SYS.DBMS_SYS_ERROR", line 105
    ORA-06512: "SYS.KUPW$WORKER", line 6241
    ----- PL/SQL Call Stack -----
    object      line  object
    handle    number  name
    *3005E140 14916 package body SYS.KUPW$WORKER*
    *3005E140 6300 package body SYS.KUPW$WORKER*
    *3005E140 2340 package body SYS.KUPW$WORKER*
    *3005E140 6861 package body SYS.KUPW$WORKER*
    *3005E140 1262 package body SYS.KUPW$WORKER*
    *2E7211A4 2 anonymous block*
    In next execution, I decid to exclude comments from import and the command was next:
    +$ impdp system/******** directory=DATA_PUMP_DIR logfile=impCMS-PRE10g.log network_link=system_pre10g schemas=CMS content=METADATA_ONLY remap_tablespace=CMS_TBL:USERS remap_tablespace=CMS_IDX:USERS exclude=GRANT,STATISTICS,COMMENT+
    But error remains, now log shows next messages:
    Processing object type SCHEMA_EXPORT/USER
    Processing object type SCHEMA_EXPORT/SYSTEM_GRANT
    Processing object type SCHEMA_EXPORT/ROLE_GRANT
    Processing object type SCHEMA_EXPORT/DEFAULT_ROLE
    Processing object type SCHEMA_EXPORT/PRE_SCHEMA/PROCACT_SCHEMA
    Processing object type SCHEMA_EXPORT/TYPE/TYPE_SPEC
    Processing object type SCHEMA_EXPORT/SEQUENCE/SEQUENCE
    Processing object type SCHEMA_EXPORT/TABLE/TABLE
    Processing object type SCHEMA_EXPORT/TABLE/INDEX/INDEX
    Processing object type SCHEMA_EXPORT/TABLE/CONSTRAINT/CONSTRAINT
    *ORA-39125: Worker unexpected fatal error in KUPW$WORKER.UNLOAD_METADATA while calling DBMS_METADATA.FETCH_XML_CLOB [CONSTRAINT:"CMS"."PK_WMU_SUGERENCI_1240219071771"]*
    ORA-06502: PL/SQL:numeric or value error: character string buffer too small
    ORA-06512: "SYS.DBMS_SYS_ERROR", line 105
    ORA-06512: "SYS.KUPW$WORKER", line 6241
    ----- PL/SQL Call Stack -----
    object      line  object
    handle    number  name
    *3005E140 14916 package body SYS.KUPW$WORKER*
    *3005E140 6300 package body SYS.KUPW$WORKER*
    *3005E140 2340 package body SYS.KUPW$WORKER*
    *3005E140 6861 package body SYS.KUPW$WORKER*
    *3005E140 1262 package body SYS.KUPW$WORKER*
    *2E7211A4 2 anonymous block*
    I've been reading several metalink notes about similar problems, but I can't get the solution. Local database version (Oracle XE) is 10.2.0.1.0 and remote databese version (Oracle SE) is 10.2.0.4.0 running under Redhat Linux OS.
    Any ideas to resolve this problem?
    Thanks in advance

    Which patch/version of the client is creating the expdp file, the 10.2.0.4 source?
    Export: Release 10.2.0.4.0 - 64bit
    Import: Release 10.2.0.1.0
    +Might be a comment on the source table that is too long for the 10.2.0.1 system catalog (DBMS_METADATA.FETCH_XML_CLOB [COMMENT]) ?+
    Maybe, how could I check this?
    Or maybe the source or target has an incompatible or different character sets, i.e. maybe its trying to put multi-byte UTF characters into a single or 2-byte varchar ... on the source and target databases check the characterset (sqlplus alter database backup controlfile to trace; and look towards the end of the trace file for CHARACTER SET ... ), maybe checking/setting NLS_LANG and LANG env vars is needed.
    CHARSET OF LOCAL DATABASE (Target database, XE):
    NLS_CHARACTERSET     AL32UTF8
    NLS_NCHAR_CHARACTERSET     AL16UTF16
    NLS_LANGUAGE     AMERICAN
    CHARSET OF REMOTE DATABASE (Source database, SE):
    NLS_CHARACTERSET     WE8ISO8859P15
    NLS_NCHAR_CHARACTERSET     AL16UTF16
    NLS_LANGUAGE     AMERICAN
    And a CMS schema, that tag rings a bell ... content manager? That might not be supported on XE
    I think schema's name is unremarkable
    Thanks for your help

  • Problems with impdp

    Hi,
    I tried to update a test-database. For this reason i´ve used a full exp of the real-database and then used remap_schema for import cause only one schema had to be updated. But the import makes a full import. I don´t know what I did wrong, can someone help me please?
    database: 10.2.0.3
    os: SLES10
    export: expdp system/pwd@DB dumpfile=exp_db.dmp full=y
    import: impdp system/pwd@DBTST dumpfile=exp_db.dmp remap_schema=user:user
    The tablespace-name is the same in real and test-database.
    here are the first lines of the import-log:
    Import: Release 10.2.0.3.0 - 64bit Production on Tuesday, 14 July, 2009 16:48:29
    Copyright (c) 2003, 2005, Oracle. All rights reserved.
    Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.3.0 - 64bit Production
    With the Partitioning, OLAP and Data Mining options
    Master-Tabelle "SYSTEM"."SYS_IMPORT_FULL_02" erfolgreich geladen/entladen
    "SYSTEM"."SYS_IMPORT_FULL_02": system/********@DBTST dumpfile=exp_db.dmp remap_schema=user:user wird gestartet
    Objekttyp DATABASE_EXPORT/TABLESPACE wird verarbeitet
    ORA-31684: Objekttyp TABLESPACE:"UNDOTBS1" ist schon vorhanden

    the error-messages only says, that the object still exists
    the import works! but a full-import was performed where I expected a schema-import...
    Job "SYSTEM"."SYS_IMPORT_FULL_02" mit 9718 Fehler(n) um 18:33:53 abgeschlossen
    (job "SYSTEM"."SYS_IMPORT_FULL_02" ended with 9718 errors at 18:33:53)
    the errors are all ORA-39111 and ORA-31684
    with the old "exp/imp" this way works:
    "full export, import using fromuser touser"
    so I thought if I make:
    "full export, import remap_schema"
    would work with datapump

  • Datapump impdp error LPX-00216

    Hi Guru, i am trying to import 150GB size db failed with error below :
    ### Detailed Problem Statement ###
    impdp "'/ as sysdba'" directory=dpdata dumpfile=episfile1%U.dmp full=y job_name=episuatimport2
    File List :
    -rwxr-x--x 1 oracle dba 23075409920 Jan 12 17:02 episfile01.dmp
    -rwxr-x--x 1 oracle dba 27706929152 Jan 12 17:44 episfile02.dmp
    -rwxr-x--x 1 oracle dba 15957520384 Jan 12 18:09 episfile03.dmp
    -rwxr-x--x 1 oracle dba 12562350080 Jan 12 22:27 episfile04.dmp
    Import: Release 10.2.0.2.0 - 64bit Production on Tuesday, 13 January, 2009 8:19:57
    Copyright (c) 2003, 2005, Oracle. All rights reserved.
    Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.2.0 - 64bit Production
    With the Partitioning and Data Mining options
    ORA-39002: invalid operation
    ORA-31694: master table "SYS"."EPISUATIMPORT2" failed to load/unload
    ORA-02354: error in exporting/importing data
    ORA-39774: parse of metadata stream failed with the following error:
    LPX-00216: invalid character 0 (0x0
    Please advice how to resolve this ? is it caused by dump corruption ?
    thanks.

    Hi,
    I am not sure whether the error raised due to the NLS settings. Try to check for metalink note 603415.1
    That might be help you.
    - Pavan Kumar N

  • Oracle11g r2 impdp parallel job does not spwans

    Trying to import through data pump but parallel is not working sequence creation is running.
    any guess why parallel is not working
    oracle11g r2 asm +rac linux
    thanks in advance

    Better late than never...
    We have seen a few issues with expdp/impdp in 11gr2 on linux.
    Problem Description: impdp that was triggered to run in Parallel=4 has failed to load the following table. This table was specified in QUERY option.
    Error:
    ORA-31693: Table data object "DBATEST"."TEST" failed to load/unload and is being skipped due to error:
    ORA-29913: error in executing ODCIEXTTABLEOPEN callout
    There was a patch for this. patch 8393456
    Also we saw the export would not work correctly in parallel.
    9243068: EXPDP RETURNS ORA-00922: MISSING OR INVALID OPTION WHEN USING CONSISTENT=TRUE
    We did not see an error here, but parallel would not spaw any workers
    Hope this helps

  • Primary key, unique key impdp problem

    Hi, during my impdp I came across problems with constraints being violated. So I disabled all constraints in a proper order. It didn't help. There are still the same errors. I checked some individual constraints that were problematic and it turned out that they were all disabled as I expected. I don't understand why datapump still shows these errors despite constraints are DISABLED.

    But if the unique index exists on the target table it does not matter if the PK is disabled or not. You would need to drop the index; however, this shows you have a data problem.
    Why do you have duplicate keys? If you import duplicates then rebuilding the unique index would result in an error.
    Do you need to truncate or drop some of the target tables before importing?
    These are points I raised in my first response and which you have not addressed in your answer.
    HTH -- Mark D Powell --

  • An impdp problem

    Hi,
    11gR2, in linux86_64
    I can did a successful exp (expdp) from source db by using "sysdba".
    when using impdp, can see error:
    Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.1.0 - 64bit Production
    With the Partitioning and Automatic Storage Management options
    ORA-39002: invalid operation
    ORA-31694: master table "SYS"."IMP_L2" failed to load/unload
    ORA-31644: unable to position to block number 226430 in dump file "/tmp/dump/l2/l2_2.dmp"
    ORA-19501: read error on file "/tmp/dump/l2/l2_2.dmp", block number 1 (block size=4096)
    ORA-27070: async read/write failed
    Linux-x86_64 Error: 22: Invalid argument
    Additional information: 3
    seems like it is block corrupition, but did another exp/imp, got same error, what is the problem?
    cc : impdp's parfile
    USERID="/ as sysdba"
    DIRECTORY=data_pump_dir1
    LOGFILE=data_pump_dir1:import_l2.log
    JOB_NAME=imp_l2
    DUMPFILE=l2.dmp
    REMAP_SCHEMA=source_schema:target_schema
    REMAP_TABLESPACE=(source_1:target_1,
    source_1:target_2)
    TABLES=(source_shcema.table1,
    source_shcema.table2)
    thank you,

    /tmp/dump/l2 >
    total 912780
    drwxrwxr-x 3 oracle dba 60 Oct 7 21:14 ../
    -rwxrwxrwx 1 oracle dba 932839424 Oct 10 15:04 l2.dmp*
    -rwxrwxr-x 1 oracle dba 4246 Oct 10 15:09 impl2.par*
    -rw-r--r-- 1 oracle dba 0 Oct 10 15:09 import_l2.log
    drwxrwxr-x 2 oracle dba 140 Oct 10 15:09 ./
    /tmp/dump >
    total 0
    drwxrwxr-x 3 oracle dba 60 Oct 7 21:14 ./
    drwxrwxrwt 6 root root 120 Oct 10 15:05 ../
    drwxrwxr-x 2 oracle dba 140 Oct 10 15:09 l2/
    tmp >
    total 8
    drwxr-xr-x 24 root root 4096 Jun 30 02:36 ../
    drwxrwxr-x 3 oracle dba 60 Oct 7 21:14 dump/
    drwxrwxrwt 6 root root 120 Oct 10 15:05 ./
    seems like should be oK...
    but still get error.
    Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.1.0 - 64bit Production
    With the Partitioning and Automatic Storage Management options
    ORA-39002: invalid operation
    ORA-31694: master table "SYS"."IMP_L2" failed to load/unload
    ORA-31644: unable to position to block number 226430 in dump file "/tmp/dump/l2/l2.dmp"
    ORA-19501: read error on file "/tmp/dump/l2/l2.dmp", block number 1 (block size=4096)
    ORA-27070: async read/write failed
    Linux-x86_64 Error: 22: Invalid argument
    Additional information: 3
    thank you
    Edited by: ROY123 on Oct 10, 2011 8:14 AM

  • IMPDP problem

    Hi I am getting strange problem, from full expdp, when I am trying to import "impdp" the user's data(sirsi user) from production database to test database (test user), here I found when I am trying to "impdp" ,it changing the test database password (same password of production).
    Ex: let as assume
    the production database password: production123
    the test database password: test123
    but when I try to Import "impdp" production to test database:
    F:\expdp>impdp directory=expdp dumpfile=21092013_JICLIBDB.DMP logfile=imp_sirsi.log remap_schema=sirsi:test statistics=none
    Impdp successful but no data was import to user (test user).
    And also I am getting following error:
    CREATE TABLESPACE "ACCOUNTABILITY_DAT" DATAFILE 'D:\ORACLE\APP\PRODUCT\11.2.0\JICLIBDB\ACCOUNTABILITY01.DAT' SIZE 10485760 AUTOEXTEND ON NEXT 8192 MAXSIZE 32767M LOGGING ONLINE PERMANENT BLOCKSIZE 819
    ORA-39083: Object type TABLESPACE failed to create with error:
    ORA-01119: error in creating database file 'D:\ORACLE\APP\PRODUCT\11.2.0\JICLIBDB\ACCOUNTABILITY01.IDX'
    ORA-27040: file create error, unable to create file
    OSD-04002: unable to open file
    O/S-Error: (OS 21) The device is not ready.
    My database version : 11..2.0.1
    OS: windows

    Hi,
    >>,it changing the default database password.
    what do you mean by default password. you are doing schema export/import, if schema does not exist it will create the schema will all the inherited property of source one.
    to prevent this you can create the schema before import and during import it will skip/error for the create statement only.
    HTH

  • Impdp - Problem with import database

    Hi Gurus,
    just a short question. What is the procedure for import oracle database ?
    I have exported dump file of my oracle instance and I would like to import it on some other servers.
    I dont know why I can import it correctly. My procedure is:
    1) export oracle instance using expdp from server A
    2) on the server B I`m creating clean oracle instance
    3) On server B I`m importing oracle file exported from server A using impdp
    Is this ok ? Should I create new oracle instance ? Or it will be installed automaticly.
    During import proces I have a lot of errors, mostly...
    Table allready exist (CTXSYS for example)...
    Who can tell me if its good procedure ?

    Dlugasx wrote:
    Hi Gurus,
    just a short question. What is the procedure for import oracle database ?
    I have exported dump file of my oracle instance and I would like to import it on some other servers.
    I dont know why I can import it correctly. My procedure is:
    1) export oracle instance using expdp from server A
    2) on the server B I`m creating clean oracle instance
    i hope you have created looking from the below errors you have described
    3) On server B I`m importing oracle file exported from server A using impdp
    Is this ok ? Should I create new oracle instance ? Or it will be installed automaticly.
    During import proces I have a lot of errors, mostly...
    Table allready exist (CTXSYS for example)...
    Who can tell me if its good procedure ?I think you are trying to do a full expdp and full impdb. you can ignore the already exist objects as when you insall oracle and create database it will have those objects too. when these exists it will through these errors. you may be interested to see the application objects reather. once import is completed you can query dba_objects and comapare with prod data whether all objects have imported or not.
    Anil Malkai

  • Problem in Database EXPDP and IMPDP

    Dear Friends
    I am using Oracle 11G R2 (11.2.0.1.0) on Windows 7 Ultimate.
    I have some 50 tables (along with its data) and corresponding 40 VIEWS (based on those 50 tablees) in User "SYSTEM" schema.
    I want to take Export DUMP file of these 50 tables and 40 views and want create another user with another name and import all these 50+40 tables/views to that new user.
    Could anyone of you please elaborate me the step by step guide to dump the data from SYSTEM USER with lots of tables and views and then import the same data to another user? is this possible
    regards
    Hamza

    thanks. I am now able to copy. here is the error i get
    SQL> $ impdp asim/asim directory=datadump1 dumpfile=asim.dmp
    Import: Release 11.2.0.1.0 - Production on Mon Jul 4 01:52:10 2011
    Copyright (c) 1982, 2009, Oracle and/or its affiliates. All rights reserved.
    Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.1.0 - 64bit Production
    With the Partitioning, OLAP, Data Mining and Real Application Testing options
    Master table "ASIM"."SYS_IMPORT_FULL_01" successfully loaded/unloaded
    Starting "ASIM"."SYS_IMPORT_FULL_01": asim/******** directory=datadump1 dumpfile=asim.dmp
    Processing object type TABLE_EXPORT/TABLE/TABLE
    ORA-39151: Table "SYSTEM"."PARTNERS_COMPANY" exists. All dependent metadata and data will be skipped due to table_exists_action of skip
    Processing object type TABLE_EXPORT/TABLE/PRE_TABLE_ACTION
    ORA-39083: Object type PRE_TABLE_ACTION failed to create with error:
    ORA-06550: line 2, column 2:
    PLS-00201: identifier 'SYS.DBMS_DEFER_IMPORT_INTERNAL' must be declared
    ORA-06550: line 2, column 2:
    PL/SQL: Statement ignored
    Failing sql is:
    BEGIN
    SYS.DBMS_DEFER_IMPORT_INTERNAL.QUEUE_IMPORT_CHECK('ORCL','IBMPC/WIN_NT64-9.1.0');
    END;
    ORA-39083: Object type PRE_TABLE_ACTION failed to create with error:
    ORA-06550: line 2, column 2:
    PLS-00201: identifier 'SYS.DBMS_DEFER_IMPORT_INTERNAL' must be declared
    ORA-06550: line 2, column 2:
    PL/SQL: Statement ignored
    Failing sql is:
    BEGIN
    SYS.DBMS_DEFER_IMPORT_INTERNAL.QUEUE_IMPORT_CHECK('ORCL','IBMPC/WIN_NT64-9.1.0');
    END;
    ORA-39083: Object type PRE_TABLE_ACTION failed to create with error:
    ORA-06550: line 2, column 2:
    PLS-00201: identifier 'SYS.DBMS_DEFER_IMPORT_INTERNAL' must be declared
    ORA-06550: line 2, column 2:
    PL/SQL: Statement ignored
    Failing sql is:
    BEGIN
    SYS.DBMS_DEFER_IMPORT_INTERNAL.QUEUE_IMPORT_CHECK('ORCL','IBMPC/WIN_NT64-9.1.0');
    END;
    ORA-39083: Object type PRE_TABLE_ACTION failed to create with error:
    ORA-06550: line 2, column 2:
    PLS-00201: identifier 'SYS.DBMS_DEFER_IMPORT_INTERNAL' must be declared
    ORA-06550: line 2, column 2:
    PL/SQL: Statement ignored
    Failing sql is:
    BEGIN
    SYS.DBMS_DEFER_IMPORT_INTERNAL.QUEUE_IMPORT_CHECK('ORCL','IBMPC/WIN_NT64-9.1.0');
    END;
    ORA-39083: Object type PRE_TABLE_ACTION failed to create with error:
    ORA-06550: line 2, column 2:
    PLS-00201: identifier 'SYS.DBMS_DEFER_IMPORT_INTERNAL' must be declared
    ORA-06550: line 2, column 2:
    PL/SQL: Statement ignored
    Failing sql is:
    BEGIN
    SYS.DBMS_DEFER_IMPORT_INTERNAL.QUEUE_IMPORT_CHECK('ORCL','IBMPC/WIN_NT64-9.1.0');
    END;
    ORA-39083: Object type PRE_TABLE_ACTION failed to create with error:
    ORA-06550: line 2, column 2:
    PLS-00201: identifier 'SYS.DBMS_DEFER_IMPORT_INTERNAL' must be declared
    ORA-06550: line 2, column 2:
    PL/SQL: Statement ignored
    Failing sql is:
    BEGIN
    SYS.DBMS_DEFER_IMPORT_INTERNAL.QUEUE_IMPORT_CHECK('ORCL','IBMPC/WIN_NT64-9.1.0');
    END;
    Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
    Processing object type TABLE_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
    Job "ASIM"."SYS_IMPORT_FULL_01" completed with 7 error(s) at 01:52:18
    SQL>

  • Expdp and impdp Problem

    Hi Experts,
    I want to export all the tables from schema hr to a new user test in a same database. When i try to imp, it shows that object already exists and end up with error message.
    sql>impdp directory=dumplocation dumpfile=test.dmp
    Please advice me where i have done a mistake, I'm using 0racle 10g.
    Thanks in advance
    Shaan

    Well then , it seems that object with same name already exists in schema where you are importing it.
    To clearify you're issue a bit more , maybe you could paste imp syntax you used and error message you are getting.

  • [DATAPUMP] impdp problem

    Hi,
    could you tell me if there can occur any trouble if I run impdp on an existing database with objects and data included in a dumpfile? If there are already existings objects and data, those would be just ignored by impdp by default?
    Regards,
    Przemek Piechota

    Hi,
    In data pump if the parameter CONTENT=DATA_ONLY is specified, the default is APPEND, and then data will be appended into existing table.
    TABLE_EXISTS_ACTION can have following values.
    1)SKIP: It leaves the table as is and moves on to the next object. This is not a valid option if the CONTENT parameter is set to DATA_ONLY.
    2)APPEND: This option loads rows from the source and leaves existing rows unchanged. This is a default option is CONTENT=DATA_ONLY is specified.
    3)TRUNCATE: This option deletes existing rows and then loads rows from the source.
    4)REPLACE: This option drops the existing table in the database and then creates and loads it from the source. This is not a valid option if the CONTENT parameter is set to DATA_ONLY.
    http://www.ora600.be/node/4101
    regards

  • Error while using REMAP_TABLE and WHERE clause  together in IMPDP

    I am trying to move some records from a very large table to another small table.
    I am facing trouble while using REMAP_TABLE and WHERE clause together in IMPDP.
    Problem is data filter is not getting applied and all records are getting imported.
    here is how I have simulated this. please advice.
    CREATE TABLE TSHARRHB.TMP1
      A  NUMBER,
      B  NUMBER
    begin
    Insert into TSHARRHB.TMP1
       (A, B)
    Values
       (1, 1);
    Insert into TSHARRHB.TMP1
       (A, B)
    Values
       (2, 2);
    COMMIT;
    end;
    expdp system/password TABLES=tsharrhb.TMP1 DIRECTORY=GRDP_EXP_DIR DUMPFILE=TMP1.dmp REUSE_DUMPFILES=YES LOGFILE=EXP.log PARALLEL=8
    impdp system/password DIRECTORY=GRDP_EXP_DIR DUMPFILE=TMP1.dmp LOGFILE=imp.log PARALLEL=8 QUERY='TSHARRHB.TMP1:"WHERE TMP1.A = 2"'  REMAP_TABLE=TSHARRHB.TMP1:TMP3 CONTENT=DATA_ONLY
    Import: Release 11.2.0.1.0 - Production on Fri Dec 13 05:13:30 2013
    Copyright (c) 1982, 2009, Oracle and/or its affiliates.  All rights reserved.
    Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.3.0 - 64bit Production
    With the Partitioning, Automatic Storage Management, OLAP, Data Mining
    and Real Application Testing options
    Master table "SYSTEM"."SYS_IMPORT_FULL_01" successfully loaded/unloaded
    Starting "SYSTEM"."SYS_IMPORT_FULL_01":  system/********@GRD6.RBSG DIRECTORY=GRDP_EXP_DIR DUMPFILE=TMP1.dmp LOGFILE=SSD_93_TABLES_FULL_EXP.log PARALLEL=8 QUERY=TSHARRHB.TMP1:"WHERE TMP1.A = 2" REMAP_TABLE=TSHARRHB.TMP1:TMP3 CONTENT=DATA_ONLY
    Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
    . . imported "TSHARRHB"."TMP3"                           5.421 KB       2 rows
    Job "SYSTEM"."SYS_IMPORT_FULL_01" successfully completed at 05:13:33
    here I am expecting only 1 record to get imported but both the records are getting imported. please advice.

    The strange thing compared to your output is that I get an error when I have table prefix in the query block:
    Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.4.0 - 64bit Production
    With the Partitioning, OLAP, Data Mining and Real Application Testing options
    Master table "SYSTEM"."SYS_IMPORT_FULL_01" successfully loaded/unloaded
    Starting "SYSTEM"."SYS_IMPORT_FULL_01":  system/******** DUMPFILE=TMP1.dmp LOGFILE=imp.log PARALLEL=8 QUERY=SYSADM.TMP1:"WHERE TMP1.A = 2" REMAP_TABLE=SYSADM.TMP1:TMP3 CONTENT=DATA_ONLY
    Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
    ORA-31693: Table data object "SYSADM"."TMP3" failed to load/unload and is being skipped due to error:
    ORA-38500: Unsupported operation: Oracle XML DB not present
    Job "SYSTEM"."SYS_IMPORT_FULL_01" completed with 1 error(s) at Fri Dec 13 10:39:11 2013 elapsed 0 00:00:03
    And if I remove it, it works:
    Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.4.0 - 64bit Production
    With the Partitioning, OLAP, Data Mining and Real Application Testing options
    Master table "SYSTEM"."SYS_IMPORT_FULL_01" successfully loaded/unloaded
    Starting "SYSTEM"."SYS_IMPORT_FULL_01":  system/******** DUMPFILE=TMP1.dmp LOGFILE=imp.log PARALLEL=8 QUERY=SYSADM.TMP1:"WHERE A = 2" REMAP_TABLE=SYSADM.TMP1:TMP3 CONTENT=DATA_ONLY
    Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
    . . imported "SYSADM"."TMP3"                             5.406 KB       1 out of 2 rows
    Job "SYSTEM"."SYS_IMPORT_FULL_01" successfully completed at Fri Dec 13 10:36:50 2013 elapsed 0 00:00:01
    Nicolas.
    PS: as you can see, I'm on 11.2.0.4, I do not have 11.2.0.1 that you seem to use.

  • Problem importing a table with blob's

    hi all, I'm facing the following situation.
    Source DB : 10.2.0.3 (client's DB)
    Destination DB (mine): 10.2.0.4
    I've a dump file (traditional) of a particular schema.
    I'm running import (imp) to import on my DB.
    It runs fine until it reaches one particular table. This table has 6 colums, 3 of them are BLOB.
    This table has 260000 rows (checked with export log).
    When import reaches row 152352 it stops loading data, but import is still running.
    what can I do to get more information from this situation in order to solve this problem?
    Any suggestion will be appreciated!
    Thanks in advance.

    Pl identify the source and target OS versions. Are there any useful messages in the alert.log ? How long did the export take ? Rule of thumb states import will take twice as long. Have you tried expdp/impdp instead ? Also see the following -
    How To Diagnose And Troubleshoot Import Or Datapump Import Hung Scenarios          (Doc ID 795034.1)
    How To Find The Cause of a Hanging Import Session          (Doc ID 184842.1)
    Import is Slow or Hangs          (Doc ID 1037231.6)
    Export and Import of Table with LOB Columns (like CLOB and BLOB) has Slow Performance          (Doc ID 281461.1)
    HTH
    Srini

Maybe you are looking for

  • Itunes to Iphone prob

    I have alot of songs in my itunes library and on my ipod. I tried to drag the songs from the library or purchased (like I normally do for my ipod) into the iphone and it won't let me. Is there something else I should do?

  • Transferrring from Imac to Ibook

    I just got an Ibook, and would like to transfer everything in my Itunes to my Ibook. What is the easiest way to do this? Do I have to move it to an external harddrive (IPOD) or is there a way to connect them directly and sync them up? Thanks Kent

  • How do I embed YouTube or Vimeo video in an apple mail message

    Is there an easy  way to embed YouTube or Vimeo video in an apple mail message? I've tried using command+i (same with share feature) on a YouTube page ... looks great except I get missing plugin error on video. I'm up to date on Flash plugin, running

  • Define a Webservice

    Hi, I am trying a SOAP --> PI --> BAPI scenario. I am in the Configuration Directory and trying to define the webservice. However I am using a BAPI that I imported from our ECC system and it is not coming up as an option in step 3 for me to select. I

  • Livecycle form

    hi, i have a livecycle expandable form. here there is a table and add row and remove row button for adding and removing table row.when i add additional feature for save form with data it will automatically show 10 rows of table as default. i want onl