[DATAPUMP] impdp problem

Hi,
could you tell me if there can occur any trouble if I run impdp on an existing database with objects and data included in a dumpfile? If there are already existings objects and data, those would be just ignored by impdp by default?
Regards,
Przemek Piechota

Hi,
In data pump if the parameter CONTENT=DATA_ONLY is specified, the default is APPEND, and then data will be appended into existing table.
TABLE_EXISTS_ACTION can have following values.
1)SKIP: It leaves the table as is and moves on to the next object. This is not a valid option if the CONTENT parameter is set to DATA_ONLY.
2)APPEND: This option loads rows from the source and leaves existing rows unchanged. This is a default option is CONTENT=DATA_ONLY is specified.
3)TRUNCATE: This option deletes existing rows and then loads rows from the source.
4)REPLACE: This option drops the existing table in the database and then creates and loads it from the source. This is not a valid option if the CONTENT parameter is set to DATA_ONLY.
http://www.ora600.be/node/4101
regards

Similar Messages

  • Primary key, unique key impdp problem

    Hi, during my impdp I came across problems with constraints being violated. So I disabled all constraints in a proper order. It didn't help. There are still the same errors. I checked some individual constraints that were problematic and it turned out that they were all disabled as I expected. I don't understand why datapump still shows these errors despite constraints are DISABLED.

    But if the unique index exists on the target table it does not matter if the PK is disabled or not. You would need to drop the index; however, this shows you have a data problem.
    Why do you have duplicate keys? If you import duplicates then rebuilding the unique index would result in an error.
    Do you need to truncate or drop some of the target tables before importing?
    These are points I raised in my first response and which you have not addressed in your answer.
    HTH -- Mark D Powell --

  • Datapump impdp error due to streams capture

    Hello,
    Last night I was trying to setup a light backup/restore scheme using datapump expdp and impdp.
    Env: oracle 10g, redhat EL 3.
    The backup (expdb) exports a schema and works smooth.
    The import for the whole schema or for a given table fails horribly with:
    impdp system/***** directory=dpump_dir_1 dumpfile=aldrapay.dmp tables=aldrapay.transaction
    Master table "SYSTEM"."SYS_IMPORT_TABLE_09" successfully loaded/unloaded
    Starting "SYSTEM"."SYS_IMPORT_TABLE_09": system/******** directory=dpump_dir_1 dumpfile=aldrapay.dmp tables=aldrapay.transaction
    ORA-39125: Worker unexpected fatal error in KUPW$WORKER.DISPATCH_WORK_ITEMS while calling SYS.DBMS_INTERNAL_SAFE_SCN.SCN.SET_SESSION_STATE [TABLE_DATA:"SYSTEM"."SYS_IMPORT_TABLE_09"]
    ORA-00439: feature not enabled: Streams Capture
    ----- PL/SQL Call Stack -----
    object line object
    handle number name
    0x57ae0e08 13460 package body SYS.KUPW$WORKER
    0x57ae0e08 5810 package body SYS.KUPW$WORKER
    0x57ae0e08 6459 package body SYS.KUPW$WORKER
    0x57ae0e08 1208 package body SYS.KUPW$WORKER
    0x587a951c 2 anonymous block
    Can anybody help ? I cannot seem to find even "datapump" or "data pump" on any forum here !
    Cheers,
    Radu

    Oh, quite old your entry. And no answer....
    I have the same problem: When importing using impdp I get the ORA-00439. We use the Standard Edition of Oracle 10.1 and Streams are a feature of the Enterprise Edition - but I do not want to use Streams, I only want to do export/import quicker than the old exp/imp.
    So my question is: Do Oracle Data Pump need Oracle Streams?
    I also set STREAMS_CONFIGURATION=n put this doesnt help.

  • DATAPUMP SIZE PROBLEM ;;; PLS HELP ME

    HI Everybaody,
    For Oracle DBA Experts , i generate evreyday one dmp file size more than 14 GO; it's a full export DATAPUMP file ; and i would like to know if could cause problem if I compress the DMP files ; I read an article on the web saying that compressing DMP file with Zip utilities can crash the DMP file and then I could have a crashed file sinc i decompress them; Any suggestion , help please?
    Thx

    Thank you ; please read this article in frensh ;
    http://oracle.developpez.com/guide/sauvegarde/generalites/#L4.2.d
    The author says In red color ;
    ne compressez jamais un fichier dump avec un outil de compression système Winzip (Windows) ou GZIP (Linux - Unix), vous risqueriez d'avoir de mauvaises surprises lors de sa décompression.
    En englsih means ;
    never compress a file dump with a tool for compression Winzip system (Windows) or GZIP (Linux - Unix), you would be likely to have nasty surprises at the time of its decompression.
    Is it Wrong?
    Message was edited by:
    HAGGAR

  • Datapump impdp error LPX-00216

    Hi Guru, i am trying to import 150GB size db failed with error below :
    ### Detailed Problem Statement ###
    impdp "'/ as sysdba'" directory=dpdata dumpfile=episfile1%U.dmp full=y job_name=episuatimport2
    File List :
    -rwxr-x--x 1 oracle dba 23075409920 Jan 12 17:02 episfile01.dmp
    -rwxr-x--x 1 oracle dba 27706929152 Jan 12 17:44 episfile02.dmp
    -rwxr-x--x 1 oracle dba 15957520384 Jan 12 18:09 episfile03.dmp
    -rwxr-x--x 1 oracle dba 12562350080 Jan 12 22:27 episfile04.dmp
    Import: Release 10.2.0.2.0 - 64bit Production on Tuesday, 13 January, 2009 8:19:57
    Copyright (c) 2003, 2005, Oracle. All rights reserved.
    Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.2.0 - 64bit Production
    With the Partitioning and Data Mining options
    ORA-39002: invalid operation
    ORA-31694: master table "SYS"."EPISUATIMPORT2" failed to load/unload
    ORA-02354: error in exporting/importing data
    ORA-39774: parse of metadata stream failed with the following error:
    LPX-00216: invalid character 0 (0x0
    Please advice how to resolve this ? is it caused by dump corruption ?
    thanks.

    Hi,
    I am not sure whether the error raised due to the NLS settings. Try to check for metalink note 603415.1
    That might be help you.
    - Pavan Kumar N

  • An impdp problem

    Hi,
    11gR2, in linux86_64
    I can did a successful exp (expdp) from source db by using "sysdba".
    when using impdp, can see error:
    Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.1.0 - 64bit Production
    With the Partitioning and Automatic Storage Management options
    ORA-39002: invalid operation
    ORA-31694: master table "SYS"."IMP_L2" failed to load/unload
    ORA-31644: unable to position to block number 226430 in dump file "/tmp/dump/l2/l2_2.dmp"
    ORA-19501: read error on file "/tmp/dump/l2/l2_2.dmp", block number 1 (block size=4096)
    ORA-27070: async read/write failed
    Linux-x86_64 Error: 22: Invalid argument
    Additional information: 3
    seems like it is block corrupition, but did another exp/imp, got same error, what is the problem?
    cc : impdp's parfile
    USERID="/ as sysdba"
    DIRECTORY=data_pump_dir1
    LOGFILE=data_pump_dir1:import_l2.log
    JOB_NAME=imp_l2
    DUMPFILE=l2.dmp
    REMAP_SCHEMA=source_schema:target_schema
    REMAP_TABLESPACE=(source_1:target_1,
    source_1:target_2)
    TABLES=(source_shcema.table1,
    source_shcema.table2)
    thank you,

    /tmp/dump/l2 >
    total 912780
    drwxrwxr-x 3 oracle dba 60 Oct 7 21:14 ../
    -rwxrwxrwx 1 oracle dba 932839424 Oct 10 15:04 l2.dmp*
    -rwxrwxr-x 1 oracle dba 4246 Oct 10 15:09 impl2.par*
    -rw-r--r-- 1 oracle dba 0 Oct 10 15:09 import_l2.log
    drwxrwxr-x 2 oracle dba 140 Oct 10 15:09 ./
    /tmp/dump >
    total 0
    drwxrwxr-x 3 oracle dba 60 Oct 7 21:14 ./
    drwxrwxrwt 6 root root 120 Oct 10 15:05 ../
    drwxrwxr-x 2 oracle dba 140 Oct 10 15:09 l2/
    tmp >
    total 8
    drwxr-xr-x 24 root root 4096 Jun 30 02:36 ../
    drwxrwxr-x 3 oracle dba 60 Oct 7 21:14 dump/
    drwxrwxrwt 6 root root 120 Oct 10 15:05 ./
    seems like should be oK...
    but still get error.
    Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.1.0 - 64bit Production
    With the Partitioning and Automatic Storage Management options
    ORA-39002: invalid operation
    ORA-31694: master table "SYS"."IMP_L2" failed to load/unload
    ORA-31644: unable to position to block number 226430 in dump file "/tmp/dump/l2/l2.dmp"
    ORA-19501: read error on file "/tmp/dump/l2/l2.dmp", block number 1 (block size=4096)
    ORA-27070: async read/write failed
    Linux-x86_64 Error: 22: Invalid argument
    Additional information: 3
    thank you
    Edited by: ROY123 on Oct 10, 2011 8:14 AM

  • IMPDP problem

    Hi I am getting strange problem, from full expdp, when I am trying to import "impdp" the user's data(sirsi user) from production database to test database (test user), here I found when I am trying to "impdp" ,it changing the test database password (same password of production).
    Ex: let as assume
    the production database password: production123
    the test database password: test123
    but when I try to Import "impdp" production to test database:
    F:\expdp>impdp directory=expdp dumpfile=21092013_JICLIBDB.DMP logfile=imp_sirsi.log remap_schema=sirsi:test statistics=none
    Impdp successful but no data was import to user (test user).
    And also I am getting following error:
    CREATE TABLESPACE "ACCOUNTABILITY_DAT" DATAFILE 'D:\ORACLE\APP\PRODUCT\11.2.0\JICLIBDB\ACCOUNTABILITY01.DAT' SIZE 10485760 AUTOEXTEND ON NEXT 8192 MAXSIZE 32767M LOGGING ONLINE PERMANENT BLOCKSIZE 819
    ORA-39083: Object type TABLESPACE failed to create with error:
    ORA-01119: error in creating database file 'D:\ORACLE\APP\PRODUCT\11.2.0\JICLIBDB\ACCOUNTABILITY01.IDX'
    ORA-27040: file create error, unable to create file
    OSD-04002: unable to open file
    O/S-Error: (OS 21) The device is not ready.
    My database version : 11..2.0.1
    OS: windows

    Hi,
    >>,it changing the default database password.
    what do you mean by default password. you are doing schema export/import, if schema does not exist it will create the schema will all the inherited property of source one.
    to prevent this you can create the schema before import and during import it will skip/error for the create statement only.
    HTH

  • Impdp - Problem with import database

    Hi Gurus,
    just a short question. What is the procedure for import oracle database ?
    I have exported dump file of my oracle instance and I would like to import it on some other servers.
    I dont know why I can import it correctly. My procedure is:
    1) export oracle instance using expdp from server A
    2) on the server B I`m creating clean oracle instance
    3) On server B I`m importing oracle file exported from server A using impdp
    Is this ok ? Should I create new oracle instance ? Or it will be installed automaticly.
    During import proces I have a lot of errors, mostly...
    Table allready exist (CTXSYS for example)...
    Who can tell me if its good procedure ?

    Dlugasx wrote:
    Hi Gurus,
    just a short question. What is the procedure for import oracle database ?
    I have exported dump file of my oracle instance and I would like to import it on some other servers.
    I dont know why I can import it correctly. My procedure is:
    1) export oracle instance using expdp from server A
    2) on the server B I`m creating clean oracle instance
    i hope you have created looking from the below errors you have described
    3) On server B I`m importing oracle file exported from server A using impdp
    Is this ok ? Should I create new oracle instance ? Or it will be installed automaticly.
    During import proces I have a lot of errors, mostly...
    Table allready exist (CTXSYS for example)...
    Who can tell me if its good procedure ?I think you are trying to do a full expdp and full impdb. you can ignore the already exist objects as when you insall oracle and create database it will have those objects too. when these exists it will through these errors. you may be interested to see the application objects reather. once import is completed you can query dba_objects and comapare with prod data whether all objects have imported or not.
    Anil Malkai

  • Datapump impdp related issues, please do suggest

    I am running impdp importon oracle 10g and got following errors at end of the import , Please do suggest how to solve this.
    ORA-39083: Object type TABLE_STATISTICS failed to create with error:
    ORA-06502: PL/SQL: numeric or value error
    ORA-06512: at "SYS.KUPW$WORKER", line 11729
    ORA-06502: PL/SQL: numeric or value error: character string buffer too small
    Failing sql is:
    DECLARE SREC DBMS_STATS.STATREC; BEGIN SREC.MINVAL := '466171'; SREC.MAXVAL := '776C616E5370656564'; SREC.EAVS := 4; SREC.CHVALS := DBMS_STATS.CHARARRAY('Faq',
    'access',
    Job "CMSPROD"."SYS_IMPORT_SCHEMA_01" completed with 2 error(s) at 11:30

    10g what? It's not very useful to tell marketing labels, better are full database versions. There are several bugs reported , which show 'character string buffer too small' error, especially in the first 10gR1 versions. That means depending on your actual database version,patching could help. Otherwise call Oracle support.
    Werner

  • Expdp and impdp Problem

    Hi Experts,
    I want to export all the tables from schema hr to a new user test in a same database. When i try to imp, it shows that object already exists and end up with error message.
    sql>impdp directory=dumplocation dumpfile=test.dmp
    Please advice me where i have done a mistake, I'm using 0racle 10g.
    Thanks in advance
    Shaan

    Well then , it seems that object with same name already exists in schema where you are importing it.
    To clearify you're issue a bit more , maybe you could paste imp syntax you used and error message you are getting.

  • Datapump impdp issue.....really slow

    hi
    Using data pump expdp ,I could export 7gb table (partitioned) in 5 min.
    Whereas impdp is taking hours n hrs ofcourse still going on.....
    below is the statement i used
    (I'm doing data only ,as I'm trying place the partitions in separate tablespaces)
    impdp oldlls/xxxxx tables=oldlls.INTERPRETATIONCALLS content=DATA_ONL Y DIRECTORY=data_pump DUMPFILE=int1.dmp,int2.dmp parallel=4
    and status from past 3 hrs show as
    Import: Release 10.2.0.1.0 - 64bit Production on Wednesday, 13 August, 2008 12:25:39
    Copyright (c) 2003, 2005, Oracle. All rights reserved.
    Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - 64bit Production
    With the Partitioning, OLAP and Data Mining options
    Master table "OLDLLS"."SYS_IMPORT_TABLE_02" successfully loaded/unloaded
    Starting "OLDLLS"."SYS_IMPORT_TABLE_02": oldlls/******** tables=oldlls.INTERPRETATIONCALLS content=DATA_ONLY DIRECTORY=data_pump DUMPFILE=int1.dmp,int2.dmp parallel=4
    Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
    I really dno't know whats going on off the screen...
    Can someone help me with this situation.
    OS-solaris
    oracle database 10.2.0.1

    You can check the status of the import using this
    select to_char(sysdate,'YYYY-MM-DD HH24:MI:SS') "DATE", s.program, s.sid,
    s.status, s.username, d.job_name, p.spid, s.serial#, p.pid
    from v$session s, v$process p, dba_datapump_sessions d
    where p.addr=s.paddr and s.saddr=d.saddr;

  • Datapump impdp error

    Hi,
    I am getting this error when I am trying to import a dump (exported with expdp):
    ORA-31626: job does not exist
    ORA-31637: cannot create job SYS_IMPORT_TABLE_01 for user SYS
    ORA-06512: at "SYS.DBMS_SYS_ERROR", line 95
    ORA-06512: at "SYS.KUPV$FT", line 1569
    ORA-39062: error creating master process DM00
    ORA-39105: Master process DM00 failed during startup. Master error:
    ORA-28031: maximum of enabled roles exceeded
    I have restarted the DB, there were no tables to drop from this select statement:
    select table_name from user_tables where table_name like 'SYS_IMPORT_%';
    and max enable rows are 150:
    sys@ORCL> show parameter max_enable
    NAME TYPE VALUE
    max_enabled_roles integer 150
    Any idea, please?
    Thanks

    I'm not sure what version you are running but I remember an bug similar to this. When a role is created, the schema creating the role automatically was granted that role. This is not the right behavior and was causing an issue. Anyway, Data Pump needed a fix so that after creating a role, it needed to be revoked from the schema running the job. Check to see if there is a patch available for your version.
    Dean

  • Datapump import consuming disk space

    Hi all,
    Windows server 2003
    Oracle 10g
    I'm trying to import a schema using datapump:
    impdp username/password schemas=siebel directory=dpdir dumpfile=schema_exp.dmp status=60
    It's importing but it's consuming massive amounts of disk space, over 40 gig now. Why is it consuming so much space just to import. I'm doing a reorg\, so I dropped the biggest production schema, and did the import. I didn't resize the datafiles, so where is all this extra disk space going?
    Please help, this is a production site and I'm running out of time (and space)

    Tablespace SEGS FRAGS SIZE FREE USED
    PERFSTAT : 0 1 1000 1000 0%
    SBL_DATA : 3448 4 30400 12052 60%
    SBL_INDEX : 7680 3 8192 4986 39%
    SYSAUX : 3137 39 2560 490 81%
    SYSTEM : 1256 1 2048 1388 32%
    TEMP : 0 0%
    TOOLS : 2 1 256 256 0%
    UNDOTBS1 : 11 13 5120 4754 7%
    USERS : 169 14 1024 424 59%
    Great thanks, the tablespaces are not the problem, it is the archivelogs. About a 100m a minute!
    My issue started with the first import, not all the data was imported, meaning there where rows missing, I saw this when I did a row count. So I decided to do the import again (followed the same steps as mentioned earlier) If after this import the rows are still missing, I want to do a point in time recovery with rman, because then my export dump file must be corrupted, am I correct in saying this?

  • Datapump + Oracle 10.1

    Hi all,
    I have a problem with my expdp.
    It launchs every night at 08pm.
    his status is on executing but it is very slow.
    It stays on this step :
    Starting "SYSTEM"."SYS_EXPORT_FULL_24": USERID=system/********@ebi FULL=Y directory=EXPORT_EBIR1 dumpfile=testebi.dmp logfile=testebi.log TRACE=1FF0300
    Estimate in progress using BLOCKS method...
    Processing object type DATABASE_EXPORT/SCHEMA/TABLE/TABLE_DATA
    I have trace this job and see that it is running but very very slowly.
    Could you tell me what Oracle do on this step?
    Wish i launch the stats?

    It's still assembling the metadata of the objects to be exported, depending on your actual database this can take a long time. Unfortunately there are several performance-related topics, refer to this metalink note:
    Checklist for Slow Performance of Export Data Pump (expdp) and Import DataPump (impdp)
    Doc ID: NOTE:453895.1
    By the way, what's the reason for tracing? That should be done only under the assistance of Oracle support and will slow down the performance additionally.
    Werner

  • Impdp with exclude issus

    Hi,
    I'm in oracle 10gR2/redhat4.
    I need to import a schema with excluding big tables.
    here is the syntax that I use :
    [oracle@osiris-rac-dev-2 datapump]$ impdp system DIRECTORY=datapump DUMPFILE=OSIRIS_100702.dump LOGFILE=osiris_light.log remap_schema=OSIMAX:OSIMAX_EU EXCLUDE=TABLE:"IN ('ASP_ADVANCED_STAT','ASP_ERROR_LOG','ASP_ERROR_LOG_OLD','JN_ON_BOARD_DATA_VALUE','ASP_COUNTER_HISTORY')"
    Import: Release 10.2.0.1.0 - Production on Friday, 14 January, 2011 16:54:47
    Copyright (c) 2003, 2005, Oracle.  All rights reserved.
    Password:
    Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - Production
    With the Partitioning, Real Application Clusters, OLAP and Data Mining options
    ORA-39001: invalid argument value
    ORA-39071: Value for EXCLUDE is badly formed.
    ORA-00936: missing expressionSome one right to use \ before ' and "
    {code}
    [oracle@osiris-rac-dev-2 datapump]$ impdp system DIRECTORY=datapump DUMPFILE=OSIRIS_100702.dump LOGFILE=osiris_light.log remap_schema=OSIRIS:OSIRIS_EU1 EXCLUDE=TABLE:\"IN (\'ASP_ADVANCED_STAT\',\'ASP_ERROR_LOG\',\'ASP_ERROR_LOG_OLD\',\'JN_ON_BOARD_DATA_VALUE\',\'ASP_COUNTER_HISTORY\')\"
    -bash: syntax error near unexpected token `('
    {code}
    What is the right syntax
    Regards,

    if you utilize PARFILE, it will avoid metacharacter conflict that occurs on command line

Maybe you are looking for

  • How do I get the user information for a page?

    I would like to display information about the person that created a page in CQ.  I know I can get the createdBy information from the page properties, but how can I turn that into a user object where I can get more detialed information about that user

  • In how many ways we can filter this select statement to improve performance

    Hi Experts, This select statement taking 2.5 hrs in production, Can we filter the where condition, to improve the performance.Plz suggest with coding ASAP. select * from dfkkop into  table t_dfkkop            where   vtref   like 'EPC%'        and   

  • How to run a VM  non standard directorie

    By default, VMs, templates, etc are located on /OVS. Is it possible to put a VM on a different directory, for example /OVS2/running_pool/<myVM>, and manage it from the VM Manager? I tried using symblinks with no luck so far. Thanks.

  • X61 with defective bluetooth?

    Hey everyone, First, let me say that I did a search, and none of the solutions I found worked for me. I have a X61. Ever since I bought it, it would freeze when I tried to enable bluetooth. Now, this wasn't often, as I never had a need for it, so I o

  • Confusion with String Buffer

    how would i know if the stringbuffer is null or empty? i have to resuse a stringbuffer but setCharAt method is not working in my code. i can't always use append method.can anyone answer this?