DataPump Import

Hi,
I am facing a small problem with datapump import, i exported a schema using datapump export(expdp) the same dump file i used for import using (impdp), once when the import is over i see all procedures,triggers and etc are all created with double quotes such as CREATE OR REPLACE PROCEDURE "ACCOUNTTRANSFERDETAIL_INSERT". So when doing a DataBase Difference these all comes as mismatch, so is there any way to overcome this error.
When i do a export and import in oracle 9i everthing was ok.
Now our QA team is doing a oracle 10g export and import (EXP & IMP) activities and i am using oracle 10g datapump for export and import (expdp & impdp) this is where we are facing problem. Comparison is between oracle 10g (exp & imp ) and oracle 10g datapump (expdp & impdp).
For their schema there was no double quotes and when i refreshed a schema using impdp objects where are enclosed using double quotes.
I also checked in data dictionary there also i see the object names enclosed with double quotes.
so i anybody has solution please revert back.
Thanks
Regards,
Ganesh R

Somehow I don;t think this is a problem with a download. You'll get a lot more eyeballs on your question if you ask in the right place.
Since Data Pump is part of the database, you might try asking your question in a Database forum (go up 2 levels and take a look at the screen).

Similar Messages

  • Datapump import doesnt filter out table when mentioned in 'exclude'

    The below datapump import fails to filter out the 'COUNTRIES' table. I thought that was what it was supposed to do ? Any suggestions ?
    C:\Documents and Settings\>impdp remap_schema=hr:hrtmp dumpfile=hr1.dmp parfile='C:\Documents and Settings\para.par'
    Import: Release 10.2.0.1.0 - Production on Monday, 18 January, 2010 15:00:05
    Copyright (c) 2003, 2005, Oracle. All rights reserved.
    Username: / as sysdba
    Connected to: Oracle Database 10g Express Edition Release 10.2.0.1.0 - Production
    Master table "SYS"."SYS_IMPORT_FULL_01" successfully loaded/unloaded
    Starting "SYS"."SYS_IMPORT_FULL_01": /******** AS SYSDBA remap_schema=hr:hrtmp dumpfile=hr1.dmp parfile='C:\Documents a
    nd Settings\para.par'
    Processing object type SCHEMA_EXPORT/USER
    Processing object type SCHEMA_EXPORT/SYSTEM_GRANT
    Processing object type SCHEMA_EXPORT/ROLE_GRANT
    Processing object type SCHEMA_EXPORT/DEFAULT_ROLE
    Processing object type SCHEMA_EXPORT/TABLESPACE_QUOTA
    Processing object type SCHEMA_EXPORT/PRE_SCHEMA/PROCACT_SCHEMA
    Processing object type SCHEMA_EXPORT/SEQUENCE/SEQUENCE
    Processing object type SCHEMA_EXPORT/TABLE/TABLE
    Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
    . . imported "HRTMP"."COUNTRIES" 6.093 KB 25 rows
    . . imported "HRTMP"."DEPARTMENTS" 6.640 KB 27 rows
    . . imported "HRTMP"."EMPLOYEES" 15.77 KB 107 rows
    . . imported "HRTMP"."JOBS" 6.609 KB 19 rows
    . . imported "HRTMP"."JOB_HISTORY" 6.585 KB 10 rows
    . . imported "HRTMP"."LOCATIONS" 7.710 KB 23 rows
    . . imported "HRTMP"."REGIONS" 5.296 KB 4 rows
    Processing object type SCHEMA_EXPORT/TABLE/INDEX/INDEX
    Processing object type SCHEMA_EXPORT/TABLE/CONSTRAINT/CONSTRAINT
    Processing object type SCHEMA_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS
    Processing object type SCHEMA_EXPORT/TABLE/COMMENT
    Processing object type SCHEMA_EXPORT/PROCEDURE/PROCEDURE
    Processing object type SCHEMA_EXPORT/PROCEDURE/ALTER_PROCEDURE
    Processing object type SCHEMA_EXPORT/VIEW/VIEW
    Processing object type SCHEMA_EXPORT/TABLE/CONSTRAINT/REF_CONSTRAINT
    Processing object type SCHEMA_EXPORT/TABLE/TRIGGER
    Processing object type SCHEMA_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
    Job "SYS"."SYS_IMPORT_FULL_01" successfully completed at 15:00:13
    para.par
    ========
    exclude=TABLES:"like '%COUNTRIES%'"

    Hi,
    The first thing I see is that you excluded TABLES (plural). The object type is not pluralized in DataPump. The parfile should look like:
    exclude=TABLE:"like '%COUNTRIES%'"
    Dean

  • ORACLE 10g : Datapump-Import : Error: unknown parameter name 'REMAP_TABLE'

    Hi,
    I am working on Oracle 10g. When executed import datapump script in the UNIX box with option “RMAP_TABLE” I received the error “unknown parameter name 'REMAP_TABLE' “Can you please give me solution for it?
    Scripts :-
    impdp eimsexp/xyz TABLES=EIMDBO.DIVISION DIRECTORY=DATAMART_DATA_PUMP_DIR DUMPFILE=expdp_DATAMART_tables_%u_$BATCH_ID.dmp LOGFILE=impdp_DATAMART_tables.log REMAP_TABLE=EIMDBO.DIVISION:EIM2DBO:YR2009_DIM_DIVISION
    Note :- The YR2009_DIM_DIVISION table is available in the target database. It is without partition table. The EIMDBO.DIVISION is partition table.
    Thanks,

    See your post here
    ORACLE 10g : Datapump-Import : Error: unknown parameter name 'REMAP_TABLE'
    Srini

  • Datapump import of tables will automatically rebuild indexes ?

    dear all,
    do datapump import of tables will automatically rebuild the indexes againts the associated tables?
    urgent response please

    Yes indexes are rebulit.
    From dba-oracle
    Set indexes=n – Index creation can be postponed until after import completes, by specifying indexes=n. If indexes for the target table already exist at the time of execution, import performs index maintenance when data is inserted into the table. Setting indexes=n eliminates this maintenance overhead. You can also use the indexfile=filename parm to rebuild all the indexes once, after the data is loaded. When editing the indexfile, add the nologging and parallel keywords (where parallel degree = cpu_count-1).

  • Datapump import doubt

    Hi,
    Oracle Version:10.2.0.1
    Operating System:Linux
    Can we put two datapump import process at a time on the same dump in same machine in parallel.
    Thanks & Regards,
    Poorna Prasad.

    Neerav999 wrote:
    NO you cannot create datapump import as
    Import process starts by creating master process,worker process,client process and shadow process no import can be done until these processes are freeI am not sure what you want to say here? Do you mean to say that we can't run two parallel import processes since the processes won't be free? Please see below,
    E:\>time
    The current time is: 18:24:17.04
    Enter the new time:
    E:\>impdp system/oracle directory=expdpdir dumpfile=dumpfile.dmp remap_schema=hr:hr2
    Import: Release 11.1.0.6.0 - Production on Saturday, 06 February, 2010 18:24:25
    Copyright (c) 2003, 2007, Oracle.  All rights reserved.
    Connected to: Oracle Database 11g Enterprise Edition Release 11.1.0.6.0 - Production
    With the Partitioning, OLAP, Data Mining and Real Application Testing options
    Master table "SYSTEM"."SYS_IMPORT_FULL_02" successfully loaded/unloaded
    Starting "SYSTEM"."SYS_IMPORT_FULL_02":  system/******** directory=expdpdir dumpfile=dumpfile.dmp remap_schema=hr:hr2
    Processing object type SCHEMA_EXPORT/PRE_SCHEMA/PROCACT_SCHEMA
    Processing object type SCHEMA_EXPORT/SEQUENCE/SEQUENCE
    Processing object type SCHEMA_EXPORT/TABLE/TABLE
    Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
    . . imported "HR2"."COUNTRIES"                           6.375 KB      25 rows
    . . imported "HR2"."DEPARTMENTS"                         7.015 KB      27 rows
    . . imported "HR2"."EMPLOYEES"                           16.80 KB     107 rows
    . . imported "HR2"."JOBS"                                6.984 KB      19 rows
    . . imported "HR2"."JOB_HISTORY"                         7.054 KB      10 rows
    . . imported "HR2"."LOCATIONS"                           8.273 KB      23 rows
    . . imported "HR2"."REGIONS"                             5.484 KB       4 rows
    Processing object type SCHEMA_EXPORT/TABLE/GRANT/OWNER_GRANT/OBJECT_GRANT
    Processing object type SCHEMA_EXPORT/TABLE/INDEX/INDEX
    Processing object type SCHEMA_EXPORT/TABLE/CONSTRAINT/CONSTRAINT
    Processing object type SCHEMA_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS
    Processing object type SCHEMA_EXPORT/TABLE/COMMENT
    Processing object type SCHEMA_EXPORT/PROCEDURE/PROCEDURE
    Processing object type SCHEMA_EXPORT/PROCEDURE/ALTER_PROCEDURE
    Processing object type SCHEMA_EXPORT/VIEW/VIEW
    Processing object type SCHEMA_EXPORT/TABLE/CONSTRAINT/REF_CONSTRAINT
    Processing object type SCHEMA_EXPORT/TABLE/TRIGGER
    Processing object type SCHEMA_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
    Processing object type SCHEMA_EXPORT/POST_SCHEMA/PROCACT_SCHEMA
    Job "SYSTEM"."SYS_IMPORT_FULL_02" successfully completed at 18:24:44And the second session,'
    E:\Documents and Settings\aristadba>time
    The current time is: 18:24:19.37
    Enter the new time:
    E:\Documents and Settings\aristadba>impdp system/oracle directory=expdpdir dumpfile=dumpfile.dmp remap_schema=hr:hr
    Import: Release 11.1.0.6.0 - Production on Saturday, 06 February, 2010 18:24:22
    Copyright (c) 2003, 2007, Oracle.  All rights reserved.
    Connected to: Oracle Database 11g Enterprise Edition Release 11.1.0.6.0 - Production
    With the Partitioning, OLAP, Data Mining and Real Application Testing options
    Master table "SYSTEM"."SYS_IMPORT_FULL_01" successfully loaded/unloaded
    Starting "SYSTEM"."SYS_IMPORT_FULL_01":  system/******** directory=expdpdir dumpfile=dumpfile.dmp remap_schema=hr:h
    Processing object type SCHEMA_EXPORT/PRE_SCHEMA/PROCACT_SCHEMA
    Processing object type SCHEMA_EXPORT/SEQUENCE/SEQUENCE
    Processing object type SCHEMA_EXPORT/TABLE/TABLE
    Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
    . . imported "HR1"."COUNTRIES"                           6.375 KB      25 rows
    . . imported "HR1"."DEPARTMENTS"                         7.015 KB      27 rows
    . . imported "HR1"."EMPLOYEES"                           16.80 KB     107 rows
    . . imported "HR1"."JOBS"                                6.984 KB      19 rows
    . . imported "HR1"."JOB_HISTORY"                         7.054 KB      10 rows
    . . imported "HR1"."LOCATIONS"                           8.273 KB      23 rows
    . . imported "HR1"."REGIONS"                             5.484 KB       4 rows
    Processing object type SCHEMA_EXPORT/TABLE/GRANT/OWNER_GRANT/OBJECT_GRANT
    Processing object type SCHEMA_EXPORT/TABLE/INDEX/INDEX
    Processing object type SCHEMA_EXPORT/TABLE/CONSTRAINT/CONSTRAINT
    Processing object type SCHEMA_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS
    Processing object type SCHEMA_EXPORT/TABLE/COMMENT
    Processing object type SCHEMA_EXPORT/PROCEDURE/PROCEDURE
    Processing object type SCHEMA_EXPORT/PROCEDURE/ALTER_PROCEDURE
    Processing object type SCHEMA_EXPORT/VIEW/VIEW
    Processing object type SCHEMA_EXPORT/TABLE/CONSTRAINT/REF_CONSTRAINT
    Processing object type SCHEMA_EXPORT/TABLE/TRIGGER
    Processing object type SCHEMA_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
    Processing object type SCHEMA_EXPORT/POST_SCHEMA/PROCACT_SCHEMA
    Job "SYSTEM"."SYS_IMPORT_FULL_01" successfully completed at 18:24:44Can you see the time difference? Its just 2 seconds before the jobs were executed!
    Would you be kind to explain the statement that you gave just now since very well are the chances that I may have not understood it.
    HTH
    Aman....

  • DataPump Import over Network and Parallel parameter

    I see and understand the advantages of using the Parallel parameter with datapump with dump files. I am trying to figure out is there any advantage to the parallel parameter when importing data over the network? I have noticed theren are sweet spots where parallel and dump files based upon the data. Sometimes ten workers and ten dmp files is no faster if not slower than three. I have not been able to find a clear answer on this. Any advice or explanation would be appreciated.
    Thanks

    I am trying to figure out is there any advantage to the parallel parameter when importing data over the network?Reading data from network is slower than other options available from datapump.
    The performance of Parallel import using datapump import are controlled by I/O contention. You have to determine the best value for performing parallel datapump export and import.

  • [10G] DATAPUMP IMPORT는 자동으로 USER 생성

    제품 : ORACLE SERVER
    작성날짜 : 2004-05-28
    [10G] DATAPUMP IMPORT는 자동으로 USER 생성
    ===================================
    PURPOSE
    다음은 DATAPUMP의 기능을 소개하고자 한다.
    Explanation
    imp utility 에서는 target user가 존재해야 했었다
    그러나 DATAPUMP 는 만익 target user가 존재하지 않는다면
    이를 자동으로 생성한다.
    Example :
    =============
    Source database 에서 TEST라는 user 가 있다.
    TEST ( password : test , role : connect , resource )
    Export the TEST schema using datapump:
    expdp system/oracle dumpfile=exp.dmp schemas=TEST
    Case I
    =======
    test user가 존재하지 않을 경우에 import 는 test user를 자동으로 생성한다.
    impdp system/oracle dumpfile=exp.dmp
    *************TEST does not exist*************************************
    Import: Release 10.1.0.2.0 - Production on Friday, 28 May, 2004 1:02
    Copyright (c) 2003, Oracle. All rights reserved.
    Connected to: Oracle Database 10g Enterprise Edition Release 10.1.0.2.0 - Produc
    tion
    With the Data Mining option
    Master table "SYSTEM"."SYS_IMPORT_FULL_01" successfully loaded/unloaded
    Starting "SYSTEM"."SYS_IMPORT_FULL_01": system/******** dumpfile=exp.dmp
    Processing object type SCHEMA_EXPORT/USER
    Processing object type SCHEMA_EXPORT/SYSTEM_GRANT
    Processing object type SCHEMA_EXPORT/ROLE_GRANT
    Processing object type SCHEMA_EXPORT/DEFAULT_ROLE
    Processing object type SCHEMA_EXPORT/SE_PRE_SCHEMA_PROCOBJACT/PROCACT_SCHEMA
    Processing object type SCHEMA_EXPORT/DB_LINK
    Processing object type SCHEMA_EXPORT/TABLE/TABLE
    Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
    . . imported "TEST"."DEPT" 5.648 KB 4 rows
    . . imported "TEST"."SALGRADE" 5.648 KB 10 rows
    . . imported "TEST"."BONUS" 0 KB 0 rows
    . . imported "TEST"."EMP" 0 KB 0 rows
    Processing object type SCHEMA_EXPORT/TABLE/INDEX/INDEX
    Processing object type SCHEMA_EXPORT/TABLE/CONSTRAINT/CONSTRAINT
    Processing object type SCHEMA_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS
    Processing object type SCHEMA_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
    Processing object type SCHEMA_EXPORT/PROCEDURE/PROCEDURE
    Processing object type SCHEMA_EXPORT/PROCEDURE/ALTER_PROCEDURE
    Processing object type SCHEMA_EXPORT/TABLE/CONSTRAINT/REF_CONSTRAINT
    Job "SYSTEM"."SYS_IMPORT_FULL_01" successfully completed at 01:02
    connect TEST/TEST (on target database)
    => connected
    SQL> select * from session_roles;
    ROLE
    connect
    resource
    Case II
    ========
    Target database 에 TEST user가 존재하는 경우에 warning message가 발생하며 import
    작업은 계속 진행된다.
    impdp system/oracle dumpfile=exp.dmp
    *************user TEST already exists************************************
    Import: Release 10.1.0.2.0 - Production on Friday, 28 May, 2004 1:06
    Copyright (c) 2003, Oracle. All rights reserved.
    Connected to: Oracle Database 10g Enterprise Edition Release 10.1.0.2.0 - Produc
    tion
    With the Data Mining option
    Master table "SYSTEM"."SYS_IMPORT_FULL_01" successfully loaded/unloaded
    Starting "SYSTEM"."SYS_IMPORT_FULL_01": system/******** dumpfile=exp.dmp
    Processing object type SCHEMA_EXPORT/USER
    ORA-31684: Object type USER:"TEST" already exists
    Processing object type SCHEMA_EXPORT/SYSTEM_GRANT
    Processing object type SCHEMA_EXPORT/ROLE_GRANT
    Processing object type SCHEMA_EXPORT/DEFAULT_ROLE
    Processing object type SCHEMA_EXPORT/SE_PRE_SCHEMA_PROCOBJACT/PROCACT_SCHEMA
    Processing object type SCHEMA_EXPORT/DB_LINK
    Processing object type SCHEMA_EXPORT/TABLE/TABLE
    Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
    . . imported "TEST"."DEPT" 5.648 KB 4 rows
    . . imported "TEST"."SALGRADE" 5.648 KB 10 rows
    . . imported "TEST"."BONUS" 0 KB 0 rows
    . . imported "TEST"."EMP" 0 KB 0 rows
    Processing object type SCHEMA_EXPORT/TABLE/INDEX/INDEX
    Processing object type SCHEMA_EXPORT/TABLE/CONSTRAINT/CONSTRAINT
    Processing object type SCHEMA_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS
    Processing object type SCHEMA_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
    Processing object type SCHEMA_EXPORT/PROCEDURE/PROCEDURE
    Processing object type SCHEMA_EXPORT/PROCEDURE/ALTER_PROCEDURE
    Processing object type SCHEMA_EXPORT/TABLE/CONSTRAINT/REF_CONSTRAINT
    Job "SYSTEM"."SYS_IMPORT_FULL_01" completed with 1 error(s) at 01:06
    You will receive ORA-31684 error but import will continue.
    Case - III
    ===========
    Target database에 TEST user가 존재하지만 warning (ora-31684) 을 피하기 위해서는
    EXCLUDE=USER 를 사용한다.
    impdp system/oracle dumpfile=exp.dmp exclude=user
    *********Disable create user statment as user TEST already exist***********
    Import: Release 10.1.0.2.0 - Production on Friday, 28 May, 2004 1:11
    Copyright (c) 2003, Oracle. All rights reserved.
    Connected to: Oracle Database 10g Enterprise Edition Release 10.1.0.2.0 - Produc
    tion
    With the Data Mining option
    Master table "SYSTEM"."SYS_IMPORT_FULL_01" successfully loaded/unloaded
    Starting "SYSTEM"."SYS_IMPORT_FULL_01": system/******** dumpfile=exp.dmp exclud
    e=user
    Processing object type SCHEMA_EXPORT/SYSTEM_GRANT
    Processing object type SCHEMA_EXPORT/ROLE_GRANT
    Processing object type SCHEMA_EXPORT/DEFAULT_ROLE
    Processing object type SCHEMA_EXPORT/SE_PRE_SCHEMA_PROCOBJACT/PROCACT_SCHEMA
    Processing object type SCHEMA_EXPORT/DB_LINK
    Processing object type SCHEMA_EXPORT/TABLE/TABLE
    Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
    . . imported "TEST"."DEPT" 5.648 KB 4 rows
    . . imported "TEST"."SALGRADE" 5.648 KB 10 rows
    . . imported "TEST"."BONUS" 0 KB 0 rows
    . . imported "TEST"."EMP" 0 KB 0 rows
    Processing object type SCHEMA_EXPORT/TABLE/INDEX/INDEX
    Processing object type SCHEMA_EXPORT/TABLE/CONSTRAINT/CONSTRAINT
    Processing object type SCHEMA_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS
    Processing object type SCHEMA_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
    Processing object type SCHEMA_EXPORT/PROCEDURE/PROCEDURE
    Processing object type SCHEMA_EXPORT/PROCEDURE/ALTER_PROCEDURE
    Processing object type SCHEMA_EXPORT/TABLE/CONSTRAINT/REF_CONSTRAINT
    Job "SYSTEM"."SYS_IMPORT_FULL_01" successfully completed at 01:11
    TEST user가 존재하지 않는데 EXCLUDE=USER parameter 를 사용하면
    ORA-01917 error 를 만나게 된다.
    Reference Documents :
    Note : 272132.1 DATAPUMP import automatically creates schema

  • Alter data on datapump import 10G?

    Hi,
    I see this is possible with a call to data_remap() in 11g. Is there an alternative or simple workaround for datapump import in 10G? I have 1 field that needs to be toggled occassionally when doing the import. Right now, the person who wrote the refresh job exports the active partitions from prod, imports into dev. If it needs to go into partition2, she updates the field, exports again and then reimports putting it, at that time, into the correct partition. I thought I might be able to change the value on the import (or the export if possible). Does anyone have any suggestion on this?
    Thanks, Pete

    REMAP_DATA sounds like it would work if you were on 11, but since you are on 10, the only think i can think of is to do a 2 step process. The first would be to create the table then create a trigger on the table to do the necessary data remap. Then load the data using a content=data_only parameter. Not sure how this will perform since the trigger will fire on all of the inserts, but I think it could be made to work.
    Dean

  • Datapump import takes ages

    Hi,
    I'm doing an import of a schema with one table with about 1 million (spatial) records. This import takes a very long time to complete, a ran the import on friday and when I looked at the proces on monday I saw is was finished at least 24 hours later after I started it. Here's the output:
    Z:\>impdp system/password@ora schemas=bla directory=dpump_dir1 dumpfile=schema8.dmp
    Import: Release 10.1.0.2.0 - Production on Friday, 21 January, 2005 16:32
    Copyright (c) 2003, Oracle. All rights reserved.
    Connected to: Personal Oracle Database 10g Release 10.1.0.2.0 - Production
    With the Partitioning, OLAP and Data Mining options
    Master table "SYSTEM"."SYS_IMPORT_SCHEMA_02" successfully loaded/unloaded
    Starting "SYSTEM"."SYS_IMPORT_SCHEMA_02": system/********@ora schemas=bla directory=dpump_dir1 dumpfile=schema8.dmp
    Processing object type SCHEMA_EXPORT/USER
    Processing object type SCHEMA_EXPORT/SYSTEM_GRANT
    Processing object type SCHEMA_EXPORT/ROLE_GRANT
    Processing object type SCHEMA_EXPORT/DEFAULT_ROLE
    Processing object type SCHEMA_EXPORT/SE_PRE_SCHEMA_PROCOBJACT/PROCACT_SCHEMA
    Processing object type SCHEMA_EXPORT/TABLE/TABLE
    Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
    . . imported "BLA"."NLD_NW" 266.1 MB 1217147 rows
    Job "SYSTEM"."SYS_IMPORT_SCHEMA_02" successfully completed at 15:22
    The dumpfile was created with:
    Z:\>expdp system/password@orcl schemas=bla directory=dpump_dir1 dumpfile=schema8.dmp
    The source DB is running on Linux and the target is running WinXP.
    I thought datapump was meant to be really fast, what's going wrong in my situation?

    ...I've just looked at the time needed to do the import into the same database from where I exported the dumpfile. This import operation takes 5 minutes.

  • Datapump import via Networklink hangs

    Hello,
    i try to clone my Database like datapump via network link. (Both Oracle Server Vers. 10.2.0.4 on a Linux Enterprise Server )
    Fist, I create a new empty Database
    - Create and Test the Database link
    - On the original DB I grant system to FULL_EXP
    - On the destinaion DB I grant system to FULL_IMP
    My Parfile :
    userid=system/**********
    full=Y
    table_exists_action=replace
    logfile=VAMLUI10_full_impdb.log
    directory=DATA_PUMP_DIR
    parallel=4
    REMAP_DATAFILE="'/oracle/oradata/VAMLU10/tools01.dbf':'/oracle/oradata/VAMLUI10/tools01.dbf'"
    REMAP_DATAFILE="'/oracle/oradata/VAMLU10/ASPROLU_D01_01.dbf':'/oracle/oradata/VAMLUI10/ASPROLU_D01_01.dbf'"
    REMAP_DATAFILE="'/oracle/oradata/VAMLU10/ASPROLU_D01_02.dbf':'/oracle/oradata/VAMLUI10/ASPROLU_D01_02.dbf'"
    REMAP_DATAFILE="'/oracle/oradata/VAMLU10/ASPROLU_D01_03.dbf':'/oracle/oradata/VAMLUI10/ASPROLU_D01_03.dbf'"
    REMAP_DATAFILE="'/oracle/oradata/VAMLU10/ASPROLU_D01_04.dbf':'/oracle/oradata/VAMLUI10/ASPROLU_D01_04.dbf'"
    REMAP_DATAFILE="'/oracle/oradata/VAMLU10/ASPROLU_l01_02.dbf':'/oracle/oradata/VAMLUI10/ASPROLU_l01_02.dbf'"
    REMAP_DATAFILE="'/oracle/oradata/VAMLU10/ASPROLU_I01_01.dbf':'/oracle/oradata/VAMLUI10/ASPROLU_I01_01.dbf'"
    REMAP_DATAFILE="'/oracle/oradata/VAMLU10/USR_01.dbf':'/oracle/oradata/VAMLUI10/USR_01.dbf'"
    network_link=VAMLUIOL.WORLD
    FLASHBACK_TIME="TO_TIMESTAMP('29.05.2012 07:00:00', 'DD.MM.YYYY HH24:MI:SS')"
    At few minutes the Job hangs at
    . imported "TSMSYS"."SRS$" 0 rows
    Job Status = Waiting
    (Waiting??: On what the job is waiting??)
    Does anyone have any idea / solution

    Hi,
    Below is the result. It is the first job, it is still executing?
    SQL> select JOB_NAME,JOB_MODE,STATE,OPERATION from dba_datapump_jobs;
    JOB_NAME JOB_MODE
    STATE OPERATION
    SYS_IMPORT_SCHEMA_01 SCHEMA
    EXECUTING IMPORT
    SYS_IMPORT_FULL_02 FULL
    NOT RUNNING IMPORT
    SYS_IMPORT_FULL_01 FULL
    NOT RUNNING IMPORT
    JOB_NAME JOB_MODE
    STATE OPERATION
    SYS_IMPORT_FULL_03 FULL
    NOT RUNNING IMPORT

  • Oracle datapump import

    Oracle 10.2.0.5
    Linux 5
    I am currently performing a full database import using datapump.
    The parfile rules with no error, but it does not import all the tables or data.
    Here is a partial view of the parfile below and log contect next:
    Parfile:
    impdp system PARFILE=/oraappl/pca/vptch/vlabscr/import_vlab.par
    Parfile content:
    DIRECTORY=VLABDIR FULL=YES DUMPFILE=fullbackup_EXP.dmp logfile=import.log REUSE_DATAFILES=YES SQLFILE=imp_sql.sql REMAP_DATAFILE="'/oraappl/pca/vdevl/vdevldata/sysaux01.dbf':'/oraappl/pca/vptch/vlabdata/vlab/sysaux01.dbf'", REMAP_DATAFILE="'/oraappl/pca/vdevl/vdevldata/undo01.dbf':'/oraappl/pca/vptch/vlabdata/vlab/undo01.dbf'", REMAP_DATAFILE="'/oraappl/pca/vdevl/vdevldata/defspace01.dbf':'/oraappl/pca/vptch/vlabdata/vlab/defspace01.dbf'", REMAP_DATAFILE="'/oraappl/pca/vdevl/vdevldata/volsup201.dbf':'/oraappl/pca/vptch/vlabdata/vlab/volsup201.dbf'", REMAP_DATAFILE="'/oraappl/pca/vdevl/vdevldata/volsup101.dbf':'/oraappl/pca/vptch/vlabdata/vlab/volsup101.dbf'",
    Logfile content
    Master table "SYSTEM"."SYS_SQL_FILE_FULL_01" successfully loaded/unloaded Starting "SYSTEM"."SYS_SQL_FILE_FULL_01":  system/******** PARFILE=/oraappl/pca/vptch/vlabscr/import_vlab.par Processing object type DATABASE_EXPORT/TABLESPACE Processing object type DATABASE_EXPORT/PASSWORD_VERIFY_FUNCTION Processing object type DATABASE_EXPORT/SYS_USER/USER Processing object type DATABASE_EXPORT/SCHEMA/TABLE/POST_INSTANCE/PROCACT_INSTANCE Processing object type DATABASE_EXPORT/SCHEMA/TABLE/POST_INSTANCE/PROCDEPOBJ Processing object type DATABASE_EXPORT/SCHEMA/POST_SCHEMA/PROCOBJ Processing object type DATABASE_EXPORT/SCHEMA/POST_SCHEMA/PROCACT_SCHEMA Processing object type DATABASE_EXPORT/SCHEMA/PASSWORD_HISTORY Processing object type DATABASE_EXPORT/AUDIT Job "SYSTEM"."SYS_SQL_FILE_FULL_01" successfully completed at 14:54:58

    You have "SQLFILE" parameter mentioned in your par file. Check that file, all the ddl's are being written to that imp_sql.sql file. No data will be inserted. Remove that sqlfile parameter if you want data to be inserted.
    SQLFILE
    Default: none
    Purpose
    Specifies a file into which all of the SQL DDL that Import would have executed, based on other parameters, is written.
    Syntax and Description
    SQLFILE=[directory_object:]file_name   
    The file_name specifies where the import job will write the DDL that would be executed during the job. The SQL is not actually executed, and the target system remains unchanged. The file is written to the directory object specified in the DIRECTORY parameter, unless another directory_object is explicitly specified here. Any existing file that has a name matching the one specified with this parameter is overwritten.

  • Datapump import error on 2 partioned tables

    I am trying to run impdp to import two tables that are partioned and use the LOB types...for some reason it always errors out. Anyone seen this issue in 11g?
    Here is the info:
    $ impdp parfile=elm_rt.par
    Master table "ELM"."SYS_IMPORT_TABLE_05" successfully loaded/unloaded
    Starting "ELM"."SYS_IMPORT_TABLE_05": elm/******** parfile=elm_rt.par
    Processing object type DATABASE_EXPORT/SCHEMA/TABLE/TABLE
    Processing object type DATABASE_EXPORT/SCHEMA/TABLE/GRANT/OWNER_GRANT/OBJECT_GRANT
    Processing object type DATABASE_EXPORT/SCHEMA/TABLE/INDEX/INDEX
    Processing object type DATABASE_EXPORT/SCHEMA/TABLE/INDEX/STATISTICS/INDEX_STATISTICS
    Processing object type DATABASE_EXPORT/SCHEMA/TABLE/AUDIT_OBJ
    Processing object type DATABASE_EXPORT/SCHEMA/TABLE/STATISTICS/TABLE_STATISTICS
    ORA-39014: One or more workers have prematurely exited.
    ORA-39029: worker 1 with process name "DW01" prematurely terminated
    ORA-31671: Worker process DW01 had an unhandled exception.
    ORA-04030: out of process memory when trying to allocate 120048 bytes (session heap,kuxLpxAlloc)
    ORA-06512: at "SYS.KUPW$WORKER", line 1602
    ORA-06512: at line 2
    ORA-39014: One or more workers have prematurely exited.
    ORA-39029: worker 2 with process name "DW01" prematurely terminated
    ORA-31671: Worker process DW01 had an unhandled exception.
    ORA-04030: out of process memory when trying to allocate 120048 bytes (session heap,kuxLpxAlloc)
    ORA-06512: at "SYS.KUPW$WORKER", line 1602
    ORA-06512: at line 2
    Job "ELM"."SYS_IMPORT_TABLE_05" stopped due to fatal error at 13:11:04
    elm_rt.par_
    $ vi elm_rt.par
    "elm_rt.par" 25 lines, 1340 characters
    DIRECTORY=DP_REGRESSION_DATA_01
    DUMPFILE=ELM_MD1.dmp,ELM_MD2.dmp,ELM_MD3.dmp,ELM_MD4.dmp
    LOGFILE=DP_REGRESSION_LOG_01:ELM_RT.log
    DATA_OPTIONS=SKIP_CONSTRAINT_ERRORS
    CONTENT=METADATA_ONLY
    TABLES=RT_AUDIT_IN_HIST,RT_AUDIT_OUT_HIST
    REMAP_TABLESPACE=RT_AUDIT_IN_HIST_DAT01:RB_AUDIT_IN_HIST_DAT01
    REMAP_TABLESPACE=RT_AUDIT_IN_HIST_IDX04:RB_AUDIT_IN_HIST_IDX01
    REMAP_TABLESPACE=RT_AUDIT_OUT_HIST_DAT01:RB_AUDIT_OUT_HIST_DAT01
    PARALLEL=4

    Read this metalink note 286496.1. (Export/Import DataPump Parameter TRACE - How to Diagnose Oracle Data Pump)
    This will help you generate trace for the datapump job.

  • Datapump import consuming disk space

    Hi all,
    Windows server 2003
    Oracle 10g
    I'm trying to import a schema using datapump:
    impdp username/password schemas=siebel directory=dpdir dumpfile=schema_exp.dmp status=60
    It's importing but it's consuming massive amounts of disk space, over 40 gig now. Why is it consuming so much space just to import. I'm doing a reorg\, so I dropped the biggest production schema, and did the import. I didn't resize the datafiles, so where is all this extra disk space going?
    Please help, this is a production site and I'm running out of time (and space)

    Tablespace SEGS FRAGS SIZE FREE USED
    PERFSTAT : 0 1 1000 1000 0%
    SBL_DATA : 3448 4 30400 12052 60%
    SBL_INDEX : 7680 3 8192 4986 39%
    SYSAUX : 3137 39 2560 490 81%
    SYSTEM : 1256 1 2048 1388 32%
    TEMP : 0 0%
    TOOLS : 2 1 256 256 0%
    UNDOTBS1 : 11 13 5120 4754 7%
    USERS : 169 14 1024 424 59%
    Great thanks, the tablespaces are not the problem, it is the archivelogs. About a 100m a minute!
    My issue started with the first import, not all the data was imported, meaning there where rows missing, I saw this when I did a row count. So I decided to do the import again (followed the same steps as mentioned earlier) If after this import the rows are still missing, I want to do a point in time recovery with rman, because then my export dump file must be corrupted, am I correct in saying this?

  • Datapump Import is not working

    Database version :- 10.2.0.1.0
    Platform : -Windows @2003 EE SP 3
    I am Trying with datarefresh using data pump. Export was successfull, but import to a different schema is failed.
    C:\>expdp hr/hr schemas=hr directory=test_dir dumpfile=hr020608.dmp logfile=hr020608.log
    Export: Release 10.2.0.1.0 - Production on Monday, 02 June, 2008 11:29:54
    Copyright (c) 2003, 2005, Oracle. All rights reserved.
    Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - Production
    With the Partitioning, OLAP and Data Mining options
    Starting "HR"."SYS_EXPORT_SCHEMA_01": hr/******** schemas=hr directory=test_dir dumpfile=hr020608.dmp logfile=hr020608.log
    Estimate in progress using BLOCKS method...
    Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
    Total estimation using BLOCKS method: 448 KB
    Processing object type SCHEMA_EXPORT/PRE_SCHEMA/PROCACT_SCHEMA
    Processing object type SCHEMA_EXPORT/SEQUENCE/SEQUENCE
    Processing object type SCHEMA_EXPORT/TABLE/TABLE
    Processing object type SCHEMA_EXPORT/TABLE/GRANT/OWNER_GRANT/OBJECT_GRANT
    Processing object type SCHEMA_EXPORT/TABLE/INDEX/INDEX
    Processing object type SCHEMA_EXPORT/TABLE/CONSTRAINT/CONSTRAINT
    Processing object type SCHEMA_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS
    Processing object type SCHEMA_EXPORT/TABLE/COMMENT
    Processing object type SCHEMA_EXPORT/PROCEDURE/PROCEDURE
    Processing object type SCHEMA_EXPORT/PROCEDURE/ALTER_PROCEDURE
    Processing object type SCHEMA_EXPORT/VIEW/VIEW
    Processing object type SCHEMA_EXPORT/TABLE/CONSTRAINT/REF_CONSTRAINT
    Processing object type SCHEMA_EXPORT/TABLE/TRIGGER
    Processing object type SCHEMA_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
    . . exported "HR"."COUNTRIES" 6.093 KB 25 rows
    . . exported "HR"."DEPARTMENTS" 6.640 KB 27 rows
    . . exported "HR"."EMPLOYEES" 15.77 KB 107 rows
    . . exported "HR"."JOBS" 6.609 KB 19 rows
    . . exported "HR"."JOB_HISTORY" 6.585 KB 10 rows
    . . exported "HR"."LOCATIONS" 7.710 KB 23 rows
    . . exported "HR"."REGIONS" 5.296 KB 4 rows
    Master table "HR"."SYS_EXPORT_SCHEMA_01" successfully loaded/unloaded
    Dump file set for HR.SYS_EXPORT_SCHEMA_01 is:
    C:\TEMP\HR020608.DMP
    ==========================================
    Import
    ===========================================
    C:\>impdp scott/tiger schemas=hr directory=test_dir dumpfile=hr020608.dmp logfile=hrimp020608.log
    Import: Release 10.2.0.1.0 - Production on Monday, 02 June, 2008 12:58:02
    Copyright (c) 2003, 2005, Oracle. All rights reserved.
    Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - Production
    With the Partitioning, OLAP and Data Mining options
    Master table "SCOTT"."SYS_IMPORT_SCHEMA_01" successfully loaded/unloaded
    Starting "SCOTT"."SYS_IMPORT_SCHEMA_01": scott/******** schemas=hr directory=test_dir dumpfile=hr020608.dmp logfile=hrimp020608.log
    ORA-39126: Worker unexpected fatal error in KUPW$WORKER.LOAD_METADATA [TABLE_DATA:"SCOTT"."SYS_IMPORT_SCHEMA_01"]
    SELECT process_order, flags, xml_clob, NVL(dump_fileid, :1), NVL(dump_position, :2), dump_length, dump_allocation, grantor, object_row, object_schema, object_long_name, processing_status, processing_s
    tate, base_object_type, base_object_schema, base_object_name, property, size_estimate, in_progress FROM "SCOTT"."SYS_IMPORT_SCHEMA_01" WHERE process_order between :3 AND :4 AND processing_state <> :5
    AND duplicate = 0 ORDER BY process_order
    ORA-03212: Temporary Segment cannot be created in locally-managed tablespace
    ORA-06512: at "SYS.DBMS_SYS_ERROR", line 105
    ORA-06512: at "SYS.KUPW$WORKER", line 6279
    ----- PL/SQL Call Stack -----
    object line object
    handle number name
    1DB41780 14916 package body SYS.KUPW$WORKER
    1DB41780 6300 package body SYS.KUPW$WORKER
    1DB41780 3514 package body SYS.KUPW$WORKER
    1DB41780 6889 package body SYS.KUPW$WORKER
    1DB41780 1262 package body SYS.KUPW$WORKER
    1A8E3D94 2 anonymous block
    Job "SCOTT"."SYS_IMPORT_SCHEMA_01" stopped due to fatal error at 12:58:08
    Please help me on this, whether it can be possible to the same schema or can be to a different one
    Thank you
    Johnson P Gerorge

    ORA-03212: Temporary Segment cannot be created in locally-managed tablespace
    Cause: Attempt to create a temporary segment for sort/hash/lobs in in permanent tablespace of kind locally-managed
    Action: Alter temporary tablespace of user to a temporary tablespace or a dictionary-managed permanent tablespace
    Does this database have a default temporary tablespace for all users?

  • Datapump Import remap_Schema does not works

    hi everybody,
    i've a problem with the import under 10.2.0.3.0 on AIX.
    I got an dump-file containing a full-databasedump.
    I just want to import one schema. Under imp i used the fromuser, touser clause.
    Now i tried it with the following parameters:
    impdp system/manager remap_schema=user1:user1 directory=my_directory dumpfile=full_DB_Name.dmp logfile=impdpfull_Db_Name.log
    The Import then starts but it imports allobjects from the system schema, creates all users etc.
    What am i doing wrong????
    Any help woul be appreciated!
    Thanks Julius

    Hello,
    You are importing the entire dumpfile, if you only want to import a single schema then you need to use the 'schemas' parameter, like this -
    impdp system/manager remap_schema=user1:user1 schemas=foo directory=my_directory dumpfile=full_DB_Name.dmp logfile=impdpfull_Db_Name.logWhere 'foo' is the schema you wish to import.
    Hope this helps,
    John.
    http://jes.blogs.shellprompt.net

Maybe you are looking for