Datapump import doubt

Hi,
Oracle Version:10.2.0.1
Operating System:Linux
Can we put two datapump import process at a time on the same dump in same machine in parallel.
Thanks & Regards,
Poorna Prasad.

Neerav999 wrote:
NO you cannot create datapump import as
Import process starts by creating master process,worker process,client process and shadow process no import can be done until these processes are freeI am not sure what you want to say here? Do you mean to say that we can't run two parallel import processes since the processes won't be free? Please see below,
E:\>time
The current time is: 18:24:17.04
Enter the new time:
E:\>impdp system/oracle directory=expdpdir dumpfile=dumpfile.dmp remap_schema=hr:hr2
Import: Release 11.1.0.6.0 - Production on Saturday, 06 February, 2010 18:24:25
Copyright (c) 2003, 2007, Oracle.  All rights reserved.
Connected to: Oracle Database 11g Enterprise Edition Release 11.1.0.6.0 - Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options
Master table "SYSTEM"."SYS_IMPORT_FULL_02" successfully loaded/unloaded
Starting "SYSTEM"."SYS_IMPORT_FULL_02":  system/******** directory=expdpdir dumpfile=dumpfile.dmp remap_schema=hr:hr2
Processing object type SCHEMA_EXPORT/PRE_SCHEMA/PROCACT_SCHEMA
Processing object type SCHEMA_EXPORT/SEQUENCE/SEQUENCE
Processing object type SCHEMA_EXPORT/TABLE/TABLE
Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
. . imported "HR2"."COUNTRIES"                           6.375 KB      25 rows
. . imported "HR2"."DEPARTMENTS"                         7.015 KB      27 rows
. . imported "HR2"."EMPLOYEES"                           16.80 KB     107 rows
. . imported "HR2"."JOBS"                                6.984 KB      19 rows
. . imported "HR2"."JOB_HISTORY"                         7.054 KB      10 rows
. . imported "HR2"."LOCATIONS"                           8.273 KB      23 rows
. . imported "HR2"."REGIONS"                             5.484 KB       4 rows
Processing object type SCHEMA_EXPORT/TABLE/GRANT/OWNER_GRANT/OBJECT_GRANT
Processing object type SCHEMA_EXPORT/TABLE/INDEX/INDEX
Processing object type SCHEMA_EXPORT/TABLE/CONSTRAINT/CONSTRAINT
Processing object type SCHEMA_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS
Processing object type SCHEMA_EXPORT/TABLE/COMMENT
Processing object type SCHEMA_EXPORT/PROCEDURE/PROCEDURE
Processing object type SCHEMA_EXPORT/PROCEDURE/ALTER_PROCEDURE
Processing object type SCHEMA_EXPORT/VIEW/VIEW
Processing object type SCHEMA_EXPORT/TABLE/CONSTRAINT/REF_CONSTRAINT
Processing object type SCHEMA_EXPORT/TABLE/TRIGGER
Processing object type SCHEMA_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
Processing object type SCHEMA_EXPORT/POST_SCHEMA/PROCACT_SCHEMA
Job "SYSTEM"."SYS_IMPORT_FULL_02" successfully completed at 18:24:44And the second session,'
E:\Documents and Settings\aristadba>time
The current time is: 18:24:19.37
Enter the new time:
E:\Documents and Settings\aristadba>impdp system/oracle directory=expdpdir dumpfile=dumpfile.dmp remap_schema=hr:hr
Import: Release 11.1.0.6.0 - Production on Saturday, 06 February, 2010 18:24:22
Copyright (c) 2003, 2007, Oracle.  All rights reserved.
Connected to: Oracle Database 11g Enterprise Edition Release 11.1.0.6.0 - Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options
Master table "SYSTEM"."SYS_IMPORT_FULL_01" successfully loaded/unloaded
Starting "SYSTEM"."SYS_IMPORT_FULL_01":  system/******** directory=expdpdir dumpfile=dumpfile.dmp remap_schema=hr:h
Processing object type SCHEMA_EXPORT/PRE_SCHEMA/PROCACT_SCHEMA
Processing object type SCHEMA_EXPORT/SEQUENCE/SEQUENCE
Processing object type SCHEMA_EXPORT/TABLE/TABLE
Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
. . imported "HR1"."COUNTRIES"                           6.375 KB      25 rows
. . imported "HR1"."DEPARTMENTS"                         7.015 KB      27 rows
. . imported "HR1"."EMPLOYEES"                           16.80 KB     107 rows
. . imported "HR1"."JOBS"                                6.984 KB      19 rows
. . imported "HR1"."JOB_HISTORY"                         7.054 KB      10 rows
. . imported "HR1"."LOCATIONS"                           8.273 KB      23 rows
. . imported "HR1"."REGIONS"                             5.484 KB       4 rows
Processing object type SCHEMA_EXPORT/TABLE/GRANT/OWNER_GRANT/OBJECT_GRANT
Processing object type SCHEMA_EXPORT/TABLE/INDEX/INDEX
Processing object type SCHEMA_EXPORT/TABLE/CONSTRAINT/CONSTRAINT
Processing object type SCHEMA_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS
Processing object type SCHEMA_EXPORT/TABLE/COMMENT
Processing object type SCHEMA_EXPORT/PROCEDURE/PROCEDURE
Processing object type SCHEMA_EXPORT/PROCEDURE/ALTER_PROCEDURE
Processing object type SCHEMA_EXPORT/VIEW/VIEW
Processing object type SCHEMA_EXPORT/TABLE/CONSTRAINT/REF_CONSTRAINT
Processing object type SCHEMA_EXPORT/TABLE/TRIGGER
Processing object type SCHEMA_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
Processing object type SCHEMA_EXPORT/POST_SCHEMA/PROCACT_SCHEMA
Job "SYSTEM"."SYS_IMPORT_FULL_01" successfully completed at 18:24:44Can you see the time difference? Its just 2 seconds before the jobs were executed!
Would you be kind to explain the statement that you gave just now since very well are the chances that I may have not understood it.
HTH
Aman....

Similar Messages

  • DataPump Import

    Hi,
    I am facing a small problem with datapump import, i exported a schema using datapump export(expdp) the same dump file i used for import using (impdp), once when the import is over i see all procedures,triggers and etc are all created with double quotes such as CREATE OR REPLACE PROCEDURE "ACCOUNTTRANSFERDETAIL_INSERT". So when doing a DataBase Difference these all comes as mismatch, so is there any way to overcome this error.
    When i do a export and import in oracle 9i everthing was ok.
    Now our QA team is doing a oracle 10g export and import (EXP & IMP) activities and i am using oracle 10g datapump for export and import (expdp & impdp) this is where we are facing problem. Comparison is between oracle 10g (exp & imp ) and oracle 10g datapump (expdp & impdp).
    For their schema there was no double quotes and when i refreshed a schema using impdp objects where are enclosed using double quotes.
    I also checked in data dictionary there also i see the object names enclosed with double quotes.
    so i anybody has solution please revert back.
    Thanks
    Regards,
    Ganesh R

    Somehow I don;t think this is a problem with a download. You'll get a lot more eyeballs on your question if you ask in the right place.
    Since Data Pump is part of the database, you might try asking your question in a Database forum (go up 2 levels and take a look at the screen).

  • Datapump import doesnt filter out table when mentioned in 'exclude'

    The below datapump import fails to filter out the 'COUNTRIES' table. I thought that was what it was supposed to do ? Any suggestions ?
    C:\Documents and Settings\>impdp remap_schema=hr:hrtmp dumpfile=hr1.dmp parfile='C:\Documents and Settings\para.par'
    Import: Release 10.2.0.1.0 - Production on Monday, 18 January, 2010 15:00:05
    Copyright (c) 2003, 2005, Oracle. All rights reserved.
    Username: / as sysdba
    Connected to: Oracle Database 10g Express Edition Release 10.2.0.1.0 - Production
    Master table "SYS"."SYS_IMPORT_FULL_01" successfully loaded/unloaded
    Starting "SYS"."SYS_IMPORT_FULL_01": /******** AS SYSDBA remap_schema=hr:hrtmp dumpfile=hr1.dmp parfile='C:\Documents a
    nd Settings\para.par'
    Processing object type SCHEMA_EXPORT/USER
    Processing object type SCHEMA_EXPORT/SYSTEM_GRANT
    Processing object type SCHEMA_EXPORT/ROLE_GRANT
    Processing object type SCHEMA_EXPORT/DEFAULT_ROLE
    Processing object type SCHEMA_EXPORT/TABLESPACE_QUOTA
    Processing object type SCHEMA_EXPORT/PRE_SCHEMA/PROCACT_SCHEMA
    Processing object type SCHEMA_EXPORT/SEQUENCE/SEQUENCE
    Processing object type SCHEMA_EXPORT/TABLE/TABLE
    Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
    . . imported "HRTMP"."COUNTRIES" 6.093 KB 25 rows
    . . imported "HRTMP"."DEPARTMENTS" 6.640 KB 27 rows
    . . imported "HRTMP"."EMPLOYEES" 15.77 KB 107 rows
    . . imported "HRTMP"."JOBS" 6.609 KB 19 rows
    . . imported "HRTMP"."JOB_HISTORY" 6.585 KB 10 rows
    . . imported "HRTMP"."LOCATIONS" 7.710 KB 23 rows
    . . imported "HRTMP"."REGIONS" 5.296 KB 4 rows
    Processing object type SCHEMA_EXPORT/TABLE/INDEX/INDEX
    Processing object type SCHEMA_EXPORT/TABLE/CONSTRAINT/CONSTRAINT
    Processing object type SCHEMA_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS
    Processing object type SCHEMA_EXPORT/TABLE/COMMENT
    Processing object type SCHEMA_EXPORT/PROCEDURE/PROCEDURE
    Processing object type SCHEMA_EXPORT/PROCEDURE/ALTER_PROCEDURE
    Processing object type SCHEMA_EXPORT/VIEW/VIEW
    Processing object type SCHEMA_EXPORT/TABLE/CONSTRAINT/REF_CONSTRAINT
    Processing object type SCHEMA_EXPORT/TABLE/TRIGGER
    Processing object type SCHEMA_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
    Job "SYS"."SYS_IMPORT_FULL_01" successfully completed at 15:00:13
    para.par
    ========
    exclude=TABLES:"like '%COUNTRIES%'"

    Hi,
    The first thing I see is that you excluded TABLES (plural). The object type is not pluralized in DataPump. The parfile should look like:
    exclude=TABLE:"like '%COUNTRIES%'"
    Dean

  • ORACLE 10g : Datapump-Import : Error: unknown parameter name 'REMAP_TABLE'

    Hi,
    I am working on Oracle 10g. When executed import datapump script in the UNIX box with option “RMAP_TABLE” I received the error “unknown parameter name 'REMAP_TABLE' “Can you please give me solution for it?
    Scripts :-
    impdp eimsexp/xyz TABLES=EIMDBO.DIVISION DIRECTORY=DATAMART_DATA_PUMP_DIR DUMPFILE=expdp_DATAMART_tables_%u_$BATCH_ID.dmp LOGFILE=impdp_DATAMART_tables.log REMAP_TABLE=EIMDBO.DIVISION:EIM2DBO:YR2009_DIM_DIVISION
    Note :- The YR2009_DIM_DIVISION table is available in the target database. It is without partition table. The EIMDBO.DIVISION is partition table.
    Thanks,

    See your post here
    ORACLE 10g : Datapump-Import : Error: unknown parameter name 'REMAP_TABLE'
    Srini

  • Datapump import of tables will automatically rebuild indexes ?

    dear all,
    do datapump import of tables will automatically rebuild the indexes againts the associated tables?
    urgent response please

    Yes indexes are rebulit.
    From dba-oracle
    Set indexes=n – Index creation can be postponed until after import completes, by specifying indexes=n. If indexes for the target table already exist at the time of execution, import performs index maintenance when data is inserted into the table. Setting indexes=n eliminates this maintenance overhead. You can also use the indexfile=filename parm to rebuild all the indexes once, after the data is loaded. When editing the indexfile, add the nologging and parallel keywords (where parallel degree = cpu_count-1).

  • DataPump Import over Network and Parallel parameter

    I see and understand the advantages of using the Parallel parameter with datapump with dump files. I am trying to figure out is there any advantage to the parallel parameter when importing data over the network? I have noticed theren are sweet spots where parallel and dump files based upon the data. Sometimes ten workers and ten dmp files is no faster if not slower than three. I have not been able to find a clear answer on this. Any advice or explanation would be appreciated.
    Thanks

    I am trying to figure out is there any advantage to the parallel parameter when importing data over the network?Reading data from network is slower than other options available from datapump.
    The performance of Parallel import using datapump import are controlled by I/O contention. You have to determine the best value for performing parallel datapump export and import.

  • [10G] DATAPUMP IMPORT는 자동으로 USER 생성

    제품 : ORACLE SERVER
    작성날짜 : 2004-05-28
    [10G] DATAPUMP IMPORT는 자동으로 USER 생성
    ===================================
    PURPOSE
    다음은 DATAPUMP의 기능을 소개하고자 한다.
    Explanation
    imp utility 에서는 target user가 존재해야 했었다
    그러나 DATAPUMP 는 만익 target user가 존재하지 않는다면
    이를 자동으로 생성한다.
    Example :
    =============
    Source database 에서 TEST라는 user 가 있다.
    TEST ( password : test , role : connect , resource )
    Export the TEST schema using datapump:
    expdp system/oracle dumpfile=exp.dmp schemas=TEST
    Case I
    =======
    test user가 존재하지 않을 경우에 import 는 test user를 자동으로 생성한다.
    impdp system/oracle dumpfile=exp.dmp
    *************TEST does not exist*************************************
    Import: Release 10.1.0.2.0 - Production on Friday, 28 May, 2004 1:02
    Copyright (c) 2003, Oracle. All rights reserved.
    Connected to: Oracle Database 10g Enterprise Edition Release 10.1.0.2.0 - Produc
    tion
    With the Data Mining option
    Master table "SYSTEM"."SYS_IMPORT_FULL_01" successfully loaded/unloaded
    Starting "SYSTEM"."SYS_IMPORT_FULL_01": system/******** dumpfile=exp.dmp
    Processing object type SCHEMA_EXPORT/USER
    Processing object type SCHEMA_EXPORT/SYSTEM_GRANT
    Processing object type SCHEMA_EXPORT/ROLE_GRANT
    Processing object type SCHEMA_EXPORT/DEFAULT_ROLE
    Processing object type SCHEMA_EXPORT/SE_PRE_SCHEMA_PROCOBJACT/PROCACT_SCHEMA
    Processing object type SCHEMA_EXPORT/DB_LINK
    Processing object type SCHEMA_EXPORT/TABLE/TABLE
    Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
    . . imported "TEST"."DEPT" 5.648 KB 4 rows
    . . imported "TEST"."SALGRADE" 5.648 KB 10 rows
    . . imported "TEST"."BONUS" 0 KB 0 rows
    . . imported "TEST"."EMP" 0 KB 0 rows
    Processing object type SCHEMA_EXPORT/TABLE/INDEX/INDEX
    Processing object type SCHEMA_EXPORT/TABLE/CONSTRAINT/CONSTRAINT
    Processing object type SCHEMA_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS
    Processing object type SCHEMA_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
    Processing object type SCHEMA_EXPORT/PROCEDURE/PROCEDURE
    Processing object type SCHEMA_EXPORT/PROCEDURE/ALTER_PROCEDURE
    Processing object type SCHEMA_EXPORT/TABLE/CONSTRAINT/REF_CONSTRAINT
    Job "SYSTEM"."SYS_IMPORT_FULL_01" successfully completed at 01:02
    connect TEST/TEST (on target database)
    => connected
    SQL> select * from session_roles;
    ROLE
    connect
    resource
    Case II
    ========
    Target database 에 TEST user가 존재하는 경우에 warning message가 발생하며 import
    작업은 계속 진행된다.
    impdp system/oracle dumpfile=exp.dmp
    *************user TEST already exists************************************
    Import: Release 10.1.0.2.0 - Production on Friday, 28 May, 2004 1:06
    Copyright (c) 2003, Oracle. All rights reserved.
    Connected to: Oracle Database 10g Enterprise Edition Release 10.1.0.2.0 - Produc
    tion
    With the Data Mining option
    Master table "SYSTEM"."SYS_IMPORT_FULL_01" successfully loaded/unloaded
    Starting "SYSTEM"."SYS_IMPORT_FULL_01": system/******** dumpfile=exp.dmp
    Processing object type SCHEMA_EXPORT/USER
    ORA-31684: Object type USER:"TEST" already exists
    Processing object type SCHEMA_EXPORT/SYSTEM_GRANT
    Processing object type SCHEMA_EXPORT/ROLE_GRANT
    Processing object type SCHEMA_EXPORT/DEFAULT_ROLE
    Processing object type SCHEMA_EXPORT/SE_PRE_SCHEMA_PROCOBJACT/PROCACT_SCHEMA
    Processing object type SCHEMA_EXPORT/DB_LINK
    Processing object type SCHEMA_EXPORT/TABLE/TABLE
    Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
    . . imported "TEST"."DEPT" 5.648 KB 4 rows
    . . imported "TEST"."SALGRADE" 5.648 KB 10 rows
    . . imported "TEST"."BONUS" 0 KB 0 rows
    . . imported "TEST"."EMP" 0 KB 0 rows
    Processing object type SCHEMA_EXPORT/TABLE/INDEX/INDEX
    Processing object type SCHEMA_EXPORT/TABLE/CONSTRAINT/CONSTRAINT
    Processing object type SCHEMA_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS
    Processing object type SCHEMA_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
    Processing object type SCHEMA_EXPORT/PROCEDURE/PROCEDURE
    Processing object type SCHEMA_EXPORT/PROCEDURE/ALTER_PROCEDURE
    Processing object type SCHEMA_EXPORT/TABLE/CONSTRAINT/REF_CONSTRAINT
    Job "SYSTEM"."SYS_IMPORT_FULL_01" completed with 1 error(s) at 01:06
    You will receive ORA-31684 error but import will continue.
    Case - III
    ===========
    Target database에 TEST user가 존재하지만 warning (ora-31684) 을 피하기 위해서는
    EXCLUDE=USER 를 사용한다.
    impdp system/oracle dumpfile=exp.dmp exclude=user
    *********Disable create user statment as user TEST already exist***********
    Import: Release 10.1.0.2.0 - Production on Friday, 28 May, 2004 1:11
    Copyright (c) 2003, Oracle. All rights reserved.
    Connected to: Oracle Database 10g Enterprise Edition Release 10.1.0.2.0 - Produc
    tion
    With the Data Mining option
    Master table "SYSTEM"."SYS_IMPORT_FULL_01" successfully loaded/unloaded
    Starting "SYSTEM"."SYS_IMPORT_FULL_01": system/******** dumpfile=exp.dmp exclud
    e=user
    Processing object type SCHEMA_EXPORT/SYSTEM_GRANT
    Processing object type SCHEMA_EXPORT/ROLE_GRANT
    Processing object type SCHEMA_EXPORT/DEFAULT_ROLE
    Processing object type SCHEMA_EXPORT/SE_PRE_SCHEMA_PROCOBJACT/PROCACT_SCHEMA
    Processing object type SCHEMA_EXPORT/DB_LINK
    Processing object type SCHEMA_EXPORT/TABLE/TABLE
    Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
    . . imported "TEST"."DEPT" 5.648 KB 4 rows
    . . imported "TEST"."SALGRADE" 5.648 KB 10 rows
    . . imported "TEST"."BONUS" 0 KB 0 rows
    . . imported "TEST"."EMP" 0 KB 0 rows
    Processing object type SCHEMA_EXPORT/TABLE/INDEX/INDEX
    Processing object type SCHEMA_EXPORT/TABLE/CONSTRAINT/CONSTRAINT
    Processing object type SCHEMA_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS
    Processing object type SCHEMA_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
    Processing object type SCHEMA_EXPORT/PROCEDURE/PROCEDURE
    Processing object type SCHEMA_EXPORT/PROCEDURE/ALTER_PROCEDURE
    Processing object type SCHEMA_EXPORT/TABLE/CONSTRAINT/REF_CONSTRAINT
    Job "SYSTEM"."SYS_IMPORT_FULL_01" successfully completed at 01:11
    TEST user가 존재하지 않는데 EXCLUDE=USER parameter 를 사용하면
    ORA-01917 error 를 만나게 된다.
    Reference Documents :
    Note : 272132.1 DATAPUMP import automatically creates schema

  • Alter data on datapump import 10G?

    Hi,
    I see this is possible with a call to data_remap() in 11g. Is there an alternative or simple workaround for datapump import in 10G? I have 1 field that needs to be toggled occassionally when doing the import. Right now, the person who wrote the refresh job exports the active partitions from prod, imports into dev. If it needs to go into partition2, she updates the field, exports again and then reimports putting it, at that time, into the correct partition. I thought I might be able to change the value on the import (or the export if possible). Does anyone have any suggestion on this?
    Thanks, Pete

    REMAP_DATA sounds like it would work if you were on 11, but since you are on 10, the only think i can think of is to do a 2 step process. The first would be to create the table then create a trigger on the table to do the necessary data remap. Then load the data using a content=data_only parameter. Not sure how this will perform since the trigger will fire on all of the inserts, but I think it could be made to work.
    Dean

  • Datapump import takes ages

    Hi,
    I'm doing an import of a schema with one table with about 1 million (spatial) records. This import takes a very long time to complete, a ran the import on friday and when I looked at the proces on monday I saw is was finished at least 24 hours later after I started it. Here's the output:
    Z:\>impdp system/password@ora schemas=bla directory=dpump_dir1 dumpfile=schema8.dmp
    Import: Release 10.1.0.2.0 - Production on Friday, 21 January, 2005 16:32
    Copyright (c) 2003, Oracle. All rights reserved.
    Connected to: Personal Oracle Database 10g Release 10.1.0.2.0 - Production
    With the Partitioning, OLAP and Data Mining options
    Master table "SYSTEM"."SYS_IMPORT_SCHEMA_02" successfully loaded/unloaded
    Starting "SYSTEM"."SYS_IMPORT_SCHEMA_02": system/********@ora schemas=bla directory=dpump_dir1 dumpfile=schema8.dmp
    Processing object type SCHEMA_EXPORT/USER
    Processing object type SCHEMA_EXPORT/SYSTEM_GRANT
    Processing object type SCHEMA_EXPORT/ROLE_GRANT
    Processing object type SCHEMA_EXPORT/DEFAULT_ROLE
    Processing object type SCHEMA_EXPORT/SE_PRE_SCHEMA_PROCOBJACT/PROCACT_SCHEMA
    Processing object type SCHEMA_EXPORT/TABLE/TABLE
    Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
    . . imported "BLA"."NLD_NW" 266.1 MB 1217147 rows
    Job "SYSTEM"."SYS_IMPORT_SCHEMA_02" successfully completed at 15:22
    The dumpfile was created with:
    Z:\>expdp system/password@orcl schemas=bla directory=dpump_dir1 dumpfile=schema8.dmp
    The source DB is running on Linux and the target is running WinXP.
    I thought datapump was meant to be really fast, what's going wrong in my situation?

    ...I've just looked at the time needed to do the import into the same database from where I exported the dumpfile. This import operation takes 5 minutes.

  • Datapump import via Networklink hangs

    Hello,
    i try to clone my Database like datapump via network link. (Both Oracle Server Vers. 10.2.0.4 on a Linux Enterprise Server )
    Fist, I create a new empty Database
    - Create and Test the Database link
    - On the original DB I grant system to FULL_EXP
    - On the destinaion DB I grant system to FULL_IMP
    My Parfile :
    userid=system/**********
    full=Y
    table_exists_action=replace
    logfile=VAMLUI10_full_impdb.log
    directory=DATA_PUMP_DIR
    parallel=4
    REMAP_DATAFILE="'/oracle/oradata/VAMLU10/tools01.dbf':'/oracle/oradata/VAMLUI10/tools01.dbf'"
    REMAP_DATAFILE="'/oracle/oradata/VAMLU10/ASPROLU_D01_01.dbf':'/oracle/oradata/VAMLUI10/ASPROLU_D01_01.dbf'"
    REMAP_DATAFILE="'/oracle/oradata/VAMLU10/ASPROLU_D01_02.dbf':'/oracle/oradata/VAMLUI10/ASPROLU_D01_02.dbf'"
    REMAP_DATAFILE="'/oracle/oradata/VAMLU10/ASPROLU_D01_03.dbf':'/oracle/oradata/VAMLUI10/ASPROLU_D01_03.dbf'"
    REMAP_DATAFILE="'/oracle/oradata/VAMLU10/ASPROLU_D01_04.dbf':'/oracle/oradata/VAMLUI10/ASPROLU_D01_04.dbf'"
    REMAP_DATAFILE="'/oracle/oradata/VAMLU10/ASPROLU_l01_02.dbf':'/oracle/oradata/VAMLUI10/ASPROLU_l01_02.dbf'"
    REMAP_DATAFILE="'/oracle/oradata/VAMLU10/ASPROLU_I01_01.dbf':'/oracle/oradata/VAMLUI10/ASPROLU_I01_01.dbf'"
    REMAP_DATAFILE="'/oracle/oradata/VAMLU10/USR_01.dbf':'/oracle/oradata/VAMLUI10/USR_01.dbf'"
    network_link=VAMLUIOL.WORLD
    FLASHBACK_TIME="TO_TIMESTAMP('29.05.2012 07:00:00', 'DD.MM.YYYY HH24:MI:SS')"
    At few minutes the Job hangs at
    . imported "TSMSYS"."SRS$" 0 rows
    Job Status = Waiting
    (Waiting??: On what the job is waiting??)
    Does anyone have any idea / solution

    Hi,
    Below is the result. It is the first job, it is still executing?
    SQL> select JOB_NAME,JOB_MODE,STATE,OPERATION from dba_datapump_jobs;
    JOB_NAME JOB_MODE
    STATE OPERATION
    SYS_IMPORT_SCHEMA_01 SCHEMA
    EXECUTING IMPORT
    SYS_IMPORT_FULL_02 FULL
    NOT RUNNING IMPORT
    SYS_IMPORT_FULL_01 FULL
    NOT RUNNING IMPORT
    JOB_NAME JOB_MODE
    STATE OPERATION
    SYS_IMPORT_FULL_03 FULL
    NOT RUNNING IMPORT

  • Oracle datapump import

    Oracle 10.2.0.5
    Linux 5
    I am currently performing a full database import using datapump.
    The parfile rules with no error, but it does not import all the tables or data.
    Here is a partial view of the parfile below and log contect next:
    Parfile:
    impdp system PARFILE=/oraappl/pca/vptch/vlabscr/import_vlab.par
    Parfile content:
    DIRECTORY=VLABDIR FULL=YES DUMPFILE=fullbackup_EXP.dmp logfile=import.log REUSE_DATAFILES=YES SQLFILE=imp_sql.sql REMAP_DATAFILE="'/oraappl/pca/vdevl/vdevldata/sysaux01.dbf':'/oraappl/pca/vptch/vlabdata/vlab/sysaux01.dbf'", REMAP_DATAFILE="'/oraappl/pca/vdevl/vdevldata/undo01.dbf':'/oraappl/pca/vptch/vlabdata/vlab/undo01.dbf'", REMAP_DATAFILE="'/oraappl/pca/vdevl/vdevldata/defspace01.dbf':'/oraappl/pca/vptch/vlabdata/vlab/defspace01.dbf'", REMAP_DATAFILE="'/oraappl/pca/vdevl/vdevldata/volsup201.dbf':'/oraappl/pca/vptch/vlabdata/vlab/volsup201.dbf'", REMAP_DATAFILE="'/oraappl/pca/vdevl/vdevldata/volsup101.dbf':'/oraappl/pca/vptch/vlabdata/vlab/volsup101.dbf'",
    Logfile content
    Master table "SYSTEM"."SYS_SQL_FILE_FULL_01" successfully loaded/unloaded Starting "SYSTEM"."SYS_SQL_FILE_FULL_01":  system/******** PARFILE=/oraappl/pca/vptch/vlabscr/import_vlab.par Processing object type DATABASE_EXPORT/TABLESPACE Processing object type DATABASE_EXPORT/PASSWORD_VERIFY_FUNCTION Processing object type DATABASE_EXPORT/SYS_USER/USER Processing object type DATABASE_EXPORT/SCHEMA/TABLE/POST_INSTANCE/PROCACT_INSTANCE Processing object type DATABASE_EXPORT/SCHEMA/TABLE/POST_INSTANCE/PROCDEPOBJ Processing object type DATABASE_EXPORT/SCHEMA/POST_SCHEMA/PROCOBJ Processing object type DATABASE_EXPORT/SCHEMA/POST_SCHEMA/PROCACT_SCHEMA Processing object type DATABASE_EXPORT/SCHEMA/PASSWORD_HISTORY Processing object type DATABASE_EXPORT/AUDIT Job "SYSTEM"."SYS_SQL_FILE_FULL_01" successfully completed at 14:54:58

    You have "SQLFILE" parameter mentioned in your par file. Check that file, all the ddl's are being written to that imp_sql.sql file. No data will be inserted. Remove that sqlfile parameter if you want data to be inserted.
    SQLFILE
    Default: none
    Purpose
    Specifies a file into which all of the SQL DDL that Import would have executed, based on other parameters, is written.
    Syntax and Description
    SQLFILE=[directory_object:]file_name   
    The file_name specifies where the import job will write the DDL that would be executed during the job. The SQL is not actually executed, and the target system remains unchanged. The file is written to the directory object specified in the DIRECTORY parameter, unless another directory_object is explicitly specified here. Any existing file that has a name matching the one specified with this parameter is overwritten.

  • Datapump import error on 2 partioned tables

    I am trying to run impdp to import two tables that are partioned and use the LOB types...for some reason it always errors out. Anyone seen this issue in 11g?
    Here is the info:
    $ impdp parfile=elm_rt.par
    Master table "ELM"."SYS_IMPORT_TABLE_05" successfully loaded/unloaded
    Starting "ELM"."SYS_IMPORT_TABLE_05": elm/******** parfile=elm_rt.par
    Processing object type DATABASE_EXPORT/SCHEMA/TABLE/TABLE
    Processing object type DATABASE_EXPORT/SCHEMA/TABLE/GRANT/OWNER_GRANT/OBJECT_GRANT
    Processing object type DATABASE_EXPORT/SCHEMA/TABLE/INDEX/INDEX
    Processing object type DATABASE_EXPORT/SCHEMA/TABLE/INDEX/STATISTICS/INDEX_STATISTICS
    Processing object type DATABASE_EXPORT/SCHEMA/TABLE/AUDIT_OBJ
    Processing object type DATABASE_EXPORT/SCHEMA/TABLE/STATISTICS/TABLE_STATISTICS
    ORA-39014: One or more workers have prematurely exited.
    ORA-39029: worker 1 with process name "DW01" prematurely terminated
    ORA-31671: Worker process DW01 had an unhandled exception.
    ORA-04030: out of process memory when trying to allocate 120048 bytes (session heap,kuxLpxAlloc)
    ORA-06512: at "SYS.KUPW$WORKER", line 1602
    ORA-06512: at line 2
    ORA-39014: One or more workers have prematurely exited.
    ORA-39029: worker 2 with process name "DW01" prematurely terminated
    ORA-31671: Worker process DW01 had an unhandled exception.
    ORA-04030: out of process memory when trying to allocate 120048 bytes (session heap,kuxLpxAlloc)
    ORA-06512: at "SYS.KUPW$WORKER", line 1602
    ORA-06512: at line 2
    Job "ELM"."SYS_IMPORT_TABLE_05" stopped due to fatal error at 13:11:04
    elm_rt.par_
    $ vi elm_rt.par
    "elm_rt.par" 25 lines, 1340 characters
    DIRECTORY=DP_REGRESSION_DATA_01
    DUMPFILE=ELM_MD1.dmp,ELM_MD2.dmp,ELM_MD3.dmp,ELM_MD4.dmp
    LOGFILE=DP_REGRESSION_LOG_01:ELM_RT.log
    DATA_OPTIONS=SKIP_CONSTRAINT_ERRORS
    CONTENT=METADATA_ONLY
    TABLES=RT_AUDIT_IN_HIST,RT_AUDIT_OUT_HIST
    REMAP_TABLESPACE=RT_AUDIT_IN_HIST_DAT01:RB_AUDIT_IN_HIST_DAT01
    REMAP_TABLESPACE=RT_AUDIT_IN_HIST_IDX04:RB_AUDIT_IN_HIST_IDX01
    REMAP_TABLESPACE=RT_AUDIT_OUT_HIST_DAT01:RB_AUDIT_OUT_HIST_DAT01
    PARALLEL=4

    Read this metalink note 286496.1. (Export/Import DataPump Parameter TRACE - How to Diagnose Oracle Data Pump)
    This will help you generate trace for the datapump job.

  • Original Import doubt on indexes

    Hi All,
    First of all, I wish you a Happy New Year to you all
    I have been learning most of the stuffs in oracle DB from here. Thanks for that to all.
    Recently, I got an original export dump (exp) to import in the database.. and I have imported in the database.
    but, some doubts I had about the import.
    The export log ended as below:
    +. . exporting table                      GILRFILES      17336 rows exported+
    +. . exporting table                          FLY7          0 rows exported+
    +. . exporting table                            KOCHC      19645 rows exported+
    +. exporting synonyms+
    +. exporting views+
    +. exporting stored procedures+
    +. exporting operators+
    +. exporting referential integrity constraints+
    +. exporting triggers+
    +. exporting indextypes+
    +. exporting bitmap,functional and extensible indexes+
    +. exporting posttables actions+
    +. exporting materialized views+
    +. exporting snapshot logs+
    +. exporting job queues+
    +. exporting refresh groups and children+
    +. exporting dimensions+
    +. exporting post-schema procedural objects and actions+
    +. exporting statistics+
    Export terminated successfully without warnings.*
    I executed the below import command:
    imp userid=JRET/JRET file=F:\oracle\need\SALLOOONI.dmp fromuser=JRET*
    The import ended with the below lines:
    +. . importing table                    "GILRFILES"      17336 rows imported+
    +. . importing table                        "FLY7"          0 rows imported+
    +. . importing table                          "KOCHC"      19645 rows imported+
    Import terminated successfully without warnings.*
    My doubt is...
    Where are the indexes, synonyms, procedures and functions which are exported in exp dump but they are not visibly displayed in the import. When I executed a command to import index I got the warning stating "This was done by SYS not by you" and an error stating " Already exists'. When I asked one of my friend he said indexes will be imported in imp command when the above imp command is issued.
    Is this true..? can some one put some light on this.
    Thanks in advance
    NONUDAY
    Edited by: 897910 on Jan 1, 2012 10:44 PM

    Hi Dean,
    Dean Gagne wrote:
    Any object in the dumpfile owned by T24 would be imported. If it existed, an error would be reported.
    DeanThanks I got the answer. The exp dump had objects like index, view, procedure, etc..
    As when they are imported from user JRET from the exp dump which has all the objects and they are imported as well.
    Thanks for your input Dean.
    NONUDAY

  • Datapump import consuming disk space

    Hi all,
    Windows server 2003
    Oracle 10g
    I'm trying to import a schema using datapump:
    impdp username/password schemas=siebel directory=dpdir dumpfile=schema_exp.dmp status=60
    It's importing but it's consuming massive amounts of disk space, over 40 gig now. Why is it consuming so much space just to import. I'm doing a reorg\, so I dropped the biggest production schema, and did the import. I didn't resize the datafiles, so where is all this extra disk space going?
    Please help, this is a production site and I'm running out of time (and space)

    Tablespace SEGS FRAGS SIZE FREE USED
    PERFSTAT : 0 1 1000 1000 0%
    SBL_DATA : 3448 4 30400 12052 60%
    SBL_INDEX : 7680 3 8192 4986 39%
    SYSAUX : 3137 39 2560 490 81%
    SYSTEM : 1256 1 2048 1388 32%
    TEMP : 0 0%
    TOOLS : 2 1 256 256 0%
    UNDOTBS1 : 11 13 5120 4754 7%
    USERS : 169 14 1024 424 59%
    Great thanks, the tablespaces are not the problem, it is the archivelogs. About a 100m a minute!
    My issue started with the first import, not all the data was imported, meaning there where rows missing, I saw this when I did a row count. So I decided to do the import again (followed the same steps as mentioned earlier) If after this import the rows are still missing, I want to do a point in time recovery with rman, because then my export dump file must be corrupted, am I correct in saying this?

  • HFM Application Design Basic and important doubts

    Hi Can one Make me clear for the following doubts?
    1.A and B are 2 entities now i am going to delete B entity for this year, reason is that not having business operations. what could i do for B entity before deleting?. B is having data for
    the past few years and i want that data for future purpose to analyze its operations for the past. requesting you please how to start think on this?
    2. In Value dimension which one is first between these 2 Proportion or Elimination?
    3. what are flagged accounts? how to understand this concept?
    5. in HFM we load web objects data, what are they? is it specific to Web client or desktop client or for both? please list out all types of web objects.
    6. While creating application profile we can find "Start year and Number of years". how many years Oracle has recommended? only 20 years or there is no such criteria?
    7. While creating application profile we can find 6 frequencies knowing to my knowledge max we can use 4 1for Monthly, 2 for QTD, 3 for HYTD,4 for YTD. and now my doubt is that what other 2 we can use for? keeping in view of what other 2 frequency's the application has been designed for?

    1) You propose deleting entity B, yet you said you require data for this entity. Do not delete the entity since once it has been deleted all data associated with this entity INCLUDING any intercompany transactions which another entity may have had with it, will be unavailable.
    2) I'm not certain but I think [Proportion] comes first when considering execution of Sub Calculate. In general do not write rules which require a sequence between these two nodes, if this is your reason for asking the question.
    3) "Flagged accounts" is not a standard expression. Possibly you are referring to the statement of "flagging an account" for a particular attribute. If so, this simply means to enable or set an account's attribute such as IsCalculated, or IsICP. This concept extends beyond the accounts dimension.
    5) Not sure what you're looking for here. Oracle provides a great deal of documentation with its products, including an API guide.
    6) There is no particular criteria. The most important consideration is selection of the start year since this can never be changed without rebuilding the application with a new profile. It is possible to add years to an application. HFM provides a utility for this with the 11.1.x release.
    7) The number of frequencies available is directly related to the period structure you define in the application profile. Most applications use:
    Periodic (= Monthly)
    Quarterly
    Yearly
    I occasionally see Half Year though this is uncommon. Weekly applications can have an additional frequency, where Periodic = Week to Date, and then Monthly is above Periodic. I suppose in this case you could have five frequencies:
    Periodic
    Monthly
    Quarterly
    HalfYearly
    Yearly
    Daily is not recommended for HFM.
    I recommend not creating more frequencies than you truly need. Although this doesn't seem to have an impact on application performance, it does present the users with more options to click through when choosing a point of view. Consider that every user every day will click on the point of view selector many times, so reducing the number of clicks required can have a large impact on your users.
    --Chris                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                       

Maybe you are looking for

  • Is adobe photoshop cs2 compatable with windows 8?

    Is adobe photoshop CS2 compatable with windows 8?

  • Error on review page - getRegionCodes Exception :null

    Hi, We are developing a custom transaction having multiple pages before the final review page. On navigating to review page – I get following error "getRegionCodes Exception :null" Details: In the seeded review page, there's one seeded review region

  • Using Adobe 9 Acrobat

    I have Adobe 9 Acrobat and have been creating all these forms, which we are then turn on the "extend feature in adobe reader", so the clients can only fill out what we allow them too. We are starting to send our forms out to our customers. One partic

  • Additional Setup After R12 Upgrade

    Hello Experts, We are currently in the Process of upgrading 11i.5.10.2 to R12.1.3. We are having GL, AR and Projects module as a part of the R12 upgrade. The question is straight forward, After R12 upgrade do we need to do any additional setup on sub

  • BED AND CESS BREAK UP IN J1IIN FOR STO

    I NEED BED AND CESS BREAK UP IN J1IIN FOR STOCK TRANSFER ORDER INVOICE PLZ GIVE ME THE CONFIG [email protected]