IMPDP, EXPDP help
Hello all,
i am using oracle 11g r2 and i have 2 questions regarding expdp and impdp:
I am trying to export a part of table an import it to another schema on the database:
below is the expdp command
expdp dbauser/dbausr@db_name tables=schema_name1.table_name query='shema_name1.table_name:"where date =to_date ('15092011','ddmmyyyy')"' dumpfile=table_name.dmp content=data_only logfile=table_exp.LOGit is failing when inserting the query parameter what is the format that shoud i use to meet this condition:where date =to_date ('15092011','ddmmyyyy')?
another question regarding impdp
the dump that i export it in the above command i need to import it to the same table but in another schema shema_name2 what should i use?
impdp dbauser/dbausr@db_name tables=shema_name1.table_name dumpfile=table_name.dmp content=data_onlyhow i can specify to which schema i need to import to it? where i specify that i need to import into shema_name2.shclog
Edited by: NB on Sep 15, 2011 8:36 AM
NB wrote:
Hello all,
i am using oracle 11g r2 and i have 2 questions regarding expdp and impdp:
I am trying to export a part of table an import it to another schema on the database:
below is the expdp command
expdp dbauser/dbausr@db_name tables=schema_name1.table_name query='shema_name1.table_name:"where date =to_date ('15092011','ddmmyyyy')"' dumpfile=table_name.dmp content=data_only logfile=table_exp.LOGit is failing when inserting the query parameter what is the format that shoud i use to meet this condition:where date =to_date ('15092011','ddmmyyyy')?
another question regarding impdp
the dump that i export it in the above command i need to import it to the same table but in another schema shema_name2 what should i use?
impdp dbauser/dbausr@db_name tables=shema_name1.table_name dumpfile=table_name.dmp content=data_onlyhow i can specify to which schema i need to import to it? where i specify that i need to import into shema_name2.shclog
Edited by: NB on Sep 15, 2011 8:36 AMDATE is a Reserved Word & should not be used as column name
Similar Messages
-
How to redirect imp/exp or impdp/expdp output
Hello,
I'm sorry for that, if this forum not the right place for this posting, but I hope that some one has this problem too.
I've an application written in C# (an windows.forms app) that calls imp/exp or impdp/expdp (depends on user). I need the output of this tools in my app.
Thank a lot for any help.user9099355 wrote:
Hello,
I'm sorry for that, if this forum not the right place for this posting, but I hope that some one has this problem too.
I've an application written in C# (an windows.forms app) that calls imp/exp or impdp/expdp (depends on user). I need the output of this tools in my app.
Thank a lot for any help.EXPDP/IMPDP:-
[oracle@adg disks]$ expdp system/manager schemas=scott directory=DATA_PUMP_DIR logfile=scot_exp.log
Export: Release 11.2.0.1.0 - Production on Tue Oct 4 14:36:54 2011
Copyright (c) 1982, 2009, Oracle and/or its affiliates. All rights reserved.
Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.1.0 - Production
With the Partitioning, Automatic Storage Management, OLAP, Data Mining
and Real Application Testing options
Starting "SYSTEM"."SYS_EXPORT_SCHEMA_01": system/******** schemas=scott directory=DATA_PUMP_DIR logfile=scot_exp.log
[oracle@adg disks]$ cd /u01/app/oracle/admin/adg/dpdump/
[oracle@adg dpdump]$ ls -lt scot_exp.log
-rw-r--r-- 1 oracle dba 1965 Oct 4 14:38 scot_exp.log
[oracle@adg dpdump]$exp/imp:-
[oracle@adg dpdump]$ exp system/manager file=exp_schema.log log=exp_schema_trad.log owner=scott
Export: Release 11.2.0.1.0 - Production on Tue Oct 4 14:38:53 2011
Copyright (c) 1982, 2009, Oracle and/or its affiliates. All rights reserved.
Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.1.0 - Production
With the Partitioning, Automatic Storage Management, OLAP, Data Mining
and Real Application Testing options
Export done in US7ASCII character set and AL16UTF16 NCHAR character set
server uses AL32UTF8 character set (possible charset conversion)
[oracle@adg dpdump]$ ls -lt exp_schema_trad.log
-rw-r--r-- 1 oracle dba 1774 Oct 4 14:39 exp_schema_trad.log
[oracle@adg dpdump]$or
nohup exp system/manager file=exp_schema.dmp owner=scott > exp_schema_test.log
+[oracle@adg dpdump]$ ls -ltr exp_schema_test.log+
-rw-r--r-- 1 oracle dba 1922 Oct 4 14:40 exp_schema_test.log
+[oracle@adg dpdump]$+ -
Impdp/expdp ignoring logfile path.
Greetings All, hopefully this is a no brainer for one of you. Whenever I use impdp/expdp and i specify: logfile=direcotry:filename, or directory=path, logfile=filename they are ignored and the logfile is always written to ORACLE_HOME\database.
I am on the Windows platform and I am running 11g R1 SP19
Thanks.impdp system/sys@DP1 LOGFILE=DATADUMP_DIR:ACCOUNT_IMP.log DUMPFILE=DATADUMP_DIR:ACCOUNT.DMP job_name=ACCOUNTImport metrics=y trace=4a0300 exclude=index,ref_constraint tables=TESTUSER.ACCOUNT
-
IMPDP / EXPDP unknown command
Hi,
I don't use oracle databases for a long time and I am not a DBA. Now I am using it again with the following environment: there is a server with a huge database Oracle 10g EE and I am developing an application that uses a part of this. I have Oracle 10g XE on my development machine and I want to use expdp/impdp to copy the tables I am interested in into my machine.
My problem is that I get a SP2-0042 error telling me that the command is unknown (in both server and dev machine). From what I understood the datapump should be available in both versions. Does anyone can help me?
Thank you,
RuiExpdp/impdp are separate utilities and not subcommands in sqlplus. I recommend you read the documentation.
http://download.oracle.com/docs/cd/B19306_01/server.102/b14215/part_dp.htm#i436481
Werner -
VDE-00018 error with IMPDP/EXPDP utility.
Hi All,
I just migrated my data from 8.1.6.0.0 to 11.1.0.7.0 succressfully with IMP/EXP utility.
But, when I try to use EXPDP/IMPDP utility, it shows above error VDE-00018 i.e. Data Pump client is incompatible with database version 11.1.0.7.0.
Can I use EXPDP/IMPDP in future with any settings ?
If not then, should I continue with EXP/IMP in 11g ?
Is EXP/IMP in 11g as sufficient as in 8i ?
I used IMP/EXP in 8i many times, with never any problem in backup/restoring my data.
Please guide.
Regards.I have never seen that error, but the rule for Data Pump is that you use the client that shipped with the database you are running on. It is different that exp/imp. All you should have to do is get the expdp/impdp client that shipped with your database.
Hope that helps.
Dean -
Helllo,
i did expdp of tables a,b,c from schema one.
Now i would like to import data from those tables in the schema two which haves also tables a,b,c but they are empty.
I am on 11g and this is on the same datrabase, only users are different, and i dont' know how to do that . Remap_table, remap_schema maybe ?true. thanks.
only it could be slower ?
i will do with the append hint and wihout it to see... althoug i am not sure i will have the time and good will to test impdp to compare..
But, if the insert select would be 'too slow' , i will must go to impdp.
btw,
is there some command to do that with the impdp, so what would that be, remap schema or what ? something like that? -
hi gurus,
I have schema TEST that containts many tables in 2 tablespaces.
tablespaces: Tb_teste1 ( 200 tables)
Tb_teste2 (140 tables)
I need only objects in tablespace Tb_teste1, so my script follow:
$ORACLE_HOME/bin/expdp userid=system/xxxx@xxxxx dumpfile="Expdpxxxx.dmp" directory=test EXCLUDE=STATISTICS TABLESPACES=Tb_teste1 >>${LOGDIR}/${LOGFILE} 2>&1
but I give problems with sequences...because sequences belong to system tablespace.
How can backup these objects in Tb_teste1 which sequences ?
Regards
Pauluse a parameterfile and generate a line like
INCLUDE=TABLE:"IN ('TABLE1', 'TABLE2',..,'TABLE200')"
using
select segement_name from dba_segments where TABLESPACE_NAME='Tb_teste1' and owner='SCHEMANAME' and SEGMENT_TYPE='TABLE'dont forget to specifi SCHEMAS and LOGFILE (instead of 1 >>${LOGDIR}/${LOGFILE})
hth -
Expdp /impdp Win 2003 Oracle 11g
Hi Guys,
I did the export from production and I want to restore on dev. What is the sysntax for impdp.
expdp userid=system/system dumpfile=livelink.dmp schemas=livelink logfile=livelink.log
impdp userid=system/system dumpfile=e:/oradata/livelink.dmp ........ Is there fromuser=livelink touser=livelink
Thanx.Handle: NPD
Status Level: Newbie
Registered: Mar 15, 2007
Total Posts: 359
Total Questions: 82 (73 unresolved)
so many questions & so few answers.
Is there fromuser=livelink touser=livelinkNO
check out SCHEMAS
bcm@bcm-laptop:~$ impdp help=yes
Import: Release 11.2.0.1.0 - Production on Thu Oct 7 08:44:28 2010
Copyright (c) 1982, 2009, Oracle and/or its affiliates. All rights reserved.
The Data Pump Import utility provides a mechanism for transferring data objects
between Oracle databases. The utility is invoked with the following command:
Example: impdp scott/tiger DIRECTORY=dmpdir DUMPFILE=scott.dmp
You can control how Import runs by entering the 'impdp' command followed
by various parameters. To specify parameters, you use keywords:
Format: impdp KEYWORD=value or KEYWORD=(value1,value2,...,valueN)
Example: impdp scott/tiger DIRECTORY=dmpdir DUMPFILE=scott.dmp
USERID must be the first parameter on the command line.
The available keywords and their descriptions follow. Default values are listed within square brackets.
ATTACH
Attach to an existing job.
For example, ATTACH=job_name.
CONTENT
Specifies data to load.
Valid keywords are: [ALL], DATA_ONLY and METADATA_ONLY.
DATA_OPTIONS
Data layer option flags.
Valid keywords are: SKIP_CONSTRAINT_ERRORS.
DIRECTORY
Directory object to be used for dump, log and sql files.
DUMPFILE
List of dumpfiles to import from [expdat.dmp].
For example, DUMPFILE=scott1.dmp, scott2.dmp, dmpdir:scott3.dmp.
ENCRYPTION_PASSWORD
Password key for accessing encrypted data within a dump file.
Not valid for network import jobs.
ESTIMATE
Calculate job estimates.
Valid keywords are: [BLOCKS] and STATISTICS.
EXCLUDE
Exclude specific object types.
For example, EXCLUDE=SCHEMA:"='HR'".
FLASHBACK_SCN
SCN used to reset session snapshot.
FLASHBACK_TIME
Time used to find the closest corresponding SCN value.
FULL
Import everything from source [Y].
HELP
Display help messages [N].
INCLUDE
Include specific object types.
For example, INCLUDE=TABLE_DATA.
JOB_NAME
Name of import job to create.
LOGFILE
Log file name [import.log].
NETWORK_LINK
Name of remote database link to the source system.
NOLOGFILE
Do not write log file [N].
PARALLEL
Change the number of active workers for current job.
PARFILE
Specify parameter file.
PARTITION_OPTIONS
Specify how partitions should be transformed.
Valid keywords are: DEPARTITION, MERGE and [NONE].
QUERY
Predicate clause used to import a subset of a table.
For example, QUERY=employees:"WHERE department_id > 10".
REMAP_DATA
Specify a data conversion function.
For example, REMAP_DATA=EMP.EMPNO:REMAPPKG.EMPNO.
REMAP_DATAFILE
Redefine datafile references in all DDL statements.
REMAP_SCHEMA
Objects from one schema are loaded into another schema.
REMAP_TABLE
Table names are remapped to another table.
For example, REMAP_TABLE=EMP.EMPNO:REMAPPKG.EMPNO.
REMAP_TABLESPACE
Tablespace object are remapped to another tablespace.
REUSE_DATAFILES
Tablespace will be initialized if it already exists [N].
SCHEMAS
List of schemas to import.
SKIP_UNUSABLE_INDEXES
Skip indexes that were set to the Index Unusable state.
SOURCE_EDITION
Edition to be used for extracting metadata.
SQLFILE
Write all the SQL DDL to a specified file.
STATUS
Frequency (secs) job status is to be monitored where
the default [0] will show new status when available.
STREAMS_CONFIGURATION
Enable the loading of Streams metadata
TABLE_EXISTS_ACTION
Action to take if imported object already exists.
Valid keywords are: APPEND, REPLACE, [SKIP] and TRUNCATE.
TABLES
Identifies a list of tables to import.
For example, TABLES=HR.EMPLOYEES,SH.SALES:SALES_1995.
TABLESPACES
Identifies a list of tablespaces to import.
TARGET_EDITION
Edition to be used for loading metadata.
TRANSFORM
Metadata transform to apply to applicable objects.
Valid keywords are: OID, PCTSPACE, SEGMENT_ATTRIBUTES and STORAGE.
TRANSPORTABLE
Options for choosing transportable data movement.
Valid keywords are: ALWAYS and [NEVER].
Only valid in NETWORK_LINK mode import operations.
TRANSPORT_DATAFILES
List of datafiles to be imported by transportable mode.
TRANSPORT_FULL_CHECK
Verify storage segments of all tables [N].
TRANSPORT_TABLESPACES
List of tablespaces from which metadata will be loaded.
Only valid in NETWORK_LINK mode import operations.
VERSION
Version of objects to import.
Valid keywords are: [COMPATIBLE], LATEST or any valid database version.
Only valid for NETWORK_LINK and SQLFILE.
The following commands are valid while in interactive mode.
Note: abbreviations are allowed.
CONTINUE_CLIENT
Return to logging mode. Job will be restarted if idle.
EXIT_CLIENT
Quit client session and leave job running.
HELP
Summarize interactive commands.
KILL_JOB
Detach and delete job.
PARALLEL
Change the number of active workers for current job.
START_JOB
Start or resume current job.
Valid keywords are: SKIP_CURRENT.
STATUS
Frequency (secs) job status is to be monitored where
the default [0] will show new status when available.
STOP_JOB
Orderly shutdown of job execution and exits the client.
Valid keywords are: IMMEDIATE.
bcm@bcm-laptop:~$ Edited by: sb92075 on Oct 7, 2010 8:53 AM -
Fromuser/touser equivanet in expdp/impdp ??
hi .
i got a dump file from the following cmmand.
expdp system/123456@SCHDB dumpfile=studentinfo.dmp logfile=studentinfo.log tables=school.studentmaster,school.studentmarks, school.studentleave directory=mydir
now i want to import these 2 tables into a different schema (old_school) using impdp
whats the fromuser / touser equivalent in impdp/expdp ???REMAP_DATA
Specify a data conversion function.
For example, REMAP_DATA=EMP.EMPNO:REMAPPKG.EMPNO.
REMAP_DATAFILE
Redefine data file references in all DDL statements.
REMAP_SCHEMA
Objects from one schema are loaded into another schema.
REMAP_TABLE
Table names are remapped to another table.
For example, REMAP_TABLE=HR.EMPLOYEES:EMPS.
REMAP_TABLESPACE
Tablespace objects are remapped to another tablespace.
REUSE_DATAFILES
Tablespace will be initialized if it already exists [N]. -
Hi Friends,
I need to exclude two tables in impdp.. help me in this...
impdp system directory=exportdump tables=SYNC.cin_document query=SYNC.CIN_DOCUMENT:'"where rownum<10"' dumpfile=satexpdpdump.dmp logfile=1.log CONTENT=DATA_ONLY EXCLUDE=TABLE:"SYNC.CIN_DETAIL,SYNC.CIN_DOCUMENT_GTIN"
Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.4.0 - 64bit Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options
ORA-39001: invalid argument value
ORA-39071: Value for EXCLUDE is badly formed.
ORA-00920: invalid relational operatorThanks friends ,,, I got the solution.
Solution:
instead of excluding i have mentioned only one table name in "tables=SYNC.cin_document" field. I have just removed the EXCLUDE option ..It is working fine..
The following query i have used.
impdp system directory=exportdump tables=SYNC.cin_document query=SYNC.CIN_DOCUMENT:'"where rownum<2570000"' dumpfile=satexpdpdump.dmp logfile=2570000d000gih.log CONTENT=DATA_ONLY
For your reference.:
The following scenario explains: 3 tables exported under scott. but we need to import only two tables. output follows
step1:
=============
create directory data_pump_dir as '/u01/oracle/my_dump_dir';
grant read,write on directory data_pump_dir1 to system;
expdp system/sys DIRECTORY=data_pump_dir DUMPFILE=tables.dmp logfile=table.log TABLES=SCOTT.DEPT,SCOTT.EMP,SCOTT.BONUS
output:
[oracle@linux1 ~]$ expdp system/sys DIRECTORY=data_pump_dir DUMPFILE=tables.dmp logfile=table.log TABLES=SCOTT.DEPT,SCOTT.EMP,SCOTT.BONUS
Export: Release 10.2.0.1.0 - Production on Tuesday, 10 May, 2011 19:40:00
Copyright (c) 2003, 2005, Oracle. All rights reserved.
Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - Production
With the Partitioning, OLAP and Data Mining options
Starting "SYSTEM"."SYS_EXPORT_TABLE_01": system/******** DIRECTORY=data_pump_dir DUMPFILE=tables.dmp logfile=table.log TABLES=SCOTT.DEPT,SCOTT.EMP,SCOTT.BONUS
Estimate in progress using BLOCKS method...
Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
Total estimation using BLOCKS method: 128 KB
Processing object type TABLE_EXPORT/TABLE/TABLE
Processing object type TABLE_EXPORT/TABLE/INDEX/INDEX
Processing object type TABLE_EXPORT/TABLE/CONSTRAINT/CONSTRAINT
Processing object type TABLE_EXPORT/TABLE/CONSTRAINT/REF_CONSTRAINT
. . exported "SCOTT"."DEPT" 5.656 KB 4 rows
. . exported "SCOTT"."EMP" 7.820 KB 14 rows
. . exported "SCOTT"."BONUS" 0 KB 0 rows
Master table "SYSTEM"."SYS_EXPORT_TABLE_01" successfully loaded/unloaded
Dump file set for SYSTEM.SYS_EXPORT_TABLE_01 is:
/u01/app/oracle/product/10.2.0/db_1/rdbms/log/tables.dmp
Job "SYSTEM"."SYS_EXPORT_TABLE_01" successfully completed at 19:40:22
[oracle@linux1 ~]$
Step 2:
SQL> alter table DEPT rename to dept1;
Table altered.
SQL> alter table EMP rename to emp1;
Table altered.
SQL> select * from tab;
TNAME TABTYPE CLUSTERID
BONUS TABLE
SALGRADE TABLE
EMP1 TABLE
DEPT1 TABLE
==============================================================================
[oracle@linux1 ~]$ impdp system/sys DIRECTORY=data_pump_dir DUMPFILE=tables.dmp TABLES=SCOTT.DEPT,SCOTT.EMP
output:
Import: Release 10.2.0.1.0 - Production on Tuesday, 10 May, 2011 19:48:51
Copyright (c) 2003, 2005, Oracle. All rights reserved.
Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - Production
With the Partitioning, OLAP and Data Mining options
Master table "SYSTEM"."SYS_IMPORT_TABLE_01" successfully loaded/unloaded
Starting "SYSTEM"."SYS_IMPORT_TABLE_01": system/******** DIRECTORY=data_pump_dir DUMPFILE=tables.dmp TABLES=SCOTT.DEPT,SCOTT.EMP
Processing object type TABLE_EXPORT/TABLE/TABLE
Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
. . imported "SCOTT"."DEPT" 5.656 KB 4 rows
. . imported "SCOTT"."EMP" 7.820 KB 14 rows
Processing object type TABLE_EXPORT/TABLE/INDEX/INDEX
ORA-31684: Object type INDEX:"SCOTT"."PK_DEPT" already exists
ORA-31684: Object type INDEX:"SCOTT"."PK_EMP" already exists
Processing object type TABLE_EXPORT/TABLE/CONSTRAINT/CONSTRAINT
ORA-31684: Object type CONSTRAINT:"SCOTT"."PK_DEPT" already exists
ORA-31684: Object type CONSTRAINT:"SCOTT"."PK_EMP" already exists
Processing object type TABLE_EXPORT/TABLE/CONSTRAINT/REF_CONSTRAINT
ORA-39083: Object type REF_CONSTRAINT failed to create with error:
ORA-02270: no matching unique or primary key for this column-list
Failing sql is:
ALTER TABLE "SCOTT"."EMP" ADD CONSTRAINT "FK_DEPTNO" FOREIGN KEY ("DEPTNO") REFERENCES "SCOTT"."DEPT" ("DEPTNO") ENABLE
Job "SYSTEM"."SYS_IMPORT_TABLE_01" completed with 5 error(s) at 19:48:54
[oracle@linux1 ~]$
SQL> select * from tab;
TNAME TABTYPE CLUSTERID
BONUS TABLE
SALGRADE TABLE
EMP1 TABLE
DEPT1 TABLE
DEPT TABLE
EMP TABLE
6 rows selected.
SQL> -
Hi,
I'm going to test using impdp through db_link(to source db)
1. Can I do a full import but exclude schemas as normal impdp/expdp?
2. Will TRANSFORM=SEGMENT_ATTRIBUTES:n create objects in the default schema for a user?
2b. If I'd like all Indexes in a index tablespace and other in data tablespace what would be the command?
Would this command work?
impdp system@database full=y network_link=REMOTE_DB directory=data_pump logfile=impdpfull.log exclude=schema:IN ('SYS','SYSTEM') flashback_time=\"to_timestamp\(\'$DATE\',\'yyyy-mm-dd hh24:mi:ss\'\)\" TRANSFORM=SEGMENT_ATTRIBUTES:n
Regards
933746933746 wrote:
hitgon wrote:
impdp -help
Edited by: hitgon on Jun 26, 2012 3:19 PMHi, Thanks for your feedback.
Ok to be more concrete will TRANSFORM=SEGMENT_ATTRIBUTES:n just remove tablespace clause from create statment in impdp and therefor all objects will be created as default for their respective users?
Regards
933746Just to add an update to this thread:
TRANSFORM=SEGMENT_ATTRIBUTES:n will remove storage and tablespace clause and create objects in default tablespace. -
Hi,
Application team run expdp on client machine and stored in CLIENT machine.They using below format.
schema.par
DUMPFILE=schema.dmp
LOGFILE=schema.log
USERID='cdata/cdata@(DESCRIPTION=(ADDRESS=(HOST=PLM-55)(PROTOCOL=tcp)(PORT=1521))(CONNECT_DATA=(SERVICE_NAME=M10)))'
SCHEMAS=cdata
schmea.sh
DIRECTORY=TEST parfile=schema.par
We got a call from application team.Their expdp scirpt is not working last two days.They got below error.Before it worked fine.the dump file stored in their local server(Not DATABASE server).
ORA-39002: invalid operation
ORA-39070: Unable to open the log file.
ORA-39087: directory name TEST is invalid
Is it possilbe to store dumpfile in local server using EXPDP?
Then we tried in East connect method.But not working
expdp userid=@//plm-55:1521/M10 directory=TEST
Export: Release 11.2.0.1.0 - Production on Fri Mar 22 08:11:48 2013
Copyright (c) 1982, 2009, Oracle and/or its affiliates. All rights reserved.
Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.3.0 - 64bit Production
With the Partitioning, Automatic Storage Management, OLAP, Data Mining
and Real Application Testing options
ORA-39002: invalid operation
ORA-39070: Unable to open the log file.
ORA-29283: invalid file operation
ORA-06512: at "SYS.UTL_FILE", line 536
ORA-29283: invalid file operation
In db server,directory is there at db level.
Help me out.
Thanks & Regards,
VN
Edited by: user3266490 on Mar 22, 2013 1:58 PMis it possilbe to store dumpfile in local server using EXPDP?it is possible.i think your directory path is a network path pointed to client machine.
but please refer the following links and comment given by
impdp, expdp error ORA-39070 has me dead in the water
Sky13
Posts: 287
Registered: 11/23/09
Yes I agree... you are most likely running into a Windows security issue. When an oracle instance is created on Windows the services are >setup to Log On as the SYSTEM account. For "Calls" coming from the database to be able to access access directories they must have >privileges granted to the SYSTEM account. Basicaly the account that the OracleService[Instance Namer] service is running under has to >have access to the UNC path. further..
impdp, expdp error ORA-39070 has me dead in the water
Expdp serve-side thing??
regards,
ragunath -
Oracle 10g / Linux
Why oracle 10g expdp and impdp is fater than exp and imp.
how to implement network export and network mport in 10g /OEL environment.
i would like to take full expdp daily.
santhanam.expdp help=yes
Export: Release 10.2.0.1.0 - Production on Thursday, 15 April, 2010 19:06:44
Copyright (c) 2003, 2005, Oracle. All rights reserved.
The Data Pump export utility provides a mechanism for transferring data objects
between Oracle databases. The utility is invoked with the following command:
Example: expdp scott/tiger DIRECTORY=dmpdir DUMPFILE=scott.dmp
You can control how Export runs by entering the 'expdp' command followed
by various parameters. To specify parameters, you use keywords:
Format: expdp KEYWORD=value or KEYWORD=(value1,value2,...,valueN)
Example: expdp scott/tiger DUMPFILE=scott.dmp DIRECTORY=dmpdir SCHEMAS=scott
or TABLES=(T1:P1,T1:P2), if T1 is partitioned table
USERID must be the first parameter on the command line.
Keyword Description (Default)
ATTACH Attach to existing job, e.g. ATTACH [=job name].
COMPRESSION Reduce size of dumpfile contents where valid
keyword values are: (METADATA_ONLY) and NONE.
CONTENT Specifies data to unload where the valid keywords are:
(ALL), DATA_ONLY, and METADATA_ONLY.
DIRECTORY Directory object to be used for dumpfiles and logfiles.
DUMPFILE List of destination dump files (expdat.dmp),
e.g. DUMPFILE=scott1.dmp, scott2.dmp, dmpdir:scott3.dmp.
ENCRYPTION_PASSWORD Password key for creating encrypted column data.
ESTIMATE Calculate job estimates where the valid keywords are:
(BLOCKS) and STATISTICS.
ESTIMATE_ONLY Calculate job estimates without performing the export.
EXCLUDE Exclude specific object types, e.g. EXCLUDE=TABLE:EMP.
FILESIZE Specify the size of each dumpfile in units of bytes.
FLASHBACK_SCN SCN used to set session snapshot back to.
FLASHBACK_TIME Time used to get the SCN closest to the specified time.
FULL Export entire database (N).
HELP Display Help messages (N).
INCLUDE Include specific object types, e.g. INCLUDE=TABLE_DATA.
JOB_NAME Name of export job to create.
LOGFILE Log file name (export.log).
NETWORK_LINK Name of remote database link to the source system.
NOLOGFILE Do not write logfile (N).
PARALLEL Change the number of active workers for current job.
PARFILE Specify parameter file.
QUERY Predicate clause used to export a subset of a table.
SAMPLE Percentage of data to be exported;
SCHEMAS List of schemas to export (login schema).
STATUS Frequency (secs) job status is to be monitored where
the default (0) will show new status when available.
TABLES Identifies a list of tables to export - one schema only.
TABLESPACES Identifies a list of tablespaces to export.
TRANSPORT_FULL_CHECK Verify storage segments of all tables (N).
TRANSPORT_TABLESPACES List of tablespaces from which metadata will be unloaded.
VERSION Version of objects to export where valid keywords are:
(COMPATIBLE), LATEST, or any valid database version.
The following commands are valid while in interactive mode.
Note: abbreviations are allowed
Command Description
ADD_FILE Add dumpfile to dumpfile set.
CONTINUE_CLIENT Return to logging mode. Job will be re-started if idle.
EXIT_CLIENT Quit client session and leave job running.
FILESIZE Default filesize (bytes) for subsequent ADD_FILE commands.
HELP Summarize interactive commands.
KILL_JOB Detach and delete job.
PARALLEL Change the number of active workers for current job.
PARALLEL=<number of workers>.
START_JOB Start/resume current job.
STATUS Frequency (secs) job status is to be monitored where
the default (0) will show new status when available.
STATUS[=interval]
STOP_JOB Orderly shutdown of job execution and exits the client.
STOP_JOB=IMMEDIATE performs an immediate shutdown of the
Data Pump job.http://download.oracle.com/docs/cd/B19306_01/server.102/b14215/dp_export.htm#i1007466 -
Is it possible to export tables from diffrent schema using expdp?
Hi,
We can export tables from different schema using exp. Ex: exp user/pass file=sample.dmp log=sample.log tables=scott.dept,system.sales ...But
Is it possible in expdp?
Thanks in advance ..
Thanks,Hi,
you have to use "schemas=user1,user2 include=table:"in('table1,table2')" use parfileexpdp scott/tiger@db10g schemas=SCOTT include=TABLE:"IN ('EMP', 'DEPT')" directory=TEST_DIR dumpfile=SCOTT.dmp logfile=expdpSCOTT.log{quote}
I am not able to perform it using parfile also.Using parfile it shows "UDE-00010: multiple job modes requested, schema and tables."
When trying the below, i get error
{code}
bash-3.00$ expdp directory=EXP_DUMP dumpfile=test.dmp logfile=test.log SCHEMAS=(\'MM\',\'MMM\') include=TABLE:\"IN\(\'EA_EET_TMP\',\'WS_DT\'\)\"
Export: Release 10.2.0.4.0 - 64bit Production on Friday, 15 October, 2010 18:34:32
Copyright (c) 2003, 2007, Oracle. All rights reserved.
Username: / as sysdba
Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.4.0 - 64bit Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options
Starting "SYS"."SYS_EXPORT_SCHEMA_01": /******** AS SYSDBA directory=EXP_DUMP dumpfile=test.dmp logfile=test.log SCHEMAS=('MM','MMM') include=TABLE:"IN('EA_EET_TMP','WS_DT')"
Estimate in progress using BLOCKS method...
Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
Total estimation using BLOCKS method: 0 KB
Processing object type SCHEMA_EXPORT/TABLE/TABLE
Processing object type SCHEMA_EXPORT/TABLE/INDEX/INDEX
Processing object type SCHEMA_EXPORT/TABLE/CONSTRAINT/CONSTRAINT
Processing object type SCHEMA_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS
Processing object type SCHEMA_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
. . exported "MM"."EA_EET_TMP" 0 KB 0 rows
ORA-39165: Schema MMM was not found.
Master table "SYS"."SYS_EXPORT_SCHEMA_01" successfully loaded/unloaded
Dump file set for SYS.SYS_EXPORT_SCHEMA_01 is:
/export/home/nucleus/dump/test.dmp
Job "SYS"."SYS_EXPORT_SCHEMA_01" completed with 1 error(s) at 18:35:19
{code}
When checking expdp help=y shows :-
{code}TABLES Identifies a list of tables to export - one schema only.{code}
As per few testing,tables from different schemas are not possible to export using expdp in a single command.
Anand -
Re: impdp - duplicate keys found error
Today I faced same problem.
impdp, expdp log are showing same number of records but target database has more number of rows(duplicate) in few tables.
Source database version 10.2.0.4 and target database version is 11.2.0.3.
I loaded the same dump file again with same par file. it loaded successfully. According to oracle support. something has interfered between data load and index creation time when the data pump in progress.
Thanks
Edited by: 931804 on Oct 11, 2012 2:35 AMDump import
Maybe you are looking for
-
How do you fix the missing Detail panel in the Develop module? I just updated on the Mac to LR 5.4.
-
Hello, Could you please let me know, how to call a Java application from a MicroFocus COBOL application. If anyone has any code samples, that would be of great help. Thanks in advance, Tijo.
-
Hi all, I am creating Consignment fill up order and i have found that Price is appearing in the order which i deally should not be. I have checked all the settings i.e in KBN item category and found ok. Please let me know valuable inputs. Regards
-
I believe I have a security issue with Firefox blocking a page that I need access to.
On my laptop ONLY, I can't get access to the page on my bank's website to pay bills. It will let me log in and view the page that shows the amount in my accounts. In order to get full access, I am required to type in the answer to my "Login Security"
-
Non-secure DDNS security risk?
We are running a 2008R2 domain. Our DCs are also DHCP/DNS(ADI) servers. The DCs are also member of the DNSUpdateProxy group. We do not have an account being used for passing Dynamic Update credentials. I read something from Ace Fekay that said this