Datapump impot
hi i have done successful export through datapump
but when i am try to import that dmp file so i am getiing error
C:\Documents and Settings>impdp scott/tiger full=y directory=datapump dump
file=full.dmp logfile=full;
Import: Release 10.2.0.3.0 - Production on Thursday, 19 August, 2010 16:33:09
Copyright (c) 2003, 2005, Oracle. All rights reserved.
Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.3.0 - Produc
tion
With the Partitioning, OLAP and Data Mining options
ORA-39002: invalid operation
ORA-39070: Unable to open the log file.
ORA-29283: invalid file operation
ORA-06512: at "SYS.UTL_FILE", line 475
ORA-29283: invalid file operation
please tell me how to solve this issue
this is i am trying on another machine
1 SELECT directory_name, grantee, privilege FROM dba_tab
ries d
2* WHERE t.table_name(+)=d.directory_name
SQL> /
DIRECTORY_NAME GRANTEE
PRIVILEGE
DATAPUMP2 SCOTT
WRITE
DATAPUMP2 SCOTT
READ
DATAPUMP1 SCOTT
WRITE
DIRECTORY_NAME GRANTEE
PRIVILEGE
DATAPUMP1 SCOTT
READ
DATAPUMP SCOTT
WRITE
DATAPUMP SCOTT
READ
DIRECTORY_NAME GRANTEE
PRIVILEGE
WORK_DIR
ADMIN_DIR
DATA_PUMP_DIR EXP_FULL_DATABASE
WRITE
DIRECTORY_NAME GRANTEE
PRIVILEGE
DATA_PUMP_DIR IMP_FULL_DATABASE
WRITE
DATA_PUMP_DIR EXP_FULL_DATABASE
READ
DATA_PUMP_DIR IMP_FULL_DATABASE
READ
see i am getting this is showing dir is there .....
what i do ????
chetan
Similar Messages
-
Hi,
I'm doing an import of a schema with one table with about 1 million (spatial) records. This import takes a very long time to complete, a ran the import on friday and when I looked at the proces on monday I saw is was finished at least 24 hours later after I started it. Here's the output:
Z:\>impdp system/password@ora schemas=bla directory=dpump_dir1 dumpfile=schema8.dmp
Import: Release 10.1.0.2.0 - Production on Friday, 21 January, 2005 16:32
Copyright (c) 2003, Oracle. All rights reserved.
Connected to: Personal Oracle Database 10g Release 10.1.0.2.0 - Production
With the Partitioning, OLAP and Data Mining options
Master table "SYSTEM"."SYS_IMPORT_SCHEMA_02" successfully loaded/unloaded
Starting "SYSTEM"."SYS_IMPORT_SCHEMA_02": system/********@ora schemas=bla directory=dpump_dir1 dumpfile=schema8.dmp
Processing object type SCHEMA_EXPORT/USER
Processing object type SCHEMA_EXPORT/SYSTEM_GRANT
Processing object type SCHEMA_EXPORT/ROLE_GRANT
Processing object type SCHEMA_EXPORT/DEFAULT_ROLE
Processing object type SCHEMA_EXPORT/SE_PRE_SCHEMA_PROCOBJACT/PROCACT_SCHEMA
Processing object type SCHEMA_EXPORT/TABLE/TABLE
Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
. . imported "BLA"."NLD_NW" 266.1 MB 1217147 rows
Job "SYSTEM"."SYS_IMPORT_SCHEMA_02" successfully completed at 15:22
The dumpfile was created with:
Z:\>expdp system/password@orcl schemas=bla directory=dpump_dir1 dumpfile=schema8.dmp
The source DB is running on Linux and the target is running WinXP.
I thought datapump was meant to be really fast, what's going wrong in my situation?...I've just looked at the time needed to do the import into the same database from where I exported the dumpfile. This import operation takes 5 minutes.
-
Check the status expdp datapump job
All,
I have started expdp (datapump) on size of 250GB schema.I want to know when the job will be completed.
I tried to find from views dba_datapump_sessions and v$session_longops,v$session.But but all in vain.
Is it possible to find completion time of expdp job?
Your help is really appreciated.Hi,
Have you started the Job in "Interactie mode".. ??
If yes then you can get the status of Exectuion with "STATUS"- defauly is zero (If you setting to not null value then based on that it will show the status - seconds)
Second, thing check "dba_datapump_jobs"
- Pavan Kumar N -
How to load data in a oracle-datapump table ...
I have done following steps,
1) I have created a external table locating a .csv file
CREATE TABLE TB1 (
COLLECTED_TIME VARCHAR2(8)
ORGANIZATION EXTERNAL(
TYPE oracle_loader
DEFAULT DIRECTORY SRC_DIR
ACCESS PARAMETERS (
RECORDS DELIMITED BY NEWLINE
BADFILE 'TB1.bad'
LOGFILE 'TB1.log'
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '`'
MISSING FIELD VALUES ARE NULL
COLLECTED_TIME
LOCATION ('TB.csv')
2) I am creating a datapump table and .dmp file using the source from step1 table (TB1)
CREATE TABLE TB2
ORGANIZATION EXTERNAL
( TYPE oracle_datapump
DEFAULT DIRECTORY SRC_DIR
LOCATION ('TB.dmp')
) AS SELECT
to_date(decode(COLLECTED_TIME,'24:00:00','00:00:00',COLLECTED_TIME),'HH24:MI:SS') COLLECTED_TIME
FROM TB1;
3) Finally I have create a datapump table which will use TB.dmp as source created by step2
CREATE TABLE TB (
COLLECTED_TIME DATE
ORGANIZATION EXTERNAL (
TYPE ORACLE_DATAPUMP
DEFAULT DIRECTORY SRC_DIR
LOCATION ('TB.dmp')
Question : How to create the TB.dmp file in SRL_DIR of step2 using the PL/SQL code?
How to get data in table TB of step3 using the PL/SQL code?
Thanks
abcHi,
Yes I want to execute the SQL code step 2(as mentioned below) in PL/SQL package,
CREATE TABLE TB2
ORGANIZATION EXTERNAL
( TYPE oracle_datapump
DEFAULT DIRECTORY SRC_DIR
LOCATION ('TB.dmp')
) AS SELECT
to_date(decode(COLLECTED_TIME,'24:00:00','00:00:00',COLLECTED_TIME),'HH24:MI:SS') COLLECTED_TIME
FROM TB1;
Please advise.
Thanks
abc
Edited by: user10775411 on Sep 15, 2010 10:40 AM -
Any specific disadvantage of using DataPump for upgrade ?
Hi,
Here's my config
SERVER A
Source O/S : Win 2003
Source DB : 10.2.0.4 ( 32 bit )
DB Size : 100 GB
SERVER B
Target O/S : Win 2008 R2 sp1
Target DB : 11.2.0.3 ( 64 bit )
I have to upgrade 10g Database of server A , by installing 11g on Server B. I have not used database upgrade assistant or RMAN or similar utilities to perform a 10g upgrade to 11g anytime in the past.
Here is my question ...
a) I was planning to use datapump to perform this upgrade ( Downtime is not a issue ), since I know how to use datapump, do you guys see any potential problem with this approach ? OR
b) Based on your exp. would you suggest that i should avoid option (a) because of potential issues and use other methods that oracle suggests like upgrade assistants etc
I am open to both options, it's just that since i am not a expert at this point of time, i was hesitating a bit to go with option (b)
db is supposed to undergo from 32 to 64 bit. Not sure if this would be deal breaker.
Note : The upgrade is suppose to happen on 2nd server. Can somebody provide high level steps as pointer.
-LearnerIf downtime is not an issue, datapump is certainly an option. How big if the database ? The steps are documented
http://docs.oracle.com/cd/E11882_01/server.112/e23633/expimp.htm#i262220
HTH
Srini -
Error while taking dump using datapump
getting following error -
Export: Release 10.2.0.1.0 - Production on Friday, 15 September, 2006 10:31:41
Copyright (c) 2003, 2005, Oracle. All rights reserved.
Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - Production
With the Partitioning, OLAP and Data Mining options
Starting "XX"."SYS_EXPORT_SCHEMA_02": XX/********@XXX directory=dpdump dumpfile=XXX150906.dmp logfile=XXX150906.log
Estimate in progress using BLOCKS method...
Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
ORA-39125: Worker unexpected fatal error in KUPW$WORKER.GET_TABLE_DATA_OBJECTS while calling DBMS_METADATA.FETCH_XML_CLOB []
ORA-31642: the following SQL statement fails:
BEGIN "DMSYS"."DBMS_DM_MODEL_EXP".SCHEMA_CALLOUT(:1,0,0,'10.02.00.01.00'); END;
ORA-06512: at "SYS.DBMS_SYS_ERROR", line 86
ORA-06512: at "SYS.DBMS_METADATA", line 907
ORA-06550: line 1, column 7:
PLS-00201: identifier 'DMSYS.DBMS_DM_MODEL_EXP' must be declared
ORA-06550: line 1, column 7:
PL/SQL: Statement ignored
ORA-06512: at "SYS.DBMS_SYS_ERROR", line 95
ORA-06512: at "SYS.KUPW$WORKER", line 6235
----- PL/SQL Call Stack -----
object line object
handle number name
2A68E610 14916 package body SYS.KUPW$WORKER
2A68E610 6300 package body SYS.KUPW$WORKER
2A68E610 9120 package body SYS.KUPW$WORKER
2A68E610 1880 package body SYS.KUPW$WORKER
2A68E610 6861 package body SYS.KUPW$WORKER
2A68E610 1262 package body SYS.KUPW$WORKER
255541A8 2 anonymous block
Job "XX"."SYS_EXPORT_SCHEMA_02" stopped due to fatal error at 10:33:12
Action required is contact customer support. And on metalink found a link that states it a bug in 10g release 1 that was suppose to be fixed in 10g release 1 version 4.
some of the default schemas were purposely dropped from the database. The only default schema available now are -
DBSNMP, DIP, OUTLN, PUBLIC, SCOTT, SYS, SYSMAN, SYSTEM, TSMSYS.
DIP, OUTLN, TSMSYS were created again.
Could this be a cause of problem??
Thanks in adv.Hi,
Below is the DDL taken from different database. Will this be enough ? One more thing please, what shall be the password should it be DMSYS.....since this will not be used by me but system.
CREATE USER "DMSYS" PROFILE "DEFAULT" IDENTIFIED BY "*******" PASSWORD EXPIRE DEFAULT TABLESPACE "SYSAUX" TEMPORARY TABLESPACE "TEMP" QUOTA 204800 K ON "SYSAUX" ACCOUNT LOCK
GRANT ALTER SESSION TO "DMSYS"
GRANT ALTER SYSTEM TO "DMSYS"
GRANT CREATE JOB TO "DMSYS"
GRANT CREATE LIBRARY TO "DMSYS"
GRANT CREATE PROCEDURE TO "DMSYS"
GRANT CREATE PUBLIC SYNONYM TO "DMSYS"
GRANT CREATE SEQUENCE TO "DMSYS"
GRANT CREATE SESSION TO "DMSYS"
GRANT CREATE SYNONYM TO "DMSYS"
GRANT CREATE TABLE TO "DMSYS"
GRANT CREATE TRIGGER TO "DMSYS"
GRANT CREATE TYPE TO "DMSYS"
GRANT CREATE VIEW TO "DMSYS"
GRANT DROP PUBLIC SYNONYM TO "DMSYS"
GRANT QUERY REWRITE TO "DMSYS"
GRANT SELECT ON "SYS"."DBA_JOBS_RUNNING" TO "DMSYS"
GRANT SELECT ON "SYS"."DBA_REGISTRY" TO "DMSYS"
GRANT SELECT ON "SYS"."DBA_SYS_PRIVS" TO "DMSYS"
GRANT SELECT ON "SYS"."DBA_TAB_PRIVS" TO "DMSYS"
GRANT SELECT ON "SYS"."DBA_TEMP_FILES" TO "DMSYS"
GRANT EXECUTE ON "SYS"."DBMS_LOCK" TO "DMSYS"
GRANT EXECUTE ON "SYS"."DBMS_REGISTRY" TO "DMSYS"
GRANT EXECUTE ON "SYS"."DBMS_SYSTEM" TO "DMSYS"
GRANT EXECUTE ON "SYS"."DBMS_SYS_ERROR" TO "DMSYS"
GRANT DELETE ON "SYS"."EXPDEPACT$" TO "DMSYS"
GRANT INSERT ON "SYS"."EXPDEPACT$" TO "DMSYS"
GRANT SELECT ON "SYS"."EXPDEPACT$" TO "DMSYS"
GRANT UPDATE ON "SYS"."EXPDEPACT$" TO "DMSYS"
GRANT SELECT ON "SYS"."V_$PARAMETER" TO "DMSYS"
GRANT SELECT ON "SYS"."V_$SESSION" TO "DMSYS"
The other database has the DMSYS and the status is EXPIRED & LOCKED but I'm still able to take the dump using datapump?? -
Getting error while importing datapump through attach keyword.
Hi All,
I am giving job_name for export :
expdp scott/tiger dumpfile=scott.dmp logfile=scott.log job_name=ashwin directory=datapump
Export: Release 10.2.0.1.0 - Production on Wednesday, 23 February, 2011 14:41:58
Copyright (c) 2003, 2005, Oracle. All rights reserved.
Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - Production
With the Partitioning, OLAP and Data Mining options
Starting "SCOTT"."ASHWIN": scott/******** dumpfile=scott.dmp logfile=scott.log job_name=ashwin directory
=datapump
Estimate in progress using BLOCKS method...
Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
Total estimation using BLOCKS method: 384 KB
Processing object type SCHEMA_EXPORT/PRE_SCHEMA/PROCACT_SCHEMA
Processing object type SCHEMA_EXPORT/TABLE/TABLE
Processing object type SCHEMA_EXPORT/TABLE/INDEX/INDEX
Processing object type SCHEMA_EXPORT/TABLE/CONSTRAINT/CONSTRAINT
Processing object type SCHEMA_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS
Processing object type SCHEMA_EXPORT/TABLE/COMMENT
Processing object type SCHEMA_EXPORT/TABLE/CONSTRAINT/REF_CONSTRAINT
Processing object type SCHEMA_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
. . exported "SCOTT"."PLAN_TABLE" 17.03 KB 3 rows
. . exported "SCOTT"."DEPT" 5.656 KB 4 rows
. . exported "SCOTT"."DEPT1" 5.656 KB 4 rows
. . exported "SCOTT"."EMP" 7.820 KB 14 rows
. . exported "SCOTT"."SALGRADE" 5.585 KB 5 rows
. . exported "SCOTT"."BONUS" 0 KB 0 rows
Master table "SCOTT"."ASHWIN" successfully loaded/unloaded
Dump file set for SCOTT.ASHWIN is:
D:\ORACLE\DATAPUMP\SCOTT.DMP
Job "SCOTT"."ASHWIN" successfully completed at 14:42:10
So my job name here is ashwin
but when I am trying to import through this job I am getting following error
impdp scott/tiger attach=ashwin
Import: Release 10.2.0.1.0 - Production on Wednesday, 23 February, 2011 14:51:16
Copyright (c) 2003, 2005, Oracle. All rights reserved.
Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - Production
With the Partitioning, OLAP and Data Mining options
ORA-31626: job does not exist
ORA-06512: at "SYS.DBMS_SYS_ERROR", line 79
ORA-06512: at "SYS.KUPV$FT", line 430
ORA-31638: cannot attach to job ASHWIN for user SCOTT
ORA-31632: master table "SCOTT.ASHWIN" not found, invalid, or inaccessible
ORA-00942: table or view does not exist
I am not able to find out my fault... please help me in this regard.. Thanks...user7280060 wrote:
Hi All,
I am giving job_name for export :
expdp scott/tiger dumpfile=scott.dmp logfile=scott.log job_name=ashwin directory=datapump
Export: Release 10.2.0.1.0 - Production on Wednesday, 23 February, 2011 14:41:58
Copyright (c) 2003, 2005, Oracle. All rights reserved.
Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - Production
With the Partitioning, OLAP and Data Mining options
Starting "SCOTT"."ASHWIN": scott/******** dumpfile=scott.dmp logfile=scott.log job_name=ashwin directory
=datapump
Estimate in progress using BLOCKS method...
Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
Total estimation using BLOCKS method: 384 KB
Processing object type SCHEMA_EXPORT/PRE_SCHEMA/PROCACT_SCHEMA
Processing object type SCHEMA_EXPORT/TABLE/TABLE
Processing object type SCHEMA_EXPORT/TABLE/INDEX/INDEX
Processing object type SCHEMA_EXPORT/TABLE/CONSTRAINT/CONSTRAINT
Processing object type SCHEMA_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS
Processing object type SCHEMA_EXPORT/TABLE/COMMENT
Processing object type SCHEMA_EXPORT/TABLE/CONSTRAINT/REF_CONSTRAINT
Processing object type SCHEMA_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
. . exported "SCOTT"."PLAN_TABLE" 17.03 KB 3 rows
. . exported "SCOTT"."DEPT" 5.656 KB 4 rows
. . exported "SCOTT"."DEPT1" 5.656 KB 4 rows
. . exported "SCOTT"."EMP" 7.820 KB 14 rows
. . exported "SCOTT"."SALGRADE" 5.585 KB 5 rows
. . exported "SCOTT"."BONUS" 0 KB 0 rows
Master table "SCOTT"."ASHWIN" successfully loaded/unloaded
Dump file set for SCOTT.ASHWIN is:
D:\ORACLE\DATAPUMP\SCOTT.DMP
Job "SCOTT"."ASHWIN" successfully completed at 14:42:10
So my job name here is ashwin
but when I am trying to import through this job I am getting following error
impdp scott/tiger attach=ashwin
Import: Release 10.2.0.1.0 - Production on Wednesday, 23 February, 2011 14:51:16
Copyright (c) 2003, 2005, Oracle. All rights reserved.
Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - Production
With the Partitioning, OLAP and Data Mining options
ORA-31626: job does not exist
ORA-06512: at "SYS.DBMS_SYS_ERROR", line 79
ORA-06512: at "SYS.KUPV$FT", line 430
ORA-31638: cannot attach to job ASHWIN for user SCOTT
ORA-31632: master table "SCOTT.ASHWIN" not found, invalid, or inaccessible
ORA-00942: table or view does not exist
I am not able to find out my fault... please help me in this regard.. Thanks...When you run the data pump, in the session of yours, Oracle creates a master table. This table is available only the time till the job is not over and after that , it's also dumped into the dump file. What you have done is, you have got the job completed and this has caused the job table to be dropped. Later on when you are trying to attach to it, it's not happening since the table is not there. So what you should do, if you want to test it out, create a job that would tak esome time to finish. Kill the job or suspend the job and then later, try to reattach to it. See in your schema, what's the job table name that is created and use it.
HTH
Aman.... -
Datapump : How to append data in an existing table
Hello Everyone,
We are new to Datapump.
We try to extract data from one user/schema and to append it into another user/schema.
First we tried Tt use the parameter table_exists_action=append during the importation but we receive this error (but the rows are appended):
ORA-39152: Table "XXXXX"."YYYYY_ZZZ" exists. Data will be appended to existing table but all dependent metadata will be skipped due to table_exists_action of append
Which I don't expect since the utility have been told to, indeed, append data.
Next we tried to use CONTENT=DATA_ONLY on exportation and importation but the importation never end.
How can we append data into a table's user/schema without having an error?
Best regards.
CarlIGNORE=Y during the import.it does the same operation. if the table already exists,it ignores and proceed with importing/appending data to the tables. same way, they do have indexes=n and constraints=n option.
both export/import have equivalent options for fitering to our requirement and datapump has one step above classic import in which you can filter upto metadata object as well. -
Encrypt datapump in oracle 10g
Hi, I have an Oracle RAC SE 10gR2 (10.2.03) with 3 nodes on SLES 10 on ibm ppc.
Every night I have some oracle exports (expdp) of some schemas. I want to encrypt one of various datadumps throught a shell bash script. I see that exists TDE for encrypt colums in a table, and I have to create a wallet for encrypt. However, I'm thinking that I preferred clear text (no encrypt in my tables and schemas) and only encrypt the datapumps files. For example, I have my database same as now, and I should specify one options in my expdp for encrypt.
It's possible?
Can I encrypt those datapumps without use TDE (in Oracle 10g)?
Thanks you very much!!Rober wrote:
Hi, I have an Oracle RAC SE 10gR2 (10.2.03) with 3 nodes on SLES 10 on ibm ppc.
Every night I have some oracle exports (expdp) of some schemas. I want to encrypt one of various datadumps throught a shell bash script. I see that exists TDE for encrypt colums in a table, and I have to create a wallet for encrypt. However, I'm thinking that I preferred clear text (no encrypt in my tables and schemas) and only encrypt the datapumps files. For example, I have my database same as now, and I should specify one options in my expdp for encrypt.
It's possible?
Can I encrypt those datapumps without use TDE (in Oracle 10g)?
Thanks you very much!!TDE perform encryption at storage level.
data pump has encryption clause in 11g
http://www.oracle-base.com/articles/11g/data-pump-enhancements-11gr1.php#encryption_parameters -
Can RMAN backup and export datapump executed at the same time?
Hello,
I have several databases that I backup using RMAN and export datapump everynight starting at 6PM and end at Midnight. The backup maintenance window doesn't give me enough time for each database to run at different time. I am using crontab to schedule my backups. Since I have so many databases that need to be backed up between 6PM - Midnight, some of the export and RMAN backup scripts will execute almost at the same time. My question is can my export data pump and RMAN backup scripts run at the same time?
Thank you in advance.
JohnNeeds must. If you don't run expdp parallel then it doesn't use that much. If it was really killing the system then look into setting up a Resource plan that knocks that user down but this is a big step.
I woud better look into using Rman
system incrementals, and block change tracking, to minimize your RMAN time.
Regards
If your shop needs to do both simultaneously then go for it.
Chris.
PS : One of my shops has maybe 20-30 rmans and pumps all kicking off some simultaneous, some not, from 0000 to 0130. No complaints from users and no problems either. Go for it.
Edited by: Chris Slattery on Nov 25, 2012 11:19 PM -
Missing Role Grants after datapump
Hello OTN-Community,
I have a problem with datapump. I am using some include filters to get the relevant data exported. One of these filters inlcudes the ROLES of my database which starts with a certain expression.
After the export into another database these roles exists but all of the role grants and the grants to other users misses. The object grants are exported correctly.
What am I doing wrong?
The export script:
declare
/*some declare specification are not copyed*/
cursor curSchema is
select
distinct
t.Mdbn_Name Name
from
ProphetMaster.Dat_MdBn t
where
Upper(t.MDBN_Name) not in ('****', '***');
begin
-- Schemas festlegen
SchemaList := '''****'',''***''';
if ExportAllProphetUsers then
for recSchema in curSchema loop
SchemaList := SchemaList||','''||recSchema.Name||'''';
end loop;
end if;
-- Dateigröße
FileSizeStr := to_char(MaxFileSize)||'M';
-- Verzeichnis
DirectoryName := 'PHT_PUMP_DIR';
execute immediate 'create or replace directory "'||DirectoryName||'" as '''|| PumpDir||'''';
-- JobName
JobName := 'PHT_EXPORT'||DateStr;
-- Filename
if not FilenameWithDateTime then
DateStr :='';
end if;
Filename := 'PHTDB'||DateStr||'_%U.DMP';
Logfilename := JobName||'.LOG';
-- Job festlegen und Ausführen
h1 := dbms_datapump.open (operation => 'EXPORT', job_mode => 'FULL', job_name => JobName, version => 'COMPATIBLE');
dbms_datapump.set_parallel(handle => h1, degree => ParallelExecutions);
dbms_datapump.add_file(handle => h1, filename => Logfilename, directory => DirectoryName, filetype => 3);
dbms_datapump.set_parameter(handle => h1, name => 'KEEP_MASTER', value => 0);
--10g
--dbms_datapump.add_file(handle => h1, filename => Filename, directory => DirectoryName, filesize => FileSizeStr, filetype => 1);
--11g
dbms_datapump.add_file(handle => h1, filename => Filename, directory => DirectoryName, filesize => FileSizeStr, filetype => 1, reusefile =>OverwriteFiles);
dbms_datapump.set_parameter(handle => h1, name => 'INCLUDE_METADATA', value => 1);
dbms_datapump.set_parameter(handle => h1, name => 'DATA_ACCESS_METHOD', value => 'AUTOMATIC');
-- Include Schemas
--dbms_datapump.metadata_filter(handle => h1, name => 'NAME_EXPR', value => 'IN('||SchemaList||')', object_type => 'DATABASE_EXPORT/SCHEMA');
dbms_datapump.metadata_filter(handle => h1, name => 'NAME_EXPR', value => 'IN('||SchemaList||')', object_type => 'DATABASE_EXPORT/SCHEMA');
dbms_datapump.metadata_filter(handle => h1, name => 'INCLUDE_PATH_EXPR', value => 'IN(''DATABASE_EXPORT/SCHEMA'')');
--Include Profiles
dbms_datapump.metadata_filter(handle => h1, name => 'NAME_EXPR', value => 'like ''PROFILE_%''', object_type => 'PROFILE');
dbms_datapump.metadata_filter(handle => h1, name => 'INCLUDE_PATH_EXPR', value => 'IN(''PROFILE'')');
--Include Roles
dbms_datapump.metadata_filter(handle => h1, name => 'NAME_EXPR', value => 'like ''***%''', object_type => 'ROLE');
dbms_datapump.metadata_filter(handle => h1, name => 'INCLUDE_PATH_EXPR', value => 'IN(''ROLE'')');
-- Größenabschätzung
dbms_datapump.set_parameter(handle => h1, name => 'ESTIMATE', value => 'BLOCKS');
--Start Job
dbms_output.put_line('Import Job started; Logfile: '|| LogFileName);
dbms_datapump.start_job(handle => h1, skip_current => 0, abort_step => 0);
-- Wait for ending and finishing job
dbms_datapump.wait_for_job(handle=>h1,job_state =>job_state);
dbms_output.put_line('Job has completed');
dbms_output.put_line('Final job state = ' || job_state);
dbms_datapump.detach(handle => h1);
The Import Script:
begin
dbms_output.Enable(buffer_size => null);
-- Verzeichnis
DirectoryName := 'PHT_PUMP_DIR';
execute immediate 'create or replace directory "'||DirectoryName||'" as '''|| PumpDir||'''';
-- JobName
JobName := 'PHT_IMPORT'|| to_char(sysdate,'_yyyy-MM-DD-HH24-MI');
--FileNames
Filename := 'PHTDB'||FileNameDateStr||'_%U.DMP';
LogFilename := JobName||'.LOG';
h1 := dbms_datapump.open (operation => 'IMPORT', job_mode => 'FULL', job_name => JobName, version => 'COMPATIBLE');
--Wenn der Datapumpimport auf einer Standardversion ausgeführt wird, muss diese Aufrufzeizeile genutzt werden
--h1 := dbms_datapump.open (operation => 'IMPORT', job_mode => 'FULL', job_name => JobName, version => '10.2');
dbms_datapump.set_parallel(handle => h1, degree => ParallelExecutions);
dbms_datapump.add_file(handle => h1, filename => Logfilename, directory => DirectoryName, filetype => 3);
dbms_datapump.set_parameter(handle => h1, name => 'KEEP_MASTER', value => 0);
dbms_datapump.add_file(handle => h1, filename => Filename, directory => DirectoryName, filetype => 1);
dbms_datapump.set_parameter(handle => h1, name => 'INCLUDE_METADATA', value => 1);
dbms_datapump.set_parameter(handle => h1, name => 'DATA_ACCESS_METHOD', value => 'AUTOMATIC');
dbms_datapump.set_parameter(handle => h1, name => 'REUSE_DATAFILES', value => 0);
dbms_datapump.set_parameter(handle => h1, name => 'TABLE_EXISTS_ACTION', value => 'REPLACE');
dbms_datapump.set_parameter(handle => h1, name => 'SKIP_UNUSABLE_INDEXES', value => 0);
--Start Job
dbms_output.put_line('Import Job started; Logfile: '|| LogFileName);
dbms_datapump.start_job(handle => h1, skip_current => 0, abort_step => 0);
-- Wait for ending and finishing job
dbms_datapump.wait_for_job(handle=>h1,job_state =>job_state);
dbms_output.put_line('Job has completed');
dbms_output.put_line('Final job state = ' || job_state);
dbms_datapump.detach(handle => h1);Has no one any idea?
-
Script for export in datapump -- help needed !!!
hello all,
i am using the following script as batch file in my database for export
script:
=========
exp name/password file=d:\exp\%date%.dmp full=y log=d:\exp\exp.log an this will replace the first file(monday) on next monday.
similar way i need a script for data pump for full database export using datapump
thanks ,
goldlogin to database as a dba and create directory for your dumpfile path.
create directory dpump_dir as 'd:\exp';
and then use the below script for export.
expdp username/password full=y directory=dpump_dir dumpfile=%date%.dmp logfile=exp.log -
Error while export using DATAPUMP
hi ,
When i try to export the database using DataPump I am getting the following error.
Details :
DB version : 10.2.0.2
OS : (HP-Unix)
error :
dbsrv:/u02/oradata2> 6 dumpfile=uatdb_preEOD28_jul.dmp logfile=uatdb_preEOD28jul.log directory=expdp_new <
Export: Release 10.2.0.1.0 - 64bit Production on Sunday, 27 July, 2008 10:00:38
Copyright (c) 2003, 2005, Oracle. All rights reserved.
Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - 64bit Production
With the Partitioning, OLAP and Data Mining options
Starting "OCEANFNC"."JOB6": userid=oceanfnc/********@uatdb content=ALL full=Y job_name=job6 dumpfile=uatdb_preEOD28_jul.dmp logfile=uatdb_preEOD28jul.log directory=expdp_new
Estimate in progress using BLOCKS method...
Processing object type DATABASE_EXPORT/SCHEMA/TABLE/TABLE_DATA
ORA-39125: Worker unexpected fatal error in KUPW$WORKER.GET_TABLE_DATA_OBJECTS while calling DBMS_METADATA.FETCH_XML_CLOB []
ORA-31642: the following SQL statement fails:
BEGIN "DMSYS"."DBMS_DM_MODEL_EXP".SCHEMA_CALLOUT(:1,0,1,'10.02.00.01.00'); END;
ORA-06512: at "SYS.DBMS_SYS_ERROR", line 86
ORA-06512: at "SYS.DBMS_METADATA", line 907
ORA-04067: not executed, package body "DMSYS.DBMS_DM_MODEL_EXP" does not exist
ORA-06508: PL/SQL: could not find program unit being called: "DMSYS.DBMS_DM_MODEL_EXP"
ORA-06512: at "SYS.DBMS_SYS_ERROR", line 95
ORA-06512: at "SYS.KUPW$WORKER", line 6235
----- PL/SQL Call Stack -----
object line object
handle number name
c0000002ae4f0a60 14916 package body SYS.KUPW$WORKER
c0000002ae4f0a60 6300 package body SYS.KUPW$WORKER
c0000002ae4f0a60 9120 package body SYS.KUPW$WORKER
c0000002ae4f0a60 1880 package body SYS.KUPW$WORKER
c0000002ae4f0a60 6861 package body SYS.KUPW$WORKER
c0000002ae4f0a60 1262 package body SYS.KUPW$WORKER
c000000264e48dc0 2 anonymous block
Job "OCEANFNC"."JOB6" stopped due to fatal error at 10:00:41
Kindly advice me reg this!
Thankshi,
we got a metalink note for this problem and the metalink id is "Note Id: 433022.1 " . Even after running the Mentioned catalog.sql and catproc.sql, in our DB, we are facing some problems. The error description :
ORA-39127: unexpected error from call to export_string :=EXFSYS.DBMS_EXPFIL_DEPASEXP.system_info_exp(0,dynconnect,10.02.00.01.00',newblock)
ORA-04067: not executed, package body "EXFSYS.DBMS_EXPFIL_DEPASEXP" does not exist
ORA-06508: PL/SQL: could not find program unit being called: "EXFSYS.DBMS_EXPFIL_DEPASEXP"
ORA-06512: at line 1
ORA-06512: at "SYS.DBMS_METADATA", line 5327
Processing object type DATABASE_EXPORT/SYSTEM_PROCOBJACT/PRE_SYSTEM_ACTIONS/PROCACT_SYSTEM
ORA-39127: unexpected error from call to export_string :=EXFSYS.DBMS_EXPFIL_DEPASEXP.system_info_exp(1,dynconnect,10.02.00.01.00',newblock)
ORA-04067: not executed, package body "EXFSYS.DBMS_EXPFIL_DEPASEXP" does not exist
ORA-06508: PL/SQL: could not find program unit being called: "EXFSYS.DBMS_EXPFIL_DEPASEXP"
ORA-06512: at line 1
ORA-06512: at "SYS.DBMS_METADATA", line 5327
Processing object type DATABASE_EXPORT/SYSTEM_PROCOBJACT/POST_SYSTEM_ACTIONS/PROCACT_SYSTEM
ORA-39127: unexpected error from call to export_string :=EXFSYS.DBMS_EXPFIL_DEPASEXP.schema_info_exp('SYS',0,1,'10.02.00.01.00',newblock)
ORA-04067: not executed, package body "EXFSYS.DBMS_EXPFIL_DEPASEXP" does not exist
ORA-06508: PL/SQL: could not find program unit being called: "EXFSYS.DBMS_EXPFIL_DEPASEXP"
ORA-06512: at line 1
ORA-06512: at "SYS.DBMS_METADATA", line 5412
ORA-39127: unexpected error from call to export_string :=EXFSYS.DBMS_EXPFIL_DEPASEXP.schema_info_exp('SYSTEM',0,1,'10.02.00.01.00',newblock)
ORA-04067: not executed, package body "EXFSYS.DBMS_EXPFIL_DEPASEXP" does not exist
ORA-06508: PL/SQL: could not find program unit being called: "EXFSYS.DBMS_EXPFIL_DEPASEXP"
ORA-06512: at line 1
ORA-06512: at "SYS.DBMS_METADATA", line 5412
ORA-39127: unexpected error from call to export_string :=EXFSYS.DBMS_EXPFIL_DEPASEXP.schema_info_exp('OUTLN',0,1,'10.02.00.01.00',newblock)
ORA-04067: not executed, package body "EXFSYS.DBMS_EXPFIL_DEPASEXP" does not exist
ORA-06508: PL/SQL: could not find program unit being called: "EXFSYS.DBMS_EXPFIL_DEPASEXP"
ORA-06512: at line 1
ORA-06512: at "SYS.DBMS_METADATA", line 5412
ORA-39127: unexpected error from call to export_string :=EXFSYS.DBMS_EXPFIL_DEPASEXP.schema_info_exp('TSMSYS',0,1,'10.02.00.01.00',newblock)
ORA-04067: not executed, package body "EXFSYS.DBMS_EXPFIL_DEPASEXP" does not exist
ORA-06508: PL/SQL: could not find program unit being called: "EXFSYS.DBMS_EXPFIL_DEPASEXP"
ORA-06512: at line 1
ORA-06512: at "SYS.DBMS_METADATA", line 5412
ORA-39127: unexpected error from call to export_string :=EXFSYS.DBMS_EXPFIL_DEPASEXP.schema_info_exp('ANONYMOUS',0,1,'10.02.00.01.00',newblock)
ORA-04067: not executed, package body "EXFSYS.DBMS_EXPFIL_DEPASEXP" does not exist
ORA-06508: PL/SQL: could not find program unit being called: "EXFSYS.DBMS_EXPFIL_DEPASEXP"
ORA-06512: at line 1
ORA-06512: at "SYS.DBMS_METADATA", line 5412
ORA-39127: unexpected error from call to export_string :=EXFSYS.DBMS_EXPFIL_DEPASEXP.schema_info_exp('OLAPSYS',0,1,'10.02.00.01.00',newblock)
ORA-04067: not executed, package body "EXFSYS.DBMS_EXPFIL_DEPASEXP" does not exist
ORA-06508: PL/SQL: could not find program unit being called: "EXFSYS.DBMS_EXPFIL_DEPASEXP"
ORA-06512: at line 1
ORA-06512: at "SYS.DBMS_METADATA", line 5412
ORA-39127: unexpected error from call to export_string :=EXFSYS.DBMS_EXPFIL_DEPASEXP.schema_info_exp('SYSMAN',0,1,'10.02.00.01.00',newblock)
ORA-04067: not executed, package body "EXFSYS.DBMS_EXPFIL_DEPASEXP" does not exist
ORA-06508: PL/SQL: could not find program unit being called: "EXFSYS.DBMS_EXPFIL_DEPASEXP"
ORA-06512: at line 1
ORA-06512: at "SYS.DBMS_METADATA", line 5412
ORA-39127: unexpected error from call to export_string :=EXFSYS.DBMS_EXPFIL_DEPASEXP.schema_info_exp('MDDATA',0,1,'10.02.00.01.00',newblock)
ORA-04067: not executed, package body "EXFSYS.DBMS_EXPFIL_DEPASEXP" does not exist
ORA-06508: PL/SQL: could not find program unit being called: "EXFSYS.DBMS_EXPFIL_DEPASEXP"
ORA-06512: at line 1
ORA-06512: at "SYS.DBMS_METADATA", line 5412
ORA-39127: unexpected error from call to export_string :=EXFSYS.DBMS_EXPFIL_DEPASEXP.schema_info_exp('UATFNC',0,1,'10.02.00.01.00',newblock)
ORA-04067: not executed, package body "EXFSYS.DBMS_EXPFIL_DEPASEXP" does not exist
ORA-06508: PL/SQL: could not find program unit being called: "EXFSYS.DBMS_EXPFIL_DEPASEXP"
ORA-06512: at line 1
ORA-06512: at "SYS.DBMS_METADATA", line 5412
ORA-39127: unexpected error from call to export_string :=EXFSYS.DBMS_EXPFIL_DEPASEXP.schema_info_exp('MGMT_VIEW',0,1,'10.02.00.01.00',newblock)
ORA-04067: not executed, package body "EXFSYS.DBMS_EXPFIL_DEPASEXP" does not exist
ORA-06508: PL/SQL: could not find program unit being called: "EXFSYS.DBMS_EXPFIL_DEPASEXP"
ORA-06512: at line 1
ORA-06512: at "SYS.DBMS_METADATA", line 5412
ORA-39127: unexpected error from call to export_string :=EXFSYS.DBMS_EXPFIL_DEPASEXP.schema_info_exp('SCOTT',0,1,'10.02.00.01.00',newblock)
ORA-04067: not executed, package body "EXFSYS.DBMS_EXPFIL_DEPASEXP" does not exist
ORA-06508: PL/SQL: could not find program unit being called: "EXFSYS.DBMS_EXPFIL_DEPASEXP"
ORA-06512: at line 1
ORA-06512: at "SYS.DBMS_METADATA", line 5412
ORA-39127: unexpected error from call to export_string :=EXFSYS.DBMS_EXPFIL_DEPASEXP.schema_info_exp('BRNSMS',0,1,'10.02.00.01.00',newblock)
ORA-04067: not executed, package body "EXFSYS.DBMS_EXPFIL_DEPASEXP" does not exist
ORA-06508: PL/SQL: could not find program unit being called: "EXFSYS.DBMS_EXPFIL_DEPASEXP"
ORA-06512: at line 1
ORA-06512: at "SYS.DBMS_METADATA", line 5412
ORA-39127: unexpected error from call to export_string :=EXFSYS.DBMS_EXPFIL_DEPASEXP.schema_info_exp('FLEXML',0,1,'10.02.00.01.00',newblock)
ORA-04067: not executed, package body "EXFSYS.DBMS_EXPFIL_DEPASEXP" does not exist
ORA-06508: PL/SQL: could not find program unit being called: "EXFSYS.DBMS_EXPFIL_DEPASEXP"
ORA-06512: at line 1
ORA-06512: at "SYS.DBMS_METADATA", line 5412
ORA-39127: unexpected error from call to export_string :=EXFSYS.DBMS_EXPFIL_DEPASEXP.schema_info_exp('FLEXBO',0,1,'10.02.00.01.00',newblock)
ORA-04067: not executed, package body "EXFSYS.DBMS_EXPFIL_DEPASEXP" does not exist
ORA-06508: PL/SQL: could not find program unit being called: "EXFSYS.DBMS_EXPFIL_DEPASEXP"
ORA-06512: at line 1
ORA-06512: at "SYS.DBMS_METADATA", line 5412
ORA-39127: unexpected error from call to export_string :=EXFSYS.DBMS_EXPFIL_DEPASEXP.schema_info_exp('BRN000',0,1,'10.02.00.01.00',newblock)
ORA-04067: not executed, package body "EXFSYS.DBMS_EXPFIL_DEPASEXP" does not exist
ORA-06508: PL/SQL: could not find program unit being called: "EXFSYS.DBMS_EXPFIL_DEPASEXP"
ORA-06512: at line 1
ORA-06512: at "SYS.DBMS_METADATA", line 5412
ORA-39127: unexpected error from call to export_string :=EXFSYS.DBMS_EXPFIL_DEPASEXP.schema_info_exp('FLEXRPT',0,1,'10.02.00.01.00',newblock)
ORA-04067: not executed, package body "EXFSYS.DBMS_EXPFIL_DEPASEXP" does not exist
ORA-06508: PL/SQL: could not find program unit being called: "EXFSYS.DBMS_EXPFIL_DEPASEXP"
ORA-06512: at line 1
ORA-06512: at "SYS.DBMS_METADATA", line 5412
ORA-39127: unexpected error from call to export_string :=EXFSYS.DBMS_EXPFIL_DEPASEXP.schema_info_exp('BRN800',0,1,'10.02.00.01.00',newblock)
ORA-04067: not executed, package body "EXFSYS.DBMS_EXPFIL_DEPASEXP" does not exist
ORA-06508: PL/SQL: could not find program unit being called: "EXFSYS.DBMS_EXPFIL_DEPASEXP"
ORA-06512: at line 1
ORA-06512: at "SYS.DBMS_METADATA", line 5412
ORA-39127: unexpected error from call to export_string :=EXFSYS.DBMS_EXPFIL_DEPASEXP.schema_info_exp('BRN900',0,1,'10.02.00.01.00',newblock)
ORA-04067: not executed, package body "EXFSYS.DBMS_EXPFIL_DEPASEXP" does not exist
ORA-06508: PL/SQL: could not find program unit being called: "EXFSYS.DBMS_EXPFIL_DEPASEXP"
ORA-06512: at line 1
ORA-06512: at "SYS.DBMS_METADATA", line 5412
ORA-39127: unexpected error from call to export_string :=EXFSYS.DBMS_EXPFIL_DEPASEXP.schema_info_exp('OCEANFNC',0,1,'10.02.00.01.00',newblock)
ORA-04067: not executed, package body "EXFSYS.DBMS_EXPFIL_DEPASEXP" does not exist
ORA-06508: PL/SQL: could not find program unit being called: "EXFSYS.DBMS_EXPFIL_DEPASEXP"
ORA-06512: at line 1
ORA-06512: at "SYS.DBMS_METADATA", line 5412
Kindly advice me regarding this !!
Thanks a lot for ur reply !! -
Datapump help - cant get it to work
This is my script:
create or replace directory EXPORT as 'e:\aia\backup\dmp\';
declare
l_dp_handle NUMBER;
begin
l_dp_handle := DBMS_DATAPUMP.open(
operation => 'EXPORT',
job_mode => 'SCHEMA',
remote_link => NULL,
job_name => 'GTAIATB4_UPGRADE_EXP4',
version => 'LATEST'
DBMS_DATAPUMP.add_file(
handle => l_dp_handle,
filename => 'GTAIATB4_UPGRADE2.dmp',
directory => 'export',
filetype => DBMS_DATAPUMP.KU$_FILE_TYPE_DUMP_FILE
DBMS_DATAPUMP.add_file(
handle => l_dp_handle,
filename => 'GTAIATB4_UPGRADE2.log',
directory => 'exportT',
filetype => DBMS_DATAPUMP.KU$_FILE_TYPE_LOG_FILE
DBMS_DATAPUMP.metadata_filter(
handle => l_dp_handle,
name => 'SCHEMA_EXPR',
value => 'IN (''GTAIATB4_UPGRADE'')');
DBMS_DATAPUMP.start_job(l_dp_handle);
DBMS_DATAPUMP.detach(l_dp_handle);
END;
I want to export everything that belongs to the schema called GTAIATB4_UPGRADE.
this is the result:
SQL> declare
2 l_dp_handle NUMBER;
3 begin
4 l_dp_handle := DBMS_DATAPUMP.open(
5 operation => 'EXPORT',
6 job_mode => 'SCHEMA',
7 remote_link => NULL,
8 job_name => 'GTAIATB4_UPGRADE_EXP4',
9 version => 'LATEST'
10 );
11 DBMS_DATAPUMP.add_file(
12 handle => l_dp_handle,
13 filename => 'GTAIATB4_UPGRADE2.dmp',
14 directory => 'export',
15 filetype => DBMS_DATAPUMP.KU$_FILE_TYPE_DUMP_FILE
16 );
17 DBMS_DATAPUMP.add_file(
18 handle => l_dp_handle,
19 filename => 'GTAIATB4_UPGRADE2.log',
20 directory => 'exportT',
21 filetype => DBMS_DATAPUMP.KU$_FILE_TYPE_LOG_FILE
22 );
23 DBMS_DATAPUMP.metadata_filter(
24 handle => l_dp_handle,
25 name => 'SCHEMA_EXPR',
26 value => 'IN (''GTAIATB4_UPGRADE'')');
27 DBMS_DATAPUMP.start_job(l_dp_handle);
28 DBMS_DATAPUMP.detach(l_dp_handle);
29 END;
30
31
32 /
declare
ERROR at line 1:
ORA-39001: invalid argument value
ORA-06512: at "SYS.DBMS_SYS_ERROR", line 79
ORA-06512: at "SYS.DBMS_DATAPUMP", line 3444
ORA-06512: at "SYS.DBMS_DATAPUMP", line 3693
ORA-06512: at line 11
Also, can someone tell me how I can delete the jobs that were created as now I have to keep adding job names ie 'GTAIATB4_UPGRADE_EXP4' started as 'GTAIATB4_UPGRADE_EXP', then 'GTAIATB4_UPGRADE_EXP3' etc etcThere could be lots of things wrong with this pl/sql script. If you need to change the job name every time, then I can tell you that at least the open is happening, because it is creating the master table. After that, I have no idea what is happening. One thing I would do is to add some sort of debugging into the code so you can figure out what is happening. I use an internal tool to help debug pl/sql, but a simple thing would be to do an insert into a table after every call. Something like this:
first create a table that is owned by the schema running the job.
create table debug_tab (a number);
then in the code, I would put a statement like this after every datapump call:
insert into debug_tab values (1);
commit;
then
insert into debug_tab values (2);
commit;
etc. Then after you run your script, look to see what is in debug_tab. This will tell you how far you got. Some of the things I would look at would be to make sure that the directory exists.
Actually, I think I just saw it. For the dumpfile, you have the directory as 'export'
for the log file, you have the directory as 'exportT' <---- you have both lower t and upper T. This is probably your problem.
As for getting rid of existing jobs,
select * from user_datapump_jobs;
then
drop table 'job_name';
This will get rid of the Data Pump job.
Hope this helps.
Dean -
How to avoid Time out issues in Datapump?
Hi All,
Iam loading one of our schema from stage to test server using datapump expdp and impdp.Its size is around 332GB.
My Oracle server instance is on unix server rwlq52l1 and iam connecting to oracle from my client instance(rwxq04l1).
iam running the expdp and impdp command from Oracle client using the below commands.
expdp pa_venky/********@qdssih30 schemas=EVPO directory=PA_IMPORT_DUMP dumpfile=EVPO_Test.dmp CONTENT=all include=table
impdp pa_venky/********@qdsrih30 schemas=EVPO directory=PA_IMPORT_DUMP dumpfile=EVPO_Test.dmp CONTENT=all include=table table_exists_action=replace
Here export is completed and import is struck at below index building.After some time iam seeing below time out in log files
Processing object type SCHEMA_EXPORT/TABLE/INDEX/INDEX.
Error:-
VERSION INFORMATION:
TNS for Linux: Version 11.1.0.7.0 - Production
Unix Domain Socket IPC NT Protocol Adaptor for Linux: Version 11.1.0.7.0 - Production
Oracle Bequeath NT Protocol Adapter for Linux: Version 11.1.0.7.0 - Production
TCP/IP NT Protocol Adapter for Linux: Version 11.1.0.7.0 - Production
Time: 13-JAN-2012 12:34:31
Tracing not turned on.
Tns error struct:
ns main err code: 12535
TNS-12535: TNS:operation timed out
ns secondary err code: 12560
nt main err code: 505
TNS-00505: Operation timed out
nt secondary err code: 110
nt OS err code: 0
Client address: (ADDRESS=(PROTOCOL=tcp)(HOST=170.217.82.86)(PORT=65069))
The above ip address is my unix client system(rwxq04l1) ip.
How to see oracle client system port number?
Please suggest me how to avoid this time out issues.Seems this time out is between oracle server and client.
Thanks,
Venkat Vadlamudi.Don't run from the client ... run from the server
or
if running from a client use the built-in DBMS_DATAPUMP package's API.
http://www.morganslibrary.org/reference/pkgs/dbms_datapump.html
Maybe you are looking for
-
A lot of the times when I visit Amazon FF 4 locks up on me to the point that I can't use it. The only option to get out of it being locked up is to right click, then select "close" at which point a dialogue pops up saying that the program has stopped
-
How can I prevent my iPhone4 from starting after connecting to Power?
How can I prevent my iPhone4 from starting (booting) after connecting to Power?
-
How to know the nwds version ?
Hi, how can i find whether my nwds is which version of the following ? NW04s SP09 (Hotfix from 7. March 2007) NW04s SP10 (Hotfix from 9. March 2007) NW04s SP11 (Hotfix from 11. April 2007) NW04 SP18 (Hotfix from 17. April 2007) NW04 SP19 (Hotfix from
-
AS2 or AS3 + online Tut for beginners ?
Hi All, Anybody can advise what is the easiest to learn for a beginner : AS2 or AS3 ? I am trying to do a "liquid layout " in Flash and have found tuts either with AS2 or AS3. Though I am struggling to understand them and need to learn the basics ei
-
Web Premium CS4 and Offline Files Issue
My system: Multiple notebooks, laptops running Windows 7 Ultimate, all on a Microsoft Small Business Server 2008 network. I have My Documents, My Pictures, and My videos folders rehomed to the server via group policy, and when I unplug from the netwo