Can't "stop" data pump job via dbms_datapump
DBMS_DATAPUMP.101
I have a job I have OPENd, but never started. How do I get rid of it?
SQL> declare
2 v_handle number;
3
4 begin
5
6 v_handle := DBMS_DATAPUMP.ATTACH('RefreshFromDev','SYSTEM')
7 DBMS_DATAPUMP.STOP_JOB(v_handle );
8
9 end;
10 /
declare
ERROR at line 1:
ORA-31626: job does not exist
ORA-06512: at "SYS.DBMS_SYS_ERROR", line 79
ORA-06512: at "SYS.DBMS_DATAPUMP", line 911
ORA-06512: at "SYS.DBMS_DATAPUMP", line 3279
ORA-06512: at line 6
SQL> SELECT JOB_NAME, OWNER_NAME , STATE FROM DBA_DATAPUMP_JOBS;
JOB_NAME
OWNER_NAME
STATE
RefreshFromDev
SYSTEM
DEFINING
Obviously the job exists. I've tried issuing another Open with the same name and It complains that the job already exists. As I understand it, I have to attach to the job and issue a stop_iob in order to dispose of it.
Thanks
Steve
OK, mystery solved, sort of. If I close my sqlplus session, and log in again, issuing the same sql, it works.
I suppose I should read the message "job does not exist" as something like "already attached to a job, detach or disconnect"
Can anyone shed some light on this? I guess it means you can only attach to one job per session, or that only one handle can be issued per job per session. But I'm only guessing.
Thanks,
Steve
Similar Messages
-
ORA-39097: Data Pump job encountered unexpected error -12801
Hallo!I am running Oracle RAC 11.2.0.3.0 database on IBM-AIX 7.1 OS platform.
We normally do data pump expdp backups and we created a OS authenticated user and been having non-DBA users use this user (instead of / as sysdba which is only used by DBAs) to run expdp.This OS authenticated user has been working fine until it statrd gigin use error below
Export: Release 11.2.0.3.0 - Production on Fri Apr 5 23:08:22 2013
Copyright (c) 1982, 2011, Oracle and/or its affiliates. All rights reserved.
Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.3.0 - 64bit Production
With the Partitioning, Real Application Clusters, Automatic Storage Management, Oracle Label Security,
OLAP, Data Mining, Oracle Database Vault and Real Application Testing optio
FLASHBACK automatically enabled to preserve database integrity.
Starting "OPS$COPBKPMG"."SYS_EXPORT_SCHEMA_16": /******** DIRECTORY=COPKBFUB_DIR dumpfile=COPKBFUB_Patch35_PreEOD_2013-04-05-23-08_%U.dmp logfile=COPKBFUB_Patch35_PreEOD_2013-04-05-23-08.log cluster=n parallel=4 schemas=BANKFUSION,CBS,UBINTERFACE,WASADMIN,CBSAUDIT,ACCTOPIC,BFBANKFUSION,PARTY,BFPARTY,WSREGISTRY,COPK
Estimate in progress using BLOCKS method...
Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
Total estimation using BLOCKS method: 130.5 GB
Processing object type SCHEMA_EXPORT/USER
Processing object type SCHEMA_EXPORT/SYSTEM_GRANT
Processing object type SCHEMA_EXPORT/ROLE_GRANT
Processing object type SCHEMA_EXPORT/DEFAULT_ROLE
Processing object type SCHEMA_EXPORT/PRE_SCHEMA/PROCACT_SCHEMA
Processing object type SCHEMA_EXPORT/SYNONYM/SYNONYM
Processing object type SCHEMA_EXPORT/DB_LINK
Processing object type SCHEMA_EXPORT/SEQUENCE/SEQUENCE
Processing object type SCHEMA_EXPORT/TABLE/TABLE
Processing object type SCHEMA_EXPORT/TABLE/GRANT/OWNER_GRANT/OBJECT_GRANT
Processing object type SCHEMA_EXPORT/TABLE/COMMENT
Processing object type SCHEMA_EXPORT/PACKAGE/PACKAGE_SPEC
Processing object type SCHEMA_EXPORT/PACKAGE/GRANT/OWNER_GRANT/OBJECT_GRANT
Processing object type SCHEMA_EXPORT/FUNCTION/FUNCTION
Processing object type SCHEMA_EXPORT/FUNCTION/GRANT/OWNER_GRANT/OBJECT_GRANT
Processing object type SCHEMA_EXPORT/PROCEDURE/PROCEDURE
Processing object type SCHEMA_EXPORT/PROCEDURE/GRANT/OWNER_GRANT/OBJECT_GRANT
Processing object type SCHEMA_EXPORT/PACKAGE/COMPILE_PACKAGE/PACKAGE_SPEC/ALTER_PACKAGE_SPEC
Processing object type SCHEMA_EXPORT/FUNCTION/ALTER_FUNCTION
Processing object type SCHEMA_EXPORT/PROCEDURE/ALTER_PROCEDURE
Processing object type SCHEMA_EXPORT/TABLE/INDEX/INDEX
Processing object type SCHEMA_EXPORT/TABLE/INDEX/FUNCTIONAL_INDEX/INDEX
Processing object type SCHEMA_EXPORT/TABLE/CONSTRAINT/CONSTRAINT
Processing object type SCHEMA_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS
Processing object type SCHEMA_EXPORT/TABLE/INDEX/STATISTICS/FUNCTIONAL_INDEX/INDEX_STATISTICS
Processing object type SCHEMA_EXPORT/VIEW/VIEW
Processing object type SCHEMA_EXPORT/VIEW/GRANT/OWNER_GRANT/OBJECT_GRANT
Processing object type SCHEMA_EXPORT/VIEW/GRANT/CROSS_SCHEMA/OBJECT_GRANT
Processing object type SCHEMA_EXPORT/VIEW/COMMENT
Processing object type SCHEMA_EXPORT/PACKAGE/PACKAGE_BODY
Processing object type SCHEMA_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
Processing object type SCHEMA_EXPORT/MATERIALIZED_VIEW
Processing object type SCHEMA_EXPORT/POST_SCHEMA/PROCACT_SCHEMA
. . exported "WASADMIN"."BATCHGATEWAYLOGDETAIL" 2.244 GB 9379850 rows
. . exported "WASADMIN"."UBTB_TRANSACTION" 13.71 GB 46299982 rows
. . exported "WASADMIN"."INTERESTHISTORY" 2.094 GB 13479801 rows
. . exported "WASADMIN"."MOVEMENTSHISTORY" 1.627 GB 13003451 rows
. . exported "WASADMIN"."ACCRUALSREPORT" 1.455 GB 18765315 rows
ORA-39097: Data Pump job encountered unexpected error -12801
ORA-39065: unexpected master process exception in MAIN
ORA-12801: error signaled in parallel query server PZ99, instance copubdb02dc:COPKBFUB2 (2)
ORA-01460: unimplemented or unreasonable conversion requested
Job "OPS$COPBKPMG"."SYS_EXPORT_SCHEMA_16" stopped due to fatal error at 23:13:37
Please assist.have you seen this?
*Bug 13099577 - ORA-1460 with parallel query [ID 13099577.8]* -
ERROR: Invalid data pump job state; one of: DEFINING or EXECUTING.
Hi all.
I'm using Oracle 10g (10.2.0.3) I had some scheduled jobs for export the DB with datapump. Some days ago they began to fail with the message:
ERROR: Invalid data pump job state; one of: DEFINING or EXECUTING.
Wait a few moments and check the log file (if one was used) for more details.
The EM says that I can't monitor the job because there's no master tables. How can I find the log?
I also founded in the bdump that exceptions occurred:
*** 2008-08-22 09:04:54.671
*** ACTION NAME:(EXPORT_ITATRD_DIARIO) 2008-08-22 09:04:54.671
*** MODULE NAME:(Data Pump Master) 2008-08-22 09:04:54.671
*** SERVICE NAME:(SYS$USERS) 2008-08-22 09:04:54.671
*** SESSION ID:(132.3726) 2008-08-22 09:04:54.671
kswsdlaqsub: unexpected exception err=604, err2=24010
kswsdlaqsub: unexpected exception err=604, err2=24010
* kjdrpkey2hv: called with pkey 114178, options x8
* kjdrpkey2hv: called with pkey 114176, options x8
* kjdrpkey2hv: called with pkey 114177, options x8
Some help would be appreciated.
Thanks.Delete any uncompleted or failed jobs and try again.
-
ORA-39097: Data Pump job encountered unexpected error -39076
Hi Everyone,
Today i tried to take a export dump pump(table specific) from my test database, version is 10.2.0.4 on Solaris10(64-bit) and i got the following error message,
Job "SYSTEM"."SYS_EXPORT_TABLE_23" successfully completed at 09:51:36
ORA-39097: Data Pump job encountered unexpected error -39076
ORA-39065: unexpected master process exception in KUPV$FT_INT.DELETE_JOB
ORA-39076: cannot delete job SYS_EXPORT_TABLE_23 for user SYSTEM
ORA-06512: at "SYS.DBMS_SYS_ERROR", line 95
ORA-06512: at "SYS.KUPV$FT_INT", line 934
ORA-31632: master table "SYSTEM.SYS_EXPORT_TABLE_23" not found, invalid, or inaccessible
ORA-06512: at "SYS.DBMS_SYS_ERROR", line 95
ORA-06512: at "SYS.KUPV$FT_INT", line 1079
ORA-20000: Unable to send e-mail message from pl/sql because of:
ORA-29260: network error: Connect failed because target host or object does not exist
ORA-39097: Data Pump job encountered unexpected error -39076
ORA-39065: unexpected master process exception in MAIN
ORA-39076: cannot delete job SYS_EXPORT_TABLE_23 for user SYSTEM
ORA-06512: at "SYS.DBMS_SYS_ERROR", line 95
ORA-06512: at "SYS.KUPV$FT_INT", line 934
ORA-31632: master table "SYSTEM.SYS_EXPORT_TABLE_23" not found, invalid, or inaccessible
ORA-06512: at "SYS.DBMS_SYS_ERROR", line 95
ORA-06512: at "SYS.KUPV$FT_INT", line 1079
ORA-20000: Unable to send e-mail message from pl/sql because of:
ORA-29260: network error: Connect failed because target host or object does not exist
i hope the export dumpfile is valid one but i don't know why i am getting this error message. Does any one have faced this kind of problem. please advice me
Thanks
ShanOnce you see this:
Job "SYSTEM"."SYS_EXPORT_TABLE_23" successfully completed at 09:51:36The Data Pump job is done with the dumpfile. There is some clean up that is needed and it looks like something in the cleanup failed. Not sure what it was, but you dumpfile should be fine. One easy way to test it is to run impdp with sqlfile. This will do everything import will do, but instead of creating objects, it writes the ddl to the sql file.
impdp user/password sqlfile=my_test.sql directory=your_dir dupmfile=your_dump.dmp ...
If that works, then your dumpfile should be fine. The last thing the export does is write the Data Pump master table to the dumpfile. The first thing that import does is read that table in. So, if you can read it in (which impdp sqlfile does) your dump is good.
Dean -
ORA-39080: failed to create queues "" and "" for Data Pump job
When I am running datapump expdp I receive the following error:
+++Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - Produc+++
+++tion+++
+++With the Partitioning, OLAP and Data Mining options+++
+++ORA-31626: job does not exist+++
+++ORA-31637: cannot create job SYS_EXPORT_SCHEMA_01 for user CHESHIRE_POLICE_LOCAL+++
+++ORA-06512: at "SYS.DBMS_SYS_ERROR", line 95+++
+++ORA-06512: at "SYS.KUPV$FT_INT", line 600+++
+++ORA-39080: failed to create queues "" and "" for Data Pump job+++
+++ORA-06512: at "SYS.DBMS_SYS_ERROR", line 95+++
+++ORA-06512: at "SYS.KUPC$QUE_INT", line 1555+++
+++ORA-01403: no data found+++
Sys has the following two objects as invalid at present after running catproc.sql and utlrp.sql and manual compilation:
OBJECT_NAME OBJECT_TYPE
AQ$_KUPC$DATAPUMP_QUETAB_E QUEUE
SCHEDULER$_JOBQ QUEUE
While I run catdpb.sql the datapump queue table does not create:
BEGIN
dbms_aqadm.create_queue_table(queue_table => 'SYS.KUPC$DATAPUMP_QUETAB', multiple_consumers => TRUE, queue_payload_type =>'SYS.KUPC$_MESSAGE', comment => 'DataPump Queue Table', compatible=>'8.1.3');
EXCEPTION
WHEN OTHERS THEN
IF SQLCODE = -24001 THEN NULL;
ELSE RAISE;
END IF;
END;
ERROR at line 1:
ORA-01403: no data found
ORA-06512: at line 7Snehashish Ghosh wrote:
When I am running datapump expdp I receive the following error:
+++Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - Produc+++
+++tion+++
+++With the Partitioning, OLAP and Data Mining options+++
+++ORA-31626: job does not exist+++
+++ORA-31637: cannot create job SYS_EXPORT_SCHEMA_01 for user CHESHIRE_POLICE_LOCAL+++
+++ORA-06512: at "SYS.DBMS_SYS_ERROR", line 95+++
+++ORA-06512: at "SYS.KUPV$FT_INT", line 600+++
+++ORA-39080: failed to create queues "" and "" for Data Pump job+++
+++ORA-06512: at "SYS.DBMS_SYS_ERROR", line 95+++
+++ORA-06512: at "SYS.KUPC$QUE_INT", line 1555+++
+++ORA-01403: no data found+++
Sys has the following two objects as invalid at present after running catproc.sql and utlrp.sql and manual compilation:
OBJECT_NAME OBJECT_TYPE
AQ$_KUPC$DATAPUMP_QUETAB_E QUEUE
SCHEDULER$_JOBQ QUEUE
While I run catdpb.sql the datapump queue table does not create:
BEGIN
dbms_aqadm.create_queue_table(queue_table => 'SYS.KUPC$DATAPUMP_QUETAB', multiple_consumers => TRUE, queue_payload_type =>'SYS.KUPC$_MESSAGE', comment => 'DataPump Queue Table', compatible=>'8.1.3');does it work better when specifying an Oracle version that is from this Century; newer than V8.1? -
Can we use Data Pump to export data, using a SQL query, doing a join
Folks,
I have a quick question.
Using Oracle 10g R2 on Solaris 10.
Can Data Pump be used to export data, using a SQL query which is doing a join between 3 tables ?
Thanks,
AshishHello,
No , this is from expdp help=Y
QUERY Predicate clause used to export a subset of a table.
Regards -
Data pump: by using dbms_datapump
I wrote a pl/sql procedure that compiles and there is no error. But when I execute the procedure, I got follwing error
ERROR at line 1:
ORA-31623: a job is not attached to this session via the specified handle
ORA-06512: at "SYS.DBMS_SYS_ERROR", line 79
ORA-06512: at "SYS.DBMS_DATAPUMP", line 911
ORA-06512: at "SYS.DBMS_DATAPUMP", line 4710
ORA-06512: at "DO_EXPORT.RCAT_EXPDP", line 59
ORA-06512: at line 1
I went to metalink and there I found that if there is any compnent invalid in dba_registry then this error will occur. I found that spatial is invalid in dba_registry. I run utlrp several times, but still spatial is in-valid. Does any one one work around this so that I can use pl/sql code which i wrote.
thanksI also got this error when I run this job in a database where all components are valid.
Source code which I am using is this.
CREATE OR REPLACE procedure DO_EXPORT.rcat_expdp (
strJobName in varchar2 , strDumpFileName in varchar2,
strLogFileName in varchar2, strDirectory in varchar2,
strSchemaName in varchar2)
is
d1 number;
v_ind number; v_sts ku$_Status; v_le ku$_LogEntry;begin
begin
dbms_output.put_line ('in code');
d1 := dbms_datapump.open (
operation => 'EXPORT',
job_mode => 'SCHEMA',
job_name => strJobName,
version => 'COMPATIBLE');
end;
dbms_output.put_line ('handle created');
begin
dbms_datapump.add_file(handle => d1,
filename => strDumpFileName,
directory => strDirectory,
filetype => DBMS_DATAPUMP.KU$_FILE_TYPE_DUMP_FILE);
end;
dbms_output.put_line ('dmup file');
begin
dbms_datapump.add_file(handle => d1,
filename => strLogFileName,
directory => strDirectory,
filetype => DBMS_DATAPUMP.KU$_FILE_TYPE_LOG_FILE);
end;
dbms_output.put_line ('log file');
begin
dbms_datapump.metadata_filter(handle => d1,
name => 'SCHEMA_EXPR',
value => 'IN ( '|| '''strSchemaName''' || ')');
end;
dbms_output.put_line ('schema added');
begin
dbms_datapump.start_job(handle => d1, skip_current => 0);
end;
dbms_output.put_line ('job started');
begin
dbms_datapump.detach(handle => d1);
end;
dbms_output.put_line('Export job submitted successfully.');
exception
when others then
v_sts:=dbms_datapump.get_status(d1,dbms_datapump.ku$_status_job_error,0);
v_le := v_sts.error;
if v_le is not null then
v_ind := v_le.FIRST;
while v_ind is not null loop
dbms_output.put_line(v_le(v_ind).LogText);
v_ind := v_le.NEXT(v_ind);
end loop;
end if;
begin dbms_datapump.stop_job(handle => d1); end;
end;
/ -
Epson can do only 1st print job via AirportExpress
I have an Epson M1400 connected to the AirportExpress. After the 1st print job no other print jobs are executed. After switching off and switching on the printer a new print job can be done.
When connecting the printer via USB to the MacBook I can print multiple print jobs as nornal without a problem.
All drivers and OSX 10.6.8 are updated.
Anyone an idea what to do?Hi 4epHaus,
On the iPad, if you press the home button (the circle indentation with the rounded square) two times, a list of currently running apps appear. Look for an app called Print Center. Select that app and choose the print job, then touch the cancel button.
If I have solved your issue, please feel free to provide kudos and make sure you mark this thread as solution provided!
Although I work for HP, my posts and replies are my own opinion and not those of HP. -
HT5934 If i want to stop upgrading iOS 7 for a while how can I stop dat??
I want to know how to stop upgrading ios7 for a while and can download later??
A possible cause is security software (firewall,anti-virus) that blocks or restricts Firefox or the plugin-container process without informing you, possibly after detecting changes (update) to the Firefox program.
Remove all rules for Firefox and the plugin-container from the permissions list in the firewall and let your firewall ask again for permission to get full unrestricted access to internet for Firefox and the plugin-container process and the updater process.
See:
*https://support.mozilla.org/kb/Server+not+found
*https://support.mozilla.org/kb/Firewalls
See also:
*http://kb.mozillazine.org/Error_loading_websites
You can try to reset (power off/on) the router. -
How can i stop data in the report when it is falling apart.
pls help me
Simple Example :
tables : lfa1.
select-options s_lifnr for lfa1-lifnr.
data : begin of i_lfa1 occurs 0 ,
lifnr like lfa1-lifnr,
name1 like lfa1-name1,
land1 like lfa1-land1,
end of i_lfa1.
start-of-selection.
select lifnr
name1
land1 from lfa1
into table i_lfa1
where lifnr in s_lifnr.
<b>if sy-subrc ne 0.
give error message.
endif.</b> -
DECLARE
ind NUMBER; -- Loop index
h1 NUMBER; -- Data Pump job handle
percent_done NUMBER; -- Percentage of job complete
job_state VARCHAR2(30); -- To keep track of job state
le ku$_LogEntry; -- For WIP and error messages
js ku$_JobStatus; -- The job status from get_status
jd ku$_JobDesc; -- The job description from get_status
sts ku$_Status; -- The status object returned by get_status
BEGIN
-- Create a (user-named) Data Pump job to do a schema export.
h1 := DBMS_DATAPUMP.OPEN('EXPORT','SCHEMA',NULL,'EXAMPLE1','LATEST');
-- Specify a single dump file for the job (using the handle just returned)
-- and a directory object, which must already be defined and accessible
-- to the user running this procedure.
--BACKUP DIRECTORY NAME
DBMS_DATAPUMP.ADD_FILE(h1,'example1.dmp','BACKUP');
-- A metadata filter is used to specify the schema that will be exported.
--ORVETL USER NAME
DBMS_DATAPUMP.METADATA_FILTER(h1,'SCHEMA_EXPR','IN (''orvetl'')');
-- Start the job. An exception will be generated if something is not set up
-- properly.
DBMS_DATAPUMP.START_JOB(h1);
-- The export job should now be running. In the following loop, the job
-- is monitored until it completes. In the meantime, progress information is
-- displayed.
percent_done := 0;
job_state := 'UNDEFINED';
while (job_state != 'COMPLETED') and (job_state != 'STOPPED') loop
dbms_datapump.get_status(h1,
dbms_datapump.ku$_status_job_error +
dbms_datapump.ku$_status_job_status +
dbms_datapump.ku$_status_wip,-1,job_state,sts);
js := sts.job_status;
-- If the percentage done changed, display the new value.
if js.percent_done != percent_done
then
dbms_output.put_line('*** Job percent done = ' ||
to_char(js.percent_done));
percent_done := js.percent_done;
end if;
-- If any work-in-progress (WIP) or error messages were received for the job,
-- display them.
if (bitand(sts.mask,dbms_datapump.ku$_status_wip) != 0)
then
le := sts.wip;
else
if (bitand(sts.mask,dbms_datapump.ku$_status_job_error) != 0)
then
le := sts.error;
else
le := null;
end if;
end if;
if le is not null
then
ind := le.FIRST;
while ind is not null loop
dbms_output.put_line(le(ind).LogText);
ind := le.NEXT(ind);
end loop;
end if;
end loop;
-- Indicate that the job finished and detach from it.
dbms_output.put_line('Job has completed');
dbms_output.put_line('Final job state = ' || job_state);
dbms_datapump.detach(h1);
END;
error-
ERROR at line 1:
ORA-39001: invalid argument value
ORA-06512: at "SYS.DBMS_SYS_ERROR", line 79
ORA-06512: at "SYS.DBMS_DATAPUMP", line 2926
ORA-06512: at "SYS.DBMS_DATAPUMP", line 3162
ORA-06512: at line 20
Message was edited by:
anutoshI assume all the other dimensions are being specified via a load rule header (i.e. the rule is otherwise valid).
What is your data source? What does the number (data) format look like? Can you identify (and post) specific rows that are causing the error? -
DATA PUMP API returning ORA-31655
Hello Gurus,
I am using below code to import a table(EMP) from one Database to Another using Network.
set serveroutput on;
DECLARE
ind NUMBER; -- Loop index
spos NUMBER; -- String starting position
slen NUMBER; -- String length for output
h1 NUMBER; -- Data Pump job handle
percent_done NUMBER; -- Percentage of job complete
job_state VARCHAR2(30); -- To keep track of job state
le ku$_LogEntry; -- For WIP and error messages
js ku$_JobStatus; -- The job status from get_status
jd ku$_JobDesc; -- The job description from get_status
sts ku$_Status; -- The status object returned by get_status
BEGIN
h1 := DBMS_DATAPUMP.OPEN('IMPORT','TABLE','DBLINK',NULL,'LATEST');
DBMS_DATAPUMP.METADATA_FILTER(h1,'NAME_EXPR','IN (''SCOTT.EMP'')','TABLE');
DBMS_DATAPUMP.SET_PARAMETER(h1,'TABLE_EXISTS_ACTION','REPLACE');
DBMS_DATAPUMP.METADATA_REMAP(h1,'REMAP_SCHEMA','SCOTT','SCOTT');
--DBMS_DATAPUMP.METADATA_FILTER(h1,'INCLUDE_PATH_LIST','like''TABLE''');
DBMS_DATAPUMP.METADATA_REMAP(h1,'REMAP_TABLESPACE','USERS','USERS');
DBMS_DATAPUMP.SET_PARALLEL(h1,8);
begin
dbms_datapump.start_job(h1);
dbms_output.put_line('Data Pump job started successfully');
exception
when others then
if sqlcode = dbms_datapump.success_with_info_num
then
dbms_output.put_line('Data Pump job started with info available:');
dbms_datapump.get_status(h1,
dbms_datapump.ku$_status_job_error,0,
job_state,sts);
if (bitand(sts.mask,dbms_datapump.ku$_status_job_error) != 0)
then
le := sts.error;
if le is not null
then
ind := le.FIRST;
while ind is not null loop
dbms_output.put_line(le(ind).LogText);
ind := le.NEXT(ind);
end loop;
end if;
end if;
else
raise;
end if;
end;
-- The export job should now be running. In the following loop, we will monitor
-- the job until it completes. In the meantime, progress information is
-- displayed.
percent_done := 0;
job_state := 'UNDEFINED';
while (job_state != 'COMPLETED') and (job_state != 'STOPPED') loop
dbms_datapump.get_status(h1,
dbms_datapump.ku$_status_job_error +
dbms_datapump.ku$_status_job_status +
dbms_datapump.ku$_status_wip,-1,job_state,sts);
js := sts.job_status;
-- If the percentage done changed, display the new value.
if js.percent_done != percent_done
then
dbms_output.put_line('*** Job percent done = ' ||
to_char(js.percent_done));
percent_done := js.percent_done;
end if;
-- Display any work-in-progress (WIP) or error messages that were received for
-- the job.
if (bitand(sts.mask,dbms_datapump.ku$_status_wip) != 0)
then
le := sts.wip;
else
if (bitand(sts.mask,dbms_datapump.ku$_status_job_error) != 0)
then
le := sts.error;
else
le := null;
end if;
end if;
if le is not null
then
ind := le.FIRST;
while ind is not null loop
dbms_output.put_line(le(ind).LogText);
ind := le.NEXT(ind);
end loop;
end if;
end loop;
-- Indicate that the job finished and detach from it.
dbms_output.put_line('Job has completed');
dbms_output.put_line('Final job state = ' || job_state);
dbms_datapump.detach(h1);
-- Any exceptions that propagated to this point will be captured. The
-- details will be retrieved from get_status and displayed.
exception
when others then
dbms_output.put_line('Exception in Data Pump job');
dbms_datapump.get_status(h1,dbms_datapump.ku$_status_job_error,0,
job_state,sts);
if (bitand(sts.mask,dbms_datapump.ku$_status_job_error) != 0)
then
le := sts.error;
if le is not null
then
ind := le.FIRST;
while ind is not null loop
spos := 1;
slen := length(le(ind).LogText);
if slen > 255
then
slen := 255;
end if;
while slen > 0 loop
dbms_output.put_line(substr(le(ind).LogText,spos,slen));
spos := spos + 255;
slen := length(le(ind).LogText) + 1 - spos;
end loop;
ind := le.NEXT(ind);
end loop;
end if;
end if;
END;
when i run the same code and change the mode for SCHEMA LEVEL import it is working fine.
But when i want to import a single table it is giving below error
Data Pump job started successfully
Starting "SCOTT"."SYS_IMPORT_TABLE_06":
Estimate in progress using BLOCKS method...
Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
Total estimation using BLOCKS method: 0 KB
ORA-31655: no data or metadata objects selected for job
*** Job percent done = 100
Job "SCOTT"."SYS_IMPORT_TABLE_06" completed with 1 error(s) at 20:56:23
Job has completed
Final job state = COMPLETED
can you give any suggestions on this error.
Thanks in advanceDid you check this link?
http://arjudba.blogspot.com/2009/01/ora-31655-no-data-or-metadata-objects.html
Does it provide any help?
Regards.
Satyaki De. -
ORA-39126 during an export of a partition via dbms_datapump
Hi ,
i did export using datapump in command line everything went fine but while exporting via dbms_datapump i got this:
ORA-39126 during an export of a partition via dbms_datapump
ORA-00920
'SELECT FROM DUAL WHERE :1' P20060401
ORA-06512: at "SYS.DBMS_SYS_ERROR", line 95
ORA-06512: at "SYS.KUPW$WORKER", line 6228
the procedure is:
PROCEDURE pr_depura_bitacora
IS
l_job_handle NUMBER;
l_job_state VARCHAR2(30);
l_partition VARCHAR2(30);
v_sql VARCHAR2(2000);
BEGIN
-- Create a user-named Data Pump job to do a "table:partition-level" export
-- Local
select 'P'|| to_char((select min(STP_LOG_DATE) from SAI_AUDITBITACORA),'YYYYMM')||'01'
into l_partition
from user_tab_partitions
where table_name = 'SAI_AUDITBITACORA'
and rownum = 1;
l_partition := rtrim (l_partition,' ');
l_job_handle:= DBMS_DATAPUMP.OPEN
operation=>'EXPORT',
job_mode =>'TABLE',
job_name =>'EXPORT_ORACLENSSA'
-- Schema filter
DBMS_DATAPUMP.METADATA_FILTER
handle => l_job_handle,
name => 'SCHEMA_EXPR',
value => 'IN (''ORACLENSSA'')'
DBMS_OUTPUT.PUT_LINE('Added filter for schema list');
-- Table filter
DBMS_DATAPUMP.METADATA_FILTER
handle => l_job_handle,
name => 'NAME_EXPR',
value => '=''SAI_AUDITBITACORA'''
DBMS_OUTPUT.PUT_LINE('Added filter for table expression');
-- Partition filter
DBMS_DATAPUMP.DATA_FILTER
handle => l_job_handle,
name => 'PARTITION_EXPR',
value => l_partition,
table_name => 'SAI_AUDITBITACORA'
DBMS_OUTPUT.PUT_LINE('Partition filter for schema list');
DBMS_DATAPUMP.ADD_FILE
handle => l_job_handle,
filename => 'EXP'||l_partition||'.DMP',
directory => 'EXP_DATA_PUMP',
filetype => 1
DBMS_DATAPUMP.ADD_FILE
handle => l_job_handle,
filename => 'EXP'||l_partition||'.LOG',
directory => 'EXP_DATA_PUMP',
filetype => 3
DBMS_DATAPUMP.START_JOB
handle => l_job_handle,
skip_current => 0
DBMS_DATAPUMP.WAIT_FOR_JOB
handle => l_job_handle,
job_state => l_job_state
DBMS_OUTPUT.PUT_LINE('Job completed - job state = '||l_job_state);
DBMS_DATAPUMP.DETACH(handle=>l_job_handle);
END;
I've already drop and recreate the directory, granted read, write to public and to user, grant create session, create table, create procedure, exp_full_database to user, restart the database and the listener with the var LD_LIBRARY pointing first to $ORACLE_HOME/lib, and add more space to temporary tablespace.The basic problem is:
Error: ORA 920
Text: invalid relational operator
Cause: A search condition was entered with an invalid or missing relational
operator.
Action: Include a valid relational operator such as =, !=, ^=, <>, >, <, >=, <=
, ALL, ANY, [NOT] BETWEEN, EXISTS, [NOT] IN, IS [NOT] NULL, or [NOT]
LIKE in the condition.
Obviously this refers to the invalid statement 'SELECT FROM DUAL ...'. I also recommend, you should contact Oracle Support, because it happens inside an Oracle provided package.
Werner -
Data pump issue for oracle 10G in window2003
Hi Experts,
I try to run data pump in oracle 10G in window 2003 server.
I got a error as
D:\>cd D:\oracle\product\10.2.0\SALE\BIN
D:\oracle\product\10.2.0\SALE\BIN>expdp system/xxxxl@sale full=Y directory=du
mpdir dumpfile=expdp_sale_20090302.dmp logfile=exp_sale_20090302.log
Export: Release 10.2.0.4.0 - Production on Tuesday, 03 March, 2009 8:05:50
Copyright (c) 2003, 2007, Oracle. All rights reserved.
Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.4.0 - Produc
tion
With the Partitioning, OLAP, Data Mining and Real Application Testing options
ORA-31626: job does not exist
ORA-31650: timeout waiting for master process response
However, I can run exp codes and works well.
What is wrong for my data pump?
Thanks
JIMHi Anand,
I did not see any error log at that time. Actually, it did not work any more. I will test it again based on your emial after exp done.
Based on new testing, I got below errors as
ORA-39014: One or more workers have prematurely exited.
ORA-39029: worker 1 with process name "DW01" prematurely terminated
ORA-31671: Worker process DW01 had an unhandled exception.
ORA-04030: out of process memory when trying to allocate 4108 bytes (PLS non-lib hp,pdzgM60_Make)
ORA-06512: at "SYS.KUPC$QUEUE_INT", line 277
ORA-06512: at "SYS.KUPW$WORKER", line 1366
ORA-04030: out of process memory when trying to allocate 65036 bytes (callheap,KQL tmpbuf)
ORA-06508: PL/SQL: could not find program unit being called: "SYS.KUPC$_WORKERERROR"
ORA-06512: at "SYS.KUPW$WORKER", line 13360
ORA-06512: at "SYS.KUPW$WORKER", line 15039
ORA-06512: at "SYS.KUPW$WORKER", line 6372
ORA-39125: Worker unexpected fatal error in KUPW$WORKER.DISPATCH_WORK_ITEMS while calling DBMS_METADATA.FETCH_XML_CLOB [PROCOBJ:"SALE"."SQLSCRIPT_2478179"]
ORA-06512: at "SYS.KUPW$WORKER", line 7078
ORA-04030: out of process memory when trying to allocate 4108 bytes (PLS non-lib hp,pdzgM60_Make)
ORA-06500: PL/SQL: storage error
ORA-04030: out of process memory when trying to allocate 16396 bytes (koh-kghu sessi,pmucpcon: tds)
ORA-04030: out of process memory when trying to allocate 16396 bytes (koh-kghu sessi,pmucalm coll)
Job "SYSTEM"."SYS_EXPORT_FULL_01" stopped due to fatal error at 14:41:36
ORA-39014: One or more workers have prematurely exited.
the trace file as
*** 2009-03-03 14:20:41.500
*** ACTION NAME:() 2009-03-03 14:20:41.328
*** MODULE NAME:(oradim.exe) 2009-03-03 14:20:41.328
*** SERVICE NAME:() 2009-03-03 14:20:41.328
*** SESSION ID:(159.1) 2009-03-03 14:20:41.328
Successfully allocated 7 recovery slaves
Using 157 overflow buffers per recovery slave
Thread 1 checkpoint: logseq 12911, block 2, scn 7355467494724
cache-low rba: logseq 12911, block 251154
on-disk rba: logseq 12912, block 221351, scn 7355467496281
start recovery at logseq 12911, block 251154, scn 0
----- Redo read statistics for thread 1 -----
Read rate (ASYNC): 185319Kb in 1.73s => 104.61 Mb/sec
Total physical reads: 189333Kb
Longest record: 5Kb, moves: 0/448987 (0%)
Change moves: 1378/5737 (24%), moved: 0Mb
Longest LWN: 1032Kb, moves: 45/269 (16%), moved: 41Mb
Last redo scn: 0x06b0.9406fb58 (7355467496280)
----- Recovery Hash Table Statistics ---------
Hash table buckets = 32768
Longest hash chain = 3
Average hash chain = 35384/25746 = 1.4
Max compares per lookup = 3
Avg compares per lookup = 847056/876618 = 1.0
*** 2009-03-03 14:20:46.062
KCRA: start recovery claims for 35384 data blocks
*** 2009-03-03 14:21:02.171
KCRA: blocks processed = 35384/35384, claimed = 35384, eliminated = 0
*** 2009-03-03 14:21:02.531
Recovery of Online Redo Log: Thread 1 Group 2 Seq 12911 Reading mem 0
*** 2009-03-03 14:21:04.718
Recovery of Online Redo Log: Thread 1 Group 1 Seq 12912 Reading mem 0
*** 2009-03-03 14:21:16.296
----- Recovery Hash Table Statistics ---------
Hash table buckets = 32768
Longest hash chain = 3
Average hash chain = 35384/25746 = 1.4
Max compares per lookup = 3
Avg compares per lookup = 849220/841000 = 1.0
*** 2009-03-03 14:21:28.468
tkcrrsarc: (WARN) Failed to find ARCH for message (message:0x1)
tkcrrpa: (WARN) Failed initial attempt to send ARCH message (message:0x1)
*** 2009-03-03 14:26:25.781
kwqmnich: current time:: 14: 26: 25
kwqmnich: instance no 0 check_only flag 1
kwqmnich: initialized job cache structure
ktsmgtur(): TUR was not tuned for 360 secs
Windows Server 2003 Version V5.2 Service Pack 2
CPU : 8 - type 586, 4 Physical Cores
Process Affinity : 0x00000000
Memory (Avail/Total): Ph:7447M/8185M, Ph+PgF:6833M/9984M, VA:385M/3071M
Instance name: vmsdbsea
Redo thread mounted by this instance: 0 <none>
Oracle process number: 0
Windows thread id: 2460, image: ORACLE.EXE (SHAD)
Dynamic strand is set to TRUE
Running with 2 shared and 18 private strand(s). Zero-copy redo is FALSE
*** 2009-03-03 08:06:51.921
*** ACTION NAME:() 2009-03-03 08:06:51.905
*** MODULE NAME:(expdp.exe) 2009-03-03 08:06:51.905
*** SERVICE NAME:(xxxxxxxxxx) 2009-03-03 08:06:51.905
*** SESSION ID:(118.53238) 2009-03-03 08:06:51.905
SHDW: Failure to establish initial communication with MCP
SHDW: Deleting Data Pump job infrastructure
is it a system memory issue for data pump? my exp works well
How to fix this issue?
JIM
Edited by: user589812 on Mar 3, 2009 5:07 PM
Edited by: user589812 on Mar 3, 2009 5:22 PM -
Data pump Operation = 'SQL_FILE'
Hellow! I need to exporting metadata from one shema to sql file.
DECLARE
DPJob NUMBER;
BEGIN
DPJob := DBMS_DataPump.open (Operation => 'SQL_FILE', Job_Mode => 'SCHEMA',
Job_Name => 'DPEXPJOB4');
DBMS_DataPump.add_File (Handle => DPJob,
FileName => 'DDL.SQL', Directory => 'DATA_PUMP_DIR',
FileType => DBMS_DataPump.KU$_File_Type_SQL_File);
DBMS_DataPump.metadata_Filter (Handle => DPJob,
Name => 'SCHEMA_EXPR', Value => 'IN (''SCOTT'') ');
DBMS_DataPump.start_Job (Handle => DPJob);
END;
but i have take an error, when i try execute this script:
invalid operation
Cause: The current API cannot be executed because of inconsistencies between the API and the current definition of the job. Subsequent messages supplied by DBMS_DATAPUMP.GET_STATUS will further describe the error.
Action: Modify the API call to be consistent with the current job or redefine the job in a manner that will support the specified API.
status of data pump job:
select * from dba_datapump_jobs;
6 SYS DPEXPJOB4 SQL_FILE SCHEMA DEFINING 1 1 2
How can i get sql file for my schema?
Edited by: alligator on 11.11.2009 9:38I ' connected as sysdba and it works ; wow very happy ;)
Maybe you are looking for
-
Does anybody know how to print a document from a ipad2 to a C6380 wireless all in one
I have just bought an ipad2 and a first time user. i want to print from my ipad2 to the hp C6380 wireless printer. Does anybody know how to do this with out some kind of special app? right now the pages app does not see the printer.
-
Source code for SAP's search components
Hello all! Does anyone know where I could find the source code to SAP's provided TREX search components? I would like to copy and modify one slightly, and would prefer to use their code instead of developing it from scratch if I can help it. Thanks
-
How to remove birthday or set it as privacy not pu...
I've tried to set Birthday to DD/MM/YY, after saved and refresh the page, it's still shows my birthday on public information. I can't edit my personal information, why? Solved! Go to Solution.
-
Plug-Ins from logic for Itunes?
Is it possible in Itunes to use Plug-Ins, for example a EQ from Logic?
-
I need a little help with xmonad and stuff
Hi all. I recently changed from fluxbox to xmonad, and I find it fantastic. I have some minor? problems. In fluxbox I was using xterm as a terminal emulator, but I couldnt use irssi with it, the Alt + w/e keys weren't working. They worked however in