DBMS_DATAPUMP.OPEN
Hi,
I am a newbie in PL/SQL & DATAPUMP !
I tried to generate (OPEN) a IMPORT-JOB with:
CREATE OR REPLACE PROCEDURE XX IS
BEGIN
BEGIN
DECLARE
HANDLE1 NUMBER;
BEGIN
HANDLE1 := DBMS_DATAPUMP.OPEN(OPERATION => 'IMPORT',
JOB_MODE => 'SCHEMA',
REMOTE_LINK => 'STRM');
DBMS_OUTPUT.PUT_LINE('HANDLE1 :' || HANDLE1);
EXCEPTION
WHEN OTHERS THEN
DBMS_OUTPUT.PUT_LINE('EX_HANDLE1 :' || HANDLE1);
DBMS_OUTPUT.PUT_LINE('Import Error 1 :' || SQLERRM(SQLCODE));
END;
END;
END XX;and got always the following error after the execution:
SQL> execute xx;
EX_HANDLE1 :
Import Error 1 :ORA-31626: Job ist nicht vorhanden
PL/SQL procedure successfully completed
SQL>
And the handle-Variable is blank !!
What did I wrong?
Please, help !
hqt200475
Edited by: hqt200475 on Mar 29, 2011 3:16 AM
Hi Saubhik,
I create the code like this once more:
CREATE OR REPLACE PROCEDURE XX IS
BEGIN
BEGIN
DECLARE
HANDLE1 NUMBER;
BEGIN
HANDLE1 := DBMS_DATAPUMP.OPEN(OPERATION => 'IMPORT',
JOB_MODE => 'SCHEMA',
REMOTE_LINK => 'STRM');
DBMS_OUTPUT.PUT_LINE('HANDLE1 :' || HANDLE1);
DBMS_OUTPUT.PUT_LINE('something stupid ');
EXCEPTION
WHEN OTHERS THEN
DBMS_OUTPUT.PUT_LINE('EX_HANDLE1 :' || HANDLE1);
DBMS_OUTPUT.PUT_LINE('Import Error 1 :' || SQLERRM(SQLCODE));
END;
END;
END XX;It works on sqlplus :
SQL> execute strmadmin_tgt.xx;
HANDLE1 :7
something stupid
PL/SQL procedure successfully completed.
SQL>but in PL/SQL-Developer the following output:
SQL> set serveroutput on;
SQL> execute strmadmin_tgt.xx;
EX_HANDLE1 :
Import Error 1 :ORA-31626: Job ist nicht vorhanden
PL/SQL procedure successfully completed
SQL> I think that's a PL/SQL-Developer-Problem.
Thanks and regards
hqt200475
Edited by: hqt200475 on Mar 29, 2011 4:11 AM
Similar Messages
-
ORA-39001: invalid argument value dbms_datapump.open
Hello,
I'm trying to import objects in a tablespace from a remote database.
The link source.de is working, but the dbms_datapump.open statements fails with below errors
SQL> var h number;
SQL> exec :h := dbms_datapump.open (operation => 'IMPORT',job_mode => 'TABLESPACE', remote_link => 'source.de');
BEGIN :h := dbms_datapump.open (operation => 'IMPORT',job_mode => 'TABLESPACE', remote_link => 'source.de'); END;
ERROR at line 1:
ORA-39001: invalid argument value
ORA-06512: at "SYS.DBMS_SYS_ERROR", line 79
ORA-06512: at "SYS.DBMS_DATAPUMP", line 2953
ORA-06512: at "SYS.DBMS_DATAPUMP", line 4603
ORA-06512: at line 1
When I describe dbms_datapump.open:
FUNCTION OPEN RETURNS NUMBER
Argument Name Type In/Out Default?
OPERATION VARCHAR2 IN
JOB_MODE VARCHAR2 IN
REMOTE_LINK VARCHAR2 IN DEFAULT
JOB_NAME VARCHAR2 IN DEFAULT
VERSION VARCHAR2 IN DEFAULT
COMPRESSION NUMBER IN DEFAULT
Anyone familier with this problem?
Regards,
TimORA-39001: invalid argument value, could be because of the db_link (is not seen by the owner of the code, by the schema which has the rights to run the code).
You can take a look at:
<p>
http://www.oracle-home.ro/Oracle_Database/10g_New_Features/Schema_refresh_DP.html</p> -
Trouble with Dbms_DataPump.Open
Hello eveyrone, I always get a "ORA-31626: job does not exist" when i try this simple code :
DECLARE
hand NUMBER;
BEGIN
hand := Dbms_DataPump.Open('IMPORT', 'FULL',NULL,'marcotte_imp');
END;
Im trying with a user that have all the privileges checked in the privileges pane in the users page.
What exactly im missing ?
Thx !
Message was edited by:
artois_vAlso found these from metalink;
Note:262557.1 -> You attempt to attach to a job that already completed or was aborted: the master table containing the list of objects to be exported has therefore already been dropped.
Note:272492.1 -> The size of the undo tablespace is not sufficient enough to complete this task.
Note:315488.1 -> No CREATE TABLE privilege explicitly granted to exporting schema. Solutions;
1) Explicitly grant privileges, e.g.:
grant create session, create table, create procedure, exp_full_database,
imp_full_database to testuser;
grant read, write on directory my_dump_dir to <users>;
2) Make sure that GRANT must be done as a separate grant and not through a ROLE. -
AutoCommit after DBMS_DATAPUMP.open
Hi, is there any way to avoid autocommit after DBMS_DATAPUMP.open? I want to rollback changes if export fails, but after DBMS_DATAPUMP.open (even if export fails) changes are commited...
plsql code:
INSERT INTO courier(courier_id, fname) VALUES (1001, 'myname');
v_handle := DBMS_DATAPUMP.open(
operation => 'EXPORT',
job_mode => 'TABLE',
remote_link => NULL,
job_name => v_jobname,
version => 'COMPATIBLE');
rollback;
This is not the code i am using, just easiest way to invoke the problem.
oracle version: Oracle Database 10g Release 10.2.0.1.0You need autonomous transactions, see http://download.oracle.com/docs/cd/B19306_01/appdev.102/b14251/adfns_sqlproc.htm#sthref306
>
At times, you may want to commit or roll back some changes to a table independently of a primary transaction's final outcome. For example, in a stock purchase transaction, you may want to commit a customer's information regardless of whether the overall stock purchase actually goes through. Or, while running that same transaction, you may want to log error messages to a debug table even if the overall transaction rolls back. Autonomous transactions allow you to do such tasks.
An autonomous transaction (AT) is an independent transaction started by another transaction, the main transaction (MT). It lets you suspend the main transaction, do SQL operations, commit or roll back those operations, then resume the main transaction.
>
HTH
Enrique -
ORA-06512 when using DBMS_DATAPUMP.OPEN
Hi,
we're using 10.2.0.3 SE1 on 32 Bit XP Prof. with SP2 and are trying to use dbms_datapump over the network.
When we call the OPEN procedure within a Package we get a
ORA-31626: job does not exist
ORA-06512: at "SYS.DBMS_SYS_ERROR", line 79
ORA-06512: at "SYS.KUPV$FT_INT", line 430
ORA-31638: cannot attach to job SNAPSHOT_JOB for user PHX_SYNC_DZ
ORA-31632: master table "PHX_SYNC_DZ.SNAPSHOT_JOB" not found, invalid, or inaccessible;
ORA-00942: table or view not foundWhen we use an anonymous pl/sql block the open works.
The User phx_sync_dz has been granted imp_full_database and exp_full_database.
Does anyone know a solution for that?
DimI found the solution. DDL rights have been granted by roles. So the user had simply not the rights to create the create the Mastertable.
A grant create table was enough and it worked.
Dim -
ORA-39126 during an export of a partition via dbms_datapump
Hi ,
i did export using datapump in command line everything went fine but while exporting via dbms_datapump i got this:
ORA-39126 during an export of a partition via dbms_datapump
ORA-00920
'SELECT FROM DUAL WHERE :1' P20060401
ORA-06512: at "SYS.DBMS_SYS_ERROR", line 95
ORA-06512: at "SYS.KUPW$WORKER", line 6228
the procedure is:
PROCEDURE pr_depura_bitacora
IS
l_job_handle NUMBER;
l_job_state VARCHAR2(30);
l_partition VARCHAR2(30);
v_sql VARCHAR2(2000);
BEGIN
-- Create a user-named Data Pump job to do a "table:partition-level" export
-- Local
select 'P'|| to_char((select min(STP_LOG_DATE) from SAI_AUDITBITACORA),'YYYYMM')||'01'
into l_partition
from user_tab_partitions
where table_name = 'SAI_AUDITBITACORA'
and rownum = 1;
l_partition := rtrim (l_partition,' ');
l_job_handle:= DBMS_DATAPUMP.OPEN
operation=>'EXPORT',
job_mode =>'TABLE',
job_name =>'EXPORT_ORACLENSSA'
-- Schema filter
DBMS_DATAPUMP.METADATA_FILTER
handle => l_job_handle,
name => 'SCHEMA_EXPR',
value => 'IN (''ORACLENSSA'')'
DBMS_OUTPUT.PUT_LINE('Added filter for schema list');
-- Table filter
DBMS_DATAPUMP.METADATA_FILTER
handle => l_job_handle,
name => 'NAME_EXPR',
value => '=''SAI_AUDITBITACORA'''
DBMS_OUTPUT.PUT_LINE('Added filter for table expression');
-- Partition filter
DBMS_DATAPUMP.DATA_FILTER
handle => l_job_handle,
name => 'PARTITION_EXPR',
value => l_partition,
table_name => 'SAI_AUDITBITACORA'
DBMS_OUTPUT.PUT_LINE('Partition filter for schema list');
DBMS_DATAPUMP.ADD_FILE
handle => l_job_handle,
filename => 'EXP'||l_partition||'.DMP',
directory => 'EXP_DATA_PUMP',
filetype => 1
DBMS_DATAPUMP.ADD_FILE
handle => l_job_handle,
filename => 'EXP'||l_partition||'.LOG',
directory => 'EXP_DATA_PUMP',
filetype => 3
DBMS_DATAPUMP.START_JOB
handle => l_job_handle,
skip_current => 0
DBMS_DATAPUMP.WAIT_FOR_JOB
handle => l_job_handle,
job_state => l_job_state
DBMS_OUTPUT.PUT_LINE('Job completed - job state = '||l_job_state);
DBMS_DATAPUMP.DETACH(handle=>l_job_handle);
END;
I've already drop and recreate the directory, granted read, write to public and to user, grant create session, create table, create procedure, exp_full_database to user, restart the database and the listener with the var LD_LIBRARY pointing first to $ORACLE_HOME/lib, and add more space to temporary tablespace.The basic problem is:
Error: ORA 920
Text: invalid relational operator
Cause: A search condition was entered with an invalid or missing relational
operator.
Action: Include a valid relational operator such as =, !=, ^=, <>, >, <, >=, <=
, ALL, ANY, [NOT] BETWEEN, EXISTS, [NOT] IN, IS [NOT] NULL, or [NOT]
LIKE in the condition.
Obviously this refers to the invalid statement 'SELECT FROM DUAL ...'. I also recommend, you should contact Oracle Support, because it happens inside an Oracle provided package.
Werner -
No log on DBMS_DATAPUMP schedule on Grid Control
Hello,
I'm trying to plan my own procedure lto make a datapump full of the instance with OEM Grid Control.
It's working but with no log in the job detail :
create or replace
PROCEDURE pr_expdp_full AS
Procedure permettant d'effectuer un export FULL
de l'instance avec la technologie DATAPUMP
Déclarer l'objet répertoire dans l'instance
CREATE DIRECTORY "DATAPUMP_DIR" AS '<Mon chemin réseau>'
JobHandle NUMBER; -- Data Pump job handle
JobStamp VARCHAR2(13); -- Date time stamp
InstanceName varchar2(30); -- Name of instance
ind NUMBER; -- Loop index
n_Exist NUMBER; -- count Job
JobHandle_Exist NUMBER; --JobHandle exist
percent_done NUMBER; -- Percentage of job complete
job_state VARCHAR2(30); -- To keep track of job state
le ku$_LogEntry; -- For WIP and error messages
js ku$_JobStatus; -- The job status from get_status
jd ku$_JobDesc; -- The job description from get_status
sts ku$_Status; -- The status object returned by get_status
BEGIN
-- TimeStamp File with system date
select to_char(SYSDATE,'DDMMRRRR-HH24MI') into JobStamp from dual ;
-- Instance Name to export
select rtrim(global_name) into InstanceName from global_name;
--Delete Job if exist
select count(*) into n_Exist from user_datapump_jobs where job_name = 'DAILY_EXPDP_'||InstanceName;
IF n_Exist > 0
THEN
JobHandle_Exist := DBMS_DATAPUMP.ATTACH('DAILY_EXPDP_'||InstanceName,'SYSTEM');
dbms_datapump.stop_job(JobHandle_Exist);
DBMS_DATAPUMP.DETACH (JobHandle_Exist);
execute immediate('DROP TABLE DAILY_EXPDP_'||InstanceName||'');
END IF;
-- Create a (user-named) Data Pump job to do a schema export.
JobHandle :=
DBMS_DATAPUMP.OPEN(
operation => 'EXPORT'
,job_mode => 'FULL'
,job_name => 'DAILY_EXPDP_'||InstanceName
,version => 'COMPATIBLE'
dbms_output.put_line('after OPEN');
-- Specify a single dump file for the job (using the handle just returned)
-- and a directory object, which must already be defined and accessible
-- to the user running this procedure.
DBMS_DATAPUMP.ADD_FILE(
handle => JobHandle
,filename => 'FULL_EXPDP_'||InstanceName||'_'||JobStamp||'.dpf'
,directory => 'DATAPUMP_DIR'
,filetype => 1 );
dbms_datapump.set_parameter(handle => JobHandle, name => 'KEEP_MASTER', value => 0);
-- Specify a single log file for the job
DBMS_DATAPUMP.ADD_FILE(
handle => JobHandle
,filename => 'FULL_EXPDP_'||InstanceName||'_'||JobStamp||'.log'
,directory => 'DATAPUMP_DIR'
,filetype => 3 );
dbms_datapump.set_parameter(handle => JobHandle, name => 'INCLUDE_METADATA', value => 1);
dbms_datapump.set_parameter(handle => JobHandle, name => 'DATA_ACCESS_METHOD', value => 'AUTOMATIC');
-- Start the job. An exception will be generated if something is not set up
-- properly.
DBMS_DATAPUMP.START_JOB(JobHandle);
-- The export job should now be running. In the following loop, the job
-- is monitored until it completes. In the meantime, progress information is
-- displayed.
percent_done := 0;
job_state := 'UNDEFINED';
while (job_state != 'COMPLETED') and (job_state != 'STOPPED') loop
dbms_datapump.get_status(JobHandle,dbms_datapump.ku$_status_job_error dbms_datapump.ku$_status_job_status dbms_datapump.ku$_status_wip,-1,job_state,sts);
js := sts.job_status;
-- If the percentage done changed, display the new value.
if js.percent_done != percent_done
then
dbms_output.put_line('*** Job percent done = ' ||
to_char(js.percent_done));
percent_done := js.percent_done;
end if;
-- If any work-in-progress (WIP) or error messages were received for the job,
-- display them.
if (bitand(sts.mask,dbms_datapump.ku$_status_wip) != 0)
then
le := sts.wip;
else
if (bitand(sts.mask,dbms_datapump.ku$_status_job_error) != 0)
then
le := sts.error;
else
le := null;
end if;
end if;
if le is not null
then
ind := le.FIRST;
while ind is not null loop
dbms_output.put_line(le(ind).LogText);
ind := le.NEXT(ind);
end loop;
end if;
end loop;
-- Indicate that the job finished and detach from it.
dbms_output.put_line('Job has completed');
dbms_output.put_line('Final job state = ' || job_state);
dbms_datapump.detach(JobHandle);
END pr_expdp_full;
I shcedule after :
BEGIN
SYSTEM.pr_expdp_full();
END;
But when i want to see the results of the execution in the details of job, i can't see anything.
When i do the same job using the direct link OEM : "Export Data" and after submission, i can see the DATAPUMP log in the job log details.
Can you tell me what is missing on the submission or what we can do to correct that.
Thanks in advance
Best regards.Hello,
I'm trying to plan my own procedure lto make a datapump full of the instance with OEM Grid Control.
It's working but with no log in the job detail :
create or replace
PROCEDURE pr_expdp_full AS
Procedure permettant d'effectuer un export FULL
de l'instance avec la technologie DATAPUMP
Déclarer l'objet répertoire dans l'instance
CREATE DIRECTORY "DATAPUMP_DIR" AS '<Mon chemin réseau>'
JobHandle NUMBER; -- Data Pump job handle
JobStamp VARCHAR2(13); -- Date time stamp
InstanceName varchar2(30); -- Name of instance
ind NUMBER; -- Loop index
n_Exist NUMBER; -- count Job
JobHandle_Exist NUMBER; --JobHandle exist
percent_done NUMBER; -- Percentage of job complete
job_state VARCHAR2(30); -- To keep track of job state
le ku$_LogEntry; -- For WIP and error messages
js ku$_JobStatus; -- The job status from get_status
jd ku$_JobDesc; -- The job description from get_status
sts ku$_Status; -- The status object returned by get_status
BEGIN
-- TimeStamp File with system date
select to_char(SYSDATE,'DDMMRRRR-HH24MI') into JobStamp from dual ;
-- Instance Name to export
select rtrim(global_name) into InstanceName from global_name;
--Delete Job if exist
select count(*) into n_Exist from user_datapump_jobs where job_name = 'DAILY_EXPDP_'||InstanceName;
IF n_Exist > 0
THEN
JobHandle_Exist := DBMS_DATAPUMP.ATTACH('DAILY_EXPDP_'||InstanceName,'SYSTEM');
dbms_datapump.stop_job(JobHandle_Exist);
DBMS_DATAPUMP.DETACH (JobHandle_Exist);
execute immediate('DROP TABLE DAILY_EXPDP_'||InstanceName||'');
END IF;
-- Create a (user-named) Data Pump job to do a schema export.
JobHandle :=
DBMS_DATAPUMP.OPEN(
operation => 'EXPORT'
,job_mode => 'FULL'
,job_name => 'DAILY_EXPDP_'||InstanceName
,version => 'COMPATIBLE'
dbms_output.put_line('after OPEN');
-- Specify a single dump file for the job (using the handle just returned)
-- and a directory object, which must already be defined and accessible
-- to the user running this procedure.
DBMS_DATAPUMP.ADD_FILE(
handle => JobHandle
,filename => 'FULL_EXPDP_'||InstanceName||'_'||JobStamp||'.dpf'
,directory => 'DATAPUMP_DIR'
,filetype => 1 );
dbms_datapump.set_parameter(handle => JobHandle, name => 'KEEP_MASTER', value => 0);
-- Specify a single log file for the job
DBMS_DATAPUMP.ADD_FILE(
handle => JobHandle
,filename => 'FULL_EXPDP_'||InstanceName||'_'||JobStamp||'.log'
,directory => 'DATAPUMP_DIR'
,filetype => 3 );
dbms_datapump.set_parameter(handle => JobHandle, name => 'INCLUDE_METADATA', value => 1);
dbms_datapump.set_parameter(handle => JobHandle, name => 'DATA_ACCESS_METHOD', value => 'AUTOMATIC');
-- Start the job. An exception will be generated if something is not set up
-- properly.
DBMS_DATAPUMP.START_JOB(JobHandle);
-- The export job should now be running. In the following loop, the job
-- is monitored until it completes. In the meantime, progress information is
-- displayed.
percent_done := 0;
job_state := 'UNDEFINED';
while (job_state != 'COMPLETED') and (job_state != 'STOPPED') loop
dbms_datapump.get_status(JobHandle,dbms_datapump.ku$_status_job_error dbms_datapump.ku$_status_job_status dbms_datapump.ku$_status_wip,-1,job_state,sts);
js := sts.job_status;
-- If the percentage done changed, display the new value.
if js.percent_done != percent_done
then
dbms_output.put_line('*** Job percent done = ' ||
to_char(js.percent_done));
percent_done := js.percent_done;
end if;
-- If any work-in-progress (WIP) or error messages were received for the job,
-- display them.
if (bitand(sts.mask,dbms_datapump.ku$_status_wip) != 0)
then
le := sts.wip;
else
if (bitand(sts.mask,dbms_datapump.ku$_status_job_error) != 0)
then
le := sts.error;
else
le := null;
end if;
end if;
if le is not null
then
ind := le.FIRST;
while ind is not null loop
dbms_output.put_line(le(ind).LogText);
ind := le.NEXT(ind);
end loop;
end if;
end loop;
-- Indicate that the job finished and detach from it.
dbms_output.put_line('Job has completed');
dbms_output.put_line('Final job state = ' || job_state);
dbms_datapump.detach(JobHandle);
END pr_expdp_full;
I shcedule after :
BEGIN
SYSTEM.pr_expdp_full();
END;
But when i want to see the results of the execution in the details of job, i can't see anything.
When i do the same job using the direct link OEM : "Export Data" and after submission, i can see the DATAPUMP log in the job log details.
Can you tell me what is missing on the submission or what we can do to correct that.
Thanks in advance
Best regards. -
Error in stored procedure while using dbms_datapump for transportable
Hi,
I'm facing following issue:
SQL> select * from v$version;
BANNER
Oracle Database 10g Enterprise Edition Release 10.2.0.4.0 - 64bi
PL/SQL Release 10.2.0.4.0 - Production
CORE 10.2.0.4.0 Production
TNS for Solaris: Version 10.2.0.4.0 - Production
NLSRTL Version 10.2.0.4.0 - Production
====================================================================================
I'm trying to do transportable tablespace through stored procedure with help of DBMS_DATAPUMP, Following is the code :
==================================================================================
create or replace
procedure sp_tts_export(v_tbs_name varchar2) as
idx NUMBER; -- Loop index
JobHandle NUMBER; -- Data Pump job handle
PctComplete NUMBER; -- Percentage of job complete
JobState VARCHAR2(30); -- To keep track of job state
LogEntry ku$_LogEntry; -- For WIP and error messages
JobStatus ku$_JobStatus; -- The job status from get_status
Status ku$_Status; -- The status object returned by get_status
dts varchar2(140):=to_char(sysdate,'YYYYMMDDHH24MISS');
exp_dump_file varchar2(500):=v_tbs_name||'_tts_export_'||dts||'.dmp';
exp_log_file varchar2(500):=v_tbs_name||'_tts_export_'||dts||'.log';
exp_job_name varchar2(500):=v_tbs_name||'_tts_export_'||dts;
dp_dir varchar2(500):='DATA_PUMP_DIR';
log_file UTL_FILE.FILE_TYPE;
log_filename varchar2(500):=exp_job_name||'_main'||'.log';
err_log_file UTL_FILE.FILE_TYPE;
v_db_name varchar2(1000);
v_username varchar2(30);
t_dir_name VARCHAR2(4000);
t_file_name VARCHAR2(4000);
t_sep_pos NUMBER;
t_dir varchar2(30):='temp_0123456789';
v_sqlerrm varchar2(4000);
stmt varchar2(4000);
FUNCTION get_file(filename VARCHAR2, dir VARCHAR2 := 'TEMP')
RETURN VARCHAR2 IS
contents VARCHAR2(32767);
file BFILE := BFILENAME(dir, filename);
BEGIN
DBMS_LOB.FILEOPEN(file, DBMS_LOB.FILE_READONLY);
contents := UTL_RAW.CAST_TO_VARCHAR2(
DBMS_LOB.SUBSTR(file));
DBMS_LOB.CLOSE(file);
RETURN contents;
END;
begin
--execute immediate ('drop tablespace test including contents and datafiles');
--execute immediate ('create tablespace test datafile ''/home/smishr02/test.dbf'' size 10m');
--execute immediate ('create table prestg.test_table (a number) tablespace test');
--execute immediate ('insert into prestg.test_table values (1)');
--commit;
--execute immediate ('alter tablespace test read only');
--dbms_output.put_line('11111111111111111111');
dbms_output.put_line(log_filename||'>>>>>>>>>>>>>>>>>>>>>>>>>>>'|| dp_dir);
log_file:=UTL_FILE.FOPEN (dp_dir, log_filename, 'w');
UTL_FILE.PUT_LINE(log_file,'#####################################################################');
UTL_FILE.PUT_LINE(log_file,'REPORT: GENERATED ON ' || SYSDATE);
UTL_FILE.PUT_LINE(log_file,'#####################################################################');
select global_name,user into v_db_name,v_username from global_name;
UTL_FILE.PUT_LINE(log_file,'Database:'||v_db_name);
UTL_FILE.PUT_LINE(log_file,'user running the job:'||v_username);
UTL_FILE.PUT_LINE(log_file,'for tablespace:'||v_tbs_name);
UTL_FILE.NEW_LINE (log_file);
stmt:='ALTER TABLESPACE '||v_tbs_name || ' read only';
dbms_output.put_line('11111111111111111111'||stmt);
execute immediate (stmt);
UTL_FILE.PUT_LINE(log_file,' '||v_tbs_name || ' altered to read only mode.');
UTL_FILE.NEW_LINE (log_file);
UTL_FILE.PUT_LINE(log_file,'#####################################################################');
UTL_FILE.NEW_LINE (log_file);
UTL_FILE.PUT_LINE(log_file,' Initiating the Datapump engine for TTS export..............');
UTL_FILE.NEW_LINE (log_file);
dbms_output.put_line('11111111111111111111');
JobHandle :=
DBMS_DATAPUMP.OPEN(
operation => 'EXPORT'
*,job_mode => 'TRANSPORTABLE'*
*,remote_link => NULL*
*,job_name => NULL*
--,job_name => exp_job_name
-- ,version => 'LATEST'
UTL_FILE.PUT_LINE(log_file,'Done');
UTL_FILE.NEW_LINE (log_file);
UTL_FILE.PUT_LINE(log_file,' Allocating dumpfile................');
DBMS_DATAPUMP.ADD_FILE(
handle => JobHandle
,filename => exp_dump_file
,directory => dp_dir
,filetype => DBMS_DATAPUMP.KU$_FILE_TYPE_DUMP_FILE
-- ,filesize => '100M'
UTL_FILE.PUT_LINE(log_file,'Done');
UTL_FILE.NEW_LINE (log_file);
UTL_FILE.PUT_LINE(log_file,' Allocating logfile................');
DBMS_DATAPUMP.ADD_FILE(
handle => JobHandle
,filename => exp_log_file
,directory => dp_dir
,filetype => DBMS_DATAPUMP.KU$_FILE_TYPE_LOG_FILE
UTL_FILE.PUT_LINE(log_file,'Done');
UTL_FILE.NEW_LINE (log_file);
UTL_FILE.PUT_LINE(log_file,' Setting attributes................');
DBMS_DATAPUMP.set_parameter(handle => JobHandle,
name=>'TTS_FULL_CHECK',
value=>1);
DBMS_DATAPUMP.METADATA_FILTER(
handle => JobHandle
,NAME => 'TABLESPACE_EXPR'
,VALUE => 'IN ('''||v_tbs_name||''')'
-- ,object_type => 'TABLE'
UTL_FILE.PUT_LINE(log_file,'Done');
UTL_FILE.NEW_LINE (log_file);
UTL_FILE.PUT_LINE(log_file,' Now starting datapump job................');
DBMS_DATAPUMP.START_JOB(JobHandle);
UTL_FILE.PUT_LINE(log_file,'Done');
UTL_FILE.NEW_LINE (log_file);
UTL_FILE.PUT_LINE(log_file,' Monitoring the job................');
--------------Monitor the job
PctComplete := 0;
JobState := 'UNDEFINED';
WHILE(JobState != 'COMPLETED') and (JobState != 'STOPPED')
LOOP
DBMS_DATAPUMP.GET_STATUS(
handle => JobHandle
,mask => 15 -- DBMS_DATAPUMP.ku$_status_job_error + DBMS_DATAPUMP.ku$_status_job_status + DBMS_DATAPUMP.ku$_status_wip
,timeout => NULL
,job_state => JobState
,status => Status
JobStatus := Status.job_status;
-- Whenever the PctComplete value has changed, display it
IF JobStatus.percent_done != PctComplete THEN
DBMS_OUTPUT.PUT_LINE('*** Job percent done = ' || TO_CHAR(JobStatus.percent_done));
PctComplete := JobStatus.percent_done;
END IF;
-- Whenever a work-in progress message or error message arises, display it
IF (BITAND(Status.mask,DBMS_DATAPUMP.ku$_status_wip) != 0) THEN
LogEntry := Status.wip;
ELSE
IF (BITAND(Status.mask,DBMS_DATAPUMP.ku$_status_job_error) != 0) THEN
LogEntry := Status.error;
ELSE
LogEntry := NULL;
END IF;
END IF;
IF LogEntry IS NOT NULL THEN
idx := LogEntry.FIRST;
WHILE idx IS NOT NULL
LOOP
DBMS_OUTPUT.PUT_LINE(LogEntry(idx).LogText);
idx := LogEntry.NEXT(idx);
END LOOP;
END IF;
END LOOP;
--copy the datafiles to data dump dir
UTL_FILE.PUT_LINE(log_file,'Done');
UTL_FILE.NEW_LINE (log_file);
UTL_FILE.PUT_LINE(log_file,' Copying datafiles to dump directory................');
-- grant select on dba_directories to prestg;
declare
cnt number;
begin
select count(*) into cnt from dba_directories
where directory_name=upper(t_dir);
if cnt=1 then
execute immediate('DROP DIRECTORY '||t_dir);
end if;
end;
FOR rec in (select file_name from sys.dba_data_files where tablespace_name=v_tbs_name)
LOOP
t_sep_pos:=instr(rec.file_name,'/',-1);
t_dir_name:=substr(rec.file_name,1,t_sep_pos-1);
t_file_name:=substr(rec.file_name,t_sep_pos+1,length(rec.file_name));
dbms_output.put_line(t_dir_name|| ' ' || t_dir);
dbms_output.put_line(t_file_name);
execute immediate('CREATE DIRECTORY '||t_dir||' AS '''||t_dir_name||'''');
UTL_FILE.PUT_LINE(log_file,' Copying '||rec.file_name||'................');
utl_file.fcopy(t_dir, t_file_name, dp_dir, t_file_name);
UTL_FILE.PUT(log_file,'Done');
execute immediate('DROP DIRECTORY '||t_dir);
END LOOP;
UTL_FILE.NEW_LINE (log_file);
UTL_FILE.PUT_LINE(log_file,' Altering tablespace to read write................');
execute immediate ('ALTER TABLESPACE '||v_tbs_name || ' read write');
UTL_FILE.PUT(log_file,' Done');
err_log_file:=utl_file.fopen(dp_dir, exp_log_file, 'r');
UTL_FILE.NEW_LINE (log_file);
UTL_FILE.PUT_LINE(log_file,' content of export logfile................');
loop
begin
utl_file.get_line(err_log_file,v_sqlerrm);
if v_sqlerrm is null then
exit;
end if;
UTL_FILE.PUT_LINE(log_file,v_sqlerrm);
EXCEPTION
WHEN NO_DATA_FOUND THEN
EXIT;
END;
end loop;
utl_file.fclose(err_log_file);
utl_file.fclose(log_file);
END;
I'm getting following error when DBMS_DATAPUMP.OPEN is called in procedure:
SQL> exec sp_tts_export('TEST');
BEGIN sp_tts_export('TEST'); END;
ERROR at line 1:
ORA-31626: job does not exist
ORA-06512: at "SYS.DBMS_SYS_ERROR", line 79
ORA-06512: at "SYS.DBMS_DATAPUMP", line 938
ORA-06512: at "SYS.DBMS_DATAPUMP", line 4566
ORA-06512: at "PRESTG.SP_TTS_EXPORT", line 78
ORA-06512: at line 1
==============================================================================================
This procedure is part of user ABC. I'm getting the above when I'm running this under ABC schema. However I have tested the same procedure under sys schema. When I'm creating same procedure in SYS schema it is running fine. I am clueless on this. Pls help
Thanks
Shailesh
Edited by: shaileshM on Jul 28, 2010 11:15 AMPrivileges acquired via ROLE do NOT apply within named PL/SQL procedures.
Explicit GRANT is required to resolve this issue. -
Using DBMS_DATAPUMP with LONG data type
I've got a procedure below that calls the DBMS_DATAPUMP procedure using a REMOTE_LINK to move a schema from one database to another. However, a couple of the tables within that schema have columns with the LONG data type. And when I run it I get an error saying that you cannot move data with the LONG data type using a REMOTE LINK. So no data in those particular tables gets moved over.
Has anyone else had this issue? If so, do you have a work around? I tried adding a CLOB column to my table and setting the new CLOB to equal the LONG, but I couldn't get that to work either...even when I tried using a TO_LOB. If I could get that to, then I could just drop the LONG, move the schema, then recreate the LONG column on the opposite side.
Here's my procedure....
DECLARE
/* EXPORT/IMPORT VARIABLES */
v_dp_job_handle NUMBER ; -- Data Pump job handle
v_count NUMBER ; -- Loop index
v_percent_done NUMBER ; -- Percentage of job complete
v_job_state VARCHAR2(30) ; -- To keep track of job state
v_message KU$_LOGENTRY ; -- For WIP and error messages
v_job_status KU$_JOBSTATUS ; -- The job status from get_status
v_status KU$_STATUS ; -- The status object returned by get_status
v_logfile NUMBER ;
v_date VARCHAR2(13) ;
v_source_server_name VARCHAR2(50) ;
v_destination_server_name VARCHAR2(50) ;
BEGIN
v_project := 'TEST' ;
v_date := TO_CHAR(SYSDATE, 'MMDDYYYY_HHMI') ;
v_source_server_name := 'TEST_DB' ;
v_dp_job_handle := DBMS_DATAPUMP.OPEN(
OPERATION => 'IMPORT',
JOB_MODE => 'SCHEMA',
REMOTE_LINK => v_source_server_name,
JOB_NAME => v_project||'_EXP_'||v_date,
VERSION => 'LATEST') ;
v_logfile := DBMS_DATAPUMP.KU$_FILE_TYPE_LOG_FILE ;
DBMS_DATAPUMP.ADD_FILE(
HANDLE => v_dp_job_handle,
FILENAME => v_project||'_EXP_'||v_date||'.LOG',
DIRECTORY => 'DATAPUMP',
FILETYPE => v_logfile) ;
DBMS_DATAPUMP.METADATA_FILTER(
HANDLE => v_dp_job_handle,
NAME => 'SCHEMA_EXPR',
VALUE => '= '''||v_project||''' ') ;
DBMS_DATAPUMP.START_JOB(v_dp_job_handle) ;
v_percent_done := 0 ;
v_job_state := 'UNDEFINED' ;
WHILE (v_job_state != 'COMPLETED') AND (v_job_state != 'STOPPED')
LOOP
DBMS_DATAPUMP.GET_STATUS(
v_dp_job_handle,
DBMS_DATAPUMP.KU$_STATUS_JOB_ERROR + DBMS_DATAPUMP.KU$_STATUS_JOB_STATUS + DBMS_DATAPUMP.KU$_STATUS_WIP,
-1,
v_job_state,
v_status) ;
v_job_status := v_status.JOB_STATUS ;
IF v_job_status.PERCENT_DONE != v_percent_done THEN
DBMS_OUTPUT.PUT_LINE('*** Job percent done = '||TO_CHAR(v_job_status.PERCENT_DONE)) ;
v_percent_done := v_job_status.PERCENT_DONE ;
END IF ;
IF BITAND(v_status.MASK, DBMS_DATAPUMP.KU$_STATUS_WIP) != 0 THEN
v_message := v_status.WIP ;
ELSIF BITAND(v_status.mask, DBMS_DATAPUMP.KU$_STATUS_JOB_ERROR) != 0 THEN
v_message := v_status.ERROR ;
ELSE
v_message := NULL ;
END IF ;
IF v_message IS NOT NULL THEN
v_count := v_message.FIRST ;
WHILE v_count IS NOT NULL
LOOP
DBMS_OUTPUT.PUT_LINE(v_message(v_count).LOGTEXT) ;
v_count := v_message.NEXT(v_count) ;
END LOOP ;
END IF ;
END LOOP ;
DBMS_OUTPUT.PUT_LINE('Job has completed') ;
DBMS_OUTPUT.PUT_LINE('Final job state = '||v_job_state) ;
DBMS_DATAPUMP.DETACH(v_dp_job_handle) ;
END ;But the application we have that uses the database cannot be changed to read from a CLOBWhy can't you change the application?
Well, anyway you should point out to your superiors that Oracle documented years ago to not use LONGS anymore...
http://download.oracle.com/docs/cd/B19306_01/server.102/b14220/datatype.htm#sthref3806
It clearly states:
LONG Datatype
Note:
Do not create tables with LONG columns. Use LOB columns (CLOB, NCLOB) instead. LONG columns are supported only for backward compatibility.
Oracle also recommends that you convert existing LONG columns to LOB columns. LOB columns are subject to far fewer restrictions than LONG columns. Further, LOB functionality is enhanced in every release, whereas LONG functionality has been static for several releases.
How do I go from CLOB to LONG?I'm sorry, cannot help you on that one, I don't think you can do that at all (Oracle wants us to stop using LONGS, so, it's a one-way conversion...):
http://asktom.oracle.com/pls/asktom/f?p=100:11:0::::P11_QUESTION_ID:1037232794454#15512131314505
So: NO built_in, you'll need to write a program if the clob is ALWAYS LESS THAN 32k in size, you can use plsql..but is that the case in your case? Only you know that.
I believe that question is still unanswered on this forum, but you might try searchin for answers on this forum, and
the 'Database-General' forum: General Database Discussions
Perhaps you can google a Q&D workaround...
( And consider convincing your collegues to just convert your LONGS to LOBS)
Edited by: hoek on Apr 8, 2009 5:43 PM -
Using dbms_datapump package to export the schema with the schema name as pa
Hi,
I am using the pl/sql block to export schema using dbms_datapump package,Now I want to pass the scheme name as the parameter to the procedure and get the .dmp and .log files with the schema name included.
CREATE OR REPLACE PROCEDURE export
IS
h1 number;
begin
h1 := dbms_datapump.open (operation => 'EXPORT', job_mode => 'SCHEMA', job_name => 'export1', version => 'COMPATIBLE');
dbms_datapump.set_parallel(handle => h1, degree => 1);
dbms_datapump.add_file(handle => h1, filename => 'EXPDAT.LOG', directory => 'DATA_PUMP_DIR', filetype => 3);
dbms_datapump.set_parameter(handle => h1, name => 'KEEP_MASTER', value => 0);
dbms_datapump.metadata_filter(handle => h1, name => 'SCHEMA_EXPR', value => 'IN(''CHECKOUT'')');
dbms_datapump.set_parameter(handle => h1, name => 'ESTIMATE', value => 'BLOCKS');
dbms_datapump.add_file(handle => h1, filename => 'EXPDAT%U' || to_char(sysdate,'dd-mm-yyyy') || '.DMP', directory => 'DATA_PUMP_DIR', filetype => 1);
dbms_datapump.set_parameter(handle => h1, name => 'INCLUDE_METADATA', value => 1);
dbms_datapump.set_parameter(handle => h1, name => 'DATA_ACCESS_METHOD', value => 'AUTOMATIC');
dbms_datapump.start_job(handle => h1, skip_current => 0, abort_step => 0);
dbms_datapump.detach (handle => h1);
exception
when others then
raise_application_error(-20001,'An error was encountered - '||SQLCODE||' -ERROR- '||SQLERRM);
end;
Thank you in advanced
Sriuser12062360 wrote:
Hi,
I am using the pl/sql block to export schema using dbms_datapump package,Now I want to pass the scheme name as the parameter to the procedure and get the .dmp and .log files with the schema name included.
OK, please proceed to do so
>
CREATE OR REPLACE PROCEDURE export
IS
h1 number;
begin
h1 := dbms_datapump.open (operation => 'EXPORT', job_mode => 'SCHEMA', job_name => 'export1', version => 'COMPATIBLE');
dbms_datapump.set_parallel(handle => h1, degree => 1);
dbms_datapump.add_file(handle => h1, filename => 'EXPDAT.LOG', directory => 'DATA_PUMP_DIR', filetype => 3);
dbms_datapump.set_parameter(handle => h1, name => 'KEEP_MASTER', value => 0);
dbms_datapump.metadata_filter(handle => h1, name => 'SCHEMA_EXPR', value => 'IN(''CHECKOUT'')');
dbms_datapump.set_parameter(handle => h1, name => 'ESTIMATE', value => 'BLOCKS');
dbms_datapump.add_file(handle => h1, filename => 'EXPDAT%U' || to_char(sysdate,'dd-mm-yyyy') || '.DMP', directory => 'DATA_PUMP_DIR', filetype => 1);
dbms_datapump.set_parameter(handle => h1, name => 'INCLUDE_METADATA', value => 1);
dbms_datapump.set_parameter(handle => h1, name => 'DATA_ACCESS_METHOD', value => 'AUTOMATIC');
dbms_datapump.start_job(handle => h1, skip_current => 0, abort_step => 0);
dbms_datapump.detach (handle => h1);
exception
when others then
raise_application_error(-20001,'An error was encountered - '||SQLCODE||' -ERROR- '||SQLERRM);
end;
EXCEPTION handler is a bug waiting to happen.
eliminate it entirely -
DBMS_DATAPUMP package ERROR ;;;; PLS HELP mE
Hello Everybody,
I have a problem using the DBMS_DATAPUMP package ;
I have just created on procedure that lunch one job and put the dmp file into an oracle directory ; the procedure works fine until yesterday and sadly strange today the procedure does'nt work and i have this oracle message :
ORA-31626: job does not exist
I dont' know how to do since i have not changed the procedure !!!
can someone help me please?
thank you very much
regardsNo i have just this error : ORA-31626: job does not exist
from the execution time to the failure in the alert.log file I have only The value (30) of MAXTRANS parameter ignored.
for the session nothing is changed
below the code source :
CREATE OR REPLACE PROCEDURE SP_DPUMP_EXPORT (
P_DIRECTORY_NAME IN VARCHAR2
AS
|| Procedure: here
||
|| Creates a nightly DataPump Export of all User schema
||
idx NUMBER; -- Loop index
JobHandle NUMBER; -- Data Pump job handle
PctComplete NUMBER; -- Percentage of job complete
JobState VARCHAR2(30); -- To keep track of job state
LogEntry ku$_LogEntry; -- For WIP and error messages
JobStatus ku$_JobStatus; -- The job status from get_status
Status ku$_Status; -- The status object returned by get_status
SELECT table_name, max_trans
FROM user_tables;
BEGIN
-- Build a handle for the export job
JobHandle :=
DBMS_DATAPUMP.OPEN(
operation => 'EXPORT'
,job_mode => 'FULL'
,remote_link => NULL
,job_name => 'GADNETOBJECTS'
,version => 'LATEST'
-- Using the job handle value obtained, specify multiple dump files for the job
-- and the directory to which the dump files should be written. Note that the
-- directory object must already exist and the user account running the job must
-- have WRITE access permissions to the directory
DBMS_DATAPUMP.ADD_FILE(
handle => JobHandle
,filename => 'GADDB01_'||TO_CHAR(SYSDATE-1,'YYYYMMDD')||'.dmp'
,directory => P_DIRECTORY_NAME
,filetype => DBMS_DATAPUMP.KU$_FILE_TYPE_DUMP_FILE
/* ,filesize => '100M'*/
DBMS_DATAPUMP.ADD_FILE(
handle => JobHandle
,filename => 'GADDB01_'||TO_CHAR(SYSDATE-1,'YYYYMMDD')||'.log'
,directory => P_DIRECTORY_NAME
,filetype => DBMS_DATAPUMP.KU$_FILE_TYPE_LOG_FILE
-- Apply a metadata filter to restrict the DataPump Export job to only return
-- selected tables and their dependent objects from the Miva schema
/*DBMS_DATAPUMP.METADATA_FILTER(
handle => JobHandle
,NAME => 'SCHEMA_EXPR'
,VALUE => '= ''ALL'''
\* ,object_type => 'TABLE'*\
-- Initiate the DataPump Export job
DBMS_DATAPUMP.START_JOB(JobHandle);
-- If no exception has been returned when the job was initiated, this loop will
-- keep track of the job and return progress information until the job is done
PctComplete := 0;
JobState := 'UNDEFINED';
WHILE(JobState != 'COMPLETED') and (JobState != 'STOPPED')
LOOP
DBMS_DATAPUMP.GET_STATUS(
handle => JobHandle
,mask => 15 -- DBMS_DATAPUMP.ku$_status_job_error + DBMS_DATAPUMP.ku$_status_job_status + DBMS_DATAPUMP.ku$_status_wip
,timeout => NULL
,job_state => JobState
,status => Status
JobStatus := Status.job_status;
-- Whenever the PctComplete value has changed, display it
IF JobStatus.percent_done != PctComplete THEN
DBMS_OUTPUT.PUT_LINE('*** Job percent done = ' || TO_CHAR(JobStatus.percent_done));
PctComplete := JobStatus.percent_done;
END IF;
-- Whenever a work-in progress message or error message arises, display it
IF (BITAND(Status.mask,DBMS_DATAPUMP.ku$_status_wip) != 0) THEN
LogEntry := Status.wip;
ELSE
IF (BITAND(Status.mask,DBMS_DATAPUMP.ku$_status_job_error) != 0) THEN
LogEntry := Status.error;
ELSE
LogEntry := NULL;
END IF;
END IF;
IF LogEntry IS NOT NULL THEN
idx := LogEntry.FIRST;
WHILE idx IS NOT NULL
LOOP
DBMS_OUTPUT.PUT_LINE(LogEntry(idx).LogText);
idx := LogEntry.NEXT(idx);
END LOOP;
END IF;
END LOOP;
-- Successful DataPump Export job completion, so detach from the job
DBMS_OUTPUT.PUT_LINE('Job has succesfully completed');
DBMS_OUTPUT.PUT_LINE('Final job state = ' || JobState);
DBMS_DATAPUMP.DETACH(JobHandle);
EXCEPTION
DBMS_OUTPUT.PUT_LINE('ERREUR :'||SQLERRM);
END; -
Problem During Importing using DBMS_DATAPUMP
Hi,
I am facing Problems while Importing usinf DBMS_DATAPUMP:
Its not showing any Error but the Tables not getting Imported.
Below is the Code whihc i have written for Import:
CREATE OR REPLACE PROCEDURE Data_Pump_Import_Load AS
ind NUMBER; -- Loop index
h1 NUMBER; -- Data Pump job handle
percent_done NUMBER; -- Percentage of job complete
job_state VARCHAR2(30); -- To keep track of job state
le ku$_LogEntry; -- For WIP and error messages
js ku$_JobStatus; -- The job status from get_status
jd ku$_JobDesc; -- The job description from get_status
sts ku$_Status; -- The status object returned by get_status
BEGIN
-- Create a (user-named) Data Pump job to do a schema export.
DBMS_OUTPUT.PUT_LINE('COMES HERE');
h1 := DBMS_DATAPUMP.OPEN(operation => 'IMPORT', job_mode =>
'TABLE',job_name => 'RET_IMPORT_TEST');
DBMS_OUTPUT.PUT_LINE('COMES HERE1111');
DBMS_DATAPUMP.ADD_FILE(handle => h1,filename =>
'MYEXPORTTEST.DMP', DIRECTORY => 'EXPORT_DIR',filetype=>1);
DBMS_OUTPUT.PUT_LINE('COMES HERE2222');
DBMS_DATAPUMP.ADD_FILE(handle => h1,filename =>
'MYEIMPORTTEST.LOG',DIRECTORY => 'EXPORT_DIR',filetype=>3);
DBMS_DATAPUMP.SET_PARAMETER (handle => h1,
NAME => 'INCLUDE_METADATA',
VALUE => 1);
DBMS_DATAPUMP.SET_PARAMETER(handle => h1,NAME =>
'DATA_ACCESS_METHOD', VALUE =>'AUTOMATIC');
DBMS_DATAPUMP.SET_PARAMETER(handle => h1,NAME =>
'TABLE_EXISTS_ACTION', VALUE =>'REPLACE');
DBMS_OUTPUT.PUT_LINE('COMES HERE3333');
-- Specify a single dump file for the job (using the handle just returned)
-- and a directory object, which must already be defined and accessible
-- to the user running this procedure.
-- Start the job. An exception will be generated if something is not set up
-- properly.
DBMS_DATAPUMP.START_JOB(h1);
-- The export job should now be running. In the following loop, the job
-- is monitored until it completes. In the meantime, progress information is
-- displayed.
percent_done := 0;
job_state := 'UNDEFINED';
WHILE (job_state != 'COMPLETED') AND (job_state != 'STOPPED') LOOP
dbms_datapump.get_status(h1,
dbms_datapump.ku$_status_job_error +
dbms_datapump.ku$_status_job_status +
dbms_datapump.ku$_status_wip,-1,job_state,sts);
js := sts.job_status;
-- If the percentage done changed, display the new value.
IF js.percent_done != percent_done
THEN
DBMS_OUTPUT.PUT_LINE('*** Job percent done = ' ||
TO_CHAR(js.percent_done));
percent_done := js.percent_done;
END IF;
-- If any work-in-progress (WIP) or error messages were received for the job,
-- display them.
IF (BITAND(sts.mask,dbms_datapump.ku$_status_wip) != 0)
THEN
le := sts.wip;
ELSE
IF (BITAND(sts.mask,dbms_datapump.ku$_status_job_error) != 0)
THEN
le := sts.error;
ELSE
le := NULL;
END IF;
END IF;
IF le IS NOT NULL
THEN
ind := le.FIRST;
WHILE ind IS NOT NULL LOOP
DBMS_OUTPUT.PUT_LINE(le(ind).LogText);
ind := le.NEXT(ind);
END LOOP;
END IF;
END LOOP;
-- Indicate that the job finished and detach from it.
DBMS_OUTPUT.PUT_LINE('Job has completed');
DBMS_OUTPUT.PUT_LINE('Final job state = ' || job_state);
dbms_datapump.detach(h1);
END;
/Use metadata_remap proc with REMAP_SCHEMA option:
DBMS_DATAPUMP.METADATA_RAMAP(id, 'REMAP_SCHEMA', 'SOURCE_SCHEMA', 'DESTINATION_SCHEMA'); -
Ora-31655 using dbms_datapump
i am using dbms_datapump to make a logical backup. it works fine, when i use just to schema mode. when i try to execute using table mode, it raised a error 31655 "no data or metadata objetcs selected for job". in both cases, i am connected by SYS.
it seems a privilege error, but as i told, i am connected by SYS.
the code that works fine :
vl_jobhandle := dbms_datapump.open(operation => 'EXPORT', job_mode => 'SCHEMA' ,job_name => 'TESTE', version => 'LATEST' );
dbms_datapump.add_file for dmp and log files;
dbms_datapump.metadata_filter(vl_jobhandle,f_'schema_expr','in ('''|| 'SYSMAN' ||''')');
dbms_datapump.start_job(vl_jobhandle);
-- and go on.....
the code that doesn't work fine :
vl_jobhandle := dbms_datapump.open(operation => 'EXPORT', job_mode => 'TABLE' ,job_name => 'TESTE', version => 'LATEST' );
dbms_datapump.add_file for dmp and log files;
dbms_datapump.metadata_filter(vl_jobhandle,f_'name_expr','in ('''|| 'MGMT_VIEW_USER_CREDENTIALS ||''')');
dbms_datapump.start_job(vl_jobhandle);
-- and go on.....
i did some changes in metadata_filter and including data_filter, but none of them worked.
any help
thanksat dbms_datapump.metadata_filter(vl_jobhandle,f_'name_expr','in ('''|| 'MGMT_VIEW_USER_CREDENTIALS' ||''')') statement. 'MGMT_VIEW_USER_CREDENTIALS' is a table. isn't it ??? if it isn't thie way, could you explain me how to set the table ?
thanks. -
Importing the METADATA ONLY using DBMS_DATAPUMP
Hi DBAs,
Using the DBMS_DATAPUMP , how can I import the metadata only for a particular table . Also I dont want to import any associated INDEXES and TRIGGERS. I have the following codes but it is trying to import every thing. Also if the table exist, it is not importing the METADATA rather erroring out.
handle1 := DBMS_DATAPUMP.OPEN('IMPORT','SCHEMA', 'QAXDB.WORLD');
DBMS_DATAPUMP.METADATA_FILTER(handle1, 'SCHEMA_EXPR', 'IN (''HR'')');
DBMS_DATAPUMP.SET_PARAMETER(handle1, 'INCLUDE_METADATA', 1);
Thanks
-Samar-See the below link,
http://download.oracle.com/docs/cd/B19306_01/appdev.102/b14258/d_datpmp.htm
Hope this helps,
Regards,
http://www.oracleracexpert.com
Click here for [Cross platform Transportable tablespace using Datapump|http://www.oracleracexpert.com/2009/08/transportable-tablespace-export-import.html]
Click here to learn [Oracle data pump export/import with examples.|http://www.oracleracexpert.com/2009/08/oracle-data-pump-exportimport.html] -
Reg: Export and Import using Dbms_datapump
Hi,
I would like to export a table using dbms_datapump package. I have a procedure to do this (In Oracle 10g R10.2.0.1.0). This procedure have parameter for schema name and table name and this particular schema table should be exported as
dump file.
PROCEDURE PR_EXPORT(PV_SCHEMA IN VARCHAR2,
PV_TABLE VARCHAR2,
PV_STATUS OUT VARCHAR2) AS
l_dp_handle NUMBER;
l_last_job_state VARCHAR2(30) := 'UNDEFINED';
l_job_state VARCHAR2(30) := 'UNDEFINED';
l_sts KU$_STATUS;
l_schema varchar2(256);
l_table varchar2(256);
BEGIN
l_schema := 'IN(''' || PV_SCHEMA || ''')'; --'IN(''VALIDATION'')'
l_table := 'IN(''' || pv_table || ''')'; -- 'IN(''TABLE1'')'
DBMS_OUTPUT .PUT_LINE('SCHEMA ' || L_SCHEMA);
DBMS_OUTPUT .PUT_LINE('TABLE ' || L_TABLE);
l_dp_handle := DBMS_DATAPUMP.open(operation => 'EXPORT',
job_mode => 'TABLE',
remote_link => NULL,
job_name => 'EMP_EXPORT13',
version => 'LATEST');
DBMS_DATAPUMP .add_file(handle => l_dp_handle,
filename => 'SCOTT12.dmp',
directory => 'BACKUP_DIR');
DBMS_DATAPUMP .add_file(handle => l_dp_handle,
filename => 'SCOTT12.log',
directory => 'BACKUP_DIR',
filetype => DBMS_DATAPUMP.KU$_FILE_TYPE_LOG_FILE);
DBMS_DATAPUMP .metadata_filter(handle => l_dp_handle,
name => 'SCHEMA_EXPR',
VALUE => l_schema --'IN(''VALIDATION'')'
DBMS_DATAPUMP .metadata_filter(handle => l_dp_handle,
name => 'NAME_EXPR',
VALUE => l_table -- 'IN(''TABLE1'')'
DBMS_DATAPUMP .start_job(l_dp_handle);
DBMS_DATAPUMP .detach(l_dp_handle);
END PR_EXPORT;
Sometime the above procedure correctly creating the dump file. But sometimes, it is showing the below error:
The following error has occurred:
ORA-26535: %ud byte row cache insufficient for table with rowsize=%ud
Please help me on this.
Thanks and Regards,
VijayThe only information I could find so far is this [http://ora-26535.ora-code.com/].
I could not find out how to change the buffer size - there does not seem to be an option in [DBMS_DATAPUMP|http://download.oracle.com/docs/cd/B19306_01/appdev.102/b14258/d_datpmp.htm#i1007277]. Maybe you have to search the [Advanced Replication documentation|http://download.oracle.com/docs/cd/B19306_01/server.102/b14226/toc.htm].
HTH
Maybe you are looking for
-
Windows is Genuine and activated but getting message not genuine
Diagnostic Report (1.9.0027.0): Windows Validation Data--> Validation Code: 0x8004FE21 Cached Online Validation Code: 0x0 Windows Product Key: *****-*****-2B3W7-D9XP8-TWWGB Windows Product Key Hash: gzakg84yivQ6CnuC2c1nMM4fbA4= Windows Product ID: 00
-
Hi , I have one requirement for table Entry. Requirement is like i hve 3 fields in table dealer no , dealer name and city.While entering the the fields from SM30, user wants like if he/she enters dealer no then dealer name and city should automatical
-
Stacked Bar chart and Line graphs
I have a report in Crystal which has stacked bar chart and a line graph together in one report. Is it possible to do the same thing in WebI . Can anybody share the experience.
-
IDVD '08 locks up on preview mode
i'm new to iDVD...i had created a dvd project using the default 7.0 theme (Revolution) that includes 3 different slideshows (about 20 pics per show with audio track playing with each show) and 2 video clips (originally shot in .AVI format). everythin
-
I am using Jasper reports with JSF, Here I am exporting result into the pdf format for that I am doing this. getting current instance object from the faces context and then getting response here I am casting that response into the httpservlet respons