Oracle Export Data pump API - job already exists

Hi all,
I use stored procedure and using data pump API for export and import. I use simple schema export from http://download.oracle.com/docs/cd/B14117_01/server.101/b10825/dp_api.htm . It works on the first executing, but the second time
I found the error ORA-31634: job already exists.
At the first running I check the Job state from dba_datapump_jobs master table, and it stated DEFINING after process completed.
why is it happened?
Really thanks in advance.
Regards,
Mr.K

My question go in direction to:
1. Did I something wrong with Data Pump Utility?I do not think so. But may be with the existing schema :-(
2. How is the correct way to export and import
sequences that they keep their actual values?If the Sequences exist in the target before the import, oracle does not drop and recreate it. So you need to ensure that the sequences do not already exist in the target or the existing ones are dropped before the import.
3. When the behaviour described here is correct, how
can I correct the values that start again from the
last value that was used in the source database?You can either refresh with the import after the above correction or drop and manually recreate the sequences to START WITH the NEXT VALUE of the source sequences.
The easier way is to generate a script from the source if you know how to do it

Similar Messages

  • Help needed with Export Data Pump using API

    Hi All,
    Am trying to do an export data pump feature using the API.
    while the export as well as import works fine from the command line, its failing with the API.
    This is the command line program:
    expdp pxperf/dba@APPN QUERY=dev_pool_data:\"WHERE TIME_NUM > 1204884480100\" DUMPFILE=EXP_DEV.dmp tables=PXPERF.dev_pool_data
    Could you help me how should i achieve the same as above in Oracle Data Pump API
    DECLARE
    h1 NUMBER;
    h1 := dbms_datapump.open('EXPORT','TABLE',NULL,'DP_EXAMPLE10','LATEST');
    dbms_datapump.add_file(h1,'example3.dmp','DATA_PUMP_TEST',NULL,1);
    dbms_datapump.add_file(h1,'example3_dump.log','DATA_PUMP_TEST',NULL,3);
    dbms_datapump.metadata_filter(h1,'NAME_LIST','(''DEV_POOL_DATA'')');
    END;
    Also in the API i want to know how to export and import multiple tables (selective tables only) using one single criteria like "WHERE TIME_NUM > 1204884480100\"

    Yes, I have read the Oracle doc.
    I was able to proceed as below: but it gives error.
    ============================================================
    SQL> SET SERVEROUTPUT ON SIZE 1000000
    SQL> DECLARE
    2 l_dp_handle NUMBER;
    3 l_last_job_state VARCHAR2(30) := 'UNDEFINED';
    4 l_job_state VARCHAR2(30) := 'UNDEFINED';
    5 l_sts KU$_STATUS;
    6 BEGIN
    7 l_dp_handle := DBMS_DATAPUMP.open(
    8 operation => 'EXPORT',
    9 job_mode => 'TABLE',
    10 remote_link => NULL,
    11 job_name => '1835_XP_EXPORT',
    12 version => 'LATEST');
    13
    14 DBMS_DATAPUMP.add_file(
    15 handle => l_dp_handle,
    16 filename => 'x1835_XP_EXPORT.dmp',
    17 directory => 'DATA_PUMP_DIR');
    18
    19 DBMS_DATAPUMP.add_file(
    20 handle => l_dp_handle,
    21 filename => 'x1835_XP_EXPORT.log',
    22 directory => 'DATA_PUMP_DIR',
    23 filetype => DBMS_DATAPUMP.KU$_FILE_TYPE_LOG_FILE);
    24
    25 DBMS_DATAPUMP.data_filter(
    26 handle => l_dp_handle,
    27 name => 'SUBQUERY',
    28 value => '(where "XP_TIME_NUM > 1204884480100")',
    29 table_name => 'ldev_perf_data',
    30 schema_name => 'XPSLPERF'
    31 );
    32
    33 DBMS_DATAPUMP.start_job(l_dp_handle);
    34
    35 DBMS_DATAPUMP.detach(l_dp_handle);
    36 END;
    37 /
    DECLARE
    ERROR at line 1:
    ORA-39001: invalid argument value
    ORA-06512: at "SYS.DBMS_SYS_ERROR", line 79
    ORA-06512: at "SYS.DBMS_DATAPUMP", line 3043
    ORA-06512: at "SYS.DBMS_DATAPUMP", line 3688
    ORA-06512: at line 25
    ============================================================
    i have a table called LDEV_PERF_DATA and its in schema XPSLPERF.
    value => '(where "XP_TIME_NUM > 1204884480100")',above is the condition i want to filter the data.
    However, the below snippet works fine.
    ============================================================
    SET SERVEROUTPUT ON SIZE 1000000
    DECLARE
    l_dp_handle NUMBER;
    l_last_job_state VARCHAR2(30) := 'UNDEFINED';
    l_job_state VARCHAR2(30) := 'UNDEFINED';
    l_sts KU$_STATUS;
    BEGIN
    l_dp_handle := DBMS_DATAPUMP.open(
    operation => 'EXPORT',
    job_mode => 'SCHEMA',
    remote_link => NULL,
    job_name => 'ldev_may20',
    version => 'LATEST');
    DBMS_DATAPUMP.add_file(
    handle => l_dp_handle,
    filename => 'ldev_may20.dmp',
    directory => 'DATA_PUMP_DIR');
    DBMS_DATAPUMP.add_file(
    handle => l_dp_handle,
    filename => 'ldev_may20.log',
    directory => 'DATA_PUMP_DIR',
    filetype => DBMS_DATAPUMP.KU$_FILE_TYPE_LOG_FILE);
    DBMS_DATAPUMP.start_job(l_dp_handle);
    DBMS_DATAPUMP.detach(l_dp_handle);
    END;
    ============================================================
    I dont want to export all contents as the above, but want to export data based on some conditions and only on selective tables.
    Any help is highly appreciated.

  • Data Pump API Question

    Hi,
    Is there a way to use the Data Pump API to export tables from multiple schemas in the same job? I can't figure out what the filters would be. It seems like I can either specify many tables from 1 schema only, or I can specify multiple schemas but not limit the tables I want to export.
    I keep running into this error: ORA-31655: no data or metadata objects selected for job
    I'd like to do something like this:
    --METADATA FILTER: SPECIFY TABLES TO EXPORT
    dbms_datapump.metadata_filter(
    handle => hdl,
    name => 'NAME_EXPR',
    value => 'IN(''schema1.table1'',''schema2.table2'')');
    This does not seem to be possible..
    Any help would be appreciated.
    Thanks,
    Nora

    User that have EXP_FULL_DATABASE role should be able to do what you want.
    Search here for that role http://students.kiv.zcu.cz/doc/oracle/server.102/b14215/dp_export.htm#i1007837
    Seems like you could do what you want by using that role in
    joint venture wiht exclude and include parameters http://students.kiv.zcu.cz/doc/oracle/server.102/b14215/dp_export.htm#i1009903

  • How to exclude statistic using Data Pump API?

    How to exclude all statistics while exporting data using Oracle Data Pump API (DBMS_DATAPUMP package)?

    You would call the metadata filter api like this:
    dbms_datapump.METADATA_FILTER(
    handle = your_handle_here,
    name = 'EXCLUDE_PATH_LIST',
    value = 'STATISTICS');
    Hope this helps.
    Dean

  • Data Pump API -- how to exlude constraints?

    Hi,
    We wrote a PL/SQL procedure using the Data Pump API to perform a schema export, as in http://www.oracle-base.com/articles/10g/OracleDataPump10g.php#DataPumpAPI
    We would like our export to exlude constraints and indexes, but I can't find an example of what the syntax would be.
    Could anybody please give me an example of what this metadata_filter would look like?
    Thanks,
    Nora

    In DBMS_DATAPUMP.metadata_filter use "name" parameter EXCLUDE_PATH_LIST or EXCLUDE_PATH_EXPR
    Check following link for details
    http://download-west.oracle.com/docs/cd/B19306_01/appdev.102/b14258/d_datpmp.htm#sthref2182
    Virag

  • Select table when import using Data Pump API

    Hi,
    Sorry for the trivial question, I export the data using Data Pump API, with "TABLE" mode.
    So all tables will be exported in one .dmp file.
    My question is, then how to import few tables only using Data Pump API?, how to define "TABLES" property like command line interface?
    should I use DATA_FILTER procedures?, if yes how to do that?
    Really thanks in advance
    Regards,
    Kahlil

    Hi,
    You should be using metadata_filter procedure for the same.
    eg:
    dbms_datapump.metadata_filter
                (handle1
                 ,'NAME_EXPR'
                 ,'IN (''TABLE1'', '"TABLE2'')'
    {code}
    Regards
    Anurag                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                       

  • Oracle 10g - Data Pump: Export / Import of Sequences ?

    Hello,
    I'm new to this forum and also to Oracle (Version 10g). Since I could not find an answer to my question, I open this post in hoping to get some help from the experienced users.
    My question concerns the Data Pump Utility and what happens to sequences which were defined in the source database:
    I have exported a schema with the following command:
    "expdp <user>/<pass> DIRECTORY=DATA_PUMP_DIR DUMPFILE=dumpfile.dmp LOGFILE=logfile.log"
    This worked fine and also the import seemed to work fine with the command:
    "impdp <user>/<pass> DIRECTORY=DATA_PUMP_DIR DUMPFILE=dumpfile.dmp"
    It loaded the exported objects directly into the schema of the target database.
    BUT:
    Something has happened to my sequences. :-(
    When I want to use them, all sequences start again with value "1". Since I have already included data with higher values in my tables, I get into trouble with the PK of these tables because I used sequences sometimes as primary key.
    My question go in direction to:
    1. Did I something wrong with Data Pump Utility?
    2. How is the correct way to export and import sequences that they keep their actual values?
    3. When the behaviour described here is correct, how can I correct the values that start again from the last value that was used in the source database?
    Thanks a lot in advance for any help concerning this topic!
    Best regards
    FireFighter
    P.S.
    It might be that my english sounds not perfect since it is not my native language. Sorry for that! ;-)
    But I hope that someone can understand nevertheless. ;-)

    My question go in direction to:
    1. Did I something wrong with Data Pump Utility?I do not think so. But may be with the existing schema :-(
    2. How is the correct way to export and import
    sequences that they keep their actual values?If the Sequences exist in the target before the import, oracle does not drop and recreate it. So you need to ensure that the sequences do not already exist in the target or the existing ones are dropped before the import.
    3. When the behaviour described here is correct, how
    can I correct the values that start again from the
    last value that was used in the source database?You can either refresh with the import after the above correction or drop and manually recreate the sequences to START WITH the NEXT VALUE of the source sequences.
    The easier way is to generate a script from the source if you know how to do it

  • DATA PUMP API returning ORA-31655

    Hello Gurus,
    I am using below code to import a table(EMP) from one Database to Another using Network.
    set serveroutput on;
    DECLARE
    ind NUMBER; -- Loop index
    spos NUMBER; -- String starting position
    slen NUMBER; -- String length for output
    h1 NUMBER; -- Data Pump job handle
    percent_done NUMBER; -- Percentage of job complete
    job_state VARCHAR2(30); -- To keep track of job state
    le ku$_LogEntry; -- For WIP and error messages
    js ku$_JobStatus; -- The job status from get_status
    jd ku$_JobDesc; -- The job description from get_status
    sts ku$_Status; -- The status object returned by get_status
    BEGIN
    h1 := DBMS_DATAPUMP.OPEN('IMPORT','TABLE','DBLINK',NULL,'LATEST');
    DBMS_DATAPUMP.METADATA_FILTER(h1,'NAME_EXPR','IN (''SCOTT.EMP'')','TABLE');
    DBMS_DATAPUMP.SET_PARAMETER(h1,'TABLE_EXISTS_ACTION','REPLACE');
    DBMS_DATAPUMP.METADATA_REMAP(h1,'REMAP_SCHEMA','SCOTT','SCOTT');
    --DBMS_DATAPUMP.METADATA_FILTER(h1,'INCLUDE_PATH_LIST','like''TABLE''');
    DBMS_DATAPUMP.METADATA_REMAP(h1,'REMAP_TABLESPACE','USERS','USERS');
    DBMS_DATAPUMP.SET_PARALLEL(h1,8);
    begin
    dbms_datapump.start_job(h1);
    dbms_output.put_line('Data Pump job started successfully');
    exception
    when others then
    if sqlcode = dbms_datapump.success_with_info_num
    then
    dbms_output.put_line('Data Pump job started with info available:');
    dbms_datapump.get_status(h1,
    dbms_datapump.ku$_status_job_error,0,
    job_state,sts);
    if (bitand(sts.mask,dbms_datapump.ku$_status_job_error) != 0)
    then
    le := sts.error;
    if le is not null
    then
    ind := le.FIRST;
    while ind is not null loop
    dbms_output.put_line(le(ind).LogText);
    ind := le.NEXT(ind);
    end loop;
    end if;
    end if;
    else
    raise;
    end if;
    end;
    -- The export job should now be running. In the following loop, we will monitor
    -- the job until it completes. In the meantime, progress information is
    -- displayed.
    percent_done := 0;
    job_state := 'UNDEFINED';
    while (job_state != 'COMPLETED') and (job_state != 'STOPPED') loop
    dbms_datapump.get_status(h1,
    dbms_datapump.ku$_status_job_error +
    dbms_datapump.ku$_status_job_status +
    dbms_datapump.ku$_status_wip,-1,job_state,sts);
    js := sts.job_status;
    -- If the percentage done changed, display the new value.
    if js.percent_done != percent_done
    then
    dbms_output.put_line('*** Job percent done = ' ||
    to_char(js.percent_done));
    percent_done := js.percent_done;
    end if;
    -- Display any work-in-progress (WIP) or error messages that were received for
    -- the job.
    if (bitand(sts.mask,dbms_datapump.ku$_status_wip) != 0)
    then
    le := sts.wip;
    else
    if (bitand(sts.mask,dbms_datapump.ku$_status_job_error) != 0)
    then
    le := sts.error;
    else
    le := null;
    end if;
    end if;
    if le is not null
    then
    ind := le.FIRST;
    while ind is not null loop
    dbms_output.put_line(le(ind).LogText);
    ind := le.NEXT(ind);
    end loop;
    end if;
    end loop;
    -- Indicate that the job finished and detach from it.
    dbms_output.put_line('Job has completed');
    dbms_output.put_line('Final job state = ' || job_state);
    dbms_datapump.detach(h1);
    -- Any exceptions that propagated to this point will be captured. The
    -- details will be retrieved from get_status and displayed.
    exception
    when others then
    dbms_output.put_line('Exception in Data Pump job');
    dbms_datapump.get_status(h1,dbms_datapump.ku$_status_job_error,0,
    job_state,sts);
    if (bitand(sts.mask,dbms_datapump.ku$_status_job_error) != 0)
    then
    le := sts.error;
    if le is not null
    then
    ind := le.FIRST;
    while ind is not null loop
    spos := 1;
    slen := length(le(ind).LogText);
    if slen > 255
    then
    slen := 255;
    end if;
    while slen > 0 loop
    dbms_output.put_line(substr(le(ind).LogText,spos,slen));
    spos := spos + 255;
    slen := length(le(ind).LogText) + 1 - spos;
    end loop;
    ind := le.NEXT(ind);
    end loop;
    end if;
    end if;
    END;
    when i run the same code and change the mode for SCHEMA LEVEL import it is working fine.
    But when i want to import a single table it is giving below error
    Data Pump job started successfully
    Starting "SCOTT"."SYS_IMPORT_TABLE_06":
    Estimate in progress using BLOCKS method...
    Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
    Total estimation using BLOCKS method: 0 KB
    ORA-31655: no data or metadata objects selected for job
    *** Job percent done = 100
    Job "SCOTT"."SYS_IMPORT_TABLE_06" completed with 1 error(s) at 20:56:23
    Job has completed
    Final job state = COMPLETED
    can you give any suggestions on this error.
    Thanks in advance

    Did you check this link?
    http://arjudba.blogspot.com/2009/01/ora-31655-no-data-or-metadata-objects.html
    Does it provide any help?
    Regards.
    Satyaki De.

  • Export data pump

    In Oracle 10g there is feature of export Datapump
    as i planned to test on test database rather then production
    but i found it is slow or you can say same as normal export /import
    As i read in documentaion that data pump is faster then NORMAL export
    Please suggest and shared your views ?

    There have been such claims. I suggest you refer to forums and/or notes on Metalink.

  • Data Pump export complete schema

    Hi ,
    can anyone please help me
    when i am trying the export the database schema with datapump api
    i have created directory before executing this block
    i am getting following errors
    ora-31634 job already exists
    ora 06512 sys.DBMS_SYS_ERROR
    ora 06512 sys.DBMS_DATAPUMP line 72
    ora 06512 sys.DBMS_DATAPUMP
    Declare
    P_handle number;
    P_last_job_state varchar2(45);
    P_job_state varchar2(45);
    P_status ku$_Status ;
    BEGIN
    P_handle:=DBMS_DATAPUMP.OPEN ('EXPORT','SCHEMA', NULL,'EXAMPLE','LATEST');
    DBMS_DATAPUMP.ADD_FILE (p_handle,'example.dmp','DMP_DIR');
    DBMS_DATAPUMP.METADATA_FILTER (p_handle,'schema_name','IN (''username'')');
    DBMS_DATAPUMP.start_job (p_handle);
    end;
    in this it is saying job already exists ,is it means that export job is already running?
    Thanks in Advance

    Hi Srini,
    Thank you very much for your help. Unfortunately after having followed the instructions of the DOC I am still getting the same errors ?
    none the less thank you for your input.
    I was also wondering if someone could tell me how to move this thread to the Database Security area of the forum, as I feel I may have posted the thread in the wrong place as it appears to be a Database Vault issue and not an imp/exp problem. ?
    Edited by: zooid on May 20, 2012 10:33 PM
    Edited by: zooid on May 20, 2012 10:36 PM

  • What are the 'gotcha' for exporting using Data Pump(10205) from HPUX to Win

    Hello,
    I have to export a schema using data pump from 10205 on HPUX 64bit to Windows 64bit same 10205 version database. What are the 'gotcha' can I expect from doing this? I mean export data pump is cross platform so this sounds straight forward. But are there issues I might face from export data pump on HPUX platform and then import data dump on to Windows 2008 platform same database version 10205? Thank you in advance.

    On the HPUX database, run this statement and look for the value for NLS_CHARACTERSET
    SQL> select * from NLS_DATABASE_PARAMETERS;http://docs.oracle.com/cd/B19306_01/server.102/b14237/statviews_4218.htm#sthref2018
    When creating the database on Windows, you have two options - manually create the database or use DBCA. If you plan to create the database manually, specify the database characterset in the CREATE DATABASE statement - http://docs.oracle.com/cd/B19306_01/server.102/b14200/statements_5004.htm#SQLRF01204
    If using DBCA, see http://docs.oracle.com/cd/B19306_01/server.102/b14196/install.htm#ADMQS0021 (especially http://docs.oracle.com/cd/B19306_01/server.102/b14196/install.htm#BABJBDIF)
    HTH
    Srini

  • Pre Checks before running Data Pump Export/Import

    Hi,
    Oracle :-11.2
    OS:- Windows
    Kindly share the pre-checks required for data pump export and import which should be followed by a DBA.
    Thanks

    When you do a tablespace mode export, Data Pump is essentially doing a table mode export of all of the tables in the tablespaces mentioned.  So if you have this:
    tablespace a    contains table 1
                                              table 2
                                             index 3a(on table 3)
    tablespace b   contains index 1a on table 1
                                             index 2a on table 2
                                            table 3
    and if you expdp tablespaces=a ...
    you will get table 1, table 2, index 1a, and index 2a.
    My belief is that you will not get table 3 or index 3a.  The way I understand the code to work is that you get the tables in the tablespaces you mention and their dependent objects, but not the other way around.  You could easily verify this to make sure.
    Dean

  • Data pump export full RAC database  in window single DB by network_link

    Hi Experts,
    I have a window 32 bit 10.2 database.
    I try to export a full rac database (350G some version with window DB) in window single database by dblink.
    exp syntax as
    exdpd salemanager/********@sale FULL=y DIRECTORY=dataload NETWORK_LINK=sale.net DUMPFILE=sale20100203.dmp LOGFILE=salelog20100203.log
    I created a dblink with fixed instance3. It was working for two day and display message as
    ORA-31693: Table data object "SALE_AUDIT"."AU_ITEM_IN" failed to load/unload and is being skipped due to error:
    ORA-29913: error in executing ODCIEXTTABLEPOPULATE callout
    ORA-01555: snapshot too old: rollback segment number with name "" too small
    ORA-02063: preceding line from sale.netL
    I stoped export and checked window target alert log.
    I saw some message as
    kupprdp: master process DM00 started with pid=16, OS id=4444
    to execute - SYS.KUPM$MCP.MAIN('SYS_EXPORT_FULL_02', 'SYSTEM', 'KUPC$C_1_20100202235235', 'KUPC$S_1_20100202235235', 0);
    Tue Feb 02 23:56:12 2010
    The value (30) of MAXTRANS parameter ignored.
    kupprdp: master process DM00 started with pid=17, OS id=4024
    to execute - SYS.KUPM$MCP.MAIN('SYS_EXPORT_FULL_01', 'SALE', 'KUPC$C_1_20100202235612', 'KUPC$S_1_20100202235612', 0);
    kupprdp: worker process DW01 started with worker id=1, pid=18, OS id=2188
    to execute - SYS.KUPW$WORKER.MAIN('SYS_EXPORT_FULL_01', 'SALE');
    In RAC instance alert.log. I saw message as
    SELECT /*+ NO_PARALLEL ("KU$") */ "ID","RAW_DATA","TRANSM_ID","RECEIVED_UTC_DATE ","RECEIVED_FROM","ACTION","ORAUSER",
    "ORADATE" FROM RELATIONAL("SALE_AUDIT"."A U_ITEM_IN") "KU$"
    How to fixed this error?
    add more undotbs space in RAC instance 3 or window database?
    Thanbks
    Jim
    Edited by: user589812 on Feb 4, 2010 10:15 AM

    I usually increate undo space. Is your undo retention set smaller than the time it takes to run the job? If it is, I would think you would need to do that. If not, then I would think it would be the space. You were in the process of exporting data when the job failed which is what I would have expected. Basically, DataPump want to export each table consistent to itself. Let's say that one of your tables is partitioned and it has a large partition and a smaller partition. DataPump attempts to export the larger partiitons first and it remembers the scn for that partition. When the smaller partitions are exported, it will use the scn to get the data from that partition as it would have looked like if it exported the data when the first partiiton was used. If you don't have partitioned tables, then do you know if some of the tables in the export job (I know it's full so that includes just about all of them) are having data added to them or removed from them? I can't think of anything else that would need undo while exporting data.
    Dean

  • Can't "stop" data pump job via dbms_datapump

    DBMS_DATAPUMP.101
    I have a job I have OPENd, but never started. How do I get rid of it?
    SQL> declare
    2 v_handle number;
    3
    4 begin
    5
    6 v_handle := DBMS_DATAPUMP.ATTACH('RefreshFromDev','SYSTEM')
    7 DBMS_DATAPUMP.STOP_JOB(v_handle );
    8
    9 end;
    10 /
    declare
    ERROR at line 1:
    ORA-31626: job does not exist
    ORA-06512: at "SYS.DBMS_SYS_ERROR", line 79
    ORA-06512: at "SYS.DBMS_DATAPUMP", line 911
    ORA-06512: at "SYS.DBMS_DATAPUMP", line 3279
    ORA-06512: at line 6
    SQL> SELECT JOB_NAME, OWNER_NAME , STATE FROM DBA_DATAPUMP_JOBS;
    JOB_NAME
    OWNER_NAME
    STATE
    RefreshFromDev
    SYSTEM
    DEFINING
    Obviously the job exists. I've tried issuing another Open with the same name and It complains that the job already exists. As I understand it, I have to attach to the job and issue a stop_iob in order to dispose of it.
    Thanks
    Steve

    OK, mystery solved, sort of. If I close my sqlplus session, and log in again, issuing the same sql, it works.
    I suppose I should read the message "job does not exist" as something like "already attached to a job, detach or disconnect"
    Can anyone shed some light on this? I guess it means you can only attach to one job per session, or that only one handle can be issued per job per session. But I'm only guessing.
    Thanks,
    Steve

  • Data Pump Errors

    trying to execute the sample here:
    http://download.oracle.com/docs/cd/B12037_01/server.101/b10825/dp_api.htm#i1006925
    Example 5-1 Performing a Simple Schema Export
    So i create a packge and I tried to modify it to do a table instead of a schema export so I mess it up and it doesn't work. I do this twice.
    If I query USER_DATAPUMP_JOBS I have 2 entries ('EXAMPLE1' and 'EXAMPLE2'), so when I try to rerun the sample, I get an ORA-31634 job already exists.
    hmm, it looks like I could then attach to the job to rerun my tests or to remove it, but when I try and attach, I get ORA-31626 job doesn't exist.
    Am I missing something basic here? All other procedures require a handle, so it's either OPEN or ATTACH.

    I better read the doc again. Seems strange to create tables in the schema with the name of the job.
    I deleted the tables in the user and then when I did a select * from user_datapump_jobs, then I could see more rows with the jobs from before plus the names of the recyclebin objects. purge recyclebin and they were gone.
    I'm not using EXPDP, I'm trying the API based approach.

Maybe you are looking for

  • Error while creating Chart

    Hi Everyone, I am facing an error while creating a chart for report.I checked javahost and oc4j are running . And i checked the logfile of java nothing in it. I face this error in Answers result. A fatal error occurred while processing the request. T

  • SD certification questions - Help with answers

    Dear All, I am writing my certification exam tomorrow. Please help me in finding the answers for the below questions. 1. Can i have multiple condition records for a condition table? 2. Can two condition types share a condition table? 3. What happens

  • Keychain issues - beyond already answered

    So I know there are a ton of topics on this. Out of the blue, I started getting the keychain not found messages. At some point in my troubleshooting, the keychain was deleted. I've read about the deleting the unique code in terminal, that won't work

  • All video content players are crashing regularly - any ideas please?

    Every application that plays any kind of video is crashing on a regular basis. Quicktime and VLC cannot get through about 20 minutes of video without suddenly freezing and after about 15 seconds of beach ball quit. Safari and Firefox do the same thin

  • Lots of videos, or all in one, & Need for a Play button?

    I am shooting about 11 separate videos over the next 12 months, each on a single local subject such as a village fair, hay-making, the Fire Service etc. Each will be from 3 to 15 minutes long. These subjects will be grouped into: events, festivals, w