Data pump PL SQL API ;;; PLZ HELP ME

Hi all,
I want to use the data pump PL SQL API provided by Oracle ; I have created one procedure named SP_EXPORT as showing below ;
But the problem that I have this error message : any suggestion, help please ?
ORA-31626: job does not exist
ORA-06512: at "SYS.DBMS_SYS_ERROR", line 79
ORA-06512: at "SYS.DBMS_DATAPUMP", line 911
ORA-06512: at "SYS.DBMS_DATAPUMP", line 4356
ORA-06512: at line 7
CREATE OR REPLACE PROCEDURE SP_EXPORT AS
l_dp_handle NUMBER;
l_last_job_state VARCHAR2(30) := 'UNDEFINED';
l_job_state VARCHAR2(30) := 'UNDEFINED';
l_sts KU$_STATUS;
BEGIN
l_dp_handle := DBMS_DATAPUMP.open(
operation => 'EXPORT',
job_mode => 'SCHEMA',
remote_link => NULL,
job_name => 'EMP_EXPORT',
version => 'LATEST');
DBMS_DATAPUMP.add_file(
handle => l_dp_handle,
filename => 'MIVA_DEVXEN03.dmp',
directory => 'DATAPUMP_DIR');
DBMS_DATAPUMP.add_file(
handle => l_dp_handle,
filename => 'MIVA_DEVXEN03.log',
directory => 'DATAPUMP_DIR',
filetype => DBMS_DATAPUMP.KU$_FILE_TYPE_LOG_FILE);
DBMS_DATAPUMP.metadata_filter(
handle => l_dp_handle,
name => 'SCHEMA_EXPR',
value => '= ''MIVA''');
DBMS_DATAPUMP.start_job(l_dp_handle);
DBMS_DATAPUMP.detach(l_dp_handle);
END;

I ' connected as sysdba and it works ; wow very happy ;)

Similar Messages

  • An Oracle data pump situation that I need help with

    Oracle 10g running on Sun Solaris:
    I have written a unix shell script that exports data into a dump file(data pump export using expdp). Similarly I have import script that imports the data from the dump file(import using impdp). These are not schema exports. In other words, we have logically divided our schema into 4 different groups based on their functionality (group1, group2,group3 and group4). Each of these groups consists of about 30-40 tables. For expdp, there are 4 parfiles. group1.par, group2.par, group3.par and group4.par. Depending on what parameter you pass while running the script, the respective par file will be picked and an export will be done.
    For example,
    While running,
    exp_script.ksh group1
    will pick up group1.par and will export into group1.dmp.
    Similarly,
    exp_script.ksh group3
    will pick up group3.par and will export into group3.dmp.
    My import script also needs the parameter to be passed to pick the right dmp file.
    For example,
    imp_script.ksh group3
    will pick up group3.dmp and import every table that group3.dmp has(Here, import process does not need par files - all it needs is the right dmp file. )
    I am now faced with a difficulty, where, I must use Oracle's dbms_datapump API, to achieve the above same(and not expdp and impdp). I haven't used this API before. How best can I use this API to stick with my above stategy -- which is, I eventually need group1.dmp, group2.dmp, group3.dmp and group4.dmp. I will need to pass table names that each group contains. How do I do this using this API ? Can someone point me to an example, or perhaps suggest ?
    Thanks

    Or atleast, how do you do a table export using dbms_datapump ?

  • I have downloaded a game called temple run, it is stuck downloading under the game it sayz 'Waiting' i have 3 other apps like that and my Smurfs game updating it is stuck too like that ! I have alot of apps so i dont want to delete date!!!! Plz help!!!!

    I have 4 apps that i just downloaded and they are 'Waiting' i need some one
    To plz come up with a solution i dont want to erase my data cause i have over
    100 games paid and free PLZ HELP i also have my smurfs game updating
    And i cant get on it now cause
    It wont update plz help!!!!!!!

    These links might be of help:
    Troubleshooting applications purchased from the App Store
    iPhone, iPad, iPod touch: Turning off and on (restarting) and resetting
    iPod touch: Hardware troubleshooting
    I would use these links in order { if the info on first link doesn't work, move to second link }

  • Could not start SQL services (Plz help me)

    I logged in windows with an administrator account, but I could not start sql services. Here is the newest log file, please help me to fix it. Thank in advanced.
    2014-10-02 16:17:49.06 Server      Microsoft SQL Server 2005 - 9.00.1399.06 (Intel X86) 
    Oct 14 2005 00:33:37 
    Copyright (c) 1988-2005 Microsoft Corporation
    Developer Edition on Windows NT 5.1 (Build 2600: Service Pack 3)
    2014-10-02 16:17:49.06 Server      (c) 2005 Microsoft Corporation.
    2014-10-02 16:17:49.06 Server      All rights reserved.
    2014-10-02 16:17:49.06 Server      Server process ID is 3888.
    2014-10-02 16:17:49.06 Server      Logging SQL Server messages in file 'C:\Program Files\Microsoft SQL Server\MSSQL.1\MSSQL\LOG\ERRORLOG'.
    2014-10-02 16:17:49.06 Server      This instance of SQL Server last reported using a process ID of 1104 at 10/2/2014 4:07:13 PM (local) 10/2/2014 9:07:13 AM (UTC). This is an informational message only; no user action is required.
    2014-10-02 16:17:49.06 Server      Registry startup parameters:
    2014-10-02 16:17:49.06 Server      
    -d C:\Program Files\Microsoft SQL Server\MSSQL.1\MSSQL\DATA\master.mdf
    2014-10-02 16:17:49.06 Server      
    -e C:\Program Files\Microsoft SQL Server\MSSQL.1\MSSQL\LOG\ERRORLOG
    2014-10-02 16:17:49.06 Server      
    -l C:\Program Files\Microsoft SQL Server\MSSQL.1\MSSQL\DATA\mastlog.ldf
    2014-10-02 16:17:49.07 Server      SQL Server is starting at normal priority base (=7). This is an informational message only. No user action is required.
    2014-10-02 16:17:49.07 Server      Detected 2 CPUs. This is an informational message; no user action is required.
    2014-10-02 16:17:49.71 Server      Using dynamic lock allocation.  Initial allocation of 2500 Lock blocks and 5000 Lock Owner blocks per node.  This is an informational message only.  No user action is required.
    --The Anh--

    Hello,
    Can you tell us the status of the issue ? If you have found answer please post here so that it can help other members when needed. If You found answer in this thread please mark answer to the post and close the thread
    Please mark this reply as answer if it solved your issue or vote as helpful if it helped so that other forum members can benefit from it
    My Technet Articles

  • Mistakenly deleted data on ODS ...Plz help

    Hello friends,
    Mistakenly i deleted data from ODS. After that immediately processed Initialize Delta Process .  Now i didnt get what to do?
    I am new in BI. Pls help.

    I guess every day morning they schedule Job in LBWE to update records in delta Queue.
    SM37
    job name - enter *LIS*
    user - *
    execute it..
    You can double confirm that no job is scheduled for particular application after today morning.
    Then -  In PSA give request newer than date-blank and refresh.. ensure no psa request is deleted recently. Depends on data volume, in some cases frequently PSA records will be deleted after successfully loaded to target.
    If you have entire request - execute DTP. It will bring all records in PSA to target.
    Edited by: Priya.D on Apr 6, 2010 12:28 PM

  • Getting max date and max time column plz help

    hi
    i have a table with suppose 7 columns with date and time column seperate
    i want to design a query which retrieve the current or the maxmum date with the max of time in that date.
    columns a ,b,c, date, time
    22-05-07 20
    23-05-07 50
    24-05-07 40
    25-05-07 30
    22-05-07 20
    ans " suppose current date is 25 "
    a,b,c,25-05-07,40

    try like this..
    SQL> with rt as
      2  (select 1 col1,'22-05-07' dt, 20 tm from dual union all
      3  select 2,'23-05-07', 50 from dual union all
      4  select 3,'24-05-07', 40 from dual union all
      5  select 4,'25-05-07', 30 from dual union all
      6  select 5,'25-05-07', 45 from dual union all
      7  select 6,'22-05-07', 20 from dual)
      8  select col1,dt,tm from
      9  (select col1,
    10      dt,
    11      max(tm) over(partition by to_date(dt,'DD-MM-RR') order by col1 desc) tm,
    12      row_number() over(partition by to_date(dt,'DD-MM-RR') order by col1 desc) rn
    13  from rt where to_date(dt,'DD-MM-RR') in (select max(to_date(dt,'DD-MM-RR')) dt from rt))
    14  where rn = 1;
          COL1 DT               TM
             5 25-05-07         45
    SQL>

  • Data Block (Reference column) URGENT~~ Plz help!

    Hi:
    i have 3 data block created. Two of them are master-detail.
    i didn't add on any relationship on third one yet.
    set "number of records" to be 8 in both 2nd and 3rd datab lock.
    i need the third one to synchonize with the data in 2nd data block all the time.
    ie. as the data in 2nd data block changes, the data in 3rd data block will change also.
    Right now i get the first record of 3rd data block synchronized with 2nd datablock
    all the time, but not the rest of records. Is there way i can fill that in?
    Thank you for all ur suggestion and effort !!
    tony

    I think that you can do this:
    First (1) block is master of (2) second block and (2) second block is master of (3) third block.

  • Related to Pl/SQL Coding plz help me its Urgent.

    A recordset depends on date range -
    1 week : approx 1000 records
    1 month : approx 5000 records
    1 year : approx 50000 records.
    Need to perform calculations on each of the row in the recordset.
    For each row recurrsive function is called to reach at the end product. (i.e for each of the 50000 rows)
    This function keeps on inserting rows in a temporary table against the data generated in the recurrsive function call.
    Hence Takes lots of time and the temp_table goes in increasing, creating performance issue.
    The session at user end becomes inactive....but at server side it remains active and the insertion of records is going on.
    Logic is working fine. But for limited date range (i.e a day or two)
    For a month processing or a year processing it fails....
    What would be the optimized soln on the above prob.?

    It's a bit difficult to give advice without knowing precisely what it is you're doing. What is the point of the recursive function? Why are you inserting into a temporary table? Does the recursive function do something with the records in the temporary table? If so then presumably your problem stems from the fact that a week only has to do 1000 * 1000 actions whereas a year has to undertake 50000 * 50000 actions.
    Have you traced these sessions to see what is going on?
    Cheers, APC

  • Error during data pump import with SQL developer

    Hello,
    I try to trasnfer data from one database to another one through data pump via SQL Developper (data amount is quite important) exporting several tables.
    Tables export is doing fine, but I encounter the following error when I import the file (I try data only and data + DDL).
    "Exception: ORA-39001: argument value invalid dbms_datapump.get_status(64...=
    ORA-39001: argument value invalid
    ORA-39000: ....
    ORA-31619: ...
    The file is in the right place, data pump folder of the new database. User is the same on both base, database version are similar.
    Do you have any idea of the problem ?
    Thanks

    With query SELECT * FROM v$version;
    Environment source
    Oracle Database 11g Enterprise Edition Release 11.2.0.1.0 - 64bit Production
    PL/SQL Release 11.2.0.1.0 - Production
    "CORE     11.2.0.1.0     Production"
    TNS for Linux: Version 11.2.0.1.0 - Production
    NLSRTL Version 11.2.0.1.0 - Production
    Environment target
    Oracle Database 11g Enterprise Edition Release 11.2.0.1.0 - 64bit Production
    PL/SQL Release 11.2.0.1.0 - Production
    "CORE     11.2.0.1.0     Production"
    TNS for Linux: Version 11.2.0.1.0 - Production
    NLSRTL Version 11.2.0.1.0 - Production

  • Plz help me to install apex on 10g

    HI anyone ,
    i m installing apex 3.0.1 on 10g release1
    but i am getting error in last ...can plz tell me how to solve this
    begin
    ERROR at line 1:
    ORA-04063: package body "FLOWS_030000.WWV_FLOW_API" has errors
    ORA-06508: PL/SQL: could not find program unit being called
    ORA-06512: at line 4
    Disconnected from Oracle Database 10g Enterprise Edition Release 10.1.0.2.0 - Pr
    oduction
    With the Partitioning, OLAP and Data Mining options
    F:\apex>
    plz help me...i have tryed all

    As Diavonex explained, you cannot install IE on the iPad because of the different OS and because there is no iOS version of IE available.
    I use the iCab Mobile browser and it can identify as desktop IE 6, 7, 8, or 9. It may not have all of the functionality of IE that you are looking for, but IMO it is a superior browser to Safari in many, many ways. Take a look at it in the App Store and read the reviews.

  • PLZ help in interfacing my VI with NI 5640

    I am interfacing Ni 5640 PCI with the example VI of the modulation toolkit "QAM transceiver" using the instrument driver of PCI 5640 not the labview FPGA driver
    My final constellation is with lot of errors.
    I am finding problems in setting the parameters of the PCI 5640. I am using the single tone generation VI at transmiter side and Spectrum measurement example Vi at RX side.
    parameters I am setting are
    at TX: 
    sampling rate of the complex cluster I am giving to ni 5640 R write waveform is 1600k S/s
    IQ rate giving to ni 5640 R configure generation is 1600k S/s
    Symbol rate is 100k Hz
    pulse shaping root raised cosine using 16 samples per symbol
    AT RX:
    what should be the span if i see in the example spectrum measurement or the IQ rate at the ni 5640 R configure acquisition
    how much samples to acquire?
    if my transmitted bits are 1044 and applying 4 QAM on them
    how should I resample my received array of IQ data????
    PLZ help or give me any example of a transceiver system interfaced with NI 5640r using instrument driver.

    good...yaar am too from AU
    so which dept r u from???and name..am from telecom A
    we r also doing 16 qam on dsp 6713, using labview...do meet me so dat we can resolve each others issues....
    regards
    wajahat Hassan

  • Interfacing NI 5640 with my VI PLZ help

    I am interfacing Ni 5640 PCI with the example VI of the modulation toolkit "QAM transceiver" using the instrument driver of PCI 5640 not the labview FPGA driver
    My final constellation is with lot of errors.
    I am finding problems in setting the parameters of the PCI 5640. I am using the single tone generation VI at transmiter side and Spectrum measurement example Vi at RX side.
    parameters I am setting are
    at TX: 
    sampling rate of the complex cluster I am giving to ni 5640 R write waveform is 1600k S/s
    IQ rate giving to ni 5640 R configure generation is 1600k S/s
    Symbol rate is 100k Hz
    pulse shaping root raised cosine using 16 samples per symbol
    AT RX:
    what should be the span if i see in the example spectrum measurement or the IQ rate at the ni 5640 R configure acquisition
    how much samples to acquire?
    if my transmitted bits are 1044 and applying 4 QAM on them
    how should I resample my received array of IQ data????
    PLZ help or give me any example of a transceiver system interfaced with NI 5640r using instrument driver.

    duplicate post -- continue here

  • PLZ HELP in interfacing NI 5640 with my VI

    I am interfacing Ni 5640 PCI with the example VI of the modulation toolkit "QAM transceiver" using the instrument driver of PCI 5640 not the labview FPGA driver
    My final constellation is with lot of errors.
    I am finding problems in setting the parameters of the PCI 5640. I am using the single tone generation VI at transmiter side and Spectrum measurement example Vi at RX side.
    parameters I am setting are
    at TX: 
    sampling rate of the complex cluster I am giving to ni 5640 R write waveform is 1600k S/s
    IQ rate giving to ni 5640 R configure generation is 1600k S/s
    Symbol rate is 100k Hz
    pulse shaping root raised cosine using 16 samples per symbol
    AT RX:
    what should be the span if i see in the example spectrum measurement or the IQ rate at the ni 5640 R configure acquisition
    how much samples to acquire?
    if my transmitted bits are 1044 and applying 4 QAM on them
    how should I resample my received array of IQ data????
    PLZ help or give me any example of a transceiver system interfaced with NI 5640r using instrument driver.
    AIRIAN

     Hi Airian,
    I think your question was answered here. Please post back
    if you need further help. Also, Please keep your posts to a single thread, as
    it helps the community to easily follow the resolution.
    David L.
    Systems Engineering
    National Instruments

  • N95 maps problem plz help me

    please any body can solve my problem?
    i have n95
    when i open gps application
    then on the screen globe is not showning and no maps are showing on screen
    only city names are showed
    and after the closing of map application in apst there are type on screen
    saving maps and data
    but now nothing typed
    plz help me

    i am using latest firm ware
    and using old maps application bcoz new beta 2.0 not showing my accurate position in pakistan
    if i am in my city
    the maps application shows u r 10 km from your city
    but old version give me accurate position...

  • Help needed with Export Data Pump using API

    Hi All,
    Am trying to do an export data pump feature using the API.
    while the export as well as import works fine from the command line, its failing with the API.
    This is the command line program:
    expdp pxperf/dba@APPN QUERY=dev_pool_data:\"WHERE TIME_NUM > 1204884480100\" DUMPFILE=EXP_DEV.dmp tables=PXPERF.dev_pool_data
    Could you help me how should i achieve the same as above in Oracle Data Pump API
    DECLARE
    h1 NUMBER;
    h1 := dbms_datapump.open('EXPORT','TABLE',NULL,'DP_EXAMPLE10','LATEST');
    dbms_datapump.add_file(h1,'example3.dmp','DATA_PUMP_TEST',NULL,1);
    dbms_datapump.add_file(h1,'example3_dump.log','DATA_PUMP_TEST',NULL,3);
    dbms_datapump.metadata_filter(h1,'NAME_LIST','(''DEV_POOL_DATA'')');
    END;
    Also in the API i want to know how to export and import multiple tables (selective tables only) using one single criteria like "WHERE TIME_NUM > 1204884480100\"

    Yes, I have read the Oracle doc.
    I was able to proceed as below: but it gives error.
    ============================================================
    SQL> SET SERVEROUTPUT ON SIZE 1000000
    SQL> DECLARE
    2 l_dp_handle NUMBER;
    3 l_last_job_state VARCHAR2(30) := 'UNDEFINED';
    4 l_job_state VARCHAR2(30) := 'UNDEFINED';
    5 l_sts KU$_STATUS;
    6 BEGIN
    7 l_dp_handle := DBMS_DATAPUMP.open(
    8 operation => 'EXPORT',
    9 job_mode => 'TABLE',
    10 remote_link => NULL,
    11 job_name => '1835_XP_EXPORT',
    12 version => 'LATEST');
    13
    14 DBMS_DATAPUMP.add_file(
    15 handle => l_dp_handle,
    16 filename => 'x1835_XP_EXPORT.dmp',
    17 directory => 'DATA_PUMP_DIR');
    18
    19 DBMS_DATAPUMP.add_file(
    20 handle => l_dp_handle,
    21 filename => 'x1835_XP_EXPORT.log',
    22 directory => 'DATA_PUMP_DIR',
    23 filetype => DBMS_DATAPUMP.KU$_FILE_TYPE_LOG_FILE);
    24
    25 DBMS_DATAPUMP.data_filter(
    26 handle => l_dp_handle,
    27 name => 'SUBQUERY',
    28 value => '(where "XP_TIME_NUM > 1204884480100")',
    29 table_name => 'ldev_perf_data',
    30 schema_name => 'XPSLPERF'
    31 );
    32
    33 DBMS_DATAPUMP.start_job(l_dp_handle);
    34
    35 DBMS_DATAPUMP.detach(l_dp_handle);
    36 END;
    37 /
    DECLARE
    ERROR at line 1:
    ORA-39001: invalid argument value
    ORA-06512: at "SYS.DBMS_SYS_ERROR", line 79
    ORA-06512: at "SYS.DBMS_DATAPUMP", line 3043
    ORA-06512: at "SYS.DBMS_DATAPUMP", line 3688
    ORA-06512: at line 25
    ============================================================
    i have a table called LDEV_PERF_DATA and its in schema XPSLPERF.
    value => '(where "XP_TIME_NUM > 1204884480100")',above is the condition i want to filter the data.
    However, the below snippet works fine.
    ============================================================
    SET SERVEROUTPUT ON SIZE 1000000
    DECLARE
    l_dp_handle NUMBER;
    l_last_job_state VARCHAR2(30) := 'UNDEFINED';
    l_job_state VARCHAR2(30) := 'UNDEFINED';
    l_sts KU$_STATUS;
    BEGIN
    l_dp_handle := DBMS_DATAPUMP.open(
    operation => 'EXPORT',
    job_mode => 'SCHEMA',
    remote_link => NULL,
    job_name => 'ldev_may20',
    version => 'LATEST');
    DBMS_DATAPUMP.add_file(
    handle => l_dp_handle,
    filename => 'ldev_may20.dmp',
    directory => 'DATA_PUMP_DIR');
    DBMS_DATAPUMP.add_file(
    handle => l_dp_handle,
    filename => 'ldev_may20.log',
    directory => 'DATA_PUMP_DIR',
    filetype => DBMS_DATAPUMP.KU$_FILE_TYPE_LOG_FILE);
    DBMS_DATAPUMP.start_job(l_dp_handle);
    DBMS_DATAPUMP.detach(l_dp_handle);
    END;
    ============================================================
    I dont want to export all contents as the above, but want to export data based on some conditions and only on selective tables.
    Any help is highly appreciated.

Maybe you are looking for