Data pump error olease help

DECLARE
ind NUMBER; -- Loop index
h1 NUMBER; -- Data Pump job handle
percent_done NUMBER; -- Percentage of job complete
job_state VARCHAR2(30); -- To keep track of job state
le ku$_LogEntry; -- For WIP and error messages
js ku$_JobStatus; -- The job status from get_status
jd ku$_JobDesc; -- The job description from get_status
sts ku$_Status; -- The status object returned by get_status
BEGIN
-- Create a (user-named) Data Pump job to do a schema export.
h1 := DBMS_DATAPUMP.OPEN('EXPORT','SCHEMA',NULL,'EXAMPLE1','LATEST');
-- Specify a single dump file for the job (using the handle just returned)
-- and a directory object, which must already be defined and accessible
-- to the user running this procedure.
--BACKUP DIRECTORY NAME
DBMS_DATAPUMP.ADD_FILE(h1,'example1.dmp','BACKUP');
-- A metadata filter is used to specify the schema that will be exported.
--ORVETL USER NAME
DBMS_DATAPUMP.METADATA_FILTER(h1,'SCHEMA_EXPR','IN (''orvetl'')');
-- Start the job. An exception will be generated if something is not set up
-- properly.
DBMS_DATAPUMP.START_JOB(h1);
-- The export job should now be running. In the following loop, the job
-- is monitored until it completes. In the meantime, progress information is
-- displayed.
percent_done := 0;
job_state := 'UNDEFINED';
while (job_state != 'COMPLETED') and (job_state != 'STOPPED') loop
dbms_datapump.get_status(h1,
dbms_datapump.ku$_status_job_error +
dbms_datapump.ku$_status_job_status +
dbms_datapump.ku$_status_wip,-1,job_state,sts);
js := sts.job_status;
-- If the percentage done changed, display the new value.
if js.percent_done != percent_done
then
dbms_output.put_line('*** Job percent done = ' ||
to_char(js.percent_done));
percent_done := js.percent_done;
end if;
-- If any work-in-progress (WIP) or error messages were received for the job,
-- display them.
if (bitand(sts.mask,dbms_datapump.ku$_status_wip) != 0)
then
le := sts.wip;
else
if (bitand(sts.mask,dbms_datapump.ku$_status_job_error) != 0)
then
le := sts.error;
else
le := null;
end if;
end if;
if le is not null
then
ind := le.FIRST;
while ind is not null loop
dbms_output.put_line(le(ind).LogText);
ind := le.NEXT(ind);
end loop;
end if;
end loop;
-- Indicate that the job finished and detach from it.
dbms_output.put_line('Job has completed');
dbms_output.put_line('Final job state = ' || job_state);
dbms_datapump.detach(h1);
END;
error-
ERROR at line 1:
ORA-39001: invalid argument value
ORA-06512: at "SYS.DBMS_SYS_ERROR", line 79
ORA-06512: at "SYS.DBMS_DATAPUMP", line 2926
ORA-06512: at "SYS.DBMS_DATAPUMP", line 3162
ORA-06512: at line 20
Message was edited by:
anutosh

I assume all the other dimensions are being specified via a load rule header (i.e. the rule is otherwise valid).
What is your data source? What does the number (data) format look like? Can you identify (and post) specific rows that are causing the error?

Similar Messages

  • Data pump error ORA-39065, status undefined after restart

    Hi members,
    The data pump full import job hung, continue client also hung, all of a sudden the window exited.
    ;;; Import> status
    ;;; Import> help
    ;;; Import> status
    ;;; Import> continue_client
    ORA-39065: unexpected master process exception in RECEIVE
    ORA-39078: unable to dequeue message for agent MCP from queue "KUPC$C_1_20090923181336"
    Job "SYSTEM"."SYS_IMPORT_FULL_01" stopped due to fatal error at 18:48:03
    I increased the shared_pool to 100M and then restarted the job with attach=jobname. After restarting, I have queried the status and found that everything is undefined. It still says undefined now and the last log message says that it has been reopened. Thats the end of the log file and nothing else is being recorded. I am not sure what is happening now. Any ideas will be appreciated. This is 10.2.0.3 version on windows. Thanks ...
    Job SYS_IMPORT_FULL_01 has been reopened at Wednesday, 23 September, 2009 18:54
    Import> status
    Job: SYS_IMPORT_FULL_01
    Operation: IMPORT
    Mode: FULL
    State: IDLING
    Bytes Processed: 3,139,231,552
    Percent Done: 33
    Current Parallelism: 8
    Job Error Count: 0
    Dump File: D:\oracle\product\10.2.0\admin\devdb\dpdump\devtest%u.dmp
    Dump File: D:\oracle\product\10.2.0\admin\devdb\dpdump\devtest01.dmp
    Dump File: D:\oracle\product\10.2.0\admin\devdb\dpdump\devtest02.dmp
    Dump File: D:\oracle\product\10.2.0\admin\devdb\dpdump\devtest03.dmp
    Dump File: D:\oracle\product\10.2.0\admin\devdb\dpdump\devtest04.dmp
    Dump File: D:\oracle\product\10.2.0\admin\devdb\dpdump\devtest05.dmp
    Dump File: D:\oracle\product\10.2.0\admin\devdb\dpdump\devtest06.dmp
    Dump File: D:\oracle\product\10.2.0\admin\devdb\dpdump\devtest07.dmp
    Dump File: D:\oracle\product\10.2.0\admin\devdb\dpdump\devtest08.dmp
    Worker 1 Status:
    State: UNDEFINED
    Worker 2 Status:
    State: UNDEFINED
    Object Schema: trm
    Object Name: EVENT_DOCUMENT
    Object Type: DATABASE_EXPORT/SCHEMA/TABLE/TABLE_DATA
    Completed Objects: 1
    Completed Rows: 78,026
    Completed Bytes: 4,752,331,264
    Percent Done: 100
    Worker Parallelism: 1
    Worker 3 Status:
    State: UNDEFINED
    Worker 4 Status:
    State: UNDEFINED
    Worker 5 Status:
    State: UNDEFINED
    Worker 6 Status:
    State: UNDEFINED
    Worker 7 Status:
    State: UNDEFINED
    Worker 8 Status:
    State: UNDEFINED

    39065, 00000, "unexpected master process exception in %s"
    // *Cause:  An unhandled exception was detected internally within the master
    //          control process for the Data Pump job.  This is an internal error.
    //          messages will detail the problems.
    // *Action: If problem persists, contact Oracle Customer Support.

  • Data Pump Error

    Dear All,
    Using datapump to export data for a schema;
    #!/bin/sh
    PS1='$PWD # '
    ORACLE_BASE=/orabin/oracle
    ORACLE_HOME=/orabin/oracle/product/10.1.0
    ORACLE_SID=vimadb
    PATH=$ORACLE_HOME/bin:$PATH:.
    export PATH PS1 ORACLE_BASE ORACLE_HOME ORACLE_SID
    /orabin/oracle/product/10.1.0/bin/expdp vproddta/vproddta@vimadb schemas = vproddta EXCLUDE = STATISTICS directory = datadir1 dumpfile = datadir1:`date '+Full_expdp_vproddta_%d%m%y_%H%M'`.dmp logfile= datadir1:`date '+Full_expdp_vproddta_%d%m%y_%H%M'`.logAlready the directory is created at OS level;
    /syslog/datapump/
    SQL> create directory datadir1 as '/syslog/datapump/'
    grant read, write on directory datadir1 to vproddta;Getting the error -
    Export: Release 10.1.0.4.0 - 64bit Production on Wednesday, 10 August, 2011 16:52
    Copyright (c) 2003, Oracle.  All rights reserved.
    Connected to: Oracle Database 10g Release 10.1.0.4.0 - 64bit Production
    ORA-39002: invalid operation
    ORA-39070: Unable to open the log file.
    ORA-29283: invalid file operation
    ORA-06512: at "SYS.UTL_FILE", line 475
    ORA-29283: invalid file operation

    1. Check OS permitions for this directory.
    2. Try to create export using simple name, like "export.dmp", if DP will be successfull, then check generated filename.

  • Data Pump Error - ORA-04063: package body "SYS.DBMS_INTERNAL_LOGSTDBY" has

    Hi All,
    I am getting the following errors when I am trying to connect with datapump.My db is 10g and os is linux.
    Connected to: Oracle Database 10g Enterprise Edition Release 10.1.0.2.0 - Production
    With the Partitioning, OLAP and Data Mining options
    ORA-31626: job does not exist
    ORA-04063: package body "SYS.DBMS_INTERNAL_LOGSTDBY" has errors
    ORA-06508: PL/SQL: could not find program unit being called
    ORA-06512: at "SYS.DBMS_LOGSTDBY", line 24
    ORA-06512: at "SYS.KUPV$FT", line 676
    ORA-04063: package body "SYS.DBMS_INTERNAL_LOGSTDBY" has errors
    ORA-06508: PL/SQL: could not find program unit being called
    When I tried to compile this package I am getting the following error
    SQL> alter package DBMS_INTERNAL_LOGSTDBY compile body;
    Warning: Package Body altered with compilation errors.
    SQL> show error
    Errors for PACKAGE BODY DBMS_INTERNAL_LOGSTDBY:
    LINE/COL ERROR
    1405/4 PL/SQL: SQL Statement ignored
    1412/38 PL/SQL: ORA-00904: "SQLTEXT": invalid identifier
    1486/4 PL/SQL: SQL Statement ignored
    1564/7 PL/SQL: ORA-00904: "DBID": invalid identifier
    1751/2 PL/SQL: SQL Statement ignored
    1870/7 PL/SQL: ORA-00904: "DBID": invalid identifier
    Can anyony suggest/guide me how to resolve the issue.
    Thanks in advance

    SQL> SELECT OBJECT_TYPE,OBJECT_NAME FROM DBA_OBJECTS
    2 WHERE OWNER='SYS' AND STATUS<>'VALID';
    OBJECT_TYPE OBJECT_NAME
    VIEW DBA_COMMON_AUDIT_TRAIL
    PACKAGE BODY DBMS_INTERNAL_LOGSTDBY
    PACKAGE BODY DBMS_REGISTRY_SYS
    Thanks

  • DATA-PUMP ERROR: ORA-39070 Database on Linux, Client on Win 2008

    Hi,
    i want to make a datapump export from client Windows 2008. I define dpdir as 'C:\DPDIR'.
    While making expdp
    expdp login\pass@ora directory=dpdir dumpfile=dump.dmp logfile=log.log full=y
    i get those errors
    ORA-39002:niepoprawna operacja
    ORA-39070:nie mozna otworzyc pliku dziennik
    ORA-29283:niepoprawna operacja na pliku
    ORA-06512:przy "sys.utl_file", linia 536
    ORA-29283:niepoprawna operacja na pliku
    (decriptions in polish)
    I found out, that datapump is saving files to the Linux Server (where database is). When i define 'C:\DPDIR' it doesn't recognize it because there is no such directory on Linux.
    How can I save datapump export dumpfile on Windows?

    tstefanski wrote:
    Hi,
    i want to make a datapump export from client Windows 2008. I define dpdir as 'C:\DPDIR'.
    While making expdp
    expdp login\pass@ora directory=dpdir dumpfile=dump.dmp logfile=log.log full=y
    i get those errors
    ORA-39002:niepoprawna operacja
    ORA-39070:nie mozna otworzyc pliku dziennik
    ORA-29283:niepoprawna operacja na pliku
    ORA-06512:przy "sys.utl_file", linia 536
    ORA-29283:niepoprawna operacja na pliku
    (decriptions in polish)
    I found out, that datapump is saving files to the Linux Server (where database is). When i define 'C:\DPDIR' it doesn't recognize it because there is no such directory on Linux.
    How can I save datapump export dumpfile on Windows?
    >Hi,
    i want to make a datapump export from client Windows 2008. I define dpdir as 'C:\DPDIR'.
    While making expdp
    expdp login\pass@ora directory=dpdir dumpfile=dump.dmp logfile=log.log full=y
    i get those errors
    ORA-39002:niepoprawna operacja
    ORA-39070:nie mozna otworzyc pliku dziennik
    ORA-29283:niepoprawna operacja na pliku
    ORA-06512:przy "sys.utl_file", linia 536
    ORA-29283:niepoprawna operacja na pliku
    (decriptions in polish)
    I found out, that datapump is saving files to the Linux Server (where database is). When i define 'C:\DPDIR' it doesn't recognize it because there is no such directory on Linux.
    How can I save datapump export dumpfile on Windows?
    expdp can only create dump file on DB Server system itself.

  • Data Pump Errors

    trying to execute the sample here:
    http://download.oracle.com/docs/cd/B12037_01/server.101/b10825/dp_api.htm#i1006925
    Example 5-1 Performing a Simple Schema Export
    So i create a packge and I tried to modify it to do a table instead of a schema export so I mess it up and it doesn't work. I do this twice.
    If I query USER_DATAPUMP_JOBS I have 2 entries ('EXAMPLE1' and 'EXAMPLE2'), so when I try to rerun the sample, I get an ORA-31634 job already exists.
    hmm, it looks like I could then attach to the job to rerun my tests or to remove it, but when I try and attach, I get ORA-31626 job doesn't exist.
    Am I missing something basic here? All other procedures require a handle, so it's either OPEN or ATTACH.

    I better read the doc again. Seems strange to create tables in the schema with the name of the job.
    I deleted the tables in the user and then when I did a select * from user_datapump_jobs, then I could see more rows with the jobs from before plus the names of the recyclebin objects. purge recyclebin and they were gone.
    I'm not using EXPDP, I'm trying the API based approach.

  • Data Pump Export error - network mounted path

    Hi,
    Please have a look at the data pump error i am getting while doing export. I am running on version 11g . Please help with your feedback..
    I am getting error due to Network mounted path for directory OverLoad it works fine with local path. i have given full permissions on network path and utl_file able to create files but datapump fail with below error messages.
    Oracle 11g
    Solaris 10
    Getting below error :
    ERROR at line 1:
    ORA-39001: invalid argument value
    ORA-06512: at "SYS.DBMS_SYS_ERROR", line 79
    ORA-06512: at "SYS.DBMS_DATAPUMP", line 3444
    ORA-06512: at "SYS.DBMS_DATAPUMP", line 3693
    ORA-06512: at line 64
    DECLARE
    p_part_name VARCHAR2(30);
    p_msg VARCHAR2(512);
    v_ret_period NUMBER;
    v_arch_location VARCHAR2(512);
    v_arch_directory VARCHAR2(20);
    v_rec_count NUMBER;
    v_partition_dumpfile VARCHAR2(35);
    v_partition_dumplog VARCHAR2(35);
    v_part_date VARCHAR2(30);
    p_partition_name VARCHAR2(30);
    v_partition_arch_location VARCHAR2(512);
    h1 NUMBER; -- Data Pump job handle
    job_state VARCHAR2(30); -- To keep track of job state
    le ku$_LogEntry; -- For WIP and error messages
    js ku$_JobStatus; -- The job status from get_status
    jd ku$_JobDesc; -- The job description from get_status
    sts ku$_Status; -- The status object returned by get_status
    ind NUMBER; -- Loop index
    percent_done NUMBER; -- Percentage of job complete
    --check dump file exist on directory
    l_file utl_file.file_type;   
    l_file_name varchar2(20);
    l_exists boolean;
    l_length number;
    l_blksize number;
    BEGIN
    p_part_name:='P2010110800';
    p_partition_name := upper(p_part_name);
    v_partition_dumpfile :=  chr(39)||p_partition_name||chr(39);
    v_partition_dumplog  :=  p_partition_name || '.LOG';
         SELECT COUNT(*) INTO v_rec_count FROM HDB.PARTITION_BACKUP_MASTER WHERE PARTITION_ARCHIVAL_STATUS='Y';
             IF v_rec_count != 0 THEN
               SELECT
               PARTITION_ARCHIVAL_PERIOD
               ,PARTITION_ARCHIVAL_LOCATION
               ,PARTITION_ARCHIVAL_DIRECTORY
               INTO v_ret_period , v_arch_location , v_arch_directory
               FROM HDB.PARTITION_BACKUP_MASTER WHERE PARTITION_ARCHIVAL_STATUS='Y';
             END IF;
         utl_file.fgetattr('ORALOAD', l_file_name, l_exists, l_length, l_blksize);      
            IF (l_exists) THEN        
             utl_file.FRENAME('ORALOAD', l_file_name, 'ORALOAD', p_partition_name ||'_'|| to_char(systimestamp,'YYYYMMDDHH24MISS') ||'.DMP', TRUE);
         END IF;
        v_part_date := replace(p_partition_name,'P');
            DBMS_OUTPUT.PUT_LINE('inside');
        h1 := dbms_datapump.open (operation => 'EXPORT',
                                  job_mode  => 'TABLE'
              dbms_datapump.add_file (handle    => h1,
                                      filename  => p_partition_name ||'.DMP',
                                      directory => v_arch_directory,
                                      filetype  => DBMS_DATAPUMP.KU$_FILE_TYPE_DUMP_FILE);
              dbms_datapump.add_file (handle    => h1,
                                      filename  => p_partition_name||'.LOG',
                                      directory => v_arch_directory,
                                      filetype  => DBMS_DATAPUMP.KU$_FILE_TYPE_LOG_FILE);
              dbms_datapump.metadata_filter (handle => h1,
                                             name   => 'SCHEMA_EXPR',
                                             value  => 'IN (''HDB'')');
              dbms_datapump.metadata_filter (handle => h1,
                                             name   => 'NAME_EXPR',
                                             value  => 'IN (''SUBSCRIBER_EVENT'')');
              dbms_datapump.data_filter (handle      => h1,
                                         name        => 'PARTITION_LIST',
                                        value       => v_partition_dumpfile,
                                        table_name  => 'SUBSCRIBER_EVENT',
                                        schema_name => 'HDB');
              dbms_datapump.set_parameter(handle => h1, name => 'COMPRESSION', value => 'ALL');
              dbms_datapump.start_job (handle => h1);
                  dbms_datapump.detach (handle => h1);              
    END;
    /

    Hi ,
    I tried to generate dump with expdp instead of API, got more specific error logs.
    but on same path log file got create.
    expdp hdb/hdb DUMPFILE=P2010110800.dmp DIRECTORY=ORALOAD TABLES=(SUBSCRIBER_EVENT:P2010110800) logfile=P2010110800.log
    Export: Release 11.2.0.1.0 - Production on Wed Nov 10 01:26:13 2010
    Copyright (c) 1982, 2009, Oracle and/or its affiliates.  All rights reserved.
    Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.1.0 - 64bit Production
    With the Partitioning, Automatic Storage Management, OLAP, Data Mining
    and Real Application Testing options
    ORA-39001: invalid argument value
    ORA-39000: bad dump file specification
    ORA-31641: unable to create dump file "/nfs_path/lims/backup/hdb/datapump/P2010110800.dmp"
    ORA-27054: NFS file system where the file is created or resides is not mounted with correct options
    Additional information: 3Edited by: Sachin B on Nov 9, 2010 10:33 PM

  • Position Activity Sequence Resulting in data extracting error

    hi gurus
    can any body tell me what is this position activity sequence error resulting in data extracting error, plz help. this is very high priority for me.
    regards
    dev

    I rebuild the report using a tabular form and populated the primary key with a PL/SQL expression and now it appears to be working. P1_ISSUE_NO is the primary key on the issue table and I could not figure how to get it in the issue_no field of the tabular report when a new line was entered I ended up putting the following code in for the default for
    issue_no on the report:
    Begin
    return :P1_ISSUE_NO;
    end
    It works great.

  • Help needed with Export Data Pump using API

    Hi All,
    Am trying to do an export data pump feature using the API.
    while the export as well as import works fine from the command line, its failing with the API.
    This is the command line program:
    expdp pxperf/dba@APPN QUERY=dev_pool_data:\"WHERE TIME_NUM > 1204884480100\" DUMPFILE=EXP_DEV.dmp tables=PXPERF.dev_pool_data
    Could you help me how should i achieve the same as above in Oracle Data Pump API
    DECLARE
    h1 NUMBER;
    h1 := dbms_datapump.open('EXPORT','TABLE',NULL,'DP_EXAMPLE10','LATEST');
    dbms_datapump.add_file(h1,'example3.dmp','DATA_PUMP_TEST',NULL,1);
    dbms_datapump.add_file(h1,'example3_dump.log','DATA_PUMP_TEST',NULL,3);
    dbms_datapump.metadata_filter(h1,'NAME_LIST','(''DEV_POOL_DATA'')');
    END;
    Also in the API i want to know how to export and import multiple tables (selective tables only) using one single criteria like "WHERE TIME_NUM > 1204884480100\"

    Yes, I have read the Oracle doc.
    I was able to proceed as below: but it gives error.
    ============================================================
    SQL> SET SERVEROUTPUT ON SIZE 1000000
    SQL> DECLARE
    2 l_dp_handle NUMBER;
    3 l_last_job_state VARCHAR2(30) := 'UNDEFINED';
    4 l_job_state VARCHAR2(30) := 'UNDEFINED';
    5 l_sts KU$_STATUS;
    6 BEGIN
    7 l_dp_handle := DBMS_DATAPUMP.open(
    8 operation => 'EXPORT',
    9 job_mode => 'TABLE',
    10 remote_link => NULL,
    11 job_name => '1835_XP_EXPORT',
    12 version => 'LATEST');
    13
    14 DBMS_DATAPUMP.add_file(
    15 handle => l_dp_handle,
    16 filename => 'x1835_XP_EXPORT.dmp',
    17 directory => 'DATA_PUMP_DIR');
    18
    19 DBMS_DATAPUMP.add_file(
    20 handle => l_dp_handle,
    21 filename => 'x1835_XP_EXPORT.log',
    22 directory => 'DATA_PUMP_DIR',
    23 filetype => DBMS_DATAPUMP.KU$_FILE_TYPE_LOG_FILE);
    24
    25 DBMS_DATAPUMP.data_filter(
    26 handle => l_dp_handle,
    27 name => 'SUBQUERY',
    28 value => '(where "XP_TIME_NUM > 1204884480100")',
    29 table_name => 'ldev_perf_data',
    30 schema_name => 'XPSLPERF'
    31 );
    32
    33 DBMS_DATAPUMP.start_job(l_dp_handle);
    34
    35 DBMS_DATAPUMP.detach(l_dp_handle);
    36 END;
    37 /
    DECLARE
    ERROR at line 1:
    ORA-39001: invalid argument value
    ORA-06512: at "SYS.DBMS_SYS_ERROR", line 79
    ORA-06512: at "SYS.DBMS_DATAPUMP", line 3043
    ORA-06512: at "SYS.DBMS_DATAPUMP", line 3688
    ORA-06512: at line 25
    ============================================================
    i have a table called LDEV_PERF_DATA and its in schema XPSLPERF.
    value => '(where "XP_TIME_NUM > 1204884480100")',above is the condition i want to filter the data.
    However, the below snippet works fine.
    ============================================================
    SET SERVEROUTPUT ON SIZE 1000000
    DECLARE
    l_dp_handle NUMBER;
    l_last_job_state VARCHAR2(30) := 'UNDEFINED';
    l_job_state VARCHAR2(30) := 'UNDEFINED';
    l_sts KU$_STATUS;
    BEGIN
    l_dp_handle := DBMS_DATAPUMP.open(
    operation => 'EXPORT',
    job_mode => 'SCHEMA',
    remote_link => NULL,
    job_name => 'ldev_may20',
    version => 'LATEST');
    DBMS_DATAPUMP.add_file(
    handle => l_dp_handle,
    filename => 'ldev_may20.dmp',
    directory => 'DATA_PUMP_DIR');
    DBMS_DATAPUMP.add_file(
    handle => l_dp_handle,
    filename => 'ldev_may20.log',
    directory => 'DATA_PUMP_DIR',
    filetype => DBMS_DATAPUMP.KU$_FILE_TYPE_LOG_FILE);
    DBMS_DATAPUMP.start_job(l_dp_handle);
    DBMS_DATAPUMP.detach(l_dp_handle);
    END;
    ============================================================
    I dont want to export all contents as the above, but want to export data based on some conditions and only on selective tables.
    Any help is highly appreciated.

  • Error connecting as DBA using Data Pump Within Transportable Modules

    Hi all
    I am using OWB 10g R2 and trying to set up a transportable Module to test the Oracle Data Pump utility. I have followed the user manual in terms of relevant grants and permissions needed to use this functionality and have sucessfully connect to both my source and target databases and created my transportable module which will extract six tables from a source database on 10.2.0.1.0 to may target schema in my warehouse also on 10.2.0.1.0. When i come to try and deploy / execute the transportable module it fails with the following error.
    RPE-01023: Failed to establish connection to target database as DBA
    Now we have even gone as far as granting the DBA role to the user within our target but we still get the same error so assume it is something to do with the connection of the Transportable Target Module Location and it needs to connect as DBA somehow in the connect string. Has anyone experienced this issue and is their a way of creating the location connection that is not documented.
    There is no mention of this anywhere within the manual and i have even followed the example from http://www.rittman.net/archives/2006_04.html and my target user has the privilages detailed in the manual as detailed below
    User must not be SYS. Musthave ALTER TABLESPACE privilege and IMP_FULL_
    DATABASE role. Must have CREATE MATERIALIZED VIEW privilege with ADMIN
    option. Must be a Warehouse Builder repository databaseuser.
    Any help would be appreciated before i raise a request with Oracle

    Did you ever find a resolution ? We are experiencing the same issue..
    thanks
    OBX

  • ORA-12560 error when trying to do a data pump

    I installed oracle 11g and was attempting to do a data pump from the dev machine to my computer however when I run the impdp command I get an ORA-12560: TNS:protocol adapter error.
    A co-worker thought I might need to add a loopback adapter as that had been the only thing different from his machine where everything worked fine so I added the loopback adapter but it still didn't work. We thought maybe the adapter had to be in place first so I uninstalled the database and installed again but that also didn't work, I still get the same error message.
    Any ideas on what I should try?
    Thanks for any help you can give me.

    user11340791 wrote:
    Please check if the service for the database is running if you are working on windows....
    I am on windows how do I check if the service for the database is running?? Sorry to need so much hand holding I'm pretty new at this.You can check it from My Computer->Right Click->Manage->Services . Check from there that the service for your database is started or not? If not than start it.
    >
    I forgot to mention that I do have my environment variable set, ORACLE_HOME to the path the bin folder is in and ORACLE_SID set to the database SID.You are on windows and on it, most of the times, its taken care automatically by the o/s itself.
    HTH
    Aman....

  • Data pump export with error

    Hi All,
    I am new to DBA and when I run expdp for the full database,I am getting the below error
    " ORA-39139: Data Pump does not support XMLSchema objects. TABLE_DATA:"APPS"."abc_TAB" will be skipped.
    I am facing issue with exp utility,so I have to use expdp.
    Kindly help me to avoid these errors.
    Thank,

    Thanks Srini for the Doc - A good one.
    We have scheduled a daily full export backup. Now this is having problems.
    -> exp expuser/exppaswd file=/ora/exp.dmp log=/ora/exp.log direct=Y statistics=none full=Y
    error=
    Table window348_TAB will be exported in conventional path.
    . . exporting table window348_TAB 0 rows exported
    Table wtpParameters350_TAB will be exported in conventional path.
    . . exporting table wtpParameters350_TAB 0 rows exported
    . exporting synonyms
    EXP-00008: ORACLE error 1406 encountered
    ORA-01406: fetched column value was truncated
    EXP-00000: Export terminated unsuccessfully
    Oracle requested to drop/disable the policy. But one policy assigned to many tables and one table having more than one policies.
    We haven't set any policy, all are default one.
    1. If I disable/drop policies how it will affect the application?
    2. How I can disable or drop it for all the objects in the database?
    3. How I can identify the policies - whether OLS or VPD
    Pelase help me to get a daily full export backup or any alternate solution for the same.
    Thanks,
    Asim

  • ERROR: Invalid data pump job state; one of: DEFINING or EXECUTING.

    Hi all.
    I'm using Oracle 10g (10.2.0.3) I had some scheduled jobs for export the DB with datapump. Some days ago they began to fail with the message:
    ERROR: Invalid data pump job state; one of: DEFINING or EXECUTING.
    Wait a few moments and check the log file (if one was used) for more details.
    The EM says that I can't monitor the job because there's no master tables. How can I find the log?
    I also founded in the bdump that exceptions occurred:
    *** 2008-08-22 09:04:54.671
    *** ACTION NAME:(EXPORT_ITATRD_DIARIO) 2008-08-22 09:04:54.671
    *** MODULE NAME:(Data Pump Master) 2008-08-22 09:04:54.671
    *** SERVICE NAME:(SYS$USERS) 2008-08-22 09:04:54.671
    *** SESSION ID:(132.3726) 2008-08-22 09:04:54.671
    kswsdlaqsub: unexpected exception err=604, err2=24010
    kswsdlaqsub: unexpected exception err=604, err2=24010
    * kjdrpkey2hv: called with pkey 114178, options x8
    * kjdrpkey2hv: called with pkey 114176, options x8
    * kjdrpkey2hv: called with pkey 114177, options x8
    Some help would be appreciated.
    Thanks.

    Delete any uncompleted or failed jobs and try again.

  • DATA PUMP export error

    Hi all
    During data pump export (big table ) dmp has 289GB (FILESIZE=10G) I had error
    . . exported "TEST3"."INC_T_ZMLUVY_PARTNERI" 6.015 GB 61182910 rows
    . . exported "TEST3"."PAR_T_VZTAHY" 5.798 GB 73121325 rows
    ORA-31693: Table data object "TEST3"."INC_T_POISTENIE_PARAMETRE_H" failed to load/unload and is being skipped due to error:
    ORA-31617: unable to open dump file "/u01/app/oracle/backup/STARINS/exp_polska03.dmp" for write
    ORA-19505: failed to identify file "/u01/app/oracle/backup/STARINS/exp_polska03.dmp"
    ORA-27037: unable to obtain file status
    Linux-x86_64 Error: 2: No such file or directory
    Additional information: 3
    . . exported "TEST3"."PAY_T_UCET_POLOZKA" 5.344 GB 97337823 rows
    and export continued until :
    Job "SYSTEM"."SYS_EXPORT_SCHEMA_02" completed with 1 error
    (it take 8hours)
    can You help me if dmp is now ok ? If impdp will conntiue after reading exp_polska03.dmp , (dump has 28 file exp_polska1 - exp_polska28)?Maximaly I 'm exporting this one table again.. is this solution ok ?
    thak Brano

    What is the expdp parameters used?
    what's the total export dump size?( Try estimate_only=y)
    ORA-31617: unable to open dump >file "/u01/app/oracle/backup/STARINS/exp_polska03.dmp" for write
    ORA-19505: failed to identify >file "/u01/app/oracle/backup/STARINS/exp_polska03.dmphttp://systemr.blogspot.com/2007/09/section-oracle-database-utilities-title.html
    Check the above one.
    I guess the file is removed from the location or not enough permission to write.

  • Data Pump import to a sql file error :ORA-31655 no data or metadata objects

    Hello,
    I'm using Data Pump to export/import data, one requirement is to import data to a sql file. The OS is window.
    I made the follow export :
    expdp system/password directory=dpump_dir dumpfile=tablesdump.dmp content=DATA_ONLY tables=user.tablename
    and it works, I can see the file TABLESDUMP.DMP in the directory path.
    then when I tried to import it to a sql file:
    impdp system/password directory=dpump_dir dumpfile=tablesdump.dmp sqlfile=tables_export.sql
    the log show :
    ORA-31655 no data or metadata objects selected for job
    and the sql file is created empty in the directory path.
    I'm not DBA, I'm a Java developer , Can you help me?
    Thks

    Hi, I added the command line :
    expdp system/system directory=dpump_dir dumpfile=tablesdump.dmp content=DATA_ONLY schemas=ko1 tables=KO1QT01 logfile=capture.log
    the log in the console screen is (is in Spanish), no log file was cerated in the directory path.
    Export: Release 10.2.0.1.0 - Production on Martes, 26 Enero, 2010 12:59:14
    Copyright (c) 2003, 2005, Oracle. All rights reserved.
    Conectado a: Oracle Database 10g Express Edition Release 10.2.0.1.0 - Production
    UDE-00010: se han solicitado varios modos de trabajo, schema y tables.
    (English error)
    UDE-00010: multiple job modes requested,schema y tables.
    This is why I used tables=user.tablename instead, is this right ?
    Thks

Maybe you are looking for

  • Storage location custom determination in MRP both for Make that for comps

    Hi experts, i have a  requirements enough difficult for me and i'm in trouble. I try to summarize. i have one sap plant but differents building using different warehouse and work centers. The requirements from my customers are the following: During M

  • IPhone 3g wont start after updating to 2.1

    im going to start from the beggining, it all started yesturday when i left with my school to the football game, i turned off my iphone, when i came back like 6 or 7 hours later, i notice that all the icons are messed up, and the PHONE application doe

  • How to upgrade from OS 10.5.8 to 10.7.0?

    I run a Mac Book Intel core duo with OS 10.5.8   I would like to upgrade to Lion or Mountain Lion.  How should I proceed?

  • MobileMe Site Viewing

    Is it not possible to visit my MobileMe site on my iPod Touch? I have it syncing, but I cannot go to the actual site... www.mobileme.com...???? and view my contents such as videos...

  • Entity Dimension Issue In Appshell

    Hi Experts, i got below issue when i modified standard ENTITY dimension of APPSHELL. 999 "cannot save the member sheet to the server; do you want save it to the local machine? "yes" or "no" when i processed same ENTITY dimension i got error message 1