Data Pump Error

Dear All,
Using datapump to export data for a schema;
#!/bin/sh
PS1='$PWD # '
ORACLE_BASE=/orabin/oracle
ORACLE_HOME=/orabin/oracle/product/10.1.0
ORACLE_SID=vimadb
PATH=$ORACLE_HOME/bin:$PATH:.
export PATH PS1 ORACLE_BASE ORACLE_HOME ORACLE_SID
/orabin/oracle/product/10.1.0/bin/expdp vproddta/vproddta@vimadb schemas = vproddta EXCLUDE = STATISTICS directory = datadir1 dumpfile = datadir1:`date '+Full_expdp_vproddta_%d%m%y_%H%M'`.dmp logfile= datadir1:`date '+Full_expdp_vproddta_%d%m%y_%H%M'`.logAlready the directory is created at OS level;
/syslog/datapump/
SQL> create directory datadir1 as '/syslog/datapump/'
grant read, write on directory datadir1 to vproddta;Getting the error -
Export: Release 10.1.0.4.0 - 64bit Production on Wednesday, 10 August, 2011 16:52
Copyright (c) 2003, Oracle.  All rights reserved.
Connected to: Oracle Database 10g Release 10.1.0.4.0 - 64bit Production
ORA-39002: invalid operation
ORA-39070: Unable to open the log file.
ORA-29283: invalid file operation
ORA-06512: at "SYS.UTL_FILE", line 475
ORA-29283: invalid file operation

1. Check OS permitions for this directory.
2. Try to create export using simple name, like "export.dmp", if DP will be successfull, then check generated filename.

Similar Messages

  • Data pump error ORA-39065, status undefined after restart

    Hi members,
    The data pump full import job hung, continue client also hung, all of a sudden the window exited.
    ;;; Import> status
    ;;; Import> help
    ;;; Import> status
    ;;; Import> continue_client
    ORA-39065: unexpected master process exception in RECEIVE
    ORA-39078: unable to dequeue message for agent MCP from queue "KUPC$C_1_20090923181336"
    Job "SYSTEM"."SYS_IMPORT_FULL_01" stopped due to fatal error at 18:48:03
    I increased the shared_pool to 100M and then restarted the job with attach=jobname. After restarting, I have queried the status and found that everything is undefined. It still says undefined now and the last log message says that it has been reopened. Thats the end of the log file and nothing else is being recorded. I am not sure what is happening now. Any ideas will be appreciated. This is 10.2.0.3 version on windows. Thanks ...
    Job SYS_IMPORT_FULL_01 has been reopened at Wednesday, 23 September, 2009 18:54
    Import> status
    Job: SYS_IMPORT_FULL_01
    Operation: IMPORT
    Mode: FULL
    State: IDLING
    Bytes Processed: 3,139,231,552
    Percent Done: 33
    Current Parallelism: 8
    Job Error Count: 0
    Dump File: D:\oracle\product\10.2.0\admin\devdb\dpdump\devtest%u.dmp
    Dump File: D:\oracle\product\10.2.0\admin\devdb\dpdump\devtest01.dmp
    Dump File: D:\oracle\product\10.2.0\admin\devdb\dpdump\devtest02.dmp
    Dump File: D:\oracle\product\10.2.0\admin\devdb\dpdump\devtest03.dmp
    Dump File: D:\oracle\product\10.2.0\admin\devdb\dpdump\devtest04.dmp
    Dump File: D:\oracle\product\10.2.0\admin\devdb\dpdump\devtest05.dmp
    Dump File: D:\oracle\product\10.2.0\admin\devdb\dpdump\devtest06.dmp
    Dump File: D:\oracle\product\10.2.0\admin\devdb\dpdump\devtest07.dmp
    Dump File: D:\oracle\product\10.2.0\admin\devdb\dpdump\devtest08.dmp
    Worker 1 Status:
    State: UNDEFINED
    Worker 2 Status:
    State: UNDEFINED
    Object Schema: trm
    Object Name: EVENT_DOCUMENT
    Object Type: DATABASE_EXPORT/SCHEMA/TABLE/TABLE_DATA
    Completed Objects: 1
    Completed Rows: 78,026
    Completed Bytes: 4,752,331,264
    Percent Done: 100
    Worker Parallelism: 1
    Worker 3 Status:
    State: UNDEFINED
    Worker 4 Status:
    State: UNDEFINED
    Worker 5 Status:
    State: UNDEFINED
    Worker 6 Status:
    State: UNDEFINED
    Worker 7 Status:
    State: UNDEFINED
    Worker 8 Status:
    State: UNDEFINED

    39065, 00000, "unexpected master process exception in %s"
    // *Cause:  An unhandled exception was detected internally within the master
    //          control process for the Data Pump job.  This is an internal error.
    //          messages will detail the problems.
    // *Action: If problem persists, contact Oracle Customer Support.

  • Data pump error olease help

    DECLARE
    ind NUMBER; -- Loop index
    h1 NUMBER; -- Data Pump job handle
    percent_done NUMBER; -- Percentage of job complete
    job_state VARCHAR2(30); -- To keep track of job state
    le ku$_LogEntry; -- For WIP and error messages
    js ku$_JobStatus; -- The job status from get_status
    jd ku$_JobDesc; -- The job description from get_status
    sts ku$_Status; -- The status object returned by get_status
    BEGIN
    -- Create a (user-named) Data Pump job to do a schema export.
    h1 := DBMS_DATAPUMP.OPEN('EXPORT','SCHEMA',NULL,'EXAMPLE1','LATEST');
    -- Specify a single dump file for the job (using the handle just returned)
    -- and a directory object, which must already be defined and accessible
    -- to the user running this procedure.
    --BACKUP DIRECTORY NAME
    DBMS_DATAPUMP.ADD_FILE(h1,'example1.dmp','BACKUP');
    -- A metadata filter is used to specify the schema that will be exported.
    --ORVETL USER NAME
    DBMS_DATAPUMP.METADATA_FILTER(h1,'SCHEMA_EXPR','IN (''orvetl'')');
    -- Start the job. An exception will be generated if something is not set up
    -- properly.
    DBMS_DATAPUMP.START_JOB(h1);
    -- The export job should now be running. In the following loop, the job
    -- is monitored until it completes. In the meantime, progress information is
    -- displayed.
    percent_done := 0;
    job_state := 'UNDEFINED';
    while (job_state != 'COMPLETED') and (job_state != 'STOPPED') loop
    dbms_datapump.get_status(h1,
    dbms_datapump.ku$_status_job_error +
    dbms_datapump.ku$_status_job_status +
    dbms_datapump.ku$_status_wip,-1,job_state,sts);
    js := sts.job_status;
    -- If the percentage done changed, display the new value.
    if js.percent_done != percent_done
    then
    dbms_output.put_line('*** Job percent done = ' ||
    to_char(js.percent_done));
    percent_done := js.percent_done;
    end if;
    -- If any work-in-progress (WIP) or error messages were received for the job,
    -- display them.
    if (bitand(sts.mask,dbms_datapump.ku$_status_wip) != 0)
    then
    le := sts.wip;
    else
    if (bitand(sts.mask,dbms_datapump.ku$_status_job_error) != 0)
    then
    le := sts.error;
    else
    le := null;
    end if;
    end if;
    if le is not null
    then
    ind := le.FIRST;
    while ind is not null loop
    dbms_output.put_line(le(ind).LogText);
    ind := le.NEXT(ind);
    end loop;
    end if;
    end loop;
    -- Indicate that the job finished and detach from it.
    dbms_output.put_line('Job has completed');
    dbms_output.put_line('Final job state = ' || job_state);
    dbms_datapump.detach(h1);
    END;
    error-
    ERROR at line 1:
    ORA-39001: invalid argument value
    ORA-06512: at "SYS.DBMS_SYS_ERROR", line 79
    ORA-06512: at "SYS.DBMS_DATAPUMP", line 2926
    ORA-06512: at "SYS.DBMS_DATAPUMP", line 3162
    ORA-06512: at line 20
    Message was edited by:
    anutosh

    I assume all the other dimensions are being specified via a load rule header (i.e. the rule is otherwise valid).
    What is your data source? What does the number (data) format look like? Can you identify (and post) specific rows that are causing the error?

  • Data Pump Error - ORA-04063: package body "SYS.DBMS_INTERNAL_LOGSTDBY" has

    Hi All,
    I am getting the following errors when I am trying to connect with datapump.My db is 10g and os is linux.
    Connected to: Oracle Database 10g Enterprise Edition Release 10.1.0.2.0 - Production
    With the Partitioning, OLAP and Data Mining options
    ORA-31626: job does not exist
    ORA-04063: package body "SYS.DBMS_INTERNAL_LOGSTDBY" has errors
    ORA-06508: PL/SQL: could not find program unit being called
    ORA-06512: at "SYS.DBMS_LOGSTDBY", line 24
    ORA-06512: at "SYS.KUPV$FT", line 676
    ORA-04063: package body "SYS.DBMS_INTERNAL_LOGSTDBY" has errors
    ORA-06508: PL/SQL: could not find program unit being called
    When I tried to compile this package I am getting the following error
    SQL> alter package DBMS_INTERNAL_LOGSTDBY compile body;
    Warning: Package Body altered with compilation errors.
    SQL> show error
    Errors for PACKAGE BODY DBMS_INTERNAL_LOGSTDBY:
    LINE/COL ERROR
    1405/4 PL/SQL: SQL Statement ignored
    1412/38 PL/SQL: ORA-00904: "SQLTEXT": invalid identifier
    1486/4 PL/SQL: SQL Statement ignored
    1564/7 PL/SQL: ORA-00904: "DBID": invalid identifier
    1751/2 PL/SQL: SQL Statement ignored
    1870/7 PL/SQL: ORA-00904: "DBID": invalid identifier
    Can anyony suggest/guide me how to resolve the issue.
    Thanks in advance

    SQL> SELECT OBJECT_TYPE,OBJECT_NAME FROM DBA_OBJECTS
    2 WHERE OWNER='SYS' AND STATUS<>'VALID';
    OBJECT_TYPE OBJECT_NAME
    VIEW DBA_COMMON_AUDIT_TRAIL
    PACKAGE BODY DBMS_INTERNAL_LOGSTDBY
    PACKAGE BODY DBMS_REGISTRY_SYS
    Thanks

  • DATA-PUMP ERROR: ORA-39070 Database on Linux, Client on Win 2008

    Hi,
    i want to make a datapump export from client Windows 2008. I define dpdir as 'C:\DPDIR'.
    While making expdp
    expdp login\pass@ora directory=dpdir dumpfile=dump.dmp logfile=log.log full=y
    i get those errors
    ORA-39002:niepoprawna operacja
    ORA-39070:nie mozna otworzyc pliku dziennik
    ORA-29283:niepoprawna operacja na pliku
    ORA-06512:przy "sys.utl_file", linia 536
    ORA-29283:niepoprawna operacja na pliku
    (decriptions in polish)
    I found out, that datapump is saving files to the Linux Server (where database is). When i define 'C:\DPDIR' it doesn't recognize it because there is no such directory on Linux.
    How can I save datapump export dumpfile on Windows?

    tstefanski wrote:
    Hi,
    i want to make a datapump export from client Windows 2008. I define dpdir as 'C:\DPDIR'.
    While making expdp
    expdp login\pass@ora directory=dpdir dumpfile=dump.dmp logfile=log.log full=y
    i get those errors
    ORA-39002:niepoprawna operacja
    ORA-39070:nie mozna otworzyc pliku dziennik
    ORA-29283:niepoprawna operacja na pliku
    ORA-06512:przy "sys.utl_file", linia 536
    ORA-29283:niepoprawna operacja na pliku
    (decriptions in polish)
    I found out, that datapump is saving files to the Linux Server (where database is). When i define 'C:\DPDIR' it doesn't recognize it because there is no such directory on Linux.
    How can I save datapump export dumpfile on Windows?
    >Hi,
    i want to make a datapump export from client Windows 2008. I define dpdir as 'C:\DPDIR'.
    While making expdp
    expdp login\pass@ora directory=dpdir dumpfile=dump.dmp logfile=log.log full=y
    i get those errors
    ORA-39002:niepoprawna operacja
    ORA-39070:nie mozna otworzyc pliku dziennik
    ORA-29283:niepoprawna operacja na pliku
    ORA-06512:przy "sys.utl_file", linia 536
    ORA-29283:niepoprawna operacja na pliku
    (decriptions in polish)
    I found out, that datapump is saving files to the Linux Server (where database is). When i define 'C:\DPDIR' it doesn't recognize it because there is no such directory on Linux.
    How can I save datapump export dumpfile on Windows?
    expdp can only create dump file on DB Server system itself.

  • Data Pump Errors

    trying to execute the sample here:
    http://download.oracle.com/docs/cd/B12037_01/server.101/b10825/dp_api.htm#i1006925
    Example 5-1 Performing a Simple Schema Export
    So i create a packge and I tried to modify it to do a table instead of a schema export so I mess it up and it doesn't work. I do this twice.
    If I query USER_DATAPUMP_JOBS I have 2 entries ('EXAMPLE1' and 'EXAMPLE2'), so when I try to rerun the sample, I get an ORA-31634 job already exists.
    hmm, it looks like I could then attach to the job to rerun my tests or to remove it, but when I try and attach, I get ORA-31626 job doesn't exist.
    Am I missing something basic here? All other procedures require a handle, so it's either OPEN or ATTACH.

    I better read the doc again. Seems strange to create tables in the schema with the name of the job.
    I deleted the tables in the user and then when I did a select * from user_datapump_jobs, then I could see more rows with the jobs from before plus the names of the recyclebin objects. purge recyclebin and they were gone.
    I'm not using EXPDP, I'm trying the API based approach.

  • Data Pump Export error - network mounted path

    Hi,
    Please have a look at the data pump error i am getting while doing export. I am running on version 11g . Please help with your feedback..
    I am getting error due to Network mounted path for directory OverLoad it works fine with local path. i have given full permissions on network path and utl_file able to create files but datapump fail with below error messages.
    Oracle 11g
    Solaris 10
    Getting below error :
    ERROR at line 1:
    ORA-39001: invalid argument value
    ORA-06512: at "SYS.DBMS_SYS_ERROR", line 79
    ORA-06512: at "SYS.DBMS_DATAPUMP", line 3444
    ORA-06512: at "SYS.DBMS_DATAPUMP", line 3693
    ORA-06512: at line 64
    DECLARE
    p_part_name VARCHAR2(30);
    p_msg VARCHAR2(512);
    v_ret_period NUMBER;
    v_arch_location VARCHAR2(512);
    v_arch_directory VARCHAR2(20);
    v_rec_count NUMBER;
    v_partition_dumpfile VARCHAR2(35);
    v_partition_dumplog VARCHAR2(35);
    v_part_date VARCHAR2(30);
    p_partition_name VARCHAR2(30);
    v_partition_arch_location VARCHAR2(512);
    h1 NUMBER; -- Data Pump job handle
    job_state VARCHAR2(30); -- To keep track of job state
    le ku$_LogEntry; -- For WIP and error messages
    js ku$_JobStatus; -- The job status from get_status
    jd ku$_JobDesc; -- The job description from get_status
    sts ku$_Status; -- The status object returned by get_status
    ind NUMBER; -- Loop index
    percent_done NUMBER; -- Percentage of job complete
    --check dump file exist on directory
    l_file utl_file.file_type;   
    l_file_name varchar2(20);
    l_exists boolean;
    l_length number;
    l_blksize number;
    BEGIN
    p_part_name:='P2010110800';
    p_partition_name := upper(p_part_name);
    v_partition_dumpfile :=  chr(39)||p_partition_name||chr(39);
    v_partition_dumplog  :=  p_partition_name || '.LOG';
         SELECT COUNT(*) INTO v_rec_count FROM HDB.PARTITION_BACKUP_MASTER WHERE PARTITION_ARCHIVAL_STATUS='Y';
             IF v_rec_count != 0 THEN
               SELECT
               PARTITION_ARCHIVAL_PERIOD
               ,PARTITION_ARCHIVAL_LOCATION
               ,PARTITION_ARCHIVAL_DIRECTORY
               INTO v_ret_period , v_arch_location , v_arch_directory
               FROM HDB.PARTITION_BACKUP_MASTER WHERE PARTITION_ARCHIVAL_STATUS='Y';
             END IF;
         utl_file.fgetattr('ORALOAD', l_file_name, l_exists, l_length, l_blksize);      
            IF (l_exists) THEN        
             utl_file.FRENAME('ORALOAD', l_file_name, 'ORALOAD', p_partition_name ||'_'|| to_char(systimestamp,'YYYYMMDDHH24MISS') ||'.DMP', TRUE);
         END IF;
        v_part_date := replace(p_partition_name,'P');
            DBMS_OUTPUT.PUT_LINE('inside');
        h1 := dbms_datapump.open (operation => 'EXPORT',
                                  job_mode  => 'TABLE'
              dbms_datapump.add_file (handle    => h1,
                                      filename  => p_partition_name ||'.DMP',
                                      directory => v_arch_directory,
                                      filetype  => DBMS_DATAPUMP.KU$_FILE_TYPE_DUMP_FILE);
              dbms_datapump.add_file (handle    => h1,
                                      filename  => p_partition_name||'.LOG',
                                      directory => v_arch_directory,
                                      filetype  => DBMS_DATAPUMP.KU$_FILE_TYPE_LOG_FILE);
              dbms_datapump.metadata_filter (handle => h1,
                                             name   => 'SCHEMA_EXPR',
                                             value  => 'IN (''HDB'')');
              dbms_datapump.metadata_filter (handle => h1,
                                             name   => 'NAME_EXPR',
                                             value  => 'IN (''SUBSCRIBER_EVENT'')');
              dbms_datapump.data_filter (handle      => h1,
                                         name        => 'PARTITION_LIST',
                                        value       => v_partition_dumpfile,
                                        table_name  => 'SUBSCRIBER_EVENT',
                                        schema_name => 'HDB');
              dbms_datapump.set_parameter(handle => h1, name => 'COMPRESSION', value => 'ALL');
              dbms_datapump.start_job (handle => h1);
                  dbms_datapump.detach (handle => h1);              
    END;
    /

    Hi ,
    I tried to generate dump with expdp instead of API, got more specific error logs.
    but on same path log file got create.
    expdp hdb/hdb DUMPFILE=P2010110800.dmp DIRECTORY=ORALOAD TABLES=(SUBSCRIBER_EVENT:P2010110800) logfile=P2010110800.log
    Export: Release 11.2.0.1.0 - Production on Wed Nov 10 01:26:13 2010
    Copyright (c) 1982, 2009, Oracle and/or its affiliates.  All rights reserved.
    Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.1.0 - 64bit Production
    With the Partitioning, Automatic Storage Management, OLAP, Data Mining
    and Real Application Testing options
    ORA-39001: invalid argument value
    ORA-39000: bad dump file specification
    ORA-31641: unable to create dump file "/nfs_path/lims/backup/hdb/datapump/P2010110800.dmp"
    ORA-27054: NFS file system where the file is created or resides is not mounted with correct options
    Additional information: 3Edited by: Sachin B on Nov 9, 2010 10:33 PM

  • ORA-39097: Data Pump job encountered unexpected error -12801

    Hallo!I am running Oracle RAC 11.2.0.3.0 database on IBM-AIX 7.1 OS platform.
    We normally do data pump expdp backups and we created a OS authenticated user and been having non-DBA users use this user (instead of / as sysdba which is only used by DBAs) to run expdp.This OS authenticated user has been working fine until it statrd gigin use error below
    Export: Release 11.2.0.3.0 - Production on Fri Apr 5 23:08:22 2013
    Copyright (c) 1982, 2011, Oracle and/or its affiliates. All rights reserved.
    Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.3.0 - 64bit Production
    With the Partitioning, Real Application Clusters, Automatic Storage Management, Oracle Label Security,
    OLAP, Data Mining, Oracle Database Vault and Real Application Testing optio
    FLASHBACK automatically enabled to preserve database integrity.
    Starting "OPS$COPBKPMG"."SYS_EXPORT_SCHEMA_16": /******** DIRECTORY=COPKBFUB_DIR dumpfile=COPKBFUB_Patch35_PreEOD_2013-04-05-23-08_%U.dmp logfile=COPKBFUB_Patch35_PreEOD_2013-04-05-23-08.log cluster=n parallel=4 schemas=BANKFUSION,CBS,UBINTERFACE,WASADMIN,CBSAUDIT,ACCTOPIC,BFBANKFUSION,PARTY,BFPARTY,WSREGISTRY,COPK
    Estimate in progress using BLOCKS method...
    Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
    Total estimation using BLOCKS method: 130.5 GB
    Processing object type SCHEMA_EXPORT/USER
    Processing object type SCHEMA_EXPORT/SYSTEM_GRANT
    Processing object type SCHEMA_EXPORT/ROLE_GRANT
    Processing object type SCHEMA_EXPORT/DEFAULT_ROLE
    Processing object type SCHEMA_EXPORT/PRE_SCHEMA/PROCACT_SCHEMA
    Processing object type SCHEMA_EXPORT/SYNONYM/SYNONYM
    Processing object type SCHEMA_EXPORT/DB_LINK
    Processing object type SCHEMA_EXPORT/SEQUENCE/SEQUENCE
    Processing object type SCHEMA_EXPORT/TABLE/TABLE
    Processing object type SCHEMA_EXPORT/TABLE/GRANT/OWNER_GRANT/OBJECT_GRANT
    Processing object type SCHEMA_EXPORT/TABLE/COMMENT
    Processing object type SCHEMA_EXPORT/PACKAGE/PACKAGE_SPEC
    Processing object type SCHEMA_EXPORT/PACKAGE/GRANT/OWNER_GRANT/OBJECT_GRANT
    Processing object type SCHEMA_EXPORT/FUNCTION/FUNCTION
    Processing object type SCHEMA_EXPORT/FUNCTION/GRANT/OWNER_GRANT/OBJECT_GRANT
    Processing object type SCHEMA_EXPORT/PROCEDURE/PROCEDURE
    Processing object type SCHEMA_EXPORT/PROCEDURE/GRANT/OWNER_GRANT/OBJECT_GRANT
    Processing object type SCHEMA_EXPORT/PACKAGE/COMPILE_PACKAGE/PACKAGE_SPEC/ALTER_PACKAGE_SPEC
    Processing object type SCHEMA_EXPORT/FUNCTION/ALTER_FUNCTION
    Processing object type SCHEMA_EXPORT/PROCEDURE/ALTER_PROCEDURE
    Processing object type SCHEMA_EXPORT/TABLE/INDEX/INDEX
    Processing object type SCHEMA_EXPORT/TABLE/INDEX/FUNCTIONAL_INDEX/INDEX
    Processing object type SCHEMA_EXPORT/TABLE/CONSTRAINT/CONSTRAINT
    Processing object type SCHEMA_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS
    Processing object type SCHEMA_EXPORT/TABLE/INDEX/STATISTICS/FUNCTIONAL_INDEX/INDEX_STATISTICS
    Processing object type SCHEMA_EXPORT/VIEW/VIEW
    Processing object type SCHEMA_EXPORT/VIEW/GRANT/OWNER_GRANT/OBJECT_GRANT
    Processing object type SCHEMA_EXPORT/VIEW/GRANT/CROSS_SCHEMA/OBJECT_GRANT
    Processing object type SCHEMA_EXPORT/VIEW/COMMENT
    Processing object type SCHEMA_EXPORT/PACKAGE/PACKAGE_BODY
    Processing object type SCHEMA_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
    Processing object type SCHEMA_EXPORT/MATERIALIZED_VIEW
    Processing object type SCHEMA_EXPORT/POST_SCHEMA/PROCACT_SCHEMA
    . . exported "WASADMIN"."BATCHGATEWAYLOGDETAIL" 2.244 GB 9379850 rows
    . . exported "WASADMIN"."UBTB_TRANSACTION" 13.71 GB 46299982 rows
    . . exported "WASADMIN"."INTERESTHISTORY" 2.094 GB 13479801 rows
    . . exported "WASADMIN"."MOVEMENTSHISTORY" 1.627 GB 13003451 rows
    . . exported "WASADMIN"."ACCRUALSREPORT" 1.455 GB 18765315 rows
    ORA-39097: Data Pump job encountered unexpected error -12801
    ORA-39065: unexpected master process exception in MAIN
    ORA-12801: error signaled in parallel query server PZ99, instance copubdb02dc:COPKBFUB2 (2)
    ORA-01460: unimplemented or unreasonable conversion requested
    Job "OPS$COPBKPMG"."SYS_EXPORT_SCHEMA_16" stopped due to fatal error at 23:13:37
    Please assist.

    have you seen this?
    *Bug 13099577 - ORA-1460 with parallel query [ID 13099577.8]*

  • Data pump import error with nested tables

    So the problem is somewhat long :)
    Actually the problems are two - why and how oracle are treating OO concept and why data pump doesn't work?
    So scenario for the 1st one:
    1) there is object type h1 and table of h1
    2) there is unrelated object type row_text and table of row_text
    3) there is object type h2 under h1 with attribute as table of row_text
    4) there is table tab1 with column b with data type as table of h1. Of course column b is stored as nested table.
    So how do you think - how many nested tables Oracle will create? The correct answer is 2. One explicitly defined and one hidden with system
    generated name for the type h2 which is under type h1. So the question is WHY? Why if I create an instance of supertype Oracle tries to adapt
    it for the subtype as well? Even more if I create another subtype h3 under h1 another hidden nested table appears.
    This was the first part.
    The second part is - if I do schema export and try to import it in another schema I got error saying that oracle failed to create storage table for
    nested table column b. So the second question is - if Oracle has created such a mess with hidden nested tables how to import/export to another
    schema?
    Ok and here is test case to demonstrate problems above:
    -- creating type h1 and table of it
    SQL> create or replace type h1 as object (a number)
      2  not final;
      3  /
    Type created.
    SQL> create or replace type tbl_h1 as table of h1;
      2  /
    Type created.
    -- creating type row_text and table of it
    SQL> create or replace type row_text as object (
      2    txt varchar2(100))
      3  not final;
      4  /
    Type created.
    SQL> create or replace type tbl_row_text as table of row_text;
      2  /
    Type created.
    -- creating type h2 as subtype of h1
    SQL> create or replace type h2 under h1 (some_texts tbl_row_text);
      2  /
    Type created.
    SQL> create table tab1 (a number, b tbl_h1)
      2  nested table b
      3  store as tab1_nested;
    Table created.
    -- so we have 2 nested tables now
    SQL> select table_name, parent_table_name, parent_table_column
      2  from user_nested_tables;
    TABLE_NAME                     PARENT_TABLE_NAME
    PARENT_TABLE_COLUMN
    SYSNTfsl/+pzu3+jgQAB/AQB27g==  TAB1_NESTED
    TREAT(SYS_NC_ROWINFO$ AS "GINTS"."H2")."SOME_TEXTS"
    TAB1_NESTED                    TAB1
    B
    -- another subtype of t1
    SQL> create or replace type h3 under h1 (some_texts tbl_row_text);
      2  /
    Type created.
    -- plus another nested table
    SQL> select table_name, parent_table_name, parent_table_column
      2  from user_nested_tables;
    TABLE_NAME                     PARENT_TABLE_NAME
    PARENT_TABLE_COLUMN
    SYSNTfsl/+pzu3+jgQAB/AQB27g==  TAB1_NESTED
    TREAT(SYS_NC_ROWINFO$ AS "GINTS"."H2")."SOME_TEXTS"
    SYSNTfsl/+pz03+jgQAB/AQB27g==  TAB1_NESTED
    TREAT(SYS_NC_ROWINFO$ AS "GINTS"."H3")."SOME_TEXTS"
    TAB1_NESTED                    TAB1
    B
    SQL> desc "SYSNTfsl/+pzu3+jgQAB/AQB27g=="
    Name                                      Null?    Type
    TXT                                                VARCHAR2(100)OK let it be and now I'm trying to export and import in another schema:
    [oracle@xxx]$ expdp gints/xxx@xxx directory=xxx dumpfile=gints.dmp logfile=gints.log
    Export: Release 11.2.0.1.0 - Production on Thu Feb 4 22:32:48 2010
    <irrelevant rows skipped>
    Processing object type SCHEMA_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
    . . exported "GINTS"."TAB1"                                  0 KB       0 rows
    . . exported "GINTS"."SYSNTfsl/+pz03+jgQAB/AQB27g=="         0 KB       0 rows
    . . exported "GINTS"."TAB1_NESTED"                           0 KB       0 rows
    . . exported "GINTS"."SYSNTfsl/+pzu3+jgQAB/AQB27g=="         0 KB       0 rows
    Master table "GINTS"."SYS_EXPORT_SCHEMA_01" successfully loaded/unloaded
    ******************************************************************************And now import. In order to create types transformation of OIDs is applied and also remap_schema
    Although it fails to create the table.
    [oracle@xxx]$ impdp gints1/xxx@xxx directory=xxx dumpfile=gints.dmp logfile=gints_imp.log remap_schema=gints:gints1 transform=OID:n
    Import: Release 11.2.0.1.0 - Production on Thu Feb 4 22:41:48 2010
    Copyright (c) 1982, 2009, Oracle and/or its affiliates.  All rights reserved.
    Connected to: Oracle Database 11g Release 11.2.0.1.0 - Production
    Master table "GINTS1"."SYS_IMPORT_FULL_01" successfully loaded/unloaded
    Starting "GINTS1"."SYS_IMPORT_FULL_01":  gints1/********@xxx directory=xxx dumpfile=gints.dmp logfile=gints_imp.log remap_schema=gints:gints1 transform=OID:n
    Processing object type SCHEMA_EXPORT/USER
    ORA-31684: Object type USER:"GINTS1" already exists
    Processing object type SCHEMA_EXPORT/SYSTEM_GRANT
    Processing object type SCHEMA_EXPORT/ROLE_GRANT
    Processing object type SCHEMA_EXPORT/DEFAULT_ROLE
    Processing object type SCHEMA_EXPORT/PRE_SCHEMA/PROCACT_SCHEMA
    Processing object type SCHEMA_EXPORT/TYPE/TYPE_SPEC
    Processing object type SCHEMA_EXPORT/TABLE/TABLE
    ORA-39083: Object type TABLE:"GINTS1"."TAB1" failed to create with error:
    ORA-02320: failure in creating storage table for nested table column B
    ORA-00904: : invalid identifier
    Failing sql is:
    CREATE TABLE "GINTS1"."TAB1" ("A" NUMBER, "B" "GINTS1"."TBL_H1" ) SEGMENT CREATION IMMEDIATE PCTFREE 10 PCTUSED 40 INITRANS 1 MAXTRANS 255 NOCOMPRESS LOGGING STORAGE(INITIAL 65536 NEXT 1048576 MINEXTENTS 1 MAXEXTENTS 2147483645 PCTINCREASE 0 FREELISTS 1 FREELIST GROUPS 1 BUFFER_POOL DEFAULT FLASH_CACHE DEFAULT CELL_
    Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
    Processing object type SCHEMA_EXPORT/TABLE/INDEX/INDEX
    Processing object type SCHEMA_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS
    ORA-39083: Object type INDEX_STATISTICS failed to create with error:
    ORA-01403: no data found
    ORA-01403: no data found
    Failing sql is:
    DECLARE I_N VARCHAR2(60);   I_O VARCHAR2(60);   c DBMS_METADATA.T_VAR_COLL;   df varchar2(21) := 'YYYY-MM-DD:HH24:MI:SS'; BEGIN  DELETE FROM "SYS"."IMPDP_STATS";   c(1) :=   DBMS_METADATA.GET_STAT_COLNAME('GINTS1','TAB1_NESTED',NULL,'TREAT(SYS_NC_ROWINFO$ AS "GINTS"."H2")."SOME_TEXTS"',1);  DBMS_METADATA.GET_STAT_INDNAME('GINTS1','TAB1_NESTED',c,1,i_o,i_n);   INSERT INTO "
    ORA-39083: Object type INDEX_STATISTICS failed to create with error:
    ORA-01403: no data found
    ORA-01403: no data found
    Failing sql is:
    DECLARE I_N VARCHAR2(60);   I_O VARCHAR2(60);   c DBMS_METADATA.T_VAR_COLL;   df varchar2(21) := 'YYYY-MM-DD:HH24:MI:SS'; BEGIN  DELETE FROM "SYS"."IMPDP_STATS";   c(1) :=   DBMS_METADATA.GET_STAT_COLNAME('GINTS1','TAB1_NESTED',NULL,'TREAT(SYS_NC_ROWINFO$ AS "GINTS"."H3")."SOME_TEXTS"',1);  DBMS_METADATA.GET_STAT_INDNAME('GINTS1','TAB1_NESTED',c,1,i_o,i_n);   INSERT INTO "
    Processing object type SCHEMA_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
    Job "GINTS1"."SYS_IMPORT_FULL_01" completed with 4 error(s) at 22:41:52So any idea how to make export/import of such tables?
    TIA
    Gints

    Tom Kyte has said it repeatedly ... I will repeat it here for you.
    The fact that Oracle allows you to build object tables is not an indication that you should.
    Store your data relationally and build object_views on top of them.
    http://www.morganslibrary.org/reference/object_views.html
    If you model properly, and store properly, you don' have any issues.

  • Error connecting as DBA using Data Pump Within Transportable Modules

    Hi all
    I am using OWB 10g R2 and trying to set up a transportable Module to test the Oracle Data Pump utility. I have followed the user manual in terms of relevant grants and permissions needed to use this functionality and have sucessfully connect to both my source and target databases and created my transportable module which will extract six tables from a source database on 10.2.0.1.0 to may target schema in my warehouse also on 10.2.0.1.0. When i come to try and deploy / execute the transportable module it fails with the following error.
    RPE-01023: Failed to establish connection to target database as DBA
    Now we have even gone as far as granting the DBA role to the user within our target but we still get the same error so assume it is something to do with the connection of the Transportable Target Module Location and it needs to connect as DBA somehow in the connect string. Has anyone experienced this issue and is their a way of creating the location connection that is not documented.
    There is no mention of this anywhere within the manual and i have even followed the example from http://www.rittman.net/archives/2006_04.html and my target user has the privilages detailed in the manual as detailed below
    User must not be SYS. Musthave ALTER TABLESPACE privilege and IMP_FULL_
    DATABASE role. Must have CREATE MATERIALIZED VIEW privilege with ADMIN
    option. Must be a Warehouse Builder repository databaseuser.
    Any help would be appreciated before i raise a request with Oracle

    Did you ever find a resolution ? We are experiencing the same issue..
    thanks
    OBX

  • ORA-12560 error when trying to do a data pump

    I installed oracle 11g and was attempting to do a data pump from the dev machine to my computer however when I run the impdp command I get an ORA-12560: TNS:protocol adapter error.
    A co-worker thought I might need to add a loopback adapter as that had been the only thing different from his machine where everything worked fine so I added the loopback adapter but it still didn't work. We thought maybe the adapter had to be in place first so I uninstalled the database and installed again but that also didn't work, I still get the same error message.
    Any ideas on what I should try?
    Thanks for any help you can give me.

    user11340791 wrote:
    Please check if the service for the database is running if you are working on windows....
    I am on windows how do I check if the service for the database is running?? Sorry to need so much hand holding I'm pretty new at this.You can check it from My Computer->Right Click->Manage->Services . Check from there that the service for your database is started or not? If not than start it.
    >
    I forgot to mention that I do have my environment variable set, ORACLE_HOME to the path the bin folder is in and ORACLE_SID set to the database SID.You are on windows and on it, most of the times, its taken care automatically by the o/s itself.
    HTH
    Aman....

  • Data pump export with error

    Hi All,
    I am new to DBA and when I run expdp for the full database,I am getting the below error
    " ORA-39139: Data Pump does not support XMLSchema objects. TABLE_DATA:"APPS"."abc_TAB" will be skipped.
    I am facing issue with exp utility,so I have to use expdp.
    Kindly help me to avoid these errors.
    Thank,

    Thanks Srini for the Doc - A good one.
    We have scheduled a daily full export backup. Now this is having problems.
    -> exp expuser/exppaswd file=/ora/exp.dmp log=/ora/exp.log direct=Y statistics=none full=Y
    error=
    Table window348_TAB will be exported in conventional path.
    . . exporting table window348_TAB 0 rows exported
    Table wtpParameters350_TAB will be exported in conventional path.
    . . exporting table wtpParameters350_TAB 0 rows exported
    . exporting synonyms
    EXP-00008: ORACLE error 1406 encountered
    ORA-01406: fetched column value was truncated
    EXP-00000: Export terminated unsuccessfully
    Oracle requested to drop/disable the policy. But one policy assigned to many tables and one table having more than one policies.
    We haven't set any policy, all are default one.
    1. If I disable/drop policies how it will affect the application?
    2. How I can disable or drop it for all the objects in the database?
    3. How I can identify the policies - whether OLS or VPD
    Pelase help me to get a daily full export backup or any alternate solution for the same.
    Thanks,
    Asim

  • ERROR: Invalid data pump job state; one of: DEFINING or EXECUTING.

    Hi all.
    I'm using Oracle 10g (10.2.0.3) I had some scheduled jobs for export the DB with datapump. Some days ago they began to fail with the message:
    ERROR: Invalid data pump job state; one of: DEFINING or EXECUTING.
    Wait a few moments and check the log file (if one was used) for more details.
    The EM says that I can't monitor the job because there's no master tables. How can I find the log?
    I also founded in the bdump that exceptions occurred:
    *** 2008-08-22 09:04:54.671
    *** ACTION NAME:(EXPORT_ITATRD_DIARIO) 2008-08-22 09:04:54.671
    *** MODULE NAME:(Data Pump Master) 2008-08-22 09:04:54.671
    *** SERVICE NAME:(SYS$USERS) 2008-08-22 09:04:54.671
    *** SESSION ID:(132.3726) 2008-08-22 09:04:54.671
    kswsdlaqsub: unexpected exception err=604, err2=24010
    kswsdlaqsub: unexpected exception err=604, err2=24010
    * kjdrpkey2hv: called with pkey 114178, options x8
    * kjdrpkey2hv: called with pkey 114176, options x8
    * kjdrpkey2hv: called with pkey 114177, options x8
    Some help would be appreciated.
    Thanks.

    Delete any uncompleted or failed jobs and try again.

  • DATA PUMP export error

    Hi all
    During data pump export (big table ) dmp has 289GB (FILESIZE=10G) I had error
    . . exported "TEST3"."INC_T_ZMLUVY_PARTNERI" 6.015 GB 61182910 rows
    . . exported "TEST3"."PAR_T_VZTAHY" 5.798 GB 73121325 rows
    ORA-31693: Table data object "TEST3"."INC_T_POISTENIE_PARAMETRE_H" failed to load/unload and is being skipped due to error:
    ORA-31617: unable to open dump file "/u01/app/oracle/backup/STARINS/exp_polska03.dmp" for write
    ORA-19505: failed to identify file "/u01/app/oracle/backup/STARINS/exp_polska03.dmp"
    ORA-27037: unable to obtain file status
    Linux-x86_64 Error: 2: No such file or directory
    Additional information: 3
    . . exported "TEST3"."PAY_T_UCET_POLOZKA" 5.344 GB 97337823 rows
    and export continued until :
    Job "SYSTEM"."SYS_EXPORT_SCHEMA_02" completed with 1 error
    (it take 8hours)
    can You help me if dmp is now ok ? If impdp will conntiue after reading exp_polska03.dmp , (dump has 28 file exp_polska1 - exp_polska28)?Maximaly I 'm exporting this one table again.. is this solution ok ?
    thak Brano

    What is the expdp parameters used?
    what's the total export dump size?( Try estimate_only=y)
    ORA-31617: unable to open dump >file "/u01/app/oracle/backup/STARINS/exp_polska03.dmp" for write
    ORA-19505: failed to identify >file "/u01/app/oracle/backup/STARINS/exp_polska03.dmphttp://systemr.blogspot.com/2007/09/section-oracle-database-utilities-title.html
    Check the above one.
    I guess the file is removed from the location or not enough permission to write.

  • ORA-39097: Data Pump job encountered unexpected error -39076

    Hi Everyone,
    Today i tried to take a export dump pump(table specific) from my test database, version is 10.2.0.4 on Solaris10(64-bit) and i got the following error message,
    Job "SYSTEM"."SYS_EXPORT_TABLE_23" successfully completed at 09:51:36
    ORA-39097: Data Pump job encountered unexpected error -39076
    ORA-39065: unexpected master process exception in KUPV$FT_INT.DELETE_JOB
    ORA-39076: cannot delete job SYS_EXPORT_TABLE_23 for user SYSTEM
    ORA-06512: at "SYS.DBMS_SYS_ERROR", line 95
    ORA-06512: at "SYS.KUPV$FT_INT", line 934
    ORA-31632: master table "SYSTEM.SYS_EXPORT_TABLE_23" not found, invalid, or inaccessible
    ORA-06512: at "SYS.DBMS_SYS_ERROR", line 95
    ORA-06512: at "SYS.KUPV$FT_INT", line 1079
    ORA-20000: Unable to send e-mail message from pl/sql because of:
    ORA-29260: network error: Connect failed because target host or object does not exist
    ORA-39097: Data Pump job encountered unexpected error -39076
    ORA-39065: unexpected master process exception in MAIN
    ORA-39076: cannot delete job SYS_EXPORT_TABLE_23 for user SYSTEM
    ORA-06512: at "SYS.DBMS_SYS_ERROR", line 95
    ORA-06512: at "SYS.KUPV$FT_INT", line 934
    ORA-31632: master table "SYSTEM.SYS_EXPORT_TABLE_23" not found, invalid, or inaccessible
    ORA-06512: at "SYS.DBMS_SYS_ERROR", line 95
    ORA-06512: at "SYS.KUPV$FT_INT", line 1079
    ORA-20000: Unable to send e-mail message from pl/sql because of:
    ORA-29260: network error: Connect failed because target host or object does not exist
    i hope the export dumpfile is valid one but i don't know why i am getting this error message. Does any one have faced this kind of problem. please advice me
    Thanks
    Shan

    Once you see this:
    Job "SYSTEM"."SYS_EXPORT_TABLE_23" successfully completed at 09:51:36The Data Pump job is done with the dumpfile. There is some clean up that is needed and it looks like something in the cleanup failed. Not sure what it was, but you dumpfile should be fine. One easy way to test it is to run impdp with sqlfile. This will do everything import will do, but instead of creating objects, it writes the ddl to the sql file.
    impdp user/password sqlfile=my_test.sql directory=your_dir dupmfile=your_dump.dmp ...
    If that works, then your dumpfile should be fine. The last thing the export does is write the Data Pump master table to the dumpfile. The first thing that import does is read that table in. So, if you can read it in (which impdp sqlfile does) your dump is good.
    Dean

Maybe you are looking for