Data pump problem

I am facing the a problem during running expdp. The command is failing with the following error when I try to use expdp command:
Connected to: Oracle Database 10g Enterprise Edition Release 10.1.0.2.0 - 64bit Production
With the Partitioning and Data Mining options
ORA-31626: job does not exist
ORA-31637: cannot create job SYS_EXPORT_SCHEMA_01 for user MAXIMO
ORA-06512: at "SYS.DBMS_SYS_ERROR", line 95
ORA-06512: at "SYS.KUPV$FT_INT", line 601
ORA-39080: failed to create queues "KUPC$C_1_20060412104127" and "KUPC$S_1_20060412104127" for Data Pump job
ORA-06512: at "SYS.DBMS_SYS_ERROR", line 95
ORA-06512: at "SYS.KUPC$QUE_INT", line 1550
ORA-06521: PL/SQL: Error mapping function
Please help

Don't know why you think that...
Try this Database forum:
http://forums.oracle.com/forums/category.jspa?categoryID=18
:-)

Similar Messages

  • Data pump problem on sync data from windows to linux

    Hi,
    i have production db on windows 2003 server, and syn data from production windows machine to linux server. using data pump exports, imports.
    i have written an procedure to export schema level
    and
    procedure for import like
    dbms_datapump.open(operation ='import',job_module='schema',job_name='imp');
    dbms_datapump.add_file(handle='',filename='imp.dmp',directory='dir_datapump',filetype=dbms_datapump.ku$_file_type_dump_file);
    dbms_datapump.metadata_filter(handle='',name='schema_expr',value='in('hr','oe')',object_type''table');
    dbms_datapump.set_parameter(handle ='',name=table_exists_action,value=replace);
    when i run the above procedure, error displays that table already exists ora-31684
    but from the
    impdp include=table table_exists_action=replace
    i am able to do it.
    can u suggest what iam missing.
    Regards
    Ajay

    I will try to explain a little better using screenshots. This screenshot is from windows :
    http://img510.imageshack.us/img510/1714/windowspg.jpg
    At the top is says 'date taken' and ANY windows photo library software will sort it by this date. Below are the properties of the same file after it's copied to osx :
    http://img153.imageshack.us/img153/6345/screenshot20100322at145.png
    It contains lots of information about the picture but not when it was taken. So software such as iphoto or picasa sorts it by it's created date (which you can see at the top is Monday 12th January).
    Why isn't OSX carrying over the exif data on when the photo was taken ? I can't see any iphoto options to sort by that date.
    Message was edited by: Ollie UK
    Message was edited by: Ollie UK

  • Interface Problems: DBA = Data Pump = Export Jobs (Job Name)

    Hello Folks,
    I need your help in troubleshooting an SQL Developer interface problem.
    DBA => Data Pump => Export Jobs (Job Name) => Data Pump Export => Job Scheduler (Step):
    -a- Job Name and Job Description fields are not visible. Well the fields are there but each of them just 1/2 character wide. I can't see/enter anything in the fields.
    Import Wizard:
    -b- Job Name field under the first "Type" wizard's step looks exactly the same as in Export case.
    -c- Can't see any row under "Chose Input Files" section (I see just ~1 mm of the first row and everything else is hidden).
    My env:
    -- Version 3.2.20.09, Build MAIN-09.87
    -- Windows 7 (64 bit)
    It could be related to the fact that I did change fonts in the Preferences. As I don't know what is the default font I can't change it back to the default and test (let me know what is the default and I will test it).
    PS
    -- Have tried to disable all extensions but DBA Navigator (11.2.0.09.87). It didn't help
    -- There are no any messages in the console if I run SQL Dev under cmd "sqldeveloper\bin\sqldeveloper.exe
    Any help is appreciated,
    Yury

    Hi Yury,
    a-I see those 1/2 character size text boxes (in my case on frequency) when the pop up dialog is too small - do they go away when you make it bigger?
    b- On IMPORT the name it starts with IMPORT - if it is the half character issue have you tried making the dialog bigger?
    c-I think it is size again but my dialog at minimum size is already big enough.
    Have you tried a smaller font - or making the dialogs bigger (resizing from the corners).
    I have a 3.2.1 version where I have not changed the fonts from Tools->Preferences->CodeEditor->Fonts appears to be:
    Font Name: DialogInput
    Font size: 12
    Turloch
    -SQLDeveloper Team

  • Data pump using network_link problem

    Hi,
    I defined a database link for the sys user as:
    create database link xe_remote
    connect to hr
    identified by hr
    using 'xe_remote'
    and it works fine:
    select * from dual@xe_remote
    D
    X
    Then, tried to import with data pump the user hr from xe_remote in the local system, but it seemed that the database link wasn't valid anymore:
    impdp system/pwd schemas=hr network_link=xe_remote remap_schema=hr:hr_remote table_exists_action=replace
    ORA-39001: invalid argument value
    ORA-39200: Link name "xe_remote" is invalid
    ORA-02019: connection description for remote database not found
    Thanks,
    Gianluca

    Hello
    oerr says:
    oerr ora 39149
    39149, 00000, "cannot link privileged user to non-privileged user"
    // *Cause:  A Data Pump job initiated be a user with
    // EXPORT_FULL_DATABASE/IMPORT_FULL_DATABASE roles specified a
    // network link that did not correspond to a user with
    // equivalent roles on the remote database.
    // *Action: Specify a network link that maps users to identically privileged
    // users in the remote database.
    Solution:
    Create a db link from hr_remote to hr
    or
    Create a db link from system to system (I think, it's easier, because you have all rights...)
    Greetings
    Sven

  • Data pump import problem

    i have exported Hr schema from one macine using following code it export successfully but i tried to import this exported file in another machine it gave error. one thing more how i use package to import file as we exported by using data pump using package plz help
    C:\Documents and Settings\remote>impdp hr/hr DIRECTORY=DATA_PUMP_DIR DUMPFILE=YAHOO.DMP full=y
    With the Partitioning, OLAP, Data Mining and Real Application Testing options
    ORA-31626: job does not exist
    ORA-31637: cannot create job SYS_IMPORT_FULL_01 for user HR
    ORA-06512: at "SYS.DBMS_SYS_ERROR", line 95
    ORA-06512: at "SYS.KUPV$FT_INT", line 663
    ORA-39080: failed to create queues "KUPC$C_1_20090320121353" and "KUPC$S_1_20090
    320121353" for Data Pump job
    ORA-06512: at "SYS.DBMS_SYS_ERROR", line 95
    ORA-06512: at "SYS.KUPC$QUE_INT", line 1665
    ORA-01658: unable to create INITIAL extent for segment in tablespace SYSTEM
    DECLARE
    handle NUMBER;
    BEGIN
    handle := dbms_datapump.open (
    operation => 'EXPORT' ,job_mode => 'SCHEMA');
    dbms_datapump.add_file(
    handle => handle
    *,filename => 'YAHOO.dmp'*
    *,directory => 'DATA_PUMP_DIR'*
    *,filetype => 1);*
    dbms_datapump.metadata_filter(
    handle => handle
    *,name => 'SCHEMA_EXPR'*
    *,value => 'IN(''HR'')');*
    dbms_datapump.start_job(
    handle => handle);
    dbms_datapump.detach(handle => handle);
    end;

    +*<Moderator edit - deleted contents of MOS Doc 752374.1 -  pl do not post such contents - it is a violation of your Support agreement - locking this thread>*+                                                                                                                                                                                                                                                                                                                               

  • Data Pump .xlsx into a SQL Server Table and the whole 32-Bit, 64-Bit discussion

    First of all...I have a headache!
    Found LOTS of Google hits when trying to data pump a .xlsx File into a SQL Server Table. And the whole discussion of the Microsoft ACE 64-Bit Driver or the Microsoft Jet 32-Bit Driver.
    Specifically receiving this error...
    An OLE DB record is available.  Source: "Microsoft Office Access Database Engine"  Hresult: 0x80004005  Description: "External table is not in the expected format.".
    Error: 0xC020801C at Data Flow Task to Load Alere Coaching Enrolled, Excel Source [56]: SSIS Error Code DTS_E_CANNOTACQUIRECONNECTIONFROMCONNECTIONMANAGER.  The AcquireConnection method call to the connection manager "Excel Connection Manager"
    failed with error code 0xC0202009.
    Strangely enough, if I simply data pump ONE .xlsx File into a SQL Server Table utilizing my SSIS Package, it seems to work fine. If instead I am trying to be pro-active and allowing for multiple .xlsx Files by using a Foreach Loop Container and a variable
    @[User::FileName], it's erroring out...but not really because it is indeed storing the rows onto the SQL Server Table. I did check all my Delay
    Why does this have to be sooooooo difficult???
    Can anyone help me out here in trying to set-up a SSIS Package in a rather constrictive environment to pump a .xlsx File into a SQL Server Table? What in God's name am I doing wrong? Or is all this a misnomer? But if it's working how do I disable the error
    so that is stops erroring out?

    Hi ITBobbyP,
    According to your description, when you import data of .xlsx file to SQL Server database, you got the error message.
    The error can be caused by the following reasons:
    The excel file is locked by other processes. Please kindly resave this file and name it to other file name to see if the issue will be fixed.
    The ACE(Access Database Engine) is not up to date as Vaibhav mentioned. Please download the latest ACE and install it from the link:
    https://www.microsoft.com/en-us/download/details.aspx?id=13255.
    The version of OFFICE and server bitness is not the same. To solve the problem, please refer to the following document:
    http://hrvoje.piasevoli.com/2010/09/01/importing-data-from-64-bit-excel-in-ssis/
    If you have any more questions, please feel free to ask.
    Thanks,
    Wendy Fu
    Wendy Fu
    TechNet Community Support

  • Using  Data Pump when database is read-only

    Hello
    I used flashback and returned my database to the past time then I opened the database read only
    then I wanted use data pump(expdp) for exporting a schema but I encounter this error
    ORA-31626: job does not exist
    ORA-31633: unable to create master table "SYS.SYS_EXPORT_SCHEMA_05"
    ORA-06512: at "SYS.DBMS_SYS_ERROR", line 95
    ORA-06512: at "SYS.KUPV$FT", line 863
    ORA-16000: database open for read-only access
    but I could by exp, export that schema
    My question is that , don't I can use Data Pump while database is read only ? or do you know any resolution for the issue ?
    thanks

    You need to use NETWORK_LINK, so the required tables are created in a read/write database and the data is read from the read only database using a database link:
    SYSTEM@db_rw> create database link db_r_only
      2   connect to system identified by oracle using 'db_r_only';
    $ expdp system/oracle@db_rw network_link=db_r_only directory=data_pump_dir schemas=scott dumpfile=scott.dmpbut I tried it with 10.2.0.4 and found and error:
    Export: Release 10.2.0.4.0 - Production on Thursday, 27 November, 2008 9:26:31
    Copyright (c) 2003, 2007, Oracle.  All rights reserved.
    Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.4.0 - Production
    With the Partitioning, OLAP, Data Mining and Real Application Testing options
    ORA-39006: internal error
    ORA-39065: unexpected master process exception in DISPATCH
    ORA-02054: transaction 1.36.340 in-doubt
    ORA-16000: database open for read-only access
    ORA-02063: preceding line from DB_R_ONLY
    ORA-39097: Data Pump job encountered unexpected error -2054
    I found in Metalink the bug 7331929 which is solved in 11.2! I haven't tested this procedure with prior versions or with 11g so I don't know if this bug only affects to 10.2.0.4 or 10* and 11.1*
    HTH
    Enrique
    PS. If your problem was solved, consider marking the question as answered.

  • While using data pump (impdp) how to rename references within objects?

    using 10g;
    what i want to accomplish is to change schema & tablespace ownership using the data pump method via the command line; i have had success using the command line for expdp / impdp. Problem is that there are objects that reference the old schemas that DO NOT get updated (e.g. procedure may reference usr1.table1 in the PL/SQL statement) and this is where i have been UN-successfull). Anyone know of a way to change references from old schema to new schama name in objects(procedures, views, etc) via the command line?
    this is what i currently use that works to change schema, tablespace, but will not change references within my objects;
    expdp system/<pass> schemas=usr1,usr2 DIRECTORY=dp_dir DUMPFILE=dataPump_BothSchemas.dmp LOGFILE=expdpAllSchema.log parallel=2
    impdp system/<pass> DIRECTORY=dp_dir DUMPFILE=dataPump_BothSchemas.dmp LOGFILE=impbothSchToEE.log remap_schema=usr1:newUsr1,usr2:newUsr2 remap_tablespace=old_ts_tables:new_ts_tables full=y
    Thanks!
    p.s. I have acomplished this using the enterprise manager.

    (e.g. procedure may reference usr1.table1 in the PL/SQL statement) If you hard coded such reference in stored procedure, you have to manually correct them. Consider use synonym if your storage procedure referencing other schema's objects.

  • Best Practice for data pump or import process?

    We are trying to copy existing schema to another newly created schema. Used export data pump to successfully export schema.
    However, we encountered some errors when importing dump file to new schema. Remapped schema and tablespaces, etc.
    Most errors occur in PL/SQL... For example, we have views like below in original schema:
    CREATE VIEW *oldschema.myview* AS
    SELECT col1, col2, col3
    FROM *oldschema.mytable*
    WHERE coll1 = 10
    Quite a few Functions, Procedures, Packages and Triggers contain "*oldschema.mytable*" in DML (insert, select, update) statement, for exmaple.
    Getting the following errors in import log:
    ORA-39082: Object type ALTER_FUNCTION:"TEST"."MYFUNCTION" created with compilation warnings
    ORA-39082: Object type ALTER_PROCEDURE:"TEST"."MYPROCEDURE" created with compilation warnings
    ORA-39082: Object type VIEW:"TEST"."MYVIEW" created with compilation warnings
    ORA-39082: Object type PACKAGE_BODY:"TEST"."MYPACKAGE" created with compilation warnings
    ORA-39082: Object type TRIGGER:"TEST"."MYTRIGGER" created with compilation warnings
    A lot of actual errors/invalid objects in new schema are due to:
    ORA-00942: table or view does not exist
    My question is:
    1. What can we do to fix those errors?
    2. Is there a better way to do the import with such condition?
    3. Update PL/SQL and recompile in new schema? Or update in original schema first and export?
    Your help will be greatly appreciated!
    Thank you!

    I routinely get many (MANY) errors as follows and they always compile when I recompile using utlrp.
    ORA-39082: Object type ALTER_FUNCTION:"TKCSOWNER"."RPTSF_WR_LASTOUTPUNCH" created with compilation warnings
    ORA-39082: Object type ALTER_FUNCTION:"TKCSOWNER"."RPTSF_WR_REFPERIODENDFOREMP" created with compilation warnings
    ORA-39082: Object type ALTER_FUNCTION:"TKCSOWNER"."RPTSF_WR_TAILOFFSECS" created with compilation warnings
    ORA-39082: Object type ALTER_FUNCTION:"TKCSOWNER"."FN_GDAPREPORTGATHERER" created with compilation warnings
    Processing object type DATABASE_EXPORT/SCHEMA/PROCEDURE/ALTER_PROCEDURE
    ORA-39082: Object type ALTER_PROCEDURE:"TKCSOWNER"."ABSENT_EXCEPTION" created with compilation warnings
    ORA-39082: Object type ALTER_PROCEDURE:"TKCSOWNER"."ACCRUAL_BAL_PROJ" created with compilation warnings
    ORA-39082: Object type ALTER_PROCEDURE:"TKCSOWNER"."ACCRUAL_DETAILS" created with compilation warnings
    ORA-39082: Object type ALTER_PROCEDURE:"TKCSOWNER"."ACCRUAL_SUMMARY" created with compilation warnings
    ORA-39082: Object type ALTER_PROCEDURE:"TKCSOWNER"."ACTUAL_SCHEDULE" created with compilation warnings
    It works. In all my databases: peoplesoft, kronos, and others...
    I should qualify that it may still be necessary to 'debug' specific problems, but most common typical problems are easily resolved using the utlrp.sql. The usual problems I run into are typically because of a database link that points to another database such as in a production environment that we firewall our test and development databases from linking to (for obvious reasons).

  • Data pump error ORA-39065, status undefined after restart

    Hi members,
    The data pump full import job hung, continue client also hung, all of a sudden the window exited.
    ;;; Import> status
    ;;; Import> help
    ;;; Import> status
    ;;; Import> continue_client
    ORA-39065: unexpected master process exception in RECEIVE
    ORA-39078: unable to dequeue message for agent MCP from queue "KUPC$C_1_20090923181336"
    Job "SYSTEM"."SYS_IMPORT_FULL_01" stopped due to fatal error at 18:48:03
    I increased the shared_pool to 100M and then restarted the job with attach=jobname. After restarting, I have queried the status and found that everything is undefined. It still says undefined now and the last log message says that it has been reopened. Thats the end of the log file and nothing else is being recorded. I am not sure what is happening now. Any ideas will be appreciated. This is 10.2.0.3 version on windows. Thanks ...
    Job SYS_IMPORT_FULL_01 has been reopened at Wednesday, 23 September, 2009 18:54
    Import> status
    Job: SYS_IMPORT_FULL_01
    Operation: IMPORT
    Mode: FULL
    State: IDLING
    Bytes Processed: 3,139,231,552
    Percent Done: 33
    Current Parallelism: 8
    Job Error Count: 0
    Dump File: D:\oracle\product\10.2.0\admin\devdb\dpdump\devtest%u.dmp
    Dump File: D:\oracle\product\10.2.0\admin\devdb\dpdump\devtest01.dmp
    Dump File: D:\oracle\product\10.2.0\admin\devdb\dpdump\devtest02.dmp
    Dump File: D:\oracle\product\10.2.0\admin\devdb\dpdump\devtest03.dmp
    Dump File: D:\oracle\product\10.2.0\admin\devdb\dpdump\devtest04.dmp
    Dump File: D:\oracle\product\10.2.0\admin\devdb\dpdump\devtest05.dmp
    Dump File: D:\oracle\product\10.2.0\admin\devdb\dpdump\devtest06.dmp
    Dump File: D:\oracle\product\10.2.0\admin\devdb\dpdump\devtest07.dmp
    Dump File: D:\oracle\product\10.2.0\admin\devdb\dpdump\devtest08.dmp
    Worker 1 Status:
    State: UNDEFINED
    Worker 2 Status:
    State: UNDEFINED
    Object Schema: trm
    Object Name: EVENT_DOCUMENT
    Object Type: DATABASE_EXPORT/SCHEMA/TABLE/TABLE_DATA
    Completed Objects: 1
    Completed Rows: 78,026
    Completed Bytes: 4,752,331,264
    Percent Done: 100
    Worker Parallelism: 1
    Worker 3 Status:
    State: UNDEFINED
    Worker 4 Status:
    State: UNDEFINED
    Worker 5 Status:
    State: UNDEFINED
    Worker 6 Status:
    State: UNDEFINED
    Worker 7 Status:
    State: UNDEFINED
    Worker 8 Status:
    State: UNDEFINED

    39065, 00000, "unexpected master process exception in %s"
    // *Cause:  An unhandled exception was detected internally within the master
    //          control process for the Data Pump job.  This is an internal error.
    //          messages will detail the problems.
    // *Action: If problem persists, contact Oracle Customer Support.

  • Data pump import error with nested tables

    So the problem is somewhat long :)
    Actually the problems are two - why and how oracle are treating OO concept and why data pump doesn't work?
    So scenario for the 1st one:
    1) there is object type h1 and table of h1
    2) there is unrelated object type row_text and table of row_text
    3) there is object type h2 under h1 with attribute as table of row_text
    4) there is table tab1 with column b with data type as table of h1. Of course column b is stored as nested table.
    So how do you think - how many nested tables Oracle will create? The correct answer is 2. One explicitly defined and one hidden with system
    generated name for the type h2 which is under type h1. So the question is WHY? Why if I create an instance of supertype Oracle tries to adapt
    it for the subtype as well? Even more if I create another subtype h3 under h1 another hidden nested table appears.
    This was the first part.
    The second part is - if I do schema export and try to import it in another schema I got error saying that oracle failed to create storage table for
    nested table column b. So the second question is - if Oracle has created such a mess with hidden nested tables how to import/export to another
    schema?
    Ok and here is test case to demonstrate problems above:
    -- creating type h1 and table of it
    SQL> create or replace type h1 as object (a number)
      2  not final;
      3  /
    Type created.
    SQL> create or replace type tbl_h1 as table of h1;
      2  /
    Type created.
    -- creating type row_text and table of it
    SQL> create or replace type row_text as object (
      2    txt varchar2(100))
      3  not final;
      4  /
    Type created.
    SQL> create or replace type tbl_row_text as table of row_text;
      2  /
    Type created.
    -- creating type h2 as subtype of h1
    SQL> create or replace type h2 under h1 (some_texts tbl_row_text);
      2  /
    Type created.
    SQL> create table tab1 (a number, b tbl_h1)
      2  nested table b
      3  store as tab1_nested;
    Table created.
    -- so we have 2 nested tables now
    SQL> select table_name, parent_table_name, parent_table_column
      2  from user_nested_tables;
    TABLE_NAME                     PARENT_TABLE_NAME
    PARENT_TABLE_COLUMN
    SYSNTfsl/+pzu3+jgQAB/AQB27g==  TAB1_NESTED
    TREAT(SYS_NC_ROWINFO$ AS "GINTS"."H2")."SOME_TEXTS"
    TAB1_NESTED                    TAB1
    B
    -- another subtype of t1
    SQL> create or replace type h3 under h1 (some_texts tbl_row_text);
      2  /
    Type created.
    -- plus another nested table
    SQL> select table_name, parent_table_name, parent_table_column
      2  from user_nested_tables;
    TABLE_NAME                     PARENT_TABLE_NAME
    PARENT_TABLE_COLUMN
    SYSNTfsl/+pzu3+jgQAB/AQB27g==  TAB1_NESTED
    TREAT(SYS_NC_ROWINFO$ AS "GINTS"."H2")."SOME_TEXTS"
    SYSNTfsl/+pz03+jgQAB/AQB27g==  TAB1_NESTED
    TREAT(SYS_NC_ROWINFO$ AS "GINTS"."H3")."SOME_TEXTS"
    TAB1_NESTED                    TAB1
    B
    SQL> desc "SYSNTfsl/+pzu3+jgQAB/AQB27g=="
    Name                                      Null?    Type
    TXT                                                VARCHAR2(100)OK let it be and now I'm trying to export and import in another schema:
    [oracle@xxx]$ expdp gints/xxx@xxx directory=xxx dumpfile=gints.dmp logfile=gints.log
    Export: Release 11.2.0.1.0 - Production on Thu Feb 4 22:32:48 2010
    <irrelevant rows skipped>
    Processing object type SCHEMA_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
    . . exported "GINTS"."TAB1"                                  0 KB       0 rows
    . . exported "GINTS"."SYSNTfsl/+pz03+jgQAB/AQB27g=="         0 KB       0 rows
    . . exported "GINTS"."TAB1_NESTED"                           0 KB       0 rows
    . . exported "GINTS"."SYSNTfsl/+pzu3+jgQAB/AQB27g=="         0 KB       0 rows
    Master table "GINTS"."SYS_EXPORT_SCHEMA_01" successfully loaded/unloaded
    ******************************************************************************And now import. In order to create types transformation of OIDs is applied and also remap_schema
    Although it fails to create the table.
    [oracle@xxx]$ impdp gints1/xxx@xxx directory=xxx dumpfile=gints.dmp logfile=gints_imp.log remap_schema=gints:gints1 transform=OID:n
    Import: Release 11.2.0.1.0 - Production on Thu Feb 4 22:41:48 2010
    Copyright (c) 1982, 2009, Oracle and/or its affiliates.  All rights reserved.
    Connected to: Oracle Database 11g Release 11.2.0.1.0 - Production
    Master table "GINTS1"."SYS_IMPORT_FULL_01" successfully loaded/unloaded
    Starting "GINTS1"."SYS_IMPORT_FULL_01":  gints1/********@xxx directory=xxx dumpfile=gints.dmp logfile=gints_imp.log remap_schema=gints:gints1 transform=OID:n
    Processing object type SCHEMA_EXPORT/USER
    ORA-31684: Object type USER:"GINTS1" already exists
    Processing object type SCHEMA_EXPORT/SYSTEM_GRANT
    Processing object type SCHEMA_EXPORT/ROLE_GRANT
    Processing object type SCHEMA_EXPORT/DEFAULT_ROLE
    Processing object type SCHEMA_EXPORT/PRE_SCHEMA/PROCACT_SCHEMA
    Processing object type SCHEMA_EXPORT/TYPE/TYPE_SPEC
    Processing object type SCHEMA_EXPORT/TABLE/TABLE
    ORA-39083: Object type TABLE:"GINTS1"."TAB1" failed to create with error:
    ORA-02320: failure in creating storage table for nested table column B
    ORA-00904: : invalid identifier
    Failing sql is:
    CREATE TABLE "GINTS1"."TAB1" ("A" NUMBER, "B" "GINTS1"."TBL_H1" ) SEGMENT CREATION IMMEDIATE PCTFREE 10 PCTUSED 40 INITRANS 1 MAXTRANS 255 NOCOMPRESS LOGGING STORAGE(INITIAL 65536 NEXT 1048576 MINEXTENTS 1 MAXEXTENTS 2147483645 PCTINCREASE 0 FREELISTS 1 FREELIST GROUPS 1 BUFFER_POOL DEFAULT FLASH_CACHE DEFAULT CELL_
    Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
    Processing object type SCHEMA_EXPORT/TABLE/INDEX/INDEX
    Processing object type SCHEMA_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS
    ORA-39083: Object type INDEX_STATISTICS failed to create with error:
    ORA-01403: no data found
    ORA-01403: no data found
    Failing sql is:
    DECLARE I_N VARCHAR2(60);   I_O VARCHAR2(60);   c DBMS_METADATA.T_VAR_COLL;   df varchar2(21) := 'YYYY-MM-DD:HH24:MI:SS'; BEGIN  DELETE FROM "SYS"."IMPDP_STATS";   c(1) :=   DBMS_METADATA.GET_STAT_COLNAME('GINTS1','TAB1_NESTED',NULL,'TREAT(SYS_NC_ROWINFO$ AS "GINTS"."H2")."SOME_TEXTS"',1);  DBMS_METADATA.GET_STAT_INDNAME('GINTS1','TAB1_NESTED',c,1,i_o,i_n);   INSERT INTO "
    ORA-39083: Object type INDEX_STATISTICS failed to create with error:
    ORA-01403: no data found
    ORA-01403: no data found
    Failing sql is:
    DECLARE I_N VARCHAR2(60);   I_O VARCHAR2(60);   c DBMS_METADATA.T_VAR_COLL;   df varchar2(21) := 'YYYY-MM-DD:HH24:MI:SS'; BEGIN  DELETE FROM "SYS"."IMPDP_STATS";   c(1) :=   DBMS_METADATA.GET_STAT_COLNAME('GINTS1','TAB1_NESTED',NULL,'TREAT(SYS_NC_ROWINFO$ AS "GINTS"."H3")."SOME_TEXTS"',1);  DBMS_METADATA.GET_STAT_INDNAME('GINTS1','TAB1_NESTED',c,1,i_o,i_n);   INSERT INTO "
    Processing object type SCHEMA_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
    Job "GINTS1"."SYS_IMPORT_FULL_01" completed with 4 error(s) at 22:41:52So any idea how to make export/import of such tables?
    TIA
    Gints

    Tom Kyte has said it repeatedly ... I will repeat it here for you.
    The fact that Oracle allows you to build object tables is not an indication that you should.
    Store your data relationally and build object_views on top of them.
    http://www.morganslibrary.org/reference/object_views.html
    If you model properly, and store properly, you don' have any issues.

  • Schema export via Oracle data pump with Database Vault enabled question

    Hi,
    I have installed and configured Database Vault on an Oracle 11g-r2-11.2.0.3 to protect a specific schema (SCHEMA_NAME) via a realm. I have followed the following doc:
    http://www.oracle.com/technetwork/database/security/twp-databasevault-dba-bestpractices-199882.pdf
    to ensure that the sys and the system user has sufficient rights to complete a schedule Oracle data pump export operation.
    I.e. I have granted to sys and system the following:
    execute dvsys.dbms_macadm.authorize_scheduler_user('sys','SCHEMA_NAME');
    execute dvsys.dbms_macadm.authorize_scheduler_user('system','SCHEMA_NAME');
    execute dvsys.dbms_macadm.authorize_datapump_user('sys','SCHEMA_NAME');
    execute dvsys.dbms_macadm.authorize_datapump_user('system','SCHEMA_NAME');
    I have also create a second realm on the same schema (SCHEMA_NAME) to allow sys and system to maintain indexes for real-protected tables, To allow a sys and system to maintain indexes for realm-protected tables. This separate realm was created for all their index types: Index, Index Partition, and Indextype, sys and system have been authorized as OWNER to this realm.
    However, when I try and complete an Oracle Data Pump export operation on the schema, I get two errors directly after the following line displayed in the export log:
    Processing object type SCHEMA_EXPORT/TABLE/INDEX/DOMAIN_INDEX/INDEX:
    ORA-39127: unexpected error from call to export_string :=SYS.DBMS_TRANSFORM_EXIMP.INSTANCE_INFO_EXP('AQ$_MGMT_NOTIFY_QTABLE_S','SYSMAN',1,1,'11.02.00.00.00',newblock)
    ORA-01031: insufficient privileges
    ORA-06512: at "SYS.DBMS_TRANSFORM_EXIMP", line 197
    ORA-06512: at line 1
    ORA-06512: at "SYS.DBMS_METADATA", line 9081
    ORA-39127: unexpected error from call to export_string :=SYS.DBMS_TRANSFORM_EXIMP.INSTANCE_INFO_EXP('AQ$_MGMT_LOADER_QTABLE_S','SYSMAN',1,1,'11.02.00.00.00',newblock)
    ORA-01031: insufficient privileges
    ORA-06512: at "SYS.DBMS_TRANSFORM_EXIMP", line 197
    ORA-06512: at line 1
    ORA-06512: at "SYS.DBMS_METADATA", line 9081
    The export is completed but with this errors.
    Any help, suggestions, pointers, etc actually anything will be very welcome at this stage.
    Thank you

    Hi Srini,
    Thank you very much for your help. Unfortunately after having followed the instructions of the DOC I am still getting the same errors ?
    none the less thank you for your input.
    I was also wondering if someone could tell me how to move this thread to the Database Security area of the forum, as I feel I may have posted the thread in the wrong place as it appears to be a Database Vault issue and not an imp/exp problem. ?
    Edited by: zooid on May 20, 2012 10:33 PM
    Edited by: zooid on May 20, 2012 10:36 PM

  • Migration using Data Pump confusion :(

    Hi,
    I have task to migrate a database from Windows x64 to Linux x64. From 10.2.0.4 to 11.2.0.4. Please correct my scenario:
    1. Install and configure binaries on new Linux host
    2. Create a empty database on Linux host
    3. Performe DP Export from old Windows database.
    4. Import dump to new database on Linux.
    Are the general steps correct? My confiusion is about Data Pump utility. As i know DP doesnt export SYS/SYSTEM schemas right? I've performed test migration with the following parameters:
    expdp ..... full=y
    impdp ..... full=y
    and i have a lot of errors similar to: "object already exists" and so on. Am i correct with using full=y in both expdp and impdp? Maybe i should export a full database and then import only specified schemas (application schemas)? What about priviledges? Where they are stored in database? Maybe when i import only application schemas there will be a problem with privilegdes grants etc?
    Thanks

    Yes your steps are correct. When you create the new database, the SYS/SYSTEM object are already populated, so you will see "object already exists" errors.
    Moving Data Using Oracle Data Pump
    HTH
    Srini

  • Using Data Pump Storage Parameter option

    I am creating database replica of our production environment - the db name don't have to be the same.
    My option is to use Oracle data pump  to move the data from source database to target database.
    I performed the same schenerio for our Windows 2003 evnironment with no problem.
    Doing the same for Linux, I am getting tablespace creation error as you can see below:
    Lix
    Linux-x86_64 Error: 2: No such file or directory Failing sql is: CREATE TABLESPACE "INQUIRY" DATAFILE '/oraappl/pca/vprod/vproddata/inquiry01.dbf' SIZE 629145600 LOGGING ONLINE PERMANENT BLOCKSIZE 8192 EXTENT MANAGEMENT LOCAL AUTOALLOCATE SEGMENT SPACE MANAGEMENT AUTO ORA-39083: Object type TABLESPACE failed to create with error: ORA-01119: error in creating database file '/oraappl/pca/vprod/vproddata/medical01.dbf' ORA-27040: file create error, unable to create file
    My question is do we have to create the tablespaces or data pump should use the default tablespace location already being used by the new database?

    Hi Richard,
    I am working creating my extra database using duplicate command as you suggested.
    I got everything up until I got this error:
    channel ORA_AUX_DISK_1: reading from backup piece /oraappl/pca/backups/weekly/vproddata/rman/VPR ackupset/2013_08_27/o1_mf_nnndf_TAG20130827T083750_91s7dz0r_.bkp ORA-19870: error reading backup piece /oraappl/pca/backups/weekly/vproddata/rman/VPROD/backupset 3_08_27/o1_mf_nnndf_TAG20130827T083750_91s7dz0r_.bkp ORA-19505: failed to identify file "/oraappl/pca/backups/weekly/vproddata/rman/VPROD/backupset/2 08_27/o1_mf_nnndf_TAG20130827T083750_91s7dz0r_.bkp" ORA-27037: unable to obtain file status Linux-x86_64 Error: 2: No such file or directory
    It is defaulting to the recovery location of the production database, instead of the auxiliary db.
    My next option was to catalog the backup files, even that is not working. Any suggestion?

  • Data pump with flashback_scn or flashback_time

    Dear Gurus,
    The Oracle database version in used is 11gR2. We don't have flashback enabled for the database. However to run a data pump export with consistency, can we turn on FLASHBACK_SCN or FLASHBACK_TIME?
    Best
    rac110g

    How to set the parameter
    The logical conclusion and field-tested best practice approved by Oracle Support is, to set the UNDO_RETENTION parameter to at least the estimated time for the data pump import. Don’t forget to size your UNDO tablespace accordingly, since the retention only works as long as there is enough undo space available.
    Note
    IMPDP uses flashback technology (flashback table) on the source database to achive consistency, so the UNDO tablespace there is worth a glance as well.
    check this one link for expdp/impdp undo requirements and possible problems
    http://www.usn-it.de/index.php/2010/05/05/oracle-impdp-ora-1555-and-undo_retention/
    Edited by: Asad99 on Mar 26, 2013 10:42 PM

Maybe you are looking for