DATAPUMP PROBLEM

Hi,
I am running datapump:
C:\>impdp system/manager directory=dp_dir dumpfile=CISADM_EXPDP_042011.dmp sqlfile=a.sql
Import: Release 11.2.0.1.0 - Production on Mon May 9 10:53:45 2011
Copyright (c) 1982, 2009, Oracle and/or its affiliates.  All rights reserved.
UDI-00018: Data Pump client is incompatible with database version 11.01.00.07.00How can I resovel this please.
I do not want to install all types of databases version just to check if my dump is readable or corrupted. :(
Thanks

its still the same error :(
C:\>impdp system/manager directory=dp_dir dumpfile=CISADM_EXPDP_042011.dmp sqlfile=a.sql version=compatible
Import: Release 11.1.0.7.0 - Production on Monday, 09 May, 2011 11:49:57
Copyright (c) 2003, 2007, Oracle.  All rights reserved.
Connected to: Oracle Database 11g Enterprise Edition Release 11.1.0.7.0 - Produc
tion
With the Partitioning, OLAP, Data Mining and Real Application Testing options
ORA-39001: invalid argument value
ORA-39000: bad dump file specification
ORA-39142: incompatible version number 3.1 in dump file "d:\dump\CISADM_
EXPDP_042011.dmp"

Similar Messages

  • Import Datapump problems on Oracle 11g

    am trying to Import (using Datapump) and DMP file from Oracle 10g (10.2.0.4).
    The 10g database server has nls_characterset of:
    SYSTEM@testers>
    PARAMETER VALUE
    NLS_CHARACTERSET WE8MSWIN1252
    1 row selected.
    The 11g server has:
    SQL> select * from v$nls_parameters
    2 where parameter = upper('nls_characterset');
    PARAMETER
    VALUE
    NLS_CHARACTERSET
    AL32UTF8
    Here are the errors:
    Import: Release 11.2.0.2.0 - Production on Tue Nov 8 16:40:11 2011
    Copyright (c) 1982, 2009, Oracle and/or its affiliates. All rights reserved.
    Connected to: Oracle Database 11g Release 11.2.0.2.0 - 64bit Production
    Master table "SYSTEM"."SYS_IMPORT_FULL_01" successfully loaded/unloaded
    Starting "SYSTEM"."SYS_IMPORT_FULL_01": system/********@Testers11g DIRECTORY=impdp_dir DUMPFILE=expdpT1400.dmp REMAP_SCHEMA=t1400:t1500 LOGFILE=expdpt1500.log
    Processing object type SCHEMA_EXPORT/PRE_SCHEMA/PROCACT_SCHEMA
    Processing object type SCHEMA_EXPORT/SEQUENCE/SEQUENCE
    Processing object type SCHEMA_EXPORT/TABLE/TABLE
    Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
    . . imported "T1500"."DPR_OPENING_BALANCE" 172.2 MB 4537640 rows
    . . imported "T1500"."IVC_OFFSITE_OPENING_BALANCE" 199.0 MB 5120520 rows
    . . imported "T1500"."IVC_OPENING_BALANCE" 187.3 MB 5171512 rows
    . . imported "T1500"."FIN_AR_STATEMENT_SAVE" 33.73 MB 93 rows
    . . imported "T1500"."AUDIT_CONTRACT" 17.74 MB 39855 rows
    . . imported "T1500"."S1_CONTRACT_FORMAT_TEMP" 6.078 KB 2 rows
    . . imported "T1500"."FO_OPENING_BALANCE" 12.42 MB 406974 rows
    . . imported "T1500"."S1_TICKET_CHARGE_SAVE" 11.62 MB 24 rows
    . . imported "T1500"."FO_TRANSACTION_SUMMARY" 11.76 MB 406974 rows
    . . imported "T1500"."FIN_GL_ACCOUNT_BALANCE" 10.89 MB 337260 rows
    . . imported "T1500"."FIN_GL_AUDIT_TRAIL_JOURNAL" 25.48 KB 30 rows
    . . imported "T1500"."FIN_GL_AUDIT_TRAIL" 1.024 MB 6648 rows
    . . imported "T1500"."DPR_TRANSACTION_DETAIL" 5.888 MB 74588 rows
    ORA-02374: conversion error loading table "T1500"."AUDIT_RELEASE"
    ORA-12899: value too large for column SHIP_TO_SHORT_NAME (actual: 11, maximum: 10)
    ORA-02372: data for row: SHIP_TO_SHORT_NAME : 0X'C8746F696C65204C656D'
    I also tried to Import a DMP (from Export) from Oracle10g server and got this result:
    Connected to: Oracle Database 11g Release 11.2.0.2.0 - 64bit Production
    Export file created by EXPORT:V10.02.01 via conventional path
    import done in WE8MSWIN1252 character set and AL16UTF16 NCHAR character set
    import server uses AL32UTF8 character set (possible charset conversion)
    export client uses US7ASCII character set (possible charset conversion)
    . importing T1400's objects into T1500
    . . importing table "A1_ASYNCH_COMMUNICATION" 0 rows imported
    . . importing table "A1_BROADCAST_LOCK" 1 rows imported
    . . importing table "A1_BROADCAST_MESSAGE" 6 rows imported
    . . importing table "A1_BROADCAST_USER" 0 rows imported
    . . importing table "A1_BUILD_SCRIPT" 437 rows imported
    . . importing table "A1_DBC_APPLIED_SCRIPT" 0 rows imported
    . . importing table "A1_ERROR_HANDLER" 5880 rows imported
    . . importing table "A1_ERROR_HANDLER_COMMON" 2956 rows imported
    . . importing table "A1_ERROR_HANDLER_TITLE"
    IMP-00019: row rejected due to ORACLE error 12899
    IMP-00003: ORACLE error 12899 encountered
    ORA-12899: value too large for column "T1500"."A1_ERROR_HANDLER_TITLE"."TITLE_DESCRIPTION" (actual: 34, maximum: 32)
    Column 1 7
    Column 2 FR
    Column 3 Relâche des Fact. Récurr des CAP
    IMP-00019: row rejected due to ORACLE error 12899
    IMP-00003: ORACLE error 12899 encountered
    The errors always occur when I try to import a column containing French accented characters.
    What am I missing???

    Multiple posts on the same topic is considered to be rude and bordering on spam - https://forums.oracle.com/forums/thread.jspa?threadID=2308355

  • DataPump Problem. Automaticaly change dump file name.

    Deal All
    I have create a plsql block and which will backup the database using datapump.
    He is working fine in scheduler but he generates error because it doesnot change
    file name every time. Against the following he always generates
    'EXPORT%U.DMP' = EXPORT01.DMP file name.
    Please tell me how i can change every file name automatically.
    Thanks
    Abid

    Hi,
    I have 10.2.0.4 db on windows and want to have a full datapump export every night and want a unique file name but this does not work for me. The script just does not execute. I have even tried this from the command line
    par file is
    DUMPFILE='expdp_testdb_full.'||to_char(sysdate,'YYYYMMDDHH24MISS')||'.dmp'
    JOB_NAME=expdp_testdb_full
    LOGFILE="expdp_full_testdb.log"
    DIRECTORY=DATAPUMP_DIR
    FULL=Y
    ESTIMATE=STATISTICS
    CONTENT=ALL
    PARALLEL=20

  • Expdp datapump Problem

    Greetings Oracle comunity, i have a a problem with an export of a full metadata database, here is the log:
    Processing object type DATABASE_EXPORT/SYSTEM_PROCOBJACT/PRE_SYSTEM_ACTIONS/PROCACT_SYSTEM
    ORA-39127: unexpected error from call to export_string := SYS.DBMS_SCHED_JOB_EXPORT.create_exp(94477,'1
    0.02.00.04.00',newblock)
    ORA-01882: timezone region not found
    ORA-06512: at "SYS.DBMS_SCHED_MAIN_EXPORT", line 268
    ORA-06512: at "SYS.DBMS_SCHED_JOB_EXPORT", line 14
    ORA-06512: at line 1
    ORA-06512: at "SYS.DBMS_METADATA", line 5038
    ORA-39127: unexpected error from call to export_string := SYS.DBMS_SCHED_JOB_EXPORT.create_exp(94478,'1
    0.02.00.04.00',newblock)
    ORA-01882: timezone region not found
    ORA-06512: at "SYS.DBMS_SCHED_MAIN_EXPORT", line 268
    ORA-06512: at "SYS.DBMS_SCHED_JOB_EXPORT", line 14
    ORA-06512: at line 1
    ORA-06512: at "SYS.DBMS_METADATA", line 5038
    ORA-39127: unexpected error from call to export_string := SYS.DBMS_SCHED_JOB_EXPORT.create_exp(94479,'1
    0.02.00.04.00',newblock)
    ORA-01882: timezone region not found
    ORA-06512: at "SYS.DBMS_SCHED_MAIN_EXPORT", line 268
    ORA-06512: at "SYS.DBMS_SCHED_JOB_EXPORT", line 14
    ORA-06512: at line 1
    ORA-06512: at "SYS.DBMS_METADATA", line 5038
    ORA-39127: unexpected error from call to export_string := SYS.DBMS_SCHED_JOB_EXPORT.create_exp(94480,'1
    0.02.00.04.00',newblock)
    ORA-01882: timezone region not found
    ORA-06512: at "SYS.DBMS_SCHED_MAIN_EXPORT", line 268
    ORA-06512: at "SYS.DBMS_SCHED_JOB_EXPORT", line 14
    ORA-06512: at line 1
    ORA-06512: at "SYS.DBMS_METADATA", line 5038
    ORA-39127: unexpected error from call to export_string := SYS.DBMS_SCHED_JOB_EXPORT.create_exp(104143,'
    10.02.00.04.00',newblock)
    ORA-01882: timezone region not found
    ORA-06512: at "SYS.DBMS_SCHED_MAIN_EXPORT", line 268
    ORA-06512: at "SYS.DBMS_SCHED_JOB_EXPORT", line 14
    ORA-06512: at line 1
    ORA-06512: at "SYS.DBMS_METADATA", line 5038
    ORA-39127: unexpected error from call to export_string := SYS.DBMS_SCHED_JOB_EXPORT.create_exp(123017,'
    10.02.00.04.00',newblock)
    ORA-01882: timezone region not found
    ORA-06512: at "SYS.DBMS_SCHED_MAIN_EXPORT", line 268
    ORA-06512: at "SYS.DBMS_SCHED_JOB_EXPORT", line 14
    ORA-06512: at line 1
    ORA-06512: at "SYS.DBMS_METADATA", line 5038
    Processing object type DATABASE_EXPORT/SYSTEM_PROCOBJACT/PROCOBJ
    Processing object type DATABASE_EXPORT/SYSTEM_PROCOBJACT/POST_SYSTEM_ACTIONS/PROCACT_SYSTEM
    Processing object type DATABASE_EXPORT/SCHEMA/PROCACT_SCHEMA
    ORA-39127: unexpected error from call to export_string := SYS.DBMS_SCHED_JOB_EXPORT.create_exp(94477,'1
    0.02.00.04.00',newblock)
    ORA-01882: timezone region not found
    ORA-06512: at "SYS.DBMS_SCHED_MAIN_EXPORT", line 268
    ORA-06512: at "SYS.DBMS_SCHED_JOB_EXPORT", line 14
    ORA-06512: at line 1
    ORA-06512: at "SYS.DBMS_METADATA", line 5038
    ORA-39127: unexpected error from call to export_string := SYS.DBMS_SCHED_JOB_EXPORT.create_exp(94478,'1
    0.02.00.04.00',newblock)
    ORA-01882: timezone region not found
    ORA-06512: at "SYS.DBMS_SCHED_MAIN_EXPORT", line 268
    ORA-06512: at "SYS.DBMS_SCHED_JOB_EXPORT", line 14
    ORA-06512: at line 1
    ORA-06512: at "SYS.DBMS_METADATA", line 5038
    ORA-39127: unexpected error from call to export_string := SYS.DBMS_SCHED_JOB_EXPORT.create_exp(94479,'1
    0.02.00.04.00',newblock)
    ORA-01882: timezone region not found
    ORA-06512: at "SYS.DBMS_SCHED_MAIN_EXPORT", line 268
    ORA-06512: at "SYS.DBMS_SCHED_JOB_EXPORT", line 14
    ORA-06512: at line 1
    ORA-06512: at "SYS.DBMS_METADATA", line 5038
    ORA-39127: unexpected error from call to export_string := SYS.DBMS_SCHED_JOB_EXPORT.create_exp(94480,'1
    0.02.00.04.00',newblock)
    ORA-01882: timezone region not found
    ORA-06512: at "SYS.DBMS_SCHED_MAIN_EXPORT", line 268
    ORA-06512: at "SYS.DBMS_SCHED_JOB_EXPORT", line 14
    ORA-06512: at line 1
    ORA-06512: at "SYS.DBMS_METADATA", line 5038
    ORA-39127: unexpected error from call to export_string := SYS.DBMS_SCHED_JOB_EXPORT.create_exp(104143,'
    10.02.00.04.00',newblock)
    ORA-01882: timezone region not found
    ORA-06512: at "SYS.DBMS_SCHED_MAIN_EXPORT", line 268
    ORA-06512: at "SYS.DBMS_SCHED_JOB_EXPORT", line 14
    ORA-06512: at line 1
    ORA-06512: at "SYS.DBMS_METADATA", line 5038
    ORA-39127: unexpected error from call to export_string := SYS.DBMS_SCHED_JOB_EXPORT.create_exp(123017,'
    10.02.00.04.00',newblock)
    ORA-01882: timezone region not found
    ORA-06512: at "SYS.DBMS_SCHED_MAIN_EXPORT", line 268
    ORA-06512: at "SYS.DBMS_SCHED_JOB_EXPORT", line 14
    ORA-06512: at line 1
    ORA-06512: at "SYS.DBMS_METADATA", line 5038
    Processing object type DATABASE_EXPORT/SCHEMA/TABLE/TABLE
    how you can see, the sequence of error is:
    ORA-06512: at "SYS.DBMS_METADATA", line 5038
    ORA-39127: unexpected error from call to export_string := SYS.DBMS_SCHED_JOB_EXPORT.create_exp(104143,'
    10.02.00.04.00',newblock)
    ORA-01882: timezone region not found
    ORA-06512: at "SYS.DBMS_SCHED_MAIN_EXPORT", line 268
    ORA-06512: at "SYS.DBMS_SCHED_JOB_EXPORT", line 14
    ORA-06512: at line 1
    I hope can you can help you, good bye friends!
    PD: The database was upgraded from 10.2.0.2 to 10.2.0.4

    Hi,do you have account on metalink?
    https://support.oracle.com/CSP/ui/flash.html#tab=KBHome(page=KBHome&id=()),(page=KBNavigator&id=(bmDocID=5865329&from=BOOKMARK&bmDocTitle=ORA-01882%20IS%20ENCOUNTERED%20AFTER%20APPLYING%20DST%20PATCH%204689959&viewingMode=1143&bmDocDsrc=BUG&bmDocType=BUG))
    https://support.oracle.com/CSP/ui/flash.html#tab=KBHome(page=KBHome&id=()),(page=KBNavigator&id=(bmDocID=556318.1&from=BOOKMARK&bmDocTitle=EXP-8%20ORA-1882%20When%20Running%20Full%20Export&viewingMode=1143&bmDocDsrc=KB&bmDocType=PROBLEM))

  • SAFEST COMMAND FOR IMPDP

    Hi all,
    I have a data pump export named EXPDAT.dmp from my PROD database.
    I just want to validate/check the file if readable or "not corrupted"
    I do not have any test database to import it to.
    I just want to run it on our PROD database just to read it, but not to create new objects or rows.
    Whats is the safest command to browse the file using our PROD db? Its the counterpart of " imp show=yes "
    I do not want to use "sqlfile=" because it takes too long to create the sqlfile :(
    I just wanna browse the contents or read it without doing anything on the database.
    Thanks a lot
    Edited by: 843228 on May 8, 2011 8:26 PM

    Handle:      843228
    Status Level:      Newbie
    Registered:      Mar 9, 2011
    Total Posts:      242
    Total Questions:      64 (42 unresolved)
    so many questions & so few answers.
    why spamming so much on same problem?
    DATAPUMP PROBLEM
    How to tell what type of dump?

  • Export specif tables

    I need to export specific tables from a schema in database
    Eg: select table_name from dba_tables where owner='ANALYTICS' and table_name like 'W_%'
    This query returns around 1500 tables.
    How can I take export of these 1500 tables (Passing these 1500 tables one by one is not an easy task)
    So need to have some tool, where all these tables are retrieved with a " , " and in one liine

    The INCLUDE parameter has a limit of 4000 bytes. Pl see these related threads for a solution
    exp datapump problem
    Re: expdp - tables
    HTH
    Srini

  • DATAPUMP SIZE PROBLEM ;;; PLS HELP ME

    HI Everybaody,
    For Oracle DBA Experts , i generate evreyday one dmp file size more than 14 GO; it's a full export DATAPUMP file ; and i would like to know if could cause problem if I compress the DMP files ; I read an article on the web saying that compressing DMP file with Zip utilities can crash the DMP file and then I could have a crashed file sinc i decompress them; Any suggestion , help please?
    Thx

    Thank you ; please read this article in frensh ;
    http://oracle.developpez.com/guide/sauvegarde/generalites/#L4.2.d
    The author says In red color ;
    ne compressez jamais un fichier dump avec un outil de compression système Winzip (Windows) ou GZIP (Linux - Unix), vous risqueriez d'avoir de mauvaises surprises lors de sa décompression.
    En englsih means ;
    never compress a file dump with a tool for compression Winzip system (Windows) or GZIP (Linux - Unix), you would be likely to have nasty surprises at the time of its decompression.
    Is it Wrong?
    Message was edited by:
    HAGGAR

  • Import problem on Vista with datapump

    Hi
    I've got a problem with an import into an Oracle XE database server on Vista, I keep getting Ora error "39070 Can't create log file" when executing an impdp command, whereas exactly the same command is working perfectly on Windows XP.
    Here is what I'm doing: I want to create a new DB on a target laptop that runs under Vista. I have Oracle XE installed on both my source laptop where I have the source DB and on the target laptop running Vista. I export the DB on the source laptop with expdp (no problems), then copy the export dir content over to the target laptop, then create the user and all the necessary stuff on the traget laptop XE DB server (account unlock, grant create, then create dir (where the DB dump is) and grant read write on that dir to the new user), then use an impdp command like this:
    impdp <user>/<password> directory=<newdb> dumpfile=newdb_schema.dmp logfile=newdb_imp.log
    to import the data in the DB on the target laptop.
    This fails miserably, whereas it succeeds without problems if I do exactly the same on a Win XP desktop machine set up exactly the same way !
    On top of that, I succeeded 6 months ago to do the same thing on that Vista laptop, but now it refuses to work.
    What's going on ? Are there known problems with the datapump on Vista, or is it a protection problem, i.e. no rights for the datapump process to write a log file in the dir where the db dump is ? What's the way to get it working ?
    Thanks for your help, I'm really stuck with that one.
    Regards
    Balex

    Which privileges has Vista user?
    On Vista you must explicitly to give user ADMIN rights....
    HTH

  • Problem with Datapump in 11g

    os windows server 2008
    db 11.1.0.7.0
    i have created an user with following rigths
    GRANT EXP_FULL_DATABASE
    GRANT CREATE JOB
    GRANT CREATE PROCEDURE
    GRANT CREATE VIEW
    GRANT CREATE TABLE
    GRANT CREATE SESSION
    GRANT READ ON DIRECTORY data_pump_dir
    GRANT WRITE ON DIRECTORY data_pump_dir
    when i run
    declare
    dp_handle1 number;
    begin
    dp_handle1 := DBMS_DATAPUMP.open(
    operation=>'EXPORT',
    job_mode=>'FULL',
    remote_link=>NULL,
    job_name=> 'my_test_job',
    version=>'COMPATIBLE');
    end;
    i get
    ORA-31626: Job ist nicht vorhanden
    ORA-06512: in "SYS.DBMS_SYS_ERROR", Zeile 79
    ORA-06512: in "SYS.DBMS_DATAPUMP", Zeile 902
    ORA-06512: in "SYS.DBMS_DATAPUMP", Zeile 4758
    when i run the same snippet as sys
    everything is ok
    any ideas why it won't work for my user?

    From the documentation:
    If the OPEN fails, call GET_STATUS with a null handle to retrieve additional information about the failure.Have you tried this?

  • Datapump - export job problem

    Just started playing with this new feature of 10g. I created a new export job through Enterprise Manager - database control. Now when I tried to delete it, its giving me an error message, which is as follows,
    Error
    The specified job, job run or execution is still active. It must finish running, or be stopped before it can be deleted. Filter on status 'Active' to see active executions
    I stopped this process successfully so many times (even I don't remember now, how many times) through database control but when I try to again delete run, it gives me the same error message.
    I logged on SQLPlus and checked that this process is still active as it has an entry into DBA_DATAPUMP_JOBS view. I deleted the corresponding table and the entry is gone from the view but when I checked in the database control, the job execution is still there with a status of "Stop Pending"
    Can somebody help me in this, I mean how can I delete that job from the database control. If you need any other information to help me, I am more than willing to provide the same.
    The job is owned by system. My platform is Windows XP Professional.
    Any help is greatly appreciated as I am doing different things for last 2 days with no success.
    Regards,

    Hi Bhargava,
    What do you get when you execute this block -
    set serverout on;
    declare
    myhandle number;
    begin
    myhandle := dbms_datapump.attach('JOB_NAME','JOB_OWNER');
    dbms_output.put_line(myhandle);
    dbms_datapump.detach(myhandle);
    end;
    If this block executes without error and prints out a number, then you can try to stop the job with this block:
    declare
    myhandle number;
    begin
    myhandle := dbms_datapump.attach('JOB_NAME','JOB_OWNER');
    dbms_output.put_line(myhandle);
    dbms_datapump.stop_job(myhandle, 1,0,0 );
    end;
    Here is an article with more information on the pl/sql API to dbms_datapump:
    http://www.devx.com/dbzone/Article/30355
    Here is the dbms_datapump documentation:
    http://download-east.oracle.com/docs/cd/B19306_01/appdev.102/b14258/d_datpmp.htm
    -Natalka
    http://toolkit.rdbms-insight.com

  • [DATAPUMP] impdp problem

    Hi,
    could you tell me if there can occur any trouble if I run impdp on an existing database with objects and data included in a dumpfile? If there are already existings objects and data, those would be just ignored by impdp by default?
    Regards,
    Przemek Piechota

    Hi,
    In data pump if the parameter CONTENT=DATA_ONLY is specified, the default is APPEND, and then data will be appended into existing table.
    TABLE_EXISTS_ACTION can have following values.
    1)SKIP: It leaves the table as is and moves on to the next object. This is not a valid option if the CONTENT parameter is set to DATA_ONLY.
    2)APPEND: This option loads rows from the source and leaves existing rows unchanged. This is a default option is CONTENT=DATA_ONLY is specified.
    3)TRUNCATE: This option deletes existing rows and then loads rows from the source.
    4)REPLACE: This option drops the existing table in the database and then creates and loads it from the source. This is not a valid option if the CONTENT parameter is set to DATA_ONLY.
    http://www.ora600.be/node/4101
    regards

  • Any specific disadvantage of using DataPump for upgrade ?

    Hi,
    Here's my config
    SERVER A
    Source O/S : Win 2003
    Source DB : 10.2.0.4 ( 32 bit )
    DB Size : 100 GB
    SERVER B
    Target O/S : Win 2008 R2 sp1
    Target DB : 11.2.0.3 ( 64 bit )
    I have to upgrade 10g Database of server A , by installing 11g on Server B. I have not used database upgrade assistant or RMAN or similar utilities to perform a 10g upgrade to 11g anytime in the past.
    Here is my question ...
    a) I was planning to use datapump to perform this upgrade ( Downtime is not a issue ), since I know how to use datapump, do you guys see any potential problem with this approach ? OR
    b) Based on your exp. would you suggest that i should avoid option (a) because of potential issues and use other methods that oracle suggests like upgrade assistants etc
    I am open to both options, it's just that since i am not a expert at this point of time, i was hesitating a bit to go with option (b)
    db is supposed to undergo from 32 to 64 bit. Not sure if this would be deal breaker.
    Note : The upgrade is suppose to happen on 2nd server. Can somebody provide high level steps as pointer.
    -Learner

    If downtime is not an issue, datapump is certainly an option. How big if the database ? The steps are documented
    http://docs.oracle.com/cd/E11882_01/server.112/e23633/expimp.htm#i262220
    HTH
    Srini

  • Error while taking dump using datapump

    getting following error -
    Export: Release 10.2.0.1.0 - Production on Friday, 15 September, 2006 10:31:41
    Copyright (c) 2003, 2005, Oracle. All rights reserved.
    Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - Production
    With the Partitioning, OLAP and Data Mining options
    Starting "XX"."SYS_EXPORT_SCHEMA_02": XX/********@XXX directory=dpdump dumpfile=XXX150906.dmp logfile=XXX150906.log
    Estimate in progress using BLOCKS method...
    Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
    ORA-39125: Worker unexpected fatal error in KUPW$WORKER.GET_TABLE_DATA_OBJECTS while calling DBMS_METADATA.FETCH_XML_CLOB []
    ORA-31642: the following SQL statement fails:
    BEGIN "DMSYS"."DBMS_DM_MODEL_EXP".SCHEMA_CALLOUT(:1,0,0,'10.02.00.01.00'); END;
    ORA-06512: at "SYS.DBMS_SYS_ERROR", line 86
    ORA-06512: at "SYS.DBMS_METADATA", line 907
    ORA-06550: line 1, column 7:
    PLS-00201: identifier 'DMSYS.DBMS_DM_MODEL_EXP' must be declared
    ORA-06550: line 1, column 7:
    PL/SQL: Statement ignored
    ORA-06512: at "SYS.DBMS_SYS_ERROR", line 95
    ORA-06512: at "SYS.KUPW$WORKER", line 6235
    ----- PL/SQL Call Stack -----
    object line object
    handle number name
    2A68E610 14916 package body SYS.KUPW$WORKER
    2A68E610 6300 package body SYS.KUPW$WORKER
    2A68E610 9120 package body SYS.KUPW$WORKER
    2A68E610 1880 package body SYS.KUPW$WORKER
    2A68E610 6861 package body SYS.KUPW$WORKER
    2A68E610 1262 package body SYS.KUPW$WORKER
    255541A8 2 anonymous block
    Job "XX"."SYS_EXPORT_SCHEMA_02" stopped due to fatal error at 10:33:12
    Action required is contact customer support. And on metalink found a link that states it a bug in 10g release 1 that was suppose to be fixed in 10g release 1 version 4.
    some of the default schemas were purposely dropped from the database. The only default schema available now are -
    DBSNMP, DIP, OUTLN, PUBLIC, SCOTT, SYS, SYSMAN, SYSTEM, TSMSYS.
    DIP, OUTLN, TSMSYS were created again.
    Could this be a cause of problem??
    Thanks in adv.

    Hi,
    Below is the DDL taken from different database. Will this be enough ? One more thing please, what shall be the password should it be DMSYS.....since this will not be used by me but system.
    CREATE USER "DMSYS" PROFILE "DEFAULT" IDENTIFIED BY "*******" PASSWORD EXPIRE DEFAULT TABLESPACE "SYSAUX" TEMPORARY TABLESPACE "TEMP" QUOTA 204800 K ON "SYSAUX" ACCOUNT LOCK
    GRANT ALTER SESSION TO "DMSYS"
    GRANT ALTER SYSTEM TO "DMSYS"
    GRANT CREATE JOB TO "DMSYS"
    GRANT CREATE LIBRARY TO "DMSYS"
    GRANT CREATE PROCEDURE TO "DMSYS"
    GRANT CREATE PUBLIC SYNONYM TO "DMSYS"
    GRANT CREATE SEQUENCE TO "DMSYS"
    GRANT CREATE SESSION TO "DMSYS"
    GRANT CREATE SYNONYM TO "DMSYS"
    GRANT CREATE TABLE TO "DMSYS"
    GRANT CREATE TRIGGER TO "DMSYS"
    GRANT CREATE TYPE TO "DMSYS"
    GRANT CREATE VIEW TO "DMSYS"
    GRANT DROP PUBLIC SYNONYM TO "DMSYS"
    GRANT QUERY REWRITE TO "DMSYS"
    GRANT SELECT ON "SYS"."DBA_JOBS_RUNNING" TO "DMSYS"
    GRANT SELECT ON "SYS"."DBA_REGISTRY" TO "DMSYS"
    GRANT SELECT ON "SYS"."DBA_SYS_PRIVS" TO "DMSYS"
    GRANT SELECT ON "SYS"."DBA_TAB_PRIVS" TO "DMSYS"
    GRANT SELECT ON "SYS"."DBA_TEMP_FILES" TO "DMSYS"
    GRANT EXECUTE ON "SYS"."DBMS_LOCK" TO "DMSYS"
    GRANT EXECUTE ON "SYS"."DBMS_REGISTRY" TO "DMSYS"
    GRANT EXECUTE ON "SYS"."DBMS_SYSTEM" TO "DMSYS"
    GRANT EXECUTE ON "SYS"."DBMS_SYS_ERROR" TO "DMSYS"
    GRANT DELETE ON "SYS"."EXPDEPACT$" TO "DMSYS"
    GRANT INSERT ON "SYS"."EXPDEPACT$" TO "DMSYS"
    GRANT SELECT ON "SYS"."EXPDEPACT$" TO "DMSYS"
    GRANT UPDATE ON "SYS"."EXPDEPACT$" TO "DMSYS"
    GRANT SELECT ON "SYS"."V_$PARAMETER" TO "DMSYS"
    GRANT SELECT ON "SYS"."V_$SESSION" TO "DMSYS"
    The other database has the DMSYS and the status is EXPIRED & LOCKED but I'm still able to take the dump using datapump??

  • Problem importing a table with blob's

    hi all, I'm facing the following situation.
    Source DB : 10.2.0.3 (client's DB)
    Destination DB (mine): 10.2.0.4
    I've a dump file (traditional) of a particular schema.
    I'm running import (imp) to import on my DB.
    It runs fine until it reaches one particular table. This table has 6 colums, 3 of them are BLOB.
    This table has 260000 rows (checked with export log).
    When import reaches row 152352 it stops loading data, but import is still running.
    what can I do to get more information from this situation in order to solve this problem?
    Any suggestion will be appreciated!
    Thanks in advance.

    Pl identify the source and target OS versions. Are there any useful messages in the alert.log ? How long did the export take ? Rule of thumb states import will take twice as long. Have you tried expdp/impdp instead ? Also see the following -
    How To Diagnose And Troubleshoot Import Or Datapump Import Hung Scenarios          (Doc ID 795034.1)
    How To Find The Cause of a Hanging Import Session          (Doc ID 184842.1)
    Import is Slow or Hangs          (Doc ID 1037231.6)
    Export and Import of Table with LOB Columns (like CLOB and BLOB) has Slow Performance          (Doc ID 281461.1)
    HTH
    Srini

  • Can RMAN backup and export datapump executed at the same time?

    Hello,
    I have several databases that I backup using RMAN and export datapump everynight starting at 6PM and end at Midnight. The backup maintenance window doesn't give me enough time for each database to run at different time. I am using crontab to schedule my backups. Since I have so many databases that need to be backed up between 6PM - Midnight, some of the export and RMAN backup scripts will execute almost at the same time. My question is can my export data pump and RMAN backup scripts run at the same time?
    Thank you in advance.
    John

    Needs must. If you don't run expdp parallel then it doesn't use that much. If it was really killing the system then look into setting up a Resource plan that knocks that user down but this is a big step.
    I woud better look into using Rman
    system incrementals, and block change tracking, to minimize your RMAN time.
    Regards
    If your shop needs to do both simultaneously then go for it.
    Chris.
    PS : One of my shops has maybe 20-30 rmans and pumps all kicking off some simultaneous, some not, from 0000 to 0130. No complaints from users and no problems either. Go for it.
    Edited by: Chris Slattery on Nov 25, 2012 11:19 PM

Maybe you are looking for