Taking impdp using network_link

HI PPl,
Am stuck in a very difficult situation..
Can you please tell me is it possible to take an import from a database which is open in READ-ONLY mode,
and i am going to use the impdp utility and network_link (syntax) and connect to readonly database and to carry out the import..
Ples help as we want to do production testing.

Zombie,
English is probably not your first language.
I say this because you
export from
and
import into
Import from a read-only database just doesn't make sense.
Yes, you should be able to export from a read-only database
No, you shouldn't be able to import into a read-only database.
The network link doesn't matter in this regard.
Can you please explain further?
Sybrand Bakker
Senior Oracle DBA

Similar Messages

  • Impdp using network_link

    Hi All,
    I have imported a database from a remote server using the following script:-
    impdp system/manager DIRECTORY=dir NETWORK_LINK=LINK remap_schema=test:test FLASHBACK_SCN = 8254511416354 content=all TABLE_EXISTS_ACTION = REPLACE
    It was successfully imported.
    But constraints are not being imported. How to import these constrains now without affecting the data at all in the target database?

    Hi All,
    I have imported a database from a remote server using the following script:-
    impdp system/manager DIRECTORY=dir NETWORK_LINK=LINK remap_schema=test:test FLASHBACK_SCN = 8254511416354 content=all TABLE_EXISTS_ACTION = REPLACE
    It was successfully imported.
    But constraints are not being imported. How to import these constrains now without affecting the data at all in the target database?

  • Refreshing sa user on 2 instances through impdp using network_link

    Hi all,
    I have 2 instances XLTCRM1 & XLTCRM2 .I want to refresh the sa user of XLTCRM2  with sa user of XLTCRM1.
    Can anyone provide the query for doing that through a single line impdp command and network_link or any other feature of impdp.
    Cheers,
    Kunwar

    Hi
    In XLTCRM2, create a database link:
    create database link old_sa connect to sa identified by <password>  using 'XLTCRM1';Then use following at the XLTCRM2
    impdp sa/<password> DIRECTORY=dmpdir NETWORK_LINK=old_sayou have to make sure that your tnsnames has the entry for the xltcrm1 and you have the directory with proper rights.
    regards

  • Impdp with network_link

    I just want to know about the performance of using network_link with impdp instead of running expdp and impdp separately. Could someone let me know what is the best method?.
    The impdp will import only the application schemas not the databases. If I run expdp, it creates the dump file of 45G size. There is huge data in this schema.
    Source DB version: 10.2.0.3
    target DB version: 10.2.0.4

    I'm excluding the statistics as doing remap_schema. This is a known bug and the import runs long with statistics when remap_schema is used. But, I just want to know about the performance impact on the source database when importing using network_link as the import will run for over 4 hours with five parallel imports.

  • Why no Rows  importing into dest schema by using network_link in data pump?

    Hi,
    Iam importing tables from source schema to dest schema by using direct network _link without dump file as shown below.
    impdp pa_dis_sub/<password> NETWORK_LINK=mylink tables=pa_dis_sub.ipa_hist_bk table_exists_action=replace CONTENT=all directory=Data_Pump_dir logfile=exp_error.log
    But no data is loading into dest schema.showing 0rows imported.not able to trace the issue. Any one aware why data is not importing?
    Import: Release 11.1.0.7.0 - 64bit Production on Tuesday, 26 July, 2011 0:04:49
    Copyright (c) 2003, 2007, Oracle. All rights reserved.
    Connected to: Oracle Database 11g Enterprise Edition Release 11.1.0.7.0 - 64bit Production
    With the Partitioning, OLAP, Data Mining and Real Application Testing options
    Starting "PA_DIS_SUB"."SYS_IMPORT_TABLE_01": pa_dis_sub/******** NETWORK_LINK=mylink tables=pa_dis_sub.ipa_hist_bk table_exists_action=replace CONTENT=all directory=Data_Pump_dir logfile=exp_error.log
    Estimate in progress using BLOCKS method...
    Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
    Total estimation using BLOCKS method: 0 KB
    Processing object type TABLE_EXPORT/TABLE/TABLE
    . . imported "PA_DIS_SUB"."IPA_HIST_BK" 0 rows
    Job "PA_DIS_SUB"."SYS_IMPORT_TABLE_01" successfully completed at 00:05:14

    868591 wrote:
    Iam not using export.
    iam using direct import from source schema to dest schema using network_link option.http://download.oracle.com/docs/cd/B19306_01/server.102/b14215/dp_import.htm#sthref319
    Edited by: sb92075 on Jul 25, 2011 9:36 PM

  • Script to execute impdp using SILO concept

    Hi ,
    Could any one help me to write a shell script to perform impdp using dbms_datapump using SILO concept .
    Thanks
    JP

    Hi Harry ,
    Thanks for your reply .
    In my company for performing schema refresh we are using expdp/impdp using dbms_datapump through network link .Since it uses n/w link it takes 8 to 10 hours to complete the refresh .
    Now we shouldn't use n/w link and time should be decresed 4 to 5 hours and we are planning for another solutions like performing export only 1 time in source db and moving the dump file into respective boxes and then perform impdp using dbms_datapump have to use SILO concept to perform impdp.
    I have wrote the script to perform export now i need to write the script to perform import using SILO concept and the SILO value should call trough shell script
    Thanks
    JP
    Edited by: JP88 on Apr 4, 2013 3:39 AM

  • My iPhone Froze While Taking Pictures Using An App.

    I was taking pictures using a photo editing app named Luminance, and after I took a photo the screen froze.
    The lock button nor the home button works. I can't turn it off either.
    What should I do?

    I've had the same thing happen to me when an app starts to lag, shutting it off will burn it on the screen, not allowing you to enter your passcode for the Iphone. It is frustrating but do as @kb1951 said. It worked for me.

  • DB Upgrade+Migrate from 10.1 to 11.2 using IMPDP with network_link param

    Dear all,
    I would like to upgrade and migrate my 10.1.0.5 DB on old server to 11.2.0.2 on new server.
    Here is the background info:
    Old server:
    OS : Redhat Linux AS 2.1
    DB Version : 10.1.0.5 (32 bit)
    No RAC
    New Server:
    OS : OEL 5.5
    DB Version : 11.2.0.2
    RAC
    ASM
    What I have done so far:
    1. Create new clustered 11Gr2 DB on new server.
    2. Pre-create tablespaces on new DB.
    3. Migrate 10.1.0.5 to 11.2.0.2 using IMPDP.
    impdp system/******* DIRECTORY=dump_file_dir NETWORK_LINK=DWH_DBLINK LOGFILE=log_file_dir:DWH_import_20110621.log FULL=Y SERVICE_NAME=dwhdb.xxx.xxx TABLE_EXISTS_ACTION=replace cluster=N exclude=statistics
    4. After IMPDP complete, invalid objects and components are found, run utlrp.sql no help
    SQL> select owner, count(*) from dba_objects where status = 'INVALID' group by owner;
    OWNER COUNT(*)
    WKSYS 16
    PUBLIC 12
    OLAPSYS 7
    ODM 21
    SYS 2
    WMSYS 11
    12 rows selected.
    SQL> select comp_name, status, version from dba_registry;
    COMP_NAME STATUS VERSION
    OWB VALID 11.2.0.2.0
    Oracle Application Express VALID 3.2.1.00.12
    Oracle Enterprise Manager VALID 11.2.0.2.0
    OLAP Catalog VALID 11.2.0.2.0
    Spatial VALID 11.2.0.2.0
    Oracle Multimedia VALID 11.2.0.2.0
    Oracle XML Database VALID 11.2.0.2.0
    Oracle Text VALID 11.2.0.2.0
    Oracle Expression Filter VALID 11.2.0.2.0
    Oracle Rules Manager VALID 11.2.0.2.0
    Oracle Workspace Manager INVALID 11.2.0.2.0
    Oracle Database Catalog Views VALID 11.2.0.2.0
    Oracle Database Packages and Types INVALID 11.2.0.2.0
    JServer JAVA Virtual Machine VALID 11.2.0.2.0
    Oracle XDK VALID 11.2.0.2.0
    Oracle Database Java Packages VALID 11.2.0.2.0
    OLAP Analytic Workspace VALID 11.2.0.2.0
    Oracle OLAP API VALID 11.2.0.2.0
    Oracle Real Application Clusters VALID 11.2.0.2.0
    19 rows selected.
    5. Check SYS's invalid objects, e.g. DBA_OUTLINE_HINTS, after tracing the reason, find outln.ol$hints is replaced by 10.1.0.5 version. I think it is due to the IMPDP's "TABLE_EXISTS_ACTION=replace" parameter.
    Others invalid objects like WMSYS.WM$ENV_VARS, also replaced by old version.
    What should I do now?
    Do I need to run upgrade script after DB migration using IMPDP?
    Is the migration procedure correct?
    Please advise
    Thanks in advance

    Hi,
    It looks like you've messed up the Non User (Oracle default user) data and/or metadata i.e sys, system, wksys.
    How many non Oracle default user do you have? and how big is the database? If you're using this method I'm assuming the data is not really big.
    I personally will not do full export/ import across different version, its better to just export the non default Oracle user schema as you might ended up messed up the sys objects, etc
    What you might do is
    - drop the 11g database and start from beginning again but exclude Oracle default user e.g sys,system, etc
    - or try run catupgrd.sql script (this will drop and recreate the objects), -this may or may not be fixing your invalid components
    Cheers

  • Unable to import database using impdp via network_link

    Dear All,
    I am trying to import database from remote location's dump to my local database after creating dblink from local db to remote db.
    At Remote Server:
    - Directory created and granted the privileges. Dir Name: PREMLIVE_EXPDP
    At Local server:
    Username: expuser with dba, imp and exp full database privileges.
    DB Link name: to_premiatest14
    I faced an error of directory invalid when I run the below command in local server to import schema from remote database.
    C:\>impdp expuser/expuser network_link=to_premiatest14 directory=PREMLIVE_EXPDP remap_schema=PREMIATEST:PREMIATEST15 logfile=2013-09-02_premiatest15.log
    Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.5.0 - 64bit
    Production
    With the Partitioning, OLAP, Data Mining and Real Application Testing options
    ORA-39002: invalid operation
    ORA-39070: Unable to open the log file.
    ORA-39087: directory name PREMLIVE_EXPDP is invalid
    Any help or advice's would be really highly appreciated.
    Regards,
    Syed

    Some more details follows.
    Server 1: Remote Server
    BANNER
    Oracle Database 10g Enterprise Edition Release 10.2.0.5.0 - 64bi
    PL/SQL Release 10.2.0.5.0 - Production
    CORE    10.2.0.5.0      Production
    TNS for 64-bit Windows: Version 10.2.0.5.0 - Production
    NLSRTL Version 10.2.0.5.0 - Production
    TNS-Entry:
    PREMIATEST14=
      (DESCRIPTION =
        (ADDRESS = (PROTOCOL = TCP)(HOST = XX.XX.XX.14)(PORT = 1521))
        (CONNECT_DATA =
          (SERVER = DEDICATED)
          (SERVICE_NAME = malath)
    DIRECTORY:
    PREMLIVE_EXPDP AS
    'C:\Dump\PREMLIVE_EXPDP'; ---->>>> This location/directory holds the dump file.
    Server 2: Local Server
    BANNER
    Oracle Database 10g Enterprise Edition Release 10.2.0.5.0 - 64bi
    PL/SQL Release 10.2.0.5.0 - Production
    CORE    10.2.0.5.0      Production
    TNS for 64-bit Windows: Version 10.2.0.5.0 - Production
    NLSRTL Version 10.2.0.5.0 - Production
    TNS-Entry:
    PREMIATEST15=
      (DESCRIPTION=
        (ADDRESS=
          (PROTOCOL=TCP)
          (HOST=XX.XX.XX.15)
          (PORT=1521)
        (CONNECT_DATA=
          (SERVICE_NAME=PREMIA15)
    PREMIATEST14 =
      (DESCRIPTION =
        (ADDRESS = (PROTOCOL = TCP)(HOST = XX.XX.XX.14)(PORT = 1521))
        (CONNECT_DATA =
          (SERVER = DEDICATED)
          (SERVICE_NAME = malath)
    Network/DBLink:
    CREATE PUBLIC DATABASE LINK TO_PREMIATEST14 CONNECT TO EXPUSER IDENTIFIED BY <PWD> USING 'PREMIATEST14';
    DBLink tested :
    SQL> select count(*) from premiatest.brkdivion@to_premiatest14;
      COUNT(*)
            94
    Error on importing:
    C:\>impdp expuser/expuser network_link=to_premiatest14 directory=PREMLIVE_EXPDP
    remap_schema=PREMIATEST:PREMIATEST15 logfile=2013-09-02_premiatest15.log
    Import: Release 10.2.0.5.0 - 64bit Production on Tuesday, 03 September, 2013 1:2
    3:10
    Copyright (c) 2003, 2007, Oracle.  All rights reserved.
    Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.5.0 - 64bit
    Production
    With the Partitioning, OLAP, Data Mining and Real Application Testing options
    ORA-39002: invalid operation
    ORA-39070: Unable to open the log file.
    ORA-39087: directory name PREMLIVE_EXPDP is invalid
    I don't know where am i going wrong . Valuable advice's/assistance will be highly appreciated.
    Thanks & Regards,
    Syed

  • Migrating from 10gR2 to 11gR2 using impdp with network_link fails

    Hi,
    I'm trying to migrate a database from 10gR2 (Novell SLES 10) to a new server (Novell SLES 11) with 11gR2. I created a database on the new system with all required users and tablespaces. Then I used impdp in 11g with a database link to the source database:
    impdp system/pw@db NETWORK_LINK=X_SourceServer FULL=Y
    This command fails immediately with ORA-39113: unable to determine database version. Even using the parameter VERSION=10.2.0 doesn't change the result.
    The Oracle Upgrade Guide tells me that upgrading is possible from 10* to 11* and recommeds impdp and network link. Is the failure perhaps a bug in the NETWORK_LINK procedure? Should I better try expdp on the source and impdp on the target with a dumpfile as intermediate (the database ist rather big, so dealing with files across the network will take a lot of storage capacity)?
    Has anyone tried this method with success?
    Thanks
    Friedhelm
    Edited by: user8964905 on 02.09.2011 02:34

    This is due to an bug. The workaround is in the DOC
    IMPDP From 10g TO 11g Gives ORA-39113, ORA-4052 & ORA-6544 PL/SQL: internal error, arguments: [55916] (Doc ID 1062428.1)

  • Data pump using network_link problem

    Hi,
    I defined a database link for the sys user as:
    create database link xe_remote
    connect to hr
    identified by hr
    using 'xe_remote'
    and it works fine:
    select * from dual@xe_remote
    D
    X
    Then, tried to import with data pump the user hr from xe_remote in the local system, but it seemed that the database link wasn't valid anymore:
    impdp system/pwd schemas=hr network_link=xe_remote remap_schema=hr:hr_remote table_exists_action=replace
    ORA-39001: invalid argument value
    ORA-39200: Link name "xe_remote" is invalid
    ORA-02019: connection description for remote database not found
    Thanks,
    Gianluca

    Hello
    oerr says:
    oerr ora 39149
    39149, 00000, "cannot link privileged user to non-privileged user"
    // *Cause:  A Data Pump job initiated be a user with
    // EXPORT_FULL_DATABASE/IMPORT_FULL_DATABASE roles specified a
    // network link that did not correspond to a user with
    // equivalent roles on the remote database.
    // *Action: Specify a network link that maps users to identically privileged
    // users in the remote database.
    Solution:
    Create a db link from hr_remote to hr
    or
    Create a db link from system to system (I think, it's easier, because you have all rights...)
    Greetings
    Sven

  • Error while taking Backup using hpux tape drive

    I got the following error while taking a backup.
    ERROR:   System Error, /opt/ignite/bin/make_medialif failed creating boot LIF 
    ERROR:   Failed to generate LIF on tape.
    ERROR:   System Error, /opt/ignite/bin/make_medialif failed creating boot LIF
    ERROR:   Failed to generate LIF on tape.
    make_medialif: ERROR: Cannot tar and gzip script files; possibly insufficient di
    [24;1H[0K[7mscbackup.log (13%)[m[24;1H[0K/11[24;1H[24;1H[0Ksk space in /var/tmp/make_medialif8916.
    =======  11/10/10 12:17:49 IST  make_tape_recovery completed unsuccessfully
    ERROR:     Ignite tape writing phase had errors
               Exit status: 1
    How can this problem be resolved? Kindly drop in your comments and solutions. 

    Hello Kulvinderbuttar,
    I understand that you are not able to make the recovery media.
    Are you using recovery media creation?
    Here is a link to walk you through the steps to create the recovery media.
    The flash drive that you are using needs to be empty and may need to be reformatted before you can make the recovery media.
    Let me know how everything goes.

  • Impdp using EXCLUDE

    Hi
    I want to import a full database excluding a list of schemas. I know I can use the parameter EXCLUDE to do that but I need to exclude several schemas and I am having problem writing the appropiate list in the parameter EXCLUDE. Can you help me?
    Thanks

    I tried as you recommed but I got an error as well
    impdp system/???@????? exclude=SCHEMA:"='DBSNMP,MGMT_VIEW,SYSMAN,SYS,SYSTEM,FLOWS_FILES,MDSYS,SCOTT,WMSYS,APPQOSSYS,XS$NULL,APEX_030200,OWBSYS_AUDIT,MDDATA,OWBSYS,ORACLE_OCM,ORDDATA,ANONYMOUS,OUTLN,DIP,EXFSYS,APEX_PUBLIC_USER,XDB,ORDSYS,SPATIAL_CSW_ADMIN_USR,CTXSYS,SPATIAL_WFS_ADMIN_USR,ORDPLUGINS,OLAPSYS,SI_INFORMTN_SCHEMA'" DIRECTORY=dirorigen LOGFILE=DATA_PUMP_DIR:imp.log FULL=y DUMPFILE=fullbk.dmp
    $
    $ ./importarfullexclude.sh
    LRM-00116: syntax error at 'SCHEMA:' following '='
    $
    I tried on this way and I got an error again...
    Import: Release 11.2.0.2.0 - Production on Fri Feb 18 11:35:00 2011
    Copyright (c) 1982, 2009, Oracle and/or its affiliates. All rights reserved.
    Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.2.0 - 64bit Production
    With the Partitioning, OLAP, Data Mining and Real Application Testing options
    ORA-39071: Value for EXCLUDE is badly formed.
    ORA-00920: invalid relational operator
    $
    impdp system/???@????? exclude=SCHEMA:"'DBSNMP,MGMT_VIEW,SYSMAN,SYS,SYSTEM,FLOWS_FILES,MDSYS,SCOTT,WMSYS,APPQOSSYS,XS$NULL,APEX_030200,OWBSYS_AUDIT,MDDATA,OWBSYS,ORACLE_OCM,ORDDATA,ANONYMOUS,OUTLN,DIP,EXFSYS,APEX_PUBLIC_USER,XDB,ORDSYS,SPATIAL_CSW_ADMIN_USR,CTXSYS,SPATIAL_WFS_ADMIN_USR,ORDPLUGINS,OLAPSYS,SI_INFORMTN_SCHEMA'" DIRECTORY=dirorigen LOGFILE=DATA_PUMP_DIR:imp_2011.log FULL=y DUMPFILE=fullbk_2011.dmp
    $

  • Impdp using parfile...

    Hi ,
    I am doing import of table using parfile with query option. My import is running is running from 4 hours where as export get completed with in half hours???
    Kindly suggest...
    query_imp.par
    tables=ABC
    query=ABC:"WHERE TIMESTAMP BETWEEN '09/20/2013 00:00:00' AND '10/09/2013' "
    logfile=import_ABC.log
    parallel= 3
    DIRECTORY=export_ABC
    DUMPFILE=export_ABC_%U.dmp
    table_exists_action=APPEND
    CONTENT=ALL
    JOB_NAME=import_ABC
    while executing the command:
    impdp SYSTEM/*****@orcl parfile=/***/***/query_imp.par
    O/P:
    Starting "SYSTEM"."IMPORT_ABC":  SYSTEM/********@ORCL tables=ABC parfile=/***/***/query_imp.par
    Processing object type TABLE_EXPORT/TABLE/TABLE
    ORA-39152: Table "XYZ"."ABC" exists. Data will be appended to existing table but all dependent metadata will be skipped due to table_exists_action of append
    Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
    Why status is 100% completed where job is running ????
    and STATUS is:
    Import> STATUS
    Job: IMPORT_ABC
      Operation: IMPORT
      Mode: TABLE
      State: EXECUTING
      Bytes Processed: 0
      Current Parallelism: 5
      Job Error Count: 0
      Dump File: /**/**/export_ABC_%u.dmp
      Dump File: /**/**/export_ABC_01.dmp
      Dump File: /**/**/export_ABC_02.dmp
    Worker 1 Status:
      State: WORK WAITING
    Worker 2 Status:
      State: EXECUTING
      Object Schema:XYZ
      Object Name: ABC
      Object Type: TABLE_EXPORT/TABLE/TABLE_DATA
      Completed Objects: 1
      Completed Bytes: 55,512,836,576
      Percent Done: 100
      Worker Parallelism: 1

    Hi ,
    While checking the lock status: i am getting..throug this query:
    set linesize 150;
    set head on;
    col sid_serial form a13
    col ora_user for a15;
    col object_name for a35;
    col object_type for a10;
    col lock_mode for a15;
    col last_ddl for a8;
    col status for a10;
    break on sid_serial;
    SELECT l.session_id||','||v.serial# sid_serial,
           l.ORACLE_USERNAME ora_user,
           o.object_name,
           o.object_type,
           DECODE(l.locked_mode,
              0, 'None',
              1, 'Null',
              2, 'Row-S (SS)',
              3, 'Row-X (SX)',
              4, 'Share',
              5, 'S/Row-X (SSX)',
              6, 'Exclusive',
              TO_CHAR(l.locked_mode)
           ) lock_mode,
           o.status,
           to_char(o.last_ddl_time,'dd.mm.yy') last_ddl
    FROM dba_objects o, gv$locked_object l, v$session v
    WHERE o.object_id = l.object_id
          and l.SESSION_ID=v.sid
    order by 2,3;
    ===================
    SESSION_ID ORACLE_USERNAME OS_USER_NAME    OBJECT OWNER                   OBJECT_NAME                           OBJECT_TYPE                           LOCKED_MODE
    1028        SYSTEM         oracle          www                              ABC                          TABLE                                           6
    1015        NFINSERT       qa                 www                         ABC                        TABLE                                           0
    982         NFINSERT         qa                www                                  ABC                 TABLE                                           0
    482         NFINSERT          qa             www                                      ABC               TABLE                                           0

  • Error while taking dump using datapump

    getting following error -
    Export: Release 10.2.0.1.0 - Production on Friday, 15 September, 2006 10:31:41
    Copyright (c) 2003, 2005, Oracle. All rights reserved.
    Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - Production
    With the Partitioning, OLAP and Data Mining options
    Starting "XX"."SYS_EXPORT_SCHEMA_02": XX/********@XXX directory=dpdump dumpfile=XXX150906.dmp logfile=XXX150906.log
    Estimate in progress using BLOCKS method...
    Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
    ORA-39125: Worker unexpected fatal error in KUPW$WORKER.GET_TABLE_DATA_OBJECTS while calling DBMS_METADATA.FETCH_XML_CLOB []
    ORA-31642: the following SQL statement fails:
    BEGIN "DMSYS"."DBMS_DM_MODEL_EXP".SCHEMA_CALLOUT(:1,0,0,'10.02.00.01.00'); END;
    ORA-06512: at "SYS.DBMS_SYS_ERROR", line 86
    ORA-06512: at "SYS.DBMS_METADATA", line 907
    ORA-06550: line 1, column 7:
    PLS-00201: identifier 'DMSYS.DBMS_DM_MODEL_EXP' must be declared
    ORA-06550: line 1, column 7:
    PL/SQL: Statement ignored
    ORA-06512: at "SYS.DBMS_SYS_ERROR", line 95
    ORA-06512: at "SYS.KUPW$WORKER", line 6235
    ----- PL/SQL Call Stack -----
    object line object
    handle number name
    2A68E610 14916 package body SYS.KUPW$WORKER
    2A68E610 6300 package body SYS.KUPW$WORKER
    2A68E610 9120 package body SYS.KUPW$WORKER
    2A68E610 1880 package body SYS.KUPW$WORKER
    2A68E610 6861 package body SYS.KUPW$WORKER
    2A68E610 1262 package body SYS.KUPW$WORKER
    255541A8 2 anonymous block
    Job "XX"."SYS_EXPORT_SCHEMA_02" stopped due to fatal error at 10:33:12
    Action required is contact customer support. And on metalink found a link that states it a bug in 10g release 1 that was suppose to be fixed in 10g release 1 version 4.
    some of the default schemas were purposely dropped from the database. The only default schema available now are -
    DBSNMP, DIP, OUTLN, PUBLIC, SCOTT, SYS, SYSMAN, SYSTEM, TSMSYS.
    DIP, OUTLN, TSMSYS were created again.
    Could this be a cause of problem??
    Thanks in adv.

    Hi,
    Below is the DDL taken from different database. Will this be enough ? One more thing please, what shall be the password should it be DMSYS.....since this will not be used by me but system.
    CREATE USER "DMSYS" PROFILE "DEFAULT" IDENTIFIED BY "*******" PASSWORD EXPIRE DEFAULT TABLESPACE "SYSAUX" TEMPORARY TABLESPACE "TEMP" QUOTA 204800 K ON "SYSAUX" ACCOUNT LOCK
    GRANT ALTER SESSION TO "DMSYS"
    GRANT ALTER SYSTEM TO "DMSYS"
    GRANT CREATE JOB TO "DMSYS"
    GRANT CREATE LIBRARY TO "DMSYS"
    GRANT CREATE PROCEDURE TO "DMSYS"
    GRANT CREATE PUBLIC SYNONYM TO "DMSYS"
    GRANT CREATE SEQUENCE TO "DMSYS"
    GRANT CREATE SESSION TO "DMSYS"
    GRANT CREATE SYNONYM TO "DMSYS"
    GRANT CREATE TABLE TO "DMSYS"
    GRANT CREATE TRIGGER TO "DMSYS"
    GRANT CREATE TYPE TO "DMSYS"
    GRANT CREATE VIEW TO "DMSYS"
    GRANT DROP PUBLIC SYNONYM TO "DMSYS"
    GRANT QUERY REWRITE TO "DMSYS"
    GRANT SELECT ON "SYS"."DBA_JOBS_RUNNING" TO "DMSYS"
    GRANT SELECT ON "SYS"."DBA_REGISTRY" TO "DMSYS"
    GRANT SELECT ON "SYS"."DBA_SYS_PRIVS" TO "DMSYS"
    GRANT SELECT ON "SYS"."DBA_TAB_PRIVS" TO "DMSYS"
    GRANT SELECT ON "SYS"."DBA_TEMP_FILES" TO "DMSYS"
    GRANT EXECUTE ON "SYS"."DBMS_LOCK" TO "DMSYS"
    GRANT EXECUTE ON "SYS"."DBMS_REGISTRY" TO "DMSYS"
    GRANT EXECUTE ON "SYS"."DBMS_SYSTEM" TO "DMSYS"
    GRANT EXECUTE ON "SYS"."DBMS_SYS_ERROR" TO "DMSYS"
    GRANT DELETE ON "SYS"."EXPDEPACT$" TO "DMSYS"
    GRANT INSERT ON "SYS"."EXPDEPACT$" TO "DMSYS"
    GRANT SELECT ON "SYS"."EXPDEPACT$" TO "DMSYS"
    GRANT UPDATE ON "SYS"."EXPDEPACT$" TO "DMSYS"
    GRANT SELECT ON "SYS"."V_$PARAMETER" TO "DMSYS"
    GRANT SELECT ON "SYS"."V_$SESSION" TO "DMSYS"
    The other database has the DMSYS and the status is EXPIRED & LOCKED but I'm still able to take the dump using datapump??

Maybe you are looking for

  • Bank Details Keyflexfield - German Segments 11i

    Our original definition of the Bank Details Keyflexfield for Germany (*DE_BANK_DETAILS*) was unfortunately overwritten. Now we do SEPA implementation and I would like to know the orig. Definition of the Segments. Could somebody please run following s

  • Adobe Form Server 5.1 Memory Problems

    FSService.exe increases memory when serving forms but never releases it. The forms are of the Form Server type and they have several calls to an InHouse COM Component. The COM Component was developed to encapsulate the calls to our Legacy systems via

  • New Mac Mini 2014?

    Apple Hi Guys, I wanted to tell you that in August had planned to buy a Mac Mini Mac for introducing me and wanted to know if I buy it now and upgrade to i7 with 8GB of RAM and I hope to come out the back and if I I have to wait I need to know if the

  • Veeam MP 65 - SCOM 2007

    Hello, I just rebuild a collector it has been seen by the Veeam Virtualization Extensions for System Center several ESX are on it... But when I try to use the "Resend Topology" within the SCOM Console it failed... where could I check why it is failin

  • Applet Debugging Option for End Users

    Users in our 12.0.4 development system that only have the 'xMII Users' role can see and use the 'Applet Debugging' option under the System Management menu.  I'd prefer them to only have the options under the Support menu. I'm puzzled why this one opt