Some objects in sys schema not imported usinf full database import/export.

Hello All,
We need to migrate database from one server to new server using same database version 11g Release 2 (11.2.0.1.0). We have some production objects in sys schema. We took full export using full=y and then import on new server using full=y. Other objects belonging to different tablespaces and users has successfully imported but production objects i.e table in sys schema not import.
We used following commands for export and import:
# exp system file=/u01/backup/orcl_full_exp.dmp log=/u01/backup/orcl_full_exp.log full=y statistics=none
# imp system file=/u01/backup/orcl_full_exp.dmp log=/u01/backup/orcl_full_imp.log full=y ignore=y
Kind Regards,
Sharjeel

Hi,
First of all it is not good practice to keep user object in the sys schema
second you are useing version 11gr2 what not you go for datapump export/import method.
third, did you get any error in import, what is the error

Similar Messages

  • Is there a way to include the sys.aud$ table in a full database dp export?

    I am doing an export using the following parfile information:
    userid=/
    directory=datapump_nightly_export
    dumpfile=test_expdp.dmp
    logfile=test_expdp.log
    full=y
    content=all
    However when I run this I do not see the sys.aud$ in the log file. I know I can do a seperate export to specifically get the sys.aud$ table but is there any way to include it in with my full export?
    Thanks in advance for any suggestion.

    here's more background infomation... I have some audits setup on my database for one of my users. Every quarter I have an automated job that runs that creates a usage/statics report for this person using data in aud$. at the end of the job I export the aud$ table and truncate it. However last quarter I found that there was a mistake in my report and my export did not run properly thus my audit data was gone. i also have full datapump exports that run daily but found that aud$ was not there. so that is why I thought I'd like to include sys.aud$ in the full datapump exports.
    i understand why other sys tables would be left out of a full export but aud$ data cannot be reproduced so to me it makes sense to include it in a full export.
    don't worry, we run our true backups using rman which is eventually how I got the aud$ data back by creating a copy of my database up until the time of the truncate. however this was quite time consuming.

  • Import all users and their objects without doing full database import

    Hi Guys,
    I have a task to that involves Importing all existing users and their objects in production to the test database without doing a full database import - Please how do i do this?
    Pls i need your help urgently.

    SQL> select * from v$version;
    BANNER
    Oracle Database 10g Express Edition Release 10.2.0.1.0 - Product
    PL/SQL Release 10.2.0.1.0 - Production
    CORE 10.2.0.1.0 Production
    TNS for Linux: Version 10.2.0.1.0 - Production
    NLSRTL Version 10.2.0.1.0 - Production
    I tried to import objects and data from a user from a FULL dump file. File was created with the following command:
    server is: SQL*Plus: Release 10.2.0.1.0 - Production on Wed May 26 15:34:05 2010
    exp full=y FILE="full.dmp" log=full.log
    Now I imported:
    imp file=full.dmp log=full.log INCTYPE=SYSTEM
    imp fromuser=user1 file=full.dmp
    Results: not all the user procedures have been imported:
    SQL> select count(*) from user_procedures;
    the Original
    COUNT(*)
    134
    the current:
    select count(*) from user_procedures;
    COUNT(*)
    18
    I also tried these alternatives:
    exp tablespaces="user1_data" FILE="user1.dmp" log=user1.log
    exp LOG=user1.log TABLESPACES=user1_data FILE=user1_data.dmp
    exp LOG=user1owner.log owner=user1 FILE=user1owner.dmp
    expdp DIRECTORY=dpump_dir1 dumpfile=servdata_pump version=compatible SCHEMAS=user1
    impdp directory=data_pump_dir dumpfile=servdata_pump.dmp :
    ORA-39213: Metadata processing is not available
    SQL> execute dbms_metadata_util.load_stylesheets
    BEGIN dbms_metadata_util.load_stylesheets; END;
    ERROR at line 1:
    ORA-31609: error loading file "kualter.xsl" from file system directory
    "/usr/lib/oracle/xe/app/oracle/product/10.2.0/server/rdbms/xml/xsl"
    ORA-06512: at "SYS.DBMS_METADATA_UTIL", line 1793
    ORA-06512: at line 1
    file kualter.xsl does not exist in XE !!
    imp owner=user1 rows=n ignore=y
    imp full=y file=user1_data.dmp
    imp full=y file=full.dmp tablespaces=user1_data,user1_index rows=n ignore=y
    So, I do not understand why user1 objects are not imported:
    see this part of the first import log:
    Export file created by EXPORT:V10.02.01 via conventional path
    import done in US7ASCII character set and AL16UTF16 NCHAR character set
    import server uses WE8ISO8859P1 character set (possible charset conversion)
    . importing SYS's objects into SYS
    . importing USER1's objects into USER1
    . . importing table .........................
    why only 18 rows?
    if you have an suggestion, you are welcome, as I do not have any other idea...
    ren
    Edited by: ronpetitpatapon on May 26, 2010 12:38 PM
    Edited by: ronpetitpatapon on May 26, 2010 12:41 PM
    Edited by: ronpetitpatapon on May 26, 2010 1:03 PM

  • A new application built on a schema not in the portal database

    I want to create a new application using the create "new application" function of the "build" tab of Portal.
    I want this application to use a schema not in the Portal Database. The schema the application has to use belongs to a different database residing on the same machine the Portal database resides.
    But the Portal "Application Name and Schema" form does allow me to choose a schema outside the Oracle Portal database.
    I run Oracle Portal Version: 3.0.7.6.2 on WNT.
    Thank for any suggestion.

    Maurizio
    Carol's right. Create a schema in the local database then create a db link to the remote db connecting as that schema user (or some other of your choice). In my experience this has to be a public db link.
    I run a portal application against a database on another sun box. There's some hassles, but it works ok mostly.
    The db link connects as the user to the remote database and I grant privileges in the remote db to that db link user. I also have public synonyms in the portal database that point to the objects in the remote db. I have public synonyms in the remote database that sort out the schema of the objects required.
    This sounds complicated and I've explained it in a rush, but if you sit down and work it through it's not so bad. You may find a better way, but after several trials I've got a setup that works.
    Also, when I create a form I always refer only to the synonym - never to a schema.object. That way you can adjust the synonym easily without rewriting your report.
    I always make reports from sql for the reason that you can change the sql, though I always refer to my synonym never to schema.object.
    Session info doesn't travel well to the remote instance. I've had to build a work-around to get my table triggers in the remote instance to figure out who the portal user is because wwctx_api.get_user returns public.
    I've been unable to directly call procedures in the remote database from portal components, but I've got around that by creating wrapper procedures in the local database.
    It hasn't been a trivial exercise, but it does work.
    Greg
    I want to create a new application using the create "new application" function of the "build" tab of Portal.>I want this application to use a schema not in the Portal Database. The schema the application has to use belongs to a different database residing on the same machine the Portal database resides.>>But the Portal "Application Name and Schema" form does allow me to choose a schema outside the Oracle Portal database.>>I run Oracle Portal Version: 3.0.7.6.2 on WNT.>>Thank for any suggestion.

  • [Q] ORA-30510 error while doing full database import

    We have ORACLE 9.2.0.6 on Redhat LINUX AS 3.4. while we doing "full database import" there has ORA-30510 come out. I searched Metalink and Google, but not found much information. The error messages on import are:
    . importing OUTLN's objects into OUTLN
    . importing SYSTEM's objects into SYSTEM
    . importing TKOWNER's objects into TKOWNER
    IMP-00017: following statement failed with ORACLE error 30510:
    "CREATE TRIGGER "TKOWNER".TRIG_DONOTANALYZE before analyze on schema decla"
    "re begin if sys.dictionary_obj_name = 'MYWTKEMPLOYEE' then raise_ap"
    "plication_error(-20101, 'Do Not ANALYZE MYWTKEMPLOYEE'); end if; end; "
    IMP-00003: ORACLE error 30510 encountered
    ORA-30510: system triggers cannot be defined on the schema of SYS user
    . importing SYS's objects into SYS
    . importing TKOWNER's objects into TKOWNER
    About to enable constraints...
    Import terminated successfully with warnings.

    877410 wrote:
    Hi I get the following ierrors while importing a schema from prod to dev database... can anyone help ,thanks!
    impdp system DIRECTORY=DATA_PUMP_DIR DUMPFILE=abcdprod.DMP LOGFILE=abcdprod.log REMAP_SCHEMA=abcdprod:abcddev
    ORA-39002: invalid operation
    ORA-31694: master table "SYSTEM"."SYS_IMPORT_FULL_01" failed to load/unload
    ORA-31644: unable to position to block number 170452 in dump file "/ots/oracle/echo/wxyz/datapump/abcdprod.DMPpost complete command line for the expdp that made the abcdprod.DMP file

  • Full database import(impdp) error

    HI ppl,
    Plz help..
    My full database import is failing due to
    ORA-39126: Worker unexpected fatal error in KUPW$WORKER.LOAD_METADATA [TABLE_DATA:"SYS"."SYS_IMPORT_FULL_04"]
    SELECT process_order, flags, xml_clob, NVL(dump_fileid, :1), NVL(dump_position, :2), dump_length, dump_allocation, grantor, object_row, object_schema, object_long_name, processing_status, processing_state, base_object_type, base_object_schema, base_object_name, property, size_estimate, in_progress FROM "SYS"."SYS_IMPORT_FULL_04" WHERE process_order between :3 AND :4 AND processing_state <> :5 AND duplicate = 0 ORDER BY process_order
    ORA-25153: Temporary Tablespace is Empty
    ORA-06512: at "SYS.DBMS_SYS_ERROR", line 105
    ORA-06512: at "SYS.KUPW$WORKER", line 6272
    ----- PL/SQL Call Stack -----
    object line object
    handle number name
    c000000267348288 14916 package body SYS.KUPW$WORKER
    c000000267348288 6293 package body SYS.KUPW$WORKER
    c000000267348288 3511 package body SYS.KUPW$WORKER
    c000000267348288 6882 package body SYS.KUPW$WORKER
    c000000267348288 1259 package body SYS.KUPW$WORKER
    c00000026fb2de68 2 anonymous block
    Job "SYS"."SYS_IMPORT_FULL_04" stopped due to fatal error at 14:56:10
    regards

    Hi..
    ya temporary tablespace is present,and also contains 2gb freespace,with 2 datafiles.Are you sure its a datafile.It must be a tempfile. :)
    Secondly, have you checked the default temporary tablespace of the database.Is it the one that you have mentioned.
    SELECT PROPERTY_NAME,PROPERTY_VALUE FROM DATABASE_PROPERTIES where PROPERTY_NAME='DEFAULT_TEMP_TABLESPACE';
    Anand

  • Full database import

    Hi,
    I am trying to import a full database using the impdp method.
    impdp system/oracle dumpfile=datap:fulldb.dmp full=y logfile=datap:full.log
    I am getting fatal error message while importing.
    ===========================
    ORA-06512: at "SYS.DBMS_SYS_ERROR", line 95
    ORA-06512: at "SYS.KUPW$WORKER", line 6273
    ----- PL/SQL Call Stack -----
    object line object
    handle number name
    5e6414d4 14916 package body SYS.KUPW$WORKER
    5e6414d4 6300 package body SYS.KUPW$WORKER
    5e6414d4 12689 package body SYS.KUPW$WORKER
    5e6414d4 11968 package body SYS.KUPW$WORKER
    5e6414d4 3279 package body SYS.KUPW$WORKER
    5e6414d4 6889 package body SYS.KUPW$WORKER
    5e6414d4 1262 package body SYS.KUPW$WORKER
    6408cfd0 2 anonymous block
    Job "SYSTEM"."SYS_IMPORT_FULL_03" stopped due to fatal error at 13:19:32:
    ==================================
    My question is 'should we have to create all the tablespaces(other than system tablespaces) that was available in the primary database manually in the target machine before executing the import command?
    What are the prerequisites for importing a full export into a database? How do we make the target database ready to accept the import?
    my oracle version is 10.2.0 on solaris 10
    Thanks in advance....
    Edited by: user13364377 on 24-Jul-2010 14:33

    Hi,
    My db version is 10.2.0.0 and OS is solaris 10..
    I am pasitng the contents of the alert log file.
    ==================================
    kupprdp: master process DM00 started with pid=19, OS id=1328
    to execute - SYS.KUPM$MCP.MAIN('SYS_IMPORT_FULL_01', 'SYSTEM', 'KUPC$C_1_20100724100422', 'KUPC$S_1_20100724100422', 0);
    kupprdp: worker process DW01 started with worker id=1, pid=20, OS id=1330
    to execute - SYS.KUPW$WORKER.MAIN('SYS_IMPORT_FULL_01', 'SYSTEM');
    Sat Jul 24 10:04:28 2010
    CREATE UNDO TABLESPACE "UNDOTBS1" DATAFILE '/oracle/product/dbn/undotbs01.dbf' SIZE 26214400 AUTOEXTEND ON NEXT 5242880 MAXSIZE 32767M BLOCKSIZE 8192 EXTENT MANAGEMENT LOCAL AUTOALLOCATE
    ORA-1543 signalled during: CREATE UNDO TABLESPACE "UNDOTBS1" DATAFILE '/oracle/product/dbn/undotbs01.dbf' SIZE 26214400 AUTOEXTEND ON NEXT 5242880 MAXSIZE 32767M BLOCKSIZE 8192 EXTENT MANAGEMENT LOCAL AUTOALLOCATE...
    Sat Jul 24 10:04:28 2010
    CREATE TABLESPACE "SYSAUX" DATAFILE '/oracle/product/dbn/sysaux01.dbf' SIZE 125829120 AUTOEXTEND ON NEXT 10485760 MAXSIZE 32767M LOGGING ONLINE PERMANENT BLOCKSIZE 8192 EXTENT MANAGEMENT LOCAL AUTOALLOCATE SEGMENT SPACE MANAGEMENT AUTO
    ORA-1543 signalled during: CREATE TABLESPACE "SYSAUX" DATAFILE '/oracle/product/dbn/sysaux01.dbf' SIZE 125829120 AUTOEXTEND ON NEXT 10485760 MAXSIZE 32767M LOGGING ONLINE PERMANENT BLOCKSIZE 8192 EXTENT MANAGEMENT LOCAL AUTOALLOCATE SEGMENT SPACE MANAGEMENT AUTO...
    Sat Jul 24 10:04:28 2010
    CREATE TEMPORARY TABLESPACE "TEMP" TEMPFILE '/oracle/product/dbn/temp01.dbf' SIZE 31457280 AUTOEXTEND ON NEXT 655360 MAXSIZE 32767M EXTENT MANAGEMENT LOCAL UNIFORM SIZE 1048576
    ORA-1543 signalled during: CREATE TEMPORARY TABLESPACE "TEMP" TEMPFILE '/oracle/product/dbn/temp01.dbf' SIZE 31457280 AUTOEXTEND ON NEXT 655360 MAXSIZE 32767M EXTENT MANAGEMENT LOCAL UNIFORM SIZE 1048576
    Sat Jul 24 10:04:28 2010
    CREATE TABLESPACE "USERTB" DATAFILE '/oracle/product/dbn/users01.dbf' SIZE 5242880 AUTOEXTEND ON NEXT 1310720 MAXSIZE 32767M LOGGING ONLINE PERMANENT BLOCKSIZE 8192 EXTENT MANAGEMENT LOCAL AUTOALLOCATE SEGMENT SPACE MANAGEMENT AUTO
    ORA-1119 signalled during: CREATE TABLESPACE "USERTB" DATAFILE '/oracle/product/dbn/users01.dbf' SIZE 5242880 AUTOEXTEND ON NEXT 1310720 MAXSIZE 32767M LOGGING ONLINE PERMANENT BLOCKSIZE 8192 EXTENT MANAGEMENT LOCAL AUTOALLOCATE SEGMENT SPACE MANAGEMENT AUTO...
    Sat Jul 24 10:04:28 2010
    CREATE TABLESPACE "EXAMPLE" DATAFILE '/oracle/product/dbn/example01.dbf' SIZE 104857600 AUTOEXTEND ON NEXT 655360 MAXSIZE 32767M NOLOGGING ONLINE PERMANENT BLOCKSIZE 8192 EXTENT MANAGEMENT LOCAL AUTOALLOCATE SEGMENT SPACE MANAGEMENT AUTO
    ORA-1119 signalled during: CREATE TABLESPACE "EXAMPLE" DATAFILE '/oracle/product/dbn/example01.dbf' SIZE 104857600 AUTOEXTEND ON NEXT 655360 MAXSIZE 32767M NOLOGGING ONLINE PERMANENT BLOCKSIZE 8192 EXTENT MANAGEMENT LOCAL AUTOALLOCATE SEGMENT SPACE MANAGEMENT AUTO
    Sat Jul 24 10:04:28 2010
    CREATE TABLESPACE "NEWTABLESPACE" DATAFILE '/oracle/product/newtablespace.dbf' SIZE 10485760 LOGGING ONLINE PERMANENT BLOCKSIZE 8192 EXTENT MANAGEMENT LOCAL UNIFORM SIZE 1048576 SEGMENT SPACE MANAGEMENT AUTO
    ORA-1119 signalled during: CREATE TABLESPACE "NEWTABLESPACE" DATAFILE '/oracle/product/newtablespace.dbf' SIZE 10485760 LOGGING ONLINE PERMANENT BLOCKSIZE 8192 EXTENT MANAGEMENT LOCAL UNIFORM SIZE 1048576 SEGMENT SPACE MANAGEMENT AUTO
    Sat Jul 24 10:04:28 2010
    CREATE TABLESPACE "T16KTAB" DATAFILE '/oracle/16k.dbf' SIZE 3145728 LOGGING ONLINE PERMANENT BLOCKSIZE 16384 EXTENT MANAGEMENT LOCAL AUTOALLOCATE SEGMENT SPACE MANAGEMENT AUTO
    ORA-29339 signalled during: CREATE TABLESPACE "T16KTAB" DATAFILE '/oracle/16k.dbf' SIZE 3145728 LOGGING ONLINE PERMANENT BLOCKSIZE 16384 EXTENT MANAGEMENT LOCAL AUTOALLOCATE SEGMENT SPACE MANAGEMENT AUTO
    Sat Jul 24 10:04:28 2010
    CREATE TABLESPACE "INDX_TS" DATAFILE '/oracle/indxts.dbf' SIZE 26214400,'/oracle/secondf.dbf' SIZE 26214400 LOGGING ONLINE PERMANENT BLOCKSIZE 8192 EXTENT MANAGEMENT LOCAL AUTOALLOCATE SEGMENT SPACE MANAGEMENT AUTO
    ORA-1119 signalled during: CREATE TABLESPACE "INDX_TS" DATAFILE '/oracle/indxts.dbf' SIZE 26214400,'/oracle/secondf.dbf' SIZE 26214400 LOGGING ONLINE PERMANENT BLOCKSIZE 8192 EXTENT MANAGEMENT LOCAL AUTOALLOCATE SEGMENT SPACE MANAGEMENT AUTO
    Sat Jul 24 10:04:43 2010
    Replication pre-import: trying to disable constraint REPCAT$_TEMPLATE_OBJECTS_FK1 for "SYSTEM"."REPCAT$_TEMPLATE_OBJECTS"
    Replication pre-import: constraint REPCAT$_TEMPLATE_OBJECTS_FK1 for "SYSTEM"."REPCAT$_TEMPLATE_OBJECTS" is disabled successfully
    Replication pre-import: trying to disable constraint REPCAT$_TEMPLATE_PARMS_FK1 for "SYSTEM"."REPCAT$_TEMPLATE_PARMS"
    Replication pre-import: constraint REPCAT$_TEMPLATE_PARMS_FK1 for "SYSTEM"."REPCAT$_TEMPLATE_PARMS" is disabled successfully
    Replication pre-import: trying to disable constraint REPCAT$_OBJECT_PARMS_FK1 for "SYSTEM"."REPCAT$_OBJECT_PARMS"
    Replication pre-import: constraint REPCAT$_OBJECT_PARMS_FK1 for "SYSTEM"."REPCAT$_OBJECT_PARMS" is disabled successfully
    Replication pre-import: trying to disable constraint REPCAT$_OBJECT_PARMS_FK2 for "SYSTEM"."REPCAT$_OBJECT_PARMS"
    Replication pre-import: constraint REPCAT$_OBJECT_PARMS_FK2 for "SYSTEM"."REPCAT$_OBJECT_PARMS" is disabled successfully
    Replication pre-import: trying to disable constraint REPCAT$_USER_AUTHORIZATION_FK2 for "SYSTEM"."REPCAT$_USER_AUTHORIZATIONS"
    Replication pre-import: constraint REPCAT$_USER_AUTHORIZATION_FK2 for "SYSTEM"."REPCAT$_USER_AUTHORIZATIONS" is disabled successfully
    Replication pre-import: trying to disable constraint REPCAT$_USER_PARM_VALUES_FK1 for "SYSTEM"."REPCAT$_USER_PARM_VALUES"
    Replication pre-import: constraint REPCAT$_USER_PARM_VALUES_FK1 for "SYSTEM"."REPCAT$_USER_PARM_VALUES" is disabled successfully
    Replication pre-import: trying to disable constraint REPCAT$_FLAVOR_OBJECTS_FK2 for "SYSTEM"."REPCAT$_FLAVOR_OBJECTS"
    Replication pre-import: constraint REPCAT$_FLAVOR_OBJECTS_FK2 for "SYSTEM"."REPCAT$_FLAVOR_OBJECTS" is disabled successfully
    Replication pre-import: trying to disable constraint REPCAT$_FLAVOR_OBJECTS_FK1 for "SYSTEM"."REPCAT$_FLAVOR_OBJECTS"
    Replication pre-import: constraint REPCAT$_FLAVOR_OBJECTS_FK1 for "SYSTEM"."REPCAT$_FLAVOR_OBJECTS" is disabled successfully
    Replication pre-import: trying to disable constraint REPCAT$_FLAVORS_FK1 for "SYSTEM"."REPCAT$_FLAVORS"
    Replication pre-import: constraint REPCAT$_FLAVORS_FK1 for "SYSTEM"."REPCAT$_FLAVORS" is disabled successfully
    Replication pre-import: trying to disable constraint REPCAT$_REPSCHEMA_DEST for "SYSTEM"."REPCAT$_REPSCHEMA"
    Replication pre-import: constraint REPCAT$_REPSCHEMA_DEST for "SYSTEM"."REPCAT$_REPSCHEMA" is disabled successfully
    Replication pre-import: trying to disable constraint REPCAT$_REPOBJECT_PRNT for "SYSTEM"."REPCAT$_REPOBJECT"
    Replication pre-import: constraint REPCAT$_REPOBJECT_PRNT for "SYSTEM"."REPCAT$_REPOBJECT" is disabled successfully
    Replication pre-import: trying to disable constraint REPCAT$_REPGEN_PRNT2 for "SYSTEM"."REPCAT$_GENERATED"
    Replication pre-import: constraint REPCAT$_REPGEN_PRNT2 for "SYSTEM"."REPCAT$_GENERATED" is disabled successfully
    Replication pre-import: trying to disable constraint REPCAT$_REPGEN_PRNT for "SYSTEM"."REPCAT$_GENERATED"
    Replication pre-import: constraint REPCAT$_REPGEN_PRNT for "SYSTEM"."REPCAT$_GENERATED" is disabled successfully
    Replication pre-import: trying to disable constraint REPCAT$_PARAMETER_COLUMN_F1 for "SYSTEM"."REPCAT$_PARAMETER_COLUMN"
    Replication pre-import: constraint REPCAT$_PARAMETER_COLUMN_F1 for "SYSTEM"."REPCAT$_PARAMETER_COLUMN" is disabled successfully
    Replication pre-import: trying to disable constraint REPCAT$_PRIORITY_F1 for "SYSTEM"."REPCAT$_PRIORITY"
    Replication pre-import: constraint REPCAT$_PRIORITY_F1 for "SYSTEM"."REPCAT$_PRIORITY" is disabled successfully
    Replication pre-import: trying to disable constraint REPCAT$_REPSCHEMA_PRNT for "SYSTEM"."REPCAT$_REPSCHEMA"
    Replication pre-import: constraint REPCAT$_REPSCHEMA_PRNT for "SYSTEM"."REPCAT$_REPSCHEMA" is disabled successfully
    Replication pre-import: trying to disable constraint REPCAT$_REPCOLUMN_FK for "SYSTEM"."REPCAT$_REPCOLUMN"
    Replication pre-import: constraint REPCAT$_REPCOLUMN_FK for "SYSTEM"."REPCAT$_REPCOLUMN" is disabled successfully
    Replication pre-import: trying to disable constraint REPCAT$_KEY_COLUMNS_PRNT for "SYSTEM"."REPCAT$_KEY_COLUMNS"
    Replication pre-import: constraint REPCAT$_KEY_COLUMNS_PRNT for "SYSTEM"."REPCAT$_KEY_COLUMNS" is disabled successfully
    Replication pre-import: trying to disable constraint REPCAT$_REPPROP_PRNT for "SYSTEM"."REPCAT$_REPPROP"
    Replication pre-import: constraint REPCAT$_REPPROP_PRNT for "SYSTEM"."REPCAT$_REPPROP" is disabled successfully
    Replication pre-import: trying to disable constraint REPCAT$_DDL_PRNT for "SYSTEM"."REPCAT$_DDL"
    Replication pre-import: constraint REPCAT$_DDL_PRNT for "SYSTEM"."REPCAT$_DDL" is disabled successfully
    Replication pre-import: trying to disable constraint REPCAT$_GROUPED_COLUMN_F1 for "SYSTEM"."REPCAT$_GROUPED_COLUMN"
    Replication pre-import: constraint REPCAT$_GROUPED_COLUMN_F1 for "SYSTEM"."REPCAT$_GROUPED_COLUMN" is disabled successfully
    Replication pre-import: trying to disable constraint REPCAT$_RESOLUTION_F1 for "SYSTEM"."REPCAT$_RESOLUTION"
    Replication pre-import: constraint REPCAT$_RESOLUTION_F1 for "SYSTEM"."REPCAT$_RESOLUTION" is disabled successfully
    Replication pre-import: trying to disable constraint REPCAT$_RESOLUTION_F3 for "SYSTEM"."REPCAT$_RESOLUTION"
    Replication pre-import: constraint REPCAT$_RESOLUTION_F3 for "SYSTEM"."REPCAT$_RESOLUTION" is disabled successfully
    Replication pre-import: trying to disable constraint REPCAT$_AUDIT_COLUMN_F2 for "SYSTEM"."REPCAT$_AUDIT_COLUMN"
    Replication pre-import: constraint REPCAT$_AUDIT_COLUMN_F2 for "SYSTEM"."REPCAT$_AUDIT_COLUMN" is disabled successfully
    Replication pre-import: trying to disable constraint REPCAT$_AUDIT_COLUMN_F1 for "SYSTEM"."REPCAT$_AUDIT_COLUMN"
    Replication pre-import: constraint REPCAT$_AUDIT_COLUMN_F1 for "SYSTEM"."REPCAT$_AUDIT_COLUMN" is disabled successfully
    Replication pre-import: trying to disable constraint REPCAT$_REPGROUP_PRIVS_FK for "SYSTEM"."REPCAT$_REPGROUP_PRIVS"
    Replication pre-import: constraint REPCAT$_REPGROUP_PRIVS_FK for "SYSTEM"."REPCAT$_REPGROUP_PRIVS" is disabled successfully
    Replication pre-import: trying to disable constraint REPCAT$_PARAMETER_COLUMN_PK for "SYSTEM"."REPCAT$_PARAMETER_COLUMN"
    Replication pre-import: constraint REPCAT$_PARAMETER_COLUMN_PK for "SYSTEM"."REPCAT$_PARAMETER_COLUMN" is disabled successfully
    Replication pre-import: trying to disable constraint REPCAT$_GROUPED_COLUMN_PK for "SYSTEM"."REPCAT$_GROUPED_COLUMN"
    Replication pre-import: constraint REPCAT$_GROUPED_COLUMN_PK for "SYSTEM"."REPCAT$_GROUPED_COLUMN" is disabled successfully
    Replication pre-import: trying to disable constraint REPCAT$_SITES_NEW_FK1 for "SYSTEM"."REPCAT$_SITES_NEW"
    Replication pre-import: constraint REPCAT$_SITES_NEW_FK1 for "SYSTEM"."REPCAT$_SITES_NEW" is disabled successfully
    Replication pre-import: trying to disable constraint REPCAT$_SITES_NEW_FK2 for "SYSTEM"."REPCAT$_SITES_NEW"
    Replication pre-import: constraint REPCAT$_SITES_NEW_FK2 for "SYSTEM"."REPCAT$_SITES_NEW" is disabled successfully
    Sat Jul 24 10:14:19 2010
    The value (30) of MAXTRANS parameter ignored.
    kupprdp: master process DM00 started with pid=20, OS id=1354
    to execute - SYS.KUPM$MCP.MAIN('SYS_IMPORT_TABLESPACE_01', 'SYSTEM', 'KUPC$C_1_20100724101420', 'KUPC$S_1_20100724101420', 0);
    kupprdp: worker process DW01 started with worker id=1, pid=23, OS id=1356
    to execute - SYS.KUPW$WORKER.MAIN('SYS_IMPORT_TABLESPACE_01', 'SYSTEM');
    Sat Jul 24 10:15:27 2010
    create tablespace indx_ts datafile '/oracle/ts.dbf' size 10m
    Sat Jul 24 10:15:29 2010
    Completed: create tablespace indx_ts datafile '/oracle/ts.dbf' size 10m
    The value (30) of MAXTRANS parameter ignored.
    kupprdp: master process DM00 started with pid=23, OS id=1368
    to execute - SYS.KUPM$MCP.MAIN('SYS_IMPORT_TABLESPACE_01', 'SYSTEM', 'KUPC$C_1_20100724101548', 'KUPC$S_1_20100724101548', 0);
    kupprdp: worker process DW01 started with worker id=1, pid=24, OS id=1370
    to execute - SYS.KUPW$WORKER.MAIN('SYS_IMPORT_TABLESPACE_01', 'SYSTEM');
    statement in resumable session 'SYSTEM.SYS_IMPORT_TABLESPACE_01.1' was suspended due to
    ORA-01652: unable to extend temp segment by 128 in tablespace INDX_TS
    Sat Jul 24 10:46:04 2010
    The value (30) of MAXTRANS parameter ignored.
    kupprdp: master process DM02 started with pid=17, OS id=1426
    to execute - SYS.KUPM$MCP.MAIN('SYS_EXPORT_SCHEMA_01', 'SYSTEM', 'KUPC$C_1_20100724104604', 'KUPC$S_1_20100724104604', 0);
    Sat Jul 24 10:46:46 2010
    The value (30) of MAXTRANS parameter ignored.
    kupprdp: master process DM02 started with pid=17, OS id=1432
    to execute - SYS.KUPM$MCP.MAIN('SYS_EXPORT_SCHEMA_01', 'SYSTEM', 'KUPC$C_1_20100724104647', 'KUPC$S_1_20100724104647', 0);
    Sat Jul 24 10:48:16 2010
    The value (30) of MAXTRANS parameter ignored.
    kupprdp: master process DM02 started with pid=17, OS id=1463
    to execute - SYS.KUPM$MCP.MAIN('SYS_SQL_FILE_FULL_01', 'SYSTEM', 'KUPC$C_1_20100724104816', 'KUPC$S_1_20100724104816', 0);
    kupprdp: worker process DW03 started with worker id=1, pid=18, OS id=1465
    to execute - SYS.KUPW$WORKER.MAIN('SYS_SQL_FILE_FULL_01', 'SYSTEM');
    Sat Jul 24 12:00:40 2010
    Thread 1 advanced to log sequence 2
    Current log# 2 seq# 2 mem# 0: /rmandb/rmandata/redo02.log
    Sat Jul 24 12:15:52 2010
    statement in resumable session 'SYSTEM.SYS_IMPORT_TABLESPACE_01.1' was timed out
    Sat Jul 24 13:18:59 2010
    The value (30) of MAXTRANS parameter ignored.
    kupprdp: master process DM00 started with pid=18, OS id=1841
    to execute - SYS.KUPM$MCP.MAIN('SYS_IMPORT_FULL_03', 'SYSTEM', 'KUPC$C_1_20100724131859', 'KUPC$S_1_20100724131859', 0);
    kupprdp: worker process DW01 started with worker id=1, pid=19, OS id=1843
    to execute - SYS.KUPW$WORKER.MAIN('SYS_IMPORT_FULL_03', 'SYSTEM');
    Sat Jul 24 13:19:10 2010
    CREATE UNDO TABLESPACE "UNDOTBS1" DATAFILE '/oracle/product/dbn/undotbs01.dbf' SIZE 26214400 AUTOEXTEND ON NEXT 5242880 MAXSIZE 32767M BLOCKSIZE 8192 EXTENT MANAGEMENT LOCAL AUTOALLOCATE
    ORA-1543 signalled during: CREATE UNDO TABLESPACE "UNDOTBS1" DATAFILE '/oracle/product/dbn/undotbs01.dbf' SIZE 26214400 AUTOEXTEND ON NEXT 5242880 MAXSIZE 32767M BLOCKSIZE 8192 EXTENT MANAGEMENT LOCAL AUTOALLOCATE...
    Sat Jul 24 13:19:10 2010
    CREATE TABLESPACE "SYSAUX" DATAFILE '/oracle/product/dbn/sysaux01.dbf' SIZE 125829120 AUTOEXTEND ON NEXT 10485760 MAXSIZE 32767M LOGGING ONLINE PERMANENT BLOCKSIZE 8192 EXTENT MANAGEMENT LOCAL AUTOALLOCATE SEGMENT SPACE MANAGEMENT AUTO
    ORA-1543 signalled during: CREATE TABLESPACE "SYSAUX" DATAFILE '/oracle/product/dbn/sysaux01.dbf' SIZE 125829120 AUTOEXTEND ON NEXT 10485760 MAXSIZE 32767M LOGGING ONLINE PERMANENT BLOCKSIZE 8192 EXTENT MANAGEMENT LOCAL AUTOALLOCATE SEGMENT SPACE MANAGEMENT AUTO...
    Sat Jul 24 13:19:10 2010
    CREATE TEMPORARY TABLESPACE "TEMP" TEMPFILE '/oracle/product/dbn/temp01.dbf' SIZE 31457280 AUTOEXTEND ON NEXT 655360 MAXSIZE 32767M EXTENT MANAGEMENT LOCAL UNIFORM SIZE 1048576
    ORA-1543 signalled during: CREATE TEMPORARY TABLESPACE "TEMP" TEMPFILE '/oracle/product/dbn/temp01.dbf' SIZE 31457280 AUTOEXTEND ON NEXT 655360 MAXSIZE 32767M EXTENT MANAGEMENT LOCAL UNIFORM SIZE 1048576
    Sat Jul 24 13:19:10 2010
    CREATE TABLESPACE "USERTB" DATAFILE '/oracle/product/dbn/users01.dbf' SIZE 5242880 AUTOEXTEND ON NEXT 1310720 MAXSIZE 32767M LOGGING ONLINE PERMANENT BLOCKSIZE 8192 EXTENT MANAGEMENT LOCAL AUTOALLOCATE SEGMENT SPACE MANAGEMENT AUTO
    ORA-1119 signalled during: CREATE TABLESPACE "USERTB" DATAFILE '/oracle/product/dbn/users01.dbf' SIZE 5242880 AUTOEXTEND ON NEXT 1310720 MAXSIZE 32767M LOGGING ONLINE PERMANENT BLOCKSIZE 8192 EXTENT MANAGEMENT LOCAL AUTOALLOCATE SEGMENT SPACE MANAGEMENT AUTO...
    Sat Jul 24 13:19:10 2010
    CREATE TABLESPACE "EXAMPLE" DATAFILE '/oracle/product/dbn/example01.dbf' SIZE 104857600 AUTOEXTEND ON NEXT 655360 MAXSIZE 32767M NOLOGGING ONLINE PERMANENT BLOCKSIZE 8192 EXTENT MANAGEMENT LOCAL AUTOALLOCATE SEGMENT SPACE MANAGEMENT AUTO
    ORA-1119 signalled during: CREATE TABLESPACE "EXAMPLE" DATAFILE '/oracle/product/dbn/example01.dbf' SIZE 104857600 AUTOEXTEND ON NEXT 655360 MAXSIZE 32767M NOLOGGING ONLINE PERMANENT BLOCKSIZE 8192 EXTENT MANAGEMENT LOCAL AUTOALLOCATE SEGMENT SPACE MANAGEMENT AUTO
    Sat Jul 24 13:19:10 2010
    CREATE TABLESPACE "NEWTABLESPACE" DATAFILE '/oracle/product/newtablespace.dbf' SIZE 10485760 LOGGING ONLINE PERMANENT BLOCKSIZE 8192 EXTENT MANAGEMENT LOCAL UNIFORM SIZE 1048576 SEGMENT SPACE MANAGEMENT AUTO
    ORA-1119 signalled during: CREATE TABLESPACE "NEWTABLESPACE" DATAFILE '/oracle/product/newtablespace.dbf' SIZE 10485760 LOGGING ONLINE PERMANENT BLOCKSIZE 8192 EXTENT MANAGEMENT LOCAL UNIFORM SIZE 1048576 SEGMENT SPACE MANAGEMENT AUTO
    Sat Jul 24 13:19:10 2010
    CREATE TABLESPACE "T16KTAB" DATAFILE '/oracle/16k.dbf' SIZE 3145728 LOGGING ONLINE PERMANENT BLOCKSIZE 16384 EXTENT MANAGEMENT LOCAL AUTOALLOCATE SEGMENT SPACE MANAGEMENT AUTO
    ORA-29339 signalled during: CREATE TABLESPACE "T16KTAB" DATAFILE '/oracle/16k.dbf' SIZE 3145728 LOGGING ONLINE PERMANENT BLOCKSIZE 16384 EXTENT MANAGEMENT LOCAL AUTOALLOCATE SEGMENT SPACE MANAGEMENT AUTO
    Sat Jul 24 13:19:10 2010
    CREATE TABLESPACE "INDX_TS" DATAFILE '/oracle/indxts.dbf' SIZE 26214400,'/oracle/secondf.dbf' SIZE 26214400 LOGGING ONLINE PERMANENT BLOCKSIZE 8192 EXTENT MANAGEMENT LOCAL AUTOALLOCATE SEGMENT SPACE MANAGEMENT AUTO
    ORA-1543 signalled during: CREATE TABLESPACE "INDX_TS" DATAFILE '/oracle/indxts.dbf' SIZE 26214400,'/oracle/secondf.dbf' SIZE 26214400 LOGGING ONLINE PERMANENT BLOCKSIZE 8192 EXTENT MANAGEMENT LOCAL AUTOALLOCATE SEGMENT SPACE MANAGEMENT AUTO
    ...

  • What is the best way to import a full database?

    Hello,
    Can anyone tell me, what is the best way to import a full database called test, into an existing database called DEV1?
    when import into an existing database do you have drop the users say pinfo, tinfo schemas are there. do i have drop these and recreate or how it will work, when you impport full database?
    Could you please give step by step instructions....
    Thanks a lot...

    Nayab,
    http://youngcow.net/doc/oracle10g/backup.102/b14191/rcmdupdb005.htmA suggestion that please don't use external sites which host oracle docs since there can not be any assurance that whether they update their content with the latest corrections or not. You can see the updated part no in the actual doc site from oracle,
    http://download.oracle.com/docs/cd/B19306_01/backup.102/b14191/rcmdupdb.htm#i1009381
    Aman....

  • Import mode FULL conflicts with export mode TRANSPORTABLE

    Hi,
    We are trying to import transportable tablespaces and are getting an error "import mode FULL conflicts with export mode TRANSPORTABLE. This is the second time we've seen it, the common thread seems to be that it only shows up with the remap_schema qualifier (the example below is with ASM, but we have seen it also without ASM). Does anyone have any suggestions as to how this can be resolved?
    Thanks,
    Dan
    impdp SYSTEM/xxxxx@DEVSUP DIRECTORY=CSF_DIR DUMPFILE=csf_au_q412_b1.dmp logfile=dev_imp.log TRANSPORT_DATAFILES='+DATA/PMDEVSUP/DATAFILE/CSF_AU_Q412_B1.1130.804322725','+DATA/PMDEVSUP/DATAFILE/CSF_AU_Q412_B1.631.804322681','+DATA/PMDEVSUP/DATAFILE/CSF_AU_Q412_B1.798.804322635','+DATA/PMDEVSUP/DATAFILE/CSF_AU_Q412_B1.968.804322581','+DATA/PMDEVSUP/DATAFILE/CSF_AU_Q412_B1.1261.804322791','+DATA/PMDEVSUP/DATAFILE/CSF_AU_Q412_B1.1727.804322951','+DATA/PMDEVSUP/DATAFILE/CSF_AU_Q412_B1.309.804322865','+DATA/PMDEVSUP/DATAFILE/CSF_AU_Q412_B1.684.804323047','+DATA/PMDEVSUP/DATAFILE/CSF_AU_Q412_B1.752.804323141','+DATA/PMDEVSUP/DATAFILE/CSF_AU_Q412_B1.843.804323247' remap_schema=CSF_AU_Q412_B1:CSF
    ORA-39002: invalid operation
    ORA-39061: import mode FULL conflicts with export mode TRANSPORTABLE

    We are trying to import transportable tablespaces and are getting an error "import mode FULL conflicts with export mode TRANSPORTABLE
    impdp SYSTEM/****@ESUST DIRECTORY=CSF_DIR DUMPFILE=csf_na_q212_b1.dmp
    TRANSPORT_DATAFILES='+DATA/PMESUST/DATAFILE/CSF_NA_Q212_B1.1907.805396709','+DATA/PMESUST/DATAFILE/CSF_NA_Q212_B1.1908.805396709','+DATA/PMESUST/DATAFILE/CSF_NA_Q212_B1.1909.805396653','+DATA/PMESUST/DATAFILE/CSF_NA_Q212_B1.1910.805396647','+DATA/PMESUST/DATAFILE/CSF_NA_Q212_B1.1911.805396591','+DATA/PMESUST/DATAFILE/CSF_NA_Q212_B1.1912.805396591','+DATA/PMESUST/DATAFILE/CSF_NA_Q212_B1.1913.805396535','+DATA/PMESUST/DATAFILE/CSF_NA_Q212_B1.1914.805396535','+DATA/PMESUST/DATAFILE/CSF_NA_Q212_B1.1915.805396479','+DATA/PMESUST/DATAFILE/CSF_NA_Q212_B1.1916.805396479','+DATA/PMESUST/DATAFILE/CSF_NA_Q212_B1.1917.805396425','+DATA/PMESUST/DATAFILE/CSF_NA_Q212_B1.1918.805396425','+DATA/PMESUST/DATAFILE/CSF_NA_Q212_B1.1919.805396369','+DATA/PMESUST/DATAFILE/CSF_NA_Q212_B1.1920.805396365','+DATA/PMESUST/DATAFILE/CSF_NA_Q212_B1.1921.805396321','+DATA/PMESUST/DATAFILE/CSF_NA_Q212_B1.1932.805396319','+DATA/PMESUST/DATAFILE/CSF_NA_Q212_B1.1933.805396263','+DATA/PMESUST/DATAFILE/CSF_NA_Q212_B1.1934.805396261','+DATA/PMESUST/DATAFILE/CSF_NA_Q212_B1.1935.805396205','+DATA/PMESUST/DATAFILE/CSF_NA_Q212_B1.1936.805396201','+DATA/PMESUST/DATAFILE/CSF_NA_Q212_B1.1937.805396137,+DATA/PMESUST/DATAFILE/CSF_NA_Q212_B1.1938.805396137,+DATA/PMESUST/DATAFILE/CSF_NA_Q212_B1.1939.805396081','+DATA/PMESUST/DATAFILE/CSF_NA_Q212_B1.1954.805396079','+DATA/PMESUST/DATAFILE/CSF_NA_Q212_B1.1958.805396013,+DATA/PMESUST/DATAFILE/CSF_NA_Q212_B1.1959.805396013,+DATA/PMESUST/DATAFILE/CSF_NA_Q212_B1.1960.805395937,+DATA/PMESUST/DATAFILE/CSF_NA_Q212_B1.1961.805395923','+DATA/PMESUST/DATAFILE/CSF_NA_Q212_B1.1962.805395867','+DATA/PMESUST/DATAFILE/CSF_NA_Q212_B1.1963.805395867' remap_schema=CSF_NA_Q212_B1:CSF
    ORA-39002: invalid operation
    ORA-39061: import mode FULL conflicts with export mode TRANSPORTABLE

  • User not available means schema not existed in the Database?

    Hi Team,
    I am trying to add primary to existing table. Requester mentioned schrma_name and table_name.
    I used below query......to find schema existed or not.
    select USERNAME,ACCOUNT_STATUS from dba_users where username='ABCD_USA_NUM';
    no rows selected.
    Please suggest any one., User not existed means schema not existed right?. This is 4 nodes RAC Database.
    Thnaks

    Please suggest any one., User not existed means schema not existed right?. This is 4 nodes RAC Database.In your case, yes.
    Rgds,
    Ahmer
    N.B.: To earn a good reputation on forums, and if you want that your questions will be answered timely. Kindly adopt the habit of marking your questions closed as soon as you get the answer. and be courteous to the people who trying to help.

  • I downloaded an album, but some of the songs do not play the full song. what do i do?

    I have downloaded an album off itunes. Some of the songs have not yet downloaded but it says that the whole song had downloaded yet hen i play them some of them only play 20 seconds of the song. What should i do?

    How to report an issue with Your iTunes Store purchase

  • SYS, SYSTEM and SYSAUX when full database refresh.

    I took full export from database using below command
    expdp "'/ as sysdba'" full=Y directory=DPUMP_DIR dumpfile=expdp_11032011.dmp logfile=expdp_11032011.log Now, I need to import this file to an other database.
    When do schema refresh we usually drop all the object in that schema and start refresh, but when doing fullback up, do we need to drop all user?
    what about sys, system and sysaux user?

    user3636719 wrote:
    So, the tables in the SYS and SYSTEM will remain same when we refresh?
    Structure will not be modified but contents will be automatically modified when DLL is executed when importing.
    And do we have to drop other user before we import?Applications schemas that you have created should be dropped. In general don't modify any schema that is directly managed by Oracle such as SYS or SYSTEM or any schema used by some database option like Oracle Text, Oracle Spatial, etc.

  • Full database import hangs

    System:
    Windows 2000 SP4
    Oracle 8.1.7
    Hi,
    I am importing a full backup ( export dump file ). It was taken from the
    database having the same structure as the database to which I am importing.
    It is taking too long in importing data in a particular table and seems to hang
    for hours. It says 40542226 rows imported. It seems to hang from here.
    I checked the v$session to identify the session serving the import and
    v$session_wait to verify the wait events for that session. But I don't see
    any wait events for this session. What I see is the session itself.
    I did the import using:
    imp username/password@sid file'D:\data\fullbkp.dmp' log=D:\log\fullbkpimp.log' full=y
    My question is: what are other diagnosis I should perform to help find out the cause and
    possibly the solution of this import hang?

    Hi,
    In addition, when I want to see what's is going on, I usually execute the SQL below in order to monitory the import operation activity...
    select substr(sql_text,instr(sql_text,'INTO "'),30) table_name,
    rows_processed,
    round((sysdate-to_date(first_load_time,'yyyy-mm-dd hh24:mi:ss'))*24*60,1) minutes,
    trunc(rows_processed/((sysdate-to_date(first_load_time,'yyyy-mm-dd hh24:mi:ss'))*24*60)) rows_per_min
    from sys.v_$sqlarea
    where sql_text like 'INSERT %INTO "%' and
    command_type = 2 and open_versions > 0; Cheers
    Legatti

  • Importing a full database

    Hi,
    I want to transfer a database from one computer A to another computer B. On the B computer I created manually the database. If I imported one time the database with full = y, and appeared errors, the users were created but the tables were without data if I give again the command imp with full=y and ignore=y options everything can be ok? What I import at the new import is written on the old import?
    Thank you,
    Mihaela

    If the table already exists you will have an exception and the import will continue. The create table statement wil not be execute. It means that if your tables structure is equal you can import your data (rows=y) without any problems. The rows will not be overwritten! If you have another table structure you must first drop the table. If the table contains rows and you want to have only those in your import you must execute a truncate on your tables.
    Bye, Aron

  • Full database import issue

    Hi everybody,
    our database version is 9.2.0.6 running in solaris, our one of the database got fragmented. In order to avoid fragmentation i export the full database and try to import the dumpfile to the same database(which i exported).If i do so means, the recored in the table will get duplicate. can someone help me out in this issue.Yhanks in advance
    Regards
    Hamid

    Why do you import back on the same database (Where you feel the database is fragmented)?, Insted, prepare the new database with all the recommendation of the fragmented database i.e. Database Name, Character Set, Tablespaces (if the datafile locations are different), etc. Then, import your dump into the newly created database.
    By the way, is this your production database?
    Regards,
    Sabdar Syed.

Maybe you are looking for