10g to 11gR2 Upgrade using Data Pump Import

Hi,
I am intending to move a 10g database from one windows server to another. However there is also a requirement to upgrade this database to 11gR2. Therefore I was going to combine the 2 in one movement by -
1. take a full data pump export of the source 10g database
2. create a new empty 11g database on the target environment
3. import the dump file into the target database
However I have a couple of queries running over in my mind about this approach -
Q1. What happens with the SYSTEM and SYS objects from SYSTEM and SYSAUX during the import ? Given that I will have in effect already created a new dictionary on the empty target database - will any import of SYSTEM or SYS simply produce errror messages which should be ignored ?
Q2. should I use EXCLUDE on SYS and SYSTEM ( is this EXCLUDE better on the export or import side ) ?
Q3. what happens if there are things such as scheduled jobs etc on the source system - since these would be stored in SYSTEM owned tables, how would I bring these across to the new target 11g database ?
thanks,
Jim

Jim Thompson wrote:
Hi,
I am intending to move a 10g database from one windows server to another. However there is also a requirement to upgrade this database to 11gR2. Therefore I was going to combine the 2 in one movement by -
1. take a full data pump export of the source 10g database
2. create a new empty 11g database on the target environment
3. import the dump file into the target database
However I have a couple of queries running over in my mind about this approach -
Q1. What happens with the SYSTEM and SYS objects from SYSTEM and SYSAUX during the import ? Given that I will have in effect already created a new dictionary on the empty target database - will any import of SYSTEM or SYS simply produce errror messages which should be ignored ?
You wont get error related to system , sysaux tablespace bacsei these wont be exported at all. Schemas like "SYS, CTXSYS, MDSYS and ORDSYS" are never exported using datapump. Thats why oracle reommends not to create any objects under sys,system schema.
Q2. should I use EXCLUDE on SYS and SYSTEM ( is this EXCLUDE better on the export or import side ) ?
Not required, as datadictionary schemas wont be exported, so spcefying EXCLUDE wont do anything
Q3. what happens if there are things such as scheduled jobs etc on the source system - since these would be stored in SYSTEM owned tables, how would I bring these across to the new target 11g database ?
DDL will get export and imported into new database. For example if you have schema A and you had defined job whos owner is A, then while importing this information also gets imported. For user defined Jobs there are view like USER_SCHEDULER_JOBS etc, So dont worry about the user jobs, these will be created.
Also see
MOS note - Schema's CTXSYS, MDSYS and ORDSYS are Not Exported [ID 228482.1]
Export system or sys schema
I also run following test in my db which shows i cannt export the objects under sys schema;
SQL> show user
USER is "SYS"
SQL> create table pump (id number);
Table created.
SQL> insert into pump values (1);
1 row created.
SQL> insert into pump values (2);
1 row created.
SQL> exit
Disconnected from Oracle Database 11g Enterprise Edition Release 11.2.0.1.0 - Pr
oduction
With the Partitioning, OLAP, Data Mining and Real Application Testing options
C:\Documents and Settings\rnagi\My Documents\Ranjit Doc\Performance\SQL>expdp tables=sys.pump logfile=test.log
Export: Release 11.2.0.1.0 - Production on Mon Feb 27 18:11:29 2012
Copyright (c) 1982, 2009, Oracle and/or its affiliates.  All rights reserved.
Username: / as sysdba
Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.1.0 - Produc
tion
With the Partitioning, OLAP, Data Mining and Real Application Testing options
Starting "SYS"."SYS_EXPORT_TABLE_01":  /******** AS SYSDBA tables=sys.pump logfi
le=test.log
Estimate in progress using BLOCKS method...
Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
Total estimation using BLOCKS method: 0 KB
*ORA-39166: Object SYS.PUMP was not found.*
*ORA-31655: no data or metadata objects selected for job*
*Job "SYS"."SYS_EXPORT_TABLE_01" completed with 2 error(s) at 18:11:57*

Similar Messages

  • Re: 10g to 11gR2 Upgrade using Data Pump Import

    Hi Srini,
    I read the below documented in export/import.
    "Grants on objects owned by the SYS schema are never exported."
    If this is the case, how do we import the grants in target database as we see many invalids due to grants.
    For example, I'm using expdp to export 11gr1 database from solaris and importing in 11gr2 database in RHEL.
    In target I see many invalids and when try to compile it manually it throws an error "insufficient privileges" which is missing grants in CTXSYS schema.
    How do we go about it.

    I branched this for you.
    The answer is not with Data Pump. Data Pump (expdp) and the old export utility (exp) does not support exporting grants on Oracle schemas. You would have to find these grants yourself and then reapply them on the target database.
    Probably not the answer you wanted, but with Data Pump, anything that is currently owned by the Oracle schemas is not exportable.
    Dean

  • Database Upgrade using Data Pump

    Hi,
    I am moving my database from a Windows 2003 server to a Windows 2007 server. At the same time I am upgrading this database from 10g to 11gR2(11.2.0.3).
    therefore I am using the export / import method of upgrade ( via Data Pump not the old exp/imp ).
    I have successfully exported by source database and have created the empty shell database ready to take the import. However I have a couple of queries
    Q1. regarding all the SYSTEM objects from the source database. How will they import given that the new target database already has a SYSTEM tablespace
    I am guessing I need to use the TABLE_EXISTS_ACTION option for the import. However should I set this to APPEND, SKIP, REPLACE or TRUNCATE - which is best ?
    Q2. I am planning to slightly change the directory structure on the new database server - would it therefore be better to pre-create the tablespaces or leave this to the import but use the REMAP DATAFILE option - what is everyone's experience as to which is the better way to go ? Again if I pre-create the tablespaces, how do I inform the import to ignore the creation of the tablespaces
    Q3. these 2 databases are on the same network, so in theorey instead of a manual export, copy of the dump file to the new server and then the import, I could use a Network Link for Import. I was just wondering where there any con's of this method over using the explicit export dump file ?
    thanks,
    Jim

    Jim,
    Q1. regarding all the SYSTEM objects from the source database. How will they import given that the new target database already has a SYSTEM tablespace
    I am guessing I need to use the TABLE_EXISTS_ACTION option for the import. However should I set this to APPEND, SKIP, REPLACE or TRUNCATE - which is best ?If all you have is the base database and nothing created, then you can do the full=y. In fact, this is probably what you want. The system tablespace will be there so when Data Pump tries to create it , it will just fail that create statement. Nothing else will fail. In most cases, your system tables will already be there, and this is ok too. If you do schema mode imports, you will miss out on some of the other stuff.
    Q2. I am planning to slightly change the directory structure on the new database server - would it therefore be better to pre-create the tablespaces or leave this to the import but use the REMAP >DATAFILE option - what is everyone's experience as to which is the better way to go ? Again if I pre-create the tablespaces, how do I inform the import to ignore the creation of the tablespacesIf the directory structure is different (which they usually are) then there is no easier way. You can run impdp but with sqlfile and you can say - include=tablespace. This will give you all of the create tablespace commands in a txt file and you can edit the text file to change what ever you want to change. You can tell datapump to skip the tablespace creation by using - exclude=tablespace
    Q3. these 2 databases are on the same network, so in theorey instead of a manual export, copy of the dump file to the new server and then the import, I could use a Network Link for Import. I >was just wondering where there any con's of this method over using the explicit export dump file ?The only con could be if you have a slow network. This will make it slower, but if you have to copy the dumpfile over the same network, then you will still see the same basic traffic. The pros are that you don't have to have extra disk space. Here is how I look at it.
    1. you need XX GB for the source database
    2. you need YY GB for the source dumpfile
    3. you need YY GB for the target dumpfile that you copy
    4. you need XX GB for the target databse.
    By doing network you get rid if 2*YY GB for the dumpfiles.
    Dean

  • How to consolidate data files using data pump when migrating 10g to 11g?

    We have one 10.2.0.4 database to be migrated to a new box running 11.2.0.1. The 10g database has too many data files scattered within too many file systems. I'd like to consolidate the data files into one or two large chunk in one file systems. Both OSs are RHEL 5. How should I do that using Data Pump Export/Import? I knew there is "Remap" option could be used, but it's only one to one mapping. How can I map multiple old data files into one new data file?

    hi
    datapump is terribly slow, make sure you have as much memory as possible allocated for Oracle but the bottleneck can be I/O throughput.
    Use PARALLEL option, set also these ones:
    * DISK_ASYNCH_IO=TRUE
    * DB_BLOCK_CHECKING=FALSE
    * DB_BLOCK_CHECKSUM=FALSE
    set high enough to allow for maximum parallelism:
    * PROCESSES
    * SESSIONS
    * PARALLEL_MAX_SERVERS
    more:
    http://download.oracle.com/docs/cd/B28359_01/server.111/b28319/dp_perf.htm
    that's it, patience welcome ;-)
    P.S.
    For maximum throughput, do not set PARALLEL to much more than twice the number of CPUs (two workers for each CPU).
    Edited by: g777 on 2011-02-02 09:53
    P.S.2
    breaking news ;-)
    I am playing now with storage performance and I turned the option of disk cache (also called write-back cache) to ON (goes at least along with RAID0 and 5 and setting it you don't lose any data on that volume) - and it gave me 1,5 to 2 times speed-up!
    Some says there's a risk of lose of more data when outage happens, but there's always such a risk even though you can lose less. Anyway if you can afford it (and with import it's OK, as it ss not a production at that moment) - I recommend to try. Takes 15 minutes, but you can gain 2,5 hours out of 10 of normal importing.
    Edited by: g777 on 2011-02-02 14:52

  • Migration from 10g to 12c using data pump

    hi there, while I've used data pump at the schema level before, I'm rather new at full database imports.
    we are attempting a full database migration from 10.2.0.4 to 12c using the full database data pump method over db link.
    the DBA has advised that we avoid moving SYSTEM and SYSAUX objects. but initially when reviewing the documentation it appeared that these objects would not be exported from the target system given TRANSPORTABLE=NEVER. can someone confirm this? the export/import log refers to objects that I believed would not be targeted:
    23-FEB-15 19:41:11.684:
    Estimated 3718 TABLE_DATA objects in 77 seconds
    23-FEB-15 19:41:12.450: Total estimation using BLOCKS method: 52.93 GB
    23-FEB-15 19:41:14.058: Processing object type DATABASE_EXPORT/TABLESPACE
    23-FEB-15 20:10:33.185: ORA-31684: Object type TABLESPACE:"UNDOTBS1" already exists
    23-FEB-15 20:10:33.185: ORA-31684: Object type TABLESPACE:"SYSAUX" already exists
    23-FEB-15 20:10:33.185: ORA-31684: Object type TABLESPACE:"TEMP" already exists
    23-FEB-15 20:10:33.185: ORA-31684: Object type TABLESPACE:"USERS" already exists
    23-FEB-15 20:10:33.200:
    Completed 96 TABLESPACE objects in 1759 seconds
    23-FEB-15 20:10:33.208: Processing object type DATABASE_EXPORT/PROFILE
    23-FEB-15 20:10:33.445:
    Completed 7 PROFILE objects in 1 seconds
    23-FEB-15 20:10:33.453: Processing object type DATABASE_EXPORT/SYS_USER/USER
    23-FEB-15 20:10:33.842:
    Completed 1 USER objects in 0 seconds
    23-FEB-15 20:10:33.852: Processing object type DATABASE_EXPORT/SCHEMA/USER
    23-FEB-15 20:10:52.368: ORA-31684: Object type USER:"OUTLN" already exists
    23-FEB-15 20:10:52.368: ORA-31684: Object type USER:"ANONYMOUS" already exists
    23-FEB-15 20:10:52.368: ORA-31684: Object type USER:"OLAPSYS" already exists
    23-FEB-15 20:10:52.368: ORA-31684: Object type USER:"MDDATA" already exists
    23-FEB-15 20:10:52.368: ORA-31684: Object type USER:"SCOTT" already exists
    23-FEB-15 20:10:52.368: ORA-31684: Object type USER:"LLTEST" already exists
    23-FEB-15 20:10:52.372:
    Completed 1140 USER objects in 19 seconds
    23-FEB-15 20:10:52.375: Processing object type DATABASE_EXPORT/ROLE
    23-FEB-15 20:10:55.255: ORA-31684: Object type ROLE:"SELECT_CATALOG_ROLE" already exists
    23-FEB-15 20:10:55.255: ORA-31684: Object type ROLE:"EXECUTE_CATALOG_ROLE" already exists
    23-FEB-15 20:10:55.255: ORA-31684: Object type ROLE:"DELETE_CATALOG_ROLE" already exists
    23-FEB-15 20:10:55.256: ORA-31684: Object type ROLE:"RECOVERY_CATALOG_OWNER" already exists
    any insight most appreciated.

    Schema's SYS,CTXSYS, MDSYS and ORDSYS are Not Exported using exp/expdp
    Doc ID: Note:228482.1
    I suppose he already installed a software 12c and created a database itseems - So when you imported you might have this "already exists"
    Whenever the database is created and software installed by default system,sys,sysaux will be created.

  • Select table when import using Data Pump API

    Hi,
    Sorry for the trivial question, I export the data using Data Pump API, with "TABLE" mode.
    So all tables will be exported in one .dmp file.
    My question is, then how to import few tables only using Data Pump API?, how to define "TABLES" property like command line interface?
    should I use DATA_FILTER procedures?, if yes how to do that?
    Really thanks in advance
    Regards,
    Kahlil

    Hi,
    You should be using metadata_filter procedure for the same.
    eg:
    dbms_datapump.metadata_filter
                (handle1
                 ,'NAME_EXPR'
                 ,'IN (''TABLE1'', '"TABLE2'')'
    {code}
    Regards
    Anurag                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                       

  • Data Pump import to a sql file error :ORA-31655 no data or metadata objects

    Hello,
    I'm using Data Pump to export/import data, one requirement is to import data to a sql file. The OS is window.
    I made the follow export :
    expdp system/password directory=dpump_dir dumpfile=tablesdump.dmp content=DATA_ONLY tables=user.tablename
    and it works, I can see the file TABLESDUMP.DMP in the directory path.
    then when I tried to import it to a sql file:
    impdp system/password directory=dpump_dir dumpfile=tablesdump.dmp sqlfile=tables_export.sql
    the log show :
    ORA-31655 no data or metadata objects selected for job
    and the sql file is created empty in the directory path.
    I'm not DBA, I'm a Java developer , Can you help me?
    Thks

    Hi, I added the command line :
    expdp system/system directory=dpump_dir dumpfile=tablesdump.dmp content=DATA_ONLY schemas=ko1 tables=KO1QT01 logfile=capture.log
    the log in the console screen is (is in Spanish), no log file was cerated in the directory path.
    Export: Release 10.2.0.1.0 - Production on Martes, 26 Enero, 2010 12:59:14
    Copyright (c) 2003, 2005, Oracle. All rights reserved.
    Conectado a: Oracle Database 10g Express Edition Release 10.2.0.1.0 - Production
    UDE-00010: se han solicitado varios modos de trabajo, schema y tables.
    (English error)
    UDE-00010: multiple job modes requested,schema y tables.
    This is why I used tables=user.tablename instead, is this right ?
    Thks

  • Error Data Pump Import

    Hi All,
    I am having errors below when trying to use data pump to import one table from the dump file (from a pre-prod db) into another database (prod_db) same version. I used the expdp to generate the dump file.
    Gettings errors -
    ORA-39083
    ORA-00959
    ORA-39112
    Any suggestions and/or advice would be appreciated.
    Thank you
    Import: Release 10.2.0.1.0 - 64bit Production
    Copyright (c) 2003, 2005, Oracle. All rights reserved.
    Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - 64bit Production
    With the Partitioning, OLAP and Data Mining options
    Master table "SYSTEM"."SYS_IMPORT_TABLE_01" successfully loaded/unloaded
    Starting "SYSTEM"."SYS_IMPORT_TABLE_01": system/******** DIRECTORY=DATA_PUMP_DIR DUMPFILE=EXP_Mxxxx_1203.DMP LOGFILE=Mxxxx_CLAIMS.LOG TABLES=CLAIMS
    Processing object type SCHEMA_EXPORT/TABLE/TABLE
    ORA-39083: Object type TABLE failed to create with error:
    ORA-00959: tablespace 'OXFORD_DATA_01' does not exist
    Failing sql is:
    CREATE TABLE "Mxxxx"."CLAIMS" ("CLAIM_NUM" VARCHAR2(25) NOT NULL ENABLE, "LINE_NUM" NUMBER(3,0) NOT NULL ENABLE, "PAID_FLAG" CHAR(1), "ADJ_FLAG" CHAR(1), "CLAIM_TYPE" VARCHAR2(20), "MEM_ID" VARCHAR2(20), "MEM_SUF" VARCHAR2(20), "MEM_BEG_DATE" DATE, "MEM_REL" CHAR(2), "MEM_NAME" VARCHAR2(40), "MEM_DOB" DATE, "MEM_SEX" CHAR(1), "REC_DATE" DATE, "PAID_DATE" DATE, "FROM_DATE" DATE, "
    Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
    Processing object type SCHEMA_EXPORT/TABLE/INDEX/INDEX
    ORA-39112: Dependent object type INDEX:"Mxxxx"."CLAIMS_IDX1" skipped, base object type TABLE:"Mxxxx"."CLAIMS" creation failed
    ORA-39112: Dependent object type INDEX:"Mxxxx"."CLAIMS_PK" skipped, base object type TABLE:"Mxxxx"."CLAIMS" creation failed
    Processing object type SCHEMA_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS
    ORA-39112: Dependent object type INDEX_STATISTICS skipped, base object type INDEX:"Mxxxx"."CLAIMS_IDX1" creation failed
    ORA-39112: Dependent object type INDEX_STATISTICS skipped, base object type INDEX:"Mxxxx"."CLAIMS_PK" creation failed
    Processing object type SCHEMA_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
    ORA-39112: Dependent object type TABLE_STATISTICS skipped, base object type TABLE:"Mxxxx"."CLAIMS" creation failed
    Job "SYSTEM"."SYS_IMPORT_TABLE_01" completed with 6 error(s) at 13:49:33
    impdp username/password DIRECTORY=DATA_PUMP_DIR DUMPFILE=EXP_Mxxxx_1203.DMP LOGFILE=Mxxxx_CLAIMS.LOG TABLES=CLAIMS

    Ok,
    Thank you for the advice-
    So I did use REMAP_TABLESPACE option and it did import and load the file but should I be concerned with 4 errors? Looks like have to rebuilt indexes which is fine.
    Master table "Mxxxx"."SYS_IMPORT_TABLE_01" successfully loaded/unloaded
    Starting "Mxxxx"."SYS_IMPORT_TABLE_01": Mxxxx/******** DIRECTORY=DATA_PUMP_DIR DUMPFILE=EXP_Mxxxx_1203.DMP LOGFILE=Mxxxx_CLAIMS.LOG TABLES=CLAIMS REMAP_TABLESPACE=OXFORD_DATA_01:Mxxxx_DATA_01
    Processing object type SCHEMA_EXPORT/TABLE/TABLE
    Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
    . . imported "Mxxxx"."CLAIMS" 605.3 MB 1715554 rows
    Processing object type SCHEMA_EXPORT/TABLE/INDEX/INDEX
    ORA-39083: Object type INDEX failed to create with error:
    ORA-00959: tablespace 'USER_INDEX_01' does not exist
    Failing sql is:
    CREATE INDEX "Mxxxx"."CLAIMS_IDX1" ON "Mxxxx"."CLAIMS" ("MEM_ID") PCTFREE 10 INITRANS 2 MAXTRANS 255 NOLOGGING STORAGE(INITIAL 8388608 NEXT 8388608 MINEXTENTS 1 MAXEXTENTS 2147483645 PCTINCREASE 0 FREELISTS 1 FREELIST GROUPS 1 BUFFER_POOL DEFAULT) TABLESPACE "USER_INDEX_01" PARALLEL 1
    ORA-39083: Object type INDEX failed to create with error:
    ORA-00959: tablespace 'USER_INDEX_01' does not exist
    Failing sql is:
    CREATE UNIQUE INDEX "Mxxxx"."CLAIMS_PK" ON "Mxxxx"."CLAIMS" ("CLAIM_NUM", "LINE_NUM") PCTFREE 10 INITRANS 2 MAXTRANS 255 NOLOGGING STORAGE(INITIAL 8388608 NEXT 8388608 MINEXTENTS 1 MAXEXTENTS 2147483645 PCTINCREASE 0 FREELISTS 1 FREELIST GROUPS 1 BUFFER_POOL DEFAULT) TABLESPACE "USER_INDEX_01" PARALLEL 1
    Processing object type SCHEMA_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS
    ORA-39112: Dependent object type INDEX_STATISTICS skipped, base object type INDEX:"Mxxxx"."CLAIMS_IDX1" creation failed
    ORA-39112: Dependent object type INDEX_STATISTICS skipped, base object type INDEX:"Mxxxx"."CLAIMS_PK" creation failed
    Processing object type SCHEMA_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
    Job "Mxxxx"."SYS_IMPORT_TABLE_01" completed with 4 error(s) at 16:57:12

  • Using  Data Pump when database is read-only

    Hello
    I used flashback and returned my database to the past time then I opened the database read only
    then I wanted use data pump(expdp) for exporting a schema but I encounter this error
    ORA-31626: job does not exist
    ORA-31633: unable to create master table "SYS.SYS_EXPORT_SCHEMA_05"
    ORA-06512: at "SYS.DBMS_SYS_ERROR", line 95
    ORA-06512: at "SYS.KUPV$FT", line 863
    ORA-16000: database open for read-only access
    but I could by exp, export that schema
    My question is that , don't I can use Data Pump while database is read only ? or do you know any resolution for the issue ?
    thanks

    You need to use NETWORK_LINK, so the required tables are created in a read/write database and the data is read from the read only database using a database link:
    SYSTEM@db_rw> create database link db_r_only
      2   connect to system identified by oracle using 'db_r_only';
    $ expdp system/oracle@db_rw network_link=db_r_only directory=data_pump_dir schemas=scott dumpfile=scott.dmpbut I tried it with 10.2.0.4 and found and error:
    Export: Release 10.2.0.4.0 - Production on Thursday, 27 November, 2008 9:26:31
    Copyright (c) 2003, 2007, Oracle.  All rights reserved.
    Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.4.0 - Production
    With the Partitioning, OLAP, Data Mining and Real Application Testing options
    ORA-39006: internal error
    ORA-39065: unexpected master process exception in DISPATCH
    ORA-02054: transaction 1.36.340 in-doubt
    ORA-16000: database open for read-only access
    ORA-02063: preceding line from DB_R_ONLY
    ORA-39097: Data Pump job encountered unexpected error -2054
    I found in Metalink the bug 7331929 which is solved in 11.2! I haven't tested this procedure with prior versions or with 11g so I don't know if this bug only affects to 10.2.0.4 or 10* and 11.1*
    HTH
    Enrique
    PS. If your problem was solved, consider marking the question as answered.

  • What are the 'gotcha' for exporting using Data Pump(10205) from HPUX to Win

    Hello,
    I have to export a schema using data pump from 10205 on HPUX 64bit to Windows 64bit same 10205 version database. What are the 'gotcha' can I expect from doing this? I mean export data pump is cross platform so this sounds straight forward. But are there issues I might face from export data pump on HPUX platform and then import data dump on to Windows 2008 platform same database version 10205? Thank you in advance.

    On the HPUX database, run this statement and look for the value for NLS_CHARACTERSET
    SQL> select * from NLS_DATABASE_PARAMETERS;http://docs.oracle.com/cd/B19306_01/server.102/b14237/statviews_4218.htm#sthref2018
    When creating the database on Windows, you have two options - manually create the database or use DBCA. If you plan to create the database manually, specify the database characterset in the CREATE DATABASE statement - http://docs.oracle.com/cd/B19306_01/server.102/b14200/statements_5004.htm#SQLRF01204
    If using DBCA, see http://docs.oracle.com/cd/B19306_01/server.102/b14196/install.htm#ADMQS0021 (especially http://docs.oracle.com/cd/B19306_01/server.102/b14196/install.htm#BABJBDIF)
    HTH
    Srini

  • How to export resource manager consumer groups using Data Pump?

    Hi, there,
    Is there any way to export RM Consumer Groups/Mappings/Plans as part of a Data Pump export/import? I was wondering because I don't fancy doing it manually and I don't see the object in the database_export_objects view. I can create them manually, but was wondering whether there's an easier, less involved way of doing it?
    Mark

    Hi,
    I have not tested it but i think a full db export/import (using data pump or traditional exp/imp) may help doing this (which might not be feasible for you to have full exp/imp) because full database mode exports/imports SYS schema objects also, so there is a chance that it will also import the resource group and resource plans.
    Salman

  • Exporting whole database (10GB) using Data Pump export utility

    Hi,
    I have a requirement that we have to export the whole database (10GB) using Data Pump export utility because it is not possible to send the 10GB dump in a CD/DVD to the system vendor of our application (to analyze few issues we have).
    Now when i checked online full export is available but not able to understand how it works, as we never used this data pump utility, we use normal export method. Also, will data pump reduce the size of the dump file so it can fit in a DVD or can we use Parallel Full DB export utility to split the files and include them in a DVD, is it possible.
    Please correct me if i am wrong and kindly help.
    Thanks for your help in advance.

    You need to create a directory object.
    sqlplus user/password
    create directory foo as '/path_here';
    grant all on directory foo to public;
    exit;
    then run you expdp command.
    Data Pump can compress the dumpfile if you are on 11.1 and have the appropriate options. The reason for saying filesize is to limit the size of the dumpfile. If you have 10G and are not compressing and the total dumpfiles are 10G, then by specifying 600MB, you will just have 10G/600MB = 17 dumpfiles that are 600MB. You will have to send them 17 cds. (probably a few more if dumpfiles don't get filled up 100% due to parallel.
    Data Pump dumpfiles are written by the server, not the client, so the dumpfiles don't get created in the directory where the job is run.
    Dean

  • Best Approach for using Data Pump

    Hi,
    I configured a new database which I set up with schemas that I imported in from another production database. Now, before this database becomes the new production database, I need to re-import the schemas so that the data is up-to-date.
    Is there a way to use Data Pump so that I don't have to drop all the schemas first? Can I just export the schemas and somehow just overwrite what's in there already?
    Thanks,
    Nora

    Hi, you can use the NETWORK_LINK parameter for import data from other remote database.
    http://download.oracle.com/docs/cd/B19306_01/server.102/b14215/dp_import.htm#i1007380
    Regards.

  • Tablespace level backup using data pump

    Hi,
    Im using 10.2.0.4 on RHEL 4,
    I have one doubt, can we take a tablespace level backup using data pump,
    bt i dnt wnt to use it for transportable tablespace.
    thanks.

    Yes, you can only for the tables in that tablespace only.
    Use the TABLESPACES option to export list of tablespaces.*here all the tables in that tablespaces will be exported*.
    and you must have the EXP_FULL_DATABASE role to use tablespace mode.
    Have a look at this,
    http://stanford.edu/dept/itss/docs/oracle/10g/server.101/b10825/dp_export.htm#i1007519
    Thanks
    Edited by: Cj on Dec 12, 2010 11:48 PM

  • Data Pump Import

    I am having trouble using the data pump import tool. I am working with Oracle 10.2.4.0 on Windows Server 2008. I am trying to import a lot of data (several schemas, thousands of tables, hundreds of GB of data) from the same version of Oracle on Linux. Since it's so much data and will take so long to import, I am trying to make sure I can get it to work for one table or one schema before I attempt everything. So I'm trying to import the TEST_USER.TABLE1 table using the following command:
    impdp.exe dumpfile=big_file.dmp logfile=big_file_import.log tables=test_user.table1 remap_datafile=\'/oradata1/data1.dbf\':\'D:\oracle\product\10.2.0\oradata\orcl\data1.dbf\'
    I supply sys as sysdba for the login when it asks (not sure how you get that info into the initial command). However, I am getting the following error:
    user 'TEST_USER' does not exist
    My understanding was that the data pump utility would create all the requisite schemas for me. Is that not the case? The target database is a new install so the source schemas don't exist there.
    Even if I create the test_user schema by hand, then I get an error stating that the tablespace doesn't exist:
    ORA-00959: tablespace 'TS_1' does not exist
    Then even doing that, which I don't want to have to do to begin with, doesn't work. Then it gives me something about how the user doesn't have the proper privileges on the tablespace.
    Isn't the data pump utility supposed to make this stuff automatic? Do I really have to create all the schemas and tablespaces by hand? That's going to take a long time. Am I doing something wrong here?
    Thanks,
    Dave

    user5482172 wrote:
    Hemant K Chitale wrote:
    tables=test_user.table1 The "TABLES" mode does NOT create the database accounts.
    The FULL mode creates tablespaces and database accounts before importing data.
    The SCHEMAS mode creates database accounts before importing data-- but expects Tablespaces to pre-exist so that the tables and indexes can be created in the correct tablespaces.
    Hemant K Chitale
    http://hemantoracledba.blogspot.com
    That is consistent with what I'm seeing. So basically there is no way to import just a few tables or schemas and have it create all the necessary users and tablespaces for you. That seems a little arbitrary, but whatever. Why make things easy on your users when you can be all Oracley about it. Anyway, I'll give it a try and let you know what happens.
    Can I ask where you got this information? I'm sure you're right, but I can't find anything stating this.
    Thanks,
    Dave
    Edited by: user5482172 on Feb 18, 2010 10:51 PMAfter some testing, this appears to be correct. The system will not create tablespaces under the SCHEMAS mode and it will not create schemas under the TABLES mode; only under the FULL mode will everything be created for you. I find that unintuitive and extremely unfriendly, but that's Oracle for you. Have you ever tried doing this in SQL Server? You open the GUI, tell it where the dump file is, the name of the database, where to put the data files and you're done. Oracle has a real knack for making things way more complicated than they need to be. End Oracle rant.

Maybe you are looking for

  • Multiple iphones on one computer.........is it possible?

    this may be a silly question but its one im having trouble finding any kind of answer for anywhere on the apple websites! i have an iphone 3gs and my other half is also going to be getting one pretty soon, but the dilema is that we will be using it o

  • Color in ALV grid output based on condition

    Hi, I have generated a ALV Grid output. I have a internal table ITAB1. It has 2 fields - field1 and field2. Now, my requirement is if field1 is 'X', the row should be colored in RED, else if field1 is 'Y', the row should be colored in GREEN. How can

  • MRSS issue in work order.

    Hi all, i have need your help to solve the below mentioned issue. Right now i am doing a sap upgradation. in the part of testing i executed the iw31 tcode. afte entering all the data the i just saved it. while saving the order the warning came like"

  • Refund checks out of AP

    Hello, We are trying to issue customer refund checks out of AP. What is the process (standard if any, since we are on the same box) to send the refund details to AP for the check to be issued out of there. What preliminary steps need to be done to th

  • Building error from CS6 to CC

    I am not sure if this is an indesign question or for DPS but my folios created in CS6 are now unable to build since I upgraded to indesign CC...considering this is over 100 articles with descriptions this is a bit of a problem...any suggestions? here