Tablespace level backup using data pump

Hi,
Im using 10.2.0.4 on RHEL 4,
I have one doubt, can we take a tablespace level backup using data pump,
bt i dnt wnt to use it for transportable tablespace.
thanks.

Yes, you can only for the tables in that tablespace only.
Use the TABLESPACES option to export list of tablespaces.*here all the tables in that tablespaces will be exported*.
and you must have the EXP_FULL_DATABASE role to use tablespace mode.
Have a look at this,
http://stanford.edu/dept/itss/docs/oracle/10g/server.101/b10825/dp_export.htm#i1007519
Thanks
Edited by: Cj on Dec 12, 2010 11:48 PM

Similar Messages

  • Migration from 10g to 12c using data pump

    hi there, while I've used data pump at the schema level before, I'm rather new at full database imports.
    we are attempting a full database migration from 10.2.0.4 to 12c using the full database data pump method over db link.
    the DBA has advised that we avoid moving SYSTEM and SYSAUX objects. but initially when reviewing the documentation it appeared that these objects would not be exported from the target system given TRANSPORTABLE=NEVER. can someone confirm this? the export/import log refers to objects that I believed would not be targeted:
    23-FEB-15 19:41:11.684:
    Estimated 3718 TABLE_DATA objects in 77 seconds
    23-FEB-15 19:41:12.450: Total estimation using BLOCKS method: 52.93 GB
    23-FEB-15 19:41:14.058: Processing object type DATABASE_EXPORT/TABLESPACE
    23-FEB-15 20:10:33.185: ORA-31684: Object type TABLESPACE:"UNDOTBS1" already exists
    23-FEB-15 20:10:33.185: ORA-31684: Object type TABLESPACE:"SYSAUX" already exists
    23-FEB-15 20:10:33.185: ORA-31684: Object type TABLESPACE:"TEMP" already exists
    23-FEB-15 20:10:33.185: ORA-31684: Object type TABLESPACE:"USERS" already exists
    23-FEB-15 20:10:33.200:
    Completed 96 TABLESPACE objects in 1759 seconds
    23-FEB-15 20:10:33.208: Processing object type DATABASE_EXPORT/PROFILE
    23-FEB-15 20:10:33.445:
    Completed 7 PROFILE objects in 1 seconds
    23-FEB-15 20:10:33.453: Processing object type DATABASE_EXPORT/SYS_USER/USER
    23-FEB-15 20:10:33.842:
    Completed 1 USER objects in 0 seconds
    23-FEB-15 20:10:33.852: Processing object type DATABASE_EXPORT/SCHEMA/USER
    23-FEB-15 20:10:52.368: ORA-31684: Object type USER:"OUTLN" already exists
    23-FEB-15 20:10:52.368: ORA-31684: Object type USER:"ANONYMOUS" already exists
    23-FEB-15 20:10:52.368: ORA-31684: Object type USER:"OLAPSYS" already exists
    23-FEB-15 20:10:52.368: ORA-31684: Object type USER:"MDDATA" already exists
    23-FEB-15 20:10:52.368: ORA-31684: Object type USER:"SCOTT" already exists
    23-FEB-15 20:10:52.368: ORA-31684: Object type USER:"LLTEST" already exists
    23-FEB-15 20:10:52.372:
    Completed 1140 USER objects in 19 seconds
    23-FEB-15 20:10:52.375: Processing object type DATABASE_EXPORT/ROLE
    23-FEB-15 20:10:55.255: ORA-31684: Object type ROLE:"SELECT_CATALOG_ROLE" already exists
    23-FEB-15 20:10:55.255: ORA-31684: Object type ROLE:"EXECUTE_CATALOG_ROLE" already exists
    23-FEB-15 20:10:55.255: ORA-31684: Object type ROLE:"DELETE_CATALOG_ROLE" already exists
    23-FEB-15 20:10:55.256: ORA-31684: Object type ROLE:"RECOVERY_CATALOG_OWNER" already exists
    any insight most appreciated.

    Schema's SYS,CTXSYS, MDSYS and ORDSYS are Not Exported using exp/expdp
    Doc ID: Note:228482.1
    I suppose he already installed a software 12c and created a database itseems - So when you imported you might have this "already exists"
    Whenever the database is created and software installed by default system,sys,sysaux will be created.

  • Differences between using Data Pump to back up database and using RMAN ?

    what are differences between using Data Pump to back up database and using RMAN ? what is CONS and PROS ?
    Thanks

    Search for Database backup in
    http://docs.oracle.com/cd/B28359_01/server.111/b28318/backrec.htm#i1007289
    In short
    RMAN -> Physical backup.(copies of physical database files)
    Datapump -> Logical backup.(logical data such as tables,procedures)
    Docs for RMAN--
    http://docs.oracle.com/cd/B28359_01/backup.111/b28270/rcmcncpt.htm#
    Docs for Datapump
    http://docs.oracle.com/cd/B19306_01/server.102/b14215/dp_overview.htm
    Edited by: Sunny kichloo on Jul 5, 2012 6:55 AM

  • Can we load data in chunks using data pump ?

    We are loading data using data pump. So I want to clear my understanding.
    Please correct me if I am wrong on my understandings -
    ODI will fetch all data from source (whether it is INIT or CDC ) in one go and unload into staging area.
    If it is true, will performance hamper in case very huge data (50 million records at source) at source as ODI tries to load entire data in one go. I believe it will give better performance if we load in chunks using data pump.
    Please confirm and correct.
    Also I would like to know how can we configure chunk load using data-pump.
    Thanks in Advance.
    Regards,
    Dinesh.

    You may consider usingLKM Oracle to Oracle (datapump)
    http://docs.oracle.com/cd/E28280_01/integrate.1111/e12644/oracle_db.htm#r15c1-t2
    In 11g ODI reads from source and write to target in parallel. This is the case where you specify select query in source command and insert/update query in the target command. At source side Odi reads records from source and add them to a data queue. At target side a parallel thread reads data from the data queue and writes to the target. So the overall performance would be the slower of the read or write process.
    Thanks,

  • Using  Data Pump when database is read-only

    Hello
    I used flashback and returned my database to the past time then I opened the database read only
    then I wanted use data pump(expdp) for exporting a schema but I encounter this error
    ORA-31626: job does not exist
    ORA-31633: unable to create master table "SYS.SYS_EXPORT_SCHEMA_05"
    ORA-06512: at "SYS.DBMS_SYS_ERROR", line 95
    ORA-06512: at "SYS.KUPV$FT", line 863
    ORA-16000: database open for read-only access
    but I could by exp, export that schema
    My question is that , don't I can use Data Pump while database is read only ? or do you know any resolution for the issue ?
    thanks

    You need to use NETWORK_LINK, so the required tables are created in a read/write database and the data is read from the read only database using a database link:
    SYSTEM@db_rw> create database link db_r_only
      2   connect to system identified by oracle using 'db_r_only';
    $ expdp system/oracle@db_rw network_link=db_r_only directory=data_pump_dir schemas=scott dumpfile=scott.dmpbut I tried it with 10.2.0.4 and found and error:
    Export: Release 10.2.0.4.0 - Production on Thursday, 27 November, 2008 9:26:31
    Copyright (c) 2003, 2007, Oracle.  All rights reserved.
    Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.4.0 - Production
    With the Partitioning, OLAP, Data Mining and Real Application Testing options
    ORA-39006: internal error
    ORA-39065: unexpected master process exception in DISPATCH
    ORA-02054: transaction 1.36.340 in-doubt
    ORA-16000: database open for read-only access
    ORA-02063: preceding line from DB_R_ONLY
    ORA-39097: Data Pump job encountered unexpected error -2054
    I found in Metalink the bug 7331929 which is solved in 11.2! I haven't tested this procedure with prior versions or with 11g so I don't know if this bug only affects to 10.2.0.4 or 10* and 11.1*
    HTH
    Enrique
    PS. If your problem was solved, consider marking the question as answered.

  • What are the 'gotcha' for exporting using Data Pump(10205) from HPUX to Win

    Hello,
    I have to export a schema using data pump from 10205 on HPUX 64bit to Windows 64bit same 10205 version database. What are the 'gotcha' can I expect from doing this? I mean export data pump is cross platform so this sounds straight forward. But are there issues I might face from export data pump on HPUX platform and then import data dump on to Windows 2008 platform same database version 10205? Thank you in advance.

    On the HPUX database, run this statement and look for the value for NLS_CHARACTERSET
    SQL> select * from NLS_DATABASE_PARAMETERS;http://docs.oracle.com/cd/B19306_01/server.102/b14237/statviews_4218.htm#sthref2018
    When creating the database on Windows, you have two options - manually create the database or use DBCA. If you plan to create the database manually, specify the database characterset in the CREATE DATABASE statement - http://docs.oracle.com/cd/B19306_01/server.102/b14200/statements_5004.htm#SQLRF01204
    If using DBCA, see http://docs.oracle.com/cd/B19306_01/server.102/b14196/install.htm#ADMQS0021 (especially http://docs.oracle.com/cd/B19306_01/server.102/b14196/install.htm#BABJBDIF)
    HTH
    Srini

  • How to consolidate data files using data pump when migrating 10g to 11g?

    We have one 10.2.0.4 database to be migrated to a new box running 11.2.0.1. The 10g database has too many data files scattered within too many file systems. I'd like to consolidate the data files into one or two large chunk in one file systems. Both OSs are RHEL 5. How should I do that using Data Pump Export/Import? I knew there is "Remap" option could be used, but it's only one to one mapping. How can I map multiple old data files into one new data file?

    hi
    datapump is terribly slow, make sure you have as much memory as possible allocated for Oracle but the bottleneck can be I/O throughput.
    Use PARALLEL option, set also these ones:
    * DISK_ASYNCH_IO=TRUE
    * DB_BLOCK_CHECKING=FALSE
    * DB_BLOCK_CHECKSUM=FALSE
    set high enough to allow for maximum parallelism:
    * PROCESSES
    * SESSIONS
    * PARALLEL_MAX_SERVERS
    more:
    http://download.oracle.com/docs/cd/B28359_01/server.111/b28319/dp_perf.htm
    that's it, patience welcome ;-)
    P.S.
    For maximum throughput, do not set PARALLEL to much more than twice the number of CPUs (two workers for each CPU).
    Edited by: g777 on 2011-02-02 09:53
    P.S.2
    breaking news ;-)
    I am playing now with storage performance and I turned the option of disk cache (also called write-back cache) to ON (goes at least along with RAID0 and 5 and setting it you don't lose any data on that volume) - and it gave me 1,5 to 2 times speed-up!
    Some says there's a risk of lose of more data when outage happens, but there's always such a risk even though you can lose less. Anyway if you can afford it (and with import it's OK, as it ss not a production at that moment) - I recommend to try. Takes 15 minutes, but you can gain 2,5 hours out of 10 of normal importing.
    Edited by: g777 on 2011-02-02 14:52

  • How to export resource manager consumer groups using Data Pump?

    Hi, there,
    Is there any way to export RM Consumer Groups/Mappings/Plans as part of a Data Pump export/import? I was wondering because I don't fancy doing it manually and I don't see the object in the database_export_objects view. I can create them manually, but was wondering whether there's an easier, less involved way of doing it?
    Mark

    Hi,
    I have not tested it but i think a full db export/import (using data pump or traditional exp/imp) may help doing this (which might not be feasible for you to have full exp/imp) because full database mode exports/imports SYS schema objects also, so there is a chance that it will also import the resource group and resource plans.
    Salman

  • Exporting whole database (10GB) using Data Pump export utility

    Hi,
    I have a requirement that we have to export the whole database (10GB) using Data Pump export utility because it is not possible to send the 10GB dump in a CD/DVD to the system vendor of our application (to analyze few issues we have).
    Now when i checked online full export is available but not able to understand how it works, as we never used this data pump utility, we use normal export method. Also, will data pump reduce the size of the dump file so it can fit in a DVD or can we use Parallel Full DB export utility to split the files and include them in a DVD, is it possible.
    Please correct me if i am wrong and kindly help.
    Thanks for your help in advance.

    You need to create a directory object.
    sqlplus user/password
    create directory foo as '/path_here';
    grant all on directory foo to public;
    exit;
    then run you expdp command.
    Data Pump can compress the dumpfile if you are on 11.1 and have the appropriate options. The reason for saying filesize is to limit the size of the dumpfile. If you have 10G and are not compressing and the total dumpfiles are 10G, then by specifying 600MB, you will just have 10G/600MB = 17 dumpfiles that are 600MB. You will have to send them 17 cds. (probably a few more if dumpfiles don't get filled up 100% due to parallel.
    Data Pump dumpfiles are written by the server, not the client, so the dumpfiles don't get created in the directory where the job is run.
    Dean

  • Select table when import using Data Pump API

    Hi,
    Sorry for the trivial question, I export the data using Data Pump API, with "TABLE" mode.
    So all tables will be exported in one .dmp file.
    My question is, then how to import few tables only using Data Pump API?, how to define "TABLES" property like command line interface?
    should I use DATA_FILTER procedures?, if yes how to do that?
    Really thanks in advance
    Regards,
    Kahlil

    Hi,
    You should be using metadata_filter procedure for the same.
    eg:
    dbms_datapump.metadata_filter
                (handle1
                 ,'NAME_EXPR'
                 ,'IN (''TABLE1'', '"TABLE2'')'
    {code}
    Regards
    Anurag                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                       

  • Best Approach for using Data Pump

    Hi,
    I configured a new database which I set up with schemas that I imported in from another production database. Now, before this database becomes the new production database, I need to re-import the schemas so that the data is up-to-date.
    Is there a way to use Data Pump so that I don't have to drop all the schemas first? Can I just export the schemas and somehow just overwrite what's in there already?
    Thanks,
    Nora

    Hi, you can use the NETWORK_LINK parameter for import data from other remote database.
    http://download.oracle.com/docs/cd/B19306_01/server.102/b14215/dp_import.htm#i1007380
    Regards.

  • Using Data Pump Storage Parameter option

    I am creating database replica of our production environment - the db name don't have to be the same.
    My option is to use Oracle data pump  to move the data from source database to target database.
    I performed the same schenerio for our Windows 2003 evnironment with no problem.
    Doing the same for Linux, I am getting tablespace creation error as you can see below:
    Lix
    Linux-x86_64 Error: 2: No such file or directory Failing sql is: CREATE TABLESPACE "INQUIRY" DATAFILE '/oraappl/pca/vprod/vproddata/inquiry01.dbf' SIZE 629145600 LOGGING ONLINE PERMANENT BLOCKSIZE 8192 EXTENT MANAGEMENT LOCAL AUTOALLOCATE SEGMENT SPACE MANAGEMENT AUTO ORA-39083: Object type TABLESPACE failed to create with error: ORA-01119: error in creating database file '/oraappl/pca/vprod/vproddata/medical01.dbf' ORA-27040: file create error, unable to create file
    My question is do we have to create the tablespaces or data pump should use the default tablespace location already being used by the new database?

    Hi Richard,
    I am working creating my extra database using duplicate command as you suggested.
    I got everything up until I got this error:
    channel ORA_AUX_DISK_1: reading from backup piece /oraappl/pca/backups/weekly/vproddata/rman/VPR ackupset/2013_08_27/o1_mf_nnndf_TAG20130827T083750_91s7dz0r_.bkp ORA-19870: error reading backup piece /oraappl/pca/backups/weekly/vproddata/rman/VPROD/backupset 3_08_27/o1_mf_nnndf_TAG20130827T083750_91s7dz0r_.bkp ORA-19505: failed to identify file "/oraappl/pca/backups/weekly/vproddata/rman/VPROD/backupset/2 08_27/o1_mf_nnndf_TAG20130827T083750_91s7dz0r_.bkp" ORA-27037: unable to obtain file status Linux-x86_64 Error: 2: No such file or directory
    It is defaulting to the recovery location of the production database, instead of the auxiliary db.
    My next option was to catalog the backup files, even that is not working. Any suggestion?

  • While using data pump (impdp) how to rename references within objects?

    using 10g;
    what i want to accomplish is to change schema & tablespace ownership using the data pump method via the command line; i have had success using the command line for expdp / impdp. Problem is that there are objects that reference the old schemas that DO NOT get updated (e.g. procedure may reference usr1.table1 in the PL/SQL statement) and this is where i have been UN-successfull). Anyone know of a way to change references from old schema to new schama name in objects(procedures, views, etc) via the command line?
    this is what i currently use that works to change schema, tablespace, but will not change references within my objects;
    expdp system/<pass> schemas=usr1,usr2 DIRECTORY=dp_dir DUMPFILE=dataPump_BothSchemas.dmp LOGFILE=expdpAllSchema.log parallel=2
    impdp system/<pass> DIRECTORY=dp_dir DUMPFILE=dataPump_BothSchemas.dmp LOGFILE=impbothSchToEE.log remap_schema=usr1:newUsr1,usr2:newUsr2 remap_tablespace=old_ts_tables:new_ts_tables full=y
    Thanks!
    p.s. I have acomplished this using the enterprise manager.

    (e.g. procedure may reference usr1.table1 in the PL/SQL statement) If you hard coded such reference in stored procedure, you have to manually correct them. Consider use synonym if your storage procedure referencing other schema's objects.

  • 10g to 11gR2 Upgrade using Data Pump Import

    Hi,
    I am intending to move a 10g database from one windows server to another. However there is also a requirement to upgrade this database to 11gR2. Therefore I was going to combine the 2 in one movement by -
    1. take a full data pump export of the source 10g database
    2. create a new empty 11g database on the target environment
    3. import the dump file into the target database
    However I have a couple of queries running over in my mind about this approach -
    Q1. What happens with the SYSTEM and SYS objects from SYSTEM and SYSAUX during the import ? Given that I will have in effect already created a new dictionary on the empty target database - will any import of SYSTEM or SYS simply produce errror messages which should be ignored ?
    Q2. should I use EXCLUDE on SYS and SYSTEM ( is this EXCLUDE better on the export or import side ) ?
    Q3. what happens if there are things such as scheduled jobs etc on the source system - since these would be stored in SYSTEM owned tables, how would I bring these across to the new target 11g database ?
    thanks,
    Jim

    Jim Thompson wrote:
    Hi,
    I am intending to move a 10g database from one windows server to another. However there is also a requirement to upgrade this database to 11gR2. Therefore I was going to combine the 2 in one movement by -
    1. take a full data pump export of the source 10g database
    2. create a new empty 11g database on the target environment
    3. import the dump file into the target database
    However I have a couple of queries running over in my mind about this approach -
    Q1. What happens with the SYSTEM and SYS objects from SYSTEM and SYSAUX during the import ? Given that I will have in effect already created a new dictionary on the empty target database - will any import of SYSTEM or SYS simply produce errror messages which should be ignored ?
    You wont get error related to system , sysaux tablespace bacsei these wont be exported at all. Schemas like "SYS, CTXSYS, MDSYS and ORDSYS" are never exported using datapump. Thats why oracle reommends not to create any objects under sys,system schema.
    Q2. should I use EXCLUDE on SYS and SYSTEM ( is this EXCLUDE better on the export or import side ) ?
    Not required, as datadictionary schemas wont be exported, so spcefying EXCLUDE wont do anything
    Q3. what happens if there are things such as scheduled jobs etc on the source system - since these would be stored in SYSTEM owned tables, how would I bring these across to the new target 11g database ?
    DDL will get export and imported into new database. For example if you have schema A and you had defined job whos owner is A, then while importing this information also gets imported. For user defined Jobs there are view like USER_SCHEDULER_JOBS etc, So dont worry about the user jobs, these will be created.
    Also see
    MOS note - Schema's CTXSYS, MDSYS and ORDSYS are Not Exported [ID 228482.1]
    Export system or sys schema
    I also run following test in my db which shows i cannt export the objects under sys schema;
    SQL> show user
    USER is "SYS"
    SQL> create table pump (id number);
    Table created.
    SQL> insert into pump values (1);
    1 row created.
    SQL> insert into pump values (2);
    1 row created.
    SQL> exit
    Disconnected from Oracle Database 11g Enterprise Edition Release 11.2.0.1.0 - Pr
    oduction
    With the Partitioning, OLAP, Data Mining and Real Application Testing options
    C:\Documents and Settings\rnagi\My Documents\Ranjit Doc\Performance\SQL>expdp tables=sys.pump logfile=test.log
    Export: Release 11.2.0.1.0 - Production on Mon Feb 27 18:11:29 2012
    Copyright (c) 1982, 2009, Oracle and/or its affiliates.  All rights reserved.
    Username: / as sysdba
    Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.1.0 - Produc
    tion
    With the Partitioning, OLAP, Data Mining and Real Application Testing options
    Starting "SYS"."SYS_EXPORT_TABLE_01":  /******** AS SYSDBA tables=sys.pump logfi
    le=test.log
    Estimate in progress using BLOCKS method...
    Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
    Total estimation using BLOCKS method: 0 KB
    *ORA-39166: Object SYS.PUMP was not found.*
    *ORA-31655: no data or metadata objects selected for job*
    *Job "SYS"."SYS_EXPORT_TABLE_01" completed with 2 error(s) at 18:11:57*

  • Error connecting as DBA using Data Pump Within Transportable Modules

    Hi all
    I am using OWB 10g R2 and trying to set up a transportable Module to test the Oracle Data Pump utility. I have followed the user manual in terms of relevant grants and permissions needed to use this functionality and have sucessfully connect to both my source and target databases and created my transportable module which will extract six tables from a source database on 10.2.0.1.0 to may target schema in my warehouse also on 10.2.0.1.0. When i come to try and deploy / execute the transportable module it fails with the following error.
    RPE-01023: Failed to establish connection to target database as DBA
    Now we have even gone as far as granting the DBA role to the user within our target but we still get the same error so assume it is something to do with the connection of the Transportable Target Module Location and it needs to connect as DBA somehow in the connect string. Has anyone experienced this issue and is their a way of creating the location connection that is not documented.
    There is no mention of this anywhere within the manual and i have even followed the example from http://www.rittman.net/archives/2006_04.html and my target user has the privilages detailed in the manual as detailed below
    User must not be SYS. Musthave ALTER TABLESPACE privilege and IMP_FULL_
    DATABASE role. Must have CREATE MATERIALIZED VIEW privilege with ADMIN
    option. Must be a Warehouse Builder repository databaseuser.
    Any help would be appreciated before i raise a request with Oracle

    Did you ever find a resolution ? We are experiencing the same issue..
    thanks
    OBX

Maybe you are looking for

  • Windows vista Error code when installing on my mac

    Okay so i'm doing everything exactly as i know to be right, and i'm getting this error message Error: 0x8007045D when it starts copying files, during the installation, is there something wrong with my friends DVD drive or something, cause i've alread

  • Ipod Classic Date Format

    Hi, can anyone tell me how to change the date format on an Ipod Classic from Amercian (mm/dd/yyyy) to British (dd/mm/yyyy)? Many thanks.

  • Shared Services URL is not up....

    Hi Experts, I have installed and configured Essbase and planning 11.1.2. I am able to see all URL's with out any issue ? But , when I tried to open shared services we are getting the below error. "Error 404 -- not found" " From RFC 2068 Hypertext Tra

  • Project Monitor Window and Timeline Tracks do not appear in new Project ?

    The Program Sequence Monitor and the Timeline Sequence 01 does not appear on opening of new project, eventhough checked in Windows Space. All existing projects are showing the correct layout  and can be used , but, i cannot activate a new project exc

  • Can't designate a fragment snippet as a reference include in Site Studio

    I would like to designate fragment snippets as included by reference, but the reference option isn't appearing in the include dropdown in the fragment editor. Does anyone know why I would not be able to see the reference option? I only get the simple