Split expdp Dumpfiles.

HI All,
I want take export using expdp.
I have two mount points data1 and data2. At data1 I have 70GB space and data2 I have 30GB space.
My expdp dump file size is 81GB . I want to keep 60GB at data1 and 20GB at data2.
How can I split my dumpfiles..? Full database backup.
I have created both directories at db level for both mount points.
Please suggest.
Thanks
Mano.
Edited by: M.M.R on Dec 1, 2010 8:51 AM

How can I split my dumpfiles..? Full database backup.Datapump should not be used for database backups.
You can split the dumps created in different directories using datapump. But, The way You wanted
I want to keep 60GB at data1 and 20GB at data2.is difficult.
Create two DIRECTORIES and mention in your par file. Something like this:
DUMPFILE=data_pump_dir1:DP_EXP_A%u.dmp,data_pump_dir2:DP_EXP_B%u.dmp
Split the datapump process by schema level or Table level and adjust the space accordingly on your mount points.
-Anantha

Similar Messages

  • Compatibility of expdp dumpfile with  higher impdp utility

    Hi All,
    I am just confused with the version utilities of datapump...
    I am having an expdp dumpfile taken with exp utility of 10.2.0. And i want to import this dumpfile on a higher oracle home 11g R2. 11.2.0 database.
    Which type of impdp version utility i have to use.....
    Source database version: 10.2.0 expdp version used. 10.2.0
    target database version 11.2.0 impdp version ????
    Thanks,

    Hi,
    With Data Pump, you always use the utility that shipped with the database. It is not like exp/imp where you use a different version utility to get the correct version dumpfile. In you case, you want to use the expdp on the source that came with the source, and the impdp on the target that came with the target.
    As for network import, it is a nice feature that can save you lots of disk space. Let's say that your database is xGB an the dumpfile will be 50% of xGB. You now need the xGB for your source databse, .5xGB for the dumpfile, then you need to copy the dumpfile so another .5xGB plus when you import it, you are at xGB again. This is equal to 3xGB. If you use the network import, you only need the source and target xGB so you are at 2xGB. This is if you can compress your dumpfile to be 50% of the databsae.
    If you have a 10gb network interconnect, the network link option can be pretty fast. I have reports of a large HMO moving 3TB/hr using the network link option of Data Pump.
    Dean

  • Expdp dumpfile: add timestamp

    How do I add the timestamp to the existing file.
    dumpfile=ENTDRP_NUCDBA_expdp.%date:~4,2%%date:~7,2%%date:~10,4%.dmp
    (1) Currently the above filename is ENTDRP_NUCDBA_EXPDP.09172009.DMP, any suggestions on how I can add the timestamp parameter.
    Thanks

    [oracle@orion2:/oradata/DMOFSCM9/dpdump]$ export mydate=`date "+%Y%m%d%H%M%S"`
    [oracle@orion2:/oradata/DMOFSCM9/dpdump]$ expdp sysadm/sysadm tables=psoprdefn dumpfile=psoprdefn_${mydate}.dmp nologfile=y
    Export: Release 11.2.0.1.0 - Production on Thu Sep 17 20:48:18 2009
    Copyright (c) 1982, 2009, Oracle and/or its affiliates.  All rights reserved.
    Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.1.0 - 64bit Production
    With the Partitioning, OLAP, Data Mining and Real Application Testing options
    Starting "SYSADM"."SYS_EXPORT_TABLE_01":  sysadm/******** tables=psoprdefn dumpfile=psoprdefn_20090917204755.dmp nologfile=y
    Estimate in progress using BLOCKS method...
    Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
    Total estimation using BLOCKS method: 128 KB
    Processing object type TABLE_EXPORT/TABLE/TABLE
    Processing object type TABLE_EXPORT/TABLE/GRANT/OWNER_GRANT/OBJECT_GRANT
    Processing object type TABLE_EXPORT/TABLE/INDEX/INDEX
    Processing object type TABLE_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS
    Processing object type TABLE_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
    . . exported "SYSADM"."PSOPRDEFN"                        40.75 KB     170 rows
    Master table "SYSADM"."SYS_EXPORT_TABLE_01" successfully loaded/unloaded
    Dump file set for SYSADM.SYS_EXPORT_TABLE_01 is:
      /oradata/DMOFSCM9/dpdump/psoprdefn_20090917204755.dmp
    Job "SYSADM"."SYS_EXPORT_TABLE_01" successfully completed at 20:48:38
    [oracle@orion2:/oradata/DMOFSCM9/dpdump]$ ls psoprdefn_*dmp
    psoprdefn_20090917204755.dmp
    [oracle@orion2:/oradata/DMOFSCM9/dpdump]$Or without variable :
    [oracle@orion2:/oradata/DMOFSCM9/dpdump]$ expdp sysadm/sysadm tables=psoprdefn dumpfile=psoprdefn_`date "+%Y%m%d%H%M%S"`.dmp nologfile=y
    Export: Release 11.2.0.1.0 - Production on Thu Sep 17 20:50:06 2009
    Copyright (c) 1982, 2009, Oracle and/or its affiliates.  All rights reserved.
    Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.1.0 - 64bit Production
    With the Partitioning, OLAP, Data Mining and Real Application Testing options
    Starting "SYSADM"."SYS_EXPORT_TABLE_01":  sysadm/******** tables=psoprdefn dumpfile=psoprdefn_20090917205006.dmp nologfile=y
    Estimate in progress using BLOCKS method...
    Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
    Total estimation using BLOCKS method: 128 KB
    Processing object type TABLE_EXPORT/TABLE/TABLE
    Processing object type TABLE_EXPORT/TABLE/GRANT/OWNER_GRANT/OBJECT_GRANT
    Processing object type TABLE_EXPORT/TABLE/INDEX/INDEX
    Processing object type TABLE_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS
    Processing object type TABLE_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
    . . exported "SYSADM"."PSOPRDEFN"                        40.75 KB     170 rows
    Master table "SYSADM"."SYS_EXPORT_TABLE_01" successfully loaded/unloaded
    Dump file set for SYSADM.SYS_EXPORT_TABLE_01 is:
      /oradata/DMOFSCM9/dpdump/psoprdefn_20090917205006.dmp
    Job "SYSADM"."SYS_EXPORT_TABLE_01" successfully completed at 20:50:17
    [oracle@orion2:/oradata/DMOFSCM9/dpdump]$Nicolas.
    Edited by: N. Gasparotto on Sep 17, 2009 8:50 PM

  • ORA-39124 with expdp when attaching timestamp to dumpfile name

    Hi all,
    I want to attach a timestamp to dumpfiles produced by expdp. So I did:
    DUMPFILE=SCOTT_`date '+%d_%m_%y_%H:%M:%S'`.dmp.
    I tried the command line, I tried a parfile all with the same result "ORA-39124: dump file name "SCOTT_"'`date '+%d_%m_%y_%H%M%S'`'".dmp" contains an invalid substitution variable" (same with logfile).
    No matter what I escaped and where I did use whatever quotation marks it always stayed the same (frustrating).
    I did it with renameing the files (e.g. SCOTT.dmp) with os means after exporting.
    What must be done (if at all possible) to have expdp and impdp create/accept dumpfiles of the sort mentioned above?
    Any help appreciated.
    FJH

    Hi Srini,
    you are almost right. The problem is the colon (probably because expdp uses colons in its command line syntax).
    If you leave out the colons it is ok.
    "expdp DUMPFILE=dumpfile_`date '+%d_%m_%y_%H_%M_%S'`.datapump" works fine.
    FJH

  • How to use EXPDP to create the dumpfile in shared network drive.

    Hi,
    I have Oracle 11g ( 11.2.0.3 ) DB server in SUSE Linux machine.
    I want to export a schema of 300 GB size using EXPDP utility. Thing is I dont have enough space to hold the EXPDP dumpfile in the same server.
    Hence I would like to know is there any way where I can create the dumpfile on to some other mapped drive from the Linux server.
    This is quite urgent, please some one help me.
    Regards
    Suresh

    Hi Suresh,
    It it's windows you want to map to linux take a look at samba to do that:
    http://www.linux-noob.com/forums/index.php?/topic/1404-how-to-mount-a-windows-share-with-smbmount/
    I've not actually exported directly to a samba mount but i assume it will work......
    the other alternative (if you have an 11.2.0.3 db on the windows server also) is to use this approach
    http://dbaharrison.blogspot.de/2013/05/expdp-creating-file-on-my-local-machine.html
    Cheers,
    Harry

  • How to map expdp parallele process to output file

    How to map expdp parallel process to its output file while running...
    say i use expdp dumpfile=test_%U.dmp parallel=5 ..
    Each parallele process writing to its related output file.. i want to know the mapping in run time...

    I'm not sure if this information is reported in the status command but it's worth a shot. You can get to the status command 2 ways:
    if you are running a datapump job from a terminal window, then while it is running, type ctl-c and you will get the datapump prompt. Either IMPORT> or EXPORT>
    IMPORT> status
    If you type status, it will tell you a bunch of information about the job and then each process. It may have dumpfile information in there.
    If you run it interactively, then you need to attach to the job. To do this, you need to know the job name. If you don't know, you can look at sys.dba_datapump_jobs if prived, or sys.user_datapump_jobs if not prived. You will see a job name and a schema name. Once you have that, you can:
    expdp user/password attach=schema.job_name
    This will bring you to the EXPORT>/iMPORT> prompt. Type status there.
    Like I said, I'm not sure if file name information is specified, but it might be. If it is not there, then I don't know of any other way to get it.
    Dean

  • How to export and Import table of another user from user system using expdp

    Hi All,
    How to export a table 'scott.emp' from system user.
    expdp system/password directory=expdp dumpfile=scott_emp.dmp ???? ?? tables=emp ??????????????????????????
    thank you
    ---------------------------------------------------------- Posting solution here to make needed users not to scroll down ------------------------------------------------------
    Finally I got it right!!!
    Task :- Export table_1 of schema_a from database db1 and Import it to schema_b of database db2 as user other than schema owner
    Solution :-
    expdp system/pwd directory=expdp tables=schema_a.table_1 dumpfile=schema_a.table_1.dmp logfile=expdp_schema_a.table_1.log
    impdp system/pwd directory=expdp tables=schema_a.table_1 dumpfile=schema_a.table_1.dmp logfile=impdp_schema_a.table_1.log remap_schema=schema_a:schema_b remap_tablespace=table_1_tablespace:schema_b_tablespace
    Thank You All
    Edited by: Ven on Mar 9, 2011 7:52 AM

    Just a example
    SQL> $expdp system/sys directory=data_pump_dir dumpfile=tests.dmp tables=scott.emp
    Export: Release 11.2.0.1.0 - Production on Tue Mar 8 10:39:57 2011
    Copyright (c) 1982, 2009, Oracle and/or its affiliates. All rights reserved.
    Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.1.0 - Production
    With the Partitioning, OLAP, Data Mining and Real Application Testing options
    Starting "SYSTEM"."SYS_EXPORT_TABLE_01": system/******** directory=data_pump_dir dumpfile=tests.dmp tables=scott.emp
    Estimate in progress using BLOCKS method...
    Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
    Total estimation using BLOCKS method: 64 KB
    Processing object type TABLE_EXPORT/TABLE/TABLE
    Processing object type TABLE_EXPORT/TABLE/INDEX/INDEX
    Processing object type TABLE_EXPORT/TABLE/CONSTRAINT/CONSTRAINT
    Processing object type TABLE_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS
    Processing object type TABLE_EXPORT/TABLE/CONSTRAINT/REF_CONSTRAINT
    Processing object type TABLE_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
    . . exported "SCOTT"."EMP" 8.570 KB 14 rows
    Master table "SYSTEM"."SYS_EXPORT_TABLE_01" successfully loaded/unloaded
    Dump file set for SYSTEM.SYS_EXPORT_TABLE_01 is:
    C:\APP\MAZAR\ADMIN\ACEME\DPDUMP\TESTS.DMP
    Job "SYSTEM"."SYS_EXPORT_TABLE_01" successfully completed at 10:40:01
    SQL>

  • Expdp and impdp  doubt

    hello frnds ,
    I have a expdp dumpfile of a schema from linux ES4 server ,oracle 10 g
    can I use above server dumpfile and impdp in solaris 10 ,oracle 10 g database ..is it possible ?
    thanks

    A corrupted dump file would almost certainly generate a different error message. Either the import utility will blow up when it encountered an invalid header or it will abort in the middle of the import when corruption is found. The error message you're receiving does not indicate that your dump file is corrupt.
    There is no way to validate a dump file short of trying to import it.
    I hope that you are doing a daily export as a supplement to a proper backup not as a replacement for a physical backup. If you're not doing a physical backup, you really want to do so-- export is no replacement for taking a proper backup.
    I'm confused about where the dump file resides now. Have you placed the dump file in the DUMP_FILE_DIR on the destination server? Or are you stating that you cannot put the dump file on the destination server at all? Or something else?
    Justin

  • Export Database Link & Public Synonyms using Expdp

    Dear All,
    We are using expdp to do schema level exports:
    expdp dumpfile=<Directory>:expdp.dmp SCHEMAS=S1,S2,S3
    We also want to export the Public/Private Synonyms (created for the objects owned by above users), Public/Private Database Links as part of above command.
    How we can achieve this? We are using 11gR2.
    regards,
    Riaz

    HI,
    Public synonym can not be exported and never exported even at FULL Export also. . You can get them from "dba_synonyms". I suggest that you create so customer script .
    select      'create public synonym ' || table_name || ' for ' || table_owner || '.' || table_name || ';'
    from      dba_synonyms
    where
         owner='PUBLIC' and table_owner not in ('SYS', 'SYSTEM')
    order by
         table_owner;
    More: Information:
    https://forums.oracle.com/thread/855639?start=0&tstart=0
    Oracle Data Pump Schema Export and Public Synonyms
    Thank you

  • Expdp/exp

    Dear Team,
    Please help. I'm running Oracle 10G
    Trying to export a schema to a new one. I experience data errors on one of the tables, error attached below:
    EXP-00015: error on row 9026759 of table BULKACCOUNTACTIVITY, column PRICE_DT, datatype 12
    I then tried to exclude the table using a parameter file wth select statement in it and got errors attached below:
    $ expdp dumpfile=lisp.dmp directory=dp parfile=lispar.par
    LRM-00110: syntax error at 'select'
    LRM-00113: error when processing file 'lispar.par'
    Expdp for some strange reason isn't working, it gives the attached error:
    Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.4.0 - 64bit Production
    With the Partitioning, OLAP, Data Mining and Real Application Testing options
    ORA-39006: internal error
    ORA-39213: Metadata processing is not available
    exp works but refuses to work with paramfile option, looks like its no longer supported as it doesn't appear when I run exp help=y
    I'm now stuck, your assistance will be highly appreciated.
    Regards,
    Melly

    Can you please post the parfile output?

  • Synonyms Error while EXPDP/IMPDP

    Hi Firends..
    I have to refresh my UAT DB with my Prod DB while doing so i always have to recreate the synonyms and also have to grant the priveleges once again on the target DB.
    As i am using EXPDP to backup my DB and as this being a logical backup and a logical backup means the objects which includes synonyms as well, are backed up and then during import this should be recreated in the target DB.
    Following are the command that i am using:
    expdp dumpfile=file_name.dmp logfile=logfile_name.log schemas=abc_liv directory=DATA_PUMP_DIR status=100 exclude=statistics
    impdp dumpfile=file_name.dmp logfile=logfile_name.log remap_schema=abc_liv:abc directory=DATA_PUMP_DIRwhere:
    abc_liv = prod schema
    abc = uat schema
    Note:
    UAT and Prod DB both has three schemas each and there are lots of dependencies among the three schemas.
    UAT schemas names are different and Prod schemas names are different.
    Kindly enlighten me...
    BR

    Thanks Guys,
    OS:RHEL 64 bit
    DB:11.2.0.2 64 bit SE
    I do drop the entire schema and then recreate the same schema. Following is the command that i use:
    drop user schema_name cascade;As the cascade keyword is used, all the schema objects are also dropped along with the user and then i have a fresh schema of the same name which is dumped with the data of the Prod DB.
    Following are the errors that i am getting during Import:(Too many errors)
    ALTER_PACKAGE_SPEC:"schema_name"."PKG_LMS_ALO_PKG_MAIN" created with compilation warnings
    ORA-39082: Object type ALTER_PACKAGE_SPEC:"schema_name"."PKG_REROUTE_MASTERS" created with compilation warnings
    ORA-39082: Object type ALTER_PACKAGE_SPEC:"schema_name"."PKG_WF_GLOBAL_VARIABLES" created with compilation warnings
    Processing object type SCHEMA_EXPORT/FUNCTION/ALTER_FUNCTION
    ORA-39082: Object type ALTER_FUNCTION:"schema_name"."CHECKDUPLICAT" created with compilation warnings
    ORA-39082: Object type ALTER_FUNCTION:"schema_name"."CHG_FUN_GET_CHARGE_DETAILS" created with compilation warnings
    ORA-39082: Object type ALTER_FUNCTION:"schema_name"."CHG_FUN_GET_CHGBASIS_DESC" created with compilation warnings
    ORA-39082: Object type ALTER_FUNCTION:"schema_name"."CHG_FUN_GET_PENAL_APPLIC_YN" created with compilation warnings
    ORA-39082: Object type ALTER_FUNCTION:"schema_name"."CPF_FUN_EXT_AGENCY_OTHER" created with compilation warnings
    ORA-39082: Object type ALTER_FUNCTION:"schema_name"."DOC_FUN_GETSMDESC" created with compilation warnings
    ORA-39082: Object type ALTER_FUNCTION:"schema_name"."FUNCTION_SET_EXPECTEDAPPRDATE" created with compilation warnings
    ORA-39082: Object type ALTER_FUNCTION:"schema_name"."FUNCTION_SET_EXPECTEDFIDATE" created with compilation warnings
    ORA-39082: Object type ALTER_FUNCTION:"schema_name"."FUN_ACTIVITY_COMPLETED_YN" created with compilation warnings
    ORA-39082: Object type ALTER_FUNCTION:"schema_name"."FUN_ACTIVITY_DONEYN" created with compilation warnings
    ORA-39082: Object type ALTER_FUNCTION:"schema_name"."FUN_ACTUAL_DISB_DONE_YN" created with compilation warnings
    ORA-39082: Object type ALTER_FUNCTION:"schema_name"."FUN_ADD_UPD_EXP" created with compilation warnings
    ORA-39082: Object type ALTER_FUNCTION:"schema_name"."FUN_ADV_INSTALL_DISPLAY_YN" created with compilation warnings
    ORA-39082: Object type ALTER_FUNCTION:"schema_name"."WFS_GET_USER_MODULE_SPECIFIED" created with compilation warnings
    ORA-39082: Object type ALTER_FUNCTION:"schema_name"."WF_FUN_GET_AVL_PARAM_DETAILS" created with compilation warnings
    ORA-39082: Object type ALTER_FUNCTION:"schema_name"."WF_FUN_GET_NO_OF_CASES" created with compilation warningsProcessing object type SCHEMA_EXPORT/PROCEDURE/ALTER_PROCEDURE
    ORA-39082: Object type ALTER_PROCEDURE:"schema_name"."ADD_PARAMETER" created with compilation warningsORA-39082: Object type ALTER_PROCEDURE:"schema_name"."CHECK_APPL_IMPACT_PERF_REROUTE" created with compilation warnings
    ORA-39082: Object type ALTER_PROCEDURE:"schema_name"."CHECK_FAULT" created with compilation warnings
    ORA-39082: Object type ALTER_PROCEDURE:"schema_name"."FEE_WAIVER_REQ_M_YN" created with compilation warnings
    ORA-39082: Object type ALTER_PROCEDURE:"schema_name"."FIND_DEDUP_MATCH" created with compilation warningsORA-39082: Object type ALTER_PROCEDURE:"schema_name"."FIND_DEDUP_MATCH_APPL" created with compilation warnings
    ORA-39082: Object type ALTER_PROCEDURE:"schema_name"."GENERATE_ENVELOPE" created with compilation warnings
    ORA-39082: Object type ALTER_PROCEDURE:"schema_name"."GENERATE_TRNAUDIT_LOG" created with compilation warnings
    ORA-39082: Object type ALTER_PROCEDURE:"schema_name"."GET_DOCUMENTS_FORADDITION_XML" created with compilation warnings
    ORA-39082: Object type ALTER_PROCEDURE:"schema_name"."GET_DOCUMENTS_FORWAIVER_XML" created with compilation warnings
    ORA-39082: Object type ALTER_PROCEDURE:"schema_name"."LMSRSP_CMPSUM_OUTSTPORTFOLIO" created with compilation warnings
    ORA-39082: Object type ALTER_PROCEDURE:"schema_name"."LMSRSP_CUST_OS_EXP_SUM" created with compilation warnings
    ORA-39082: Object type ALTER_PROCEDURE:"schema_name"."LMSRSP_FUTURE_INFLOWS" created with compilation warningsORA-39082: Object type ALTER_PROCEDURE:"schema_name"."LMSRSP_INTEREST_CERTIFICATE" created with compilation warnings
    ORA-39082: Object type ALTER_PROCEDURE:"schema_name"."LMSRSP_NONSTARTERCASE_DET" created with compilation warnings
    ORA-39082: Object type ALTER_PROCEDURE:"schema_name"."LMSRSP_NONSTARTERCASE_SUM" created with compilation warnings
    ORA-39082: Object type ALTER_PROCEDURE:"schema_name"."LMSRSP_ODPREEMI_EMISUMMARY" created with compilation warnings
    ORA-39082: Object type ALTER_PROCEDURE:"schema_name"."LMSRSP_SHORT_PDC_REPORT" created with compilation warningsORA-39082: Object type ALTER_PROCEDURE:"schema_name"."LOS_BATCH_PROCEDURE" created with compilation warnings
    ORA-39082: Object type ALTER_PROCEDURE:"schema_name"."PC_CALC_RISK_PARAMETERS" created with compilation warnings
    ORA-39082: Object type ALTER_FUNCTION:"schema_name"."FUN_ALLOC_SPFC_USER" created with compilation warnings
    ORA-39082: Object type
    CHEMA_EXPORT/VIEW/VIEWProcessing object type SCHEMA_EXPORT/VIEW/GRANT/OWNER_GRANT/OBJECT_GRANT
    ORA-39082: Object type VIEW:"schema_name"."AST_VEW_ASSET" created with compilation warnings
    ORA-39082: Object type VIEW:"schema_name"."AST_VEW_ASSETTYPE" created with compilation warnings
    ORA-39082: Object type VIEW:"schema_name"."AST_VEW_ASSET_TYPE" created with compilation warnings
    ORA-39082: Object type VIEW:"schema_name"."AST_VEW_ASTCLASS" created with compilation warningsORA-39082: Object type VIEW:"schema_name"."AST_VEW_ATTR_MAPPING" created with compilation warnings
    ORA-39082: Object type VIEW:"schema_name"."AST_VEW_EXT_AGENCY" created with compilation warnings
    ORA-39082: Object type VIEW:"schema_name"."AST_VEW_VEHICLE_TYPE" created with compilation warnings
    ORA-39082: Object type VIEW:"schema_name"."BASE_SEC_MST_USERPROFILE" created with compilation warnings
    ORA-39082: Object type VIEW:"schema_name"."CHG_RPT_VEW_CHGBASISCODE" created with compilation warnings
    ORA-39082: Object type VIEW:"schema_name"."CHG_RPT_VEW_TAX_DETAILS" created with compilation warnings
    ORA-39082: Object type VIEW:"schema_name"."CHG_VEW_ACC_BOOKS" created with compilation warningsORA-39082: Object type VIEW:"schema_name"."CHG_VEW_ACC_METHOD" created with compilation warnings
    ORA-39082: Object type VIEW:"schema_name"."CHG_VEW_AMORT_METHOD" created with compilation warnings
    ORA-39082: Object type VIEW:"schema_name"."CHG_VEW_CALC_BASIS" created with compilation warnings
    ORA-39082: Object type
    VIEW:"schema_name"."CHG_VEW_CALC_FROM" created with compilation warnings
    ORA-39082: Object type VIEW:"schema_name"."CHG_VEW_CALC_TYPE" created with compilation warnings
    ORA-39082: Object type VIEW:"schema_name"."CHG_VEW_CHG_RECOMODE" created with compilation warnings
    ORA-39082: Object type VIEW:"schema_name"."CHG_VEW_EVENT" created with compilation warnings
    ORA-39082: Object type VIEW:"schema_name"."CHG_VEW_FEES" created with compilation warningsORA-39082: Object type VIEW:"schema_name"."CHG_VEW_FEES_RANGE" created with compilation warnings
    ORA-39082: Object type VIEW:"schema_name"."CHG_VEW_FEE_ACC_DETAILS" created with compilation warnings
    ORA-39082: Object type VIEW:"schema_name"."CHG_VEW_FEE_DETAILS" created with compilation warnings
    ORA-39082: Object type VIEW:"schema_name"."CHG_VEW_FEE_MST_DETAILS" created with compilation warnings
    ORA-39082: Object type IEW:"schema_name"."V_LMS_CHEQUE_BOUNCE_LETTER" created with compilation warnings
    ORA-39082: Object type VIEW:"schema_name"."V_LMS_ACCRUED_INCOME_REPORT" created with compilation warnings
    ORA-39082: Object type VIEW:"schema_name"."V_LMS_REPAY_FREQ" created with compilation warnings
    Processing object type SCHEMA_EXPORT/PACKAGE/PACKAGE_BODY
    ORA-39082: Object type PACKAGE_BODY:"schema_name"."AST_PKG_ASSET" created with compilation warnings
    ORA-39082: Object type PACKAGE_BODY:"schema_name"."CHG_PKG_API" created with compilation warnings
    ORA-39082: Object type PACKAGE_BODY:"schema_name"."DOC_PKG_DOCUMENT" created with compilation warnings
    ORA-39082: Object type PACKAGE_BODY:"schema_name"."EXT_PKG_EXT_AGENCY" created with compilation warnings
    ORA-39082: Object type PACKAGE_BODY:"schema_name"."PKG_ACCOUNT" created with compilation warnings
    ORA-39082: Object type PACKAGE_BODY:"schema_name"."PKG_ACCOUNTING" created with compilation warnings
    ORA-39082: Object type PACKAGE_BODY:"schema_name"."PKG_ADV_ALLOCATION" created with compilation warnings
    ORA-39082: Object type PACKAGE_BODY:"schema_name"."PKG_ALLOCATION" created with compilation warnings
    ORA-39082: Object type PACKAGE_BODY:"schema_name"."PKG_ASSET" created with compilation warningsORA-39082: Object type PACKAGE_BODY:"schema_name"."PKG_BASE_SEC_AUDIT" created with compilation warnings
    ORA-39082: Object type PACKAGE_BODY:"schema_name"."PKG_CHARGES" created with compilation warnings
    ORA-39082: Object type PACKAGE_BODY:"schema_name"."PKG_CRS_XML" created with compilation warnings
    ORA-39082: Object type PACKAGE_BODY:"schema_name"."PKG_DDP_MAIN" created with compilation warnings
    ORA-39082: Object type PACKAGE_BODY:"schema_name"."PKG_EODDETAILS" created with compilation warnings
    ORA-39082: Object type PACKAGE_BODY:"schema_name"."PKG_EOD_PROCESS" created with compilation warnings
    ORA-39082: Object type
    TRIGGER:"schema_name"."TRNTRG_T_APPLICANT_BUSINESS_IN" created with compilation warnings
    ORA-39082: Object type TRIGGER:"schema_name"."TRNTRG_T_APPLICANT_BUSINESS_IN" created with compilation warnings
    ORA-39082: Object type TRIGGER:"schema_name"."TRNTRG_ADDRESS_DETAILS" created with compilation warnings
    ORA-39082: Object type TRIGGER:"schema_name"."TRNTRG_ADDRESS_DETAILS" created with compilation warnings
    ORA-39082: Object type TRIGGER:"schema_name"."TRNTRG_T_APPLICANT_EMPLOYMENT" created with compilation warnings
    ORA-39082: Object type TRIGGER:"schema_name"."TRNTRG_T_APPLICANT_EMPLOYMENT" created with compilation warnings
    ORA-39082: Object type TRIGGER:"schema_name"."TRNTRG_T_APPLICANT_LIABILITIES" created with compilation warnings
    ORA-39082: Object type TRIGGER:"schema_name"."TRNTRG_T_APPLICANT_LIABILITIES" created with compilation warnings
    ORA-39082: Object type TRIGGER:"schema_name"."TRNTRG_T_SUBPRODUCT" created with compilation warnings
    ORA-39082: Object type TRIGGER:"schema_name"."TRNTRG_T_SUBPRODUCT" created with compilation warnings
    ORA-39082: Object type TRIGGER:"schema_name"."TRNTRG_T_ORG_APPLICANT_DETAILS" created with compilation warnings
    ORA-39082: Object type TRIGGER:"schema_name"."TRNTRG_T_ORG_APPLICANT_DETAILS" created with compilation warnings
    Processing object type SCHEMA_EXPORT/JOB
    Job "SYSTEM"."SYS_IMPORT_FULL_01" completed with 1687 error(s) at 17:32:05Once the import is done, I recreate the synonyms and also give the required grants and then i get around 1750 invalid objects in the DB and when i run the UTLRP to compile the objects the invalid count is reduced to 89..
    BR
    Edited by: user12045405 on 11 Jul, 2012 5:28 AM
    Edited by: user12045405 on 11 Jul, 2012 5:29 AM

  • Full db backup using expdp

    Hi All,
    My database version is 10.2.0.2 & I want to take full database backup using expdp.
    This is the parfile I use.
    userid=system/manager
    full=y
    directory=exp_dmp
    parallel=4
    The database size is 45 gb & the expdp dumpfile size comes to around 31gb. What can be done to compress the expdp dumpfile?
    or is there some other parameters which i need to use?
    thanks

    I agree expdp is logical backup.
    In 10g compression=metadata_only is the only option that can be used, so that does not help to compress the backup.
    due to size restrictions, i want the dumpfile to be compressed as and when the backup is running, how to achieve this?

  • Expdp   include with   dbms_datapump API

    Hi,
    I am trying to translate the following command to use it with the dbms_datapump API but I m having difficulty to write the correct syntax for the INCLUDE=SCHEMA:\"=\'myschema\'\" section.
    expdp dumpfile=mydirectoryp:mydumpfile.dp INCLUDE=SCHEMA:\"=\'myschema\'\" logfile=mydirectoryg:mylogfile.log full=y
    Could someone help me with this ?
    Thank you
    JP

    Hi,
    I'm working with JP on that...
    We know that it will produce the same result but it was proposer in an Oracle SR as a workaround to bug # 7362589 (as listed in Note 1253955.1) and it does the job.
    I tried this solution, but the result is not the same. I mean, the log files do not contains the same result and the dump files do not have the same size.
    Soltution 1, command line:
    expdp dumpfile=dbadev_dp:exp_TSTBASEEXPORTIMPORT_1.dp INCLUDE=SCHEMA:\"=\'TSTBASEEXPORTIMPORT\'\" logfile=dbadev_log:exp_TSTBASEEXPORTIMPORT_log.log full=y
    LogFile:
    [oracle@qcdvcn1001-dev711 dbadev]$ cat exp_TSTBASEEXPORTIMPORT_log.log
    Export: Release 10.2.0.4.0 - 64bit Production on Tuesday, 19 April, 2011 9:33:08
    Copyright (c) 2003, 2007, Oracle. All rights reserved.
    Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.4.0 - 64bit Production
    With the Partitioning, OLAP, Data Mining and Real Application Testing options
    Starting "DBADEV"."SYS_EXPORT_FULL_01": dbadev/******** dumpfile=dbadev_dp:exp_TSTBASEEXPORTIMPORT_1.dp INCLUDE=SCHEMA:"='TSTBASEEXPORTIMPORT'" logfile=dbadev_log:exp_TSTBASEEXPORTIMPORT_log.log full=y
    Estimate in progress using BLOCKS method...
    Processing object type DATABASE_EXPORT/SCHEMA/TABLE/TABLE_DATA
    Total estimation using BLOCKS method: 256 KB
    Processing object type DATABASE_EXPORT/SCHEMA/USER
    Processing object type DATABASE_EXPORT/SCHEMA/GRANT/SYSTEM_GRANT
    Processing object type DATABASE_EXPORT/SCHEMA/ROLE_GRANT
    Processing object type DATABASE_EXPORT/SCHEMA/DEFAULT_ROLE
    Processing object type DATABASE_EXPORT/SCHEMA/PROCACT_SCHEMA
    Processing object type DATABASE_EXPORT/SCHEMA/TABLE/TABLE
    Processing object type DATABASE_EXPORT/SCHEMA/PROCEDURE/PROCEDURE
    Processing object type DATABASE_EXPORT/SCHEMA/PROCEDURE/ALTER_PROCEDURE
    . . exported "TSTBASEEXPORTIMPORT"."TESTA" 4.929 KB 1 rows
    . . exported "TSTBASEEXPORTIMPORT"."TESTB" 0 KB 0 rows
    Master table "DBADEV"."SYS_EXPORT_FULL_01" successfully loaded/unloaded
    Dump file set for DBADEV.SYS_EXPORT_FULL_01 is:
    /mnt/e3be11/oracle/dump/dbadev/exp_TSTBASEEXPORTIMPORT_1.dp
    Job "DBADEV"."SYS_EXPORT_FULL_01" successfully completed at 09:51:38
    Dump file:
    -rw-rw---- 1 oracle dba *229376* Apr 19 09:51 exp_TSTBASEEXPORTIMPORT_1.dp
    Soltution 2, dbms_datapump:
    lv_handle := dbms_datapump.open(operation => 'EXPORT',job_mode => 'FULL',job_name => lv_jobName,version => 'LATEST');
    dbms_datapump.metadata_filter(handle => lv_handle,name => 'NAME_EXPR', value => '='''||upper(pin_request.sourceUser)||'''',object_type => 'SCHEMA');
    LogFile:
    [oracle@qcdvcn1001-dev711 dbadev]$ cat exp_dev711_TSTBASEEXPORTIMPORT_201104190956.log
    Starting "DBADEV"."EXP_TSTBASEEXPORTIMPORT":
    Estimate in progress using BLOCKS method...
    Processing object type DATABASE_EXPORT/SCHEMA/TABLE/TABLE_DATA
    Total estimation using BLOCKS method: 256 KB
    Processing object type DATABASE_EXPORT/TABLESPACE
    Processing object type DATABASE_EXPORT/SYS_USER/USER
    Processing object type DATABASE_EXPORT/SCHEMA/USER
    Processing object type DATABASE_EXPORT/ROLE
    Processing object type DATABASE_EXPORT/GRANT/SYSTEM_GRANT/PROC_SYSTEM_GRANT
    Processing object type DATABASE_EXPORT/SCHEMA/GRANT/SYSTEM_GRANT
    Processing object type DATABASE_EXPORT/SCHEMA/ROLE_GRANT
    Processing object type DATABASE_EXPORT/SCHEMA/DEFAULT_ROLE
    Processing object type DATABASE_EXPORT/RESOURCE_COST
    Processing object type DATABASE_EXPORT/TRUSTED_DB_LINK
    Processing object type DATABASE_EXPORT/DIRECTORY/DIRECTORY
    Processing object type DATABASE_EXPORT/DIRECTORY/GRANT/OWNER_GRANT/OBJECT_GRANT
    Processing object type DATABASE_EXPORT/DIRECTORY/GRANT/CROSS_SCHEMA/OBJECT_GRANT
    Processing object type DATABASE_EXPORT/CONTEXT
    Processing object type DATABASE_EXPORT/SYSTEM_PROCOBJACT/PRE_SYSTEM_ACTIONS/PROCACT_SYSTEM
    Processing object type DATABASE_EXPORT/SYSTEM_PROCOBJACT/PROCOBJ
    Processing object type DATABASE_EXPORT/SYSTEM_PROCOBJACT/POST_SYSTEM_ACTIONS/PROCACT_SYSTEM
    Processing object type DATABASE_EXPORT/SCHEMA/PROCACT_SCHEMA
    Processing object type DATABASE_EXPORT/SCHEMA/TABLE/TABLE
    Processing object type DATABASE_EXPORT/SCHEMA/PROCEDURE/PROCEDURE
    Processing object type DATABASE_EXPORT/SCHEMA/PROCEDURE/ALTER_PROCEDURE
    . . exported "TSTBASEEXPORTIMPORT"."TESTA" 4.929 KB 1 rows
    . . exported "TSTBASEEXPORTIMPORT"."TESTB" 0 KB 0 rows
    Master table "DBADEV"."EXP_TSTBASEEXPORTIMPORT" successfully loaded/unloaded
    Dump file set for DBADEV.EXP_TSTBASEEXPORTIMPORT is:
    /mnt/e3be11/oracle/dump/dbadev/tstbaseexportimport_201104190956_01.dp
    Job "DBADEV"."EXP_TSTBASEEXPORTIMPORT" successfully completed at 10:16:28
    Dump file:
    -rw-rw---- 1 oracle dba *405504* Apr 19 10:16 tstbaseexportimport_201104190956_01.dp
    Do we need to exclude the difference (bold sections of the second log file) with a filter of type EXCLUDE_PATH_EXPR ?
    thanks for your help!
    jonathan

  • How to do a full export and exclude user specific tables

    Hi all,
    Is it possible to export all objects but one (or more) tables with expdp when doing a full export?
    This is what I've to in mind: (parfile)
    directory=expdp
    dumpfile=full.dmp
    logfile=full.elog
    full=y
    exclude=table:"IN('userA.table_A')"
    This par-file does not encounter any error messages but it doesn't exclude table_A in schema userA either.
    I guess that's because the the term userA is not interpreted as an owner.
    It's not a big deal to filter out a table when doing a schema export.
    I wonder whether it is at all possible do except a table from a full export. If so, I'd value your suggestions.
    Regards,
    Louis

    Louis,
    The exclude parameter in DataPump takes only the object name, not the schema name. So, if you only have one table called 'TableA', then you can just use:
    exclude=table:"IN('table_A')"
    If you have more than one table called 'table_A', then the exclude will exclude all of them. There is no way to specify user_1.table_a with an exclude parameter.
    Dean

  • Unable to export a table

    Hi,
    I am trying to export a table using Datapump export utility but the operation is failing. Firstly, this is the table I am trying to export:
    SQL> select owner, table_name from dba_tables where table_name='DEMO1';
    OWNER TABLE_NAME
    SYS DEMO1
    SQL> select * from DEMO1;
    ID
    1
    2
    3
    This is how the expdp command is run:
    $ expdp DUMPFILE=demo1.dmp TABLES=DEMO1
    Export: Release 11.1.0.6.0 - Production on Wednesday, 05 November, 2008 12:37:28
    Copyright (c) 2003, 2007, Oracle. All rights reserved.
    Username: sys as sysdba
    Password:
    Connected to: Oracle Database 11g Enterprise Edition Release 11.1.0.6.0 - Production
    With the Partitioning, OLAP, Data Mining and Real Application Testing options
    Starting "SYS"."SYS_EXPORT_TABLE_01": sys/******** AS SYSDBA DUMPFILE=demo1.dmp TABLES=DEMO1
    Estimate in progress using BLOCKS method...
    Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
    Total estimation using BLOCKS method: 0 KB
    ORA-39166: Object DEMO1 was not found.
    ORA-31655: no data or metadata objects selected for job
    Job "SYS"."SYS_EXPORT_TABLE_01" completed with 2 error(s) at 12:37:37
    I have searched for the error ORA-39166 but most of the results talk about either missing table or lack of privileges. I don't think either is the cause of the problem here. Any help is appreciated.
    Thanks,
    Raghu

    sql prompt > host expdp sys/pass word directory=dir1 dumpfile = test1234.dmp tables = <schema_name>.tablename;I get the same failure as before. Please note that I passed "SYS" explicitly as schema this time.
    SQL> host expdp DUMPFILE=demo1.dmp TABLES=SYS.DEMO1;
    Export: Release 11.1.0.6.0 - Production on Wednesday, 05 November, 2008 13:58:06
    Copyright (c) 2003, 2007, Oracle. All rights reserved.
    Username: sys as sysdba
    Password: ********
    Connected to: Oracle Database 11g Enterprise Edition Release 11.1.0.6.0 - Production
    With the Partitioning, OLAP, Data Mining and Real Application Testing options
    Starting "SYS"."SYS_EXPORT_TABLE_01": sys/******** AS SYSDBA DUMPFILE=demo1.dmp TABLES=SYS.DEMO1
    Estimate in progress using BLOCKS method...
    Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
    Total estimation using BLOCKS method: 0 KB
    ORA-39165: Schema SYS was not found.
    ORA-39166: Object DEMO1 was not found.
    ORA-31655: no data or metadata objects selected for job
    Job "SYS"."SYS_EXPORT_TABLE_01" completed with 3 error(s) at 13:58:15
    *****

Maybe you are looking for