10g data pump full import

Hi.. I have full exported imported a 10g (10.2.0.3) database from Solaris to Windows 2003 using data pump. I have made object counts of both source and destination databases and found a huge difference in their total object counts (dba_objects). What is the reason behind this difference? The database seems to be ok and the important application schemas have all row counts similar. It is due for testing anyway tomorrow. Another question I need to ask you is the dumpfile. What is the max size of dump file which can be opened in an editor? I can't seem to open the data pump dmp file. Thanks in advance for your responses.

I have made object counts of both source and destination databases and found a huge difference in their total object counts (dba_objects).
What is the reason behind this difference? Perhaps you should do object COUNT(*) on per schema basis.
What is the max size of dump file which can be opened in an editor?You'll corrupt the file is you ever open in an editor.
I can't seem to open the data pump dmp file. You were saved from shooting yourself in the foot.
Edited by: sb92075 on Jun 24, 2009 6:53 PM

Similar Messages

  • Oracle 10g - Data Pump: Export / Import of Sequences ?

    Hello,
    I'm new to this forum and also to Oracle (Version 10g). Since I could not find an answer to my question, I open this post in hoping to get some help from the experienced users.
    My question concerns the Data Pump Utility and what happens to sequences which were defined in the source database:
    I have exported a schema with the following command:
    "expdp <user>/<pass> DIRECTORY=DATA_PUMP_DIR DUMPFILE=dumpfile.dmp LOGFILE=logfile.log"
    This worked fine and also the import seemed to work fine with the command:
    "impdp <user>/<pass> DIRECTORY=DATA_PUMP_DIR DUMPFILE=dumpfile.dmp"
    It loaded the exported objects directly into the schema of the target database.
    BUT:
    Something has happened to my sequences. :-(
    When I want to use them, all sequences start again with value "1". Since I have already included data with higher values in my tables, I get into trouble with the PK of these tables because I used sequences sometimes as primary key.
    My question go in direction to:
    1. Did I something wrong with Data Pump Utility?
    2. How is the correct way to export and import sequences that they keep their actual values?
    3. When the behaviour described here is correct, how can I correct the values that start again from the last value that was used in the source database?
    Thanks a lot in advance for any help concerning this topic!
    Best regards
    FireFighter
    P.S.
    It might be that my english sounds not perfect since it is not my native language. Sorry for that! ;-)
    But I hope that someone can understand nevertheless. ;-)

    My question go in direction to:
    1. Did I something wrong with Data Pump Utility?I do not think so. But may be with the existing schema :-(
    2. How is the correct way to export and import
    sequences that they keep their actual values?If the Sequences exist in the target before the import, oracle does not drop and recreate it. So you need to ensure that the sequences do not already exist in the target or the existing ones are dropped before the import.
    3. When the behaviour described here is correct, how
    can I correct the values that start again from the
    last value that was used in the source database?You can either refresh with the import after the above correction or drop and manually recreate the sequences to START WITH the NEXT VALUE of the source sequences.
    The easier way is to generate a script from the source if you know how to do it

  • 10G data pumps export / import

    window XP pro sp-2,
    I run expdp from the command line I got this
    Connected to: Oracle Database 10g Enterprise Edition Release 10.1.0.2.0 - Produc
    tion
    With the Partitioning, OLAP and Data Mining options
    ORA-39002: invalid operation
    ORA-39070: Unable to open the log file.
    ORA-39087: directory name DATA_PUMP_DIR is invalid
    What am I missing ?
    Eric

    Maybe you've missed some parameters? Expdmp alsways needs to have some sort of info on what you want to export, i.e.:
    expdp hr/hr TABLES=employees,jobs DUMPFILE=dpump_dir1:table.dmp NOLOGFILE=y
    or
    expdp hr/hr PARFILE=exp.par
    with parfile contents being:
    DIRECTORY=dpump_dir1
    DUMPFILE=dataonly.dmp
    CONTENT=DATA_ONLY
    EXCLUDE=TABLE:"IN ('COUNTRIES', 'LOCATIONS', 'REGIONS')"
    QUERY=employees:"WHERE department_id !=50 ORDER BY employee_id"
    These examples are from the documentation on Utilities, so you can look up all the possible parameters there.

  • Oracle data pump vs import/export utility

    Hello all,
    What is the difference between Oracle Data Pump and Import/Export utility? which on is the faster?

    Handle:      user9362044
    Status Level:      Newbie
    Registered:      Jan 26, 2011
    Total Posts:      31
    Total Questions:      11 (7 unresolved)
    so many questions & so few answers.
    What is the difference between Oracle Data Pump and Import/Export utility?unwilling or incapable to Read The Fine Manual yourself?
    http://download.oracle.com/docs/cd/B19306_01/server.102/b14215/toc.htm

  • Best Practice for data pump or import process?

    We are trying to copy existing schema to another newly created schema. Used export data pump to successfully export schema.
    However, we encountered some errors when importing dump file to new schema. Remapped schema and tablespaces, etc.
    Most errors occur in PL/SQL... For example, we have views like below in original schema:
    CREATE VIEW *oldschema.myview* AS
    SELECT col1, col2, col3
    FROM *oldschema.mytable*
    WHERE coll1 = 10
    Quite a few Functions, Procedures, Packages and Triggers contain "*oldschema.mytable*" in DML (insert, select, update) statement, for exmaple.
    Getting the following errors in import log:
    ORA-39082: Object type ALTER_FUNCTION:"TEST"."MYFUNCTION" created with compilation warnings
    ORA-39082: Object type ALTER_PROCEDURE:"TEST"."MYPROCEDURE" created with compilation warnings
    ORA-39082: Object type VIEW:"TEST"."MYVIEW" created with compilation warnings
    ORA-39082: Object type PACKAGE_BODY:"TEST"."MYPACKAGE" created with compilation warnings
    ORA-39082: Object type TRIGGER:"TEST"."MYTRIGGER" created with compilation warnings
    A lot of actual errors/invalid objects in new schema are due to:
    ORA-00942: table or view does not exist
    My question is:
    1. What can we do to fix those errors?
    2. Is there a better way to do the import with such condition?
    3. Update PL/SQL and recompile in new schema? Or update in original schema first and export?
    Your help will be greatly appreciated!
    Thank you!

    I routinely get many (MANY) errors as follows and they always compile when I recompile using utlrp.
    ORA-39082: Object type ALTER_FUNCTION:"TKCSOWNER"."RPTSF_WR_LASTOUTPUNCH" created with compilation warnings
    ORA-39082: Object type ALTER_FUNCTION:"TKCSOWNER"."RPTSF_WR_REFPERIODENDFOREMP" created with compilation warnings
    ORA-39082: Object type ALTER_FUNCTION:"TKCSOWNER"."RPTSF_WR_TAILOFFSECS" created with compilation warnings
    ORA-39082: Object type ALTER_FUNCTION:"TKCSOWNER"."FN_GDAPREPORTGATHERER" created with compilation warnings
    Processing object type DATABASE_EXPORT/SCHEMA/PROCEDURE/ALTER_PROCEDURE
    ORA-39082: Object type ALTER_PROCEDURE:"TKCSOWNER"."ABSENT_EXCEPTION" created with compilation warnings
    ORA-39082: Object type ALTER_PROCEDURE:"TKCSOWNER"."ACCRUAL_BAL_PROJ" created with compilation warnings
    ORA-39082: Object type ALTER_PROCEDURE:"TKCSOWNER"."ACCRUAL_DETAILS" created with compilation warnings
    ORA-39082: Object type ALTER_PROCEDURE:"TKCSOWNER"."ACCRUAL_SUMMARY" created with compilation warnings
    ORA-39082: Object type ALTER_PROCEDURE:"TKCSOWNER"."ACTUAL_SCHEDULE" created with compilation warnings
    It works. In all my databases: peoplesoft, kronos, and others...
    I should qualify that it may still be necessary to 'debug' specific problems, but most common typical problems are easily resolved using the utlrp.sql. The usual problems I run into are typically because of a database link that points to another database such as in a production environment that we firewall our test and development databases from linking to (for obvious reasons).

  • Pre Checks before running Data Pump Export/Import

    Hi,
    Oracle :-11.2
    OS:- Windows
    Kindly share the pre-checks required for data pump export and import which should be followed by a DBA.
    Thanks

    When you do a tablespace mode export, Data Pump is essentially doing a table mode export of all of the tables in the tablespaces mentioned.  So if you have this:
    tablespace a    contains table 1
                                              table 2
                                             index 3a(on table 3)
    tablespace b   contains index 1a on table 1
                                             index 2a on table 2
                                            table 3
    and if you expdp tablespaces=a ...
    you will get table 1, table 2, index 1a, and index 2a.
    My belief is that you will not get table 3 or index 3a.  The way I understand the code to work is that you get the tables in the tablespaces you mention and their dependent objects, but not the other way around.  You could easily verify this to make sure.
    Dean

  • 10G data pumps / sql loader

    When data is moved by data pumps / sql loader,
    is the change enter into redo ?
    Thanks eric

    Where can I find Oracle client for 10g (or earlier)? Look at the upper left corner of this page (there is red image "Download Oracle Software")...
    Is SQL*Loader part of Oracle client? Yes it is.

  • 11gR2 Data Pump. Import table in one schema into a different schema

    I stady oracle. I export shema HR into file hrexport.dmp. When I import tables from this file I had trouble. I used Enterprise Manager:
    1. connected by user SYSTEM as NORMAL
    2. selected file, chosen import type - tables
    3. data from file was imported
    4. chosen tables which tables to import
    5. in the next step I try insert row in table Re-Map Shemas and edit cell Destination Shema, but in the list is only one name of shema - HR! Why?
    Edited by: alvahtin on 10.03.2013 6:11

    Paul M. wrote:
    Did you try using impdp command at OS prompt ?I don't know what this command use. I try:
    impdp system/oracle remap_schema=hr:inventory tables=employees, departments, locations directory=ORACLES_HOME dumpfile=hrexport.dmp logfile=hrimport.log
    But it finished with error:
    ORA-39166: Object SYSTEM.EMPLOYEES was not found.
    ORA-39166: Object SYSTEM.DEPARTMENTS was not found.
    ORA-39166: Object SYSTEM.LOCATIONS was not found.
    Job "SYSTEM"."SYS_IMPORT_TABLE_01" successfully completed at 18:43:19
    Job "SYSTEM"."SYS_IMPORT_TABLE_01" successfully completed at 18:43:19
    Why SYSTEM? I want import to INVENTORY from HR - replaced in the file hrexport.dmp

  • Data pump error ORA-39065, status undefined after restart

    Hi members,
    The data pump full import job hung, continue client also hung, all of a sudden the window exited.
    ;;; Import> status
    ;;; Import> help
    ;;; Import> status
    ;;; Import> continue_client
    ORA-39065: unexpected master process exception in RECEIVE
    ORA-39078: unable to dequeue message for agent MCP from queue "KUPC$C_1_20090923181336"
    Job "SYSTEM"."SYS_IMPORT_FULL_01" stopped due to fatal error at 18:48:03
    I increased the shared_pool to 100M and then restarted the job with attach=jobname. After restarting, I have queried the status and found that everything is undefined. It still says undefined now and the last log message says that it has been reopened. Thats the end of the log file and nothing else is being recorded. I am not sure what is happening now. Any ideas will be appreciated. This is 10.2.0.3 version on windows. Thanks ...
    Job SYS_IMPORT_FULL_01 has been reopened at Wednesday, 23 September, 2009 18:54
    Import> status
    Job: SYS_IMPORT_FULL_01
    Operation: IMPORT
    Mode: FULL
    State: IDLING
    Bytes Processed: 3,139,231,552
    Percent Done: 33
    Current Parallelism: 8
    Job Error Count: 0
    Dump File: D:\oracle\product\10.2.0\admin\devdb\dpdump\devtest%u.dmp
    Dump File: D:\oracle\product\10.2.0\admin\devdb\dpdump\devtest01.dmp
    Dump File: D:\oracle\product\10.2.0\admin\devdb\dpdump\devtest02.dmp
    Dump File: D:\oracle\product\10.2.0\admin\devdb\dpdump\devtest03.dmp
    Dump File: D:\oracle\product\10.2.0\admin\devdb\dpdump\devtest04.dmp
    Dump File: D:\oracle\product\10.2.0\admin\devdb\dpdump\devtest05.dmp
    Dump File: D:\oracle\product\10.2.0\admin\devdb\dpdump\devtest06.dmp
    Dump File: D:\oracle\product\10.2.0\admin\devdb\dpdump\devtest07.dmp
    Dump File: D:\oracle\product\10.2.0\admin\devdb\dpdump\devtest08.dmp
    Worker 1 Status:
    State: UNDEFINED
    Worker 2 Status:
    State: UNDEFINED
    Object Schema: trm
    Object Name: EVENT_DOCUMENT
    Object Type: DATABASE_EXPORT/SCHEMA/TABLE/TABLE_DATA
    Completed Objects: 1
    Completed Rows: 78,026
    Completed Bytes: 4,752,331,264
    Percent Done: 100
    Worker Parallelism: 1
    Worker 3 Status:
    State: UNDEFINED
    Worker 4 Status:
    State: UNDEFINED
    Worker 5 Status:
    State: UNDEFINED
    Worker 6 Status:
    State: UNDEFINED
    Worker 7 Status:
    State: UNDEFINED
    Worker 8 Status:
    State: UNDEFINED

    39065, 00000, "unexpected master process exception in %s"
    // *Cause:  An unhandled exception was detected internally within the master
    //          control process for the Data Pump job.  This is an internal error.
    //          messages will detail the problems.
    // *Action: If problem persists, contact Oracle Customer Support.

  • How to consolidate data files using data pump when migrating 10g to 11g?

    We have one 10.2.0.4 database to be migrated to a new box running 11.2.0.1. The 10g database has too many data files scattered within too many file systems. I'd like to consolidate the data files into one or two large chunk in one file systems. Both OSs are RHEL 5. How should I do that using Data Pump Export/Import? I knew there is "Remap" option could be used, but it's only one to one mapping. How can I map multiple old data files into one new data file?

    hi
    datapump is terribly slow, make sure you have as much memory as possible allocated for Oracle but the bottleneck can be I/O throughput.
    Use PARALLEL option, set also these ones:
    * DISK_ASYNCH_IO=TRUE
    * DB_BLOCK_CHECKING=FALSE
    * DB_BLOCK_CHECKSUM=FALSE
    set high enough to allow for maximum parallelism:
    * PROCESSES
    * SESSIONS
    * PARALLEL_MAX_SERVERS
    more:
    http://download.oracle.com/docs/cd/B28359_01/server.111/b28319/dp_perf.htm
    that's it, patience welcome ;-)
    P.S.
    For maximum throughput, do not set PARALLEL to much more than twice the number of CPUs (two workers for each CPU).
    Edited by: g777 on 2011-02-02 09:53
    P.S.2
    breaking news ;-)
    I am playing now with storage performance and I turned the option of disk cache (also called write-back cache) to ON (goes at least along with RAID0 and 5 and setting it you don't lose any data on that volume) - and it gave me 1,5 to 2 times speed-up!
    Some says there's a risk of lose of more data when outage happens, but there's always such a risk even though you can lose less. Anyway if you can afford it (and with import it's OK, as it ss not a production at that moment) - I recommend to try. Takes 15 minutes, but you can gain 2,5 hours out of 10 of normal importing.
    Edited by: g777 on 2011-02-02 14:52

  • Error Data Pump Import

    Hi All,
    I am having errors below when trying to use data pump to import one table from the dump file (from a pre-prod db) into another database (prod_db) same version. I used the expdp to generate the dump file.
    Gettings errors -
    ORA-39083
    ORA-00959
    ORA-39112
    Any suggestions and/or advice would be appreciated.
    Thank you
    Import: Release 10.2.0.1.0 - 64bit Production
    Copyright (c) 2003, 2005, Oracle. All rights reserved.
    Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - 64bit Production
    With the Partitioning, OLAP and Data Mining options
    Master table "SYSTEM"."SYS_IMPORT_TABLE_01" successfully loaded/unloaded
    Starting "SYSTEM"."SYS_IMPORT_TABLE_01": system/******** DIRECTORY=DATA_PUMP_DIR DUMPFILE=EXP_Mxxxx_1203.DMP LOGFILE=Mxxxx_CLAIMS.LOG TABLES=CLAIMS
    Processing object type SCHEMA_EXPORT/TABLE/TABLE
    ORA-39083: Object type TABLE failed to create with error:
    ORA-00959: tablespace 'OXFORD_DATA_01' does not exist
    Failing sql is:
    CREATE TABLE "Mxxxx"."CLAIMS" ("CLAIM_NUM" VARCHAR2(25) NOT NULL ENABLE, "LINE_NUM" NUMBER(3,0) NOT NULL ENABLE, "PAID_FLAG" CHAR(1), "ADJ_FLAG" CHAR(1), "CLAIM_TYPE" VARCHAR2(20), "MEM_ID" VARCHAR2(20), "MEM_SUF" VARCHAR2(20), "MEM_BEG_DATE" DATE, "MEM_REL" CHAR(2), "MEM_NAME" VARCHAR2(40), "MEM_DOB" DATE, "MEM_SEX" CHAR(1), "REC_DATE" DATE, "PAID_DATE" DATE, "FROM_DATE" DATE, "
    Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
    Processing object type SCHEMA_EXPORT/TABLE/INDEX/INDEX
    ORA-39112: Dependent object type INDEX:"Mxxxx"."CLAIMS_IDX1" skipped, base object type TABLE:"Mxxxx"."CLAIMS" creation failed
    ORA-39112: Dependent object type INDEX:"Mxxxx"."CLAIMS_PK" skipped, base object type TABLE:"Mxxxx"."CLAIMS" creation failed
    Processing object type SCHEMA_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS
    ORA-39112: Dependent object type INDEX_STATISTICS skipped, base object type INDEX:"Mxxxx"."CLAIMS_IDX1" creation failed
    ORA-39112: Dependent object type INDEX_STATISTICS skipped, base object type INDEX:"Mxxxx"."CLAIMS_PK" creation failed
    Processing object type SCHEMA_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
    ORA-39112: Dependent object type TABLE_STATISTICS skipped, base object type TABLE:"Mxxxx"."CLAIMS" creation failed
    Job "SYSTEM"."SYS_IMPORT_TABLE_01" completed with 6 error(s) at 13:49:33
    impdp username/password DIRECTORY=DATA_PUMP_DIR DUMPFILE=EXP_Mxxxx_1203.DMP LOGFILE=Mxxxx_CLAIMS.LOG TABLES=CLAIMS

    Ok,
    Thank you for the advice-
    So I did use REMAP_TABLESPACE option and it did import and load the file but should I be concerned with 4 errors? Looks like have to rebuilt indexes which is fine.
    Master table "Mxxxx"."SYS_IMPORT_TABLE_01" successfully loaded/unloaded
    Starting "Mxxxx"."SYS_IMPORT_TABLE_01": Mxxxx/******** DIRECTORY=DATA_PUMP_DIR DUMPFILE=EXP_Mxxxx_1203.DMP LOGFILE=Mxxxx_CLAIMS.LOG TABLES=CLAIMS REMAP_TABLESPACE=OXFORD_DATA_01:Mxxxx_DATA_01
    Processing object type SCHEMA_EXPORT/TABLE/TABLE
    Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
    . . imported "Mxxxx"."CLAIMS" 605.3 MB 1715554 rows
    Processing object type SCHEMA_EXPORT/TABLE/INDEX/INDEX
    ORA-39083: Object type INDEX failed to create with error:
    ORA-00959: tablespace 'USER_INDEX_01' does not exist
    Failing sql is:
    CREATE INDEX "Mxxxx"."CLAIMS_IDX1" ON "Mxxxx"."CLAIMS" ("MEM_ID") PCTFREE 10 INITRANS 2 MAXTRANS 255 NOLOGGING STORAGE(INITIAL 8388608 NEXT 8388608 MINEXTENTS 1 MAXEXTENTS 2147483645 PCTINCREASE 0 FREELISTS 1 FREELIST GROUPS 1 BUFFER_POOL DEFAULT) TABLESPACE "USER_INDEX_01" PARALLEL 1
    ORA-39083: Object type INDEX failed to create with error:
    ORA-00959: tablespace 'USER_INDEX_01' does not exist
    Failing sql is:
    CREATE UNIQUE INDEX "Mxxxx"."CLAIMS_PK" ON "Mxxxx"."CLAIMS" ("CLAIM_NUM", "LINE_NUM") PCTFREE 10 INITRANS 2 MAXTRANS 255 NOLOGGING STORAGE(INITIAL 8388608 NEXT 8388608 MINEXTENTS 1 MAXEXTENTS 2147483645 PCTINCREASE 0 FREELISTS 1 FREELIST GROUPS 1 BUFFER_POOL DEFAULT) TABLESPACE "USER_INDEX_01" PARALLEL 1
    Processing object type SCHEMA_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS
    ORA-39112: Dependent object type INDEX_STATISTICS skipped, base object type INDEX:"Mxxxx"."CLAIMS_IDX1" creation failed
    ORA-39112: Dependent object type INDEX_STATISTICS skipped, base object type INDEX:"Mxxxx"."CLAIMS_PK" creation failed
    Processing object type SCHEMA_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
    Job "Mxxxx"."SYS_IMPORT_TABLE_01" completed with 4 error(s) at 16:57:12

  • How to export resource manager consumer groups using Data Pump?

    Hi, there,
    Is there any way to export RM Consumer Groups/Mappings/Plans as part of a Data Pump export/import? I was wondering because I don't fancy doing it manually and I don't see the object in the database_export_objects view. I can create them manually, but was wondering whether there's an easier, less involved way of doing it?
    Mark

    Hi,
    I have not tested it but i think a full db export/import (using data pump or traditional exp/imp) may help doing this (which might not be feasible for you to have full exp/imp) because full database mode exports/imports SYS schema objects also, so there is a chance that it will also import the resource group and resource plans.
    Salman

  • Question for experts/ wrapping - data pump

    Hello All,
    In am using ORACLE 10.2.0.4.0
    i have an issue while export / import wrapped procedures:
    I have some procedures / packages that are wrapped and when i do export for their related schema and then import it to the database, i got the below error
    Compilation errors for PROCEDURE PROC_NAME
    Error: PLS-00753: malformed or corrupted wrapped unit
    Line: 1
    Text: CREATE OR REPLACE PROCEDURE "PROC_NAME" wrapped
    After i did many tests on this scenario i Noted that:
    This error is not appearing on all procedures, there are some that will be corrupted after the import and some will not be corrupted.
    After i did the normal import export, i found that the character set of the imp/exp is different then the one of the database, so i changed the character set of the database to be the same as the one used by the import, i redo my tests and i found that this solved the issue of normal imp/exp, so now and if i am doing normal exp/imp the recompilation of procedures will not fail after the normal old import. but at this stage the imp/exp character set has changed again.
    The character set of my db was : AR8MSWIN1256 and when i did exp/imp it was for exp /imp AR8ISO8859P6, so i changed the one of my db to AR8ISO8859P6 , after changing the database character set to AR8ISO8859P6 the recompilation was working fine for the normal exp / imp, but now the character set for import/ export changed to AR8MSWIN1256 !! why ?
    The Data pump export import was not solved even after changing the character set of the database, the procedures were imported with errors and it was fixed after recompiling it.
    My issue that i have too many clients and i am not able always to change the character set of the database because i may face ERROR "ORA-12712: new character set must be a superset of old character set"
    Hope the above description is clear,
    Any suggestions or ideas to solve the issue of wrapped procedures with expdp/impdp? can i force certain character set for the expdp/impdp ? knowing that with release 10.2.0.1.0 the expdp/impdp is totally not working with the wrapped procedures

    Any ideas?

  • Data pump problem on sync data from windows to linux

    Hi,
    i have production db on windows 2003 server, and syn data from production windows machine to linux server. using data pump exports, imports.
    i have written an procedure to export schema level
    and
    procedure for import like
    dbms_datapump.open(operation ='import',job_module='schema',job_name='imp');
    dbms_datapump.add_file(handle='',filename='imp.dmp',directory='dir_datapump',filetype=dbms_datapump.ku$_file_type_dump_file);
    dbms_datapump.metadata_filter(handle='',name='schema_expr',value='in('hr','oe')',object_type''table');
    dbms_datapump.set_parameter(handle ='',name=table_exists_action,value=replace);
    when i run the above procedure, error displays that table already exists ora-31684
    but from the
    impdp include=table table_exists_action=replace
    i am able to do it.
    can u suggest what iam missing.
    Regards
    Ajay

    I will try to explain a little better using screenshots. This screenshot is from windows :
    http://img510.imageshack.us/img510/1714/windowspg.jpg
    At the top is says 'date taken' and ANY windows photo library software will sort it by this date. Below are the properties of the same file after it's copied to osx :
    http://img153.imageshack.us/img153/6345/screenshot20100322at145.png
    It contains lots of information about the picture but not when it was taken. So software such as iphoto or picasa sorts it by it's created date (which you can see at the top is Monday 12th January).
    Why isn't OSX carrying over the exif data on when the photo was taken ? I can't see any iphoto options to sort by that date.
    Message was edited by: Ollie UK
    Message was edited by: Ollie UK

  • Data pump and SGA, system memory in window 2003

    Hi Experts,
    I have a question for oracle 10g data pump. Based on oracle document,
    all data pump ".dmp" and ".log" files are created on the Oracle server, not the client machine.
    That means data pump need to used oracle server SGA or other system memory. is it true?
    Or data pump must be support by oracle server memory?
    we use oracle 10 G R4 in 32 bit window 2003 with memory issue. So we take care this point.
    at the present, we use exp to do exp job. we DB size is 280 G. SGA is 2G in window.
    for testing, i can saw data pump message in alert file. I does not saw taht a export job broken DB and data replication job.
    Does any experts have this point experience?
    Thanks,
    JIM

    Hi Jim,
    user589812 wrote:
    I have a question for oracle 10g data pump. Based on oracle document,
    all data pump ".dmp" and ".log" files are created on the Oracle server, not the client machine.Yes, they are but you can always give a shared location on to another server.
    That means data pump need to used oracle server SGA or other system memory. is it true?
    Or data pump must be support by oracle server memory?Irrespective, the SGA is used for Conventional Export. You can reduce the overhead to some extent by a Direct Export (direct=y), but still the server resources will be in use.
    we use oracle 10 G R4 in 32 bit window 2003 with memory issue. So we take care this point.If you have windiows enterprise edition, why don't you enable PAE to use more memory on provided you have memory beyond 3GB on the server.
    at the present, we use exp to do exp job. we DB size is 280 G. SGA is 2G in window.With respect to the size of the database, your SGA is too small.
    Hope it helps.
    Regards
    Z.K.

Maybe you are looking for