10G data pumps export / import

window XP pro sp-2,
I run expdp from the command line I got this
Connected to: Oracle Database 10g Enterprise Edition Release 10.1.0.2.0 - Produc
tion
With the Partitioning, OLAP and Data Mining options
ORA-39002: invalid operation
ORA-39070: Unable to open the log file.
ORA-39087: directory name DATA_PUMP_DIR is invalid
What am I missing ?
Eric

Maybe you've missed some parameters? Expdmp alsways needs to have some sort of info on what you want to export, i.e.:
expdp hr/hr TABLES=employees,jobs DUMPFILE=dpump_dir1:table.dmp NOLOGFILE=y
or
expdp hr/hr PARFILE=exp.par
with parfile contents being:
DIRECTORY=dpump_dir1
DUMPFILE=dataonly.dmp
CONTENT=DATA_ONLY
EXCLUDE=TABLE:"IN ('COUNTRIES', 'LOCATIONS', 'REGIONS')"
QUERY=employees:"WHERE department_id !=50 ORDER BY employee_id"
These examples are from the documentation on Utilities, so you can look up all the possible parameters there.

Similar Messages

  • Oracle 10g - Data Pump: Export / Import of Sequences ?

    Hello,
    I'm new to this forum and also to Oracle (Version 10g). Since I could not find an answer to my question, I open this post in hoping to get some help from the experienced users.
    My question concerns the Data Pump Utility and what happens to sequences which were defined in the source database:
    I have exported a schema with the following command:
    "expdp <user>/<pass> DIRECTORY=DATA_PUMP_DIR DUMPFILE=dumpfile.dmp LOGFILE=logfile.log"
    This worked fine and also the import seemed to work fine with the command:
    "impdp <user>/<pass> DIRECTORY=DATA_PUMP_DIR DUMPFILE=dumpfile.dmp"
    It loaded the exported objects directly into the schema of the target database.
    BUT:
    Something has happened to my sequences. :-(
    When I want to use them, all sequences start again with value "1". Since I have already included data with higher values in my tables, I get into trouble with the PK of these tables because I used sequences sometimes as primary key.
    My question go in direction to:
    1. Did I something wrong with Data Pump Utility?
    2. How is the correct way to export and import sequences that they keep their actual values?
    3. When the behaviour described here is correct, how can I correct the values that start again from the last value that was used in the source database?
    Thanks a lot in advance for any help concerning this topic!
    Best regards
    FireFighter
    P.S.
    It might be that my english sounds not perfect since it is not my native language. Sorry for that! ;-)
    But I hope that someone can understand nevertheless. ;-)

    My question go in direction to:
    1. Did I something wrong with Data Pump Utility?I do not think so. But may be with the existing schema :-(
    2. How is the correct way to export and import
    sequences that they keep their actual values?If the Sequences exist in the target before the import, oracle does not drop and recreate it. So you need to ensure that the sequences do not already exist in the target or the existing ones are dropped before the import.
    3. When the behaviour described here is correct, how
    can I correct the values that start again from the
    last value that was used in the source database?You can either refresh with the import after the above correction or drop and manually recreate the sequences to START WITH the NEXT VALUE of the source sequences.
    The easier way is to generate a script from the source if you know how to do it

  • Pre Checks before running Data Pump Export/Import

    Hi,
    Oracle :-11.2
    OS:- Windows
    Kindly share the pre-checks required for data pump export and import which should be followed by a DBA.
    Thanks

    When you do a tablespace mode export, Data Pump is essentially doing a table mode export of all of the tables in the tablespaces mentioned.  So if you have this:
    tablespace a    contains table 1
                                              table 2
                                             index 3a(on table 3)
    tablespace b   contains index 1a on table 1
                                             index 2a on table 2
                                            table 3
    and if you expdp tablespaces=a ...
    you will get table 1, table 2, index 1a, and index 2a.
    My belief is that you will not get table 3 or index 3a.  The way I understand the code to work is that you get the tables in the tablespaces you mention and their dependent objects, but not the other way around.  You could easily verify this to make sure.
    Dean

  • 10g data pump full import

    Hi.. I have full exported imported a 10g (10.2.0.3) database from Solaris to Windows 2003 using data pump. I have made object counts of both source and destination databases and found a huge difference in their total object counts (dba_objects). What is the reason behind this difference? The database seems to be ok and the important application schemas have all row counts similar. It is due for testing anyway tomorrow. Another question I need to ask you is the dumpfile. What is the max size of dump file which can be opened in an editor? I can't seem to open the data pump dmp file. Thanks in advance for your responses.

    I have made object counts of both source and destination databases and found a huge difference in their total object counts (dba_objects).
    What is the reason behind this difference? Perhaps you should do object COUNT(*) on per schema basis.
    What is the max size of dump file which can be opened in an editor?You'll corrupt the file is you ever open in an editor.
    I can't seem to open the data pump dmp file. You were saved from shooting yourself in the foot.
    Edited by: sb92075 on Jun 24, 2009 6:53 PM

  • Oracle data pump vs import/export utility

    Hello all,
    What is the difference between Oracle Data Pump and Import/Export utility? which on is the faster?

    Handle:      user9362044
    Status Level:      Newbie
    Registered:      Jan 26, 2011
    Total Posts:      31
    Total Questions:      11 (7 unresolved)
    so many questions & so few answers.
    What is the difference between Oracle Data Pump and Import/Export utility?unwilling or incapable to Read The Fine Manual yourself?
    http://download.oracle.com/docs/cd/B19306_01/server.102/b14215/toc.htm

  • Exporting whole database (10GB) using Data Pump export utility

    Hi,
    I have a requirement that we have to export the whole database (10GB) using Data Pump export utility because it is not possible to send the 10GB dump in a CD/DVD to the system vendor of our application (to analyze few issues we have).
    Now when i checked online full export is available but not able to understand how it works, as we never used this data pump utility, we use normal export method. Also, will data pump reduce the size of the dump file so it can fit in a DVD or can we use Parallel Full DB export utility to split the files and include them in a DVD, is it possible.
    Please correct me if i am wrong and kindly help.
    Thanks for your help in advance.

    You need to create a directory object.
    sqlplus user/password
    create directory foo as '/path_here';
    grant all on directory foo to public;
    exit;
    then run you expdp command.
    Data Pump can compress the dumpfile if you are on 11.1 and have the appropriate options. The reason for saying filesize is to limit the size of the dumpfile. If you have 10G and are not compressing and the total dumpfiles are 10G, then by specifying 600MB, you will just have 10G/600MB = 17 dumpfiles that are 600MB. You will have to send them 17 cds. (probably a few more if dumpfiles don't get filled up 100% due to parallel.
    Data Pump dumpfiles are written by the server, not the client, so the dumpfiles don't get created in the directory where the job is run.
    Dean

  • Interface Problems: DBA = Data Pump = Export Jobs (Job Name)

    Hello Folks,
    I need your help in troubleshooting an SQL Developer interface problem.
    DBA => Data Pump => Export Jobs (Job Name) => Data Pump Export => Job Scheduler (Step):
    -a- Job Name and Job Description fields are not visible. Well the fields are there but each of them just 1/2 character wide. I can't see/enter anything in the fields.
    Import Wizard:
    -b- Job Name field under the first "Type" wizard's step looks exactly the same as in Export case.
    -c- Can't see any row under "Chose Input Files" section (I see just ~1 mm of the first row and everything else is hidden).
    My env:
    -- Version 3.2.20.09, Build MAIN-09.87
    -- Windows 7 (64 bit)
    It could be related to the fact that I did change fonts in the Preferences. As I don't know what is the default font I can't change it back to the default and test (let me know what is the default and I will test it).
    PS
    -- Have tried to disable all extensions but DBA Navigator (11.2.0.09.87). It didn't help
    -- There are no any messages in the console if I run SQL Dev under cmd "sqldeveloper\bin\sqldeveloper.exe
    Any help is appreciated,
    Yury

    Hi Yury,
    a-I see those 1/2 character size text boxes (in my case on frequency) when the pop up dialog is too small - do they go away when you make it bigger?
    b- On IMPORT the name it starts with IMPORT - if it is the half character issue have you tried making the dialog bigger?
    c-I think it is size again but my dialog at minimum size is already big enough.
    Have you tried a smaller font - or making the dialogs bigger (resizing from the corners).
    I have a 3.2.1 version where I have not changed the fonts from Tools->Preferences->CodeEditor->Fonts appears to be:
    Font Name: DialogInput
    Font size: 12
    Turloch
    -SQLDeveloper Team

  • DATA PUMP export error

    Hi all
    During data pump export (big table ) dmp has 289GB (FILESIZE=10G) I had error
    . . exported "TEST3"."INC_T_ZMLUVY_PARTNERI" 6.015 GB 61182910 rows
    . . exported "TEST3"."PAR_T_VZTAHY" 5.798 GB 73121325 rows
    ORA-31693: Table data object "TEST3"."INC_T_POISTENIE_PARAMETRE_H" failed to load/unload and is being skipped due to error:
    ORA-31617: unable to open dump file "/u01/app/oracle/backup/STARINS/exp_polska03.dmp" for write
    ORA-19505: failed to identify file "/u01/app/oracle/backup/STARINS/exp_polska03.dmp"
    ORA-27037: unable to obtain file status
    Linux-x86_64 Error: 2: No such file or directory
    Additional information: 3
    . . exported "TEST3"."PAY_T_UCET_POLOZKA" 5.344 GB 97337823 rows
    and export continued until :
    Job "SYSTEM"."SYS_EXPORT_SCHEMA_02" completed with 1 error
    (it take 8hours)
    can You help me if dmp is now ok ? If impdp will conntiue after reading exp_polska03.dmp , (dump has 28 file exp_polska1 - exp_polska28)?Maximaly I 'm exporting this one table again.. is this solution ok ?
    thak Brano

    What is the expdp parameters used?
    what's the total export dump size?( Try estimate_only=y)
    ORA-31617: unable to open dump >file "/u01/app/oracle/backup/STARINS/exp_polska03.dmp" for write
    ORA-19505: failed to identify >file "/u01/app/oracle/backup/STARINS/exp_polska03.dmphttp://systemr.blogspot.com/2007/09/section-oracle-database-utilities-title.html
    Check the above one.
    I guess the file is removed from the location or not enough permission to write.

  • How-to list the contents of a Data Pump Export file?

    How can I list the contents of a 10gR2 Data Pump Export file? I'm looking at the Syntax Diagram for Data Pump Import and can't see a list-only option.
    Regards,
    Al Malin

    use the parameter SQLFILE in the impdp which writes all the sql ddl's to the specified file.
    http://download-west.oracle.com/docs/cd/B19306_01/server.102/b14215/dp_import.htm
    SQLFILE
    Default: none
    Purpose
    Specifies a file into which all of the SQL DDL that Import would have executed, based on other parameters, is written.
    Syntax and Description
    SQLFILE=[directory_object:]file_name
    The file_name specifies where the import job will write the DDL that would be executed during the job. The SQL is not actually executed, and the target system remains unchanged. The file is written to the directory object specified in the DIRECTORY parameter, unless another directory_object is explicitly specified here. Any existing file that has a name matching the one specified with this parameter is overwritten.
    Note that passwords are not included in the SQL file. For example, if a CONNECT statement is part of the DDL that was executed, it will be replaced by a comment with only the schema name shown. In the following example, the dashes indicate that a comment follows, and the hr schema name is shown, but not the password.
    -- CONNECT hr
    Therefore, before you can execute the SQL file, you must edit it by removing the dashes indicating a comment and adding the password for the hr schema (in this case, the password is also hr), as follows:
    CONNECT hr/hr
    For Streams and other Oracle database options, anonymous PL/SQL blocks may appear within the SQLFILE output. They should not be executed directly.
    Example
    The following is an example of using the SQLFILE parameter. You can create the expfull.dmp dump file used in this example by running the example provided for the Export FULL parameter. See FULL.
    impdp hr/hr DIRECTORY=dpump_dir1 DUMPFILE=expfull.dmpSQLFILE=dpump_dir2:expfull.sql
    A SQL file named expfull.sql is written to dpump_dir2.
    Message was edited by:
    Ranga
    Message was edited by:
    Ranga

  • How can I use the data pump export from external client?

    I am trying to export a bunch of table from a DB but I cant figure out how to do it.
    I dont have access to a shell terminal on the server itself, I can only login using TOAD.
    I am trying to use TOAD's Data Pump Export utility but I keep getting this error:
    ORA-39070: Unable to open the log file.
    ORA-39087: directory name D:\TEMP\ is invalid
    I dont understand if its because I am setting the parameter file wrong or if the utility is trying to find that directory on the server whereas I am thinking its going to dump it to my local filesystem where that directory exists.
    I'd hate to have to use SQL Loader to create ctl files for each and every table...
    Here is my parameter file:
    DUMPFILE="db_export.dmp"
    LOGFILE="exp_db_export.log"
    DIRECTORY="D:\temp\"
    TABLES=ACCOUNT
    CONTENT=ALL
    (just trying to test it on one table so far...)
    P.S. Oracle 11g
    Edited by: trant on Jan 13, 2012 7:58 AM

    ORA-39070: Unable to open the log file.
    ORA-39087: directory name D:\TEMP\ is invalidDirectory here it should not be physical location, its a logical representation.
    For that you have to create a directory from SQL level, like create directory exp_dp..
    then you have to use above created directory as DIRECTORY=exp_dp
    HTH

  • Data Pump Export Parameters

    Hi,
    Can you please explain me about CONSISTENT and COMPRESS parameters in Data Pump Export.
    Thanks,
    Suresh Bommalata..

    Use FLASHBACK_SCN and FLASHBACK_TIME for this functionality against the consistent parameter
    http://oracledatapump.blogspot.com/2009/04/mapping-between-parameters-of-datapump.html
    Edited by: user00726 on Aug 18, 2010 10:56 PM

  • Check for directory for Data Pump Export

    Hi
    In order to perform Data Pump Export, It is required to create the same as a pre-requisite.
    e.g :
    SQL> create directory expdp_dir as '/u01/backup/exports';
    I need to know if there is any way to know if the folder is already created.
    Please advice
    Regards,
    Dheeraj

    Dheeraj Kumar M wrote:
    I need to know if there is any way to know if the folder is already created.Do you mean see if OS folder exists and is accessible to oracle user? Try creating file in that directory using UTL_FILE. If it succeeds - you are OK. Another option is writing java SP to check OS folder existence and permissions.
    SY.

  • Best Practice for data pump or import process?

    We are trying to copy existing schema to another newly created schema. Used export data pump to successfully export schema.
    However, we encountered some errors when importing dump file to new schema. Remapped schema and tablespaces, etc.
    Most errors occur in PL/SQL... For example, we have views like below in original schema:
    CREATE VIEW *oldschema.myview* AS
    SELECT col1, col2, col3
    FROM *oldschema.mytable*
    WHERE coll1 = 10
    Quite a few Functions, Procedures, Packages and Triggers contain "*oldschema.mytable*" in DML (insert, select, update) statement, for exmaple.
    Getting the following errors in import log:
    ORA-39082: Object type ALTER_FUNCTION:"TEST"."MYFUNCTION" created with compilation warnings
    ORA-39082: Object type ALTER_PROCEDURE:"TEST"."MYPROCEDURE" created with compilation warnings
    ORA-39082: Object type VIEW:"TEST"."MYVIEW" created with compilation warnings
    ORA-39082: Object type PACKAGE_BODY:"TEST"."MYPACKAGE" created with compilation warnings
    ORA-39082: Object type TRIGGER:"TEST"."MYTRIGGER" created with compilation warnings
    A lot of actual errors/invalid objects in new schema are due to:
    ORA-00942: table or view does not exist
    My question is:
    1. What can we do to fix those errors?
    2. Is there a better way to do the import with such condition?
    3. Update PL/SQL and recompile in new schema? Or update in original schema first and export?
    Your help will be greatly appreciated!
    Thank you!

    I routinely get many (MANY) errors as follows and they always compile when I recompile using utlrp.
    ORA-39082: Object type ALTER_FUNCTION:"TKCSOWNER"."RPTSF_WR_LASTOUTPUNCH" created with compilation warnings
    ORA-39082: Object type ALTER_FUNCTION:"TKCSOWNER"."RPTSF_WR_REFPERIODENDFOREMP" created with compilation warnings
    ORA-39082: Object type ALTER_FUNCTION:"TKCSOWNER"."RPTSF_WR_TAILOFFSECS" created with compilation warnings
    ORA-39082: Object type ALTER_FUNCTION:"TKCSOWNER"."FN_GDAPREPORTGATHERER" created with compilation warnings
    Processing object type DATABASE_EXPORT/SCHEMA/PROCEDURE/ALTER_PROCEDURE
    ORA-39082: Object type ALTER_PROCEDURE:"TKCSOWNER"."ABSENT_EXCEPTION" created with compilation warnings
    ORA-39082: Object type ALTER_PROCEDURE:"TKCSOWNER"."ACCRUAL_BAL_PROJ" created with compilation warnings
    ORA-39082: Object type ALTER_PROCEDURE:"TKCSOWNER"."ACCRUAL_DETAILS" created with compilation warnings
    ORA-39082: Object type ALTER_PROCEDURE:"TKCSOWNER"."ACCRUAL_SUMMARY" created with compilation warnings
    ORA-39082: Object type ALTER_PROCEDURE:"TKCSOWNER"."ACTUAL_SCHEDULE" created with compilation warnings
    It works. In all my databases: peoplesoft, kronos, and others...
    I should qualify that it may still be necessary to 'debug' specific problems, but most common typical problems are easily resolved using the utlrp.sql. The usual problems I run into are typically because of a database link that points to another database such as in a production environment that we firewall our test and development databases from linking to (for obvious reasons).

  • Is it possible to Grant Nested Roles using Data Pump Export?

    I'm on Oracle 10.2.0.5, trying various Data Pump Parameters to obtain an Export containing a statement like "GRANT ParentRole TO ChildRole;" .
    This is to Import to 11.2.0.2, on the Windows x64 Platform. I'm using SQLFILE= Parameter in an IMPDP to check the effect of various EXPDP Parameters.
    I can get the "CREATE ROLE" Statements with a Full EXPDP using FULL=Y and INCLUDE=ROLE:"IN('ParentRole','ChildRole')"
    I can get the Grants of Objects to Roles with a 2nd Schema EXPDP using SCHEMAS=('MySchema') - e.g. I get "GRANT SELECT ON MySchema.MyTable TO ParentRole;"
    But I can get the Parameters so that a Role Being Granted to Another Role is Exported.
    Is this possible?

    Can you give an example of the grants, a real example so I can try to create this here. I'm thinking it is a grant that you want, but not sure which grant. There are a bunch of different grants.
    Dean

  • Data Pump Export issue - no streams pool created and cannot automatically c

    I am trying to use data pump on a 10.2.0.1 database that has vlm enabled and getting the following error :
    Export: Release 10.2.0.1.0 - Production on Tuesday, 20 April, 2010 10:52:08
    Connected to: Oracle Database 10g Release 10.2.0.1.0 - Production
    ORA-31626: job does not exist
    ORA-31637: cannot create job SYS_EXPORT_TABLE_01 for user E_AGENT_SITE
    ORA-06512: at "SYS.DBMS_SYS_ERROR", line 95
    ORA-06512: at "SYS.KUPV$FT_INT", line 600
    ORA-39080: failed to create queues "KUPC$C_1_20100420105208" and "KUPC$S_1_20100420105208" for Data Pump job
    ORA-06512: at "SYS.DBMS_SYS_ERROR", line 95
    ORA-06512: at "SYS.KUPC$QUE_INT", line 1555
    ORA-00832: no streams pool created and cannot automatically create one
    This is my script (that I currently use on other non vlm databases successfully):
    expdp e_agent_site/<password>@orcl parfile=d:\DailySitePump.par
    this is my parameter file :
    DUMPFILE=site_pump%U.dmp
    PARALLEL=1
    LOGFILE=site_pump.log
    STATUS=300
    DIRECTORY=DATA_DUMP
    QUERY=wwv_document$:"where last_updated > sysdate-18"
    EXCLUDE=CONSTRAINT
    EXCLUDE=INDEX
    EXCLUDE=GRANT
    TABLES=wwv_document$
    FILESIZE=2000M
    My oracle directory is created and the user has rights
    googling the issue says that the shared pool is too small or streams_pool_size needs setting. shared_pool_size = 1200M and when I query v$parameter it shows that streams_pool_size = 0
    I've tried alter system set streams_pool_size=1M; but I just get :
    ORA-02097: parameter cannot be modified because specified value is invalid
    ORA-04033: Insufficient memory to grow pool
    The server is a windows enterprise box with 16GB ram and VLM enabled, pfile memory parameters listed below:
    # resource
    processes = 1250
    job_queue_processes = 10
    open_cursors = 1000 # no overhead if set too high
    # sga
    shared_pool_size = 1200M
    large_pool_size = 150M
    java_pool_size = 50M
    # pga
    pga_aggregate_target = 850M # custom
    # System Managed Undo and Rollback Segments
    undo_management=AUTO
    undo_tablespace=UNDOTBS1
    # vlm support
    USE_INDIRECT_DATA_BUFFERS = TRUE
    DB_BLOCK_BUFFERS = 1500000
    Any ideas why I cannot run data pump? I am assuming that I just need to set streams_pool_size but I don't understand why I cannot increase the size of it on this db. It is set to 0 on other databases that work fine and I can set it which is why I am possibly linking the issue to vlm
    thanks
    Robert

    SGA_MAX_SIZE?
    SQL> ALTER SYSTEM SET streams_pool_size=32M SCOPE=BOTH;
    ALTER SYSTEM SET streams_pool_size=32M SCOPE=BOTH
    ERROR at line 1:
    ORA-02097: parameter cannot be modified because specified value is invalid
    ORA-04033: Insufficient memory to grow pool
    SQL> show parameter sga_max
    NAME                         TYPE      VALUE
    sga_max_size                    big integer 480M
    SQL> show parameter cache
    NAME                         TYPE      VALUE
    db_16k_cache_size               big integer 0
    db_2k_cache_size               big integer 0
    db_32k_cache_size               big integer 0
    db_4k_cache_size               big integer 0
    db_8k_cache_size               big integer 0
    db_cache_advice                string      ON
    db_cache_size                    big integer 256M
    db_keep_cache_size               big integer 0
    db_recycle_cache_size               big integer 0
    object_cache_max_size_percent          integer      10
    object_cache_optimal_size          integer      102400
    session_cached_cursors               integer      20
    SQL> ALTER SYSTEM SET db_cache_size=224M SCOPE=both;
    System altered.
    SQL> ALTER SYSTEM SET streams_pool_size=32M SCOPE=both;
    System altered.Lukasz

Maybe you are looking for

  • SAP BusinessObjects 4.0 sp2 on windows server 2008 r2

    Hi SAP, I'm planing to install SAP businessobjects 4.0. There is some thing not clear related to the version of the Operating system. In the pam documentation released on July 28, 2011 there is written that : BusinessObjects Enterprise 4.0 SP2,RTM is

  • Display column_tree in Popup

    Hi there, I just came across a strange problem. I was displaying a column tree on a dynpro. I used the classes cl_column_tree_model for the tree and cl_gui_custom_container as the container class. I prepare everything I need to display. I call the sc

  • Tomcat Crashes for JNI ?

    Hello everyone, I am facing a serious problem. This is as follows: 1. I have wrote some audio/video conversion fuctions using C and make the shared library (*.so files) in linux. 2. I have wrote the JNI code (making it shared library also, *.so file)

  • Import PO conditioins

    Hello, We have a business scenario in which we are imporitng raw materials from the vendor whoe currency is USD. Now when PO is created all calculations are done in USD including base amount, accessible value etc etc. Once PO is sent to vendor then a

  • Reading something scanned sideways, how do I get the iPad to let me read without turning it sideways when I turn the iPad sideways?

    How I get the screen to staty put if I turn it to read a scanned document?