Datapump network_link questions

I had a few high level datapump questions that I have been unable to find a straight answer for and it's been a little while since I last did a large migration. Database will be either 10gR2 or 11gR2.
- Since you are not using any files, does the Oracle user doing either the export or import even need a directory created and granted read/write on it to the user?
- Does the user even need any other permission besides create session? (like create table, index, etc) I wasn't sure if any create table statements are executed behind the scenes when the DDL is run.
- Just out of curiosity, is there any other way to use NETWORK_LINK without using a public database link? For some reason, the system guys do not like us creating these, they see it as a security issue for some reason...
- Does using NETWORK_LINK lock the tables out for write access when doing a single schema level export/import or can it be done while the database is online without any consequences (besides some reduced performance I'd imagine)? I thought I read transportable tablespaces locked the tables for read access only (not that I'm trying to do that).
- We have at least 1 TB of raw data in the tablespaces to migrate. If I start the data pump using the NETWORK_LINK, and the network connection gets dropped unexpectedly (router outage, etc), and say I was at 900 GB, would I have to start over or is there any kind of savepoint and resume concept?
Thanks.

Hi
is there any other way to use NETWORK_LINK without using a public database link?
Please find the document
http://www.oracle-base.com/articles/10g/oracle-data-pump-10g.php#NetworkExportsImports
is there any kind of savepoint and resume concept?
expdp help=y OR impdp help=y
START_JOB Start/resume current job.
Please refer oracle documentation
http://docs.oracle.com/cd/B28359_01/server.111/b28319/dp_export.htm
Best of Luck :)
Regards
Hitgon
Edited by: hitgon on Apr 26, 2012 9:34 AM

Similar Messages

  • Datapump monitoring question

    Hi,
    I've set up two stored procedures:
    1. One to do a datapump iimport over a network link using the datapump api.
    2. Another one to read the resulting log file on the server into a clob fiield in a table.
    Individually, both of the above work fine.
    The problem is when I try to call stored proc number 2 from number 1. Since the stored procedure returns before the imports complete, then the stored procedure to read the log file into the clob in the table runs, the log file is empty.
    I know I can monitor user_datapump_sessions for completion, any ideas how to make that work in a stored procedure? I thought maybe chained jobs, but I'm not sure.

    I found the answer. User DBMS_DATAPUMP.WAIT_FOR_JOB instead of start_job.
    Thanks.

  • Datapump question

    Hi,
    Is Datapump creating triggers, view or tables when you issue a full expdp?
    Checking documentation cant see nothing, probably dont searching in the right site.
    Can someone show me some light about it?

    Ricard,
    The question was if it creates any object in the users beign exported, like triggers for example, to do the operation.
    So if i expdp full=y with sys user, and the expdp is exporting schema scott, will appear new objects on the schema scott, that will remain in it when expdp finish?
    If I understood you correctly,you want to ask that the Scott's objects will be there in the Scott schema or will go away once the export will finish right?What makes you think that they will go away and why?Export is "dumping" the informatioon of the objects in the dump file.Its idea is not to temper the already existing objects.It is not just Data Pump,its the standard export also which does the same thing.
    Reading the documentation seems it doesn't happens, only a master table is created on the user sys, and exported too when expdp finish. And you also said me that expdp doesnt create any object to do the operation.
    My only questio to you,why you think it will drop the objects?
    s the table master dropped when expdp finish btw?
    Yes master table indeed gets created and is exported in the export file once it is over.
    You need to re-read the docs I guess.
    Aman....

  • RMAN and Datapump Question

    Hello,
    Database : 11gR2
    I was recently provieded an access to a new database, because users are complaining of poor perfromance.
    During my monitoring of this database, and using statspack report (because we don't have licensed diagnostics pack), I have seen both RMAN and Datapump processes running together during peak hours. So I asked the DBA responsible of this server, why she's scheduling RMAN and datapump together?!! She replied that she knows nothing about these datapump processes and that could be RMAN who is using (“ Module: Data Pump Worker”) !
    This answer was surprising me since in the Oracle documentation, I never read that before! Is it correct that RMAN utility can use Data Pump Workers??!
    This is here answer : "I think these “ Module: Data Pump Worker” are run by rman during backup."
    Thank you and sorry for this stupid question!
    Edited by: 899767 on 26-ago-2011 0:47

    RMAN and DATAPUMP are two different utility provided by oracle to take backup
    Oracle Backup are divided into two parts: ---
    a) Physical Backup: -- These are the copies of physical database files and we can take it by using RMAN utilities provided by Oracle.
    b) Logical Backup: -- These contain logical data (for example tables and stored procedures) extracted with Oracle Datapump Utility and stored in a binary file
    Refer to this thread it will help
    https://forums.oracle.com/forums/thread.jspa?threadID=1133819
    And
    http://myoracleservice.com/?q=node/38

  • Impdp, Network_Link, and Data Append, security questions

    I am implementing a nightly table data transfer between two Networked DB servers and have some questions
    1. Can impdp command have a NETWORK_LINK and still have TABLE_EXISTS_ACTION=Append, and REMAP_TABLE=[schema.]old_tablename[.partition]:new_tablename.
    For example: impdp hr/hr NETWORK_LINK=DB_LINK TABLES=hr.employees REMAP_TABLE=hr.employees:emps TABLE_EXISTS_ACTION=Append
    2. How secure is the data move, is the data encrypted?

    893730 wrote:
    I am implementing a nightly table data transfer between two Networked DB servers and have some questions
    1. Can impdp command have a NETWORK_LINK and still have TABLE_EXISTS_ACTION=Append, and REMAP_TABLE=[schema.]old_tablename[.partition]:new_tablename. Although i am not sure if this should fail. Did you encounter any issue while doing the export import?
    For example: impdp hr/hr NETWORK_LINK=DB_LINK TABLES=hr.employees REMAP_TABLE=hr.employees:emps TABLE_EXISTS_ACTION=Append
    2. How secure is the data move, is the data encrypted?This excerpt is from Oracle Documentation.
    Caution:
    If an export operation is performed over an unencrypted network link, then all data is exported as clear text even if it is encrypted in the database.
    http://docs.oracle.com/cd/E11882_01/server.112/e22490/dp_export.htm#SUTIL856
    Regards,
    NC

  • Datapump table import, simple question

    Hi,
    As a junior dba, i am a little bit confused.
    Suppose that a user wants me to import a table with datapump which he exported from another db with different schemas and tablespaces(He exported with expdp using tables=XX and i dont know the details..)...
    If i only know the table name, should i ask these three questions 1)schema 2)tablespace 3)oracle version ?
    From docs. i know remapping capabilities of impdp.. But are they mandatory when importing just a table?
    thanks in advance

    Hi,
    Suppose that a user wants me to import a table with datapump which he exported from another db with different schemas and tablespaces(He >exported with expdp using tables=XX and i dont know the details..)...
    If i only know the table name, should i ask these three questions 1)schema 2)tablespace 3)oracle version ?You can get this information from the dumpfile if you want - just to make sure you get the right information. If you run your import command but add:
    sqlfile=mysql.sql
    Then you can edit that sql file to see what is in the dumpfile. It won't show data, but it will show all of the metadata (tables, tablespaces, etc). It is a .sql file that contains all of the create statements that would have been executed if you did not add sqlfile parameter.
    From docs. i know remapping capabilities of impdp.. But are they mandatory when importing just a table?
    thanks in advanceYou never have to remap anything, but if the dumpfile contains a table scott.emp, then it will import that table into scott.emp. If you want it to go into blake, then you need to remap_schema. If it is going into tablespace tbs1 and you want it in tbs2, then you need a remap_tablespace.
    Suppose that an enduser wants me to export a spesific table using datapump ..
    Should i give him also the name of the tablespace where exported table resides?It would be nice, but see above, you can get the tablespace name from the sqlfile command on import.
    Hope this helps.
    Dean

  • Some question regarding the DATAPUMP parameter

    Hi,
    I am trying to perform a database import using the EXCLUDE=GRANT parameter but I get the following errors
    ORA-39002: invalid operation
    ORA-39168: Object path GRANT was not found.
    What is the correct syntax of using the parameter?
    =====================================
    In the old import method the ANALYZE parameter was available, what is the equivalent parameter (if any) in DATAPUMP import?
    Thanks,
    Barry

    Barrry wrote:
    Hi,
    I am trying to perform a database import using the EXCLUDE=GRANT parameter but I get the following errors
    ORA-39002: invalid operation
    ORA-39168: Object path GRANT was not found.
    What is the correct syntax of using the parameter?
    =====================================
    In the old import method the ANALYZE parameter was available, what is the equivalent parameter (if any) in DATAPUMP import?
    Thanks,
    BarrySyntax????
    SYNTAX?????
    Did you ever consider that syntax can be looked up in the very fine reference manual?
    =================================================
    I don't want to be flippant or rude, but if people want to be professionals in ANY field, the first knowledge they need to acquire is how to locate AND USE+ the fundamental reference materials for that profession. And the most important trait, the one for which they are really hired, is the ability to do independent research, and having a modicum of curiosity that would drive one to do that research. We don't mind helping newbies, and even the most experienced person on this board will run into something they are not familiar with, or occasionally just require a second set of eyes to look at something. But a professional+ needs, above all, a willingness and capability to check the docs. A professional isn't necessarily someone who has all the answers at their fingertips or has a full understanding about every arcane subject in their field. It certainly isn't someone who has an encyclopedia full of memorized answers but little understanding of how it all fits together. It's someone who knows where to find the answers when needed, how to recognize them when he sees them. It's less about knowing than it is about attitude. Everything you asked can be answered in the Oracle Concepts Manual at tahiti.oracle.com. You should bookmark that site.
    =================================================
    Learning how to look things up in the documentation is time well spent investing in your career. To that end, you should drop everything else you are doing and do the following:
    Go to tahiti.oracle.com.
    Drill down to your product and version.
    <b><i><u>BOOKMARK THAT LOCATION</u></i></b>
    Spend a few minutes just getting familiar with what is available here. Take special note of the "books" and "search" tabs. Under the "books" tab you will find the complete documentation library.
    Spend a few minutes just getting familiar with what <b><i><u>kind</u></i></b> of documentation is available there by simply browsing the titles under the "Books" tab.
    Open the Reference Manual and spend a few minutes looking through the table of contents to get familiar with what <b><i><u>kind</u></i></b> of information is available there.
    Do the same with the SQL Reference Manual.
    Do the same with the Utilities manual.
    You don't have to read the above in depth. They are <b><i><u>reference</b></i></u> manuals. Just get familiar with <b><i><u>what</b></i></u> is there to <b><i><u>be</b></i></u> referenced. Ninety percent of the questions asked on this forum can be answered in less than 5 minutes by simply searching one of the above manuals.
    Then set yourself a plan to dig deeper.
    - Read a chapter a day from the Concepts Manual.
    - Take a look in your alert log. One of the first things listed at startup is the initialization parms with non-default values. Read up on each one of them (listed in your alert log) in the Reference Manual.
    - Take a look at your listener.ora, tnsnames.ora, and sqlnet.ora files. Go to the Network Administrators manual and read up on everything you see in those files.
    - When you have finished reading the Concepts Manual, do it again.
    Give a man a fish and he eats for a day. Teach a man to fish and he eats for a lifetime.
    =================================

  • Datapump exp with network_link generated crashed dump file

    Hi Experts,
    I am using oracle 10.2.0.4 DB at 32 bit window. I try to exp full source db in target db by data pump with network_link.
    But I got a dump file as a crashed dump type under window explorer.
    syntax as expdp system/test@targetDB FULL=y DIRECTORY=dataload NETWORK_LINK=sale DUMPFILE=salefull091115.dmp LOGFILE=salefulllog091115.log
    what kind of issue is for this event?
    Thanks
    JIm
    Edited by: user589812 on Nov 15, 2009 3:48 AM

    Pl post the last 50 lines of the export log file, along with any errors recorded in the Windows system log.
    See you duplicate post here - data pump export network_link   with dblink issue
    Srini

  • Datapump Export Wipro Question

    Hi,
    we are having a 50GB size of Schema but the Dumpfile size will be 5GB only,
    now what the question is ,
    How can we Export this 50gb Schema using the 5GB Dumpfile?
    Is this is posible?
    Give me the solution for this with query?

    Create a compressed export on the fly. Depending on the type of data, you probably can export up to 10 gigabytes to a single file. This example uses gzip. It offers the best compression I know of, but you can also substitute it with zip, compress or whatever.
    # create a named pipe
    mknod exp.pipe p
    # read the pipe - output to zip file in the background
    gzip < exp.pipe > scott.exp.gz &
    # feed the pipe
    exp userid=scott/tiger file=exp.pipe ...
    refer the link:
    http://www.orafaq.com/wiki/Import_Export_FAQ#Can_one_export_to_multiple_files.3F.2F_Can_one_beat_the_Unix_2_Gig_limit.3F

  • DataPump Question for Full Expdp

    Dear All,
    how to export full database using data pump utility. actually the client need the full 11i ebiz database alone for his testing purpose.
    i have some basic idea which i use to follow for local exports.
    $expdp scott/tiger directory=dump dumpfile=table_scott.dmp logfile=table_scott.log tables=empthis is for a single table
    for the whole database how we have to export,, before export what are the privileges does the use must have and what are the prerequisites !?!
    $expdp system/manager full=y directory=dump dumpfile=full.dmp logfile=full.log job_name=fulljobam i correct in this above syntax for export the full db.. ?!?!?
    while i import what are the prerequisits i have to check in the new database. kindly help me to study this subject deeply
    Thanks and Regards
    HAMEED
    One more Question :
    Before export do we have to take a report ? if it what kind of report and how to take that ?
    Edited by: Hameed on May 11, 2011 5:17 AM

    so how come its possible to import each and ever tablespace by remap technic ?AFAIK you have to do that....create a parameter file and use either REMAP_DATAFILES or REMAP_TABLESPACE parameter. Please be aware that REMAP_DATAFILE value uses quotation marks so specifiy in parameter file.
    Also check below MOS before using this parameter:-
    Note:429846.1 "Slow Data Pump with REMAP_SCHEMA and REMAP_TABLESPACE parameters"
    Thanks,
    JD

  • Editing a datapump .dmp file question

    I am running 10.2.0.4 rdbms on hpux platfrom and I am taking a data pump export with content=metadata_only. I want to then import this metadata into a new database install which will be running an 11.2.0.3 database.
    I want to be able to change the default tablespaces for where the objects live in the export file and also change the initial extent size so that the empty objects will only take up a small amouth of space in the new database and also have all of the objects created in one tablespace in the new database.
    Is is possible to edit the .dmp file and make these kind of global changes so that all objects can be created in the users tablespace and set the initial extent size as 10mb?
    Thanks.

    923395 wrote:
    I am running 10.2.0.4 rdbms on hpux platfrom and I am taking a data pump export with content=metadata_only. I want to then import this metadata into a new database install which will be running an 11.2.0.3 database.
    I want to be able to change the default tablespaces for where the objects live in the export file and also change the initial extent size so that the empty objects will only take up a small amouth of space in the new database and also have all of the objects created in one tablespace in the new database.
    Is is possible to edit the .dmp file Not possible.
    You could use DBMS_METADATA.GET_DDL to a file then do global replace with any text editor.

  • Datapump API PL/SQL question

    I had craeted a procedure. In the procedure, schema name SCOTT is hardcoded
    DBMS_DATAPUMP.METADATA_FILTER(h1,'SCHEMA_EXPR','IN (''SCOTT'')');
    if in procedure I want to pass in the schema name say inSchema then how it will be replace 'IN (''SCOTT'')' by this.
    DBMS_DATAPUMP.METADATA_FILTER(h1,'SCHEMA_EXPR',inSchema);
    it complies but not run. Thanks

    Now it works by calling exec test_datapump('IN (''SCOTT'')');
    this code (with very little modification) is in the ORACLE API documentation.
    CREATE OR REPLACE procedure DO_EXPORT.test_datapump (inSchema varchar2)
    is
    ind NUMBER; -- Loop index
    h1 NUMBER; -- Data Pump job handle
    percent_done NUMBER; -- Percentage of job complete
    job_state VARCHAR2(30); -- To keep track of job state
    le ku$_LogEntry; -- For WIP and error messages
    js ku$_JobStatus; -- The job status from get_status
    jd ku$_JobDesc; -- The job description from get_status
    sts ku$_Status; -- The status object returned by get_status
    BEGIN
    -- Create a (user-named) Data Pump job to do a schema export.
    h1 := DBMS_DATAPUMP.OPEN('EXPORT','SCHEMA',NULL,'EXAMPLE907','LATEST');
    -- Specify a single dump file for the job (using the handle just returned)
    -- and a directory object, which must already be defined and accessible
    -- to the user running this procedure.
    DBMS_DATAPUMP.ADD_FILE(h1,'example1.dmp','NIGHTLY_DB_EXPORT');
    -- A metadata filter is used to specify the schema that will be exported.
    --DBMS_DATAPUMP.METADATA_FILTER(h1,'SCHEMA_EXPR','IN (''SCOTT'')');
    -- inSchema
    -- DBMS_DATAPUMP.METADATA_FILTER( h1,'SCHEMA_EXPR', 'IN (''' || inSchema || ''')');
    DBMS_DATAPUMP.METADATA_FILTER( h1,'SCHEMA_EXPR', inSchema );
    -- Start the job. An exception will be generated if something is not set up
    -- properly.
    DBMS_DATAPUMP.START_JOB(h1);
    -- The export job should now be running. In the following loop, the job
    -- is monitored until it completes. In the meantime, progress information is
    -- displayed.
    percent_done := 0;
    job_state := 'UNDEFINED';
    while (job_state != 'COMPLETED') and (job_state != 'STOPPED') loop
    dbms_datapump.get_status(h1,
    dbms_datapump.ku$_status_job_error +
    dbms_datapump.ku$_status_job_status +
    dbms_datapump.ku$_status_wip,-1,job_state,sts);
    js := sts.job_status;
    -- If the percentage done changed, display the new value.
    if js.percent_done != percent_done
    then
    dbms_output.put_line('*** Job percent done = ' ||
    to_char(js.percent_done));
    percent_done := js.percent_done;
    end if;
    -- If any work-in-progress (WIP) or error messages were received for the job,
    -- display them.
    if (bitand(sts.mask,dbms_datapump.ku$_status_wip) != 0)
    then
    le := sts.wip;
    else
    if (bitand(sts.mask,dbms_datapump.ku$_status_job_error) != 0)
    then
    le := sts.error;
    else
    le := null;
    end if;
    end if;
    if le is not null
    then
    ind := le.FIRST;
    while ind is not null loop
    dbms_output.put_line(le(ind).LogText);
    ind := le.NEXT(ind);
    end loop;
    end if;
    end loop;
    -- Indicate that the job finished and detach from it.
    dbms_output.put_line('Job has completed');
    dbms_output.put_line('Final job state = ' || job_state);
    dbms_datapump.detach(h1);
    END;

  • Datapump question with "query"

    Hi,
    I'm trying to use expdp to export only certain rows of a table. Am I doing something wrong? (obviously yes ;-) Getting the following error
    $ expdp user/password directory=DP dumpfile=dunp.dmpdp tables=user.t_calculationhistory query=user.t_calculationhistory:"WHERE DATEOFCALCULATION>TO_DATE('20081111','YYYYMMDD') AND DATEOFCALCULATION<TO_DATE('20081112','YYYYMMDD')"
    **LRM-00116: syntax error at ')' following 'YYYYMMDD'**
    $
    Any suggestions on the syntax

    Hi,
    Try this
    $ expdp user/password directory=DP dumpfile=dunp.dmpdp tables=user.t_calculationhistory query=user.t_calculationhistory:' "WHERE DATEOFCALCULATION>TO_DATE('20081111','YYYYMMDD') AND DATEOFCALCULATION<TO_DATE('20081112','YYYYMMDD')" '
    Your condition is wrong because date of calculation is greater than 11 and less than 12, i think you have given wrong date.
    regards
    Jafar

  • Expdp (Datapump) question

    Hi,
    I am using the following expdp command to create database export for one of my schemas. The issue is when I run it manually on command window, it creates 2 files 'EXPORT01.dmp' and 'EXPORT02.dmp' which is exactly what I want. However, when I run the same command from the batch file it creates only one file: EXPORTU.dmp and runs out of space as I have limited the space to only 3m. Why does the same script not run in the batch and creates 2 files (export01.dmp and export02.dmp)?
    Here's my command:
    expdp system/password SCHEMAS=myschema LOGFILE=EXPDP_LOG_DIR:EXPORT.log DUMPFILE=EXPDP_DATA_DIR:EXPORT%U.dmp FILESIZE=3m
    Please help.
    Thank you.

    Because when You run it from script it can't correctly parse "%" character. You need to solve it, try
    DUMPFILE=EXPDP_DATA_DIR:EXPORT\%U.dmp FILESIZE=3m
    backslash before % on most os it must work

  • How to find table with colum that not support by data pump network_link

    Hi Experts,
    We try to import a database to new DB by data pump network_link.
    as oracle statement, Tables with columns that are object types are not supported in a network export. An ORA-22804 error will be generated and the export will move on to the next table. To work around this restriction, you can manually create the dependent object types within the database from which the export is being run.
    My question, how to find these tables with colum that that are object types are not supported in a network export.
    We have LOB object and oracle spital SDO_GEOMETRY object type. our database size is about 300G. nornally exp will takes 30 hours.
    We try to use data pump with network_link to speed export process.
    How do we fix oracle spital users type SDO_GEOMETRY issue during data pump?
    our system is 32 bit window 2003 and 10GR2 database.
    Thanks
    Jim
    Edited by: user589812 on Nov 3, 2009 12:59 PM

    Hi,
    I remember there being issues with sdo_geometry and DataPump. You may want to contact oracle support with this issue.
    Dean

Maybe you are looking for

  • How do I programatically open a LabView Front panel?

    I want to use a test stand variable (perhaps a sequence global) to control whether certain VIs open their front panels, and wait for subsequent operator interaction. Can I get there using the API and something like this:  "Step_TSInfoProp.TSInfo_Step

  • Numbers crashes when scrolling

    Since updating to IOS 8, including all subsequent updates released for IOS 8, Numbers has exhibited crashing when scrolling horizontally or vertically when the spreadsheet exceeds the screen boundaries.  I have erased Numbers from the iPad Air (sympt

  • Qmaster ate my quicktime - what the ...?

    i finally got qmaster set up using a macpro 3ghz machine as both controller and renderer and a 1.67 hz G4 powerbook as a node. after doing a test render of 1200 frames of a longer shake script i found that the original media that the script utilized

  • Block the payment

    Expert How to block the paymnet of a delivery till inpection stock is accepted Pls give me in details step Faisal

  • Dynamic record group not working when apostrophes are used

    Hi Everyone, I have developed a Form for the most part everything is working as expected. However, there is a search functionality that is giving me problems. The issue is that when the user enters search criteria for last_name that has an apostrophe