Data Pump API Question

Hi,
Is there a way to use the Data Pump API to export tables from multiple schemas in the same job? I can't figure out what the filters would be. It seems like I can either specify many tables from 1 schema only, or I can specify multiple schemas but not limit the tables I want to export.
I keep running into this error: ORA-31655: no data or metadata objects selected for job
I'd like to do something like this:
--METADATA FILTER: SPECIFY TABLES TO EXPORT
dbms_datapump.metadata_filter(
handle => hdl,
name => 'NAME_EXPR',
value => 'IN(''schema1.table1'',''schema2.table2'')');
This does not seem to be possible..
Any help would be appreciated.
Thanks,
Nora

User that have EXP_FULL_DATABASE role should be able to do what you want.
Search here for that role http://students.kiv.zcu.cz/doc/oracle/server.102/b14215/dp_export.htm#i1007837
Seems like you could do what you want by using that role in
joint venture wiht exclude and include parameters http://students.kiv.zcu.cz/doc/oracle/server.102/b14215/dp_export.htm#i1009903

Similar Messages

  • Select table when import using Data Pump API

    Hi,
    Sorry for the trivial question, I export the data using Data Pump API, with "TABLE" mode.
    So all tables will be exported in one .dmp file.
    My question is, then how to import few tables only using Data Pump API?, how to define "TABLES" property like command line interface?
    should I use DATA_FILTER procedures?, if yes how to do that?
    Really thanks in advance
    Regards,
    Kahlil

    Hi,
    You should be using metadata_filter procedure for the same.
    eg:
    dbms_datapump.metadata_filter
                (handle1
                 ,'NAME_EXPR'
                 ,'IN (''TABLE1'', '"TABLE2'')'
    {code}
    Regards
    Anurag                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                       

  • Oracle Export Data pump API - job already exists

    Hi all,
    I use stored procedure and using data pump API for export and import. I use simple schema export from http://download.oracle.com/docs/cd/B14117_01/server.101/b10825/dp_api.htm . It works on the first executing, but the second time
    I found the error ORA-31634: job already exists.
    At the first running I check the Job state from dba_datapump_jobs master table, and it stated DEFINING after process completed.
    why is it happened?
    Really thanks in advance.
    Regards,
    Mr.K

    My question go in direction to:
    1. Did I something wrong with Data Pump Utility?I do not think so. But may be with the existing schema :-(
    2. How is the correct way to export and import
    sequences that they keep their actual values?If the Sequences exist in the target before the import, oracle does not drop and recreate it. So you need to ensure that the sequences do not already exist in the target or the existing ones are dropped before the import.
    3. When the behaviour described here is correct, how
    can I correct the values that start again from the
    last value that was used in the source database?You can either refresh with the import after the above correction or drop and manually recreate the sequences to START WITH the NEXT VALUE of the source sequences.
    The easier way is to generate a script from the source if you know how to do it

  • How to exclude statistic using Data Pump API?

    How to exclude all statistics while exporting data using Oracle Data Pump API (DBMS_DATAPUMP package)?

    You would call the metadata filter api like this:
    dbms_datapump.METADATA_FILTER(
    handle = your_handle_here,
    name = 'EXCLUDE_PATH_LIST',
    value = 'STATISTICS');
    Hope this helps.
    Dean

  • Data Pump API -- how to exlude constraints?

    Hi,
    We wrote a PL/SQL procedure using the Data Pump API to perform a schema export, as in http://www.oracle-base.com/articles/10g/OracleDataPump10g.php#DataPumpAPI
    We would like our export to exlude constraints and indexes, but I can't find an example of what the syntax would be.
    Could anybody please give me an example of what this metadata_filter would look like?
    Thanks,
    Nora

    In DBMS_DATAPUMP.metadata_filter use "name" parameter EXCLUDE_PATH_LIST or EXCLUDE_PATH_EXPR
    Check following link for details
    http://download-west.oracle.com/docs/cd/B19306_01/appdev.102/b14258/d_datpmp.htm#sthref2182
    Virag

  • DATA PUMP API returning ORA-31655

    Hello Gurus,
    I am using below code to import a table(EMP) from one Database to Another using Network.
    set serveroutput on;
    DECLARE
    ind NUMBER; -- Loop index
    spos NUMBER; -- String starting position
    slen NUMBER; -- String length for output
    h1 NUMBER; -- Data Pump job handle
    percent_done NUMBER; -- Percentage of job complete
    job_state VARCHAR2(30); -- To keep track of job state
    le ku$_LogEntry; -- For WIP and error messages
    js ku$_JobStatus; -- The job status from get_status
    jd ku$_JobDesc; -- The job description from get_status
    sts ku$_Status; -- The status object returned by get_status
    BEGIN
    h1 := DBMS_DATAPUMP.OPEN('IMPORT','TABLE','DBLINK',NULL,'LATEST');
    DBMS_DATAPUMP.METADATA_FILTER(h1,'NAME_EXPR','IN (''SCOTT.EMP'')','TABLE');
    DBMS_DATAPUMP.SET_PARAMETER(h1,'TABLE_EXISTS_ACTION','REPLACE');
    DBMS_DATAPUMP.METADATA_REMAP(h1,'REMAP_SCHEMA','SCOTT','SCOTT');
    --DBMS_DATAPUMP.METADATA_FILTER(h1,'INCLUDE_PATH_LIST','like''TABLE''');
    DBMS_DATAPUMP.METADATA_REMAP(h1,'REMAP_TABLESPACE','USERS','USERS');
    DBMS_DATAPUMP.SET_PARALLEL(h1,8);
    begin
    dbms_datapump.start_job(h1);
    dbms_output.put_line('Data Pump job started successfully');
    exception
    when others then
    if sqlcode = dbms_datapump.success_with_info_num
    then
    dbms_output.put_line('Data Pump job started with info available:');
    dbms_datapump.get_status(h1,
    dbms_datapump.ku$_status_job_error,0,
    job_state,sts);
    if (bitand(sts.mask,dbms_datapump.ku$_status_job_error) != 0)
    then
    le := sts.error;
    if le is not null
    then
    ind := le.FIRST;
    while ind is not null loop
    dbms_output.put_line(le(ind).LogText);
    ind := le.NEXT(ind);
    end loop;
    end if;
    end if;
    else
    raise;
    end if;
    end;
    -- The export job should now be running. In the following loop, we will monitor
    -- the job until it completes. In the meantime, progress information is
    -- displayed.
    percent_done := 0;
    job_state := 'UNDEFINED';
    while (job_state != 'COMPLETED') and (job_state != 'STOPPED') loop
    dbms_datapump.get_status(h1,
    dbms_datapump.ku$_status_job_error +
    dbms_datapump.ku$_status_job_status +
    dbms_datapump.ku$_status_wip,-1,job_state,sts);
    js := sts.job_status;
    -- If the percentage done changed, display the new value.
    if js.percent_done != percent_done
    then
    dbms_output.put_line('*** Job percent done = ' ||
    to_char(js.percent_done));
    percent_done := js.percent_done;
    end if;
    -- Display any work-in-progress (WIP) or error messages that were received for
    -- the job.
    if (bitand(sts.mask,dbms_datapump.ku$_status_wip) != 0)
    then
    le := sts.wip;
    else
    if (bitand(sts.mask,dbms_datapump.ku$_status_job_error) != 0)
    then
    le := sts.error;
    else
    le := null;
    end if;
    end if;
    if le is not null
    then
    ind := le.FIRST;
    while ind is not null loop
    dbms_output.put_line(le(ind).LogText);
    ind := le.NEXT(ind);
    end loop;
    end if;
    end loop;
    -- Indicate that the job finished and detach from it.
    dbms_output.put_line('Job has completed');
    dbms_output.put_line('Final job state = ' || job_state);
    dbms_datapump.detach(h1);
    -- Any exceptions that propagated to this point will be captured. The
    -- details will be retrieved from get_status and displayed.
    exception
    when others then
    dbms_output.put_line('Exception in Data Pump job');
    dbms_datapump.get_status(h1,dbms_datapump.ku$_status_job_error,0,
    job_state,sts);
    if (bitand(sts.mask,dbms_datapump.ku$_status_job_error) != 0)
    then
    le := sts.error;
    if le is not null
    then
    ind := le.FIRST;
    while ind is not null loop
    spos := 1;
    slen := length(le(ind).LogText);
    if slen > 255
    then
    slen := 255;
    end if;
    while slen > 0 loop
    dbms_output.put_line(substr(le(ind).LogText,spos,slen));
    spos := spos + 255;
    slen := length(le(ind).LogText) + 1 - spos;
    end loop;
    ind := le.NEXT(ind);
    end loop;
    end if;
    end if;
    END;
    when i run the same code and change the mode for SCHEMA LEVEL import it is working fine.
    But when i want to import a single table it is giving below error
    Data Pump job started successfully
    Starting "SCOTT"."SYS_IMPORT_TABLE_06":
    Estimate in progress using BLOCKS method...
    Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
    Total estimation using BLOCKS method: 0 KB
    ORA-31655: no data or metadata objects selected for job
    *** Job percent done = 100
    Job "SCOTT"."SYS_IMPORT_TABLE_06" completed with 1 error(s) at 20:56:23
    Job has completed
    Final job state = COMPLETED
    can you give any suggestions on this error.
    Thanks in advance

    Did you check this link?
    http://arjudba.blogspot.com/2009/01/ora-31655-no-data-or-metadata-objects.html
    Does it provide any help?
    Regards.
    Satyaki De.

  • Help needed with Export Data Pump using API

    Hi All,
    Am trying to do an export data pump feature using the API.
    while the export as well as import works fine from the command line, its failing with the API.
    This is the command line program:
    expdp pxperf/dba@APPN QUERY=dev_pool_data:\"WHERE TIME_NUM > 1204884480100\" DUMPFILE=EXP_DEV.dmp tables=PXPERF.dev_pool_data
    Could you help me how should i achieve the same as above in Oracle Data Pump API
    DECLARE
    h1 NUMBER;
    h1 := dbms_datapump.open('EXPORT','TABLE',NULL,'DP_EXAMPLE10','LATEST');
    dbms_datapump.add_file(h1,'example3.dmp','DATA_PUMP_TEST',NULL,1);
    dbms_datapump.add_file(h1,'example3_dump.log','DATA_PUMP_TEST',NULL,3);
    dbms_datapump.metadata_filter(h1,'NAME_LIST','(''DEV_POOL_DATA'')');
    END;
    Also in the API i want to know how to export and import multiple tables (selective tables only) using one single criteria like "WHERE TIME_NUM > 1204884480100\"

    Yes, I have read the Oracle doc.
    I was able to proceed as below: but it gives error.
    ============================================================
    SQL> SET SERVEROUTPUT ON SIZE 1000000
    SQL> DECLARE
    2 l_dp_handle NUMBER;
    3 l_last_job_state VARCHAR2(30) := 'UNDEFINED';
    4 l_job_state VARCHAR2(30) := 'UNDEFINED';
    5 l_sts KU$_STATUS;
    6 BEGIN
    7 l_dp_handle := DBMS_DATAPUMP.open(
    8 operation => 'EXPORT',
    9 job_mode => 'TABLE',
    10 remote_link => NULL,
    11 job_name => '1835_XP_EXPORT',
    12 version => 'LATEST');
    13
    14 DBMS_DATAPUMP.add_file(
    15 handle => l_dp_handle,
    16 filename => 'x1835_XP_EXPORT.dmp',
    17 directory => 'DATA_PUMP_DIR');
    18
    19 DBMS_DATAPUMP.add_file(
    20 handle => l_dp_handle,
    21 filename => 'x1835_XP_EXPORT.log',
    22 directory => 'DATA_PUMP_DIR',
    23 filetype => DBMS_DATAPUMP.KU$_FILE_TYPE_LOG_FILE);
    24
    25 DBMS_DATAPUMP.data_filter(
    26 handle => l_dp_handle,
    27 name => 'SUBQUERY',
    28 value => '(where "XP_TIME_NUM > 1204884480100")',
    29 table_name => 'ldev_perf_data',
    30 schema_name => 'XPSLPERF'
    31 );
    32
    33 DBMS_DATAPUMP.start_job(l_dp_handle);
    34
    35 DBMS_DATAPUMP.detach(l_dp_handle);
    36 END;
    37 /
    DECLARE
    ERROR at line 1:
    ORA-39001: invalid argument value
    ORA-06512: at "SYS.DBMS_SYS_ERROR", line 79
    ORA-06512: at "SYS.DBMS_DATAPUMP", line 3043
    ORA-06512: at "SYS.DBMS_DATAPUMP", line 3688
    ORA-06512: at line 25
    ============================================================
    i have a table called LDEV_PERF_DATA and its in schema XPSLPERF.
    value => '(where "XP_TIME_NUM > 1204884480100")',above is the condition i want to filter the data.
    However, the below snippet works fine.
    ============================================================
    SET SERVEROUTPUT ON SIZE 1000000
    DECLARE
    l_dp_handle NUMBER;
    l_last_job_state VARCHAR2(30) := 'UNDEFINED';
    l_job_state VARCHAR2(30) := 'UNDEFINED';
    l_sts KU$_STATUS;
    BEGIN
    l_dp_handle := DBMS_DATAPUMP.open(
    operation => 'EXPORT',
    job_mode => 'SCHEMA',
    remote_link => NULL,
    job_name => 'ldev_may20',
    version => 'LATEST');
    DBMS_DATAPUMP.add_file(
    handle => l_dp_handle,
    filename => 'ldev_may20.dmp',
    directory => 'DATA_PUMP_DIR');
    DBMS_DATAPUMP.add_file(
    handle => l_dp_handle,
    filename => 'ldev_may20.log',
    directory => 'DATA_PUMP_DIR',
    filetype => DBMS_DATAPUMP.KU$_FILE_TYPE_LOG_FILE);
    DBMS_DATAPUMP.start_job(l_dp_handle);
    DBMS_DATAPUMP.detach(l_dp_handle);
    END;
    ============================================================
    I dont want to export all contents as the above, but want to export data based on some conditions and only on selective tables.
    Any help is highly appreciated.

  • Data pump PL SQL API ;;; PLZ HELP ME

    Hi all,
    I want to use the data pump PL SQL API provided by Oracle ; I have created one procedure named SP_EXPORT as showing below ;
    But the problem that I have this error message : any suggestion, help please ?
    ORA-31626: job does not exist
    ORA-06512: at "SYS.DBMS_SYS_ERROR", line 79
    ORA-06512: at "SYS.DBMS_DATAPUMP", line 911
    ORA-06512: at "SYS.DBMS_DATAPUMP", line 4356
    ORA-06512: at line 7
    CREATE OR REPLACE PROCEDURE SP_EXPORT AS
    l_dp_handle NUMBER;
    l_last_job_state VARCHAR2(30) := 'UNDEFINED';
    l_job_state VARCHAR2(30) := 'UNDEFINED';
    l_sts KU$_STATUS;
    BEGIN
    l_dp_handle := DBMS_DATAPUMP.open(
    operation => 'EXPORT',
    job_mode => 'SCHEMA',
    remote_link => NULL,
    job_name => 'EMP_EXPORT',
    version => 'LATEST');
    DBMS_DATAPUMP.add_file(
    handle => l_dp_handle,
    filename => 'MIVA_DEVXEN03.dmp',
    directory => 'DATAPUMP_DIR');
    DBMS_DATAPUMP.add_file(
    handle => l_dp_handle,
    filename => 'MIVA_DEVXEN03.log',
    directory => 'DATAPUMP_DIR',
    filetype => DBMS_DATAPUMP.KU$_FILE_TYPE_LOG_FILE);
    DBMS_DATAPUMP.metadata_filter(
    handle => l_dp_handle,
    name => 'SCHEMA_EXPR',
    value => '= ''MIVA''');
    DBMS_DATAPUMP.start_job(l_dp_handle);
    DBMS_DATAPUMP.detach(l_dp_handle);
    END;

    I ' connected as sysdba and it works ; wow very happy ;)

  • Schema export via Oracle data pump with Database Vault enabled question

    Hi,
    I have installed and configured Database Vault on an Oracle 11g-r2-11.2.0.3 to protect a specific schema (SCHEMA_NAME) via a realm. I have followed the following doc:
    http://www.oracle.com/technetwork/database/security/twp-databasevault-dba-bestpractices-199882.pdf
    to ensure that the sys and the system user has sufficient rights to complete a schedule Oracle data pump export operation.
    I.e. I have granted to sys and system the following:
    execute dvsys.dbms_macadm.authorize_scheduler_user('sys','SCHEMA_NAME');
    execute dvsys.dbms_macadm.authorize_scheduler_user('system','SCHEMA_NAME');
    execute dvsys.dbms_macadm.authorize_datapump_user('sys','SCHEMA_NAME');
    execute dvsys.dbms_macadm.authorize_datapump_user('system','SCHEMA_NAME');
    I have also create a second realm on the same schema (SCHEMA_NAME) to allow sys and system to maintain indexes for real-protected tables, To allow a sys and system to maintain indexes for realm-protected tables. This separate realm was created for all their index types: Index, Index Partition, and Indextype, sys and system have been authorized as OWNER to this realm.
    However, when I try and complete an Oracle Data Pump export operation on the schema, I get two errors directly after the following line displayed in the export log:
    Processing object type SCHEMA_EXPORT/TABLE/INDEX/DOMAIN_INDEX/INDEX:
    ORA-39127: unexpected error from call to export_string :=SYS.DBMS_TRANSFORM_EXIMP.INSTANCE_INFO_EXP('AQ$_MGMT_NOTIFY_QTABLE_S','SYSMAN',1,1,'11.02.00.00.00',newblock)
    ORA-01031: insufficient privileges
    ORA-06512: at "SYS.DBMS_TRANSFORM_EXIMP", line 197
    ORA-06512: at line 1
    ORA-06512: at "SYS.DBMS_METADATA", line 9081
    ORA-39127: unexpected error from call to export_string :=SYS.DBMS_TRANSFORM_EXIMP.INSTANCE_INFO_EXP('AQ$_MGMT_LOADER_QTABLE_S','SYSMAN',1,1,'11.02.00.00.00',newblock)
    ORA-01031: insufficient privileges
    ORA-06512: at "SYS.DBMS_TRANSFORM_EXIMP", line 197
    ORA-06512: at line 1
    ORA-06512: at "SYS.DBMS_METADATA", line 9081
    The export is completed but with this errors.
    Any help, suggestions, pointers, etc actually anything will be very welcome at this stage.
    Thank you

    Hi Srini,
    Thank you very much for your help. Unfortunately after having followed the instructions of the DOC I am still getting the same errors ?
    none the less thank you for your input.
    I was also wondering if someone could tell me how to move this thread to the Database Security area of the forum, as I feel I may have posted the thread in the wrong place as it appears to be a Database Vault issue and not an imp/exp problem. ?
    Edited by: zooid on May 20, 2012 10:33 PM
    Edited by: zooid on May 20, 2012 10:36 PM

  • Data Pump using 11g2 and 11g1 question during migration

    Our DBE put our test database on 11g2 and our production on 11g1. Due to some ingest failure we wanted to move the data in test(11g2) to production(11g1) using data pump, however I was told that you cannot go from 11g2 to 11g1. I was also told that because the database contained public synonyms that I would have to recreate all public synonyms. he said it had something to do with lbascys. Can someone clarify this for me..

    user11171364 wrote:
    ... Can I still use these parameters during the import ONLY without having used them during the export.Nope, read the restriction : during the import "+This parameter is valid only when the NETWORK_LINK parameter is also specified.+"
    http://download.oracle.com/docs/cd/B19306_01/server.102/b14215/dp_import.htm#sthref299
    Consequently, you cannot use it within your dumpfile.
    Nicolas.

  • Question for experts/ wrapping - data pump

    Hello All,
    In am using ORACLE 10.2.0.4.0
    i have an issue while export / import wrapped procedures:
    I have some procedures / packages that are wrapped and when i do export for their related schema and then import it to the database, i got the below error
    Compilation errors for PROCEDURE PROC_NAME
    Error: PLS-00753: malformed or corrupted wrapped unit
    Line: 1
    Text: CREATE OR REPLACE PROCEDURE "PROC_NAME" wrapped
    After i did many tests on this scenario i Noted that:
    This error is not appearing on all procedures, there are some that will be corrupted after the import and some will not be corrupted.
    After i did the normal import export, i found that the character set of the imp/exp is different then the one of the database, so i changed the character set of the database to be the same as the one used by the import, i redo my tests and i found that this solved the issue of normal imp/exp, so now and if i am doing normal exp/imp the recompilation of procedures will not fail after the normal old import. but at this stage the imp/exp character set has changed again.
    The character set of my db was : AR8MSWIN1256 and when i did exp/imp it was for exp /imp AR8ISO8859P6, so i changed the one of my db to AR8ISO8859P6 , after changing the database character set to AR8ISO8859P6 the recompilation was working fine for the normal exp / imp, but now the character set for import/ export changed to AR8MSWIN1256 !! why ?
    The Data pump export import was not solved even after changing the character set of the database, the procedures were imported with errors and it was fixed after recompiling it.
    My issue that i have too many clients and i am not able always to change the character set of the database because i may face ERROR "ORA-12712: new character set must be a superset of old character set"
    Hope the above description is clear,
    Any suggestions or ideas to solve the issue of wrapped procedures with expdp/impdp? can i force certain character set for the expdp/impdp ? knowing that with release 10.2.0.1.0 the expdp/impdp is totally not working with the wrapped procedures

    Any ideas?

  • Data Pump - expdp / impdp utility question

    HiAll,
    As a basic exercise to learn Data pump, I am trying to export schema scott and then want to load the dump file into a test_t schema in the same database.
    --1. created dir object from sysdba
    CREATE DIRECTORY dpump_dir1 AS 'C:\output_dir';
    --2. created dir on the OS as c:/output_dit
    --3. run expdp
    expdp system/*****@orcl schemas=scott DIRECTORY=dpump_dir1 JOB_NAME=hr DUMPFILE=scott_orcl_nov5.dmp PARALLEL=4
    --4. create test_t schema and grant dba to it.
    --5. run impdp
    impdp system/*****@orcl schemas=test_t DIRECTORY=dpump_dir1 JOB_NAME=hr DUMPFILE=scott_orcl_nov5.dmp PARALLEL=8
    it fails here as ORA39165 : schema test_t not found. However the schema test_t does exist.
    So, I am not sure why it should give this error. It seems that the schema of Scott from the expdp dump file can not be loaded to any other schema but only to a schema named scott...Is it right? If yes then how can I load all the objects of schema say scott to another schema say test_t? It would be helpful if you can please show the respective expdp and impdp command.
    Thanks a lot
    KS

    The test_t schema does not exist in the export dump file, you should remap from input scott schema to the target test_t schema.
    REMAP_SCHEMA : http://download.oracle.com/docs/cd/E11882_01/server.112/e22490/dp_import.htm#SUTIL927
    Nicolas.

  • Data Pump .xlsx into a SQL Server Table and the whole 32-Bit, 64-Bit discussion

    First of all...I have a headache!
    Found LOTS of Google hits when trying to data pump a .xlsx File into a SQL Server Table. And the whole discussion of the Microsoft ACE 64-Bit Driver or the Microsoft Jet 32-Bit Driver.
    Specifically receiving this error...
    An OLE DB record is available.  Source: "Microsoft Office Access Database Engine"  Hresult: 0x80004005  Description: "External table is not in the expected format.".
    Error: 0xC020801C at Data Flow Task to Load Alere Coaching Enrolled, Excel Source [56]: SSIS Error Code DTS_E_CANNOTACQUIRECONNECTIONFROMCONNECTIONMANAGER.  The AcquireConnection method call to the connection manager "Excel Connection Manager"
    failed with error code 0xC0202009.
    Strangely enough, if I simply data pump ONE .xlsx File into a SQL Server Table utilizing my SSIS Package, it seems to work fine. If instead I am trying to be pro-active and allowing for multiple .xlsx Files by using a Foreach Loop Container and a variable
    @[User::FileName], it's erroring out...but not really because it is indeed storing the rows onto the SQL Server Table. I did check all my Delay
    Why does this have to be sooooooo difficult???
    Can anyone help me out here in trying to set-up a SSIS Package in a rather constrictive environment to pump a .xlsx File into a SQL Server Table? What in God's name am I doing wrong? Or is all this a misnomer? But if it's working how do I disable the error
    so that is stops erroring out?

    Hi ITBobbyP,
    According to your description, when you import data of .xlsx file to SQL Server database, you got the error message.
    The error can be caused by the following reasons:
    The excel file is locked by other processes. Please kindly resave this file and name it to other file name to see if the issue will be fixed.
    The ACE(Access Database Engine) is not up to date as Vaibhav mentioned. Please download the latest ACE and install it from the link:
    https://www.microsoft.com/en-us/download/details.aspx?id=13255.
    The version of OFFICE and server bitness is not the same. To solve the problem, please refer to the following document:
    http://hrvoje.piasevoli.com/2010/09/01/importing-data-from-64-bit-excel-in-ssis/
    If you have any more questions, please feel free to ask.
    Thanks,
    Wendy Fu
    Wendy Fu
    TechNet Community Support

  • How to find table with colum that not support by data pump network_link

    Hi Experts,
    We try to import a database to new DB by data pump network_link.
    as oracle statement, Tables with columns that are object types are not supported in a network export. An ORA-22804 error will be generated and the export will move on to the next table. To work around this restriction, you can manually create the dependent object types within the database from which the export is being run.
    My question, how to find these tables with colum that that are object types are not supported in a network export.
    We have LOB object and oracle spital SDO_GEOMETRY object type. our database size is about 300G. nornally exp will takes 30 hours.
    We try to use data pump with network_link to speed export process.
    How do we fix oracle spital users type SDO_GEOMETRY issue during data pump?
    our system is 32 bit window 2003 and 10GR2 database.
    Thanks
    Jim
    Edited by: user589812 on Nov 3, 2009 12:59 PM

    Hi,
    I remember there being issues with sdo_geometry and DataPump. You may want to contact oracle support with this issue.
    Dean

  • Can we use Data Pump to export data, using a SQL query, doing a join

    Folks,
    I have a quick question.
    Using Oracle 10g R2 on Solaris 10.
    Can Data Pump be used to export data, using a SQL query which is doing a join between 3 tables ?
    Thanks,
    Ashish

    Hello,
    No , this is from expdp help=Y
    QUERY                 Predicate clause used to export a subset of a table.
    Regards

Maybe you are looking for

  • Can two different itunes accounts share one purchased icloud backup storage?

    My husband and I each have an iTunes account, I purchased 25GB of memory for my iPad and iPhone iCloud backup for the year and am only using about 5 of that.  He is out of free iCloud memory for his iPad and iPhone and I would like him to be able to

  • Hidden tab won't close, constantly loading.

    There's a tab on my Firefox that doesn't show on the window. It seems to be a tab that was previously open, that I thought I'd closed. While it doesn't show on the window, the tab preview when I mouse over it in the windows taskbar shows that the tab

  • Problem setting up Network User

    I am running Mac OS X 10.5 Server with clients running 10.5 also. Currently, there are several users on the server, but in Workgroup Manager, their home directory is set to null. The users have local accounts on certain 10.5 clients which are linked

  • Menu items button in android tablet application not displayed

    I used to build my tablet air applications with flex sdk 4.5. I test it mainly on Galaxy Tab . When i upgraded to 4.6 sdk, I noticed that the system bar didn't displayed buttons anymore. it is completly empty. and  viewMenuItems didn't work anymore,

  • Canon G9 Scene Capture Type not working in Preview/Aperture

    Hi, I have a Canon G9, and have been using the SCN program on the program dialer on the top of the cam, letting me encode metadata about the content on the fly. When I upload pictures taken with the SCN program on the G9 using ImageCapture, I get the