Data Pump query

Just started looking at data pump yesterday and it seems like a great tool.
I've been no issues in getting data transferred between databases and mapping to original tables. However I have refactored a couple of parts of my db and hoped I could get a pointer in the right direction.
http://docs.oracle.com/cd/B12037_01/server.101/b10825/dp_import.htm
I'm hoping to import some of my data from table BAD_DESIGN from my old schema and move a section of it to table GOOD_DESIGN in my new schema - I'm also hoping this is possible on the import.
eg.
OLD.BAD_DESIGN table
name1 VARCHAR 2
number1 INTEGER
name2 VARCHAR 2
number2 INTEGER
name3 VARCHAR 2
number3 INTEGER
I want to take only name1 and number 1 and insert into my new tables
NEW.GOOD_DESIGN table
name1 VARCHAR 2
number1 INTEGER
The instruction I've pasted below is not working which is obvious as I'm not defining the column the data should be inserted into in the new schema. I also don't think I need a function to remap the data here ...
impdp NEW/NEW_PASSWORD@NEW_DB:PORT/SERVICE dumpfile=BAD_DESIGN.dmp TABLES=NEW.GOOD_DESIGN CONTENT=DATA_ONLY remap_data=OLD.BAD_DESIGN.name1:remapConfig.remapString \
Is this even possible with a data pump? If so, a pointer to what I'm missing in the instruction or the documentation would be appreciated.
Edited by: 910652 on Jan 27, 2012 6:25 AM

Thanks for the reply
Database versions are 10.2.0.4.0 (old) and 11.2.0.3.0 both sitting on a Linux server.
Are you asking if datapump can be used to move data from a table that contains 6 columns into a table that contains 2 columns ?
Kind of....not realy specifically to do with the amount of columns, more to do with can I get a subset of data from a data dump and import it into a different table in a refactored db.
To clarify I'm asking if data containted in two columns in an old table can be 'extracted' from the .dmp file and inserted into a different table in a different schema (column types remain the same).
To clarify with a better example - I have a table
OLD.FAMILY
dad_name VARCHAR2
dad_age INTEGER
mum_name VARCHAR2
mum_age INTEGER
child1_name VARCHAR2
......etc.
I export this table using datapump
Now I want only want to insert the data from dad_name and dad_age into my new table
NEW.DAD
name VARCHAR2
age INTEGER
Edited by: 910652 on Jan 27, 2012 7:03 AM

Similar Messages

  • Data pump, Query "1=2" performance?

    Hi guys
    I am trying to export a schema using data pump however I need no data from a few of the tables since they are irrelevant but I'd still like to have the structure of the table itself along with any constraints and such.
    I thought of using the QUERY parameter with a "1=2" query making it so that I can filter out all data from certain tables in the export while giving me everything else.
    While this works I wonder if data pump/oracle is smart enough to not run this query through the entire table? If it does perform a full table scan then can anybody recommend any other way of excluding just the data of certain tables while still getting the table structure itself along with anything else related to it?
    I have been unable to find such information after searching the net for a good while.
    Regards
    Alex

    Thanks.
    Does that mean 1=2 actually scans the entire table so it should be avoided in the future?
    Regards
    Alex

  • Can we use Data Pump to export data, using a SQL query, doing a join

    Folks,
    I have a quick question.
    Using Oracle 10g R2 on Solaris 10.
    Can Data Pump be used to export data, using a SQL query which is doing a join between 3 tables ?
    Thanks,
    Ashish

    Hello,
    No , this is from expdp help=Y
    QUERY                 Predicate clause used to export a subset of a table.
    Regards

  • Data pump using table query

    I am trying to perform a data pump export on a table using a query within a parfile and I am getting some odd behaviour. The database version is 10.2.0.4.3 and the OS is AIX 5.3. The query looks like this.
    QUERY="POSDECLARATIONQUEUE:where SESSIONID in (select 'B.SESSIONID' from POSACCOUNT A, POSDECLARATIONQUEUE B, POSDECLARATIONSESSION C where 'B.SESSIONID' = 'C.ID' and 'C.ACCOUNTID' = 'A.ID' and 'A.SITE' = '10252')"
    This works but gets 0 rows. If I run the query against the instance in an SQLPlus session as below then I get 0 rows returned.
    select * from POSDECLARATIONQUEUE where SESSIONID in (select 'B.SESSIONID' from POSACCOUNT A, POSDECLARATIONQUEUE B, POSDECLARATIONSESSION C where 'B.SESSIONID' = 'C.ID' AND 'C.ACCOUNTID' = 'A.ID' and 'A.SITE' = '10252');
    If I take out the single quotes from around the columns within the query against the instance within SQLPlus, I get over 2000 rows returned.
    SQL> select count(*) from POSDECLARATIONQUEUE where SESSIONID in (select B.SESSIONID from POSACCOUNT A, POSDECLARATIONQUEUE B, POSDECLARATIONSESSION C where B.SESSIONID = C.ID and C.ACCOUNTID = A.ID and A.SITE = 10252);
    COUNT(*)
    2098
    If I remove the single quotes from the parfile query then I get the following error within the data pump export.
    UDE-00014: invalid value for parameter, 'schemas'.
    The SCHEMAS option is not specified within the parfile and the TABLES option only specifies the table POSDECLARATIONQUEUE.
    Can someone assist with this, I just can't seem to be able to get the syntax right for it to work within data pump.
    Kind Regards.
    Graeme.
    Edited by: user12219844 on Apr 14, 2010 3:34 AM

    It looks like your query might be a little wrong:
    This is what you have:
    QUERY="POSDECLARATIONQUEUE:where SESSIONID in (select 'B.SESSIONID' from POSACCOUNT A, POSDECLARATIONQUEUE B, POSDECLARATIONSESSION C where 'B.SESSIONID' = 'C.ID' and 'C.ACCOUNTID' = 'A.ID' and 'A.SITE' = '10252')"
    This is what I would have thought it should look like:
    QUERY=POSDECLARATIONQUEUE:"where SESSIONID in (select B.SESSIONID from POSACCOUNT A, POSDECLARATIONQUEUE B, POSDECLARATIONSESSION C where B.SESSIONID = C.ID and C.ACCOUNTID = A.ID and A.SITE = 10252)"
    You want double " arount the complete query, and you don't need the single ' around all of the =. The single ' are treating those values as strings and it says
    'B.SESSIONID' = 'C.ID'
    is the string B.SESSIONID equal to the string C.ID
    In your query that you used in sql was
    B.SESSIONID = C.ID
    which says is the value stored B.SESSIONID equal to the value stored at C.ID
    Which is what you want.
    Dean

  • Can we load data in chunks using data pump ?

    We are loading data using data pump. So I want to clear my understanding.
    Please correct me if I am wrong on my understandings -
    ODI will fetch all data from source (whether it is INIT or CDC ) in one go and unload into staging area.
    If it is true, will performance hamper in case very huge data (50 million records at source) at source as ODI tries to load entire data in one go. I believe it will give better performance if we load in chunks using data pump.
    Please confirm and correct.
    Also I would like to know how can we configure chunk load using data-pump.
    Thanks in Advance.
    Regards,
    Dinesh.

    You may consider usingLKM Oracle to Oracle (datapump)
    http://docs.oracle.com/cd/E28280_01/integrate.1111/e12644/oracle_db.htm#r15c1-t2
    In 11g ODI reads from source and write to target in parallel. This is the case where you specify select query in source command and insert/update query in the target command. At source side Odi reads records from source and add them to a data queue. At target side a parallel thread reads data from the data queue and writes to the target. So the overall performance would be the slower of the read or write process.
    Thanks,

  • Help needed with Export Data Pump using API

    Hi All,
    Am trying to do an export data pump feature using the API.
    while the export as well as import works fine from the command line, its failing with the API.
    This is the command line program:
    expdp pxperf/dba@APPN QUERY=dev_pool_data:\"WHERE TIME_NUM > 1204884480100\" DUMPFILE=EXP_DEV.dmp tables=PXPERF.dev_pool_data
    Could you help me how should i achieve the same as above in Oracle Data Pump API
    DECLARE
    h1 NUMBER;
    h1 := dbms_datapump.open('EXPORT','TABLE',NULL,'DP_EXAMPLE10','LATEST');
    dbms_datapump.add_file(h1,'example3.dmp','DATA_PUMP_TEST',NULL,1);
    dbms_datapump.add_file(h1,'example3_dump.log','DATA_PUMP_TEST',NULL,3);
    dbms_datapump.metadata_filter(h1,'NAME_LIST','(''DEV_POOL_DATA'')');
    END;
    Also in the API i want to know how to export and import multiple tables (selective tables only) using one single criteria like "WHERE TIME_NUM > 1204884480100\"

    Yes, I have read the Oracle doc.
    I was able to proceed as below: but it gives error.
    ============================================================
    SQL> SET SERVEROUTPUT ON SIZE 1000000
    SQL> DECLARE
    2 l_dp_handle NUMBER;
    3 l_last_job_state VARCHAR2(30) := 'UNDEFINED';
    4 l_job_state VARCHAR2(30) := 'UNDEFINED';
    5 l_sts KU$_STATUS;
    6 BEGIN
    7 l_dp_handle := DBMS_DATAPUMP.open(
    8 operation => 'EXPORT',
    9 job_mode => 'TABLE',
    10 remote_link => NULL,
    11 job_name => '1835_XP_EXPORT',
    12 version => 'LATEST');
    13
    14 DBMS_DATAPUMP.add_file(
    15 handle => l_dp_handle,
    16 filename => 'x1835_XP_EXPORT.dmp',
    17 directory => 'DATA_PUMP_DIR');
    18
    19 DBMS_DATAPUMP.add_file(
    20 handle => l_dp_handle,
    21 filename => 'x1835_XP_EXPORT.log',
    22 directory => 'DATA_PUMP_DIR',
    23 filetype => DBMS_DATAPUMP.KU$_FILE_TYPE_LOG_FILE);
    24
    25 DBMS_DATAPUMP.data_filter(
    26 handle => l_dp_handle,
    27 name => 'SUBQUERY',
    28 value => '(where "XP_TIME_NUM > 1204884480100")',
    29 table_name => 'ldev_perf_data',
    30 schema_name => 'XPSLPERF'
    31 );
    32
    33 DBMS_DATAPUMP.start_job(l_dp_handle);
    34
    35 DBMS_DATAPUMP.detach(l_dp_handle);
    36 END;
    37 /
    DECLARE
    ERROR at line 1:
    ORA-39001: invalid argument value
    ORA-06512: at "SYS.DBMS_SYS_ERROR", line 79
    ORA-06512: at "SYS.DBMS_DATAPUMP", line 3043
    ORA-06512: at "SYS.DBMS_DATAPUMP", line 3688
    ORA-06512: at line 25
    ============================================================
    i have a table called LDEV_PERF_DATA and its in schema XPSLPERF.
    value => '(where "XP_TIME_NUM > 1204884480100")',above is the condition i want to filter the data.
    However, the below snippet works fine.
    ============================================================
    SET SERVEROUTPUT ON SIZE 1000000
    DECLARE
    l_dp_handle NUMBER;
    l_last_job_state VARCHAR2(30) := 'UNDEFINED';
    l_job_state VARCHAR2(30) := 'UNDEFINED';
    l_sts KU$_STATUS;
    BEGIN
    l_dp_handle := DBMS_DATAPUMP.open(
    operation => 'EXPORT',
    job_mode => 'SCHEMA',
    remote_link => NULL,
    job_name => 'ldev_may20',
    version => 'LATEST');
    DBMS_DATAPUMP.add_file(
    handle => l_dp_handle,
    filename => 'ldev_may20.dmp',
    directory => 'DATA_PUMP_DIR');
    DBMS_DATAPUMP.add_file(
    handle => l_dp_handle,
    filename => 'ldev_may20.log',
    directory => 'DATA_PUMP_DIR',
    filetype => DBMS_DATAPUMP.KU$_FILE_TYPE_LOG_FILE);
    DBMS_DATAPUMP.start_job(l_dp_handle);
    DBMS_DATAPUMP.detach(l_dp_handle);
    END;
    ============================================================
    I dont want to export all contents as the above, but want to export data based on some conditions and only on selective tables.
    Any help is highly appreciated.

  • ORA-39097: Data Pump job encountered unexpected error -12801

    Hallo!I am running Oracle RAC 11.2.0.3.0 database on IBM-AIX 7.1 OS platform.
    We normally do data pump expdp backups and we created a OS authenticated user and been having non-DBA users use this user (instead of / as sysdba which is only used by DBAs) to run expdp.This OS authenticated user has been working fine until it statrd gigin use error below
    Export: Release 11.2.0.3.0 - Production on Fri Apr 5 23:08:22 2013
    Copyright (c) 1982, 2011, Oracle and/or its affiliates. All rights reserved.
    Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.3.0 - 64bit Production
    With the Partitioning, Real Application Clusters, Automatic Storage Management, Oracle Label Security,
    OLAP, Data Mining, Oracle Database Vault and Real Application Testing optio
    FLASHBACK automatically enabled to preserve database integrity.
    Starting "OPS$COPBKPMG"."SYS_EXPORT_SCHEMA_16": /******** DIRECTORY=COPKBFUB_DIR dumpfile=COPKBFUB_Patch35_PreEOD_2013-04-05-23-08_%U.dmp logfile=COPKBFUB_Patch35_PreEOD_2013-04-05-23-08.log cluster=n parallel=4 schemas=BANKFUSION,CBS,UBINTERFACE,WASADMIN,CBSAUDIT,ACCTOPIC,BFBANKFUSION,PARTY,BFPARTY,WSREGISTRY,COPK
    Estimate in progress using BLOCKS method...
    Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
    Total estimation using BLOCKS method: 130.5 GB
    Processing object type SCHEMA_EXPORT/USER
    Processing object type SCHEMA_EXPORT/SYSTEM_GRANT
    Processing object type SCHEMA_EXPORT/ROLE_GRANT
    Processing object type SCHEMA_EXPORT/DEFAULT_ROLE
    Processing object type SCHEMA_EXPORT/PRE_SCHEMA/PROCACT_SCHEMA
    Processing object type SCHEMA_EXPORT/SYNONYM/SYNONYM
    Processing object type SCHEMA_EXPORT/DB_LINK
    Processing object type SCHEMA_EXPORT/SEQUENCE/SEQUENCE
    Processing object type SCHEMA_EXPORT/TABLE/TABLE
    Processing object type SCHEMA_EXPORT/TABLE/GRANT/OWNER_GRANT/OBJECT_GRANT
    Processing object type SCHEMA_EXPORT/TABLE/COMMENT
    Processing object type SCHEMA_EXPORT/PACKAGE/PACKAGE_SPEC
    Processing object type SCHEMA_EXPORT/PACKAGE/GRANT/OWNER_GRANT/OBJECT_GRANT
    Processing object type SCHEMA_EXPORT/FUNCTION/FUNCTION
    Processing object type SCHEMA_EXPORT/FUNCTION/GRANT/OWNER_GRANT/OBJECT_GRANT
    Processing object type SCHEMA_EXPORT/PROCEDURE/PROCEDURE
    Processing object type SCHEMA_EXPORT/PROCEDURE/GRANT/OWNER_GRANT/OBJECT_GRANT
    Processing object type SCHEMA_EXPORT/PACKAGE/COMPILE_PACKAGE/PACKAGE_SPEC/ALTER_PACKAGE_SPEC
    Processing object type SCHEMA_EXPORT/FUNCTION/ALTER_FUNCTION
    Processing object type SCHEMA_EXPORT/PROCEDURE/ALTER_PROCEDURE
    Processing object type SCHEMA_EXPORT/TABLE/INDEX/INDEX
    Processing object type SCHEMA_EXPORT/TABLE/INDEX/FUNCTIONAL_INDEX/INDEX
    Processing object type SCHEMA_EXPORT/TABLE/CONSTRAINT/CONSTRAINT
    Processing object type SCHEMA_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS
    Processing object type SCHEMA_EXPORT/TABLE/INDEX/STATISTICS/FUNCTIONAL_INDEX/INDEX_STATISTICS
    Processing object type SCHEMA_EXPORT/VIEW/VIEW
    Processing object type SCHEMA_EXPORT/VIEW/GRANT/OWNER_GRANT/OBJECT_GRANT
    Processing object type SCHEMA_EXPORT/VIEW/GRANT/CROSS_SCHEMA/OBJECT_GRANT
    Processing object type SCHEMA_EXPORT/VIEW/COMMENT
    Processing object type SCHEMA_EXPORT/PACKAGE/PACKAGE_BODY
    Processing object type SCHEMA_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
    Processing object type SCHEMA_EXPORT/MATERIALIZED_VIEW
    Processing object type SCHEMA_EXPORT/POST_SCHEMA/PROCACT_SCHEMA
    . . exported "WASADMIN"."BATCHGATEWAYLOGDETAIL" 2.244 GB 9379850 rows
    . . exported "WASADMIN"."UBTB_TRANSACTION" 13.71 GB 46299982 rows
    . . exported "WASADMIN"."INTERESTHISTORY" 2.094 GB 13479801 rows
    . . exported "WASADMIN"."MOVEMENTSHISTORY" 1.627 GB 13003451 rows
    . . exported "WASADMIN"."ACCRUALSREPORT" 1.455 GB 18765315 rows
    ORA-39097: Data Pump job encountered unexpected error -12801
    ORA-39065: unexpected master process exception in MAIN
    ORA-12801: error signaled in parallel query server PZ99, instance copubdb02dc:COPKBFUB2 (2)
    ORA-01460: unimplemented or unreasonable conversion requested
    Job "OPS$COPBKPMG"."SYS_EXPORT_SCHEMA_16" stopped due to fatal error at 23:13:37
    Please assist.

    have you seen this?
    *Bug 13099577 - ORA-1460 with parallel query [ID 13099577.8]*

  • Data pump error ORA-39065, status undefined after restart

    Hi members,
    The data pump full import job hung, continue client also hung, all of a sudden the window exited.
    ;;; Import> status
    ;;; Import> help
    ;;; Import> status
    ;;; Import> continue_client
    ORA-39065: unexpected master process exception in RECEIVE
    ORA-39078: unable to dequeue message for agent MCP from queue "KUPC$C_1_20090923181336"
    Job "SYSTEM"."SYS_IMPORT_FULL_01" stopped due to fatal error at 18:48:03
    I increased the shared_pool to 100M and then restarted the job with attach=jobname. After restarting, I have queried the status and found that everything is undefined. It still says undefined now and the last log message says that it has been reopened. Thats the end of the log file and nothing else is being recorded. I am not sure what is happening now. Any ideas will be appreciated. This is 10.2.0.3 version on windows. Thanks ...
    Job SYS_IMPORT_FULL_01 has been reopened at Wednesday, 23 September, 2009 18:54
    Import> status
    Job: SYS_IMPORT_FULL_01
    Operation: IMPORT
    Mode: FULL
    State: IDLING
    Bytes Processed: 3,139,231,552
    Percent Done: 33
    Current Parallelism: 8
    Job Error Count: 0
    Dump File: D:\oracle\product\10.2.0\admin\devdb\dpdump\devtest%u.dmp
    Dump File: D:\oracle\product\10.2.0\admin\devdb\dpdump\devtest01.dmp
    Dump File: D:\oracle\product\10.2.0\admin\devdb\dpdump\devtest02.dmp
    Dump File: D:\oracle\product\10.2.0\admin\devdb\dpdump\devtest03.dmp
    Dump File: D:\oracle\product\10.2.0\admin\devdb\dpdump\devtest04.dmp
    Dump File: D:\oracle\product\10.2.0\admin\devdb\dpdump\devtest05.dmp
    Dump File: D:\oracle\product\10.2.0\admin\devdb\dpdump\devtest06.dmp
    Dump File: D:\oracle\product\10.2.0\admin\devdb\dpdump\devtest07.dmp
    Dump File: D:\oracle\product\10.2.0\admin\devdb\dpdump\devtest08.dmp
    Worker 1 Status:
    State: UNDEFINED
    Worker 2 Status:
    State: UNDEFINED
    Object Schema: trm
    Object Name: EVENT_DOCUMENT
    Object Type: DATABASE_EXPORT/SCHEMA/TABLE/TABLE_DATA
    Completed Objects: 1
    Completed Rows: 78,026
    Completed Bytes: 4,752,331,264
    Percent Done: 100
    Worker Parallelism: 1
    Worker 3 Status:
    State: UNDEFINED
    Worker 4 Status:
    State: UNDEFINED
    Worker 5 Status:
    State: UNDEFINED
    Worker 6 Status:
    State: UNDEFINED
    Worker 7 Status:
    State: UNDEFINED
    Worker 8 Status:
    State: UNDEFINED

    39065, 00000, "unexpected master process exception in %s"
    // *Cause:  An unhandled exception was detected internally within the master
    //          control process for the Data Pump job.  This is an internal error.
    //          messages will detail the problems.
    // *Action: If problem persists, contact Oracle Customer Support.

  • Data pump import from 11g to 10g

    I have 2 database: first is 11.2.0.2.0 and second is 10.2.0.1.0
    In 10g i created database link on 11g
    CREATE DATABASE LINK "TEST.LINK"
    CONNECT TO "monservice" IDENTIFIED BY "monservice"
    USING '(DESCRIPTION = (ADDRESS_LIST = (ADDRESS = (PROTOCOL = TCP)(HOST = host)(PORT = port))) (CONNECT_DATA = (SID = sid)))';
    And execute this query for test dbLink which work fine:
    select * from v$[email protected];
    After it i try to call open function:
    declare
    h number;
    begin
    h := dbms_datapump.open('IMPORT', 'TABLE', 'TEST.LINK', null, '10.2');
    end;
    and get exception: 39001. 00000 - "invalid argument value"
    if i remove 'TEST.LINK' from the arguments it works fine
    Edited by: 990594 on 26.02.2013 23:41

    Hi Richard Harrison,
    result for import from 11g to 10g:
    impdp user/pass@dburl schemas=SCHEMANAME network_link=TEST_LINK version=10.2
    Connected to: Oracle Database 10g Express Edition Release 10.2.0.1.0 - Production
    ORA-39001: invalid argument value
    ORA-39169: Local version of 10.2.0.1.0 cannot work with remote version of 11.2.0.2.0
    result for export from 11g to 10g:
    expdp user/pass@dburl schemas=SCHEMANAME network_link=TEST_LINK version=10.2
    Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.2.0 - 64 bit Production
    With the Partitioning, OLAP, Data Mining and Real Application Testing option
    ORA-39006: internal error
    ORA-39065: unexpected master process exception in DISPATCH
    ORA-04052: error occurred when looking up remote object SYS.KUPM$MCP@TEST_LINK
    ORA-00604: error occurred at recursive SQL level 3
    ORA-06544: PL/SQL: internal, error, arguments: [55916], [], [], [], [], [], [], []
    ORA-06553: PLS-801: internal error [55916]
    ORA-02063: preceding 2 lines from TEST_LINK
    ORA_39097: Data Pump job encountered unexpected error -4052

  • Data Pump Export issue - no streams pool created and cannot automatically c

    I am trying to use data pump on a 10.2.0.1 database that has vlm enabled and getting the following error :
    Export: Release 10.2.0.1.0 - Production on Tuesday, 20 April, 2010 10:52:08
    Connected to: Oracle Database 10g Release 10.2.0.1.0 - Production
    ORA-31626: job does not exist
    ORA-31637: cannot create job SYS_EXPORT_TABLE_01 for user E_AGENT_SITE
    ORA-06512: at "SYS.DBMS_SYS_ERROR", line 95
    ORA-06512: at "SYS.KUPV$FT_INT", line 600
    ORA-39080: failed to create queues "KUPC$C_1_20100420105208" and "KUPC$S_1_20100420105208" for Data Pump job
    ORA-06512: at "SYS.DBMS_SYS_ERROR", line 95
    ORA-06512: at "SYS.KUPC$QUE_INT", line 1555
    ORA-00832: no streams pool created and cannot automatically create one
    This is my script (that I currently use on other non vlm databases successfully):
    expdp e_agent_site/<password>@orcl parfile=d:\DailySitePump.par
    this is my parameter file :
    DUMPFILE=site_pump%U.dmp
    PARALLEL=1
    LOGFILE=site_pump.log
    STATUS=300
    DIRECTORY=DATA_DUMP
    QUERY=wwv_document$:"where last_updated > sysdate-18"
    EXCLUDE=CONSTRAINT
    EXCLUDE=INDEX
    EXCLUDE=GRANT
    TABLES=wwv_document$
    FILESIZE=2000M
    My oracle directory is created and the user has rights
    googling the issue says that the shared pool is too small or streams_pool_size needs setting. shared_pool_size = 1200M and when I query v$parameter it shows that streams_pool_size = 0
    I've tried alter system set streams_pool_size=1M; but I just get :
    ORA-02097: parameter cannot be modified because specified value is invalid
    ORA-04033: Insufficient memory to grow pool
    The server is a windows enterprise box with 16GB ram and VLM enabled, pfile memory parameters listed below:
    # resource
    processes = 1250
    job_queue_processes = 10
    open_cursors = 1000 # no overhead if set too high
    # sga
    shared_pool_size = 1200M
    large_pool_size = 150M
    java_pool_size = 50M
    # pga
    pga_aggregate_target = 850M # custom
    # System Managed Undo and Rollback Segments
    undo_management=AUTO
    undo_tablespace=UNDOTBS1
    # vlm support
    USE_INDIRECT_DATA_BUFFERS = TRUE
    DB_BLOCK_BUFFERS = 1500000
    Any ideas why I cannot run data pump? I am assuming that I just need to set streams_pool_size but I don't understand why I cannot increase the size of it on this db. It is set to 0 on other databases that work fine and I can set it which is why I am possibly linking the issue to vlm
    thanks
    Robert

    SGA_MAX_SIZE?
    SQL> ALTER SYSTEM SET streams_pool_size=32M SCOPE=BOTH;
    ALTER SYSTEM SET streams_pool_size=32M SCOPE=BOTH
    ERROR at line 1:
    ORA-02097: parameter cannot be modified because specified value is invalid
    ORA-04033: Insufficient memory to grow pool
    SQL> show parameter sga_max
    NAME                         TYPE      VALUE
    sga_max_size                    big integer 480M
    SQL> show parameter cache
    NAME                         TYPE      VALUE
    db_16k_cache_size               big integer 0
    db_2k_cache_size               big integer 0
    db_32k_cache_size               big integer 0
    db_4k_cache_size               big integer 0
    db_8k_cache_size               big integer 0
    db_cache_advice                string      ON
    db_cache_size                    big integer 256M
    db_keep_cache_size               big integer 0
    db_recycle_cache_size               big integer 0
    object_cache_max_size_percent          integer      10
    object_cache_optimal_size          integer      102400
    session_cached_cursors               integer      20
    SQL> ALTER SYSTEM SET db_cache_size=224M SCOPE=both;
    System altered.
    SQL> ALTER SYSTEM SET streams_pool_size=32M SCOPE=both;
    System altered.Lukasz

  • Error during data pump import with SQL developer

    Hello,
    I try to trasnfer data from one database to another one through data pump via SQL Developper (data amount is quite important) exporting several tables.
    Tables export is doing fine, but I encounter the following error when I import the file (I try data only and data + DDL).
    "Exception: ORA-39001: argument value invalid dbms_datapump.get_status(64...=
    ORA-39001: argument value invalid
    ORA-39000: ....
    ORA-31619: ...
    The file is in the right place, data pump folder of the new database. User is the same on both base, database version are similar.
    Do you have any idea of the problem ?
    Thanks

    With query SELECT * FROM v$version;
    Environment source
    Oracle Database 11g Enterprise Edition Release 11.2.0.1.0 - 64bit Production
    PL/SQL Release 11.2.0.1.0 - Production
    "CORE     11.2.0.1.0     Production"
    TNS for Linux: Version 11.2.0.1.0 - Production
    NLSRTL Version 11.2.0.1.0 - Production
    Environment target
    Oracle Database 11g Enterprise Edition Release 11.2.0.1.0 - 64bit Production
    PL/SQL Release 11.2.0.1.0 - Production
    "CORE     11.2.0.1.0     Production"
    TNS for Linux: Version 11.2.0.1.0 - Production
    NLSRTL Version 11.2.0.1.0 - Production

  • Data pump + statistics + dba_tab_modifications

    Database 11.2, persons involved with replicating data are using data pump and importing the statistics (that's all I know about their process). When I look at this query:
    select /*mods.table_owner,
           mods.table_name,*/
           mods.inserts,
           mods.updates,
           mods.deletes,
           mods.timestamp,
           tabs.num_rows,
           tabs.table_lock,
           tabs.monitoring,
           tabs.sample_size,
           tabs.last_analyzed
    from   sys.dba_tab_modifications mods join dba_tables tabs
           on mods.table_owner = tabs.owner and mods.table_name = tabs.table_name
    where  mods.table_name = &tab;I see this:
    INSERTS          UPDATES     DELETES     TIMESTAMP          NUM_ROWS     TABLE_LOCK     MONITORING     SAMPLE_SIZE     LAST_ANALYZED
    119333320     0     0     11/22/2011 19:27     116022939     ENABLED          YES          116022939     10/24/2011 23:10As we can see, the source database last gathered stats on 10/24 and the data was loaded in the destination on 11/22.
    The database is giving bad execution plans as indicated in previous thread: Re: Understanding results from dbms_xplan.display_cursor
    My first inclination is to run the following, but since they imported the stats, should they already be "good" and it's a matter of dba_tab_modifications getting out of sync? What gives?
    exec dbms_stats.gather_schema_stats(
         ownname          => 'SCHEMA_NAME',
         options          => 'GATHER AUTO'
       )

    In your previous post you mentioned that the explain plan has 197 records. That is one big SQL statement, so the CBO has plenty of opportunity to mess it up.
    That said, it is a good idea to verify that your statistics are fine. One way to accomplish that is to gather statistics into a pending area and compare the pending area stats with what is currently in the dictionary using DBMS_STATS.DIFF_TABLE_STATS_IN_PENDING.
    As mentioned by Tubby in your previous post, extended stats are powerful and easy way to improve the quality of CBO’s plans. Note that in 11.2.0.2 you can create extended stats specifically tailored to your SQLs (based on AWR or from live system) - http://iiotzov.wordpress.com/2011/11/01/get-the-max-of-oracle-11gr2-right-from-the-start-create-relevant-extended-statistics-as-a-part-of-the-upgrade/
    Iordan Iotzov
    http://iiotzov.wordpress.com/
    Edited by: Iordan Iotzov on Jan 6, 2012 12:45 PM

  • Not able to access data in Query

    Hi,
    I have loaded data using DTP from 2 different source systems.
    One is 3.x emulated datasource and the other is New 7.0 datasource.I have loaded data sucessfully in to DSO.Built a query on same.
    But when iam executing the query by material as selection criteria.Iam not able to found data in query from one source system.The same data is available in DSO.Please needed in this regard.

    Hi Venkat,
    After extracting data into DSO check the request whether active or not.
    Check data in DSO in contents.
    If is there any restrictions on info providers in Queries.
    Let us know status clearly.......
    Reg
    Pra

  • UDI-00018: Data Pump client is incompatible with database version 11.2.0.1

    Hi
    I am trying to import data in Oracle 11g Release2(11.2.0.1) using impdp utitlity and getting below errror
    UDI-00018: Data Pump client is incompatible with database version 11.2.0.1.0
    Export dump has taken in database with oracle 11g Release 1(11.1.0.7.0) and I am trying to import in higher version of the database. Is there any parameter I have to set to avoid this error?

    AUTHSTATE=compat
    A__z=! LOGNAME
    CLASSPATH=/app/oracle/11.2.0/jlib:.
    HOME=/home/oracle
    LANG=C
    LC__FASTMSG=true
    LD_LIBRARY_PATH=/app/oracle/11.2.0/lib:/app/oracle/11.2.0/network/lib:.
    LIBPATH=/app/oracle/11.2.0/JDK/JRE/BIN:/app/oracle/11.2.0/jdk/jre/bin/classic:/app/oracle/11.2.0/lib32
    LOCPATH=/usr/lib/nls/loc
    LOGIN=oracle
    LOGNAME=oracle
    MAIL=/usr/spool/mail/oracle
    MAILMSG=[YOU HAVE NEW MAIL]
    NLSPATH=/usr/lib/nls/msg/%L/%N:/usr/lib/nls/msg/%L/%N.cat
    NLS_DATE_FORMAT=DD-MON-RRRR HH24:MI:SS
    ODMDIR=/etc/objrepos
    ORACLE_BASE=/app/oracle
    ORACLE_HOME=/app/oracle/11.2.0
    ORACLE_SID=AMT6
    ORACLE_TERM=xterm
    ORA_NLS33=/app/oracle/11.2.0/nls/data
    PATH=/app/oracle/11.2.0/bin:.:/usr/bin:/etc:/usr/sbin:/usr/ucb:/home/oracle/bin:/usr/bin/X11:/sbin:.:/usr/local/bin:/usr/ccs/bin
    PS1=nbsud01[$PWD]:($ORACLE_SID)>
    PWD=/nbsiar/nbimp
    SHELL=/usr/bin/ksh
    SHLIB_PATH=/app/oracle/11.2.0/lib:/usr/lib
    TERM=xterm
    TZ=Europe/London
    USER=oracle
    _=/usr/bin/env

  • Data Pump .xlsx into a SQL Server Table and the whole 32-Bit, 64-Bit discussion

    First of all...I have a headache!
    Found LOTS of Google hits when trying to data pump a .xlsx File into a SQL Server Table. And the whole discussion of the Microsoft ACE 64-Bit Driver or the Microsoft Jet 32-Bit Driver.
    Specifically receiving this error...
    An OLE DB record is available.  Source: "Microsoft Office Access Database Engine"  Hresult: 0x80004005  Description: "External table is not in the expected format.".
    Error: 0xC020801C at Data Flow Task to Load Alere Coaching Enrolled, Excel Source [56]: SSIS Error Code DTS_E_CANNOTACQUIRECONNECTIONFROMCONNECTIONMANAGER.  The AcquireConnection method call to the connection manager "Excel Connection Manager"
    failed with error code 0xC0202009.
    Strangely enough, if I simply data pump ONE .xlsx File into a SQL Server Table utilizing my SSIS Package, it seems to work fine. If instead I am trying to be pro-active and allowing for multiple .xlsx Files by using a Foreach Loop Container and a variable
    @[User::FileName], it's erroring out...but not really because it is indeed storing the rows onto the SQL Server Table. I did check all my Delay
    Why does this have to be sooooooo difficult???
    Can anyone help me out here in trying to set-up a SSIS Package in a rather constrictive environment to pump a .xlsx File into a SQL Server Table? What in God's name am I doing wrong? Or is all this a misnomer? But if it's working how do I disable the error
    so that is stops erroring out?

    Hi ITBobbyP,
    According to your description, when you import data of .xlsx file to SQL Server database, you got the error message.
    The error can be caused by the following reasons:
    The excel file is locked by other processes. Please kindly resave this file and name it to other file name to see if the issue will be fixed.
    The ACE(Access Database Engine) is not up to date as Vaibhav mentioned. Please download the latest ACE and install it from the link:
    https://www.microsoft.com/en-us/download/details.aspx?id=13255.
    The version of OFFICE and server bitness is not the same. To solve the problem, please refer to the following document:
    http://hrvoje.piasevoli.com/2010/09/01/importing-data-from-64-bit-excel-in-ssis/
    If you have any more questions, please feel free to ask.
    Thanks,
    Wendy Fu
    Wendy Fu
    TechNet Community Support

Maybe you are looking for

  • Error while 'Crystal Report for Sap Business One' addon started

    Hi, I have try to install Crystal Report for Sap Business One (Addon Versione number 2.0.0.7). The installatione finish succesfully, but when I started the addOn i retrieve this errore message: External Connection to the dataBase failed. Sap Crystal

  • Colored dots in calendar month view

    hello, i have an iphone and use the calendar. i have multiple calendars with assigned colors (example: red for professional, green for sport, blue for family). as i am very busy, a black dot appears in almost every day in the month view. if i want to

  • BT Infinity - Slow - and resyncing often

    Hi everyone. I just got BT Infinity installed today (14.23) It was working fine for a few hours. I ran a speed test and the test varied 9mbs to 60mbs and the ping was 20/22ms. But now the speed has dropped significantly. Also the hub broadband light

  • Error in Inbound B2B: AIP-50031:  B2B adapter general error: java.lang.Null

    Hi, I am getting the following error in inbound B2B - Custom Document over Generic Exchange - SFTP Description: B2B adapter general error StackTrace: Error -: AIP-50031: B2B adapter general error      at oracle.tip.adapter.b2b.engine.Engine.processIn

  • Database Link.  Problem in Forms. Baffled!

    Hoping someone can help. I'm working on a system that has a form on it with a problem. Although there is a currently compiled and working version of this form available, I cannot get it to compile. The situation is this. There are 2 databases, MINX a