How to use Bulk Collect and Forall

Hi all,
We are on Oracle 10g. I have a requirement to read from table A and then for each record in table A, find matching rows in table B and then write the identified information in table B to the target table (table C). In the past, I had used two ‘cursor for loops’ to achieve that. To make the new procedure, more efficient, I would like to learn to use ‘bulk collect’ and ‘forall’.
Here is what I have so far:
DECLARE
TYPE employee_array IS TABLE OF EMPLOYEES%ROWTYPE;
employee_data  employee_array;
TYPE job_history_array IS TABLE OF JOB_HISTORY%ROWTYPE;
Job_history_data   job_history_array;
BatchSize CONSTANT POSITIVE := 5;
-- Read from File A
CURSOR c_get_employees IS
         SELECT  Employee_id,
                   first_name,
                   last_name,
                   hire_date,
                   job_id
          FROM EMPLOYEES;
-- Read from File B based on employee ID in File A
CURSOR c_get_job_history (p_employee_id number) IS
         select start_date,
                  end_date,
                  job_id,
                  department_id
         FROM JOB_HISTORY
         WHERE employee_id = p_employee_id;
BEGIN
    OPEN c_get_employees;
    LOOP
        FETCH c_get_employees BULK COLLECT INTO employee_data.employee_id.LAST,
                                                                          employee_data.first_name.LAST,
                                                                          employee_data.last_name.LAST,
                                                                          employee_data.hire_date.LAST,
                                                                          employee_data.job_id.LAST
         LIMIT BatchSize;
        FORALL i in 1.. employee_data.COUNT
                Open c_get_job_history (employee_data(i).employee_id);
                FETCH c_get_job_history BULKCOLLECT INTO job_history_array LIMIT BatchSize;
                         FORALL k in 1.. Job_history_data.COUNT LOOP
                                        -- insert into FILE C
                                          INSERT INTO MY_TEST(employee_id, first_name, last_name, hire_date, job_id)
                                                            values (job_history_array(k).employee_id, job_history_array(k).first_name,
                                                                      job_history_array(k).last_name, job_history_array(k).hire_date,
                                                                      job_history_array(k).job_id);
                                         EXIT WHEN job_ history_data.count < BatchSize                        
                         END LOOP;                          
                         CLOSE c_get_job_history;                          
                 EXIT WHEN employee_data.COUNT < BatchSize;
       END LOOP;
        COMMIT;
        CLOSE c_get_employees;
END;
                 When I run this script, I get
[Error] Execution (47: 17): ORA-06550: line 47, column 17:
PLS-00103: Encountered the symbol "OPEN" when expecting one of the following:
   . ( * @ % & - + / at mod remainder rem select update with
   <an exponent (**)> delete insert || execute multiset save
   merge
ORA-06550: line 48, column 17:
PLS-00103: Encountered the symbol "FETCH" when expecting one of the following:
   begin function package pragma procedure subtype type use
   <an identifier> <a double-quoted delimited-identifier> form
   current cursorWhat is the best way to code this? Once, I learn how to do this, I apply the knowledge to the real application in which file A would have around 200 rows and file B would have hundreds of thousands of rows.
Thank you for your guidance,
Seyed

Hello BlueShadow,
Following your advice, I modified a stored procedure that initially was using two cursor for loops to read from tables A and B to write to table C to use instead something like your suggestion listed below:
INSERT INTO tableC
SELECT …
FROM tableA JOIN tableB on (join condition).I tried this change on a procedure writing to tableC with keys disabled. I will try this against the real table that has primary key and indexes and report the result later.
Thank you very much,
Seyed

Similar Messages

  • How to use BULK COLLECT, FORALL and TREAT

    There is a need to read match and update data from and into a custom table. The table would have about 3 millions rows and holds key numbers. BAsed on a field value of this custom table, relevant data needs to be fetched from joins of other tables and updated in the custom table. I plan to use BULK COLLECT and FORALL.
    All examples I have seen, do an insert into a table. How do I go about reading all values of a given field and fetching other relevant data and then updating the custom table with data fetched.
    Defined an object with specifics like this
    CREATE OR REPLACE TYPE imei_ot AS OBJECT (
    recid NUMBER,
    imei VARCHAR2(30),
    STORE VARCHAR2(100),
    status VARCHAR2(1),
    TIMESTAMP DATE,
    order_number VARCHAR2(30),
    order_type VARCHAR2(30),
    sku VARCHAR2(30),
    order_date DATE,
    attribute1 VARCHAR2(240),
    market VARCHAR2(240),
    processed_flag VARCHAR2(1),
    last_update_date DATE
    Now within a package procedure I have defined like this.
    type imei_ott is table of imei_ot;
    imei_ntt imei_ott;
    begin
    SELECT imei_ot (recid,
    imei,
    STORE,
    status,
    TIMESTAMP,
    order_number,
    order_type,
    sku,
    order_date,
    attribute1,
    market,
    processed_flag,
    last_update_date
    BULK COLLECT INTO imei_ntt
    FROM (SELECT stg.recid, stg.imei, cip.store_location, 'S',
    co.rtl_txn_timestamp, co.rtl_order_number, 'CUST',
    msi.segment1 || '.' || msi.segment3,
    TRUNC (co.txn_timestamp), col.part_number, 'ZZ',
    stg.processed_flag, SYSDATE
    FROM custom_orders co,
    custom_order_lines col,
    custom_stg stg,
    mtl_system_items_b msi
    WHERE co.header_id = col.header_id
    AND msi.inventory_item_id = col.inventory_item_id
    AND msi.organization_id =
    (SELECT organization_id
    FROM hr_all_organization_units_tl
    WHERE NAME = 'Item Master'
    AND source_lang = USERENV ('LANG'))
    AND stg.imei = col.serial_number
    AND stg.processed_flag = 'U');
    /* Update staging table in one go for COR order data */
    FORALL indx IN 1 .. imei_ntt.COUNT
    UPDATE custom_stg
    SET STORE = TREAT (imei_ntt (indx) AS imei_ot).STORE,
    status = TREAT (imei_ntt (indx) AS imei_ot).status,
    TIMESTAMP = TREAT (imei_ntt (indx) AS imei_ot).TIMESTAMP,
    order_number = TREAT (imei_ntt (indx) AS imei_ot).order_number,
    order_type = TREAT (imei_ntt (indx) AS imei_ot).order_type,
    sku = TREAT (imei_ntt (indx) AS imei_ot).sku,
    order_date = TREAT (imei_ntt (indx) AS imei_ot).order_date,
    attribute1 = TREAT (imei_ntt (indx) AS imei_ot).attribute1,
    market = TREAT (imei_ntt (indx) AS imei_ot).market,
    processed_flag =
    TREAT (imei_ntt (indx) AS imei_ot).processed_flag,
    last_update_date =
    TREAT (imei_ntt (indx) AS imei_ot).last_update_date
    WHERE recid = TREAT (imei_ntt (indx) AS imei_ot).recid
    AND imei = TREAT (imei_ntt (indx) AS imei_ot).imei;
    DBMS_OUTPUT.put_line ( TO_CHAR (SQL%ROWCOUNT)
    || ' rows updated using Bulk Collect / For All.'
    EXCEPTION
    WHEN NO_DATA_FOUND
    THEN
    DBMS_OUTPUT.put_line ('No Data: ' || SQLERRM);
    WHEN OTHERS
    THEN
    DBMS_OUTPUT.put_line ('Other Error: ' || SQLERRM);
    END;
    Now for the unfortunate part. When I compile the pkg, I face an error
    PL/SQL: ORA-00904: "LAST_UPDATE_DATE": invalid identifier
    I am not sure where I am wrong. Object type has the last update date field and the custom table also has the same field.
    Could someone please throw some light and suggestion?
    Thanks
    uds

    I suspect your error comes from the »bulk collect into« and not from the »forall loop«.
    From a first glance you need to alias sysdate with last_update_date and some of the other select items need to be aliased as well :
    But a simplified version would be
    select imei_ot (stg.recid,
                    stg.imei,
                    cip.store_location,
                    'S',
                    co.rtl_txn_timestamp,
                    co.rtl_order_number,
                    'CUST',
                    msi.segment1 || '.' || msi.segment3,
                    trunc (co.txn_timestamp),
                    col.part_number,
                    'ZZ',
                    stg.processed_flag,
                    sysdate
    bulk collect into imei_ntt
      from custom_orders co,
           custom_order_lines col,
           custom_stg stg,
           mtl_system_items_b msi
    where co.header_id = col.header_id
       and msi.inventory_item_id = col.inventory_item_id
       and msi.organization_id =
                  (select organization_id
                     from hr_all_organization_units_tl
                    where name = 'Item Master' and source_lang = userenv ('LANG'))
       and stg.imei = col.serial_number
       and stg.processed_flag = 'U';
    ...

  • BULK COLLECT and FORALL with dynamic INSERT.

    Hello,
    I want to apply BULK COLLECT and FORALL feature for a insert statement in my procedure for performance improvements as it has to insert a huge amount of data.
    But the problem is that the insert statement gets generated dynamically and even the table name is found at the run-time ... so i am not able to apply the performance tuning concepts.
    See below the code
    PROCEDURE STP_MES_INSERT_GLOBAL_TO_MAIN
      (P_IN_SRC_TABLE_NAME                  VARCHAR2 ,
      P_IN_TRG_TABLE_NAME                  VARCHAR2 ,
      P_IN_ED_TRIG_ALARM_ID                 NUMBER ,
      P_IN_ED_CATG_ID                 NUMBER ,
      P_IN_IS_PIECEID_ALARM IN CHAR,
      P_IN_IS_LAST_RECORD IN CHAR
    IS
          V_START_DATA_ID                 NUMBER;
          V_STOP_DATA_ID                   NUMBER;
          V_FROM_DATA_ID                   NUMBER;
          V_TO_DATA_ID                       NUMBER;
          V_MAX_REC_IN_LOOP              NUMBER := 30000;
          V_QRY1         VARCHAR2(32767);
    BEGIN
                EXECUTE IMMEDIATE 'SELECT MIN(ED_DATA_ID), MAX(ED_DATA_ID) FROM '|| P_IN_SRC_TABLE_NAME INTO V_START_DATA_ID , V_STOP_DATA_ID;
                --DBMS_OUTPUT.PUT_LINE('ORIGINAL START ID := '||V_START_DATA_ID ||' ORIGINAL  STOP ID := ' || V_STOP_DATA_ID);
                V_FROM_DATA_ID := V_START_DATA_ID ;
                IF (V_STOP_DATA_ID - V_START_DATA_ID ) > V_MAX_REC_IN_LOOP THEN
                        V_TO_DATA_ID := V_START_DATA_ID + V_MAX_REC_IN_LOOP;
                ELSE
                       V_TO_DATA_ID := V_STOP_DATA_ID;
                END IF;
              LOOP
                    BEGIN
                 LOOP      
                            V_QRY1 := ' INSERT INTO '||P_IN_TRG_TABLE_NAME||
                            ' SELECT * FROM '||P_IN_SRC_TABLE_NAME ||
                            ' WHERE ED_DATA_ID BETWEEN ' || V_FROM_DATA_ID ||' AND ' || V_TO_DATA_ID;
                    EXECUTE IMMEDIATE V_QRY1;
    commit;
                                     V_FROM_DATA_ID :=  V_TO_DATA_ID + 1;
                            IF  ( V_STOP_DATA_ID - V_TO_DATA_ID > V_MAX_REC_IN_LOOP ) THEN
                                  V_TO_DATA_ID := V_TO_DATA_ID + V_MAX_REC_IN_LOOP;
                            ELSE
                                  V_TO_DATA_ID := V_TO_DATA_ID + (V_STOP_DATA_ID - V_TO_DATA_ID);
                            END IF;
                    EXCEPTION
                             WHEN OTHERS THEN.............
    ....................so on Now you can observer here that P_IN_SRC_TABLE_NAME is the source table name which we get as a parameter at run-time. I have used 2 table in the insert statement P_IN_TRG_TABLE_NAME (in which i have to insert data) and P_IN_SRC_TABLE_NAME(from where i have to insert data)
      V_QRY1 := ' INSERT INTO '||P_IN_TRG_TABLE_NAME||
                            ' SELECT * FROM '||P_IN_SRC_TABLE_NAME ||
                            ' WHERE ED_DATA_ID BETWEEN ' || V_FROM_DATA_ID ||' AND ' || V_TO_DATA_ID;
                    EXECUTE IMMEDIATE V_QRY1;now when i appy the bulk collect and forall feature i am facing the out of scope problem....see the code below ...
    BEGIN
                EXECUTE IMMEDIATE 'SELECT MIN(ED_DATA_ID), MAX(ED_DATA_ID) FROM '|| P_IN_SRC_TABLE_NAME INTO V_START_DATA_ID , V_STOP_DATA_ID;
                --DBMS_OUTPUT.PUT_LINE('ORIGINAL START ID := '||V_START_DATA_ID ||' ORIGINAL  STOP ID := ' || V_STOP_DATA_ID);
                V_FROM_DATA_ID := V_START_DATA_ID ;
                IF (V_STOP_DATA_ID - V_START_DATA_ID ) > V_MAX_REC_IN_LOOP THEN
                        V_TO_DATA_ID := V_START_DATA_ID + V_MAX_REC_IN_LOOP;
                ELSE
                       V_TO_DATA_ID := V_STOP_DATA_ID;
                END IF;
              LOOP
                    DECLARE
                     TYPE TRG_TABLE_TYPE IS TABLE OF P_IN_SRC_TABLE_NAME%ROWTYPE;
                     V_TRG_TABLE_TYPE TRG_TABLE_TYPE;
                     CURSOR TRG_TAB_CUR IS
                     SELECT * FROM P_IN_SRC_TABLE_NAME
                     WHERE ED_DATA_ID BETWEEN V_FROM_DATA_ID AND V_TO_DATA_ID;
                     V_QRY1 varchar2(32767);
                    BEGIN
                    OPEN TRG_TAB_CUR;
                    LOOP
                    FETCH TRG_TAB_CUR BULK COLLECT INTO V_TRG_TABLE_TYPE LIMIT 30000;
                    FORALL I IN 1..V_TRG_TABLE_TYPE.COUNT
                    V_QRY1 := ' INSERT INTO '||P_IN_TRG_TABLE_NAME||' VALUES V_TRG_TABLE_TYPE(I);'
                    EXECUTE IMMEDIATE V_QRY1;
                    EXIT WHEN TRG_TAB_CUR%NOTFOUND;
                    END LOOP;
                    CLOSE TRG_TAB_CUR;
                            V_FROM_DATA_ID :=  V_TO_DATA_ID + 1;
                            IF  ( V_STOP_DATA_ID - V_TO_DATA_ID > V_MAX_REC_IN_LOOP ) THEN
                                  V_TO_DATA_ID := V_TO_DATA_ID + V_MAX_REC_IN_LOOP;
                            ELSE
                                  V_TO_DATA_ID := V_TO_DATA_ID + (V_STOP_DATA_ID - V_TO_DATA_ID);
                            END IF;
                    EXCEPTION
                             WHEN OTHERS THEN.........so on
    But the above code is not helping me ,  what i am doing wrong ??? how can i tune this dynamically generated statement to use bulk collect for better performace ......
    Thanks in Advance !!!!

    Hello,
    a table name cannot be bind as a parameter in SQL, this wont't compile:
    EXECUTE IMMEDIATE ' INSERT INTO :1 VALUES ......
    USING P_IN_TRG_TABLE_NAME ...but this should work:
    EXECUTE IMMEDIATE ' INSERT INTO ' || P_IN_TRG_TABLE_NAME || ' VALUES ......You cannot declare a type that is based on a table which name is in a variable.
    PL/SQL is stronly typed language, a type must be known at compile time, a code like this is not allowed:
    PROCEDURE xx( src_table_name varchar2 )
    DECLARE
       TYPE tab IS TABLE OF src_table_name%ROWTYPE;
      ...This can be done by creating one big dynamic SQL - see example below (tested on Oracle 10 XE - this is a slightly simplified version of your procedure):
    CREATE OR REPLACE
    PROCEDURE stp1(
      p_in_src_table_name                  VARCHAR2 ,
      p_in_trg_table_name                  VARCHAR2 ,
      v_from_data_id     NUMBER := 100,
      v_to_data_id       NUMBER := 100000
    IS
    BEGIN
      EXECUTE IMMEDIATE q'{
      DECLARE
         TYPE trg_table_type IS TABLE OF }' || p_in_src_table_name || q'{%ROWTYPE;
         V_TRG_TABLE_TYPE TRG_TABLE_TYPE;
         CURSOR TRG_TAB_CUR IS
         SELECT * FROM }' || p_in_src_table_name ||
         q'{ WHERE ED_DATA_ID BETWEEN :V_FROM_DATA_ID AND :V_TO_DATA_ID;
      BEGIN
          OPEN TRG_TAB_CUR;
          LOOP
                FETCH TRG_TAB_CUR BULK COLLECT INTO V_TRG_TABLE_TYPE LIMIT 30000;
                FORALL I IN 1 .. V_TRG_TABLE_TYPE.COUNT
                INSERT INTO }' || p_in_trg_table_name || q'{ VALUES V_TRG_TABLE_TYPE( I );
                EXIT WHEN TRG_TAB_CUR%NOTFOUND;
          END LOOP;
          CLOSE TRG_TAB_CUR;
      END; }'
      USING v_from_data_id, v_to_data_id;
      COMMIT;
    END;But this probably won't give any performace improvements. Bulk collect and forall can give performance improvements when there is a DML operation inside a loop,
    and this one single DML operates on only one record or relatively small number of records, and this DML is repeated many many times in the loop.
    I guess that your code is opposite to this - it contains insert statement that operates on many records (one single insert ~ 30000 records),
    and you are trying to replace it with bulk collect/forall - INSERT INTO ... SELECT FROM will almost alwayst be faster than bulk collect/forall.
    Look at simple test - below is a procedure that uses INSERT ... SELECT :
    CREATE OR REPLACE
    PROCEDURE stp(
      p_in_src_table_name                  VARCHAR2 ,
      p_in_trg_table_name                  VARCHAR2 ,
      v_from_data_id     NUMBER := 100,
      v_to_data_id       NUMBER := 100000
    IS
    V_QRY1 VARCHAR2(32767);
    BEGIN
          V_QRY1 := ' INSERT INTO '||   P_IN_TRG_TABLE_NAME ||
                    ' SELECT * FROM '|| P_IN_SRC_TABLE_NAME ||
                    ' WHERE ed_data_id BETWEEN :f AND :t ';
          EXECUTE IMMEDIATE V_QRY1
          USING V_FROM_DATA_ID, V_TO_DATA_ID;
          COMMIT;
    END;
    /and we can compare both procedures:
    SQL> CREATE TABLE test333
      2  AS SELECT level ed_data_id ,
      3            'XXX ' || LEVEL x,
      4            'YYY ' || 2 * LEVEL y
      5  FROM dual
      6  CONNECT BY LEVEL <= 1000000;
    Table created.
    SQL> CREATE TABLE test333_dst AS
      2  SELECT * FROM test333 WHERE 1 = 0;
    Table created.
    SQL> set timing on
    SQL> ed
    Wrote file afiedt.buf
      1  BEGIN
      2     FOR i IN 1 .. 100 LOOP
      3        stp1( 'test333', 'test333_dst', 1000, 31000 );
      4     END LOOP;
      5* END;
    SQL> /
    PL/SQL procedure successfully completed.
    Elapsed: 00:00:22.12
    SQL> ed
    Wrote file afiedt.buf
      1  BEGIN
      2     FOR i IN 1 .. 100 LOOP
      3        stp( 'test333', 'test333_dst', 1000, 31000 );
      4     END LOOP;
      5* END;
    SQL> /
    PL/SQL procedure successfully completed.
    Elapsed: 00:00:14.86without bulk collect ~ 15 sec.
    bulk collect version ~ 22 sec. .... 7 sec longer / 15 sec. = about 45% performance decrease.

  • Using bulk collect and for all to solve a problem

    Hi All
    I have a following problem.
    Please forgive me if its a stupid question :-) im learning.
    1: Data in a staging table xx_staging_table
    2: two Target table t1, t2 where some columns from xx_staging_table are inserted into
    Some of the columns from the staging table data are checked for valid entries and then some columns from that row will be loaded into the two target tables.
    The two target tables use different set of columns from the staging table
    When I had a thousand records there was no problem with a direct insert but it seems we will now have half a million records.
    This has slowed down the process considerably.
    My question is
    Can I use the bulk collect and for all functionality to get specific columns from a staging table, then validate the row using those columns
    and then use a bulk insert to load the data into a specific table.?
    So code would be like
    get_staging_data cursor will have all the columns i need from the staging table
    cursor get_staging_data
    is select * from xx_staging_table (about 500000) records
    Use bulk collect to load about 10000 or so records into a plsql table
    and then do a bulk insert like this
    CREATE TABLE t1 AS SELECT * FROM all_objects WHERE 1 = 2;
    CREATE OR REPLACE PROCEDURE test_proc (p_array_size IN PLS_INTEGER DEFAULT 100)
    IS
    TYPE ARRAY IS TABLE OF all_objects%ROWTYPE;
    l_data ARRAY;
    CURSOR c IS SELECT * FROM all_objects;
    BEGIN
    OPEN c;
    LOOP
    FETCH c BULK COLLECT INTO l_data LIMIT p_array_size;
    FORALL i IN 1..l_data.COUNT
    INSERT INTO t1 VALUES l_data(i);
    EXIT WHEN c%NOTFOUND;
    END LOOP;
    CLOSE c;
    END test_proc;
    In the above example t1 and the cursor have the same number of columns
    In my case the columns in the cursor loop are a small subset of the columns of table t1
    so can i use a forall to load that subset into the table t1? How does that work?
    Thanks
    J

    user7348303 wrote:
    checking if the value is valid and theres also some conditional processing rules ( such as if the value is a certain value no inserts are needed)
    which are a little more complex than I can put in a simpleWell, if the processing is too complex (and conditional) to be done in SQL, then doing that in PL/SQL is justified... but will be slower as you are now introducing an additional layer. Data now needs to travel between the SQL layer and PL/SQL layer. This is slower.
    PL/SQL is inherently serialised - and this also effects performance and scalability. PL/SQL cannot be parallelised by Oracle in an automated fashion. SQL processes can.
    To put in in simple terms. You create PL/SQL procedure Foo that processes SQL cursor and you execute that proc. Oracle cannot run multiple parallel copies of Foo. It perhaps can parallelise that SQL cursor that Foo uses - but not Foo itself.
    However, if Foo is called by the SQL engine it can run in parallel - as the SQL process calling Foo is running in parallel. So if you make Foo a pipeline table function (written in PL/SQL), and you design and code it as a thread-safe/parallel enabled function, it can be callled and used and executed in parallel, by the SQL engine.
    So moving your PL/SQL code into a parallel enabled pipeline function written in PL/SQL, and using that function via parallel SQL, can increase performance over running that same basic PL/SQL processing as a serialised process.
    This is of course assuming that the processing that needs to be done using PL/SQL code, can be designed and coded for parallel processing in this fashion.

  • How to use BULK COLLECT in oracle forms

    hi gurus,
    I am using oracle forms
    Forms [32 Bit] Version 10.1.2.0.2 (Production)
    Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - ProductionI wanna use bulk collect from database table lets say <employees>
    while working on database level with collections and records it's working very well for me, but when I try to use that technique on oracle forms it hits me error
    error 591 this feature is not supported in client side programmingI know I can use cursors to loop through the records of oracle tables ,
    but I'm convenient while using collections and arrays
    for example
    Set Serveroutput On
    Declare
          Type Rec_T Is Record (     
           Empid Number ,
           Empname Varchar2(100)
          Type V_R Is Table Of Rec_T Index By Binary_Integer;     
          V_Array V_R;
    Begin
       Select Employee_Id , First_Name
       Bulk Collect
       Into V_Array
          From Employees; 
       For Indx In V_Array.First..V_Array.Last Loop
       Dbms_Output.Put_Line('employees id '||V_Array(Indx).Empid ||'and the name is '||V_Array(Indx).Empname);
       End Loop;      
         End;I wanna use this same way on oracle forms , for certain purposes , please guide me how can I use ...
    thanks...

    For information, you can use and populate a collection within the Forms application without using the BULK COLLECT
    Francoisactually I want to work with arrays , index tables ,
    like
             record_type (variable , variable2);
             type type_name <record_type>  index by binary_integer
            type_variable type_name;
            and in main body of program
            select something
            bulk collect into type_variable
            from any_table;
           loop
                type_variable(indx).variable , type_variable(indx).variable2;
           end loop;
           this is very useful for my logic on which I am working
              like
              type_variable(indx).variable || type_variable(indx-1);
             if it's possible with cursors then how can I use cursor that can fullfill my this logic@Francois
    if it's possible then how can i populate without using bulk collect?
    thanks
    and for others replies: if I can use stored procedures please give me any example..
    thanks

  • PL/SQL using BULK COLLECT and MERGE

    what i am trying to do is to use bulk collect to create an array of row data, then loop through the array and either insert or update a table, hence, merge:
    FORALL i in ID.first .. ID.last SAVE EXCEPTIONS
    MERGE INTO table1 t USING (
    select ID(i) ID, {other array fields...} from dual) s
    ON t.ID = s.ID
    WHEN MATCHED THEN UPDATE...
    WHEN NOT MATCHED THEN INSERT ...
    The problem is that Oracle always do a INSERT. Has anyone had the same problem? Any workaround?
    Thanks.

    in package header:
         TYPE ID_TYPE IS TABLE OF table1.ID%TYPE;
    in the proc, I declared
         ID ID_TYPE;
    then a bulk collect:
    select * from .. BULK COLLECT INTO ID...
    In addition, i truncate the destination table and run the Proc. all records are insert's. then I ran the same proc again, expecting all records to be updated, but insert occured again causing exceptions due to violation of unique keys.
    Message was edited by:
    zliao01

  • How to use BULK COLLECT in Oracle Forms 11g

    Forms is showing error that "Feature is not support in Client Side Program" when i am trying to impliment Bulk collect in Forms 11g.
    i need to load full data from DB to my form becuase using cursor is very slow....
    Is there any method/Work around to achieve this ....

    declare
    type arr is table of emp%rowtype ;
    lv_arr arr;
    begin
    select * bulk collect in to lv_arr from emp;
    /*written code here to process the data and write in to file*/
    end;Unless you are inserting/updating the data you are holding in the array into a database table I don't think there is much peformance gain in using bulk-collect in conjunction with writing a file. Bulk processing will increase performance by minimizing context switches from the SQL to the PL/SQL engine, nothing more, nothing less.
    In any case bulk processing is not available in forms, if you really need to make use of it you need to do it in a stored procedure.
    cheers

  • How to use BULK COLLECT in ref cursor?

    hi,
    can we use bulk collect in ref cursor ? if yes then please give small example ..
    thanks

    Try this:
    create or replace type person_ot as object (name varchar2(10)) not final;
    create or replace type student_ot under person_ot (s_num number) not final;
    create type person_tt as table of person_ot;
    create table persons of person_ot;
    declare
    lv_person_list person_tt;
    lv_sql varchar2(1000);
    ref_cur sys_refcursor;
    begin
    lv_sql:= 'select new student_ot(''fred'', 100) from dual
    union all
    select new student_ot(''sally'', 200) from dual';
    open ref_cur for lv_sql;
    fetch ref_cur bulk collect into lv_person_list;
    close ref_cur;
    for i in lv_person_list.first..lv_person_list.last loop
    dbms_output.put_line(lv_person_list(i).name );
    end loop;
    forall i in lv_person_list.first..lv_person_list.last
    insert into persons values lv_person_list(i);
    end;
    /

  • How to use bulk collect into clause

    hi all,
    i have requirement like this i want to change the query by passing the different table names in oracle database 10gr2.
    like if i use first i pass the scott.emp table name select * from scott.emp;
    then i want pass scott.dept table name select * from scott.dept;
    using select * option in the select list.
    how can i execute it.
    give me any solution.
    please reply....

    Hi,
    i recently also ran into some serious trouble to make dynamic sql work.
    It was about parallel pipelined function with strongly typed cursor (for "partition ... by hash(...)").
    But in addition requiring dynamic SQL for the input to this cursor.
    I couldn't make it work with execute immediate or something else.
    So i used another way - I translated dynamic SQL into dynamically created static SQL:
    1. create a base SQL data object type with abstract interface for code (e.g. some Execute() member function).
    2. dynamically create a derived SQL data object type with concrete implementation of Execute() holding your "dynamic SQL" "in a static way"
    3. delegate to Execute().
    Let me quote my example from comp.databases.oracle.server, thread "dynamically created cursor doesn't work for parallel pipelined functions"
    - pls. see below (it's an old one - with likely some odd stuff inside).
    It might sound some kind of strange for DB programmer.
    (I come from C++. Though this is not an excuse smile)
    Maybe i just missed another possible option to handle the problem.
    And it's a definitely verbose.
    But i think it's at least a (last) option.
    Actually i would be interested to hear, what others think about it.
    best regards,
    Frank
    --------------- snip -------------------------
    drop table parallel_test;
    drop type MyDoit;
    drop type BaseDoit;
    drop type ton;
    create type ton as table of number;
    CREATE TABLE parallel_test (
    id NUMBER(10),
    description VARCHAR2(50)
    BEGIN
    FOR i IN 1 .. 100000 LOOP
    INSERT INTO parallel_test (id, description)
    VALUES (i, 'Description or ' || i);
    END LOOP;
    COMMIT;
    END;
    create or replace type BaseDoit as object (
    id number,
    static function make(p_id in number)
    return BaseDoit,
    member procedure doit(
    p_sids in out nocopy ton,
    p_counts in out nocopy ton)
    ) not final;
    create or replace type body BaseDoit as
    static function make(p_id in number)
    return BaseDoit
    is
    begin
    return new BaseDoit(p_id);
    end;
    member procedure doit(
    p_sids in out nocopy ton,
    p_counts in out nocopy ton)
    is
    begin
    dbms_output.put_line('BaseDoit.doit(' || id || ') invoked...');
    end;
    end;
    -- Define a strongly typed REF CURSOR type.
    CREATE OR REPLACE PACKAGE parallel_ptf_api AS
    TYPE t_parallel_test_row IS RECORD (
    id1 NUMBER(10),
    desc1 VARCHAR2(50),
    id2 NUMBER(10),
    desc2 VARCHAR2(50),
    sid NUMBER
    TYPE t_parallel_test_tab IS TABLE OF t_parallel_test_row;
    TYPE t_parallel_test_ref_cursor IS REF CURSOR RETURN
    t_parallel_test_row;
    FUNCTION test_ptf (p_cursor IN t_parallel_test_ref_cursor)
    RETURN t_parallel_test_tab PIPELINED
    PARALLEL_ENABLE(PARTITION p_cursor BY any);
    END parallel_ptf_api;
    SHOW ERRORS
    CREATE OR REPLACE PACKAGE BODY parallel_ptf_api AS
    FUNCTION test_ptf (p_cursor IN t_parallel_test_ref_cursor)
    RETURN t_parallel_test_tab PIPELINED
    PARALLEL_ENABLE(PARTITION p_cursor BY any)
    IS
    l_row t_parallel_test_row;
    BEGIN
    LOOP
    FETCH p_cursor
    INTO l_row;
    EXIT WHEN p_cursor%NOTFOUND;
    select userenv('SID') into l_row.sid from dual;
    PIPE ROW (l_row);
    END LOOP;
    RETURN;
    END test_ptf;
    END parallel_ptf_api;
    SHOW ERRORS
    PROMPT
    PROMPT Serial Execution
    PROMPT ================
    SELECT sid, count(*)
    FROM TABLE(parallel_ptf_api.test_ptf(CURSOR(SELECT t1.id,
    t1.description, t2.id, t2.description, null
    FROM parallel_test t1,
    parallel_test t2
    where t1.id = t2.id
    ) t2
    GROUP BY sid;
    PROMPT
    PROMPT Parallel Execution
    PROMPT ==================
    SELECT sid, count(*)
    FROM TABLE(parallel_ptf_api.test_ptf(CURSOR(SELECT /*+
    parallel(t1,5) */ t1.id, t1.description, t2.id, t2.description, null
    FROM parallel_test t1,
    parallel_test t2
    where t1.id = t2.id
    ) t2
    GROUP BY sid;
    PROMPT
    PROMPT Parallel Execution 2
    PROMPT ==================
    set serveroutput on;
    declare
    v_sids ton := ton();
    v_counts ton := ton();
    -- v_cur parallel_ptf_api.t_parallel_test_ref_cursor;
    v_cur sys_refcursor;
    procedure OpenCursor(p_refCursor out sys_refcursor)
    is
    begin
    open p_refCursor for 'SELECT /*+ parallel(t1,5) */ t1.id,
    t1.description, t2.id, t2.description, null
    FROM parallel_test t1,
    parallel_test t2
    where t1.id = t2.id';
    end;
    begin
    OpenCursor(v_cur);
    SELECT sid, count(*) bulk collect into v_sids, v_counts
    FROM TABLE(parallel_ptf_api.test_ptf(v_cur)) t2
    GROUP BY sid;
    for i in v_sids.FIRST.. v_sids.LAST loop
    dbms_output.put_line (v_sids(i) || ', ' || v_counts(i));
    end loop;
    end;
    PROMPT
    PROMPT Parallel Execution 3
    PROMPT ==================
    set serveroutput on;
    declare
    instance BaseDoit;
    v_sids ton := ton();
    v_counts ton := ton();
    procedure CreateMyDoit
    is
    cmd varchar2(4096 char);
    begin
    cmd := 'create or replace type MyDoit under BaseDoit ( ' ||
    ' static function make(p_id in number) ' ||
    ' return MyDoit, ' ||
    ' overriding member procedure doit( ' ||
    ' p_sids in out nocopy ton, ' ||
    ' p_counts in out nocopy ton) ' ||
    execute immediate cmd;
    cmd := 'create or replace type body MyDoit as ' ||
    ' static function make(p_id in number) ' ||
    ' return MyDoit ' ||
    ' is ' ||
    ' begin ' ||
    ' return new MyDoit(p_id); ' ||
    ' end; ' ||
    ' ' ||
    ' overriding member procedure doit( ' ||
    ' p_sids in out nocopy ton, ' ||
    ' p_counts in out nocopy ton) ' ||
    ' is ' ||
    ' begin ' ||
    ' dbms_output.put_line(''MyDoit.doit('' || id || '')
    invoked...''); ' ||
    ' SELECT sid, count(*) bulk collect into p_sids, p_counts ' ||
    ' FROM TABLE(parallel_ptf_api.test_ptf(CURSOR( ' ||
    ' SELECT /*+ parallel(t1,5) */ t1.id, t1.description,
    t2.id, t2.description, null ' ||
    ' FROM parallel_test t1, parallel_test t2 ' ||
    ' where t1.id = t2.id ' ||
    ' ))) ' ||
    ' GROUP BY sid; ' ||
    ' end; ' ||
    ' end; ';
    execute immediate cmd;
    end;
    begin
    CreateMyDoit;
    execute immediate 'select MyDoit.Make(11) from dual' into instance;
    instance.doit(v_sids, v_counts);
    if v_sids.COUNT > 0 then
    for i in v_sids.FIRST.. v_sids.LAST loop
    dbms_output.put_line (v_sids(i) || ', ' || v_counts(i));
    end loop;
    end if;
    end;
    --------------- snap -------------------------

  • ORA-06502 during a procedure which uses Bulk collect feature and nested tab

    Hello Friends,
    have created one procedure which uses Bulk collect and nested table to hold the bulk data. This procedure was using one cursor and a nested table with the same type as the cursor to hold data fetched from cursor. Bulk collection technique was used to collect data from cursor to nested table. But it is giving ORA-06502 error.
    I reduced code of procedure to following to trace the error point. But still error is comming. Please help us to find the cause and solve it.
    Script which is giving error:
    declare
    v_Errorflag BINARY_INTEGER;
    v_flag number := 1;
    CURSOR cur_terminal_info Is
    SELECT distinct
    'a' SettlementType
    FROM
    dual;
    TYPE typ_cur_terminal IS TABLE OF cur_terminal_info%ROWTYPE;
    Tab_Terminal_info typ_cur_Terminal;
    BEGIN
    v_Errorflag := 2;
    OPEN cur_terminal_info;
    LOOP
    v_Errorflag := 4;
    FETCH cur_terminal_info BULK COLLECT INTO tab_terminal_info LIMIT 300;
    EXIT WHEN cur_terminal_info%rowcount &lt;= 0;
    v_Errorflag := 5;
    FOR Y IN Tab_Terminal_Info.FIRST..tab_terminal_info.LAST
    LOOP
    dbms_output.put_line(v_flag);
    v_flag := v_flag + 1;
    end loop;
    END LOOP;
    v_Errorflag := 13;
    COMMIT;
    END;
    I have updated script as following to change datatype as varchar2 for nested table, but still same error is
    comming..
    declare
    v_Errorflag BINARY_INTEGER;
    v_flag number := 1;
    CURSOR cur_terminal_info Is
    SELECT distinct
    'a' SettlementType
    FROM
    dual;
    TYPE typ_cur_terminal IS TABLE OF varchar2(50);
    Tab_Terminal_info typ_cur_Terminal;
    BEGIN
    v_Errorflag := 2;
    OPEN cur_terminal_info;
    LOOP
    v_Errorflag := 4;
    FETCH cur_terminal_info BULK COLLECT INTO tab_terminal_info LIMIT 300;
    EXIT WHEN cur_terminal_info%rowcount &lt;= 0;
    v_Errorflag := 5;
    FOR Y IN Tab_Terminal_Info.FIRST..tab_terminal_info.LAST
    LOOP
    dbms_output.put_line(v_flag);
    v_flag := v_flag + 1;
    end loop;
    END LOOP;
    v_Errorflag := 13;
    COMMIT;
    I could not find the exact reason of error.
    Please help us to solve this error.
    Thanks and Regards..
    Dipali..

    Hello Friends,
    I got the solution.. :)
    I did one mistake in procedure where the loop should end.
    I used the statemetn: EXIT WHEN cur_terminal_info%rowcount &lt;= 0;
    But it should be: EXIT WHEN Tab_Terminal_Info.COUNT &lt;= 0;
    Now my script is working fine.. :)
    Thanks and Regards,
    Dipali..

  • Can I use Bulk Collect results as input parameter for another cursor

    MUSIC            ==> remote MUSIC_DB database, MUSIC table has 60 million rows
    PRICE_DATA ==> remote PRICING_DB database, PRICE_DATE table has 1 billion rows
    These two table once existed in same database, but size of database exceeded available hardware size and hardware budget, so the PRICE_DATA table was moved to another Oracle database.  I need to create a single report that combines data from both of these tables, and a distributed join with DRIVING_SITE hint will not work because the size of both table is too large to push to one DRIVING_SITE location, so I wrote this PLSQL block to process in small blocks.
    QUESTION: how can use bulk collect from one cursor and pass that bulk collected information as input to second cursor without specifically listing each cell of the PLSQL bulk collection?  See sample pseudo-code below, I am trying to determine more efficient way to code than hard-coding 100 parameter names into 2nd cursor.
    NOTE: below is truly pseudo-code, I had to change the names of everything to adhere to NDA, but below works and is fast enough for my purposes, but if I want to change from 100 input parameters to 200, I have to add more hard-coded values.  There has got to be a better way.
    DECLARE
         -- define cursor that retrieves distinct SONG_IDs from MUSIC table in remote music database
         CURSOR C_CURRENT_MUSIC
         IS
        select distinct SONG_ID
        from MUSIC@MUSIC_DB
        where PRODUCTION_RELEASE=1
         /*  define a parameterized cursor that accepts 100 SONG_IDs and retrieves
              required pricing information
         CURSOR C_get_music_price_data
                   P_SONG_ID_001 NUMBER, P_SONG_ID_002 NUMBER, P_SONG_ID_003 NUMBER, P_SONG_ID_004 NUMBER, P_SONG_ID_005 NUMBER, P_SONG_ID_006 NUMBER, P_SONG_ID_007 NUMBER, P_SONG_ID_008 NUMBER, P_SONG_ID_009 NUMBER, P_SONG_ID_010 NUMBER,
                   P_SONG_ID_011 NUMBER, P_SONG_ID_012 NUMBER, P_SONG_ID_013 NUMBER, P_SONG_ID_014 NUMBER, P_SONG_ID_015 NUMBER, P_SONG_ID_016 NUMBER, P_SONG_ID_017 NUMBER, P_SONG_ID_018 NUMBER, P_SONG_ID_019 NUMBER, P_SONG_ID_020 NUMBER,
                   P_SONG_ID_021 NUMBER, P_SONG_ID_022 NUMBER, P_SONG_ID_023 NUMBER, P_SONG_ID_024 NUMBER, P_SONG_ID_025 NUMBER, P_SONG_ID_026 NUMBER, P_SONG_ID_027 NUMBER, P_SONG_ID_028 NUMBER, P_SONG_ID_029 NUMBER, P_SONG_ID_030 NUMBER,
                   P_SONG_ID_031 NUMBER, P_SONG_ID_032 NUMBER, P_SONG_ID_033 NUMBER, P_SONG_ID_034 NUMBER, P_SONG_ID_035 NUMBER, P_SONG_ID_036 NUMBER, P_SONG_ID_037 NUMBER, P_SONG_ID_038 NUMBER, P_SONG_ID_039 NUMBER, P_SONG_ID_040 NUMBER,
                   P_SONG_ID_041 NUMBER, P_SONG_ID_042 NUMBER, P_SONG_ID_043 NUMBER, P_SONG_ID_044 NUMBER, P_SONG_ID_045 NUMBER, P_SONG_ID_046 NUMBER, P_SONG_ID_047 NUMBER, P_SONG_ID_048 NUMBER, P_SONG_ID_049 NUMBER, P_SONG_ID_050 NUMBER,
                   P_SONG_ID_051 NUMBER, P_SONG_ID_052 NUMBER, P_SONG_ID_053 NUMBER, P_SONG_ID_054 NUMBER, P_SONG_ID_055 NUMBER, P_SONG_ID_056 NUMBER, P_SONG_ID_057 NUMBER, P_SONG_ID_058 NUMBER, P_SONG_ID_059 NUMBER, P_SONG_ID_060 NUMBER,
                   P_SONG_ID_061 NUMBER, P_SONG_ID_062 NUMBER, P_SONG_ID_063 NUMBER, P_SONG_ID_064 NUMBER, P_SONG_ID_065 NUMBER, P_SONG_ID_066 NUMBER, P_SONG_ID_067 NUMBER, P_SONG_ID_068 NUMBER, P_SONG_ID_069 NUMBER, P_SONG_ID_070 NUMBER,
                   P_SONG_ID_071 NUMBER, P_SONG_ID_072 NUMBER, P_SONG_ID_073 NUMBER, P_SONG_ID_074 NUMBER, P_SONG_ID_075 NUMBER, P_SONG_ID_076 NUMBER, P_SONG_ID_077 NUMBER, P_SONG_ID_078 NUMBER, P_SONG_ID_079 NUMBER, P_SONG_ID_080 NUMBER,
                   P_SONG_ID_081 NUMBER, P_SONG_ID_082 NUMBER, P_SONG_ID_083 NUMBER, P_SONG_ID_084 NUMBER, P_SONG_ID_085 NUMBER, P_SONG_ID_086 NUMBER, P_SONG_ID_087 NUMBER, P_SONG_ID_088 NUMBER, P_SONG_ID_089 NUMBER, P_SONG_ID_090 NUMBER,
                   P_SONG_ID_091 NUMBER, P_SONG_ID_092 NUMBER, P_SONG_ID_093 NUMBER, P_SONG_ID_094 NUMBER, P_SONG_ID_095 NUMBER, P_SONG_ID_096 NUMBER, P_SONG_ID_097 NUMBER, P_SONG_ID_098 NUMBER, P_SONG_ID_099 NUMBER, P_SONG_ID_100 NUMBER
         IS
         select
         from PRICE_DATA@PRICING_DB
         where COUNTRY = 'USA'
         and START_DATE <= sysdate
         and END_DATE > sysdate
         and vpc.SONG_ID IN
                   P_SONG_ID_001 ,P_SONG_ID_002 ,P_SONG_ID_003 ,P_SONG_ID_004 ,P_SONG_ID_005 ,P_SONG_ID_006 ,P_SONG_ID_007 ,P_SONG_ID_008 ,P_SONG_ID_009 ,P_SONG_ID_010,
                   P_SONG_ID_011 ,P_SONG_ID_012 ,P_SONG_ID_013 ,P_SONG_ID_014 ,P_SONG_ID_015 ,P_SONG_ID_016 ,P_SONG_ID_017 ,P_SONG_ID_018 ,P_SONG_ID_019 ,P_SONG_ID_020,
                   P_SONG_ID_021 ,P_SONG_ID_022 ,P_SONG_ID_023 ,P_SONG_ID_024 ,P_SONG_ID_025 ,P_SONG_ID_026 ,P_SONG_ID_027 ,P_SONG_ID_028 ,P_SONG_ID_029 ,P_SONG_ID_030,
                   P_SONG_ID_031 ,P_SONG_ID_032 ,P_SONG_ID_033 ,P_SONG_ID_034 ,P_SONG_ID_035 ,P_SONG_ID_036 ,P_SONG_ID_037 ,P_SONG_ID_038 ,P_SONG_ID_039 ,P_SONG_ID_040,
                   P_SONG_ID_041 ,P_SONG_ID_042 ,P_SONG_ID_043 ,P_SONG_ID_044 ,P_SONG_ID_045 ,P_SONG_ID_046 ,P_SONG_ID_047 ,P_SONG_ID_048 ,P_SONG_ID_049 ,P_SONG_ID_050,
                   P_SONG_ID_051 ,P_SONG_ID_052 ,P_SONG_ID_053 ,P_SONG_ID_054 ,P_SONG_ID_055 ,P_SONG_ID_056 ,P_SONG_ID_057 ,P_SONG_ID_058 ,P_SONG_ID_059 ,P_SONG_ID_060,
                   P_SONG_ID_061 ,P_SONG_ID_062 ,P_SONG_ID_063 ,P_SONG_ID_064 ,P_SONG_ID_065 ,P_SONG_ID_066 ,P_SONG_ID_067 ,P_SONG_ID_068 ,P_SONG_ID_069 ,P_SONG_ID_070,
                   P_SONG_ID_071 ,P_SONG_ID_072 ,P_SONG_ID_073 ,P_SONG_ID_074 ,P_SONG_ID_075 ,P_SONG_ID_076 ,P_SONG_ID_077 ,P_SONG_ID_078 ,P_SONG_ID_079 ,P_SONG_ID_080,
                   P_SONG_ID_081 ,P_SONG_ID_082 ,P_SONG_ID_083 ,P_SONG_ID_084 ,P_SONG_ID_085 ,P_SONG_ID_086 ,P_SONG_ID_087 ,P_SONG_ID_088 ,P_SONG_ID_089 ,P_SONG_ID_090,
                   P_SONG_ID_091 ,P_SONG_ID_092 ,P_SONG_ID_093 ,P_SONG_ID_094 ,P_SONG_ID_095 ,P_SONG_ID_096 ,P_SONG_ID_097 ,P_SONG_ID_098 ,P_SONG_ID_099 ,P_SONG_ID_100
         group by
               vpc.SONG_ID
              ,vpc.STOREFRONT_ID
         TYPE SONG_ID_TYPE IS TABLE OF MUSIC@MUSIC_DB%TYPE INDEX BY BINARY_INTEGER;
         V_SONG_ID_ARRAY                         SONG_ID_TYPE                     ;
         v_commit_counter           NUMBER := 0;
    BEGIN
         /* open cursor you intent to bulk collect from */
         OPEN C_CURRENT_MUSIC;
         LOOP
              /* in batches of 100, bulk collect ADAM_ID mapped TMS_IDENTIFIER into PLSQL table or records */
              FETCH C_CURRENT_MUSIC BULK COLLECT INTO V_SONG_ID_ARRAY LIMIT 100;
                   EXIT WHEN V_SONG_ID_ARRAY.COUNT = 0;
                   /* to avoid NO DATA FOUND error when pass 100 parameters to OPEN cursor, if the arrary
                      is not fully populated to 100, pad the array with nulls to fill up to 100 cells. */
                   IF (V_SONG_ID_ARRAY.COUNT >=1 and V_SONG_ID_ARRAY.COUNT <> 100) THEN
                        FOR j IN V_SONG_ID_ARRAY.COUNT+1..100 LOOP
                             V_SONG_ID_ARRAY(j) := null;
                        END LOOP;
                   END IF;
              /* pass a batch of 100 to cursor that get price information per SONG_ID and STOREFRONT_ID */
              FOR j IN C_get_music_price_data
                        V_SONG_ID_ARRAY(1) ,V_SONG_ID_ARRAY(2) ,V_SONG_ID_ARRAY(3) ,V_SONG_ID_ARRAY(4) ,V_SONG_ID_ARRAY(5) ,V_SONG_ID_ARRAY(6) ,V_SONG_ID_ARRAY(7) ,V_SONG_ID_ARRAY(8) ,V_SONG_ID_ARRAY(9) ,V_SONG_ID_ARRAY(10) ,
                        V_SONG_ID_ARRAY(11) ,V_SONG_ID_ARRAY(12) ,V_SONG_ID_ARRAY(13) ,V_SONG_ID_ARRAY(14) ,V_SONG_ID_ARRAY(15) ,V_SONG_ID_ARRAY(16) ,V_SONG_ID_ARRAY(17) ,V_SONG_ID_ARRAY(18) ,V_SONG_ID_ARRAY(19) ,V_SONG_ID_ARRAY(20) ,
                        V_SONG_ID_ARRAY(21) ,V_SONG_ID_ARRAY(22) ,V_SONG_ID_ARRAY(23) ,V_SONG_ID_ARRAY(24) ,V_SONG_ID_ARRAY(25) ,V_SONG_ID_ARRAY(26) ,V_SONG_ID_ARRAY(27) ,V_SONG_ID_ARRAY(28) ,V_SONG_ID_ARRAY(29) ,V_SONG_ID_ARRAY(30) ,
                        V_SONG_ID_ARRAY(31) ,V_SONG_ID_ARRAY(32) ,V_SONG_ID_ARRAY(33) ,V_SONG_ID_ARRAY(34) ,V_SONG_ID_ARRAY(35) ,V_SONG_ID_ARRAY(36) ,V_SONG_ID_ARRAY(37) ,V_SONG_ID_ARRAY(38) ,V_SONG_ID_ARRAY(39) ,V_SONG_ID_ARRAY(40) ,
                        V_SONG_ID_ARRAY(41) ,V_SONG_ID_ARRAY(42) ,V_SONG_ID_ARRAY(43) ,V_SONG_ID_ARRAY(44) ,V_SONG_ID_ARRAY(45) ,V_SONG_ID_ARRAY(46) ,V_SONG_ID_ARRAY(47) ,V_SONG_ID_ARRAY(48) ,V_SONG_ID_ARRAY(49) ,V_SONG_ID_ARRAY(50) ,
                        V_SONG_ID_ARRAY(51) ,V_SONG_ID_ARRAY(52) ,V_SONG_ID_ARRAY(53) ,V_SONG_ID_ARRAY(54) ,V_SONG_ID_ARRAY(55) ,V_SONG_ID_ARRAY(56) ,V_SONG_ID_ARRAY(57) ,V_SONG_ID_ARRAY(58) ,V_SONG_ID_ARRAY(59) ,V_SONG_ID_ARRAY(60) ,
                        V_SONG_ID_ARRAY(61) ,V_SONG_ID_ARRAY(62) ,V_SONG_ID_ARRAY(63) ,V_SONG_ID_ARRAY(64) ,V_SONG_ID_ARRAY(65) ,V_SONG_ID_ARRAY(66) ,V_SONG_ID_ARRAY(67) ,V_SONG_ID_ARRAY(68) ,V_SONG_ID_ARRAY(69) ,V_SONG_ID_ARRAY(70) ,
                        V_SONG_ID_ARRAY(71) ,V_SONG_ID_ARRAY(72) ,V_SONG_ID_ARRAY(73) ,V_SONG_ID_ARRAY(74) ,V_SONG_ID_ARRAY(75) ,V_SONG_ID_ARRAY(76) ,V_SONG_ID_ARRAY(77) ,V_SONG_ID_ARRAY(78) ,V_SONG_ID_ARRAY(79) ,V_SONG_ID_ARRAY(80) ,
                        V_SONG_ID_ARRAY(81) ,V_SONG_ID_ARRAY(82) ,V_SONG_ID_ARRAY(83) ,V_SONG_ID_ARRAY(84) ,V_SONG_ID_ARRAY(85) ,V_SONG_ID_ARRAY(86) ,V_SONG_ID_ARRAY(87) ,V_SONG_ID_ARRAY(88) ,V_SONG_ID_ARRAY(89) ,V_SONG_ID_ARRAY(90) ,
                        V_SONG_ID_ARRAY(91) ,V_SONG_ID_ARRAY(92) ,V_SONG_ID_ARRAY(93) ,V_SONG_ID_ARRAY(94) ,V_SONG_ID_ARRAY(95) ,V_SONG_ID_ARRAY(96) ,V_SONG_ID_ARRAY(97) ,V_SONG_ID_ARRAY(98) ,V_SONG_ID_ARRAY(99) ,V_SONG_ID_ARRAY(100)        
              LOOP
                   /* do stuff with data from Song and Pricing Database coming from the two
                        separate cursors, then continue processing more rows...
              END LOOP;
              /* commit after each batch of 100 SONG_IDs is processed */        
              COMMIT;
              EXIT WHEN C_CURRENT_MUSIC%NOTFOUND;  -- exit when there are no more rows to fetch from cursor
         END LOOP; -- bulk fetching loop
         CLOSE C_CURRENT_MUSIC; -- close cursor that was used in bulk collection
         /* commit rows */
         COMMIT; -- commit any remaining uncommitted data.
    END;

    I've got a problem when using passing VARRAY of numbers as parameter to remote cursor: it takes a super long time to run, sometimes doesn't finish even after an hour as passed.
    Continuing with my example in original entry, I replaced the bulk collect into PLSQL table collection with a VARRAY and i bulk collect into the VARRAY, this is fast and I know it works because I can DBMS_OUTPUT.PUT_LINE cells of VARRAY so I know it is getting populated correctly.  However, when I pass the VARRAY containing 100 cells populated with SONG_IDs as parameter to cursor, execution time is over an hour and when I am expecting a few seconds.
    Below code example strips the problem down to it's raw details, I skip the bulk collect and just manually populate a VARRAY with 100 SONG_ID values, then try to pass to as parameter to a cursor, but the execution time of cursor is unexpectedly long, over 30 minutes, sometime longer, when I am expecting seconds.
    IMPORTANT: If I take the same 100 SONG_IDs and place them directly in the cursor query's where IN clause, the SQL runs in under 5 seconds and returns result.  Also, if I pass the 100 SONG_IDs as individual cells of a PLSQL table collection, then it also runs fast.
    I thought that since the VARRAY is used via select subquery that is it queried locally, but the cursor is remote, and that I had a distribute problem on my hands, so I put in the DRIVING_SITE hint to attempt to force the result of query against VARRAY to go to remote server and rest of query will run there before returning result, but that didn't work either, still got slow response.
    Is something wrong with my code, or I am running into a Oracle problem that may require support to resolve?
    DECLARE
         /*  define a parameterized cursor that accepts XXX number of in SONG_IDs and
          retrieves required pricing information
         CURSOR C_get_music_price_data
      p_array_song_ids SYS.ODCInumberList              
         IS
         select  /*+DRIVING_SITE(pd) */
      count(distinct s.EVE_ID)
         from PRICE_DATA@PRICING_DB pd
         where pd.COUNTRY = 'USA'
         and pd.START_DATE <= sysdate
         and pd.END_DATE > sysdate
         and pd.SONG_ID IN
              select column_value from table(p_array_song_ids)
         group by
               pd.SONG_ID
              ,pd.STOREFRONT_ID
      V_ARRAY_SONG_IDS SYS.ODCInumberList := SYS.ODCInumberList();    
    BEGIN
    V_ARRAY_SONG_IDS.EXTEND(100);
    V_ARRAY_SONG_IDS(  1 ) := 31135  ;
    V_ARRAY_SONG_IDS(  2 ) := 31140   ;
    V_ARRAY_SONG_IDS(  3 ) := 31142   ;
    V_ARRAY_SONG_IDS(  4 ) := 31144   ;
    V_ARRAY_SONG_IDS(  5 ) := 31146   ;
    V_ARRAY_SONG_IDS(  6 ) := 31148   ;
    V_ARRAY_SONG_IDS(  7 ) := 31150   ;
    V_ARRAY_SONG_IDS(  8 ) := 31152   ;
    V_ARRAY_SONG_IDS(  9 ) := 31154   ;
    V_ARRAY_SONG_IDS( 10 ) := 31156   ;
    V_ARRAY_SONG_IDS( 11 ) := 31158   ;
    V_ARRAY_SONG_IDS( 12 ) := 31160   ;
    V_ARRAY_SONG_IDS( 13 ) := 33598   ;
    V_ARRAY_SONG_IDS( 14 ) := 33603   ;
    V_ARRAY_SONG_IDS( 15 ) := 33605   ;
    V_ARRAY_SONG_IDS( 16 ) := 33607   ;
    V_ARRAY_SONG_IDS( 17 ) := 33609   ;
    V_ARRAY_SONG_IDS( 18 ) := 33611   ;
    V_ARRAY_SONG_IDS( 19 ) := 33613   ;
    V_ARRAY_SONG_IDS( 20 ) := 33615   ;
    V_ARRAY_SONG_IDS( 21 ) := 33617   ;
    V_ARRAY_SONG_IDS( 22 ) := 33630   ;
    V_ARRAY_SONG_IDS( 23 ) := 33632   ;
    V_ARRAY_SONG_IDS( 24 ) := 33636   ;
    V_ARRAY_SONG_IDS( 25 ) := 33638   ;
    V_ARRAY_SONG_IDS( 26 ) := 33640   ;
    V_ARRAY_SONG_IDS( 27 ) := 33642   ;
    V_ARRAY_SONG_IDS( 28 ) := 33644   ;
    V_ARRAY_SONG_IDS( 29 ) := 33646   ;
    V_ARRAY_SONG_IDS( 30 ) := 33648   ;
    V_ARRAY_SONG_IDS( 31 ) := 33662   ;
    V_ARRAY_SONG_IDS( 32 ) := 33667   ;
    V_ARRAY_SONG_IDS( 33 ) := 33669   ;
    V_ARRAY_SONG_IDS( 34 ) := 33671   ;
    V_ARRAY_SONG_IDS( 35 ) := 33673   ;
    V_ARRAY_SONG_IDS( 36 ) := 33675   ;
    V_ARRAY_SONG_IDS( 37 ) := 33677   ;
    V_ARRAY_SONG_IDS( 38 ) := 33679   ;
    V_ARRAY_SONG_IDS( 39 ) := 33681   ;
    V_ARRAY_SONG_IDS( 40 ) := 33683   ;
    V_ARRAY_SONG_IDS( 41 ) := 33685   ;
    V_ARRAY_SONG_IDS( 42 ) := 33700   ;
    V_ARRAY_SONG_IDS( 43 ) := 33702   ;
    V_ARRAY_SONG_IDS( 44 ) := 33704   ;
    V_ARRAY_SONG_IDS( 45 ) := 33706   ;
    V_ARRAY_SONG_IDS( 46 ) := 33708   ;
    V_ARRAY_SONG_IDS( 47 ) := 33710   ;
    V_ARRAY_SONG_IDS( 48 ) := 33712   ;
    V_ARRAY_SONG_IDS( 49 ) := 33723   ;
    V_ARRAY_SONG_IDS( 50 ) := 33725   ;
    V_ARRAY_SONG_IDS( 51 ) := 33727   ;
    V_ARRAY_SONG_IDS( 52 ) := 33729   ;
    V_ARRAY_SONG_IDS( 53 ) := 33731   ;
    V_ARRAY_SONG_IDS( 54 ) := 33733   ;
    V_ARRAY_SONG_IDS( 55 ) := 33735   ;
    V_ARRAY_SONG_IDS( 56 ) := 33737   ;
    V_ARRAY_SONG_IDS( 57 ) := 33749   ;
    V_ARRAY_SONG_IDS( 58 ) := 33751   ;
    V_ARRAY_SONG_IDS( 59 ) := 33753   ;
    V_ARRAY_SONG_IDS( 60 ) := 33755   ;
    V_ARRAY_SONG_IDS( 61 ) := 33757   ;
    V_ARRAY_SONG_IDS( 62 ) := 33759   ;
    V_ARRAY_SONG_IDS( 63 ) := 33761   ;
    V_ARRAY_SONG_IDS( 64 ) := 33763   ;
    V_ARRAY_SONG_IDS( 65 ) := 33775   ;
    V_ARRAY_SONG_IDS( 66 ) := 33777   ;
    V_ARRAY_SONG_IDS( 67 ) := 33779   ;
    V_ARRAY_SONG_IDS( 68 ) := 33781   ;
    V_ARRAY_SONG_IDS( 69 ) := 33783   ;
    V_ARRAY_SONG_IDS( 70 ) := 33785   ;
    V_ARRAY_SONG_IDS( 71 ) := 33787   ;
    V_ARRAY_SONG_IDS( 72 ) := 33789   ;
    V_ARRAY_SONG_IDS( 73 ) := 33791   ;
    V_ARRAY_SONG_IDS( 74 ) := 33793   ;
    V_ARRAY_SONG_IDS( 75 ) := 33807   ;
    V_ARRAY_SONG_IDS( 76 ) := 33809   ;
    V_ARRAY_SONG_IDS( 77 ) := 33811   ;
    V_ARRAY_SONG_IDS( 78 ) := 33813   ;
    V_ARRAY_SONG_IDS( 79 ) := 33815   ;
    V_ARRAY_SONG_IDS( 80 ) := 33817   ;
    V_ARRAY_SONG_IDS( 81 ) := 33819   ;
    V_ARRAY_SONG_IDS( 82 ) := 33821   ;
    V_ARRAY_SONG_IDS( 83 ) := 33823   ;
    V_ARRAY_SONG_IDS( 84 ) := 33825   ;
    V_ARRAY_SONG_IDS( 85 ) := 33839   ;
    V_ARRAY_SONG_IDS( 86 ) := 33844   ;
    V_ARRAY_SONG_IDS( 87 ) := 33846   ;
    V_ARRAY_SONG_IDS( 88 ) := 33848   ;
    V_ARRAY_SONG_IDS( 89 ) := 33850   ;
    V_ARRAY_SONG_IDS( 90 ) := 33852   ;
    V_ARRAY_SONG_IDS( 91 ) := 33854   ;
    V_ARRAY_SONG_IDS( 92 ) := 33856   ;
    V_ARRAY_SONG_IDS( 93 ) := 33858   ;
    V_ARRAY_SONG_IDS( 94 ) := 33860   ;
    V_ARRAY_SONG_IDS( 95 ) := 33874   ;
    V_ARRAY_SONG_IDS( 96 ) := 33879   ;
    V_ARRAY_SONG_IDS( 97 ) := 33881   ;
    V_ARRAY_SONG_IDS( 98 ) := 33883   ;
    V_ARRAY_SONG_IDS( 99 ) := 33885   ;
    V_ARRAY_SONG_IDS(100 ) := 33889  ;
        /* do stuff with data from Song and Pricing Database coming from the two
      separate cursors, then continue processing more rows...
      FOR i IN C_get_music_price_data( v_array_song_ids ) LOOP
      . (this is the loop where I pass in v_array_song_ids
      .  populated with only 100 cells and it runs forever)
      END LOOP; 
    END;

  • Calling Stored procedure which uses Bulk Collect

    Hi All, I have Oracle stored procedure which uses Bulk Collect and returns table type parameter as output. Can anyone please help me how Can I call this kind of stored procedures which returns table type output using VB and Oracle's Driver. (I am successfully able to call using MS ODBC driver, but I want to use OraOLEDB driver.)

    861412 wrote:
    how Can I call this kind of stored procedures which returns table type output using VB and Oracle's Driver. This forum deals with the server-side languages SQL and PL/SQL.
    Your question deals with the client side and Visual Basic language.

  • Approach of using Bulk Collect

    Hi Experts,
    how to use bulk collect for uncertain number of columns of select statement.
    Master table structure:
    Create table tabmst
    (id number,
    cls_input varchar2(2000),
    price number);
    insert into tabmst(1,'select product, price from product',500);
    insert into tabmst(2,'select product, price,purchase_dt from product',100);
    insert into tabmst(3,'select * from product',1000);
    Currently I want to store Select statement of cls_input column in a local variable like
    dyn_qry:= cls_input; by using a cursor.
    Now my question is how to use Bulk Collect by using "Execute Immediate" in Bulk collect variable as there is not certainity of the number of columns from "Select Statment". Please suggest.
    Sample code:
    I created TYPE variable for Bulk Collect also support blk_var;
    Declare
    dyn_qry varchar2(3000);
    cursor c1 is select * from tabmst;
    begin
    for i in c1 loop
    dyn_qry:= cls_input;
    Execute immediate dyn_qry into blk_var;
    End Loop;
    End;
    Now I want to store values of Each "Select statements columns" which is executing by dynamic SQL. but it is uncertain that how many columns with return from dynamic SQL.
    Please suggest the approach on the same. Thanks in advance.

    >
    I don't think you can use bulk collect with EXECUTE IMMEDIATE. They do two different things. EXECUTE IMMEDIATE allows the execlution of dynamic SQL. BULK COLLECT provides optimization of SELECT statements when loading the contents into collections. I am not aware of any support for BULK COLLECT with EXECUTE IMMEDIATE.
    You may be able to do this a different way. If you must use dynamic SQL (I suggest you don't unless it is absolutely necessary. Dynamic SQL is hard to write, hard to debug, hard to maintain, and hard to tune) use a reference cursor instead. You can use the BULK COLLECT with the standard fetch.

  • Commit / Rollback  AND  ERROS when to use BULK COLLECT

    Hi
    Where can I to put commit when to use bulk collect for to insert or update ?,
    How many records I to must to commit ?
    declare
    TYPE t_cd_estrutura_comercial IS TABLE OF t_ec_pessoa_perfil_ciclo.cd_estrutura_comercial%TYPE   INDEX BY PLS_INTEGER;
    TYPE t_cd_tipo_estrutura_comercial IS TABLE OF t_ec_pessoa_perfil_ciclo.cd_tipo_estrutura_comercial%TYPE  INDEX BY PLS_INTEGER;
    TYPE t_nm_ciclo_operacional IS TABLE OF  t_ec_pessoa_perfil_ciclo.nm_ciclo_operacional%TYPE   INDEX BY PLS_INTEGER;
    TYPE t_cd_consultora IS TABLE OF t_ec_pessoa_perfil_ciclo.cd_consultora%TYPE  INDEX BY PLS_INTEGER;
    TYPE t_cd_perfil IS TABLE OF t_ec_pessoa_perfil_ciclo.cd_perfil%TYPE  INDEX BY PLS_INTEGER;
    TYPE t_cd_indicador IS TABLE OF t_ec_pessoa_perfil_ciclo.cd_indicador%TYPE  INDEX BY PLS_INTEGER;
    TYPE t_vl_indicador IS TABLE OF t_ec_pessoa_perfil_ciclo.vl_indicador%TYPE  INDEX BY PLS_INTEGER;
    TYPE t_dt_ultima_atualizacao IS TABLE OF t_ec_pessoa_perfil_ciclo.dt_ultima_atualizacao%TYPE   INDEX BY PLS_INTEGER;
    v_cd_estrutura_comercial t_cd_estrutura_comercial;
    v_cd_tipo_estrutura_comercial t_cd_tipo_estrutura_comercial;
    v_nm_ciclo_operacional t_nm_ciclo_operacional;
    v_cd_consultora t_cd_consultora;
    v_cd_perfil t_cd_perfil;
    v_cd_indicador t_cd_indicador;
    v_vl_indicador t_vl_indicador;
    v_dt_ultima_atualizacao t_dt_ultima_atualizacao;
    V_CURSOR  SYS_REFCURSOR;
    n  pls_integer :=0;
    begin
      ---open v_cursor 
      pkg_scnob_batch_indicadores.abre_cursor(275,v_cursor);
       LOOP
       FETCH V_CURSOR   BULK COLLECT INTO
         v_cd_estrutura_comercial,
         v_cd_tipo_estrutura_comercial,
         v_nm_ciclo_operacional,
         v_cd_consultora,
         v_cd_perfil,
         v_cd_indicador,
         v_vl_indicador,
         v_dt_ultima_atualizacao  LIMIT 1000;
           FORALL i IN 1 .. v_cd_estrutura_comercial.COUNT
              MERGE
                 INTO T_EC_PESSOA_PERFIL_CICLO tgt
                 USING ( SELECT v_cd_estrutura_comercial(i) cd_estrutura_comercial,                           
                          v_cd_tipo_estrutura_comercial(i) cd_tipo_estrutura_comercial,                      
                         v_nm_ciclo_operacional(i) nm_ciclo_operacional,                             
                         v_cd_consultora(i) cd_consultora,                                    
                         v_cd_perfil(i)    cd_perfil,                                       
                         v_cd_indicador(i) cd_indicador,                                     
                         v_vl_indicador(i) vl_indicador,                                     
                         v_dt_ultima_atualizacao(i) dt_ultima_atualizacao FROM  dual ) src
                 ON   (   src.CD_ESTRUTURA_COMERCIAL            = TGT.CD_ESTRUTURA_COMERCIAL        
                          AND src.CD_TIPO_ESTRUTURA_COMERCIAL   = TGT.CD_TIPO_ESTRUTURA_COMERCIAL   
                          AND src.NM_CICLO_OPERACIONAL          = TGT.NM_CICLO_OPERACIONAL          
                          AND src.CD_CONSULTORA                 = TGT.CD_CONSULTORA                 
                          AND src.CD_PERFIL                     = TGT.CD_PERFIL                     
                          AND  src.CD_INDICADOR                 = TGT.CD_INDICADOR  )
              WHEN MATCHED
              THEN
                UPDATE
                 SET TGT.VL_INDICADOR  = src.VL_INDICADOR ,                             
                     TGT.DT_ULTIMA_ATUALIZACAO  = src.DT_ULTIMA_ATUALIZACAO                     
              WHEN NOT MATCHED
              THEN
                 INSERT (tgt.CD_ESTRUTURA_COMERCIAL,                           
                         tgt.CD_TIPO_ESTRUTURA_COMERCIAL,                      
                         tgt.NM_CICLO_OPERACIONAL,                             
                         tgt.CD_CONSULTORA,                                    
                         tgt.CD_PERFIL ,                                       
                         tgt.CD_INDICADOR,                                     
                         tgt.VL_INDICADOR,                                     
                         tgt.DT_ULTIMA_ATUALIZACAO)                            
                 VALUES (src.CD_ESTRUTURA_COMERCIAL,                           
                         src.CD_TIPO_ESTRUTURA_COMERCIAL,                      
                         src.NM_CICLO_OPERACIONAL,                             
                         src.CD_CONSULTORA,                                    
                         src.CD_PERFIL ,                                       
                         src.CD_INDICADOR,                                     
                         src.VL_INDICADOR,                                     
                         src.DT_ULTIMA_ATUALIZACAO) ;
               -- Did I to must commit here ?     
               --         commit;
           n := n + SQL%ROWCOUNT;
        EXIT WHEN  V_CURSOR%NOTFOUND; 
      END LOOP; 
    exception
       when others then
           rollback;  -- and log errors
    end;

    There is a lot of questions like this
    Actually there no precise max of records to be committed
    The COMMIT

  • How to call BULK COLLECT using JDBC in JSP?

    Hi
    I am using BULK COLLECT instead of CURSOR in my Stored Procedure.
    How to use these stuff in JDBC.
    If i am using CURSOR means
    cs.registerOutParameter(1,oracle.jdbc.driver.OracleTypes.CURSOR);
    is used.
    Which types i have to use?
    I am retriving morethan one record using FOR. LOOP in SP and before that
    BULK COLLECT is used in SELECT query..
    Can you give jsp code for this?
    Thanks

    you may find related sample jdbc code on otn - http://www.oracle.com/technology/sample_code/tech/java/sqlj_jdbc/index.html
    Best regards.

Maybe you are looking for

  • Vendor expected receipts report

    I am looking for a report of expected receipts which can be used to forecast cash needs as well as vendor exposure. This needs to encompass both POs and PAs. I have reviewed some of the standard evaluations (ME2M, ME2L, etc) which do not contain the

  • How can I change my primary account on iTunes from one computer to another computer?

    I opened an iTunes account years ago when I had a Dell, and after the Dell crashed, I got a new Macbook and authorized the mac to play all my iTunes music and whatnot that I had bought on the Dell.  I don't want this computer to just be "authorized",

  • Awesome Bar no longer searches all bookmarks after update to FF 34

    I just upgraded to FF34 and having a horrible time getting everything functioning right again. My current problem is that when I type a name in the Awesome Bar, it will not look at ALL my bookmarks. It does show some, but not all. For example if I ty

  • Lightroom 5.3

    I have lightroom 5.3 and Photoshop CS5.I want to open RAW file in Camera Raw, however, I get a message "This version may require the PS Camera Raw plug-in v8.3 for full compatibility." I checked both PS and LR and both say they are up to date. Terry

  • Reg release stratergy

    Hi Suppose in release stratergy customization 032 (class) is not mentioning wat will happen.