DBMS_OUTPUT within BULK COLLECT FORALL
Hi,
I am trying to figure out how I can output using DBMS_OUTPUT.PUT_LINE within a BULK COLLECT/ FORALL Update?
Example:
FETCH REF_CURSOR BULK COLLECT INTO l_productid,l_qty
forall indx in l_productid.first..l_productid.last
Update products aa
Set aa.LastInventorySent = l_qty(indx)
Where aa.productid = l_productid(indx);
DBMS_OUTPUT.PUT_LINE('ProductID: ' || l_productid(indx)|| ' QTY: ' || l_qty(indx);
Is this possible? If so how can I accomlish this?
Thanks,
S
FETCH REF_CURSOR BULK COLLECT INTO l_productid,l_qty
forall indx in l_productid.first..l_productid.last
Update products aa
Set aa.LastInventorySent = l_qty(indx)
Where aa.productid = l_productid(indx);
for indx in 1..l_qty.count loop
DBMS_OUTPUT.PUT_LINE('ProductID: ' || l_productid(indx)|| ' QTY: ' || l_qty(indx);
end loop;SY.
Similar Messages
-
Having problem using BULK COLLECT - FORALL
Hi,
I'm facing a problem while setting a table type value before inserting into a table using FORALL.
My concern is that i'm unable to generate the values in FOR LOOP, as by using dbms_output.put_line i observed that after 100 rows execution the process exits giving error as
ORA-22160: element at index [1] does not exist
ORA-06512: at "XYZ", line 568
ORA-06512: at line 2
I need to use the values stored in FOR LOOP in the same order for insertion in table TEMP using FOR ALL;
I'm guessing that i'm using the wrong technique for storing values in FOR LOOP.
Below given is my SP structure.
Any suggestion would be hepful.
Thanks!!
create or replace procedure XYZ
cursor cur is
select A,B,C,D from ABCD; ---expecting 40,000 row fetch
type t_A is table of ABCD.A%type index by pls_integer;
type t_B is table of ABCD.B%type index by pls_integer;
type t_C is table of ABCD.C%type index by pls_integer;
type t_D is table of ABCD.D%type index by pls_integer;
v_A t_A;
v_B t_B;
v_C t_C;
v_D t_D;
type t_E is table of VARCHAR2(100);
type t_F is table of VARCHAR2(100);
v_E t_E := t_E();
v_F t_F := t_F();
begin
open cur;
loop
fetch cur BULK COLLECT INTO v_A,v_B,v_C,v_D limit 100;
for i in 1 .. v_A.count loop
v_E.extend(i);
select 1111 into v_E(i) from dual;
v_F.extend(i);
v_F(i) := 'Hi !!';
----calculating v_E(i) and v_F(i) here----
end loop;
forall in i in 1 .. v_A.count
insert into table TEMP values (v_E(i), v_F(i));
exit when cur%NOTFOUND;
end loop;
close cur;
end;
BANNER
Oracle Database 10g Enterprise Edition Release 10.2.0.4.0 - 64bi
PL/SQL Release 10.2.0.4.0 - Production
CORE 10.2.0.4.0 Production
TNS for HPUX: Version 10.2.0.4.0 - Production
NLSRTL Version 10.2.0.4.0 - Production
-------The problem is that inside the IF ELSIF blocks i need to query various tables..As I thought. But why did you concentrate on BULK COLLECT - FORALL?
The cursor whereas does take more time to execute.More time then?
We have join of two tables have 18,00,000(normal table) and >17,92,2067(MView) records, having inidex on one the join column.
After joining these two and adding the filter conditions i'm having around >40,000 >rows.? You have a cursor row. And then inside the loop you have a query which returns 40'000 rows? What do you do with that data?
Is the query you show running INSIDE the loop?
I guess you still talk about the LOOP query and your are unhappy that it is not taking an index?
1. The loop is NOT the problem. It's the "... i need to query various tables"
2. ORACLE is ok when it's NOT taking the index. That is faster!!
3. If you add code and execution plans, please add tags. Otherwise it's unreadable.
Try to merge your LOOP query with the "various tables" and make ONE query out of 40000*various ;-) -
Need help with Bulk Collect ForAll Update
Hi - I'm trying to do a Bulk Collect/ForAll Update but am having issues.
My declarations look like this:
CURSOR cur_hhlds_for_update is
SELECT hsh.household_id, hsh.special_handling_type_id
FROM compas.household_special_handling hsh
, scr_id_lookup s
WHERE hsh.household_id = s.id
AND s.scr = v_scr
AND s.run_date = TRUNC (SYSDATE)
AND effective_date IS NULL
AND special_handling_type_id = 1
AND created_by != v_user;
TYPE rec_hhlds_for_update IS RECORD (
household_id HOUSEHOLD_SPECIAL_HANDLING.household_id%type,
spec_handl_type_id HOUSEHOLD_SPECIAL_HANDLING.SPECIAL_HANDLING_TYPE_ID%type
TYPE spec_handling_update_array IS TABLE OF rec_hhlds_for_update;
l_spec_handling_update_array spec_handling_update_array;And then the Bulk Collect/ForAll looks like this:
OPEN cur_hhlds_for_update;
LOOP
FETCH cur_hhlds_for_update BULK COLLECT INTO l_spec_handling_update_array LIMIT 1000;
EXIT WHEN l_spec_handling_update_array.count = 0;
FORALL i IN 1..l_spec_handling_update_array.COUNT
UPDATE compas.household_special_handling
SET effective_date = TRUNC(SYSDATE)
, last_modified_by = v_user
, last_modified_date = SYSDATE
WHERE household_id = l_spec_handling_update_array(i).household_id
AND special_handling_type_id = l_spec_handling_update_array(i).spec_handl_type_id;
l_special_handling_update_cnt := l_special_handling_update_cnt + SQL%ROWCOUNT;
END LOOP;And this is the error I'm receiving:
ORA-06550: line 262, column 31:
PLS-00436: implementation restriction: cannot reference fields of BULK In-BIND table of records
ORA-06550: line 262, column 31:
PLS-00382: expression is of wrong type
ORA-06550: line 263, column 43:
PL/SQL: ORA-22806: not an object or REF
ORA-06550: line 258, column 9:
PL/SQL: SQMy problem is that the table being updated has a composite primary key so I have two conditions in my where clause. This the the first time I'm even attempting the Bulk Collect/ForAll Update and it seems like it would be straight forward if I was only dealing with a single-column primary key. Can anyone please help advise me as to what I'm missing here or how I can accomplish this?
Thanks!
ChristineYou cannot reference a column inside a record when doin a for all. You need to refer as a whole collection . So you will need two collections.
Try like this,
DECLARE
CURSOR cur_hhlds_for_update
IS
SELECT hsh.household_id, hsh.special_handling_type_id
FROM compas.household_special_handling hsh, scr_id_lookup s
WHERE hsh.household_id = s.ID
AND s.scr = v_scr
AND s.run_date = TRUNC (SYSDATE)
AND effective_date IS NULL
AND special_handling_type_id = 1
AND created_by != v_user;
TYPE arr_household_id IS TABLE OF HOUSEHOLD_SPECIAL_HANDLING.household_id%TYPE
INDEX BY BINARY_INTEGER;
TYPE arr_spec_handl_type_id IS TABLE OF HOUSEHOLD_SPECIAL_HANDLING.SPECIAL_HANDLING_TYPE_ID%TYPE
INDEX BY BINARY_INTEGER;
l_household_id_col arr_household_id;
l_spec_handl_type_id_col arr_spec_handl_type_id;
BEGIN
OPEN cur_hhlds_for_update;
LOOP
FETCH cur_hhlds_for_update
BULK COLLECT INTO l_household_id_col, l_spec_handl_type_id_col
LIMIT 1000;
EXIT WHEN cur_hhlds_for_update%NOTFOUND;
FORALL i IN l_household_id_col.FIRST .. l_household_id_col.LAST
UPDATE compas.household_special_handling
SET effective_date = TRUNC (SYSDATE),
last_modified_by = v_user,
last_modified_date = SYSDATE
WHERE household_id = l_household_id_col(i)
AND special_handling_type_id = l_spec_handl_type_id_col(i);
--l_special_handling_update_cnt := l_special_handling_update_cnt + SQL%ROWCOUNT; -- Not sure what this does.
END LOOP;
END;G. -
Exceptional handling using Bulk Collect FORALL
Hi Members,
Following are by DB details
SELECT * FROM v$version;
Oracle9i Enterprise Edition Release 9.2.0.7.0 - 64bit Production
PL/SQL Release 9.2.0.7.0 - Production
CORE 9.2.0.7.0 Production
TNS for HPUX: Version 9.2.0.7.0 - Production
NLSRTL Version 9.2.0.7.0 - ProductionI need to handle exception during Bulk Collect FORALL operation and update the table column with Status_flag as 'E' for given item name and extension id. Below is the code snippet for the same.
declare
k NUMBER;
cursor c_list_price
IS
SELECT /*+ leading(a) */ item_name,
'PUBLISH_TO_PRICE_CATALOG' ATTRIBUTE_NAME,
MAX(DECODE (attribute_name,
'PUBLISH_TO_PRICE_CATALOG',
attribute_value))
ATTRIBUTE_VALUE,
NULL GEOGRAPHY_LEVEL,
NULL GEOGRAPHY_VALUE,
NULL INCLUDE_GEOGRAPHY,
NULL EFFECTIVE_START_DATE,
NULL EFFECTIVE_TO_DATE,
EXTENSION_ID,
NULL PRICING_UNIT,
NULL ATTRIBUTE_FROM,
NULL ATTRIBUTE_TO,
NULL FIXED_BASE_PRICE,
NULL PARENT_ATTRIBUTE,
NULL DURATION_QUANTITY,
NULL ATTRIBUTE2,
NULL ATTRIBUTE3,
NULL ATTRIBUTE4,
NULL ATTRIBUTE5,
NULL ATTRIBUTE6,
NULL ATTRIBUTE7,
NULL ATTRIBUTE8,
NULL ATTRIBUTE9,
NULL ATTRIBUTE10,
NULL ATTRIBUTE11,
NULL ATTRIBUTE12,
NULL ATTRIBUTE13,
NULL ATTRIBUTE14,
NULL ATTRIBUTE15,
--ORG_CODE,
ITEM_CATALOG_CATEGORY ITEM_CATEGORY,
a.INVENTORY_ITEM_ID INVENTORY_ITEM_ID
FROM XXCPD_ITM_SEC_UDA_DETAILS_TB a
WHERE DECODE(attribute_group_name,'LIST_PRICE',DECODE(STATUS_FLAG,'E',1,'P',2,3)
,'XXITM_FORM_PRICING_HS',DECODE(STATUS_FLAG,'E',4,'P',5,6)
,'XXITM_FORM_PRICING_SB',DECODE(STATUS_FLAG,'E',4,'P',5,6)
,'XXITM_ADV_FORM_PRICING_HS',DECODE(STATUS_FLAG,'E',7,'P',8,9)
,'XXITM_ADV_FORM_PRICING_SB',DECODE(STATUS_FLAG,'E',7,'P',8,9)
,'XXITM_ADV_FORM_PRICING_HWSW',DECODE(STATUS_FLAG,'E',10,'P',11,12)
,'XXITM_ADV_FORM_PRICING_SWSUB',DECODE(STATUS_FLAG,'E',10,'P',11,12)
,'XXITM_ROUTE_TO_MARKET',DECODE(STATUS_FLAG,'E',13,'P',14,15)) in (1,2,3)
AND exists(select /*+ no_merge */ '1' from mtl_system_items_b b where a.inventory_item_id=b.inventory_item_id and b.organization_id=1)
GROUP BY item_name,
extension_id,
ITEM_CATALOG_CATEGORY,
INVENTORY_ITEM_ID;
TYPE myarray IS TABLE OF c_list_price%ROWTYPE;
c_lp myarray;
BEGIN
OPEN c_list_price;
LOOP
FETCH c_list_price BULK COLLECT INTO c_lp
LIMIT 50000;
IF c_lp.count = 0 THEN
EXIT;
END IF;
Begin
FORALL i IN c_lp.FIRST..c_lp.LAST SAVE EXCEPTIONS
INSERT /*+append*/ INTO XXCPD_ITEM_PRICING_ATTR_BKP2(
line_id,
ITEM_NAME,
ATTRIBUTE_NAME,
ATTRIBUTE_VALUE,
GEOGRAPHY_LEVEL,
GEOGRAPHY_VALUE,
INCLUDE_GEOGRAPHY,
EFFECTIVE_START_DATE,
EFFECTIVE_TO_DATE,
EXTENSION_ID,
PRICING_UNIT,
ATTRIBUTE_FROM,
ATTRIBUTE_TO,
FIXED_BASE_PRICE,
PARENT_ATTRIBUTE,
DURATION_QUANTITY,
ATTRIBUTE2,
ATTRIBUTE3,
ATTRIBUTE4,
ATTRIBUTE5,
ATTRIBUTE6,
ATTRIBUTE7,
ATTRIBUTE8,
ATTRIBUTE9,
ATTRIBUTE10,
ATTRIBUTE11,
ATTRIBUTE12,
ATTRIBUTE13,
ATTRIBUTE14,
ATTRIBUTE15,
ITEM_CATEGORY,
INVENTORY_ITEM_ID
VALUES
xxcpd.xxcpd_if_prc_line_id_s.NEXTVAL
,c_lp(i).ITEM_NAME
,c_lp(i).ATTRIBUTE_NAME
,c_lp(i).ATTRIBUTE_VALUE
,c_lp(i).GEOGRAPHY_LEVEL
,c_lp(i).GEOGRAPHY_VALUE
,c_lp(i).INCLUDE_GEOGRAPHY
,c_lp(i).EFFECTIVE_START_DATE
,c_lp(i).EFFECTIVE_TO_DATE
,c_lp(i).EXTENSION_ID
,c_lp(i).PRICING_UNIT
,c_lp(i).ATTRIBUTE_FROM
,c_lp(i).ATTRIBUTE_TO
,c_lp(i).FIXED_BASE_PRICE
,c_lp(i).PARENT_ATTRIBUTE
,c_lp(i).DURATION_QUANTITY
,c_lp(i).ATTRIBUTE2
,c_lp(i).ATTRIBUTE3
,c_lp(i).ATTRIBUTE4
,c_lp(i).ATTRIBUTE5
,c_lp(i).ATTRIBUTE6
,c_lp(i).ATTRIBUTE7
,c_lp(i).ATTRIBUTE8
,c_lp(i).ATTRIBUTE9
,c_lp(i).ATTRIBUTE10
,c_lp(i).ATTRIBUTE11
,c_lp(i).ATTRIBUTE12
,c_lp(i).ATTRIBUTE13
,c_lp(i).ATTRIBUTE14
,c_lp(i).ATTRIBUTE15
,c_lp(i).ITEM_CATEGORY
,c_lp(i).INVENTORY_ITEM_ID
EXCEPTION
WHEN OTHERS THEN
FOR j IN 1..SQL%bulk_exceptions.Count
LOOP
UPDATE XXCPD_ITM_SEC_UDA_DETAILS_TB
SET status_flag = 'E',
last_updated_by = 1,
last_update_date = SYSDATE
WHERE item_name = c_lp(SQL%BULK_EXCEPTIONS(j).ERROR_INDEX).item_name
AND extension_id=c_lp(SQL%BULK_EXCEPTIONS(j).ERROR_INDEX).extension_id;
COMMIT;
IF c_list_price%ISOPEN THEN
CLOSE c_list_price;
END IF;
FND_FILE.put_line(FND_FILE.output,'********************Exception Occured*************:-- '||SYSTIMESTAMP);
END LOOP;
END;
END LOOP;
CLOSE c_list_price;
COMMIT;
end;and I am getting following error
ORA-06550: line 156, column 47:
PL/SQL: ORA-00911: invalid character
ORA-06550: line 152, column 21:
PL/SQL: SQL Statement ignoredpointing to following lines in exception block for update clause.
WHERE item_name = c_lp(SQL%BULK_EXCEPTIONS(j).ERROR_INDEX).item_name
AND extension_id=c_lp(SQL%BULK_EXCEPTIONS(j).ERROR_INDEX).extension_id; can some one please help me out with the issue.
Thanks in AdvanceHave re-written the code in the following manner
declare
lv_ITEM_NAME DBMS_SQL.VARCHAR2S;
lv_ITEM_CATEGORY DBMS_SQL.VARCHAR2S;
lv_INVENTORY_ITEM_ID DBMS_SQL.NUMBER_TABLE;
lv_ATTRIBUTE_NAME DBMS_SQL.VARCHAR2S;
lv_ATTRIBUTE_VALUE DBMS_SQL.VARCHAR2S;
lv_GEOGRAPHY_LEVEL DBMS_SQL.VARCHAR2S;
lv_GEOGRAPHY_VALUE DBMS_SQL.VARCHAR2S;
lv_INCLUDE_GEOGRAPHY DBMS_SQL.VARCHAR2S;
lv_EFFECTIVE_START_DATE DBMS_SQL.date_table;
lv_EFFECTIVE_TO_DATE DBMS_SQL.date_table;
lv_EXTENSION_ID DBMS_SQL.NUMBER_TABLE;
lv_PRICING_UNIT DBMS_SQL.VARCHAR2S;
lv_ATTRIBUTE_FROM DBMS_SQL.VARCHAR2S;
lv_ATTRIBUTE_TO DBMS_SQL.VARCHAR2S;
lv_FIXED_BASE_PRICE DBMS_SQL.NUMBER_TABLE;
lv_PARENT_ATTRIBUTE DBMS_SQL.VARCHAR2S;
lv_DURATION_QUANTITY DBMS_SQL.NUMBER_TABLE;
lv_ATTRIBUTE2 DBMS_SQL.VARCHAR2S;
lv_ATTRIBUTE3 DBMS_SQL.VARCHAR2S;
lv_ATTRIBUTE4 DBMS_SQL.VARCHAR2S;
lv_ATTRIBUTE5 DBMS_SQL.VARCHAR2S;
lv_ATTRIBUTE6 DBMS_SQL.VARCHAR2S;
lv_ATTRIBUTE7 DBMS_SQL.VARCHAR2S;
lv_ATTRIBUTE8 DBMS_SQL.VARCHAR2S;
lv_ATTRIBUTE9 DBMS_SQL.VARCHAR2S;
lv_ATTRIBUTE10 DBMS_SQL.VARCHAR2S;
lv_ATTRIBUTE11 DBMS_SQL.VARCHAR2S;
lv_ATTRIBUTE12 DBMS_SQL.VARCHAR2S;
lv_ATTRIBUTE13 DBMS_SQL.VARCHAR2S;
lv_ATTRIBUTE14 DBMS_SQL.VARCHAR2S;
lv_ATTRIBUTE15 DBMS_SQL.VARCHAR2S;
l_item_name XXCPD_ITM_SEC_UDA_DETAILS_TB.item_name%TYPE;
l_extension_id XXCPD_ITM_SEC_UDA_DETAILS_TB.extension_id%TYPE;
cursor c_list_price
IS
SELECT /*+ leading(a) */ item_name,
'PUBLISH_TO_PRICE_CATALOG' ATTRIBUTE_NAME,
MAX(DECODE (attribute_name,
'PUBLISH_TO_PRICE_CATALOG',
attribute_value))
ATTRIBUTE_VALUE,
NULL GEOGRAPHY_LEVEL,
NULL GEOGRAPHY_VALUE,
NULL INCLUDE_GEOGRAPHY,
NULL EFFECTIVE_START_DATE,
NULL EFFECTIVE_TO_DATE,
EXTENSION_ID,
NULL PRICING_UNIT,
NULL ATTRIBUTE_FROM,
NULL ATTRIBUTE_TO,
NULL FIXED_BASE_PRICE,
NULL PARENT_ATTRIBUTE,
NULL DURATION_QUANTITY,
NULL ATTRIBUTE2,
NULL ATTRIBUTE3,
NULL ATTRIBUTE4,
NULL ATTRIBUTE5,
NULL ATTRIBUTE6,
NULL ATTRIBUTE7,
NULL ATTRIBUTE8,
NULL ATTRIBUTE9,
NULL ATTRIBUTE10,
NULL ATTRIBUTE11,
NULL ATTRIBUTE12,
NULL ATTRIBUTE13,
NULL ATTRIBUTE14,
NULL ATTRIBUTE15,
--ORG_CODE,
ITEM_CATALOG_CATEGORY ITEM_CATEGORY,
a.INVENTORY_ITEM_ID INVENTORY_ITEM_ID
FROM XXCPD_ITM_SEC_UDA_DETAILS_TB a
WHERE DECODE(attribute_group_name,'LIST_PRICE',DECODE(STATUS_FLAG,'E',1,'P',2,3)
,'XXITM_FORM_PRICING_HS',DECODE(STATUS_FLAG,'E',4,'P',5,6)
,'XXITM_FORM_PRICING_SB',DECODE(STATUS_FLAG,'E',4,'P',5,6)
,'XXITM_ADV_FORM_PRICING_HS',DECODE(STATUS_FLAG,'E',7,'P',8,9)
,'XXITM_ADV_FORM_PRICING_SB',DECODE(STATUS_FLAG,'E',7,'P',8,9)
,'XXITM_ADV_FORM_PRICING_HWSW',DECODE(STATUS_FLAG,'E',10,'P',11,12)
,'XXITM_ADV_FORM_PRICING_SWSUB',DECODE(STATUS_FLAG,'E',10,'P',11,12)
,'XXITM_ROUTE_TO_MARKET',DECODE(STATUS_FLAG,'E',13,'P',14,15)) in (1,2,3)
AND exists(select /*+ no_merge */ '1' from mtl_system_items_b b where a.inventory_item_id=b.inventory_item_id and b.organization_id=1)
GROUP BY item_name,
extension_id,
ITEM_CATALOG_CATEGORY,
INVENTORY_ITEM_ID;
BEGIN
OPEN c_list_price;
LOOP
lv_ITEM_NAME.delete;
lv_ITEM_CATEGORY.delete;
lv_INVENTORY_ITEM_ID.delete;
lv_ATTRIBUTE_NAME.delete;
lv_ATTRIBUTE_VALUE.delete;
lv_GEOGRAPHY_LEVEL.delete;
lv_GEOGRAPHY_VALUE.delete;
lv_INCLUDE_GEOGRAPHY.delete;
lv_EFFECTIVE_START_DATE.delete;
lv_EFFECTIVE_TO_DATE.delete;
lv_EXTENSION_ID.delete;
lv_PRICING_UNIT.delete;
lv_ATTRIBUTE_FROM.delete;
lv_ATTRIBUTE_TO.delete;
lv_FIXED_BASE_PRICE.delete;
lv_PARENT_ATTRIBUTE.delete;
lv_DURATION_QUANTITY.delete;
lv_ATTRIBUTE2.delete;
lv_ATTRIBUTE3.delete;
lv_ATTRIBUTE4.delete;
lv_ATTRIBUTE5.delete;
lv_ATTRIBUTE6.delete;
lv_ATTRIBUTE7.delete;
lv_ATTRIBUTE8.delete;
lv_ATTRIBUTE9.delete;
lv_ATTRIBUTE10.delete;
lv_ATTRIBUTE11.delete;
lv_ATTRIBUTE12.delete;
lv_ATTRIBUTE13.delete;
lv_ATTRIBUTE14.delete;
lv_ATTRIBUTE15.delete;
FETCH c_list_price BULK COLLECT INTO lv_ITEM_NAME
,lv_ATTRIBUTE_NAME
,lv_ATTRIBUTE_VALUE
,lv_GEOGRAPHY_LEVEL
,lv_GEOGRAPHY_VALUE
,lv_INCLUDE_GEOGRAPHY
,lv_EFFECTIVE_START_DATE
,lv_EFFECTIVE_TO_DATE
,lv_EXTENSION_ID
,lv_PRICING_UNIT
,lv_ATTRIBUTE_FROM
,lv_ATTRIBUTE_TO
,lv_FIXED_BASE_PRICE
,lv_PARENT_ATTRIBUTE
,lv_DURATION_QUANTITY
,lv_ATTRIBUTE2
,lv_ATTRIBUTE3
,lv_ATTRIBUTE4
,lv_ATTRIBUTE5
,lv_ATTRIBUTE6
,lv_ATTRIBUTE7
,lv_ATTRIBUTE8
,lv_ATTRIBUTE9
,lv_ATTRIBUTE10
,lv_ATTRIBUTE11
,lv_ATTRIBUTE12
,lv_ATTRIBUTE13
,lv_ATTRIBUTE14
,lv_ATTRIBUTE15
,lv_ITEM_CATEGORY
,lv_INVENTORY_ITEM_ID
LIMIT 50000;
IF lv_INVENTORY_ITEM_ID.count = 0 THEN
EXIT;
END IF;
Begin
FORALL I IN 1..lv_INVENTORY_ITEM_ID.count SAVE EXCEPTIONS
INSERT /*+append*/ INTO XXCPD_ITEM_PRICING_ATTR_BKP2(
line_id,
ITEM_NAME,
ATTRIBUTE_NAME,
ATTRIBUTE_VALUE,
GEOGRAPHY_LEVEL,
GEOGRAPHY_VALUE,
INCLUDE_GEOGRAPHY,
EFFECTIVE_START_DATE,
EFFECTIVE_TO_DATE,
EXTENSION_ID,
PRICING_UNIT,
ATTRIBUTE_FROM,
ATTRIBUTE_TO,
FIXED_BASE_PRICE,
PARENT_ATTRIBUTE,
DURATION_QUANTITY,
ATTRIBUTE2,
ATTRIBUTE3,
ATTRIBUTE4,
ATTRIBUTE5,
ATTRIBUTE6,
ATTRIBUTE7,
ATTRIBUTE8,
ATTRIBUTE9,
ATTRIBUTE10,
ATTRIBUTE11,
ATTRIBUTE12,
ATTRIBUTE13,
ATTRIBUTE14,
ATTRIBUTE15,
ITEM_CATEGORY,
INVENTORY_ITEM_ID
VALUES
xxcpd.xxcpd_if_prc_line_id_s.NEXTVAL
,lv_ITEM_NAME(i)
,lv_ATTRIBUTE_NAME(i)
,lv_ATTRIBUTE_VALUE(i)
,lv_GEOGRAPHY_LEVEL(i)
,lv_GEOGRAPHY_VALUE(i)
,lv_INCLUDE_GEOGRAPHY(i)
,lv_EFFECTIVE_START_DATE(i)
,lv_EFFECTIVE_TO_DATE(i)
,lv_EXTENSION_ID(i)
,lv_PRICING_UNIT(i)
,lv_ATTRIBUTE_FROM(i)
,lv_ATTRIBUTE_TO(i)
,lv_FIXED_BASE_PRICE(i)
,lv_PARENT_ATTRIBUTE(i)
,lv_DURATION_QUANTITY(i)
,lv_ATTRIBUTE2(i)
,lv_ATTRIBUTE3(i)
,lv_ATTRIBUTE4(i)
,lv_ATTRIBUTE5(i)
,lv_ATTRIBUTE6(i)
,lv_ATTRIBUTE7(i)
,lv_ATTRIBUTE8(i)
,lv_ATTRIBUTE9(i)
,lv_ATTRIBUTE10(i)
,lv_ATTRIBUTE11(i)
,lv_ATTRIBUTE12(i)
,lv_ATTRIBUTE13(i)
,lv_ATTRIBUTE14(i)
,lv_ATTRIBUTE15(i)
,lv_ITEM_CATEGORY(i)
,lv_INVENTORY_ITEM_ID(i)
EXCEPTION
WHEN OTHERS THEN
ROLLBACK;
FOR j IN 1..SQL%bulk_exceptions.Count
LOOP
l_item_name:=lv_ITEM_NAME(SQL%BULK_EXCEPTIONS(j).ERROR_INDEX);
l_extension_id:=lv_EXTENSION_ID(SQL%BULK_EXCEPTIONS(j).ERROR_INDEX);
UPDATE XXCPD_ITM_SEC_UDA_DETAILS_TB
SET status_flag = 'E',
last_updated_by = 1,
last_update_date = SYSDATE
WHERE item_name = l_item_name
AND extension_id=l_extension_id;
COMMIT;
FND_FILE.put_line(FND_FILE.output,'********************Exception Occured*************:-- '||SYSTIMESTAMP);
END LOOP;
END;
END LOOP;
CLOSE c_list_price;
COMMIT;
end; -
Bulk collect forall vs single merge statement
I understand that a single DML statement is better than using bulk collect for all having intermediate commits. My only concern is if I'm loading a large amount of data like 100 million records into a 800 million record table with foreign keys and indexes and the session gets killed, the rollback might take a long time which is not acceptable. Using bulk collect forall with interval commits is slower than a single straight merge statement, but in case of dead session, the rollback time won't be as bad and a reload of the not yet committed data will not be as bad. To design a recoverable data load that may not be affected as badly, is bulk collect + for all the right approach?
1. specifics about the actual data available
2. the location/source of the data
3. whether NOLOGGING is appropriate
4. whether PARALLEL is an option
1. I need to transform the data before, so I can build the staging tables to match to be the same structure as the tables I'm loading to.
2. It's in the same database (11.2)
3. Cannot use NOLOGGING or APPEND because I need to allow DML in the target table and I can't use NOLOGGING because I cannot afford to lose the data in case of failure.
4. PARALLEL is an option. I've done some research on DBMS_PARALLEL_EXECUTE and it sounds very cool. Can this be used to load to two tables? I have a parent child tables. I can chunk the data and load these two tables separately, but the only requirement would be that I need to commit together. I cannot load a chunk into the parent table and commit before I load the corresponding chunk into its child table. Can this be done using DBMS_PARALLEL_EXECUTE? If so, I think this would be the perfect solution since it looks like it's exactly what I'm looking for. However, if this doesn't work, is bulk collect + for all the best option I am left with?
What is the underlying technology of DBMS_PARALLEL_EXECUTE? -
Bulk collect / forall which collection type?
Hi I am trying to speed up the query below using bulk collect / forall:
SELECT h.cust_order_no AS custord, l.shipment_set AS sset
FROM info.tlp_out_messaging_hdr h, info.tlp_out_messaging_lin l
WHERE h.message_id = l.message_id
AND h.contract = '12384'
AND l.shipment_set IS NOT NULL
AND h.cust_order_no IS NOT NULL
GROUP BY h.cust_order_no, l.shipment_set
I would like to extract the 2 fields selected above into a new table as quickly as possible, but I’m fairly new to Oracle and I’m finding it difficult to sort out the best way to do it. The query below does not work (no doubt there are numerous issues) but hopefully it is sufficiently developed to shows the sort of thing I am trying to achieve:
DECLARE
TYPE xcustord IS TABLE OF info.tlp_out_messaging_hdr.cust_order_no%TYPE;
TYPE xsset IS TABLE OF info.tlp_out_messaging_lin.shipment_set%TYPE;
TYPE xarray IS TABLE OF tp_a1_tab%rowtype INDEX BY BINARY_INTEGER;
v_xarray xarray;
v_xcustord xcustord;
v_xsset xsset;
CURSOR cur IS
SELECT h.cust_order_no AS custord, l.shipment_set AS sset
FROM info.tlp_out_messaging_hdr h, info.tlp_out_messaging_lin l
WHERE h.message_id = l.message_id
AND h.contract = '1111'
AND l.shipment_set IS NOT NULL
AND h.cust_order_no IS NOT NULL;
BEGIN
OPEN cur;
LOOP
FETCH cur
BULK COLLECT INTO v_xarray LIMIT 10000;
EXIT WHEN v_xcustord.COUNT() = 0;
FORALL i IN 1 .. v_xarray.COUNT
INSERT INTO TP_A1_TAB (cust_order_no, shipment_set)
VALUES (v_xarray(i).cust_order_no,v_xarray(i).shipment_set);
commit;
END LOOP;
CLOSE cur;
END;
I am running on Oracle 9i release 2.Well I suppose I can share the whole query as it stands for context and information (see below).
This is a very ugly piece of code that I am trying to improve. The advantage it has currently is
that it works, the disadvantage it has is that it's very slow. My thoughts were that bulk collect
and forall might be useful tools here as part of the re-write hence my original question, but perhaps not.
So on a more general note any advice on how best speed up this code would be welcome:
CREATE OR REPLACE VIEW DLP_TEST AS
WITH aa AS
(SELECT h.cust_order_no AS c_ref,l.shipment_set AS shipset,
l.line_id, h.message_id, l.message_line_no,
l.vendor_part_no AS part, l.rqst_quantity AS rqst_qty, l.quantity AS qty,
l.status AS status, h.rowversion AS allocation_date
FROM info.tlp_in_messaging_hdr h
LEFT JOIN info.tlp_in_messaging_lin l
ON h.message_id = l.message_id
WHERE h.contract = '12384'
AND h.cust_order_no IS NOT NULL
UNION ALL
SELECT ho.cust_order_no AS c_ref, lo.shipment_set AS shipset,
lo.line_id, ho.message_id, lo.message_line_no,
lo.vendor_part_no AS part,lo.rqst_quantity AS rqst_qty, lo.quantity AS qty,
lo.status AS status, ho.rowversion AS allocation_date
FROM info.tlp_out_messaging_hdr ho, info.tlp_out_messaging_lin lo
WHERE ho.message_id = lo.message_id
AND ho.contract = '12384'
AND ho.cust_order_no IS NOT NULL),
a1 AS
(SELECT h.cust_order_no AS custord, l.shipment_set AS sset
FROM info.tlp_out_messaging_hdr h, info.tlp_out_messaging_lin l
WHERE h.message_id = l.message_id
AND h.contract = '12384'
AND l.shipment_set IS NOT NULL
AND h.cust_order_no IS NOT NULL
GROUP BY h.cust_order_no, l.shipment_set),
a2 AS
(SELECT ho.cust_order_no AS c_ref, lo.shipment_set AS shipset
FROM info.tlp_out_messaging_hdr ho, info.tlp_out_messaging_lin lo
WHERE ho.message_id = lo.message_id
AND ho.contract = '12384'
AND ho.message_type = '3B13'
AND lo.shipment_set IS NOT NULL
GROUP BY ho.cust_order_no, lo.shipment_set),
a3 AS
(SELECT a1.custord, a1.sset, CONCAT('SHIPSET',a1.sset) AS ssset
FROM a1
LEFT OUTER JOIN a2
ON a1.custord = a2.c_ref AND a1.sset = a2.shipset
WHERE a2.c_ref IS NULL),
bb AS
(SELECT so.contract, so.order_no, sr.service_request_no AS sr_no, sr.reference_no,
substr(sr.part_no,8) AS shipset,
substr(note_text,1,instr(note_text,'.',1,1)-1) AS Major_line,
note_text AS CISCO_line,ma.part_no,
(Select TO_DATE(TO_CHAR(TO_DATE(d.objversion,'YYYYMMDDHH24MISS'),'YYYY-MM-DD HH24:MI:SS'),'YYYY-MM-DD HH24:MI:SS')
FROM ifsapp.document_text d WHERE so.note_id = d.note_id AND so.contract = '12384') AS Print_Date,
ma.note_text
FROM (ifsapp.service_request sr
LEFT OUTER JOIN ifsapp.work_order_shop_ord ws
ON sr.service_request_no = ws.wo_no
LEFT OUTER JOIN ifsapp.shop_ord so
ON ws.order_no = so.order_no)
LEFT OUTER JOIN ifsapp.customer_order co
ON sr.reference_no = co.cust_ref
LEFT OUTER JOIN ifsapp.shop_material_alloc ma
ON so.order_no = ma.order_no
JOIN a3
ON a3.custord = sr.reference_no AND a3.sset = substr(sr.part_no,8)
WHERE sr.part_contract = '12384'
AND so.contract = '12384'
AND co.contract = '12384'
AND sr.reference_no IS NOT NULL
AND ma.part_no NOT LIKE 'SHIPSET%'),
cc AS
(SELECT
bb.reference_no,
bb.shipset,
bb.order_no,
bb.cisco_line,
aa.message_id,
aa.allocation_date,
row_number() over(PARTITION BY bb.reference_no, bb.shipset, aa.allocation_date
ORDER BY bb.reference_no, bb.shipset, aa.allocation_date, aa.message_id DESC) AS selector
FROM bb
LEFT OUTER JOIN aa
ON bb.reference_no = aa.c_ref AND bb.shipset = aa.shipset
WHERE aa.allocation_date <= bb.print_date
OR aa.allocation_date IS NULL
OR bb.print_date IS NULL),
dd AS
(SELECT
MAX(reference_no) AS reference_no,
MAX(shipset) AS shipset,
order_no,
MAX(allocation_date) AS allocation_date
FROM cc
WHERE selector = 1
GROUP BY order_no, selector),
ee AS
(SELECT
smx.order_no,
SUM(smx.qty_assigned) AS total_allocated,
SUM(smx.qty_issued) AS total_issued,
SUM(smx.qty_required) AS total_required
FROM ifsapp.shop_material_alloc smx
WHERE smx.contract = '12384'
AND smx.part_no NOT LIKE 'SHIPSET%'
GROUP BY smx.order_no),
ff AS
(SELECT
dd.reference_no,
dd.shipset,
dd.order_no,
MAX(allocation_date) AS last_allocation,
MAX(ee.total_allocated) AS total_allocated,
MAX(ee.total_issued) AS total_issued,
MAX(ee.total_required) AS total_required
FROM dd
LEFT OUTER JOIN ee
ON dd.order_no = ee.order_no
GROUP BY dd.reference_no, dd.shipset, dd.order_no),
base AS
(SELECT x.order_no, x.part_no, z.rel_no, MIN(x.dated) AS dated, MIN(y.cust_ref) AS cust_ref, MIN(z.line_no) AS line_no,
MIN(y.state) AS state, MIN(y.contract) AS contract, MIN(z.demand_order_ref1) AS demand_order_ref1
FROM ifsapp.inventory_transaction_hist x, ifsapp.customer_order y, ifsapp.customer_order_line z
WHERE x.contract = '12384'
AND x.order_no = y.order_no
AND y.order_no = z.order_no
AND x.transaction = 'REP-OESHIP'
AND x.part_no = z.part_no
AND TRUNC(x.dated) >= SYSDATE - 8
GROUP BY x.order_no, x.part_no, z.rel_no)
SELECT
DISTINCT
bb.contract,
bb.order_no,
bb.sr_no,
CAST('-' AS varchar2(40)) AS Usr,
CAST('-' AS varchar2(40)) AS Name,
CAST('01' AS number) AS Operation,
CAST('Last Reservation' AS varchar2(40)) AS Action,
CAST('-' AS varchar2(40)) AS Workcenter,
CAST('-' AS varchar2(40)) AS Next_Workcenter_no,
CAST('Print SO' AS varchar2(40)) AS Next_WC_Description,
ff.total_allocated,
ff.total_issued,
ff.total_required,
ff.shipset,
ff.last_allocation AS Action_date,
ff.reference_no
FROM ff
LEFT OUTER JOIN bb
ON bb.order_no = ff.order_no
WHERE bb.order_no IS NOT NULL
UNION ALL
SELECT
c.contract AS Site, c.order_no AS Shop_Order, b.wo_no AS SR_No,
CAST('-' AS varchar2(40)) AS UserID,
CAST('-' AS varchar2(40)) AS User_Name,
CAST('02' AS number) AS Operation,
CAST('SO Printed' AS varchar2(40)) AS Action,
CAST('SOPRINT' AS varchar2(40)) AS Workcenter,
CAST('PKRPT' AS varchar2(40)) AS Next_Workcenter_no,
CAST('Pickreport' AS varchar2(40)) AS Next_WC_Description,
CAST('0' AS number) AS Total_Allocated,
CAST('0' AS number) AS Total_Issued,
CAST('0' AS number) AS Total_Required,
e.part_no AS Ship_Set,
TO_DATE(TO_CHAR(TO_DATE(d.objversion,'YYYYMMDDHH24MISS'),'YYYY-MM-DD HH24:MI:SS'),'YYYY-MM-DD HH24:MI:SS')AS Action_Date,
f.cust_ref AS cust_ref
FROM ifsapp.shop_ord c
LEFT OUTER JOIN ifsapp.work_order_shop_ord b
ON b.order_no = c.order_no
LEFT OUTER JOIN ifsapp.document_text d
ON d.note_id = c.note_id
LEFT OUTER JOIN ifsapp.customer_order_line e
ON e.demand_order_ref1 = TRIM(to_char(b.wo_no))
LEFT OUTER JOIN ifsapp.customer_order f
ON f.order_no = e.order_no
JOIN a3
ON a3.custord = f.cust_ref AND a3.ssset = e.part_no
WHERE c.contract = '12384'
AND e.contract = '12384'
AND d.objversion IS NOT NULL
UNION ALL
SELECT
a.site AS Site, a.order_no AS Shop_Order, b.wo_no AS SR_No, a.userid AS UserID,
a.user_name AS Name, a.operation_no AS Operation, a.action AS Action,
a.workcenter_no AS Workcenter, a.next_work_center_no AS Next_Workcenter_no,
(SELECT d.description FROM ifsapp.work_center d WHERE a.next_work_center_no = d.work_center_no AND a.site = d.contract)
AS Next_WC_Description,
CAST('0' AS number) AS Total_Allocated,
CAST('0' AS number) AS Total_Issued,
CAST('0' AS number) AS Total_Required,
e.part_no AS Ship_set,
a.action_date AS Action_Date, f.cust_ref AS cust_ref
FROM ifsapp.shop_ord c
LEFT OUTER JOIN ifsapp.work_order_shop_ord b
ON b.order_no = c.order_no
LEFT OUTER JOIN ifsapp.customer_order_line e
ON e.demand_order_ref1 = to_char(b.wo_no)
LEFT OUTER JOIN ifsapp.customer_order f
ON f.order_no = e.order_no
LEFT OUTER JOIN info.tp_hvt_so_op_hist a
ON a.order_no = c.order_no
JOIN a3
ON a3.custord = f.cust_ref AND a3.ssset = e.part_no
WHERE a.site = '12384'
AND c.contract = '12384'
AND e.contract = '12384'
AND f.contract = '12384'
UNION ALL
SELECT so.contract AS Site, so.order_no AS Shop_Order_No, sr.service_request_no AS SR_No,
CAST('-' AS varchar2(40)) AS "User",
CAST('-' AS varchar2(40)) AS "Name",
CAST('999' AS number) AS "Operation",
CAST('Shipped' AS varchar2(40)) AS "Action",
CAST('SHIP' AS varchar2(40)) AS "Workcenter",
CAST('-' AS varchar2(40)) AS "Next_Workcenter_no",
CAST('-' AS varchar2(40)) AS "Next_WC_Description",
CAST('0' AS number) AS Total_Allocated,
CAST('0' AS number) AS Total_Issued,
CAST('0' AS number) AS Total_Required,
so.part_no AS ship_set, base.dated AS Action_Date,
sr.reference_no AS CUST_REF
FROM base
LEFT OUTER JOIN ifsapp.service_request sr
ON base.demand_order_ref1 = sr.service_request_no
LEFT OUTER JOIN ifsapp.work_order_shop_ord ws
ON sr.service_request_no = ws.wo_no
LEFT OUTER JOIN ifsapp.shop_ord so
ON ws.order_no = so.order_no
WHERE base.contract = '12384'; -
How to use BULK COLLECT, FORALL and TREAT
There is a need to read match and update data from and into a custom table. The table would have about 3 millions rows and holds key numbers. BAsed on a field value of this custom table, relevant data needs to be fetched from joins of other tables and updated in the custom table. I plan to use BULK COLLECT and FORALL.
All examples I have seen, do an insert into a table. How do I go about reading all values of a given field and fetching other relevant data and then updating the custom table with data fetched.
Defined an object with specifics like this
CREATE OR REPLACE TYPE imei_ot AS OBJECT (
recid NUMBER,
imei VARCHAR2(30),
STORE VARCHAR2(100),
status VARCHAR2(1),
TIMESTAMP DATE,
order_number VARCHAR2(30),
order_type VARCHAR2(30),
sku VARCHAR2(30),
order_date DATE,
attribute1 VARCHAR2(240),
market VARCHAR2(240),
processed_flag VARCHAR2(1),
last_update_date DATE
Now within a package procedure I have defined like this.
type imei_ott is table of imei_ot;
imei_ntt imei_ott;
begin
SELECT imei_ot (recid,
imei,
STORE,
status,
TIMESTAMP,
order_number,
order_type,
sku,
order_date,
attribute1,
market,
processed_flag,
last_update_date
BULK COLLECT INTO imei_ntt
FROM (SELECT stg.recid, stg.imei, cip.store_location, 'S',
co.rtl_txn_timestamp, co.rtl_order_number, 'CUST',
msi.segment1 || '.' || msi.segment3,
TRUNC (co.txn_timestamp), col.part_number, 'ZZ',
stg.processed_flag, SYSDATE
FROM custom_orders co,
custom_order_lines col,
custom_stg stg,
mtl_system_items_b msi
WHERE co.header_id = col.header_id
AND msi.inventory_item_id = col.inventory_item_id
AND msi.organization_id =
(SELECT organization_id
FROM hr_all_organization_units_tl
WHERE NAME = 'Item Master'
AND source_lang = USERENV ('LANG'))
AND stg.imei = col.serial_number
AND stg.processed_flag = 'U');
/* Update staging table in one go for COR order data */
FORALL indx IN 1 .. imei_ntt.COUNT
UPDATE custom_stg
SET STORE = TREAT (imei_ntt (indx) AS imei_ot).STORE,
status = TREAT (imei_ntt (indx) AS imei_ot).status,
TIMESTAMP = TREAT (imei_ntt (indx) AS imei_ot).TIMESTAMP,
order_number = TREAT (imei_ntt (indx) AS imei_ot).order_number,
order_type = TREAT (imei_ntt (indx) AS imei_ot).order_type,
sku = TREAT (imei_ntt (indx) AS imei_ot).sku,
order_date = TREAT (imei_ntt (indx) AS imei_ot).order_date,
attribute1 = TREAT (imei_ntt (indx) AS imei_ot).attribute1,
market = TREAT (imei_ntt (indx) AS imei_ot).market,
processed_flag =
TREAT (imei_ntt (indx) AS imei_ot).processed_flag,
last_update_date =
TREAT (imei_ntt (indx) AS imei_ot).last_update_date
WHERE recid = TREAT (imei_ntt (indx) AS imei_ot).recid
AND imei = TREAT (imei_ntt (indx) AS imei_ot).imei;
DBMS_OUTPUT.put_line ( TO_CHAR (SQL%ROWCOUNT)
|| ' rows updated using Bulk Collect / For All.'
EXCEPTION
WHEN NO_DATA_FOUND
THEN
DBMS_OUTPUT.put_line ('No Data: ' || SQLERRM);
WHEN OTHERS
THEN
DBMS_OUTPUT.put_line ('Other Error: ' || SQLERRM);
END;
Now for the unfortunate part. When I compile the pkg, I face an error
PL/SQL: ORA-00904: "LAST_UPDATE_DATE": invalid identifier
I am not sure where I am wrong. Object type has the last update date field and the custom table also has the same field.
Could someone please throw some light and suggestion?
Thanks
udsI suspect your error comes from the »bulk collect into« and not from the »forall loop«.
From a first glance you need to alias sysdate with last_update_date and some of the other select items need to be aliased as well :
But a simplified version would be
select imei_ot (stg.recid,
stg.imei,
cip.store_location,
'S',
co.rtl_txn_timestamp,
co.rtl_order_number,
'CUST',
msi.segment1 || '.' || msi.segment3,
trunc (co.txn_timestamp),
col.part_number,
'ZZ',
stg.processed_flag,
sysdate
bulk collect into imei_ntt
from custom_orders co,
custom_order_lines col,
custom_stg stg,
mtl_system_items_b msi
where co.header_id = col.header_id
and msi.inventory_item_id = col.inventory_item_id
and msi.organization_id =
(select organization_id
from hr_all_organization_units_tl
where name = 'Item Master' and source_lang = userenv ('LANG'))
and stg.imei = col.serial_number
and stg.processed_flag = 'U';
... -
Where i might be doing wrong in Bulk collect ,forall
DECLARE
CURSOR Cur_sub_rp IS
SELECT A.SUB_ACCOUNT, B.RATE_PLAN,B.SUB_LAST_NAME,A.SUB_SSN
FROM STG_SUB_MASTER_MONTH_HISTORY A, SUB_PHONE_RATEPLAN B
WHERE A.SUB_ACCOUNT = B.SUB_ACCOUNT (+)
AND A.MONTH_ID = B.MONTH_ID ;
TYPE t_values_tab IS TABLE OF cur_sub_rp%rowtype index by binary_integer;
values_tab t_values_tab ;
BEGIN
OPEN Cur_sub_rp ;
LOOP
FETCH Cur_sub_rp BULK COLLECT INTO Values_tab
LIMIT 1000;
EXIT WHEN Cur_sub_rp%NOTFOUND ;
END LOOP ;
CLOSE Cur_sub_rp;
FORALL i IN VALUES_TAB.first..values_tab.last
INSERT INTO SUB_PHN_1 VALUES VALUES_TAB(i);
END;
when i select records from sub_phn_1 it shows no rows selected.I have a working example wich is close to your script (It is already was posted before)
--Create some data
DROP TABLE emp;
DROP TABLE dept;
CREATE TABLE emp
(empno NUMBER(4) NOT NULL,
ename VARCHAR(10),
job VARCHAR(9),
mgr NUMBER(4),
hiredate DATE,
sal NUMBER(7, 2),
comm NUMBER(7, 2),
deptno NUMBER(2));
INSERT INTO emp
VALUES (7369, 'SMITH', 'CLERK', 7902, '17-DEC-1980', 800, NULL, 20);
INSERT INTO emp
VALUES (7499, 'ALLEN', 'SALESMAN', 7698, '20-FEB-1981', 1600, 300, 30);
INSERT INTO emp
VALUES (7521, 'WARD', 'SALESMAN', 7698, '22-FEB-1981', 1250, 500, 30);
INSERT INTO emp
VALUES (7566, 'JONES', 'MANAGER', 7839, '2-APR-1981', 2975, NULL, 20);
CREATE TABLE dept
(deptno NUMBER(2),
dname VARCHAR(14),
loc VARCHAR(13) );
INSERT INTO dept
VALUES (20, 'RESEARCH', 'DALLAS');
INSERT INTO dept
VALUES (30, 'SALES', 'CHICAGO');
COMMIT;
DECLARE
CURSOR c1
IS
(SELECT e.*, d.dname
FROM emp e JOIN dept d ON d.deptno = e.deptno
TYPE c1_tbl_typ IS TABLE OF c1%ROWTYPE
INDEX BY PLS_INTEGER;
emp_tbl c1_tbl_typ;
BEGIN
OPEN c1;
FETCH c1
BULK COLLECT INTO emp_tbl;
CLOSE c1;
--Test emp_tbl
FOR i IN 1 .. emp_tbl.COUNT
LOOP
DBMS_OUTPUT.put_line (emp_tbl (i).empno || ', ' || emp_tbl (i).dname);
NULL;
END LOOP;
END;
--or :
DECLARE
CURSOR c1
IS
(SELECT *
FROM emp
WHERE deptno = 20);
TYPE c1_tbl_typ IS TABLE OF c1%ROWTYPE
INDEX BY PLS_INTEGER;
emp_tbl c1_tbl_typ;
BEGIN
OPEN c1;
LOOP
FETCH c1
BULK COLLECT INTO emp_tbl LIMIT 100;
FORALL i IN 1 .. emp_tbl.COUNT
INSERT INTO emp_1
VALUES emp_tbl (i);
EXIT WHEN c1%NOTFOUND;
END LOOP;
CLOSE c1;
END;
SELECT *
FROM emp_1;
--Additionally, What if your select does not return records? -
Hi,
I have been using bulk collect with Forall for insert into a partition table which has uniquw constraing on 2 columns but the partition table is aquiring lock every time I ran the procedure and it doesn't get killed even though session is killed. The no. of records I am trying to insert with duplicate values are 2000(appr).
Can any one suggest what could be the problem..
Thanks in advanceHi,
The Code is
BEGIN
strQuery := ' SELECT C1, C2, C3, C4, C5, C6, C7, C8, C9, C10 ';
strQuery := strQuery ||' FROM TEMP';
strQuery := strQuery ||' WHERE C3 = ''R'' AND C4 = ''' || Bid;
strQuery := strQuery || ''' AND TO_CHAR(C8,''DDMMYYYY'') = ''' || CallDate||'''';
EXECUTE IMMEDIATE strQuery BULK COLLECT INTO V_C1, V_C2, V_C3, V_C4, V_C5, V_C6, V_C7, V_C8, V_C9, V_10 ;
Count1 := SQL%rowcount;
strQuery := 'INSERT INTO '|| PartitionTabName ||' (T1, T2, T3, T4, T5, T6, T7, T8, T9, T10) VALUES (';
strQuery := strQuery ||':a1, :a2 , :a3 , :a4 , :a5 , :a6 , :a7 , :a8 , :a9 , :a10 )';
FORALL j IN 1.. Count1 save EXCEPTIONS
EXECUTE IMMEDIATE strQuery USING V_C1(j) , V_C2(j) , V_C3(j) , V_C4(j) , V_C5(j) , V_C6(j) , V_C7(j) , V_C8(j) , V_C9(j) , V_C10(j);
COMMIT;
EXCEPTION
WHEN OTHERS THEN
Count1 := SQL%BULK_EXCEPTIONS.COUNT;
DBMS_OUTPUT.PUT_LINE('ERROR '||Count1);
COMMIT;
END;
Temp table has composite unique key constraint on PartitionTabName table are T1, T2, T3,T4,T5. -
Bulk collect & forall error in script
hi,
i am having senerio as below in my code.but i am getting error "too many values for emp1". i know why i m getting error but i need to insert emp column " EMPNO, ENAME" value in emp1 table &
emp column " EMPNO, ENAME,sal" value in emp2 table .
how to do it.sud i declare 2 diffrent cursor ? 1 cursor having only "EMPNO, ENAME" & 2nd having "EMPNO, ENAME, SAL" values.
or any other best way to do this. plz help ..
DECLARE
CURSOR s_cur1
IS SELECT EMPNO, ENAME, SAL
FROM EMP E;
TYPE fetch_array1 IS TABLE OF s_cur1%ROWTYPE;
s_array1 fetch_array1;
BEGIN
OPEN s_cur1;
LOOP
FETCH s_cur1 BULK COLLECT INTO s_array1 LIMIT 1000;
FORALL i IN 1..s_array1.COUNT SAVE EXCEPTIONS
INSERT INTO EMP1 --(EMPNO, ENAME)
VALUES s_array1(i);
FORALL i IN 1..s_array1.COUNT SAVE EXCEPTIONS
INSERT INTO EMP2 --(EMPNO, ENAME, sal)
VALUES s_array1(i);
EXIT WHEN s_cur1%NOTFOUND;
END LOOP;
CLOSE s_cur1;
COMMIT;
EXCEPTION
WHEN OTHERS THEN
DBMS_OUTPUT.PUT_LINE('Unexpected Error:'||SqleRrm);
END;
table structures:-
CREATE TABLE EMP (
EMPNO NUMBER,
ENAME VARCHAR2 (100),
SAL NUMBER ) ;
CREATE TABLE EMP1 (
EMPNO NUMBER,
ENAME VARCHAR2 (100),
CREATE TABLE EMP2 (
EMPNO NUMBER,
ENAME VARCHAR2 (100),
SAL NUMBER ) ;
---------------------------------------------------------------------Do it in one simple SQL.
Connected to:
Oracle Database 10g Enterprise Edition Release 10.2.0.3.0 - Production
With the Partitioning, OLAP and Data Mining options
SQL> desc emp
Name Null? Type
EMPNO NOT NULL NUMBER(4)
ENAME VARCHAR2(10)
JOB VARCHAR2(9)
MGR NUMBER(4)
HIREDATE DATE
SAL NUMBER(7,2)
COMM NUMBER(7,2)
DEPTNO NUMBER(2)
SQL> desc emp1
Name Null? Type
EMPNO NUMBER
ENAME VARCHAR2(100)
SQL> desc emp2
Name Null? Type
EMPNO NUMBER
ENAME VARCHAR2(100)
SAL NUMBER
SQL> INSERT ALL WHEN 1 = 1 THEN INTO emp1
2 (empno, ename)
3 VALUES
4 (empno, ename) WHEN 2 = 2 THEN INTO emp2
5 (empno, ename, sal)
6 VALUES
7 (empno, ename, sal)
8 SELECT empno, ename, sal FROM emp;
28 rows created.
SQL> SELECT COUNT(*) FROM emp1;
COUNT(*)
14
SQL> SELECT COUNT(*) FROM emp2;
COUNT(*)
14
SQL> SELECT COUNT(*) FROM emp;
COUNT(*)
14
SQL> -
Issue in using Cursor+Dynamic SQL+ Bulk collect +FORALL
Hi,
I have a dynamic query which I need to use as a cursor to fetch records that inturn need to be inserted into a staging table.
The issue I am facing is I am not sure how to declare the variable to fetch the records into. Since I am using a dynamic cursor how do I declare it?
My code looks something like this -
TYPE c_details_tbl_type IS REF CURSOR;
c_details c_details_tbl_type;
TYPE c_det_tbl_type IS TABLE OF c_details%ROWTYPE INDEX BY PLS_INTEGER;
c_det_tbl c_det_tbl_type; -- ???
BEGIN
v_string1 := 'SELECT....'
v_string2 := ' UNION ALL SELECT....'
v_string3 := 'AND ....'
v_string := v_string1||v_string2||v_string3;
OPEN c_details FOR v_string;
LOOP
FETCH c_details BULK COLLECT
INTO c_det_tbl LIMIT 1000;
IF (c_det_tbl.COUNT > 0) THEN
DELETE FROM STG;
FORALL i IN 1..c_det_tbl.COUNT
INSERT INTO STG
VALUES (c_det_tbl(i));
END IF;
EXIT WHEN c_details%NOTFOUND;
END LOOP;
CLOSE c_details;
END
ThanksWhy the bulk collect? All that this does is slow down the read process (SELECT) and write process (INSERT).
Data selected needs (as a collection) to be pushed into the PGA memory of the PL/SQL engine. And then that very same data needs to be pushed again by the PL/SQL engine back to the database to be inserted. Why?
It is a lot faster, needs a lot less resources, with fewer moving parts, to simply instruct the SQL engine to do both these steps using a single INSERT..SELECT statement. And this can support parallel DML too for scalability when data volumes get large.
It is also pretty easy to make a single SQL statement like this dynamic and even support bind variables.
Simplicity is the ultimate form of elegance. Pushing data needlessly around is not simple and thus not a very elegant way to address the problem. -
Object Type and Bulk Collect/Forall
How can I bulk collect into (and read from) a collection which is a table of an object type such as
CREATE TABLE base_table (
attr1 NUMBER,
attr2 NUMBER,
attr3 NUMBER,
attr4 NUMBER);
CREATE OR REPLACE TYPE rec_t IS OBJECT (
attr1 NUMBER,
attr2 NUMBER,
attr3 NUMBER,
attr4 NUMBER);
CREATE OR REPLACE TYPE col_t IS TABLE OF rect;
In my pl sql code I instantiate the collection type and want to populate it with a BULK COLLECT - statemt:
PROCEDURE test IS
v_col col_t;
BEGIN
SELECT rec_T(attr1, attr2, attr3, attr4)
BULK COLLECT INTO v_col
FROM base_table
FORALL i IN v_col.FIRST..v_col.LAST INSERT INTO base_table VALUES (rec_t(v_col(i)));
END;
? If I do it this way I get the following exception on the FORALL insert:
PL/SQL: ORA-00947: not enough values
Edited by: user12149927 on 22.01.2010 00:48try like this
CREATE TABLE BASE_TABLE
ATTR1 NUMBER,
ATTR2 NUMBER,
ATTR3 NUMBER,
ATTR4 NUMBER
CREATE OR REPLACE TYPE rec_t IS OBJECT (attr1 number, attr2 number, attr3 number, attr4 number);
CREATE OR REPLACE TYPE col_t IS TABLE OF rec_t;
CREATE OR REPLACE PROCEDURE test
IS
v_col col_t;
BEGIN
SELECT rec_t (attr1, attr2, attr3, attr4)
BULK COLLECT INTO v_col
FROM base_table;
INSERT INTO base_table
SELECT *
FROM table (CAST (v_col AS col_t));
END;Regards,
Mahesh Kaila
Edited by: Mahesh Kaila on Jan 22, 2010 12:56 AM -
I have a 10.2.0.4 database that contains a PL/SQL procedure that copies data from a singe remote 10.2.0.4 database table. The procedure will return anywhere from 50,000 to 500,000 rows of data. In testing I have made this a pretty speedy process using BULK COLLECT and using FORALL to load them in 2000 row batches. It has now been requested that I include an additional column from my source table which happens to be a CLOB datatype. However, when I try to perform this with the extra column I get the standard "cannot select remote lob locators" or whatever. Does anyone know of a way to perform this using BULK COLLECT? I've seen countless examples of doing it using "INSERT INTO TABLEX SELECT COL1, COL2, COL3, etc FROM TABLE Y@DBLINK". I don't want to do it this way for performance reasons. Any suggestions would be greatly appreciated.
sjm133 wrote:
I've seen countless examples of doing it using "INSERT INTO TABLEX SELECT COL1, COL2, COL3, etc FROM TABLE Y@DBLINK". I don't want to do it this way for performance reasons.What performance reasons are those then?
Best thing to do would be to give it a go and see, and then if you find problems with it look for alternatives. Don't dismiss solutions without trying them. ;) -
Index range error bulk collect & forall ?
hi,
can any1 tell me why i m getting below errir
ORA-22165: given index [32768] must be in the range of [1] to [32767].
i am using bulk collecrt & forall,as in below senerio.
declare
cursor cmain is...
type <collectionType> is table of <>tablename%rowtype;
<collectionvariable> <collectionType>;
begin
declare
cursor c1 .....
type <collectionType> is table of <>tablename%rowtype;
<collectionvariable> <collectionType>;
begin
open c1;
loop
fetch c1 into <collectionvariable2> limit 50000;
forall i in 1 .. <collectionVariable1>.count
stmnt....
EXIT WHEN C1%NOTFOUND;
end loop;
close c1;
COMMIT;
end;
open cmain;
loop
fetch cmain into <collectionVariable2> limit 50000;
forall i in 1 .. <collectionVariable2>.count
stmnt....
EXIT WHEN CMAIN%NOTFOUND;
end loop;
close cmain;
COMMIT;
end ;
}Do not see anything obviously wrong (except for perhaps consuming way too much PGA memory using bulk fetches larger than a 1000, with likely minute to no performance improvement).
The 50000 limit exceeds the 32767 ceiling that the exception throws. So it could be perhaps related to that - though a quick test on 10.2.0.1 showed no problems in bulk fetching 40k+ rows like this.
But I would suggest changing it irrespective - you cannot justify the memory footprint of a 50000 bulk fetch by the decrease in context switching it provides. No way. It is just a useless waste of very expensive memory.
Also note that the exception will include a PL/SQL source code line number - use that to start tracing the error.
And did I mention not too waste precious PGA memory? -
Hello Experts,
Please examine the below snippet, as it errors with a PLS-00103: Encoountered the synbol "END".
IF v_geospatial_coordinate_type = 'None'
THEN
OPEN p_cursor;
LOOP
FETCH p_cursor
BULK COLLECT INTO v_search_results_basic LIMIT 100;
FOR I in 1..v_search_results_basic.count
EXIT WHEN v_search_results_basic.count < 100;
END LOOP;
CLOSE p_cursor;
Thank you ahead of time for the assist, and of course your professionalism.Did you post all your code? Here is what I noticed right off:
IF v_geospatial_coordinate_type = 'None'
THEN
OPEN p_cursor;
LOOP
FETCH p_cursor
BULK COLLECT INTO v_search_results_basic LIMIT 100;
FOR I in 1..v_search_results_basic.count
LOOP /* MISSING */
<stuff here> /* MISSING */
END LOOP; /* MISSING */
EXIT WHEN v_search_results_basic.count < 100;
END LOOP;
END IF; /* MISSING */
Maybe you are looking for
-
Unable to load data in 0material_attr
Hi Guys, I have migrated datasource 0material_attr to 7.0. I execute the infopackage to load the data in PSA and get the data eg. 50 materials loaded The problem is while executing the DTP, I get the follwoing error for all the 50 materials. The data
-
Connecting iBook G4 to TV - cables OK, but not working
Hi. I bought an apple video adapter and plugged this into my TV via an additional S-video cable. I have plugged in the audio (mac to TV via jack / RCA). The sound works fine. No image on the TV I have set the TV on the right channel (AV5 in my case i
-
Front panel breaks on file io change
I'm working on a pump controls application in labview. The software uses inputs from various serial ports to make an input. The problem I'm having is on start the application loads a calibration document (that translates the user's requested flow sp
-
ORA-1652: unable to extend temp
Hi, We faced a problem with one of r billing process which normally takes 25-30 min was running for over 5 hrs and in the end gave a rollback error....when the process was re-started it got completed in 35 min.... Tue Jan 1 22:48:33 2008 ORA-1652: un
-
Low RAM and no Intel. Can I install anyway?
Hi all! I need a bit of advice. I just got CS4 Production Premium on a student license and my PowerPC G4 meets all the requirements but two. I have 56.15 GB of available hard-disk, a 1400X900 display, and Tiger 10.4.11. However, I only have 640 MB of