BULK COLLECT and FORALL with dynamic INSERT.
Hello,
I want to apply BULK COLLECT and FORALL feature for a insert statement in my procedure for performance improvements as it has to insert a huge amount of data.
But the problem is that the insert statement gets generated dynamically and even the table name is found at the run-time ... so i am not able to apply the performance tuning concepts.
See below the code
PROCEDURE STP_MES_INSERT_GLOBAL_TO_MAIN
(P_IN_SRC_TABLE_NAME VARCHAR2 ,
P_IN_TRG_TABLE_NAME VARCHAR2 ,
P_IN_ED_TRIG_ALARM_ID NUMBER ,
P_IN_ED_CATG_ID NUMBER ,
P_IN_IS_PIECEID_ALARM IN CHAR,
P_IN_IS_LAST_RECORD IN CHAR
IS
V_START_DATA_ID NUMBER;
V_STOP_DATA_ID NUMBER;
V_FROM_DATA_ID NUMBER;
V_TO_DATA_ID NUMBER;
V_MAX_REC_IN_LOOP NUMBER := 30000;
V_QRY1 VARCHAR2(32767);
BEGIN
EXECUTE IMMEDIATE 'SELECT MIN(ED_DATA_ID), MAX(ED_DATA_ID) FROM '|| P_IN_SRC_TABLE_NAME INTO V_START_DATA_ID , V_STOP_DATA_ID;
--DBMS_OUTPUT.PUT_LINE('ORIGINAL START ID := '||V_START_DATA_ID ||' ORIGINAL STOP ID := ' || V_STOP_DATA_ID);
V_FROM_DATA_ID := V_START_DATA_ID ;
IF (V_STOP_DATA_ID - V_START_DATA_ID ) > V_MAX_REC_IN_LOOP THEN
V_TO_DATA_ID := V_START_DATA_ID + V_MAX_REC_IN_LOOP;
ELSE
V_TO_DATA_ID := V_STOP_DATA_ID;
END IF;
LOOP
BEGIN
LOOP
V_QRY1 := ' INSERT INTO '||P_IN_TRG_TABLE_NAME||
' SELECT * FROM '||P_IN_SRC_TABLE_NAME ||
' WHERE ED_DATA_ID BETWEEN ' || V_FROM_DATA_ID ||' AND ' || V_TO_DATA_ID;
EXECUTE IMMEDIATE V_QRY1;
commit;
V_FROM_DATA_ID := V_TO_DATA_ID + 1;
IF ( V_STOP_DATA_ID - V_TO_DATA_ID > V_MAX_REC_IN_LOOP ) THEN
V_TO_DATA_ID := V_TO_DATA_ID + V_MAX_REC_IN_LOOP;
ELSE
V_TO_DATA_ID := V_TO_DATA_ID + (V_STOP_DATA_ID - V_TO_DATA_ID);
END IF;
EXCEPTION
WHEN OTHERS THEN.............
....................so on Now you can observer here that P_IN_SRC_TABLE_NAME is the source table name which we get as a parameter at run-time. I have used 2 table in the insert statement P_IN_TRG_TABLE_NAME (in which i have to insert data) and P_IN_SRC_TABLE_NAME(from where i have to insert data)
V_QRY1 := ' INSERT INTO '||P_IN_TRG_TABLE_NAME||
' SELECT * FROM '||P_IN_SRC_TABLE_NAME ||
' WHERE ED_DATA_ID BETWEEN ' || V_FROM_DATA_ID ||' AND ' || V_TO_DATA_ID;
EXECUTE IMMEDIATE V_QRY1;now when i appy the bulk collect and forall feature i am facing the out of scope problem....see the code below ...
BEGIN
EXECUTE IMMEDIATE 'SELECT MIN(ED_DATA_ID), MAX(ED_DATA_ID) FROM '|| P_IN_SRC_TABLE_NAME INTO V_START_DATA_ID , V_STOP_DATA_ID;
--DBMS_OUTPUT.PUT_LINE('ORIGINAL START ID := '||V_START_DATA_ID ||' ORIGINAL STOP ID := ' || V_STOP_DATA_ID);
V_FROM_DATA_ID := V_START_DATA_ID ;
IF (V_STOP_DATA_ID - V_START_DATA_ID ) > V_MAX_REC_IN_LOOP THEN
V_TO_DATA_ID := V_START_DATA_ID + V_MAX_REC_IN_LOOP;
ELSE
V_TO_DATA_ID := V_STOP_DATA_ID;
END IF;
LOOP
DECLARE
TYPE TRG_TABLE_TYPE IS TABLE OF P_IN_SRC_TABLE_NAME%ROWTYPE;
V_TRG_TABLE_TYPE TRG_TABLE_TYPE;
CURSOR TRG_TAB_CUR IS
SELECT * FROM P_IN_SRC_TABLE_NAME
WHERE ED_DATA_ID BETWEEN V_FROM_DATA_ID AND V_TO_DATA_ID;
V_QRY1 varchar2(32767);
BEGIN
OPEN TRG_TAB_CUR;
LOOP
FETCH TRG_TAB_CUR BULK COLLECT INTO V_TRG_TABLE_TYPE LIMIT 30000;
FORALL I IN 1..V_TRG_TABLE_TYPE.COUNT
V_QRY1 := ' INSERT INTO '||P_IN_TRG_TABLE_NAME||' VALUES V_TRG_TABLE_TYPE(I);'
EXECUTE IMMEDIATE V_QRY1;
EXIT WHEN TRG_TAB_CUR%NOTFOUND;
END LOOP;
CLOSE TRG_TAB_CUR;
V_FROM_DATA_ID := V_TO_DATA_ID + 1;
IF ( V_STOP_DATA_ID - V_TO_DATA_ID > V_MAX_REC_IN_LOOP ) THEN
V_TO_DATA_ID := V_TO_DATA_ID + V_MAX_REC_IN_LOOP;
ELSE
V_TO_DATA_ID := V_TO_DATA_ID + (V_STOP_DATA_ID - V_TO_DATA_ID);
END IF;
EXCEPTION
WHEN OTHERS THEN.........so on
But the above code is not helping me , what i am doing wrong ??? how can i tune this dynamically generated statement to use bulk collect for better performace ......
Thanks in Advance !!!!
Hello,
a table name cannot be bind as a parameter in SQL, this wont't compile:
EXECUTE IMMEDIATE ' INSERT INTO :1 VALUES ......
USING P_IN_TRG_TABLE_NAME ...but this should work:
EXECUTE IMMEDIATE ' INSERT INTO ' || P_IN_TRG_TABLE_NAME || ' VALUES ......You cannot declare a type that is based on a table which name is in a variable.
PL/SQL is stronly typed language, a type must be known at compile time, a code like this is not allowed:
PROCEDURE xx( src_table_name varchar2 )
DECLARE
TYPE tab IS TABLE OF src_table_name%ROWTYPE;
...This can be done by creating one big dynamic SQL - see example below (tested on Oracle 10 XE - this is a slightly simplified version of your procedure):
CREATE OR REPLACE
PROCEDURE stp1(
p_in_src_table_name VARCHAR2 ,
p_in_trg_table_name VARCHAR2 ,
v_from_data_id NUMBER := 100,
v_to_data_id NUMBER := 100000
IS
BEGIN
EXECUTE IMMEDIATE q'{
DECLARE
TYPE trg_table_type IS TABLE OF }' || p_in_src_table_name || q'{%ROWTYPE;
V_TRG_TABLE_TYPE TRG_TABLE_TYPE;
CURSOR TRG_TAB_CUR IS
SELECT * FROM }' || p_in_src_table_name ||
q'{ WHERE ED_DATA_ID BETWEEN :V_FROM_DATA_ID AND :V_TO_DATA_ID;
BEGIN
OPEN TRG_TAB_CUR;
LOOP
FETCH TRG_TAB_CUR BULK COLLECT INTO V_TRG_TABLE_TYPE LIMIT 30000;
FORALL I IN 1 .. V_TRG_TABLE_TYPE.COUNT
INSERT INTO }' || p_in_trg_table_name || q'{ VALUES V_TRG_TABLE_TYPE( I );
EXIT WHEN TRG_TAB_CUR%NOTFOUND;
END LOOP;
CLOSE TRG_TAB_CUR;
END; }'
USING v_from_data_id, v_to_data_id;
COMMIT;
END;But this probably won't give any performace improvements. Bulk collect and forall can give performance improvements when there is a DML operation inside a loop,
and this one single DML operates on only one record or relatively small number of records, and this DML is repeated many many times in the loop.
I guess that your code is opposite to this - it contains insert statement that operates on many records (one single insert ~ 30000 records),
and you are trying to replace it with bulk collect/forall - INSERT INTO ... SELECT FROM will almost alwayst be faster than bulk collect/forall.
Look at simple test - below is a procedure that uses INSERT ... SELECT :
CREATE OR REPLACE
PROCEDURE stp(
p_in_src_table_name VARCHAR2 ,
p_in_trg_table_name VARCHAR2 ,
v_from_data_id NUMBER := 100,
v_to_data_id NUMBER := 100000
IS
V_QRY1 VARCHAR2(32767);
BEGIN
V_QRY1 := ' INSERT INTO '|| P_IN_TRG_TABLE_NAME ||
' SELECT * FROM '|| P_IN_SRC_TABLE_NAME ||
' WHERE ed_data_id BETWEEN :f AND :t ';
EXECUTE IMMEDIATE V_QRY1
USING V_FROM_DATA_ID, V_TO_DATA_ID;
COMMIT;
END;
/and we can compare both procedures:
SQL> CREATE TABLE test333
2 AS SELECT level ed_data_id ,
3 'XXX ' || LEVEL x,
4 'YYY ' || 2 * LEVEL y
5 FROM dual
6 CONNECT BY LEVEL <= 1000000;
Table created.
SQL> CREATE TABLE test333_dst AS
2 SELECT * FROM test333 WHERE 1 = 0;
Table created.
SQL> set timing on
SQL> ed
Wrote file afiedt.buf
1 BEGIN
2 FOR i IN 1 .. 100 LOOP
3 stp1( 'test333', 'test333_dst', 1000, 31000 );
4 END LOOP;
5* END;
SQL> /
PL/SQL procedure successfully completed.
Elapsed: 00:00:22.12
SQL> ed
Wrote file afiedt.buf
1 BEGIN
2 FOR i IN 1 .. 100 LOOP
3 stp( 'test333', 'test333_dst', 1000, 31000 );
4 END LOOP;
5* END;
SQL> /
PL/SQL procedure successfully completed.
Elapsed: 00:00:14.86without bulk collect ~ 15 sec.
bulk collect version ~ 22 sec. .... 7 sec longer / 15 sec. = about 45% performance decrease.
Similar Messages
-
How to use Bulk Collect and Forall
Hi all,
We are on Oracle 10g. I have a requirement to read from table A and then for each record in table A, find matching rows in table B and then write the identified information in table B to the target table (table C). In the past, I had used two ‘cursor for loops’ to achieve that. To make the new procedure, more efficient, I would like to learn to use ‘bulk collect’ and ‘forall’.
Here is what I have so far:
DECLARE
TYPE employee_array IS TABLE OF EMPLOYEES%ROWTYPE;
employee_data employee_array;
TYPE job_history_array IS TABLE OF JOB_HISTORY%ROWTYPE;
Job_history_data job_history_array;
BatchSize CONSTANT POSITIVE := 5;
-- Read from File A
CURSOR c_get_employees IS
SELECT Employee_id,
first_name,
last_name,
hire_date,
job_id
FROM EMPLOYEES;
-- Read from File B based on employee ID in File A
CURSOR c_get_job_history (p_employee_id number) IS
select start_date,
end_date,
job_id,
department_id
FROM JOB_HISTORY
WHERE employee_id = p_employee_id;
BEGIN
OPEN c_get_employees;
LOOP
FETCH c_get_employees BULK COLLECT INTO employee_data.employee_id.LAST,
employee_data.first_name.LAST,
employee_data.last_name.LAST,
employee_data.hire_date.LAST,
employee_data.job_id.LAST
LIMIT BatchSize;
FORALL i in 1.. employee_data.COUNT
Open c_get_job_history (employee_data(i).employee_id);
FETCH c_get_job_history BULKCOLLECT INTO job_history_array LIMIT BatchSize;
FORALL k in 1.. Job_history_data.COUNT LOOP
-- insert into FILE C
INSERT INTO MY_TEST(employee_id, first_name, last_name, hire_date, job_id)
values (job_history_array(k).employee_id, job_history_array(k).first_name,
job_history_array(k).last_name, job_history_array(k).hire_date,
job_history_array(k).job_id);
EXIT WHEN job_ history_data.count < BatchSize
END LOOP;
CLOSE c_get_job_history;
EXIT WHEN employee_data.COUNT < BatchSize;
END LOOP;
COMMIT;
CLOSE c_get_employees;
END;
When I run this script, I get
[Error] Execution (47: 17): ORA-06550: line 47, column 17:
PLS-00103: Encountered the symbol "OPEN" when expecting one of the following:
. ( * @ % & - + / at mod remainder rem select update with
<an exponent (**)> delete insert || execute multiset save
merge
ORA-06550: line 48, column 17:
PLS-00103: Encountered the symbol "FETCH" when expecting one of the following:
begin function package pragma procedure subtype type use
<an identifier> <a double-quoted delimited-identifier> form
current cursorWhat is the best way to code this? Once, I learn how to do this, I apply the knowledge to the real application in which file A would have around 200 rows and file B would have hundreds of thousands of rows.
Thank you for your guidance,
SeyedHello BlueShadow,
Following your advice, I modified a stored procedure that initially was using two cursor for loops to read from tables A and B to write to table C to use instead something like your suggestion listed below:
INSERT INTO tableC
SELECT …
FROM tableA JOIN tableB on (join condition).I tried this change on a procedure writing to tableC with keys disabled. I will try this against the real table that has primary key and indexes and report the result later.
Thank you very much,
Seyed -
Hi all,
I have a performance issue in the below code,where i am trying to insert the data from table_stg into target_tab and in parent_tab tables and then to child tables via cursor with bulk collect .the target_tab and parent_tab are huge tables and have a row wise trigger enabled on it .the trigger is mandatory . This timetaken for this block to execute is 5000 seconds.Now my requirement is to reduce it to 5 to 10 mins.
can someone please guide me here.Its bit urgent .Awaiting for your response.
declare
vmax_Value NUMBER(5);
vcnt number(10);
id_val number(20);
pc_id number(15);
vtable_nm VARCHAR2(100);
vstep_no VARCHAR2(10);
vsql_code VARCHAR2(10);
vsql_errm varchar2(200);
vtarget_starttime timestamp;
limit_in number :=10000;
idx number(10);
cursor stg_cursor is
select
DESCRIPTION,
SORT_CODE,
ACCOUNT_NUMBER,
to_number(to_char(CORRESPONDENCE_DATE,'DD')) crr_day,
to_char(CORRESPONDENCE_DATE,'MONTH') crr_month,
to_number(substr(to_char(CORRESPONDENCE_DATE,'DD-MON-YYYY'),8,4)) crr_year,
PARTY_ID,
GUID,
PAPERLESS_REF_IND,
PRODUCT_TYPE,
PRODUCT_BRAND,
PRODUCT_HELD_ID,
NOTIFICATION_PREF,
UNREAD_CORRES_PERIOD,
EMAIL_ID,
MOBILE_NUMBER,
TITLE,
SURNAME,
POSTCODE,
EVENT_TYPE,
PRIORITY_IND,
SUBJECT,
EXT_PRD_ID_TX,
EXT_PRD_HLD_ID_TX,
EXT_SYS_ID,
EXT_PTY_ID_TX,
ACCOUNT_TYPE_CD,
COM_PFR_TYP_TX,
COM_PFR_OPT_TX,
COM_PFR_RSN_CD
from table_stg;
type rec_type is table of stg_rec_type index by pls_integer;
v_rt_all_cols rec_type;
BEGIN
vstep_no := '0';
vmax_value := 0;
vtarget_starttime := systimestamp;
id_val := 0;
pc_id := 0;
success_flag := 0;
vstep_no := '1';
vtable_nm := 'before cursor';
OPEN stg_cursor;
vstep_no := '2';
vtable_nm := 'After cursor';
LOOP
vstep_no := '3';
vtable_nm := 'before fetch';
--loop
FETCH stg_cursor BULK COLLECT INTO v_rt_all_cols LIMIT limit_in;
vstep_no := '4';
vtable_nm := 'after fetch';
--EXIT WHEN v_rt_all_cols.COUNT = 0;
EXIT WHEN stg_cursor%NOTFOUND;
FOR i IN 1 .. v_rt_all_cols.COUNT
LOOP
dbms_output.put_line(upper(v_rt_all_cols(i).event_type));
if (upper(v_rt_all_cols(i).event_type) = upper('System_enforced')) then
vstep_no := '4.1';
vtable_nm := 'before seq sel';
select PC_SEQ.nextval into pc_id from dual;
vstep_no := '4.2';
vtable_nm := 'before insert corres';
INSERT INTO target1_tab
(ID,
PARTY_ID,
PRODUCT_BRAND,
SORT_CODE,
ACCOUNT_NUMBER,
EXT_PRD_ID_TX,
EXT_PRD_HLD_ID_TX,
EXT_SYS_ID,
EXT_PTY_ID_TX,
ACCOUNT_TYPE_CD,
COM_PFR_TYP_TX,
COM_PFR_OPT_TX,
COM_PFR_RSN_CD,
status)
VALUES
(pc_id,
v_rt_all_cols(i).party_id,
decode(v_rt_all_cols(i).product_brand,'LTB',2,'HLX',1,'HAL',1,'BOS',3,'VER',4,0),
v_rt_all_cols(i).sort_code,
'XXXX'||substr(trim(v_rt_all_cols(i).ACCOUNT_NUMBER),length(trim(v_rt_all_cols(i).ACCOUNT_NUMBER))-3,4),
v_rt_all_cols(i).EXT_PRD_ID_TX,
v_rt_all_cols(i).EXT_PRD_HLD_ID_TX,
v_rt_all_cols(i).EXT_SYS_ID,
v_rt_all_cols(i).EXT_PTY_ID_TX,
v_rt_all_cols(i).ACCOUNT_TYPE_CD,
v_rt_all_cols(i).COM_PFR_TYP_TX,
v_rt_all_cols(i).COM_PFR_OPT_TX,
v_rt_all_cols(i).COM_PFR_RSN_CD,
NULL);
vstep_no := '4.3';
vtable_nm := 'after insert corres';
else
select COM_SEQ.nextval into id_val from dual;
vstep_no := '6';
vtable_nm := 'before insertcomm';
if (upper(v_rt_all_cols(i).event_type) = upper('REMINDER')) then
vstep_no := '6.01';
vtable_nm := 'after if insertcomm';
insert into parent_tab
(ID ,
CTEM_CODE,
CHA_CODE,
CT_CODE,
CONTACT_POINT_ID,
SOURCE,
RECEIVED_DATE,
SEND_DATE,
RETRY_COUNT)
values
(id_val,
lower(v_rt_all_cols(i).event_type),
decode(v_rt_all_cols(i).product_brand,'LTB',2,'HLX',1,'HAL',1,'BOS',3,'VER',4,0),
'Email',
v_rt_all_cols(i).email_id,
'IADAREMINDER',
systimestamp,
systimestamp,
0);
else
vstep_no := '6.02';
vtable_nm := 'after else insertcomm';
insert into parent_tab
(ID ,
CTEM_CODE,
CHA_CODE,
CT_CODE,
CONTACT_POINT_ID,
SOURCE,
RECEIVED_DATE,
SEND_DATE,
RETRY_COUNT)
values
(id_val,
lower(v_rt_all_cols(i).event_type),
decode(v_rt_all_cols(i).product_brand,'LTB',2,'HLX',1,'HAL',1,'BOS',3,'VER',4,0),
'Email',
v_rt_all_cols(i).email_id,
'CORRESPONDENCE',
systimestamp,
systimestamp,
0);
END if;
vstep_no := '6.11';
vtable_nm := 'before chop';
if (v_rt_all_cols(i).ACCOUNT_NUMBER is not null) then
v_rt_all_cols(i).ACCOUNT_NUMBER := 'XXXX'||substr(trim(v_rt_all_cols(i).ACCOUNT_NUMBER),length(trim(v_rt_all_cols(i).ACCOUNT_NUMBER))-3,4);
insert into child_tab
(COM_ID,
KEY,
VALUE)
values
(id_val,
'IB.Correspondence.AccountNumberMasked',
v_rt_all_cols(i).ACCOUNT_NUMBER);
end if;
vstep_no := '6.1';
vtable_nm := 'before stateday';
if (v_rt_all_cols(i).crr_day is not null) then
insert into child_tab
(COM_ID,
KEY,
VALUE)
values
(id_val,
--'IB.Correspondence.Date.Day',
'IB.Crsp.Date.Day',
v_rt_all_cols(i).crr_day);
end if;
vstep_no := '6.2';
vtable_nm := 'before statemth';
if (v_rt_all_cols(i).crr_month is not null) then
insert into child_tab
(COM_ID,
KEY,
VALUE)
values
(id_val,
--'IB.Correspondence.Date.Month',
'IB.Crsp.Date.Month',
v_rt_all_cols(i).crr_month);
end if;
vstep_no := '6.3';
vtable_nm := 'before stateyear';
if (v_rt_all_cols(i).crr_year is not null) then
insert into child_tab
(COM_ID,
KEY,
VALUE)
values
(id_val,
--'IB.Correspondence.Date.Year',
'IB.Crsp.Date.Year',
v_rt_all_cols(i).crr_year);
end if;
vstep_no := '7';
vtable_nm := 'before type';
if (v_rt_all_cols(i).product_type is not null) then
insert into child_tab
(COM_ID,
KEY,
VALUE)
values
(id_val,
'IB.Product.ProductName',
v_rt_all_cols(i).product_type);
end if;
vstep_no := '9';
vtable_nm := 'before title';
if (trim(v_rt_all_cols(i).title) is not null) then
insert into child_tab
(COM_ID,
KEY,
VALUE )
values
(id_val,
'IB.Customer.Title',
trim(v_rt_all_cols(i).title));
end if;
vstep_no := '10';
vtable_nm := 'before surname';
if (v_rt_all_cols(i).surname is not null) then
insert into child_tab
(COM_ID,
KEY,
VALUE)
values
(id_val,
'IB.Customer.LastName',
v_rt_all_cols(i).surname);
end if;
vstep_no := '12';
vtable_nm := 'before postcd';
if (trim(v_rt_all_cols(i).POSTCODE) is not null) then
insert into child_tab
(COM_ID,
KEY,
VALUE)
values
(id_val,
'IB.Customer.Addr.PostCodeMasked',
substr(replace(v_rt_all_cols(i).POSTCODE,' ',''),length(replace(v_rt_all_cols(i).POSTCODE,' ',''))-2,3));
end if;
vstep_no := '13';
vtable_nm := 'before subject';
if (trim(v_rt_all_cols(i).SUBJECT) is not null) then
insert into child_tab
(COM_ID,
KEY,
VALUE)
values
(id_val,
'IB.Correspondence.Subject',
v_rt_all_cols(i).subject);
end if;
vstep_no := '14';
vtable_nm := 'before inactivity';
if (trim(v_rt_all_cols(i).UNREAD_CORRES_PERIOD) is null or
trim(v_rt_all_cols(i).UNREAD_CORRES_PERIOD) = '3' or
trim(v_rt_all_cols(i).UNREAD_CORRES_PERIOD) = '6' or
trim(v_rt_all_cols(i).UNREAD_CORRES_PERIOD) = '9') then
insert into child_tab
(COM_ID,
KEY,
VALUE)
values
(id_val,
'IB.Correspondence.Inactivity',
v_rt_all_cols(i).UNREAD_CORRES_PERIOD);
end if;
vstep_no := '14.1';
vtable_nm := 'after notfound';
end if;
vstep_no := '15';
vtable_nm := 'after notfound';
END LOOP;
end loop;
vstep_no := '16';
vtable_nm := 'before closecur';
CLOSE stg_cursor;
vstep_no := '17';
vtable_nm := 'before commit';
DELETE FROM table_stg;
COMMIT;
vstep_no := '18';
vtable_nm := 'after commit';
EXCEPTION
WHEN OTHERS THEN
ROLLBACK;
success_flag := 1;
vsql_code := SQLCODE;
vsql_errm := SUBSTR(sqlerrm,1,200);
error_logging_pkg.inserterrorlog('samp',vsql_code,vsql_errm, vtable_nm,vstep_no);
RAISE_APPLICATION_ERROR (-20011, 'samp '||vstep_no||' SQLERRM:'||SQLERRM);
end;
ThanksIts bit urgent
NO - it is NOT urgent. Not to us.
If you have an urgent problem you need to hire a consultant.
I have a performance issue in the below code,
Maybe you do and maybe you don't. How are we to really know? You haven't posted ANYTHING indicating that a performance issue exists. Please read the FAQ for how to post a tuning request and the info you need to provide. First and foremost you have to post SOMETHING that actually shows that a performance issue exists. Troubleshooting requires FACTS not just a subjective opinion.
where i am trying to insert the data from table_stg into target_tab and in parent_tab tables and then to child tables via cursor with bulk collect .the target_tab and parent_tab are huge tables and have a row wise trigger enabled on it .the trigger is mandatory . This timetaken for this block to execute is 5000 seconds.Now my requirement is to reduce it to 5 to 10 mins.
Personally I think 5000 seconds (about 1 hr 20 minutes) is very fast for processing 800 trillion rows of data into parent and child tables. Why do you think that is slow?
Your code has several major flaws that need to be corrected before you can even determine what, if anything, needs to be tuned.
This code has the EXIT statement at the beginning of the loop instead of at the end
FETCH stg_cursor BULK COLLECT INTO v_rt_all_cols LIMIT limit_in;
vstep_no := '4';
vtable_nm := 'after fetch';
--EXIT WHEN v_rt_all_cols.COUNT = 0;
EXIT WHEN stg_cursor%NOTFOUND;
The correct place for the %NOTFOUND test when using BULK COLLECT is at the END of the loop; that is, the last statement in the loop.
You can use a COUNT test at the start of the loop but ironically you have commented it out and have now done it wrong. Either move the NOTFOUND test to the end of the loop or remove it and uncomment the COUNT test.
WHEN OTHERS THEN
ROLLBACK;
That basically says you don't even care what problem occurs or whether the problem is for a single record of your 10,000 in the collection. You pretty much just throw away any stack trace and substitute your own message.
Your code also has NO exception handling for any of the individual steps or blocks of code.
The code you posted also begs the question of why you are using NAME=VALUE pairs for child data rows? Why aren't you using a standard relational table for this data?
As others have noted you are using slow-by-slow (row by row processing). Let's assume that PL/SQL, the bulk collect and row-by-row is actually necessary.
Then you should be constructing the parent and child records into collections and then inserting them in BULK using FORALL.
1. Create a collection for the new parent rows
2. Create a collection for the new child rows
3. For each set of LIMIT source row data
a. empty the parent and child collections
b. populate those collections with new parent/child data
c. bulk insert the parent collection into the parent table
d. bulk insert the child collection into the child table
And unless you really want to either load EVERYTHING or abandon everything you should use bulk exception handling so that the clean data gets processed and only the dirty data gets rejected. -
Using bulk collect and for all to solve a problem
Hi All
I have a following problem.
Please forgive me if its a stupid question :-) im learning.
1: Data in a staging table xx_staging_table
2: two Target table t1, t2 where some columns from xx_staging_table are inserted into
Some of the columns from the staging table data are checked for valid entries and then some columns from that row will be loaded into the two target tables.
The two target tables use different set of columns from the staging table
When I had a thousand records there was no problem with a direct insert but it seems we will now have half a million records.
This has slowed down the process considerably.
My question is
Can I use the bulk collect and for all functionality to get specific columns from a staging table, then validate the row using those columns
and then use a bulk insert to load the data into a specific table.?
So code would be like
get_staging_data cursor will have all the columns i need from the staging table
cursor get_staging_data
is select * from xx_staging_table (about 500000) records
Use bulk collect to load about 10000 or so records into a plsql table
and then do a bulk insert like this
CREATE TABLE t1 AS SELECT * FROM all_objects WHERE 1 = 2;
CREATE OR REPLACE PROCEDURE test_proc (p_array_size IN PLS_INTEGER DEFAULT 100)
IS
TYPE ARRAY IS TABLE OF all_objects%ROWTYPE;
l_data ARRAY;
CURSOR c IS SELECT * FROM all_objects;
BEGIN
OPEN c;
LOOP
FETCH c BULK COLLECT INTO l_data LIMIT p_array_size;
FORALL i IN 1..l_data.COUNT
INSERT INTO t1 VALUES l_data(i);
EXIT WHEN c%NOTFOUND;
END LOOP;
CLOSE c;
END test_proc;
In the above example t1 and the cursor have the same number of columns
In my case the columns in the cursor loop are a small subset of the columns of table t1
so can i use a forall to load that subset into the table t1? How does that work?
Thanks
Juser7348303 wrote:
checking if the value is valid and theres also some conditional processing rules ( such as if the value is a certain value no inserts are needed)
which are a little more complex than I can put in a simpleWell, if the processing is too complex (and conditional) to be done in SQL, then doing that in PL/SQL is justified... but will be slower as you are now introducing an additional layer. Data now needs to travel between the SQL layer and PL/SQL layer. This is slower.
PL/SQL is inherently serialised - and this also effects performance and scalability. PL/SQL cannot be parallelised by Oracle in an automated fashion. SQL processes can.
To put in in simple terms. You create PL/SQL procedure Foo that processes SQL cursor and you execute that proc. Oracle cannot run multiple parallel copies of Foo. It perhaps can parallelise that SQL cursor that Foo uses - but not Foo itself.
However, if Foo is called by the SQL engine it can run in parallel - as the SQL process calling Foo is running in parallel. So if you make Foo a pipeline table function (written in PL/SQL), and you design and code it as a thread-safe/parallel enabled function, it can be callled and used and executed in parallel, by the SQL engine.
So moving your PL/SQL code into a parallel enabled pipeline function written in PL/SQL, and using that function via parallel SQL, can increase performance over running that same basic PL/SQL processing as a serialised process.
This is of course assuming that the processing that needs to be done using PL/SQL code, can be designed and coded for parallel processing in this fashion. -
Binds collections and forall statement
version 9.2.0.6
I would like to make this more dynamic in that the collection cList can be used only once and be used by all bind variables. The variable stmt would be dynamically generated in that case it would insert into any number of tables.
Can this be done?
Is this feature available in a newer version of Oracle?
create table d2 nologging as
select
rownum rn
,substr(dbms_random.string('x',5),1,10) v1
,sysdate d1
,round(dbms_random.value(1,10)) n1
,substr(dbms_random.string('x',5),1,10) v2
,rpad(' ',4000,' ') as concat
from dual connect by level <= 100;
-- no rows for our test
create table d3 nologging as
select
from d2 where 1 = 2;
-- setup for our test
update d2
set image = rpad(nvl(to_char(rn),' '),10,' ')
|| rpad(nvl(v1,' '),20,' ')
|| rpad(nvl(to_char(d1,'DD-MON-YYYY HH24:MI:SS'),' '),34,' ')
|| rpad(nvl(to_char(n1),' '),10,' ')
|| rpad(nvl(v2,' '),10,' ')
-- test got all locations right
select
to_number(rtrim(substr(image,1,10)))
,rtrim(substr(image,11,20))
,to_date(rtrim(substr(image,30,34)),'DD-MON-YYYY HH24:MI:SS')
,to_number(rtrim(substr(image,65,10))) AS n1
,rtrim(substr(image,75,10))
from d2;
-- here is where we do the work
declare
type charList is table of varchar2(4000);
cList charList;
d2l d2_list;
errors NUMBER;
dml_errors EXCEPTION;
PRAGMA exception_init(dml_errors, -24381);
sqlStmt varchar2(32000);
cursor cur is select image from d2;
bcLimit number := 23;
begin
sqlStmt := 'insert into d3 (rn,v1,d1,n1,v2)'
|| 'values (to_number(rtrim(substr(:a,1,10)))
,rtrim(substr(:a,11,20))
,to_date(rtrim(substr(:a,30,34)),''DD-MON-YYYY HH24:MI:SS'')
,to_number(rtrim(substr(:a,65,10)))
,rtrim(substr(:a,75,10)))';
open cur;
loop
fetch cur bulk collect into cList limit bcLimit;
exit when cList.count = 0;
begin
-- very very unfortunately the code is unable to have one using clause variable be applied to all bind variables
-- note the number of cList uses having the bind variables all name the same does not help
-- using only one gets a ORA-1008 error :(
FORALL i IN cList.FIRST..cList.LAST SAVE EXCEPTIONS execute immediate sqlstmt using cList(i),cList(i),cList(i),cList(i),cList(i);
-- FORALL i IN cList.FIRST..cList.LAST SAVE EXCEPTIONS execute immediate sqlstmt using cList(i); --< DOES NOT WORK :( I WISH IT WOULD!
EXCEPTION
WHEN dml_errors THEN
errors := SQL%BULK_EXCEPTIONS.COUNT;
dbms_output.put_line('number of errors is ' || errors);
FOR i IN 1..errors LOOP
dbms_output.put_line('Error ' || i || ' occurred during iteration ' || SQL%BULK_EXCEPTIONS(i).ERROR_INDEX);
dbms_output.put_line('Could not insert ' || cList(SQL%BULK_EXCEPTIONS(i).ERROR_INDEX));
dbms_output.put_line(SQLERRM(-SQL%BULK_EXCEPTIONS(i).ERROR_CODE));
END LOOP;
end;
end loop;
close cur;
dbms_output.put_line('h2');
end;
/The CREATE TABLE you post for table D2 has no column called 'image'. It would help somewhat if you posted a working example.
Also I am not clear why the INSERT INTO D3 statement in the anonymous block needs to be dynamic. -
PL/SQL using BULK COLLECT and MERGE
what i am trying to do is to use bulk collect to create an array of row data, then loop through the array and either insert or update a table, hence, merge:
FORALL i in ID.first .. ID.last SAVE EXCEPTIONS
MERGE INTO table1 t USING (
select ID(i) ID, {other array fields...} from dual) s
ON t.ID = s.ID
WHEN MATCHED THEN UPDATE...
WHEN NOT MATCHED THEN INSERT ...
The problem is that Oracle always do a INSERT. Has anyone had the same problem? Any workaround?
Thanks.in package header:
TYPE ID_TYPE IS TABLE OF table1.ID%TYPE;
in the proc, I declared
ID ID_TYPE;
then a bulk collect:
select * from .. BULK COLLECT INTO ID...
In addition, i truncate the destination table and run the Proc. all records are insert's. then I ran the same proc again, expecting all records to be updated, but insert occured again causing exceptions due to violation of unique keys.
Message was edited by:
zliao01 -
Oracle 9.2.0.6
anyway to bulk collect into a varray where the index is part what is returned by the bulk collect?
declare
type tv is varray(5) of int;
tvList tv := tv(0,0,0,0,0);
begin
select l,v bulk collect into tvList
from
(select 1 as l,2 as v from dual
union
select 2,3 from dual);
end;
also tried:
declare
type tv is varray(5) of int;
tvList tv := tv(0,0,0,0,0);
begin
select l,v bulk collect into tvList(l)
from
(select 1 as l,2 as v from dual
union
select 2,3 from dual);
end;
neither worked.
ThanksYou can't do that; bulk collect only copies into consecutive entries, indexed by an integer.
If you use varrays, all the return values must fit in the varray's declared size. Elements are inserted starting at index 1, overwriting any existing elements.
See http://download-uk.oracle.com/docs/cd/B19306_01/appdev.102/b14261/tuning.htm#sthref2213
The workaround is to
- bilk collect into an associative array indexed by binary/pls integer
- after the bulk collect, copy the data into your varray
- do this in batches using BULK COLLECT ... LIMIT ,,,
Sounds painful, but the bulk collect can really save time on context switching between PL/SQL and SQL.
HTH
Regards Nigel -
Help with CISCO-887VA adsl over pots and PPPoE with dynamic IP
Hi
I've got problem trying to connect the CISCO-887VDSL/ADSL OVER POTS ROUTER to internet. Only got the LAN part working.
I'm trying to setup PPPoE with dynamic IP
Followed CISCO's documentations but the commands used were not recognized by the router. Any simple working config for me to follow will be enough.
I'll appreciate any help. Thanks a lot!
here's my config.
! Last configuration change at 08:31:51 UTC Sat Feb 11 2012
version 15.1
no service pad
service timestamps debug datetime msec
service timestamps log datetime msec
no service password-encryption
hostname router
boot-start-marker
boot-end-marker
no aaa new-model
memory-size iomem 10
crypto pki token default removal timeout 0
ip source-route
ip dhcp excluded-address 10.0.0.1 10.0.0.149
ip dhcp excluded-address 10.0.0.199 10.0.0.254
ip dhcp pool sdm-pool
import all
network 10.0.0.0 255.255.255.0
default-router 10.0.0.1
dns-server x.x.x.x x.x.x.x.x
lease 0 2
ip cef
no ipv6 cef
license udi pid CISCO887VA-K9 sn FGLxxxxxxx
controller VDSL 0
ip ftp username cisco
ip ftp password cisco
interface Ethernet0
pppoe enable group global
pppoe-client dial-pool-number 1
no ip address
shutdown
interface ATM0
no ip address
no atm ilmi-keepalive
pvc 0/35
pppoe-client dial-pool-number 1
interface FastEthernet0
no ip address
interface FastEthernet1
no ip address
interface FastEthernet2
no ip address
interface FastEthernet3
no ip address
interface Vlan1
ip address 10.0.0.1 255.255.255.0
ip nat inside
ip directed-broadcast
ip virtual-reassembly in
ip tcp adjust-mss 1452
interface Dialer1
mtu 1492
ip address negotiated
ip nat outside
ip virtual-reassembly in
encapsulation ppp
dialer pool 1
dialer-group 1
ppp authentication chap pap callin
ppp chap hostname xxxx
ppp chap password 0 xxxx
ppp pap sent-username xxxx password 0 xxxx
ip forward-protocol nd
no ip http server
no ip http secure-server
ip nat inside source list 1 interface Dialer1 overload
ip route 0.0.0.0 0.0.0.0 Dialer1
ip access-list standard 1
permit 10.0.0.0 0.0.0.255
no cdp run
line con 0
line aux 0
line vty 0 4
login
transport input all
endTry to check with your ISP the modem string to use for VDSL
and some ISP support direct dhcp on Ethernet0 without PPPoE.
An equivalent config is working for me in Switzerland with Swisscom.
N.B. "modem" under VDSL controller is enable using service internal !
service internal
controller VDSL 0
operating mode vdsl2
modem co5
ip source-route
ip cef
ip dhcp excluded-address 10.0.0.1 10.0.0.149
ip dhcp excluded-address 10.0.0.199 10.0.0.254
ip dhcp pool sdm-pool
import all
network 10.0.0.0 255.255.255.0
default-router 10.0.0.1
dns-server 8.8.8.8
lease 0 2
interface Ethernet0
ip address dhcp
ip nat outside
interface Vlan1
ip address 10.0.0.1 255.255.255.0
ip nat inside
ip tcp adjust-mss 1452
ip nat inside source list 23 interface Ethernet0 overload
access-list 23 permit 10.0.0.0 0.0.0.255
end -
Can't find photos in iOS8 (yes, I checked Collections and played with Settings)
Hello,
I recently inherited an iPhone 5s from someone else who got a 6, and I activated the 5s with my Apple ID and restored it to the latest backup I had of my iPhone 4 (made just moments before restoring the 5s). Like many other users, I had a hard time finding my photos. I browsed online for different solutions, switched Photo Stream off/on in Settings, I switched iCloud Library beta on/off, I switched photo summary off, went into Collections, but still do not see my photos.
I connected the iPhone to my Windows 7 computer and iTunes found some photos but not all of them (about 130 out of 250). All of the photos are still available in the Photo Stream on my iPhone 4.
Any other suggestions to get all of my photos into the 5s, please?Razmee,
Thank you for the link, but I'm not sure what exactly you're pointing me to. I had already tried the options in that article, and I just turned on Photo Library Beta again and it still doesn't bring in all my photos. Are you telling me to contact Apple? -
Dear Guru
1) I have some documents about collection of 10g
and example of Forall function.
2) Question: I have procedure called Test_ps
How to see the source code of the procedure
A : User_source
But i want to see how my parameter are there in procedure is there any option ?
Advance Thanks..you can use DSEC <Procedure_Name> to see the list of arguments
PRAZY@11gR1> create or replace procedure test_proc(a number,b number) is
2 begin
3 null;
4 end;
5 /
Procedure created.
Elapsed: 00:00:00.01
PRAZY@11gR1> select text from user_source where name='TEST_PROC' order by line;
TEXT
procedure test_proc(a number,b number) is
begin
null;
end;
Elapsed: 00:00:00.01
PRAZY@11gR1> desc test_proc;
PROCEDURE test_proc
Argument Name Type In/Out Default?
A NUMBER IN
B NUMBER INRegards,
Prazy -
DataGrid and Charts with dynamic XML
First of all, forgive me if this has already been answered.
The charting examples you see in the Flex documentation all
use the same familiar ArrayCollection, and binds it to a DataGrid,
Bar Chart, Area Chart, etc. However, all the grids and charts
already know the structure of the data because the mxml contains
field like yField="month" displayName="Month".
What I am trying to do is make everything dynamic. I am
converting an XML structure to an ArrayCollection and making it the
dataProvider, but all I end up with is the column headers. The
actual data doesn't populate the rows of the DataGrid.
Beyond that. I want to take that same ArrayCollection I
created from dynamic XML and use it as the dataProvider for any of
the charts included with Flex Charting. Basically, a way to
visualize the same data in any of the Flex charting components via
a drop down.
Has anybody got this to work?Without actually seeing the file, I would say check your
content-type and make sure it is 'text/xml'.
You can see examples of setting the content type here:
http://labs.adobe.com/technologies/spry/samples/utils/query2xml.html
You can sometimes tell just by viewing the dynamic XML in the
browser. If it shows as a XML tree, all is well. If it is a big
blob of text, it's probably content-type...
Let us know...
Don -
HELP Filtering and IReport with Dynamic Actions
Hi all,
Now that I've upgraded to 4.0, I'm trying to make more use of Dynamic Actions. I followed the example at: http://anthonyrayner.blogspot.com/2010/07/report-filtering-with-apex-40-dynamic.html and it works fine. The thing is that it is based on a select list which kind of takes away from the effect. What I want is to have an interactive report which is filtered by the value in a text box. I want to use a dynamic action that when a user types a search string, the IR is filtered down based on each character typed. I used the "Change" Event on the text box and the Refresh action on the report region for the IR, but it only refreshes when I "Enter" (submit). Any ideas on how I can get an AJAX-like action using dynamic actions?
Thanks!!!Hi,
See this post
APEX 3.2 Refresh interective report
It works also for Apex 4.
You can call javascript that set your item session state and refresh IR like in that post
Regards,
Jari -
Safari and Quictime with dynamic innerHTML
I'm working on a website that is doing some cool things with safari and quicktime. The site dynamically updates the media window with a new playlist using ajax, however redrawing the new QuickTime object in the same div causes multiple quicktime players to overlap and play.
Safari is the only browser that has this issue, all other browsers drop the previous quicktime object and creates a new one. Even if I set the innerHTML to null, it seems that safari wants to keep running the player.You might be better served posting this in the developers forum.
-
Policy-based Tunnel Selection and PBTS with Dynamic Tunnel Selection option
Unfortunately I can't test this feature on GSR 12k platform.
RP/0/0/CPU0:ios(config)#show config failed
!! SEMANTIC ERRORS: This configuration was rejected by
!! the system due to semantic errors. The individual
!! errors with each failed configuration command can be
!! found below.
interface tunnel-te1
policy-class 1
!!% The requested operation is not supported: Feature not supported on this platform
end
What is the difference between Dynamic Tunnel Selection and ordinary PBTS? It would be nice to see some real-world example.PBTS with DTS is supported only on the Cisco XR 12000 Series Router... you should rather test it on 12K platform.
-
New and need help - drag and drop with dynamic text
So I'm doing this project and as an animator I'm not familiar with the whole action script side of flash
Okay so far I've managed to create the whole Drag and Drop feature and it works well, the thing is I want to make it so when you drag in object in the correct spot and new text appears, and I need like six different object with the dynamic text. but I have no idea how to integrated it in my code or where I should start!
So i based myself on some tutorial so theres some code in there that had dynamic text, but not exactly what i wanted
Your help would be much appreciated!
This is my code:
var counter:Number = 0;
var startX:Number;
var startY:Number;
six_mc.addEventListener(MouseEvent.MOUSE_DOWN, pickUp);
six_mc.addEventListener(MouseEvent.MOUSE_UP, dropIt);
five_mc.addEventListener(MouseEvent.MOUSE_DOWN, pickUp);
five_mc.addEventListener(MouseEvent.MOUSE_UP, dropIt);
four_mc.addEventListener(MouseEvent.MOUSE_DOWN, pickUp);
four_mc.addEventListener(MouseEvent.MOUSE_UP, dropIt);
three_mc.addEventListener(MouseEvent.MOUSE_DOWN, pickUp);
three_mc.addEventListener(MouseEvent.MOUSE_UP, dropIt);
two_mc.addEventListener(MouseEvent.MOUSE_DOWN, pickUp);
two_mc.addEventListener(MouseEvent.MOUSE_UP, dropIt);
one_mc.addEventListener(MouseEvent.MOUSE_DOWN, pickUp);
one_mc.addEventListener(MouseEvent.MOUSE_UP, dropIt);
function pickUp(event:MouseEvent):void {
event.target.startDrag(true);
reply_txt.text = "";
event.target.parent.addChild(event.target);
startX = event.target.x;
startY = event.target.y;
function dropIt(event:MouseEvent):void {
event.target.stopDrag();
var myTargetName:String = "target" + event.target.name;
var myTarget:DisplayObject = getChildByName(myTargetName);
if (event.target.dropTarget != null && event.target.dropTarget.parent == myTarget){
reply_txt.text = "Good Job!";
event.target.removeEventListener(MouseEvent.MOUSE_DOWN, pickUp);
event.target.removeEventListener(MouseEvent.MOUSE_UP, dropIt);
event.target.buttonMode = false;
event.target.x = myTarget.x;
event.target.y = myTarget.y;
} else {
reply_txt.text = "Try Again!";
event.target.x = startX;
event.target.y = startY;
if(counter == 6){
reply_txt.text = "Congrats, you're finished!";
six_mc.buttonMode = true;
five_mc.buttonMode = true;
four_mc.buttonMode = true;
three_mc.buttonMode = true;
two_mc.buttonMode = true;
one_mc.buttonMode = true;where you have
xxx.text = ....
is where you're assigning text.
Maybe you are looking for
-
I am trying to email a photo from iphoto using my gmail account. I am receiving this error that my email server doesn't recognize my username/password. I do have a different password for Gmail, should that matter? How can I correct this?
-
Oracle Discoverer 10g desktop edition Release 2 (10.1.2.0.0)
hi, Does anyone know where i can find out the documentation or tutorial about oracle discoverer 10g desktop edition? any hits are welcome thanks and regards ling
-
BAPI for CS08 - Plant assignment - BOM
Is there a BAPI for transaction CS08 (BOM - plant assignment)?
-
Hi I have one age field(values eg 10,20,15,12).But report requirement is to have details based on age ranges(eg 0-5,5-10 etc.) Can anyone please tell me how can I handle this with discoverer. Please advice Thanks Pooja
-
Installer "error encountered while writing receipt"
I'm running off 10.4.10, and have been trying to install updates for months now, unsuccessfully. When I try to install off of the automatic updating system, it simply tells me the installation has failed and to try again later. When I download update