Associative array Basics
I have a SELECT query like
SELECT ename, sal from emp;
I want the ename column values from the output of the above query to be populated into an Associative array. So i did the following steps
DECLARE
TYPE ename_type IS TABLE OF VARCHAR2(10) INDEX BY PLS_INTEGER;
v_ename ename_type;
What is the next step by which i could populate the v_ename variable with the ename column values from the above query?
One slight problem.
The ename is derived from a large SELECT query joining three tables. The SELECT query looks like
SELECT ename,max(sal) from ...............;
I only need ename values from this SELECT query to be populated to the Assosiative array (v_ename). But I cannot exclude max(sal) from the query due to aggregate function's technical reasons(GROUP BY and so on).
So when i try to populate the associative array v_ename by issuing the following command:
SELECT ename BULK COLLECT INTO v_ename FROM (SELECT ename,max(sal) from ...............) ;
The max(sal) value in the SELECT query is going to be a problem. How do i solve this?
Similar Messages
-
Associative Array (Object) problems
Here is the function i'm dealing with
i'm reading in a delimited string and using indexed arrays to
break them up and assign the keys and values to an associative
array in a loop.
i'm using variables in the loop and the array loads as
expected in the loop
but outside the loop, the only key is the variable name and
the value is undefined
this is true using dot or array notation, as well as literal
strings for the keys
any help is appreciated
watchSuspendData = function (id, oldval, newval):String {
//the incoming suspendData string is delimited by the
semicolon;
//newval is: firstValue=Yes;captivateKey=1
var listSuspendData:Array = newval.split(";"); // convert it
to a list of key/value pairs
if (listSuspendData.length > 0){
//line 123: listSuspendData.length is: 2
for (i=0; i < listSuspendData.length; i++){ //for each
key/value pair
var keyValArray:Array = new Array();
var myNameValue:String = listSuspendData
//line 127: listSuspendData is: firstValue=Yes
keyValArray = myNameValue.split("="); // split 'em on the
equal sign
var myKey:String = keyValArray[0];
var myVal:String = keyValArray[1];
//keyValArray[0] is: firstValue
//keyValArray[1] is: Yes
// store the key and the value in associative array
suspendDataArray.myKey = myVal;
trace("line 134: suspendDataArray is: " +
suspendDataArray.myKey);
// trace is line 134: suspendDataArray is: Yes on the first
pass and 1 on the second
//the below loop always returns one array key: myKey and the
value as undefined
for(x in suspendDataArray){
trace("x is: " + x); //x is: myKey
trace("the val is: " + suspendDataArray.x); //the val is:
undefined
} //end for
return newval;on lines 12-13 i assign the key=value pair to string
variables
then on lines 17-18 i assign those values to the associative
array using dot notation
the trace seems to work there
the problem is that when the procedure exits the for loop,
the associative array only has one key (myKey) and no value
(undefined)
all the documentation i've read shows using these types of
arrays with either non-quoted property names like:
myAssocArray.myKey = "somevalue";
or
myAssocArray[myKey] = "somevalue";
i tried assigning the key/value pairs directly from the
indexed arrays, but the result was always undefined
like this:
suspendDataArray.keyValArray[0] = keyValArray[1]
or
suspendDataArray[keyValArray[0]] = keyValArray[1]
i even tried building a string in the loop and trying to
assign all the pairs at once using the curly brace
this is pretty wierd behavior for actionscript or i'm missing
something basic here
thanks for looking -
Without loops how can i read data from associative Array??
Hi all,
I am facing scenario like...
i need to read data from associative array without using loops is it possible,
CREATE OR REPLACE PACKAGE BODY test_pkg IS
TYPE t1 IS TABLE OF NUMBER INDEX BY BINARY_INTEGER;
-- in array we can expect more than one row or sometimes no data also.
FUNCTION test1(vt1 T1 DEFAULT CAST(NULL AS t1)) RETURN NUMBER IS
BEGIN
-- basically in array we'll get data of column2
-- this loop should satisfies table1.colum2 = nvl(NULL, table2.colum2 )if array is null.
-- if array is not null then only compare with array values
FOR i IN (SELECT t1.colum1,t1.column2
FROM table1 t1, table1 t2
WHERE t1.colum1 = t2.column1
AND t1.colum2 = nvl(vt1, t2.colum2)
LOOP
generateTEXT(i.colum1, i.colum2);
END LOOP;
END test1;
END test_pkg;
in table1 we have date like...
colum1 column2
Jan 1
Feb 2
Mar 3
if i call select test_pkg.test1(1) from dual then output should
be Jan..
and
select test_pkg.test1(null) from dual then it should display all elements from table1.
Jan 1
Feb 2
Mar 3,
Thanks for your quick replay..i need to read data from associative array without using loops is it possible,
No - you would need to create a SQL type and then use the TABLE operator to unnest the collection.
create or replace TYPE my_nums IS TABLE OF INTEGER;
DECLARE
-- TYPE my_nums IS TABLE OF PLS_INTEGER INDEX BY PLS_INTEGER;
v_nums my_nums := my_nums(1, 2, 3);
v_total number;
BEGIN
select sum(column_value) into v_total from table(v_nums);
DBMS_OUTPUT.PUT_LINE
('Sum of the numbers is ' || TO_CHAR(v_total));
END;
Sum of the numbers is 6 -
Parsing XMLType into an associative array
I'm try to build a procedure that can take an XMLType as a parameter. What I would like to do be able to extract both the node names and the string values from each tag of the XML and put them into basically a key --> value associative array.
Ex:
<envelope>
<node1>value1</node1>
<node2>value2</node2>
</envelope>
Would be put into an array structured like: array['node1'] = 'value1', array['node2'] = 'value2', etc.
I'm not entirely sure how I go about looping through the XMLType and extracting each node name and its value. I'm not worried about repeating tag names.. I know that will not be an issue in any of the XMLTypes that I'm using.
Would anyone be able to offer any suggestions? If not how to go about doing exactly that, then at least some insight as to how I can loop through and extract nodes and/or values without specifying the specific XPaths?
Thanks in advance!Hi Roger,
This is a good description of your process. Just add your Oracle DB Version somewhere. Want you want is a method to map your XML to a relationship design (some special variant of object-relational-modelling).
I still see two choices:
The programmatic approach (as you described):
Have multiple procedures where you input an XML. Each procedure is reflecting a different part of the transformation process. So it depends from the Message which procedure you will call.
Example
SQL*Plus: Release 9.2.0.2.0 - Production on Thu Aug 14 17:53:25 2008
Copyright (c) 1982, 2002, Oracle Corporation. All rights reserved.
Connected to:
Oracle9i Enterprise Edition Release 9.2.0.8.0 - 64bit Production
With the Partitioning, OLAP and Oracle Data Mining options
JServer Release 9.2.0.8.0 - Production
SQL> create table testtab (username varchar2(100));
Table created.
/* the following insert part should be in some packaged procedure, */
/* this just don't work on my 9i version, but should in 10g */
SQL> insert into testtab (username)
2 select substr(extractvalue(column_value,'*/text()'),1,100)
3 from table(xmlsequence(extract( xmltype('<envelope>'||
4 '<username>Roger</username>'||
5 '<username>Sven</username>'||
6 '</envelope>'),'envelope/username')));
2 rows created.
SQL> select * from testtab;
USERNAME
Roger
Sven
SQL> The declarative approach:
Insert the XML into a SchemaBased Table. The XML Schema will provide the mapping to your relational tables automagically. Problem here is the writing, handling and evolving of the XML Schema (Sub schemas are allowed too). -
Associative array related problem
Hi All,
When I am trying to use assotiative array in a select statement I recieve the following error:
ORA-06550: line 9, column 22:
PLS-00201: identifier 'COL1' must be declared
ORA-06550: line 9, column 22:
PLS-00201: identifier 'COL1' must be declared
ORA-06550: line 9, column 10:
PL/SQL: ORA-00904: : invalid identifier
ORA-06550: line 9, column 3:
PL/SQL: SQL Statement ignored
Here is the example
--create table MyTable (col1 varchar2(255), col2 varchar2(255))
declare
type m_ttMyTable
is table of MyTable%rowtype index by MyTable.col1%type;
m_tMyTable m_ttMyTable;
m_sCol2 varchar2(255);
begin
select m_tMyTable (col1).col2 /* works with ocntant: select m_tMyTable */('col1').col2
into m_sCol2
from MyTable
where rownum = 1;
end;
--drop table MyTableAny ideas how to workaround this?
ThanksThe only collection types SQL can query are ones defined in SQL using CREATE TYPE. That excludes associative arrays, as they are PL/SQL-only constructs. I'd recommend a nested table collection.
Some more suggestions:
www.williamrobertson.net/documents/collection-types.html -
Associative Array our only option?
Hello,
I'm having a problem accepting associative arrays as the only option I have for getting data from a stored procedure. I have a good reason for not wanting to use ref cursors as I am using the stored procedure to manipulate data which I in turn would like to pass back to VB through the stored procedure and would rather not have to insert he data into a table just to re-select it for a ref cursor.
My main concern is that with associative arrays I am expected to define the number of return results before I even generate the data. Also from what I can see I am required to set the data length for each and every item in said array one at a time. All this overhead seems like more work than what I would have to do to utilizer a reference cursor. Is there a right way to do this? I would really like to do the most straight forward way I can without the extra processing.Hi,
Here's a blog post of mine that illustrates using pipelined functions and PL/SQL to return results:
http://oradim.blogspot.com/2007/10/odpnet-tip-using-pipelined-functions.html
Not sure if that will be helpful in your case, but perhaps it might be a place to start anyway.
- Mark -
I am using a couple associative arrays in my code and comparing the data in one, and if it is an asterisk, I change it to use the data in the other. Here is the meat of my code. I am running into an error at the bolded line saying I have too many values, which I don't understand because the code is the exact same as the block of code right before it where I populate the first array. FYI, the table it is pulling from only has one row. The error is listed below the code.
Code
DECLARE
TYPE refresh_file_t IS TABLE OF test.loading_dock%ROWTYPE ;
refresh_data refresh_file_t ;
prospect_data refresh_file_t ;
TYPE CV_TYPE IS REF CURSOR ;
c_id CV_TYPE ;
v_id NUMBER(10) ;
v_phone VARCHAR2(10) ;
v_project VARCHAR2(10) ;
BEGIN
OPEN c_id FOR
'SELECT id
FROM test.loading_dock
WHERE rownum = 1' ;
LOOP
FETCH c_id INTO v_id ;
EXIT WHEN c_id%NOTFOUND ;
SELECT * BULK COLLECT
INTO refresh_data
FROM test.loading_dock
WHERE id = v_id ;
SELECT * BULK COLLECT
INTO prospect_data
FROM test.prospects
WHERE id_number = v_id ;
IF refresh_data(1).home_phone = '*' THEN
v_phone := prospect_data(1).phone ;
ELSE
v_phone := refresh_data(1).home_phone ;
END IF ;
DBMS_OUTPUT.PUT_LINE(v_phone) ;
END LOOP ;
CLOSE c_id ;
END ;
Error
ORA-06550: line 29, column 13:
PL/SQL: ORA-00913: too many values
ORA-06550: line 27, column 13:
PL/SQL: SQL Statement ignored
ORA-06550: line 34, column 46:
PLS-00302: component 'PHONE' must be declared
ORA-06550: line 34, column 13:
PL/SQL: Statement ignored
06550. 00000 - "line %s, column %s:\n%s"Collection prospect_data is of type refresh_file_t, which is a table of records of test.loading_dock%TYPE. Most likely tables test.loading_dock and test.prospects have different structure - test.prospects has fewer columns. So when you try to fetch from test.prospects into prospect_data you get the error Try replacing
prospect_data refresh_file_t ;with
TYPE prospect_data_t IS TABLE OF test.prospects%ROWTYPE ;
prospect_data prospect_data_t ;SY. -
Hi All,
I've searched through this forum trying to find information I'm needing on associative arrays with a varchar2 index without luck. What I'm looking for is a way to get the index or "key" values of the array without knowing what they are. Meaning, I wouldn't have to know the index value when designing the array but would be able to utilize them values at runtime. For those familiar with Java it would be like calling the keySet() method from a Map object.
So, if I have an array of TYPE COLUMN_ARRAY IS TABLE OF VARCHAR2(4000) INDEX BY VARCHAR2(100) is there any way to dynamically get the index values without knowing what they are?
Any help is appreciated.
ThanksThanks for the response.
I am aware of using FIRST and NEXT for iterating the array but can I extract the index value of the current element into a variable when I don't know what the index value is at runtime ?
Thanks -
Associative Array vs Table Scan
Still new to PL/SQL, but very keen to learn. I wondered if somebody could advise me whether I should use a collection (such as an associative array) instead of repeating a table scan within a loop for the example below. I need to read from an input table of experiment data and if the EXPERIMENT_ID does not already exist in my EXPERIMENTS table, then add it. Here is the code I have so far. My instinct is that it my code is inefficient. Would it be more efficient to scan the EXPERIMENTS table only once and store the list of IDs in a collection, then scan the collection within the loop?
-- Create any new Experiment IDs if needed
open CurExperiments;
loop
-- Fetch the explicit cursor
fetch CurExperiments
into vExpId, dExpDate;
exit when CurExperiments%notfound;
-- Check to see if already exists
select count(id)
into iCheckExpExists
from experiments
where id = vExpId;
if iCheckExpExists = 0 then
-- Experiment ID is not already in table so add a row
insert into experiments
(id, experiment_date)
values(vExpId, dExpDate);
end if;
end loop;Except that rownum is assigned after the result set
is computed, so the whole table will have to be
scanned.really?
SQL> explain plan for select * from i;
Explained.
SQL> select * from table( dbms_xplan.display );
PLAN_TABLE_OUTPUT
Plan hash value: 1766854993
| Id | Operation | Name | Rows | Bytes | Cost (%CPU)| Time |
| 0 | SELECT STATEMENT | | 910K| 4443K| 630 (3)| 00:00:08 |
| 1 | TABLE ACCESS FULL| I | 910K| 4443K| 630 (3)| 00:00:08 |
8 rows selected.
SQL> explain plan for select * from i where rownum=1;
Explained.
SQL> select * from table( dbms_xplan.display );
PLAN_TABLE_OUTPUT
Plan hash value: 2766403234
| Id | Operation | Name | Rows | Bytes | Cost (%CPU)| Time |
| 0 | SELECT STATEMENT | | 1 | 5 | 2 (0)| 00:00:01 |
|* 1 | COUNT STOPKEY | | | | | |
| 2 | TABLE ACCESS FULL| I | 1 | 5 | 2 (0)| 00:00:01 |
Predicate Information (identified by operation id):
1 - filter(ROWNUM=1)
14 rows selected. -
Associative array type for each blob column in the table
i am using the code in given link
http://www.oracle.com/technology/oramag/oracle/07-jan/o17odp.html
i chnages that code like this
CREATE TABLE JOBS
JOB_ID VARCHAR2(10 BYTE),
JOB_TITLE VARCHAR2(35 BYTE),
MIN_SALARY NUMBER(6),
MAX_SALARY NUMBER(6),
JOBPIC BLOB
CREATE OR REPLACE PACKAGE associative_array
AS
-- define an associative array type for each column in the jobs table
TYPE t_job_id IS TABLE OF jobs.job_id%TYPE
INDEX BY PLS_INTEGER;
TYPE t_job_title IS TABLE OF jobs.job_title%TYPE
INDEX BY PLS_INTEGER;
TYPE t_min_salary IS TABLE OF jobs.min_salary%TYPE
INDEX BY PLS_INTEGER;
TYPE t_max_salary IS TABLE OF jobs.max_salary%TYPE
INDEX BY PLS_INTEGER;
TYPE t_jobpic IS TABLE OF jobs.jobpic%TYPE
INDEX BY PLS_INTEGER;
-- define the procedure that will perform the array insert
PROCEDURE array_insert (
p_job_id IN t_job_id,
p_job_title IN t_job_title,
p_min_salary IN t_min_salary,
p_max_salary IN t_max_salary,
p_jobpic IN t_jobpic
END associative_array;
CREATE OR REPLACE package body SHC_OLD.associative_array as
-- implement the procedure that will perform the array insert
procedure array_insert (p_job_id in t_job_id,
p_job_title in t_job_title,
p_min_salary in t_min_salary,
p_max_salary in t_max_salary,
P_JOBPIC IN T_JOBPIC
) is
begin
forall i in p_job_id.first..p_job_id.last
insert into jobs (job_id,
job_title,
min_salary,
max_salary,
JOBPIC
values (p_job_id(i),
p_job_title(i),
p_min_salary(i),
p_max_salary(i),
P_JOBPIC(i)
end array_insert;
end associative_array;
this procedure is called from .net. from .net sending blob is posiible or not.if yes howOk, that won't work...you need to generate an image tag and provide the contents of the blob column as the src for the image tag.
If you look at my blog entry -
http://jes.blogs.shellprompt.net/2007/05/18/apex-delivering-pages-in-3-seconds-or-less/
and download that Whitepaper that I talk about you will find an example of how to do what you want to do. Note the majority of that whitepaper is discussing other (quite advanced) topics, but there is a small part of it that shows how to display an image stored as a blob in a table. -
I’m currently working on a system that allows the users to upload an Excel spreadsheet (.xls) in the system. The upload page is a PL/SQL cartridge. Then I’ve written a Java servlet (using Oracle Clean Content) to convert the XLS into a CSV and store it back in the database. (it is stored in the “uploaded_files” table as a blob). I’m trying to create another procedure to read the contents of the blob and display a preview of the data on the screen (using an html table (will be done using cartridge)). After the preview, the user can choose to submit the data into the database into the “detail_records” table or simply discard everything.
I've been trying to use an associative array to grab all the data from the blob but I’m getting confused about how to implement it in my situation.
Can someone provide any examples of this nature?
Any help is greatly appreciated.I decided to create a "record" type with all the columns from my excel spreadsheet. Then I will create a table type of records
I am doing something like this:
declare
type s_record is record
(l_name varchar2(100),
f_code varchar2(4) ,
l_code varchar2(6),
d_date varchar2(5),
d_type varchar2(5),
price number,
volume number,
tax number,
amount_paid number
type s_data_tab is table of s_record index by binary_integer;
v_s_data s_data_tab;
v_indx binary_integer :=0;
begin
end; I am getting confused about parsing an entire row of values separated by commas into a row in the temporary table created above.
I know I need a loop, but from what I understand, the way to populate data needs to be something like this, for example:
for v_indx in 0..data_size loop
v_s_data(v_indx).l_name:= 'Company A';
v_s_data(v_indx).f_code := '2700';
end loop; But I'm not sure how this approach should be used to parse an entire row at once.
Any help appreciated. -
Associative array binding - poor performance
Dear All
i have very very low performence when i am inserting a binary array using associative array bind. i mean when i insert huge jagged binary array.
the jagged array has
BinarryArray[0][0]........BinarryArray[0][8000]
BinarryArray[1][0]........BinarryArray[1][8000]
BinarryArray[3600][0]........BinarryArray[3600][8000]
BinarryArray[0] - i have 8000 byte end so on. total is 3600 X 8000
that means 28,800KB hence to ~28MB.
the C# code is as follows
string strInsert "Insert Into T Values(t.SEQUENCE.currval, :paramArr);
OracleCommand objCommand = new OracleCommand;
OracleParameter objParam = new OracleParameter(paramArr, OracleDbtype.blob, 8000, system.data.ParameterDirection.Input,true,0,0,"ColumnName", system.data.DataRowVersion.Curren, BinarryArray);
objCommandtext = strInsert;
objCommand.ArrayBindCount = BinarryArray.Length;
objCommand.Parameters.Clear();
objCommand.Parameters.Add(paramArr);
objCommand.ExecuteNonQuery();
In generall the Insertion is good for each row in the array i get separate row in the DB but it works so slow.
why??????
see the code belowwell??
-
Associative array EXISTS() DML
Is there a way to include the EXISTS() method of an associative array in a DML statement? For example:
This is fine...
SQL> declare
2 type distTblTyp is table of number index by varchar2(8);
3 testTbl distTblTyp; testVar varchar2(30);
4 begin
5 testTbl('a') := 1;
6 if (testTbl.exists('a')=true) then
7 null;
8 end if;
9* end;
SQL> /
PL/SQL procedure successfully completed.
But this fails...
SQL> declare
2 type distTblTyp is table of number index by varchar2(8);
3 testTbl distTblTyp; testVar varchar2(30);
4 begin
5 testTbl('a') := 1;
6 select decode(testTbl.exists('a'),true,'a','b')
7 into testVar from dual;
8* end;
SQL> /
select decode(testTbl.exists('a'),true,'a','b')
ERROR at line 6:
ORA-06550: line 6, column 24:
PL/SQL: ORA-01747: invalid user.table.column, table.column, or column
specification
ORA-06550: line 6, column 2:
PL/SQL: SQL Statement ignored
PL/SQL: SQL Statement ignored
Since I won't know which index elements exist until execution time, I want to have an insert statement (ran within a loop) with ~100 field values like:
insert into table [tablename]
values (
decode(testTbl.exists('a'), true, testTbl('a'), null),
decode(testTbl.exists('b'), true, testTbl('b'), null)
Any suggestions appreciated.As Gerd suggested...
SQL> create or replace package my_test_pk as
2 type my_type is table of number index by varchar2(2);
3 v_my_type my_type;
4 function ifExists(v_i varchar2) return number;
5 procedure runTest;
6 end;
7 /
Package created.
SQL> create or replace package body my_test_pk as
2 function ifExists(v_i varchar2) return number is
3 v_r number;
4 begin
5 v_r := 0;
6 if (v_my_type.exists(v_i)) then v_r := 1; end if;
7 return v_r;
8 end;
9
10 procedure runTest is
11 v_check varchar2(2);
12 begin
13 v_my_type('a') := 1;
14 select decode(ifExists('a'),1,'a',null) into v_check from dual;
15 dbms_output.put_line(v_check);
16 select decode(ifExists('b'),1,'b','N') into v_check from dual;
17 dbms_output.put_line(v_check);
18 end;
19 end;
20 /
Package body created.
SQL> exec my_test_pk.runtest;
a
N
PL/SQL procedure successfully completed.
SQL> -
I have an associative array that is built on a record.
TYPE LKey IS RECORD (
lock_key locks.lock_key%TYPE,
category_id_cd locks.category_id_cd%TYPE);
TYPE LockKeys_t IS TABLE OF LKey
INDEX BY PLS_INTEGER;
v_LockKeys_arr LockKeys_t;
How do I populate the elements of this array one row at a time within my cursor FOR loop? I just want to do something like this:
SELECT v_lock_key, v_category_id_cd INTO v_LockKeys_arr FROM dual;
When I try this I get : "expression v_LockKeys_arr in the INTO list is of wrong type."
Then I will loop through the array and use each element one at a time:
FOR j IN 1..v_LockKeys_arr.COUNT
LOOP
ReleaseLocks( v_LockKeys_arr(j).lock_key||'2:',
v_Lockkeys_arr(j).category_id_cd,
i_User, v_release_set );
END LOOP;
Thank you in advance for your time.DECLARE--CREATE RECORD
TYPE CUSTOMER_RECORD IS RECORD (CUSTOMER_ACCT_ID NUMBER,
CUSTOMER_NAME VARCHAR2(2000));
--CREATE TABLE OF RECORD TYPE
TYPE CUSTOMER_REC IS TABLE OF
CUSTOMER_RECORD
INDEX BY BINARY_INTEGER;
--INSTANCE OF RECORD
LREC_CUSTOMER_RECORD CUSTOMER_RECORD;
--INSTANCE OF TABLE
LT_CUSTOMER_REC CUSTOMER_REC;
BEGIN
--ASSIGN VALUES TO INSTANCE OF RECORD
LREC_CUSTOMER_RECORD.CUSTOMER_ACCT_ID:=10;
LREC_CUSTOMER_RECORD.CUSTOMER_NAME:='BHAGAT';
--INSERT INTO TABLE, VALUES FROM RECORD INSTANCE
LT_CUSTOMER_REC(1):=LREC_CUSTOMER_RECORD;
--OUTPUT
DBMS_OUTPUT.PUT_LINE(LT_CUSTOMER_REC(1).CUSTOMER_ACCT_ID);
DBMS_OUTPUT.PUT_LINE(LT_CUSTOMER_REC(1).CUSTOMER_NAME);
END;
/ -
Associative array comparison and INSERT upon IF condition
Hi Guys,
I have written this pl sql code to identify non existing sellers and insert their sales channel information into the dimension table (dimensional table update).
Somehow,......nothing is inserted and this script runs for 12 hours+ without any result. the sql autotrace shows no result and the explain plan (button on sql developer throws upon clicking "missing keyword". I have no
information what is going on/wrong. Does anyone spot an error?
UNDEFINE DimSales;
UNDEFINE FactTable;
DEFINE DimSales = 'testsales';
DEFINE FactTable = 'testfact';
DECLARE
v_SellerNo VarChar(9);
v_error_code T_ERRORS.v_error_code%TYPE;
v_error_message T_ERRORS.v_error_message%TYPE;
TYPE assoc_array_str_type1 IS TABLE OF VARCHAR2(32) INDEX BY PLS_INTEGER;
v1 assoc_array_str_type1;
TYPE assoc_array_str_type2 IS TABLE OF VARCHAR2(32) INDEX BY PLS_INTEGER;
v2 assoc_array_str_type2;
BEGIN
--Collect all distinct SellerNo into associative array (hash table)
select distinct SellerNo bulk collect into v1 from &FactTable;
select distinct seller_id bulk collect into v2 from &DimSales;
v_SellerNo := v1.first;
loop
exit when v1 is null;
--1 Check if v_SellerNo already exists in DIM_Sales (if NOT/FALSE, its a new seller and we can insert all records for that seller
if (v2.exists(v_SellerNo)=false) THEN
INSERT INTO &DimSales (K_Sales,REG,BVL,DS, VS,RS,GS,VK)
(SELECT DISTINCT trim(leading '0' from RS||GS) ,REG BVL,DS,VS,RS,GS,VK from &FactTable where SellerNo =v_SellerNo);
--ELSE
end if;
v_SellerNo := v1.next(v_SellerNo);
end loop;
EXCEPTION
WHEN OTHERS THEN
ROLLBACK;
--v_error_code := SQLCODE
--v_error_message := SQLERRM
--INSERT INTO t_errors VALUES ( v_error_code, v_error_message);
END;
---------------------------------------------------------------Distinct clause requires a sort. Sorts can be very expensive.
Bulk collects that are not constrained in fetch size, can potentially fetch millions of rows - requiring that data to be wholly read into server memory. I have seen how this can degrade performance so badly that the kernel reboots the server.
Using PL/SQL loops to process and insert/update/delete data is often problematic due to its row-by-row approach - also called slow-by-slow approach. It is far more scalable letting SQL do the "loop" processing, by using joins, sub-selects and so on.
Where the conditional processing is too complex for SQL to handle, then PL/SQL is obviously an alternative to use. Ideally one should process data sets as oppose to rows in PL//SQL. Reduce context switching by using bulk fetches and bulk binds.
But PL/SQL cannot execute in parallel as the SQL it fires off can. If after all the optimisation, the PL/SQL process still needs to hit a million rows to process, it will be slow irrespective of how optimal that PL/SQL approach and design - simply because of the number of rows and the processing overheads per row.
In that case, the PL/SQL code itself need to be parallelised. There are a number of ways to approach this problem - the typical one is to create unique and distinct ranges of rows to process, spawn multiple P/SQL processes, and provide each with a unique range of rows to process. In parallel.
So you need to look close at what you are trying to achieve, what the workloads are, and how to effectively decrease the workloads and increase the processing time of a workload.
For example - finding distinct column values. You can pay for that workload when wanting that distinct list. And each time afterward repeat that workload when wanting that distinct list. Or you can pay for that workload up-front with the DML that creates/updates those values - and use (for example) a materialised view to maintain a ready to use distinct list of values.
Same workload in essence - but paying once for it and up-front as oppose to each time you execute your code that needs to dynamically build that distinct list.
Kent Crotty did tests and showed stunning performance improvements with bulk collect and forall, up to 30x faster:Bulk processing is not a magical silver bullet. It is a tool. And when correctly use, the tool does exactly what it was designed to do.
The problem is using a hammer to drive in screws - instead of a screwdriver. There's nothing "stunning" about using a screwdriver. It is all about using the correct tool.
If the goal of the swap daemon is to free up "idle" chunks of memory, and try to use that memory for things like file cache instead, what does that have to do with bulk processing?The swap daemon reads virtual memory pages from swap space into memory, and writes virtual pages from memory to swap space.
What does it have to do with bulk processing? A bulk fetch reads data from the SGA (buffer cache) into the PGA (private process memory space). The larget the fetch, the more memory is required. If for example 50% of server memory is required for a bulk collection that is 2GB in size, then that will force in-use pages from memory to swap space.. only to be swapped back again as it is needed, thereby forcing other in-use pages to swap. The swap daemon will consume almost all the CPU time swapping hot pages continually in and out of memory.
Maybe you are looking for
-
I have recently moved house and had such an awful experience moving my BT service with me. I was on Fibre at my old place and called the house move team and gave them a months notice to switch the service. The only issue I would have would be no fibr
-
SM58 Error For IDOC to IDOC Scenario
Folks, I am getting an error in SM58 from Sender system ( IDOC-XI-IDOC Scenario ) <u><b>IDoc adapter inbound channel: Error Error when reading</b></u>. The Issue is XI is not receiving any IDOCS. Does anybody know what is the problem Thanks Manish
-
Added jTDS as third party DB connector to Oracle SQL Developer and connecti
I have followed the directions for adding a third party db connector to SQL Developer however as soon as I do the connection manager form doesn't come up for both adding a new connection and editing an exisitng one. I have posted here as I am having
-
Reset Clearing am Confusing for long time?
Dear Masters, The credited amount gets Reset cleared and the whole amount (at times partial amounts)get debited thereby causing OUTSTANDING in the next invoice. In few cases, the amount gets automatically cleared. I am confusing,Could you please expl
-
Need to print same outputs at different printers
Is there any way we could have the output directed to a different printer. A sales order confirmation output needs to print at printer ABC for user Sam, and printer xyz for user Jude. What configuration change would allow printing of outputs at user