Loading data into multiple tables - Bulk collect or regular Fetch
I have a procedure to load data from one source table into eight different destination tables. The 8 tables have some of the columns of the source table with a common key.
I have run into a couple of problems and have a few questions where I would like to seek advice:
1.) Procedure with and without the BULK COLLECT clause took the same time for 100,000 records. I thought I would see improvement in performance when I include BULK COLLECT with LIMIT.
2.) Updating the Load_Flag in source_table happens only for few records and not all. I had expected all records to be updated
3.) Are there other suggestions to improve the performance? or could you provide links to other posts or articles on the web that will help me improve the code?
Notes:
1.) 8 Destination tables have at least 2 Million records each, have multiple indexes and are accessed by application in Production
2.) There is an initial load of 1 Million rows with a subsequent daily load of 10,000 rows. Daily load will have updates for existing rows (not shown in code structure below)
The structure of the procedure is as follows
Declare
dest_type is table of source_table%ROWTYPE;
dest_tab dest_type ;
iCount NUMBER;
cursor source_cur is select * from source_table FOR UPDATE OF load_flag;
BEGIN
OPEN source_cur;
LOOP
FETCH source_cur -- BULK COLLECT
INTO dest_tab -- LIMIT 1000
EXIT WHEN source_cur%NOTFOUND;
FOR i in dest_tab.FIRST .. dest_tab.LAST LOOP
<Insert into app_tab1 values key, col12, col23, col34 ;>
<Insert into app_tab2 values key, col15, col29, col31 ;>
<Insert into app_tab3 values key, col52, col93, col56 ;>
UPDATE source_table SET load_flag = 'Y' WHERE CURRENT OF source_cur ;
iCount := iCount + 1 ;
IF iCount = 1000 THEN
COMMIT ;
iCount := 0 ;
END IF;
END LOOP;
END LOOP ;
COMMIT ;
END ;
Edited by: user11368240 on Jul 14, 2009 11:08 AM
Assuming you are on 10g or later, the PL/SQL compiler generates the bulk fetch for you automatically, so your code is the same as (untested):
DECLARE
iCount NUMBER;
CURSOR source_cur is select * from source_table FOR UPDATE OF load_flag;
BEGIN
OPEN source_cur;
FOR r IN source_cur
LOOP
<Insert into app_tab1 values key, col12, col23, col34 ;>
<Insert into app_tab2 values key, col15, col29, col31 ;>
<Insert into app_tab3 values key, col52, col93, col56 ;>
UPDATE source_table SET load_flag = 'Y' WHERE CURRENT OF source_cur ;
iCount := iCount + 1 ;
IF iCount = 1000 THEN
COMMIT ;
iCount := 0 ;
END IF;
END LOOP;
COMMIT ;
END ;However most of the benefit of bulk fetching would come from using the array with a FORALL expression, which the PL/SQL compiler can't automate for you.
If you are fetching 1000 rows at a time, purely from a code simplification point of view you could lose iCount and the IF...COMMIT...END IF and just commit each time after looping through the 1000-row array.
However I'm not sure how committing every 1000 rows helps restartability, even if your real code has a WHERE clause in the cursor so that it only selects rows with load_flag = 'N' or whatever. If you are worried that it will roll back all your hard work on failure, why not just commit in your exception handler?
Similar Messages
-
Loading data into multiple tables from an excel
Can we load data in to multiple tables at a time from an excel through Utilities? If yes how? Please help me
Regards,
PallaviI would imagine that the utilities allow you to insert data from a spreadsheet into 1 and only 1 table.
You may have to write your own custom data upload using External Tables and a PL/SQL procedure to insert data from 1 spreadsheet into more than 1 table.
If you need any guidance on doing this let me know and I will happily point you in the right direction.
Regards
Duncan -
Loading data into multiple tables using sqlloader
Hi,
I am using sql loader to load the data from flat file into the data base
my file structure is as below
====================
101,john,mobile@@fax@@home@@office@@email,1234@@3425@@1232@@2345@@[email protected],1234.40
102,smith,mobile@@fax@@home,1234@@345@@234,123.40
103,adams,fax@@mobile@@office@@others,1234@@1233@@1234@@3456,2345.40
in file first columns are empno,ename,comm_mode(multiple values terminated by '@@'),comm_no_txt(multiple values terminated by '@@'), sal
the comm_mode and comm_no_text needs to be inserted into the separate table (emp_comm) like below
emp
empno ename sal
101 john 1234.40
102 smith 123.40
103 adams 2345.40
emp_comm
empno comm_mode comm_no_text
101 mobile 1234
101 fax 3425
101 home 1232
101 office 2345
101 email [email protected]
102 mobile 1234
102 fax 345
102 home 234
103 fax 1234
like this needs to insert the data using sql loader
my table structures
===============
emp
empno number(5)
ename varchar2(15)
sal number(10,2)
emp_comm
empno number(5) reference the empno of the emp table
comm_mode varchar2(10)
Comm_no_text varchar2(35)
now i want insert the file data into the specified structues
please help me out to achieve this using sql loader
(we are not using external tables for this)
Thanks & Regards.
Bala Sake
Edited by: 954925 on Aug 25, 2012 12:24 AMPl post OS and database details
You will need to split up the datafile in order to load into separate tables. The process is documented
http://docs.oracle.com/cd/E11882_01/server.112/e22490/ldr_control_file.htm#autoId72
HTH
Srini -
Insert data into multiple tables
Hi all,
I've a requirement where i need to insert data into multiple tables using PL/SQL procedure
Procedure should have two parameters
1. Table Name (parameter1)
2. Data (parameter2)
Based on these two parameters i need to insert data into table (parameter1) using data (parameter2)
ex:
Procedure insert_data (p_table IN VARCHAR2
,p_data IN -- what should be the datatype?
IS
l_statement VARCHAR2(2000);
BEGIN
-- insert data into tables
INSERT INTO p_table
values (....);
END insert_data;Thanks in advance!!BEDE wrote:
Amen to that!
So, I believe a better approach would be the following...
Suppose you have N datafiles with the same structure, and you wish to insert into the database the data from all those files.
For that, you should have a single table, named, say incoming_file_data, which should be structured more or less like below:
create table incoming_file_data (
filename varchar2(250) not null -- name of the file inserted from
,file_time timestamp -- timestamp when the data was inserted
,... -- the columns of meaningful data contained in the lines of those files
);And you will insert the data from all those files in this table, having normally one transaction for each file processed, for otherwise, when shit happens, some file may only get to be partially inserted into the table...
Maybe one good approach would be to create dynamically an external table for the file to be loaded, and then execute dynamically insert select into the table I said, so that you will have only one insert select for one file instead of using utl_file... RTM on that.If the file structures are the same, and it's just the filename that's changing, I would have a single external table definition, and use the alter table ... location ... statement (through execute immediate) to change the filename(s) as appropriate before querying the data. Of course that's not scalable if there are multiple users intenting to use this, but generally when we talk about importing multiple files, it's a one-user/one-off/once-a-day type of scenario, so multi-user isn't a consideration. -
How to insert one table data into multiple tables by using procedure?
How to insert one table data into multiple tables by using procedure?
Below is the simple procedure. Try the below
CREATE OR REPLACE PROCEDURE test_proc
AS
BEGIN
INSERT ALL
INTO emp_test1
INTO emp_test2
SELECT * FROM emp;
END;
If you want more examples you can refer below link
multi-table inserts in oracle 9i
Message was edited by: 000000 -
Inserting data into multiple tables in jdbc
I am doing on file to jdbc. Now I got a requirement to insert data into multiple tables on the receiver side. How can I do this ?
Hi,
you are going to insert data into 4 tables in a sequence one after another , I see three options.
1) Stored procedure and 2) creating 4 statement data structure (one for each table)
The third option is writing a SQL with join for the 4 tables and use action command = SQL_DML. Example as follows....
Write SQL code and place it in access tag. Pass values for the columns using key tag...
<stmt>
<Customers action="SQL_DML">
<access> UPDATE Customers SET CompanyName=u2019$NAME$u2019, Address=u2019$ADDRESS$' WHERE CustomerID='$KEYFIELD$u2019
</access>
<key>
<NAME>name</NAME>
<ADDRESS>add </ADDRESS>
<KEYFIELD>1</KEYFIELD>
</key>
</Customers>
</stmt>
Refer this http://help.sap.com/saphelp_nwpi71/helpdata/en/44/7b7855fde93673e10000000a114a6b/content.htm
Hope this helps .... -
I am loading data into a table I created which includes a column "Description" with a data type VARCHAR2(1000). When I go to load the data which is less than 1000 characters I receive the following error message:
Record 38: Rejected - Error on table SSW_INPUTS, column DESCRIPTION.
Field in data file exceeds maximum length
I have increased the size of the column but that does not seem to fix the error. Does anyone know what this error means? Another thought is that I have created the "Description" column to large...which can't be true because I should receive the error when I create the table. Plus I already inputted data into a similar table with similar data and had no problems!
Someone please help...
Thank you,
April.Note that I'm assuming Oracle8(i) behavior. Oracle9 may treat Unicode differently.
Are you inserting Unicode data into the table? Declaring a variable as varchar2(1000) indicates that Oracle should reserve 1000 bytes for data. If you're inserting UTF-8 encoded data, each character may take up to 3 bytes to store. Thus, 334 characters of data could theoretically overflow a varchar2(1000) variable.
Note that UTF-8 is designed so that the most commonly used characters are stored in 1 byte, less commonly used characters are stored in 2 bytes, and the remainder is stored in 3 bytes. On average, this will require less space than the more familiar UCS-2 encoding which stores every character as 2 bytes of data.
Justin -
Error while loading data into External table from the flat files
HI ,
We have a data load in our project which feeds the oracle external tables with the data from the Flat Files(.bcp files) in unix.
While loading the data, we are encountering the following error.
Error occured (Error Code : -29913 and Error Message : ORA-29913: error in executing ODCIEXTTABLEOPEN callout
ORA-29400: data cartridge error
KUP-04063: un) while loading data into table_ext
Please let us know what needs to be done in this case to solve this problem.
Thanks,
KartheekKartheek,
I used Google (mine still works).... please check those links:
http://oraclequirks.blogspot.com/2008/07/ora-29400-data-cartridge-error.html
http://jonathanlewis.wordpress.com/2011/02/15/ora-29913/
HTH,
Thierry -
How to load data into user tables using DIAPIs?
Hi,
I have created an user table using UserTablesMD object.
But I don't have know how to load data into this user table. I guess I have to use UserTable object for that. But I still don't know how to put some data in particular column.
Can somebody please help me with this?
I would appreciate if somebody can share their code in this regard.
Thank you,
SudhaYou can try this code:
Dim lRetCode As Long
Dim userTable As SAPbobsCOM.UserTable
userTable = pCompany.UserTables.Item("My_Table")
'First row in the @My_Table table
userTable.Code = "A1"
userTable.Name = "A.1"
userTable.UserFields.Fields.Item("U_1stF").Value = "First row value"
userTable.Add()
'Second row in the @My_Table table
userTable.Code = "A2"
userTable.Name = "A.2"
userTable.UserFields.Fields.Item("U_1stF").Value = "Second row value"
userTable.Add()
This way I have added 2 lines in my table.
Hope it helps
Trinidad. -
Loading data from multiple tables to multiple sheets of excel using SSIS
I have a requirement in which I want to load data from 13 tables to 13 respective sheets of single excel file using SSIS.
Can anyone know SSIS logic for developing package for this?see similar example here
http://visakhm.blogspot.in/2013/09/exporting-sqlserver-data-to-multiple.html
In your case you need to use loop to iterate through tables
First get list of tables in a object variable created in SSIS using INFORMATION_SCHEMA.TABLES view
Then add a for each loop based on ADO.NET variable enumerator to iterate through tables and inside loop follow method as in the above link to create the sheet first and populate it.
Please Mark This As Answer if it helps to solve the issue Visakh ---------------------------- http://visakhm.blogspot.com/ https://www.facebook.com/VmBlogs -
Loading data into existing table
Hi I have tried to load data into a large table from a csv file but am not getting any success. I have this control file
LOAD DATA
INFILE 'Book1.xls'
BADFILE 'p_sum_bad.txt'
DISCARDFILE 'p_sum_dis.txt'
APPEND
INTO TABLE p_sum
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"'
TRAILING NULLCOLS
SUMMARY_LEVEL ,
PERIOD_START_TIME ,
BUSY_HOUR ,
OMC ,
INT_ID ,
BTS_ID ,
BTS_INT_ID ,
CELL_GROUP ,
HO_PERIOD_DURATION ,
POWER_PERIOD_DURATION ,
MSC_I_SUCC_HO ,
MSC_I_TCH_TCH ,
MSC_I_SDCCH_TCH ,
MSC_I_SDCCH ,
MSC_I_TCH_TCH_AT ,
MSC_I_SDCCH_TCH_AT ,
MSC_I_SDCCH_AT ,
MSC_I_FAIL_LACK ,
MSC_I_FAIL_CONN ,
MSC_I_FAIL_BSS ,
MSC_I_END_OF_HO ,
MSC_O_SUCC_HO ,
The data is:
2 3-Nov-06 1000033 9 8092220 1440 1440 5411 5374 7 30 5941
2 3-Nov-06 1000033 10 1392190 1440 1440 0 0 0 0 0
2 3-Nov-06 2000413 3 2127446 1440 1440 80 80 0 0 83
2 3-Nov-06 2000413 4 2021248 1140 1440 0 0 0 0 0
2 3-Nov-06 2000413 5 2021252 1080 1440 1 1 0 0 1
2 3-Nov-06 2000413 6 2130163 1440 1440 2200 2193 2 5 2224
2 3-Nov-06 2000413 7 6205155 1020 1440 0 0 0 0 0
2 3-Nov-06 2000413 8 6200768 900 1440 30 30 0 0 31
2 3-Nov-06 2000413 10 2111877 1440 1440 0 0 0 0 0
2 3-Nov-06 1000033 18 1076419 1440 1440 75 73 0 2 79
2 3-Nov-06 1000033 19 8089060 1440 1440 0 0 0 0 0
but when I try to load the data, I get:
Column Name Position Len Term Encl Datatype
SUMMARY_LEVEL FIRST * , O(") CHARACTER
PERIOD_START_TIME NEXT * , O(") CHARACTER
Record 51: Rejected - Error on table OMC.P_SUM_BTS_HO_POWER, column SUMMARY_LEVEL.
ORA-01722: invalid number
I believe the data being loaded has to be NUMBER. Can anyone adivse what do I need to change to load the data. ThanksJustin,
Tried that, no luck:
Number to load: ALL
Number to skip: 0
Errors allowed: 50
Bind array: 64 rows, maximum of 256000 bytes
Continuation: none specified
Path used: Conventional
Table P_SUM, loaded from every logical record.
Insert option in effect for this table: APPEND
TRAILING NULLCOLS option in effect
Column Name Position Len Term Encl Datatype
SUMMARY_LEVEL FIRST * WHT O(") CHARACTER
PERIOD_START_TIME NEXT * WHT O(") CHARACTER
BUSY_HOUR NEXT * WHT O(") CHARACTER
OMC NEXT * WHT O(") CHARACTER
INT_ID NEXT * WHT O(") CHARACTER
BTS_ID NEXT * WHT O(") CHARACTER
BTS_INT_ID NEXT * WHT O(") CHARACTER
CELL_GROUP NEXT * WHT O(") CHARACTER
Record 51: Rejected - Error on table OMC.P_SUM_BTS_HO_POWER, column SUMMARY_LEVEL.
ORA-01722: invalid number
Any other sugesstion -
Loading Data from multiple tables into essbase using ODI
Hi,
We have a scenario where data comes from multiple tables. I would like to know how will ODI load this data for the right combination of the membersHi,
I take it each data table has a field which maps to the other table. You can just drag the datastores on to the source interface and create a join between the tables.
Cheers
John
http://john-goodwin.blogspot.com/ -
Inserting data into multiple tables(Oracle Version 9.2.0.6)
Hi,
we are going to receive the following XML file from one of our vendor. We need to parse the file and then save the data to multiple database tables (around 3).
<?xml version="1.0"?>
<datafeed xmlns:xsi="ht tp://www.w3.org/2001/XMLSchema-instance" xsi:noNamespaceSchemaLocation="DailyFeed.xsd" deliverydate="2007-02-14T00:00:00" vendorid="4">
<items count="1">
<item feed_id="001379" mode="MERGE">
<content empbased="true">
<emp>10000000</emp>
<value>
</value>
<date>2006-01-16</date>
<unit>CHF</unit>
<links>
<link lang="EN">
<url>www.pqr.com</url>
<description>pqr website</description>
</link>
<link lang="DE">
<url>www.efg.com</url>
<description>efg website</description>
</link>
</links>
</content>
<content empbased="true">
<emp>10000001</emp>
<value>
</value>
<date>2006-01-16</date>
<unit>CHF</unit>
<links>
<link lang="EN">
<url>www.abc.com</url>
<description>abc website</description>
</link>
<link lang="DE">
<url>www.xyz.com</url>
<description>xyz website</description>
</link>
</links>
</content>
<content empbased="true">
<emp>10000002</emp>
<value>
</value>
<date>2006-01-16</date>
<unit>CHF</unit>
<links>
<link lang="IT">
<url>www.rst.com</url>
<description>rst website</description>
</link>
</links>
</content>
</item>
</items>
</datafeed>
Now the operation to be done on the table depends on the mode attribute. Further, there are some basic validations need to be done using count attribute. Here the item tag, content tag & link tag are recurring elements.
The problem is I am not able to find the correct attributes like mode, feed_id, lang through SQL query(they are getting duplicated) though I was able to find the deliverydate & vendorid attribute as they are non-repeatitive. Here are the scripts :
create table tbl_xml_rawdata (xml_content xmltype);
create directory xmldir as 'c:\xml';
--put the above xml file in this directory and name it testfile.xml
Declare
l_bfile BFILE;
l_clob CLOB;
BEGIN
l_bfile := BFileName('XMLDIR', 'testfile.xml');
dbms_lob.createtemporary(l_clob, cache=>FALSE);
dbms_lob.open(l_bfile, dbms_lob.lob_readonly);
dbms_lob.loadFromFile(dest_lob => l_clob,
src_lob => l_bfile,
amount => dbms_lob.getLength(l_bfile));
dbms_lob.close(l_bfile);
insert into test_xml_rawdata values(xmltype.createxml(l_clob));
commit;
end;
My query is:
select extractvalue(value(b),'/datafeed/@deliverydate') ddate, extractvalue(value(b),'/datafeed/@vendorid') vendorid,
extractvalue( value( c ), '//@feed_id') feed_id,
extractvalue( value( a ), '//@empbased') empbased,
extractvalue( value( a ), '//emp') emp,
extractvalue( value( a ), '//value') value,
extractvalue( value( a ), '//unit') unit,
extractvalue( value( a ), '//date') ddate1,
extract( value( a ), '//links/link/url') url,
extract( value( a ), '//links/link/description') description
from tbl_xml_rawdata t,
table(xmlsequence(extract(t.xml_content,'/datafeed/items/item/content'))) a,
table(xmlsequence(extract(t.xml_content,'/'))) b ,
table(xmlsequence(extract(t.xml_content,'/datafeed/items/item'))) c;
If the above query is run, the feed_id is cartesian joined with other data ,which is wrong.
How should I go with this so that I can have 1 relational record with respect to each element & sub-elements.
Also, if this is not doable in SQL, can someone direct me to some plsql example to do this. I read that dbms_xmldom & dbms_xmlparser can be used to travel through XML doc but I don't know how to use them.
Any help please ??I'm still getting the same error while installing Oracle Patch set 9.2.0.6. I downloaded the patchset 2 weeks back.
Pls help where download the correct version ? -
SQLLOADER PROBLEM IN LOADING DATA TO MULTIPLE TABLES
My problem is I have to data from a flat file which consists 64 columns and 11040 records into 5 different tables.Other thin is I have to check that only UNIQUE record should goto database and then I have to generate a primary key for the record that came to database.
So I have written a BEFORE INSERT TRIGGER FOR EACH ROW for all the 5 tables to check uniques of the record arrived.
Now my problem is SQLLDR is loading only those number of records for all the tables which are in minimum to a table uniquely .i.e.,
TABLES RECORDS(ORGINALLY)
TIME 11
STORES 184
PROMOTION 20
PRODUCT 60
Now it is loadin only 11 records for all the problem
with regards
vijayankarThe easiest thing is to do data manipulation in the database; that's what SQL is good for.
So load your file into tables without any unique constraints. Then apply unique constraints using the EXCEPTIONS INTO... clause. This will populate your exceptions table with the rowid of all the non-unique rows. You can then decide which rows to zap.
If you don't already have an exceptions table you'll need to run utlexcpt.sql.
HTH
P.S. This isn't the right forum to be posting SQL*Loader enquiries. -
Import Data into multiple tables
Hi folks,
I already did some research in this forum but I cannot find a solution. As far as I understood I need to import the data from the flatfile into a staging table and then distribute it to the different tables by running a SQL script/statement.
Do you have any examples of such a SQL statement/script?
Thanks in advance!
Regards,
Tino
Message was edited by:
tino.albrechtrepeat for each table:
insert /*+ APPEND */ into <table 1>
select <source columns for table 1>
from <table where the flat file was imported>
where <conditions that identify records for table 1>;
OR, alternatively, with a single statement
insert
when <conditions that identify records for table1>
then into <table 1>(<columns of table 1>) values(<source columns for table 1>)
when <conditions that identify records for table2>
then into <table 2>(<columns of table 2>) values(<source columns for table 2>)
select * from <table where the flat file was imported>;
Maybe you are looking for
-
Error while configuring LOANs-SAP HCM Indian PT
Dear Consultants, As per my client requirement I have created a Loan type it is coming in the Infotype 45 and subtype. but when I want to create a Loan in the CONDITIONS tab Loan Condition says nothing to select, Please let me know what is the tcode
-
Javascript: copy an object to a new layer and tracing it
Hello everybody: I'm using Illustrator CS5 on Mac. I need to copy an objecto to a new layer and tracing it. I'm trying to execute the script above. There is no problem with the copy to a new layer, but Illustrator returns the error "newItem.trace is
-
I connect my iphone to my laptop and my iphone wont open to my itunes HELP!
i connect my iphone to my laptop, and my iphone wont open to my itunes! help !!!
-
Hello All How is the time variants gets the data, Do we need to LOAD data from Source system, like how we do for Master Data Load ? Thanks, looks silly questions, but good to understand .... BI
-
Firefox and other 32 bit programs cannot load websites intermittently
From time to time, perhaps once a day, firefox cannot load any websites for about 30 min to several hours. No error message is given, the page just continues to attempt to load. A similar thing occurs in thunderbird at this time as well as IE 8. Howe