Import Data into multiple tables
Hi folks,
I already did some research in this forum but I cannot find a solution. As far as I understood I need to import the data from the flatfile into a staging table and then distribute it to the different tables by running a SQL script/statement.
Do you have any examples of such a SQL statement/script?
Thanks in advance!
Regards,
Tino
Message was edited by:
tino.albrecht
repeat for each table:
insert /*+ APPEND */ into <table 1>
select <source columns for table 1>
from <table where the flat file was imported>
where <conditions that identify records for table 1>;
OR, alternatively, with a single statement
insert
when <conditions that identify records for table1>
then into <table 1>(<columns of table 1>) values(<source columns for table 1>)
when <conditions that identify records for table2>
then into <table 2>(<columns of table 2>) values(<source columns for table 2>)
select * from <table where the flat file was imported>;
Similar Messages
-
Insert data into multiple tables
Hi all,
I've a requirement where i need to insert data into multiple tables using PL/SQL procedure
Procedure should have two parameters
1. Table Name (parameter1)
2. Data (parameter2)
Based on these two parameters i need to insert data into table (parameter1) using data (parameter2)
ex:
Procedure insert_data (p_table IN VARCHAR2
,p_data IN -- what should be the datatype?
IS
l_statement VARCHAR2(2000);
BEGIN
-- insert data into tables
INSERT INTO p_table
values (....);
END insert_data;Thanks in advance!!BEDE wrote:
Amen to that!
So, I believe a better approach would be the following...
Suppose you have N datafiles with the same structure, and you wish to insert into the database the data from all those files.
For that, you should have a single table, named, say incoming_file_data, which should be structured more or less like below:
create table incoming_file_data (
filename varchar2(250) not null -- name of the file inserted from
,file_time timestamp -- timestamp when the data was inserted
,... -- the columns of meaningful data contained in the lines of those files
);And you will insert the data from all those files in this table, having normally one transaction for each file processed, for otherwise, when shit happens, some file may only get to be partially inserted into the table...
Maybe one good approach would be to create dynamically an external table for the file to be loaded, and then execute dynamically insert select into the table I said, so that you will have only one insert select for one file instead of using utl_file... RTM on that.If the file structures are the same, and it's just the filename that's changing, I would have a single external table definition, and use the alter table ... location ... statement (through execute immediate) to change the filename(s) as appropriate before querying the data. Of course that's not scalable if there are multiple users intenting to use this, but generally when we talk about importing multiple files, it's a one-user/one-off/once-a-day type of scenario, so multi-user isn't a consideration. -
Best way to import data to multiple tables in oracle d.b from sql server
HI All am newbie to Oracle,
What is the Best way to import data to multiple tables in Oracle Data base from sql server?
1)linked server?
2)ssis ?
If possible share me the query to done this task using Linked server?
Regards,
KoteRavindra.check:
http://www.mssqltips.com/sqlservertip/2011/export-sql-server-data-to-oracle-using-ssis/
koteravindra
Handle: koteravindra
Status Level: Newbie
Registered: Jan 9, 2013
Total Posts: 4
Total Questions: 3 (3 unresolved)
why so many unresolved questions? Remember to close your threads marking them as answered. -
How to insert one table data into multiple tables by using procedure?
How to insert one table data into multiple tables by using procedure?
Below is the simple procedure. Try the below
CREATE OR REPLACE PROCEDURE test_proc
AS
BEGIN
INSERT ALL
INTO emp_test1
INTO emp_test2
SELECT * FROM emp;
END;
If you want more examples you can refer below link
multi-table inserts in oracle 9i
Message was edited by: 000000 -
Inserting data into multiple tables in jdbc
I am doing on file to jdbc. Now I got a requirement to insert data into multiple tables on the receiver side. How can I do this ?
Hi,
you are going to insert data into 4 tables in a sequence one after another , I see three options.
1) Stored procedure and 2) creating 4 statement data structure (one for each table)
The third option is writing a SQL with join for the 4 tables and use action command = SQL_DML. Example as follows....
Write SQL code and place it in access tag. Pass values for the columns using key tag...
<stmt>
<Customers action="SQL_DML">
<access> UPDATE Customers SET CompanyName=u2019$NAME$u2019, Address=u2019$ADDRESS$' WHERE CustomerID='$KEYFIELD$u2019
</access>
<key>
<NAME>name</NAME>
<ADDRESS>add </ADDRESS>
<KEYFIELD>1</KEYFIELD>
</key>
</Customers>
</stmt>
Refer this http://help.sap.com/saphelp_nwpi71/helpdata/en/44/7b7855fde93673e10000000a114a6b/content.htm
Hope this helps .... -
How to import data into taxonomy tables
Hi ALL,
Can anybody suggest to import data into taxonomy tables.Please send me any documentation related this.
Regards,
Ravi.Hi,
I hope following URL may help you.
Taxonomy Import:
http://help.sap.com/saphelp_mdm550/helpdata/en/09/322ad42c864bf79892af2123525b6f/content.htm
Importing Hierarchy:
http://help.sap.com/saphelp_mdm550/helpdata/en/36/c4c97f96484f438c2d5c96171cdd6f/content.htm
Importing Attributes:
http://help.sap.com/saphelp_mdm550/helpdata/en/bd/f768fd70d945ee873712304127493b/content.htm
Importing Attribute text Values:
http://help.sap.com/saphelp_mdm550/helpdata/en/52/2086075c584d96be102512036b9c0c/content.htm
Importing Attribute Links:
http://help.sap.com/saphelp_mdm550/helpdata/en/26/17f916867344afbdfcf7d629035c83/content.htm
Regards
Nisha -
How to import data into taxonomy tables and qualified tables
Hi ALL,
Can anybody suggest to import data into taxonomy tables and qualified tables, when is value mapping neccessary and what is partitioning? and what is the use of that partioning?Hello rehman
even i am facing same issue regading import of data in taxonomy table
. can we do it through import manager.
please provide me some details i will be very very thankful to u
my email - [email protected]
Thanks & regards
Himanshu -
Inserting data into multiple tables(Oracle Version 9.2.0.6)
Hi,
we are going to receive the following XML file from one of our vendor. We need to parse the file and then save the data to multiple database tables (around 3).
<?xml version="1.0"?>
<datafeed xmlns:xsi="ht tp://www.w3.org/2001/XMLSchema-instance" xsi:noNamespaceSchemaLocation="DailyFeed.xsd" deliverydate="2007-02-14T00:00:00" vendorid="4">
<items count="1">
<item feed_id="001379" mode="MERGE">
<content empbased="true">
<emp>10000000</emp>
<value>
</value>
<date>2006-01-16</date>
<unit>CHF</unit>
<links>
<link lang="EN">
<url>www.pqr.com</url>
<description>pqr website</description>
</link>
<link lang="DE">
<url>www.efg.com</url>
<description>efg website</description>
</link>
</links>
</content>
<content empbased="true">
<emp>10000001</emp>
<value>
</value>
<date>2006-01-16</date>
<unit>CHF</unit>
<links>
<link lang="EN">
<url>www.abc.com</url>
<description>abc website</description>
</link>
<link lang="DE">
<url>www.xyz.com</url>
<description>xyz website</description>
</link>
</links>
</content>
<content empbased="true">
<emp>10000002</emp>
<value>
</value>
<date>2006-01-16</date>
<unit>CHF</unit>
<links>
<link lang="IT">
<url>www.rst.com</url>
<description>rst website</description>
</link>
</links>
</content>
</item>
</items>
</datafeed>
Now the operation to be done on the table depends on the mode attribute. Further, there are some basic validations need to be done using count attribute. Here the item tag, content tag & link tag are recurring elements.
The problem is I am not able to find the correct attributes like mode, feed_id, lang through SQL query(they are getting duplicated) though I was able to find the deliverydate & vendorid attribute as they are non-repeatitive. Here are the scripts :
create table tbl_xml_rawdata (xml_content xmltype);
create directory xmldir as 'c:\xml';
--put the above xml file in this directory and name it testfile.xml
Declare
l_bfile BFILE;
l_clob CLOB;
BEGIN
l_bfile := BFileName('XMLDIR', 'testfile.xml');
dbms_lob.createtemporary(l_clob, cache=>FALSE);
dbms_lob.open(l_bfile, dbms_lob.lob_readonly);
dbms_lob.loadFromFile(dest_lob => l_clob,
src_lob => l_bfile,
amount => dbms_lob.getLength(l_bfile));
dbms_lob.close(l_bfile);
insert into test_xml_rawdata values(xmltype.createxml(l_clob));
commit;
end;
My query is:
select extractvalue(value(b),'/datafeed/@deliverydate') ddate, extractvalue(value(b),'/datafeed/@vendorid') vendorid,
extractvalue( value( c ), '//@feed_id') feed_id,
extractvalue( value( a ), '//@empbased') empbased,
extractvalue( value( a ), '//emp') emp,
extractvalue( value( a ), '//value') value,
extractvalue( value( a ), '//unit') unit,
extractvalue( value( a ), '//date') ddate1,
extract( value( a ), '//links/link/url') url,
extract( value( a ), '//links/link/description') description
from tbl_xml_rawdata t,
table(xmlsequence(extract(t.xml_content,'/datafeed/items/item/content'))) a,
table(xmlsequence(extract(t.xml_content,'/'))) b ,
table(xmlsequence(extract(t.xml_content,'/datafeed/items/item'))) c;
If the above query is run, the feed_id is cartesian joined with other data ,which is wrong.
How should I go with this so that I can have 1 relational record with respect to each element & sub-elements.
Also, if this is not doable in SQL, can someone direct me to some plsql example to do this. I read that dbms_xmldom & dbms_xmlparser can be used to travel through XML doc but I don't know how to use them.
Any help please ??I'm still getting the same error while installing Oracle Patch set 9.2.0.6. I downloaded the patchset 2 weeks back.
Pls help where download the correct version ? -
Loading data into multiple tables from an excel
Can we load data in to multiple tables at a time from an excel through Utilities? If yes how? Please help me
Regards,
PallaviI would imagine that the utilities allow you to insert data from a spreadsheet into 1 and only 1 table.
You may have to write your own custom data upload using External Tables and a PL/SQL procedure to insert data from 1 spreadsheet into more than 1 table.
If you need any guidance on doing this let me know and I will happily point you in the right direction.
Regards
Duncan -
Loading data into multiple tables using sqlloader
Hi,
I am using sql loader to load the data from flat file into the data base
my file structure is as below
====================
101,john,mobile@@fax@@home@@office@@email,1234@@3425@@1232@@2345@@[email protected],1234.40
102,smith,mobile@@fax@@home,1234@@345@@234,123.40
103,adams,fax@@mobile@@office@@others,1234@@1233@@1234@@3456,2345.40
in file first columns are empno,ename,comm_mode(multiple values terminated by '@@'),comm_no_txt(multiple values terminated by '@@'), sal
the comm_mode and comm_no_text needs to be inserted into the separate table (emp_comm) like below
emp
empno ename sal
101 john 1234.40
102 smith 123.40
103 adams 2345.40
emp_comm
empno comm_mode comm_no_text
101 mobile 1234
101 fax 3425
101 home 1232
101 office 2345
101 email [email protected]
102 mobile 1234
102 fax 345
102 home 234
103 fax 1234
like this needs to insert the data using sql loader
my table structures
===============
emp
empno number(5)
ename varchar2(15)
sal number(10,2)
emp_comm
empno number(5) reference the empno of the emp table
comm_mode varchar2(10)
Comm_no_text varchar2(35)
now i want insert the file data into the specified structues
please help me out to achieve this using sql loader
(we are not using external tables for this)
Thanks & Regards.
Bala Sake
Edited by: 954925 on Aug 25, 2012 12:24 AMPl post OS and database details
You will need to split up the datafile in order to load into separate tables. The process is documented
http://docs.oracle.com/cd/E11882_01/server.112/e22490/ldr_control_file.htm#autoId72
HTH
Srini -
Loading data into multiple tables - Bulk collect or regular Fetch
I have a procedure to load data from one source table into eight different destination tables. The 8 tables have some of the columns of the source table with a common key.
I have run into a couple of problems and have a few questions where I would like to seek advice:
1.) Procedure with and without the BULK COLLECT clause took the same time for 100,000 records. I thought I would see improvement in performance when I include BULK COLLECT with LIMIT.
2.) Updating the Load_Flag in source_table happens only for few records and not all. I had expected all records to be updated
3.) Are there other suggestions to improve the performance? or could you provide links to other posts or articles on the web that will help me improve the code?
Notes:
1.) 8 Destination tables have at least 2 Million records each, have multiple indexes and are accessed by application in Production
2.) There is an initial load of 1 Million rows with a subsequent daily load of 10,000 rows. Daily load will have updates for existing rows (not shown in code structure below)
The structure of the procedure is as follows
Declare
dest_type is table of source_table%ROWTYPE;
dest_tab dest_type ;
iCount NUMBER;
cursor source_cur is select * from source_table FOR UPDATE OF load_flag;
BEGIN
OPEN source_cur;
LOOP
FETCH source_cur -- BULK COLLECT
INTO dest_tab -- LIMIT 1000
EXIT WHEN source_cur%NOTFOUND;
FOR i in dest_tab.FIRST .. dest_tab.LAST LOOP
<Insert into app_tab1 values key, col12, col23, col34 ;>
<Insert into app_tab2 values key, col15, col29, col31 ;>
<Insert into app_tab3 values key, col52, col93, col56 ;>
UPDATE source_table SET load_flag = 'Y' WHERE CURRENT OF source_cur ;
iCount := iCount + 1 ;
IF iCount = 1000 THEN
COMMIT ;
iCount := 0 ;
END IF;
END LOOP;
END LOOP ;
COMMIT ;
END ;
Edited by: user11368240 on Jul 14, 2009 11:08 AMAssuming you are on 10g or later, the PL/SQL compiler generates the bulk fetch for you automatically, so your code is the same as (untested):
DECLARE
iCount NUMBER;
CURSOR source_cur is select * from source_table FOR UPDATE OF load_flag;
BEGIN
OPEN source_cur;
FOR r IN source_cur
LOOP
<Insert into app_tab1 values key, col12, col23, col34 ;>
<Insert into app_tab2 values key, col15, col29, col31 ;>
<Insert into app_tab3 values key, col52, col93, col56 ;>
UPDATE source_table SET load_flag = 'Y' WHERE CURRENT OF source_cur ;
iCount := iCount + 1 ;
IF iCount = 1000 THEN
COMMIT ;
iCount := 0 ;
END IF;
END LOOP;
COMMIT ;
END ;However most of the benefit of bulk fetching would come from using the array with a FORALL expression, which the PL/SQL compiler can't automate for you.
If you are fetching 1000 rows at a time, purely from a code simplification point of view you could lose iCount and the IF...COMMIT...END IF and just commit each time after looping through the 1000-row array.
However I'm not sure how committing every 1000 rows helps restartability, even if your real code has a WHERE clause in the cursor so that it only selects rows with load_flag = 'N' or whatever. If you are worried that it will roll back all your hard work on failure, why not just commit in your exception handler? -
Import data into XMLType tables
I have a document whose root node is MedlineCitationSet. It contains one or more MedlineCitations
i.e
<MedlineCitationSet xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:noNamespaceSchemaLocation="http://localhost/schema/medline.xsd">
<MedlineCitation Owner="NLM" Status="Completed">
<MedlineID>66039085</MedlineID>
</MedlineCitation>
<MedlineCitation Owner="NLM" Status="Completed">
<MedlineID>66039086</MedlineID>
</MedlineCitation>
</MedlineCitationSet>
I registered the xschema http://localhost/schema/medline.xsd in the XML db.
I then created two XMLType tables
CREATE TABLE MEDLINECITATIONS OF XMLTYPE XMLSCHEMA "http://localhost/schema/medline.xsd" ELEMENT "MedlineCitation";
CREATE TABLE XMLLOAD OF XMLTYPE XMLSCHEMA "http://localhost/schema/medline.xsd" ELEMENT "MedlineCitationSet";
I then import the xmldocument using the following function
INSERT INTO XMLLOAD VALUES (GETXML ('smallxmldoc.xml'));
The getxml function was defined in a previous thread (I can't seem to find in again). It basically reads the file into a clob and then returns an xmltype for the clob.
It does work and inserts a single record for the entire[i]Long postings are being truncated to ~1 kB at this time.--- My message was truncated. Here is the rest ---
If I run the following query,
SELECT VALUE (O) FROM XMLLOAD X, TABLE (XMLSEQUENCE (EXTRACT (VALUE(X), '/MedlineCitationSet/MedlineCitation'))) O;
I get 15 records back one for each MedlineCitation. If I run
INSERT INTO MEDLINECITATIONS
SELECT VALUE (O) FROM XMLLOAD X, TABLE (XMLSEQUENCE (EXTRACT (VALUE(X), '/MedlineCitationSet/MedlineCitation'))) O;
I get a ORA-19007: Schema and element do not match. I don't understand why they wouldn't match. They were both created using the same schema. It's almost as though the XMLType that is returned by the query "loses" it's schema so even though they match XMLdb is kicking it out. Anybody have any ideas? Is there a simpler way that would allow me to directly import the MedlineCitations into the MEDLINECITATIONS table?
Thanks
Marc Paris -
I need to insert data into multiple tables - how?
Using ASP, VBScript & MS SQL -
My main table has a PK with the name projectID, this field is
updated automatically in SQL. I need that ID in order to populate
the rest of the tables. How do I retrieve this ID before moving on
to the next page(s)?
I've seen some similar questions but none have really been
answered, Hoping this time is the charm.
ThanksPost your SQL statements and I will put this into a Stored
Procedure for
you. This will give you far more flexibility and will run
more efficiently.
I may not be able to do it tonight though as I am about to go
out.
"Mark.P." <[email protected]> wrote in message
news:[email protected]...
>I do have that option but I don't know how to write them.
I don' t
>normally do
> this kind of thing but due to recent org changes i'm
learning on the fly.
> I
> was hoping this would be on the easy side. Guess not (at
least not for
> me).
> Here's where I'm at now with it, it's returning an error
at this point.
>
>
>
>
> <%
> If (CStr(Request("MM_insert")) = "addProject") Then
> If (Not MM_abortEdit) Then
>
> 'intNewKeyVal = rsProjects.Execute(MM_editCmd,
intRecordsAffected,
> adCmdText)(0)
> ' execute the insert
> Dim MM_editCmd
>
> Set MM_editCmd = Server.CreateObject ("ADODB.Command")
> MM_editCmd.ActiveConnection = MM_commStr_STRING
> MM_editCmd.CommandText = "INSERT INTO dbo.projects
(pName, pLead,
> status,
> startDate, endDate, audience, ojective, deliverables,
issues) VALUES (?,
> ?, ?,
> ?, ?, ?, ?, ?, ?) SELECT SCOPE_IDENTITY()"
> MM_editCmd.Prepared = true
> MM_editCmd.Parameters.Append
MM_editCmd.CreateParameter("param1", 202,
> 1,
> 100, Request.Form("pName")) ' adVarWChar
> MM_editCmd.Parameters.Append
MM_editCmd.CreateParameter("param2", 202,
> 1,
> 100, Request.Form("pLead")) ' adVarWChar
> MM_editCmd.Parameters.Append
MM_editCmd.CreateParameter("param3", 202,
> 1,
> 50, Request.Form("status")) ' adVarWChar
> MM_editCmd.Parameters.Append
MM_editCmd.CreateParameter("param4", 135,
> 1,
> -1, MM_IIF(Request.Form("startDate"),
Request.Form("startDate"), null)) '
> adDBTimeStamp
> MM_editCmd.Parameters.Append
MM_editCmd.CreateParameter("param5", 135,
> 1,
> -1, MM_IIF(Request.Form("endDate"),
Request.Form("endDate"), null)) '
> adDBTimeStamp
> MM_editCmd.Parameters.Append
MM_editCmd.CreateParameter("param6", 202,
> 1,
> 250, Request.Form("audience")) ' adVarWChar
> MM_editCmd.Parameters.Append
MM_editCmd.CreateParameter("param7", 202,
> 1,
> 500, Request.Form("objective")) ' adVarWChar
> MM_editCmd.Parameters.Append
MM_editCmd.CreateParameter("param8", 202,
> 1,
> 500, Request.Form("deliverables")) ' adVarWChar
> MM_editCmd.Parameters.Append
MM_editCmd.CreateParameter("param9", 202,
> 1,
> 500, Request.Form("issues")) ' adVarWChar
> MM_editCmd.Execute
> intNewKeyVal = rsProjects.Execute(MM_editCmd,
intRecordsAffected,
> adCmdText)(0)
> MM_editCmd.ActiveConnection.Close
>
> ' append the query string to the redirect URL
> Dim MM_editRedirectUrl
> MM_editRedirectUrl = "insertPart2.asp?fKey=intNewKeyVal"
> If (Request.QueryString <> "") Then
> If (InStr(1, MM_editRedirectUrl, "?", vbTextCompare) =
0) Then
> MM_editRedirectUrl = MM_editRedirectUrl & "?" &
> Request.QueryString
> Else
> MM_editRedirectUrl = MM_editRedirectUrl & "&"
> Request.QueryString
> End If
> End If
> Response.Redirect(MM_editRedirectUrl)
> End If
> End If
> %>
> -
Inserting data into multiple tables(on DB).
Hi,
I have 2 tables in DB.
Person{ID, Name,Surname}
Address{city,street,tel.zip}
Where both of them are entity classes.
I want to insert data from one jsp page
I tried it but i got error :
javax.servlet.ServletException: Target Unreachable, 'address' returned null
I need some hints.
Thank you.My DB is located on Derby,
I am trying to reach from jsp pages. Like NewPerson.jsp from wich i want to create new person and his address.
It can see person bean but can not reach address bean.
Thanx. -
Importing data into tables with grant access (sql developer 3.2)
Hello,
I want to import data into a table PAY_BALANCE_BATCH_LINES which is an interface table. I'm logged in to a schema (APPS) and this table belongs to the HR schema. However, if you look at the grants, the APPS schema has all access to this particular table. In TOAD, this used to work great.
But in sqldeveloper, when I filter the tables dropdown, I am not able to find this table. Since this is my primary way of uploading data I'm not sure how else I can get access to upload data into this table. I don't know the password for the HR schema by the way.
Is there a way out?
Many ThanksScroll down the tree to the 'Other Users' node, expand it, and then drill down into HR > Tables. Then do your import.
For an alternative browser, right-click on your connection in the tree and open a Schema Browser.
Maybe you are looking for
-
Master Detail : Default value to '0' in Detail
Hi All, I am new to Apex. I need some assistance with a Master-Detail form I created in Oracle APEX. The detail table has few columns and during ADD ROW (@ detail level), I have been trying to default the values in the detail table to '0'. But, I hav
-
NEW MAILDATABASE DUE TO TOTAL LOSS
PLEASE HELP SPENT HOURS LOOKING FOR DEFINATIVE AWNSER TO NO AVAIL!!! We have a situation where we have corrupt maildatabase and public both very large over 250Gb combined. We had total loss and no usable backups due to disk outage. managed by just
-
Inco terms not flowing in Purchasing Info record.
Hi , Eventhough I maintained Incoterms in Vendor master , while creating Purchasing Inforecord with the same vendor , Incoterms is not picking from the vendor master. Kindly give the reason for this ........ regards, YK
-
Hello Experts, I have ordered new manual checks. The new manual check range should be 10000 to 11000. I want to know whether I can change check number range in FCHI if there is setting before, or is there another transaction to set the number range o
-
Can´t send mail on any account
Hi, Suddenly all of my accounts can´t send mail... it pops up a window saying mail can´t use this smtp, and it doesn´t matter what I choose for selected server, nothing works. And if I change the outgoing mail server settings in preferences it does n