How to batch load data to several tables
Hi,
One customer have such data structure and have large number of data(arround 10 Million). I think it's proper to convert them to data that SQL loader can recognize and then insert into Oracle 8 or 9. The question is how to convert?
Or maybe to insert them one by one is simpler?
1: Component of Data
The data file consists of nameplate and some records.
1.1 Structure of nameplate
ID datatype length(byte) comments
1 char 4
2 char 19
3 char 2
4 char 6 records in this file
5 char 8
1.2 structure of each record
ID datatype length(byte)
1 char 21
2 char 18
3 char 30
4 char 1
5 char 8
6 char 2
7 char 6
8 char 70
9 char 30
10 char 8
11 char 8
12 char 1
13 char 1
14 char 1
15 char 30
16 char 20
17 char 6
18 char 70
19 char 5
24 bin(blob) 1024
25 bin(blob) defined in ID19
2: data file and table spaces in database
dataID 1-13 of each record insert to table1,
14-18 to table2, and 19,24,25 to table3
Is there a method to convert them to some data that SQL loader can input and then at a whole load into Oracle 8 or 9?
Thanks &Regards.
I've check the Oracle Utilities docs, but did not find a way to load so many data files at a batch action.
In my view the solution consist in two ways:
1, Load each of them individualy individually to different tables by some programme. But the speed may be problem because the uninterrupted db connections and close.
2, Convert them to one or three files then use SQL loader.
But either isn't much easy, I wonder if there's a better method to handle.
Many thanks!
Similar Messages
-
How do i load data in excel to an oracle table
How do i load data from an excel file with several worksheets into an oracle table?
using Oracle 10g
Excel
sample data of excel
Name eric Name mary
AccountNo 123 AccountNo 321
amount1 5.0 Amount1 1.0
amount2 5.5 Amount2 2.0
amount3 6.0 Amount3 3.0
Total 16.5 Total 6.0
Name larry Name beth
AccountNo 123 AccountNo 321
amount1 5.0 Amount1 1.0
amount2 5.5 Amount2 2.0
amount3 6.0 Amount3 3.0
Total 16.5 Total 6.0
Note: Assume data are aligned into columns like a real excel workbookHi,
You can make one csv file per sheet. After what you can use sql*loader utyility.
Since 9i, there is external table : http://download-west.oracle.com/docs/cd/B14117_01/server.101/b10759/statements_7002.htm#i2129649
Nicolas. -
How can I load data into table with SQL*LOADER
how can I load data into table with SQL*LOADER
when column data length more than 255 bytes?
when column exceed 255 ,data can not be insert into table by SQL*LOADER
CREATE TABLE A (
A VARCHAR2 ( 10 ) ,
B VARCHAR2 ( 10 ) ,
C VARCHAR2 ( 10 ) ,
E VARCHAR2 ( 2000 ) );
control file:
load data
append into table A
fields terminated by X'09'
(A , B , C , E )
SQL*LOADER command:
sqlldr test/test control=A_ctl.txt data=A.xls log=b.log
datafile:
column E is more than 255bytes
1 1 1 1234567------(more than 255bytes)
1 1 1 1234567------(more than 255bytes)
1 1 1 1234567------(more than 255bytes)
1 1 1 1234567------(more than 255bytes)
1 1 1 1234567------(more than 255bytes)
1 1 1 1234567------(more than 255bytes)
1 1 1 1234567------(more than 255bytes)
1 1 1 1234567------(more than 255bytes)Check this out.
http://download-west.oracle.com/docs/cd/B10501_01/server.920/a96652/ch06.htm#1006961 -
How to Load data from a Table in BW to R3 Table .. any FM or Program if you
I want to load data from a Table in BW side to R3 side. Any FM or Program if you Know.
If it is one time activity ..Download the data into excel file on BW side..
Write an ABAP program to read the data from the excel file on local workstation and update table.. -
Loading data from a table to another
This question is continuation of my earlier thread
Re: query help
The last thread was answered by experts and was closed. But I have another problem, I'll put the whole scenario here:
I have a table with data as
id status_code emp_id
1 a 10
2 b 10
3 a 20
4 c 30
5 k 10
6 k 10
7 a 10
8 a 20
9 z 20
10 k 30
11 k 30
12 m 20
13 k 10
14 k 30and scripts are:
CREATE TABLE TEST
(id NUMBER,
status_code VARCHAR2(5),
emp_id NUMBER
INSERT INTO test VALUES (1,'a',10);
INSERT INTO test VALUES (2,'b',10);
INSERT INTO test VALUES (3,'a',20);
INSERT INTO test VALUES (4,'c',30);
INSERT INTO test VALUES (5,'k',10);
INSERT INTO test VALUES (6,'k',10);
INSERT INTO test VALUES (7,'a',10);
INSERT INTO test VALUES (8,'a',20);
INSERT INTO test VALUES (9,'z',20);
INSERT INTO test VALUES (10,'k',30);
INSERT INTO test VALUES (11,'k',30);
INSERT INTO test VALUES (12,'m',20);
INSERT INTO test VALUES (13,'k',10);
INSERT INTO test VALUES (14,'k',30);Here's the query that experts gave me in last thread, to fetch desired records:
The requirement was to fetch records of those emp ids where a given emp_id consectively gets same status code.
Each emp id gets a status code and at times it gets same status code as its previous status code. And I am interested in such scenarios.
For example,
emp_id=20 gets status_code=a in id=3 and its next status_code is also =a at id=8,
emp_id=10 gets status_code=k in id=5 and gets same status_code again in id=6,
emp_id=30 gets status_code=k in id=10 and gets same status code in id=11 and id=14.
And the answer was:
select *
from
select
id,
status_code,
emp_id,
lag(status_code) over (partition by emp_id order by id asc) as last_status,
lead(status_code) over (partition by emp_id order by id asc) as next_status
from test
where status_code = last_status
or status_code = next_status
/ But now the new issue is that the records of the table TEST are to be loaded into another table TEST_REVISED . For each ID of table TEST, there will be one or more records in table TEST_REVISED.
ALTER TABLE TEST ADD (CONSTRAINT id_pk PRIMARY KEY(id));
CREATE TABLE TEST_REVISED
(TR_ID NUMBER PRIMARY KEY,
ID NUMBER REFERENCES TEST(ID),
NEW_STATUS_CODE VARCHAR2(5)
);This is what I want to see in table TEST_REVISED
TR_ID, ID, NEW_STATUS_CODE
100, 1, 'test-code'
200, 2,' 'test-code'
300, 3, 'test-code'
400, 4, 'test-code'
500, 5, 'test-code'
600, 5, 'test-code'--notice here: EMP_ID=10 had same status consecutively in table TEST, so when that record is loaded to TEST_REVISED, ID=5 is used instead of id=6
700, 7, 'test-code'
800, 3, 'test-code'--notice here: EMP_ID=20 had same status consecutively in table TEST, so when that record is loaded to TEST_REVISED, ID=3 is used instead of id=8
900, 9, 'test-code'
1000, 10, 'test-code'
1100, 10, 'test-code'--notice here: EMP_D=30 had same status consecutively in table TEST, so when that record is loaded to TEST_REVISED, ID=10 is used instead of id=11
1200, 12, 'test-code'
1300, 13, 'test-code'
1400, 10, 'test-code'--notice here: EMP_ID=30 had same status consecutively in table TEST, so when that record is loaded to TEST_REVISED, ID=10 is used instead of id=14 or id=11Ignore the value of the column NEW_STATUS_CODE here.
How can I load data in table TEST_REVISED from table TEST keeping in view the requirement I mentioned? Table TEST has around 1 million records.
Once this achieved, I want to delete records from table TEST with following ids :
ID=6,8,11,14.
Thanks a lot!
Edited by: user5406804 on Dec 5, 2010 5:49 PM
Edited by: user5406804 on Dec 5, 2010 5:50 PMOK, i still don't see how this is all supposed to fit together.
You haven't said it, but i'm assuming when you have a series like
id emp status_code
1 30 k
2 30 k
3 10 a
4 30 k
5 30 kYou actually want the output to be
id emp status_code new_id
1 30 k 1
2 30 k 1
3 10 a 3
4 30 k 4
5 30 k 4Which presents a bit of a problem using simply analytic functions.
So here's the modified query that picks up just the rows you want 'changed'.
TUBBY_TUBBZ?
select
id,
emp_id,
status_code,
connect_by_root(id) as new_id
from
select
id,
emp_id,
status_code,
lag(status_code) over (partition by emp_id, status_code order by id asc) as last_status,
row_number() over (partition by emp_id, status_code order by id asc) as rn
from
select
id,
emp_id,
status_code,
lead(status_code) over (partition by emp_id order by id asc) as next_code,
lag(status_code) over (partition by emp_id order by id asc) as last_code,
row_number() over (partition by emp_id order by id asc) as rn
from test
where status_code in (last_code, next_code)
start with last_status is null
connect by prior emp_id = emp_id
and prior status_code = status_code
and prior rn = rn - 1
ID EMP_ID STATU NEW_ID
5 10 k 5
6 10 k 5
3 20 a 3
8 20 a 3
10 30 k 10
11 30 k 10
14 30 k 10
7 rows selected.
TUBBY_TUBBZ?You say you have a lot of rows in the TEST table, and i'm hoping and praying this is a one time requirement and not an on-going process that needs to occur so your best bet (performance wise) would probably be to use the above query to create a temporary table.
Use that temporary table to control the rest of the process (inserting into TEMP_REVISED and deleting from TEST).
That's the best i can think of based on what i believe your requirement is ... and i honestly don't think that's going to perform overly well (the connect by functionality specifically). -
Batch load images (tiff) into table with a BLOB column
Hi,
Does anyone know any third-party tool/software to bulk/batch load images into Oracle table? There is an ETL software to regularly pull the images (tiff) from a ftp server to a directory in the database server. There is no way I can hardcode the image filenames into the control file and use SQL*Loader to load the files. Is there any tool/software that can just grab whatever files in the directory and load them into Oracle. Then, I can write a program to extract the filename from the file already loaded into the table and update the filename column.
Thanks.
Andysound like simple scripting to me...
-- SQL loader example
http://www.orafaq.com/wiki/SQL%2ALoader_FAQ#How_does_one_use_SQL.2ALoader_to_load_images.2C_sound_clips_and_documents.3F
-- dynamically build control file
devuser1:/tmp>>touch image1.gif
devuser1:/tmp>>touch image2.jpg
devuser1:/tmp>>touch image3.jpg
devuser1:/tmp>>ls -l image*
-rw-rw-r-- 1 devuser1 mygrp 0 Jul 10 11:19 image1.gif
-rw-rw-r-- 1 devuser1 mygrp 0 Jul 10 11:19 image2.jpg
-rw-rw-r-- 1 devuser1 mygrp 0 Jul 10 11:19 image3.jpg
devuser1:/tmp>>ls -l image* | awk '{ print NR "," $9}'
1,image1.gif
2,image2.jpg
3,image3.jpg
devuser1:/tmp>>echo "LOAD DATA" > t.ctl
devuser1:/tmp>>echo "INFILE *" >> t.ctl
devuser1:/tmp>>echo "..." >> t.ctl
devuser1:/tmp>>echo "BEGINDATA" >> t.ctl
devuser1:/tmp>>ls -l image* | awk '{ print NR "," $9}' >> t.ctl
devuser1:/tmp>>cat t.ctl
LOAD DATA
INFILE *
BEGINDATA
1,image1.gif
2,image2.jpg
3,image3.jpgEdited by: maceyah on Jul 10, 2009 12:42 PM -
Loading data from Z table to an ODS in BW/BI
Hello Gurus,
Can some one guide me how do I load data from a Z table which exists in the same BI system into an ODS/DSO. I'm working on a 04S system.
Your help is highly appreciated.
Thanks & Regards,
Prashanthhi Prasanth
u r using generic extraction method to load data from R/3 to BW server
u can use
T.Code SBIW or RSO2 to create Generic DataSource.
step 1- logon to R/3 system
step2 - check data in table
for this use t.code se11.
Db table name --- ZXXXXX.
1. select " Display " button.
2. select " contents"(shftctrlf11).--->execute.
*step-3 *- create generic datasource for trasactiona data
 enter t.code- rso2
 select t.data -
zXXXXX.(specify ur datasource name to create a new one).
 select create icon.
 appl.. component --- (browse and choose ur application component)(EX- sd).
 Extration from DBView"
 table/view---zXXXXX(give ur ztable name).
 text----give sht .des, m.des, L .des.......... for u data source.
 Select generic delta option in toolbar.
 Give delta specific field
 Field name---- (ex- pid)
Select any radio button(ex-numeric pointer).
 Settings additive delta radio button( for delta loads from ods to i.cube).
 Select save save .
 Package -
some package name.
 Save
 Continiue
 Coustomize the datasource by seleting selection check boxes for fields.
 Save
 MSg:- datasource hve been created.
SAP BW side:
Step :1
Enter t.code rsa13.
 Identify R/3 source system icon.
 Double click on R/3 s . system.
 Expand BW datasource
 Expand sap Appl. Comp..
 Select ur application component( for Ex- SD).
 Context menu -- replicate datasource
 Refresh tree once the replication is complete.
 Find ur datasource.
 Double click on data source icon { this implies data source is not assigned).
 Context menu
 Assign infosource..
 i.source assignment:
o select others radio button
o and select optioncreate.
 Flexible update.-------XXXXX
 Des----XXXXXX
 Continue
 Create I.Obj w .r t r/3 s.s fields.
 Assign the I.Obj to the fields of r/3 respectively.
 Enter 0RECORDMODE in comm.. structure.
 Activate
 create ODS obj and create structure and activate
 create update rules for Ods object with reference to i.source. and activate.
 Create infopackage and schedule data .and monitor the data in psa and ods objects tables. -
How do we use Data rules/error table for source validation?
How do we use Data rules/error table for source validation?
We are using OWB repository 10.2.0.3.0 and OWB client 10.2.0.3.33. The Oracle version is 10 G (10.2.0.3.0). OWB is installed on Linux.
I reviewed the posting
Re: Using Data Rules
Thanks for this forum.
I want to apply data rules to source table/view and rule violated rows should go to defined error table. Here is an example.
Table ProjectA
Pro_ID Number(10)
Project_name Varchar(50)
Pro_date Date
As per above posting, I created the table in object editor, created the data rule
NAME_NOT_NULL (ie project name not null). I specified the shadow table name as ProjectA_ERR
In mapping editor, I have projectA as source. I did not find error table name and defined data rules in table properties. It is not showing up the ERR group in source table
How do we bring the defined data rules and error table into mapping?
Are there any additional steps/process?
Any idea ?
Thanks in advance.
RIHi,
Thanks for your reply/pointer. I reviewed the blog. It is interesting.
What is the version of OWB used in this blog?
After defining data rule/shadow table, I deployed the table via CC. It created a error table and created the all the source coulmns in alphabatical order. If I have the primary key as 1st coulmn (which does not start with 'A') in my source, it will apprear middle of of columns in error table.
How do we prevent/workaround this?
If I have source(view) in sch A, how do we create Error table in Sch B for source(view)?
Is it feasible?
I brought the error table details in mapping. Configured the data rules/error tables.
If I picked up 'MOVE TO ERROR' option, I am getting "VLD-2802 Missing delete matching criteria in table. the condition is needed because the operator contain at least one data rule with a MOVE TO ERROR action"
On condition Loading - I have 'All constraints' for matching criteria.
I changed to "no constraints' still I get the above error.
If I change to 'REPORT' option instead of 'MOVE TO ERROR' option, error goes off.
Any idea?
Thanks in advance.
RI -
how can i load data from a oracle table to a text file or CSV file using PL/SQL procedures where the pls/sql code will take the table name dynamically.........
soumenTry this thread..
Is it possible to export a pl/sql region as a csv file? -
How to update the data the transparent table CKMLMV013 ?
How to update the data the transparent table CKMLMV013 ?
can you please replay asap.
thanks,
sambaHi,
if you do a "where used" search of the table CKMLMV013 you will see that it is updated in several places:
e.g. fm CKML_ORDER_CONNECTION_UPDATE
Have a look at those places where the table is updated.
Best regards. -
Hi ,
How to load data from external table to transaction table using SQLLDR ?You use an external table to load the data it is described in the link to the manual I provided.
Here is an example.
Re: Using DML order to import a .xls file
You would not be using SQLLDR though as external tables replace that functionality. -
How to get the data from Pooled Table T157E.
Hi Experts,
How to get the data from Pooled Table T157E.
Any help.
Thanks in Advance,
Ur's Harsha.create some internal table similar to T157E and pass all data as per SPRAS.
After that use internal table in your program as per the requirement.
Regds,
Anil -
In ADF how can i insert data in multiple table if they have foreign key
I have started working on ADF and can anybody inform me in ADF how can i insert data in multiple table if they have foreign key,please?
Thnak you very much.Hello,
Still no luck.I am surely doing silly mistakes.Anyway,Here are my workings-
1> student_mst (id(pk),studentname) and student_guard_mst(id(fk),guardianname)
2> created EO from both of the tables,made id in both EO as DBSequence and an association was also generated.
3> i made that association composite by clicking the checkbox
4> i created 2 VO from 2 EO.
5> put those VO in Application Module.
6> dragged and dropped 2 VO on my jspx page and dropped them as ADF Form.
Now what to do please? -
How to validate the dates in the table control ?
How to validate the dates in the table control ?
Can I write like this ?
LOOP AT it_tab .
CHAIN.
FIELD : it_tab-strtdat,it_tab-enddat.
module date_validation.
ENDCHAIN.
ENDLOOP.
Module Date_validation.
ranges : vdat type sy-datum.
vdat-sign = 'I'.
VDAT-LOW = it_tab-STRTDAT.
VDAT-HIGH = it_tab-ENDDAT.
VDAT-OPTION = 'BT'.
APPEND VDAT.
WHAT CODE I have to write here to validate ?
and If I write like this How can we know which is the current row being add ?
It loops total internal table ..?
Bye,
Muttu.Hi,
I think there is no need to put chain endchain.
To do validation you have to write module in PAI which does required validations.
Thanks
DARSHAN PATEL -
How to delete the data from partition table
Hi all,
Am very new to partition concepts in oracle..
here my question is how to delete the data from partition table.
is the below query will work ?
delete from table1 partition (P_2008_1212)
we have define range partition ...
or help me how to delete the data from partition table.
Thanks
Sree874823 wrote:
delete from table1 partition (P_2008_1212)This approach is wrong - as Andre pointed, this is not how partition tables should be used.
Oracle supports different structures for data and indexes. A table can be a hash table or index organised table. It can have B+tree index. It can have bitmap indexes. It can be partitioned. Etc.
How the table implements its structure is a physical design consideration.
Application code should only deal with the logical data structure. How that data structure is physically implemented has no bearing on application. Does your application need to know what the indexes are and the names of the indexes,in order to use a table? Obviously not. So why then does your application need to know that the table is partitioned?
When your application code starts referring directly to physical partitions, it needs to know HOW the table is partitioned. It needs to know WHAT partitions to use. It needs to know the names of the partitions. Etc.
And why? All this means is increased complexity in application code as this code now needs to know and understand the physical data structure. This app code is now more complex, has more moving parts, will have more bugs, and will be more complex to maintain.
Oracle can take an app SQL and it can determine (based on the predicates of the SQL), which partitions to use and not use for executing that SQL. All done totally transparently. The app does not need to know that the table is even partitioned.
This is a crucial concept to understand and get right.
Maybe you are looking for
-
Co41 enhancement for add field in ALV output
I must add a custom filed to ALV output of transaction CO41 and i trying to use all the 25 enhancements provided (that i find in other post CO41- Enhancement ), but i haven't found any way to get the desired results. Can any body help me? Thanks in a
-
"itunes.exe - No disk There is no disk in the drive.
The following message shows up whenever I open itunes: "itunes.exe - No disk There is no disk in the drive. Please insert a disk into drive\Device\harddisk1\DR2" Can anyone tell me what this means, and more importantly how do I make it stop? As long
-
and when i try to Re-install itunes i get a message from itunes ERROR 7 (windows error 1114), now how do i sort this out ???
-
Web Service - Invoke Web Service not trapping errors
Greetings All: I am having an issue trying to trap errors coming out of the Invoke Web Service service. I have a Fault Route set up for SYSTEM EXCEPTION, and direct it to an ExecuteScript service. If the process I am developing fails, it appears to l
-
XML as CLOB parameter to procedure
I'm trying to insert data from an XML file into tables/columns via a stored procedure. The procedure is part is as follows: create or replace procedure ParseSomeXMLTest xmlDoc in CLOB, someID out int as doc xmldom.DOMDocument; parser xmlparser.Parser