Loading millions of rows using SQL*loader to a table with constraints
I have a table with constraints and I need to load millions of rows in it using SQL*Loader.
What is the best way to do this, means what SQL*Loader options to use, for getting the best loading performance and how to deal with constraints?
Regards
- check if your table has check constraints (like column not null)
if you trust the data in the file you have to load you can disable this constrainst and after the loader enable this constrainst.
- Check if you can modify the table and place it in nologging mode (generate less redo but ONLY is SOME Conditions)
Hope it helps
Rui Madaleno
Similar Messages
-
Loading a flat table with duplicate rows in SQL server
Hi,
I'm trying to load a flat table with different levels that has duplicate rows. When I'm loading it from my source SQL server enviornment to target SQL server environment.. I can only load 63 rows out of the 1225 rows.. This is happenning because i had to define a primary key on the couple of columns..
When I just try to load it without a primary key, I get an error that PK needs to be defined for load to happen..
My table structure looks as follows -
Lvl1 Lvl2 Lvl3 Lvl4 AccountID AccountDesc
How do i load all rows of data in my target table using ODI?\
Please helpwhirlpool wrote:
Hi,
I'm trying to load a flat table What is a flat table ? Are you talking about FACT table ?
When I'm loading it from my source SQL server enviornment to target SQL server environment.. I can only load 63 rows out of the 1225 rows.. This is happenning because i had to define a primary key on the couple of columns..
When I just try to load it without a primary key, I get an error that PK needs to be defined for load to happen..
Which IKM is in use ? I can not remember an IKM which needs a PK . Incremental Update IKM needs a Update key which can be a PK or UK at database level or ODI level.
My table structure looks as follows -
Lvl1 Lvl2 Lvl3 Lvl4 AccountID AccountDesc
How do i load all rows of data in my target table using ODI?\
If you not bother about PK at target then you can go for SQL Control Append to load your target table.
Thanks,
Sutirtha -
How can I load data into table with SQL*LOADER
how can I load data into table with SQL*LOADER
when column data length more than 255 bytes?
when column exceed 255 ,data can not be insert into table by SQL*LOADER
CREATE TABLE A (
A VARCHAR2 ( 10 ) ,
B VARCHAR2 ( 10 ) ,
C VARCHAR2 ( 10 ) ,
E VARCHAR2 ( 2000 ) );
control file:
load data
append into table A
fields terminated by X'09'
(A , B , C , E )
SQL*LOADER command:
sqlldr test/test control=A_ctl.txt data=A.xls log=b.log
datafile:
column E is more than 255bytes
1 1 1 1234567------(more than 255bytes)
1 1 1 1234567------(more than 255bytes)
1 1 1 1234567------(more than 255bytes)
1 1 1 1234567------(more than 255bytes)
1 1 1 1234567------(more than 255bytes)
1 1 1 1234567------(more than 255bytes)
1 1 1 1234567------(more than 255bytes)
1 1 1 1234567------(more than 255bytes)Check this out.
http://download-west.oracle.com/docs/cd/B10501_01/server.920/a96652/ch06.htm#1006961 -
Mutiple Rows from a Single Row using SQL
How can i get Multiple rows from a single row using sql ?
Example : one row contains the complete address separated by delimiter say comma (,) as address1,address2,city,state,zip,country
I want to split this row and get the output in multiple rows as address1 address2 city state zip country using sql query.
Thanks,Hi,
The solution above assumes that the |-delimited entries always contain at least one character. If you have a string like
1 Elm Street|||Sioux City|IA||it will think 'Siuox City' is address2.
If you have empty entries, like that, then you need something a little more complicated:
INSERT INTO table2
( address1
, address2
, address3
, city
, state
, postal
, country
SELECT SUBSTR (REGEXP_SUBSTR ('|' || txt, '\|[^|]*', 1, 1), 2) -- address1
, SUBSTR (REGEXP_SUBSTR ('|' || txt, '\|[^|]*', 1, 2), 2) -- address2
, SUBSTR (REGEXP_SUBSTR ('|' || txt, '\|[^|]*', 1, 3), 2) -- address3
, SUBSTR (REGEXP_SUBSTR ('|' || txt, '\|[^|]*', 1, 4), 2) -- city
, SUBSTR (REGEXP_SUBSTR ('|' || txt, '\|[^|]*', 1, 5), 2) -- state
, SUBSTR (REGEXP_SUBSTR ('|' || txt, '\|[^|]*', 1, 6), 2) -- postal
, SUBSTR (REGEXP_SUBSTR ('|' || txt, '\|[^|]*', 1, 7), 2) -- country
FROM table1
; -
How to Use SQL Query having IN Clause With DB Adapter
Hi,
I am using 11.1.1.5 want to find out how to Use SQL Query having IN Clause With DB Adapter. I want to pass the IN values dynamically. Any ideas.
Thanksinvoke a stored procedure, it's safer than trying to put together an arbitrary SQL statement in the JCA adapter
-
Sql*loader - load data in table with multiple condition
Hi,
I have oracle 9i on Sun sloaris and i need to load data in one of oracle table using sql*loader with conditional column data.
My table is like:
Load_table
col1 varchar2(10),
col2 varchar2(10),
col3 varchar2(10),
Now i have to load data like:
If col2 = US1 then col3 = 'AA'
If col2 = US2 then col3 = 'BB'
If col2 = US3 then col3 = 'CC'
How can i load this data in table using sql*loader?
Thanks,
PoraHi
it is a half-solution.
You have to:
1. open file
2. take a line
3. split the line into values (using substring to)
4. check condition (01 or 02)
5. do a proper insertion
Good Luck,
Przemek
DECLARE
v_dir VARCHAR2(50) := 'd:/tmp/'; --directory where file is placed
v_file VARCHAR2(50) := 'test.txt'; -- file name
v_fhandle UTL_FILE.FILE_TYPE; ---file handler
v_fline VARCHAR2(906); --file line
v_check VARCHAR2(50);
BEGIN
v_fhandle := UTL_FILE.FOPEN(v_dir, v_file, 'R'); --open file for read only
LOOP -- in the loop
UTL_FILE.GET_LINE( v_fhandle , v_fline); -- get line by line from file
if (substr(v_fline,17,2) = '01') then --check the value
INSERT INTO ... -- Time_in
else
INSERT INTO ... -- Time_out
end if;
END LOOP;
EXCEPTION
WHEN NO_DATA_FOUND
THEN UTL_FILE.FCLOSE( v_fhandle );
END; -
Loading a big table with SQL*LOADER
I'm loading thru sql*loader a big quantity of data, The number of rows that I need to load to oracle table is 339'582.194. The inconvenient is that I want to load this data more quickly.
Somebody knows how I can accelerate this process. Or somebody knows other way for to do this more efficiently.
Thanks.Dirct Path?
-
Error displaying a jpg file loaded into a table with blob field
This may not be the correct forum for this question, but if it isn't could someone direct me to the correct one.
I have created a table with a blob field in which I have loaded a jpg image. This appeared to work correctly, but when I try to display the image in internet explorer it comes back and tells me that it doesn't recognize the file type. Enclosed is the table create, load, and display pl/sql code. Can anyone tell me what I am doing wrong. Thanks. For the mime/header I used owa_util.mime_header('images/jpg') because my image is a jpg file.
The database is 10g
-- Create table
create table PHOTOS
IMAGEID NUMBER(10),
IMAGE BLOB,
IMAGE_NAME VARCHAR2(50)
load image
CREATE OR REPLACE PROCEDURE load_file ( p_id number, p_photo_name in varchar2) IS
src_file BFILE;
dst_file BLOB;
lgh_file BINARY_INTEGER;
BEGIN
src_file := bfilename('SNAPUNCH', p_photo_name);
-- insert a NULL record to lock
INSERT INTO photos (imageid, image_name, image)
VALUES (p_id , p_photo_name, EMPTY_BLOB())
RETURNING image INTO dst_file;
-- lock record
SELECT image
INTO dst_file
FROM photos
WHERE imageid = p_id AND image_name = p_photo_name
FOR UPDATE;
-- open the file
dbms_lob.fileopen(src_file, dbms_lob.file_readonly);
-- determine length
lgh_file := dbms_lob.getlength(src_file);
-- read the file
dbms_lob.loadfromfile(dst_file, src_file, lgh_file);
-- update the blob field
UPDATE photos
SET image = dst_file
WHERE imageid = p_id
AND image_name = p_photo_name;
-- close file
dbms_lob.fileclose(src_file);
END load_file;
display image
PROCEDURE display_image(p_id NUMBER) IS
Photo BLOB;
v_amt NUMBER DEFAULT 4096;
v_off NUMBER DEFAULT 1;
v_raw RAW(4096);
BEGIN
-- Get the blob image
SELECT image
INTO Photo
FROM PHOTOS
WHERE IMAGEID = p_id;
owa_util.mime_header('images/jpg');
BEGIN
LOOP
-- Read the BLOB
dbms_lob.READ(Photo, v_amt, v_off, v_raw);
-- Display image
htp.prn(utl_raw.cast_to_varchar2(v_raw));
v_off := v_off + v_amt;
v_amt := 4096;
END LOOP;
dbms_lob.CLOSE(Photo);
EXCEPTION
WHEN NO_DATA_FOUND THEN
NULL;
END;
END;
The url I enter is: http://webdev:7777/tisinfo/tis.tiss0011.Display_Image?p_id=1Just a little more information. When I enter owa_util.mime_header('image/jpeg') I can't display the file. It just shows up with a red x for the file.
When I enter owa_util.mime_header('image/jpg') it displays the file, but in the format
¿¿¿¿JFIF¿¿-Intel(R) JPEG Library, version [2.0.16.48]¿¿C
This is the way I would expect it to look if I opened it with Notepad, or an application that doesn't recognize jpg files. Can anyone tell me what I am doing wrong?? Thanks. -
Error message using SQL import and Export Wizard with Excel spread sheet
I am trying to import an Excel spreadsheet using the Import export wizard that is provided with SQL 2014.
Everything seems to set up OK but then when I go to do the transfer "error message" comes up saying:
External Table is not in the expected format.(Microsoft Jet Database Engine).
This is a spreadsheet and there is no database, that is why I want to import it into a database!! So the reference to JET is a tad disappointing.
So why the error and how do I fix the problem?
As there are over 100 variables in the spread sheet rows, it would be great to have this automatically create the database and populate the fields.
I am using SQL 2014 Express and Office Excel 2013.
Thank you in advance for taking the time to read this, and hopefully sheading some light on the issue.Hi AWlcurrent,
When import a .xlsx file to SQL Server using SQL Server Import and Export Wizard, you see a “External Table is not in the expected format.(Microsoft Jet Database Engine).” error message. This error message seems that the Microsoft JET Database Engine is
unable to handle something that is contained in the file.
So please make sure there is no unsupported content in the Excel file. Alternatively, we can use ‘Microsoft Excel’ as the Data Source, then Select ‘Microsoft Excel 2007’ as the Excel version to import the excel file.
If there are any other questions, please feel free to ask.
Thanks,
Katherine Xiong
If you have any feedback on our support, please click
here.
Katherine Xiong
TechNet Community Support -
Using sql query as a table name
hello,
I have a table(say table1) which is storing the names of some other tables. I want to access the name of a table from table1 using sql query and then use the result of this query as the table name to access the data from
retrieved table name. How can I do this ?
ex:
select * from (select tablename from table1 where tableid='1');
I want to do something likw this. How can I do this?I want to access the name of a table from table1 using sql query and then use the result of this query as the table name to access the data from retrieved table name. How can I do this ?e.g. like this:
SQL> with table1 as (
select 'emp' tablename from dual
select extractvalue(x.column_value, 'ROW/ENAME') ename
from table1, xmltable('ROWSET/ROW' passing dbms_xmlgen.getxmltype('select * from ' || tablename)) x
ENAME
SMITH
ALLEN
WARD
JONES
MARTIN
BLAKE
CLARK
SCOTT
KING
TURNER
ADAMS
JAMES
FORD
MILLER
14 rows selected. -
Loading Data into Table with Complex Transformations
Hello Guys
I am trying to load data into one of the Dimension table and it has quite a few Transformations and i created 6 temp tables
1. It has 7 Columns , Gets 935 rows based on where condition
2. It has 10 Columns , Gets 935 rows with but it has nulls in it i.e for column 1 there are 500 fields and columns 2 there are 300 etc ...
3 , 4 , 5 , 6 all the same as the 2 table
and at the end when i am trying to join all the temp tables with the Product_id into the target , which is in each temp table ...
I am Getting Error Saying Not Obeying Primary key Constraints i.e unique values are not been inserting into the Product_id Column of Target Table and the Job is running for Hours
and the main Problem comes at , some of the Columns have the same Product_id
Please help me
I have been Trying for 1 week and i am in Full pressure
Thanks
Sriks
Edited by: Sriks on Oct 16, 2008 6:43 PMHi,
If you are creating a warehouse and product_key is ur PK then it should come only once and so u might have to think ur logic in getting the data. To get over the isue u can disable the constraint and load with out the cosntraint, but i would have u look at the logic and make sure u have only 1 product_key in the table.
Regards
Bharath -
Can i use OEM to automaticaly fill tables with data ?
Hi currently, i have tables that contains data on our databases such as the characteristics of the servers where these DB are stored, the user enter these data manually via an interface. I know that OEM has these information, i want to use it to automatically fill the tables with it. Is that doable ?
Apparently you know nothing of OEM architecture. The obvious would be to look up documentation, sadly you belong to the majority of people here refusing to read any documentation.
OEM comes in the flavors Grid Control (monitoring multiple databases) and Database Control (monitoring one database).
OEM needs a repository. Using Grid Control this is located in a separate database, and each server has the Intelligent Agent installed to collect information from that server, provided the database is registered against grid control.
Using Database Control, the repository is in the monitored database.
Having OEM 'feed' your 'repository' would be about the poorest solution you can implement. Converting your 'repository' in views on the OEM schema would be the only viable solution, as Grid Control automatically has the most recent info.
And obviously, for those who read, the OEM info is in the SYSMAN schema.
Sybrand Bakker
Senior Oracle DBA -
How to make an update of millions of rows using only one commit statement?
Hi, I need to execute a complex update statement over a partitioned table. I take advantage of partitioning: loop for each partition, do the update and make commit. Of that way I'm updating and then commiting around 600.000 rows.
But some of our systems have Oracle Standar Edition version and partitioning is not supported. I want to make the same update without taking advantage of partitioning. That is my problem. I need to update around 15.000.000 of rows but if I try to make commit at the end, this update generates a lot of UNDO data and fails because there is not enought space for retention.
I would like to know your suggestions. There is some way of avoid UNDO data generation? There is some way to execute commit automatically?
Thanks for your support.>
This is exactly what I was looking for. Its a shame is only available in 11.2.
>
Then you may be interested in 'Do-It-Yourself Parallelism' for your 10g version to accomplish the same thing very effectivelyl.
See the 'Do-It-Yourself Parallelism' section of this article by Tom Kyte.
http://www.oracle.com/technetwork/issue-archive/2009/09-nov/o69asktom-089919.html
>
In my book Expert Oracle Database Architecture , I spent quite a few pages describing how to perform batch operations “in parallel,” using a do-it-yourself parallelism approach. The approach was to break up a table into ranges, using rowids (see www.bit.ly/jRysl for an overview) or primary key ranges (see “Splitting Up a Large Table”). Although the approach I described was rather straightforward, it was always also rather manual. You had to take my “technique” and tweak it for your specific circumstances.
>
You really should get the book since it has ALL of the details. But the above quote has a link to that "Splitting Up a Large Table" doc and that doc shows how to do it.,
http://www.oracle.com/technetwork/issue-archive/2006/06-jan/o16asktom-101983.html
>
Splitting Up a Large Table
I would like to partition a range of values into balanced sets. Initially I figured that one of the analytics functions might be useful for this and decided to look into these and learn more about them. The question I had in mind was, "For an ordered list of values, how can we 'chop' them into ranges and then list the first and last value for each range?" -
Multiple rows to multiple columns on one row using SQL
Hi
I am attempting to select back multiple values for a specific key on one row. See the example below. I have been able to use the sys_connect_by_path to combine the fields into one field but I am unable to assign them to fields of their own. See the example below
TABLE DETAILS:
Policy id plan name
111 A Plan
111 B Plan
111 Z Plan
112 A Plan
112 Z Plan
My desired result is to be able to show the output as follows
Policy ID Plan_1 Plan_2 Plan_3
111 A Plan B Plan Z PLan
112 A Plan Z PLan
Can you help???Thanks for all the replies. Perhaps I could give a little more detail incldung a sample table and insert statements. The repsonses work fine but the problem I was having was that I did not want to have to hardcode in the plan_name to a decode statement. The list of plans is not exhaustive. There could be numerous different plans in the table. I have amended the details below slightly to try and give a little more information as I was probably not too clear at the start. Would you know if there is a way to do this without hardcoding the plan values in?
Thanks in advance!
CREATE TABLE TEST_SAMPLE (
POLICY_NUMBER VARCHAR2(10),
plan_name varchar2(20) );
INSERT INTO TEST_SAMPLE VALUES ('111', 'A Plan');
INSERT INTO TEST_SAMPLE VALUES ('111', 'B Plan');
INSERT INTO TEST_SAMPLE VALUES ('111', 'C Plan');
INSERT INTO TEST_SAMPLE VALUES ('112', 'J Plan');
INSERT INTO TEST_SAMPLE VALUES ('112', 'Z Plan');
My desired result is to be able to show the output as follows
Policy ID Plan_1 Plan_2 Plan_3
111 A Plan B Plan C PLan
112 J Plan Z PLan -
Convert Column to rows using sql
My client relies on a legacy system which currently outputs data in the following format:
001
100test1
110addr1
115addr2
120city
125state
999drr
001
100test1
110addr1
115addr2
120city
125state
130XXX
135YYY
140ZZZ
999drr
001
100test1
110addr1
115addr2
120city
125state
145AAA
150BBB
155CCC
999drr
I want to transform the above data into the format, as shown below:
Name | ADDR1 | ADDR2 | CITY | STATE | ColX | ColY | ColZ | ColA | ColB | ColC | .... | LastCol
100test1 | 110addr1 | 115addr2 | 120city | 125state | null | null | null | null | null | null | ... | null
100test1 | 110addr1 | 115addr2 | 120city | 125state | 130XXX | 135YYY | 140ZZZ | null | null | null | ... | null
100test1 | 110addr1 | 115addr2 | 120city | 125state | null | null | null | 145AAA| 150BBB | 155CCC | ... |null |
When I tried , it only works when the number of rows between '001' and '999' is fixed. But, in my case, the data I get is never fixed and there can be any number of rows in betweeen '001' and '999' and The rows with null values do not show up at all in the flatfile (as shown above in the third block).
Can anyone help about how to achieve this kind of data transformation ? What sequence of steps or scripts do I have to run to get this?
Any help is appreciated.
Thanks ahead
premaI have considered that 7 consecutive rows forms one record. Change the 7 in the query according to your actual data.
SQL> with t
2 as
3 (
4 select '001' detail from dual
5 union all
6 select '100test1' from dual
7 union all
8 select '110addr1' from dual
9 union all
10 select '115addr2' from dual
11 union all
12 select '120city' from dual
13 union all
14 select '125state' from dual
15 union all
16 select '999drr' from dual
17 union all
18 select '002' from dual
19 union all
20 select '200test2' from dual
21 union all
22 select '210addr1' from dual
23 union all
24 select '215addr2' from dual
25 union all
26 select '220city' from dual
27 union all
28 select '225state' from dual
29 union all
30 select '999drr' from dual
31 union all
32 select '003' from dual
33 union all
34 select '300test1' from dual
35 union all
36 select '310addr1' from dual
37 union all
38 select '315addr2' from dual
39 union all
40 select '320city' from dual
41 union all
42 select '325state' from dual
43 union all
44 select '999drr' from dual
45 )
46 select ltrim(max(sys_connect_by_path(detail,'|')),'|')
47 from
48 (
49 select row_num,
50 trunc(abs(decode(row_num-rownum,0,1,row_num-rownum))/7)+1 row_num1,
51 detail
52 from
53 (
54 select decode(mod(rownum,7),0,7,mod(rownum,7)) row_num,detail
55 from t
56 )
57 )
58 start with row_num=1
59 connect by row_num-1=prior row_num
60 and row_num1= PRIOR row_num1
61 group by row_num1
62 /
LTRIM(MAX(SYS_CONNECT_BY_PATH(DETAIL,'|')),'|')
001|100test1|110addr1|115addr2|120city|125state|999drr
002|200test2|210addr1|215addr2|220city|225state|999drr
003|300test1|310addr1|315addr2|320city|325state|999drrRegards,
Mohana
Maybe you are looking for
-
Can you remove items from the toolbar when opening a pdf in a browser?
I know you can turn off the toolbar with open parameters, but we would like to only display the zoom controls on the toolbar. Can I do this with an FDF file or some other way?
-
After Effects CC crash at start up on OS X 10.9.2
Process: After Effects [1144] Path: /Applications/Adobe After Effects CC/Adobe After Effects CC.app/Contents/MacOS/After Effects Identifier: com.adobe.AfterEffects Version: 12.2 (12.2) Code Type: X86-64 (Native)
-
Can't open mail after updating to Safari 7.0.6.
I can't open my Mail after updating to Safari 7.0.6.
-
Oracle Rac Node 1 , Node 2
*Hi All , I need to ask about RAC : I have Oracle RAC ( Node1,Node2) When i check The performance For both nodes , the all load Goes to node 1 without anything happened on node 2 , as i know the the load must be equal on the both Nodes ( node1 , node
-
Printer not working i get an error message of ink sytem failure : oxc19a0007 please help
HP photosmart all in one Model CD028A using windows 7 64 bit. It was working one week ago