SQL Loader Information
Hello Dear Oracle Users,
There is a piece of information which i want to know. Is there any way in sql loader to know the filename i am loading? For example file X is being loaded, what i want to get is a field with data value X for all the records.
Thanking you to attend my request.
Best Regards,
Amit
The SQL*Loader documentation is in the Utilities guide, along with external tables, data pump, import and export and other bits and pieces.
http://download-east.oracle.com/docs/cd/B19306_01/server.102/b14215/part_ldr.htm#i436326
http://download-east.oracle.com/docs/cd/B19306_01/server.102/b14215/toc.htm
There is also a PDF
http://www.oracle.com/pls/db102/to_pdf?pathname=server.102%2Fb14215.pdf&remark=portal+%28Books%29
Similar Messages
-
How can we tell if SQL*Loader is working on a TABLE?
We have a process that requires comparing batches with LDAP information. Instead of using an LDAP lookup tool, we get a nightly directory file, and import the two COLUMNs we want via SQL*Loader (REPLACE) into an IOT. Out of three cases, two just check the first COLUMN, and the third needs the second COLUMN as well.
We did not think of using External TABLEs, because we cannot store files on the DB server itself.
The question arises, what to do while the file is being imported. The file is just under 300M, so it takes a minute or so to replace all the data. We found SQL*Loader waits until a transaction is finished before starting, but a query against the TABLE only waits while it is actually importing the data. At the beginning of SQL*Loader's process, however, a query against the TABLE returns no rows.
The solution we are trying right now is, to have the process that starts SQL*Loader flip a flag in another TABLE denoting that it is unavailable. When it is done, it flips it back, and notes the date. Then, the process that queries the information, exits if the flag is currently 'N'.
The problem, is, what if SQL*Loader starts inbetween the check of the flag, and the query against the TABLE. How do we guarantee that it is still not being imported.
I can think of three solutions:
1) LOCK the ldap information TABLE before checking the flag.
2) LOCK the record that the process starting SQL*Loader flips.
3) Add a clause to the query against the TABLE checks that there are records in the TABLE (AND EXISTS(SELECT * FROM ldap_information).
The problem with 3) is that the process has already tagged the batches (via a COLUMN). It could, technically reset them afterwards, but that seems a bit backwards.Just out of curiosity, are you aware that Oracle supplies a DBMS_LDAP package for pulling information from LDAP sources? It would obviously be relatively easy to have a single transaction that deletes the existing data, loads the new data via DBMS_LDAP, and commits, which would get around the problem you're having with SQL*Loader truncating the table.
You could also have SQL*Loader load the data into a staging table and then have a second process either MERGE the changes from the staging table into the real table (again in a transactionally consistent manner) or just delete and insert the data.
Justin -
Decode Not working in sql loader
I had a requirement of loading flatfile into staging table using SQL Loader, One of the columns in the the Flat file is having values FALSE or TRUE and my requirement is that I load 0 for FALSE and 1 for TRUE which can be achieved by simple DECODE function...I did use decode and tried to load several times but did not work. What might be the problem
LOAD DATA
INFILE 'sql_4ODS.txt'
BADFILE 'SQL_4ODS.badtxt'
APPEND
INTO TABLE members
FIELDS TERMINATED BY "|"
( Person_ID,
FNAME,
LNAME,
Contact,
status "decode(:status, 'TRUE', '1','FALSE','0')"
I did try putting a trim as well as SUBSTR but did not work....the cloumn just doent get any values in the output (just null or say free space)
Any help would be great.....Hello user8937215.
Please provide a create table statement and a sample of data file contents. I would expect DECODE or CASE to work based on the information provided.
Cheers,
Luke
Please mark the answer as helpful or answered if it is so. If not, provide additional details.
Always try to provide create table and insert table statements to help the forum members help you better. -
How to handle Multiple date formats for the same date field in SQL*Loader
Dear All,
I got a requirement where I need to get data from a text file and insert the same into oracle table.
I am using SQL*Loader to populate the data from the text file into my table.
The file has one field where I am expecting date date data in multiple formats, like dd/mon/yyyy, yyyy/dd/mon, yyyy/mon/dd, ,mm/dd/yyyy, mon/dd/yyyy.
While using SQL*Loader, I can see Loading is failing for records where we have formats like yyyy/dd/mon, yyyy/mon/dd, mon/dd/yyyy.
Is there any way in SQL*Loader where we can mention all these date formats so that this date data should go smoothly into the underlying date column in the table.
Appreciate your response on this.
Thanks,
Madhu K.The point being made was, are you sure that you can uniquely identify a date format from the value you receieve? Are you sure that the data stored is only of a particular limited set of formats?
e.g. if you had a value of '07/08/03' how do you know what format that is?
It could be...
7th August 2003 (thus assuming it's DD/MM/RR format)
or
8th July 2003 (thus assuming it's MM/DD/RR format)
or
3rd August 2007 (thus assuming it's RR/MM/DD format)
or
8th March 2007 (thus assuming it's RR/DD/MM format)
or even more obscurely...
3rd July 2008 (MM/RR/DD)
or
7th March 2008 (DD/RR/MM)
Do you have any information to tell you what formats are valid that would allow you to be specific and know what date format is meant?
This is a classic example of why dates should be stored on the database using DATE datatype and not VARCHAR2. It can lead to corruption of data, especially if the date can be entered in any format a user wishes. -
I am using sql loader command line sqlldr to load 141306 records into my table. It ended with an error message --
SQL*Loader-605: Non-data dependent ORACLE error occurred -- load discontinued.
Checking for data load failures
Checking Log data for clinic_biotox table data load
90970 Rows successfully loaded.
78 Rows not loaded due to data errors.
And moreover (90970+78 = 91048) which is much less than 141306
Any ideas as to what might have happened ?
ThanksHave you checked the sqlldr log file (the name of which you should have supplied at the command line)? This sometimes provides additional information as to what occurred.
I would also say that the reason why your record counts do not tally up like they should would be because the error occurred before sqlldr was able to complete the load.
The last thing I can offer would be for you to check what your allowable error threshold is within your logfile. It could be that you exceeded the allowable number of bad records and as such caused the load to fail.
Hope this helps... -
CSV FILES DOESN'T LOAD WITH RIGHT DATA USING SQL LOADER
Hi pals, I have the following information in csv file:
MEXICO,Seretide_Q110,2010_SEE_01,Sales Line,OBJECTIVE,MEXICO,Q110,11/01/2010,02/04/2010,Activo,,,MEXICO
MEXICO,Seretide_Q210,2010_SEE_02,Sales Line,OBJECTIVE,MEXICO,Q210,05/04/2010,25/06/2010,Activo,,,MEXICO
When I use SQLLOADER the data is loaded as follow:*
EXICO,Seretide_Q110,2010_SEE_01,Sales Line,OBJECTIVE,MEXICO,Q110,11/01/2010,02/04/2010,Activo,,,MEXICO
And for the next data in a csv file too:
MX_001,MEXICO,ASMA,20105912,Not Verified,General,,RH469364,RH469364,Change Request,,,,,,,Y,MEXICO,RH469364
MX_002,MEXICO,ASMA,30094612,Verified,General,,LCS1405,LCS1405,Change Request,,,,,,,Y,MEXICO,LCS1405
the data is loaded as follow:
X_001,MEXICO,ASMA,20105912,Not Verified,General,,RH469364,RH469364,Change Request,,,,,,,Y,MEXICO,RH469364
X_002,MEXICO,ASMA,30094612,Verified,General,,LCS1405,LCS1405,Change Request,,,,,,,Y,MEXICO,LCS1405
I mean the first character is truncated and this bug happens with all my data. Any suggestion? I really hope you can help me.
Edited by: user11260938 on 11/06/2009 02:17 PM
Edited by: Mariots on 12/06/2009 09:37 AM
Edited by: Mariots on 12/06/2009 09:37 AMYour table and view don't make sense so I created a "dummy" table to match your .ctl file.
SQL> create table CCI_SRC_MX
2 (ORG_BU varchar2(30)
3 ,name varchar2(30)
4 ,src_num varchar2(30)
5 ,src_cd varchar2(30)
6 ,sub_type varchar2(30)
7 ,period_bu varchar2(30)
8 ,period_name varchar2(30)
9 ,prog_start_dt date
10 ,prog_end_dt date
11 ,status_cd varchar2(30)
12 ,X_ACTUALS_CALC_DATE date
13 ,X_ACTUAL_UPDATE_SRC varchar2(30)
14 ,prod_bu varchar2(30)
15 ,ROW_ID NUMBER(15,0)
16 ,IF_ROW_STAT VARCHAR2(90)
17 ,JOB_ID NUMBER(15,0)
18 );
Table created.
SQL> create sequence GSK_GENERAL_SEQ;
Sequence created.I simplified your .ctl file and moved all the constant and sequence stuff to the end. I also changed the format masks to match the dates in your data.
LOAD DATA
INFILE 'SBSLSLT.txt'
BADFILE 'SBSLSLT.bad'
DISCARDFILE 'SBSLSLT.dis'
APPEND
INTO TABLE CCI_SRC_MX
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"'
TRAILING NULLCOLS
(ORG_BU
,NAME
,SRC_NUM
,SRC_CD
,SUB_TYPE
,PERIOD_BU
,PERIOD_NAME
,PROG_START_DT DATE 'dd/mm/yyyy'
,PROG_END_DT DATE 'dd/mm/yyyy'
,STATUS_CD
,X_ACTUALS_CALC_DATE DATE 'dd/mm/yyyy'
,X_ACTUAL_UPDATE_SRC
,PROD_BU
,row_id "GSK_GENERAL_SEQ.nextval"
,if_row_stat CONSTANT 'UPLOADED'
,job_id constant 36889106
{code}
When I run SQL Loader, I get this:
{code}
SQL> select * from CCI_SRC_MX;
ORG_BU NAME SRC_NUM SRC_CD SUB_TYPE PERIOD_BU PERIOD_NAME PROG_START_DT PROG_END_DT STATUS_CD PROD_BU ROW_ID IF_ROW_STAT JOB_ID
MEXICO Seretide_Q110 2010_SEE_01 Sales Line OBJECTIVE MEXICO Q110 11-JAN-2010 00:00:00 02-APR-2010 00:00:00 Activo MEXICO 1 UPLOADED 36889106
MEXICO Seretide_Q210 2010_SEE_02 Sales Line OBJECTIVE MEXICO Q210 05-APR-2010 00:00:00 25-JUN-2010 00:00:00 Activo MEXICO 2 UPLOADED 36889106
{code} -
SQL LOADER / INFILE filename as variable in .ctl file
I stumbled over several threads in the OTN-forums regarding this problem, but neither was it finaly solved nor did I find a FAQ that answered my question. Soooo:
We get several datafiles from several sources and process them via SQL Loader and store 'em in the DB.
This is done via a CHRON job and a PERL skript, for all datafiles in a specific directory.
We need the information which file on which date generated the data INSIDE the DB as well.
So I want to store the filename || SYSDATE combination as well.
I know, I could parse the .ctl file and replace a key-string with the actual filename and so have it in the input.
But this seems a bit dirty to me. Isn't there some way, i.e. a keyword or variable for the infile-filename within the SQLLoader that I can access in the .ctl file? Something like:
INTO TABLE processed_files
FIELDS TERMINATED BY ';'
WHEN LOWER(LTRIM(RTRIM(hdr_ftr))) = 'ftr' -- FOOTER??
(hdr_ftr VARCHAR2(100),
source INFILE||' on '||TO_CHAR(SYSDATE, 'MM/DD/YYYY'),
realm VARCHAR2(100),
version VARCHAR2(20)
I would be greatfull if you?d share your wisdom with me. ;-))
OliverI passed this quite similar to 'Ask Tom' and got the advice to put the .ctl's content as a string variable into a Shell skript.
This shell skript (which had to be written anyway to loop over the datafiles an subsequently call the sqlldr) should then replace the INFILE parameter and the CONSTANTs for the filenames and generate a 'temporarry' .ctl before calling sqlldr!
That's it, no better and safer way! -
Substitution variable in sql load rules file
Okay gurus,
I need a little guidance, I have to replace the value of 201020 and 2008 from substitution variables. I have created the variables and set them up globally on the essbase server.
201020 = FW00
2008 = FY00
WHERE ACT.FISCAL_WEEK_ID <= 201020
AND ACT.FISCAL_YEAR_ID > 2008
AND RTDIV.DIV IN (1,2,3,4,5,6,7,8,9,99) (This is the query with hard coded values of week and year)
When i m trying to put sub variables there , its throwing the error. please find below the way i was trying to do it.
WHERE ACT.FISCAL_WEEK_ID = '&FW00'
AND ACT.FISCAL_YEAR_ID = '&FY00'
But unfortunately, its throwing error Error: 1021001 Failed to Establish Connection With SQL Database Server. See log for more information
I know that this is the generic error because if i put the hard coded value in sql load rules it works fine.
Is it the right way to out sub var in sql load rules???
Please advice and thanks in advance.Hi Genn,
I tried to see the app log for sql but i m afraid that there is nothing in there, the only error message which I am getting in app log is this:
Failed to Establish Connection With SQL Database Server. See log for more information
Its an ASO cube and initially i was using the variable as FY00 AS "2008" it did not work and than i tried without quotes in variable but is still not working.
Any idea..thanks in advance. -
I use SQL Loader to read a file into a table. In my tests a lot of rows are rejected and written to the bad file. But there are no error messages.
Is there a way to find out why the row was rejected.The Bad File
The bad file contains records that were rejected, either by SQL*Loader or by the Oracle database server. Some of the possible reasons for rejection are discussed in the next sections.
SQL*Loader Rejects
Datafile records are rejected by SQL*Loader when the input format is invalid. For example, if the second enclosure delimiter is missing, or if a delimited field exceeds its maximum length, SQL*Loader rejects the record. Rejected records are placed in the bad file.
Oracle Rejects
After a datafile record is accepted for processing by SQL*Loader, it is sent to the Oracle database server for insertion into a table as a row. If the Oracle database server determines that the row is valid, then the row is inserted into the table. If the row is determined to be invalid, then the record is rejected and SQL*Loader puts it in the bad file. The row may be invalid, for example, because a key is not unique, because a required field is null, or because the field contains invalid data for the Oracle datatype.
YOU WILL get more information
sqlldr help=y
also send your control file syntax
kuljeet pal singh -
SQL Loader and Insert Into Performance Difference
Hello All,
Im in a situation to measure performance difference between SQL Loader and Insert into. Say there 10000 records in a flat file and I want to load it into a staging table.
I know that if I use PL/SQL UTL_FILE to do this job performance will degrade(dont ask me why im going for UTL_FILE instead of SQL Loader). But I dont know how much. Can anybody tell me the performance difference in % (like 20% will decrease) in case of 10000 records.
Thanks,
Kannan.Kannan B wrote:
Do not confuse the topic, as I told im not going to use External tables. This post is to speak the performance difference between SQL Loader and Simple Insert Statement.I don't think people are confusing the topic.
External tables are a superior means of reading a file as it doesn't require any command line calls or external control files to be set up. All that is needed is a single external table definition created in a similar way to creating any other table (just with the additional external table information obviously). It also eliminates the need to have a 'staging' table on the database to load the data into as the data can just be queried as needed directly from the file, and if the file changes, so does the data seen through the external table automatically without the need to re-run any SQL*Loader process again.
Who told you not to use External Tables? Do they know what they are talking about? Can they give a valid reason why external tables are not to be used?
IMO, if you're considering SQL*Loader, you should be considering External tables as a better alternative. -
Hi,
I want to load hex strings with sql loader into a blob/raw fields.
The problem is that sql functions (e.g. HextoRaw) are not allowed with lob columns, and for the RAW one i get:
"illegal use of TERMINATED BY for RAW"
What should i do?If each blob is in a separate file, then you should be able to use LOBFILES to load them. Each row in the data file would need to have the name of the file containing the blob for that row. File ulcase9.ctl in the demo directory shows an example of using LOBFILES to load a lob column.
If you want the blob in the data file, then the data file needs to use types that have include length information in then, such as VARRAW, LONG VARRAW or VARRAWC. Also, the records for the data files cannot be terminated by a character string or newline. That's because the blob data might contain the character string or newline in the middle of its data. Instead, you would need to use the VAR record type. -
hi,
I want to insert 100,000 records daily in a table for the first month and then in next month these records are going to be replaced by new updated records.
there might be few addition and deletion in the previous records also.
actually its consumer data so there might be few consumer who have withdrawn the utility and there will be some more consumer added in the database.
but almost 99% of the previous month data have to be updated/replaced with the fresh month data.
For instance, what i have in my mind is that i will use sql loader to load data for the first month and then i will delete the previous data using sqlPlus and load the fresh month data using sql loader again.
1. Is this ok ? or there is some better solution to this.
2. I have heard of external files, are they feasible in my scenario?
3. I have planned that i will make scripts for sqlPlus and Loader and use them in batch files. (OS windows 2003 server, Oracle 9i database). is there some better choice to make all the procedure automatic?
looking for your suggestions
nadeem ameerI would suggest u use External tables since its more flexible then
sqlloader & is a better option.
For using external tables
1)u will have to create a directory first
2)Generally creation od directory is done by sys,hence after creating the directory
privileges read & write to be provided to user .
3)Creation of external tables.
4) Now use the table as a normal table to insert ,update delete in
ur table.
U can get more information from
http://www.oracle-base.com/articles/9i/SQLNewFeatures9i.php#ExternalTables
Create Directory <directory_name> as <Directory path where file be present>
Grant read,write on directory <directory_name> to <username>
CREATE TABLE <table_name>
(<column names>)
ORGANIZATION EXTERNAL
TYPE ORACLE_LOADER
DEFAULT DIRECTORY ,directory_name>
ACCESS PARAMETERS
RECORDS DELIMITED BY NEWLINE
FIELDS TERMINATED BY ','
LOCATION (<filename>)
PARALLEL 5
REJECT LIMIT 200;
Hope this helps. -
Hi,
Does anybody know if I can generate the unique primary key using an Oracle Sequence for a Database table to which I am inserting records in SQL Loader?
I checked the SQL Loader manual and there is no information as to how to make use of a Oracle sequence.. in the control file?
Thanks
SurajitYes, you can do it. Create the sequence (suppose you call it "PK_SEQ_X") and then in your control file reference it as "PK_SEQ_X.NEXTVAL". For example suppose you wanted to put it into a column named 'Y' the entry in your control file will look like 'load data insert into table Z (Y "PK_SEQ_X.NEXTVAL", ....)'
Note that the double quotes around the sequence name are required. -
SQL*Loader: How to load multi-line report data?
Hi,
is it possible to use SQL*Loader to load data from a hierarchical structured fixed column ASCII file like this
001 scott
New York 01.01.2002 1234
Chicago 15.10.2001 9876
002 smith
Los Angeles 24.12.1999 5678
Washington 01.12.1999 0000
Chicago 01.01.2000 1111
into one database table:
id name city day code
001 scott New York 01.01.2002 1234
001 scott Chicago 15.10.2001 9876
002 smith Los Angeles 24.12.1999 5678
002 smith Washington 01.12.1999 0000
002 smith Chicago 01.01.2000 1111
The number of lines per name is unlimited, the next line starts after a separating ---- line.
We cannot change the format of the text file to import.
There is an example in the documentation that shows how to load a structure like the following via insert triggers:
001 scott New York 01.01.2002 1234
Chicago 15.10.2001 9876
002 smith Los Angeles 24.12.1999 5678
Washington 01.12.1999 0000
Chicago 01.01.2000 1111
But we have the name information on a separate header line, so I don't know if we can use a similar technique here.
regards
SvenTry enclosing your strings with e.g. a double-quote. To do so, the control file needs the following (example):
LOAD DATA
REPLACE
INTO TABLE test
FIELDS TERMINATED BY '|' OPTIONALLY ENCLOSED BY '"'
TRAILING NULLCOLS
id1,
id2,
id3,
id4,
)of course this does mean that your data must change...
L. -
Sql loader unable to read from pipe
Hi All:
I'm using named pipe along with Oracle SQL*Loader to load some 20 millions rows into database.
The source of the pipe is from a Java application which write to the pipe using simple FileOutputStream.
It can be observed that the Oracle SQL*Loader need to wait a lot on the Java application to produce enough data for loading.
The waiting is fine. However, the Oracle SQL*Loader always exist after loading about 1 million rows with output like:
SQL*Loader-501: Unable to read file (upipe.dat)
SQL*Loader-560: error reading file
SQL*Loader-2026: the load was aborted because SQL Loader cannot continue.
And in this case, the Java will throw IOException with information:
Exception in thread "main" java.io.IOException: Broken pipe
at java.io.FileOutputStream.writeBytes(Native Method)
at java.io.FileOutputStream.write(FileOutputStream.java:284)
It runs on Linux environment with 11g database.
Any idea why this will happen?check
SQLLDR NOT LOADING ALL DATA IN DAT FILE : SQL*Loader-510/SQL*Loader-2026 [ID 741100.1]
Maybe you are looking for
-
I am using a older version of web application Ariba Buyer 8.1. It is a e procurement application. In the latest version of firefox i am unable to click buttons. But it works in firefox 3.6. We have some more application with this issue. can you fix t
-
I have idsync for windows setup and syncing users between AD and DS 5.2 Question. If a new user is created in AD and sent over to DS5.2, how do I populate their uid, gid and homedir attributes with appropriate values? (i.e and uid & gid that don't pr
-
Can some photos from ipad be transferred to mac or external hard disk
can some photos from ipad be transferred to mac or external hard disk
-
I just got my Team membership and everything opened except Muse. It keeps saying my membership expired. We joined for a year. What's going on?
-
Log and Transfer with Canon 5d mark iii footage on OS 10.5
FCP is not recognizing my SCDH card in the Log and Transfer window and I'm wondering why this is. I am running OS 10.5 - and there were no additional EOS drivers available on Canon's website for my OS. I am shooting with the new 5D mark iii. When I f