Sql loader scheduling and file polling on Windows
Hi there,
I am looking for tool which can poll a particular folder on Windows box for files upload events and then kick of the sqlldr process by passing the file.
I don't want to write a custom utility but looking for something out of the box.
Let me know your thoughts.
Thanks,
oops I think i did not post correctly.
I have the sqlldr part all coded and working against the csv file and able to load into tables. this is fine.
Also In windows I can schedule a batch job and get this running. What I am looking for is a scheduler which also includes file listener utility. THis will run 24X7 on the windows box and as and when a file is uploaded will kickoff the sqlldr job by passing the appropriate file name to the sqlldr task.
I believe this has to be at the OS level if I am not mistaken. If there is way to code this in Oracle I am all for it. Let me know
Thanks In Advance.
Edited by: ssp on Feb 10, 2011 2:26 PM
Similar Messages
-
SQL*Loader Sequential Data File Record Processing?
If I use the conventional path will SQL*Loader process a data file sequentially from top to bottom? I have a file comprised of header and detail records with no value found in the detail records that can be used to relate to the header records. The only option is to derive a header value via a sequence (nextval) and then populate the detail records with the same value pulled from the same sequence (currval). But for this to work SQL*Loader must process the file in the exact same sequence that the data has been written to the data file. I've read through the 11g Oracle® Database Utilities SQL*Loader sections looking for proof that this is what will happen but haven't found this information and I don't want to assume that SQL*Loader will always process the data file records sequentially.
Thank youOracle Support responded with the following statement.
"Yes, SQL*LOADER process data file from top to bottom.
This was touched in the note below:
SQL*Loader - How to Load a Single Logical Record from Physical Records which Include Linefeeds (Doc ID 160093.1)"
Jason -
Dear all,
Please provide script for sql loader schedule process
Regards
A.GopiDepending on your version you can use DBMS_JOB or DBMS_SCHEDULER to setup a task that runs and external executable periodically.
You can find the information for your version at http://tahiti.oracle.com -
SQL Loader - CSV Data file with carraige returns and line fields
Hi,
I have a CSV data file with occasional carraige returns and line feeds in between, which throws my SQL loader script off. Sql loader, takes the characters following the carraige return as a new record and gives me error. Is there a way I could handle carraige returns and linefeeds in SQL Loader.
Please help. Thank you for your time.
This is my Sql Loader script.
load data
infile 'D:\Documents and Settings\user1\My Documents\infile.csv' "str '\r\n'"
append
into table MYSCHEMA.TABLE1
fields terminated by ','
OPTIONALLY ENCLOSED BY '"'
trailing nullcols
( NAME CHAR(4000),
field2 FILLER,
field3 FILLER,
TEST DEPT CHAR(4000)
)You can "regexp_replace" the columns for special characters
-
Sql@loader-704 and ORA-12154: error messages when trying to load data with SQL Loader
I have a data base with two tables that is used by Apex 4.2. One table has 800,000 records . The other has 7 million records
The client recently upgraded from Apex 3.2 to Apex 4.2 . We exported/imported the data to the new location with no problems
The source of the data is an old mainframe system; I needed to make changes to the source data and then load the tables.
The first time I loaded the data i did it from a command line with SQL loader
Now when I try to load the data I get this message:
sql@loader-704 Internal error: ulconnect OCISERVERATTACH
ORA-12154: tns:could not resolve the connect identifier specified
I've searched for postings on these error message and they all seem to say that SQL Ldr can't find my TNSNAMES file.
I am able to connect and load data with SQL Developer; so SQL developer is able to find the TNSNAMES file
However SQL Developer will not let me load a file this big
I have also tried to load the file within Apex (SQL Workshop/ Utilities) but again, the file is too big.
So it seems like SQL Loader is the only option
I did find one post online that said to set an environment variable with the path to the TNSNAMES file, but that didn't work..
Not sure what else to try or where to look
thanksHi,
You must have more than one tnsnames file or multiple installations of oracle. What i suggest you do (as I'm sure will be mentioned in ed's link that you were already pointed at) is the following (* i assume you are on windows?)
open a command prompt
set TNS_ADMIN=PATH_TO_DIRECTOT_THAT_CONTAINS_CORRECT_TNSNAMES_FILE (i.e. something like set TNS_ADMIN=c:\oracle\network\admin)
This will tell oracle use the config files you find here and no others
then try sqlldr user/pass@db (in the same dos window)
see if that connects and let us know.
Cheers,
Harry
http://dbaharrison.blogspot.com -
SQL*Loader with multiple files
Gurus,
I search the documentation and this forum and haven't found a solution to my issue yet...
I am not expert of SQL*Loader. I have used SQL*Loader to copy from one file to a table many times. But I have not copied multiple files into one table especially with different names.
More specifically....
I need to load data from multiple files into a table. But the file names will be different each time. A file will be created every hour. The file name will consist of the root file name appended by a time stamp. For example, a file created on 10/07/2010 at 2:15 P.M. would be filea100720101415.txt while a file created on 10/08/2010 at 8:15 A.M. would be filea100820100815.txt. All the files will be in one directory.How can I load the data from the files using SQL*Loader?
My database: Oracle 10g Release 2
Operating System: Windows 2003 Server
Please assist.
Robertsect55 wrote:
Gurus,
I search the documentation and this forum and haven't found a solution to my issue yet...
I am not expert of SQL*Loader. I have used SQL*Loader to copy from one file to a table many times. But I have not copied multiple files into one table especially with different names.
More specifically....
I need to load data from multiple files into a table. But the file names will be different each time. A file will be created every hour. The file name will consist of the root file name appended by a time stamp. For example, a file created on 10/07/2010 at 2:15 P.M. would be filea100720101415.txt while a file created on 10/08/2010 at 8:15 A.M. would be filea100820100815.txt. All the files will be in one directory.How can I load the data from the files using SQL*Loader?
My database: Oracle 10g Release 2
Operating System: Windows 2003 Server
Please assist.
RobertToo bad this isn't in *nix, where you get a powerful shell scripting capability.
That said, here is the core of the solution .... you will also need a way to identify files that have been processed vs. new ones. Maybe rename them, maybe move them. But with this sample you can see the basics. From there it is really an issue of DOS scripting, which would better be found by googling around a bit.
cd c:\loadfiles
FOR %%datfile IN (*.txt) DO SQLLDR CONTROL=sample.ctl, LOG=sample.log, BAD=baz.bad, DATA=%%datfileTry googling "dos scripting language". You'll find lots of tutorials and ideas on "advanced" (well, as advanced as DOS gets) techniques to solve your problem.
Edited by: EdStevens on Dec 1, 2010 5:03 PM -
Calling SQL Loader with Dynamic file names
HI all
I woul like to know if I can call sql loader as below
$ sqlldr userid=uname/pwd control=new.ctl, data=$1
I have to schedule my loader 3 times a day and each time my file names are different(AAA_BBB_timestamp)
ThanksI have found a solution myself and if any one is interested its like this
for file in `ls -1 /opt/user/from/`
do
sqlldr userid=user/pwd@connect_string control=control_file.ctl data="/opt/user/from//$file"
done
Ramu -
SQL Loader Inserting Log File Statistics to a table
Hello.
I'm contemplating how to approach gathering the statistics from the SQL Loader log file to insert them into a table. I've approached this from a Korn Shell Script perspective previously, but now that I'm working in a Windows environment and my peers aren't keen about batch files and scripting I thought I'd attempt to use SQL Loader itself to read the log file and insert one or more records into a table that tracks data uploads. Has anyone created a control file that accomplishes this?
My current environment:
Windows 2003 Server
SQL*Loader: Release 10.2.0.1.0
Thanks,
LukeHello.
Learned a little about inserting into multiple tables with delimited records. Here is my current tested control file:
LOAD DATA
APPEND
INTO TABLE upload_log
WHEN (1:12) = 'SQL*Loader: '
FIELDS TERMINATED BY WHITESPACE
TRAILING NULLCOLS
( upload_log_id RECNUM
, filler_field_0 FILLER
, filler_field_1 FILLER
, filler_field_2 FILLER
, filler_field_3 FILLER
, filler_field_4 FILLER
, filler_field_5 FILLER
, day_of_week
, month
, day_of_month
, time_of_day
, year
, log_started_on "TO_DATE((:month ||' '|| :day_of_month ||' '|| :time_of_day ||' '|| :year), 'Mon DD HH24:MI:SS YYYY')"
INTO TABLE upload_log
WHEN (1:11) = 'Data File: '
FIELDS TERMINATED BY ':'
( upload_log_id RECNUM
, filler_field_0 FILLER POSITION(1)
, input_file_name "TRIM(:input_file_name)"
INTO TABLE upload_log
WHEN (1:6) = 'Table '
FIELDS TERMINATED BY WHITESPACE
( upload_log_id RECNUM
, filler_field_0 FILLER POSITION(1)
, table_name "RTRIM(:table_name, ',')"
INTO TABLE upload_rejects
WHEN (1:7) = 'Record '
FIELDS TERMINATED BY ':'
( upload_rejects_id RECNUM
, record_number POSITION(1) "TO_NUMBER(SUBSTR(:record_number,8,20))"
, reason
INTO TABLE upload_rejects
WHEN (1:4) = 'ORA-'
FIELDS TERMINATED BY ':'
( upload_rejects_id RECNUM
, error_code POSITION(1)
, error_desc
INTO TABLE upload_log
WHEN (1:22) = 'Total logical records '
FIELDS TERMINATED BY WHITESPACE
( upload_log_id RECNUM
, filler_field_0 FILLER POSITION(1)
, filler_field_1 FILLER
, filler_field_2 FILLER
, action "RTRIM(:action, ':')"
, number_of_records
INTO TABLE upload_log
WHEN (1:13) = 'Run began on '
FIELDS TERMINATED BY WHITESPACE
TRAILING NULLCOLS
( upload_log_id RECNUM
, filler_field_0 FILLER POSITION(1)
, filler_field_1 FILLER
, filler_field_2 FILLER
, day_of_week
, month
, day_of_month
, time_of_day
, year
, run_began_on "TO_DATE((:month ||' '|| :day_of_month ||' '|| :time_of_day ||' '|| :year), 'Mon DD HH24:MI:SS YYYY')"
INTO TABLE upload_log
WHEN (1:13) = 'Run ended on '
FIELDS TERMINATED BY WHITESPACE
TRAILING NULLCOLS
( upload_log_id RECNUM
, filler_field_0 FILLER POSITION(1)
, filler_field_1 FILLER
, filler_field_2 FILLER
, day_of_week
, month
, day_of_month
, time_of_day
, year
, run_ended_on "TO_DATE((:month ||' '|| :day_of_month ||' '|| :time_of_day ||' '|| :year), 'Mon DD HH24:MI:SS YYYY')"
INTO TABLE upload_log
WHEN (1:18) = 'Elapsed time was: '
FIELDS TERMINATED BY ':'
( upload_log_id RECNUM
, filler_field_0 FILLER POSITION(1)
, filler_field_1 FILLER
, filler_field_2 FILLER
, elapsed_time
INTO TABLE upload_log
WHEN (1:14) = 'CPU time was: '
FIELDS TERMINATED BY ':'
( upload_log_id RECNUM
, filler_field_0 FILLER POSITION(1)
, filler_field_1 FILLER
, filler_field_2 FILLER
, cpu_time
)Here are the basic table create scripts:
TRUNCATE TABLE upload_log;
DROP TABLE upload_log;
CREATE TABLE upload_log
( upload_log_id INTEGER
, day_of_week VARCHAR2( 3)
, month VARCHAR2( 3)
, day_of_month INTEGER
, time_of_day VARCHAR2( 8)
, year INTEGER
, log_started_on DATE
, input_file_name VARCHAR2(255)
, table_name VARCHAR2( 30)
, action VARCHAR2( 10)
, number_of_records INTEGER
, run_began_on DATE
, run_ended_on DATE
, elapsed_time VARCHAR2( 8)
, cpu_time VARCHAR2( 8)
TRUNCATE TABLE upload_rejects;
DROP TABLE upload_rejects;
CREATE TABLE upload_rejects
( upload_rejects_id INTEGER
, record_number INTEGER
, reason VARCHAR2(255)
, error_code VARCHAR2( 9)
, error_desc VARCHAR2(255)
);Now, if I could only insert a single record to the upload_log table (per table logged); adding separate columns for skipped, read, rejected, discarded quantities. Any advice on how to use SQL Loader to do this (writing a procedure would be fairly simple, but I'd like to perform all of the work in one place if at all possible)?
Thanks,
Luke
Edited by: Luke Mackey on Nov 12, 2009 4:28 PM -
SQL Loader : Trim and Decode functions help please
Hi,
I have to load data from a flat file, for some columns i need to use TRIM and DECODE functions.It is a pipe delimited file.
I get syntax errors (one is below) same error listed for TRIM.
SQL*Loader-350: Syntax error at line xx.
Expecting "," or ")", found "DECODE".
===========
,FINAL_BILL_DATE CHAR(30) "TRIM(:FINAL_BILL_DATE)"
,BUSINESS_ID "DECODE(:BUSINESS_ID,'B',1,'C',2,'E',3,'G',4,'O',5,'R',6,'T',7,'U',8,'H',9,-1)"
Can anyone please help.
Thanks
CherrishHello Cherrish.
The error you are receiving leads me to believe that at some point prior to the DECODE on the line for BUSINESS_ID, probably some line even before the FINAL_BILL_DATE line, there a syntactical error causing the quotes before the DECODE to actually terminate some other syntax. Without all of the lines that could actually contribute to this, including the header details, this is the best I can advise.
Hope this helps,
Luke
Please mark the answer as helpful or answered if it is so. If not, provide additional details.
Always try to provide create table and insert table statements to help the forum members help you better. -
SQL Loader Truncate and SQL TRUNCATE difference
Could any one let me know what is difference between truncate used by control file of the SQL Loader and TRUNCATE command used by SQL? Is there any impact or difference of these both over the data files.
ThanksMr Jens I think TRUNCATE in SQLLDR control file reuses extents, unlike SQL TRUNCATE command. In my opinion it is best to truncate these to show the normal usage of these tables, not the elevated values.
Could you please further comment? -
Sql loader maximum data file size..?
Hi - I wrote sql loader script runs through shell script which will import data into table from CSV file. CSV file size is around 700MB. I am using Oracle 10g with Sun Solaris 5 environment.
My question is, is there any maximum data file size. The following code from my shell script.
SQLLDR=
DB_USER=
DB_PASS=
DB_SID=
controlFile=
dataFile=
logFileName=
badFile=
${SQLLDR} userid=$DB_USER"/"$DB_PASS"@"$DB_SID \
control=$controlFile \
data=$dataFile \
log=$logFileName \
bad=$badFile \
direct=true \
silent=all \
errors=5000Here is my control file code
LOAD DATA
APPEND
INTO TABLE KEY_HISTORY_TBL
WHEN OLD_KEY <> ''
AND NEW_KEY <> ''
FIELDS TERMINATED BY ','
TRAILING NULLCOLS
OLD_KEY "LTRIM(RTRIM(:OLD_KEY))",
NEW_KEY "LTRIM(RTRIM(:NEW_KEY))",
SYS_DATE "SYSTIMESTAMP",
STATUS CONSTANT 'C'
)Thanks,
-Soma
Edited by: user4587490 on Jun 15, 2011 10:17 AM
Edited by: user4587490 on Jun 15, 2011 11:16 AMHello Soma.
How many records exist in your 700 MB CSV file? How many do you expect to process in 10 minutes? You may want to consider performing a set of simple unit tests with 1) 1 record, 2) 1,000 records, 3) 100 MB filesize, etc. to #1 validate that your shell script and control file syntax function as expected (including the writing of log files, etc.), and #2 gauge how long the processing will take for the full file.
Hope this helps,
Luke
Please mark the answer as helpful or answered if it is so. If not, provide additional details.
Always try to provide actual or sample statements and the full text of errors along with error code to help the forum members help you better. -
SQL LOADER LPAD CONTROL FILE QUESTION
Hi, inthe flat file
company_cd is "1" or "01"
and center_cd is 3 digits... ex: "493"
in the table the userid coulmn should be 6 digits
currently i am getting this as userid in the table
"010493" is right, because company_cd is "01"
"10493" is not right, because company_cd is "1"
if company_cd is 2 digits(01) i am getting 6 digits userid which is OK
but when company_cd is singile digit(1) i am getting 5 digits userid
I NEED TO LPAD with 0 in the front when company_cd is "1"any suggetions ???????
***********This is the code i am using currently in the CTL file for userid**********
,USERID "CONCAT(substr(trim(:company_cd),1,2),lpad(trim(:center_cd),4,0))"
.......Thank You..........
Edited by: phani_Marella on Aug 28, 2012 11:12 AMNow where does company 'coz' come from all of a sudden?
I'm sure you read {message:id=9360002} , hence my confusion.
Anyway, the SQL*Loader forum is @ Export/Import/SQL Loader & External Tables -
SQL Loader reads Data file Sequentially or Randomly?
Will SQL Loader loads the data read from file Sequentially or Randomly?
I have the data file like the below
one
two
three
four
and my control file is
LOAD DATA
INFILE *
TRUNCATE
INTO TABLE T TRAILING NULLCOLS
x RECNUM,
y POSITION (1:4000)
so my table will be polulated like
X Y
1 one
2 Two
3 Three
4 Four
Will this happend sequentially even for the large data sets? say i have from one to one million datas in my data files.
Please clarify.
Thanks,
Rajesh.SQL Loader may read the file sequentially, but you should not rely on the physical ordering of the rows in the table.
It looks like that's what you were hinting at. -
Hi,
I am using SQL loader to insert data from a flat file.
While searching for other options in sqlloader. I have found Zoned datatype.
If I have some negative value in flat file like 98765.4321-
now I have searched on internet and found if I write Zoned(9,4) to store above value, it will store the negative sign also? Just want to confirm if it is? as I have seen readed is it does take in Zoned datatype but not in Zoned external.
So if you can confirm or send me some link for same.
Also I want to write nullif for more then one value for one column. What I found in internet is 2 approach.
1) or condition in nullif. For example
TerminationDate POSITION(58:63) DATE(6) "YYMMDD"
NULLIF(TerminationDate = "000000" OR TerminationDate = "999999" OR
TerminationDate = "731014")
2) Decode the value. For example
TerminationDate POSITION(58:63) "decode (:TerminationDate,
'000000', NULL, '999999', NULL, '731014', NULL, to_date (:TerminationDate,
'YYMMDD') )"
Which one is the better approach out of these 2?
Thanksuser539644 wrote:
1) or condition in nullif. For example
TerminationDate POSITION(58:63) DATE(6) "YYMMDD"
NULLIF(TerminationDate = "000000" OR TerminationDate = "999999" OR
TerminationDate = "731014")
2) Decode the value. For example
TerminationDate POSITION(58:63) "decode (:TerminationDate,
'000000', NULL, '999999', NULL, '731014', NULL, to_date (:TerminationDate,
'YYMMDD') )"
Which one is the better approach out of these 2?The best one is the one that works correctly with good performance and maintainability is the best one - beyond that you decide.
I personally like the NULLIF answer better because I find DECODE to be hard to work with; if you must use DECODE and have a recent version of the database use CASE instead. -
SQL loader with *.csv file
Good morning,
I have a *.CVS file containing data like this:
A123456789,Ah Tong Station,Jalan Dungun
I would like insert the data into a table that returns 1 row for each value
id outlet_name addr_1A123456789 Ah Tong Station Jalan Dungun
etc
In the stored procedure, SQL loader, I used
Insert into XXXXX values
(SUBSTR(input_buffer, 1, 10),
SUBSTR(input_buffer, 11, 60),
SUBSTR(input_buffer, 61, 110);
but the problem is that insertion of the outlet_name and addr_1 will not follow the subcripts that provided. Instead in gave an output like that:
id outlet_name addr_1
A123456789 ,Ah Tong Station,Jalan Dungun NULL
Hope to get some advice asap. Thanks. :)
PaulineWhy not use the SQL*Loader proper to load this in to your table? It will be significantly faster than anything you could do in PL/SQL, especially if your volumes increase. And it will take care of the parsing for you, as long as you tell it the field delimiter. You've already seen that you can't rely on fixed widths when using delimited data.
Given this data...
A123456789,Ah Tong Station,Jalan Dungun
A234,Some Station,Some Place...the following SQL*Loader control file ( stored in a file called mytable.ctl ) will load your table ( note I've used truncate - you could also append )...
load data
infile 'mydata.csv'
truncate
into table mytable
fields terminated by ','
( id
, outlet_name
, addr_1
)This is invoked using the following:-
sqlldr userid/password@tns control=mytable.ctlThe results are:-
SQL> select * from mytable;
ID OUTLET_NAME ADDR_1
A123456789 Ah Tong Station Jalan Dungun
A234 Some Station Some PlaceYou will find this much easier. Of course, you could always avoid the database altogether if you don't need to actually store this data. You could just parse the source .csv ( if you are on UNIX this will be really easy using an awk one-liner ) and write it to an output file.
Anyway, hope this helps.
Regards
Adrian
Maybe you are looking for
-
My ipod touch 4g will not connect to my computer. Its not even in the device manager
My ipod touch 4g will not connect to my computer. Not to charge or connect to itunes. My usb ports work fine with everything else. I went and bought a new wall charger that came with a synce cord and it works fine, as a wall charger with wont work wi
-
Quality issues importing Sony HD into iMovie 8
I have a Sony HD AVCHD camera which uses SD cards. The only method to import the footage is with iMovie 8. The program advises to use the "large" quality setting, and the footage came out really nicely. The resolution was small, but the quality was f
-
Hi All Another interesting scenario for which I cant find many answers on SDN. How to track cost savings in standard SAP? Example: A company buys a material for $10 ea. General market price is $11 so its a saving of $1 per item. Every time a user pla
-
Oracle Client for Windows 2008 R2 server to connect to Oracle database 9iR2
Hi, We are running Oracle 9.2.0.7 on Sun solaris (64 Bit) 5.10 and trying to install client on new Windows 2008 R2 (64 - bit) server. (Before this 2008 R2 we were using Windows 2003 32 bit and 9iR2 client was working great.) I can't install 9iR2 clin
-
There is one ambiguity (at list for me) in migration process. Should or should I not create new Oracle database for the migration repository? If not then which to use? If yes then Which would suit best? Or how it should be prepared? End if Thank you