Check data before loading through SQL *Loader
Hi all,
I have a temp table which is loaded through SQL*Loader.This table is used by a procedure for inserting data into another table.
I get error of 0RA-01722 frequently during procdures execution.
I have decided to check for the error data through the control file itself.
I have few doubts about SQL Loader.
Will a record containing character data for column declared as INTEGER EXTERNAL in ctrl file get discarded?
Does declaring column as INTERGER EXTERNAL take care of NULL values?
Does a whole record gets discarded if one of the column data is misplaced in the record in input file?
Control File is of following format:
LOAD DATA
APPEND INTO TABLE Temp
FIELDS TERMINATED BY "|" optionally enclosed by "'"
trailing nullcols
FILEDATE DATE 'DD/MM/YYYY',
ACC_NUM INTEGER EXTERNAL,
REC_TYPE ,
LOGO , (Data:Numeric Declared:VARCHAR)
CARD_NUM INTEGER EXTERNAL,
ACTION_DATE DATE 'DD/MM/YYYY',
EFFECTIVE_DATE DATE 'DD/MM/YYYY',
ACTION_AMOUNT , (Data:Numeric Declared:NUMBER)
ACTION_STORE , (Data:Numeric Declared:VARCHAR)
ACTION_AUTH_NUM ,
ACTION_SKU_NUM ,
ACTION_CASE_NUM )
What changes do I need to make in this file regarding above questions?
Is there any online document for this?<br>
Here it is
Similar Messages
-
Loading through sql* loader
Hi,
I want to load xml file data using sql*loader into Oracle.
Kindly advice what could be the pros & cons for this.
Is there any online document for this?
Thanks
DeepakIs there any online document for this?<br>
Here it is -
How can I load data into table with SQL*LOADER
how can I load data into table with SQL*LOADER
when column data length more than 255 bytes?
when column exceed 255 ,data can not be insert into table by SQL*LOADER
CREATE TABLE A (
A VARCHAR2 ( 10 ) ,
B VARCHAR2 ( 10 ) ,
C VARCHAR2 ( 10 ) ,
E VARCHAR2 ( 2000 ) );
control file:
load data
append into table A
fields terminated by X'09'
(A , B , C , E )
SQL*LOADER command:
sqlldr test/test control=A_ctl.txt data=A.xls log=b.log
datafile:
column E is more than 255bytes
1 1 1 1234567------(more than 255bytes)
1 1 1 1234567------(more than 255bytes)
1 1 1 1234567------(more than 255bytes)
1 1 1 1234567------(more than 255bytes)
1 1 1 1234567------(more than 255bytes)
1 1 1 1234567------(more than 255bytes)
1 1 1 1234567------(more than 255bytes)
1 1 1 1234567------(more than 255bytes)Check this out.
http://download-west.oracle.com/docs/cd/B10501_01/server.920/a96652/ch06.htm#1006961 -
Comparison of Data Loading techniques - Sql Loader & External Tables
Below are 2 techniques using which the data can be loaded from Flat files to oracle tables.
1) SQL Loader:
a. Place the flat file( .txt or .csv) on the desired Location.
b. Create a control file
Load Data
Infile "Mytextfile.txt" (-- file containing table data , specify paths correctly, it could be .csv as well)
Append or Truncate (-- based on requirement) into oracle tablename
Separated by "," (or the delimiter we use in input file) optionally enclosed by
(Field1, field2, field3 etc)
c. Now run sqlldr utility of oracle on sql command prompt as
sqlldr username/password .CTL filename
d. The data can be verified by selecting the data from the table.
Select * from oracle_table;
2) External Table:
a. Place the flat file (.txt or .csv) on the desired location.
abc.csv
1,one,first
2,two,second
3,three,third
4,four,fourth
b. Create a directory
create or replace directory ext_dir as '/home/rene/ext_dir'; -- path where the source file is kept
c. After granting appropriate permissions to the user, we can create external table like below.
create table ext_table_csv (
i Number,
n Varchar2(20),
m Varchar2(20)
organization external (
type oracle_loader
default directory ext_dir
access parameters (
records delimited by newline
fields terminated by ','
missing field values are null
location ('file.csv')
reject limit unlimited;
d. Verify data by selecting it from the external table now
select * from ext_table_csv;
External tables feature is a complement to existing SQL*Loader functionality.
It allows you to –
• Access data in external sources as if it were in a table in the database.
• Merge a flat file with an existing table in one statement.
• Sort a flat file on the way into a table you want compressed nicely
• Do a parallel direct path load -- without splitting up the input file, writing
Shortcomings:
• External tables are read-only.
• No data manipulation language (DML) operations or index creation is allowed on an external table.
Using Sql Loader You can –
• Load the data from a stored procedure or trigger (insert is not sqlldr)
• Do multi-table inserts
• Flow the data through a pipelined plsql function for cleansing/transformation
Comparison for data loading
To make the loading operation faster, the degree of parallelism can be set to any number, e.g 4
So, when you created the external table, the database will divide the file to be read by four processes running in parallel. This parallelism happens automatically, with no additional effort on your part, and is really quite convenient. To parallelize this load using SQL*Loader, you would have had to manually divide your input file into multiple smaller files.
Conclusion:
SQL*Loader may be the better choice in data loading situations that require additional indexing of the staging table. However, we can always copy the data from external tables to Oracle Tables using DB links.Please let me know your views on this.
-
Need faster data loading (using sql-loader)
i am trying to load approx. 230 million records (around 60-bytes per record) from a flat file into a single table. i have tried sql-loader (conventional load) and i'm seeing performance degrade as the file is being processed. i am avoiding direct path sql-loading because i need to maintain uniqueness using my primary key index during the load. so the degradation of the load performance doesn't shock me. the source data file contains duplicate records and may contain records that are duplicates of those that are already in the table (i am appending during sql-loader).
my other option is to unload the entire table to a flat file, concatenate the new file onto it, run it through a unique sort, and then direct-path load it.
has anyone had a similar experience? any cool solutions available that are quick?
thanks,
jeffIt would be faster to suck into a Oracle table, call it a temporary table and then make a final move into the final table.
This way you could direct load into an oracle table then you could
INSERT /*+ APPEND */ INTO Final_Table
SELECT DISTINCT *
FROM Temp_Table
ORDER BY ID;This would do a 'direct load' type move from your temp teable to the final table, which automatically merging the duplicate records;
So
1) Direct Load from SQL*Loader into temp table.
2) Place index (non-unique) on temp table column ID.
3) Direct load INSERT into the final table.
Step 2 may make this process faster or slower, only testing will tell.
Good Luck,
Eric Kamradt -
Importing oracle.sql.BLOB through SQL Loader
Hello,
Currently the system we have creates .sql files and executes them. This takes a long time, so we're attempting to change to using SQL Loader. The one problem I'm having that I can't seem to fix is finding out how to imitate the behavior of this SQL statement through SQL Loader:
INSERT INTO KRSB_QRTZ_BLOB_TRIGGERS (BLOB_DATA,TRIGGER_GROUP,TRIGGER_NAME)
VALUES (oracle.sql.BLOB@515263,'KCB-Delivery','PeriodicMessageProcessingTrigger')
I tried creating a lobfile and placing the text oracle.sql.BLOB@515263 within it this way
INTO TABLE KRSB_QRTZ_BLOB_TRIGGERS
WHEN tablename = 'KRSB_QRTZ_BLOB_TRIGGERS'
FIELDS TERMINATED BY ',' ENCLOSED BY '##'
TRAILING NULLCOLS
tablename FILLER POSITION(1),
TRIGGER_NAME CHAR(80),
TRIGGER_GROUP CHAR(80),
ext_fname FILLER,
BLOB_DATA LOBFILE(ext_fname) TERMINATED BY EOF
However, as expected, it merely loaded the text "oracle.sql.BLOB@515263" into the database. So does anyone have any ideas of how to imitate that insert statement through SQL Loader? The only information I have available is the string "oracle.sql.BLOB@515263", no other files or anything to aid with actually getting the binary data.
When the .sql file is run with that insert statement in it, a 1.2kb BLOB is inserted into the database versus a 22 byte BLOB that contains nothing useful when done through SQL Loader.
AlexMy preference is DBMS_LOB.LOADFROMFILE
http://www.morganslibrary.org/reference/dbms_lob.html
did you try it? -
Is this possible through sql loader
hi grus,
i want to load a data through sql loader but the problem is, data file contains some numeric values and after each numeric value there is sign character,
for example
position 1 to 9 = 12345678- (in case of negative)
or
position 1 to 9 = 12345678% (% represents blank space in case of positive)
is there any way to load this numeric value into one field, where the sign follow the numeric value?
ThanksHi Jim/Jenis
Just want to know why the below statement dosn't work.
when i used "MI" format then all the records which trailing blank rejected by the loader only trailing "-" loaded.
CURR_MON_OPEN_BAL position(0045:0053) "to_number(:CURR_MON_OPEN_BAL, '99999999MI')"
then i use the following trick and works fine.
CURR_MON_OPEN_BAL position(0045:0053) "to_number(rpad(:CURR_MON_OPEN_BAL,9,'+', '99999999S')"
Could you guys explain why the MI format doesn't work in my case? -
Defaulting org id value into a table through SQL Loader program
Hi ,
We have a requirement that we need to load some data from flat file to a table.we are using sql loader to do that. so far no problem but now the requirement is that we need to populate the org id from which we are running the program.
I tried fnd_profile.value('ORG_ID') and it is populating site level org id.
Coudl any one please help me how to default org id or request id into a table through sql loader program.
Thanks,
Yuser12001627 wrote:
Hi Srini,
Thanks for looking into this!!
We are on EBS 11.5.10 and OS is solaris.
I tried fnd_profile,fnd_global but no luck.
Here is the control file which we are using to load data.
load data
infile *
replace into table XXXX_YYYY_STAG
trailing nullcols
(line POSITION(1:2000)
I would like to populate org id when I load the data from file.unfortunately there is no identifier in the file that says for which org id the data is in the file.Only the way to identify the file org is based on file name
Where do you want to populate the ORG_ID ? There is no column for it in your stage table above
Is there way we can pass through concurrent program parameters?
Thanks
YHTH
Srini -
Load two input file into One Target table through SQL LOADER
Hi Expert,
I have 2 .cvs files.
1st file having cols as account_no & first name
2nd file having cols as account_no & last name
There is a table called temp has 3 cols accounts_number first name & last name
accounts number in both files are same
Now I wann upload these 2 files in to table temp
how do I do that
Pls suggest
Thx
Umesh GoelUG wrote:
I dont want truncate my table bcoz account no and first name from first file and last name from second file.
so pls suggest something else....Hang on, I'll go find you some glasses. I think you're having trouble reading.
First file : Control file truncates and loads data
Second file : Control file does not truncate and loads data
2 Calls to SQL*Loader
Is that a little clearer?
Are you saying there is a relationship between the data in the two files?
How do you know which last name relates to which first name in the first file? Is there a reference key between the records on the two files?
Perhaps you'd better show us an example of the contents of the files so we can see what you are trying to do. -
SQL Loader error: SQL*Loader-926. Please help
Hi,
While loading some files to my database table, I am getting the following error. I am using 'Truncate' option while loading the file:
Error:
====
SQL*Loader-926: OCI error while executing delete/truncate (due to REPLACE/TRUNCATE keyword) for table LOS_STAGE_DS4
ORA-01426: numeric overflow
Here's the loader properties(excerpts from load log)
================================
SQL*Loader: Release 11.1.0.6.0 - Production on Fri Nov 26 04:54:18 2010
Copyright (c) 1982, 2007, Oracle. All rights reserved.
Control File: d:\Prod\rent_Load\Bin\rent_Load.ctl
Data File: d:\Prod\rent_Load\Data\rent.704
Bad File: d:\Prod\rent_Load\Bad\rent.704
Discard File: none specified
(Allow all discards)
Number to load: ALL
Number to skip: 0
Errors allowed: 1000000000
Bind array: 64 rows, maximum of 256000 bytes
Continuation: none specified
Path used: Conventional
Table LS_STAGE, loaded from every logical record.
Insert option in effect for this table: TRUNCATE
Column Name Position Len Term Encl Datatype
Could someone please help and advise what is the root cause of this error?
Thanks,The root cause is in the error ora-1426, which you can look up in the online error documentation at http://tahiti.oracle.com . No one knows every error message by heart. This means it is expected you look up the error prior to posting, and you don't expect any volunteer in this forum to look up the error on your behalf.
Also this is a typical candidate for being a known problem, and known problems can be found on My Oracle Support.
Sybrand Bakker
Senior Oracle DBA -
Track flat files that failed loading in sql loader
Hi,
Can anyone please suggest me any way to track the flat files which failed while loading in sql loader. And pass this failed flat file name to any column of a database table
Thanks in advance.
Edited by: 806821 on Nov 2, 2010 10:22 AMHi Morgan thnannks for ur reply.
Define failed. 1 row not loaded ... no rows not loaded ... what operating system ... what version of the Oracle database ... track in a table, send an email?
Your inquiry is long on generalities and short on specifics: Fill in all the blanks ... not just the ones I've listed above.
even if 1 row is not loaded even then it should be considered failed to load and the name of the that particular flat file should be fetched.
Operating system is unix
Oracle database we are using is R12
track in a table , yeah we want to send an email notificaiton whenever the flat files fails to load.
Thanks once again...!! -
Why no exclusive lock for conventional path loading for SQL*Loader?
why no exclusive lock for conventional path loading for SQL*Loader?
it use insert statement so it should use exclusive lock right?
thanksok, so only update statement would put a lock but not for insert statement?
because I have seen a situation where a user update rows in a sesssion (without commit) prevent another user update the rows.
thanks -
Unable to load clob data through sql loader
Hi Experts ,
My ctl file is :
LOAD DATA infile '$di_top/conversion/devtrack_notes.csv'
truncate into table xxdi_proj
fields terminated by ','
optionally enclosed by '"'
trailing nullcols (bugid,note *clob*)
{code}
The problem is note column is a clob and one of the
values has line breaks like this :
{code}
Hi Sir,
Would you please inform when the reports are scheduled for automatic process?
Maria will stop his process to avoid duplication.
Please inform asap
With Regards ,
Ronaldinho
{code}
When the data gets loaded ,
The first column gets the sentence 'Would you please inform.....' i.e . the data of second columns gets loaded into first column as nulls are recognized as end delimiter.
How to overcome this problem?
ThanksPl post your exact OS and database versions, along with your complete sqlldr command, the table description and a sample of your input csv file.
HTH
Srini -
Problem in Data Migration via. SQL*Loader
Hi,
I am trying to load the data from a text generated file.
While using the slqldr command along with the url,".ctl", ".log",".bad" and ".dat" parameters, all the records goes to ".bad" file. Why?
Eventhouh it parse the control file successfully.
Anybody please help me in the matter.
I am sending the ".log" file generated during the transaction :
========================================================================
Control File: CUSTOMER.CTL
Data File: CUSTOMER.DAT
Bad File: CUSTOMER.BAD
(Allow all discards)
Number to load: ALL
Number to skip: 0
Errors allowed: 50
Bind array: 64 rows, maximum of 256000 bytes
Continuation: none specified
Path used: Conventional
Table CUSTOMER_BACKUP, loaded from every logical record.
Insert option in effect for this table: INSERT
Column Name Position Len Term Encl Datatype
CS_CODE FIRST * WHT O(") CHARACTER
CS_NAME NEXT * WHT O(") CHARACTER
CS_ADD1 NEXT * WHT O(") CHARACTER
CS_ADD2 NEXT * WHT O(") CHARACTER
CS_ADD3 NEXT * WHT O(") CHARACTER
CS_ADD4 NEXT * WHT O(") CHARACTER
CS_PIN NEXT * WHT O(") CHARACTER
CS_PHONE NEXT * WHT O(") CHARACTER
CS_TELEX NEXT * WHT O(") CHARACTER
CS_CR_DAYS NEXT * WHT O(") CHARACTER
CS_CR_LIM NEXT * WHT O(") CHARACTER
CS_TRD_DIS NEXT * WHT O(") CHARACTER
CS_CSH_DIS NEXT * WHT O(") CHARACTER
CS_STX_FRM NEXT * WHT O(") CHARACTER
CS_LSTX_NO NEXT * WHT O(") CHARACTER
CS_MOB_BAL NEXT * WHT O(") CHARACTER
CS_STX_PER NEXT * WHT O(") CHARACTER
CS_IND NEXT * WHT O(") CHARACTER
CS_CSTX_NO NEXT * WHT O(") CHARACTER
CS_SLMN_CD NEXT * WHT O(") CHARACTER
CS_BANK_1 NEXT * WHT O(") CHARACTER
CS_BANK_2 NEXT * WHT O(") CHARACTER
CS_BANK_3 NEXT * WHT O(") CHARACTER
CS_YOB_BAL NEXT * WHT O(") CHARACTER
CS_CURR NEXT * WHT O(") CHARACTER
CS_ZONE NEXT * WHT O(") CHARACTER
CS_CAT NEXT * WHT O(") CHARACTER
F_EDT NEXT * WHT O(") CHARACTER
F_UID NEXT * WHT O(") CHARACTER
F_ACTV NEXT * WHT O(") CHARACTER
CS_RANGE NEXT * WHT O(") CHARACTER
CS_ITNO NEXT * WHT O(") CHARACTER
CS_INT NEXT * WHT O(") CHARACTER
CS_CIRCLE NEXT * WHT O(") CHARACTER
CS_ECCCODE NEXT * WHT O(") CHARACTER
CS_MFRCODE NEXT * WHT O(") CHARACTER
CS_QTY_BUG NEXT * WHT O(") CHARACTER
CS_VAL_BUG NEXT * WHT O(") CHARACTER
Record 1: Rejected - Error on table CUSTOMER_BACKUP, column CS_YOB_BAL.
Column not found before end of logical record (use TRAILING NULLCOLS)
Record 2: Rejected - Error on table CUSTOMER_BACKUP, column CS_CURR.
Column not found before end of logical record (use TRAILING NULLCOLS)
Record 3: Rejected - Error on table CUSTOMER_BACKUP, column CS_CURR.
Column not found before end of logical record (use TRAILING NULLCOLS)
Column not found before end of logical record (use TRAILING NULLCOLS)
Record 50: Rejected - Error on table CUSTOMER_BACKUP, column CS_MOB_BAL.
Column not found before end of logical record (use TRAILING NULLCOLS)
Record 51: Rejected - Error on table CUSTOMER_BACKUP, column CS_MOB_BAL.
Column not found before end of logical record (use TRAILING NULLCOLS)
MAXIMUM ERROR COUNT EXCEEDED - Above statistics reflect partial run.
Table CUSTOMER_BACKUP:
0 Rows successfully loaded.
51 Rows not loaded due to data errors.
0 Rows not loaded because all WHEN clauses were failed.
0 Rows not loaded because all fields were null.
Space allocated for bind array: 254904 bytes(26 rows)
Read buffer bytes: 1048576
Total logical records skipped: 0
Total logical records read: 51
Total logical records rejected: 51
Total logical records discarded: 0
Run began on Wed Aug 28 13:30:29 2002
Run ended on Wed Aug 28 13:30:29 2002
Elapsed time was: 00:00:00.18
CPU time was: 00:00:00.08
=================================================================
Regards,
BimalHi,
I am bit late ,I cannot say bit, If we use TRAILING NULL COLS in the control file your problem should be fixed.
TRAILING NULLCOLS tells SQL*Loader to treat any relatively positioned columns that are not present in the record as null columns.
For example, if the following data
10 Accounting
is read with the following control file
INTO TABLE dept
TRAILING NULLCOLS
( deptno CHAR TERMINATED BY " ",
dname CHAR TERMINATED BY WHITESPACE,
loc CHAR TERMINATED BY WHITESPACE
and the record ends after DNAME. The remaining LOC field is set to null. Without the TRAILING NULLCOLS clause, an error would be generated due to missing data. -
Loading Data Error while using SQL Loader
Hi All,
I am facing another issue while loading the data from text file to oracle db.
I have a table in which data is loaded from multiple files. What i want the when i load the data i want to load the filename as well in the table in front of the record. the script which works fine when i load data in windows OS but i am getting error in Unix OS. My sql loader script is as follows.
source $HOME/. .bash_profile
export LOGFILE=/NSN/rawfiles/scripts/PAYMOBILE_LOAD/load.log
ORACLE_ACCESS=cdr/cdr123_awcc@$ORACLE_SID
FILES=`ls /NSN/rawfiles/bkp/PAYMOBILE/JAN2013/*.txt`;
for file in ${FILES[@]}
do
filename=`expr substr "$file" 43 50`
echo $filename
sqlldr $ORACLE_ACCESS control=/NSN/rawfiles/scripts/PAYMOBILE/CAT.ctl log=$LOGFILE data=/NSN/rawfiles/bkp/PAYMOBILE/JAN2013/$filename skip=0 direct=true
cd
mv /NSN/rawfiles/bkp/PAYMOBILE/JAN2013/$filename /NSN/backup/PAYMOBILE/JAN2013/
done
Control File for the same which i run on unix is :
Load data
INFILE "AR_101_1011_01-01-2013-01-32.txt"
Append
into table CDR.PAY_MOBILE_012013
fields terminated by "|" optionally enclosed by " "
trailing nullcols
VERSION, TICKETTYPE, GMT_TIME Date 'YYYY-MM-DD HH24:MI:SS' , LOCAL_TIME Date 'YYYY-MM-DD HH24:MI:SS', TRANSACID, EXTERNAL_TRANSACTION_ID, MEDIUM, ALIASCATEGORY, SERVICE, OPERATION, ACTORID,
ALIASNAME, SENDERACTORID, SENDERALIASNAME, RECIPIENTACTORID, RECIPIENTALIASNAME, THIRDACTORID, THIRDALIASNAME, ERRORHRN, CLASS, SNE,
MASTERSNE, VOUCHERCATEGORYID, VOUCHERSCENARIOID, EXPIRYDATE Date 'YYYY-MM-DD HH24:MI:SS', UNITVALUEID, CREDIT, VAT, IERROR, OPERSTATE, OPERERROR, CREATION_DATE Date 'YYYY-MM-DD HH24:MI:SS',
LAST_UPDATE Date 'YYYY-MM-DD HH24:MI:SS', BRANDID, CREDIT_PERIOD, CREDIT_EXPIRY_DATE Date 'YYYY-MM-DD HH24:MI:SS', SENDER_ACCOUNT, SENDER_TRANSAC, SENDER_ACCOUNT_TYPE, RECIPIENT_ACCOUNT,
RECIPIENT_TRANSAC, RECIPIENT_ACCOUNT_TYPE, RECIPIENT_MEDIUM, TIMESTAMP Date 'YYYY-MM-DD HH24:MI:SS', SENDERBILLINGTYPE, RECIPIENTBILLINGTYPE
Control file which works fine in windows is as follows:
Load data
Append
into table CDR.PAY_MOBILE_012013
fields terminated by "|" optionally enclosed by " "
trailing nullcols
VERSION, TICKETTYPE, GMT_TIME Date 'YYYY-MM-DD HH24:MI:SS' , LOCAL_TIME Date 'YYYY-MM-DD HH24:MI:SS', TRANSACID, EXTERNAL_TRANSACTION_ID, MEDIUM, ALIASCATEGORY, SERVICE, OPERATION, ACTORID,
ALIASNAME, SENDERACTORID, SENDERALIASNAME, RECIPIENTACTORID, RECIPIENTALIASNAME, THIRDACTORID, THIRDALIASNAME, ERRORHRN, CLASS, SNE,
MASTERSNE, VOUCHERCATEGORYID, VOUCHERSCENARIOID, EXPIRYDATE Date 'YYYY-MM-DD HH24:MI:SS', UNITVALUEID, CREDIT, VAT, IERROR, OPERSTATE, OPERERROR, CREATION_DATE Date 'YYYY-MM-DD HH24:MI:SS',
LAST_UPDATE Date 'YYYY-MM-DD HH24:MI:SS', BRANDID, CREDIT_PERIOD, CREDIT_EXPIRY_DATE Date 'YYYY-MM-DD HH24:MI:SS', SENDER_ACCOUNT, SENDER_TRANSAC, SENDER_ACCOUNT_TYPE, RECIPIENT_ACCOUNT,
RECIPIENT_TRANSAC, RECIPIENT_ACCOUNT_TYPE, RECIPIENT_MEDIUM, TIMESTAMP Date 'YYYY-MM-DD HH24:MI:SS', SENDERBILLINGTYPE, RECIPIENTBILLINGTYPE, FILENAME CONSTANT "AR_101_1011_01-12-2012-23-32.txt"
and the sql loader script which works fine in windows is as follows:
cd K:\paymobilefiles\DEC2012\1
for %%f in (*.txt) do (K:\paymobilefiles\DEC2012\1\LOAD\REPTEXT K:\paymobilefiles\DEC2012\1\LOAD\TEST.ctl "%%f">K:\paymobilefiles\DEC2012\1\LOAD\MYMOBILE.ctl
sqlldr userid=cdr/cdr123_awcc@tsiindia control=K:\paymobilefiles\DEC2012\1\LOAD\MYMOBILE.ctl rows=50000 bindsize=20000000 readsize=20000000 data=%%f log=K:\paymobilefiles\DEC2012\1\LOAD\MYMOBILE.log
move %%f K:\paymobilefiles\DEC2012\backup\1
when using windows based scripts i am getting filename in the filename column but when running the above mentioned scripts i am not able to get the filenames in the filename column.
Can anyone help over the same pleaseHI,
There is Difference in Control files , no INFILE & FILENAME CONSTANT values
Load data
Append
into table CDR.PAY_MOBILE_012013
fields terminated by "|" optionally enclosed by " "
trailing nullcols
VERSION, TICKETTYPE, GMT_TIME Date 'YYYY-MM-DD HH24:MI:SS' , LOCAL_TIME Date 'YYYY-MM-DD HH24:MI:SS', TRANSACID, EXTERNAL_TRANSACTION_ID, MEDIUM, ALIASCATEGORY, SERVICE, OPERATION, ACTORID,
ALIASNAME, SENDERACTORID, SENDERALIASNAME, RECIPIENTACTORID, RECIPIENTALIASNAME, THIRDACTORID, THIRDALIASNAME, ERRORHRN, CLASS, SNE,
MASTERSNE, VOUCHERCATEGORYID, VOUCHERSCENARIOID, EXPIRYDATE Date 'YYYY-MM-DD HH24:MI:SS', UNITVALUEID, CREDIT, VAT, IERROR, OPERSTATE, OPERERROR, CREATION_DATE Date 'YYYY-MM-DD HH24:MI:SS',
LAST_UPDATE Date 'YYYY-MM-DD HH24:MI:SS', BRANDID, CREDIT_PERIOD, CREDIT_EXPIRY_DATE Date 'YYYY-MM-DD HH24:MI:SS', SENDER_ACCOUNT, SENDER_TRANSAC, SENDER_ACCOUNT_TYPE, RECIPIENT_ACCOUNT,
RECIPIENT_TRANSAC, RECIPIENT_ACCOUNT_TYPE, RECIPIENT_MEDIUM, TIMESTAMP Date 'YYYY-MM-DD HH24:MI:SS', SENDERBILLINGTYPE, RECIPIENTBILLINGTYPE, FILENAME CONSTANT "AR_101_1011_01-12-2012-23-32.txt"
)Thanks,
Ajay More
http://moreajays.blogspot.com
Maybe you are looking for
-
Hello, i have since a long time a login for the apple account in Germany. But now i need a free app in a US i-tunes store.
-
Why can't I open pdf's after install of adobe reader? Should I see it in system preferences?
-
Why is the helpdesk of Adobe unreachable?
Hi there, I bought myself a brand-new Apple Mac Book Pro yesterday, the newest... Superhappy ....until I started downloading programs from CC. Trying for hours, It did not not work. I tried to contact Adobe, but, what a frustrating experience was tha
-
How to implement hyphenation for Armenian in Adobe InDesign on Mac OS?
Hi, I'm using Adobe InDesign CS4 and CS5. I've never written any kind of plugin neither for Adobe InDesign nor on Mac OS, so I need your help to consider what's the best solution for this kind of problem. I've read about ExtendScript, Linguistic Libr
-
NG 7 Billion - How'd they do that?
Curious if anyone knows how National Geographic achieved launching the ipad email browser filled in with HTML body content directly from their digital magazine? At the beginning of each article, there is an email icon displayed, and when you click on