SQL Loader input derived column
Hi,
I am running SQL Loader and I have two columns in my source file I am loading along with LoadDate and the name of the source file. my loader.ctl is below:
load data
infile 'c:\Reports\Test20070619.txt'
APPEND
into table MikeTest2
fields terminated by "," optionally enclosed by '"'
(OrderNumber,SKU,LoadDate sysdate,SourceFile "Test20070619.txt")
How would I add the SourceFile Field value Test20070619.txt? I can get the LoadDate to load with sysdate. Any thoughts.
Any info would be greatly appreciated.
Thanks,
Mike
(OrderNumber,
SKU,
LoadDate sysdate,
SourceFile "Test20070619.txt")If I understood your question correct:
You can do this: Create a variable for the filename and use that variable:
in a shell program:
cat <<EOF! > $ctl_file
LOAD DATA
INFILE '$in_file'
BADFILE '$HOME/logs/$bad_file'
DISCARDFILE '$HOME/logs/$disc_file'
(OrderNumber,
SKU,
LoadDate sysdate,
SourceFile CONSTANT "$in_filename"
)
Similar Messages
-
SQL*loader support database columns Of Binary Double/Float
Does SQL*loader support database columns Of Binary Double/Float when applying Direct Path?
Why not use Google and find out for yourself.
The first hit on Google is this
Go see for yourself
Regards
FJ -
Hi All,
I have 20 text files which have large amount of data. I need to load these data into oracle db. Each of these text files having header as column name which are same in order. Now my manager ask me to load these data in database like that last column should be the file name, so that he can track from which text file data is from. Can anybody help me to do this using sql loader.
Example:
temp1.txt temp2.txt
col1 col2 col1 col2
118 'B' 100 'J'
245 'D' 110 'K'
160 'A' 200 'N'
Now i need to load data into a database table from these text file as-------
Table Name: Temp
col1 col2 FileName
118 'B' temp1
245 'D' temp1
160 'A' temp1
100 'J' temp2
110 'K' temp2
200 'N' temp2you can load from multiple input files provided that the files use the same record format by repeating the INFILE clause. Here is an example:
load data
infile file1.dat
infile file2.dat
infile file3.dat
fields terminated by "," optionally enclosed by '"' trailing nullcols
( column1,
column2 ) -
Convert SQL Query to Derived Column Expression
I'm in the process of reworking some SSIS packages I inherited. A lot of these use Execute SQL Statement components to update certain fields in the tables once they've been loaded. I'd like to replace these individual UPDATE statements with Derived Column
components.
I've got 95% of them converted, except for a couple, one of which has me a bit perplexed and was hoping someone could provide some insight.
Essentially, there is a column called POSTPERIOD which is a date in string format YYYYMM (i.e., 201503). The SQL update parses out the month piece and converts it to the month name and populates a column called POSTPERIOD_MONTH.
Here is the SQL:
select DateName(month , DateAdd(month, CONVERT(INT, SUBSTRING(POST_PERIOD,5,2)), 0 ) - 1 )
I'd like to accomplish this using a Derived Column, but not sure how to go about doing this...
If someone could point me in the right direction, it would be greatly appreciated!
Thanks!
A. M. RobinsonVisakh:
Thank you for the reply! I've tried your solution but am getting the following error(s):
Warning 17 Validation warning. Populate STAGING_DIM_AP_DETAIL - RADIO: {ACAA99C4-272C-4641-BF44-B4D4409D1EFB}: The expression "(DT_STR,8,1252)((DT_I4)[SUBSTRING](#85,5,2) == 1 ? (DT_STR,8,1252)"January" : ((DT_I4)[SUBSTRING](#85,5,2)
== 2 ? (DT_STR,8,1252)"February" : ((DT_I4)[SUBSTRING](#85,5,2) == 3 ? (DT_STR,8,1252)"March" : ((DT_I4)[SUBSTRING](#85,5,2) == 4 ? (DT_STR,8,1252)"April" : ((DT_I4)[SUBSTRING](#85,5,2) == 5 ? (DT_STR,8,1252)"May" :
((DT_I4)[SUBSTRING](#85,5,2) == 6 ? (DT_STR,8,1252)"June" : ((DT_I4)[SUBSTRING](#85,5,2) == 7 ? (DT_STR,8,1252)"July" : ((DT_I4)[SUBSTRING](#85,5,2) == 8 ? (DT_STR,8,1252)"August" : ((DT_I4)[SUBSTRING](#85,5,2) == 9 ? (DT_STR,8,1252)"September"
: ((DT_I4)[SUBSTRING](#85,5,2) == 10 ? (DT_STR,8,1252)"October" : ((DT_I4)[SUBSTRING](#85,5,2) == 11 ? (DT_STR,8,1252)"November" : (DT_STR,8,1252)"December")))))))))))" will always result in a truncation of data. The expression
contains a static truncation (the truncation of a fixed value). Dimension Finance - Load STAGING_DIM_AP_DETAIL_DERIVED_COLUMNS.dtsx 0 0
This is the derived column expression I'm using:
(DT_STR,8,1252) ((DT_I4)SUBSTRING([POST_PERIOD],5,2) == 1? (DT_STR,8,1252) "January" : ((DT_I4)SUBSTRING([POST_PERIOD],5,2) == 2 ? (DT_STR,8,1252) "February" : ((DT_I4)SUBSTRING([POST_PERIOD],5,2) == 3 ? (DT_STR,8,1252) "March" :((DT_I4)SUBSTRING([POST_PERIOD],5,2) == 4 ? (DT_STR,8,1252) "April":((DT_I4)SUBSTRING([POST_PERIOD],5,2) == 5 ? (DT_STR,8,1252) "May" : ((DT_I4)SUBSTRING([POST_PERIOD],5,2) == 6 ? (DT_STR,8,1252) "June" : ((DT_I4)SUBSTRING([POST_PERIOD],5,2) == 7 ? (DT_STR,8,1252) "July" : ((DT_I4)SUBSTRING([POST_PERIOD],5,2) == 8 ? (DT_STR,8,1252) "August":((DT_I4)SUBSTRING([POST_PERIOD],5,2) == 9 ? (DT_STR,8,1252) "September":((DT_I4)SUBSTRING([POST_PERIOD],5,2) == 10? (DT_STR,8,1252) "October": ((DT_I4)SUBSTRING([POST_PERIOD],5,2) == 11 ? (DT_STR,8,1252) "November" : (DT_STR,8,1252)"December")))))))))))
Some info that might be of some help in the matter...the POST_PERIOD field is a VARCHAR(8), the POST_PERIOD_MONTH field is also a VARCHAR(8).
There is a computed column Any help would be appreciated once again!!!
A. M. Robinson -
HELP: SQL*LOADER AND Ref Column
Hallo,
I have already posted and I really need help and don't come further with this
I have the following problem. I have 2 tables which I created the following way:
CREATE TYPE gemark_schluessel_t AS OBJECT(
gemark_id NUMBER(8),
gemark_schl NUMBER(4),
gemark_name VARCHAR2(45)
CREATE TABLE gemark_schluessel_tab OF gemark_schluessel_t(
constraint pk_gemark PRIMARY KEY(gemark_id)
CREATE TYPE flurstueck_t AS OBJECT(
flst_id NUMBER(8),
flst_nr_zaehler NUMBER(4),
flst_nr_nenner NUMBER(4),
zusatz VARCHAR2(2),
flur_nr NUMBER(2),
gemark_schluessel REF gemark_schluessel_t,
flaeche SDO_GEOMETRY
CREATE TABLE flurstuecke_tab OF flurstueck_t(
constraint pk_flst PRIMARY KEY(flst_id),
constraint uq_flst UNIQUE(flst_nr_zaehler,flst_nr_nenner,zusatz,flur_nr),
flst_nr_zaehler NOT NULL,
flur_nr NOT NULL,
gemark_schluessel REFERENCES gemark_schluessel_tab
Now I have data in the gemark_schluessel_tab which looks like this (a sample):
1 101 Borna
2 102 Draisdorf
Now I wanna load data in my flurstuecke_tab with SQL*Loader and there I have problems with my ref column gemark_schluessel.
One data record looks like this in my file (it is without geometry)
1|97|7||1|1|
If I wanna load my data record, it does not work. The reference (the system generated OID) should be taken from gemark_schluessel_tab.
LOAD DATA
INFILE *
TRUNCATE
CONTINUEIF NEXT(1:1) = '#'
INTO TABLE FLURSTUECKE_TAB
FIELDS TERMINATED BY '|'
TRAILING NULLCOLS (
flst_id,
flst_nr_zaehler,
flst_nr_nenner,
zusatz,
flur_nr,
gemark_schluessel REF(CONSTANT 'GEMARK_SCHLUESSEL_TAB',GEMARK_ID),
gemark_id FILLER
BEGINDATA
1|97|7||1|1|
Is there a error I made?
Thanks in advance
Tigmultiple duplicate threads:
to call an oracle procedure and sql loader in an unix script
Re: Can some one help he sql loader issue. -
SQL Loader Constraints with Column Objects and Nested Tables
I am working on loading a Table that (god forbid) contains columns, column objects, and nested tables (which contains several depth of column objects). My question is does SQL Loader have a hidding undocumented feature where it states how the column objects must be grouped in refereneced to the nested tables within the loader file? I can load the various column objects, and nested tables fine right now, however, I am loading them all in strange and insane order. Can anyone answer this question? Thanks.
PeterI just noticed that my email is wrong. If you can help, plese send email to [email protected]
thanks. -
SQL Loader - BLOB & LONG COLUMNS
I have a table with 500 rows
200 out of these rows need to be exported from one database to another database.
These rows have long and blob columns with values. The long columns COULD contain all sorts of special characters.
How can this be achieved with SQL Loader? Is there any other tool that can be used to get this done?
Any help will be appreciated.Hello,
What's your oracle version? As satish mentioned you can use datapump for load and unload. Here is an additional way of loading lob data
To load data upto 32k, you can use varchar2(32656) but its not a good idea to load clob in that manner because it's very inconsistent as length can vary resulting in string literal too long. So you have 2 choices now, first you have to use either procedure or anonymous block to load clob data
First Method -- I loaded alert.log successfully and you can imagine how big this file can be (5MB (upto 4GB) in my test case)_
CREATE OR REPLACE DIRECTORY DIR AS '/myapth/logs';
DECLARE
clob_data CLOB;
clob_file BFILE;
BEGIN
INSERT INTO t1clob
VALUES (EMPTY_CLOB ())
RETURNING clob_text INTO clob_data;
clob_file := BFILENAME ('DIR', 'wwalert_dss.log');
DBMS_LOB.fileopen (clob_file);
DBMS_LOB.loadfromfile (clob_data,
clob_file,
DBMS_LOB.getlength (clob_file)
DBMS_LOB.fileclose (clob_file);
COMMIT;
END;
Second Method: Use of Sqlldr_
LOAD DATA
INFILE alert.log "STR '|\n'"
REPLACE INTO table t1clob
clob_text char(30000000)
)Hope this helps.
Regards -
SQL Loader - Concatenating 2 columns
Can you concatenate 2 colums using SQL*loader?
For example, from a flat file I have ...col1,col2...
I need to concatenate col1+col2 to make one column.
Thanks in advance,
JimHello,
What's your oracle version? As satish mentioned you can use datapump for load and unload. Here is an additional way of loading lob data
To load data upto 32k, you can use varchar2(32656) but its not a good idea to load clob in that manner because it's very inconsistent as length can vary resulting in string literal too long. So you have 2 choices now, first you have to use either procedure or anonymous block to load clob data
First Method -- I loaded alert.log successfully and you can imagine how big this file can be (5MB (upto 4GB) in my test case)_
CREATE OR REPLACE DIRECTORY DIR AS '/myapth/logs';
DECLARE
clob_data CLOB;
clob_file BFILE;
BEGIN
INSERT INTO t1clob
VALUES (EMPTY_CLOB ())
RETURNING clob_text INTO clob_data;
clob_file := BFILENAME ('DIR', 'wwalert_dss.log');
DBMS_LOB.fileopen (clob_file);
DBMS_LOB.loadfromfile (clob_data,
clob_file,
DBMS_LOB.getlength (clob_file)
DBMS_LOB.fileclose (clob_file);
COMMIT;
END;
Second Method: Use of Sqlldr_
LOAD DATA
INFILE alert.log "STR '|\n'"
REPLACE INTO table t1clob
clob_text char(30000000)
)Hope this helps.
Regards -
How to skip footer record in SQL*Loader input file
I have am using SQL*Loader in a batch import process.
The input files to SQL*Loader have a header record and footer record - always the 1st and last records in the file.
I need SQL*Loader to ignore these two records in every file when performing the import. I can easily ignore the header by using the SKIP function.
Does anybody know how to ignore the last record (footer) in an input file??
I do not want to physically pre-strip the footer since the business want all data files to have the header and footer records.Thanks - how do I use the when clause to specify the last line of the input file?
I am presuming it requires me to have a unique identifier at a given position on that last line. If I don't have such an identifier can I still use your solution?
Cheers Why not putting an idetifier at the end of your input file: echo "This_is_the_End" >> input_file ? -
SQL*Loader . A column value in data file contains comma(,)
Hi Friends,
I am getting an issue while loading a csv file to database.
A column in datafile contains a comma .. how to load such data?
For ex, a record in data file is :
453,1,452,N,5/18/2006,1,"FOREIGN, NON US$ CORPORATE",,,310
Here "FOREIGN, NON US$ CORPORATE" is a column and contains a , in the value.
I have specified optionally enclosed with " also.. but still not working
Here is my control file:
options (errors=100)
load data
infile 'TAX_LOT_DIM_1.csv'
badfile 'TAX_LOT_DIM_1.bad'
replace
into table TAX_LOT_DIM
fields terminated by ',' optionally enclosed by '"'
trailing nullcols
TAX_LOT_DIM_ID ,
TAX_LOT_NBR ,
TAX_LOT_ODS_ID ,
RESTRICTION_IND ,
LAST_UPDATE_DTM ,
TRAN_LOT_NBR integer,
MGR_GRP_CD optionally enclosed by '"' ,
RESTRICTION_AMT "TO_NUMBER(:RESTRICTION_AMT,'99999999999999999999.999999999999')" ,
RESTRICTION_INFO ,
SRC_MGR_GRP_CD
Problem is with MGR_GRP_CD column in ctrl file.
Please reply asap.
Regards,
KishoreThanks for the response.
Actually my ctrl file is like this with some conversion functions:
replace
into table TAX_LOT_DIM
fields terminated by ',' optionally enclosed by '"'
trailing nullcols
TAX_LOT_DIM_ID "TO_NUMBER(:TAX_LOT_DIM_ID ,'999999999999999.99999999')",
TAX_LOT_NBR ,
TAX_LOT_ODS_ID "to_number(:TAX_LOT_ODS_ID ,'999999999999999.999999')",
RESTRICTION_IND ,
LAST_UPDATE_DTM "to_date(:LAST_UPDATE_DTM ,'mm/dd/yyyy')",
TRAN_LOT_NBR integer, --"TO_NUMBER(:TRAN_LOT_NBR,'999999999999999.99999999999999999')",
MGR_GRP_CD char optionally enclosed by '"' ,
RESTRICTION_AMT "TO_NUMBER(:RESTRICTION_AMT,'99999999999999999999.999999999999')" ,
RESTRICTION_INFO ,
SRC_MGR_GRP_CD
For char columns , even i dont give any datatype, i think it will work.
And pblm is not with this hopefully.
Thanks,
Kishore -
How to specify Sql Loader input file with dynamic name
The input file name likes pochange_YYYYMMDD.dat which is generated by other program each day.
I want to load this file to a table.
Any help? Thanks a lotI thought in an excellent strategy to do this:
Conditions:
- you have to have only one file in the folder with the data.
- Independently of the name of that file. A process is always going to change its name to a fix name.
- the controlfile must at another fixed path
the scripts is this one:
joel_sqlloader.bat
ren *.dat always.txt
sqlldr <user>/password data=always.txt control=<another_path>/my_ctl.ctl
- In a folder you will have these files:
* joel_sqlloader.bat
* pochange_YYYYMMDD.dat
- In another folder you will have:
* my_ctl.ctl
Did you get it ?
Joel Pérez
http://otn.oracle.com/experts -
How to load a default value in to a column when using sql loader
Im trying to load from a flat file using sql loader.
for 1 column i need to update using a default value
how to go about this?Hi!
try this code --
LOAD DATA
INFILE 'sample.dat'
REPLACE
INTO TABLE emp
empno POSITION(01:04) INTEGER EXTERNAL NULLIF empno=BLANKS,
ename POSITION(06:15) CHAR,
job POSITION(17:25) CHAR,
mgr POSITION(27:30) INTEGER EXTERNAL NULLIF mgr=BLANKS,
sal POSITION(32:39) DECIMAL EXTERNAL NULLIF sal=BLANKS,
comm POSITION(41:48) DECIMAL EXTERNAL DEFAULTIF comm = 100,
deptno POSITION(50:51) INTEGER EXTERNAL NULLIF deptno=BLANKS,
hiredate POSITION(52:62) CONSTANT SYSDATE
)-hope this will solve ur purpose.
Regards.
Satyaki De. -
SQL*Loader and binary data
i have a C routine that builds SQL*Loader input files. the input files contain multiple records, with a couple of integer columns and a raw(1400) field. the control file specifies a record separator of '|', which seems weird (having text in the middle of raw data) but also is somewhat co-operative (see below).
so i basically write each integer to the file, then a short (2-byte) length value for the raw field, then the raw field. then the '|' separator.
i've noticed that if the size of raw field is 400 bytes or less, everything works fine, i get the correct number of records in the database.
unfortunately, with a size of 401 or more, SQL*Loader parses the thing into twice as many records as it should. so if i've written 3 records to my input data file, with each record's raw field at 400 bytes or less i get 3 records loaded. but any with a raw field of 401+, i get two records for each.
any ideas why? and how to correct this? also, any ideas on a better way to do this? all the examples of large data in the online doc and the o'reilly book favor showing examples with large character data, which does me not much good.
tia. johni have a C routine that builds SQL*Loader input files. the input files contain multiple records, with a couple of integer columns and a raw(1400) field. the control file specifies a record separator of '|', which seems weird (having text in the middle of raw data) but also is somewhat co-operative (see below).
so i basically write each integer to the file, then a short (2-byte) length value for the raw field, then the raw field. then the '|' separator.
i've noticed that if the size of raw field is 400 bytes or less, everything works fine, i get the correct number of records in the database.
unfortunately, with a size of 401 or more, SQL*Loader parses the thing into twice as many records as it should. so if i've written 3 records to my input data file, with each record's raw field at 400 bytes or less i get 3 records loaded. but any with a raw field of 401+, i get two records for each.
any ideas why? and how to correct this? also, any ideas on a better way to do this? all the examples of large data in the online doc and the o'reilly book favor showing examples with large character data, which does me not much good.
tia. john -
"Derived Column" failed because truncation occurred !!!!
HI Friends,
I have got this Error so My package execution keep Failing so Please find my error Below.
Error Message :
Executed as user: BSSLOCAL\DB-CLUS-SQL-Server. Microsoft (R) SQL Server Execute Package Utility Version 11.0.2100.60 for 32-bit Copyright (C) Microsoft Corporation. All rights reserved. Started: 08:41:00 Error: 2014-02-03
08:41:13.92 Code: 0xC020902A Source: Upload to Sharepoint Derived Column [14] Description:The "Derived Column" failed because truncation occurred, and the truncation row disposition
on "Derived Column.Outputs[Derived Column Output].Columns[SDate]" specifies failure on truncation. A truncation error occurred on the specified object of the specified component. End Error Error: 2014-02-03 08:41:13.95 Code:
0xC0047022 Source: Upload to Sharepoint SSIS.Pipeline Description: SSIS Error Code DTS_E_PROCESSINPUTFAILED. The ProcessInput method on component "Derived Column" (14) failed with error code 0xC020902A while processing
input "Derived Column Input" (15). The identified component returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running. There
may be error messages posted before this with more information about the failure. End Error DTExec: The package execution returned DTSER_FAILURE (1). Started: 08:41:00 Finished: 08:41:17 Elapsed: 17.5 seconds. The
package execution failed. The step failed.
Please give me how Can I rectify This Error Thanks For Advance.Hi Cherukuri19,
If I understand correctly, you checked the data type of the input column from the Advanced Editor for the Derived Column component, right? Based on the error message, it seems that the data length of the target column outputted by the component above the
Derived Column (may be the Source Component) is more than 10 characters. However, you just modify the Input Column data type and length from the Derived Column component, hence, the data truncation error occurs. In this condition, you can avoid this issue
by using the following steps:
Keep the original nature of the SDate Input Column, e.g. don’t set the data type of this column to [DT_STR] or set its length.
Use the following expression instead for the Derived Column:
(DT_STR,10,1252)((DT_DBDATE)SDate)
Regards,
Mike Yin
TechNet Community Support -
Sql*loader and nested tables
I'm having trouble loading a nested table via sqlldr in Oracle 10g and was hoping someone could point me in the right direction. I keep getting the following error:
SQL*Loader-403: Referenced column not present in table mynamespace.mytable
Here's an overview of my type and table definitions:
create type mynamespace.myinfo as object
i_name varchar2(64),
i_desc varchar2(255)
create TYPE mynamespace.myinfotbl as TABLE of mynamespace.myinfo;
create table mynamespace.mytable
Info mynamespace.myinfotbl,
note varchar2(255)
NESTED TABLE Info STORE AS mytable_nested_tab;
My control file looks like this:
load data
infile 'mydatafile.csv'
insert into table mynamespace.mytable
fields terminated by ',' optionally enclosed by '"'
TRAILING NULLCOLS
Info nested table count(6)
Info column object
i_name char(64),
i_desc char(255)
note
Example mydatafile.csv would be something like:
lvl1,des1,lvl2,des2,lvl3,des3,lvl4,des4,lvl5,des5,lvl6,des6,a test data set
I can't figure out why sqlldr keeps rejecting this control file. I'm using 'direct=false' in my .par file.
Any hints?I just noticed that my email is wrong. If you can help, plese send email to [email protected]
thanks.
Maybe you are looking for
-
Dear All i hav a scenario for make to order where sales order created should default block for MRP,only authorized person should release the block Please suggest how to do this Regards
-
hello..!!! i cant enter to iTunes store in my Windows 7 why??? somebody help me please :S
-
Can anyone tell me how I can make a copy of a cd or dvd? I have the super drive.
-
How to find Namespaces used.
hi guys, My customer is using Namespaces in ABAP development. but not sure where and whats the name of it. How can i find custom developed objects are used in Customer namespace in ABAP. Any specific transaction or table used for saving Namespaces..?
-
"Type 3" error trying to install PCM-CIA/USB card
Hello, I am in the USA, trying to sort out a PB G3 Wallstreet problem for my son in Kenya (Africa). He is a Peace Corps volunteer, and has been given the use of this old laptop for his work in the village of Oyugis. I have sent him a PCM-CIA to USB c