Sql Loader - Decimal numbers showing in null column
Greetings,
My apologies if this is in the wrong forum section. It seemed to be the most logical.
I have added new column to a control file used in a sql loader upload and I am getting unexpected results. Long story short, I copy foxpro tables from a network directory to my local pc. A foxpro exe converts these tables to .dat files. Sql loader then uploads the .dat files to matching oracle tables. I've run this program from my pc for years with no problems.
Problem now: We added a new column to a foxpro table and to the matching oracle table. This column in FoxPro in null for now - no data at all. I then added the new column to my ctl file for this table. The program runs, sql loader does it's thing with no errors. However, in the new field in Oracle, I'm finding decimal numbers in many of the records, when all records should have null values in this field. I've checked all other columns in the oracle table and the data looks accurate. I'm not sure why I'm getting these decimal values in the new column.
My log and bad files show no hints of any problems. The bad file is empty for this table.
At first I thought the positioning of the new column in the fox table, .ctl file and the oracle table were not lining up correctly, but I checked and they are.
I've double checked the FoxPro table and all records for this new column are null.
I'm not sure what to check for next or what to test. I am hoping someone in this forum might lend a clue or has maybe seen this problem before. Below is my control file. The new column is the last one: fromweb_id. It is a number field in both FoxPro and Oracle.
Thanks for any advise.
JOBS table control file:
load data
infile 'convdata\fp_ora\JOBS.dat' "str X'08'"
into table JOBS
fields terminated by X'07'
TRAILING NULLCOLS
(SID,
CO_NAME "replace(replace(:CO_NAME,chr(11),chr(10)),chr(15),chr(13))",
JOB_TITLE "replace(replace(:JOB_TITLE,chr(11),chr(10)),chr(15),chr(13))",
CREDITS,
EARN_DATE date "mm/dd/yyyy",
COMMENTS CHAR(2000) "replace(replace(:COMMENTS,chr(11),chr(10)),chr(15),chr(13))",
DONT_SHOW,
PC_SRC "replace(replace(:PC_SRC,chr(11),chr(10)),chr(15),chr(13))",
PC_SRC_NO,
SALARY,
SALFOR,
ROOM,
BOARD,
TIPS,
UPD_DATE date "mm/dd/yyyy hh12:mi:ss am",
STUKEY,
JOBKEY,
JO_COKEY,
JO_CNKEY,
JO_ZUKEY,
EMPLID,
CN_NAME "replace(replace(:CN_NAME,chr(11),chr(10)),chr(15),chr(13))",
JOB_START date "mm/dd/yyyy",
JOB_END date "mm/dd/yyyy",
FROMWEB_ID)
I apologize for not explaining how this was resolved. Sql Loader was working as it should.
The problem was due to new fields being added to the FoxPro table, along with the fromweb_id column, that I was not informed about. I was asked to add a column named fromweb_id to the oracle jobs table and to the sql-loader program. I was not told that there were other columns added at the same time. In the foxpro table, the fromweb_id column was the last column added.
The jobs.dat file contained data from all columns in the foxpro table, including all the new columns. I only added the "fromweb_id" to the control file, which is what I was asked to do. When it ran, it was getting values from one of the new columns and the values were being uploaded into the fromweb_id column in Oracle. It is that simple.
When I had checked the FoxPro table earlier, I did not pickup on the other new columns. I was focussing in on looking for values in the fromweb_id column. When back-tracing data in the jobs.dat file, I found a value in the fromweb_id column that matched a value in a differnt column (new column) in FoxPro. That is when I realized the other new columns. I instantly knew what the problem was.
Thanks for all the feedback. I'm sorry if this was an inconvenience to anyone. I'll try to dig a little deeper next time. Lessons learned...
regards,
Similar Messages
-
SQL*Loader-418: Bad datafile datatype for column XMLDOC
Hi,
I am trying to load a xml document into a xmltype column in a table using SQL*Loader and I receive this error:
SQL*Loader-418: Bad datafile datatype for column XMLDOC
My ctl is:
LOAD DATA INFILE 'marginalpdbc_xml_20030529.xml'
APPEND INTO TABLE PRUEBA_CARGA
( XMLDOC LOBFILE (CONSTANT 'marginalpdbc_xml_20030529.xml')
TERMINATED BY EOF,
NOMBFICH CONSTANT 'marginalpdbc_xml_20030529.xml' )
And the table is:
create table prueba_carga (NOMBFICH VARCHAR2(200),xmldoc xmltype);
What is wrong with my ctl?
I am using SQL*Loader: Release 9.2.0.1.0 and Oracle9i Enterprise Edition Release 9.2.0.1.0
Thanks in advance.Looks like there is data which takes > 8 digits for the 'DTE_ADDED' column.
Plz check that first.
Can you provide a couple of lines of data for sampling ??? -
Sql Loader INFILE name value in table column Value
Hi,
Here is my Sql Loader Script
LOAD DATA
infile '%1'
APPEND INTO TABLE XX_SUPPLIER_UPD
FIELDS TERMINATED BY ";" OPTIONALLY ENCLOSED BY '"' TRAILING NULLCOLS
ACTION Char
,ADDRESS_TYPE Char
,REGION Char "LTRIM(RTRIM(:REGION))"
,PO_BOX Char
,,WWW_ADDRESS Char
,status Char "NVL(:status,'X')"
,filename Char "replace(:infile,'\"','')"
I am getting the infile name as the parameter and i want to insert that parameter value in the column name fillename. Can any one guide me to how to do.
Cheers!
Jayaraj.SIf you were to use external tables instead of SQL*Loader, you can dynamically change the location of the external table (i.e. the filename) using a simple ALTER TABLE statement.
External tables also mean that all the control is inside the database rather than relying on external utilities and external scripts.
;) -
Hello,
I need to import data with field containing floating point number. When I do this load on linux it goes ok. But when I try to load the same data from my windows there is problem with decimal point character. (My oracle server is 9i on linux).
I trid to use after logon trigger to issue statement alter session set nls_numeric_characters (enclosed in execute immediate statement) but it don't helped me.
Thanks for advice
sasaok.
I have following values in file to import:
45.5
50.3
60.2
38.7
("." is decimal point)
If I use sqlldr on linux it is imported correctly. If I try to to import data from windows machine I get error message ORA-01722: Invalid number. So somewhere on my windows machine is set another decimal separator and i would like to change it to ".". But i don't know where I can change it.
Is it more understandable? -
SQL*Loader: How to use Sequence and REF together?
Hi,
I'm getting the following error:
SQL*Loader-418: Bad datafile datatype for column AREA_ID.
I attempt to upload one datafile into 4 different tables.
When executing sqlldr scott/tiger control=sqlldr_aj_new.ctl
1) Here is my Control file (sqlldr_aj_new.ctl):
load data
infile 'TST_MAIN_NEW.csv'
-- Loads table "TST_AREAS"
into table TST_AREAS
REPLACE
FIELDS TERMINATED by ','
(AREA_ID SEQUENCE,
AREA_NAME,
PRODUCT_ID,
PRIORITY_ID,
PLAN_ID,
CREATED_BY_ID,
AREA_DESC)
-- Loads table "TST_TEMPLATE_SCENS"
into table TST_TEMPLATE_SCENS
REPLACE
FIELDS TERMINATED by ','
(TEMPLATE_SCEN_ID SEQUENCE,
SCENARIO_NAME,
MODIFIED_BY_ID,
SCEN_TYPE_ID,
OWNER_ID,
PRIORITY_ID,
SCOPE_ID,
SCENARIO_DESC,
AREA_ID REF(CONSTANT 'TST_AREAS', AREA_ID))
2) Here is my data file(TST_MAIN_NEW.csv)
B/S,105,1,plan_id_1,2222,area_desc1,CCEMC_PR_001,1111,1,2222,2,1,scenario_desc_1
B/S,105,1,plan_id_2,2222,area_desc2,CCEMC_PR_002,1111,2,2222,2,2,scenario_desc_2
3) Here are the tables:
TST_AREAS
Name Null? Type
AREA_ID NOT NULL NUMBER(35)
AREA_NAME CHAR(64)
PRODUCT_ID NUMBER(35)
PRIORITY_ID NUMBER(35)
PLAN_ID NUMBER(35)
CREATED_BY_ID NUMBER(35)
AREA_DESC CHAR(1000)
TST_TEMPLATE_SCENS
Name Null? Type
TEMPLATE_SCEN_ID NOT NULL NUMBER(35)
SCENARIO_NAME CHAR(500)
MODIFIED_BY_ID NUMBER(35)
SCEN_TYPE_ID NUMBER(35)
OWNER_ID NUMBER(35)
PRIORITY_ID NUMBER(35)
SCOPE_ID NUMBER(35)
SCENARIO_DESC CHAR(1000)
AREA_ID NOT NULL NUMBER(35)
Please advise what is the reason for an error.
Thank you very much!
AndreyI don't think so.
SQL> create table t (
2 rowid number);
rowid number)
ERROR at line 2:
ORA-00904: : invalid identifierPlease post your version number and the DDL for the table.
Best practice in Oracle is to NEVER, EVER, store a ROWID: They can change.
After you have solved the first problem, that you've never actually built the table, the solution to the sequence use is to make the call in your DML.
INSERT INTO t (testcol) VALUES (<sequence_name>.NEXTVAL); -
Is there a way to load in spaces using Sql*Loader given that the table it is loading to has a not null column but has a default value. My data file has spaces for a number field instead of zeros. Since my table column is declared as not null, it's failing even if I have a default value of '0' in my column.
You can tackle this in the control file. Here is a snippet from Oracle OnLine books - hopefully this will help out! Cheers.
==========================================
Setting a Column to Null or Zero
If you want all inserted values for a given column to be null, omit the column's specifications entirely. To set a column's values
conditionally to null based on a test of some condition in the logical record, use the NULLIF clause; see NULLIF Keyword. To set a
numeric column to zero instead of NULL, use the DEFAULTIF clause, described next.
DEFAULTIF Clause
Using DEFAULTIF on numeric data sets the column to zero when the specified field condition is true. Using DEFAULTIF on
character (CHAR or DATE) data sets the column to null (compare with Numeric External Datatypes). See also Specifying Field
Conditions for details on the conditional tests.
DEFAULTIF field_condition
A column may have both a NULLIF clause and a DEFAULTIF clause, although this often would be redundant.
Note: The same effects can be achieved with the SQL string and the DECODE function. See Applying SQL Operators to
Fields
NULLIF Keyword
Use the NULLIF keyword after the datatype and optional delimiter specification, followed by a condition. The condition has the same
format as that specified for a WHEN clause. The column's value is set to null if the condition is true. Otherwise, the value remains
unchanged.
NULLIF field_condition
The NULLIF clause may refer to the column that contains it, as in the following example:
COLUMN1 POSITION(11:17) CHAR NULLIF (COLUMN1 = "unknown")
This specification may be useful if you want certain data values to be replaced by nulls. The value for a column is first determined from
the datafile. It is then set to null just before the insert takes place. Case 6: Loading Using the Direct Path Load Method provides
examples of the NULLIF clause.
Note: The same effect can be achieved with the SQL string and the NVL function. See Applying SQL Operators to Fields.
Null Columns at the End of a Record
When the control file specifies more fields for a record than are present in the record, SQL*Loader must determine whether the
remaining (specified) columns should be considered null or whether an error should be generated. The TRAILING NULLCOLS
clause, described in TRAILING NULLCOLS, explains how SQL*Loader proceeds in this case.
========================================== -
Sql loader: loading external file into blob
hi,
i keep being blocked by loading external binary files into a blob columns with sql loader tool.
Here is the problem:
I want to load pictures into a table with lob columns. Here is my control file, data file and the error message:
control file:
LOAD DATA
APPEND
INTO TABLE T_K58_SYMBOLE
FIELDS TERMINATED BY ';' TRAILING NULLCOLS
SYMB_ID INTEGER EXTERNAL(8),
SYMB_LMLABEL CHAR(100),
imgName FILLER,
SYMB_GIF_HI BFILE(CONSTANT ".", imgName),
thumbName FILLER,
SYMB_GIF_LOW BFILE(CONSTANT ".", thumbName)
data file:
1;0800;image1.gif;image1.gif;
error message:
SQL*Loader-418: Bad datafile datatype for column SYMB_GIF_HI
Here is the script of the creation of my table in my database:
create table T_K58_SYMBOLE (
SYMB_ID NUMBER(8) not null,
SYMB_LMLABEL VARCHAR2(100),
SYMB_GIF_HI BLOB,
SYMB_GIF_LOW BLOB,
constraint PK_T_K58_SYMBOLE primary key (SYMB_ID)
LOB (SYMB_GIF_HI, SYMB_GIF_LOW) STORE AS (tablespace TDK5813)
tablespace TDK5811
Please, i need help!!!!This is my control file. I'm loading images in database table using sqlldr in Unix:
LOAD DATA
INFILE 'sampledata.dat'
INTO TABLE image_table
APPEND
FIELDS TERMINATED BY ',' optionally enclosed by '"'
IMAGE_ID INTEGER EXTERNAL,
FILE_NAME CHAR,
IMAGE_DATA LOBFILE(FILE_NAME) TERMINATED BY EOF
I'm facing following error:
SQL*Loader-350: Syntax error at line 9.
Expecting "," or ")", found "LOBFILE".
IMAGE_DATA LOBFILE(FILE_NAME) TERMINATED BY EOF
My data file is:
1,"IMG_3126.jpg"
2,"IMG_3127.jpg"
3,"IMG_3128.jpg"
and images are in the same server. please help me to get out of this.
Thanks,
Surjeet Kaur -
Problem in Data Migration via. SQL*Loader
Hi,
I am trying to load the data from a text generated file.
While using the slqldr command along with the url,".ctl", ".log",".bad" and ".dat" parameters, all the records goes to ".bad" file. Why?
Eventhouh it parse the control file successfully.
Anybody please help me in the matter.
I am sending the ".log" file generated during the transaction :
========================================================================
Control File: CUSTOMER.CTL
Data File: CUSTOMER.DAT
Bad File: CUSTOMER.BAD
(Allow all discards)
Number to load: ALL
Number to skip: 0
Errors allowed: 50
Bind array: 64 rows, maximum of 256000 bytes
Continuation: none specified
Path used: Conventional
Table CUSTOMER_BACKUP, loaded from every logical record.
Insert option in effect for this table: INSERT
Column Name Position Len Term Encl Datatype
CS_CODE FIRST * WHT O(") CHARACTER
CS_NAME NEXT * WHT O(") CHARACTER
CS_ADD1 NEXT * WHT O(") CHARACTER
CS_ADD2 NEXT * WHT O(") CHARACTER
CS_ADD3 NEXT * WHT O(") CHARACTER
CS_ADD4 NEXT * WHT O(") CHARACTER
CS_PIN NEXT * WHT O(") CHARACTER
CS_PHONE NEXT * WHT O(") CHARACTER
CS_TELEX NEXT * WHT O(") CHARACTER
CS_CR_DAYS NEXT * WHT O(") CHARACTER
CS_CR_LIM NEXT * WHT O(") CHARACTER
CS_TRD_DIS NEXT * WHT O(") CHARACTER
CS_CSH_DIS NEXT * WHT O(") CHARACTER
CS_STX_FRM NEXT * WHT O(") CHARACTER
CS_LSTX_NO NEXT * WHT O(") CHARACTER
CS_MOB_BAL NEXT * WHT O(") CHARACTER
CS_STX_PER NEXT * WHT O(") CHARACTER
CS_IND NEXT * WHT O(") CHARACTER
CS_CSTX_NO NEXT * WHT O(") CHARACTER
CS_SLMN_CD NEXT * WHT O(") CHARACTER
CS_BANK_1 NEXT * WHT O(") CHARACTER
CS_BANK_2 NEXT * WHT O(") CHARACTER
CS_BANK_3 NEXT * WHT O(") CHARACTER
CS_YOB_BAL NEXT * WHT O(") CHARACTER
CS_CURR NEXT * WHT O(") CHARACTER
CS_ZONE NEXT * WHT O(") CHARACTER
CS_CAT NEXT * WHT O(") CHARACTER
F_EDT NEXT * WHT O(") CHARACTER
F_UID NEXT * WHT O(") CHARACTER
F_ACTV NEXT * WHT O(") CHARACTER
CS_RANGE NEXT * WHT O(") CHARACTER
CS_ITNO NEXT * WHT O(") CHARACTER
CS_INT NEXT * WHT O(") CHARACTER
CS_CIRCLE NEXT * WHT O(") CHARACTER
CS_ECCCODE NEXT * WHT O(") CHARACTER
CS_MFRCODE NEXT * WHT O(") CHARACTER
CS_QTY_BUG NEXT * WHT O(") CHARACTER
CS_VAL_BUG NEXT * WHT O(") CHARACTER
Record 1: Rejected - Error on table CUSTOMER_BACKUP, column CS_YOB_BAL.
Column not found before end of logical record (use TRAILING NULLCOLS)
Record 2: Rejected - Error on table CUSTOMER_BACKUP, column CS_CURR.
Column not found before end of logical record (use TRAILING NULLCOLS)
Record 3: Rejected - Error on table CUSTOMER_BACKUP, column CS_CURR.
Column not found before end of logical record (use TRAILING NULLCOLS)
Column not found before end of logical record (use TRAILING NULLCOLS)
Record 50: Rejected - Error on table CUSTOMER_BACKUP, column CS_MOB_BAL.
Column not found before end of logical record (use TRAILING NULLCOLS)
Record 51: Rejected - Error on table CUSTOMER_BACKUP, column CS_MOB_BAL.
Column not found before end of logical record (use TRAILING NULLCOLS)
MAXIMUM ERROR COUNT EXCEEDED - Above statistics reflect partial run.
Table CUSTOMER_BACKUP:
0 Rows successfully loaded.
51 Rows not loaded due to data errors.
0 Rows not loaded because all WHEN clauses were failed.
0 Rows not loaded because all fields were null.
Space allocated for bind array: 254904 bytes(26 rows)
Read buffer bytes: 1048576
Total logical records skipped: 0
Total logical records read: 51
Total logical records rejected: 51
Total logical records discarded: 0
Run began on Wed Aug 28 13:30:29 2002
Run ended on Wed Aug 28 13:30:29 2002
Elapsed time was: 00:00:00.18
CPU time was: 00:00:00.08
=================================================================
Regards,
BimalHi,
I am bit late ,I cannot say bit, If we use TRAILING NULL COLS in the control file your problem should be fixed.
TRAILING NULLCOLS tells SQL*Loader to treat any relatively positioned columns that are not present in the record as null columns.
For example, if the following data
10 Accounting
is read with the following control file
INTO TABLE dept
TRAILING NULLCOLS
( deptno CHAR TERMINATED BY " ",
dname CHAR TERMINATED BY WHITESPACE,
loc CHAR TERMINATED BY WHITESPACE
and the record ends after DNAME. The remaining LOC field is set to null. Without the TRAILING NULLCOLS clause, an error would be generated due to missing data. -
How to make sqlldr handle mutliple NULL columns
I am using TRAILING NULL COLS in the control file to tell SQL*Loader to treat any relatively positioned columns that are not present in the record as null columns.
But the problem is occuring when there are multiple NULL columns one after the other. In that case SQL*Loader is treating continuous null column as 1 NULL column and then I am getting following message:
ORA-12899: value too large for column
And reason being wrong value is being tried to inserted in wrong column.
How can I stop SQL*Loader from combining multiple and continuous NULL columns and basically to make it treat each of them separately.I have been able to take care of the issue using the NULLIF attribute. Here is how my control file looks now:
LOAD DATA
INFILE 'tab_sal_ldr.tab'
INSERT INTO TABLE SQL_TAB_LDR
REPLACE
FIELDS TERMINATED BY "\t"
TRAILING NULLCOLS
(COL1 CHAR,
COL2 INTEGER EXTERNAL NULLIF COL2=BLANKS,
COL3 FLOAT EXTERNAL NULLIF COL3=BLANKS,
COL4 DATE EXTERNAL NULLIF COL4=BLANKS)
However it is rejecting the rows having all NULLs. Is it possible to upload rows having all column values as NULL.
Following is the snippet from the log:
2 Rows not loaded because all fields were null.
Edited by: Parag Kalra on May 4, 2010 3:31 AM -
I was wondering if anyone knows a way to prevent ORA-00054 errors on tables when using SQL*Loader. I'm currently invoking scripts that extract data from a source system and load into a table in my database. I'm concurrently invoking the script because I have multiple source systems to improve my overall time for the loads. Sometimes, I get failures on the loads due to ORA-00054 errors. Can this be prevented? Is these a WAIT command/option that can be turned on for SQL*Loader?
desc toto
Name Null? Type
COL1 DATE
Controlfile :
load data
infile 'titi.dat'
truncate
into table titi
(col1 position(1:7) DATE "YYYYMMDD" "DECODE (:col1, '9999999','19990101','000000
0',null,:col1 + 19000000)")
Indeed, the input format date is SYYMMDD where S=0 if year=19xx and S=1 if Year=20xx.
Thanks for your reply. -
Re: Sql*loader 11g - Error ORA-12899
My Bad file has first 2 records like this:
MEMB_NUMBER,ID_NUMBER,ASSIGNED_MEMB_NUMBER,ASSOC_AMT,ASSOC_TYPE,DATE_ADDED,DATE_MODIFIED,OPERATOR_NAME,USER_GROUP,LOCATION_ID,
0000000107,0000828633, ,1.5,J,22-FEB-02,12-JUN-02,MSUM080_MEMB_CONV,00,,
0000002301,0000800007, ,297.5,J,03-AUG-00,12-JUN-02,MSUM080_MEMB_CONV,00,,
My Log file says:
Record 1: Rejected - Error on table OWBREP.MEMB_ENTITY, column ID_NUMBER.
ORA-12899: value too large for column "OWBREP"."MEMB_ENTITY"."ID_NUMBER" (actual: 20, maximum: 10)
Record 2: Rejected - Error on table OWBREP.MEMB_ENTITY, column ASSOC_AMT.
ORA-01722: invalid number
Decription of target table:
memb_number
varchar2(10 byte)
y
id_number
varchar2(10 byte)
y
assigned_memb_number
varchar2(15 byte)
y
assoc_amt
number(14,2)
y
assoc_type
char(1 byte)
y
date_added
date
y
date_modified
date
y
operator_name
varchar2(32 byte)
y
user_group
varchar2(2 byte)
y
location_id
number
y
Can you please tell me why the sqlldr is throwing error?The data seems correct to me.Hi,
Now again I am facing problem..
I mean my log file throws error
"Record 2: Rejected - Error on table OWBREP.ADDRESS, column USER_GROUP.
ORA-12899: value too large for column "OWBREP"."ADDRESS"."USER_GROUP" (actual: 27, maximum: 2)"
and my record when I see in bad file is:
"0000810722,3,00000000,00000000,H,A,Y, , , , , ,1777 Hull Road, , , ,Mason,MI,48854, , , , ,12-MAR-02,N,0,,, ,00000000,0,FAC, , ,N,N, ,1777 Hull Road,Mason, MI 48854, , , , , , , ,12-MAR-02,10-FEB-05,FIX075_ADDR_VERSION_UPGRADE,00,,"
I have checked my control file and the table structure .They are same.
The problem is that I have an address field before this user_group column which has value like "Mason,MI,48854" .So sql loader is reading these as separate columns because of comma.That is why the order is getting messed up and I am getting this error.Can you suggest what should I do resolve this kind of error? -
hi all,
We have Oracle 10g RAC on IBM AIX
Today i will be uploading data in a table through sql loader.
I want that at the time of insertion only data which i am inserting goes into the table,As i am using sequence on particular column A.
Problem is If i shutdown database then i will not to able to insert data and if i doesnt shutdown db value for that column A is generated automatically when application works.
i have recommanded that we should stop connectivity of application with database then i will perform insertion but some people are saying no.Find other method
What steps should i do?Ok i thing my question was not clear
I have table XYZ
A B C
1 2 3
4 4 5
if application is connected to database value to Coulmn is incremented
A B C
1 2 3
4 4 5
5 6 7
i have to insert into table XYZ using sql loader giving sequence (4,1) on column A.
But if it is connected to application it will give me error of PK.
as Value 5 is already inserted by application so how can i stop this.
so that i can have the max value of coulmn and start sequence
Hope this time make u clear.
sorry for bad english -
SQL*LOADER example not working for REF
I was working on the Oracle bulk load scripts using SQL*LOADER utility .
I have encountered an error while using the REF column option to resolve the parent-child referential integrity .
The example given in the URL http://www.csee.umbc.edu/help/oracle8/server.815/a67792/ch05.htm Example 5-9 Loading Primary Key REF Columns
does not work . and generates errors : 'SQL*Loader-418: Bad datafile datatype for column DEPT_MGR'
example :
Control File
LOAD DATA
INFILE `sample.dat'
INTO TABLE departments_alt
FIELDS TERMINATED BY `,' OPTIONALLY ENCLOSED BY `"'
(dept_no CHAR(5),
dept_name CHAR(30),
dept_mgr REF(CONSTANT `EMPLOYEES', emp_id),
emp_id FILLER CHAR(32))
Data file (sample.dat)
22345, QuestWorld, 007,
23423, Geography, 000,
Could you please suggest any solutions for this ?Some of the quotes are wrong
Try the sample from the 10.2 documentation
LOAD DATA
INFILE 'sample.dat'
INTO TABLE departments_alt
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"'
(dept_no CHAR(5),
dept_name CHAR(30),
dept_mgr REF(CONSTANT 'EMPLOYEES', emp_id),
emp_id FILLER CHAR(32)) -
Hi All ,
I have a use case in sql loader where in i have two columns A and B in a Table.
My Use case for control file is to get B from csv ( which is a mandatory column) . A is optional in CSV . if A is populated take that value for A . If not default if from B.
Thing i have tried is :
B ,
A EXPRESSION : "NVL( :A, :B) " ,
This is failing saying that
Invalid bind variable A in SQL string for column A.
Any ideas ? I know we could define a function. . But any better way to do with control files itself ?
Thanks,
BibinRemove the word EXPRESSION.
SCOTT@orcl12c> host type test.csv
bval1,aval1,
bval2,,
SCOTT@orcl12c> host type test.ctl
load data
infile test.csv
into table a_table
fields terminated by ','
trailing nullcols
(b,
a "nvl (:a, :b)")
SCOTT@orcl12c> create table a_table
2 (b varchar2(5),
3 a varchar2(5))
4 /
Table created.
SCOTT@orcl12c> host sqlldr scott/tiger control=test.ctl log=test.log
SQL*Loader: Release 12.1.0.1.0 - Production on Fri Nov 15 13:46:40 2013
Copyright (c) 1982, 2013, Oracle and/or its affiliates. All rights reserved.
Path used: Conventional
Commit point reached - logical record count 2
Table A_TABLE:
2 Rows successfully loaded.
Check the log file:
test.log
for more information about the load.
SCOTT@orcl12c> select * from a_table
2 /
B A
bval1 aval1
bval2 bval2
2 rows selected. -
SQL Loader: Null value in column
Have a tab delimited file, in UNIX
Some of the inbetween columns have null values
The records are failing to load with null column values.
I tried:
col10 nullif col10 ="(null)"
My control file is:
load data
infile 'abc.txt'
into table XX_data
fields terminated by X'09' optionally enclosed by '"'
TRAILING NULLCOLS
( col1,
col2 nullif col2 ="(null)",
col3)Sample data is tab delimited file. For some reason, it has problem reading the second date (+time) column in the same record when there are null values in preceding columns. It says:
ORA-01841: (full) year must be between -4713 and +9999, and not be 0
Col1 Col2 Col3 Col4 Col5 Col6 Col7 Col8 Col9 Col10 Col11 Col12
2000-01-03 10:38:05.733000000 XX AA Change 0 DOG CAT R 2000-01-03 10:38:05.733000000 GIRAFFE MONKEY COW
Maybe you are looking for
-
If anyone can help w\this, please respond. My ITunes crashed a couple of mnths ago and I've been tryin to reinstall ever since w\out any success and am getting very frustrated. It will install onto my computer and my desktop (shows that it and Quickt
-
Looking for a good book about threads
Hello all, I am looking for a good book about threads. I would like something which is both practical, but also provides some theoretical basis. I am an experienced programmer, with some, but not much, experience with concurrent design. I would great
-
Dual Monitors - Open All PDFs on the same monitor
I use Adobe Acrobat Reader with Windows 7 and a dual monitor set-up. When I open a single PDF, without any other PDFs open, Reader will open it on my secondary monitor. This is fine, as that's where I prefer it to open. However, when I open any furth
-
MAP 7.0 errors completing inventory
I am using MAP v7.0 to run hardware and software inventory scans on Windows and Linux/Solaris servers and receive errors not found in any of the blogs or documentation. The specific error is "Failed- Inventory not completed". I cant think of a more g
-
Hi, Has anyone configured and successfully connected to Netezza from Essbase? I have installed the Netezza ODBC driver on my Essbase Server and have modified the $ARBORPATH/bin/.odbc.ini file to include the Netezza datasources. I have tested the conn