SQL*Loader: puzzling with ORA-03106
Hello All,
I got a process that loads 3 files into a 8.1.7.2.0. Database. It uses DIRECT Path to load it. If the database is freshly started it loads the first 2 files and fails with the third one with:
Error during upi fetch: [100]
ORA-03106: fatal two-task communication protocol error
SQL*Loader-704: Internal error: uldisconnect [-1]
ORA-00600: internal error code, arguments: [729], [64], [space leak], [], [], [], [], []
If it's not started freshly it can't even load a single file. But if I do the load without DIRECT it works.
In a forum I found a hint to the Parameter SESSION_CACHED_CURSORS. I then changed it from 100 to 0 and it worked.
But I don't see the reason for that behaviour!
Any hint is appreciated!
Imre
03135, 00000, "connection lost contact"
// *Cause: 1) Server unexpectedly terminated or was forced to terminate.
// 2) Server timed out the connection.
// *Action: 1) Check if the server session was terminated.
// 2) Check if the timeout parameters are set properly in sqlnet.ora.I would suspect networking issues external to sqlldr & Oracle
Similar Messages
-
SQL*LOADER script with ORA-03135: connection lost contact error
Hello,
I got a really strange problem with a script that transfer data from a database to an other. Each day we recieve the raw data in zipped txt files, then we feed it to the database with SQL Loader. This worked fine until two weeks ago. At first, I thought the error must have been in the script, but it didn't change at all since 3 months. And the weirdest thing is that sometime it works without any problems, but now I get almost anytime the "ORA-03135: connection lost contact" and it seems to happen randomly. It never fails at the same place , thought the data I feed the script is always the same.
I'm new to SQL*Loader and since the script seems to send all the data, I don't understand where that error come from. Can I do something on my workstation or the problem is on the database server ?
I think the problem would be that the connection get a timeout out on the server, but we never got that problem on the production server and it has a lot more data to transfer then my little test script.
Have you any idea on whjat the problem could be ?03135, 00000, "connection lost contact"
// *Cause: 1) Server unexpectedly terminated or was forced to terminate.
// 2) Server timed out the connection.
// *Action: 1) Check if the server session was terminated.
// 2) Check if the timeout parameters are set properly in sqlnet.ora.I would suspect networking issues external to sqlldr & Oracle -
SQL*Loader completed with ORA error
Hi all,
i'm trying to load a flat file via a SQL Loader mapping. Even though it produces rejected rows, the mapping completed successfully!? How could I configure a SQL Loader mapping to return an error after a data error occurs?
Regards UweUwe,
I just performed some tests, and the following is the case:
- The maximum number of errors specifies how many errors you allow. If that number is reached, then no more records will be loaded.
- The deployment manager always shows finished successfully (this is not good, so I filed bug 3569480 for that).
- The RAB shows the correct status.
Does that confirm your experiences?
Thanks,
Mark. -
Sql@loader-704 and ORA-12154: error messages when trying to load data with SQL Loader
I have a data base with two tables that is used by Apex 4.2. One table has 800,000 records . The other has 7 million records
The client recently upgraded from Apex 3.2 to Apex 4.2 . We exported/imported the data to the new location with no problems
The source of the data is an old mainframe system; I needed to make changes to the source data and then load the tables.
The first time I loaded the data i did it from a command line with SQL loader
Now when I try to load the data I get this message:
sql@loader-704 Internal error: ulconnect OCISERVERATTACH
ORA-12154: tns:could not resolve the connect identifier specified
I've searched for postings on these error message and they all seem to say that SQL Ldr can't find my TNSNAMES file.
I am able to connect and load data with SQL Developer; so SQL developer is able to find the TNSNAMES file
However SQL Developer will not let me load a file this big
I have also tried to load the file within Apex (SQL Workshop/ Utilities) but again, the file is too big.
So it seems like SQL Loader is the only option
I did find one post online that said to set an environment variable with the path to the TNSNAMES file, but that didn't work..
Not sure what else to try or where to look
thanksHi,
You must have more than one tnsnames file or multiple installations of oracle. What i suggest you do (as I'm sure will be mentioned in ed's link that you were already pointed at) is the following (* i assume you are on windows?)
open a command prompt
set TNS_ADMIN=PATH_TO_DIRECTOT_THAT_CONTAINS_CORRECT_TNSNAMES_FILE (i.e. something like set TNS_ADMIN=c:\oracle\network\admin)
This will tell oracle use the config files you find here and no others
then try sqlldr user/pass@db (in the same dos window)
see if that connects and let us know.
Cheers,
Harry
http://dbaharrison.blogspot.com -
SQL Loader and Error ORA-01847/ORA-01839
Hi,
While using the direct loading in SQL-LOADER when we get the ORA-01847/ORA-01839 all the other records are getting errorred out. It goes fine with the conventional loading.
Should I use some parameters or anything to make sure that all the other records are not rejected when we get the ORA-01847/ORA-01839 error while going with the DIRECT loading.
Thanks
JibinIn internet I found this short message:
“AL32UTF8 is a multi-byte characterset,that means some characters are stored in more than 1 character, that's true for these special characters.
If you have same table definitions in both databases you likely face error ORA-12899.
This metalink note discusses this problem, it's also applicable to sqlloader:
Import reports "ORA-12899: Value too large for column" when using BYTE semantic
Doc ID: Note:563893.1”
By metalink, I can see the Note linked to a one Oracle Internal Bug for Oracle 11g.....
I'm waiting you suggestion... thanks very much in advance.
Regards.
Giovanni -
Sql loader error with date format
Hi everyone,
I have table and have a data in one coulmn RECORDED_DATE like '20090224' and my client is asking me to load this coulmn data in 'yyyymmdd' format.I am strucked up with my ideas.I used to_date('20090124','yyyymmdd') in control file.but it is also not working Here it is my control file
LOAD DATA
INFILE 'C:\xxxx\SQLLDR\HE data\HE_data_Feb.txt'
BADFILE 'C:\xxxx\SQLLDR\HE data.bad'
DISCARDFILE 'C:\xxxx\SQLLDR\HE data.dsc'
INTO TABLE LSCCMGR.FASTPAY_HE_DATA
REPLACE
fields terminated by X'09'
TRAILING NULLCOLS
(RECORDED_DATE "TO_DATE(:RECORDED_DATE,'mm/dd/yyyy')"
AGENT_ID
MEASURE
TRANSACTIONS
FEES
If i excute like this i am getting the error like
Record 1: Rejected - Error on table LSCCMGR.FASTPAY_HE_DATA, column RECORDED_DATE.
ORA-01843: not a valid month
Getting for all records,what i need o change to get the RECORDED_DATE as dateformat.Ple any one help me in this issue to resolve
How can we perform this using sql loader pls let me know,Thanks in advance.
SravanHi,
>>(RECORDED_DATE "TO_DATE(:RECORDED_DATE,'mm/dd/yyyy')"
*Change this line to*
(RECORDED_DATE "TO_DATE(:RECORDED_DATE,'yyyymmdd')"Regards, -
SQL Loader Problem with Date Format
Dear all,
I am dealing with a problem in loading data with SQL Loader. The problem is in the date format.
More specifically, I created the following Control File:
file.ctl
LOAD DATA
INFILE 'D:\gbal\chatium.log'
APPEND INTO TABLE CHAT_SL
FIELDS TERMINATED BY WHITESPACE
TRAILING NULLCOLS
(SL1 DATE "Mon DD, YYYY HH:MI:SS FF3AM",
SL2 char,
SL3 DATE "Mon DD, YYYY HH:MI:SS FF3AM",
SL4 char,
SL5 char,
SL6 char,
SL7 char,
SL8 char,
SL9 char,
SL10 char,
SL11 char,
SL12 char,
SL13 char,
SL14 char,
SL15 char)
The data we want to load are in the following file:
Apr 29, 2007 12:05:49 AM 1060615 Apr 29, 2007 12:05:35 AM 306978537730 24026384 chatium.user.userinfo WAP 0
Apr 29, 2007 12:12:51 AM 1061251 Apr 29, 2007 12:12:27 AM 306978537730 24026384 chatium.channel.list WAP 0
Apr 29, 2007 12:12:51 AM 1061264 Apr 29, 2007 12:12:32 AM 306978537730 24026384 chatium.channel.listdetail WAP 0
Apr 29, 2007 12:13:51 AM 1061321 Apr 29, 2007 12:13:31 AM 306978537730 24026384 chatium.user.search WAP 0
Apr 29, 2007 12:13:51 AM 1061330 Apr 29, 2007 12:13:37 AM 306978537730 24026384 chatium.user.userinfo WAP 0
The error log file is the following:
SQL*Loader: Release 9.2.0.1.0 - Production on Mon Apr 30 11:29:16 2007
Copyright (c) 1982, 2002, Oracle Corporation. All rights reserved.
Control File: file.ctl
Data File: D:\gbal\chatium.log
Bad File: chatium.bad
Discard File: none specified
(Allow all discards)
Number to load: ALL
Number to skip: 0
Errors allowed: 50
Bind array: 64 rows, maximum of 256000 bytes
Continuation: none specified
Path used: Conventional
Table CHAT_SL, loaded from every logical record.
Insert option in effect for this table: APPEND
TRAILING NULLCOLS option in effect
Column Name Position Len Term Encl Datatype
SL1 FIRST * WHT DATE MonDD,YYYYHH:MI:SS
SL2 NEXT * WHT CHARACTER
SL3 NEXT * WHT CHARACTER
SL4 NEXT * WHT CHARACTER
SL5 NEXT * WHT CHARACTER
SL6 NEXT * WHT CHARACTER
SL7 NEXT * WHT CHARACTER
SL8 NEXT * WHT CHARACTER
SL9 NEXT * WHT CHARACTER
SL10 NEXT * WHT CHARACTER
SL11 NEXT * WHT CHARACTER
SL12 NEXT * WHT CHARACTER
SL13 NEXT * WHT CHARACTER
SL14 NEXT * WHT CHARACTER
SL15 NEXT * WHT CHARACTER
Record 1: Rejected - Error on table CHAT_SL, column SL1.
ORA-01840: input value not long enough for date format
Record 2: Rejected - Error on table CHAT_SL, column SL1.
ORA-01840: input value not long enough for date format
Record 3: Rejected - Error on table CHAT_SL, column SL1.
ORA-01840: input value not long enough for date format
Record 4: Rejected - Error on table CHAT_SL, column SL1.
ORA-01840: input value not long enough for date format
I wonder if you could help me.
Thank you very much in advance.
Giorgos BaliotisSQL> select to_date('Apr 29, 2007 12:05:49 AM','Mon DD, YYYY HH:MI:SS FF3AM') from dual;
select to_date('Apr 29, 2007 12:05:49 AM','Mon DD, YYYY HH:MI:SS FF3AM') from dual
ERROR at line 1:
ORA-01821: date format not recognized
SQL> ed
Wrote file afiedt.buf
1* select to_date('Apr 29, 2007 12:05:49 AM','Mon DD, YYYY HH:MI:SS AM') from dual
SQL> /
TO_DATE(
29/04/07
SQL> Then, you defined blank space as separator, but there is spaces in your date inside your file. So, you should add double-quotes around the date field like below, and add optionally enclosed by '"' into your ctlfile.
"Apr 29, 2007 12:05:49 AM" 1060615 "Apr 29, 2007 12:05:35 AM" 306978537730 24026384 chatium.user.userinfo WAP 0
"Apr 29, 2007 12:12:51 AM" 1061251 "Apr 29, 2007 12:12:27 AM" 306978537730 24026384 chatium.channel.list WAP 0
"Apr 29, 2007 12:12:51 AM" 1061264 "Apr 29, 2007 12:12:32 AM" 306978537730 24026384 chatium.channel.listdetail WAP 0
"Apr 29, 2007 12:13:51 AM" 1061321 "Apr 29, 2007 12:13:31 AM" 306978537730 24026384 chatium.user.search WAP 0
"Apr 29, 2007 12:13:51 AM" 1061330 "Apr 29, 2007 12:13:37 AM" 306978537730 24026384 chatium.user.userinfo WAP 0Example :
http://download-uk.oracle.com/docs/cd/B19306_01/server.102/b14215/ldr_concepts.htm#sthref477
Nicolas. -
SQL*Loader ContinueIf with X'hex'
Hi all,
i am using SQL*Loader: Release 9.2.0.6.0 to import a text file with this CTL file :
LOAD DATA
INFILE AZUPI00F.TXT
REPLACE
CONTINUEIF THIS (1:3) != X'0D0A1A'
INTO TABLE SP_AZUPI00F
(UPIUPI POSITION (1:5) INTEGER EXTERNAL,
UPINOM POSITION (6:40) CHAR,
UPIIND POSITION (41:90) CHAR,
UPILOC POSITION (91:125) CHAR,
...but the log return this :
Errors allowed: 10001
Bind array: 10000 rows, maximum of 256000 bytes
Continuation: 1:3 != 0X0d0a1a(character ''), in current physical record
Path used: Conventional
Silent options: FEEDBACK, ERRORS and DISCARDS
...and the import fails. What i miss in the syntax?
I am trying to exclude from import the last line in my text file that contain the hexadecimal
Thank to all for sugestions and solutions.Following link might be helpful ->
[SQL Loader 1|http://www.stanford.edu/dept/itss/docs/oracle/10g/server.101/b10825/ldr_cases.htm#i1010200]
[SQL Loader 2|http://www.psoug.org/reference/sqlloader.html]
Regards.
Satyaki De. -
SQL*Loader issue with WHEN command
Environment: R12.1.2
We have a file coming in from a bank that needs to be loaded into a custom table using SQL*Loader.
The file has multiple record formats. Each record in the file starts with a "record type", which defines the format.
For simplicity, let me say that there is a record type of "H" with the header format, and another record type "D" has a detail record format. An "H" record may be followed by multiple "D" records until the next "H" record is encountered. Unfortunately, there is no common key, like say "Vendor Number" in both the "H" and "D" records to establish a relationship. So the plan was to use a Oracle sequence or SQL*Loader sequence to get a sequence loaded into the table as the file is being loaded. Then if consecutive "H" records had a sequence value of 100 and 112, we would know that the "D" records for the "H" 100 record are all the records with sequence value of 101 through 111.
The issue occurs as we have to use the WHEN command in the control file to direct a certain record type to specific columns of the table. Based on the populated sequence values, with the WHEN command, it seems that all the "H" records get loaded first followed by the "D" records. The sequence becomes of no use and we cannot establish a link between the "H" and "D" records. The alternative is to not use WHEN with the sequence, but load the file into generic column names which provides for less understanding in the application.
Is there a way (command feature) to ensure that SQL*Loader loads the records sequentially while using WHEN?
Thanks
SatishI used RECNUM parameter instead of sequence and it worked fine
-
Re: Sql*loader 11g - Error ORA-12899
My Bad file has first 2 records like this:
MEMB_NUMBER,ID_NUMBER,ASSIGNED_MEMB_NUMBER,ASSOC_AMT,ASSOC_TYPE,DATE_ADDED,DATE_MODIFIED,OPERATOR_NAME,USER_GROUP,LOCATION_ID,
0000000107,0000828633, ,1.5,J,22-FEB-02,12-JUN-02,MSUM080_MEMB_CONV,00,,
0000002301,0000800007, ,297.5,J,03-AUG-00,12-JUN-02,MSUM080_MEMB_CONV,00,,
My Log file says:
Record 1: Rejected - Error on table OWBREP.MEMB_ENTITY, column ID_NUMBER.
ORA-12899: value too large for column "OWBREP"."MEMB_ENTITY"."ID_NUMBER" (actual: 20, maximum: 10)
Record 2: Rejected - Error on table OWBREP.MEMB_ENTITY, column ASSOC_AMT.
ORA-01722: invalid number
Decription of target table:
memb_number
varchar2(10 byte)
y
id_number
varchar2(10 byte)
y
assigned_memb_number
varchar2(15 byte)
y
assoc_amt
number(14,2)
y
assoc_type
char(1 byte)
y
date_added
date
y
date_modified
date
y
operator_name
varchar2(32 byte)
y
user_group
varchar2(2 byte)
y
location_id
number
y
Can you please tell me why the sqlldr is throwing error?The data seems correct to me.Hi,
Now again I am facing problem..
I mean my log file throws error
"Record 2: Rejected - Error on table OWBREP.ADDRESS, column USER_GROUP.
ORA-12899: value too large for column "OWBREP"."ADDRESS"."USER_GROUP" (actual: 27, maximum: 2)"
and my record when I see in bad file is:
"0000810722,3,00000000,00000000,H,A,Y, , , , , ,1777 Hull Road, , , ,Mason,MI,48854, , , , ,12-MAR-02,N,0,,, ,00000000,0,FAC, , ,N,N, ,1777 Hull Road,Mason, MI 48854, , , , , , , ,12-MAR-02,10-FEB-05,FIX075_ADDR_VERSION_UPGRADE,00,,"
I have checked my control file and the table structure .They are same.
The problem is that I have an address field before this user_group column which has value like "Mason,MI,48854" .So sql loader is reading these as separate columns because of comma.That is why the order is getting messed up and I am getting this error.Can you suggest what should I do resolve this kind of error? -
Hi,
Can we use SQL Loader with Forms and Reports 6i? Do i need to download SQL Loader separately? How do we use it?
Thanks and regards,
Nitin KoshyYes u can use sql loader in forms. I have seen this being done using Host command
For more info about this Host cmd refer link below
http://www.oracle.com/webapps/online-help/forms/10g/state?navSetId=_&navId=3&vtTopicFile=f1_help/builth_m/host.html&vtTopicId= -
Sql Loader fails with newline fields
Hi,I have a csv file delimited by "|" which is exported from MS Access. One of the fields contains a newline (e.g. notes column) which sql loader treats as beginning of new record. Can anyone tell me how to force the loader to ignore this newline.Is there a way I can tell the loader to read the optional '"' and continue with the record till it finds a closing '"'? Or is there a way to export data from access an force it to skip newlines and make a flat file with one record per line? I hate fixed length columns since it adds up pain in wrting the control files. Also the loader still fails to read the record if it encounters a newline as discussed above.
ThanksUnfortunately there is no command that can make sql*loader treat two lines in a text file as one record. Some one asked tom kyte same question, see what he had to say about it:
http://asktom.oracle.com/pls/ask/f?p=4950%3A8%3A%3A%3A%3A%3AF4950_P8_DISPLAYID%3A4972732303253
Why don't you create a staging table where you first load the data, and then from the data inserted into this table populate your original table in the form required.
Anwar -
How to load XML file to table (non-XML) with SQL*Loader -- issue with nulls
I have been attempting to use SQL*Loader to load an XML file into a "regular" Oracle table. All fields work fine, unless a null is encountered. The way that nulls are represented is shown below:
<PAYLOAD>
<FIELD1>ABCDEF</FIELD1>
<FIELD2/>
<FIELD3>123456</FIELD3>
</PAYLOAD>
In the above example, FIELD2 is a null field and that is the way it is presented. I have searched everywhere and have not found how I could code for this. The issue is that if FIELD2 is present, it is coded like: <FIELD2>SOMEDATA</FIELD2>, but the null is represented as <FIELD2/>. Here is a sample of the control file I am using to attempt the load -- very simplistic, but works fine when fields are present:
load data
infile 'testdata.xml' "str '<PAYLOAD>'"
TRUNCATE
into table DATA_FROM_XML
FIELD1 ENCLOSED BY '<FIELD1>' AND '</FIELD1>',
FIELD2 ENCLOSED BY '<FIELD2>' AND '</FIELD2>',
FIELD3 ENCLOSED BY '<FIELD3>' AND '</FIELD3>')
What do I need to do to account for the way that nulls are presented? I have tried everything I could glean from the web and the documentation and nothing has worked. Any help would be really appreciated.I hadn't even got that far. can you direct me to where the docs are to import data that is stored within xml but that you don't need any xml functionality, that just happens to be the format the data is stored in? thx
-
SQL Loader : Issue with WHEN
I'm using Oracle 11g, Win XP.
I'm trying to load data with below control file:
OPTIONS (SKIP=0, DIRECT=FALSE, PARALLEL=FALSE, BINDSIZE=50000, errors=999999,ROWS=200, READSIZE=65536)
LOAD DATA
APPEND
INTO TABLE v_table
when COL_3 = 'XXXX'
fields terminated by "|" optionally enclosed by '"'
trailing nullcols
COL_1 "trim(:COL_1)",
COL_2 "trim(:COL_2)",
COL_3 "trim(:COL_3)",
COL_4 "trim(:COL_4)",
COL_5 "trim(:COL_5)",
COL_6 "trim(:COL_6)",
COL_7 "trim(:COL_7)",
INTO TABLE v_table
APPEND
when COL_3 = 'YYY'
fields terminated by "|" optionally enclosed by '"'
trailing nullcols
COL_1 "trim(:COL_1)",
COL_2 "trim(:COL_2)",
COL_3 "trim(:COL_3)",
COL_4 "trim(:COL_4)",
COL_5 "trim(:COL_5)",
COL_6 "trim(:COL_6)",
COL_7 "trim(:COL_7)",
)Below is the sample data in the data file:
33432|"ORACLE"|"XXXX"|"555827 "|"317564"|" "|""|"ORACLE "|2011-07-20-15.37.11.879915|0001-01-01-01.01.01.000001
33433|"ORACLE"|"XXXX"|"555828 "|"317564"|" "|""|"ORACLE "|2011-07-24-15.37.11.879915|0001-01-01-01.01.01.000001
33434|"ORACLE"|"XXXX"|"555829 "|"317564"|" "|""|"ORACLE "|2011-07-10-15.37.11.879915|0001-01-01-01.01.01.000001
33435|"ORACLE"|"XXXX"|"555830 "|"317564"|" "|""|"ORACLE "|2011-07-22-15.37.11.879915|0001-01-01-01.01.01.000001
33436|"ORACLE"|"XXXX"|"555831 "|"317564"|" "|""|"ORACLE "|2011-07-20-15.37.11.879915|0001-01-01-01.01.01.000001
33437|"ORACLE"|"XXXX"|"555832 "|"317564"|" "|""|"ORACLE "|2011-07-20-15.37.11.879915|0001-01-01-01.01.01.000001
40048|"SAS"|"ZZZ "|"1017838 "|"317551"|" "|""|"COD "|2011-09-08-08.44.29.684915|0001-01-01-01.01.01.000001
40049|"SAS"|"ZZZ "|"1017839 "|"317551"|" "|""|"COD "|2011-09-08-08.44.29.684915|0001-01-01-01.01.01.000001
40050|"SAS"|"ZZZ "|"1017840 "|"317551"|" "|""|"COD "|2011-09-08-08.44.29.684915|0001-01-01-01.01.01.000001
20046|"SUNUSA"|"YYY "|"1017836 "|"317551"|" "|""|"JAVA "|2011-09-08-08.44.29.684915|0001-01-01-01.01.01.000001
20047|"SUNUSA"|"YYY "|"1017837 "|"317551"|" "|""|"JAVA "|2011-09-08-08.44.29.684915|0001-01-01-01.01.01.000001
20048|"SUNUSA"|"YYY "|"1017838 "|"317551"|" "|""|"JAVA "|2011-09-08-08.44.29.684915|0001-01-01-01.01.01.000001
20049|"SUNUSA"|"YYY "|"1017839 "|"317551"|" "|""|"JAVA "|2011-09-08-08.44.29.684915|0001-01-01-01.01.01.000001
20050|"SUNUSA"|"YYY "|"1017840 "|"317551"|" "|""|"JAVA "|2011-09-08-08.44.29.684915|0001-01-01-01.01.01.000001Issue is:
When I'm loading data in the table with the above control card, only data with when COL_3 = 'XXXX' is getting loaded. And if I comment the block which has COL_3 = 'XXXX', then the second block is getting loaded (when COL_3 = 'YYY'). But I'm unable to load data for XXXX and YYY in single load. Can someone please help me on this?Thanks Warren. Found the solution.
when COL_3 = 'XXXX'
fields terminated by "|" optionally enclosed by '"'
trailing nullcols
COL_1 *POSITION(1)* "trim(:COL_1)",
COL_2 "trim(:COL_2)",
COL_3 "trim(:COL_3)",
COL_4 "trim(:COL_4)",
COL_5 "trim(:COL_5)",
COL_6 "trim(:COL_6)",
COL_7 "trim(:COL_7)",
INTO TABLE v_table
APPEND
when COL_3 = 'YYY'
fields terminated by "|" optionally enclosed by '"'
trailing nullcols
COL_1 POSITION(1) "trim(:COL_1)",
COL_2 "trim(:COL_2)", -
SQL*Loader issue with NULLIF
Hi all,
I am trying to use following control file,
LOAD DATA
INFILE *
REPLACE
INTO TABLE T1
FIELDS TERMINATED BY '|'
TRAILING NULLCOLS
OBJECT_NAME CHAR NULLIF OBJECT_NAME = "NULL" ,
SUBOBJECT_NAME CHAR NULLIF SUBOBJECT_NAME = "NULL" ,
OBJECT_ID DECIMAL EXTERNAL NULLIF OBJECT_ID = "NULL" ,
DATA_OBJECT_ID DECIMAL EXTERNAL NULLIF DATA_OBJECT_ID = "NULL" ,
OBJECT_TYPE CHAR NULLIF OBJECT_TYPE = "NULL" ,
CREATED DATE "DD/MM/YYYY HH24:MI:SS" NULLIF CREATED = "NULL" ,
LAST_DDL_TIME DATE "DD/MM/YYYY HH24:MI:SS" NULLIF LAST_DDL_TIME = "NULL" ,
TIMESTAMP CHAR NULLIF TIMESTAMP = "NULL" ,
STATUS CHAR NULLIF STATUS = "NULL" ,
TEMPORARY CHAR NULLIF TEMPORARY = "NULL" ,
GENERATED CHAR NULLIF GENERATED = "NULL" ,
SECONDARY CHAR NULLIF SECONDARY = "NULL"
)I am getting error,
SQL*Loader-350: Syntax error at line 21.
Expecting positive integer or column name, found keyword timestamp.
CHAR NULLIF TIMESTAMP = "NULL" ,
STATUSThe file I am trying to load is a pipe delimited file and has a string "NULL" for NULL values. So, I have added NULLIF for all columns.
Interesting thing is, Oracle allows us to have column names like TIMESTAMP or GENERATED, but I use it in the NULLIF clause, it is effectively syntax error.
The table I am using is like this (it is same as user_objects view),
SQL> desc t1
Name Null? Type
OBJECT_NAME VARCHAR2(128)
SUBOBJECT_NAME VARCHAR2(30)
OBJECT_ID NUMBER
DATA_OBJECT_ID NUMBER
OBJECT_TYPE VARCHAR2(19)
CREATED DATE
LAST_DDL_TIME DATE
TIMESTAMP VARCHAR2(19)
STATUS VARCHAR2(7)
TEMPORARY VARCHAR2(1)
GENERATED VARCHAR2(1)
SECONDARY VARCHAR2(1)If I remove the NULLIF clause for columns, timestamp and generated, there is no problem, the control file works fine.
How can I get around this problem ?
Thanks in advanceTIMESTAMP is a keyword for the loader and confuses it.
rename your column
Maybe you are looking for
-
Hello, my daughter's ipod nano (4th generation) is no longer recognised on itunes and on its settings it says 0KB used 0KB free !! How can I get it working again? I have done the usual restart/restore thing but it just comes back the same once I've
-
Transfer of messages in workflow from one employee to another
Hi, There are around 2300 items lying in SAP workflow of an employee. We need to transfer all these items to workflow bucket of another employee. We have allreday made set up such that in future all items will go to desired employee's workflow. Howev
-
Payment term baseline date in SD
Hi, all: In the configuration of payment term, there are four criteria for baseline date: no defalut date; entry date; docuemnt date; posting date. For SD receivable items, what the "entry date" and "docuemtn date" respectively stand for whe
-
Error in crm_order_maintain.
Hi All, I want to create Service Contract with Perticular Product here i am getting someProblem when adding Product line item in it when i am submitting correct item Category then system is giving me mail with i am giving you code. Update w
-
When I burn a cd from a playlist in itunes, it burns the playlist alphabetically , not the order I selected. Why?