COLUMN DATA VALUE EXCEEDS-REJECTED WHILE BULK LOADING
hi
am having a table like
CREATE TABLE SMLC_CELLDATA
MCC NUMBER,
MNC NUMBER,
LAC VARCHAR2(10 BYTE),
CELL_ID VARCHAR2(10 BYTE),
CELL_NAME VARCHAR2(500 BYTE),
LAT FLOAT(20),
LON FLOAT(20),
ORIENTATION NUMBER,
OPENING NUMBER,
RANGE NUMBER,
BSIC NUMBER,
ARFCN NUMBER,
HANDOVER VARCHAR2(4000 BYTE),
EIRP NUMBER,
ENVIRONMENT VARCHAR2(50 BYTE)
TABLESPACE SYSTEM
PCTUSED 40
PCTFREE 10
INITRANS 1
MAXTRANS 255
STORAGE (
INITIAL 64K
MINEXTENTS 1
MAXEXTENTS 2147483645
PCTINCREASE 0
FREELISTS 1
FREELIST GROUPS 1
BUFFER_POOL DEFAULT
LOGGING
NOCACHE
NOPARALLEL;WHEN I TRY TO BULK LOAD DATA IN TO THIS TABLE CERTAIN ROWS WERE REJECTED. AND IT SAYS COLUMN HANDOVER VALUE EXCEEDS THE LIMIT.
BUT THE DATA IS
404-80-101-1021 404-80-101-1022 404-80-101-1023 404-80-101-1101 404-80-101-1103 404-80-101-1131 404-80-101-1132 404-80-101-1133 404-80-101-1151 404-80-101-1153 404-80-101-1161 404-80-101-1163 404-80-101-1322 404-80-101-1392 404-80-101-1393 404-80-120-18312
WHICH IS OF 256 BYTES. HOW COME IT CAN REJECT IT..?
AEMUNATHAN
Edited by: Aemunathan on Sep 26, 2008 6:13 PM
the bad file contains
404 80 101 1102 110_Hotel_Madura_H2 10.79805556 78.68110833 120 60 3000 38 77 404-80-101-1021 404-80-101-1022 404-80-101-1023 404-80-101-1101 404-80-101-1103 404-80-101-1131 404-80-101-1132 404-80-101-1133 404-80-101-1151 404-80-101-1153 404-80-101-1161 404-80-101-1163 404-80-101-1322 404-80-101-1392 404-80-101-1393 404-80-120-18312 52 u
404 80 105 9012 901_Tansi_Kumbakonam_H2 10.96108056 79.39166667 120 60 3000 34 75 404-80-105-9011 404-80-105-9013 404-80-105-9032 404-80-105-9051 404-80-105-9052 404-80-105-9091 404-80-105-9092 404-80-105-9093 404-80-105-9132 404-80-105-9133 404-80-105-9181 404-80-105-9183 404-80-105-9233 404-80-105-9282 404-80-121-17013 404-80-121-17033 52 u
404 80 107 13012 1301_Attur_H2 11.59705556 78.59446944 100 60 3000 32 77 404-80-107-13011 404-80-107-13013 404-80-107-13023 404-80-107-13041 404-80-107-13043 404-80-107-13112 404-80-107-13113 404-80-107-13121 404-80-107-13122 404-80-107-13123 404-80-107-13441 404-80-107-13443 404-80-107-13471 404-80-107-13473 404-80-232-32842 404-80-232-32843 52 u
404 80 107 13303 1330_Omallur_H2 11.73888889 78.04583056 210 60 3000 36 69 404-80-107-13101 404-80-107-13102 404-80-107-13241 404-80-107-13301 404-80-107-13302 404-80-107-13313 404-80-107-13322 404-80-107-13352 404-80-107-13423 404-80-204-4051 404-80-204-4052 404-80-204-4053 404-80-204-4072 404-80-204-4073 404-80-207-7201 404-80-207-7203 52 u
404 80 107 13423 1342_Yercaud_Support_H1 11.77083056 78.19860833 240 60 3000 32 63 404-80-107-13071 404-80-107-13072 404-80-107-13073 404-80-107-13091 404-80-107-13092 404-80-107-13093 404-80-107-13102 404-80-107-13132 404-80-107-13171 404-80-107-13173 404-80-107-13301 404-80-107-13302 404-80-107-13303 404-80-107-13311 404-80-107-13313 404-80-107-13322 404-80-107-13351 404-80-107-13352 404-80-107-13353 404-80-107-13421 404-80-107-13422 404-80-136-36102 404-80-204-4021 404-80-204-4031 404-80-204-4051 404-80-204-4073 404-80-204-4261 404-80-204-4263 52 u
404 80 109 15152 1515_Paramathy_Exg_H1U 11.15138889 78.02471944 200 60 3000 32 71 404-80-109-15141 404-80-109-15142 404-80-109-15143 404-80-109-15151 404-80-109-15153 404-80-109-15251 404-80-109-15253 404-80-109-15311 404-80-109-15312 404-80-109-15313 404-80-109-15322 404-80-109-15401 404-80-109-15403 404-80-205-5232 404-80-207-7531 404-80-207-7533 52 u
404 80 109 15172 1517_Andalurgate_H1 11.44916667 78.15916667 80 60 3000 39 68 404-80-109-15011 404-80-109-15053 404-80-109-15161 404-80-109-15163 404-80-109-15171 404-80-109-15173 404-80-109-15181 404-80-109-15182 404-80-109-15183 404-80-109-15193 404-80-109-15261 404-80-109-15333 404-80-109-15421 404-80-109-15430 404-80-109-15463 404-80-207-7101 404-80-207-7102 404-80-207-7162 52 u
404 80 112 20012 2001_Attur_TE_H2D 11.59705556 78.59446944 85 60 1500 32 77 404-80-107-13121 404-80-107-13122 404-80-107-13123 404-80-112-20011 404-80-112-20013 404-80-112-20022 404-80-112-20023 404-80-112-20141 404-80-112-20143 404-80-112-20153 404-80-112-20231 404-80-112-20233 404-80-112-20251 404-80-112-20253 404-80-232-32842 404-80-232-32843 52 u
404 80 115 12222 1222_Venkateswara_Ngr_H1 11.11722222 79.66166667 210 60 3000 35 71 404-80-115-12221 404-80-115-12223 404-80-115-12231 404-80-115-12232 404-80-115-12241 404-80-115-12242 404-80-115-12243 404-80-115-12253 404-80-115-12261 404-80-115-12262 404-80-115-12311 404-80-115-12312 404-80-115-12313 404-80-115-12321 404-80-115-12353 404-80-115-12413 404-80-115-12471 52 u
404 80 121 17011 1701_Eravancheri_H1 10.93305556 79.55416667 40 60 3000 34 82 404-80-105-9112 404-80-105-9113 404-80-105-9163 404-80-105-9742 404-80-115-12242 404-80-115-12243 404-80-115-12252 404-80-115-12403 404-80-115-12473 404-80-121-17012 404-80-121-17013 404-80-121-17041 404-80-121-17043 404-80-121-17131 404-80-121-17133 404-80-121-17182 52 u
404 80 121 17442 1744_Exg_Pattukkottai_H2 10.42666667 79.31891667 130 60 3000 32 76 404-80-121-17111 404-80-121-17113 404-80-121-17162 404-80-121-17401 404-80-121-17402 404-80-121-17403 404-80-121-17422 404-80-121-17423 404-80-121-17431 404-80-121-17433 404-80-121-17441 404-80-121-17443 404-80-121-17461 404-80-121-17472 404-80-121-17522 404-80-121-17523 52 u
404 80 130 30022 Coonoor_MW_2 11.34527778 76.81027778 270 65 3000 47 68 404-80-130-30041 404-80-130-30051 404-80-130-30151 404-80-130-30012 404-80-130-30031 404-80-130-30032 404-80-130-30033 404-80-130-30021 404-80-130-30052 404-80-130-30152 404-80-130-30153 404-80-130-30011 404-80-130-30211 404-80-130-30162 404-80-130-30163 404-80-130-30013 404-80-130-30361 404-80-130-30023 404-80-130-30363 404-80-130-30251 404-80-130-30253 52 u
404 80 130 30093 Forest_Land_Ooty_3 11.395 76.70277778 270 65 3000 44 75 404-80-130-30433 404-80-130-30182 404-80-130-30142 404-80-130-30143 404-80-130-30103 404-80-130-30091 404-80-130-30092 404-80-130-30082 404-80-130-30083 404-80-130-30053 404-80-130-30321 404-80-130-30183 404-80-130-30422 404-80-130-30283 404-80-130-30423 404-80-130-30413 404-80-130-30432 404-80-130-30431 52 u
404 80 130 30163 Vandhisolai_3 11.37222222 76.81861111 220 65 3000 47 74 404-80-130-30161 404-80-130-30162 404-80-130-30153 404-80-130-30011 404-80-130-30012 404-80-130-30151 404-80-130-30073 404-80-130-30022 404-80-130-30033 404-80-130-30051 404-80-130-30361 404-80-130-30013 404-80-130-30303 404-80-130-30383 404-80-130-30023 404-80-130-30203 404-80-130-30253 52 u
404 80 132 32731 Collectorate_IBS TNJ 10.81458 79.60622 0 65 3000 47 74 404-80-121-17071 404-80-121-17091 404-80-121-17093 u
404 80 134 34741 Parambikulam_IBS_1 10.390417 76.774889 0 65 3000 42 63 404-80-132-32443 m
404 80 134 34751 TNG_Palayam_IBS2_1 10.390417 76.774889 0 65 1000 40 63 404-80-134-34141 404-80-134-34651 404-80-134-34652
404 80 149 49011 Tirunelveli MSC Switch 8.72777 77.7044 0 360 200
404 80 149 49012 Tirunelveli MSC Switch 8.72777 77.7044 0 360 200 m
404 80 166 6611 Ranga_Pilllai BSNL Pondy Exchange building 11.56 79.5 0 360 200 404-80-160-60171 m
404 80 204 4261 426_Old_Suramangalam_H2 11.67055556 78.10999722 0 60 3000 39 70 404-80-107-13072 404-80-107-13073 404-80-107-13302 404-80-107-13423 404-80-204-4051 404-80-204-4071 404-80-204-4072 404-80-204-4073 404-80-204-4251 404-80-204-4253 404-80-204-4262 404-80-204-4263 404-80-204-4270 404-80-204-4753 404-80-204-4773 404-80-204-4910 52 u
404 80 205 5232 523_Kodumudi_H2 11.07916667 77.87999722 120 60 3000 33 69 404-80-109-15142 404-80-109-15152 404-80-109-15382 404-80-109-15412 404-80-109-15413 404-80-114-11103 404-80-114-11201 404-80-114-11203 404-80-205-5231 404-80-205-5233 404-80-205-5251 404-80-205-5252 404-80-205-5571 404-80-205-5572 404-80-206-6081 404-80-207-7552 52 u
404 80 205 5392 539_V.P.Palayam_Erode_H1 11.29610833 77.66944444 120 60 3600 33 73 404-80-205-5593 404-80-205-5591 404-80-205-5493 404-80-205-5492 404-80-205-5393 404-80-205-5391 404-80-205-5283 404-80-205-5281 404-80-205-5163 404-80-205-5161 404-80-205-5133 404-80-205-5123 404-80-110-16762 404-80-110-16651 404-80-110-16612 404-80-110-16611 52 u
404 80 232 32842 Thalaivasal_2 11.5698056 78.7496389 120 60 3000 43 80 404-80-107-13042 404-80-107-13441 404-80-107-13471 404-80-107-13473 404-80-2003-22253 404-80-232-32841 404-80-232-32843 u
each row ends with "u"
Edited by: Aemunathan on Sep 26, 2008 9:02 PM
Similar Messages
-
XML data value exceeds maximum length - ORA-30951
Hello,
I am receiving ORA-30951: Element or attribute at Xpath /dataroot/Respondent[1]/Response[3]/Value exceeds maximum length error during the XML load.
I have registered the schema and it works fine when the Value is less than 64k but fails if its greater. I tried changing the type of Value to type="xsd:base64Binary" xdb:maintainDOM="false" xdb:SQLName="LOB_DATA" xdb:SQLType="BLOB" but then I get ORA-00932: inconsistent datatypes error.
Can someone please let me know what I am doing wrong or is there a way I can load more than 64k length elements on 10g?
Thanks
Here is my schema.
var SCHEMAURL varchar2(256)
var XMLSCHEMA CLOB
set define off
begin
:SCHEMAURL := 'http://xmlns.example.com/Svy_Resp.xsd';
:XMLSCHEMA := '<?xml version="1.0"; encoding="utf-16"?>
<xsd:schema attributeFormDefault="unqualified" elementFormDefault="qualified" version="1.0"; xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:xdb="http://xmlns.oracle.com/xdb" xdb:storeVarrayAsTable="true">
<xsd:element name="dataroot" xdb:defaultTable="SVY_RESP_XML_SCHEMA" type="datarootType" />
<xsd:complexType name="datarootType" xdb:maintainDOM="false"
xdb:SQLType="Dataroot_T">
<xsd:sequence>
<xsd:element maxOccurs="unbounded" name="Respondent" type="RespondentType" />
</xsd:sequence>
<xsd:attribute name="generated" type="xsd:dateTime" />
</xsd:complexType>
<xsd:complexType name="RespondentType" xdb:maintainDOM="false" xdb:SQLType="Respondent_Type">
<xsd:sequence>
<xsd:element name="RespondentID" type="xsd:int" />
<xsd:element name="KsID" type="xsd:int" />
<xsd:element name="email" type="xsd:string" />
<xsd:element name="SyID" type="xsd:int" />
<xsd:element name="KSuID" type="xsd:int" />
<xsd:element name="Completed" type="xsd:int" />
<xsd:element name="SubmitDateTime" type="xsd:dateTime" />
<xsd:element maxOccurs="unbounded" name="Response" type="ResponseType" />
</xsd:sequence>
</xsd:complexType>
<xsd:complexType name="ResponseType" xdb:maintainDOM="false" xdb:SQLType="Response_Type">
<xsd:sequence>
<xsd:element name="ResponseID" type="xsd:int" />
<xsd:element name="RespondentID" type="xsd:int" />
<xsd:element name="CID" type="xsd:int" />
<xsd:element name="AID" type="xsd:int" />
<xsd:element name="Value" type="xsd:string"/>
<xsd:element name="QID" type="xsd:int" />
<xsd:element name="SID" type="xsd:int" />
</xsd:sequence>
</xsd:complexType>
</xsd:schema>';
end;
/Thanks for the quick response. I am not able to modify the source file, but will it be possible to set the value to NULL if it exceeds the max length instead of failure?
Thanks -
Using Field/Column Date Value In Case Statement
I have code that the first part works (the part that evaluates null). However, it appears the second part doesn't work. The error I get is:
Data Value out of range
Can you use the a table column value in a Case statement? What I'm trying to do is: where all mbr06..values are 12/31/9999, then null; if mbr06.. values are equal to or less than today's date, then put the value in mbr02.mbr02_cancel_proc_date..
Note, the table MBR02 does have dates that go back as far as 11/17/1858..could that be an issue?
Thanks for any assistance..
cast((case
when mbr06.mbr06_exp_date = to_date('31-dec-9999') then null
when mbr06.mbr06_exp_date <= sysdate then mbr02.mbr02_cancel_proc_dateThe error is due to the CAST, not to the CASE.
Cause: Value from cast operand is larger than cast target size.Post at least the whole cast ...
Max
http://oracleitalia.wordpress.com -
Dump error while bulk load to Sybase IQ
Hi ,
I am loading data from Sybase IQ source to Sybase IQ target database. I can load the data in normal mode. But If I am trying bulk loading it is throwing dump error. Earlier I loaded so many times using bulk load but I didn't face any issue. We upgraded to Sybase IQ 16.0 SP3 to SP8. Now We are facing this type of difficulties. Please find the attached log file for error.
Please help me how to resolve this issue.
Thanks & Regards,
Ramana.Hi Vijay,
follow the steps below (May Help you)
Goto Monitor screen>Status Tab> using the wizard or the menu path >Environment -> Short dump> In the warehouse
Select the Error and double click
Analyse the error from the message.
1.-->Go to Monitor
-->Transactional RFC
-->In the Warehouse
-->Execute
-->EDIT
-->Execute LUW
Refresh the Transactional RFC screen and go to the Data Target(s) which have failed and see the status of the Request-->Now it should be green.
2.In some cases the above process will not work (where the Bad requests still exists after the above procedure).
In that case we need to go to the Data Targets and need to delete the bad requests and do the update again
Regards,
BH -
Milestone Billing, Billing date value exceeds Net Value
Hi all,
I have a issue related to Milestone Billing. Here billing dates are entered manually. Say for eg for Rs.1000, 4 billing dates will be entered for 25% based on completion of a project.
Now the problem here is , The net value will be 1000 [contract amount] but system is allowing to enter more than Rs.1000 in milestone billing dates. means i can enter more than 100%. how to restrict this. Do we need to use any user exit.? I have tried all standard ways.
Thanks & Regards
PraveenDear Praveen,
You can put a logic to compare the Total Value and also the individual milestone amount and trigger a error message while saving the sales order.
This seems the only way.
However lets wait for the experts to comment on this requirement.
Thanks & Regards,
Hegal K Charles -
Hi,
I'm using Oracle Endeca 2.3.
I encountered a problem in data integrator, Some batch of records were missing in the Front end and when I checked the status of Graph , It Showed "Graph Executed sucessfully".
So, I've connected the Bulk loader to "Universal data writer" to see the data domain status of the bulk load.
I've listed the results below, However I'm not able to interpret the information from the status and I've looked up the documentation but I found nothing useful.
0|10000|0|In progress
0|11556|0|In progress
0|20000|0|In progress
0|30000|0|In progress
0|39891|0|In progress
0|39891|0|In progress
0|39891|0|In progress
0|39891|0|In progress
0|39891|0|In progress
0|39891|0|In progress
40009|-9|0|In progress
40009|9991|0|In progress
40009|19991|0|In progress
40009|20846|0|In progress
Could anyone enlighten me more about this status.
Also,Since these messages are a part of "Post load", I'm wondering why is it still showing "In-Progress".
Cheers,
KhurshidI assume there was nothing of note in the dgraph.log?
The other option is to see what happens when you either:
A) filter your data down to the records that are missing prior to the load and see what happens
Or
B) use the regular data ingest API rather than the bulk.
Option b will definitely perform much worse on 2.3 so it may not be feasible.
The other thing to check is that your record spec is truly unique. The only time I can remember seeing an issue like this was loading a record, then loading a different record with the same spec value. The first record would get in and then be overwritten by the second record making it seem like the first record was dropped. Figured it would be worth checking.
Patrick Rafferty
Branchbird -
Bulk Loading using remote sql statement execution
Well, i have a different scenario. I want to bulk load the tables like we do in MySQL with LOAD LOCAL DATA sql command.
I have a file populated with data, what sql statement would bulk load the data into specified table using that file?
Adnan MemonIn Oracle, you would either use the SQL*Loader utility to load data from a flat file or you would create an external table (9i and later) that loads the flat file.
A quick example of the external table approach
Justin
Distributed Database Consulting, Inc.
http://www.ddbcinc.com/askDDBC -
I've read that Drop and recreate an index before bulk loads.What does it mean bulk loads here.
Thanks in advancebulk loads means, you have lot of inserts to your table/s. While doing so, if there are any indexes on the table, it would be better to make drop and recreate those indexes. But, it is not highly recommend. You can make your index unusable while bulk loading.
-
Hi,
I have a file where fields are wrapped with ".
=========== file sample
"asdsa","asdsadasdas","1123"
"asdsa","asdsadasdas","1123"
"asdsa","asdsadasdas","1123"
"asdsa","asdsadasdas","1123"
==========
I am having a .net method to remove the wrap characters and write out a file without wrap characters.
======================
asdsa,asdsadasdas,1123
asdsa,asdsadasdas,1123
asdsa,asdsadasdas,1123
asdsa,asdsadasdas,1123
======================
the .net code is here.
========================================
public static string RemoveCharacter(string sFileName, char cRemoveChar)
object objLock = new object();
//VirtualStream objInputStream = null;
//VirtualStream objOutStream = null;
FileStream objInputFile = null, objOutFile = null;
lock(objLock)
try
objInputFile = new FileStream(sFileName, FileMode.Open);
//objInputStream = new VirtualStream(objInputFile);
objOutFile = new FileStream(sFileName.Substring(0, sFileName.LastIndexOf('\\')) + "\\" + Guid.NewGuid().ToString(), FileMode.Create);
//objOutStream = new VirtualStream(objOutFile);
int nByteRead;
while ((nByteRead = objInputFile.ReadByte()) != -1)
if (nByteRead != (int)cRemoveChar)
objOutFile.WriteByte((byte)nByteRead);
finally
objInputFile.Close();
objOutFile.Close();
return sFileName.Substring(0, sFileName.LastIndexOf('\\')) + "\\" + Guid.NewGuid().ToString();
==================================
however when I run the bulk load utility I get the error
=======================================
Bulk load data conversion error (type mismatch or invalid character for the specified codepage) for row 1, column 3 (NumberOfMultipleMatches).
==========================================
the bulk insert statement is as follows
=========================================
BULK INSERT Temp
FROM '<file name>' WITH
FIELDTERMINATOR = ','
, KEEPNULLS
==========================================
Does anybody know what is happening and what needs to be done ?
PLEASE HELP
Thanks in advance
VikramTo load that file with BULK INSERT, use this format file:
9.0
4
1 SQLCHAR 0 0 "\"" 0 "" ""
2 SQLCHAR 0 0 "\",\"" 1 col1 Latin1_General_CI_AS
3 SQLCHAR 0 0 "\",\"" 2 col2 Latin1_General_CI_AS
4 SQLCHAR 0 0 "\"\r\n" 3 col3 Latin1_General_CI_AS
Note that the format file defines four fields while the fileonly seems to have three. The format file defines an empty field before the first quote.
Or, since you already have a .NET program, use a stored procedure with table-valued parameter instead. I have an example of how to do this here:
http://www.sommarskog.se/arrays-in-sql-2008.html
Erland Sommarskog, SQL Server MVP, [email protected] -
SQL loader Field in data file exceeds maximum length for CLOB column
Hi all
I'm loading data from text file separated by TAB and i got the error below for some lines.
Event the column is CLOB data type is there a limitation of the size of a CLOB data type.
The error is:
Record 74: Rejected - Error on table _TEMP, column DEST.
Field in data file exceeds maximum length
I'm using SQL Loader and the database is oracle 11g r2 on linux Red hat 5
Here are the line causing the error fronm my data file and my table description for test:
create table TEMP
CODE VARCHAR2(100),
DESC VARCHAR2(500),
RATE FLOAT,
INCREASE VARCHAR2(20),
COUNTRY VARCHAR2(500),
DEST CLOB,
WEEK VARCHAR2(10),
IS_SAT VARCHAR2(50),
IS_SUN VARCHAR2(50)
CONTROL FILE:
LOAD DATA
INTO TABLE TEMP
APPEND
FIELDS TERMINATED BY X'9' TRAILING NULLCOLS
CODE,
DESC,
RATE,
INCREASE,
COUNTRY),
DEST,
WEEK,
IS_SAT,
IS_SUN
Data file:
BHS Mobile Bahamas - Mobile 0.1430 1 "242357, 242359, 242375, 242376, 242395, 242421, 242422, 242423, 242424, 242425, 242426, 242427, 242428, 242429, 242431, 242432, 242433, 242434, 242435, 242436, 242437, 242438, 242439, 242441, 242442, 242443, 242445, 242446, 242447, 242448, 242449, 242451, 242452, 242453, 242454, 242455, 242456, 242457, 242458, 242462, 242463, 242464, 242465, 242466, 242467, 242468, 24247, 242524, 242525, 242533, 242535, 242544, 242551, 242552, 242553, 242554, 242556, 242557, 242558, 242559, 242565, 242577, 242636, 242646, 242727"
BOL Mobile ENTEL Bolivia - Mobile Entel 0.0865 Increase 591 "67, 68, 71, 72, 73, 740, 7410, 7411, 7412, 7413, 7414, 7415, 7420, 7421, 7422, 7423, 7424, 7425, 7430, 7431, 7432, 7433, 7434, 7435, 7436, 7437, 7440, 7441, 7442, 7443, 7444, 7445, 7450, 7451, 7452, 7453, 7454, 7455, 746, 7470, 7471, 7472, 7475, 7476, 7477, 7480, 7481, 7482, 7483, 7484, 7485, 7486, 7490, 7491, 7492, 7493, 7494, 7495, 7496" Thank you.Hi
Thank you for youe help, I found the solution and here what i do in my Control file i added
char(40000) OPTIONALLY ENCLOSED BY '"' .
LOAD DATA
INTO TABLE TEMP
APPEND
FIELDS TERMINATED BY X'9' TRAILING NULLCOLS
CODE,
DESC,
RATE,
INCREASE,
COUNTRY,
DEST
char(40000) OPTIONALLY ENCLOSED BY '"',
WEEK,
IS_SAT,
IS_SUN
Thank you for your help. -
Error while running bulk load utility for account data with CSV file
Hi All,
I'm trying to run the bulk load utility for account data using CSV but i'm getting following error...
ERROR ==> The number of CSV files provided as input does not match with the number of account tables.
Thanks in advance........Please check your child table.
http://docs.oracle.com/cd/E28389_01/doc.1111/e14309/bulkload.htm#CHDCGGDA
-kuldeep -
Error when Bulk load hierarchy data
Hi,
While loading P6 Reporting databases following message error appears atthe step in charge of Bulk load hierarchy data into ODS.
<04.29.2011 14:03:59> load [INFO] (Message) - === Bulk load hierarchy data into ODS (ETL_LOADWBSHierarchy.ldr)
<04.29.2011 14:04:26> load [INFO] (Message) - Load completed - logical record count 384102.
<04.29.2011 14:04:26> load [ERROR] (Message) - SqlLoaderSQL LOADER ACTION FAILED. [control=D:\oracle\app\product\11.1.0\db_1\p6rdb\scripts\DATA_WBSHierarchy.csv.ldr] [file=D:\oracle\app\product\11.1.0\db_1\p6rdb\temp\WBSHierarchy\DATA_WBSHierarchy.csv]
<04.29.2011 14:04:26> load [INFO] (Progress) - Step 3/9 Part 5/6 - FAILED (-1) (0 hours, 0 minutes, 28 seconds, 16 milliseconds)
Checking corresponding log error file (see below) I see that effectively some records are rejected. Question is: How could I identify the source of the problem and fix it?
QL*Loader: Release 11.1.0.6.0 - Production on Mon May 2 09:03:22 2011
Copyright (c) 1982, 2007, Oracle. All rights reserved.
Control File: DATA_WBSHierarchy.csv.ldr
Character Set UTF16 specified for all input.
Using character length semantics.
Byteorder little endian specified.
Data File: D:\oracle\app\product\11.1.0\db_1\p6rdb\temp\WBSHierarchy\DATA_WBSHierarchy.csv
Bad File: DATA_WBSHierarchy.bad
Discard File: none specified
+(Allow all discards)+
Number to load: ALL
Number to skip: 0
Errors allowed: 50
Bind array: 64 rows, maximum of 256000 bytes
Continuation: none specified
Path used: Conventional
Table WBSHIERARCHY, loaded from every logical record.
Insert option in effect for this table: APPEND
TRAILING NULLCOLS option in effect
Column Name Position Len Term Encl Datatype
PARENTOBJECTID FIRST * WHT CHARACTER
PARENTPROJECTID NEXT * WHT CHARACTER
PARENTSEQUENCENUMBER NEXT * WHT CHARACTER
PARENTNAME NEXT * WHT CHARACTER
PARENTID NEXT * WHT CHARACTER
CHILDOBJECTID NEXT * WHT CHARACTER
CHILDPROJECTID NEXT * WHT CHARACTER
CHILDSEQUENCENUMBER NEXT * WHT CHARACTER
CHILDNAME NEXT * WHT CHARACTER
CHILDID NEXT * WHT CHARACTER
PARENTLEVELSBELOWROOT NEXT * WHT CHARACTER
CHILDLEVELSBELOWROOT NEXT * WHT CHARACTER
LEVELSBETWEEN NEXT * WHT CHARACTER
CHILDHASCHILDREN NEXT * WHT CHARACTER
FULLPATHNAME NEXT 8000 WHT CHARACTER
SKEY SEQUENCE (MAX, 1)
value used for ROWS parameter changed from 64 to 21
Record 14359: Rejected - Error on table WBSHIERARCHY, column PARENTLEVELSBELOWROOT.
ORA-01400: cannot insert NULL into ("ODSUSER"."WBSHIERARCHY"."PARENTLEVELSBELOWROOT")
Record 14360: Rejected - Error on table WBSHIERARCHY, column PARENTOBJECTID.
ORA-01722: invalid number
Record 14361: Rejected - Error on table WBSHIERARCHY, column PARENTOBJECTID.
ORA-01722: invalid number
Record 27457: Rejected - Error on table WBSHIERARCHY, column PARENTLEVELSBELOWROOT.
ORA-01400: cannot insert NULL into ("ODSUSER"."WBSHIERARCHY"."PARENTLEVELSBELOWROOT")
Record 27458: Rejected - Error on table WBSHIERARCHY, column PARENTOBJECTID.
ORA-01722: invalid number
Record 27459: Rejected - Error on table WBSHIERARCHY, column PARENTOBJECTID.
ORA-01722: invalid number
Record 38775: Rejected - Error on table WBSHIERARCHY, column PARENTLEVELSBELOWROOT.
ORA-01400: cannot insert NULL into ("ODSUSER"."WBSHIERARCHY"."PARENTLEVELSBELOWROOT")
Record 38776: Rejected - Error on table WBSHIERARCHY, column PARENTOBJECTID.
ORA-01722: invalid number
Record 38777: Rejected - Error on table WBSHIERARCHY, column PARENTOBJECTID.
ORA-01722: invalid number
Record 52411: Rejected - Error on table WBSHIERARCHY, column PARENTLEVELSBELOWROOT.
ORA-01400: cannot insert NULL into ("ODSUSER"."WBSHIERARCHY"."PARENTLEVELSBELOWROOT")
Record 52412: Rejected - Error on table WBSHIERARCHY, column PARENTOBJECTID.
ORA-01722: invalid number
Record 52413: Rejected - Error on table WBSHIERARCHY, column PARENTOBJECTID.
ORA-01722: invalid number
Record 114619: Rejected - Error on table WBSHIERARCHY, column PARENTLEVELSBELOWROOT.
ORA-01400: cannot insert NULL into ("ODSUSER"."WBSHIERARCHY"."PARENTLEVELSBELOWROOT")
Record 114620: Rejected - Error on table WBSHIERARCHY, column PARENTOBJECTID.
ORA-01722: invalid number
Record 127921: Rejected - Error on table WBSHIERARCHY, column PARENTLEVELSBELOWROOT.
ORA-01400: cannot insert NULL into ("ODSUSER"."WBSHIERARCHY"."PARENTLEVELSBELOWROOT")
Record 127922: Rejected - Error on table WBSHIERARCHY, column PARENTOBJECTID.
ORA-01722: invalid number
Record 164588: Rejected - Error on table WBSHIERARCHY, column PARENTLEVELSBELOWROOT.
ORA-01400: cannot insert NULL into ("ODSUSER"."WBSHIERARCHY"."PARENTLEVELSBELOWROOT")
Record 164589: Rejected - Error on table WBSHIERARCHY, column PARENTOBJECTID.
ORA-01722: invalid number
Record 171322: Rejected - Error on table WBSHIERARCHY, column PARENTLEVELSBELOWROOT.
ORA-01400: cannot insert NULL into ("ODSUSER"."WBSHIERARCHY"."PARENTLEVELSBELOWROOT")
Record 171323: Rejected - Error on table WBSHIERARCHY, column PARENTOBJECTID.
ORA-01722: invalid number
Record 186779: Rejected - Error on table WBSHIERARCHY, column PARENTLEVELSBELOWROOT.
ORA-01400: cannot insert NULL into ("ODSUSER"."WBSHIERARCHY"."PARENTLEVELSBELOWROOT")
Record 186780: Rejected - Error on table WBSHIERARCHY, column PARENTOBJECTID.
ORA-01722: invalid number
Record 208687: Rejected - Error on table WBSHIERARCHY, column PARENTLEVELSBELOWROOT.
ORA-01400: cannot insert NULL into ("ODSUSER"."WBSHIERARCHY"."PARENTLEVELSBELOWROOT")
Record 208688: Rejected - Error on table WBSHIERARCHY, column PARENTOBJECTID.
ORA-01722: invalid number
Record 221167: Rejected - Error on table WBSHIERARCHY, column PARENTLEVELSBELOWROOT.
ORA-01400: cannot insert NULL into ("ODSUSER"."WBSHIERARCHY"."PARENTLEVELSBELOWROOT")
Record 221168: Rejected - Error on table WBSHIERARCHY, column PARENTOBJECTID.
ORA-01722: invalid number
Record 246951: Rejected - Error on table WBSHIERARCHY, column PARENTLEVELSBELOWROOT.
ORA-01400: cannot insert NULL into ("ODSUSER"."WBSHIERARCHY"."PARENTLEVELSBELOWROOT")
Record 246952: Rejected - Error on table WBSHIERARCHY, column PARENTOBJECTID.
ORA-01722: invalid number
Table WBSHIERARCHY:
+384074 Rows successfully loaded.+
+28 Rows not loaded due to data errors.+
+0 Rows not loaded because all WHEN clauses were failed.+
+0 Rows not loaded because all fields were null.+
Space allocated for bind array: 244377 bytes(21 rows)
Read buffer bytes: 1048576
Total logical records skipped: 0
Total logical records read: 384102
Total logical records rejected: 28
Total logical records discarded: 0
Run began on Mon May 02 09:03:22 2011
Run ended on Mon May 02 09:04:07 2011
Elapsed time was: 00:00:44.99Hi Mandeep,
Thanks for the information.
But still it doesnot seem to work.
Actally, i have Group ID and Group Name as display field in the Hiearchy table.
Group ID i have directly mapped to Group ID.
I have created a Split Hierarchy of Group Name and mapped it.
I have also made all the options configurations as per your suggestions, but it doenot work still.
Can you please help.
Thanks,
Priya. -
SQL Loader - Field in data file exceeds maximum length
Dear All,
I have a file which has more than 4000 characters in a field and I wish to load the data in a table with field length = 4000. but I receive error as
Field in data file exceeds maximum lengthThe below given are the scripts and ctl file
Table creation script:
CREATE TABLE "TEST_TAB"
"STR" VARCHAR2(4000 BYTE),
"STR2" VARCHAR2(4000 BYTE),
"STR3" VARCHAR2(4000 BYTE)
);Control file:
LOAD DATA
INFILE 'C:\table_export.txt'
APPEND INTO TABLE TEST_TAB
FIELDS TERMINATED BY '|'
TRAILING NULLCOLS
( STR CHAR(4000) "SUBSTR(:STR,1,4000)" ,
STR2 CHAR(4000) "SUBSTR(:STR2,1,4000)" ,
STR3 CHAR(4000) "SUBSTR(:STR3,1,4000)"
)Log:
SQL*Loader: Release 10.2.0.1.0 - Production on Mon Jul 26 16:06:25 2010
Copyright (c) 1982, 2005, Oracle. All rights reserved.
Control File: C:\TEST_TAB.CTL
Data File: C:\table_export.txt
Bad File: C:\TEST_TAB.BAD
Discard File: none specified
(Allow all discards)
Number to load: ALL
Number to skip: 0
Errors allowed: 0
Bind array: 64 rows, maximum of 256000 bytes
Continuation: none specified
Path used: Conventional
Table TEST_TAB, loaded from every logical record.
Insert option in effect for this table: APPEND
TRAILING NULLCOLS option in effect
Column Name Position Len Term Encl Datatype
STR FIRST 4000 | CHARACTER
SQL string for column : "SUBSTR(:STR,1,4000)"
STR2 NEXT 4000 | CHARACTER
SQL string for column : "SUBSTR(:STR2,1,4000)"
STR3 NEXT 4000 | CHARACTER
SQL string for column : "SUBSTR(:STR3,1,4000)"
value used for ROWS parameter changed from 64 to 21
Record 1: Rejected - Error on table TEST_TAB, column STR.
Field in data file exceeds maximum length
MAXIMUM ERROR COUNT EXCEEDED - Above statistics reflect partial run.
Table TEST_TAB:
0 Rows successfully loaded.
1 Row not loaded due to data errors.
0 Rows not loaded because all WHEN clauses were failed.
0 Rows not loaded because all fields were null.
Space allocated for bind array: 252126 bytes(21 rows)
Read buffer bytes: 1048576
Total logical records skipped: 0
Total logical records read: 1
Total logical records rejected: 1
Total logical records discarded: 0
Run began on Mon Jul 26 16:06:25 2010
Run ended on Mon Jul 26 16:06:25 2010
Elapsed time was: 00:00:00.22
CPU time was: 00:00:00.15Please suggest a way to get it done.
Thanks for reading the post!
*009*Hi Toni,
Thanks for the reply.
Do you mean this?
CREATE TABLE "TEST"."TEST_TAB"
"STR" VARCHAR2(4001),
"STR2" VARCHAR2(4001),
"STR3" VARCHAR2(4001)
);However this does not work as the error would be:
Error at Command Line:8 Column:20
Error report:
SQL Error: ORA-00910: specified length too long for its datatype
00910. 00000 - "specified length too long for its datatype"
*Cause: for datatypes CHAR and RAW, the length specified was > 2000;
otherwise, the length specified was > 4000.
*Action: use a shorter length or switch to a datatype permitting a
longer length such as a VARCHAR2, LONG CHAR, or LONG RAW*009*
Edited by: 009 on Jul 28, 2010 6:15 AM -
Loader- Field in data file exceeds maximum length
Hi,
I am getting error while loading the data: However data size of this columns is less thatn 4000 and i defined column as : OBJ_ADDN_INFO CLOB
Please help
==================
Record 1: Rejected - Error on table APPS.CG_COMPARATIVE_MATRIX_TAB, column OBJ_ADDN_INFO.
Field in data file exceeds maximum length
LOAD DATA
infile *
REPLACE
INTO TABLE APPS.CG_COMPARATIVE_MATRIX_TAB
FIELDS TERMINATED BY ","
OPTIONALLY ENCLOSED BY '"' TRAILING NULLCOLS
( APPS_VERSION,
MODULE_SHORT_NAME,
CATEGORY,
MODULE,
OBJECT_NAME,
OBJECT_TYPE,
OBJECT_STATUS,
FUNCTION_NAME,
OBJ_ADDN_INFO
begindata
"12",DBI,Oracle Daily Business Intelligence,DBI for Depot Repair,ISC_DEPOT_RO_INIT,PROGRAM,Changed,"Initial Load - Update Depot Repair Order Base Summary","The ISC_DR_REPAIR_ORDERS_F fact has a new column FLOW_SATUS_ID. The FLOW_STATUS_ID contains a user-defined Status for a Repair Order. The STATUS Column will continue to store the Status, now called State of the Repair Order i.e. O , C , D , H . The Initial Load incorporates the additional column FLOW_STATUS_ID. The Incremental Load s merge statement is modified to collect or update the additional column FLOW_STATUS_ID also. ","DBI for Depot Repair"
"12",DBI,Oracle Daily Business Intelligence,DBI for Depot Repair,ISC_DEPOT_RO_INCR,PROGRAM,Changed,"Update Depot Repair Orders Base Summary","The ISC_DR_REPAIR_ORDERS_F fact has a new column FLOW_SATUS_ID. The FLOW_STATUS_ID contains a user-defined Status for a Repair Order. The STATUS Column will continue to store the Status, now called State of the Repair Order i.e. O , C , D , H . The Initial Load incorporates the additional column FLOW_STATUS_ID. The Incremental Load s merge statement is modified to collect or update the additional column FLOW_STATUS_ID also. ","DBI for Depot Repair"If you don't specify a data type for a data field in the SQL Loader control file, SQL Loader assumes the data type is CHAR(255). If you have data that is larger than that, then you can't rely on the default. Try changing the control file to
LOAD DATA
infile *
REPLACE
INTO TABLE APPS.CG_COMPARATIVE_MATRIX_TAB
FIELDS TERMINATED BY ","
OPTIONALLY ENCLOSED BY '"' TRAILING NULLCOLS
( APPS_VERSION,
MODULE_SHORT_NAME,
CATEGORY,
MODULE,
OBJECT_NAME,
OBJECT_TYPE,
OBJECT_STATUS,
FUNCTION_NAME,
OBJ_ADDN_INFO char(4000)
) -
On load, getting error: Field in data file exceeds maximum length
Oracle Database 11g Enterprise Edition Release 11.2.0.3.0 - 64bit Production
PL/SQL Release 11.2.0.3.0 - Production
CORE 11.2.0.3.0 Production
TNS for Solaris: Version 11.2.0.3.0 - Production
NLSRTL Version 11.2.0.3.0 - Production
I'm trying to load a table, small in size (110 rows, 6 columns). One of the columns, called NOTES is erroring when I run the load. It is saying that the column size exceeds max limit. As you can see here, the table column is set to 4000 Bytes)
CREATE TABLE NRIS.NRN_REPORT_NOTES
NOTES_CN VARCHAR2(40 BYTE) DEFAULT sys_guid() NOT NULL,
REPORT_GROUP VARCHAR2(100 BYTE) NOT NULL,
AREACODE VARCHAR2(50 BYTE) NOT NULL,
ROUND NUMBER(3) NOT NULL,
NOTES VARCHAR2(4000 BYTE),
LAST_UPDATE TIMESTAMP(6) WITH TIME ZONE DEFAULT systimestamp NOT NULL
TABLESPACE USERS
RESULT_CACHE (MODE DEFAULT)
PCTUSED 0
PCTFREE 10
INITRANS 1
MAXTRANS 255
STORAGE (
INITIAL 80K
NEXT 1M
MINEXTENTS 1
MAXEXTENTS UNLIMITED
PCTINCREASE 0
BUFFER_POOL DEFAULT
FLASH_CACHE DEFAULT
CELL_FLASH_CACHE DEFAULT
LOGGING
NOCOMPRESS
NOCACHE
NOPARALLEL
MONITORING;
I did a little investigating, and it doesn't add up.
when i run
select max(lengthb(notes)) from NRIS.NRN_REPORT_NOTES
I get a return of
643
That tells me that the largest size instance of that column is only 643 bytes. But EVERY insert is failing.
Here is the loader file header, and first couple of inserts:
LOAD DATA
INFILE *
BADFILE './NRIS.NRN_REPORT_NOTES.BAD'
DISCARDFILE './NRIS.NRN_REPORT_NOTES.DSC'
APPEND INTO TABLE NRIS.NRN_REPORT_NOTES
Fields terminated by ";" Optionally enclosed by '|'
NOTES_CN,
REPORT_GROUP,
AREACODE,
ROUND NULLIF (ROUND="NULL"),
NOTES,
LAST_UPDATE TIMESTAMP WITH TIME ZONE "MM/DD/YYYY HH24:MI:SS.FF9 TZR" NULLIF (LAST_UPDATE="NULL")
BEGINDATA
|E2ACF256F01F46A7E0440003BA0F14C2|;|DEMOGRAPHICS|;|A01003|;3;|Demographic results show that 46 percent of visits are made by females. Among racial and ethnic minorities, the most commonly encountered are Native American (4%) and Hispanic / Latino (2%). The age distribution shows that the Bitterroot has a relatively small proportion of children under age 16 (14%) in the visiting population. People over the age of 60 account for about 22% of visits. Most of the visitation is from the local area. More than 85% of visits come from people who live within 50 miles.|;07/29/2013 16:09:27.000000000 -06:00
|E2ACF256F02046A7E0440003BA0F14C2|;|VISIT DESCRIPTION|;|A01003|;3;|Most visits to the Bitterroot are fairly short. Over half of the visits last less than 3 hours. The median length of visit to overnight sites is about 43 hours, or about 2 days. The average Wilderness visit lasts only about 6 hours, although more than half of those visits are shorter than 3 hours long. Most visits come from people who are fairly frequent visitors. Over thirty percent are made by people who visit between 40 and 100 times per year. Another 8 percent of visits are from people who report visiting more than 100 times per year.|;07/29/2013 16:09:27.000000000 -06:00
|E2ACF256F02146A7E0440003BA0F14C2|;|ACTIVITIES|;|A01003|;3;|The most frequently reported primary activity is hiking/walking (42%), followed by downhill skiing (12%), and hunting (8%). Over half of the visits report participating in relaxing and viewing scenery.|;07/29/2013 16:09:27.000000000 -06:00
Here is the full beginning of the loader log, ending after the first row return. (They ALL say the same error)
SQL*Loader: Release 10.2.0.4.0 - Production on Thu Aug 22 12:09:07 2013
Copyright (c) 1982, 2007, Oracle. All rights reserved.
Control File: NRIS.NRN_REPORT_NOTES.ctl
Data File: NRIS.NRN_REPORT_NOTES.ctl
Bad File: ./NRIS.NRN_REPORT_NOTES.BAD
Discard File: ./NRIS.NRN_REPORT_NOTES.DSC
(Allow all discards)
Number to load: ALL
Number to skip: 0
Errors allowed: 50
Bind array: 64 rows, maximum of 256000 bytes
Continuation: none specified
Path used: Conventional
Table NRIS.NRN_REPORT_NOTES, loaded from every logical record.
Insert option in effect for this table: APPEND
Column Name Position Len Term Encl Datatype
NOTES_CN FIRST * ; O(|) CHARACTER
REPORT_GROUP NEXT * ; O(|) CHARACTER
AREACODE NEXT * ; O(|) CHARACTER
ROUND NEXT * ; O(|) CHARACTER
NULL if ROUND = 0X4e554c4c(character 'NULL')
NOTES NEXT * ; O(|) CHARACTER
LAST_UPDATE NEXT * ; O(|) DATETIME MM/DD/YYYY HH24:MI:SS.FF9 TZR
NULL if LAST_UPDATE = 0X4e554c4c(character 'NULL')
Record 1: Rejected - Error on table NRIS.NRN_REPORT_NOTES, column NOTES.
Field in data file exceeds maximum length...
I am not seeing why this would be failing.HI,
the problem is delimited data defaults to char(255)..... Very helpful I know.....
what you need to two is tell sqlldr hat the data is longer than this.
so change notes to notes char(4000) in you control file and it should work.
cheers,
harry
Maybe you are looking for
-
when i try to update an application it tells me that my account is not valid for use in the UK and have to switch to the australian store to update?how do i switch stores/countries for the applications?? Thanks.
-
Field Update workflow action - Updating today's date into a field
Hello there. I was hoping someone might be able to help with an error currently being experienced with regards to this workflow action. The requirement is for a workflow rule to action the printing of today's date into a custom field on an Activity r
-
Get textfield value and put it session JSP
Hi all . I have issuse with Session . this the code . file name (order.jsp) %> <p>Compute</p> <table width="354" border="1"> <tr> <td width="36" scope="col"> </td> <td width="148" scope="col">Computer Type</td> <td width="148" scope="col">Price</td>
-
Hi all, need some info on CE bpm . I am abap consultant need to work on CE bpm . what are minimum technical skilks required to work on CE BPM ( like webdynpro , workflow etc ) . so that iIget trained in those areas . Thanks In advance . Thanks ,
-
I need help on how to setup hardware raid for ASM.
In the « Recommendations for Storage Preparation" section in the following documentation: http://download.oracle.com/docs/cd/B28359_01/server.111/b31107/asmprepare.htm It mentions: --Use the storage array hardware RAID 1 mirroring protection when pos