SQL Loader to append data in same table but using differnet WHEN clauses
In my data file i have a header record and a detail record identified by Record_type = 1 and 2 respectively.
The Database table has all the columns to capture detail records but i want to capture jus one column of header record now also in my existing table. So i have added that column (DATA_DATE)in my table but how to capture that value ?
im writing my control file using two WHEN clauses, something like -
load data
into table t_bdn
append
when RECORD_TYPE = '2'
FIELDS TERMINATED BY "|" TRAILING NULLCOLS
SEQUENCE_NO
, RECORD_TYPE
, DISTRIBUTOR_CODE
, SUPPLIER_CODE
, SUPPLIER_DISTRIBUTOR_CODE
, DISTRIBUTOR_SKU
, SUPPLIER_SKU
when RECORD_TYPE = '1'
FIELDS TERMINATED BY "|" TRAILING NULLCOLS
SEQUENCE_NO FILLER
, RECORD_TYPE FILLER
, CREATE_DATE FILLER
, DATA_DATE "NVL(to_date(:DATA_DATE, 'YYYY/MM/DD'),to_date('9999/12/31', 'YYYY/MM/DD'))"
im getting error " expecting INTO and foung WHEN RECORD_TYPE = '1' "
if i give iNTO second time it will append a new row altogether in my table but i want the same row to be updated with this DATA_DATE value coming from RECORD_TYPE =1 and header record has 4 delimited data text fields only and i am interested in fetching just the 4th column..
KIndly suggest what to do ?
Ravneek, I could be wrong but sqlldr is a 'load' program, that is, it inserts data. I am unaware of any ability to update existing rows as you seem to want. What you appear to want to do is more the job of a merge statement.
I would look at writing a pro* language, a .net, or a java program to perform inserts where some or all of the newly inserted rows are also to be updated.
From the manual: (Oracle® Database Utilities 10g Release 2 (10.2) Part Number B14215-01)
Updating Existing Rows
The REPLACE method is a table replacement, not a replacement of individual rows. SQL*Loader does not update existing records, even if they have null columns. To update existing rows, use the following procedure:
1. Load your data into a work table.
2. Use the SQL language UPDATE statement with correlated subqueries.
3. Drop the work table.
HTH -- Mark D Powell --
Similar Messages
-
Appending data from one table to another
Hello
How to append data from one table t1 to another table t2.
t1 and t2 have the same structures .
t2 contains already data so i don't want do delete it and create it as select * from t1.
If there is a mean to add t1 content without altering t2 content.
Thanks in advanceinsert into t2
select * from t1 -
ORACLE 8I SQL*LOADER DATAFILE의 특정 FIELD DATA를 SKIP하고 LOADING하는 방법
제품 : ORACLE SERVER
작성날짜 : 2002-04-09
ORACLE 8I SQL*LOADER DATAFILE의 특정 FIELD DATA를 SKIP하고 LOADING하는 방법
===========================================================================
아래의 예제와 같이 가변 길이의 filed들이 ',', '|' 와 같은 구분자로
구분이 되고 있는 경우 oracle 8i부터 제공되는 'FILLER'라고 하는 필드
구분자를 사용하여 상태인식자로 표시하여 insert시 skip할 수 있다.
<Example>
TABLE : skiptab
===========================
col1 varchar2(20)
col2 varchar2(20)
col3 varchar2(20)
CONTROLFIEL : skip.ctl
load data
infile skip.dat
into table skiptab
fields terminated by ","
(col1 char,
col2 filler char,
col3 char)
DATAFILE : skip.dat
SMITH, DALLAS, RESEARCH
ALLEN, CHICAGO, SALES
WARD, CHICAGO, SALES
data loading :
$sqlldr scott/tiger control=skip.ctl
결과 :
COL1 COL3
SMITH RESEARCH
ALLEN SALES
WARD SALES -
OWB11gR2 - simple and easy way to load XML formatted data into db tables?
Hi,
we're currently trying to load table data stored in XML files into our datawarehouse using OWB 11gR2.
However, we're finding this is not quite as trivial as loading flat files...
Most postings on this forum points to the blog-entry title "Leveraging XDB" found here (http://blogs.oracle.com/warehousebuilder/2007/09/leveraging_xdb.html).
This blog also references the zip-file owb_xml_etl_utils.zip, which seems to have disappeared from it's original location and can now be found on sourceforge.
Anyway, the solution described is for OWB 10g, and when trying to import experts from the zip-file etc. we end up not being able to run the "Create ETL from XSD" expert, as the 11gR2 client is different from the 10g and does not have the Experts menu et.al.
Also, this solution was published over 3 years ago, and it seems rather strange that importing XML-formatted data should still be so cumbersome in the newer warehouse builder releases.
The OWB 11gR2 documentation is very sparse (or rather - quite empty) on how to load XML data, all it has is a few lines on "XML Transformations", giving no clue as to how one goes about loading data.
Is this really the state of things? Or are we missing some vital information here?
We'd have thought that with 11g-releases, loading XML-data would be rather simple, quick and painless?
Is there somewhere besides the blog mentioned above where we can find simple and to the point guidelines for OWB 11gR2 on how to load XML-formatted data into Oracle tables?
Regards,
-Haakon-Yes it is possible to use SQL*Loader to parse and load XML, but that is not what it was designed for and so is not recommended. You also don't need to register a schema, just to load/store/parse XML in the DB either.
So where does that leave you?
Some options
{thread:id=410714} (see page 2)
{thread:id=1090681}
{thread:id=1070213}
Those talk some about storage options and reading in XML from disk and parsing XML. They should also give you options to consider. Without knowing more about your requirements for the effort, it is difficult to give specific advice. Maybe your 7-8 tables don't exist and so using Object Relational Storage for the XML would be the best solution as you can query/update tables that Oracle creates based off the schema associated to the XML. Maybe an External Table definition works better for reading the XML into the system because this process will happen just once. Maybe using WebDAV makes more sense for loading XML to be parsed (I don't have much experience with this, just know it is possible from what I've read on the forums). Also, your version makes a difference as you have different options available depending upon the version of Oracle.
Hope all that helps as a starter.
Edited by: A_Non on Jul 8, 2010 4:31 PM
A great example, see the answers by mdrake in {thread:id=1096784} -
How to load Matrix report data into basic table data using ODI
Hi,
How to load Matrix report data into basic table data using oracle Data Integrator ?
Requirement Description:
Following is the matrix report data:
JOB DEPT10 DEPT20
ANALYST 6000
CLERK 1300 1900 Need to convert it into below format:
JOB Dept Salary
ANALYST DEPT10
ANALYST DEPT20 6000
CLERK DEPT10 1300
CLERK DEPT20 1900
Thanks for your help in advance. Let me know if any further explanation is required.Your list seems to be a little restrictive, you can do a lot more with ODI procedures.
If you create new procedure, and add a step. In the 'command on source' tab set you technology and schema as per your source database. Use the unpivot functionality as described in the link, please, rather than using 'SELECT *' use the appropriate column names and alias them for eg:
SELECT job as job,
deptsal as deptsal,
saldesc as saledesc
FROM pivoted_data
UNPIVOT (
deptsal --<-- unpivot_clause
FOR saldesc --<-- unpivot_for_clause
IN (d10_sal, d20_sal, d30_sal, d40_sal) --<-- unpivot_in_clause
Then in your 'command on target' tab set the technology and schema to your target db, then put your INSERT statement for eg:
INSERT INTO job_sales
(job,
deptsal,
saledesc
VALUES
:job,
:deptsal,
:saledesc
Therefore you are using bind variables from source to load data into target.
Obviously if the source and target table are in the same database, then you can have it all in one statement in the 'command on target' as
INSERT INTO job_sales
(job,
deptsal,
saledesc
SELECT job as job,
deptsal as deptsal,
saldesc as saledesc
FROM pivoted_data
UNPIVOT (
deptsal --<-- unpivot_clause
FOR saldesc --<-- unpivot_for_clause
IN (d10_sal, d20_sal, d30_sal, d40_sal) --<-- unpivot_in_clause
also set the log counter as 'Insert' on the tab where your INSERT statement is, so you know how many rows you insert into the table.
Hope this helps.
BUT remember that this feature only came out in Oracle 11g. -
How to append data runtime in Table in MIDlet
Hi Friends,
i am having 2 queries..
1st:: How can i append data runtime in table in MIDlet ( like web )which are coming from Database.
2: requirement is that 1st row of table should for headings like StartDate,EndDate,Resources and Status.
From the 2nd row ,Columns for runtime data.(Like any Table how u all got my query).
Plz send me reply as early as possible.
Waiting for reply
Best regards
karanPresently you cannot use AJAX kind of stuffs in J2ME.
If you want to achieve the functionality better to look at articles wriiten on writing custom items in J2ME. That will help you achieve the kind of requirement that you are expecting.
~Mohan -
How to insert one table data into multiple tables by using procedure?
How to insert one table data into multiple tables by using procedure?
Below is the simple procedure. Try the below
CREATE OR REPLACE PROCEDURE test_proc
AS
BEGIN
INSERT ALL
INTO emp_test1
INTO emp_test2
SELECT * FROM emp;
END;
If you want more examples you can refer below link
multi-table inserts in oracle 9i
Message was edited by: 000000 -
My password works fine when I turn the computer on, but when I try to wake it from sleep mode the computer doesn't recognize my password -- the same one I used successfully when I turned the computer on.
Okay, I seem to have solved the problem I posted about earlier. I noticed that the language preference in the upper right hand corner of the screen (between the date and time and the Spotlight icon) had somehow gotten switched from American English to French. When I switched it back to American English, my password worked when waking the computer from sleep mode. I have no idea why changing the language preference would affect the way the password works but that seems to have been the case. I'll post again if I experience more problems.
-
Problem with SQL*Loader and different date formats in the same file
DB: Oracle Database 10g Enterprise Edition Release 10.2.0.4.0 - 64bi
System: AIX 5.3.0.0
Hello,
I'm using SQL*Loader to import semi-colon separated values into a table. The files are delivered to us by a data provider who concatenates data from different sources and this results in us having different date formats within the same file. For example:
...;2010-12-31;22/11/1932;...
I load this data using the following lines in the control file:
EXECUTIONDATE1 TIMESTAMP NULLIF EXECUTIONDATE1=BLANKS "TO_DATE(:EXECUTIONDATE1, 'YYYY-MM-DD')",
DELDOB TIMESTAMP NULLIF DELDOB=BLANKS "TO_DATE(:DELDOB, 'DD/MM/YYYY')",
The relevant NLS parameters:
NLS_LANGUAGE=FRENCH
NLS_DATE_FORMAT=DD/MM/RR
NLS_DATE_LANGUAGE=FRENCH
If I load this file as is the values loaded into the table are 31 dec 2010 and 22 nov *2032*, aven though the years are on 4 digits. If I change the NLS_DATE_FORMAT to DD/MM/YYYY then the second date value will be loaded correctly, but the first value will be loaded as 31 dec *2020* !!
How can I get both date values to load correctly?
Thanks!
SylvainThis is very strange, after running a few tests I realized that if the year is 19XX then it will get loaded as 2019, and if it is 20XX then it will be 2020. I'm guessing it may have something to do with certain env variables that aren't set up properly because I'm fairly sure my SQL*Loader control file is correct... I'll run more tests :-(
-
SQL Loader - read 1st line to one table, rest of data to another
Hi
I looked around the FAQs and forums and find similar cases but not mine...
I am running Oracle 9i and have a text file which has the 1st line as a control header and everything beneath it as data, something like this:
14/07/2010|8
12345678|0
12345679|0
12345680|10.87
12345681|7655.8
12345682|100
12345683|0
12345684|-90.44
12345685|0
The first (header) line has a date field and a counter (the number of records expected beneath it)
The rest of the data is an account number and balance.
Since SQL Loader is invoked outside of Oracle (Unix in my case) I assume I should create two tables, such as:
Create table
TIF_CURRENT_BALANCE_DTL
ACCOUNT_REF_NO VARCHAR2(30),
BALANCE_AMT NUMBER(12,2)
Create table
TIF_CURRENT_BALANCE_HDR
HDR_DATE DATE,
HDR_COUNT NUMBER(10)
);And use a control file which will load line 1 to TIF_CURRENT_BALANCE_HDR and other lines (SKIP=1) to TIF_CURRENT_BALANCE_DTL.
Since the header/detail lines are not (necessarily) distinguishable, is there a way to achieve this, without modifying the input file in anyway?
Thanks
MartinThanks for your reply - the solution should not be OS dependant as it will run on a Linux and UNIX installation.
The DB will be (for now) Oracle9i Enterprise Edition Release 9.2.0.4.0 - 64bit Production
I looked at that web page you provided and had some hope with the ROWS option, but this is the number of rows to load before each commit.
Any other solutions?
I know I could load it to a common table (text and number) and then convert the values accordingly in the PL/SQL procedure which will run afterwards, but I feel loading the data in the final format is going to be faster.
I am considering using SQL Load to load records (SKIPping row 1) into the DTL table and within the PL.SQL loading the 1st record and performing validation. Since the file has approx 2million rows and is processed daily, 1.99999 million records read with SQL Loader and 1 with conventional methods will still be a vast improvement!
Thanks -
SQL*Loader job exits unexpectedly and causes table to be locked with NOWAIT
I have a weekly report job that I run where I have to load about 48 logs with about 750k rows of data in each log. To facilitate this, we have been using a Java job that runs SQL*Loader as an external Process (using ProcessBuilder), one after the other. Recently however, this process has been terminating abnormally during the load which is causing a lock on the table and basically causes the process to grind to a halt until we can open a ticket with the DB team to kill the session that is hung. Is there maybe a better way to handle this upload process than using SQL*Loader or is there some change I could make in either the control file or command line to stop it from dying a horrible death?
At the start of the process, I truncate the table that I'm loading to and then run this command line with the following control file:
COMMAND LINE:
C:\Oracle\ora92\BIN\SQLLDR.EXE userid=ID/PASS@DB_ID load=10000000 rows=100000 DIRECT=TRUE SKIP_INDEX_MAINTENANCE=TRUE control=ControlFile.ctl data=logfile.log
CONTROL FILE:
UNRECOVERABLE
Load DATA
INFILE *
Append
PRESERVE BLANKS
INTO TABLE MY_REPORT_TABLE
FIELDS TERMINATED BY ","
filler_field1 FILLER char(16),
filler_field2 FILLER char(16),
time TIMESTAMP 'MMDDYYYY-HH24MISSFF3' ENCLOSED BY '"',
partne ENCLOSED BY '"',
trans ENCLOSED BY '"',
vendor ENCLOSED BY '"' "SUBSTR(:vendor, 1, 1)",
filler_field4 FILLER ENCLOSED BY '"',
cache_hit_count,
cache_get_count,
wiz_trans_count,
wiz_req_size,
wiz_res_size,
wiz_trans_time,
dc_trans_time,
hostname ENCLOSED BY '"',
trans_list CHAR(2048) ENCLOSED BY '"' "SUBSTR(:trans_list, 1, 256)",
timeouts,
success ENCLOSED BY '"'
Once all of the logs have finished loading, I rebuild the indexes on the table and then start the report process. It seems like it's just dying on random logs now, re-running the process it will fail at a different point each time.
EDIT: The reasons for the UNRECOVERABLE and SKIP_INDEX_MAINTENANCE are to speed the load up. As it is, it still can take 7-12 minutes for each log to load, it's even worse without those on. Overall it's taking about 18 hours for this process to run from start to finish.
Edited by: user6676140 on Jul 7, 2011 11:37 AMPlease note that my post stated: "I have opened a ticket with Oracle support. after 6 days have not had the help that I need."
I also agree that applying the latest PSU is a Best Practice, which Oracle defines as "a cumulative collection of high impact, low risk, and proven fixes for a specific product or component".
With that statement I feel there should not be the drastic issues that we have seen. Our policy is to always apply PSUs, no matter what the product or component, without issue.
Except for now. We did our research, and only open an Oracle ticket when we need expert help. That has not been forthcoming from them, but we are still working the ticket.
Hence, I opened this forum because many times I have found help here, where others have faced the same issue and now have an insight. When having a serious problem I like to use all of my resources, this forum being one of those.
To restate the question:
(1) 97% of our databases reside on RAC. From the Search List for Databases, we do not see the columns Sessions:CPU, Sessions: I/O, Sessions: Other, Instance CPU%, and are told this is working as designed because you must monitor the instance, not the database, with RAC.
(a) After applying PSU2 the Oracle Load Map no longer showed any databases.
All of this in (1) is making the tool less useful for monitoring at the database level, which we do most of the time.
(2) Within a few days of applying PSU2, we couldn't log into EM and got the error "Authentication failed. If problem persists, contact your system administrator."
(b) searching through the emoms.trc files we found the errors listed above posting frantically.
After rolling back PSU we are back in business.
However, there is still the need to remain current with the components of EM.
I am looking for suggestion, insights, experience. While I appreciate Akanksha answering so quickly, a recommendation to open an SR is not what I need.
Sherrie -
SQL Loader - Field in data file exceeds maximum length
Dear All,
I have a file which has more than 4000 characters in a field and I wish to load the data in a table with field length = 4000. but I receive error as
Field in data file exceeds maximum lengthThe below given are the scripts and ctl file
Table creation script:
CREATE TABLE "TEST_TAB"
"STR" VARCHAR2(4000 BYTE),
"STR2" VARCHAR2(4000 BYTE),
"STR3" VARCHAR2(4000 BYTE)
);Control file:
LOAD DATA
INFILE 'C:\table_export.txt'
APPEND INTO TABLE TEST_TAB
FIELDS TERMINATED BY '|'
TRAILING NULLCOLS
( STR CHAR(4000) "SUBSTR(:STR,1,4000)" ,
STR2 CHAR(4000) "SUBSTR(:STR2,1,4000)" ,
STR3 CHAR(4000) "SUBSTR(:STR3,1,4000)"
)Log:
SQL*Loader: Release 10.2.0.1.0 - Production on Mon Jul 26 16:06:25 2010
Copyright (c) 1982, 2005, Oracle. All rights reserved.
Control File: C:\TEST_TAB.CTL
Data File: C:\table_export.txt
Bad File: C:\TEST_TAB.BAD
Discard File: none specified
(Allow all discards)
Number to load: ALL
Number to skip: 0
Errors allowed: 0
Bind array: 64 rows, maximum of 256000 bytes
Continuation: none specified
Path used: Conventional
Table TEST_TAB, loaded from every logical record.
Insert option in effect for this table: APPEND
TRAILING NULLCOLS option in effect
Column Name Position Len Term Encl Datatype
STR FIRST 4000 | CHARACTER
SQL string for column : "SUBSTR(:STR,1,4000)"
STR2 NEXT 4000 | CHARACTER
SQL string for column : "SUBSTR(:STR2,1,4000)"
STR3 NEXT 4000 | CHARACTER
SQL string for column : "SUBSTR(:STR3,1,4000)"
value used for ROWS parameter changed from 64 to 21
Record 1: Rejected - Error on table TEST_TAB, column STR.
Field in data file exceeds maximum length
MAXIMUM ERROR COUNT EXCEEDED - Above statistics reflect partial run.
Table TEST_TAB:
0 Rows successfully loaded.
1 Row not loaded due to data errors.
0 Rows not loaded because all WHEN clauses were failed.
0 Rows not loaded because all fields were null.
Space allocated for bind array: 252126 bytes(21 rows)
Read buffer bytes: 1048576
Total logical records skipped: 0
Total logical records read: 1
Total logical records rejected: 1
Total logical records discarded: 0
Run began on Mon Jul 26 16:06:25 2010
Run ended on Mon Jul 26 16:06:25 2010
Elapsed time was: 00:00:00.22
CPU time was: 00:00:00.15Please suggest a way to get it done.
Thanks for reading the post!
*009*Hi Toni,
Thanks for the reply.
Do you mean this?
CREATE TABLE "TEST"."TEST_TAB"
"STR" VARCHAR2(4001),
"STR2" VARCHAR2(4001),
"STR3" VARCHAR2(4001)
);However this does not work as the error would be:
Error at Command Line:8 Column:20
Error report:
SQL Error: ORA-00910: specified length too long for its datatype
00910. 00000 - "specified length too long for its datatype"
*Cause: for datatypes CHAR and RAW, the length specified was > 2000;
otherwise, the length specified was > 4000.
*Action: use a shorter length or switch to a datatype permitting a
longer length such as a VARCHAR2, LONG CHAR, or LONG RAW*009*
Edited by: 009 on Jul 28, 2010 6:15 AM -
SQL Loader Problem with Date Format
Dear all,
I am dealing with a problem in loading data with SQL Loader. The problem is in the date format.
More specifically, I created the following Control File:
file.ctl
LOAD DATA
INFILE 'D:\gbal\chatium.log'
APPEND INTO TABLE CHAT_SL
FIELDS TERMINATED BY WHITESPACE
TRAILING NULLCOLS
(SL1 DATE "Mon DD, YYYY HH:MI:SS FF3AM",
SL2 char,
SL3 DATE "Mon DD, YYYY HH:MI:SS FF3AM",
SL4 char,
SL5 char,
SL6 char,
SL7 char,
SL8 char,
SL9 char,
SL10 char,
SL11 char,
SL12 char,
SL13 char,
SL14 char,
SL15 char)
The data we want to load are in the following file:
Apr 29, 2007 12:05:49 AM 1060615 Apr 29, 2007 12:05:35 AM 306978537730 24026384 chatium.user.userinfo WAP 0
Apr 29, 2007 12:12:51 AM 1061251 Apr 29, 2007 12:12:27 AM 306978537730 24026384 chatium.channel.list WAP 0
Apr 29, 2007 12:12:51 AM 1061264 Apr 29, 2007 12:12:32 AM 306978537730 24026384 chatium.channel.listdetail WAP 0
Apr 29, 2007 12:13:51 AM 1061321 Apr 29, 2007 12:13:31 AM 306978537730 24026384 chatium.user.search WAP 0
Apr 29, 2007 12:13:51 AM 1061330 Apr 29, 2007 12:13:37 AM 306978537730 24026384 chatium.user.userinfo WAP 0
The error log file is the following:
SQL*Loader: Release 9.2.0.1.0 - Production on Mon Apr 30 11:29:16 2007
Copyright (c) 1982, 2002, Oracle Corporation. All rights reserved.
Control File: file.ctl
Data File: D:\gbal\chatium.log
Bad File: chatium.bad
Discard File: none specified
(Allow all discards)
Number to load: ALL
Number to skip: 0
Errors allowed: 50
Bind array: 64 rows, maximum of 256000 bytes
Continuation: none specified
Path used: Conventional
Table CHAT_SL, loaded from every logical record.
Insert option in effect for this table: APPEND
TRAILING NULLCOLS option in effect
Column Name Position Len Term Encl Datatype
SL1 FIRST * WHT DATE MonDD,YYYYHH:MI:SS
SL2 NEXT * WHT CHARACTER
SL3 NEXT * WHT CHARACTER
SL4 NEXT * WHT CHARACTER
SL5 NEXT * WHT CHARACTER
SL6 NEXT * WHT CHARACTER
SL7 NEXT * WHT CHARACTER
SL8 NEXT * WHT CHARACTER
SL9 NEXT * WHT CHARACTER
SL10 NEXT * WHT CHARACTER
SL11 NEXT * WHT CHARACTER
SL12 NEXT * WHT CHARACTER
SL13 NEXT * WHT CHARACTER
SL14 NEXT * WHT CHARACTER
SL15 NEXT * WHT CHARACTER
Record 1: Rejected - Error on table CHAT_SL, column SL1.
ORA-01840: input value not long enough for date format
Record 2: Rejected - Error on table CHAT_SL, column SL1.
ORA-01840: input value not long enough for date format
Record 3: Rejected - Error on table CHAT_SL, column SL1.
ORA-01840: input value not long enough for date format
Record 4: Rejected - Error on table CHAT_SL, column SL1.
ORA-01840: input value not long enough for date format
I wonder if you could help me.
Thank you very much in advance.
Giorgos BaliotisSQL> select to_date('Apr 29, 2007 12:05:49 AM','Mon DD, YYYY HH:MI:SS FF3AM') from dual;
select to_date('Apr 29, 2007 12:05:49 AM','Mon DD, YYYY HH:MI:SS FF3AM') from dual
ERROR at line 1:
ORA-01821: date format not recognized
SQL> ed
Wrote file afiedt.buf
1* select to_date('Apr 29, 2007 12:05:49 AM','Mon DD, YYYY HH:MI:SS AM') from dual
SQL> /
TO_DATE(
29/04/07
SQL> Then, you defined blank space as separator, but there is spaces in your date inside your file. So, you should add double-quotes around the date field like below, and add optionally enclosed by '"' into your ctlfile.
"Apr 29, 2007 12:05:49 AM" 1060615 "Apr 29, 2007 12:05:35 AM" 306978537730 24026384 chatium.user.userinfo WAP 0
"Apr 29, 2007 12:12:51 AM" 1061251 "Apr 29, 2007 12:12:27 AM" 306978537730 24026384 chatium.channel.list WAP 0
"Apr 29, 2007 12:12:51 AM" 1061264 "Apr 29, 2007 12:12:32 AM" 306978537730 24026384 chatium.channel.listdetail WAP 0
"Apr 29, 2007 12:13:51 AM" 1061321 "Apr 29, 2007 12:13:31 AM" 306978537730 24026384 chatium.user.search WAP 0
"Apr 29, 2007 12:13:51 AM" 1061330 "Apr 29, 2007 12:13:37 AM" 306978537730 24026384 chatium.user.userinfo WAP 0Example :
http://download-uk.oracle.com/docs/cd/B19306_01/server.102/b14215/ldr_concepts.htm#sthref477
Nicolas. -
SQL loader Field in data file exceeds maximum length for CLOB column
Hi all
I'm loading data from text file separated by TAB and i got the error below for some lines.
Event the column is CLOB data type is there a limitation of the size of a CLOB data type.
The error is:
Record 74: Rejected - Error on table _TEMP, column DEST.
Field in data file exceeds maximum length
I'm using SQL Loader and the database is oracle 11g r2 on linux Red hat 5
Here are the line causing the error fronm my data file and my table description for test:
create table TEMP
CODE VARCHAR2(100),
DESC VARCHAR2(500),
RATE FLOAT,
INCREASE VARCHAR2(20),
COUNTRY VARCHAR2(500),
DEST CLOB,
WEEK VARCHAR2(10),
IS_SAT VARCHAR2(50),
IS_SUN VARCHAR2(50)
CONTROL FILE:
LOAD DATA
INTO TABLE TEMP
APPEND
FIELDS TERMINATED BY X'9' TRAILING NULLCOLS
CODE,
DESC,
RATE,
INCREASE,
COUNTRY),
DEST,
WEEK,
IS_SAT,
IS_SUN
Data file:
BHS Mobile Bahamas - Mobile 0.1430 1 "242357, 242359, 242375, 242376, 242395, 242421, 242422, 242423, 242424, 242425, 242426, 242427, 242428, 242429, 242431, 242432, 242433, 242434, 242435, 242436, 242437, 242438, 242439, 242441, 242442, 242443, 242445, 242446, 242447, 242448, 242449, 242451, 242452, 242453, 242454, 242455, 242456, 242457, 242458, 242462, 242463, 242464, 242465, 242466, 242467, 242468, 24247, 242524, 242525, 242533, 242535, 242544, 242551, 242552, 242553, 242554, 242556, 242557, 242558, 242559, 242565, 242577, 242636, 242646, 242727"
BOL Mobile ENTEL Bolivia - Mobile Entel 0.0865 Increase 591 "67, 68, 71, 72, 73, 740, 7410, 7411, 7412, 7413, 7414, 7415, 7420, 7421, 7422, 7423, 7424, 7425, 7430, 7431, 7432, 7433, 7434, 7435, 7436, 7437, 7440, 7441, 7442, 7443, 7444, 7445, 7450, 7451, 7452, 7453, 7454, 7455, 746, 7470, 7471, 7472, 7475, 7476, 7477, 7480, 7481, 7482, 7483, 7484, 7485, 7486, 7490, 7491, 7492, 7493, 7494, 7495, 7496" Thank you.Hi
Thank you for youe help, I found the solution and here what i do in my Control file i added
char(40000) OPTIONALLY ENCLOSED BY '"' .
LOAD DATA
INTO TABLE TEMP
APPEND
FIELDS TERMINATED BY X'9' TRAILING NULLCOLS
CODE,
DESC,
RATE,
INCREASE,
COUNTRY,
DEST
char(40000) OPTIONALLY ENCLOSED BY '"',
WEEK,
IS_SAT,
IS_SUN
Thank you for your help. -
SQL*Loader to insert data file name during load
I'd like to use a single control file to load data from different files (at different times) to the same table. I'd like this table to have a column to hold the name of the file the data came from. Is there a way for SQL*Loader to automatically do this? (I.e., as opposed to running an update query separately.) I can edit the control file before each load to set a CONSTANT to hold the new data file name, but I'd like it to pick this up automatically.
Thanks for any help.
-- HarveyHello Harvey.
I've previously attempted to store a value into a global/local OS variable and use this within a SQL*Loader control file (Unix OS and Oracle versions between 7.3.4 and 10g). I was unsuccessful in each attempt and approach I could imagine. It was very easy to use a sed script to make a copy of the control file, changing a string within it to do this however.
Do you really want to store a file name on each and every record? Perhaps an alternative would be to use a relational model. Create a file upload log table that would store the file name and an upload # and then have the SQL*Loader control file call a function that would read that table for the most recent upload #. You'll save some disk space too.
Hope this helps,
Luke
Maybe you are looking for
-
Prime infrastructure display issues
dear all, I have some display problems with Prime 1.3 .. and the same with 1.2 :-) for example the Discover Devices doesn't display the menu where I can click to launch the quick discovery.It's the case in classic theme but not in lifecycle theme. It
-
PO number with Exception status 20(cancel process).
In MD04, the PO line with Exception status 20. What actually caused status 20 pop-up ? The EDD is more than 'Planned Delivery Time' set in Material Masteri ?
-
A few days ago I got back late with my days work and went about importing it into Aperture as per usual. It was a big job so while the import was taking place I decided to export another job that was wanted the next morning. This client needed origo
-
Suggestions on Workflow w/ External HD
I'm trying to find the best way to plan my workflow while I'm on the road. While I'm away from home, I will be uploading and editing photos to my laptop in LR and then when I'm home, I would like to store the files on my external harddrive, so that I
-
Internal Error 11024, 23851332, 23851626, 26595125
I'm keep getting Internal Error 11024, 23851332, 23851626, 26595125 on structured FrameMaker 11. Any Idea?