Using Sequence in SQL Loader
Hi,
Does anybody know if I can generate the unique primary key using an Oracle Sequence for a Database table to which I am inserting records in SQL Loader?
I checked the SQL Loader manual and there is no information as to how to make use of a Oracle sequence.. in the control file?
Thanks
Surajit
Yes, you can do it. Create the sequence (suppose you call it "PK_SEQ_X") and then in your control file reference it as "PK_SEQ_X.NEXTVAL". For example suppose you wanted to put it into a column named 'Y' the entry in your control file will look like 'load data insert into table Z (Y "PK_SEQ_X.NEXTVAL", ....)'
Note that the double quotes around the sequence name are required.
Similar Messages
-
Using oracle sequence in SQL Loader
I'm using oracle sequence in control file of sql loader to load data from .csv file.
Controlfile:
LOAD DATA APPEND
INTO TABLE PHONE_LIST
FIELDS TERMINATED BY "," TRAILING NULLCOLS
PHONE_LIST_ID "seqId.NEXTVAL",
COUNTRY_CODE CHAR,
CITY_CODE CHAR,
BEGIN_RANGE CHAR,
END_RANGE CHAR ,
BLOCKED_FREE_FLAG CHAR
Datafile:
1516,8,9,9,B
1517,1,1,2,B
1518,8,9,9,B
1519,8,9,9,B
1520,8,9,9,B
1521,8,9,9,B
1) As first column uses oracle sequence, we have not defined that in datafile.
This gives me error "Can not insert NULL value for last column"
Is it mandatory to specify first column in datafile, even though we are using sequence?
2) Another table is referencing PHONE_LIST_ID column (the one for which we using sequence) of this table as a foreign key.
So is it possible to insert this column values in other table simultaneously? Sequence no. should be same as it is in first table...
Kindly reply this on urgent basis....use BEFORE INSERT trigger
with
select your_seq.nextval into :new.id from dual; -
Using clob in sql loader utility in oracle 9i
Hi,
I want to load data into a table with 2 clob columns using a sql loader utility dat file and control file created programatically.
The size of clob in dat file can change and the clob columns are inline in data file.
As per 9i doc the size of clob is 4GB .
How can I change the control file so that it can load max 4 GB data in clob columns .
I am getting error while calling sqlldr using below control file :
SQL*Loader-350: Syntax error at line 13.
Expecting non-negative integer, found "-294967296".
,"NARRATIVE" char(4000000000)
^
control file :
LOAD DATA
INFILE '' "str X'3C213E0A'"
APPEND INTO TABLE PSD_TERM
FIELDS TERMINATED BY '~^'
TRAILING NULLCOLS
"PSD_ID" CHAR(16) NULLIF ("PSD_ID"=BLANKS)
,"PSD_SERIAL_NUM" CHAR(4) NULLIF ("PSD_SERIAL_NUM"=BLANKS)
,"PSD_TERM_COD" CHAR(4) NULLIF ("PSD_TERM_COD"=BLANKS)
,"PSD_TERM_SER_NO" CHAR(4) NULLIF ("PSD_TERM_SER_NO"=BLANKS)
,"VERSION_DT" DATE "DD-MON-YYYY HH:MI:SS AM" NULLIF ("VERSION_DT"=BLANKS)
,"LATEST_VERSION" CHAR(1) NULLIF ("LATEST_VERSION"=BLANKS)
,"NARRATIVE" char(4000000000)
,"PARTITION_DT" DATE "DD-MON-YYYY HH:MI:SS AM" NULLIF ("PARTITION_DT"=BLANKS)
,"NARRATIVE_UNEXPANDED" char(4000000000)
)Yes, you can do it. Create the sequence (suppose you call it "PK_SEQ_X") and then in your control file reference it as "PK_SEQ_X.NEXTVAL". For example suppose you wanted to put it into a column named 'Y' the entry in your control file will look like 'load data insert into table Z (Y "PK_SEQ_X.NEXTVAL", ....)'
Note that the double quotes around the sequence name are required. -
I am using sql loader to load a table
This is the syntax i am using
LOAD DATA
TRUNCATE
INTO TABLE selva_tst
WHEN(01:04) = 'D328' AND (06:06)='$'
FIELDS TERMINATED BY "|"
A_ID,
NULLIF(CO,$),
ANB,
STS_DT Date 'YYYYMMDD',
DMP_ID
It is giving error at line NULLIF(CO,$),
stating that syntax error in this line
can anyone help me in this regard
thanks in advanceYou may be interested to look up documentation about using sql expressions to load data.
http://download.oracle.com/docs/cd/B19306_01/server.102/b14215/ldr_field_list.htm#sthref1238
Best regards
Maxim -
Is it possible to pass or set a variable in SQL LOADER? In this case I want the file name (eg $data) that is getting passed from the command line to load into my table into the extract_date field.
For example. The command line:
sqlldr user/password control=deposit.ctl data=080322.txt
Control file:
Load data
infile '$data'
Append into table deposit
, id position (1-10)
, extract_date date "YYMMDD" $data
Any thoughts?user567866 wrote:
Is it possible to pass or set a variable in SQL LOADER? In this case I want the file name (eg $data) that is getting passed from the command line to load into my table into the extract_date field.
For example. The command line:
sqlldr user/password control=deposit.ctl data=080322.txt
Control file:
Load data
infile '$data'
Append into table deposit
, id position (1-10)
, extract_date date "YYMMDD" $data
Any thoughts?Just wonder, why do you need a variable, if you are passing the filename on the command line? The sqlldr is perfectly capable to read the data from the file given as argument with parameter data. Just remove the line with infile from your controlfile and leave your commandline as is.
Best regards
Maxim -
Hi guys. I have a problem and can´t solve it.
I am a few rookie in oracle.
I have to load a txt file.
I develope the next ctr:
LOAD DATA
INFILE './file.txt'
INTO table
REPLACE
FIELDS TERMINATED BY '|' OPTIONALLY ENCLOSED BY '"'
field_1,
field_2,
ID SEQUENCE(MAX,1)
My problem is that new records would be:
value1_field_1 value1_field_2 1
value2_field_1 value2_field_2 2
value3_field_1 value3_field_2 3
value4_field_1 value4_field_2 4
value5_field_1 value5_field_2 5
In the next load, I would have the next;
value1_field_1 value1_field_2 1
value2_field_1 value2_field_2 2
value3_field_1 value3_field_2 3
value4_field_1 value4_field_2 4
value5_field_1 value5_field_2 5
value6_field_1 value6_field_2 6
value7_field_1 value7_field_2 7
value8_field_1 value8_field_2 8
value9_field_1 value9_field_2 9
value10_field_1 value10_field_2 10
Well( Ii write too much, :-) ), my problem is that i want the following in the first load:
value1_field_1 value1_field_2 1
value2_field_1 value2_field_2 1
value3_field_1 value3_field_2 1
value4_field_1 value4_field_2 1
value5_field_1 value5_field_2 1
And after second load:
value1_field_1 value1_field_2 1
value2_field_1 value2_field_2 1
value3_field_1 value3_field_2 1
value4_field_1 value4_field_2 1
value5_field_1 value5_field_2 1
value6_field_1 value6_field_2 2
value7_field_1 value7_field_2 2
value8_field_1 value8_field_2 2
value9_field_1 value9_field_2 2
value10_field_1 value10_field_2 2
It seems easy, but i don't know how make it. Can someone help me, please?.
Regards, Javi.
(Now i haven't primary key in the table: I don't have problems with same IDs)
Edited by: user12249099 on 07-dic-2009 9:38
Edited by: user12249099 on 07-dic-2009 9:38
Edited by: user12249099 on 07-dic-2009 9:39Hi,
field_1,
field_2,
ID SEQUENCE(MAX,1)
)In this code, dont use the sequence. Instead have a variable and increment it after every load.
Edited by: machan on Dec 7, 2009 11:01 AM
Edited by: machan on Dec 7, 2009 1:03 PM -
Want to use sequence object of oracle when loading data in sql loader
Hi,
I want to use sequence when loading data in sqll loader, but the problem is i could not use sequence object of oracle to load the data by sql loader, i can use sequence of sql loader.
I want to use sequence object because in later entries this sequence object will be used.If i use sequence of sql loader how can i use oracle sequence object
Is there any other optionI have a simillar problem, I also want to use a sequence when loading data by the SQL Loader.
My control file is:
load data
infile '0testdata.txt'
into table robertl.tbltest
fields terminated by X'09'
trailing nullcols
(redbrojunos,
broj,
dolazak,
odlazak nullif odlazak=blanks,
komentar nullif komentar=blanks)
And the datafile is:
robertl.brojilo.nextval 1368 17.06.2003 08:02:46 17.06.2003 16:17:18
robertl.brojilo.nextval 2363 17.06.2003 08:18:18 17.06.2003 16:21:52
robertl.brojilo.nextval 7821 17.06.2003 08:29:22 17.06.2003 16:21:59
robertl.brojilo.nextval 0408 17.06.2003 11:20:27 17.06.2003 18:33:00 ispit
robertl.brojilo.nextval 1111 17.06.2003 11:30:58 17.06.2003 16:09:34 Odlazak na ispit
robertl.brojilo.nextval 6129 17.06.2003 14:02:42 17.06.2003 16:23:23 seminar
But all records were rejected by the Loader, for every record I get the error:
Record 1: Rejected - Error on table ROBERTL.TBLTEST, column REDBROJUNOS.
ORA-01722: invalid number -
How to Use Sequence created in Oracle Database in SQL Ldr Control file
Hi,
I created a sequence in oracle database. How will use the sequence in SQL loader Control file.
Thanks in advanceHi,
You might get a good response to your post in the forum dedicated to data movement , including SQL*Loader . You can find it here Export/Import/SQL Loader & External Tables
Regards, -
Hi all,
I am kind of wondering whether we can give same sequence number to the whole data file.
Explanation:
I need to load a data file into the table, i know how to use the sequence in Sql loader control file. But the problem is i need the same sequence number for every record that is inserted into the table from the Datafile,
Say the datafile has 500 record
and i need all the 500 records to have the same sequence number.
when i load another datafile i need to have a different sequence number for all the records that are inserted into the table.
Thanks in advance.
Edited by: 943254 on Mar 11, 2013 11:08 AMThen you will have to look for a solution elsewhere - either the file will need to come preseeded with the value, or you will need to perform post-processing on the loaded data to add the constant value, or something else.
HTH
Srini -
Error in loading data using SQL loader
I am getting a error like ‘SQL*Loader -350 syntax error of illegal combination of non-alphanumeric characters’ while loading a file using SQL loader in RHEL. The command used to run SQL*Loader is:
Sqlldr userid=<username>/<password> control =data.ctl
The control file, data.ctl is :
LOAD data
infile '/home/oraprod/data.txt'
append into table test
empid terminated by ',',
fname terminated by ',',
lname terminated by ',',
salary terminated by whitespace
The data.txt file is:
1,Kaushal,halani,5000
2,Chetan,halani,1000
I hope, my question is clear.
Please revert with the reply to my query.
RegardsReplace ''{" by "(" in your control file
LOAD data
infile 'c:\data.txt'
append into table emp_t
empid terminated by ',',
fname terminated by ',',
lname terminated by ',',
salary terminated by whitespace
C:\>sqlldr user/pwd@database control=c.ctl
SQL*Loader: Release 10.2.0.3.0 - Production on Wed Nov 13 10:10:24 2013
Copyright (c) 1982, 2005, Oracle. All rights reserved.
Commit point reached - logical record count 1
Commit point reached - logical record count 2
SQL> select * from emp_t;
EMPID FNAME LNAME SALARY
1 Kaushal halani 5000
2 Chetan halani 1000
Best regards
Mohamed Houri -
SQL*Loader issue with WHEN command
Environment: R12.1.2
We have a file coming in from a bank that needs to be loaded into a custom table using SQL*Loader.
The file has multiple record formats. Each record in the file starts with a "record type", which defines the format.
For simplicity, let me say that there is a record type of "H" with the header format, and another record type "D" has a detail record format. An "H" record may be followed by multiple "D" records until the next "H" record is encountered. Unfortunately, there is no common key, like say "Vendor Number" in both the "H" and "D" records to establish a relationship. So the plan was to use a Oracle sequence or SQL*Loader sequence to get a sequence loaded into the table as the file is being loaded. Then if consecutive "H" records had a sequence value of 100 and 112, we would know that the "D" records for the "H" 100 record are all the records with sequence value of 101 through 111.
The issue occurs as we have to use the WHEN command in the control file to direct a certain record type to specific columns of the table. Based on the populated sequence values, with the WHEN command, it seems that all the "H" records get loaded first followed by the "D" records. The sequence becomes of no use and we cannot establish a link between the "H" and "D" records. The alternative is to not use WHEN with the sequence, but load the file into generic column names which provides for less understanding in the application.
Is there a way (command feature) to ensure that SQL*Loader loads the records sequentially while using WHEN?
Thanks
SatishI used RECNUM parameter instead of sequence and it worked fine
-
I have used OMWB "generate Sql Loader script " option. ANd received the sql loader error.
The previous attempt to use OMWB online loading has generated garbage data. The picture was not matching with the person id.
Table in Sql Server..................
CREATE TABLE [nilesh] (
[LargeObjectID] [int] NOT NULL ,
[LargeObject] [image] NULL ,
[ContentType] [varchar] (40) COLLATE SQL_Latin1_General_CP1_CI_AS NULL ,
[LargeObjectName] [varchar] (255) COLLATE SQL_Latin1_General_CP1_CI_AS NULL ,
[LargeObjectExtension] [varchar] (10) COLLATE SQL_Latin1_General_CP1_CI_AS NULL ,
[LargeObjectDescription] [varchar] (255) COLLATE SQL_Latin1_General_CP1_CI_AS NULL ,
[LargeObjectSize] [int] NULL ,
[VersionControl] [bit] NULL ,
[WhenLargeObjectLocked] [datetime] NULL ,
[WhoLargeObjectLocked] [char] (11) COLLATE SQL_Latin1_General_CP1_CI_AS NULL ,
[LargeObjectTimeStamp] [timestamp] NOT NULL ,
[LargeObjectOID] [uniqueidentifier] NOT NULL
) ON [PRIMARY] TEXTIMAGE_ON [PRIMARY]
GO
Table in Oracle..............
CREATE TABLE LARGEOBJECT
LARGEOBJECTID NUMBER(10) NOT NULL,
LARGEOBJECT BLOB,
CONTENTTYPE VARCHAR2(40 BYTE),
LARGEOBJECTNAME VARCHAR2(255 BYTE),
LARGEOBJECTEXTENSION VARCHAR2(10 BYTE),
LARGEOBJECTDESCRIPTION VARCHAR2(255 BYTE),
LARGEOBJECTSIZE NUMBER(10),
VERSIONCONTROL NUMBER(1),
WHENLARGEOBJECTLOCKED DATE,
WHOLARGEOBJECTLOCKED CHAR(11 BYTE),
LARGEOBJECTTIMESTAMP NUMBER(8) NOT NULL,
LARGEOBJECTOID RAW(16) NOT NULL
TABLESPACE USERS
PCTUSED 0
PCTFREE 10
INITRANS 1
MAXTRANS 255
STORAGE (
INITIAL 64K
MINEXTENTS 1
MAXEXTENTS 2147483645
PCTINCREASE 0
BUFFER_POOL DEFAULT
LOGGING
NOCOMPRESS
LOB (LARGEOBJECT) STORE AS
( TABLESPACE USERS
ENABLE STORAGE IN ROW
CHUNK 8192
PCTVERSION 10
NOCACHE
STORAGE (
INITIAL 64K
MINEXTENTS 1
MAXEXTENTS 2147483645
PCTINCREASE 0
BUFFER_POOL DEFAULT
NOCACHE
NOPARALLEL
MONITORING;
Sql Loader script....
SET NLS_DATE_FORMAT=Mon dd YYYY HH:mi:ssAM
REM SET NLS_TIMESTAMP_FORMAT=Mon dd YYYY HH:mi:ss:ffAM
REM SET NLS_LANGUAGE=AL32UTF8
sqlldr cecildata/@ceciltst control=LARGEOBJECT.ctl log=LARGEOBJECT.log
Sql loader control file......
load data
infile 'nilesh.dat' "str '<er>'"
into table LARGEOBJECT
fields terminated by '<ec>'
trailing nullcols
(LARGEOBJECTID,
LARGEOBJECT CHAR(2000000) "HEXTORAW (:LARGEOBJECT)",
CONTENTTYPE "DECODE(:CONTENTTYPE, CHR(00), ' ', :CONTENTTYPE)",
LARGEOBJECTNAME CHAR(255) "DECODE(:LARGEOBJECTNAME, CHR(00), ' ', :LARGEOBJECTNAME)",
LARGEOBJECTEXTENSION "DECODE(:LARGEOBJECTEXTENSION, CHR(00), ' ', :LARGEOBJECTEXTENSION)",
LARGEOBJECTDESCRIPTION CHAR(255) "DECODE(:LARGEOBJECTDESCRIPTION, CHR(00), ' ', :LARGEOBJECTDESCRIPTION)",
LARGEOBJECTSIZE,
VERSIONCONTROL,
WHENLARGEOBJECTLOCKED,
WHOLARGEOBJECTLOCKED,
LARGEOBJECTTIMESTAMP,
LARGEOBJECTOID "GUID_MOVER(:LARGEOBJECTOID)")
Error Received...
Column Name Position Len Term Encl Datatype
LARGEOBJECTID FIRST * CHARACTER
Terminator string : '<ec>'
LARGEOBJECT NEXT ***** CHARACTER
Maximum field length is 2000000
Terminator string : '<ec>'
SQL string for column : "HEXTORAW (:LARGEOBJECT)"
CONTENTTYPE NEXT * CHARACTER
Terminator string : '<ec>'
SQL string for column : "DECODE(:CONTENTTYPE, CHR(00), ' ', :CONTENTTYPE)"
LARGEOBJECTNAME NEXT 255 CHARACTER
Terminator string : '<ec>'
SQL string for column : "DECODE(:LARGEOBJECTNAME, CHR(00), ' ', :LARGEOBJECTNAME)"
LARGEOBJECTEXTENSION NEXT * CHARACTER
Terminator string : '<ec>'
SQL string for column : "DECODE(:LARGEOBJECTEXTENSION, CHR(00), ' ', :LARGEOBJECTEXTENSION)"
LARGEOBJECTDESCRIPTION NEXT 255 CHARACTER
Terminator string : '<ec>'
SQL string for column : "DECODE(:LARGEOBJECTDESCRIPTION, CHR(00), ' ', :LARGEOBJECTDESCRIPTION)"
LARGEOBJECTSIZE NEXT * CHARACTER
Terminator string : '<ec>'
VERSIONCONTROL NEXT * CHARACTER
Terminator string : '<ec>'
WHENLARGEOBJECTLOCKED NEXT * CHARACTER
Terminator string : '<ec>'
WHOLARGEOBJECTLOCKED NEXT * CHARACTER
Terminator string : '<ec>'
LARGEOBJECTTIMESTAMP NEXT * CHARACTER
Terminator string : '<ec>'
LARGEOBJECTOID NEXT * CHARACTER
Terminator string : '<ec>'
SQL string for column : "GUID_MOVER(:LARGEOBJECTOID)"
SQL*Loader-309: No SQL string allowed as part of LARGEOBJECT field specification
what's the cause ?The previous attempt to use OMWB online loading has generated garbage data. The picture was not matching with the person id.This is being worked on (bug4119713) If you have a reproducible testcase please send it in (small testcases seem to work ok).
I have the following email about BLOBS I could forward to you if I have your email address:
[The forum may cut the lines in the wrong places]
Regards,
Turloch
Oracle Migration Workbench Team
Hi,
This may provide the solution. Without having the customer files here I can only guess at the problem. But this should help.
This email outlines a BLOB data move.
There are quiet a few steps to complete the task of moving a large BLOB into the Oracle database.
Normally this wouldn't be a problem, but as far as we can tell SQL Server's (and possibly Sybase) BCP does not reliably export binary data.
The only way to export binary data properly via BCP is to export it in a HEX format.
Once in a HEX format it is difficult to get it back to binary during a data load into Oracle.
We have come up with the idea of getting the HEX values into Oracle by saving them in a CLOB (holds text) column.
We then convert the HEX values to binary values and insert them into the BLOB column.
The problem here is that the HEXTORAW function in Oracle only converts a maximum of 2000 HEX pairs.
We over came this problem by writing our own procedure that will convert (bit by bit) your HEX data to binary.
NOTE: YOU MUST MODIFY THE START.SQL AND FINISH.SQL TO SUIT YOUR CUSTOMER
The task is split into 4 sub tasks
1) CREATE A TABLESPACE TO HOLD ALL THE LOB DATA
--log into your system schema and create a tablespace
--Create a new tablespace for the CLOB and BLOB column (this may take a while to create)
--You may resize this to fit your data ,
--but I believe you have in excess of 500MB of data in this table and we are going to save it twice (in a clob then a blob)
--Note: This script may take some time to execute as it has to create a tablespace of 1000Mb.
-- Change this to suit your customer.
-- You can change this if you want depending on the size of your data
-- Remember that we save the data once as CLOB and then as BLOB
create tablespace lob_tablespace datafile 'lob_tablespace' SIZE 1000M AUTOEXTEND ON NEXT 50M;
LOG INTO YOUR TABLE SCHEMA IN ORACLE
--Modify this script to fit your requirements
2) START.SQL (this script will do the following tasks)
a) Modify your current schema so that it can accept HEX data
b) Modify your current schema so that it can hold that huge amount of data.
The new tablespace is used; you may want to alter this to your requirements
c) Disable triggers, indexes & primary keys on tblfiles
3)DATA MOVE
The data move now involves moving the HEX data in the .dat files to a CLOB.
The START.SQL script adds a new column to <tablename> called <blob_column>_CLOB.
This is where the HEX values will be stored.
MODIFY YOUR CONTROL FILE TO LOOK LIKE THISload data
infile '<tablename>.dat' "str '<er>'"
into table <tablename>
fields terminated by '<ec>'
trailing nullcols
<blob_column>_CLOB CHAR(200000000),
The important part being "_CLOB" appended to your BLOB column name and the datatype set to CHAR(200000000)
RUN sql_loader_script.bat
log into your schema to check if the data was loaded successfully-- now you can see that the hex values were sent to the CLOB column
SQL> select dbms_lob.getlength(<blob_column>),dbms_lob.getlength(<blob_column>_clob) from <tablename>;
LOG INTO YOUR SCHEMA
4)FINISH.SQL (this script will do the following tasks)
a) Creates the procedure needed to perform the CLOB to BLOB transformation
b) Executes the procedure (this may take some time a 500Mb has to be converted to BLOB)
c) Alters the table back to its original form (removes the <blob_column>_clob)
b) Enables the triggers, indexes and primary keys
Regards,
(NAME)
-- START.SQL
-- Modify this for your particular customer
-- This should be executed in the user schema in Oracle that contains the table.
-- DESCRIPTION:
-- ALTERS THE OFFENDING TABLE SO THAT THE DATA MOVE CAN BE EXECUTED
-- DISABLES TRIGGERS, INDEXES AND SEQUENCES ON THE OFFENDING TABLE
-- 1) Add an extra column to hold the hex string
alter table <tablename> add (FILEBINARY_CLOB CLOB);
-- 2) Allow the BLOB column to accpet NULLS
alter table <tablename> MODIFY FILEBINARY NULL;
-- 3) Dissable triggers and sequences on tblfiles
alter trigger <triggername> disable;
alter table tblfiles drop primary key cascade;
drop index <indexname>;
-- 4) Allow the table to use the tablespace
alter table <tablename> move lob (<blob_column>) store as (tablespace lob_tablespace);
alter table tblfiles move lob (<blob_column>clob) store as (tablespace lobtablespace);
COMMIT;
-- END OF FILE
-- FINISH.SQL
-- Modify this for your particular customer
-- This should be executed in the table schema in Oracle.
-- DESCRIPTION:
-- MOVES THE DATA FROM CLOB TO BLOB
-- MODIFIES THE TABLE BACK TO ITS ORIGIONAL SPEC (without a clob)
-- THEN ENABLES THE SEQUENCES, TRIGGERS AND INDEXES AGAIN
-- Currently we have the hex values saved as text in the <columnname>_CLOB column
-- And we have NULL in all rows for the <columnname> column.
-- We have to get BLOB locators for each row in the BLOB column
-- put empty blobs in the blob column
UPDATE <tablename> SET filebinary=EMPTY_BLOB();
COMMIT;
-- create the following procedure in your table schema
CREATE OR REPLACE PROCEDURE CLOBTOBLOB
AS
inputLength NUMBER; -- size of input CLOB
offSet NUMBER := 1;
pieceMaxSize NUMBER := 50; -- the max size of each peice
piece VARCHAR2(50); -- these pieces will make up the entire CLOB
currentPlace NUMBER := 1; -- this is where were up to in the CLOB
blobLoc BLOB; -- blob locator in the table
clobLoc CLOB; -- clob locator pointsthis is the value from the dat file
-- THIS HAS TO BE CHANGED FOR SPECIFIC CUSTOMER TABLE AND COLUMN NAMES
CURSOR cur IS SELECT <blob_column>clob clobcolumn , <blob_column> blob_column FROM /*table*/<tablename> FOR UPDATE;
cur_rec cur%ROWTYPE;
BEGIN
OPEN cur;
FETCH cur INTO cur_rec;
WHILE cur%FOUND
LOOP
--RETRIVE THE clobLoc and blobLoc
clobLoc := cur_rec.clob_column;
blobLoc := cur_rec.blob_column;
currentPlace := 1; -- reset evertime
-- find the lenght of the clob
inputLength := DBMS_LOB.getLength(clobLoc);
-- loop through each peice
LOOP
-- get the next piece and add it to the clob
piece := DBMS_LOB.subStr(clobLoc,pieceMaxSize,currentPlace);
-- append this peice to the BLOB
DBMS_LOB.WRITEAPPEND(blobLoc, LENGTH(piece)/2, HEXTORAW(piece));
currentPlace := currentPlace + pieceMaxSize ;
EXIT WHEN inputLength < currentplace;
END LOOP;
FETCH cur INTO cur_rec;
END LOOP;
END CLOBtoBLOB;
-- now run the procedure
-- It will update the blob column witht the correct binary represntation of the clob column
EXEC CLOBtoBLOB;
-- drop the extra clob cloumn
alter table <tablename> drop column <blob_column>_clob;
-- 2) apply the constraint we removed during the data load
alter table <tablename> MODIFY FILEBINARY NOT NULL;
-- Now re enable the triggers,indexs and primary keys
alter trigger <triggername> enable;
ALTER TABLE TBLFILES ADD ( CONSTRAINT <pkname> PRIMARY KEY ( <column>) ) ;
CREATE INDEX <index_name> ON TBLFILES ( <column> );
COMMIT;
-- END OF FILE -
Sql Loader - Decimal numbers showing in null column
Greetings,
My apologies if this is in the wrong forum section. It seemed to be the most logical.
I have added new column to a control file used in a sql loader upload and I am getting unexpected results. Long story short, I copy foxpro tables from a network directory to my local pc. A foxpro exe converts these tables to .dat files. Sql loader then uploads the .dat files to matching oracle tables. I've run this program from my pc for years with no problems.
Problem now: We added a new column to a foxpro table and to the matching oracle table. This column in FoxPro in null for now - no data at all. I then added the new column to my ctl file for this table. The program runs, sql loader does it's thing with no errors. However, in the new field in Oracle, I'm finding decimal numbers in many of the records, when all records should have null values in this field. I've checked all other columns in the oracle table and the data looks accurate. I'm not sure why I'm getting these decimal values in the new column.
My log and bad files show no hints of any problems. The bad file is empty for this table.
At first I thought the positioning of the new column in the fox table, .ctl file and the oracle table were not lining up correctly, but I checked and they are.
I've double checked the FoxPro table and all records for this new column are null.
I'm not sure what to check for next or what to test. I am hoping someone in this forum might lend a clue or has maybe seen this problem before. Below is my control file. The new column is the last one: fromweb_id. It is a number field in both FoxPro and Oracle.
Thanks for any advise.
JOBS table control file:
load data
infile 'convdata\fp_ora\JOBS.dat' "str X'08'"
into table JOBS
fields terminated by X'07'
TRAILING NULLCOLS
(SID,
CO_NAME "replace(replace(:CO_NAME,chr(11),chr(10)),chr(15),chr(13))",
JOB_TITLE "replace(replace(:JOB_TITLE,chr(11),chr(10)),chr(15),chr(13))",
CREDITS,
EARN_DATE date "mm/dd/yyyy",
COMMENTS CHAR(2000) "replace(replace(:COMMENTS,chr(11),chr(10)),chr(15),chr(13))",
DONT_SHOW,
PC_SRC "replace(replace(:PC_SRC,chr(11),chr(10)),chr(15),chr(13))",
PC_SRC_NO,
SALARY,
SALFOR,
ROOM,
BOARD,
TIPS,
UPD_DATE date "mm/dd/yyyy hh12:mi:ss am",
STUKEY,
JOBKEY,
JO_COKEY,
JO_CNKEY,
JO_ZUKEY,
EMPLID,
CN_NAME "replace(replace(:CN_NAME,chr(11),chr(10)),chr(15),chr(13))",
JOB_START date "mm/dd/yyyy",
JOB_END date "mm/dd/yyyy",
FROMWEB_ID)I apologize for not explaining how this was resolved. Sql Loader was working as it should.
The problem was due to new fields being added to the FoxPro table, along with the fromweb_id column, that I was not informed about. I was asked to add a column named fromweb_id to the oracle jobs table and to the sql-loader program. I was not told that there were other columns added at the same time. In the foxpro table, the fromweb_id column was the last column added.
The jobs.dat file contained data from all columns in the foxpro table, including all the new columns. I only added the "fromweb_id" to the control file, which is what I was asked to do. When it ran, it was getting values from one of the new columns and the values were being uploaded into the fromweb_id column in Oracle. It is that simple.
When I had checked the FoxPro table earlier, I did not pickup on the other new columns. I was focussing in on looking for values in the fromweb_id column. When back-tracing data in the jobs.dat file, I found a value in the fromweb_id column that matched a value in a differnt column (new column) in FoxPro. That is when I realized the other new columns. I instantly knew what the problem was.
Thanks for all the feedback. I'm sorry if this was an inconvenience to anyone. I'll try to dig a little deeper next time. Lessons learned...
regards, -
Different log file name in the Control file of SQL Loader
Dear all,
I get every day 3 log files with ftp from a Solaris Server to a Windows 2000 Server machine. In this Windows machine, we have an Oracle Database 9.2. These log files are in the following format: in<date>.log i.e. in20070429.log.
I would like to load this log file's data to an Oracle table every day and I would like to use SQL Loader for this job.
The problem is that the log file name is different every day.
How can I give this variable log file name in the Control file, which is used for the SQL Loader?
file.ctl
LOAD DATA
INFILE 'D:\gbal\in<date>.log'
APPEND INTO TABLE CHAT_SL
FIELDS TERMINATED BY WHITESPACE
TRAILING NULLCOLS
(SL1 DATE "Mon DD, YYYY HH:MI:SS FF3AM",
SL2 char,
SL3 DATE "Mon DD, YYYY HH:MI:SS FF3AM",
SL4 char,
SL5 char,
SL6 char,
SL7 char,
SL8 char,
SL9 char,
SL10 char,
SL11 char,
SL12 char,
SL13 char,
SL14 char,
SL15 char)
Do you have any better idea about this issue?
I thought of renaming the log file to an instant name, such as in.log, but how can I distinguish the desired log file, from the other two?
Thank you very much in advance.
Giorgos BaliotisI don't have a direct solution for your problem.
However if you invoke the SQL loader from an Oracle stored procedure, it is possible to dynamically set control\log file.
# Grant previleges to the user to execute command prompt statements
BEGIN
dbms_java.grant_permission('bc4186ol','java.io.FilePermission','C:\windows\system32\cmd.exe','execute');
END;
* Procedure to execute Operating system commands using PL\SQL(Oracle script making use of Java packages
CREATE OR REPLACE AND COMPILE JAVA SOURCE NAMED "Host" AS
import java.io.*;
public class Host {
public static void executeCommand(String command) {
try {
String[] finalCommand;
finalCommand = new String[4];
finalCommand[0] = "C:\\windows\\system32\\cmd.exe";
finalCommand[1] = "/y";
finalCommand[2] = "/c";
finalCommand[3] = command;
final Process pr = Runtime.getRuntime().exec(finalCommand);
new Thread(new Runnable() {
public void run() {
try {
BufferedReader br_in = new BufferedReader(new InputStreamReader(pr.getInputStream()));
String buff = null;
while ((buff = br_in.readLine()) != null) {
System.out.println("Process out :" + buff);
try {Thread.sleep(100); } catch(Exception e) {}
catch (IOException ioe) {
System.out.println("Exception caught printing process output.");
ioe.printStackTrace();
}).start();
new Thread(new Runnable() {
public void run() {
try {
BufferedReader br_err = new BufferedReader(new InputStreamReader(pr.getErrorStream()));
String buff = null;
while ((buff = br_err.readLine()) != null) {
System.out.println("Process err :" + buff);
try {Thread.sleep(100); } catch(Exception e) {}
catch (IOException ioe) {
System.out.println("Exception caught printing process error.");
ioe.printStackTrace();
}).start();
catch (Exception ex) {
System.out.println(ex.getLocalizedMessage());
public static boolean isWindows() {
if (System.getProperty("os.name").toLowerCase().indexOf("windows") != -1)
return true;
else
return false;
* Oracle wrapper to call the above procedure
CREATE OR REPLACE PROCEDURE Host_Command (p_command IN VARCHAR2)
AS LANGUAGE JAVA
NAME 'Host.executeCommand (java.lang.String)';
* Now invoke the procedure with an operating system command(Execyte SQL-loader)
* The execution of script would ensure the Prod mapping data file is loaded to PROD_5005_710_MAP table
* Change the control\log\discard\bad files as apropriate
BEGIN
Host_Command (p_command => 'sqlldr system/tiburon@orcl control=C:\anupama\emp_join'||1||'.ctl log=C:\anupama\ond_lists.log');
END;Does that help you?
Regards,
Bhagat -
SQL Loader fails loading XML data enclosed by tag not found
The problem I'm having is my XML tree doesn't contain all possible elements. In this example the second entry doesn't contain <age> - only the first entry will be added to the database
Any idea of how I could solve this?
The fields are saved as varchar2
XML:
<rowset>
<row>
<name>Name</name>
<age>Age</age>
<city>City</city>
</row>
<row>
<name>Name2</name>
<city>City2</city>
</row>
</rowset>
LOAD DATA
INFILE 'data.xml' "str '</row>'"
APPEND
INTO TABLE test
TRAILING NULLCOLS
dummy FILLER terminated BY "<row>",
name ENCLOSED BY "<name>" AND "</name>",
age ENCLOSED BY "<age>" AND "</age>",
city ENCLOSED BY "<city>" AND "</city>"
)I noticed that failure occurs when using 11g version SQL Loader. It doesn't fail when using 10g version SQL Loader.
Delimited source data comes from:
Oracle Database 10g Enterprise Edition Release 10.2.0.4.0 - Prod
PL/SQL Release 10.2.0.4.0 - Production
CORE 10.2.0.4.0 Production
TNS for Linux: Version 10.2.0.4.0 - Production
NLSRTL Version 10.2.0.4.0 - ProductionAnd will be loaded into
Oracle Database 10g Enterprise Edition Release 10.2.0.5.0 - 64bi
PL/SQL Release 10.2.0.5.0 - Production
CORE 10.2.0.5.0 Production
TNS for Linux: Version 10.2.0.5.0 - Production
NLSRTL Version 10.2.0.5.0 - ProductionMy previously used SQL Loader was from:
Oracle Database 11g Enterprise Edition Release 11.2.0.1.0 - 64bit Production
PL/SQL Release 11.2.0.1.0 - Production
CORE 11.2.0.1.0 Production
TNS for 64-bit Windows: Version 11.2.0.1.0 - Production
NLSRTL Version 11.2.0.1.0 - ProductionIt seems that I have found the real culprit. Should I know something more?
Maybe you are looking for
-
My normal routine after I have converted my Nikon NEF files and processed them in Photoshop CS3 is to save as TIF files. These have always then been visible in Adobe bridge CS3. For some reason I can no longer see the picture thumbnail in Adobe bridg
-
If I have a 16G 4S and purchase iTunes Match, how will I acess all my music if its 50G?
I want to purchase iTunes Match as I have over 50G of music. However, my new iPhone 4S is only 16G. How does that work?
-
How to call codec in java code
i want to call a jmf codec for video compression form direct capture device, i dont know how to call codecs of Jmf i am using jffmpeg how to implement it can any one please let meknow
-
My MacBook Pro has been overheating for years... One of the black rubber dot under my MacBook Pro has even MELTED !!!?!! TWICE. Apple has already stress-tested my machine but didn't find anything and they refuse to make any gesture as my computer is
-
No display arrangement with projector?
I'm trying to use 'presenter tools' in powerpoint with my ibook and don't seem to have an 'arrangement' tab in the display system preferences. The display is mirrored (the audience sees the same thing as my laptop screen). How can I change this?