Using nullif in sql loader
I am using sql loader to load a table
This is the syntax i am using
LOAD DATA
TRUNCATE
INTO TABLE selva_tst
WHEN(01:04) = 'D328' AND (06:06)='$'
FIELDS TERMINATED BY "|"
A_ID,
NULLIF(CO,$),
ANB,
STS_DT Date 'YYYYMMDD',
DMP_ID
It is giving error at line NULLIF(CO,$),
stating that syntax error in this line
can anyone help me in this regard
thanks in advance
You may be interested to look up documentation about using sql expressions to load data.
http://download.oracle.com/docs/cd/B19306_01/server.102/b14215/ldr_field_list.htm#sthref1238
Best regards
Maxim
Similar Messages
-
Using clob in sql loader utility in oracle 9i
Hi,
I want to load data into a table with 2 clob columns using a sql loader utility dat file and control file created programatically.
The size of clob in dat file can change and the clob columns are inline in data file.
As per 9i doc the size of clob is 4GB .
How can I change the control file so that it can load max 4 GB data in clob columns .
I am getting error while calling sqlldr using below control file :
SQL*Loader-350: Syntax error at line 13.
Expecting non-negative integer, found "-294967296".
,"NARRATIVE" char(4000000000)
^
control file :
LOAD DATA
INFILE '' "str X'3C213E0A'"
APPEND INTO TABLE PSD_TERM
FIELDS TERMINATED BY '~^'
TRAILING NULLCOLS
"PSD_ID" CHAR(16) NULLIF ("PSD_ID"=BLANKS)
,"PSD_SERIAL_NUM" CHAR(4) NULLIF ("PSD_SERIAL_NUM"=BLANKS)
,"PSD_TERM_COD" CHAR(4) NULLIF ("PSD_TERM_COD"=BLANKS)
,"PSD_TERM_SER_NO" CHAR(4) NULLIF ("PSD_TERM_SER_NO"=BLANKS)
,"VERSION_DT" DATE "DD-MON-YYYY HH:MI:SS AM" NULLIF ("VERSION_DT"=BLANKS)
,"LATEST_VERSION" CHAR(1) NULLIF ("LATEST_VERSION"=BLANKS)
,"NARRATIVE" char(4000000000)
,"PARTITION_DT" DATE "DD-MON-YYYY HH:MI:SS AM" NULLIF ("PARTITION_DT"=BLANKS)
,"NARRATIVE_UNEXPANDED" char(4000000000)
)Yes, you can do it. Create the sequence (suppose you call it "PK_SEQ_X") and then in your control file reference it as "PK_SEQ_X.NEXTVAL". For example suppose you wanted to put it into a column named 'Y' the entry in your control file will look like 'load data insert into table Z (Y "PK_SEQ_X.NEXTVAL", ....)'
Note that the double quotes around the sequence name are required. -
Hi,
Does anybody know if I can generate the unique primary key using an Oracle Sequence for a Database table to which I am inserting records in SQL Loader?
I checked the SQL Loader manual and there is no information as to how to make use of a Oracle sequence.. in the control file?
Thanks
SurajitYes, you can do it. Create the sequence (suppose you call it "PK_SEQ_X") and then in your control file reference it as "PK_SEQ_X.NEXTVAL". For example suppose you wanted to put it into a column named 'Y' the entry in your control file will look like 'load data insert into table Z (Y "PK_SEQ_X.NEXTVAL", ....)'
Note that the double quotes around the sequence name are required. -
Is it possible to pass or set a variable in SQL LOADER? In this case I want the file name (eg $data) that is getting passed from the command line to load into my table into the extract_date field.
For example. The command line:
sqlldr user/password control=deposit.ctl data=080322.txt
Control file:
Load data
infile '$data'
Append into table deposit
, id position (1-10)
, extract_date date "YYMMDD" $data
Any thoughts?user567866 wrote:
Is it possible to pass or set a variable in SQL LOADER? In this case I want the file name (eg $data) that is getting passed from the command line to load into my table into the extract_date field.
For example. The command line:
sqlldr user/password control=deposit.ctl data=080322.txt
Control file:
Load data
infile '$data'
Append into table deposit
, id position (1-10)
, extract_date date "YYMMDD" $data
Any thoughts?Just wonder, why do you need a variable, if you are passing the filename on the command line? The sqlldr is perfectly capable to read the data from the file given as argument with parameter data. Just remove the line with infile from your controlfile and leave your commandline as is.
Best regards
Maxim -
Want to use sequence object of oracle when loading data in sql loader
Hi,
I want to use sequence when loading data in sqll loader, but the problem is i could not use sequence object of oracle to load the data by sql loader, i can use sequence of sql loader.
I want to use sequence object because in later entries this sequence object will be used.If i use sequence of sql loader how can i use oracle sequence object
Is there any other optionI have a simillar problem, I also want to use a sequence when loading data by the SQL Loader.
My control file is:
load data
infile '0testdata.txt'
into table robertl.tbltest
fields terminated by X'09'
trailing nullcols
(redbrojunos,
broj,
dolazak,
odlazak nullif odlazak=blanks,
komentar nullif komentar=blanks)
And the datafile is:
robertl.brojilo.nextval 1368 17.06.2003 08:02:46 17.06.2003 16:17:18
robertl.brojilo.nextval 2363 17.06.2003 08:18:18 17.06.2003 16:21:52
robertl.brojilo.nextval 7821 17.06.2003 08:29:22 17.06.2003 16:21:59
robertl.brojilo.nextval 0408 17.06.2003 11:20:27 17.06.2003 18:33:00 ispit
robertl.brojilo.nextval 1111 17.06.2003 11:30:58 17.06.2003 16:09:34 Odlazak na ispit
robertl.brojilo.nextval 6129 17.06.2003 14:02:42 17.06.2003 16:23:23 seminar
But all records were rejected by the Loader, for every record I get the error:
Record 1: Rejected - Error on table ROBERTL.TBLTEST, column REDBROJUNOS.
ORA-01722: invalid number -
Error in loading data using SQL loader
I am getting a error like ‘SQL*Loader -350 syntax error of illegal combination of non-alphanumeric characters’ while loading a file using SQL loader in RHEL. The command used to run SQL*Loader is:
Sqlldr userid=<username>/<password> control =data.ctl
The control file, data.ctl is :
LOAD data
infile '/home/oraprod/data.txt'
append into table test
empid terminated by ',',
fname terminated by ',',
lname terminated by ',',
salary terminated by whitespace
The data.txt file is:
1,Kaushal,halani,5000
2,Chetan,halani,1000
I hope, my question is clear.
Please revert with the reply to my query.
RegardsReplace ''{" by "(" in your control file
LOAD data
infile 'c:\data.txt'
append into table emp_t
empid terminated by ',',
fname terminated by ',',
lname terminated by ',',
salary terminated by whitespace
C:\>sqlldr user/pwd@database control=c.ctl
SQL*Loader: Release 10.2.0.3.0 - Production on Wed Nov 13 10:10:24 2013
Copyright (c) 1982, 2005, Oracle. All rights reserved.
Commit point reached - logical record count 1
Commit point reached - logical record count 2
SQL> select * from emp_t;
EMPID FNAME LNAME SALARY
1 Kaushal halani 5000
2 Chetan halani 1000
Best regards
Mohamed Houri -
Sql Loader - Decimal numbers showing in null column
Greetings,
My apologies if this is in the wrong forum section. It seemed to be the most logical.
I have added new column to a control file used in a sql loader upload and I am getting unexpected results. Long story short, I copy foxpro tables from a network directory to my local pc. A foxpro exe converts these tables to .dat files. Sql loader then uploads the .dat files to matching oracle tables. I've run this program from my pc for years with no problems.
Problem now: We added a new column to a foxpro table and to the matching oracle table. This column in FoxPro in null for now - no data at all. I then added the new column to my ctl file for this table. The program runs, sql loader does it's thing with no errors. However, in the new field in Oracle, I'm finding decimal numbers in many of the records, when all records should have null values in this field. I've checked all other columns in the oracle table and the data looks accurate. I'm not sure why I'm getting these decimal values in the new column.
My log and bad files show no hints of any problems. The bad file is empty for this table.
At first I thought the positioning of the new column in the fox table, .ctl file and the oracle table were not lining up correctly, but I checked and they are.
I've double checked the FoxPro table and all records for this new column are null.
I'm not sure what to check for next or what to test. I am hoping someone in this forum might lend a clue or has maybe seen this problem before. Below is my control file. The new column is the last one: fromweb_id. It is a number field in both FoxPro and Oracle.
Thanks for any advise.
JOBS table control file:
load data
infile 'convdata\fp_ora\JOBS.dat' "str X'08'"
into table JOBS
fields terminated by X'07'
TRAILING NULLCOLS
(SID,
CO_NAME "replace(replace(:CO_NAME,chr(11),chr(10)),chr(15),chr(13))",
JOB_TITLE "replace(replace(:JOB_TITLE,chr(11),chr(10)),chr(15),chr(13))",
CREDITS,
EARN_DATE date "mm/dd/yyyy",
COMMENTS CHAR(2000) "replace(replace(:COMMENTS,chr(11),chr(10)),chr(15),chr(13))",
DONT_SHOW,
PC_SRC "replace(replace(:PC_SRC,chr(11),chr(10)),chr(15),chr(13))",
PC_SRC_NO,
SALARY,
SALFOR,
ROOM,
BOARD,
TIPS,
UPD_DATE date "mm/dd/yyyy hh12:mi:ss am",
STUKEY,
JOBKEY,
JO_COKEY,
JO_CNKEY,
JO_ZUKEY,
EMPLID,
CN_NAME "replace(replace(:CN_NAME,chr(11),chr(10)),chr(15),chr(13))",
JOB_START date "mm/dd/yyyy",
JOB_END date "mm/dd/yyyy",
FROMWEB_ID)I apologize for not explaining how this was resolved. Sql Loader was working as it should.
The problem was due to new fields being added to the FoxPro table, along with the fromweb_id column, that I was not informed about. I was asked to add a column named fromweb_id to the oracle jobs table and to the sql-loader program. I was not told that there were other columns added at the same time. In the foxpro table, the fromweb_id column was the last column added.
The jobs.dat file contained data from all columns in the foxpro table, including all the new columns. I only added the "fromweb_id" to the control file, which is what I was asked to do. When it ran, it was getting values from one of the new columns and the values were being uploaded into the fromweb_id column in Oracle. It is that simple.
When I had checked the FoxPro table earlier, I did not pickup on the other new columns. I was focussing in on looking for values in the fromweb_id column. When back-tracing data in the jobs.dat file, I found a value in the fromweb_id column that matched a value in a differnt column (new column) in FoxPro. That is when I realized the other new columns. I instantly knew what the problem was.
Thanks for all the feedback. I'm sorry if this was an inconvenience to anyone. I'll try to dig a little deeper next time. Lessons learned...
regards, -
Different log file name in the Control file of SQL Loader
Dear all,
I get every day 3 log files with ftp from a Solaris Server to a Windows 2000 Server machine. In this Windows machine, we have an Oracle Database 9.2. These log files are in the following format: in<date>.log i.e. in20070429.log.
I would like to load this log file's data to an Oracle table every day and I would like to use SQL Loader for this job.
The problem is that the log file name is different every day.
How can I give this variable log file name in the Control file, which is used for the SQL Loader?
file.ctl
LOAD DATA
INFILE 'D:\gbal\in<date>.log'
APPEND INTO TABLE CHAT_SL
FIELDS TERMINATED BY WHITESPACE
TRAILING NULLCOLS
(SL1 DATE "Mon DD, YYYY HH:MI:SS FF3AM",
SL2 char,
SL3 DATE "Mon DD, YYYY HH:MI:SS FF3AM",
SL4 char,
SL5 char,
SL6 char,
SL7 char,
SL8 char,
SL9 char,
SL10 char,
SL11 char,
SL12 char,
SL13 char,
SL14 char,
SL15 char)
Do you have any better idea about this issue?
I thought of renaming the log file to an instant name, such as in.log, but how can I distinguish the desired log file, from the other two?
Thank you very much in advance.
Giorgos BaliotisI don't have a direct solution for your problem.
However if you invoke the SQL loader from an Oracle stored procedure, it is possible to dynamically set control\log file.
# Grant previleges to the user to execute command prompt statements
BEGIN
dbms_java.grant_permission('bc4186ol','java.io.FilePermission','C:\windows\system32\cmd.exe','execute');
END;
* Procedure to execute Operating system commands using PL\SQL(Oracle script making use of Java packages
CREATE OR REPLACE AND COMPILE JAVA SOURCE NAMED "Host" AS
import java.io.*;
public class Host {
public static void executeCommand(String command) {
try {
String[] finalCommand;
finalCommand = new String[4];
finalCommand[0] = "C:\\windows\\system32\\cmd.exe";
finalCommand[1] = "/y";
finalCommand[2] = "/c";
finalCommand[3] = command;
final Process pr = Runtime.getRuntime().exec(finalCommand);
new Thread(new Runnable() {
public void run() {
try {
BufferedReader br_in = new BufferedReader(new InputStreamReader(pr.getInputStream()));
String buff = null;
while ((buff = br_in.readLine()) != null) {
System.out.println("Process out :" + buff);
try {Thread.sleep(100); } catch(Exception e) {}
catch (IOException ioe) {
System.out.println("Exception caught printing process output.");
ioe.printStackTrace();
}).start();
new Thread(new Runnable() {
public void run() {
try {
BufferedReader br_err = new BufferedReader(new InputStreamReader(pr.getErrorStream()));
String buff = null;
while ((buff = br_err.readLine()) != null) {
System.out.println("Process err :" + buff);
try {Thread.sleep(100); } catch(Exception e) {}
catch (IOException ioe) {
System.out.println("Exception caught printing process error.");
ioe.printStackTrace();
}).start();
catch (Exception ex) {
System.out.println(ex.getLocalizedMessage());
public static boolean isWindows() {
if (System.getProperty("os.name").toLowerCase().indexOf("windows") != -1)
return true;
else
return false;
* Oracle wrapper to call the above procedure
CREATE OR REPLACE PROCEDURE Host_Command (p_command IN VARCHAR2)
AS LANGUAGE JAVA
NAME 'Host.executeCommand (java.lang.String)';
* Now invoke the procedure with an operating system command(Execyte SQL-loader)
* The execution of script would ensure the Prod mapping data file is loaded to PROD_5005_710_MAP table
* Change the control\log\discard\bad files as apropriate
BEGIN
Host_Command (p_command => 'sqlldr system/tiburon@orcl control=C:\anupama\emp_join'||1||'.ctl log=C:\anupama\ond_lists.log');
END;Does that help you?
Regards,
Bhagat -
SQL Loader fails loading XML data enclosed by tag not found
The problem I'm having is my XML tree doesn't contain all possible elements. In this example the second entry doesn't contain <age> - only the first entry will be added to the database
Any idea of how I could solve this?
The fields are saved as varchar2
XML:
<rowset>
<row>
<name>Name</name>
<age>Age</age>
<city>City</city>
</row>
<row>
<name>Name2</name>
<city>City2</city>
</row>
</rowset>
LOAD DATA
INFILE 'data.xml' "str '</row>'"
APPEND
INTO TABLE test
TRAILING NULLCOLS
dummy FILLER terminated BY "<row>",
name ENCLOSED BY "<name>" AND "</name>",
age ENCLOSED BY "<age>" AND "</age>",
city ENCLOSED BY "<city>" AND "</city>"
)I noticed that failure occurs when using 11g version SQL Loader. It doesn't fail when using 10g version SQL Loader.
Delimited source data comes from:
Oracle Database 10g Enterprise Edition Release 10.2.0.4.0 - Prod
PL/SQL Release 10.2.0.4.0 - Production
CORE 10.2.0.4.0 Production
TNS for Linux: Version 10.2.0.4.0 - Production
NLSRTL Version 10.2.0.4.0 - ProductionAnd will be loaded into
Oracle Database 10g Enterprise Edition Release 10.2.0.5.0 - 64bi
PL/SQL Release 10.2.0.5.0 - Production
CORE 10.2.0.5.0 Production
TNS for Linux: Version 10.2.0.5.0 - Production
NLSRTL Version 10.2.0.5.0 - ProductionMy previously used SQL Loader was from:
Oracle Database 11g Enterprise Edition Release 11.2.0.1.0 - 64bit Production
PL/SQL Release 11.2.0.1.0 - Production
CORE 11.2.0.1.0 Production
TNS for 64-bit Windows: Version 11.2.0.1.0 - Production
NLSRTL Version 11.2.0.1.0 - ProductionIt seems that I have found the real culprit. Should I know something more? -
I have used OMWB "generate Sql Loader script " option. ANd received the sql loader error.
The previous attempt to use OMWB online loading has generated garbage data. The picture was not matching with the person id.
Table in Sql Server..................
CREATE TABLE [nilesh] (
[LargeObjectID] [int] NOT NULL ,
[LargeObject] [image] NULL ,
[ContentType] [varchar] (40) COLLATE SQL_Latin1_General_CP1_CI_AS NULL ,
[LargeObjectName] [varchar] (255) COLLATE SQL_Latin1_General_CP1_CI_AS NULL ,
[LargeObjectExtension] [varchar] (10) COLLATE SQL_Latin1_General_CP1_CI_AS NULL ,
[LargeObjectDescription] [varchar] (255) COLLATE SQL_Latin1_General_CP1_CI_AS NULL ,
[LargeObjectSize] [int] NULL ,
[VersionControl] [bit] NULL ,
[WhenLargeObjectLocked] [datetime] NULL ,
[WhoLargeObjectLocked] [char] (11) COLLATE SQL_Latin1_General_CP1_CI_AS NULL ,
[LargeObjectTimeStamp] [timestamp] NOT NULL ,
[LargeObjectOID] [uniqueidentifier] NOT NULL
) ON [PRIMARY] TEXTIMAGE_ON [PRIMARY]
GO
Table in Oracle..............
CREATE TABLE LARGEOBJECT
LARGEOBJECTID NUMBER(10) NOT NULL,
LARGEOBJECT BLOB,
CONTENTTYPE VARCHAR2(40 BYTE),
LARGEOBJECTNAME VARCHAR2(255 BYTE),
LARGEOBJECTEXTENSION VARCHAR2(10 BYTE),
LARGEOBJECTDESCRIPTION VARCHAR2(255 BYTE),
LARGEOBJECTSIZE NUMBER(10),
VERSIONCONTROL NUMBER(1),
WHENLARGEOBJECTLOCKED DATE,
WHOLARGEOBJECTLOCKED CHAR(11 BYTE),
LARGEOBJECTTIMESTAMP NUMBER(8) NOT NULL,
LARGEOBJECTOID RAW(16) NOT NULL
TABLESPACE USERS
PCTUSED 0
PCTFREE 10
INITRANS 1
MAXTRANS 255
STORAGE (
INITIAL 64K
MINEXTENTS 1
MAXEXTENTS 2147483645
PCTINCREASE 0
BUFFER_POOL DEFAULT
LOGGING
NOCOMPRESS
LOB (LARGEOBJECT) STORE AS
( TABLESPACE USERS
ENABLE STORAGE IN ROW
CHUNK 8192
PCTVERSION 10
NOCACHE
STORAGE (
INITIAL 64K
MINEXTENTS 1
MAXEXTENTS 2147483645
PCTINCREASE 0
BUFFER_POOL DEFAULT
NOCACHE
NOPARALLEL
MONITORING;
Sql Loader script....
SET NLS_DATE_FORMAT=Mon dd YYYY HH:mi:ssAM
REM SET NLS_TIMESTAMP_FORMAT=Mon dd YYYY HH:mi:ss:ffAM
REM SET NLS_LANGUAGE=AL32UTF8
sqlldr cecildata/@ceciltst control=LARGEOBJECT.ctl log=LARGEOBJECT.log
Sql loader control file......
load data
infile 'nilesh.dat' "str '<er>'"
into table LARGEOBJECT
fields terminated by '<ec>'
trailing nullcols
(LARGEOBJECTID,
LARGEOBJECT CHAR(2000000) "HEXTORAW (:LARGEOBJECT)",
CONTENTTYPE "DECODE(:CONTENTTYPE, CHR(00), ' ', :CONTENTTYPE)",
LARGEOBJECTNAME CHAR(255) "DECODE(:LARGEOBJECTNAME, CHR(00), ' ', :LARGEOBJECTNAME)",
LARGEOBJECTEXTENSION "DECODE(:LARGEOBJECTEXTENSION, CHR(00), ' ', :LARGEOBJECTEXTENSION)",
LARGEOBJECTDESCRIPTION CHAR(255) "DECODE(:LARGEOBJECTDESCRIPTION, CHR(00), ' ', :LARGEOBJECTDESCRIPTION)",
LARGEOBJECTSIZE,
VERSIONCONTROL,
WHENLARGEOBJECTLOCKED,
WHOLARGEOBJECTLOCKED,
LARGEOBJECTTIMESTAMP,
LARGEOBJECTOID "GUID_MOVER(:LARGEOBJECTOID)")
Error Received...
Column Name Position Len Term Encl Datatype
LARGEOBJECTID FIRST * CHARACTER
Terminator string : '<ec>'
LARGEOBJECT NEXT ***** CHARACTER
Maximum field length is 2000000
Terminator string : '<ec>'
SQL string for column : "HEXTORAW (:LARGEOBJECT)"
CONTENTTYPE NEXT * CHARACTER
Terminator string : '<ec>'
SQL string for column : "DECODE(:CONTENTTYPE, CHR(00), ' ', :CONTENTTYPE)"
LARGEOBJECTNAME NEXT 255 CHARACTER
Terminator string : '<ec>'
SQL string for column : "DECODE(:LARGEOBJECTNAME, CHR(00), ' ', :LARGEOBJECTNAME)"
LARGEOBJECTEXTENSION NEXT * CHARACTER
Terminator string : '<ec>'
SQL string for column : "DECODE(:LARGEOBJECTEXTENSION, CHR(00), ' ', :LARGEOBJECTEXTENSION)"
LARGEOBJECTDESCRIPTION NEXT 255 CHARACTER
Terminator string : '<ec>'
SQL string for column : "DECODE(:LARGEOBJECTDESCRIPTION, CHR(00), ' ', :LARGEOBJECTDESCRIPTION)"
LARGEOBJECTSIZE NEXT * CHARACTER
Terminator string : '<ec>'
VERSIONCONTROL NEXT * CHARACTER
Terminator string : '<ec>'
WHENLARGEOBJECTLOCKED NEXT * CHARACTER
Terminator string : '<ec>'
WHOLARGEOBJECTLOCKED NEXT * CHARACTER
Terminator string : '<ec>'
LARGEOBJECTTIMESTAMP NEXT * CHARACTER
Terminator string : '<ec>'
LARGEOBJECTOID NEXT * CHARACTER
Terminator string : '<ec>'
SQL string for column : "GUID_MOVER(:LARGEOBJECTOID)"
SQL*Loader-309: No SQL string allowed as part of LARGEOBJECT field specification
what's the cause ?The previous attempt to use OMWB online loading has generated garbage data. The picture was not matching with the person id.This is being worked on (bug4119713) If you have a reproducible testcase please send it in (small testcases seem to work ok).
I have the following email about BLOBS I could forward to you if I have your email address:
[The forum may cut the lines in the wrong places]
Regards,
Turloch
Oracle Migration Workbench Team
Hi,
This may provide the solution. Without having the customer files here I can only guess at the problem. But this should help.
This email outlines a BLOB data move.
There are quiet a few steps to complete the task of moving a large BLOB into the Oracle database.
Normally this wouldn't be a problem, but as far as we can tell SQL Server's (and possibly Sybase) BCP does not reliably export binary data.
The only way to export binary data properly via BCP is to export it in a HEX format.
Once in a HEX format it is difficult to get it back to binary during a data load into Oracle.
We have come up with the idea of getting the HEX values into Oracle by saving them in a CLOB (holds text) column.
We then convert the HEX values to binary values and insert them into the BLOB column.
The problem here is that the HEXTORAW function in Oracle only converts a maximum of 2000 HEX pairs.
We over came this problem by writing our own procedure that will convert (bit by bit) your HEX data to binary.
NOTE: YOU MUST MODIFY THE START.SQL AND FINISH.SQL TO SUIT YOUR CUSTOMER
The task is split into 4 sub tasks
1) CREATE A TABLESPACE TO HOLD ALL THE LOB DATA
--log into your system schema and create a tablespace
--Create a new tablespace for the CLOB and BLOB column (this may take a while to create)
--You may resize this to fit your data ,
--but I believe you have in excess of 500MB of data in this table and we are going to save it twice (in a clob then a blob)
--Note: This script may take some time to execute as it has to create a tablespace of 1000Mb.
-- Change this to suit your customer.
-- You can change this if you want depending on the size of your data
-- Remember that we save the data once as CLOB and then as BLOB
create tablespace lob_tablespace datafile 'lob_tablespace' SIZE 1000M AUTOEXTEND ON NEXT 50M;
LOG INTO YOUR TABLE SCHEMA IN ORACLE
--Modify this script to fit your requirements
2) START.SQL (this script will do the following tasks)
a) Modify your current schema so that it can accept HEX data
b) Modify your current schema so that it can hold that huge amount of data.
The new tablespace is used; you may want to alter this to your requirements
c) Disable triggers, indexes & primary keys on tblfiles
3)DATA MOVE
The data move now involves moving the HEX data in the .dat files to a CLOB.
The START.SQL script adds a new column to <tablename> called <blob_column>_CLOB.
This is where the HEX values will be stored.
MODIFY YOUR CONTROL FILE TO LOOK LIKE THISload data
infile '<tablename>.dat' "str '<er>'"
into table <tablename>
fields terminated by '<ec>'
trailing nullcols
<blob_column>_CLOB CHAR(200000000),
The important part being "_CLOB" appended to your BLOB column name and the datatype set to CHAR(200000000)
RUN sql_loader_script.bat
log into your schema to check if the data was loaded successfully-- now you can see that the hex values were sent to the CLOB column
SQL> select dbms_lob.getlength(<blob_column>),dbms_lob.getlength(<blob_column>_clob) from <tablename>;
LOG INTO YOUR SCHEMA
4)FINISH.SQL (this script will do the following tasks)
a) Creates the procedure needed to perform the CLOB to BLOB transformation
b) Executes the procedure (this may take some time a 500Mb has to be converted to BLOB)
c) Alters the table back to its original form (removes the <blob_column>_clob)
b) Enables the triggers, indexes and primary keys
Regards,
(NAME)
-- START.SQL
-- Modify this for your particular customer
-- This should be executed in the user schema in Oracle that contains the table.
-- DESCRIPTION:
-- ALTERS THE OFFENDING TABLE SO THAT THE DATA MOVE CAN BE EXECUTED
-- DISABLES TRIGGERS, INDEXES AND SEQUENCES ON THE OFFENDING TABLE
-- 1) Add an extra column to hold the hex string
alter table <tablename> add (FILEBINARY_CLOB CLOB);
-- 2) Allow the BLOB column to accpet NULLS
alter table <tablename> MODIFY FILEBINARY NULL;
-- 3) Dissable triggers and sequences on tblfiles
alter trigger <triggername> disable;
alter table tblfiles drop primary key cascade;
drop index <indexname>;
-- 4) Allow the table to use the tablespace
alter table <tablename> move lob (<blob_column>) store as (tablespace lob_tablespace);
alter table tblfiles move lob (<blob_column>clob) store as (tablespace lobtablespace);
COMMIT;
-- END OF FILE
-- FINISH.SQL
-- Modify this for your particular customer
-- This should be executed in the table schema in Oracle.
-- DESCRIPTION:
-- MOVES THE DATA FROM CLOB TO BLOB
-- MODIFIES THE TABLE BACK TO ITS ORIGIONAL SPEC (without a clob)
-- THEN ENABLES THE SEQUENCES, TRIGGERS AND INDEXES AGAIN
-- Currently we have the hex values saved as text in the <columnname>_CLOB column
-- And we have NULL in all rows for the <columnname> column.
-- We have to get BLOB locators for each row in the BLOB column
-- put empty blobs in the blob column
UPDATE <tablename> SET filebinary=EMPTY_BLOB();
COMMIT;
-- create the following procedure in your table schema
CREATE OR REPLACE PROCEDURE CLOBTOBLOB
AS
inputLength NUMBER; -- size of input CLOB
offSet NUMBER := 1;
pieceMaxSize NUMBER := 50; -- the max size of each peice
piece VARCHAR2(50); -- these pieces will make up the entire CLOB
currentPlace NUMBER := 1; -- this is where were up to in the CLOB
blobLoc BLOB; -- blob locator in the table
clobLoc CLOB; -- clob locator pointsthis is the value from the dat file
-- THIS HAS TO BE CHANGED FOR SPECIFIC CUSTOMER TABLE AND COLUMN NAMES
CURSOR cur IS SELECT <blob_column>clob clobcolumn , <blob_column> blob_column FROM /*table*/<tablename> FOR UPDATE;
cur_rec cur%ROWTYPE;
BEGIN
OPEN cur;
FETCH cur INTO cur_rec;
WHILE cur%FOUND
LOOP
--RETRIVE THE clobLoc and blobLoc
clobLoc := cur_rec.clob_column;
blobLoc := cur_rec.blob_column;
currentPlace := 1; -- reset evertime
-- find the lenght of the clob
inputLength := DBMS_LOB.getLength(clobLoc);
-- loop through each peice
LOOP
-- get the next piece and add it to the clob
piece := DBMS_LOB.subStr(clobLoc,pieceMaxSize,currentPlace);
-- append this peice to the BLOB
DBMS_LOB.WRITEAPPEND(blobLoc, LENGTH(piece)/2, HEXTORAW(piece));
currentPlace := currentPlace + pieceMaxSize ;
EXIT WHEN inputLength < currentplace;
END LOOP;
FETCH cur INTO cur_rec;
END LOOP;
END CLOBtoBLOB;
-- now run the procedure
-- It will update the blob column witht the correct binary represntation of the clob column
EXEC CLOBtoBLOB;
-- drop the extra clob cloumn
alter table <tablename> drop column <blob_column>_clob;
-- 2) apply the constraint we removed during the data load
alter table <tablename> MODIFY FILEBINARY NOT NULL;
-- Now re enable the triggers,indexs and primary keys
alter trigger <triggername> enable;
ALTER TABLE TBLFILES ADD ( CONSTRAINT <pkname> PRIMARY KEY ( <column>) ) ;
CREATE INDEX <index_name> ON TBLFILES ( <column> );
COMMIT;
-- END OF FILE -
Sql loader Wizard in Toad For Oracle
when i use oracle xe i can load data using toad with sql loader wizard.
But in xe i met some problems and i must to set up oracle 10 g db server and now when i'm using toad, nothing happened and didnt any error in sql loader wizard.
Is it about unicode characters. When i set up oracle,i choosed WE8ISO8859P9 character set and changed regedit nls_lang value is TURKISH_TURKEY.WE8ISO8859P9.Do you think what can be the errori solved the problem using command window and i can load data into table
tenks all
c: \sqlldr.exe kdevre@orcl control= C:\Users\TT\Desktop\loaderlar\kd_abone.ctl
but i still understand why i didnt accomplish with toad -
Sql * loader/Utl_file
Hi all,
As i know that both the UTL_FILE package and SQL* Loader are used to Load the flat file data into Apps Table.
My question is
1) Which one is the Best Method to use.
2) Advantages and disadvantages by using UTL_FILE and SQL* LOADER ;
Thanks In Advance
Goutham KonduruTriple posting
Diffrence b/w UTL_FILE package & SQL Loader
UTL_FILE package and SQL* Loader -
I have a text file that has multiple records on a line, i want to bring in each as a separate record, then drop to the next line and continue a new set of data. each record is separated by a space. each line has max number of 40 records 54 characters long. I couldn't figure out how to use continueif with sql loader and make this work. Any suggestions??
Thank you,
STevenHi Steven
from whati understand you have a line in a flat file and this line will be seperated to seperate rows and inserted into the db??
if yes then this might help you
i suggest you create a function with an IN OUT parameter, which you feed the line and seperate it by a defined seperater, and use each portion as a seperate record/row.
ie:
FUNCTION TO_TEXTVAL
(P_TEXT_LINE IN OUT VARCHAR2)
RETURN VARCHAR2 IS
V_SEP NUMBER(1000):= 0;
V_TEXT_LINE VARCHAR2(10000);
BEGIN
V_SEP := INSTR(P_TEXT_LINE,';');
IF V_SEP = 0 THEN
V_TEXT_LINE := P_TEXT_LINE;
ELSE
V_TEXT_LINE := SUBSTR(P_TEXT_LINE, 1, V_SEP - 1);
P_TEXT_LINE := SUBSTR(P_TEXT_LINE, V_SEP + 1);
END IF;
RETURN (V_TEXT_LINE);
END TO_TEXTVAL;
here i have used a ';' as my seperator u can replace that with a ' ' (space) to do the job .
FUNCTION TO_TEXTVAL
(P_TEXT_LINE IN OUT VARCHAR2)
RETURN VARCHAR2 IS
V_SEP NUMBER(1000):= 0;
V_TEXT_LINE VARCHAR2(10000);
BEGIN
V_SEP := INSTR(P_TEXT_LINE,' ');
IF V_SEP = 0 THEN
V_TEXT_LINE := P_TEXT_LINE;
ELSE
V_TEXT_LINE := SUBSTR(P_TEXT_LINE, 1, V_SEP - 1);
P_TEXT_LINE := SUBSTR(P_TEXT_LINE, V_SEP + 1);
END IF;
RETURN (V_TEXT_LINE);
END TO_TEXTVAL;
then you can use this function in a procedure that calls UTL_FILE.GET_LINE builtin
ie:
UTL_FILE.GET_LINE(v_filehandle, v_text);
v_item_code := TO_TEXTVAL(v_text);
v_item_qty := TO_TEXTVAL(v_text);
v_prod_date := TO_DATE(TO_TEXTVAL(v_text), 'MM/DD/YYYY HH:MI:SS AM');
hope this helps.
Regards
Tony G. -
Running SQL Loader through PL/SQL
Hi All,
Is there a utility package that can be used to run SQL LOADER through PL/SQL?
RegardsExternal tables are new in 9i.
If you need to call SQL*Loader in 8i, you'd be stuck with the Java stored procedure/ external procedure approach. Of course, this might also be an impetus to upgrade, since 8.1.7 leaves error correction support at the end of the year.
Justin
Distributed Database Consulting, Inc.
http://www.ddbcinc.com/askDDBC -
Sql loader utl_file & external table
can any one let me know the differences between.
1.sql loader
2.utl_file
3.external table
Regards.
Asif.To expand on Aron's answer....
SQL*Loader - An operating system utility which uses control files (which you create) to load data files onto database tables.
UTL_FILE - A database package which can be used for reading and writing files in any format you care to design programmatically.
External Table - The latest thing which can be used instead of SQL*Loader. This is done from the database end, by creating a table as an external table and pointing it at the source file on the operating system. It also allows information similar to that put in the SQL*Loader control files to be specified against the table. By querying against the table you are in fact querying against the source file. There are some limitation compared to regular database tables such as there is no ability to write to the external table.
;)
Maybe you are looking for
-
When i click import, then open external hard drive the bookmarks dont show. If i open the external drive via 'start/computer' i can see them in it ok,.
-
Can I use a 3rd party external hard drive to backup my iMac?
I use my iMac to backup all our iOS devices, but I don't have a back up for the iMac. I have a 500MB harddrive but am only using about 200MB. Do I have to buy a Time Capsule or is there a smaller external hard drive that I could use instead? Any othe
-
BADI implementation for NOTIF_EVENT_POST not getting triggered
Hi All, We have created an implementaion for the NOTIF_EVENT_POST. And it was working fine previously. Now it is not working for some reason. In fact issue is that the implemetation is not getting triggerd at all. I have put a BREAK-POINT sta
-
Invalid Kiosk session configuration...
Guys, I have a Solaris-based SRSS 4.2 installation. I have Global Kiosk set to connect to VMware View server (and it works fine). I have certain people that need to connect to their Linux desktops. So I created another Kiosk configuration using the K
-
Suggested alternative to key mapping?
Hello! I started using Abelton live for putting together easily accessible virtual instruments for live performance. For example, if my band performs the entire Dark Side of the Moon album, I can create each song in order of the album with the necess