Transformation in SQL Loader

Hi all,
I am in the process of loading the data using sql loader. I need to transform the data when I load
I have a data in flat file as below
1     10,20,30,40     vijay
2     10,20     jagdeeshI need to load this data as
1     10     vijay
1     20     vijay
1     30     vijay
4     40     vijay
2     10     jagdeesh
2     20     jagdeeshIs it possible to load the data as above.
Thanks,
Vijayaraghavan K

Oracle 10.2.0.4
create table TMP_MESS_MISSED (
extension varchar2(100),
pin varchar2(30)
create view TMP_MESS_MISSED_VI
as
select *
from TMP_MESS_MISSED
create trigger TMP_MESS_MISSED_VI_IO
instead of insert on TMP_MESS_MISSED_VI
REFERENCING NEW AS NEW OLD AS OLD
BEGIN
insert into TMP_MESS_MISSED(extension, pin)
values('AAA'||:new.extension, :new.pin);
END;
CTL file:
load data
infile *
replace
into table TMP_MESS_MISSED_VI
FIELDS TERMINATED BY "," OPTIONALLY ENCLOSED BY '"'
extension,
pin
BEGINDATA
72014848332,0
74014848401,103
72014848430,0
77017067666,0
72017773973,0
72018181288,106
72018181288,8
72018841396,0
72023498477,0
72023604787,0
72023604787,2
running sqlldr
SQL*Loader: Release 10.2.0.4.0 - Production on Wed May 5 14:00:31 2010
Copyright (c) 1982, 2007, Oracle. All rights reserved.
SQL*Loader: Release 10.2.0.4.0 - Production on Wed May 5 14:00:58 2010
Copyright (c) 1982, 2007, Oracle. All rights reserved.
Commit point reached - logical record count 12
RESULT:
select * from TMP_MESS_MISSED
AAA72014848332      0
AAA74014848401      103
AAA72014848430      0
AAA77017067666      0
AAA72017773973      0
AAA72018181288      106
AAA72018181288      8
AAA72018841396      0
AAA72023498477      0
AAA72023604787      0
AAA72023604787      2
Regards
Andrey

Similar Messages

  • Comparison of Data Loading techniques - Sql Loader & External Tables

    Below are 2 techniques using which the data can be loaded from Flat files to oracle tables.
    1)     SQL Loader:
    a.     Place the flat file( .txt or .csv) on the desired Location.
    b.     Create a control file
    Load Data
    Infile "Mytextfile.txt" (-- file containing table data , specify paths correctly, it could be .csv as well)
    Append or Truncate (-- based on requirement) into oracle tablename
    Separated by "," (or the delimiter we use in input file) optionally enclosed by
    (Field1, field2, field3 etc)
    c.     Now run sqlldr utility of oracle on sql command prompt as
    sqlldr username/password .CTL filename
    d.     The data can be verified by selecting the data from the table.
    Select * from oracle_table;
    2)     External Table:
    a.     Place the flat file (.txt or .csv) on the desired location.
    abc.csv
    1,one,first
    2,two,second
    3,three,third
    4,four,fourth
    b.     Create a directory
    create or replace directory ext_dir as '/home/rene/ext_dir'; -- path where the source file is kept
    c.     After granting appropriate permissions to the user, we can create external table like below.
    create table ext_table_csv (
    i Number,
    n Varchar2(20),
    m Varchar2(20)
    organization external (
    type oracle_loader
    default directory ext_dir
    access parameters (
    records delimited by newline
    fields terminated by ','
    missing field values are null
    location ('file.csv')
    reject limit unlimited;
    d.     Verify data by selecting it from the external table now
    select * from ext_table_csv;
    External tables feature is a complement to existing SQL*Loader functionality.
    It allows you to –
    •     Access data in external sources as if it were in a table in the database.
    •     Merge a flat file with an existing table in one statement.
    •     Sort a flat file on the way into a table you want compressed nicely
    •     Do a parallel direct path load -- without splitting up the input file, writing
    Shortcomings:
    •     External tables are read-only.
    •     No data manipulation language (DML) operations or index creation is allowed on an external table.
    Using Sql Loader You can –
    •     Load the data from a stored procedure or trigger (insert is not sqlldr)
    •     Do multi-table inserts
    •     Flow the data through a pipelined plsql function for cleansing/transformation
    Comparison for data loading
    To make the loading operation faster, the degree of parallelism can be set to any number, e.g 4
    So, when you created the external table, the database will divide the file to be read by four processes running in parallel. This parallelism happens automatically, with no additional effort on your part, and is really quite convenient. To parallelize this load using SQL*Loader, you would have had to manually divide your input file into multiple smaller files.
    Conclusion:
    SQL*Loader may be the better choice in data loading situations that require additional indexing of the staging table. However, we can always copy the data from external tables to Oracle Tables using DB links.

    Please let me know your views on this.

  • SQL Loader CLOBs

    I am moving data from an oracle 10g table to another oracle 10g table, transforming the data in Access, filtering records, merging fields, etc.., then via Access VBA creating CTL files and neccessary scripts, then loading the data via SQL Loader. Getting the CLOBs to load is not a problem, getting it to load with the formatting is. The records in the control file look fine, the formatting is kept just fine in the CTL file, but when it is loaded all the carriage returns, line feeds, etc are removed
    Here is the header of my CTL file:
    LOAD DATA
    INFILE *
    CONTINUEIF LAST != "|"
    INTO TABLE table_name
    APPEND
    FIELDS TERMINATED BY '||' TRAILING NULLCOLS
    (field1, field2, field3....etc)
    begindata
    field1||field2||field3||fieldx|
    At first i thought taking out the TRAILING NULLCOLS would help but i get the same result.
    The end result i am looking for is keeping the data formatted exactly as it is in the source table.
    Thanks
    Mike

    After doing some research it seems that the only way is creating a file with the clob, see http://download.oracle.com/docs/cd/B19306_01/server.102/b14215/ldr_loading.htm#i1007627
    Control File Contents:LOAD DATA
    INFILE 'sample.dat'
    INTO TABLE person_table
    FIELDS TERMINATED BY ','
    (name CHAR(20),
    ext_fname FILLER CHAR(40),
    "RESUME" LOBFILE(ext_fname) TERMINATED BY EOF)
    Datafile (sample.dat):
    Johny Quest,jqresume.txt,
    Speed Racer,'/private/sracer/srresume.txt',
    Secondary Datafile (jqresume.txt):
    Johny Quest,479 Johny Quest
    500 Oracle Parkway
    [email protected]>
    HTH
    Enrique

  • SQL Loader unicode (umlaut) problem

    Hi
    I want to load some data with SQL Loader. The data contains german umlaut like ä, ö, ü.
    The loading process works, but the umlaut are transformed to something like 'ü' in the DB. How can I get to load them correctly?
    My environment:
    - DB 10g Rel.2
    - Windows XP
    - Registry key in Ora_Home: NLS_LANG=GERMAN_GERMANY.WE8MSWIN1252
    I tried it with setting the character set in the CTL file:
    characterset 'WE8MSWIN1252'
    That didn't help either.
    Does anyone have an idea? I searched the forum but didn't find a solution.
    Thanks for your help,
    Roger

    Maybe a codepage issue ? See this example :
    C:\tmp>type umlaut.ctl
    load data
    infile umlaut.dat
    replace
    into table umlaut_tab
    (a)
    C:\tmp>sqlldr test/test control=umlaut.ctl
    SQL*Loader: Release 10.2.0.1.0 - Production on Tue Jun 10 13:19:50 2008
    Copyright (c) 1982, 2005, Oracle.  All rights reserved.
    Commit point reached - logical record count 2
    Commit point reached - logical record count 3
    C:\tmp>sqlplus test/test
    SQL*Plus: Release 10.2.0.1.0 - Production on Tue Jun 10 13:19:56 2008
    Copyright (c) 1982, 2005, Oracle.  All rights reserved.
    Connected to:
    Oracle Database 10g Express Edition Release 10.2.0.1.0 - Production
    SQL> select * from umlaut_tab;
    A
    õ
    ÷
    ³
    SQL> exit
    Disconnected from Oracle Database 10g Express Edition Release 10.2.0.1.0 - Produ
    ction
    C:\tmp>chcp 1252
    Tabella codici attiva: 1252
    C:\tmp>sqlplus test/test
    SQL*Plus: Release 10.2.0.1.0 - Production on Tue Jun 10 13:20:19 2008
    Copyright (c) 1982, 2005, Oracle.  All rights reserved.
    Connected to:
    Oracle Database 10g Express Edition Release 10.2.0.1.0 - Production
    SQL> select * from umlaut_tab;
    A
    ä
    ö
    ü
    SQL>

  • Sql loader issue char to date

    Hi
    I have a text file like this:
    logs~-~189.138.221.234~[19/Nov/2007:18:39:53 +0100]~mujer.orange.es~/mujer.woo/home/home/index.html~mujer.woo~home~home~index.html~Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 5.1; FunWebProducts; InfoPath.2)~bid=1195493283-1925481236; __em_p=1195493283-1925481236%26bid; GUID=0002D029C7BD07412A7C28F861626364~-~?ord=4939922038660
    Im trying to load via sql loader to a table like this:
    CREATE TABLE SPY_LOGS_FLOGS
    TYPE VARCHAR2(10 BYTE),
    PROXY VARCHAR2(30 BYTE),
    IP VARCHAR2(30 BYTE),
    DATETIME DATE,
    REFERER VARCHAR2(100 BYTE),
    SPY VARCHAR2(100 BYTE),
    SPY0 VARCHAR2(100 BYTE),
    SPY1 VARCHAR2(100 BYTE),
    SPY2 VARCHAR2(100 BYTE),
    SPY3 VARCHAR2(100 BYTE),
    BROWSER VARCHAR2(300 BYTE),
    COOKIE VARCHAR2(100 BYTE),
    UNKNOWN VARCHAR2(100 BYTE),
    QS VARCHAR2(100 BYTE)
    Im using a control similar to this:
    LOAD DATA
    infile '../files/spy_logs_flogs'
    append
    into table spy_logs_flogs
    FIELDS TERMINATED BY '~'
    TRAILING NULLCOLS
    TYPE char,
    PROXY char,
    IP char,
    DATETIME date 'to_date(:DATETIME,'"["yyyymmdd hh24:mi:ss"] +100"')' ,
    REFERER char,
    SPY char,
    SPY0 char,
    SPY1 char,
    SPY2 char,
    SPY3 char,
    BROWSER char,
    COOKIE char,
    UNKNOWN char,
    qs char
    Im trying to transform the DATETIME field from [19/Nov/2007:18:39:53 +0100] to date 19/10/2007 18:39:53, but have tested with control and cant find the way to do it.
    Any help will be appreciate.

    Play with
    select substr('[19/Nov/2007:18:39:53 +0100]',2,20) the_date
      from dualand if you get 19/Nov/2007:18:39:53 proceed to
    select to_date('19/Nov/2007:18:39:53','DD/MON/YYYY:HH24:MI:SS')
      from dual
    select to_date('19/Nov/2007:18:39:53','DD/Mon/YYYY:HH24:MI:SS')
      from dual
    select to_date('19/Nov/2007 18:39:53','DD/MON/YYYY HH24:MI:SS')
      from dual
    select to_date('19/Nov/2007 18:39:53','DD/Mon/YYYY HH24:MI:SS')
      from dualperhaps one of those will work (if it's the one without : then you must replace it with space using substr and || before using to_date)
    Regards
    Etbin

  • Sql loader - BLOB

    I have used OMWB "generate Sql Loader script " option. ANd received the sql loader error.
    The previous attempt to use OMWB online loading has generated garbage data. The picture was not matching with the person id.
    Table in Sql Server..................
    CREATE TABLE [nilesh] (
         [LargeObjectID] [int] NOT NULL ,
         [LargeObject] [image] NULL ,
         [ContentType] [varchar] (40) COLLATE SQL_Latin1_General_CP1_CI_AS NULL ,
         [LargeObjectName] [varchar] (255) COLLATE SQL_Latin1_General_CP1_CI_AS NULL ,
         [LargeObjectExtension] [varchar] (10) COLLATE SQL_Latin1_General_CP1_CI_AS NULL ,
         [LargeObjectDescription] [varchar] (255) COLLATE SQL_Latin1_General_CP1_CI_AS NULL ,
         [LargeObjectSize] [int] NULL ,
         [VersionControl] [bit] NULL ,
         [WhenLargeObjectLocked] [datetime] NULL ,
         [WhoLargeObjectLocked] [char] (11) COLLATE SQL_Latin1_General_CP1_CI_AS NULL ,
         [LargeObjectTimeStamp] [timestamp] NOT NULL ,
         [LargeObjectOID] [uniqueidentifier] NOT NULL
    ) ON [PRIMARY] TEXTIMAGE_ON [PRIMARY]
    GO
    Table in Oracle..............
    CREATE TABLE LARGEOBJECT
    LARGEOBJECTID NUMBER(10) NOT NULL,
    LARGEOBJECT BLOB,
    CONTENTTYPE VARCHAR2(40 BYTE),
    LARGEOBJECTNAME VARCHAR2(255 BYTE),
    LARGEOBJECTEXTENSION VARCHAR2(10 BYTE),
    LARGEOBJECTDESCRIPTION VARCHAR2(255 BYTE),
    LARGEOBJECTSIZE NUMBER(10),
    VERSIONCONTROL NUMBER(1),
    WHENLARGEOBJECTLOCKED DATE,
    WHOLARGEOBJECTLOCKED CHAR(11 BYTE),
    LARGEOBJECTTIMESTAMP NUMBER(8) NOT NULL,
    LARGEOBJECTOID RAW(16) NOT NULL
    TABLESPACE USERS
    PCTUSED 0
    PCTFREE 10
    INITRANS 1
    MAXTRANS 255
    STORAGE (
    INITIAL 64K
    MINEXTENTS 1
    MAXEXTENTS 2147483645
    PCTINCREASE 0
    BUFFER_POOL DEFAULT
    LOGGING
    NOCOMPRESS
    LOB (LARGEOBJECT) STORE AS
    ( TABLESPACE USERS
    ENABLE STORAGE IN ROW
    CHUNK 8192
    PCTVERSION 10
    NOCACHE
    STORAGE (
    INITIAL 64K
    MINEXTENTS 1
    MAXEXTENTS 2147483645
    PCTINCREASE 0
    BUFFER_POOL DEFAULT
    NOCACHE
    NOPARALLEL
    MONITORING;
    Sql Loader script....
    SET NLS_DATE_FORMAT=Mon dd YYYY HH:mi:ssAM
    REM SET NLS_TIMESTAMP_FORMAT=Mon dd YYYY HH:mi:ss:ffAM
    REM SET NLS_LANGUAGE=AL32UTF8
    sqlldr cecildata/@ceciltst control=LARGEOBJECT.ctl log=LARGEOBJECT.log
    Sql loader control file......
    load data
    infile 'nilesh.dat' "str '<er>'"
    into table LARGEOBJECT
    fields terminated by '<ec>'
    trailing nullcols
    (LARGEOBJECTID,
    LARGEOBJECT CHAR(2000000) "HEXTORAW (:LARGEOBJECT)",
    CONTENTTYPE "DECODE(:CONTENTTYPE, CHR(00), ' ', :CONTENTTYPE)",
    LARGEOBJECTNAME CHAR(255) "DECODE(:LARGEOBJECTNAME, CHR(00), ' ', :LARGEOBJECTNAME)",
    LARGEOBJECTEXTENSION "DECODE(:LARGEOBJECTEXTENSION, CHR(00), ' ', :LARGEOBJECTEXTENSION)",
    LARGEOBJECTDESCRIPTION CHAR(255) "DECODE(:LARGEOBJECTDESCRIPTION, CHR(00), ' ', :LARGEOBJECTDESCRIPTION)",
    LARGEOBJECTSIZE,
    VERSIONCONTROL,
    WHENLARGEOBJECTLOCKED,
    WHOLARGEOBJECTLOCKED,
    LARGEOBJECTTIMESTAMP,
    LARGEOBJECTOID "GUID_MOVER(:LARGEOBJECTOID)")
    Error Received...
    Column Name Position Len Term Encl Datatype
    LARGEOBJECTID FIRST * CHARACTER
    Terminator string : '<ec>'
    LARGEOBJECT NEXT ***** CHARACTER
    Maximum field length is 2000000
    Terminator string : '<ec>'
    SQL string for column : "HEXTORAW (:LARGEOBJECT)"
    CONTENTTYPE NEXT * CHARACTER
    Terminator string : '<ec>'
    SQL string for column : "DECODE(:CONTENTTYPE, CHR(00), ' ', :CONTENTTYPE)"
    LARGEOBJECTNAME NEXT 255 CHARACTER
    Terminator string : '<ec>'
    SQL string for column : "DECODE(:LARGEOBJECTNAME, CHR(00), ' ', :LARGEOBJECTNAME)"
    LARGEOBJECTEXTENSION NEXT * CHARACTER
    Terminator string : '<ec>'
    SQL string for column : "DECODE(:LARGEOBJECTEXTENSION, CHR(00), ' ', :LARGEOBJECTEXTENSION)"
    LARGEOBJECTDESCRIPTION NEXT 255 CHARACTER
    Terminator string : '<ec>'
    SQL string for column : "DECODE(:LARGEOBJECTDESCRIPTION, CHR(00), ' ', :LARGEOBJECTDESCRIPTION)"
    LARGEOBJECTSIZE NEXT * CHARACTER
    Terminator string : '<ec>'
    VERSIONCONTROL NEXT * CHARACTER
    Terminator string : '<ec>'
    WHENLARGEOBJECTLOCKED NEXT * CHARACTER
    Terminator string : '<ec>'
    WHOLARGEOBJECTLOCKED NEXT * CHARACTER
    Terminator string : '<ec>'
    LARGEOBJECTTIMESTAMP NEXT * CHARACTER
    Terminator string : '<ec>'
    LARGEOBJECTOID NEXT * CHARACTER
    Terminator string : '<ec>'
    SQL string for column : "GUID_MOVER(:LARGEOBJECTOID)"
    SQL*Loader-309: No SQL string allowed as part of LARGEOBJECT field specification
    what's the cause ?

    The previous attempt to use OMWB online loading has generated garbage data. The picture was not matching with the person id.This is being worked on (bug4119713) If you have a reproducible testcase please send it in (small testcases seem to work ok).
    I have the following email about BLOBS I could forward to you if I have your email address:
    [The forum may cut the lines in the wrong places]
    Regards,
    Turloch
    Oracle Migration Workbench Team
    Hi,
    This may provide the solution. Without having the customer files here I can only guess at the problem. But this should help.
    This email outlines a BLOB data move.
    There are quiet a few steps to complete the task of moving a large BLOB into the Oracle database.
    Normally this wouldn't be a problem, but as far as we can tell SQL Server's (and possibly Sybase) BCP does not reliably export binary data.
    The only way to export binary data properly via BCP is to export it in a HEX format.
    Once in a HEX format it is difficult to get it back to binary during a data load into Oracle.
    We have come up with the idea of getting the HEX values into Oracle by saving them in a CLOB (holds text) column.
    We then convert the HEX values to binary values and insert them into the BLOB column.
    The problem here is that the HEXTORAW function in Oracle only converts a maximum of 2000 HEX pairs.
    We over came this problem by writing our own procedure that will convert (bit by bit) your HEX data to binary.
    NOTE: YOU MUST MODIFY THE START.SQL AND FINISH.SQL TO SUIT YOUR CUSTOMER
    The task is split into 4 sub tasks
    1) CREATE A TABLESPACE TO HOLD ALL THE LOB DATA
    --log into your system schema and create a tablespace
    --Create a new tablespace for the CLOB and BLOB column (this may take a while to create)
    --You may resize this to fit your data ,
    --but I believe you have in excess of 500MB of data in this table and we are going to save it twice (in a clob then a blob)
    --Note: This script may take some time to execute as it has to create a tablespace of 1000Mb.
    -- Change this to suit your customer.
    -- You can change this if you want depending on the size of your data
    -- Remember that we save the data once as CLOB and then as BLOB
    create tablespace lob_tablespace datafile 'lob_tablespace' SIZE 1000M AUTOEXTEND ON NEXT 50M;
    LOG INTO YOUR TABLE SCHEMA IN ORACLE
    --Modify this script to fit your requirements
    2) START.SQL (this script will do the following tasks)
    a) Modify your current schema so that it can accept HEX data
    b) Modify your current schema so that it can hold that huge amount of data.
    The new tablespace is used; you may want to alter this to your requirements
    c) Disable triggers, indexes & primary keys on tblfiles
    3)DATA MOVE
    The data move now involves moving the HEX data in the .dat files to a CLOB.
    The START.SQL script adds a new column to <tablename> called <blob_column>_CLOB.
    This is where the HEX values will be stored.
    MODIFY YOUR CONTROL FILE TO LOOK LIKE THISload data
    infile '<tablename>.dat' "str '<er>'"
    into table <tablename>
    fields terminated by '<ec>'
    trailing nullcols
    <blob_column>_CLOB CHAR(200000000),
    The important part being "_CLOB" appended to your BLOB column name and the datatype set to CHAR(200000000)
    RUN sql_loader_script.bat
    log into your schema to check if the data was loaded successfully-- now you can see that the hex values were sent to the CLOB column
    SQL> select dbms_lob.getlength(<blob_column>),dbms_lob.getlength(<blob_column>_clob) from <tablename>;
    LOG INTO YOUR SCHEMA
    4)FINISH.SQL (this script will do the following tasks)
    a) Creates the procedure needed to perform the CLOB to BLOB transformation
    b) Executes the procedure (this may take some time a 500Mb has to be converted to BLOB)
    c) Alters the table back to its original form (removes the <blob_column>_clob)
    b) Enables the triggers, indexes and primary keys
    Regards,
    (NAME)
    -- START.SQL
    -- Modify this for your particular customer
    -- This should be executed in the user schema in Oracle that contains the table.
    -- DESCRIPTION:
    -- ALTERS THE OFFENDING TABLE SO THAT THE DATA MOVE CAN BE EXECUTED
    -- DISABLES TRIGGERS, INDEXES AND SEQUENCES ON THE OFFENDING TABLE
    -- 1) Add an extra column to hold the hex string
    alter table <tablename> add (FILEBINARY_CLOB CLOB);
    -- 2) Allow the BLOB column to accpet NULLS
    alter table <tablename> MODIFY FILEBINARY NULL;
    -- 3) Dissable triggers and sequences on tblfiles
    alter trigger <triggername> disable;
    alter table tblfiles drop primary key cascade;
    drop index <indexname>;
    -- 4) Allow the table to use the tablespace
    alter table <tablename> move lob (<blob_column>) store as (tablespace lob_tablespace);
    alter table tblfiles move lob (<blob_column>clob) store as (tablespace lobtablespace);
    COMMIT;
    -- END OF FILE
    -- FINISH.SQL
    -- Modify this for your particular customer
    -- This should be executed in the table schema in Oracle.
    -- DESCRIPTION:
    -- MOVES THE DATA FROM CLOB TO BLOB
    -- MODIFIES THE TABLE BACK TO ITS ORIGIONAL SPEC (without a clob)
    -- THEN ENABLES THE SEQUENCES, TRIGGERS AND INDEXES AGAIN
    -- Currently we have the hex values saved as text in the <columnname>_CLOB column
    -- And we have NULL in all rows for the <columnname> column.
    -- We have to get BLOB locators for each row in the BLOB column
    -- put empty blobs in the blob column
    UPDATE <tablename> SET filebinary=EMPTY_BLOB();
    COMMIT;
    -- create the following procedure in your table schema
    CREATE OR REPLACE PROCEDURE CLOBTOBLOB
    AS
    inputLength NUMBER; -- size of input CLOB
    offSet NUMBER := 1;
    pieceMaxSize NUMBER := 50; -- the max size of each peice
    piece VARCHAR2(50); -- these pieces will make up the entire CLOB
    currentPlace NUMBER := 1; -- this is where were up to in the CLOB
    blobLoc BLOB; -- blob locator in the table
    clobLoc CLOB; -- clob locator pointsthis is the value from the dat file
    -- THIS HAS TO BE CHANGED FOR SPECIFIC CUSTOMER TABLE AND COLUMN NAMES
    CURSOR cur IS SELECT <blob_column>clob clobcolumn , <blob_column> blob_column FROM /*table*/<tablename> FOR UPDATE;
    cur_rec cur%ROWTYPE;
    BEGIN
    OPEN cur;
    FETCH cur INTO cur_rec;
    WHILE cur%FOUND
    LOOP
    --RETRIVE THE clobLoc and blobLoc
    clobLoc := cur_rec.clob_column;
    blobLoc := cur_rec.blob_column;
    currentPlace := 1; -- reset evertime
    -- find the lenght of the clob
    inputLength := DBMS_LOB.getLength(clobLoc);
    -- loop through each peice
    LOOP
    -- get the next piece and add it to the clob
    piece := DBMS_LOB.subStr(clobLoc,pieceMaxSize,currentPlace);
    -- append this peice to the BLOB
    DBMS_LOB.WRITEAPPEND(blobLoc, LENGTH(piece)/2, HEXTORAW(piece));
    currentPlace := currentPlace + pieceMaxSize ;
    EXIT WHEN inputLength < currentplace;
    END LOOP;
    FETCH cur INTO cur_rec;
    END LOOP;
    END CLOBtoBLOB;
    -- now run the procedure
    -- It will update the blob column witht the correct binary represntation of the clob column
    EXEC CLOBtoBLOB;
    -- drop the extra clob cloumn
    alter table <tablename> drop column <blob_column>_clob;
    -- 2) apply the constraint we removed during the data load
    alter table <tablename> MODIFY FILEBINARY NOT NULL;
    -- Now re enable the triggers,indexs and primary keys
    alter trigger <triggername> enable;
    ALTER TABLE TBLFILES ADD ( CONSTRAINT <pkname> PRIMARY KEY ( <column>) ) ;
    CREATE INDEX <index_name> ON TBLFILES ( <column> );
    COMMIT;
    -- END OF FILE

  • SQL*Loader sqlldr removes zeros from character field

    Hello,
    I am using SQL*Loader to load an Oracle table, and am having a problem. One of the fields is defined as VARCHAR2 and contains comments entered by a user. There may be numbers or dollar amounts included in this text. When I execute the sqlldr script below, the result is that all of the zeros on the text field disappear. There is a translate function invoked for this field (bolded statement) in an attempt to remove imbedded newlines from the text. Wherever there was a zero in the original text, it ends up being removed after I run this script. Can anyone suggest why this is occurring, and how to prevent it? Can it be related to the translate function?
    Thanks for your help!
    OPTIONS (READSIZE=20971520, BINDSIZE=20971520, ROWS=20000)
    LOAD DATA
    INFILE 'R24.REGION.ERL.N1E104' "str X'5E5E220A'"
    BADFILE 'LOGS/N1E104_BUT_RS_ASSGN_TXT_BADDATA.TXT'
    DISCARDFILE 'LOGS/N1E104_BUT_RS_ASSGN_TXT_DISCARDDATA.TXT'
    REPLACE
    INTO TABLE TESTM8.CONV_BUT_RS_ASSGN_TXT
    FIELDS TERMINATED BY '~' OPTIONALLY ENCLOSED BY '"'
    TRAILING NULLCOLS
    RST_RS_EXT_TXT_OID DECIMAL EXTERNAL,
    RST_RS_ASSGN_OID DECIMAL EXTERNAL NULLIF RST_RS_ASSGN_OID = 'NULL',
    RST_TXT_SEQ_NBR INTEGER EXTERNAL,
    RST_RS_COMM_OID DECIMAL EXTERNAL,
    RST_DIF_ASSGN_OID DECIMAL EXTERNAL NULLIF RST_DIF_ASSGN_OID = 'NULL',
    RST_EXTENDED_TXT "SUBSTR(TRANSLATE(:RST_EXTENDED_TXT, '#0x0A', '#'), 1, 248)"
    --------------------------------------------

    Never mind, found my mistake. In the TRANSLATE function, I had assumed that the 0x0A would be interpreted as a single hex value. Instead, it is interpreted literally as the character '0', the character 'x', the character 'A', etc. The result is that the transformed text had no '0', 'x', or 'A' characters, which is exactly what I inadvertently told it to do. I changed it to the following, which works better ;-)
    RST_EXTENDED_TXT "SUBSTR(TRANSLATE(:RST_EXTENDED_TXT, '#'||CHR(10), '#'), 1, 250)"

  • Performance Problem..DBMS_XMLStore or SQL Loader

    Hi All,
    I have implemented the insertion of data using the following method but it's taking very long.
    Once XML and XSL files are there in mapped 'Directory_Path'....
    BEGIN
    SELECT directory_name INTO v_directory_name
    FROM all_directories
    WHERE owner = 'SYS'
    AND directory_name = 'Directory_Path';
    END;
    xmldata1 := XMLTYPE(BFILENAME(v_directory_name, 'temp_xml.xml'), nls_charset_id('UTF8'));
    xmldata2 := XMLTYPE(BFILENAME(v_directory_name, 'temp_xml.xsl'), nls_charset_id('UTF8'));
    --Open a new context, required for these procedures
    v_context := DBMS_XMLSTORE.newContext('TEMP_XML');
    v_rows := DBMS_XMLStore.insertXML(
    v_context,
    XMLType.transform(xmldata1, xmldata2));
    -- Close the context
    DBMS_XMLStore.closeContext(v_context);
    END;
    May be I am converting data from BFILE to XMLType. But XMLType.transform(xmldata1, xmldata2) accepts XMLType data only.
    Or May be this approach where I am not giving the column names explicitly..
    Earlier I was loading the transformed XML file(With ROWSET ROW Format) in the the 'Directory_Path' and loading the data as
    DECLARE
    insCtx DBMS_XMLSTORE.ctxType;
    rows NUMBER;
    v_directory_name VARCHAR2(300);
    src_file BFILE ;
    BEGIN
    BEGIN
    SELECT directory_name INTO v_directory_name
    FROM all_directories
    WHERE owner = 'SYS'
    AND directory_name = 'Directory_Path';
    END;
    src_file := BFILENAME(v_directory_name, XMLFILE.xml');
    insCtx := DBMS_XMLSTORE.newContext( 'TABLE_NAME'); -- Get saved context
    DBMS_XMLSTORE.clearUpdateColumnList(insCtx); -- Clear the update settings
    -- Set the columns to be updated as a list of values
    DBMS_XMLSTORE.setUpdateColumn(insCtx, 'COLUMN1');
    DBMS_XMLSTORE.setUpdateColumn(insCtx, 'COLUMN2');
    -- Insert the doc.
    rows := DBMS_XMLSTORE.insertXML(insCtx, xmltype(src_file,nls_charset_id('AL32UTF8')));
    DBMS_OUTPUT.put_line(rows || ' rows inserted.');
    -- Close the context
    DBMS_XMLSTORE.closeContext(insCtx);
    END;
    This was happening in a fraction of a second.
    But for thousands of XMLs we can't manually generate the Transformed XMLs even if there is a single type of XSL for all those.
    AM I MISSING SOMETHING VERY ELEMENTARY OVER HERE???
    (I didn't meant to shout by putting in CAPS but highlight it so that it's not missed..Apologies )
    The new suggestion being given is PARSE XML FILE into PIPE DELIMITED FORMAT FILE and LOAD that data through SQL LOADER
    Personally I don't think it would be better when Oracle is giving us an inbuilt feature.
    PLEASE HELP....
    How can I improve this. I really think I am taking some wrong step somewhere.....
    Regards.....

    Thanks A_NON,
    But I actually did load within 4 seconds using the below approach without registering as mentioned here.
    xmldata1 :=
    XMLTYPE (BFILENAME (v_directory_name,
    p_xml_file_nme
    NLS_CHARSET_ID ('UTF8')
    xmldata2 :=
    XMLTYPE (BFILENAME (v_directory_name,
    p_xsl_file_nme
    NLS_CHARSET_ID ('UTF8')
    SELECT XMLTRANSFORM (xmldata1, xmldata2) AS temp2
    INTO temp2
    FROM DUAL;
    And then
    insCtx := DBMS_XMLSTORE.newContext( 'TABLE_NAME'); -- Get saved context
    DBMS_XMLSTORE.clearUpdateColumnList(insCtx); -- Clear the update settings
    -- Set the columns to be updated as a list of values
    DBMS_XMLSTORE.setUpdateColumn(insCtx, 'COLUMN1');
    DBMS_XMLSTORE.setUpdateColumn(insCtx, 'COLUMN2');
    I will try to optimise it more now. But 4 seconds is beautiful you see. :-)
    Thanks for All the help A_NON and I think I have learnt some Oracle XML DB over here.
    Hope to answer some basic questions too in future.. :-)
    Regards....

  • OWB 9.0.4 :SQL*Loader: Operator POSTMAPPING does not support

    Hi,
    While trying to poulate Analytical Workspace using WB_LOAD_OLAP_CUBE, I got the following validation error
    The analysis of the mapping is not successful under all supported languages and operating modes. Detail is as follows:
    SQL*Loader: Operator POSTMAPPING does not support SQL*Loader generation.
    ABAP: Operator AWPARAMS does not support ABAP generation.
    I dont know what that means. Your help will be appreciated. Do I need apply some post 9.2.0.3 patch?. If yes, please let me know the patch number if available.
    FYI: I am using Oracle9i with 9.2.0.3 patch
    Thanks
    Panneer

    Panneer,
    Does the regular process load from a flat file into a table? This would be implemented as SQL loader mapping... in which case a PL/SQL call cannot be implemented.
    What you could do:
    - Use an external table to read from the flat file.
    - Use the transformation in a process flow. I.e. you first execute the SQL loader mapping and then execute the transformation.
    Mark.

  • Sql loader 510

    I am loading a file with some BLOBs. Most of the data seems to have loaded ok but I am now getting this error:
    SQL*Loader-510: Physical record in data file
    (c:\Sheets_2005.dat) is longer than the maximum(20971520)
    The ctl file was auto generated by Migration Workbench...i have added the options in....
    options (BINDSIZE=20971520, READSIZE=20971520)
    load data
    infile 'c:\sheets_2005.dat' "str '<EORD>'"
    append
    into table SHEETS
    fields terminated by '<EOFD>'
    trailing nullcols
    (REFNO,
    SHEETNO,
    DETAIL CHAR(100000000)
    MESSAGE,
    SIZE_)
    Any ways around this error?
    Thanks

    Hello,
    Can you tell me which plugin you are using?
    option#1
    Cause: I think from the error message it appears that the datafile has a physical record that is too long.
    If that is the case, try changing the length of the column [this problem could be most likely at the blob/clob column]. Also try Using CONCATENATE or CONTINUEIF or break up the physical records.
    OPTION#2
    If you are using sql server or sybase plugin, this workaround may work:
    Cause: Export of Binary data may be bit too big. Hence it needs to be converted to the HEX format. Thus produced HEX data can be saved into a clob column.
    The task is split into 4 sub tasks
    1. CREATE A TABLESPACE TO HOLD ALL THE LOB DATA
         --log into your system schema and create a tablespace
         --Create a new tablespace for the CLOB and BLOB column
         --You may resize this to fit your data ,
         --Remember that we save the data once as CLOB and then as BLOB   
         --create tablespace lob_tablespace datafile 'lob_tablespace' SIZE 1000M AUTOEXTEND ON NEXT 50M;
    2. LOG INTO YOUR TABLE SCHEMA IN ORACLE
         --Modify this script to fit your requirements
         --START.SQL (this script will do the following tasks)
              ~~Modify your current schema so that it can accept HEX data
              ~~Modify your current schema so that it can hold that huge amount of data.
              ~~Modify the new tablespace to suite your requirements [can be estimated based on size of the blobs/clobs and number of rows]
              ~~Disable triggers, indexes & primary keys on tblfiles
    3. DATA MOVE: The data move now involves moving the HEX data in the .dat files to a CLOB.
         --The START.SQL script adds a new column to <tablename> called <blob_column>_CLOB.  This is where the HEX values will be stored.
         --MODIFY YOUR CONTROL FILE TO LOOK LIKE THIS
              ~~load data
              ~~infile '<tablename>.dat' "str '<er>'"
              ~~into table <tablename>
              ~~fields terminated by '<ec>'
              ~~trailing nullcols
              ~~(
              ~~ <blob_column>_CLOB CHAR(200000000),
              ~~)
    The important part being "_CLOB" appended to your BLOB column name and the datatype set to CHAR(200000000)
         --RUN sql_loader_script.bat
         --log into your schema to check if the data was loaded successfully
         --now you can see that the hex values were sent to the CLOB column
         --SQL> select dbms_lob.getlength(<blob_column>),dbms_lob.getlength(<blob_column>_clob) from <tablename>;
    4. LOG INTO YOUR SCHEMA
         --Run FINISH.SQL.  This script will do the following tasks:
              ~~Creates the procedure needed to perform the CLOB to BLOB transformation
              ~~Executes the procedure (this may take some time a 500Mb has to be converted to BLOB)
              ~~Alters the table back to its original form (removes the <blob_column>_clob)
              ~~Enables the triggers, indexes and primary keys
    Good luck
    Srinivas Nandavanam

  • OWB or SQL Loader?

    Is there some way to create a SQL loader control file using the OWB tool?

    Yes, OWB creates all of the ETL process, including sqlloader scripts, temporary tables, transformation. But the answer is pretty long. Creating sqlloader scripts is just one out of many functions OWB performs.
    As previously stated, your thread will not have too much echo here as this is an RDBMS specialized forum. By the way, I suggest you to narrow the scope of your question, as the answer, even in the proper forum would be too wide too.

  • Code Template Design - SQL LOADER

    Hi,
    We extract data from 12 csv files which have 10+ million rows using SQL Loader into staging table. We use each of these tables in at least two different mappings to populate our Dimensions & Facts.
    Now moving to CT's -
    Using Code Template,
    Code Template LCT_FILE_TO_ORACLE_SQLLDR and DEFAULT_ORACLE_TARGET_CT , i am able to load the data into the staging table. But this involves loading the data into work table and then into our Staging table. So we are loading our data twice.
    So I decided to write my own code template which loads the data from file directly to staging table bypassing work table.
    Developed an Integration CT which write's data directly to my staging table.
    But the issue is , I am miissing the audit. The CT does not shows me any record count it processed. This might be issue since i don't have any work tables.
    Is there a way we can capture audit ?
    Regards,
    Samurai.

    Hi David,
    I was thinking of getting the details from log file (using jytho) but did not know how to update the audit statistics. If you can get me that, it would be great.
    I go through all your blogs :-) . I am following the baby step as per your blog :-) I am moving towards bulk loading.
    Trying to design a CT
    a) SQL Loader
    1) Create a named pipe in unix to hold data into memory
    2) Bulk load data from MYSQL or SYBASE into a named pipe
    3) Then load the data into oracle table using SQLLoader
    4) Drop the named pipe
    OR
    b) External Table
    1) Bulk load data from MYSQL or SYBASE into a file ( zip the data)
    2) Then load the data into oracle external table with preprocessor option
    From our past experience in our current enviornment, quering or transforming data across external table with 10+ million rows was slower than a doing it across a table ( dba's will come after me for issuing such statements). So was trying to use SQL loader first.
    Regards,
    Samurai.
    BTW You know me by other name.
    Edited by: Samurai on Mar 4, 2010 3:12 PM

  • How to load a default value in to a column when using sql loader

    Im trying to load from a flat file using sql loader.
    for 1 column i need to update using a default value
    how to go about this?

    Hi!
    try this code --
    LOAD DATA
       INFILE 'sample.dat'
       REPLACE
       INTO TABLE emp
       empno   POSITION(01:04) INTEGER EXTERNAL NULLIF empno=BLANKS,
       ename   POSITION(06:15)  CHAR,
       job         POSITION(17:25)  CHAR,
       mgr       POSITION(27:30)  INTEGER EXTERNAL NULLIF mgr=BLANKS,
       sal        POSITION(32:39)  DECIMAL EXTERNAL NULLIF sal=BLANKS,
       comm   POSITION(41:48)  DECIMAL EXTERNAL DEFAULTIF comm = 100,
       deptno  POSITION(50:51)  INTEGER EXTERNAL NULLIF deptno=BLANKS,
       hiredate POSITION(52:62) CONSTANT SYSDATE
      )-hope this will solve ur purpose.
    Regards.
    Satyaki De.

  • How can we tell if SQL*Loader is working on a TABLE?

    We have a process that requires comparing batches with LDAP information. Instead of using an LDAP lookup tool, we get a nightly directory file, and import the two COLUMNs we want via SQL*Loader (REPLACE) into an IOT. Out of three cases, two just check the first COLUMN, and the third needs the second COLUMN as well.
    We did not think of using External TABLEs, because we cannot store files on the DB server itself.
    The question arises, what to do while the file is being imported. The file is just under 300M, so it takes a minute or so to replace all the data. We found SQL*Loader waits until a transaction is finished before starting, but a query against the TABLE only waits while it is actually importing the data. At the beginning of SQL*Loader's process, however, a query against the TABLE returns no rows.
    The solution we are trying right now is, to have the process that starts SQL*Loader flip a flag in another TABLE denoting that it is unavailable. When it is done, it flips it back, and notes the date. Then, the process that queries the information, exits if the flag is currently 'N'.
    The problem, is, what if SQL*Loader starts inbetween the check of the flag, and the query against the TABLE. How do we guarantee that it is still not being imported.
    I can think of three solutions:
    1) LOCK the ldap information TABLE before checking the flag.
    2) LOCK the record that the process starting SQL*Loader flips.
    3) Add a clause to the query against the TABLE checks that there are records in the TABLE (AND EXISTS(SELECT * FROM ldap_information).
    The problem with 3) is that the process has already tagged the batches (via a COLUMN). It could, technically reset them afterwards, but that seems a bit backwards.

    Just out of curiosity, are you aware that Oracle supplies a DBMS_LDAP package for pulling information from LDAP sources? It would obviously be relatively easy to have a single transaction that deletes the existing data, loads the new data via DBMS_LDAP, and commits, which would get around the problem you're having with SQL*Loader truncating the table.
    You could also have SQL*Loader load the data into a staging table and then have a second process either MERGE the changes from the staging table into the real table (again in a transactionally consistent manner) or just delete and insert the data.
    Justin

  • Loading two tables at same time with SQL Loader

    I have two tables I would like to populate from a file C:\my_data_file.txt.
    Many of the columns I am loading into both tables but there are a handful of columns I do not want. The first column I do not want for either table. My problem is how I can direct SQL Loader to go back to the first column and skip over it. I had tried using POSITION(1) and FILLER for the first column while loading the second table but I got THE following error message:
    SQL*Loader-350: Syntax error at line 65
    Expecting "," or ")" found keyword Filler
    col_a Poistion(1) FILLER INTEGER EXTERNALMy control file looks like the following:
    LOAD DATA
    INFILE 'C:\my_data_file.txt'
    BADFILE 'C:\my_data_file.txt'
    DISCARDFILE 'C:\my_data_file.txt'
    TRUNCATE INTO TABLE table_one
    WHEN (specific conditions)
    FIELDS TERMINATED BY ' '
    TRAILING NULLCOLS
    col_a FILLER INTEGER EXTERNAL,
    col_b INTEGER EXTERNAL,
    col_g FILLER CHAR,
    col_h CHAR,
    col_date DATE "yyyy-mm-dd"
    INTO TABLE table_two
    WHEN (specific conditions)
    FIELDS TERMINATED BY ' '
    TRAILING NULLCOLS
    col_a POSITION(1) FILLER INTEGER EXTERNAL,
    col_b INTEGER EXTERNAL,
    col_g FILLER CHAR,
    col_h CHAR,
    col_date DATE "yyyy-mm-dd"
    )

    Try adapting this for your scenario.
    tables for the test
    create table test1 ( fld1 varchar2(20), fld2 integer, fld3 varchar2(20) );
    create table test2 ( fld1 varchar2(20), fld2 integer, fld3 varchar2(20) );
    control file
    LOAD DATA
    INFILE "test.txt"
    INTO TABLE user.test1 TRUNCATE
    WHEN RECID = '1'
    FIELDS TERMINATED BY ' '
    recid filler integer external,
    fld1 char,
    fld2 integer external,
    fld3 char
    INTO TABLE user.test2 TRUNCATE
    WHEN RECID <> '1'
    FIELDS TERMINATED BY ' '
    recid filler position(1) integer external,
    fld1 char,
    fld2 integer external,
    fld3 char
    data for loading [text.txt]
    1 AAAAA 11111 IIIII
    2 BBBBB 22222 JJJJJ
    1 CCCCC 33333 KKKKK
    2 DDDDD 44444 LLLLL
    1 EEEEE 55555 MMMMM
    2 FFFFF 66666 NNNNN
    1 GGGGG 77777 OOOOO
    2 HHHHH 88888 PPPPP
    HTH
    RK

Maybe you are looking for

  • LV 8.20 Custom Application Icons Limited to 16 Colors?

    Hi Folks, I'm in the process of building a completed LabVIEW application into a useful .exe that can be distributed. I'm wondering about the LV 8.20 application builder. First I noticed that indeed there is now an ICON EDITOR! (your prayers have been

  • Unable to start UCM server in webcenter 11.1.1.8

    Hi, I have installed following software's: 1. Database 2. Weblogic server 3. RCU 4. Webcenter 11.1.1.8 We are getting error while starting the UCM server. Please find the below screenshot: Kindly check and advise accordingly. Thanks, Deva

  • How can i get my camera working again?

    Everytime i go in my camera, i take a picture and the shuttle closes. It doesnt open back after that. And when i go to see if the picture is there, nothing apears. What can i do for this. My Iphone is an IPhone 3 or 2 i think. Please someone, answer

  • NetBeans and JSE projects compatibility

    I am currently using NetBeans 4.1 for my Java development and am thinking of trying out JSE8. If I do this will my NB4.1 projects open in JSE and if I decide to go back to NB will NB still be able to open them along with any new projects I create?

  • MobileMe Calendar Problem Help ASAP!!

    Hi Guys I'd appreciate some advice with this pretty annoying problem. This is an iPhone 4 problem not a mobile me so I'd appreciate it if the MOD does not move this. My MobileMe calendars do not appear to be syncing anymore, problem only just started