CLOB - SQL LOADER

Hi all,
I am trying to insert this data into the CLOB column. I am getting the error - Records rejected.
Please help me in this regard.
Control file:
load data
infile 'c:\test.csv'
truncate
into table test_clob
fields terminated by ","
a ,
b
Table Structure:
create table test_clob(a number, b clob);
Test.csv:
1, "Marsh Supermarkets here is the exclusive supermarket partner of Project 18 a program it developed with the Peyton Manning Childrens Hospital and Ball State University to fight childhood obesity.
Project 18 Approved shelf tags identify better-for-you kid-friendly products throughout Marsh stores. Canned fruit frozen entrees vegetables granola bars and salty snacks are among the categories analyzed.
In the granola bar category for instance products can have no more than 35% of calories from total fat and have no more than 10% of calories from saturated and trans fats combined.
Along with shelf-tags at Marsh the initiative includes: Project 18 MVPs local high school students honored for serving as role models for younger children in the community; community wellness events Project 18 walks; a school curriculum; and The Project 18 Mobile Van."
Warm Regards
Swami

Hi Swami,
You need to modify your control file like this:
load data
infile 'c:\test.csv'
TRUNCATE
into table test_clob
fields terminated by ","
a ,
b CHAR(4000)
)Reason for specifying CHAR(4000) is as under (from Oracle Documentation)
To load internal LOBs (BLOBs, CLOBs, and NCLOBs) or XML columns from a primary datafile, you can use the following standard SQL*Loader formats:
* Predetermined size fields
* Delimited fields
* Length-value pair fields
For more go here: Link: [http://download.oracle.com/docs/cd/B19306_01/server.102/b14215/ldr_loading.htm#i1008564]
Thanks,
Ankur

Similar Messages

  • SQL Loader CLOBs

    I am moving data from an oracle 10g table to another oracle 10g table, transforming the data in Access, filtering records, merging fields, etc.., then via Access VBA creating CTL files and neccessary scripts, then loading the data via SQL Loader. Getting the CLOBs to load is not a problem, getting it to load with the formatting is. The records in the control file look fine, the formatting is kept just fine in the CTL file, but when it is loaded all the carriage returns, line feeds, etc are removed
    Here is the header of my CTL file:
    LOAD DATA
    INFILE *
    CONTINUEIF LAST != "|"
    INTO TABLE table_name
    APPEND
    FIELDS TERMINATED BY '||' TRAILING NULLCOLS
    (field1, field2, field3....etc)
    begindata
    field1||field2||field3||fieldx|
    At first i thought taking out the TRAILING NULLCOLS would help but i get the same result.
    The end result i am looking for is keeping the data formatted exactly as it is in the source table.
    Thanks
    Mike

    After doing some research it seems that the only way is creating a file with the clob, see http://download.oracle.com/docs/cd/B19306_01/server.102/b14215/ldr_loading.htm#i1007627
    Control File Contents:LOAD DATA
    INFILE 'sample.dat'
    INTO TABLE person_table
    FIELDS TERMINATED BY ','
    (name CHAR(20),
    ext_fname FILLER CHAR(40),
    "RESUME" LOBFILE(ext_fname) TERMINATED BY EOF)
    Datafile (sample.dat):
    Johny Quest,jqresume.txt,
    Speed Racer,'/private/sracer/srresume.txt',
    Secondary Datafile (jqresume.txt):
    Johny Quest,479 Johny Quest
    500 Oracle Parkway
    [email protected]>
    HTH
    Enrique

  • SQL LOADER: how to load CLOB column using stored function

    Hi,
    I am a newbie of sql loader. Everything seems to be fine until I hit a
    road block - the CLOB column type. I want to load data into the clob
    column using a stored function. I need to do some manipulation on the
    data before it gets saved to that column. But I got this error when I
    run the sql loader.
    SQL*Loader-309: No SQL string allowed as part of "DATA" field
    specification
    DATA is my CLOB type column.
    here is the content of the control file:
    LOAD DATA
    INFILE 'test.csv'
    BADFILE 'test.bad'
    DISCARDFILE 'test.dsc'
    REPLACE
    INTO TABLE test_table
    FIELDS TERMINATED BY ','
    OPTIONALLY ENCLOSED BY '"'
    TRAILING NULLCOLS
    codeid          BOUNDFILLER,
    reason          BOUNDFILLER,
    Checkstamp     "to_date(:CHECKSTAMP, 'mm/dd/yyyy')",
    "DATA"          "GetContent(:codeid, :reason)"
    All references are suggesting to use a file to load data on
    CLOB column but I want to use a function in which it generates
    the content to be saved into the column.
    Any help is greatly appreciated.
    Thanks,
    Baldwin
    MISICompany

    *** Duplicate Post ... Please Ignore ***

  • Using clob in sql loader utility in oracle 9i

    Hi,
    I want to load data into a table with 2 clob columns using a sql loader utility dat file and control file created programatically.
    The size of clob in dat file can change and the clob columns are inline in data file.
    As per 9i doc the size of clob is 4GB .
    How can I change the control file so that it can load max 4 GB data in clob columns .
    I am getting error while calling sqlldr using below control file :
    SQL*Loader-350: Syntax error at line 13.
    Expecting non-negative integer, found "-294967296".
    ,"NARRATIVE" char(4000000000)
    ^
    control file :
    LOAD DATA
    INFILE '' "str X'3C213E0A'"
    APPEND INTO TABLE PSD_TERM
    FIELDS TERMINATED BY '~^'
    TRAILING NULLCOLS
    "PSD_ID" CHAR(16) NULLIF ("PSD_ID"=BLANKS)
    ,"PSD_SERIAL_NUM" CHAR(4) NULLIF ("PSD_SERIAL_NUM"=BLANKS)
    ,"PSD_TERM_COD" CHAR(4) NULLIF ("PSD_TERM_COD"=BLANKS)
    ,"PSD_TERM_SER_NO" CHAR(4) NULLIF ("PSD_TERM_SER_NO"=BLANKS)
    ,"VERSION_DT" DATE "DD-MON-YYYY HH:MI:SS AM" NULLIF ("VERSION_DT"=BLANKS)
    ,"LATEST_VERSION" CHAR(1) NULLIF ("LATEST_VERSION"=BLANKS)
    ,"NARRATIVE" char(4000000000)
    ,"PARTITION_DT" DATE "DD-MON-YYYY HH:MI:SS AM" NULLIF ("PARTITION_DT"=BLANKS)
    ,"NARRATIVE_UNEXPANDED" char(4000000000)
    )

    Yes, you can do it. Create the sequence (suppose you call it "PK_SEQ_X") and then in your control file reference it as "PK_SEQ_X.NEXTVAL". For example suppose you wanted to put it into a column named 'Y' the entry in your control file will look like 'load data insert into table Z (Y "PK_SEQ_X.NEXTVAL", ....)'
    Note that the double quotes around the sequence name are required.

  • SQL Loader, CLOB, delimited fields

    Hello.
    I have to load using SQL Loader data from csv file into table, which one field is CLOB type.
    Here is how ctl file initially looked like:
    UNRECOVERABLE
    LOAD DATA
    INFILE '.\csv_files\TSH_DGRA.csv'
    BADFILE '.\bad_files\TSH_DGRA.bad'
    DISCARDFILE '.\dsc_files\TSH_DGRA.dsc'
    APPEND
    INTO TABLE TSH_DGRA
    FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"'
    TRAILING NULLCOLS
    ID_OBJ_TSHD,
    PR_ZOOM_TSHD,
    PR_GRID_TSHD,
    PR_ELMGR_TSHD CHAR(4000) OPTIONALLY ENCLOSED BY '<clob>' AND '</clob>',
    PR_ALRMGR_TSHD CHAR(4000) OPTIONALLY ENCLOSED BY '<clob>' AND '</clob>'
    Problems are fields PR_ELMGR_TSHD and PR_ALRMGR_TSHD (CLOBs in table TSH_DGRA). Until data which should be loaded into CLOB fields are under 4000 characters long, it works fine, but what should I do if I want to load data which are longer than 4000 characters?
    If found on Link:[http://download.oracle.com/docs/cd/B14117_01/server.101/b10825/ldr_loading.htm#i1006803] which one sentence said that:
    "SQL*Loader defaults to 255 bytes when moving CLOB data, but a value of up to 2 gigabytes can be specified. For a delimited field, if a length is specified, that length is used as a maximum. If no maximum is specified, it defaults to 255 bytes. For a CHAR field that is delimited and is also greater than 255 bytes, you must specify a maximum length. See CHAR for more information about the CHAR datatype."
    So, my question is, how to specify "up to 2gb" as text said? I can not use CHAR datatype because it is limited to 4000 characters. And I have to load about 60000 characters. I also can not use technique where all data for every CLOB field are in separate files.

    Just specify the maximum expected size:
    PR_ELMGR_TSHD CHAR(100000) OPTIONALLY ENCLOSED BY '<clob>' AND '</clob>',
    PR_ALRMGR_TSHD CHAR(1000000) OPTIONALLY ENCLOSED BY '<clob>' AND '</clob>'
    The CHAR(1000000) will allow SQLLDR to handle up to 1000000 bytes of input text.

  • SQL loader Field in data file exceeds maximum length for CLOB column

    Hi all
    I'm loading data from text file separated by TAB and i got the error below for some lines.
    Event the column is CLOB data type is there a limitation of the size of a CLOB data type.
    The error is:
    Record 74: Rejected - Error on table _TEMP, column DEST.
    Field in data file exceeds maximum length
    I'm using SQL Loader and the database is oracle 11g r2 on linux Red hat 5
    Here are the line causing the error fronm my data file and my table description for test:
    create table TEMP
    CODE VARCHAR2(100),
    DESC VARCHAR2(500),
    RATE     FLOAT,
    INCREASE VARCHAR2(20),
    COUNTRY VARCHAR2(500),
    DEST     CLOB,
    WEEK     VARCHAR2(10),
    IS_SAT VARCHAR2(50),
    IS_SUN VARCHAR2(50)
    CONTROL FILE:
    LOAD DATA
    INTO TABLE TEMP
    APPEND
    FIELDS TERMINATED BY X'9' TRAILING NULLCOLS
    CODE,
    DESC,
    RATE,
    INCREASE,
    COUNTRY),
    DEST,
    WEEK,
    IS_SAT,
    IS_SUN
    Data file:
    BHS Mobile     Bahamas - Mobile     0.1430          1     "242357, 242359, 242375, 242376, 242395, 242421, 242422, 242423, 242424, 242425, 242426, 242427, 242428, 242429, 242431, 242432, 242433, 242434, 242435, 242436, 242437, 242438, 242439, 242441, 242442, 242443, 242445, 242446, 242447, 242448, 242449, 242451, 242452, 242453, 242454, 242455, 242456, 242457, 242458, 242462, 242463, 242464, 242465, 242466, 242467, 242468, 24247, 242524, 242525, 242533, 242535, 242544, 242551, 242552, 242553, 242554, 242556, 242557, 242558, 242559, 242565, 242577, 242636, 242646, 242727"               
    BOL Mobile ENTEL     Bolivia - Mobile Entel     0.0865     Increase     591     "67, 68, 71, 72, 73, 740, 7410, 7411, 7412, 7413, 7414, 7415, 7420, 7421, 7422, 7423, 7424, 7425, 7430, 7431, 7432, 7433, 7434, 7435, 7436, 7437, 7440, 7441, 7442, 7443, 7444, 7445, 7450, 7451, 7452, 7453, 7454, 7455, 746, 7470, 7471, 7472, 7475, 7476, 7477, 7480, 7481, 7482, 7483, 7484, 7485, 7486, 7490, 7491, 7492, 7493, 7494, 7495, 7496"               Thank you.

    Hi
    Thank you for youe help, I found the solution and here what i do in my Control file i added
    char(40000) OPTIONALLY ENCLOSED BY '"' .
    LOAD DATA
    INTO TABLE TEMP
    APPEND
    FIELDS TERMINATED BY X'9' TRAILING NULLCOLS
    CODE,
    DESC,
    RATE,
    INCREASE,
    COUNTRY,
    DEST
    char(40000) OPTIONALLY ENCLOSED BY '"',
    WEEK,
    IS_SAT,
    IS_SUN
    Thank you for your help.

  • Using SQL*Loader to load a .csv file having multiple CLOBs

    Oracle 8.1.5 on Solaris 2.6
    I want to use SQL*Loader to load a .CSV file that has 4 inline CLOB columns. I shall attempt to give some information about the problem:
    1. The CLOBs are not delimited in the field level and could themselves contain commas.
    2. I cannot get the data file in any other format.
    Can anybody help me out with this? While loading LOB in predetermined size fields, is there a limit on the size?
    TIA.
    -Murali

    Thanks for the article link. The article states "...the loader can load only XMLType tables, not columns." Is this still the case with 10g R2? If so, what is the best way to workaround this problem? I am migrating data from a Sybase table that contains a TEXT column (among others) to an Oracle table that contains an XMLType column. How do you recommend I accomplish this task?
    - Ron

  • SQL Loader + CLOB

    Dear buddies,
    Please guide me with this.
    LOAD DATA
    INFILE 'D:\load\dat\Enquiry_reply.dat'
    BADFILE 'D:\load\bad\Enquiry_reply.bad'
    DISCARDFILE 'D:\load\dat\discard\Enquiry_reply.dsc'
    replace INTO TABLE OES_Enquiry FIELDS TERMINATED BY '['
    *(Reply CLOBFILE("D:\load\dat\oes_enquiry_reply.dat") TERMINATED BY '[')*
    Whats wrong here in this?
    I am getting error:
    SQLLoader-350: Syntax error at line 6.*
    Expecting "," or ")", found "CLOBFILE".
    *(Reply CLOBFILE("D:\load\dat\enquiry_reply.dat") TERMINATED BY '['*
    Please advice me on how to go about it.
    Thanks in advance.
    Nith

    This forum is meant for issues with database upgrades - pl re-post in the SQL*Loader forum (Export/Import/SQL Loader & External Tables
    HTH
    Srini

  • Create sql loader data file dynamically

    Hi,
    I want a sample program/approach which is used to create a sql loader data file.
    The program will read table name as i/p and will use
    select stmt will column list derived from user_tab_columns from data dictionary
    assuming multiple clob columns in the column list.
    Thanks
    Manoj

    I 'm writing clob and other columns to a sql loader dat file.
    Below sample code for writing clob column is giving file write error.
    How can I write multiple clobs to dat file so that control file will handle it correctly
    offset NUMBER := 1;
    chunk VARCHAR2(32000);
    chunk_size NUMBER := 32000;
    WHILE( offset < dbms_lob.getlength(l_rec_type.narrative) )
    LOOP
    chunk := dbms_lob.substr(l_rec_type.narrative, chunk_size, offset );
    utl_file.put( l_file_handle, chunk );
         utl_file.fflush(l_file_handle);
    offset := offset + chunk_size;
    END LOOP;
         utl_file.new_line(l_file_handle);

  • How to call sql loader ctrl file with in the pl/sql procedure

    Hi Friends,
    I am doing a project related with the transferring data using queues. In the queue , I will get a tab delimited data in the form of CLOB variable/message. I do want to store that dat in the oracle table.
    When updating data into the table ,
    1. Don't want to write that data into a file.( Want to access directly after dequeueing from the specfic queue).
    2. As the data is in tab delimited form, I want to use sql loader concept.
    How do I call the sql loader ctrl file with in my pl/sql procedure. When I searched , most of the forums recommending external procedure or Java program.
    Please Guide me on this issue. my preferrence is pl sql, But don't know about external procedure . If no other way , I will try Java.
    I am using oracle 9.2.0.8.0.
    Thanks in advance,
    Vimal..

    Neither SQL*Loader nor external tables are designed to read data from a CLOB stored in the database. They both work on files stored on the file system. If you don't want the data to be written to a file, you're going to have to roll your own parsing code. This is certainly possible. But it is going to be far less efficient than either SQL*Loader or external tables. And it's likely to involve quite a bit more code.
    The simplest possible thing that could work would be to use something like Tom Kyte's string tokenization package to read a line from the CLOB, break it into the component pieces, and then store the different tokens in a meaningful collection (i.e. an object type or a record type that corresponds to the table definition). Of course, you'll need to handle things like converting strings to numbers or dates, rejecting rows, writing log files, etc.
    Justin

  • SQL*Loader or external table for load a MSG (email) file

    Hi there!
    I'm looking for a way to load an email in a Oracle DB.
    I mean, not all the email's body in a column, but to "parse" it in a multi column/table fashion.
    Is it possible to do with a sql*loader script or an external table?
    I think it is not possible, and that I must switch to XML DB.
    Any idea?
    Thanks,
    Antonio

    Hello,
    Why don't you just load the entire MSG (email) as clob into one email_body column or whatever column name you want to use.
    To load data upto 32k, you can use varchar2(32656) but its not a good idea to load clob in that manner because it's very inconsistent as length can
    vary resulting in string literal too long. So you have 2 choices now, first you have to use either procedure or anonymous block to load clob data.
    First Method -- I loaded alert.log successfully and you can imagine how big this file can be (5MB in my test case)
    CREATE OR REPLACE DIRECTORY DIR AS '/mydirectory/logs';
    DECLARE
       clob_data   CLOB;
       clob_file   BFILE;
    BEGIN
       INSERT INTO t1clob
       VALUES (EMPTY_CLOB ())
       RETURNING clob_text INTO clob_data;
       clob_file   := BFILENAME ('DIR', 'wwalert_dss.log');
       DBMS_LOB.fileopen (clob_file);
       DBMS_LOB.loadfromfile (clob_data,
                              clob_file,
                              DBMS_LOB.getlength (clob_file)
       DBMS_LOB.fileclose (clob_file);
       COMMIT;
    END;Second Method: Use of Sqlldr
    Example of controlfile
    LOAD DATA
    INFILE alert.log "STR '|\n'"
    REPLACE INTO  table t1clob
       clob_text char(30000000)
    )Hope this helps

  • SQL*Loader-510: Physical record in data file (clob_table.ldr) is long

    If I generate loader / Insert script from Raptor, it's not working for Clob columns.
    I am getting error:
    SQL*Loader-510: Physical record in data file (clob_table.ldr) is long
    er than the maximum(1048576)
    What's the solution?
    Regards,

    Hi,
    Has the file been somehow changed by copying it between windows and unix? Ora file transfer done as binary or ASCII? The most common cause of your problem. Is if the end of line carriage return characters have been changed so they are no longer /n/r could this have happened? Can you open the file in a good editor or do an od command in unix to see what is actually present?
    Regards,
    Harry
    http://dbaharrison.blogspot.co.uk/

  • Sql loader - BLOB

    I have used OMWB "generate Sql Loader script " option. ANd received the sql loader error.
    The previous attempt to use OMWB online loading has generated garbage data. The picture was not matching with the person id.
    Table in Sql Server..................
    CREATE TABLE [nilesh] (
         [LargeObjectID] [int] NOT NULL ,
         [LargeObject] [image] NULL ,
         [ContentType] [varchar] (40) COLLATE SQL_Latin1_General_CP1_CI_AS NULL ,
         [LargeObjectName] [varchar] (255) COLLATE SQL_Latin1_General_CP1_CI_AS NULL ,
         [LargeObjectExtension] [varchar] (10) COLLATE SQL_Latin1_General_CP1_CI_AS NULL ,
         [LargeObjectDescription] [varchar] (255) COLLATE SQL_Latin1_General_CP1_CI_AS NULL ,
         [LargeObjectSize] [int] NULL ,
         [VersionControl] [bit] NULL ,
         [WhenLargeObjectLocked] [datetime] NULL ,
         [WhoLargeObjectLocked] [char] (11) COLLATE SQL_Latin1_General_CP1_CI_AS NULL ,
         [LargeObjectTimeStamp] [timestamp] NOT NULL ,
         [LargeObjectOID] [uniqueidentifier] NOT NULL
    ) ON [PRIMARY] TEXTIMAGE_ON [PRIMARY]
    GO
    Table in Oracle..............
    CREATE TABLE LARGEOBJECT
    LARGEOBJECTID NUMBER(10) NOT NULL,
    LARGEOBJECT BLOB,
    CONTENTTYPE VARCHAR2(40 BYTE),
    LARGEOBJECTNAME VARCHAR2(255 BYTE),
    LARGEOBJECTEXTENSION VARCHAR2(10 BYTE),
    LARGEOBJECTDESCRIPTION VARCHAR2(255 BYTE),
    LARGEOBJECTSIZE NUMBER(10),
    VERSIONCONTROL NUMBER(1),
    WHENLARGEOBJECTLOCKED DATE,
    WHOLARGEOBJECTLOCKED CHAR(11 BYTE),
    LARGEOBJECTTIMESTAMP NUMBER(8) NOT NULL,
    LARGEOBJECTOID RAW(16) NOT NULL
    TABLESPACE USERS
    PCTUSED 0
    PCTFREE 10
    INITRANS 1
    MAXTRANS 255
    STORAGE (
    INITIAL 64K
    MINEXTENTS 1
    MAXEXTENTS 2147483645
    PCTINCREASE 0
    BUFFER_POOL DEFAULT
    LOGGING
    NOCOMPRESS
    LOB (LARGEOBJECT) STORE AS
    ( TABLESPACE USERS
    ENABLE STORAGE IN ROW
    CHUNK 8192
    PCTVERSION 10
    NOCACHE
    STORAGE (
    INITIAL 64K
    MINEXTENTS 1
    MAXEXTENTS 2147483645
    PCTINCREASE 0
    BUFFER_POOL DEFAULT
    NOCACHE
    NOPARALLEL
    MONITORING;
    Sql Loader script....
    SET NLS_DATE_FORMAT=Mon dd YYYY HH:mi:ssAM
    REM SET NLS_TIMESTAMP_FORMAT=Mon dd YYYY HH:mi:ss:ffAM
    REM SET NLS_LANGUAGE=AL32UTF8
    sqlldr cecildata/@ceciltst control=LARGEOBJECT.ctl log=LARGEOBJECT.log
    Sql loader control file......
    load data
    infile 'nilesh.dat' "str '<er>'"
    into table LARGEOBJECT
    fields terminated by '<ec>'
    trailing nullcols
    (LARGEOBJECTID,
    LARGEOBJECT CHAR(2000000) "HEXTORAW (:LARGEOBJECT)",
    CONTENTTYPE "DECODE(:CONTENTTYPE, CHR(00), ' ', :CONTENTTYPE)",
    LARGEOBJECTNAME CHAR(255) "DECODE(:LARGEOBJECTNAME, CHR(00), ' ', :LARGEOBJECTNAME)",
    LARGEOBJECTEXTENSION "DECODE(:LARGEOBJECTEXTENSION, CHR(00), ' ', :LARGEOBJECTEXTENSION)",
    LARGEOBJECTDESCRIPTION CHAR(255) "DECODE(:LARGEOBJECTDESCRIPTION, CHR(00), ' ', :LARGEOBJECTDESCRIPTION)",
    LARGEOBJECTSIZE,
    VERSIONCONTROL,
    WHENLARGEOBJECTLOCKED,
    WHOLARGEOBJECTLOCKED,
    LARGEOBJECTTIMESTAMP,
    LARGEOBJECTOID "GUID_MOVER(:LARGEOBJECTOID)")
    Error Received...
    Column Name Position Len Term Encl Datatype
    LARGEOBJECTID FIRST * CHARACTER
    Terminator string : '<ec>'
    LARGEOBJECT NEXT ***** CHARACTER
    Maximum field length is 2000000
    Terminator string : '<ec>'
    SQL string for column : "HEXTORAW (:LARGEOBJECT)"
    CONTENTTYPE NEXT * CHARACTER
    Terminator string : '<ec>'
    SQL string for column : "DECODE(:CONTENTTYPE, CHR(00), ' ', :CONTENTTYPE)"
    LARGEOBJECTNAME NEXT 255 CHARACTER
    Terminator string : '<ec>'
    SQL string for column : "DECODE(:LARGEOBJECTNAME, CHR(00), ' ', :LARGEOBJECTNAME)"
    LARGEOBJECTEXTENSION NEXT * CHARACTER
    Terminator string : '<ec>'
    SQL string for column : "DECODE(:LARGEOBJECTEXTENSION, CHR(00), ' ', :LARGEOBJECTEXTENSION)"
    LARGEOBJECTDESCRIPTION NEXT 255 CHARACTER
    Terminator string : '<ec>'
    SQL string for column : "DECODE(:LARGEOBJECTDESCRIPTION, CHR(00), ' ', :LARGEOBJECTDESCRIPTION)"
    LARGEOBJECTSIZE NEXT * CHARACTER
    Terminator string : '<ec>'
    VERSIONCONTROL NEXT * CHARACTER
    Terminator string : '<ec>'
    WHENLARGEOBJECTLOCKED NEXT * CHARACTER
    Terminator string : '<ec>'
    WHOLARGEOBJECTLOCKED NEXT * CHARACTER
    Terminator string : '<ec>'
    LARGEOBJECTTIMESTAMP NEXT * CHARACTER
    Terminator string : '<ec>'
    LARGEOBJECTOID NEXT * CHARACTER
    Terminator string : '<ec>'
    SQL string for column : "GUID_MOVER(:LARGEOBJECTOID)"
    SQL*Loader-309: No SQL string allowed as part of LARGEOBJECT field specification
    what's the cause ?

    The previous attempt to use OMWB online loading has generated garbage data. The picture was not matching with the person id.This is being worked on (bug4119713) If you have a reproducible testcase please send it in (small testcases seem to work ok).
    I have the following email about BLOBS I could forward to you if I have your email address:
    [The forum may cut the lines in the wrong places]
    Regards,
    Turloch
    Oracle Migration Workbench Team
    Hi,
    This may provide the solution. Without having the customer files here I can only guess at the problem. But this should help.
    This email outlines a BLOB data move.
    There are quiet a few steps to complete the task of moving a large BLOB into the Oracle database.
    Normally this wouldn't be a problem, but as far as we can tell SQL Server's (and possibly Sybase) BCP does not reliably export binary data.
    The only way to export binary data properly via BCP is to export it in a HEX format.
    Once in a HEX format it is difficult to get it back to binary during a data load into Oracle.
    We have come up with the idea of getting the HEX values into Oracle by saving them in a CLOB (holds text) column.
    We then convert the HEX values to binary values and insert them into the BLOB column.
    The problem here is that the HEXTORAW function in Oracle only converts a maximum of 2000 HEX pairs.
    We over came this problem by writing our own procedure that will convert (bit by bit) your HEX data to binary.
    NOTE: YOU MUST MODIFY THE START.SQL AND FINISH.SQL TO SUIT YOUR CUSTOMER
    The task is split into 4 sub tasks
    1) CREATE A TABLESPACE TO HOLD ALL THE LOB DATA
    --log into your system schema and create a tablespace
    --Create a new tablespace for the CLOB and BLOB column (this may take a while to create)
    --You may resize this to fit your data ,
    --but I believe you have in excess of 500MB of data in this table and we are going to save it twice (in a clob then a blob)
    --Note: This script may take some time to execute as it has to create a tablespace of 1000Mb.
    -- Change this to suit your customer.
    -- You can change this if you want depending on the size of your data
    -- Remember that we save the data once as CLOB and then as BLOB
    create tablespace lob_tablespace datafile 'lob_tablespace' SIZE 1000M AUTOEXTEND ON NEXT 50M;
    LOG INTO YOUR TABLE SCHEMA IN ORACLE
    --Modify this script to fit your requirements
    2) START.SQL (this script will do the following tasks)
    a) Modify your current schema so that it can accept HEX data
    b) Modify your current schema so that it can hold that huge amount of data.
    The new tablespace is used; you may want to alter this to your requirements
    c) Disable triggers, indexes & primary keys on tblfiles
    3)DATA MOVE
    The data move now involves moving the HEX data in the .dat files to a CLOB.
    The START.SQL script adds a new column to <tablename> called <blob_column>_CLOB.
    This is where the HEX values will be stored.
    MODIFY YOUR CONTROL FILE TO LOOK LIKE THISload data
    infile '<tablename>.dat' "str '<er>'"
    into table <tablename>
    fields terminated by '<ec>'
    trailing nullcols
    <blob_column>_CLOB CHAR(200000000),
    The important part being "_CLOB" appended to your BLOB column name and the datatype set to CHAR(200000000)
    RUN sql_loader_script.bat
    log into your schema to check if the data was loaded successfully-- now you can see that the hex values were sent to the CLOB column
    SQL> select dbms_lob.getlength(<blob_column>),dbms_lob.getlength(<blob_column>_clob) from <tablename>;
    LOG INTO YOUR SCHEMA
    4)FINISH.SQL (this script will do the following tasks)
    a) Creates the procedure needed to perform the CLOB to BLOB transformation
    b) Executes the procedure (this may take some time a 500Mb has to be converted to BLOB)
    c) Alters the table back to its original form (removes the <blob_column>_clob)
    b) Enables the triggers, indexes and primary keys
    Regards,
    (NAME)
    -- START.SQL
    -- Modify this for your particular customer
    -- This should be executed in the user schema in Oracle that contains the table.
    -- DESCRIPTION:
    -- ALTERS THE OFFENDING TABLE SO THAT THE DATA MOVE CAN BE EXECUTED
    -- DISABLES TRIGGERS, INDEXES AND SEQUENCES ON THE OFFENDING TABLE
    -- 1) Add an extra column to hold the hex string
    alter table <tablename> add (FILEBINARY_CLOB CLOB);
    -- 2) Allow the BLOB column to accpet NULLS
    alter table <tablename> MODIFY FILEBINARY NULL;
    -- 3) Dissable triggers and sequences on tblfiles
    alter trigger <triggername> disable;
    alter table tblfiles drop primary key cascade;
    drop index <indexname>;
    -- 4) Allow the table to use the tablespace
    alter table <tablename> move lob (<blob_column>) store as (tablespace lob_tablespace);
    alter table tblfiles move lob (<blob_column>clob) store as (tablespace lobtablespace);
    COMMIT;
    -- END OF FILE
    -- FINISH.SQL
    -- Modify this for your particular customer
    -- This should be executed in the table schema in Oracle.
    -- DESCRIPTION:
    -- MOVES THE DATA FROM CLOB TO BLOB
    -- MODIFIES THE TABLE BACK TO ITS ORIGIONAL SPEC (without a clob)
    -- THEN ENABLES THE SEQUENCES, TRIGGERS AND INDEXES AGAIN
    -- Currently we have the hex values saved as text in the <columnname>_CLOB column
    -- And we have NULL in all rows for the <columnname> column.
    -- We have to get BLOB locators for each row in the BLOB column
    -- put empty blobs in the blob column
    UPDATE <tablename> SET filebinary=EMPTY_BLOB();
    COMMIT;
    -- create the following procedure in your table schema
    CREATE OR REPLACE PROCEDURE CLOBTOBLOB
    AS
    inputLength NUMBER; -- size of input CLOB
    offSet NUMBER := 1;
    pieceMaxSize NUMBER := 50; -- the max size of each peice
    piece VARCHAR2(50); -- these pieces will make up the entire CLOB
    currentPlace NUMBER := 1; -- this is where were up to in the CLOB
    blobLoc BLOB; -- blob locator in the table
    clobLoc CLOB; -- clob locator pointsthis is the value from the dat file
    -- THIS HAS TO BE CHANGED FOR SPECIFIC CUSTOMER TABLE AND COLUMN NAMES
    CURSOR cur IS SELECT <blob_column>clob clobcolumn , <blob_column> blob_column FROM /*table*/<tablename> FOR UPDATE;
    cur_rec cur%ROWTYPE;
    BEGIN
    OPEN cur;
    FETCH cur INTO cur_rec;
    WHILE cur%FOUND
    LOOP
    --RETRIVE THE clobLoc and blobLoc
    clobLoc := cur_rec.clob_column;
    blobLoc := cur_rec.blob_column;
    currentPlace := 1; -- reset evertime
    -- find the lenght of the clob
    inputLength := DBMS_LOB.getLength(clobLoc);
    -- loop through each peice
    LOOP
    -- get the next piece and add it to the clob
    piece := DBMS_LOB.subStr(clobLoc,pieceMaxSize,currentPlace);
    -- append this peice to the BLOB
    DBMS_LOB.WRITEAPPEND(blobLoc, LENGTH(piece)/2, HEXTORAW(piece));
    currentPlace := currentPlace + pieceMaxSize ;
    EXIT WHEN inputLength < currentplace;
    END LOOP;
    FETCH cur INTO cur_rec;
    END LOOP;
    END CLOBtoBLOB;
    -- now run the procedure
    -- It will update the blob column witht the correct binary represntation of the clob column
    EXEC CLOBtoBLOB;
    -- drop the extra clob cloumn
    alter table <tablename> drop column <blob_column>_clob;
    -- 2) apply the constraint we removed during the data load
    alter table <tablename> MODIFY FILEBINARY NOT NULL;
    -- Now re enable the triggers,indexs and primary keys
    alter trigger <triggername> enable;
    ALTER TABLE TBLFILES ADD ( CONSTRAINT <pkname> PRIMARY KEY ( <column>) ) ;
    CREATE INDEX <index_name> ON TBLFILES ( <column> );
    COMMIT;
    -- END OF FILE

  • SQL*Loader Error Field in data file exceeds maximum length", version 8.1.6

    Hi All,
    I am trying to load a data file into a database table using SQL loader. I
    received the data in an data file but I saved it as a pipe delimited
    file.
    When I run the SQL Loader command no records are loaded - looking at the log
    file I get the following error:
    Rejected - Error on table FARESDATA, column FANOTESIN.
    Then I tried with substr and doesnt seem to work for values greater than 4000 chars, it only works if the field value is below 4000 chars (say doing a substr for first 3000 chars).
    see the code--------
    LOAD DATA
    INFILE 'p.dat'
    APPEND INTO TABLE PROSPECTUS
    FIELDS TERMINATED BY '|' OPTIONALLY ENCLOSED BY '"'
    TRAILING NULLCOLS
    Fanotesin char(4000) "substr(:fanotesin,1,4000)"
    We get the error ORA-01461: can bind a LONG value only for insert into a LONG column when using substr for 4000 chars.
    Please help!
    Thanks,
    Rajesh
    null

    I believe the problem here is that the ORACLE database regards anything > 4000chs as a CLOB (or LONG). 4000 is the maximum length of a varchar2 field in the database, although of course you can declare larger values in PL/SQL. (Since the incoming data is > 4000 chs it is regarded as a LONG and you cannot therefore use SUBSTR on it)
    I think that you must either truncate your data before loading or load into a table with the offending field as a CLOB (or LONG)
    and then use PL/SQL to recreate it in a table with a varchar2(4000) which is otherwise identical. (You can select from a LONG into a varchar2(32000) for example and then truncate before writing out to the new table).

  • Sql loader performance problem with xml

    Hi,
    i have to load a 400 mb big xml file into mz local machine's free oracle db
    i have tested a one record xml and was able to load succesfully, but 400 mb freeying for half an hour and does not even started?
    it is normal? is there any chance i will be able to load it, just need to wait?
    are there any faster solution?
    i ahve created a table below
    CREATE TABLE test_xml
    COL_ID VARCHAR2(1000),
    IN_FILE XMLTYPE
    XMLTYPE IN_FILE STORE AS CLOB
    and control file below
    LOAD DATA
    CHARACTERSET UTF8
    INFILE 'test.xml'
    APPEND
    INTO TABLE product_xml
    col_id filler CHAR (1000),
    in_file LOBFILE(CONSTANT "test.xml") TERMINATED BY EOF
    anything i am doing wrong? thanks for advices

    SQL*Loader: Release 11.2.0.2.0 - Production on H. Febr. 11 18:57:09 2013
    Copyright (c) 1982, 2009, Oracle and/or its affiliates. All rights reserved.
    Control File: prodxml.ctl
    Character Set UTF8 specified for all input.
    Data File: test.xml
    Bad File: test.bad
    Discard File: none specified
    (Allow all discards)
    Number to load: ALL
    Number to skip: 0
    Errors allowed: 5000
    Bind array: 64 rows, maximum of 256000 bytes
    Continuation: none specified
    Path used: Conventional
    Table PRODUCT_XML, loaded from every logical record.
    Insert option in effect for this table: APPEND
    Column Name Position Len Term Encl Datatype
    COL_ID FIRST 1000 CHARACTER
    (FILLER FIELD)
    IN_FILE DERIVED * EOF CHARACTER
    Static LOBFILE. Filename is bv_test.xml
    Character Set UTF8 specified for all input.
    SQL*Loader-605: Non-data dependent ORACLE error occurred -- load discontinued.
    ORA-01652: unable to extend temp segment by 128 in tablespace TEMP
    Table PRODUCT_XML:
    0 Rows successfully loaded.
    0 Rows not loaded due to data errors.
    0 Rows not loaded because all WHEN clauses were failed.
    0 Rows not loaded because all fields were null.
    Space allocated for bind array: 256 bytes(64 rows)
    Read buffer bytes: 1048576
    Total logical records skipped: 0
    Total logical records rejected: 0
    Total logical records discarded: 0
    Run began on H. Febr. 11 18:57:09 2013
    Run ended on H. Febr. 11 19:20:54 2013
    Elapsed time was: 00:23:45.76
    CPU time was: 00:05:05.50
    this is the log
    i have truncated everything i am not able to load 400 mega into 4 giga i cannot understand
    windows is not licensed 32 bit

Maybe you are looking for

  • Performance Testing - Upgrade from 4.6B to ECC6.0

    Hi, We are doing an upgrade from SAP 4.6B to ECC6.0. I would like to know what would be the best approach for doing a performance test in an upgrade project. More specifically, 1. What are the main components that need to be tested for performance? 2

  • Query Tuning Assistance - part 6

    1061 - access("A"."INVOICE_NBR"=:B1) 1062 - filter("FIN_AGING_ARG"."CREDIT_STATUS"='0' OR "FIN_AGING_ARG"."CREDIT_STATUS"='1' AND "S1_CUSTOMER_DATA"."HOLD_FLAG"='Y' OR "FIN_AGING_ARG"."CREDIT_STATUS"='2' AND "S1_CUSTOMER_DATA"."CREDIT_LIMIT" IS NOT N

  • Video is zoomed in when i create a new movie.

    New iMac user.  Trying to work in iMovie.  Dumped video in and have it linbed up in "Events."  When I try to create a new Project, I can't.  It does not even give me the option.  The only option I have is to create a new Movie.  When I do that, the v

  • How I get Oracle 9i (9.2.0.1)

    I need install a 9.2.0.8 Linux x64 version but I can't find it. The database download page only appears version 10g or up, the 9.2.0.8 patch I got from metalink, but the primary installer no, any way to download it ?

  • Function module to convert character type to required data type

    Hi, Is there a function module which can convert data in character type to any other required data type? Like i have a date 12.2.2008 and it is stored as character now i want to convert it into type d. so that when i use Describe field <field_name> t