Sql loader - BLOB

I have used OMWB "generate Sql Loader script " option. ANd received the sql loader error.
The previous attempt to use OMWB online loading has generated garbage data. The picture was not matching with the person id.
Table in Sql Server..................
CREATE TABLE [nilesh] (
     [LargeObjectID] [int] NOT NULL ,
     [LargeObject] [image] NULL ,
     [ContentType] [varchar] (40) COLLATE SQL_Latin1_General_CP1_CI_AS NULL ,
     [LargeObjectName] [varchar] (255) COLLATE SQL_Latin1_General_CP1_CI_AS NULL ,
     [LargeObjectExtension] [varchar] (10) COLLATE SQL_Latin1_General_CP1_CI_AS NULL ,
     [LargeObjectDescription] [varchar] (255) COLLATE SQL_Latin1_General_CP1_CI_AS NULL ,
     [LargeObjectSize] [int] NULL ,
     [VersionControl] [bit] NULL ,
     [WhenLargeObjectLocked] [datetime] NULL ,
     [WhoLargeObjectLocked] [char] (11) COLLATE SQL_Latin1_General_CP1_CI_AS NULL ,
     [LargeObjectTimeStamp] [timestamp] NOT NULL ,
     [LargeObjectOID] [uniqueidentifier] NOT NULL
) ON [PRIMARY] TEXTIMAGE_ON [PRIMARY]
GO
Table in Oracle..............
CREATE TABLE LARGEOBJECT
LARGEOBJECTID NUMBER(10) NOT NULL,
LARGEOBJECT BLOB,
CONTENTTYPE VARCHAR2(40 BYTE),
LARGEOBJECTNAME VARCHAR2(255 BYTE),
LARGEOBJECTEXTENSION VARCHAR2(10 BYTE),
LARGEOBJECTDESCRIPTION VARCHAR2(255 BYTE),
LARGEOBJECTSIZE NUMBER(10),
VERSIONCONTROL NUMBER(1),
WHENLARGEOBJECTLOCKED DATE,
WHOLARGEOBJECTLOCKED CHAR(11 BYTE),
LARGEOBJECTTIMESTAMP NUMBER(8) NOT NULL,
LARGEOBJECTOID RAW(16) NOT NULL
TABLESPACE USERS
PCTUSED 0
PCTFREE 10
INITRANS 1
MAXTRANS 255
STORAGE (
INITIAL 64K
MINEXTENTS 1
MAXEXTENTS 2147483645
PCTINCREASE 0
BUFFER_POOL DEFAULT
LOGGING
NOCOMPRESS
LOB (LARGEOBJECT) STORE AS
( TABLESPACE USERS
ENABLE STORAGE IN ROW
CHUNK 8192
PCTVERSION 10
NOCACHE
STORAGE (
INITIAL 64K
MINEXTENTS 1
MAXEXTENTS 2147483645
PCTINCREASE 0
BUFFER_POOL DEFAULT
NOCACHE
NOPARALLEL
MONITORING;
Sql Loader script....
SET NLS_DATE_FORMAT=Mon dd YYYY HH:mi:ssAM
REM SET NLS_TIMESTAMP_FORMAT=Mon dd YYYY HH:mi:ss:ffAM
REM SET NLS_LANGUAGE=AL32UTF8
sqlldr cecildata/@ceciltst control=LARGEOBJECT.ctl log=LARGEOBJECT.log
Sql loader control file......
load data
infile 'nilesh.dat' "str '<er>'"
into table LARGEOBJECT
fields terminated by '<ec>'
trailing nullcols
(LARGEOBJECTID,
LARGEOBJECT CHAR(2000000) "HEXTORAW (:LARGEOBJECT)",
CONTENTTYPE "DECODE(:CONTENTTYPE, CHR(00), ' ', :CONTENTTYPE)",
LARGEOBJECTNAME CHAR(255) "DECODE(:LARGEOBJECTNAME, CHR(00), ' ', :LARGEOBJECTNAME)",
LARGEOBJECTEXTENSION "DECODE(:LARGEOBJECTEXTENSION, CHR(00), ' ', :LARGEOBJECTEXTENSION)",
LARGEOBJECTDESCRIPTION CHAR(255) "DECODE(:LARGEOBJECTDESCRIPTION, CHR(00), ' ', :LARGEOBJECTDESCRIPTION)",
LARGEOBJECTSIZE,
VERSIONCONTROL,
WHENLARGEOBJECTLOCKED,
WHOLARGEOBJECTLOCKED,
LARGEOBJECTTIMESTAMP,
LARGEOBJECTOID "GUID_MOVER(:LARGEOBJECTOID)")
Error Received...
Column Name Position Len Term Encl Datatype
LARGEOBJECTID FIRST * CHARACTER
Terminator string : '<ec>'
LARGEOBJECT NEXT ***** CHARACTER
Maximum field length is 2000000
Terminator string : '<ec>'
SQL string for column : "HEXTORAW (:LARGEOBJECT)"
CONTENTTYPE NEXT * CHARACTER
Terminator string : '<ec>'
SQL string for column : "DECODE(:CONTENTTYPE, CHR(00), ' ', :CONTENTTYPE)"
LARGEOBJECTNAME NEXT 255 CHARACTER
Terminator string : '<ec>'
SQL string for column : "DECODE(:LARGEOBJECTNAME, CHR(00), ' ', :LARGEOBJECTNAME)"
LARGEOBJECTEXTENSION NEXT * CHARACTER
Terminator string : '<ec>'
SQL string for column : "DECODE(:LARGEOBJECTEXTENSION, CHR(00), ' ', :LARGEOBJECTEXTENSION)"
LARGEOBJECTDESCRIPTION NEXT 255 CHARACTER
Terminator string : '<ec>'
SQL string for column : "DECODE(:LARGEOBJECTDESCRIPTION, CHR(00), ' ', :LARGEOBJECTDESCRIPTION)"
LARGEOBJECTSIZE NEXT * CHARACTER
Terminator string : '<ec>'
VERSIONCONTROL NEXT * CHARACTER
Terminator string : '<ec>'
WHENLARGEOBJECTLOCKED NEXT * CHARACTER
Terminator string : '<ec>'
WHOLARGEOBJECTLOCKED NEXT * CHARACTER
Terminator string : '<ec>'
LARGEOBJECTTIMESTAMP NEXT * CHARACTER
Terminator string : '<ec>'
LARGEOBJECTOID NEXT * CHARACTER
Terminator string : '<ec>'
SQL string for column : "GUID_MOVER(:LARGEOBJECTOID)"
SQL*Loader-309: No SQL string allowed as part of LARGEOBJECT field specification
what's the cause ?

The previous attempt to use OMWB online loading has generated garbage data. The picture was not matching with the person id.This is being worked on (bug4119713) If you have a reproducible testcase please send it in (small testcases seem to work ok).
I have the following email about BLOBS I could forward to you if I have your email address:
[The forum may cut the lines in the wrong places]
Regards,
Turloch
Oracle Migration Workbench Team
Hi,
This may provide the solution. Without having the customer files here I can only guess at the problem. But this should help.
This email outlines a BLOB data move.
There are quiet a few steps to complete the task of moving a large BLOB into the Oracle database.
Normally this wouldn't be a problem, but as far as we can tell SQL Server's (and possibly Sybase) BCP does not reliably export binary data.
The only way to export binary data properly via BCP is to export it in a HEX format.
Once in a HEX format it is difficult to get it back to binary during a data load into Oracle.
We have come up with the idea of getting the HEX values into Oracle by saving them in a CLOB (holds text) column.
We then convert the HEX values to binary values and insert them into the BLOB column.
The problem here is that the HEXTORAW function in Oracle only converts a maximum of 2000 HEX pairs.
We over came this problem by writing our own procedure that will convert (bit by bit) your HEX data to binary.
NOTE: YOU MUST MODIFY THE START.SQL AND FINISH.SQL TO SUIT YOUR CUSTOMER
The task is split into 4 sub tasks
1) CREATE A TABLESPACE TO HOLD ALL THE LOB DATA
--log into your system schema and create a tablespace
--Create a new tablespace for the CLOB and BLOB column (this may take a while to create)
--You may resize this to fit your data ,
--but I believe you have in excess of 500MB of data in this table and we are going to save it twice (in a clob then a blob)
--Note: This script may take some time to execute as it has to create a tablespace of 1000Mb.
-- Change this to suit your customer.
-- You can change this if you want depending on the size of your data
-- Remember that we save the data once as CLOB and then as BLOB
create tablespace lob_tablespace datafile 'lob_tablespace' SIZE 1000M AUTOEXTEND ON NEXT 50M;
LOG INTO YOUR TABLE SCHEMA IN ORACLE
--Modify this script to fit your requirements
2) START.SQL (this script will do the following tasks)
a) Modify your current schema so that it can accept HEX data
b) Modify your current schema so that it can hold that huge amount of data.
The new tablespace is used; you may want to alter this to your requirements
c) Disable triggers, indexes & primary keys on tblfiles
3)DATA MOVE
The data move now involves moving the HEX data in the .dat files to a CLOB.
The START.SQL script adds a new column to <tablename> called <blob_column>_CLOB.
This is where the HEX values will be stored.
MODIFY YOUR CONTROL FILE TO LOOK LIKE THISload data
infile '<tablename>.dat' "str '<er>'"
into table <tablename>
fields terminated by '<ec>'
trailing nullcols
<blob_column>_CLOB CHAR(200000000),
The important part being "_CLOB" appended to your BLOB column name and the datatype set to CHAR(200000000)
RUN sql_loader_script.bat
log into your schema to check if the data was loaded successfully-- now you can see that the hex values were sent to the CLOB column
SQL> select dbms_lob.getlength(<blob_column>),dbms_lob.getlength(<blob_column>_clob) from <tablename>;
LOG INTO YOUR SCHEMA
4)FINISH.SQL (this script will do the following tasks)
a) Creates the procedure needed to perform the CLOB to BLOB transformation
b) Executes the procedure (this may take some time a 500Mb has to be converted to BLOB)
c) Alters the table back to its original form (removes the <blob_column>_clob)
b) Enables the triggers, indexes and primary keys
Regards,
(NAME)
-- START.SQL
-- Modify this for your particular customer
-- This should be executed in the user schema in Oracle that contains the table.
-- DESCRIPTION:
-- ALTERS THE OFFENDING TABLE SO THAT THE DATA MOVE CAN BE EXECUTED
-- DISABLES TRIGGERS, INDEXES AND SEQUENCES ON THE OFFENDING TABLE
-- 1) Add an extra column to hold the hex string
alter table <tablename> add (FILEBINARY_CLOB CLOB);
-- 2) Allow the BLOB column to accpet NULLS
alter table <tablename> MODIFY FILEBINARY NULL;
-- 3) Dissable triggers and sequences on tblfiles
alter trigger <triggername> disable;
alter table tblfiles drop primary key cascade;
drop index <indexname>;
-- 4) Allow the table to use the tablespace
alter table <tablename> move lob (<blob_column>) store as (tablespace lob_tablespace);
alter table tblfiles move lob (<blob_column>clob) store as (tablespace lobtablespace);
COMMIT;
-- END OF FILE
-- FINISH.SQL
-- Modify this for your particular customer
-- This should be executed in the table schema in Oracle.
-- DESCRIPTION:
-- MOVES THE DATA FROM CLOB TO BLOB
-- MODIFIES THE TABLE BACK TO ITS ORIGIONAL SPEC (without a clob)
-- THEN ENABLES THE SEQUENCES, TRIGGERS AND INDEXES AGAIN
-- Currently we have the hex values saved as text in the <columnname>_CLOB column
-- And we have NULL in all rows for the <columnname> column.
-- We have to get BLOB locators for each row in the BLOB column
-- put empty blobs in the blob column
UPDATE <tablename> SET filebinary=EMPTY_BLOB();
COMMIT;
-- create the following procedure in your table schema
CREATE OR REPLACE PROCEDURE CLOBTOBLOB
AS
inputLength NUMBER; -- size of input CLOB
offSet NUMBER := 1;
pieceMaxSize NUMBER := 50; -- the max size of each peice
piece VARCHAR2(50); -- these pieces will make up the entire CLOB
currentPlace NUMBER := 1; -- this is where were up to in the CLOB
blobLoc BLOB; -- blob locator in the table
clobLoc CLOB; -- clob locator pointsthis is the value from the dat file
-- THIS HAS TO BE CHANGED FOR SPECIFIC CUSTOMER TABLE AND COLUMN NAMES
CURSOR cur IS SELECT <blob_column>clob clobcolumn , <blob_column> blob_column FROM /*table*/<tablename> FOR UPDATE;
cur_rec cur%ROWTYPE;
BEGIN
OPEN cur;
FETCH cur INTO cur_rec;
WHILE cur%FOUND
LOOP
--RETRIVE THE clobLoc and blobLoc
clobLoc := cur_rec.clob_column;
blobLoc := cur_rec.blob_column;
currentPlace := 1; -- reset evertime
-- find the lenght of the clob
inputLength := DBMS_LOB.getLength(clobLoc);
-- loop through each peice
LOOP
-- get the next piece and add it to the clob
piece := DBMS_LOB.subStr(clobLoc,pieceMaxSize,currentPlace);
-- append this peice to the BLOB
DBMS_LOB.WRITEAPPEND(blobLoc, LENGTH(piece)/2, HEXTORAW(piece));
currentPlace := currentPlace + pieceMaxSize ;
EXIT WHEN inputLength < currentplace;
END LOOP;
FETCH cur INTO cur_rec;
END LOOP;
END CLOBtoBLOB;
-- now run the procedure
-- It will update the blob column witht the correct binary represntation of the clob column
EXEC CLOBtoBLOB;
-- drop the extra clob cloumn
alter table <tablename> drop column <blob_column>_clob;
-- 2) apply the constraint we removed during the data load
alter table <tablename> MODIFY FILEBINARY NOT NULL;
-- Now re enable the triggers,indexs and primary keys
alter trigger <triggername> enable;
ALTER TABLE TBLFILES ADD ( CONSTRAINT <pkname> PRIMARY KEY ( <column>) ) ;
CREATE INDEX <index_name> ON TBLFILES ( <column> );
COMMIT;
-- END OF FILE

Similar Messages

  • SQL Loader - BLOB & LONG COLUMNS

    I have a table with 500 rows
    200 out of these rows need to be exported from one database to another database.
    These rows have long and blob columns with values. The long columns COULD contain all sorts of special characters.
    How can this be achieved with SQL Loader? Is there any other tool that can be used to get this done?
    Any help will be appreciated.

    Hello,
    What's your oracle version? As satish mentioned you can use datapump for load and unload. Here is an additional way of loading lob data
    To load data upto 32k, you can use varchar2(32656) but its not a good idea to load clob in that manner because it's very inconsistent as length can vary resulting in string literal too long. So you have 2 choices now, first you have to use either procedure or anonymous block to load clob data
    First Method -- I loaded alert.log successfully and you can imagine how big this file can be (5MB  (upto 4GB) in my test case)_
    CREATE OR REPLACE DIRECTORY DIR AS '/myapth/logs';
    DECLARE
       clob_data   CLOB;
       clob_file   BFILE;
    BEGIN
       INSERT INTO t1clob
       VALUES (EMPTY_CLOB ())
       RETURNING clob_text INTO clob_data;
       clob_file   := BFILENAME ('DIR', 'wwalert_dss.log');
       DBMS_LOB.fileopen (clob_file);
       DBMS_LOB.loadfromfile (clob_data,
                              clob_file,
                              DBMS_LOB.getlength (clob_file)
       DBMS_LOB.fileclose (clob_file);
       COMMIT;
    END;
    Second Method: Use of Sqlldr_
    LOAD DATA
    INFILE alert.log "STR '|\n'"
    REPLACE INTO  table t1clob
       clob_text char(30000000)
    )Hope this helps.
    Regards

  • Sql*loader and blob

    Hi,
    I want to load hex strings with sql loader into a blob/raw fields.
    The problem is that sql functions (e.g. HextoRaw) are not allowed with lob columns, and for the RAW one i get:
    "illegal use of TERMINATED BY for RAW"
    What should i do?

    If each blob is in a separate file, then you should be able to use LOBFILES to load them. Each row in the data file would need to have the name of the file containing the blob for that row. File ulcase9.ctl in the demo directory shows an example of using LOBFILES to load a lob column.
    If you want the blob in the data file, then the data file needs to use types that have include length information in then, such as VARRAW, LONG VARRAW or VARRAWC. Also, the records for the data files cannot be terminated by a character string or newline. That's because the blob data might contain the character string or newline in the middle of its data. Instead, you would need to use the VAR record type.

  • URGENT: Problems Loading files with SQL Loader into a BLOB column

    Hi friends,
    I read a lot about how to load files into blob columns, but I found errors that I can't solve.
    I've read several notes in these forums, ine of them:
    sql loader: loading external file into blob
    and tried the solutions but without good results.
    Here are some of my tests:
    With this .ctl:
    LOAD DATA
    INFILE *
    INTO TABLE mytable
    REPLACE
    FIELDS TERMINATED BY ','
    number1 INTEGER EXTERNAL,
    cad1 CHAR(250),
    image1 LOBFILE(cad1) TERMINATED BY EOF
    BEGINDATA
    1153,/opt/oracle/appl/myapp/1.0.0/img/1153.JPG,
    the error when I execute sqlldr is:
    SQL*Loader-350: Syntax error at line 9.
    Expecting "," or ")", found "LOBFILE".
    image1 LOBFILE(cad1) TERMINATED BY EOF
    ^
    What problem exists with LOBFILE ??
    (mytable of course has number1 as a NUMBER, cad1 as VARCHAR2(250) and image1 as BLOB
    I tried too with :
    LOAD DATA
    INFILE sample.dat
    INTO TABLE mytable
    FIELDS TERMINATED BY ','
    (cad1 CHAR(3),
    cad2 FILLER CHAR(30),
    image1 BFILE(CONSTANT "/opt/oracle/appl/myapp/1.0.0/img/", cad2))
    sample.dat is:
    1153,1153.JPEG,
    and error is:
    SQL*Loader-350: Syntax error at line 6.
    Expecting "," or ")", found "FILLER".
    cad2 FILLER CHAR(30),
    ^
    I tried too with a procedure, but without results...
    Any idea about this error messages?
    Thanks a lot.
    Jose L.

    > So you think that if one person put an "urgent" in the subject is screwing the problems of
    other people?
    Absolutely. As you are telling them "My posting is more important than yours and deserve faster attention and resolution than yours!".
    So what could a typical response be? Someone telling you that his posting is more important by using the phrase "VERY URGENT!". And the next poster may decide that, no, his problem is evern more import - and use "EXTREMELY URGENT!!" as the subject. And the next one then raises the stakes by claiming his problem is "CODE RED! CRITICAL. DEFCON 4. URGENT!!!!".
    Stupid, isn't it? As stupid as your instance that there is nothing wrong with your pitiful clamoring for attention to your problem by saying it is urgent.
    What does the RFC's say about a meaningful title/subject in a public forum? I trust that you know what a RFC is? After all, you claim to have used public forums on the Internet for some years now..
    The RFC on "public forums" is called The Usenet Article Format. This is what it has to say about the SUBJECT of a public posting:
    =
    The "Subject" line (formerly "Title") tells what the message is about. It should be suggestive enough of the contents of the message to enable a reader to make a decision whether to read the message based on the subject alone. If the message is submitted in response to another message (e.g., is a follow-up) the default subject should begin with the four characters "Re: ", and the "References" line is required. For follow-ups, the use of the "Summary" line is encouraged.
    =
    ([url http://www.cs.tut.fi/~jkorpela/rfc/1036.html]RFC 1036, the Usenet article format)
    Or how about [url http://www.cs.tut.fi/~jkorpela/usenet/dont.html]The seven don'ts of Usenet?
    Point 7 of the Don'ts:
    Don't try to catch attention by typing something foolish like "PLEASE HELP ME!!!! URGENT!!! I NEED YOUR HELP!!!" into the Subject line. Instead, type something informative (using normal mixed case!) that describes the subject matter.
    Please tell me that you are not too thick to understand the basic principles of netiquette, or to argue with the RFCs that governs the very fabric of the Internet.
    As for when I have an "urgent" problem? In my "real" work? I take it up with Oracle Support on Metalink by filing an iTAR/SR. As any non-idiot should do with a real-life Oracle crisis problem.
    I do not barge into a public forum like you do, jump up and down, and demand quick attention by claiming that my problem is more important and more urgent and more deserving of attention that other people's problem in the very same forum.

  • Bulk loading BLOBs using PL/SQL - is it possible?

    Hi -
    Does anyone have a good reference article or example of how I can bulk load BLOBs (videos, images, audio, office docs/pdf) into the database using PL/SQL?
    Every example I've ever seen in PL/SQL for loading BLOBs does a commit; after each file loaded ... which doesn't seem very scalable.
    Can we pass in an array of BLOBs from the application, into PL/SQL and loop through that array and then issue a commit after the loop terminates?
    Any advice or help is appreciated. Thanks
    LJ

    It is easy enough to modify the example to commit every N files. If you are loading large amounts of media, I think that you will find that the time to load the media is far greater than the time spent in SQL statements doing inserts or retrieves. Thus, I would not expect to see any significant benefit to changing the example to use PL/SQL collection types in order to do bulk row operations.
    If your goal is high performance bulk load of binary content then I would suggest that you look to use Sqlldr. A PL/SQL program loading from BFILEs is limited to loading files that are accessible from the database server file system. Sqlldr can do this but it can also load data from a remote client. Sqlldr has parameters to control batching of operations.
    See section 7.3 of the Oracle Multimedia DICOM Developer's Guide for the example Loading DICOM Content Using the SQL*Loader Utility. You will need to adapt this example to the other Multimedia objects (ORDImage, ORDAudio .. etc) but the basic concepts are the same.
    Once the binary content is loaded into the database, you will need a to write a program to loop over the new content and initialize the Multimedia objects (extract attributes). The example in 7.3 contains a sample program that does this for the ORDDicom object.

  • Importing oracle.sql.BLOB through SQL Loader

    Hello,
    Currently the system we have creates .sql files and executes them. This takes a long time, so we're attempting to change to using SQL Loader. The one problem I'm having that I can't seem to fix is finding out how to imitate the behavior of this SQL statement through SQL Loader:
    INSERT INTO KRSB_QRTZ_BLOB_TRIGGERS (BLOB_DATA,TRIGGER_GROUP,TRIGGER_NAME)
    VALUES (oracle.sql.BLOB@515263,'KCB-Delivery','PeriodicMessageProcessingTrigger')
    I tried creating a lobfile and placing the text oracle.sql.BLOB@515263 within it this way
    INTO TABLE KRSB_QRTZ_BLOB_TRIGGERS
    WHEN tablename = 'KRSB_QRTZ_BLOB_TRIGGERS'
    FIELDS TERMINATED BY ',' ENCLOSED BY '##'
    TRAILING NULLCOLS
    tablename FILLER POSITION(1),
    TRIGGER_NAME CHAR(80),
    TRIGGER_GROUP CHAR(80),
    ext_fname FILLER,
    BLOB_DATA LOBFILE(ext_fname) TERMINATED BY EOF
    However, as expected, it merely loaded the text "oracle.sql.BLOB@515263" into the database. So does anyone have any ideas of how to imitate that insert statement through SQL Loader? The only information I have available is the string "oracle.sql.BLOB@515263", no other files or anything to aid with actually getting the binary data.
    When the .sql file is run with that insert statement in it, a 1.2kb BLOB is inserted into the database versus a 22 byte BLOB that contains nothing useful when done through SQL Loader.
    Alex

    My preference is DBMS_LOB.LOADFROMFILE
    http://www.morganslibrary.org/reference/dbms_lob.html
    did you try it?

  • Sql loader: loading external file into blob

    hi,
    i keep being blocked by loading external binary files into a blob columns with sql loader tool.
    Here is the problem:
    I want to load pictures into a table with lob columns. Here is my control file, data file and the error message:
    control file:
    LOAD DATA
    APPEND
    INTO TABLE T_K58_SYMBOLE
    FIELDS TERMINATED BY ';' TRAILING NULLCOLS
    SYMB_ID INTEGER EXTERNAL(8),
    SYMB_LMLABEL CHAR(100),
    imgName FILLER,
    SYMB_GIF_HI BFILE(CONSTANT ".", imgName),
    thumbName FILLER,
    SYMB_GIF_LOW BFILE(CONSTANT ".", thumbName)
    data file:
    1;0800;image1.gif;image1.gif;
    error message:
    SQL*Loader-418: Bad datafile datatype for column SYMB_GIF_HI
    Here is the script of the creation of my table in my database:
    create table T_K58_SYMBOLE (
    SYMB_ID NUMBER(8) not null,
    SYMB_LMLABEL VARCHAR2(100),
    SYMB_GIF_HI BLOB,
    SYMB_GIF_LOW BLOB,
    constraint PK_T_K58_SYMBOLE primary key (SYMB_ID)
    LOB (SYMB_GIF_HI, SYMB_GIF_LOW) STORE AS (tablespace TDK5813)
    tablespace TDK5811
    Please, i need help!!!!

    This is my control file. I'm loading images in database table using sqlldr in Unix:
    LOAD DATA
    INFILE 'sampledata.dat'
    INTO TABLE image_table
    APPEND
    FIELDS TERMINATED BY ',' optionally enclosed by '"'
    IMAGE_ID INTEGER EXTERNAL,
    FILE_NAME CHAR,
    IMAGE_DATA LOBFILE(FILE_NAME) TERMINATED BY EOF
    I'm facing following error:
    SQL*Loader-350: Syntax error at line 9.
    Expecting "," or ")", found "LOBFILE".
    IMAGE_DATA LOBFILE(FILE_NAME) TERMINATED BY EOF
    My data file is:
    1,"IMG_3126.jpg"
    2,"IMG_3127.jpg"
    3,"IMG_3128.jpg"
    and images are in the same server. please help me to get out of this.
    Thanks,
    Surjeet Kaur

  • SQL Loader null- BLOB

    Hello,
    I'm having a lot of problems to load a null value to a BLOB field via SQL Loader.
    In my file tehre's a blank in the place corresponding to the BLOB field
    , "ACCESS_IMAGE_DATA" CHAR NULLIF ("ACCESS_IMAGE_DATA"=BLANKS)
    then I execute succesfully SQL Loader but in the BLOB field, I get something that is not null
    select (ACCESS_IMAGE_DATA)from TABLE
    ACCESS_IMAGE_DATA
    (BLOB)
    (BLOB)
    (BLOB)
    (BLOB)
    (BLOB)
    (BLOB)
    (BLOB)
    instead of null, null, null....
    Any idea?
    thanks in advance
    Message was edited by:
    user646394

    You have not indicated the front-end tool you are using but it is not SQL*Plus and you are not looking at what you assume you are seeing. Front-end tools do not display BLOBs.
    Run the following query:
    SELECT dbms_lob.getlength(access_image_data)
    FROM <table_name>;If it returns zero then you've been chasing the wind.

  • BLOB data and SQL Loader

    Has any one used SQL Loader for BLOB data? I would think you can only use it for text type data.
    Thanks,
    Vic

    If each blob is in a separate file, then you should be able to use LOBFILES to load them. Each row in the data file would need to have the name of the file containing the blob for that row. File ulcase9.ctl in the demo directory shows an example of using LOBFILES to load a lob column.
    If you want the blob in the data file, then the data file needs to use types that have include length information in then, such as VARRAW, LONG VARRAW or VARRAWC. Also, the records for the data files cannot be terminated by a character string or newline. That's because the blob data might contain the character string or newline in the middle of its data. Instead, you would need to use the VAR record type.

  • How to insert BLOB datatype image using PL/SQL Procedure and SQL Loader

    Hi,
    How to insert an image into database using PL/SQL Procedure and also how to insert using SQL Loader. Please help by giving sample code and process description.
    Thanks,
    Vijay V

    http://asktom.oracle.com/pls/apex/f?p=100:11:0::::P11_QUESTION_ID:232814159006

  • SQL*Loader and HTMLDB_APPLICATION_FILES

    Hello!
    Can I use SQL*Loader for loading data from file stored in HTMLDB_APPLICATION_FILES as blob to tables in database? Files are always CSV in my case.
    Best regards,
    Tom

    Hello Maxim!
    Of course files are stored in HTMLDB_APPLICATION_FILES as blobs and of course I can do 'select blob_content from htmldb.....' - BUT I have to get text from this blob, and copy certain words from this blob, NOT the blob itself.
    Example: I have a file.csv. I upload it to HTMLDB_APPLICATION_FILES so it is stored there as a blob. Now I want copy data delimited by ';'(for example) to table in database.
    Any ideas?
    Best regards,
    Tom

  • SQL*Loader and "Variable length field was truncated"

    Hi,
    I'm experiencing this problem using SQL*Loader: Release 8.1.7.0.0
    Here is my control file (it's actually split into separate control and data files, but the result is the same)
    LOAD DATA
    INFILE *
    APPEND INTO TABLE test
    FIELDS TERMINATED BY ','
    OPTIONALLY ENCLOSED BY '"'
    first_id,
    second_id,
    third_id,
    language_code,
    display_text VARCHAR(2000)
    begindata
    2,1,1,"eng","Type of Investment Account"
    The TEST table is defined as:
    Name Null? Type
    FIRST_ID NOT NULL NUMBER(4)
    SECOND_ID NOT NULL NUMBER(4)
    THIRD_ID NOT NULL NUMBER(4)
    LANGUAGE_CODE NOT NULL CHAR(3)
    DISPLAY_TEXT VARCHAR2(2000)
    QUESTION_BLOB BLOB
    The log file displays:
    Record 1: Warning on table "USER"."TEST", column DISPLAY_TEXT
    Variable length field was truncated.
    And the results of the insert are:
    FIRST_ID SECOND_ID THIRD_ID LANGUAGE_CODE DISPLAY_TEXT
    2 1 1 eng ype of Investment Account"
    The language_code field is imported correctly, but display_text keeps the closing delimiter, and loses the first character of the string. In other words, it is interpreting the enclosing double quote and/or the delimiter, and truncating the first two characters.
    I've also tried the following:
    LOAD DATA
    INFILE *
    APPEND INTO TABLE test
    FIELDS TERMINATED BY '|'
    first_id,
    second_id,
    third_id,
    language_code,
    display_text VARCHAR(2000)
    begindata
    2|1|1|eng|Type of Investment Account
    In this case, display_text is imported as:
    pe of Investment Account
    In the log file, I get this table which seems odd as well - why is the display_text column shown as having length 2002 when I explicitly set it to 2000?
    Column Name Position Len Term Encl Datatype
    FIRST_ID FIRST * | O(") CHARACTER
    SECOND_ID NEXT * | O(") CHARACTER
    THIRD_ID NEXT * | O(") CHARACTER
    LANGUAGE_CODE NEXT 3 | O(") CHARACTER
    DISPLAY_TEXT NEXT 2002 VARCHAR
    Am I missing something totally obvious in my control and data files? I've played with various combinations of delimiters (commas vs '|'), trailing nullcols, optional enclosed etc.
    Any help would be greatly appreciated!

    Use CHAR instead aof VARCHAR
    LOAD DATA
    INFILE *
    APPEND INTO TABLE test
    FIELDS TERMINATED BY ','
    OPTIONALLY ENCLOSED BY '"'
      first_id,
      second_id,
      third_id,
      language_code,
      display_text    CHAR(2000)
    )From the docu:
    A VARCHAR field is a length-value datatype.
    It consists of a binary length subfield followed by a character string of the specified length.
    http://download-west.oracle.com/docs/cd/A87860_01/doc/server.817/a76955/ch05.htm#20324

  • ORA-22275 :invalid LOB locator specified error while loading BLOBs/CLOBS.

    Hello All,
    I am trying to load BLOB/CLOB data from a client oracle DB to the oracle db on our side,we are using ODI version 10.1.3.5.6,which reportedly has the issue of loading BLOB/CLOBS solved.I am using
    The extraction fails in the loading stage when inserting data into the C$ table with the following error.
    "22275:99999 :java.sql.BatchUpdateException:ORA-22275:Invalid LOB locator specified".
    Kindly let me know how I can resolve this issue as the requirement to load this data is very urgent.
    Thanks,
    John

    One alternate way can be done out of ODI as ODI is still not able to resolve this issue. You can trim these fields (CLOB/BLOB) and push this data as separate fields into ODI and at the reporting end you can again concatenate them.
    May be this may solve your problem ....it solved mine.
    --XAT                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                   

  • Using SQL*Loader and UTL_FILE to load and unload large files(i.e PDF,DOCs)

    Problem : Load PDF or similiar files( stored at operating system) into an oracle table using SQl*Loader .
    and than Unload the files back from oracle tables to prevoius format.
    I 've used SQL*LOADER .... " sqlldr " command as :
    " sqlldr scott/[email protected] control=c:\sqlldr\control.ctl log=c:\any.txt "
    Control file is written as :
    LOAD DATA
    INFILE 'c:\sqlldr\r_sqlldr.txt'
    REPLACE
    INTO table r_sqlldr
    Fields terminated by ','
    id sequence (max,1) ,
    fname char(20),
    data LOBFILE(fname) terminated by EOF )
    It loads files ( Pdf, Image and more...) that are mentioned in file r_sqlldr.txt into oracle table r_sqlldr
    Text file ( used as source ) is written as :
    c:\kalam.pdf,
    c:\CTSlogo1.bmp
    c:\any1.txt
    after this load ....i used UTL_FILE to unload data and write procedure like ...
    CREATE OR REPLACE PROCEDURE R_UTL AS
    l_file UTL_FILE.FILE_TYPE;
    l_buffer RAW(32767);
    l_amount BINARY_INTEGER ;
    l_pos INTEGER := 1;
    l_blob BLOB;
    l_blob_len INTEGER;
    BEGIN
    SELECT data
    INTO l_blob
    FROM r_sqlldr
    where id= 1;
    l_blob_len := DBMS_LOB.GETLENGTH(l_blob);
    DBMS_OUTPUT.PUT_LINE('blob length : ' || l_blob_len);
    IF (l_blob_len < 32767) THEN
    l_amount :=l_blob_len;
    ELSE
    l_amount := 32767;
    END IF;
    DBMS_LOB.OPEN(l_blob, DBMS_LOB.LOB_READONLY);
    l_file := UTL_FILE.FOPEN('DBDIR1','Kalam_out.pdf','w', 32767);
    DBMS_OUTPUT.PUT_LINE('File opened');
    WHILE l_pos < l_blob_len LOOP
    DBMS_LOB.READ (l_blob, l_amount, l_pos, l_buffer);
    DBMS_OUTPUT.PUT_LINE('Blob read');
    l_pos := l_pos + l_amount;
    UTL_FILE.PUT_RAW(l_file, l_buffer, TRUE);
    DBMS_OUTPUT.PUT_LINE('writing to file');
    UTL_FILE.FFLUSH(l_file);
    UTL_FILE.NEW_LINE(l_file);
    END LOOP;
    UTL_FILE.FFLUSH(l_file);
    UTL_FILE.FCLOSE(l_file);
    DBMS_OUTPUT.PUT_LINE('File closed');
    DBMS_LOB.CLOSE(l_blob);
    EXCEPTION
    WHEN OTHERS THEN
    IF UTL_FILE.IS_OPEN(l_file) THEN
    UTL_FILE.FCLOSE(l_file);
    END IF;
    DBMS_OUTPUT.PUT_LINE('Its working at last');
    END R_UTL;
    This loads data from r_sqlldr table (BOLBS) to files on operating system ,,,
    -> Same procedure with minor changes is used to unload other similar files like Images and text files.
    In above example : Loading : 3 files 1) Kalam.pdf 2) CTSlogo1.bmp 3) any1.txt are loaded into oracle table r_sqlldr 's 3 rows respectively.
    file names into fname column and corresponding data into data ( BLOB) column.
    Unload : And than these files are loaded back into their previous format to operating system using UTL_FILE feature of oracle.
    so PROBLEM IS : Actual capacity (size ) of these files is getting unloaded back but with quality decreased. And PDF file doesnt even view its data. means size is almot equal to source file but data are lost when i open it.....
    and for images .... imgaes are getting loaded an unloaded but with colors changed ....
    Also features ( like FFLUSH ) of Oracle 've been used but it never worked
    ANY SUGGESTIONS OR aLTERNATE SOLUTION TO LOAD AND UNLOAD PDFs through Oracle ARE REQUESTED.
    ------------------------------------------------------------------------------------------------------------------------

    Thanks Justin ...for a quick response ...
    well ... i am loading data into BLOB only and using SQL*Loader ...
    I've never used dbms_lob.loadFromFile to do the loads ...
    i 've opend a file on network and than used dbms_lob.read and
    UTL_FILE.PUT_RAW to read and write data into target file.
    actually ...my process is working fine with text files but not with PDF and IMAGES ...
    and your doubt of ..."Is the data the proper length after reading it in?" ..m not getting wat r you asking ...but ... i think regarding data length ..there is no problem... except ... source PDF length is 90.4 kb ..and Target is 90.8 kb..
    thats it...
    So Request u to add some more help ......or should i provide some more details ??

  • Intermedia and SQL*Loader

    I created a small test table to try and figure out the syntax for the control file for SQL*Loader. I'm using what I found on technet but I can't seem to get past ->
    ** SQL*Loader-416: SDF clause for field LOCALDATA in table QUERY_INTERMEDIA references a non existent field.**
    All the files, control file, .bat file and front.tif file all exist in the same directory. I was able to get this to work when the column type was a blob but can't seem to get this right for interMedia.
    create table bsw.query_intermedia
    customer_id number not null,
    image_front ORDSYS.ORDImage
    LOB (image_front.source.localData) STORE AS
    (tablespace images_lc1 storage (pctincrease 0)
    pctversion 0 chunk 8K NOCACHE LOGGING DISABLE STORAGE IN ROW)
    pctfree 0
    Control file
    LOAD DATA
    INFILE *
    INTO TABLE query_intermedia
    TRUNCATE
    FIELDS TERMINATED BY ','
    (customer_id integer external,
    image_front column object
    (source column object
    localdata_fname FILLER CHAR(128),
    localdata LOBFILE(image.source.localdata_fname)
    mimetype char(40)
    BEGINDATA
    1,front.tif,image/tif
    .bat file to initiate sqlldr.
    sqlldr userid=bsw/bsw control=control_intermedia.ctl log=test_intermedia.log rows=300 bindsize=20000000 readsize=20000000
    Any help or suggestions would be much appreciated. Thanks.
    null

    User,
    There's a [url http://forums.oracle.com/forums/forum.jspa?forumID=260]SQL Developer Forum that's probably better suited for SQL Developer enhancement requests.
    John

Maybe you are looking for