Load BLOB
Hi
oracle 10.2.0.1.0
I tried the following code
CREATE OR REPLACE PROCEDURE load_lob AS
id NUMBER;
image1 BLOB;
locator BFILE;
bfile_len NUMBER;
bf_desc VARCHAR2(30);
bf_name VARCHAR2(30);
bf_dir VARCHAR2(30);
bf_typ VARCHAR2(4);
ctr integer;
CURSOR get_id IS
SELECT bfile_id,bfile_desc,bfile_type FROM graphics_table;
BEGIN
OPEN get_id;
LOOP
FETCH get_id INTO id, bf_desc, bf_typ;
EXIT WHEN get_id%notfound;
dbms_output.put_line('ID: '||to_char(id));
SELECT bfile_loc INTO locator FROM graphics_table WHERE bfile_id=id;
dbms_lob.filegetname(locator,bf_dir,bf_name);
dbms_output.put_line('Dir: '||bf_dir);
dbms_lob.fileopen(locator,dbms_lob.file_readonly);
bfile_len:=dbms_lob.getlength(locator);
dbms_output.put_line('ID: '||to_char(id)||' length: '||to_char(bfile_len));
SELECT temp_blob INTO image1 FROM temp_blob;---- whr error occurs
bfile_len:=dbms_lob.getlength(locator);
dbms_lob.loadfromfile(image1,locator,bfile_len,1,1);
INSERT INTO internal_graphics VALUES (id,bf_desc,image1,bf_typ);
dbms_output.put_line(bf_desc||' Length: '||TO_CHAR(bfile_len)||
' Name: '||bf_name||' Dir: '||bf_dir||' '||bf_typ);
dbms_lob.fileclose(locator);
END LOOP;
END;its says that temp_blob table does not exists . please help
So, the question is really temp_blob exists ? Can you query temp_blob from SQL Plus ? If yes, then Can you query it from SQL Plus after using SET ROLE NONE ?
Similar Messages
-
ORA-22275 :invalid LOB locator specified error while loading BLOBs/CLOBS.
Hello All,
I am trying to load BLOB/CLOB data from a client oracle DB to the oracle db on our side,we are using ODI version 10.1.3.5.6,which reportedly has the issue of loading BLOB/CLOBS solved.I am using
The extraction fails in the loading stage when inserting data into the C$ table with the following error.
"22275:99999 :java.sql.BatchUpdateException:ORA-22275:Invalid LOB locator specified".
Kindly let me know how I can resolve this issue as the requirement to load this data is very urgent.
Thanks,
JohnOne alternate way can be done out of ODI as ODI is still not able to resolve this issue. You can trim these fields (CLOB/BLOB) and push this data as separate fields into ODI and at the reporting end you can again concatenate them.
May be this may solve your problem ....it solved mine.
--XAT -
Bulk loading BLOBs using PL/SQL - is it possible?
Hi -
Does anyone have a good reference article or example of how I can bulk load BLOBs (videos, images, audio, office docs/pdf) into the database using PL/SQL?
Every example I've ever seen in PL/SQL for loading BLOBs does a commit; after each file loaded ... which doesn't seem very scalable.
Can we pass in an array of BLOBs from the application, into PL/SQL and loop through that array and then issue a commit after the loop terminates?
Any advice or help is appreciated. Thanks
LJIt is easy enough to modify the example to commit every N files. If you are loading large amounts of media, I think that you will find that the time to load the media is far greater than the time spent in SQL statements doing inserts or retrieves. Thus, I would not expect to see any significant benefit to changing the example to use PL/SQL collection types in order to do bulk row operations.
If your goal is high performance bulk load of binary content then I would suggest that you look to use Sqlldr. A PL/SQL program loading from BFILEs is limited to loading files that are accessible from the database server file system. Sqlldr can do this but it can also load data from a remote client. Sqlldr has parameters to control batching of operations.
See section 7.3 of the Oracle Multimedia DICOM Developer's Guide for the example Loading DICOM Content Using the SQL*Loader Utility. You will need to adapt this example to the other Multimedia objects (ORDImage, ORDAudio .. etc) but the basic concepts are the same.
Once the binary content is loaded into the database, you will need a to write a program to loop over the new content and initialize the Multimedia objects (extract attributes). The example in 7.3 contains a sample program that does this for the ORDDicom object. -
Load blob to file. error: file write error
Hi,
I used procedure witch load blob to file. It's working at Oracle 10g, but its does't work on oracle 11g. Why?
create or replace PROCEDURE load_blob_to_bfile (p_file_id IN VARCHAR2, p_directory IN VARCHAR2, p_ident in varchar2 default NULL)
IS
v_blob BLOB;
v_start NUMBER := 1;
v_bytelen NUMBER := 2000;
v_len NUMBER;
v_raw RAW (2000);
v_x NUMBER;
v_output UTL_FILE.file_type;
v_file_name VARCHAR2 (200);
BEGIN
-- get length of blob
SELECT DBMS_LOB.getlength (blob_content), filename
INTO v_len, v_file_name
FROM wwv_flow_files
WHERE filename = p_file_id;
-- define output directory
v_output := UTL_FILE.fopen (p_directory, p_ident||'_'||v_file_name, 'wb', 32760);
-- save blob length
v_x := v_len;
-- select blob into variable
SELECT blob_content
INTO v_blob
FROM wwv_flow_files
WHERE filename = p_file_id;
v_start := 1;
WHILE v_start < v_len AND v_bytelen > 0
LOOP
DBMS_LOB.READ (v_blob, v_bytelen, v_start, v_raw);
UTL_FILE.put_raw (v_output, v_raw);
UTL_FILE.fflush (v_output);
/* Text only could be: UTL_RAW.cast_to_varchar2 (v_raw);*/
-- set the start position for the next cut
v_start := v_start + v_bytelen;
-- set the end position if less than 32000 bytes
v_x := v_x - v_bytelen;
IF v_x < 2000
THEN
v_bytelen := v_x;
END IF;
END LOOP;
UTL_FILE.fclose (v_output);
END;
directories is creaited and granted for read and write. It looks like file is created, but application rise the write file error, and files in directory is emty.
TomasBSo, I'm copying the file in a temp. BLOB with this function:
FUNCTION get_remote_binary_data (p_conn IN OUT NOCOPY UTL_TCP.connection,
p_file IN VARCHAR2)
RETURN BLOB IS
l_conn UTL_TCP.connection;
l_amount PLS_INTEGER;
l_buffer RAW(32767);
l_data BLOB;
BEGIN
DBMS_LOB.createtemporary (lob_loc => l_data,
CACHE => TRUE,
dur => DBMS_LOB.CALL);
l_conn := get_passive(p_conn); //get a passive connection
send_command(p_conn, 'RETR ' || p_file, TRUE); //send Retrieve command to server
BEGIN
LOOP
l_amount := UTL_TCP.read_raw (l_conn, l_buffer, 32767);
DBMS_LOB.writeappend(l_data, l_amount, l_buffer);
END LOOP;
EXCEPTION
WHEN UTL_TCP.END_OF_INPUT THEN
NULL;
WHEN OTHERS THEN
NULL;
END;
UTL_TCP.close_connection(l_conn);
get_reply(p_conn);
RETURN l_data;
END;
Then I'm writing it into a local file:
PROCEDURE put_local_binary_data (p_data IN BLOB,
p_dir IN VARCHAR2,
p_file IN VARCHAR2) IS
l_out_file UTL_FILE.FILE_TYPE;
l_buffer RAW(32767);
l_amount BINARY_INTEGER;
l_pos INTEGER := 1;
l_blob_len INTEGER;
BEGIN
l_blob_len := DBMS_LOB.getlength(p_data);
l_amount := DBMS_LOB.GETCHUNKSIZE(p_data);
IF (l_amount >= 32767) THEN
l_amount := 32767;
END IF;
l_out_file := UTL_FILE.FOPEN(p_dir, p_file, 'w', 32767);
WHILE l_pos < l_blob_len LOOP
DBMS_LOB.READ (p_data, l_amount, l_pos, l_buffer);
UTL_FILE.put_raw(l_out_file, l_buffer, FALSE);
l_pos := l_pos + l_amount;
END LOOP;
UTL_FILE.FCLOSE(l_out_file);
EXCEPTION
WHEN OTHERS THEN
IF UTL_FILE.IS_OPEN(l_out_file) THEN
UTL_FILE.FCLOSE(l_out_file);
END IF;
RAISE;
END;
I've checked the blob before I'we wrote it ito the file (UTL_FILE.put_raw(l_out_file, l_buffer, FALSE)), and it contains no carriage-return. -
ORA-01422 when I load blob file
Hi,
I've this table TAB_BLOB;
IMMOBILE..................FILE_NAME................FILE_FISICO
001236....................my_file.xls.......................(HugeBlob)
127890....................my_file2.xls.......................(HugeBlob)
127890....................my_file3.xls.......................(HugeBlob)
127222....................my_file4.xls.......................(HugeBlob)
127222....................my_file5.xls.......................(HugeBlob)
127222....................my_file6.xls.......................(HugeBlob)
555590....................my_file7.xls.......................(HugeBlob)
555590....................my_file8.xls.......................(HugeBlob)
1777890....................my_file9.xls.......................(HugeBlob)
2222222....................my_file11.xls.......................(HugeBlob)
2222222....................my_file13.xls.......................(HugeBlob)
2222222....................my_file14.xls.......................(HugeBlob)
2222222....................my_file15.xls.......................(HugeBlob)
I created this stored procedure to load blob file:
CREATE OR REPLACE PROCEDURE LOAD_FILE_BLOB
IS
SRC_FILE BFILE;
DST_FILE BLOB;
LGH_FILE BINARY_INTEGER;
FILE_NAME VARCHAR2(30);
ERR_NUM NUMBER;
ERR_MSG VARCHAR2(300);
CURSOR MY_CUR IS
SELECT IMMOBILE, FILE_NAME
FROM TAB_BLOB;
BEGIN
FOR REC_CUR IN MY_CUR LOOP
SRC_FILE := BFILENAME('DIR_NAME', REC_CUR.FILE_NAME);
IF DBMS_LOB.FILEEXISTS(SRC_FILE) = 1 THEN
UPDATE TAB_BLOB
SET FILE_FISICO = EMPTY_BLOB()
WHERE IMMOBILE = REC_CUR.IMMOBILE;
SELECT FILE_FISICO
INTO DST_FILE
FROM TAB_BLOB
WHERE IMMOBILE = REC_CUR.IMMOBILE
FOR UPDATE;
DBMS_LOB.FILEOPEN(SRC_FILE);
LGH_FILE := DBMS_LOB.GETLENGTH(SRC_FILE);
DBMS_LOB.LOADFROMFILE(DST_FILE, SRC_FILE, LGH_FILE);
UPDATE TAB_BLOB
SET FILE_FISICO = DST_FILE
WHERE IMMOBILE = REC_CUR.IMMOBILE;
DBMS_LOB.FILECLOSE(SRC_FILE);
COMMIT;
ELSE
DBMS_OUTPUT.PUT_LINE('File not present on folder');
COMMIT;
END IF;
END LOOP;
EXCEPTION
WHEN OTHERS THEN
ERR_MSG := SUBSTR(SQLERRM, 1,300);
ERR_NUM := SQLCODE;
INSERT INTO CIET_ERRORS (PROC_NAME, ERR_CODE, ERR_MSG)
VALUES('carica_file_blob', ERR_NUM, ERR_MSG);
COMMIT;
END LOAD_FILE_BLOB;
but when I run:
execute LOAD_FILE_BLOB;
I get this error:
ORA-01422: exact fetch returns more than requested number of rows
I think that the error is on:
UPDATE TAB_BLOB
SET FILE_FISICO = DST_FILE
WHERE IMMOBILE = REC_CUR.IMMOBILE;
How can I change my stored procedure to load files blob?
Thanks in advance!I changed so:
CREATE OR REPLACE PROCEDURE LOAD_FILE_BLOB
IS
SRC_FILE BFILE;
DST_FILE BLOB;
LGH_FILE BINARY_INTEGER;
FILE_NAME VARCHAR2(30);
ERR_NUM NUMBER;
ERR_MSG VARCHAR2(300);
CURSOR MY_CUR IS
SELECT IMMOBILE, FILE_NAME
FROM TAB_BLOB;
BEGIN
FOR REC_CUR IN MY_CUR LOOP
SRC_FILE := BFILENAME('DIR_NAME', REC_CUR.FILE_NAME);
IF DBMS_LOB.FILEEXISTS(SRC_FILE) = 1 THEN
UPDATE TAB_BLOB
SET FILE_FISICO = EMPTY_BLOB()
WHERE IMMOBILE = REC_CUR.IMMOBILE;
SELECT FILE_FISICO
INTO DST_FILE
FROM TAB_BLOB
WHERE IMMOBILE = REC_CUR.IMMOBILE
and FILE_NAME = REC_CUR.FILE_NAME;
FOR UPDATE;
DBMS_LOB.FILEOPEN(SRC_FILE);
LGH_FILE := DBMS_LOB.GETLENGTH(SRC_FILE);
DBMS_LOB.LOADFROMFILE(DST_FILE, SRC_FILE, LGH_FILE);
UPDATE TAB_BLOB
SET FILE_FISICO = DST_FILE
WHERE IMMOBILE = REC_CUR.IMMOBILE
and FILE_NAME = REC_CUR.FILE_NAME;;
DBMS_LOB.FILECLOSE(SRC_FILE);
COMMIT;
ELSE
DBMS_OUTPUT.PUT_LINE('File not present on folder');
COMMIT;
END IF;
END LOOP;
EXCEPTION
WHEN OTHERS THEN
ERR_MSG := SUBSTR(SQLERRM, 1,300);
ERR_NUM := SQLCODE;
INSERT INTO CIET_ERRORS (PROC_NAME, ERR_CODE, ERR_MSG)
VALUES('carica_file_blob', ERR_NUM, ERR_MSG);
COMMIT;
END LOAD_FILE_BLOB;
but in table TAB_BLOB I have this situation:
IMMOBILE..................FILE_NAME................FILE_FISICO
001236....................my_file.xls.......................(HugeBlob) --load correctly
127890....................my_file2.xls.......................(HugeBlob)-- not loaded
127890....................my_file3.xls.......................(HugeBlob) --load correctly
127222....................my_file4.xls.......................(HugeBlob)-- not loaded
127222....................my_file5.xls.......................(HugeBlob)-- not loaded
127222....................my_file6.xls.......................(HugeBlob) --load correctly
555590....................my_file7.xls.......................(HugeBlob)-- not loaded
555590....................my_file8.xls.......................(HugeBlob) --load correctly
1777890....................my_file9.xls.......................(HugeBlob) --load correctly
2222222....................my_file11.xls.......................(HugeBlob)-- not loaded
2222222....................my_file13.xls.......................(HugeBlob)-- not loaded
2222222....................my_file14.xls.......................(HugeBlob)-- not loaded
2222222....................my_file15.xls.......................(HugeBlob) --load correctly
Why I load just one file by IMMOBILE? -
i am trying to load blob to a column for testing Oracle text. following is what I did-
CREATE TABLE my_docs (
id NUMBER(10) NOT NULL,
name VARCHAR2(200) NOT NULL,
doc BLOB NOT NULL
ALTER TABLE my_docs ADD (
CONSTRAINT my_docs_pk PRIMARY KEY (id)
CREATE SEQUENCE my_docs_seq;
CREATE OR REPLACE DIRECTORY documents AS 'C:\work\';
CREATE OR REPLACE PROCEDURE load_file_to_my_docs (p_file_name IN my_docs.name%TYPE) AS
v_bfile BFILE;
v_blob BLOB;
BEGIN
INSERT INTO my_docs (id, name, doc)
VALUES (my_docs_seq.NEXTVAL, p_file_name, empty_blob())
RETURN doc INTO v_blob;
v_bfile := BFILENAME('DOCUMENTS', p_file_name);
Dbms_Lob.Fileopen(v_bfile, Dbms_Lob.File_Readonly);
Dbms_Lob.Loadfromfile(v_blob, v_bfile, Dbms_Lob.Getlength(v_bfile));
Dbms_Lob.Fileclose(v_bfile);
COMMIT;
END;
EXEC load_file_to_my_docs('doc1.doc');
I am getting the following error
ERROR at line 1:
ORA-22288: file or LOB operation FILEOPEN failed
The system cannot find the path specified.
ORA-06512: at "SYS.DBMS_LOB", line 504
ORA-06512: at "SCOTT.LOAD_FILE_TO_MY_DOCS", line 10
ORA-06512: at line 1
The directory and documents are available in the system. So what could be the reason for this error. Kindly help me and thanks in advance.
Rgds
MuneerSorry to bother..i find out the error. I created the directory in the client system instead of the Server. Thats what cased the head ache.
cheers
Muneer -
How to load blob data into table
hi
i have a table with
ID NUMBER No - 1
PARENT_ID NUMBER No - -
DOCUMENT BLOB Yes - -
NAME VARCHAR2(40) Yes - -
MIMETYPE VARCHAR2(40) Yes - -
COMMENTS VARCHAR2(400) Yes - -
TIMESTAMP_CREATED TIMESTAMP(6) Yes - -
CREATED_BY VARCHAR2(40) Yes - -
TIMESTAMP_MODIFIED TIMESTAMP(6) Yes - -
MODIFIED_BY CHAR(40) Yes - -
IS_GROUP CHAR(1) No - -
FILE_NAME VARCHAR2(4000) Yes - -
as columns. i want to insert blob data into the empty table.i have some fields in the form through which i insert data by hard coding in a process.when i upload a document in the filebrowse type field the mime type is not updating though i have written code in the source value. i removed the database links of the form with the table and that is why i am hard coding to the table thru a process. could u suggest a query where i can insert the blolb data.
i use the process
begin
select max(ID) into aaa from "PSA_KNOWLEDGE_TREE";
insert into PSA_KNOWLEDGE_TREE values(aaa+1,1,null,:p126_NEW_GROUP,null,:p126_COMMENTS,:P126_TIMESTAMP_CREATED,:P126_CREATED_BY,null,null,'Y',null);could u please type the query according to my table and process requirements. i have tried many queries and i have failed to load the blob data. the imetype is not being updated.
thnx for ur reply -
I have used OMWB "generate Sql Loader script " option. ANd received the sql loader error.
The previous attempt to use OMWB online loading has generated garbage data. The picture was not matching with the person id.
Table in Sql Server..................
CREATE TABLE [nilesh] (
[LargeObjectID] [int] NOT NULL ,
[LargeObject] [image] NULL ,
[ContentType] [varchar] (40) COLLATE SQL_Latin1_General_CP1_CI_AS NULL ,
[LargeObjectName] [varchar] (255) COLLATE SQL_Latin1_General_CP1_CI_AS NULL ,
[LargeObjectExtension] [varchar] (10) COLLATE SQL_Latin1_General_CP1_CI_AS NULL ,
[LargeObjectDescription] [varchar] (255) COLLATE SQL_Latin1_General_CP1_CI_AS NULL ,
[LargeObjectSize] [int] NULL ,
[VersionControl] [bit] NULL ,
[WhenLargeObjectLocked] [datetime] NULL ,
[WhoLargeObjectLocked] [char] (11) COLLATE SQL_Latin1_General_CP1_CI_AS NULL ,
[LargeObjectTimeStamp] [timestamp] NOT NULL ,
[LargeObjectOID] [uniqueidentifier] NOT NULL
) ON [PRIMARY] TEXTIMAGE_ON [PRIMARY]
GO
Table in Oracle..............
CREATE TABLE LARGEOBJECT
LARGEOBJECTID NUMBER(10) NOT NULL,
LARGEOBJECT BLOB,
CONTENTTYPE VARCHAR2(40 BYTE),
LARGEOBJECTNAME VARCHAR2(255 BYTE),
LARGEOBJECTEXTENSION VARCHAR2(10 BYTE),
LARGEOBJECTDESCRIPTION VARCHAR2(255 BYTE),
LARGEOBJECTSIZE NUMBER(10),
VERSIONCONTROL NUMBER(1),
WHENLARGEOBJECTLOCKED DATE,
WHOLARGEOBJECTLOCKED CHAR(11 BYTE),
LARGEOBJECTTIMESTAMP NUMBER(8) NOT NULL,
LARGEOBJECTOID RAW(16) NOT NULL
TABLESPACE USERS
PCTUSED 0
PCTFREE 10
INITRANS 1
MAXTRANS 255
STORAGE (
INITIAL 64K
MINEXTENTS 1
MAXEXTENTS 2147483645
PCTINCREASE 0
BUFFER_POOL DEFAULT
LOGGING
NOCOMPRESS
LOB (LARGEOBJECT) STORE AS
( TABLESPACE USERS
ENABLE STORAGE IN ROW
CHUNK 8192
PCTVERSION 10
NOCACHE
STORAGE (
INITIAL 64K
MINEXTENTS 1
MAXEXTENTS 2147483645
PCTINCREASE 0
BUFFER_POOL DEFAULT
NOCACHE
NOPARALLEL
MONITORING;
Sql Loader script....
SET NLS_DATE_FORMAT=Mon dd YYYY HH:mi:ssAM
REM SET NLS_TIMESTAMP_FORMAT=Mon dd YYYY HH:mi:ss:ffAM
REM SET NLS_LANGUAGE=AL32UTF8
sqlldr cecildata/@ceciltst control=LARGEOBJECT.ctl log=LARGEOBJECT.log
Sql loader control file......
load data
infile 'nilesh.dat' "str '<er>'"
into table LARGEOBJECT
fields terminated by '<ec>'
trailing nullcols
(LARGEOBJECTID,
LARGEOBJECT CHAR(2000000) "HEXTORAW (:LARGEOBJECT)",
CONTENTTYPE "DECODE(:CONTENTTYPE, CHR(00), ' ', :CONTENTTYPE)",
LARGEOBJECTNAME CHAR(255) "DECODE(:LARGEOBJECTNAME, CHR(00), ' ', :LARGEOBJECTNAME)",
LARGEOBJECTEXTENSION "DECODE(:LARGEOBJECTEXTENSION, CHR(00), ' ', :LARGEOBJECTEXTENSION)",
LARGEOBJECTDESCRIPTION CHAR(255) "DECODE(:LARGEOBJECTDESCRIPTION, CHR(00), ' ', :LARGEOBJECTDESCRIPTION)",
LARGEOBJECTSIZE,
VERSIONCONTROL,
WHENLARGEOBJECTLOCKED,
WHOLARGEOBJECTLOCKED,
LARGEOBJECTTIMESTAMP,
LARGEOBJECTOID "GUID_MOVER(:LARGEOBJECTOID)")
Error Received...
Column Name Position Len Term Encl Datatype
LARGEOBJECTID FIRST * CHARACTER
Terminator string : '<ec>'
LARGEOBJECT NEXT ***** CHARACTER
Maximum field length is 2000000
Terminator string : '<ec>'
SQL string for column : "HEXTORAW (:LARGEOBJECT)"
CONTENTTYPE NEXT * CHARACTER
Terminator string : '<ec>'
SQL string for column : "DECODE(:CONTENTTYPE, CHR(00), ' ', :CONTENTTYPE)"
LARGEOBJECTNAME NEXT 255 CHARACTER
Terminator string : '<ec>'
SQL string for column : "DECODE(:LARGEOBJECTNAME, CHR(00), ' ', :LARGEOBJECTNAME)"
LARGEOBJECTEXTENSION NEXT * CHARACTER
Terminator string : '<ec>'
SQL string for column : "DECODE(:LARGEOBJECTEXTENSION, CHR(00), ' ', :LARGEOBJECTEXTENSION)"
LARGEOBJECTDESCRIPTION NEXT 255 CHARACTER
Terminator string : '<ec>'
SQL string for column : "DECODE(:LARGEOBJECTDESCRIPTION, CHR(00), ' ', :LARGEOBJECTDESCRIPTION)"
LARGEOBJECTSIZE NEXT * CHARACTER
Terminator string : '<ec>'
VERSIONCONTROL NEXT * CHARACTER
Terminator string : '<ec>'
WHENLARGEOBJECTLOCKED NEXT * CHARACTER
Terminator string : '<ec>'
WHOLARGEOBJECTLOCKED NEXT * CHARACTER
Terminator string : '<ec>'
LARGEOBJECTTIMESTAMP NEXT * CHARACTER
Terminator string : '<ec>'
LARGEOBJECTOID NEXT * CHARACTER
Terminator string : '<ec>'
SQL string for column : "GUID_MOVER(:LARGEOBJECTOID)"
SQL*Loader-309: No SQL string allowed as part of LARGEOBJECT field specification
what's the cause ?The previous attempt to use OMWB online loading has generated garbage data. The picture was not matching with the person id.This is being worked on (bug4119713) If you have a reproducible testcase please send it in (small testcases seem to work ok).
I have the following email about BLOBS I could forward to you if I have your email address:
[The forum may cut the lines in the wrong places]
Regards,
Turloch
Oracle Migration Workbench Team
Hi,
This may provide the solution. Without having the customer files here I can only guess at the problem. But this should help.
This email outlines a BLOB data move.
There are quiet a few steps to complete the task of moving a large BLOB into the Oracle database.
Normally this wouldn't be a problem, but as far as we can tell SQL Server's (and possibly Sybase) BCP does not reliably export binary data.
The only way to export binary data properly via BCP is to export it in a HEX format.
Once in a HEX format it is difficult to get it back to binary during a data load into Oracle.
We have come up with the idea of getting the HEX values into Oracle by saving them in a CLOB (holds text) column.
We then convert the HEX values to binary values and insert them into the BLOB column.
The problem here is that the HEXTORAW function in Oracle only converts a maximum of 2000 HEX pairs.
We over came this problem by writing our own procedure that will convert (bit by bit) your HEX data to binary.
NOTE: YOU MUST MODIFY THE START.SQL AND FINISH.SQL TO SUIT YOUR CUSTOMER
The task is split into 4 sub tasks
1) CREATE A TABLESPACE TO HOLD ALL THE LOB DATA
--log into your system schema and create a tablespace
--Create a new tablespace for the CLOB and BLOB column (this may take a while to create)
--You may resize this to fit your data ,
--but I believe you have in excess of 500MB of data in this table and we are going to save it twice (in a clob then a blob)
--Note: This script may take some time to execute as it has to create a tablespace of 1000Mb.
-- Change this to suit your customer.
-- You can change this if you want depending on the size of your data
-- Remember that we save the data once as CLOB and then as BLOB
create tablespace lob_tablespace datafile 'lob_tablespace' SIZE 1000M AUTOEXTEND ON NEXT 50M;
LOG INTO YOUR TABLE SCHEMA IN ORACLE
--Modify this script to fit your requirements
2) START.SQL (this script will do the following tasks)
a) Modify your current schema so that it can accept HEX data
b) Modify your current schema so that it can hold that huge amount of data.
The new tablespace is used; you may want to alter this to your requirements
c) Disable triggers, indexes & primary keys on tblfiles
3)DATA MOVE
The data move now involves moving the HEX data in the .dat files to a CLOB.
The START.SQL script adds a new column to <tablename> called <blob_column>_CLOB.
This is where the HEX values will be stored.
MODIFY YOUR CONTROL FILE TO LOOK LIKE THISload data
infile '<tablename>.dat' "str '<er>'"
into table <tablename>
fields terminated by '<ec>'
trailing nullcols
<blob_column>_CLOB CHAR(200000000),
The important part being "_CLOB" appended to your BLOB column name and the datatype set to CHAR(200000000)
RUN sql_loader_script.bat
log into your schema to check if the data was loaded successfully-- now you can see that the hex values were sent to the CLOB column
SQL> select dbms_lob.getlength(<blob_column>),dbms_lob.getlength(<blob_column>_clob) from <tablename>;
LOG INTO YOUR SCHEMA
4)FINISH.SQL (this script will do the following tasks)
a) Creates the procedure needed to perform the CLOB to BLOB transformation
b) Executes the procedure (this may take some time a 500Mb has to be converted to BLOB)
c) Alters the table back to its original form (removes the <blob_column>_clob)
b) Enables the triggers, indexes and primary keys
Regards,
(NAME)
-- START.SQL
-- Modify this for your particular customer
-- This should be executed in the user schema in Oracle that contains the table.
-- DESCRIPTION:
-- ALTERS THE OFFENDING TABLE SO THAT THE DATA MOVE CAN BE EXECUTED
-- DISABLES TRIGGERS, INDEXES AND SEQUENCES ON THE OFFENDING TABLE
-- 1) Add an extra column to hold the hex string
alter table <tablename> add (FILEBINARY_CLOB CLOB);
-- 2) Allow the BLOB column to accpet NULLS
alter table <tablename> MODIFY FILEBINARY NULL;
-- 3) Dissable triggers and sequences on tblfiles
alter trigger <triggername> disable;
alter table tblfiles drop primary key cascade;
drop index <indexname>;
-- 4) Allow the table to use the tablespace
alter table <tablename> move lob (<blob_column>) store as (tablespace lob_tablespace);
alter table tblfiles move lob (<blob_column>clob) store as (tablespace lobtablespace);
COMMIT;
-- END OF FILE
-- FINISH.SQL
-- Modify this for your particular customer
-- This should be executed in the table schema in Oracle.
-- DESCRIPTION:
-- MOVES THE DATA FROM CLOB TO BLOB
-- MODIFIES THE TABLE BACK TO ITS ORIGIONAL SPEC (without a clob)
-- THEN ENABLES THE SEQUENCES, TRIGGERS AND INDEXES AGAIN
-- Currently we have the hex values saved as text in the <columnname>_CLOB column
-- And we have NULL in all rows for the <columnname> column.
-- We have to get BLOB locators for each row in the BLOB column
-- put empty blobs in the blob column
UPDATE <tablename> SET filebinary=EMPTY_BLOB();
COMMIT;
-- create the following procedure in your table schema
CREATE OR REPLACE PROCEDURE CLOBTOBLOB
AS
inputLength NUMBER; -- size of input CLOB
offSet NUMBER := 1;
pieceMaxSize NUMBER := 50; -- the max size of each peice
piece VARCHAR2(50); -- these pieces will make up the entire CLOB
currentPlace NUMBER := 1; -- this is where were up to in the CLOB
blobLoc BLOB; -- blob locator in the table
clobLoc CLOB; -- clob locator pointsthis is the value from the dat file
-- THIS HAS TO BE CHANGED FOR SPECIFIC CUSTOMER TABLE AND COLUMN NAMES
CURSOR cur IS SELECT <blob_column>clob clobcolumn , <blob_column> blob_column FROM /*table*/<tablename> FOR UPDATE;
cur_rec cur%ROWTYPE;
BEGIN
OPEN cur;
FETCH cur INTO cur_rec;
WHILE cur%FOUND
LOOP
--RETRIVE THE clobLoc and blobLoc
clobLoc := cur_rec.clob_column;
blobLoc := cur_rec.blob_column;
currentPlace := 1; -- reset evertime
-- find the lenght of the clob
inputLength := DBMS_LOB.getLength(clobLoc);
-- loop through each peice
LOOP
-- get the next piece and add it to the clob
piece := DBMS_LOB.subStr(clobLoc,pieceMaxSize,currentPlace);
-- append this peice to the BLOB
DBMS_LOB.WRITEAPPEND(blobLoc, LENGTH(piece)/2, HEXTORAW(piece));
currentPlace := currentPlace + pieceMaxSize ;
EXIT WHEN inputLength < currentplace;
END LOOP;
FETCH cur INTO cur_rec;
END LOOP;
END CLOBtoBLOB;
-- now run the procedure
-- It will update the blob column witht the correct binary represntation of the clob column
EXEC CLOBtoBLOB;
-- drop the extra clob cloumn
alter table <tablename> drop column <blob_column>_clob;
-- 2) apply the constraint we removed during the data load
alter table <tablename> MODIFY FILEBINARY NOT NULL;
-- Now re enable the triggers,indexs and primary keys
alter trigger <triggername> enable;
ALTER TABLE TBLFILES ADD ( CONSTRAINT <pkname> PRIMARY KEY ( <column>) ) ;
CREATE INDEX <index_name> ON TBLFILES ( <column> );
COMMIT;
-- END OF FILE -
Load BLOB column in Oracle to Image column in MS SQL Server
Hi there,
I have an Oracle table as the source and a MS SQL Server table as the target. A blob(4000) column in source is mapped with a Image(2147483647) column in target.
The execution will give me error message "*java.lang.NumberFormatException: For input string: "4294967295"* " when it comes to step "load data into staging table"
The LKM that I am using is LKM SQL to MSSQL.
I also tried LKM SQL to MSSQL(BULK), LKM SQL to SQL(jython), LKM SQL to SQL. None of them is helpful.
Coudl anyone tell me how to solve this 4294967295 problem? Thanks a lotHi Yang,
Oracle recommends the setting of the 'get_lob_precision' flag to FALSE to avoid this error.
To do that,
1. Take a backup of your ODIPARAM.BAT file.
2. Open ODIPARAM.BAT file and add the following line,
set ODI_ADDITIONAL_JAVA_OPTIONS=%ODI_ADDITIONAL_JAVA_OPTIONS% " -Doracledatabasemetadata.get_lob_precision=false"
next to,
set ODI_ADDITIONAL_JAVA_OPTIONS="-Djava.security.policy=server.policy"
PS:If the parameter is set, the resulting "ODI_ADDITIONAL_JAVA_OPTIONS" ("SNP_ADDITIONAL_JAVA_OPTIONS" or "SNP_JAVA_OPTIONS") should be similar to the above.
Restart ODI and try.
Thanks,
G -
Problem loading Blobs to Oracle DB
Hi folks,
I am indeed stuck. I wrote a stored procedure, compiled it to byte-code, jarred it, used load java to load it into the Oracle 9i DB and wrapped it in a PL/SQL wrapper. It worked. simply calling it as:
select
f_loadBlob('100')
from dual... produced a return of 1 - that's what the function returns - which is the count of inserted DB rows.
Now, I altered the java code to load several Blobs instead of just one. I did not change any other code so far as checking and re-checking over and over again brings to my attention. I went through all the same steps - altering the PL/SQL wrapper to to suit.
But I can't get rid of the following error from the call:
1 select
2 f_loadPix('100','200')
3* from dual
11:05:55 SQL>
11:05:56 SQL> /
from dual
ERROR at line 3:
ORA-29532: Java call terminated by uncaught Java exception:
java.lang.NullPointerException... where 100 and 200 are VARCHAR2 file-names, in the file system where the jpeg files exist.
I am willing to post the code, but just don't want to overwhelm any who might be inclined to help me out. So if you want to see the class file code, please just ask.
Thanks in advance,
~Bill
PS: Oh BTW - I neglected to tell that the java code works just fine when run through the command line as a class file and accesses the DB through JDBC.
I also have try/catches up to wazzu.Ok ... I noticed a few checked this thread out, and so here's the answer:
1. I must be mistaken that the single load worked, because having discussed the problem with our DBA, I am assured that no permissions were changed on the file that I am accessing ... or was attempting to access.
2. Our DBA made the correct determination that - even though the message did not indicate the file could not be accessed, this was indeed the problem. It seems that even though I placated Oracle with the proper Grant, Unix still controled file access.
3. After Oracle was granted rights to the directory containing the files, it is now working for both single and batch Blob loads.
Now I just need to write the routine to get the Blobs out of the table and onto the web-page ... always fun.
Thanks everyone who took the time to give this a look, and I hope if anyone was interested, I might have in turn helped you. Best regards,
~Bill
PS: I also learned that
select
f_loadBlob('100')
from dualis not the correct way to run this function. Running it as:
variable x number;
call f_loadBlob('100') into :x;... is the correct way for testing. -
SQL Loader - BLOB & LONG COLUMNS
I have a table with 500 rows
200 out of these rows need to be exported from one database to another database.
These rows have long and blob columns with values. The long columns COULD contain all sorts of special characters.
How can this be achieved with SQL Loader? Is there any other tool that can be used to get this done?
Any help will be appreciated.Hello,
What's your oracle version? As satish mentioned you can use datapump for load and unload. Here is an additional way of loading lob data
To load data upto 32k, you can use varchar2(32656) but its not a good idea to load clob in that manner because it's very inconsistent as length can vary resulting in string literal too long. So you have 2 choices now, first you have to use either procedure or anonymous block to load clob data
First Method -- I loaded alert.log successfully and you can imagine how big this file can be (5MB (upto 4GB) in my test case)_
CREATE OR REPLACE DIRECTORY DIR AS '/myapth/logs';
DECLARE
clob_data CLOB;
clob_file BFILE;
BEGIN
INSERT INTO t1clob
VALUES (EMPTY_CLOB ())
RETURNING clob_text INTO clob_data;
clob_file := BFILENAME ('DIR', 'wwalert_dss.log');
DBMS_LOB.fileopen (clob_file);
DBMS_LOB.loadfromfile (clob_data,
clob_file,
DBMS_LOB.getlength (clob_file)
DBMS_LOB.fileclose (clob_file);
COMMIT;
END;
Second Method: Use of Sqlldr_
LOAD DATA
INFILE alert.log "STR '|\n'"
REPLACE INTO table t1clob
clob_text char(30000000)
)Hope this helps.
Regards -
Loading BLOB / CLOB data from oracle source to oracle target
Hello Friends,
Source and Taret DB are Oracle 11g.
I am havnig CLOB and BLOB source columns and I am linking these links to CLOB/BLOB cloumns of Target .
but the image of source ( BLOB ) is not got loaded and when I tried to view the Image - It shows *'IMAGE COULD NOT BE DECODED FROM BINARY STREAM '.*
For BLOB Coulmn - the is what i am using for target - NVL(WRK_IMAGE.IMAGE_DATA, EMPTY_BLOB())
For CLOB Column - NVL(WRK_CALLOUT_V.CALLOUT.getClobVal(), EMPTY_CLOB()) -
For CLOB i don't have any issue - but for BLOB I could not able to see the blob image .
Appreciate if you can let me know where the issue is .
thanks/kumartry with some other image. if issue exists then open the raw form of the image and check the header.
-
Weird problem using load() on a BLOB
I've got a weird bug when trying to load BLOBs via PHP. Here is the code:<P>
<TT>
<?<BR>
// Environment Variables so PHP can find Oracle<BR>
putenv("TWO_TASK=gonk"); // Name of Oracle Server<BR>
putenv("ORACLE_SID=ora10"); // Name of Oracle Instance<BR>
putenv("ORACLE_HOME=/local/apps/oracle/InstantClient");<BR>
putenv("DYLD_LIBRARY_PATH=/local/apps/oracle/InstantClient");<P>
// Oracle username, password and connection string<BR>
define("ORA_CON_UN", "test"); // Oracle Username<BR>
define("ORA_CON_PW", "blahblah"); // Oracle Password<BR>
define("ORA_CON_DB", "ora10"); // Oracle Instance<P>
// Connect to Oracle<BR>
$conn = OCILogon(ORA_CON_UN, ORA_CON_PW, ORA_CON_DB);<P>
// SQL Query to execute<BR>
$query = "SELECT TESTNUM, TESTCHAR, TESTBLOB ";<BR>
$query .= "FROM TESTING WHERE TESTNUM = 1";<BR>
$stmt = OCIParse ($conn, $query);<BR>
OCIExecute($stmt);<BR>
OCIFetchInto($stmt, &$arr, OCI_RETURN_LOBS);<P>
$BlobNum = OCIResult($stmt,1);<BR>
$BlobChar = OCIResult($stmt,2);<BR>
$RawBlobBlob = OCIResult($stmt,3);<BR>
$BlobBlob = $RawBlobBlob->load();<P>
print ("BlobNum:".$BlobNum."\n");<BR>
print ("BlobChar:".$BlobChar."\n");<BR>
print ("RawBlobBlob:".$RawBlobBlob."\n");<BR>
print ("BlobBlob:".$BlobBlob."\n");<BR>
print("***\n");<BR>
?><P>
</TT>
Basically, the program works fine under PHP 4.0.5 on a TRU64 box (yes, the BLOB is displayed as a mess of strange characters, but that's cool - if I include a "header("Content-Type: image/jpeg");" line and only output the BLOB by itself the jpeg image stored in the BLOB is displayed properly). However, when it comes to running exactly the same program under PHP 4.4.1 on a OS X box, the program does not display the BLOB. The program is definitely loading the BLOB, as I can get the program to display how big (in bytes) the data in the BLOB is, but the OCI function load() is returning nothing when it should be displaying the data I have in the BLOB.
<P>
Anyone got any ideas on what could be causing this behaviour ? It's been driving me nuts for nearly a week!
<P>
Cheers,<BR>
RockyGlad you fixed it.
See http://www.oracle.com/technology/tech/php/htdocs/php_troubleshooting_faq.html#envvars regarding these lines in your script:
putenv("ORACLE_HOME=/local/apps/oracle/InstantClient");
putenv("DYLD_LIBRARY_PATH=/local/apps/oracle/InstantClient");
-- cj -
How can i load file into database from client-side to server-side
i want to upload file from client-side to server-side, i use the following code to load blob into database.
if the file is in the server-side, it can work, but if it in the client-side, it said that the system cannot find the file. i think it only will search the file is in the server-side or not, it will not search the client-side.
how can i solve it without upload the file to the server first, then load it into database??
try
ResultSet rset = null;
PreparedStatement pstmt =
conn.prepareStatement ("insert into docs values (? , EMPTY_BLOB())");
pstmt.setInt (1, docId);
pstmt.execute ();
// Open the destination blob:
pstmt.setInt (1, docId);
rset = pstmt.executeQuery (
"SELECT content FROM docs WHERE id = ? FOR UPDATE");
BLOB dest_lob = null;
if (rset.next()) {
dest_lob = ((OracleResultSet)rset).getBLOB (1);
// Declare a file handler for the input file
File binaryFile = new File (fileName);
FileInputStream istream = new FileInputStream (binaryFile);
// Create an OutputStram object to write the BLOB as a stream
OutputStream ostream = dest_lob.getBinaryOutputStream();
// Create a tempory buffer
byte[] buffer = new byte[1024];
int length = 0;
// Use the read() method to read the file to the byte
// array buffer, then use the write() method to write it to
// the BLOB.
while ((length = istream.read(buffer)) != -1)
ostream.write(buffer, 0, length);
pstmt.close();
// Close all streams and file handles:
istream.close();
ostream.flush();
ostream.close();
//dest_lob.close();
// Commit the transaction:
conn.commit();
conn.close();
} catch (SQLException e) {Hi,
Without some more details of the configuration, its difficult to know
what's happening here. For example, what do you mean by client side
and server side, and where are you running the upload Java application?
If you always run the application on the database server system, but can't
open the file on a different machine, then it sounds like a file protection
problem that isn't really connected with the database at all. That is to
say, if the new FileInputStream (binaryFile) statement fails, then its not
really a database problem, but a file protection issue. On the other hand,
I can't explain what's happening if you run the program on the same machine
as the document file (client machine), but you can't write the data to the
server, assuming the JDBC connection string is set correctly to connect to
the appropriate database server.
If you can provide some more information, we'll try to help.
Simon
null -
Display BLOB/CLOB image data in OBIEE 11g
Dear Gurus,
I want to display employee profile picture in dashboard, the image data was saved as blob/clob image.
How to configure and display it in obiee?
Regards
JOEHi
Thanks for your quick response.
actually im not using EBS, im just using publisher with OBIEE 11g.
if explain step by step what you wrote first replay-
Let i have a table with one BLOB column.
1) first load blob data to my table
2) create data model from oracle publisher.
3) change the xdm file with above tagline , my xdm file is in below link-
C:\mw_bi_home\instances\instance1\bifoundation\OracleBIPresentationServicesComponent\coreapplication_obips1\catalog\SampleAppLite\root\shared\zia%2exdm
i don't know is that right step? because if i re-open the xdm file after change by your given tagline its shows error.
I will grateful to you please give me a little explanation which file i can put the following tagline.
_<fo:instream-foreign-object content-type="image/jpg"><xsl:value-of select=".//IMAGE_ELEMENT"/></fo:instream-foreign-object>_
Thanks again and waiting your feedback -
@zia
Maybe you are looking for
-
Issue with custom search fields in agent inbox
Hi All, I am working on CRM agent inbox 7.0. The issue I am facing is like this: I have added custom search fields in inbox. When give a combination of standard search field and custom search field, the result is only based on standard field. The val
-
Itunes not working due to quicktime
I recently erased and reinstalled my os on my macbook. There seems to be an issue with itunes. When trying to open itunes I get a prompt saying it requires quicktime 7.5.5 or later. I then went to the apple site to download 7.5.5. however during the
-
PDF Generator process converting several document types
I am creating a PDF Conversion process, and I utilize a watched folder to fetch files. However, if I set the fileName parameter to * I get the error stating that the file extension is missing. If I use for example file.doc, all word documents are con
-
Make native code and Java VM interact
Hi all, I am building a native components (Delphi and C++) using JNI. Currently my components involve a Java GUI. It loads the JVM, then request for that Java Object GUI. When the interface is created, I need to get back information from what the use
-
Can someone please send me a link where I can download the user manual for Photoshop CS6. I'm not looking for individual help topics but an actual user manual. I couldn't find it in the online documentation. Thanks.