View to convert table with blob to varchar
Hello,
Is it possible to create a view on table which contains BLOB field to have it in the view as varchar?
Thanks for your advices
In general, no
BLOBs can have binary data - VARCHARs cannot
BLOBs can be really really large - VARCHARs have a limit
John
Similar Messages
-
Error while importing a table with BLOB column
Hi,
I am having a table with BLOB column. When I export such a table it gets exported correctly, but when I import the same in different schema having different tablespace it throws error
IMP-00017: following statement failed with ORACLE error 959:
"CREATE TABLE "CMM_PARTY_DOC" ("PDOC_DOC_ID" VARCHAR2(10), "PDOC_PTY_ID" VAR"
"CHAR2(10), "PDOC_DOCDTL_ID" VARCHAR2(10), "PDOC_DOC_DESC" VARCHAR2(100), "P"
"DOC_DOC_DTL_DESC" VARCHAR2(100), "PDOC_RCVD_YN" VARCHAR2(1), "PDOC_UPLOAD_D"
"ATA" BLOB, "PDOC_UPD_USER" VARCHAR2(10), "PDOC_UPD_DATE" DATE, "PDOC_CRE_US"
"ER" VARCHAR2(10) NOT NULL ENABLE, "PDOC_CRE_DATE" DATE NOT NULL ENABLE) PC"
"TFREE 10 PCTUSED 40 INITRANS 1 MAXTRANS 255 STORAGE(INITIAL 65536 FREELISTS"
" 1 FREELIST GROUPS 1 BUFFER_POOL DEFAULT) TABLESPACE "TS_AGIMSAPPOLOLIVE030"
"4" LOGGING NOCOMPRESS LOB ("PDOC_UPLOAD_DATA") STORE AS (TABLESPACE "TS_AG"
"IMSAPPOLOLIVE0304" ENABLE STORAGE IN ROW CHUNK 8192 PCTVERSION 10 NOCACHE L"
"OGGING STORAGE(INITIAL 65536 FREELISTS 1 FREELIST GROUPS 1 BUFFER_POOL DEF"
"AULT))"
IMP-00003: ORACLE error 959 encountered
ORA-00959: tablespace 'TS_AGIMSAPPOLOLIVE0304' does not exist
I used the import command as follows :
imp <user/pwd@conn> file=<dmpfile.dmp> fromuser=<fromuser> touser=<touser> log=<logfile.log>
What can I do so that this table gets imported correctly?
Also tell me "whether the BLOB is stored in different tablespace than the default tablespace of the user?"
Thanks in advance.Hello,
U can either
1) create a tablespace with the same name in destination where you are trying to import.
2) get the ddl of the table, modify the tablespace name to reflect the existing tablespace name in destination and run the ddl in the destination database, and run your import command with option ignore=y--> which will ignore all the create errors.
Regards,
Vinay -
Problem while importing table with blob datatype
hi i am having a database 9i on windows xp and dev database 9i on AIX 5.2
while i am taking export of normal tables and trying to import i am successful.but when i am trying to import a table with blob datatype it is throwing "tablespace <tablespace_name> doesn't exist" error
here how i followed.
SQL*Plus: Release 9.2.0.1.0 - Production on Mon Oct 8 14:08:29 2007
Copyright (c) 1982, 2002, Oracle Corporation. All rights reserved.
Enter user-name: test@test
Enter password: ****
Connected to:
Oracle9i Enterprise Edition Release 9.2.0.1.0 - Production
With the Partitioning, OLAP and Oracle Data Mining options
JServer Release 9.2.0.1.0 - Production
SQL> create table x(photo blob);
Table created.
exporting:
D:\>exp file=x.dmp log=x.log tables='TEST.X'
Export: Release 9.2.0.1.0 - Production on Mon Oct 8 14:09:40 2007
Copyright (c) 1982, 2002, Oracle Corporation. All rights reserved.
Username: pavan@test
Password:
Connected to: Oracle9i Enterprise Edition Release 9.2.0.1.0 - Production
With the Partitioning, OLAP and Oracle Data Mining options
JServer Release 9.2.0.1.0 - Production
Export done in WE8MSWIN1252 character set and AL16UTF16 NCHAR character set
About to export specified tables via Conventional Path ...
Current user changed to TEST
. . exporting table X 0 rows exported
Export terminated successfully without warnings.
importing:
D:\>imp file=x.dmp log=ximp.log fromuser='TEST' touser='IBT' tables='X'
Import: Release 9.2.0.1.0 - Production on Mon Oct 8 14:10:42 2007
Copyright (c) 1982, 2002, Oracle Corporation. All rights reserved.
Username: system@mch
Password:
Connected to: Oracle9i Enterprise Edition Release 9.2.0.6.0 - 64bit Production
With the Partitioning, Real Application Clusters, OLAP and Oracle Data Mining op
tions
JServer Release 9.2.0.6.0 - Production
Export file created by EXPORT:V09.02.00 via conventional path
Warning: the objects were exported by PAVAN, not by you
import done in WE8MSWIN1252 character set and AL16UTF16 NCHAR character set
import server uses US7ASCII character set (possible charset conversion)
. importing TEST's objects into IBT
IMP-00017: following statement failed with ORACLE error 959:
"CREATE TABLE "X" ("PHOTO" BLOB) PCTFREE 10 PCTUSED 40 INITRANS 1 MAXTRANS "
"255 STORAGE(INITIAL 65536 FREELISTS 1 FREELIST GROUPS 1) TABLESPACE "TESTTB"
"S" LOGGING NOCOMPRESS LOB ("PHOTO") STORE AS (TABLESPACE "TESTTBS" ENABLE "
"STORAGE IN ROW CHUNK 8192 PCTVERSION 10 NOCACHE STORAGE(INITIAL 65536 FREE"
"LISTS 1 FREELIST GROUPS 1))"
IMP-00003: ORACLE error 959 encountered
ORA-00959: tablespace 'TESTTBS' does not exist
Import terminated successfully with warnings.
why it is happening for this table alone?plz help me
thanks in advanceHere is exerpt from {
http://asktom.oracle.com/pls/asktom/f?p=100:11:0::::P11_QUESTION_ID:378418239571}
=============================================
Hi Tom,
I have a dump file containing blob datatypes, when i import the dump file in a schema it gives an
error stating that the tablespace for Blob datatype does not exists. My question is how do i import
the dump file in the default tablespace of the importing user.
Followup March 2, 2004 - 7am US/Eastern:
You'll have to precreate the table.
do this:
imp userid=u/p tables=that_table indexfile=that_table.sql
edit that_table.sql, fix up the tablespace references to be whatever you want to be, run that sql.
then imp with ignore=y
for any MULTI-SEGMENT object (iot's with overflows, tables with lobs, partitioned tables, for
example), you have to do this -- imp will not rewrite ALL of the tablespaces in that
multi-tablespace create -- hence you either need to have the same tablespaces in place or precreate
the object with the proper tablespaces.
Only for single tablespace segments will imp rewrite the create to go into the default if the
requested tablespace does not exist.
===================================================
To summarize: precreate target table when importing multi-segment tables -
Global Temp table with BLOB causing session crash
Hi,
i have a table as follows:
create global temporary table spg_file_import (
name varchar2 (128) constraint sfi_nam_ck not null,
mime_type varchar2 (128),
doc_size number,
dad_charset varchar2 (128),
last_updated date,
content_type varchar2 (128),
content long raw,
blob_content blob
on commit delete rows
this is my 9ias 'document' table thats used to receive uploaded files which i modified to be global temporary.
what i want to do is:
a)upload a text file (xml)
b) convert that file to clob
c) store that file in new permanent table as clob
d) commit (hence delete temp table rows as they are no longer necessary)
to test it i have:
CREATE OR REPLACE procedure daz_html
as
begin
htp.p(' <FORM enctype="multipart/form-data" action="daz_fu" method="POST">');
htp.p(' <p>');
htp.p(' File to upload: <INPUT type="file" name="p_file_in"><br>');
htp.p(' <p><INPUT type="submit">');
htp.p(' </form>');
htp.p('</body>');
htp.p('</html>');
end;
CREATE OR REPLACE procedure daz_fu (
p_file_in varchar2
as
-- BLOB Stream locator
v_raw blob;
v_clob clob;
v_blob_length number;
v_length number;
v_buffer varchar2(32767);
v_pos number := 1;
begin
-- Get xml document from transient 9iAs data store.
select blob_content
into v_raw
from spg_file_import
where name = p_file_in;
-- create temp LOB
dbms_lob.createtemporary(v_clob, false);
-- get BLOB length
v_blob_length := dbms_lob.getlength(v_raw);
loop
-- get length to read. this is set as a max length of 32767
v_length := least((v_blob_length - (v_pos-1)),32767);
-- assign BLOB to a varchar2 buffer in preparation to convert to CLOB
v_buffer := utl_raw.cast_to_varchar2(dbms_lob.substr(v_raw, v_length, v_pos));
-- now write out to the CLOB
dbms_lob.writeappend(v_clob, v_length, v_buffer);
-- increment our position.
v_pos := v_pos + v_length;
-- exit when we are done.
exit when v_pos >= v_blob_length;
end loop;
commit;
htp.p('commit done!');
end;
now if i upload a small text file (about 5kb) it works with no problem.
however if I upload a large text file (say about 1Mb) it crashes oracle with:
Fri, 26 Jul 2002 11:49:24 GMT
ORA-03113: end-of-file on communication channel
DAD name: spgd1
PROCEDURE : daz_fu
USER : spg
URL : http://www.bracknell.bt.co.uk/pls/spgd1/daz_fu
PARAMETERS :
============
p_file_in:
F22210/Document.txt
this produces a large trc file.. the trace file indicates the crash occured on the commit; line
Current RBA:[0x4eb0.117.10]
*** 2002-07-26 12:35:11.857
ksedmp: internal or fatal error
ORA-00600: internal error code, arguments: [kcblibr_user_found], [4294967295], [2], [12583564], [65], [], [], []
Current SQL statement for this session:
declare
rc__ number;
begin
owa.init_cgi_env(:n__,:nm__,:v__);
htp.HTBUF_LEN := 255;
null;
daz_fu(p_ref_in=>:p_ref_in,p_type_in=>:p_type_in,p_file_in=>:p_file_in);
if (wpg_docload.is_file_download) then
rc__ := 1;
wpg_docload.get_download_file(:doc_info);
null;
commit;
else
rc__ := 0; null;
commit;
owa.get_page(:data__,:ndata__);
end if;
:rc__ := rc__;
end;
----- PL/SQL Call Stack -----
object line object
handle number name
812b1998 42 procedure SPG.DAZ_FU
819dff90 7 anonymous block
----- Call Stack Trace -----
If i reaplce the temporary table with a non-temp table of the same structure i get no problems what-so-ever. Am I doing something that I shouldnt be with global temporary tables?
Thanks.This is on Oracle 8.1.7.2
-
How to know exact size of table with blob column
I have a table with one BLOB column. I ran this query.
select bytes/1024/1024 from user_segments where segment_name='GMSSP_REQUEST_TEMP_FILES'
(user_segments is a view)
it gave me 0.125
It means size of table is 0.125. I have uploaded 3 files to this table. Each of them is of 5 mb. After that I check size of table. but result was same. i.e 0.125.
Can any body tell me how to know exact amount of space consumed by files. I am expecting following result
size should be (5+5+5+0.125)MB
Any help is appreciated.
Thanks.
Akiehttp://download.oracle.com/docs/cd/B19306_01/server.102/b14237/statviews_1092.htm#i1581211
http://asktom.oracle.com/pls/asktom/f?p=100:11:0::::P11_QUESTION_ID:266215435203#993030200346552097 -
Creating View for a table with parent child relation in table
I need help creating a view. It is on a base table which is a metadata table.It is usinf parent child relationship. There are four types of objects, Job, Workflow, Dataflow and ABAP dataflow. Job would be the root parent everytime. I have saved all the jobs
of the project in another table TABLE_JOB with column name JOB_NAME. Query should iteratively start from the job and search all the child nodes and then display all child with the job name. Attached are the images of base table data and expected view data
and also the excel sheet with data.Picture 1 is the sample data in base table. Picture 2 is data in the view.
Base Table
PARENT_OBJ
PAREBT_OBJ_TYPE
DESCEN_OBJ
DESCEN_OBJ_TYPE
JOB_A
JOB
WF_1
WORKFLOW
JOB_A
JOB
DF_1
DATAFLOW
WF_1
WORKFLOW
DF_2
DATAFLOW
DF_1
DATAFLOW
ADF_1
ADF
JOB_B
JOB
WF_2
WORKFLOW
JOB_B
JOB
WF_3
WORKFLOW
WF_2
WORKFLOW
DF_3
DATAFLOW
WF_3
WORKFLOW
DF_4
DATAFLOW
DF_4
DATAFLOW
ADF_2
ADF
View
Job_Name
Flow_Name
Flow_Type
Job_A
WF_1
WORKFLOW
Job_A
DF_1
DATAFLOW
Job_A
DF_2
DATAFLOW
Job_A
ADF_1
ADF
Job_B
WF_2
WORKFLOW
Job_B
WF_3
WORKFLOW
Job_B
DF_3
DATAFLOW
Job_B
DF_4
DATAFLOW
Job_B
ADF_2
ADF
I implemented the same in oracle using CONNECT_BY_ROOT and START WITH.
Regards,
MeghaI think what you need is recursive CTE
Consider your table below
create table basetable
(PARENT_OBJ varchar(10),
PAREBT_OBJ_TYPE varchar(10),
DESCEN_OBJ varchar(10),DESCEN_OBJ_TYPE varchar(10))
INSERT basetable(PARENT_OBJ,PAREBT_OBJ_TYPE,DESCEN_OBJ,DESCEN_OBJ_TYPE)
VALUES('JOB_A','JOB','WF_1','WORKFLOW'),
('JOB_A','JOB','DF_1','DATAFLOW'),
('WF_1','WORKFLOW','DF_2','DATAFLOW'),
('DF_1','DATAFLOW','ADF_1','ADF'),
('JOB_B','JOB','WF_2','WORKFLOW'),
('JOB_B','JOB','WF_3','WORKFLOW'),
('WF_2','WORKFLOW','DF_3','DATAFLOW'),
('WF_3','WORKFLOW','DF_4','DATAFLOW'),
('DF_4','DATAFLOW','ADF_2','ADF')
ie first create a UDF like below to get hierarchy recursively
CREATE FUNCTION GetHierarchy
@Object varchar(10)
RETURNS @RESULTS table
PARENT_OBJ varchar(10),
DESCEN_OBJ varchar(10),
DESCEN_OBJ_TYPE varchar(10)
AS
BEGIN
;With CTE
AS
SELECT PARENT_OBJ,DESCEN_OBJ,DESCEN_OBJ_TYPE
FROM basetable
WHERE PARENT_OBJ = @Object
UNION ALL
SELECT b.PARENT_OBJ,b.DESCEN_OBJ,b.DESCEN_OBJ_TYPE
FROM CTE c
JOIN basetable b
ON b.PARENT_OBJ = c.DESCEN_OBJ
INSERT @RESULTS
SELECT @Object,DESCEN_OBJ,DESCEN_OBJ_TYPE
FROM CTE
OPTION (MAXRECURSION 0)
RETURN
END
Then you can invoke it as below
SELECT * FROM dbo.GetHierarchy('JOB_A')
Now you need to use this for every parent obj (start obj) in view
for that create view as below
CREATE VIEW vw_Table
AS
SELECT f.*
FROM (SELECT DISTINCT PARENT_OBJ FROM basetable r
WHERE NOT EXISTS (SELECT 1
FROM basetable WHERE DESCEN_OBJ = r.PARENT_OBJ)
)b
CROSS APPLY dbo.GetHierarchy(b.PARENT_OBJ) f
GO
This will make sure it will give full hieraracy for each start object
Now just call view as below and see the output
SELECT * FROM vw_table
Output
PARENT_OBJ DESCEN_OBJ DESCEN_OBJ_TYPE
JOB_A WF_1 WORKFLOW
JOB_A DF_1 DATAFLOW
JOB_A ADF_1 ADF
JOB_A DF_2 DATAFLOW
JOB_B WF_2 WORKFLOW
JOB_B WF_3 WORKFLOW
JOB_B DF_4 DATAFLOW
JOB_B ADF_2 ADF
JOB_B DF_3 DATAFLOW
Please Mark This As Answer if it helps to solve the issue Visakh ---------------------------- http://visakhm.blogspot.com/ https://www.facebook.com/VmBlogs -
Join two source tables and replicat into a target table with BLOB
Hi,
I am working on an integration to source transaction data from legacy application to ESB using GG.
What I need to do is join two source tables (to de-normalize the area_id) to form the transaction detail, then transform by concatenate the transaction detail fields into a value only CSV, replicate it on the target ESB IN_DATA table's BLOB content field.
Based on what I had researched, lookup by join two source tables require SQLEXEC, which doesn't support BLOB.
What alternatives are there and what GG recommend in such use case?
Any helpful advice is much appreciated.
thanks,
XiaocunXiaocun,
Not sure what you're data looks like but it's possible the the comma separated value (CSV) requirement may be solved by something like this in your MAP statement:
colmap (usedefaults,
my_blob = @STRCAT (col02, ",", col03, ",", col04)
Since this is not 1:1 you'll be using a sourcedefs file, which is nice because it will do the datatype conversion for you under the covers (also a nice trick when migrating long raws to blobs). So col02 can be varchar2, col03 a number, and col04 a clob and they'll convert in real-time.
Mapping two tables to one is simple enough with two MAP statements, the harder challenge is joining operations from separate transactions because OGG is operation based and doesn't work on aggregates. It's possible you could end up using a combination of built in parameters and funcations with SQLEXEC and SQL/PL/SQL for more complicated scenarios, all depending on the design of the target table. But you have several scenarios to address.
For example, is the target table really a history table or are you actually going to delete from it? If just the child is deleted but you don't want to delete the whole row yet, you may want to use NOCOMPRESSDELETES & UPDATEDELETES and COLMAP a new flag column to denote it was deleted. It's likely that the insert on the child may really mean an update to the target (see UPDATEINSERTS).
If you need to update the LOB by appending or prepending new data then that's going to require some custom work, staging tables and a looping script, or a user exit.
Some parameters you may want to become familiar with if not already:
COLS | COLSEXCEPT
COLMAP
OVERRIDEDUPS
INSERTDELETES
INSERTMISSINGUPDATES
INSERTUPDATES
GETDELETES | IGNOREDELETES
GETINSERTS | IGNOREINSERTS
GETUPDATES | IGNOREUPDATES
Good luck,
-joe -
Maintenance View for custom table with foreign key relationship
Hi Folks,
I have created a custom table with foreign key relationship with other check tables. I want to create a maintenance view / tablemaintenance generator. What all things I need to take care for the foreign keys related fields while creating the maintenance view / tablemaintenance generator.
Regards,
santoshHi,
You do not have to do anything explicitely for the foreign key relationships in the table maintainance generator.
Create the table maintainance generator via SE11 and it will take care of all teh foreign key checks by itself.
Regards,
Ankur Parab -
Table with BLOB column re-partitioning
Currently I have a table with a blob column Z partitioned on Column X I want to re-partition the table on Column Y is there a quick way of doing this.
INSERT INTO SELECT * has limitations on the blob size I guess.
Can someone explain this?
Edited by: user10229350 on Mar 1, 2010 7:36 AMI am using hash partitioning looks like split partition doesn't apply for hash partition:
"The SPLIT PARTITION clause of the ALTER TABLE or ALTER INDEX statement is used to redistribute the contents of a partition into two new partitions. Consider doing this when a partition becomes too large and causes backup, recovery, or maintenance operations to take a long time to complete. You can also use the SPLIT PARTITION clause to redistribute the I/O load.
This clause cannot be used for hash partitions or subpartitions." -
How to read/write a binary file from/to a table with BLOB column
I have create a table with a column of data type BLOB.
I can read/write an IMAGE file from/to the column of the table using:
READ_IMAGE_FILE
WRITE_IMAGE_FILE
How can I do the same for other binary files, e.g. aaaa.zip?There is a package procedure dbms_lob.readblobfromfile to read BLOB's from file.
http://download-east.oracle.com/docs/cd/B19306_01/appdev.102/b14258/d_lob.htm#sthref3583
To write a BLOB to file you can use a Java procedure (pre Oracle 9i R2) or utl_file.put_raw (there is no dbms_lob.writelobtofile).
http://asktom.oracle.com/pls/ask/f?p=4950:8:1559124855641433424::NO::F4950_P8_DISPLAYID,F4950_P8_CRITERIA:6379798216275 -
Problem importing a table with blob's
hi all, I'm facing the following situation.
Source DB : 10.2.0.3 (client's DB)
Destination DB (mine): 10.2.0.4
I've a dump file (traditional) of a particular schema.
I'm running import (imp) to import on my DB.
It runs fine until it reaches one particular table. This table has 6 colums, 3 of them are BLOB.
This table has 260000 rows (checked with export log).
When import reaches row 152352 it stops loading data, but import is still running.
what can I do to get more information from this situation in order to solve this problem?
Any suggestion will be appreciated!
Thanks in advance.Pl identify the source and target OS versions. Are there any useful messages in the alert.log ? How long did the export take ? Rule of thumb states import will take twice as long. Have you tried expdp/impdp instead ? Also see the following -
How To Diagnose And Troubleshoot Import Or Datapump Import Hung Scenarios (Doc ID 795034.1)
How To Find The Cause of a Hanging Import Session (Doc ID 184842.1)
Import is Slow or Hangs (Doc ID 1037231.6)
Export and Import of Table with LOB Columns (like CLOB and BLOB) has Slow Performance (Doc ID 281461.1)
HTH
Srini -
Sample insert into table with BLOB column
This is my first opportunity to work with BLOB columns. I was wondering if anyone had some sample code that shows the loading of a blob column and inserted into the table with a blob column.
I have to produce a report (including crlf) and place it into a blob column. The user will download the report at a later time.
Any suggestions / code samples are greatly appreciated!!!!You can enable string binding in TopLink.
login.useStringBinding(int size);
or you could enable binding in general,
login.bindAllParameters(); -
Hello All :
I have been trying to figure out for a simple code I can use in my JSP to upload a file (of any format) into an Oracle table with a BLOB column type. I have gone through a lot of existing forums but couldnot find a simple code (that doesnot use Servlet, for eg.) to implement this piece.
Thanks a lot for your help !!Hi.
First of all to put a file into Oracle you need to get the array of bytes byte[] for that file. For this use for example FileInputStream.
After you get the byte array try to use this code:
try {
Connection conn = myGetDBConnection();
PreparedStatement pstmt = conn.prepareStatement("INSERT INTO table1 (content) VALUES(?)");
byte[] content = myGetFileAsBytes();
if (content != null) {
pstmt.setBinaryStream(0, new ByteArrayInputStream(content), content.length);
pstmt.close();
conn.close();
} catch (Exception ex) {
ex.printStackTrace();
}or instead of using ByteArrayInputStream try pstmt.setBinaryStream(0, new FileInputStream(yourFile), getFileSize());Hope this will help...
regards, Victor Letunovsky -
"Error: Document not found (WWC-46000)" with a form based on a table with blob item
Hi all,
I have to manage a table-based form. One of the fields of the table is BLOB and the corrisponding field of the form is HIDDEN.
When I push the INSERT button all works well but when I try to UPDATE I get "Error: Document not found. (WWC-46000)".
Have you some suggestions ?
Thanks,
Antonino
p.s. Oracle Portal 3.0.7.6.2 / NT
nullSorry, I think I did not explain well my problem.
Imagine this simple table:
key number;
description varchar2(50);
image blob;
I need to make a form that contains the corresponding "key" field and the "description" field but not the "image" one. I don't want to allow the end user to upload images!
When I insert a row the form works well and in the "image" field of the table an empty blob or null (now I don't remember) is stored: that's ok.
Now imagine I want to change the value of the "description" field. I submit a query for the right key, I type a new value for the description and finally I push UPDATE button but....an error occours.
I think this error is related with the Blob item of the table but I'm not sure.
Thanks again,
Antonino
<BLOCKQUOTE><font size="1" face="Verdana, Arial">quote:</font><HR>Originally posted by Dmitry Nonkin([email protected]):
Antonino,
If I understood the problem correctly:
The form's item type for BLOB column used to upload content cannot be hidden, that's why we have a special item type FileUpload(binary).
Thanks,
Dmitry<HR></BLOCKQUOTE>
null -
Error displaying a jpg file loaded into a table with blob field
This may not be the correct forum for this question, but if it isn't could someone direct me to the correct one.
I have created a table with a blob field in which I have loaded a jpg image. This appeared to work correctly, but when I try to display the image in internet explorer it comes back and tells me that it doesn't recognize the file type. Enclosed is the table create, load, and display pl/sql code. Can anyone tell me what I am doing wrong. Thanks. For the mime/header I used owa_util.mime_header('images/jpg') because my image is a jpg file.
The database is 10g
-- Create table
create table PHOTOS
IMAGEID NUMBER(10),
IMAGE BLOB,
IMAGE_NAME VARCHAR2(50)
load image
CREATE OR REPLACE PROCEDURE load_file ( p_id number, p_photo_name in varchar2) IS
src_file BFILE;
dst_file BLOB;
lgh_file BINARY_INTEGER;
BEGIN
src_file := bfilename('SNAPUNCH', p_photo_name);
-- insert a NULL record to lock
INSERT INTO photos (imageid, image_name, image)
VALUES (p_id , p_photo_name, EMPTY_BLOB())
RETURNING image INTO dst_file;
-- lock record
SELECT image
INTO dst_file
FROM photos
WHERE imageid = p_id AND image_name = p_photo_name
FOR UPDATE;
-- open the file
dbms_lob.fileopen(src_file, dbms_lob.file_readonly);
-- determine length
lgh_file := dbms_lob.getlength(src_file);
-- read the file
dbms_lob.loadfromfile(dst_file, src_file, lgh_file);
-- update the blob field
UPDATE photos
SET image = dst_file
WHERE imageid = p_id
AND image_name = p_photo_name;
-- close file
dbms_lob.fileclose(src_file);
END load_file;
display image
PROCEDURE display_image(p_id NUMBER) IS
Photo BLOB;
v_amt NUMBER DEFAULT 4096;
v_off NUMBER DEFAULT 1;
v_raw RAW(4096);
BEGIN
-- Get the blob image
SELECT image
INTO Photo
FROM PHOTOS
WHERE IMAGEID = p_id;
owa_util.mime_header('images/jpg');
BEGIN
LOOP
-- Read the BLOB
dbms_lob.READ(Photo, v_amt, v_off, v_raw);
-- Display image
htp.prn(utl_raw.cast_to_varchar2(v_raw));
v_off := v_off + v_amt;
v_amt := 4096;
END LOOP;
dbms_lob.CLOSE(Photo);
EXCEPTION
WHEN NO_DATA_FOUND THEN
NULL;
END;
END;
The url I enter is: http://webdev:7777/tisinfo/tis.tiss0011.Display_Image?p_id=1Just a little more information. When I enter owa_util.mime_header('image/jpeg') I can't display the file. It just shows up with a red x for the file.
When I enter owa_util.mime_header('image/jpg') it displays the file, but in the format
¿¿¿¿JFIF¿¿-Intel(R) JPEG Library, version [2.0.16.48]¿¿C
This is the way I would expect it to look if I opened it with Notepad, or an application that doesn't recognize jpg files. Can anyone tell me what I am doing wrong?? Thanks.
Maybe you are looking for
-
NO MATERIALs in the Sales order created through inbound idoc file
hi, i have maintained the customer material number as well the SAP Material number in the tcode VD51,but still the inbound idoc type ORDERS03 is posting the application document(Sales order) without the materials. for instance, I have cust.material a
-
Adobe Reader 11 (XI) won't open file via command line
Hello, I have posted this question to probably inapropriate topic earlier today, so I am repeating it again. I installed newest Adobe Reader XI today and when I tried to open certain pdf file via command line, it reported a syntax error. Now, this sy
-
Connecting 8520 to external GPS
Hi! I've seen a few other messages about my problem, but I still don't know how to solve it. I have a BB Curve 8520. I want to use a running app and because the 8520 doesn't have a GPS, I bought an external one; B Speech keychain GPS. My BB can find
-
How to deploy Portal to Oracle identity Federation 10.2.0.4 ?
AIX5.3 Oracle iAS 10.2.0.2, 10.2.0.4 Oracle Identity Federation 10.2.0.4 was installed successfully. How can i deploy portal of 10.2.0.2 to it`s instance ? Simple installation of Portal into instance has no result. Help please.
-
Sales order uploading through bapi
hi i want to upload sales order to bapi pls give the code and i also error handling through bapi regards A.K