Help in my pl/sql from blob to a file

i found the code here: http://www.oracle-base.com/articles/9i/ExportBlob9i.php but somehow i was confused by errors:
CREATE OR REPLACE
procedure sp_readBlob
declare
l_file UTL_FILE.FILE_TYPE;
l_buffer RAW(32767);
l_amount BINARY_INTEGER := 32767;
l_pos INTEGER := 1;
l_blob BLOB;
l_blob_len INTEGER;
BEGIN
-- Get LOB locator
SELECT fileupload
INTO l_blob
FROM rsmaster
WHERE rownum = 1;
l_blob_len := DBMS_LOB.getlength(l_blob);
-- Open the destination file.
l_file := UTL_FILE.fopen('BLOBS','MyImage.gif','w', 32767);
-- Read chunks of the BLOB and write them to the file
-- until complete.
WHILE l_pos < l_blob_len LOOP
DBMS_LOB.read(l_blob, l_amount, l_pos, l_buffer);
UTL_FILE.put_raw(l_file, l_buffer, TRUE);
l_pos := l_pos + l_amount;
END LOOP;
-- Close the file.
UTL_FILE.fclose(l_file);
EXCEPTION
WHEN OTHERS THEN
-- Close the file if something goes wrong.
IF UTL_FILE.is_open(l_file) THEN
UTL_FILE.fclose(l_file);
END IF;
RAISE;
END;
Error(2,1): PLS-00103: Encountered the symbol "DECLARE" when expecting one of the following: ( ; is with authid as cluster compress order using compiled wrapped external deterministic parallel_enable pipelined
Error(40): PLS-00103: Encountered the symbol "end-of-file" when expecting one of the following: begin case declare end exception exit for goto if loop mod null pragma raise return select update while with <an identifier> <a double-quoted delimited-identifier> <a bind variable> << close current delete fetch lock insert open rollback savepoint set sql execute commit forall merge pipe
thanks in advance!

You haven't used the quoted code very well!
That is an 'anonymous pl/sql block and uses the explicit DECLARE to define the area for declarations.
You are using a named pl/sql procedure and do not require the explicit DECLARE.
I think you must have miss-read the 'Create or replace Directory' line above.

Similar Messages

  • Help in to convert sql from CASE to DECODE in 11g

    11gr2, Windows.
    Can somebody help me out in re-writing below sql from CASE statement to DECODE ?
    SELECT INVOICE_ID,sum(TOTAL_EXCL_VAT),LEVEL2 as descr,
    sum(case when instr(LEVEL2,'Internet')>0 then SUM_QUANTITY else 0 end) as KB  --- >> Here
    from rator_cdr.INVOICE_DETAIL_LINE
    WHERE INVOICE_ID ='000000000000000000' and CALLER='0000000000' and LEVEL1='Gesprekskosten'
    group by LEVEL2,INVOICE_ID;Thanks

    There is no difference in performance. CASE is a standard syntax that is relatively easy for any developer to read and follow. DECODE is an Oracle-specific function that is much less flexible (you can only use equality conditions, for example) and generally results in more cryptic code.
    Justin

  • Help! Moved a project from laptop to desktop files are missing

    hi,
    sorry I am a premiere elements 7 newbie.
    I started my project on my laptop but unfortunately it hadn´t enough space and was not aible to work with the data flow. i bought a desktop and made the mistake that I only copied the "premiere elements project" the document..........
    then I organised the new pc and so on...opened the project but all my files (audio, pictures and videos) were missing.
    can someone help me what to do? I don´t want to do it again...........
    please note that I saved the documents (songs, pictures and avi) from different places on the laptop.
    thank you so much
    sabine from germany

    Sabine,
    The Project file (.PREL) is just an XML database and contains none of the media, only links to where that media was, when it was Imported into PrE, plus instructions on what you did, as you edited in PrE.
    Now, when one moves either a Project, or the media (your Assets) is to break those links. PrE goes looking, per the Path in the .PREL file. Unless the files are in that exact Path, it will throw up an error message, basicly saying, "help, I canno find these files." One has several choices at that point. One is to help PrE find the files, where they are now. Depending on the folder structure, if you help it find the first one that it's asking for, it'll automaticlaly link to all others in that folder, that it needs. If it then asks for another file, the chances are great that that file will be, maybe in a sub-folder. After that one is linked, it'll gather any that it needs in that sub-folder. This is repeated until all Assets have been found and linked. It could be find the first and all get linked or search around the sub-folders. This part just depends on the folder structure.
    Another choice is to choose Media Off-line. What this does is allow one to edit with any Assets that are found, and just show all others as Off-line. Here comes a big warning. If, after choosing Media Off-line (actual syntax for all unfound media would be "Off-line All."), you chose to Save, and your original Project file is overwritten, you will now have to manually locate all Assets to get them relinked. If I were going to do any off-line editing, I'd always do a Save_As, or a Save_As_a_Copy and NOT a Save. As mentioned, if you do a Save, then you have to go in and manually relink all Assest, unlike the scenario in the second paragraph.
    Now, if you media is shown as off-line and you do not have an original copy of your .PREL Project file, you'll need to go to the Project Panel. Your Assest will have a different icon, than before and the media will show as off-line. Rt-click on each one and choose Relink Media. Now, navigate to the new locations for that file and choose it to relink. I believe that this is the same in PrE, but that is the list of steps for PrPro. One must do this for each Asset, that you wish to bring back online. This method does take a lot of work, especially for large Projects with lots of Assets.
    As Kodebuster says, copying over your Assets entire folder structure to the second machine and then helping PrE find the first file (and any additionals that are in sub-folders) is the best way to do this.
    Again, remember that the .PREL file contains no media, just links to where PrE last saw those files.
    Good luck,
    Hunt

  • Need help saving image to SQL as BLOB, then sending as email attachment

    Hi folks,
    These forums have gotten me most of the way through this particular project, but I'm finally stumped...
    First, I'll describe exactly what I'm trying to do: the user chooses an image file client side and the server-side code then converts it to binary data and saves it to SQL. Then as a separate step, an email will be sent out with that image file loaded from the DB as an attachment.
    So I've actually got most of this working. I'm using the O'Reilly MultipartParser class to pull the file input stream from the Request and insert it into SQL BLOB column ("image" datatype). I can successfully send Text attachments by simply converting the binary data back to a string and then using a subclass of DataSource (ByteArrayDataSource) to feed it to the mail object.
    After getting the text documents to work, I then moved on to trying to send images.
    I tried to send images using essentially the same technique as sending text. The image file gets saved to SQL the same way and I then convert the binary data back to a string an attach it. The only difference is that I set the content-type to "image/gif".
    I didn't really expect this to work, however, I am able to "Preview" very simple Gifs (in XP) that I've attached. I cannot "Open" them, though, and I cannot even "Preview" more complicated Gifs.
    So on to my real questions. I didn't really expect that you could simply output binary image data into Text format and simply slap on a ".gif" and expect it to work. Am I right about that? XP's "preview" option seems to read it okay, but I really need to be able to "open" these files.
    So basically I need to know what to DO with that binary data in SQL in order to successfully attach it as a true image file. Is this an encoding issue? MIME type?
    I didn't want to paste all the code but I'll paste the snippet that deals with the attachment section. If you'd like to see any other parts, please just let me know...
    MyBlobObject v_blob = new MyBlobObject(m_cp);  // the getBlobStr() method of this object simply grabs the data from SQL as a String
    DataSource source = new ByteArrayDataSource(v_blob.getBlobStr(),"application/octet-stream");
    BodyPart messageBodyPart = new MimeBodyPart();
    messageBodyPart.setDataHandler(new DataHandler(source));
    messageBodyPart.setHeader("content-type","image/gif");
    messageBodyPart.setFileName(v_fileName);
    ...Another interesting thing I'm seeing is that when I view the small Gif as text, I see this:
    GIF89a � !� , D ;
    While the file I'm getting from SQL as an attachment looks like this:
    GIF89a ? !? , D ;
    Obviously, the question marks are unknown, but I'm not quite sure why. I outputted the char values as integers and found that the first question mark is 65408 and the second one's value is 65529. Obviously something is getting changed somewhere in the process, but I'm not quite sure how and where.
    anyway, thanks in advance for any advice!

    Nevermind! Of course, I figured it all out after writing that long post.
    I had tried using a byte array instead of the String in the ByteArrayDataSource and it hadn't worked. However once I cleaned up the code a bit I got it working.
    This links has it all...
    http://www.magelang.com/faq/view.jsp?EID=498439
    thanks anyway!

  • Please help to build an sql from the given expression to build case condition with Where clause

    if  @rollno is not null then
    Select top  1 studentid  from student where rollno =:@rollno
    else
    if @regno is not null then
    Select top  1 studentid  from student where regno =:@regno
    Please help me to create  the above condition as  a single  sql statement . I will pass two argument in to the sql.
    With Thanks
    Pol
    polachan

    Select top 1 studentid from student
    where (rollno =:@rollno or :@rollno is null)
    and (regno =:@regno or :@regno is null)
    The above expression will work in SQL server
    I'm not sure you're using sql server as syntax like :@regno are not t-sql valid
    so you can try the above and if it doesnt work please try in the relevant forums for more help in case you're using a different RDBMS
    Please Mark This As Answer if it solved your issue
    Please Mark This As Helpful if it helps to solve your issue
    Visakh
    My MSDN Page
    My Personal Blog
    My Facebook Page

  • Function convertion from blob to clob

    Plesae need help,to make function convert from blob to clob
    i have function but not work when coulmn of blob is vey large
    =========================================================================================
    CREATE OR REPLACE PACKAGE BODY "SHB_PACKAGE"
    AS
    FUNCTION blob2clob (l_blob BLOB)
    RETURN CLOB
    IS
    l_clob CLOB;
    l_src_offset NUMBER;
    l_dest_offset NUMBER;
    l_blob_csid NUMBER := DBMS_LOB.default_csid;
    v_lang_context NUMBER := DBMS_LOB.default_lang_ctx;
    l_warning NUMBER;
    l_amount NUMBER := dbms_lob.lobmaxsize;
    BEGIN
    IF l_blob is null
    THEN
    return null;
    ELSE
    DBMS_LOB.createtemporary (l_clob, TRUE);
    l_src_offset := 1;
    l_dest_offset := 1;
    l_amount := NVL(DBMS_LOB.getlength (l_blob),0);
    DBMS_LOB.converttoclob (l_clob,
    l_blob,
    l_amount,
    l_src_offset,
    l_dest_offset,
    0,
    v_lang_context,
    l_warning
    RETURN l_clob;
    END IF;
    END;
    END SHB_package;

    thanks,function is working fine
    issue duo to i used under informatice and bug in informatica
    Edited by: user8929623 on Jan 17, 2010 1:39 AM

  • File Name after donwload from BLOB

    dear experts,
    i'm a newbie in programming, just started this month.. i got problem where after i download from blob (oracle) the file name will be the same as the java class that i use to download it. How can i make the file name as it real file name rather than the java class name
    For your information i use oreilly package to upload and for download i use other source code i take from internet example.

    some of the code in download java class.. i use servlet...
    public void doPost(....
    ServletOutputStream out = response.getOutputStream();
         response.setContentType(dm_file1contenttype);
         out.write(getBlob(id));
         out.flush();
         out.close();
    public byte[] getBlob(String id) {
    while (rs.next()) {
              blob = rs.getBlob(1);
              bytes = blob.getBytes(1, (int) (blob.length()));
              conn.close();

  • PL SQL block - Check for empty file

    Hello,
    I need help in writing pl sql code to check a file to verify if it has 0 records (file size = zero kb.) In a preceding pl sql block, I am performing a SELECT on data and writing it to a file. If that file is empty/has no records, then I would like to stop the process from continuing on to my next pl sql block which performs an update function.
    Summary:
    1) Create file from SELECT
    2) Check file for zero data, if true, trigger error, if false, continue to next pl sql block.
    3) Run update on records if data exists in the file.
    Thanks in advance!

    Thanks John,
    You've been of great help so far and a really appreciate it. I'm so close to getting this, I just need help with how to create an error when the file shows no data. I will be running this as an Oracle Apps Concurrent request. I'd like for the job to end in error when no data exists for the file. In unix, I can enter "EXIT 1" and it signals to the concurrent manager to put end the job with an error status. How can I do something similar using pl sql? Here is my code so far, with the missing piece. Thanks!
    DECLARE
      vFile UTL_FILE.FILE_TYPE;
      vrecs BOOLEAN := FALSE;
      vnodata EXCEPTION;
    BEGIN
      vFile := UTL_FILE.FOPEN('/devlop, 'norecs.txt', 'w', 32767);
    FOR x IN (
    SELECT
         node_name AS txt
    FROM
         applsys.fnd_nodes
    WHERE
         node_name = 'blahblahblah'
    LOOP
            vrecs := TRUE;
            UTL_FILE.PUT_LINE (vFile, x.txt);
    END LOOP;
            UTL_FILE.FCLOSE(vFile);
            IF NOT vrecs THEN
                RAISE vnodata;
            END IF;     
    EXCEPTION
           WHEN vnodata THEN
        *<what can I put here to cause the pl sql block to error???>*
    END;
    /

  • Reading BLOB in Native SQL from ABAP program

    Hello,
    I'm trying to read content of a BLOB field from a table with Native SQL in ABAP like this:
    DATA: l_bytes type xstring.
    EXEC SQL.
      SELECT bytes INTO :l_bytes FROM tablename
    ENDEXEC.
    But when I'm using xstring it returns only 32768 bytes. When using type x length 65000 for l_bytes it returns 65000 bytes, but x is limited to 65535 bytes only. So why it returns only 32768 bytes in direct sql? For DB2 I found note 610342 where you need to add \lob to the statement, I tried with Oracle but doesn't work.
    DATA: CLOB_VAR TYPE STRING.
    DATA: BLOB_VAR TYPE XSTRING.
    EXEC SQL.     
      SELECT FCLOB, FBLOB FROM ZZTAB        INTO :CLOB_VAR\lob, :BLOB_VAR\lob
    ENDEXEC.
    Regards
    Markus

    Hi Markus,
    you have to read it in chunks (remember a blob could be up to 4 Gbyte!).
    I assume you store the byte stream in the field (no bfile pointer to a file ).
    Give you a pseudocode that you get the picture.
    Would recommend that you read the Oracle manual (db Version?)
    9i
    http://download.oracle.com/docs/cd/B10501_01/appdev.920/a96612/d_lob2.htm#1008611
    10g
    http://download.oracle.com/docs/cd/B19306_01/appdev.102/b14258/d_lob.htm#ARPLS600
    chunksize = 32768 .  " or use 65000 . This is the amount of the blob part you want to read
    offset = 1.
    *get the row of interest and retrieve blob's length
    Exec sql.
    select  dbms_lob.getlength(your_blob)  :blob_length from blob_table where my_primary_key = 1
    end exec.
    *calculate times to read from blob  until the end (well, leave the adjustment to you
    * in case of an uneven  fraction)
    ntimes = blob_length / chunksize .
    Do ntimes
    *loop and get the chunks into your variable:
    * start reading from offset chunksize of bytes
    exec sql.
    select dbms_lob.substr(your_blob, :chunksize,:offset) into  :xblobvar  from blob_table
    where my_primary_key = 1
    endexec.
    *don't know what you want to do with the blob chunk...
    *process  :xblobvar
    offset = chunksize * ntimes .
    enddo.
    bye
    yk

  • Sql loader - BLOB

    I have used OMWB "generate Sql Loader script " option. ANd received the sql loader error.
    The previous attempt to use OMWB online loading has generated garbage data. The picture was not matching with the person id.
    Table in Sql Server..................
    CREATE TABLE [nilesh] (
         [LargeObjectID] [int] NOT NULL ,
         [LargeObject] [image] NULL ,
         [ContentType] [varchar] (40) COLLATE SQL_Latin1_General_CP1_CI_AS NULL ,
         [LargeObjectName] [varchar] (255) COLLATE SQL_Latin1_General_CP1_CI_AS NULL ,
         [LargeObjectExtension] [varchar] (10) COLLATE SQL_Latin1_General_CP1_CI_AS NULL ,
         [LargeObjectDescription] [varchar] (255) COLLATE SQL_Latin1_General_CP1_CI_AS NULL ,
         [LargeObjectSize] [int] NULL ,
         [VersionControl] [bit] NULL ,
         [WhenLargeObjectLocked] [datetime] NULL ,
         [WhoLargeObjectLocked] [char] (11) COLLATE SQL_Latin1_General_CP1_CI_AS NULL ,
         [LargeObjectTimeStamp] [timestamp] NOT NULL ,
         [LargeObjectOID] [uniqueidentifier] NOT NULL
    ) ON [PRIMARY] TEXTIMAGE_ON [PRIMARY]
    GO
    Table in Oracle..............
    CREATE TABLE LARGEOBJECT
    LARGEOBJECTID NUMBER(10) NOT NULL,
    LARGEOBJECT BLOB,
    CONTENTTYPE VARCHAR2(40 BYTE),
    LARGEOBJECTNAME VARCHAR2(255 BYTE),
    LARGEOBJECTEXTENSION VARCHAR2(10 BYTE),
    LARGEOBJECTDESCRIPTION VARCHAR2(255 BYTE),
    LARGEOBJECTSIZE NUMBER(10),
    VERSIONCONTROL NUMBER(1),
    WHENLARGEOBJECTLOCKED DATE,
    WHOLARGEOBJECTLOCKED CHAR(11 BYTE),
    LARGEOBJECTTIMESTAMP NUMBER(8) NOT NULL,
    LARGEOBJECTOID RAW(16) NOT NULL
    TABLESPACE USERS
    PCTUSED 0
    PCTFREE 10
    INITRANS 1
    MAXTRANS 255
    STORAGE (
    INITIAL 64K
    MINEXTENTS 1
    MAXEXTENTS 2147483645
    PCTINCREASE 0
    BUFFER_POOL DEFAULT
    LOGGING
    NOCOMPRESS
    LOB (LARGEOBJECT) STORE AS
    ( TABLESPACE USERS
    ENABLE STORAGE IN ROW
    CHUNK 8192
    PCTVERSION 10
    NOCACHE
    STORAGE (
    INITIAL 64K
    MINEXTENTS 1
    MAXEXTENTS 2147483645
    PCTINCREASE 0
    BUFFER_POOL DEFAULT
    NOCACHE
    NOPARALLEL
    MONITORING;
    Sql Loader script....
    SET NLS_DATE_FORMAT=Mon dd YYYY HH:mi:ssAM
    REM SET NLS_TIMESTAMP_FORMAT=Mon dd YYYY HH:mi:ss:ffAM
    REM SET NLS_LANGUAGE=AL32UTF8
    sqlldr cecildata/@ceciltst control=LARGEOBJECT.ctl log=LARGEOBJECT.log
    Sql loader control file......
    load data
    infile 'nilesh.dat' "str '<er>'"
    into table LARGEOBJECT
    fields terminated by '<ec>'
    trailing nullcols
    (LARGEOBJECTID,
    LARGEOBJECT CHAR(2000000) "HEXTORAW (:LARGEOBJECT)",
    CONTENTTYPE "DECODE(:CONTENTTYPE, CHR(00), ' ', :CONTENTTYPE)",
    LARGEOBJECTNAME CHAR(255) "DECODE(:LARGEOBJECTNAME, CHR(00), ' ', :LARGEOBJECTNAME)",
    LARGEOBJECTEXTENSION "DECODE(:LARGEOBJECTEXTENSION, CHR(00), ' ', :LARGEOBJECTEXTENSION)",
    LARGEOBJECTDESCRIPTION CHAR(255) "DECODE(:LARGEOBJECTDESCRIPTION, CHR(00), ' ', :LARGEOBJECTDESCRIPTION)",
    LARGEOBJECTSIZE,
    VERSIONCONTROL,
    WHENLARGEOBJECTLOCKED,
    WHOLARGEOBJECTLOCKED,
    LARGEOBJECTTIMESTAMP,
    LARGEOBJECTOID "GUID_MOVER(:LARGEOBJECTOID)")
    Error Received...
    Column Name Position Len Term Encl Datatype
    LARGEOBJECTID FIRST * CHARACTER
    Terminator string : '<ec>'
    LARGEOBJECT NEXT ***** CHARACTER
    Maximum field length is 2000000
    Terminator string : '<ec>'
    SQL string for column : "HEXTORAW (:LARGEOBJECT)"
    CONTENTTYPE NEXT * CHARACTER
    Terminator string : '<ec>'
    SQL string for column : "DECODE(:CONTENTTYPE, CHR(00), ' ', :CONTENTTYPE)"
    LARGEOBJECTNAME NEXT 255 CHARACTER
    Terminator string : '<ec>'
    SQL string for column : "DECODE(:LARGEOBJECTNAME, CHR(00), ' ', :LARGEOBJECTNAME)"
    LARGEOBJECTEXTENSION NEXT * CHARACTER
    Terminator string : '<ec>'
    SQL string for column : "DECODE(:LARGEOBJECTEXTENSION, CHR(00), ' ', :LARGEOBJECTEXTENSION)"
    LARGEOBJECTDESCRIPTION NEXT 255 CHARACTER
    Terminator string : '<ec>'
    SQL string for column : "DECODE(:LARGEOBJECTDESCRIPTION, CHR(00), ' ', :LARGEOBJECTDESCRIPTION)"
    LARGEOBJECTSIZE NEXT * CHARACTER
    Terminator string : '<ec>'
    VERSIONCONTROL NEXT * CHARACTER
    Terminator string : '<ec>'
    WHENLARGEOBJECTLOCKED NEXT * CHARACTER
    Terminator string : '<ec>'
    WHOLARGEOBJECTLOCKED NEXT * CHARACTER
    Terminator string : '<ec>'
    LARGEOBJECTTIMESTAMP NEXT * CHARACTER
    Terminator string : '<ec>'
    LARGEOBJECTOID NEXT * CHARACTER
    Terminator string : '<ec>'
    SQL string for column : "GUID_MOVER(:LARGEOBJECTOID)"
    SQL*Loader-309: No SQL string allowed as part of LARGEOBJECT field specification
    what's the cause ?

    The previous attempt to use OMWB online loading has generated garbage data. The picture was not matching with the person id.This is being worked on (bug4119713) If you have a reproducible testcase please send it in (small testcases seem to work ok).
    I have the following email about BLOBS I could forward to you if I have your email address:
    [The forum may cut the lines in the wrong places]
    Regards,
    Turloch
    Oracle Migration Workbench Team
    Hi,
    This may provide the solution. Without having the customer files here I can only guess at the problem. But this should help.
    This email outlines a BLOB data move.
    There are quiet a few steps to complete the task of moving a large BLOB into the Oracle database.
    Normally this wouldn't be a problem, but as far as we can tell SQL Server's (and possibly Sybase) BCP does not reliably export binary data.
    The only way to export binary data properly via BCP is to export it in a HEX format.
    Once in a HEX format it is difficult to get it back to binary during a data load into Oracle.
    We have come up with the idea of getting the HEX values into Oracle by saving them in a CLOB (holds text) column.
    We then convert the HEX values to binary values and insert them into the BLOB column.
    The problem here is that the HEXTORAW function in Oracle only converts a maximum of 2000 HEX pairs.
    We over came this problem by writing our own procedure that will convert (bit by bit) your HEX data to binary.
    NOTE: YOU MUST MODIFY THE START.SQL AND FINISH.SQL TO SUIT YOUR CUSTOMER
    The task is split into 4 sub tasks
    1) CREATE A TABLESPACE TO HOLD ALL THE LOB DATA
    --log into your system schema and create a tablespace
    --Create a new tablespace for the CLOB and BLOB column (this may take a while to create)
    --You may resize this to fit your data ,
    --but I believe you have in excess of 500MB of data in this table and we are going to save it twice (in a clob then a blob)
    --Note: This script may take some time to execute as it has to create a tablespace of 1000Mb.
    -- Change this to suit your customer.
    -- You can change this if you want depending on the size of your data
    -- Remember that we save the data once as CLOB and then as BLOB
    create tablespace lob_tablespace datafile 'lob_tablespace' SIZE 1000M AUTOEXTEND ON NEXT 50M;
    LOG INTO YOUR TABLE SCHEMA IN ORACLE
    --Modify this script to fit your requirements
    2) START.SQL (this script will do the following tasks)
    a) Modify your current schema so that it can accept HEX data
    b) Modify your current schema so that it can hold that huge amount of data.
    The new tablespace is used; you may want to alter this to your requirements
    c) Disable triggers, indexes & primary keys on tblfiles
    3)DATA MOVE
    The data move now involves moving the HEX data in the .dat files to a CLOB.
    The START.SQL script adds a new column to <tablename> called <blob_column>_CLOB.
    This is where the HEX values will be stored.
    MODIFY YOUR CONTROL FILE TO LOOK LIKE THISload data
    infile '<tablename>.dat' "str '<er>'"
    into table <tablename>
    fields terminated by '<ec>'
    trailing nullcols
    <blob_column>_CLOB CHAR(200000000),
    The important part being "_CLOB" appended to your BLOB column name and the datatype set to CHAR(200000000)
    RUN sql_loader_script.bat
    log into your schema to check if the data was loaded successfully-- now you can see that the hex values were sent to the CLOB column
    SQL> select dbms_lob.getlength(<blob_column>),dbms_lob.getlength(<blob_column>_clob) from <tablename>;
    LOG INTO YOUR SCHEMA
    4)FINISH.SQL (this script will do the following tasks)
    a) Creates the procedure needed to perform the CLOB to BLOB transformation
    b) Executes the procedure (this may take some time a 500Mb has to be converted to BLOB)
    c) Alters the table back to its original form (removes the <blob_column>_clob)
    b) Enables the triggers, indexes and primary keys
    Regards,
    (NAME)
    -- START.SQL
    -- Modify this for your particular customer
    -- This should be executed in the user schema in Oracle that contains the table.
    -- DESCRIPTION:
    -- ALTERS THE OFFENDING TABLE SO THAT THE DATA MOVE CAN BE EXECUTED
    -- DISABLES TRIGGERS, INDEXES AND SEQUENCES ON THE OFFENDING TABLE
    -- 1) Add an extra column to hold the hex string
    alter table <tablename> add (FILEBINARY_CLOB CLOB);
    -- 2) Allow the BLOB column to accpet NULLS
    alter table <tablename> MODIFY FILEBINARY NULL;
    -- 3) Dissable triggers and sequences on tblfiles
    alter trigger <triggername> disable;
    alter table tblfiles drop primary key cascade;
    drop index <indexname>;
    -- 4) Allow the table to use the tablespace
    alter table <tablename> move lob (<blob_column>) store as (tablespace lob_tablespace);
    alter table tblfiles move lob (<blob_column>clob) store as (tablespace lobtablespace);
    COMMIT;
    -- END OF FILE
    -- FINISH.SQL
    -- Modify this for your particular customer
    -- This should be executed in the table schema in Oracle.
    -- DESCRIPTION:
    -- MOVES THE DATA FROM CLOB TO BLOB
    -- MODIFIES THE TABLE BACK TO ITS ORIGIONAL SPEC (without a clob)
    -- THEN ENABLES THE SEQUENCES, TRIGGERS AND INDEXES AGAIN
    -- Currently we have the hex values saved as text in the <columnname>_CLOB column
    -- And we have NULL in all rows for the <columnname> column.
    -- We have to get BLOB locators for each row in the BLOB column
    -- put empty blobs in the blob column
    UPDATE <tablename> SET filebinary=EMPTY_BLOB();
    COMMIT;
    -- create the following procedure in your table schema
    CREATE OR REPLACE PROCEDURE CLOBTOBLOB
    AS
    inputLength NUMBER; -- size of input CLOB
    offSet NUMBER := 1;
    pieceMaxSize NUMBER := 50; -- the max size of each peice
    piece VARCHAR2(50); -- these pieces will make up the entire CLOB
    currentPlace NUMBER := 1; -- this is where were up to in the CLOB
    blobLoc BLOB; -- blob locator in the table
    clobLoc CLOB; -- clob locator pointsthis is the value from the dat file
    -- THIS HAS TO BE CHANGED FOR SPECIFIC CUSTOMER TABLE AND COLUMN NAMES
    CURSOR cur IS SELECT <blob_column>clob clobcolumn , <blob_column> blob_column FROM /*table*/<tablename> FOR UPDATE;
    cur_rec cur%ROWTYPE;
    BEGIN
    OPEN cur;
    FETCH cur INTO cur_rec;
    WHILE cur%FOUND
    LOOP
    --RETRIVE THE clobLoc and blobLoc
    clobLoc := cur_rec.clob_column;
    blobLoc := cur_rec.blob_column;
    currentPlace := 1; -- reset evertime
    -- find the lenght of the clob
    inputLength := DBMS_LOB.getLength(clobLoc);
    -- loop through each peice
    LOOP
    -- get the next piece and add it to the clob
    piece := DBMS_LOB.subStr(clobLoc,pieceMaxSize,currentPlace);
    -- append this peice to the BLOB
    DBMS_LOB.WRITEAPPEND(blobLoc, LENGTH(piece)/2, HEXTORAW(piece));
    currentPlace := currentPlace + pieceMaxSize ;
    EXIT WHEN inputLength < currentplace;
    END LOOP;
    FETCH cur INTO cur_rec;
    END LOOP;
    END CLOBtoBLOB;
    -- now run the procedure
    -- It will update the blob column witht the correct binary represntation of the clob column
    EXEC CLOBtoBLOB;
    -- drop the extra clob cloumn
    alter table <tablename> drop column <blob_column>_clob;
    -- 2) apply the constraint we removed during the data load
    alter table <tablename> MODIFY FILEBINARY NOT NULL;
    -- Now re enable the triggers,indexs and primary keys
    alter trigger <triggername> enable;
    ALTER TABLE TBLFILES ADD ( CONSTRAINT <pkname> PRIMARY KEY ( <column>) ) ;
    CREATE INDEX <index_name> ON TBLFILES ( <column> );
    COMMIT;
    -- END OF FILE

  • Read Image file from BLOB.

    Hi All,
    I am trying to load data from BLOB data from database and save the image file.
    SQL to insert data:
    INSERT INTO BLOBTest (BLOBName, BLOBData)
    SELECT 'First test file', BulkColumn FROM OPENROWSET(Bulk 'C:\Inbound\logo.jpg', SINGLE_BLOB) AS BLOB
    I am using the following 'SELECT top 1 BLOBData from BLOBTest' to get the data and setting the value to 'Image Saver' action. The image file is saved to the specified path, but the image is not opening.
    I get "Image is damaged, corrupted or is too large message. The file is around 40 KB."
    Am i missing something.
    Any help would be highly appreciated.
    MII Version: 14.0 SP 3
    Thanks,

    I think this thread might help: Can we interpret Oracle BLOB data in SAP MII and how?
    Basically MII doesn't support the BLOB/CLOB data types so in your query you will need to convert it to a base64 string that MII can consume using the image saver action.
    Regards,
    Christian

  • Can not retrieve cell data content From BLOB object.

    I have load Image into Georaster with Raster_Table following:
    create table rdt_1 of mdsys.sdo_raster
    (primary key (rasterId, pyramidLevel, bandBlockNumber,
    rowBlockNumber, columnBlockNumber))
    lob(rasterblock) store as (nocache nologging);
    After I load Image successful, I continue load all cell data into BLOB object by:
    DECLARE
    gr sdo_georaster;
    lb blob;
    BEGIN
    SELECT georaster INTO gr FROM georaster_table WHERE georid=2;
    dbms_lob.createTemporary(lb, FALSE);
    sdo_geor.getRasterData(gr, 0, lb);
    dbms_lob.freeTemporary(lb);
    END;
    Please give me simple PL/SQL to retrieval content from BLOB object!
    Thank You very much!
    YuMi

    BLOB stands for Binary Large OBject. However the acronym has a pleasing affinity with the actual nature of the thing. In a database a BLOB is an undiffereniated mass of bytes. We don't know whether it's a spreadsheet or a word document or an image. So there is no out-of-the-box API for treating a BLOB as it's native file type.
    Having said that there may be something in the Spatial API that works with such things - you might be better off asking the question in Spatial. Although I suspect that venue doesn't get as much through traffic as this one.
    Cheers, APC

  • How can I open different binary files from BLOB column ?

    If we store some type of binary file (XLS, DOC, PDF, EML and so on, not only pictures) in BLOB column how can I show the different contents? We use designer and forms 9i with PL/SQL.
    How can I copy the files from BLOB to file in a directory or how can I pass BLOB's content to the proper application directly to open it?

    The mime type is just a string as explained above (e.g. application/pdf...). There are lot of samples here and on metalink.
    E.g. add a column mime_type varchar(30) to your blob table. Create a procedure similar to the following:
    PROCEDURE getblob
    (P_FILE IN VARCHAR2
    IS
    vblob blob;
    vmime_type myblobs.mime_type%type;
    length number;
    begin
         select document, mime_type into vblob,vmime_type from myblobs where docname = p_file;
         length := dbms_lob.getlength(vblob);
         if length = 0 or vblob is null then
         htp.p('Document not available yet.');
         else
         owa_util.mime_header(vmime_type);
         htp.p('Content-Length: ' || dbms_lob.getlength(vblob));
         owa_util.http_header_close;
         wpg_docload.download_file(vblob);                
         end if;
    exception
         when others then
         htp.p(sqlerrm);
    END;
    Create a DAD on your application server (refer to documentation on how to create a DAD).
    Display the blob from forms (e.g. on a when-button-pressed trigger):
    web.show_document('http://myserver:port/DAD/getblob?p_file=myfilename','_blank');
    For storing blobs in a directory on your db server take a look at the dbms_lob package.
    For storing blobs in a directory on your app server take a look at WebUtil available on OTN.
    HTH
    Gerald Krieger

  • Help needed in executing SQL query...

    Hi,
    I am very new to JDeveloper. Curently i am tryin to execute an SQL query from the BPEL process, the output of the query is to be mapped to a variable field from a target xsd file. the query is fairly simple, like
    SELECT emp
    FROM emp_table
    WHERE emp_id=123
    The target field, namely "employee", is an element from the xsd file. I tried using Java embedding activity to connect to the db and execute the query through a piece of Java code, but couldn't find a way to assign the output of the query to the field. however lately i also discovered the Database Adapter services which helps me create a database connection, but still i am not sure as of how to handle and map the output of the query to the variable field.
    Can somebody please help me in resolving the issue either through Java Embed activity or Database Adapter services??
    Thanks in advance
    Anjan

    Anjan,
    I suggest you try the [url http://forums.oracle.com/forums/forum.jspa?forumID=212]BPEL Forum
    John

  • Help needed in framing SQL query.

    Hi,
    I have table having following schema:
    PRODUCT_ID         INT
    DATE                   DATETIME
    ITEMS_SOLD         INT
    This table contains data for a week (sunday to saturday). I want to write an SQL query to get (filter out) PRODUCT_ID of products which has same no. of ITEMS_SOLD for all 7 days. Also for given PRODUCT_ID I need to find the longest period of successive days where the no. of ITEMS_SOLD is same for all days in this period. Eg.(PRODUCT_ID 23 was sold as following along the week in no. of units sold: 4,6,6,6,6,7,4 .So the longest period is *4 days* from Monday to Thursday.) The first condition is special case of second condition where no. of days is 7.
    Any help to get the SQL query will be appreciated.
    Thanks,
    Akshay.

    PRODUCT_ID      DATE           ITEMS_SOLD
    1          10/10/2011     3
    1          11/10/2011     3
    1          12/10/2011     3
    1          13/10/2011     3
    1           16/10/2011     5
    2          10/10/2011     4
    2           11/10/2011     4
    2          12/10/2011     4
    2          13/10/2011     4
    2           14/10/2011     4
    2          15/10/2011     4
    2          16/10/2011     4
    Output:
    PRODUCT_ID ITEMS_SOLD NO_OF_DAYS
    1          3                4     
    2          4               7
    Explanation of results:
    The table to be queried contains data for 1 week: from 10/10/2011(Sunday) to 16/10/2011(Saturday). Now, product with PRODUCT_ID '1' was sold on dates 10,11,12,13,16. Out of these 5 days 3 units were sold on 4 successive days (from 10-13). So output should be like :
    PRODUCT_ID ITEMS_SOLD NO_OF_DAYS
    1          3               4     
    as longest period of successive days is 4 where same no. of units were sold i.e 3 units (other period is of 1 day on 16th ).
    For PRODUCT_ID 2 we have only one period of 7 days where 4 units were sold each day. So we output :
    PRODUCT_ID ITEMS_SOLD NO_OF_DAYS
    2           4               7
    Other case where same PRODUCT_ID have different units sold on each day should be ignored.
    I hope that clarifies the problem more. Let me know in case I have missed out anything which should have been mentioned.
    -Akshay.

Maybe you are looking for

  • Shared Services Security Issue with Financial Reporting - 11.1.1.3

    Hi, So we have some users that are provisioned in some groups. Those groups have Essbase Server Access and Planner access to an application. That access is working great. They have access to the app and they have access via Smartview to the cube. Now

  • Goods are sent back to the vendor once the Credit is taken and Utilized

    Goods are sent back to the vendor once the Credit is taken and Utilized( Declared to Authorities) Hi, I have a scenario in which the inpur credit is taken at the time of the creation of the GR. The Credit is declared to the authorities on a monthly b

  • Integration of customers and vendors- customer exit for renaming

    We are planning to use customers for capacity reservation functionality and vendors for subcontracting in APO. Due to this, we will need Vendors and Customers in APO. Currently some plants are using same number range for customers and vendors. So, we

  • MaxDB and Developer Workplace

    I would like to install SAP Netweaver 2004S SR2 Developer Workplace on my windows XP desktop. I am using the official DVD containing the Developer Workplace, but the installation requires the maxDB. Where can I get a copy of maxDB suitable for my ins

  • Deploying EJB with RAD

    Hello, I have an EJB application which uses several external JARs. I added them as a library to the project so that the bean will be able to pass the compilation. However, when I start the server (WebSphere 5.1 Test Environment), I get the exception: