ODI PDF to CLOB column

Hi,
How should I read a PDF content from some file share location and update a CLOB column in target DB.
My Source is PDF file and target is Oracle DB.
thanks
Sanjeeva

Easy way of getting LOBs is via external table, this external table below will give you all the file names in a file with name 'files_to_load.dat' and a BLOB column for each file.
Directories used in external table;
create directory dir_data as 'd:\\temp\\data';
create directory dir_out_log as 'd:\\temp\\logs';
create directory dir_out_bad as 'd:\\temp\\logs';
External table definition;
CREATE TABLE blob_data_et
FILE_NM varchar2(120),
file_data blob
organization external
type oracle_loader
default directory dir_data
access parameters
records delimited by NEWLINE
skip 1
logfile dir_out_log:'dir_data.log'
badfile dir_out_bad:'dir_data.bad'
nodiscardfile
fields terminated by ','
missing field values are null
FILE_NM CHAR(120)
column transforms
FILE_DATA FROM LOBFILE (FILE_NM) FROM (dir_data)
location (dir_data:'files_to_load.dat')
reject limit 0
noparallel
nomonitoring;
Cheers
David

Similar Messages

  • Contains clause in Interactive report search of a clob column

    Hi,
    I am using APEX Version 4.2.4.00.08
    How can an interactive Report use the contains clause when searching a clob column so that it uses the CONTEXT index?
    Thanks
    Chandra.

    I wrote it into the SQL used for the IR
    I created an APEX application that stores all of our IT's HOWTO documents. (word,excel,pdf)
    ctx_doc.snippet creates HTML code.
    select D.doc_id
      ,D.doc_filename
      ,dbms_lob.getLength( D.doc_blob ) as download
      ,decode( :P12_SEARCH, null, '-- nothing --',
             ctx_doc.snippet( 'IT_DATA.DOC_CTX_IX' -- my Oracle Text index name
                                 , D.doc_id, :P12_SEARCH )
       as snippet
    from it_data.documents D
    where :P12_SEARCH is null
    or contains( D.doc_blob, :P12_SEARCH ) > 0
    MK

  • Show the rtf formated data from CLOB Column

    Hi all,
    is it possible to show the rtf formatted text stored in a clob column on the report generated with bi publisher?
    A step-by-step guidance wuold be nice.
    I have created with bi publisher a report, generated a data source and sql query.
    But in ms word it will not be shown as formatted. All rtf tags will be shown too.
    Regards,

    Hi Leonid,
    A) =======
    I'd be keen if someone like Tim would pick up on this and advise the full list of supported fo:instream-foreign-object content-type's that BI Publisher supports, would be nice if there was application/pdf and as you require application/rtf or similar. That way you'd just base64 encode your file in your XML output as per images like jpg/gif/png and include it in your template.
    Desirable syntax would then be like:
    <fo:instream-foreign-object content-type="application/rtf" >
    <xsl:value-of select="MY_BASE64_RTF_ELEMENT"/>
    </fo:instream-foreign-object>However thats wishful thinking as I can only see documented support for images specifically in jpg / gif / png formats.
    B) =======
    As a workaround if you have the ability to update your RTFs files (clobs) then you can just use the standard subtemplate functionality.
    In that case the approach would be:
    1. Update the RTFs (stored as clobs) to surround with template syntax:
    <?template:DefSubTemp?>
    ... your RTF doc here ...
    <?end template?>2. Take your clob and setup/upload as template (subtemplate=yes) manual or backend automate.
    3. Add the subtemplate to your main template (assuming you register under Application Object Library, code XXV8_SUBTEMP, language English and territory United States):
    <?import:xdo://FND.XXV8_SUBTEMP.EN.US?>
    <?call-template:DefSubTemp?>C) =======
    If you don't have the ability to update your RTF CLOBs and neither of the other solutions are suitable, then one of thing I thought of is on the fly conversion of the RTFs to, say, PNG and then use the image functionality to embed the image in the output. But again this may not be suitable as it will be an image not the "textual" output.
    D) =======
    All else failing you could look into the XML Publisher APIs to see if those could help.
    Hopefully something for you to think about here and get more info on.
    Regards,
    Gareth
    Blog: http://garethroberts.blogspot.com/

  • Doubt handling Clob columns with Java JDBC api

    Hi,
    we have a doubt handling Clob columns with Java JDBC api.
    Reading Oracle 10g official documentation (document b10979.pdf, page 236), we found this note:
    ============================================
    To write LOB data, the application must acquire a write lock on the LOB object. One way to accomplish this is through a SELECT FOR UPDATE. Also, disable auto-commit mode.
    ============================================
    We also found a java sample code about how to handle Lob objects at this URL:
    http://www.oracle.com/technology/sample_code/tech/java/sqlj_jdbc/files/advanced/LOBSample/LOBSample.java.html
    In our java2 application, we access Clob objects in a quite different
    manner: we use normal setString() and getString() methods, as described into paragraph "Shortcuts For Inserting and Retrieving CLOB Data"
    (document b10979.pdf, page 244).
    Using those methods, we never lock the table row by a SELECT FOR UPDATE statement (as described into the note above). We use simply SELECT, UPDATE and INSERT prepared statement.
    In this way we can insert both clob objects and normal timestamp, number and other types with a single insert statement. Idem for update.
    To recap, our question is:
    Is it mandatory to create a SELECT FOR UPDATE statement when updating clob data? What may be the consequences if we don't use it? It is also correct to insert with a single sql statement both clob and not clob data using the setString() method for the clob types? And more than one lob column in the same record?
    bye,
    luca acri.

    And columns of type FLOAT. These also have, for some unknown reason a metadata type of OTHER, and a type string of 'FLOAT'. Yet PreparedStatement.setNull(x, Types.OTHER) doesn't work and setNull(x, Types.DECIMAL) does.

  • Forms 6.0 how to query clob column with oracle 9.2 DB

    hi every body,
    i made install for oracle 9.2 oracle DB every thing goes ok but when i made query in my form version 6.0 which have CLOB column the form closed automatically without any message?
    and just for know when i run the same form with oracle 8.1.7 DB the form made query normally without any problem.
    i want your help please.
    Message was edited by:
    mshaqalaih

    I know there was a problem in 6i where you would get a crash if your query returned more than {Max Length} characters of the field representing the CLOB column.

  • Error reading data from CLOB column into VARCHAR2 variable

    Hi all,
    Am hitting an issue retrieving data > 8K (minus 1) stored in a CLOB column into a VARCHAR2 variable in PL/SQL...
    The "problem to be solved" here is storing DDL, in this case a "CREATE VIEW" statement, that is longer than 8K for later retrieval (and execution) using dynamic SQL. Given that the EXECUTE IMMEDIATE statement can take a VARCHAR2 variable (up to 32K(-1)), this should suffice for our needs, however, it seems that somewhere in the process of converting this VARCHAR2 text to a CLOB for storage, and then retrieving the CLOB and attempting to put it back into a VARCHAR2 variable, it is throwing a standard ORA-06502 exception ("PL/SQL: numeric or value error"). Consider the following code:
    set serveroutput on
    drop table test1;
    create table test1(col1 CLOB);
    declare
    cursor c1 is select col1 from test1;
    myvar VARCHAR2(32000);
    begin
    myvar := '';
    for i in 1..8192 loop
    myvar := myvar || 'a';
    end loop;
    INSERT INTO test1 (col1) VALUES (myvar);
    for arec in c1 loop
    begin
    myvar := arec.col1;
    dbms_output.put_line('Read data of length ' || length(myvar));
    exception when others then
    dbms_output.put_line('Error reading data: ' || sqlerrm);
    end;
    end loop;
    end;
    If you change the loop upper bound to 8191, all works fine. I'm guessing this might have something to do with the database character set -- we've recently converted our databases over to UTF-8, for Internationalizion support, and that seems to have changed underlying assumptions regarding character processing...?
    As far as the dynamic SQL issue goes, we can probably use the DBMS_SQL interface instead, with it's EXECUTE procedure that takes a PL/SQL array of varchar2(32K) - the only issue there is reading the data from the CLOB column, and then breaking that data into an array but that doesn't seem insurmountable. But this same basic issue (when a 9K text block, let's say, turns into a >32K block after being CLOBberred) seems to comes up in other text-processing situations also, so any ideas for how to resolve would be much appreciated.
    Thanks for any tips/hints/ideas...
    Jim

    For those curious about this, here's the word from Oracle support (courtesy of Metalinks):
    RESEARCH
    ========
    Test the issue for different DB version and different characterset.
    --Testing the following PL/SQL blocks by using direct assignment method(myvar := arec.col1;) on
    different database version and different characterset.
    SQL>create table test1(col1 CLOB);
    --Inserting four CLOB data into test1.
    declare
    myvar VARCHAR2(32767);
    begin
    myvar := RPAD('a',4000);
    INSERT INTO test1 (col1) VALUES (myvar);
    myvar := RPAD('a',8191);
    INSERT INTO test1 (col1) VALUES (myvar);
    myvar := RPAD('b',8192);
    INSERT INTO test1 (col1) VALUES (myvar);
    myvar := RPAD('c',32767);
    INSERT INTO test1 (col1) VALUES (myvar);
    commit;
    end;
    --Testing the direct assignment method.
    declare
    cursor c1 is select col1, length(col1) len1 from test1;
    myvar VARCHAR2(32767);
    begin
    for arec in c1 loop
    myvar := arec.col1;
    --DBMS_LOB.READ(arec.col1, arec.len1, 1, myvar);
    dbms_output.put_line('Read data of length: ' || length(myvar));
    end loop;
    end;
    The following are the summary of the test results:
    ===================================
    1. If the database characterset is WE8ISO8859P1, then the above direct assignment
    method(myvar := arec.col1;) works for database version 9i/10g/11g without any
    errors.
    2. If the database characterset is UTF8 or AL32UTF8, then the above direct assignment method(myvar := arec.col1;) will generate the "ORA-06502:
    PL/SQL: numeric or value error" when the length of the CLOB data is greater
    than 8191(=8K-1). The same error can be reproduced across all database versions
    9i/10g/11g.
    3. Using DBMS_LOB.READ(arec.col1, arec.len1, 1, myvar) method to read CLOB data into a VARCHAR2 variable works for both WE8ISO8859P1 and UTF8
    characterset and for all database versions.
    So - it seems as I'd surmised, UTF8 changes the way VARCHAR2 and CLOB data is handled. Not too surprising, I suppose - may you all be lucky enough to be able to stay away from this sort of issue. But - the DBMS_LOB.READ workaround is certainly sufficient for the text processing situations we find ourselves in currently.
    Cheers,
    Jim C.

  • Split CLOB column to improve performance

    Hi All,
    We have a transactional table which has 3 columns and one among those is CLOB which holds XML data.Inserts are coming at 35K/hr to this table and data will be deleted as soon as job is completed. So anytime the total records in this table will be less than 1000.
    The XML data contains binary info of images and the size of each XML file ranges any where between 200KB to 600KB and the elapsed time for each insert varies from 1 to 2 secs depending upon the concurrency. As we need to achieve 125K/hour soon we were planning to do few modifications on table level.
    1. Increase the CHUNK size from 8KB to 32KB.
    2. Disabling logging for table,clob and index.
    3. Disable flashback for database.
    4. Move the table to a non default blocksize of 32KB. Default is 8KB
    5. Increase the SDU value.
    6. Split the XML data and store it on multiple CLOB columns.
    We don't do any update to this table. Its only INSERT,SELECT and DELETE operations.
    The major wait events I'm seeing during the insert is
    1. direct path read
    2. direct path write
    3. flashback logfile sync
    4. SQL*Net more data from client
    5. Buffer busy wait
    My doubt over here is ,
    1. If I allocate a 2G memory for the non default block size and change the clob to CACHE, will my other objects in buffer_cache gets affected or gets aged out fast?
    2. And moving this table to a SECUREFILE from BASICFILE will help?
    3. Splitting the XML data to insert into different columns in the same table will give a performance boost?
    Oracle EE 11.2.0.1,ASM
    Thanks,
    Arun

    Thanks to all for the replies
    @Sybrand
    Please answer first whether the column is stored in a separate lobsegment.
    No. Table,Index,LOB,LOB index uses the same TS. I missed adding this point( moving to separate TS) as part of table modifications.
    @Hemant
    There's a famous paper / blog post about CLOBs and Database Flashback. If I find it, I'll post the URL.
    Is this the one you are referring to
    http://laimisnd.wordpress.com/2011/03/25/lobs-and-flashback-database-performance/
    By moving the CLOB column to different block size , I will test the performance improvement it gives and will share the results.
    We dont need any data from this table. XML file contains details about finger prints and once the application server completes the job , XML data is deleted from this table.
    So no need of backup/recovery operations for this table. Client will be able to replay the transactions if any problem occurs.
    @Billy
    We are not performing XML parsing on DB side. Gets the XML data from client -> insert into table -> client selects from table -> Upon successful completion of the Job from client ,XML data gets deleted.
    Regarding binding of LOB from client side, will check on that side also to reduce round trips.
    By changing the blocksize, I can keep db_32K_cache_size=2G and keep this table in CACHE. If I directly put my table to CACHE, it will age out all other operation from buffer which makes things worse for us.
    This insert is part of transaction( Registration of a finger print) and this is the only statement taking time as of now compared to other statements in the transaction.
    Thanks,
    Arun

  • How to read/write .CSV file into CLOB column in a table of Oracle 10g

    I have a requirement which is nothing but a table has two column
    create table emp_data (empid number, report clob)
    Here REPORT column is CLOB data type which used to load the data from the .csv file.
    The requirement here is
    1) How to load data from .CSV file into CLOB column along with empid using DBMS_lob utility
    2) How to read report columns which should return all the columns present in the .CSV file (dynamically because every csv file may have different number of columns) along with the primariy key empid).
    eg: empid report_field1 report_field2
    1 x y
    Any help would be appreciated.

    If I understand you right, you want each row in your table to contain an emp_id and the complete text of a multi-record .csv file.
    It's not clear how you relate emp_id to the appropriate file to be read. Is the emp_id stored in the csv file?
    To read the file, you can use functions from [UTL_FILE|http://download.oracle.com/docs/cd/B19306_01/appdev.102/b14258/u_file.htm#BABGGEDF] (as long as the file is in a directory accessible to the Oracle server):
    declare
        lt_report_clob CLOB;
        l_max_line_length integer := 1024;   -- set as high as the longest line in your file
        l_infile UTL_FILE.file_type;
        l_buffer varchar2(1024);
        l_emp_id report_table.emp_id%type := 123; -- not clear where emp_id comes from
        l_filename varchar2(200) := 'my_file_name.csv';   -- get this from somewhere
    begin
       -- open the file; we assume an Oracle directory has already been created
        l_infile := utl_file.fopen('CSV_DIRECTORY', l_filename, 'r', l_max_line_length);
        -- initialise the empty clob
        dbms_lob.createtemporary(lt_report_clob, TRUE, DBMS_LOB.session);
        loop
          begin
             utl_file.get_line(l_infile, l_buffer);
             dbms_lob.append(lt_report_clob, l_buffer);
          exception
             when no_data_found then
                 exit;
          end;
        end loop;
        insert into report_table (emp_id, report)
        values (l_emp_id, lt_report_clob);
        -- free the temporary lob
        dbms_lob.freetemporary(lt_report_clob);
       -- close the file
       UTL_FILE.fclose(l_infile);
    end;This simple line-by-line approach is easy to understand, and gives you an opportunity (if you want) to take each line in the file and transform it (for example, you could transform it into a nested table, or into XML). However it can be rather slow if there are many records in the csv file - the lob_append operation is not particularly efficient. I was able to improve the efficiency by caching the lines in a VARCHAR2 up to a maximum cache size, and only then appending to the LOB - see [three posts on my blog|http://preferisco.blogspot.com/search/label/lob].
    There is at least one other possibility:
    - you could use [DBMS_LOB.loadclobfromfile|http://download.oracle.com/docs/cd/B19306_01/appdev.102/b14258/d_lob.htm#i998978]. I've not tried this before myself, but I think the procedure is described [here in the 9i docs|http://download.oracle.com/docs/cd/B10501_01/appdev.920/a96591/adl12bfl.htm#879711]. This is likely to be faster than UTL_FILE (because it is all happening in the underlying DBMS_LOB package, possibly in a native way).
    That's all for now. I haven't yet answered your question on how to report data back out of the CLOB. I would like to know how you associate employees with files; what happens if there is > 1 file per employee, etc.
    HTH
    Regards Nigel
    Edited by: nthomas on Mar 2, 2009 11:22 AM - don't forget to fclose the file...

  • How to retreive soap xml data from clob column in a table

    Hi,
    I am trying to retrieve the XML tag value from clob column.
    Table name = xyz and column= abc (clob datatype)
    data stored in abc column is as below
    <soapenv:Envelope xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/" xmlns:head="http://www.abc.com/gcgi/shared/system/header" xmlns:v6="http://www.abc.com/gcgi/services/v6_0_0_0" xmlns:sys="http://www.abc.com/gcgi/shared/system/systemtypes">
    <soapenv:Header xmlns:wsa="http://www.w3.org/2005/08/addressing">
    <RqHeader soapenv:mustUnderstand="0" xmlns="http://www.abc.com/gcgi/shared/system/header">
    <DateAndTimeStamp>2011-12-20T16:02:36.677+08:00</DateAndTimeStamp>
    <UUID>1000002932</UUID>
    <Version>6_0_0_0</Version>
    <ClientDetails>
    <Org>ABC</Org>
    <OrgUnit>ABC</OrgUnit>
    <ChannelID>HBK</ChannelID>
    <TerminalID>0000</TerminalID>
    <SrcCountryCode>SG</SrcCountryCode>
    <DestCountryCode>SG</DestCountryCode>
    <UserGroup>HBK</UserGroup>
    </ClientDetails>
    </RqHeader>
    <wsa:Action>/SvcImpl/bank/
    SOAPEndpoint/AlertsService.serviceagent/OpEndpointHTTP/AlertDeleteInq</wsa:Action></soapenv:Header>
    <soapenv:Body>
    <v6:AlertDeleteInqRq>
    <v6:Base>
    <v6:VID>20071209013112</v6:VID>
    <!--Optional:-->
    <v6:Ref>CTAA00000002644</v6:Ref>
    </v6:Base>
    </v6:AlertDeleteInqRq>
    </soapenv:Body>
    </soapenv:Envelope>
    And i want to retrieve the values of tag
    <ChannelID> and <v6:VID>
    can somebody help, i have tried with extractvalue but not able to get the values

    I have used the below two queries but not able to get the expected results. Both queries result into no values.
    select xmltype(MED_REQ_PAYLOAD).extract('//ClientDetails/Org','xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/" || xmlns="http://www.abc.com/gcgi/shared/system/header"').getStringValue() from ESB_OUTPUT_TEMP where SOAPACTION = '/SvcImpl/bank/alerts/v6_0_0_0/SOAPEndpoint/AlertsService.serviceagent/OpEndpointHTTP/AlertDeleteInq'
    select EXTRACTVALUE(xmltype(MED_REQ_PAYLOAD),'/RqHeader/) from ESB_OUTPUT_TEMP where SOAPACTION = '/SvcImpl/bank/SOAPEndpoint/AlertsService.serviceagent/OpEndpointHTTP/AlertDeleteInq'
    Well, for starters, both queries are syntactically wrong :
    - non terminated string
    - incorrect namespace mapping declaration
    - unknown XMLType method "getStringValue()"
    Secondly, all those functions are deprecated now.
    Here's an up-to-date example using XMLTable. It will retrieve the two tag values you mentioned :
    SQL> select x.*
      2  from esb_output_temp t
      3     , xmltable(
      4         xmlnamespaces(
      5           'http://schemas.xmlsoap.org/soap/envelope/' as "soap"
      6         , 'http://www.abc.com/gcgi/shared/system/header' as "head"
      7         , 'http://www.abc.com/gcgi/services/v6_0_0_0' as "v6"
      8         )
      9       , '/soap:Envelope'
    10         passing xmlparse(document t.med_req_payload)
    11         columns ChannelID  varchar2(10) path 'soap:Header/head:RqHeader/head:ClientDetails/head:ChannelID'
    12               , VID        varchar2(30) path 'soap:Body/v6:AlertDeleteInqRq/v6:Base/v6:VID'
    13       ) x
    14  ;
    CHANNELID  VID
    HBK        20071209013112
    You may also want to store XML in XMLType columns for both performance and storage optimizations.

  • Trying to Insert an XML Element into XML data stored in CLOB column

    Hi all,
    My ORACLE DB version is:
    ('Oracle Database 11g Enterprise Edition Release 11.2.0.2.0 - 64bit Production');
    ('PL/SQL Release 11.2.0.2.0 - Production');
    ('CORE 11.2.0.2.0 Production');
    ('TNS for Linux: Version 11.2.0.2.0 - Production');
    ('NLSRTL Version 11.2.0.2.0 - Production');
    I have this XML data stored in a CLOB column:
    <Activity>
         <Changes>     
         </Changes>
         <Inserts>     
         </Inserts>
         <Definition>     
         </Definition>
         <Assignment TYPE="Apply">     
         </Assignment>
         <Spawned>
              <Activity>576D8CD9-57A1-8608-1563-8F6DC74BDF3C</Activity>
              <Activity>11226E79-5D24-02EB-A950-D34A9CCFB3FF</Activity>
              <Activity>DAA68DC0-CA9A-BB15-DE31-9596E19513EE</Activity>
              <Activity>93F667D6-966A-7EAD-9B70-630D9BEFDDD2</Activity>
              <Activity>FA63D9D3-86BB-3FF0-BE69-17EAA7581637</Activity>
         </Spawned>
         <SpawnedBy>AFC49BD4-5AA7-38C0-AE27-F59D16EE1B1C</SpawnedBy>
    </Activity>
    I am in need of some assistance in creating an update that will insert another <Activity>SomeGUID</Activity> into the <Spawned> parent.
    Any help is greatly appreciated.
    Thanks.
    Edited by: 943783 on Dec 14, 2012 12:58 PM

    See XML updating functions : http://docs.oracle.com/cd/E11882_01/appdev.112/e23094/xdb04cre.htm#i1032611
    For example :
    UPDATE my_table t
    SET t.my_clob =
          XMLSerialize(document
            insertChildXML(
              XMLParse(document t.my_clob)
            , '/Activity/Spawned'
            , 'Activity'
            , XMLElement("Activity", 'Some GUID')
    WHERE ...
    ;Although it works, there's overhead introduced by parsing the CLOB, then serializing again.
    Is there any chance you can change the CLOB to SECUREFILE binary XMLType storage instead?
    You would then be able to benefit from optimized piecewise update of the XML and improved storage.

  • How to insert more than 32k xml data into oracle clob column

    how to insert more than 32k xml data into oracle clob column.
    xml data is coming from java front end
    if we cannot use clob than what are the different options available

    Are you facing any issue with my code?
    String lateral size error will come when you try to insert the full xml in string format.
    public static boolean writeCLOBData(String tableName, String id, String columnName, String strContents) throws DataAccessException{
      boolean isUpdated = true;
      Connection connection = null;
      try {
      connection = ConnectionManager.getConnection ();
      //connection.setAutoCommit ( false );
      PreparedStatement PREPARE_STATEMENT = null;
      String sqlQuery = "UPDATE " + tableName + " SET " + columnName + "  = ?  WHERE ID =" + id;
      PREPARE_STATEMENT = connection.prepareStatement ( sqlQuery );
      // converting string to reader stream
      Reader reader = new StringReader ( strContents );
      PREPARE_STATEMENT.setClob ( 1, reader );
      // return false after updating the clob data to DB
      isUpdated = PREPARE_STATEMENT.execute ();
      PREPARE_STATEMENT.close ();
      } catch ( SQLException e ) {
      e.printStackTrace ();
      finally{
      return isUpdated;
    Try this JAVA code.

  • Limitation for CLOB columns? - ORA-01704: string literal too long

    Hello!
    I'm trying to update a CLOB column with more than 35000 characteres, but I get "ORA-01704: string literal too long".
    The code:
    declare
    l_clob clob;
    begin
    update test set test = empty_clob()
    WHERE ID = 1
    returning test into l_clob;
    dbms_lob.write( l_clob, length('A...here 35000xA...A'), 1,'A...here 35000xA...A');
    end;
    Is there any limitation for CLOB columns?
    Thanks for help.
    Daniel

    user605489 wrote:
    32768 characteres :)Actually it's 1 character less than 32K...
    *32767*
    SQL> declare
      2    v_vc varchar2(32768);
      3  begin
      4    null;
      5  end;
      6  /
      v_vc varchar2(32768);
    ERROR at line 2:
    ORA-06550: line 2, column 17:
    PLS-00215: String length constraints must be in range (1 .. 32767)
    SQL>I guess it comes from a legacy thing where signed words (2 bytes) are/were used to represent a value. As the most significant bit of the word is used to represent the sign of the number the range goes from -32768 to 32767.

  • How to Update a clob column..it gives error string literal too long

    I am trying to update a clob column of a table but it gives error string literal too long plz tell me what's the issue
    ORA-01704: string literal too long

    Peeyush wrote:
    I am trying to update a clob column of a table but it gives error string literal too long plz tell me what's the issue
    ORA-01704: string literal too longThere's a problem with my car. It won't start. Why won't it start? Please tell me!
    Oh wait, you can't, because I haven't given you nearly enough information...
    In other words, if you would like help in trying to work out where you've gone wrong, you should provide a small enough example of your code that demonstrates the error. We might then actually stand a chance of being able to help you!

  • Getting an error while selecting from table having CLOB column.

    Hi All,
    I have below table created in My oracle database version Oracle Database 11g Enterprise Edition Release 11.2.0.1.0.
    CREATE TABLE my_clob -- Dummy table created
    (DataBody CLOB);
    Current Database Character set - WE8MSWIN1250.
    On the front end of my application, I have one form through which I can save/edit data in the above table. If I'm creating one new entry in the above table then it first check with existing record to avoid the duplicate entry and the this point application create the below select statement on the above table and return the error "ORA-00932: inconsistent data types: expected - got CLOB".
    I can not change the sql statement.
    SELECT * FROM my_clob WHERE databody IS NULL OR databody ='';
    Even when I run the same statement on my DB server I’m getting the same error. Shown below
    SQL> SELECT * FROM my_clob WHERE databody IS NULL OR databody ='';
    SELECT * FROM my_clob WHERE databody IS NULL OR databody =''
    ERROR at line 1:
    ORA-00932: inconsistent datatypes: expected - got CLOB
    SQL>
    Is there anything with OraOLEDB which causing this error? Please help me out to get rid of this error.
    Thanks,
    Santosh

    You cannot compare directly a CLOB column with a VARCHAR2 column. In your case you don't need to do such comparison because Oracle consider zero length strings as null values:
    SQL> create table my_clob(data int, databody clob);
    Table created.
    SQL> insert into my_clob values(1, null);
    1 row created.
    SQL> insert into my_clob values(2, '');
    1 row created.
    SQL> commit;
    Commit complete.
    SQL> select * from my_clob where databody is null;
          DATA
    DATABODY
             1
             2About null values in Oracle, please read http://docs.oracle.com/cd/E11882_01/server.112/e26088/sql_elements005.htm#SQLRF30037.

  • Creation of view with clob column in select and group by clause.

    Hi,
    We are trying to migrate a view from sql server2005 to oracle 10g. It has clob column which is used in group by clause. How can the same be achived in oracle 10g.
    Below is the sql statament used in creating view aling with its datatypes.
    CREATE OR REPLACE FORCE VIEW "TEST" ("CONTENT_ID", "TITLE", "KEYWORDS", "CONTENT", "ISPOPUP", "CREATED", "SEARCHSTARTDATE", "SEARCHENDDATE", "HITS", "TYPE", "CREATEDBY", "UPDATED", "ISDISPLAYED", "UPDATEDBY", "AVERAGERATING", "VOTES") AS
      SELECT content_ec.content_id,
              content_ec.title,
              content_ec.keywords,
              content_ec.content content ,
              content_ec.ispopup,
              content_ec.created,
              content_ec.searchstartdate,
              content_ec.searchenddate,
            COUNT(contenttracker_ec.contenttracker_id) hits,
              contenttypes_ec.type,
              users_ec_1.username createdby,
              Backup_Latest.created updated,
              Backup_Latest.isdisplayed,
              users_ec_1.username updatedby,
              guideratings.averagerating,
              guideratings.votes
         FROM users_ec users_ec_1
                JOIN Backup_Latest
                 ON users_ec_1.USER_ID = Backup_Latest.USER_ID
                RIGHT JOIN content_ec
                JOIN contenttypes_ec
                 ON content_ec.contenttype_id = contenttypes_ec.contenttype_id
                 ON Backup_Latest.content_id = content_ec.content_id
                LEFT JOIN guideratings
                 ON content_ec.content_id = guideratings.content_id
                LEFT JOIN contenttracker_ec
                 ON content_ec.content_id = contenttracker_ec.content_id
                LEFT JOIN users_ec users_ec_2
                 ON content_ec.user_id = users_ec_2.USER_ID
         GROUP BY content_ec.content_id,
         content_ec.title,
         content_ec.keywords,
         to_char(content_ec.content) ,
         content_ec.ispopup,
         content_ec.created,
         content_ec.searchstartdate,
         content_ec.searchenddate,
         contenttypes_ec.type,
         users_ec_1.username,
         Backup_Latest.created,
         Backup_Latest.isdisplayed,
         users_ec_1.username,
         guideratings.averagerating,
         guideratings.votes;
    Column Name      Data TYpe
    CONTENT_ID     NUMBER(10,0)
    TITLE          VARCHAR2(50)
    KEYWORDS     VARCHAR2(100)
    CONTENT          CLOB
    ISPOPUP          NUMBER(1,0)
    CREATED          TIMESTAMP(6)
    SEARCHSTARTDATE     TIMESTAMP(6)
    SEARCHENDDATE     TIMESTAMP(6)
    HITS          NUMBER
    TYPE          VARCHAR2(50)
    CREATEDBY     VARCHAR2(20)
    UPDATED          TIMESTAMP(6)
    ISDISPLAYED     NUMBER(1,0)
    UPDATEDBY     VARCHAR2(20)
    AVERAGERATING     NUMBER
    VOTES          NUMBERAny help realyy appreciated.
    Thanks in advance
    Edited by: user512743 on Dec 10, 2008 10:46 PM

    Hello,
    Specifically, this should be asked in the
    ASP.Net MVC forum on forums.asp.net.
    Karl
    When you see answers and helpful posts, please click Vote As Helpful, Propose As Answer, and/or Mark As Answer.
    My Blog: Unlock PowerShell
    My Book: Windows PowerShell 2.0 Bible
    My E-mail: -join ('6F6C646B61726C40686F746D61696C2E636F6D'-split'(?<=\G.{2})'|%{if($_){[char][int]"0x$_"}})

Maybe you are looking for