Losing NATIONAL CHARACTERS(blob- clob- table). unistr?

Hello!
I have a problem with national characters. My example is as follows:
1. A csv file is uploaded from disk to htmldb_application_files
2. This BLOB is then converted to CLOB with dbms_lob.converttoclob()
3. Data from this CLOB is copied to PL/SQL array.
4. From PL/SQL array to table in database.
The problem: Either data copied to table in database loses national characters (display strange characters instead of national), or if I set my national character set id as an argument of dbms_lob.converttoclob() function I have an error - says that file is inconvertible.
What is wrong? How can I solve my problem? Can unistr() help somewhere? Any ideas?
Tom

Duplicate posting, being addressed at:
losing NATIONAL CHARACTERS(blob->clob->table). unistr?

Similar Messages

  • Table Import Data - "Insert script" - National characters

    Hi all,
    it looks like that there is a problem with support of national characters in imported data file when method "Insert script" is chosen.
    Table -> Import Data -> Open datafile "csv".
    As far as in the preview window I'm seeing properly displayed national characters from csv data file and when I'm choosing "Insert" or "SQL Loader" method - data is properly imported to the table.
    But when I'm using "Insert script" method, in generated script national characters are changed into "bushes":
    http://imm.io/V0J9
    SQL Developer: Version 3.2.20.09
    OS: Windows XP SP3
    Client code page: WIN-1250
    Tested databases: 10g, 11g

    <p>This has been fixed in the latest build. The patch is now available for <a href = "http://www.oracle.com/technology/software/products/sql/index.html">download</a>.
    </p>
    Regards
    </p>Sue

  • Expoting/Importing Oracle BLOB, CLOB, Nested Tables datatypes

    I am trying to export the database containing BLOB, Nested Tables etc. and wants to import the same. But i am getting an error. Pleease help as to how i will proceed with it. This is something very urgent. please reply . i will be very thankful to you

    I believe this error is only associated with LONG data type. I have been using data pump to export/import table with LOB type without any problem.
    Data pump been having issue with LONG type check this bug
    Bug 5598333 - EXPDP/IMPDP corrupts the data for a LONG column
    Doc ID: Note:5598333.8
    It's fixed in 11.1.0.6

  • Words containing national characters (hungarian) are not indexed

    I created a table data source that contains text in a CLOB field, and added it to the sample UltraSearch query application.
    I created few lines in this table and run the index building.
    The demo is now able to search in the clob field.
    If the text contains at least one hungarian national character
    (example éíöü...)
    then the containing word will be not indexed.
    I couldn't find the word in
    dr$wk$doc_path_idx$i
    table.
    The other words without national characters are there.
    example:
    Ez egy teszt szöveg
    words indexed:
    Ez
    egy
    teszt
    word not indexed:
    szöveg

    Also problem:
    I set the default language to this data source Hungarian.
    I think this will be applied to all the documents in the source.
    select count(t.lang)
    from wk$doc t
    where t.lang = 'hu'
    5
    This works fine.
    I created a new record in the table with the following text:
    Ez itt most egy teszt szöveg.
    If I try to search the text
    most
    it will be not found.
    It is an english stopword
    see:
    select count(*) from ctx_user_index_values t where t.ixv_value = 'most' and t.ixv_class = 'STOPLIST'
    1
    This word means 'now' and has nothing to do with this stopword.
    Is it a bug or am I missed something?

  • Problem crawling filenames with national characters

    Hi
    I have a big problem with filenames containing national (danish) characters.
    The documents gets an entry in in wk$url but have error code 404 (Not found).
    I'm running Oracle RDBMS 9.2.0.1 on Redhat Advanced Server 2.1. The
    filesystem is mounted on the oracle server using NFS.
    I configure the Ultrasearch to crawl the specific directory containing
    several files, two of which contains national characters in their
    filenames. (ls -l)
    <..>
    -rw-rw-r-- 1 user group 13 Oct 4 13:36 crawlertest_linux_2_fxeFXE.txt
    -rw-rw-r-- 1 user group 19968 Oct 4 13:36 crawlertest_windows_fxeFXE.doc
    <..>
    (Since the preview function is not working in my Mozilla browser, I'm
    unable to tell whether or not the national characters will display
    properly in this post. But they represent lower and upper cases of the
    three special danish characters.)
    In the crawler log the following entries are added:
    <..>
    file://localhost/<DIR_PATH>/crawlertest_linux_2_B|C?C%C?C?.txt
    file://localhost/<DIR_PATH>/crawlertest_linux_2_B|C?C%C?C?.txt
    Processing file://localhost/<DIR_PATH>/crawlertest_linux_2_%e6%f8%e5%c6%d8%c5.txt
    WKG-30008: file://localhost/<DIR_PATH>/crawlertest_linux_2_%e6%f8%e5%c6%d8%c5.txt: Not found
    <..>
    file://localhost/<DIR_PATH>/crawlertest_windows_B|C?C%C?C?.doc
    file://localhost/<DIR_PATH>/crawlertest_windows_B|C?C%C?C?.doc
    Processing file://localhost/<DIR_PATH>/crawlertest_windows_%e6%f8%e5%c6%d8%c5.doc
    WKG-30008:
    file://localhost/<DIR_PATH>/crawlertest_windows_%e6%f8%e5%c6%d8%c5.doc:
    Not found
    <..>
    The 'file://' entries looks somewhat UTF encoded to me (some chars are
    missing because they are not printable) and the others looks URL
    encoded.
    All other files in the directory seems to process just fine!.
    In the wk$url table the following entries are added:
    (select status url from wk$url where url like '%crawlertest%'; )
    404 file://localhost/<DIR_PATH>/crawlertest_linux_2_%e6%f8%e5%c6%d8%c5.txt
    404 file://localhost/<DIR_PATH>/crawlertest_windows_%e6%f8%e5%c6%d8%c5.doc
    Just for testing purpose a
    SELECT utl_url.unescape('%e6%f8%e5%c6%d8%c5') from dual;
    Actually produce the expected resulat : fxeFXE
    To me this indicates that the actual filesystem scanning part of the
    crawler can sees the files, but the processing part of the crawler can
    not open the file for reading and it therefor fails with error 404.
    Since the crawler (to my knowledge is written in Java i did some
    experiments, with the following Java program.
    import java.io.*;
    class filetest {
    public static void main(String args[]) throws Exception {
    try {
    String dirname = "<DIR_PATH>";
    File dir = new File(dirname);
    File[] fs = dir.listFiles();
    for(int idx = 0; idx < fs.length; idx++) {
    if(fs[idx].canRead()) {
    System.out.print("Can Read: ");
    } else {
    System.out.print("Can NOT Read: ");
    System.out.println(fs[idx]);
    } catch(Exception e) {
    e.printStackTrace();
    The performance of this program is very depending on the language
    settings of the current shell (under Linux). If LC_ALL is set to "C"
    (which is a common default) the program can only read files with
    filenames NOT containing national characters (Just as the Ultrasearch
    crawler). If LC_ALL is set to e.g. "en_US", then it is capable of
    reading all the files.
    I therefor tried to set the LC_ALL environment for the oracle user on
    my oracle server (using locale_config, and .bash_profile) but that did
    not seem to fix the problem at hand.
    So (finally) my question is; is this a bug in the Ultrasearch crawler
    or simply a mis configuration of my execution environment. If the
    latter how do i configure my system correctly?
    Yours sincerely
    Martin Dahl Pedersen, Visanti ( mdp at visanti dot com )

    I've posted my problems as a TAR on METALINK a little week ago.
    And it turns out to be a new bug in UltraSearch.
    It is now filed under BUG:2673282
    -- mdp

  • Manual Exporting of BLOB specific table to text file

    Hi,
    Our application is having 60000 record in a BLOB specific table.
    My requirement is to export the entire table data to text files .
    When I tried converting BLOB to sting and writing to file, it took almost 3 mins for 100 records, if so, will take so much of time in exporting data from BLOB specific table.
    I am using the following logic,
    byte[] bdata = blob.getBytes(1, (int)blob.length());
    String data1 = new String(bdata);
    buffer.append(data1);
    Can anyone please tell me how can I speed up the operation.

    >
    Our application is having 60000 record in a BLOB specific table.
    My requirement is to export the entire table data to text files .
    >
    Welcome to the forum!
    If you are looking for a pure Java solution you should post this question in the JDBC forum.
    https://forums.oracle.com/forums/category.jspa?categoryID=288
    Unless your BLOB data is located inline you are going to use Oracle to read 60,000 files, one at a time, and then use Java to write 60,000 files one at a time.
    That will be a very slow process.
    Also - your Java code is only going to read the LOB locator and inline BLOB data since you are not getting and processing the actual stream.
    See 'Reading and Writing BLOB, CLOB and NCLOB Data' in the JDBC Dev Guide for details
    http://docs.oracle.com/cd/B28359_01/java.111/b31224/oralob.htm#sthref756

  • ORA-22275 :invalid LOB locator specified error while loading BLOBs/CLOBS.

    Hello All,
    I am trying to load BLOB/CLOB data from a client oracle DB to the oracle db on our side,we are using ODI version 10.1.3.5.6,which reportedly has the issue of loading BLOB/CLOBS solved.I am using
    The extraction fails in the loading stage when inserting data into the C$ table with the following error.
    "22275:99999 :java.sql.BatchUpdateException:ORA-22275:Invalid LOB locator specified".
    Kindly let me know how I can resolve this issue as the requirement to load this data is very urgent.
    Thanks,
    John

    One alternate way can be done out of ODI as ODI is still not able to resolve this issue. You can trim these fields (CLOB/BLOB) and push this data as separate fields into ODI and at the reporting end you can again concatenate them.
    May be this may solve your problem ....it solved mine.
    --XAT                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                   

  • Display BLOB/CLOB image data in OBIEE 11g

    Dear Gurus,
    I want to display employee profile picture in dashboard, the image data was saved as blob/clob image.
    How to configure and display it in obiee?
    Regards
    JOE

    Hi
    Thanks for your quick response.
    actually im not using EBS, im just using publisher with OBIEE 11g.
    if explain step by step what you wrote first replay-
    Let i have a table with one BLOB column.
    1) first load blob data to my table
    2) create data model from oracle publisher.
    3) change the xdm file with above tagline , my xdm file is in below link-
    C:\mw_bi_home\instances\instance1\bifoundation\OracleBIPresentationServicesComponent\coreapplication_obips1\catalog\SampleAppLite\root\shared\zia%2exdm
    i don't know is that right step? because if i re-open the xdm file after change by your given tagline its shows error.
    I will grateful to you please give me a little explanation which file i can put the following tagline.
    _<fo:instream-foreign-object content-type="image/jpg"><xsl:value-of select=".//IMAGE_ELEMENT"/></fo:instream-foreign-object>_
    Thanks again and waiting your feedback -
    @zia

  • National characters problem

    Hi.
    I'm using AE on XE 10.2.0.1.0
    I have problem with typing national characters f.e. in updatable Report Attributes Column Heading (Custom). If i type name for heading "Ilo&#347;&#263;", then push "Apply changes", name are saved without national characters, "Ilosc".
    Why it is happenig ?
    Should i change settings in Application ? Or on database ?
    Should i use another Browser (currentlny SeaMonkey) ?
    I have download "Oracle Database 10g Express Edition (Western European)".
    Should I download and use "Oracle Database 10g Express Edition (Universal)" ???
    My APP globalization parameters:
    Application Primary Language      : Polish (pl)
    Application Language Derived From: Application Preference (using FSP_LANGUAGE_PRFERENCE)
    Automatic CSV Encoding: no
    My DB NLS settings :
    NLS_CALENDAR     GREGORIAN
    NLS_CHARACTERSET     WE8MSWIN1252
    NLS_COMP     BINARY
    NLS_CURRENCY     zl
    NLS_DATE_FORMAT     RR/MM/DD
    NLS_DATE_LANGUAGE     POLISH
    NLS_DUAL_CURRENCY     zl
    NLS_ISO_CURRENCY     POLAND
    NLS_LANGUAGE     POLISH
    NLS_LENGTH_SEMANTICS     BYTE
    NLS_NCHAR_CHARACTERSET     AL16UTF16
    NLS_NCHAR_CONV_EXCP     FALSE
    NLS_NUMERIC_CHARACTERS     ,
    NLS_SORT     POLISH
    NLS_TERRITORY     POLAND
    NLS_TIME_FORMAT     HH24:MI:SSXFF
    NLS_TIMESTAMP_FORMAT     RR/MM/DD HH24:MI:SSXFF
    NLS_TIMESTAMP_TZ_FORMAT     RR/MM/DD HH24:MI:SSXFF TZR
    NLS_TIME_TZ_FORMAT     HH24:MI:SSXFF TZR

    N'<national symbols>', being part of an SQL statement, will be converted to the database character set (WE8ISO8859P1) before being parsed. Only if the client and the database are both 10.2 or higher, the client can encode the literal appropriately so that it survives this conversion.
    In earlier versions, you can do the encoding yourself. Instead of the N'<national symbols>' literal use the UNISTR function: UNISTR('\xxxx\yyyy\zzzz'), where U+xxxx, U+yyyy, U+zzzz are Unicode code points of your national characters.
    -- Sergiusz

  • Problem with national characters on windows client

    Hello there,
    I'am having problem with national characters on windows client.
    All national data stored in NVARCHAR2 colums, applications (.net) works fine,
    but in sqlplus:
    select city from test_table;
    - everything ok, sqlplus shows national characters
    select dump(N'<national symbols>') from dual
    - returns
    Typ=96 Len=12: 0,191,0,191,0,191,0,191,0,191,0,191
    select * from test_table where city = N'<national symbols> '
    - always returns nothing
    As i understand the problem in
    sql query text (and national literals) convertion
    to servers "WE8ISO8859P1" encoding, Is it possible
    to solve the issue?
    Thanks in advance
    PS.
    Console in right mode (chcp=1251)
    sqlplus shows russian messages well
    Server (oracle 9 on solaris):
    select * from nls_database_parameters
    NLS_NCHAR_CHARACTERSET AL16UTF16
    NLS_SAVED_NCHAR_CS WE8ISO8859P1
    NLS_LANGUAGE AMERICAN
    NLS_TERRITORY AMERICA
    NLS_CURRENCY $
    NLS_ISO_CURRENCY AMERICA
    NLS_NUMERIC_CHARACTERS .,
    NLS_CHARACTERSET WE8ISO8859P1
    NLS_CALENDAR GREGORIAN
    NLS_DATE_FORMAT DD-MON-RR
    NLS_DATE_LANGUAGE AMERICAN
    NLS_SORT BINARY
    NLS_TIME_FORMAT HH.MI.SSXFF AM
    NLS_TIMESTAMP_FORMAT DD-MON-RR HH.MI.SSXFF AM
    NLS_TIME_TZ_FORMAT HH.MI.SSXFF AM TZH:TZM
    NLS_TIMESTAMP_TZ_FORMAT DD-MON-RR HH.MI.SSXFF AM TZH:TZM
    NLS_DUAL_CURRENCY $
    NLS_COMP BINARY
    NLS_LENGTH_SEMANTICS BYTE
    NLS_NCHAR_CONV_EXCP FALSE
    NLS_RDBMS_VERSION 9.2.0.6.0
    Client (windows server 2003, oracle client 10):
    NLS_LANG = RUSSIAN_CIS.CL8MSWIN1251

    N'<national symbols>', being part of an SQL statement, will be converted to the database character set (WE8ISO8859P1) before being parsed. Only if the client and the database are both 10.2 or higher, the client can encode the literal appropriately so that it survives this conversion.
    In earlier versions, you can do the encoding yourself. Instead of the N'<national symbols>' literal use the UNISTR function: UNISTR('\xxxx\yyyy\zzzz'), where U+xxxx, U+yyyy, U+zzzz are Unicode code points of your national characters.
    -- Sergiusz

  • Can't read some national characters.. instead $..&.. please help!

    Hello Gurus,
    How can I read national characters?
    There is a table imported from mssql database and when I run a query it doesn't disply national characters.. instead it displays symbols like that... $.. etc..
    Do I have to change characterset (which, how?) or nationalcharacterset to do that, or is there an other way?
    Please help me understand this because it is confusing..
    Regards!!!

    Try setting NLS_LANG environment variable on the Client. Do the following :
    SQL> select * from nls_database_parameters
      2* where parameter in ('NLS_LANGUAGE', 'NLS_TERRITORY', 'NLS_CHARACTERSET')
    SQL> /
    PARAMETER                      VALUE
    NLS_LANGUAGE                   AMERICAN
    NLS_TERRITORY                  AMERICA
    NLS_CHARACTERSET               WE8ISO8859P15
    SQL>                                                                             then at OS level set NLS_LANG as follows (Linux syntax) :
    $ export NLS_LANG=AMERICAN_AMERICA.WE8ISO8859P15
    of course change values with yours. On Windows :
    C:\> set NLS_LANG=AMERICAN_AMERICA.WE8ISO8859P15

  • Query to read XML from CLOB table column

    Hi
    I want an SQL to get the following information extract from a CLOB table column.
    MasterReport/sg:RptDef/sg:RptCell@RealDesc MasterReport/sg:RptDef/sg:RptCell@RealNum
    credits                              100
    debits                              100
    Sample XML data from table column is:
    <?xml version="1.0" encoding="UTF-8" ?>
    <MasterReport xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:sg="http://www.oracle.com/fsg/2002-03-20/" xsi:schemaLocation="http://www.oracle.com/2002-03-20/fsg.xsd">
    <sg:LDGName>Vision Portugal</sg:LDGName>
    <sg:SOBName>Vision Portugal</sg:SOBName>
    <sg:DataAccessSetName>Vision Portugal</sg:DataAccessSetName>
    <sg:InternalReportName>Model 30 Report</sg:InternalReportName>
    <sg:CustomParam10 />
    <sg:RowContext RowId="r100001">
    <sg:RowName />
    <sg:RowLineItem>Litigation Credits- Total amount from previous period</sg:RowLineItem>
    <sg:RowDispUnit>1</sg:RowDispUnit>
    <sg:RowDispFormat />
    <sg:RowUnitOfMeasure>EUR</sg:RowUnitOfMeasure>
    <sg:RowLedgerCurrency>ANY</sg:RowLedgerCurrency>
    <sg:RowCurrencyType>T</sg:RowCurrencyType>
    <sg:RowChangeSign>0</sg:RowChangeSign>
    <sg:RowSeq>1.0000000000000</sg:RowSeq>
    </sg:RowContext>
    <sg:RowContext RowId="r100002">
    <sg:RowName />
    <sg:RowLineItem>Litigation credits- Taxed amounts from column2 for Previous period</sg:RowLineItem>
    <sg:RowDispUnit>1</sg:RowDispUnit>
    <sg:RowDispFormat />
    <sg:RowUnitOfMeasure>EUR</sg:RowUnitOfMeasure>
    <sg:RowLedgerCurrency>ANY</sg:RowLedgerCurrency>
    <sg:RowCurrencyType>T</sg:RowCurrencyType>
    <sg:RowChangeSign>0</sg:RowChangeSign>
    <sg:RowSeq>2.0000000000000</sg:RowSeq>
    </sg:RowContext>
    <sg:ColContext ColId="c1000">
    <sg:ColAmountType />
    <sg:ColPeriod />
    <sg:ColPerOffset />
    <sg:ColChangeSign />
    <sg:ColPosition />
    <sg:ColSeq />
    <sg:ColWidth>100</sg:ColWidth>
    </sg:ColContext>
    <sg:ColContext ColId="c1001">
    <sg:ColName>Total</sg:ColName>
    <sg:ColDescr />
    <sg:ColDispUnit>1</sg:ColDispUnit>
    <sg:ColUnitOfMeasure>EUR</sg:ColUnitOfMeasure>
    <sg:ColLedgerCurrency>ANY</sg:ColLedgerCurrency>
    <sg:ColCurrencyType>T</sg:ColCurrencyType>
    <sg:ColDispFormat>999999999.99</sg:ColDispFormat>
    <sg:ColAmountType>YTD-Actual</sg:ColAmountType>
    <sg:ColPerOffset>0</sg:ColPerOffset>
    <sg:ColAmntId>14</sg:ColAmntId>
    <sg:ColParamId>-1</sg:ColParamId>
    <sg:ColType>A</sg:ColType>
    <sg:ColStyle>B</sg:ColStyle>
    <sg:ColPeriod>10-08</sg:ColPeriod>
    <sg:ColPeriodYear>2008</sg:ColPeriodYear>
    <sg:ColPeriodNum>11</sg:ColPeriodNum>
    <sg:ColPeriodStart>2008-10-01T00:00:00</sg:ColPeriodStart>
    <sg:ColPeriodEnd>2008-10-31T00:00:00</sg:ColPeriodEnd>
    <sg:ColChangeSign>0</sg:ColChangeSign>
    <sg:ColHeadLine1>Totals</sg:ColHeadLine1>
    <sg:ColHeadLine2 />
    <sg:ColHeadLine3 />
    <sg:ColHeadLine4 />
    <sg:ColHeadLine5 />
    <sg:ColHeadLine6 />
    <sg:ColHeadLine7 />
    <sg:ColHeadLine8 />
    <sg:ColHeadLine9 />
    <sg:ColPosition>99</sg:ColPosition>
    <sg:ColSeq>1.0000000000000</sg:ColSeq>
    <sg:ColWidth>14</sg:ColWidth>
    </sg:ColContext>
    <sg:RptDef RptId="p1001" RptDetName="Ledger=Vision PT (Vision Portugal)" RptPESegm="" RptPEVal="" RptTabLabel="Output 1 (Vision PT)">
    <sg:RptLine RptCnt="p1001" RowCnt="r100001" LineRowSeq="1.0000000000000" LinCnt="l100001">
    <sg:RptCell ColCnt="c1000" RealDesc="debits">debits</sg:RptCell>
    <sg:RptCell ColCnt="c1001" RealNum="100.000000">100.00</sg:RptCell>
    </sg:RptLine>
    <sg:RptLine RptCnt="p1001" RowCnt="r100002" LineRowSeq="2.0000000000000" LinCnt="l100002">
    <sg:RptCell ColCnt="c1000" RealDesc="creditsd">credits</sg:RptCell>
    <sg:RptCell ColCnt="c1001" RealNum="100.000000">100.00</sg:RptCell>
    </sg:RptLine>
    </sg:RptDef>
    <sg:TabCount>1</sg:TabCount>
    </MasterReport>
    Please help me.
    Regards
    Giri
    Edited by: user576087 on Mar 18, 2012 11:54 PM

    I'm not sure if you want the values from the attribute or the element, but this should give you a good start :
    SQL> alter session set nls_numeric_characters = ".,";
    Session altered
    SQL>
    SQL> select x.*
      2  from my_table t
      3     , xmltable(
      4         xmlnamespaces('http://www.oracle.com/fsg/2002-03-20/' as "sg")
      5       , '/MasterReport/sg:RptDef/sg:RptLine'
      6         passing xmltype(t.xmldoc)
      7         columns type    varchar2(30) path 'sg:RptCell[1]'
      8               , amount  number       path 'sg:RptCell[2]'
      9       ) x
    10  ;
    TYPE                               AMOUNT
    debits                                100
    credits                               100

  • More than 255 characters in a table text field

    Dear experts,
    i am facing a problem (in WD Alv too) that i can not display more than 255 characters in a single text field.
    I want to display a table containing a description field without a limitation of its length. As soon as providing a (formatted) string longer than 255 characters of length, no interactive form is shown on screen.
    Debugging a while, the following error message occurs;
    ADS: com.adobe.ProcessingException: com.adobe.ProcessingException: XMLFM Exception - PDF render operation exception, reason code: 0 : InvalidXDPException: Xml parsing error: reference to invalid character number (error code 14) ...
    Does anybody have similar problems to mine?
    Did anyone resolve the issue to show more than 255 characters in a table in interactive form?
    Regards,
    Florian Royer
    Edited by: Florian Royer on Feb 11, 2010 2:48 PM

    CALL METHOD lr_service_manager->retrieve
            EXPORTING
              iv_bo_name       = 'cPro_Project' "lv_bo_name "cPro_Project
    *      iv_bo_name      = cl_dpr_api_co=>sc_bo_cprojects "
              iv_bo_node_name =  'Longtext.Root' "lv_bo_node_name "Longtext.Root
              it_keys         = lt_ltext_key
              iv_edit_mode    = '0' "iv_edit_mode "0
            IMPORTING
              et_data         = lt_longtext_mast
              et_failed_keys  = lt_ltext_key_fail.
          READ TABLE lt_longtext_mast INTO ls_longtext_mast INDEX 1.
          MOVE ls_longtext_mast-longtext TO ls_action_item-zz_description.
    This is how i get the text with format (line feeds).
    zz_description is type string.
    My table is on a page, wrapped in a subform. and zz_description is type text field.
    Yes, i maintained "allow multiple lines" and did not limit length somehow.
    The problem arises in portal, pressing the preview button of a zform. providing a string <255 characters of length, everything works fine.
    Edited by: Florian Royer on Feb 11, 2010 3:10 PM

  • Yest we can store BLOB,CLOB

    Hi,
    it is possible to strore the BLOB , CLOB and BFILES IN THE ORACLE8 and above DATABASE.
    For this you can take the help of the DBMS_LOB package.
    regards,
    khaleel

    How do you propose to do that? You are limited to 255 bytes per cell in Excel. If you come up with a scheme to lay down blocks of data and some way to uniquely ID them, then yes, you can; but you have to overcome that 255 byte boundary per cell before you can do much.
    A CLOB can be stored as a string, but once again... there is that max character boundary to overcome in Excel.

  • Inserting images stored as BLOB in table to Oracle Report(10G)

    We have some oracle reports(rdf) built in Oracle 10g. Now we are planning to update these reports to include images stored as pdf's on file system. We have loaded these pdf's as blob into tables. Is there a way to include these image/s into the reports.
    Thx.

    Easy answer: No, this is not possible. You can't show a PDF inside your report.
    Since you are talking about just images, why do you even store these images as PDF? Why not as JPG or PNG (if you need best image-quality)?
    Using a normal image in a report would be no problem at all.
    Complex answer: Yes, this could be possible with an enormous effort.
    You could write yourself a Java-Bean which reads the BLOB, uses something like "iText" to convert the PDF to a normal image, and then display this image.
    I want to recommend to not use this solution since it introduces a really big complexity into you report where the usual solution of just storing images as images and not PDFs would give you a better result (and way better performance).
    Regards
    Markus

Maybe you are looking for

  • Phone numbers are not showing up on craigslist and google maps is not working.

    I uploaded the 3.6 version of firefox and although learning a new look and figuring out what changes had been made were doing ok then the other night the power went out it shut the comp down and when my husband started it back up it had him update it

  • Entries in Table VKDFS

    Hi Gurus, Can you please explain me how entries are created in table VKDFS ( Billing initiator table ). the problem we are facing is in transaction VF04 ( billing due list ) , for order related billing documents billing type is not coming for some ca

  • Adobe Flash Player 12 installation error

    I cannot install Adobe Flash Player 12. I go through all of the setting and just when I think it's about to finish I get the "General Installation Error" message. I've trashed all the Flash Player folders in my library but it hasn't helped. Any sugge

  • Time Machine to retrieve serial number

    I have (or had) a PowerBook that was stolen recently. I still have my external HDD. Two questions: 1. Is it possible to retrieve my former laptop's serial number from the backup? 2. When I replace the laptop, can I do a full system restore from Time

  • IAS 9i  Rel. 2  9.0.2 General

    I made an ias course by Oracle (GNC Partner) in Vienna, Austria and i hearded about BUGS and BUGs and so an. as good as Oracles DB 9i is as bad ist this release !! Our (me and the trainers too :((( ) opinion is: This Release is worse and cant be real