Charset or UTF-8 problems

Is there any solution to display the Hungarian (or other) special characters in a htmldb application (or in a html page generated by dynamic PL/SQL)?
Vasek

I don't think this is a HTMLDB problem; if you are using the default XE database which is created using the Western European Latin-1 character set, it will not be able to store and process the accented Hungarian characters correctly. You should see the same problem in SQL*PLUS too.
There will be a Universal language version of XE (created using UTF-8 as the database character set) available in the production release of Oracle Database 10g Express Edition.

Similar Messages

  • OC4J 9.0.2.0.1 UTF-8 problems

    We were using OC4J 1.0.2.2.1 with default-charset="UTF-8" in global-application.xml and UTF-8 is working fine.
    We recently upgraded to 9.0.2.0.1. The same setting is not working. We tried setting the default-charset in
    application-deployments/**/wev/orion-web.xml also. It did not work.
    Please respond if any one experiencing the same problem and workarounds for it.
    Thanks

    Srinivas,
    Are you seeing this with servlets or jsps or both...
    recently we got a similar problem reported with jsp.
    thanks,
    -Prasad

  • Robohelp HTML, JBoss: UTF-8 Problem

    Hello
    We use Robohelp 8 to create a web help for our JBOSS based project. Unfortunaltely, all files that robohelp creates use UTF-8, but the default of our JBOSS is ISO-8859-1.
    So, if I open the help using Firefox in the web container I receive only "" and the encoding in the brower is set to ISO. IE8 can display the help in a correct way, I suppose this browser recognize the different character encoding.
    The only thing that I can do to let it work in FF is to change the robohelp output html files and add this lines in each(!) htm file of the help
    <?xml version="1.0"?>
    <!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd">
    To show the special characters in german like ä,ö,ü I have also to remove the charset in the <metatag> of the output files...
    But every time I re-generate the output I have to redo this. Is there some possibility to change the character encoding in my robohelp source code? Unfortunaltely it seams impossible to change this things in masterfiles or robohelp pages in the UI
    Has somebody a workaround?
    Best regards
    Stefan

    Hi, Stefan. Yes, others have seen this problem - specifically with your ISO charset.
    There isn't a fix in RoboHelp8. The solution appears to be to enable UTF-8 on your application server.
    There are several posts about the problem on this forum:
    http://forums.adobe.com/message/2095270#2095270
    http://forums.adobe.com/message/961768#961768
    These two seem most useful, but if you're really technical, you can search for ISO-8859-1 on the Adobe Forums. If you want just RoboHelp posts, enter  "ISO-8859-1 +RoboHelp".
    HTH,
    Elisa

  • More Cf + MySQL 5 + Unicode/UTF-8 Problems

    Here is the problem:
    I am using a MySQL database that store data in Unicode/UTF-8
    (the website/database are in Lao).
    Settings:
    CF 7.0.2
    MySQL 5.0.26
    MySQL Defaults: latin1_swedish_ci collation, latin1 encoding
    Database Defaults: utf8_general_ci collation, utf8 encoding
    These are same on my local computer and on the host
    (HostNexus)
    The only difference is that my CF uses
    mysql-connector-java-3.1.10 driver while the host uses MySQL 3.x
    driver (org.gjt.mm.mysql.Driver class).
    On my local computer everything works just fine, even without
    any extra CF DSN settings (i.e. connection string and/or JDBC URL
    do not need the useUnicode=true&characterEncoding=UTF-8 strings
    added to show Lao text properly).
    On the host, even with the
    useUnicode=true&characterEncoding=UTF-8 added (I have even
    tried adding
    &connectionCollation=utf8_unicode_ci&characterSetResults=utf8
    to the line as well), I only get ??????? instead of Lao text from
    the db.
    The cfm pages have <cfprocessingdirective> and
    <cfcontent> tags set to utf-8 and also have html <meta>
    set to utf-8. ALl static Lao text in cfm pages shows just fine.
    Is this the problem with the MySQL driver used by the host?
    Has anyone encountered this before? Is there some other setting I
    have to emply with the org.gjt.mm.mysql.Driver class on the host?
    Please help!

    Thanks for your reply/comments, Paul!
    I also think it must be the db driver used on the host... I
    just don't understand why the DSN connection string
    (useUnicode=true&characterEncoding=UTF-8 [btw, doesn't really
    matter utf8 or UTF-8 - works with both; I think the proper way
    actually is UTF-8, since that is the encosing's name used in
    Java...]) wouldn't work with it??? I have the hosting tech support
    totally puzzled over this.
    Don't know if you can help any more, but I have added answers
    to your questions in the quoted text below.
    quote:
    Sabaidee wrote:
    > Here is the problem:
    > I am using a MySQL database that store data in
    Unicode/UTF-8 (the
    > website/database are in Lao).
    well that's certainly different.
    I mean, they are in Lao language, not that they are hosted in
    Laos.
    > Database Defaults: utf8_general_ci collation, utf8
    encoding
    how was the data entered? how was it uploaded to the host?
    could the data have
    been corrupted loading or uploading to the host?
    The data was entered locally, then dumped into a .sql file using
    utf8 charset and then the dump imported into the db on the host,
    once again with utf8 charset. I know the data in the database is
    correct: when I browse the db tables with phpMyAdmin, all Lao text
    in the db is displayed in proper Lao...
    > The only difference is that my CF uses
    mysql-connector-java-3.1.10 driver
    > while the host uses MySQL 3.x driver
    (org.gjt.mm.mysql.Driver class).
    and does that driver support mysql 5 and/or unicode?
    I am sure it does support MySQL5, as I have other MySQL5
    databases hosted there and they work fine. I am not sure if it
    supports Unicode, though.... I am actually more and more sure it
    does not... The strange this is, I am not able to find the java
    class that driver is stored in to try and test using it on my local
    server... I have asked the host to email me the .jar file they are
    using, but have not heard back from them yet...
    > On my local computer everything works just fine, even
    without any extra CF DSN
    > settings (i.e. connection string and/or JDBC URL do not
    need the
    > useUnicode=true&characterEncoding=UTF-8 strings
    added to show Lao text
    > properly).
    and what happens if you do use those? what locale for the
    local server?
    If I use just that line, nothing changes (apart from the 2 mysql
    variables which then default to uft8 instead of latin1) -
    everything works just fine locally.
    The only difference I have noticed between MySQL setup on my
    local comp and on the host is that on my comp the
    character_set_results var is not set (shows [empty string]), but on
    the host it is set to latin1. When I set it to latin1 on my local
    comp using &characterSetResults=ISO8859_1 in the JDBC URL
    string, I get exactly same problem as on the host: all ???????
    instead of Lao text from db. If it is not set, or set to utf8,
    everything works just fine.
    For some reason, we are unable to make it work on the host:
    whatever you add to the JDBC URL string or enter in the Connection
    String box in CF Admin is simply ignored...
    Do you know if this is a particular problem of the driver
    used on the host?
    > The cfm pages have <cfprocessingdirective> and
    <cfcontent> tags set to utf-8
    > and also have html <meta> set to utf-8. ALl static
    Lao text in cfm pages
    > shows just fine.
    db driver then.
    I think so too...

  • Display UTF-8 problem

    configuration:
    1. HP-UNIX 11
    2. Oracle 8.1.7
    3. Weblogic6 sp1
    Our application needs to support english, big5 chinese and Portuguese
    and I have set NLS_CHARACTERST=UTF8 and NLS_NCHAR_CHARACTERSET = UTF8 in
    oracle
    I have also set NLS_LANG=AMERICAN_AMERICA.UTF8 in shell of start up
    weblogic
    However, u can only read English, Protuguese in jsp file. The Chinese
    character cannot be displayed correctly.
    I have tried to use weblogic5.1 sp9 to retieve data from oracle. The
    charasters displayed correctly.
    Is there any problem about my setting or my code?
    Driver myDriver = (Driver)Class.forName
    ("weblogic.jdbc.oci.Driver").newInstance();
    Properties props = new Properties();
    props.put("weblogic.oci.min_bind_size", "660");
    props.put("weblogic.codeset", "UTF8");
    props.put("user", "oracle");
    props.put("password", "oracle");
    props.put("server", "devdb");
    try{
    Connection conn = myDriver.connect("jdbc:weblogic:oracle", props);
    Statement stmt = conn.createStatement();
    stmt.execute("select * from test");
    ResultSet rs = stmt.getResultSet();
    while (rs.next()) {
    out.println(" loginid = " +rs.getString("loginid"));
    out.println("<br>");

    Do you have the following page directive in the jsp files?
    <%@ page contentType="text/html; charset=utf-8" %>
    "shall" <[email protected]> wrote in message
    news:[email protected]..
    configuration:
    1. HP-UNIX 11
    2. Oracle 8.1.7
    3. Weblogic6 sp1
    Our application needs to support english, big5 chinese and Portuguese
    and I have set NLS_CHARACTERST=UTF8 and NLS_NCHAR_CHARACTERSET = UTF8 in
    oracle
    I have also set NLS_LANG=AMERICAN_AMERICA.UTF8 in shell of start up
    weblogic
    However, u can only read English, Protuguese in jsp file. The Chinese
    character cannot be displayed correctly.
    I have tried to use weblogic5.1 sp9 to retieve data from oracle. The
    charasters displayed correctly.
    Is there any problem about my setting or my code?
    Driver myDriver = (Driver)Class.forName
    ("weblogic.jdbc.oci.Driver").newInstance();
    Properties props = new Properties();
    props.put("weblogic.oci.min_bind_size", "660");
    props.put("weblogic.codeset", "UTF8");
    props.put("user", "oracle");
    props.put("password", "oracle");
    props.put("server", "devdb");
    try{
    Connection conn = myDriver.connect("jdbc:weblogic:oracle", props);
    Statement stmt = conn.createStatement();
    stmt.execute("select * from test");
    ResultSet rs = stmt.getResultSet();
    while (rs.next()) {
    out.println(" loginid = " +rs.getString("loginid"));
    out.println("<br>");

  • Unable to set Default charset to UTF-8

    So I've added the following to /etc/httpd/conf/httpd.conf:
    AddDefaultCharset utf-8
    Still firefox reports that my phpfiles are using ISO-8859-1 even though defaulcharset and the files encoding is utf-8.
    What am I doing wrong? I converted the file to UTF-8 with notepad++.
    Problem is that Å Ä Ö don't work properly. They never work when echoed from mysql.
    Last edited by XAM (2010-07-02 11:49:19)

    Banton wrote:Have you set <meta http-equiv="Content-Type" content="text/html; charset=UTF-8" /> in html header?
    Yes, but I've created a phpfile with the following code, nothing else is in the file.
    <?php echo'Å Ä Ö'; ?>
    Does not work, it echoes out wierd characters.
    karol wrote:> They never work when echoed from mysql.
    Are the files generated dynamically? If so, you should check the mysql part.
    mysql is set to utf-8.

  • Text track russian (utf-8) problems in windows

    Hi,
    I am working with QT Pro on a Windows 7 PC and need to create several subtitled versions of a Movie.
    For european languages everything went fine, but i have problems with Russian, Polish and Turkish.
    When i open my utf-8 encoded *.txt file with Quicktime I only get gibberish characters.
    I spent at least 2 hours searching the internet for similar issues, but couldn't find anything.
    Can anybody help me with this, please?
    thanks a lot!

    The character set in your e-mail headers is set to iso-8859-1 (Western European).  Change that to utf-8.
    $headers = "MIME-Version: 1.0\n"
                ."From: \"".$name."\" <".$email.">\n"
                ."Content-type: text/html; charset=iso-8859-1\n";
    Nancy O.

  • Is there a way to force the charset to utf-8 with the IIS plug-in?

    We're using AJAX. The initial request for the HTML has charset=utf-8 set on the HTTP header as seen in this Live HTTP Headers capture:
    http://plper.mysite.com/mysupport/index.jsf
    GET /mysupport/index.jsf HTTP/1.1
    Host: plper.mysite.com
    User-Agent: Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.8.1.14) Gecko/20080404 Firefox/2.0.0.14
    Accept: text/xml,application/xml,application/xhtml+xml,text/html;q=0.9,text/plain;q=0.8,image/png,*/*;q=0.5
    Accept-Language: en-us,en;q=0.5
    Accept-Encoding: gzip,deflate
    Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.7
    Keep-Alive: 300
    Connection: keep-alive
    Cookie: mysiteSiteUrl=http://www.mysite.com; Amysession=aHR0cDovL3BscGVyLmVtYy5jb206ODAv; JSESSIONID=FN3WLTNJFJCfYhHHVrwKvLHF2gGdnnTb11DrCyZqR9YbGhcG28lK!-1728721171; mysession=AAAAAgABAFBy5LRMDmjSRCN%2FByvfquVwFeKCpmES4x9lReRava35fxKfwcbJimb3YyPhEd0vBq7ZxgJVecL475TFZwQuSphLOwRWAQw2t7PEW%2BrxsfxgnQ%3D%3D
    HTTP/1.x 200 OK
    Date: Tue, 10 Jun 2008 18:53:01 GMT
    Server: Microsoft-IIS/6.0
    Cache-Control: no-store,no-cache,must-revalidate, no-cache="set-cookie"
    Pragma: No-cache
    Transfer-Encoding: chunked
    Content-Type: text/html;charset=UTF-8
    Expires: Thu, 01 Jan 1970 00:00:00 GMT
    Set-Cookie: JSESSIONID=09VTLTNWT07LlqnK22jTWwM8y5L9v1rmPf9CTW5TnGGKBvWvjJpP!-1728721171; path=/
    Content-Language: en-US
    X-Powered-By: Servlet/2.5 JSP/2.1
    Subsequent requests do not:
    http://plper.mysite.com/mysupport/index.jsf
    POST /mysupport/index.jsf HTTP/1.1
    Host: plper.mysite.com
    User-Agent: Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.8.1.14) Gecko/20080404 Firefox/2.0.0.14
    Accept: application/x-backbase+xml,application/xhtml+xml,application/xml,text/xml,application/x-www-form-urlencoded,*/*;q=0.5
    Accept-Language: en-us,en;q=0.5
    Accept-Encoding: gzip,deflate
    Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.7
    Keep-Alive: 300
    Connection: keep-alive
    Content-Type: application/x-www-form-urlencoded
    Referer: http://plper.mysite.com/mysupport/index.jsf
    Content-Length: 122
    Cookie: mysiteSiteUrl=http://www.mysite.com; Amysession=aHR0cDovL3BscGVyLmVtYy5jb206ODAv; JSESSIONID=09VTLTNWT07LlqnK22jTWwM8y5L9v1rmPf9CTW5TnGGKBvWvjJpP!-1728721171; mysession=AAAAAgABAFBRtE5lAyr85YM0aIap%2Bekf1Qu8FoA6BNh4JVl1JgvDNDQgYrQm5m9W%2FQa4HLK767CtXV5c%2FhtXchbug9%2BE1zoCmqSBqqYmqXE9VG1lXi%2F%2Brg%3D%3D
    Pragma: no-cache
    Cache-Control: no-cache
    BackbaseClientDelta=%5Bevt%3DsrQuery%3AsiteList%7Cevent%7Csubmit%5D%5Bvalue%3DsrQuery%3AsiteList%7Cmultivalue%7C3971957%5D
    HTTP/1.x 200 OK
    Date: Tue, 10 Jun 2008 18:58:17 GMT
    Server: Microsoft-IIS/6.0
    Content-Length: 1720
    Content-Type: text/xml
    X-Powered-By: Servlet/2.5 JSP/2.1
    Is there a way to force requests going through the proxy plug-in to get a charset=utf-8 set in the HTTP header for all requests?
    Thanks!
    Edited by f2racer at 06/10/2008 12:01 PM

    If for some reason you have failed to maintain a backup copy of your computer ( not good), then transfer purchases form your ipod.
    Without syncing, click File>Transfer Purchases

  • UTF-8 problem in LR 3.4RC, not in 3.3

    I've just found a new issue in LR 3.4RC that was not present in 3.3:
    When exporting images to JPG and the IPTC contain words with umlauts (ä, ö, ü), at first glance the results seeem to be the same as in 3.3: The IPTC are read in the same way as they used to. But, when opening these JPGs in exiftool, you can see the difference: The recently exported JPGs are encoded UTF-8, whereas those JPGs exported from LR 3.3 do not have a UTF-8 flag.
    This issue is a severe problem when creating a web gallery, i. e. by JAlbum, because then the umlauts are not displayed correctly, but appear as corrupted letters.
    First I thought this is a JAlbum issue, but after examening the matter it is clear that LR 3.4 has changed the export format from 3.3
    Is this a bug or a feature?
    Regards
    Thokra

    Hi adda,
    are you German? I am.
    Exactly the same problem here. JAlbum can manage one kind of encoded charactres only. So, best would be if we could convert all the formerly exported files to the new standard. I tried simply writing the metadata again by CTRL+S, but no success.
    Anybody around who knows a method to convert the files without having to write down the hundreds of names with the German characters again?
    Regrads
    Thokra

  • UTF-8 problem on Mac OS X

    Hello,
    We have a 1.4.1 Java app on Mac OS X.
    This app writes out a text file, encoded using UTF-8, to a Windows folder. The following code is used:
    FileOutputStream fout = new FileOutputStream(<windows path>, false);
    OutputStreamWriter osw = new OutputStreamWriter(fout, "UTF-8");
    osw.write(str, 0, str.length());
    osw.flush();
    fout.flush();
    fout.close();
    Windows app tries to read that file as UTF-8 and is having a problem doing that. When you pull it up in Notepad - it's wrong.
    When we run the same Java app on Windows and write out a UTF-8 encoded file - other Windows apps understand it.
    This only happens when the encoded characters are not in the ASCII range. Therefore accented French characters, Japanese characters, etc. are all wrong.
    It seems to be an OS X Java implementation that "decomposes" accented characters as apparently required by the HFS file system.
    Has anyone else run into this?
    What would be a work around - or are we doing something wrong here?
    Thanks, any help will be greatly appreciated!!!
    Dave Raskin

    The main problem here seems to be that you are writing a text file. Text files do not contain meta-data and therefore applications like notepad can not tell what encoding was used to write the file and will assume a default encoding.
    Depending on what you are trying to do you could either change the encoding you use to write the text file to match the default encoding used on your target system or use a file format that contains meta-data like rtf, html, pdf, or doc.
    Thomas

  • UTF-8 Problems in Flex Builder 3

    I created locale properties files that contained characters
    requiring UTF-8 encoding using Flex Builder 3 for Mac OS. Some of
    the characters displayed incorrectly when viewed through the web
    browser. Open the properties files with an editor (BBEdit) produced
    a warning that the UTF-8 format was corrupted and the characters
    did not display in BBEdit correctly. I fixed the format and
    characters through BBEdit and found they worked and displayed as
    expected. Opening the properties in Flex Builder 3 show the
    characters correctly, but as soon as I saved the file from Flex
    Builder the characters stopped displaying. BBEdit again showed the
    UTF-8 format was corrupted.
    Does anyone have any suggestions for fixing this problem? Any
    workarounds?

    FB saves files as ISO-8859-1 by default. Right click file,
    select Properties, then select "other" in the Text File Encoding
    area, and then select UTF-8 there.

  • SOAP + UTF-8 problem

    Hi,<br />I have this problem: I need to  execute a call to a webservice passing to an XML. I writte in Spanish, so in my XML there are letters like: àá ÒÓ ... <br /><br />I'm trying to codify in UTF-8 cause de webservice do the decode from UTF-8 to ASCII, but I don't know how. I try with :  <br />        /*theRequest is the XML*/<br />         var cRequest = util.streamFromString(theRequest, "utf-8");<br />      theRequest = util.stringFromStream(cRequest);<br /><br />But I think this is not ok, this codify the request twice?<br /> <br />The Soap message that I send to the webservice starts like this: <br /><?xml version="1.0"?><br /><SOAP-ENV:Envelope xmlns:SOAP-ENV="http://schemas.xmlsoap.org/soap/envelope/" xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"><SOAP-ENV:Header>....<br /><br />is this " <?xml version="1.0"?>" ok?? must I see the encoding="utf-8" ??<br /><br />The webservice returns me this error:<br /><?xml version="1.0" encoding="UTF-8"?><soapenv:Envelope xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/" xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"><soapenv:Body><soapenv:Fault><fault code>soapenv:Server.userException</faultcode><faultstring>java.io.UTFDataFormatException: Invalid byte 2 of 4-byte UTF-8 sequence.</faultstring><detail><ns1:hostname xmlns:ns1="http://xml.apache.org/axis/">nut</ns1:hostname></detail></soapenv:Fault></soap env:Body></soapenv:Envelope><br /><br />Any idea?<br /><br />Thanks,<br />And sorry for my english :P

    Yes, I create the connection. This is the function:
    theURL is the URL of the service dispatcher connection
    thHeader is the Header SOAP
    theRequest the XML of the pdf
    the Action the action of the service Dispather I'm calling for.
    function doWS(theURL, theHeader, theRequest, theAction){
    try {
    var oResponse = "";
    SOAP.wireDump = true;
    /* var cRequest = util.streamFromString(theRequest, "utf-8");
    theRequest = util.stringFromStream(cRequest);
    var cHeader = util.streamFromString(theHeader, "utf-8");
    theHeader = util.stringFromStream(cHeader);*/
    var oService = SOAP.connect(theURL);
    if (oService != "[object SOAPService]") {
    oResponse = "ERROR - No connection";
    } else {
    // Create the processRequest parameter
    var processRequest = {
    soapType: "xsd:string", //anyType
    soapValue: theRequest
    // Create the action
    var mySOAPAction = theAction;
    var sendHeader = {
    soapType: "xsd:string",
    soapValue: theHeader
    // Create the responseHeader parameter
    var responseHeader = {};
    responseHeader[theAction] = {};
    // Do the request method
    oResponse = SOAP.request ({
    cURL: theURL,
    oRequest: processRequest,
    cAction: mySOAPAction,
    bEncoded: false,
    oReqHeader: sendHeader,
    oResHeader: responseHeader,
    cResponseStyle: SOAPMessageStyle.XML
    return oResponse;
    oService = null;
    catch(e)
    console.println("ERROR - " + e);
    return ("ERROR - " + e);
    oService = null;
    I'm new on this job, and they use to do it like that, so I supose that this is correct. But they have problems when they work with àáèé...
    I thougth that adobe designer works all in utf-8, so I don't understand that the problem could be on the utf-codification.
    Maybe the problem is on te server???
    Thank you so much!

  • SciTE 1.60 UTF-8 problem

    I can't write Russian or Hebrew when i set "File-> Encoding->UTF-8"
    In Kate for example it's work properly.

    @leX wrote:I can't write Russian or Hebrew when i set "File-> Encoding->UTF-8"
    In Kate for example it's work properly.
    I can write german Umlauts like ü,ä,ö without problems.  I also copied a text
    from the Russian truth (prawda) to it and saw the text correctly.
    My crystal ball says you dont have enabled the pango engine in scite. Instead
    you sticked with the oridinary gtk textarea. To enable it you need specify
    your font settings in your /usr/share/scite/SciTEGlobal.properties (or you
    local copy of itin ~) with setting an exclamation mark infron of the font.
    Here is a copy of mine:
    # Give symbolic names to the set of fonts used in the standard styles.
    if PLAT_WIN
        font.base=font:Verdana,size:10
        font.small=font:Verdana,size:8
        font.comment=font:Comic Sans MS,size:9
        font.code.comment.box=$(font.comment)
        font.code.comment.line=$(font.comment)
        font.code.comment.doc=$(font.comment)
        font.text=font:Times New Roman,size:11
        font.text.comment=font:Verdana,size:9
        font.embedded.base=font:Verdana,size:9
        font.embedded.comment=font:Comic Sans MS,size:8
        font.monospace=font:Courier New,size:10
        font.vbs=font:Lucida Sans Unicode,size:10
    if PLAT_GTK
        font.base=font:!bitstream vera sans mono,size:10
        font.small=font:!bitstream vera sans mono,size:9
        font.comment=font:!bitstream vera sans mono,italics,size:10
        font.code.comment.box=$(font.comment)
        font.code.comment.line=$(font.comment)
        font.code.comment.doc=$(font.comment)
        font.text=font:!bitstream vera sans mono,size:10
        font.text.comment=font:!bitstream vera sans mono,size:10
        font.embedded.base=font:!bitstream vera sans mono,size:9
        font.embedded.comment=font:!bitstream vera sans monosize:9
        font.monospace=font:!andale mono,size:10
        font.vbs=font:!bitstream vera sans mono,size:10
    font.js=$(font.comment)
    Try this, HTH
    bye neri

  • No more data to read from socket UTF instance problem

    I'm using oracle jdbc thin driver and SunOne Application Server 7 environment.
    I'm trying to call the stored procedure which has one IN parameter that is of type CLOB.
    My code looks like this:
    conn = DriverManager.getConnection (url, username, password);
    conn.setAutoCommit(false);
    clob = CLOB.createTemporary(conn, true, CLOB.DURATION_SESSION);
    Writer wr = clob.getCharacterOutputStream();
    wr.write(m_data);
    wr.flush();
    wr.close();
         PreparedStatement pstmt = conn.prepareCall(procedureCall);
    pstmt.setClob(1, clob);
         pstmt.execute();
    but when I run it, it throws this (at wr.write(m_data) statement):
    [29/Jan/2003:15:07:25] WARNING ( 9340): CORE3283: stderr: java.io.IOException: No more data to read from socket
    [29/Jan/2003:15:07:25] WARNING ( 9340): CORE3283: stderr: at oracle.jdbc.dbaccess.DBError.SQLToIOException(DBError.java:716)
    [29/Jan/2003:15:07:25] WARNING ( 9340): CORE3283: stderr: at oracle.jdbc.driver.OracleClobWriter.flushBuffer(OracleClobWriter.java:270)
    [29/Jan/2003:15:07:25] WARNING ( 9340): CORE3283: stderr: at oracle.jdbc.driver.OracleClobWriter.write(OracleClobWriter.java:172)
    [29/Jan/2003:15:07:25] WARNING ( 9340): CORE3283: stderr: at java.io.Writer.write(Writer.java:150)
    [29/Jan/2003:15:07:25] WARNING ( 9340): CORE3283: stderr: at java.io.Writer.write(Writer.java:126)
    I tried using this instead of Writer:
    clob.putString(1, m_data);
    but the same error occurs.
    I then tried to do both of these:
    InputStream reader = new StringBufferInputStream(m_data);
    PreparedStatement pstmt = conn.prepareCall(procedureCall);
    pstmt.setUnicodeStream(1, reader, reader.available());
    Reader reader = new StringReader(m_data);
         PreparedStatement pstmt = conn.prepareCall(procedureCall);
    pstmt.setCharacterStream(1, reader, m_data.length());
    but in both cases I got this (at pstmt.setCharacterStream() or pstmt.setUnicodeStream()):
    [29/Jan/2003:16:06:00] WARNING ( 9340): CORE3283: stderr: java.sql.SQLException: Data size bigger than max size for this type: 76716
    [29/Jan/2003:16:06:00] WARNING ( 9340): CORE3283: stderr: at oracle.jdbc.dbaccess.DBError.throwSqlException(DBError.java:134)
    [29/Jan/2003:16:06:00] WARNING ( 9340): CORE3283: stderr: at oracle.jdbc.dbaccess.DBError.throwSqlException(DBError.java:179)
    [29/Jan/2003:16:06:00] WARNING ( 9340): CORE3283: stderr: at oracle.jdbc.ttc7.TTCItem.setArrayData(TTCItem.java:95)
    [29/Jan/2003:16:06:00] WARNING ( 9340): CORE3283: stderr: at oracle.jdbc.dbaccess.DBDataSetImpl.setBytesBindItem(DBDataSetImpl.java:2414)
    [29/Jan/2003:16:06:00] WARNING ( 9340): CORE3283: stderr: at oracle.jdbc.driver.OraclePreparedStatement.setItem(OraclePreparedStatement.java:1134)
    [29/Jan/2003:16:06:00] WARNING ( 9340): CORE3283: stderr: at oracle.jdbc.driver.OraclePreparedStatement.setUnicodeStream(OraclePreparedStatement.java:2633)
    But, the greatest mistery of all is that code with temporary CLOB works fine when I create instance and use default settings. Problem occurs when I create instance with UTF coding scheme. But we are forced to use Unicode coding scheme, because of local special characters.
    We are using Oracle 9i on Solaris UNIX platform and jdbc drivers supplied with it.
    The CLOB I am trying to pass is a XML file and it is possible to be up to 400 KB in size.
    Please help. I'm at my wit's end!

    Hi,
    I have a similar problem . This is the code that I used. Can u please help me
    oracle.sql.CLOB newClob = oracle.sql.CLOB.createTemporary(((org.apache.commons.dbcp.PoolableConnection) con).getDelegate() , true, oracle.sql.CLOB.DURATION_SESSION);
              newClob.open(oracle.sql.CLOB.MODE_READWRITE);
              Writer wr = newClob.getCharacterOutputStream();
              wr.write(valuesXml);
              wr.flush();
              wr.close();
              //newClob.putString(1,valuesXml);
              pst.setClob(1,newClob);
    These are the versions that I use
    java version is 1.4.2_06
    and it is a Liunx OS - gij (GNU libgcj) version 3.2.3 20030502 (Red Hat Linux 3.2.3-49)
    the Oracle version is 9.2.0.4.0
    The exception I see is
    java.io.IOException: No more data to read from socket
         at oracle.jdbc.dbaccess.DBError.SQLToIOException(DBError.java:716)
         at oracle.jdbc.driver.OracleClobWriter.flushBuffer(OracleClobWriter.java:270)
         at oracle.jdbc.driver.OracleClobWriter.flush(OracleClobWriter.java:204)

  • Deploy portlet charset set utf-8,but show ????

    hi:
    os:window 2000
    jdev 9051
    app server 1012
    jhs9051
    deploy portlet for jheadstart ok,modify orion-web.xml content <orion-web-app default-charset="utf-8" ...>
    and global-web-application <orion-web-app default-charset="utf-8" ...>,
    but show cheniese become ???,please tell me how to set

    Hi,
    What is the encoding on your jsp pages? You need to set the encoding to UTF-8
    hope that helps,
    thanks,
    Harsha

Maybe you are looking for