Encoding Characters ISO-8859-2

You say:
You need to set your $ORACLE_HOME environment variable point to the database of interest.
I mean:
We work under WinNT(SP4) with MS DevStudio 6.0. What can I do to encoding Character ?
I put line <?xml version="1.0" encoding="ISO-8859-2"?> in the XML-Header, but the oracle-parser say ' LPX-00201 '.
Help, I need somebody help ? Thanks Matthias

I've found a solution. Yes, the problem was, when I've written polish characters into Excel file it encoded them as a rubbish. The problem was, I didn't set the cell encodding. So my code after I've fixed the problem looks like:
HSSFCell cell = row.createCell((short)0);
// And below is the line I was looking for
cell.setEncoding(HSSFCell.ENCODING_UTF_16);
cell.setCellValue("Some text with polish characters");
Now it works great. Thanks anyway.

Similar Messages

  • UTF-8 encoding vs ISO 8859-1 encoding

    The iTunes tech specs call for UTF-8 encoding of the XML feed file; a friend of mine uses feed generator software through his blog that uses ISO 8859 encoding. Is there a way to convert the latter to UTF-8 so that iTunes tags may be successfully added?
    When I tried editing his XML file, I got error messages when I submitted the file to RSS feed validator sites (such as http://feedvalidator.org/. Any help or knowledge is appreciated because I am not the least bit expert in this coding arena.

    You don't need to convert iso 8859-1 (us-ascii) to utf-8 unless you have nonstandard characters. Basically, ascii is a subset of utf-8 and for English it will serve you just fine. You can have iTunes tags in the xml file even if the file itself is encoded in iso 8859-1.
    The error you see at feedvalidator.org is most likely a warning.
    Hope this helps!
    - Andy Kim
    Potion Factory
    http://www.potionfactory.com

  • Flat file transformation to xml with encoding in iso-8859-1

    We have a BPEL process that picks up a flat fle (fixed length), transforms it to xml and emails it. The flat file is in iso-8859-1 (when I look at special characters like e-accent's in hex).
    The basic transformation that you get from (the wizards) of BPEL don't do the trick, the special characters are corrupted and unreadable.
    After that I changed all the encoding's (in xsd's and xsl's) from utf-8 into iso-8859-1 but doesn't help either. I also added an xsl:output section to my xsl (with "encoding="iso-8859-1") but that also fails...
    Does anyone have a clue? We're using SOA Suite 10.1.3.1.
    greetings,
    Jan

    just tried that but the problem is still there
    I suspect that the output of the file adapter (reading the flat file) is already corrupted, you should be able to tell the file adapter what encoding the input file is...

  • Problem when setting encoding to ISO-8859-1

    Hi. Is there something I'm missing when it comes to setting
    encoding? Here are the code in the mxml:
    <?xml version="1.0" encoding="ISO-8859-1"?>
    <mx:Application xmlns:mx="
    http://www.adobe.com/2006/mxml"
    layout="absolute">
    </Application>
    In the html-template I'm setting:
    <html>
    <head>
    <meta http-equiv="Content-Type" content="text/html;
    charset=ISO-8859-1" />
    The code for sending data from the app:
    <mx:HTTPService
    id="service"
    url="encoding.jsp"
    method="post"
    showBusyCursor="true"
    result="onResult(event)"
    resultFormat="xml"
    contentType="application/xml"
    fault="onFault(event)"
    useProxy="false" />
    public function send():void {
    var xml:XML = <Data>{data.text}</Data>;
    service.send(xml);
    But still the respons jsp I have made says it receives the
    data in UTF-8 format? What am I doing wrong?
    Servercode:
    don't work: BufferedReader reader = new BufferedReader( new
    InputStreamReader( is ) );
    work: BufferedReader reader = new BufferedReader( new
    InputStreamReader( is, "UTF-8") );

    Are there no-one who have the same problem? I've tried
    everything imaginable, and still the serverside receives the data
    as UTF-8.
    Is it supported at all to set the encoding of the
    request?

  • Default UTF-8 encode to Iso-8859-1 How to?

    Excused my English, I have a problem with JCreator 2 early access, when, in JSP page, change encode it from Utf-8 to Iso-8859-1 JCreator of default restores to Utf-8, as I can make? I cannot use Italian chars as "� "
    Thanks to who will want to help me.

    Hi,
    Please post the queries related to creator 2EA to the creator 2 EA discussion at https://feedbackprograms.sun.com/login.html
    regards

  • XML Encoding Issue - Format UTF-16 to ISO-8859-1

    Dear Groupmates,
    I have data in my Internal table which i am converting to XML using custom Transformation.
    Data is going to third party.The third party system requires data in ISO-8859-1 Format but SAP is generating the same in UTF-16 Format.I have been able to change the format of file from
    utf-16 to ISO-8859-1 format but after conversion i am getting invalid tag information in form of characters
    like &lt , &gt etc..in my file.
    Here is the code i have used to set the encoding to ISO-8859-1 :-
    DATA: xmlout TYPE xstring.
    DATA: ixml TYPE REF TO if_ixml,
    streamfactory TYPE REF TO if_ixml_stream_factory,
    encoding TYPE REF TO if_ixml_encoding,
    ixml_ostream TYPE REF TO if_ixml_ostream.
    ixml = cl_ixml=>create( ).
    streamfactory = ixml->create_stream_factory( ).
    ixml_ostream = streamfactory->create_ostream_xstring( xmlout ).
    encoding = ixml->create_encoding(
    character_set = 'ISO-8859-1' byte_order = 0 ).
    ixml_ostream->set_encoding( encoding = encoding ).
    Sample Output :-
    <?xml version="1.0" encoding="iso-8859-1"?>
    <AMS_DOC_XML_EXPORT_FILE><AMS_DOCUMENT AUTO_DOC_NUM="FALSE" DOC_CAT="CA" DOC_CD="CA" DOC_DEPT_CD="045" DOC_ID="XR10281060830400001" DOC_IMPORT_MODE="OE" DOC_TYP="CH" DOC_UNIT_CD ="NULL" DOC_VERS_NO="01">
    <CH_DOC_HDR AMSDataObject="Y">
    <DOC_CAT Attribute="Y">&lt;![CDATA[CA]]&gt;</DOC_CAT>
    <DOC_TYP Attribute="Y">&lt;![CDATA[CH]]&gt;</DOC_TYP>
    Please let me know if anyone has idea how i can get rid of the invalid tag information.
    Thanks !
    With Regards,
    Darshan Mulmule

    Darshan,
    Did you get an answer for this question? We have same requirement to create XML file in ISO-8859-1 format with Attributes is set to "Y" and CDATA is being used for data.
    Can you please let me know if you still remember how did you achieve it?
    Satyen...

  • How to set the Xml Encoding ISO-8859-1 to Transformer or DOMSource

    I have a xml string and it already contains an xml declaration with encoding="ISO-8859-1". (In my real product, since some of the element/attribute value contains a Spanish character, I need to use this encoding instead of UTF-8.) Also, in my program, I need to add more attributes or manipulate the xml string dynamically, so I create a DOM Document object for that. And, then, I use Transformer to convert this Document to a stream.
    My problme is: Firstly, once converted through the Transformer, the xml encoding changed to default UTF-8, Secondly, I wanted to check whether the DOM Document created with the xml string maintains the Xml Encoding of ISO-8859-1 or not. So, I called Document.getXmlEncoding(), but it is throwing a runtime error - unknown method.
    Is there any way I can maintain the original Xml Encoding of ISO-8859-1 when I use either the DOMSource or Transformer? I am using JDK1.5.0-12.
    Following is my sample program you can use.
    I would appreciate any help, because so far, I cannot find any answer to this using the JDK documentation at all.
    Thanks,
    Jin Kim
    import java.io.*;
    import org.w3c.dom.Document;
    import org.w3c.dom.Element;
    import org.w3c.dom.Node;
    import org.w3c.dom.NodeList;
    import org.w3c.dom.Attr;
    import org.xml.sax.InputSource;
    import javax.xml.parsers.DocumentBuilder;
    import javax.xml.parsers.DocumentBuilderFactory;
    import javax.xml.transform.Templates;
    import javax.xml.transform.Transformer;
    import javax.xml.transform.TransformerException;
    import javax.xml.transform.TransformerFactory;
    import javax.xml.transform.TransformerConfigurationException;
    import javax.xml.transform.dom.DOMSource;
    import javax.xml.transform.Source;
    import javax.xml.transform.stream.StreamSource;
    import javax.xml.transform.stream.StreamResult;
    public class XmlEncodingTest
        StringBuffer xmlStrBuf = new StringBuffer();
        TransformerFactory tFactory = null;
        Transformer transformer = null;
        Document document = null;
        public void performTest()
            xmlStrBuf.append("<?xml version=\"1.0\" encoding=\"iso-8859-1\"?>\n")
                     .append("<TESTXML>\n")
                     .append("<ELEM ATT1=\"Yes\" />\n")
                     .append("</TESTXML>\n");
            // the encoding is set to iso-8859-1 in the xml declaration.
            System.out.println("initial xml = \n" + xmlStrBuf.toString());
            try
                //Test1: Use the transformer to ouput the xmlStrBuf.
                // This shows the xml encoding result from the transformer which will change to UTF-8
                tFactory = TransformerFactory.newInstance();
                transformer = tFactory.newTransformer();
                StreamSource ss = new StreamSource( new StringBufferInputStream( xmlStrBuf.toString()));
                System.out.println("Test1 result = ");
                transformer.transform( ss, new StreamResult(System.out));
                //Test2: Create a DOM document object for xmlStrBuf and manipulate it by adding an attribute ATT2="No"
                DocumentBuilderFactory dfactory = DocumentBuilderFactory.newInstance();
                DocumentBuilder builder = dfactory.newDocumentBuilder();
                document = builder.parse( new StringBufferInputStream( xmlStrBuf.toString()));
                // skip adding attribute since it does not affect the test result
                // Use a Transformer to output the DOM document. the encoding becomes UTF-8
                DOMSource source = new DOMSource(document);
                StreamResult result = new StreamResult(System.out);
                System.out.println("\n\nTest2 result = ");
                transformer.transform(source, result);
            catch (Exception e)
                System.out.println("<performTest> Exception caught. " + e.toString());
        public static void main( String arg[])
            XmlEncodingTest xmlTest = new XmlEncodingTest();
            xmlTest.performTest();
    }

    Thanks DrClap for your answer. With your information, I rewrote the sample program as in the following, and it works well now as I intended! About the UTF-8 and Spanish charaters, I think you are right. It looks like there can be many factors involved on this subject though - for example, the real character sets used to create an xml document, and the xml encoding information declared will matter. The special character I had a trouble was u00F3, and somehow, I found out that Sax Parser or even Document Builder parser does not like this character when encoding is set to "UTF-8" in the Xml document. My sample program below may not be a perfect example, but if you replaces ISO-8859-1 with UTF-8, and not setting the encoding property to the transfermer, you may notice that the special character in my example is broken in Test1 and Test2. In my sample, I decided to use ByteArrayInputStream instead of StringBufferInpuptStream because the documentation says StringBufferInputStream may have a problem with converting characters into bytes.
    Thanks again for your help!
    Jin Kim
    import java.io.*;
    import java.util.*;
    import org.w3c.dom.Document;
    import org.w3c.dom.Element;
    import org.w3c.dom.Node;
    import org.w3c.dom.NodeList;
    import org.w3c.dom.Attr;
    import org.xml.sax.InputSource;
    import javax.xml.parsers.DocumentBuilder;
    import javax.xml.parsers.DocumentBuilderFactory;
    import javax.xml.transform.Templates;
    import javax.xml.transform.Transformer;
    import javax.xml.transform.TransformerException;
    import javax.xml.transform.TransformerFactory;
    import javax.xml.transform.TransformerConfigurationException;
    import javax.xml.transform.dom.DOMSource;
    import javax.xml.transform.Source;
    import javax.xml.transform.stream.StreamSource;
    import javax.xml.transform.stream.StreamResult;
    * XML encoding test for Transformer
    public class XmlEncodingTest2
        StringBuffer xmlStrBuf = new StringBuffer();
        TransformerFactory tFactory = null;
        Document document = null;
        public void performTest()
            xmlStrBuf.append("<?xml version=\"1.0\" encoding=\"iso-8859-1\"?>\n")
                     .append("<TESTXML>\n")
                     .append("<ELEM ATT1=\"Resoluci�n\">\n")
                     .append("Special charatered attribute test")
                     .append("\n</ELEM>")
                     .append("\n</TESTXML>\n");
            // the encoding is set to iso-8859-1 in the xml declaration.
            System.out.println("**** Initial xml = \n" + xmlStrBuf.toString());
            try
                //TransformerFactoryImpl transformerFactory = new TransformerFactoryImpl();
                //Test1: Use the transformer to ouput the xmlStrBuf.
                tFactory = TransformerFactory.newInstance();
                Transformer transformer = tFactory.newTransformer();
                byte xmlbytes[] = xmlStrBuf.toString().getBytes("ISO-8859-1");
                StreamSource streamSource = new StreamSource( new ByteArrayInputStream( xmlbytes ));
                ByteArrayOutputStream xmlBaos = new ByteArrayOutputStream();
                Properties transProperties = transformer.getOutputProperties();
                transProperties.list( System.out); // prints out current transformer properties
                System.out.println("**** setting the transformer's encoding property to ISO-8859-1.");
                transformer.setOutputProperty("encoding", "ISO-8859-1");
                transformer.transform( streamSource, new StreamResult( xmlBaos));
                System.out.println("**** Test1 result = ");
                System.out.println(xmlBaos.toString("ISO-8859-1"));
                //Test2: Create a DOM document object for xmlStrBuf to add a new attribute ATT2="No"
                DocumentBuilderFactory dfactory = DocumentBuilderFactory.newInstance();
                DocumentBuilder builder = dfactory.newDocumentBuilder();
                document = builder.parse( new ByteArrayInputStream( xmlbytes));
                // skip adding attribute since it does not affect the test result
                // Use a Transformer to output the DOM document.
                DOMSource source = new DOMSource(document);
                xmlBaos.reset();
                transformer.transform( source, new StreamResult( xmlBaos));
                System.out.println("\n\n****Test2 result = ");
                System.out.println(xmlBaos.toString("ISO-8859-1"));
                //xmlBaos.flush();
                //xmlBaos.close();
            catch (Exception e)
                System.out.println("<performTest> Exception caught. " + e.toString());
            finally
        public static void main( String arg[])
            XmlEncodingTest2 xmlTest = new XmlEncodingTest2();
            xmlTest.performTest();
    }

  • Changing the xml encoding from UTF-8 to ISO-8859-1

    Hi,
    I have created an xml file in xMII transaction that I feed into a webservice as input. As of now, the data in the xml file is entirely english text (it would be changing to have European text soon).  I gave the encoding as UTF-8.
    I get an error on the webservice side(not xMII code) that the its not able to parse. The error is 'SaxParseException: Invalid 1 of 1-byte UTF-8 sequence). I know that an easy fix is if tI change the encoding to iso-8859-1.
    But the reference document doesnot let me put anythign other than UTF-8. Even if I put <?xml version="1.0" encoding="iso-8859-1"?> as the first line, when I save it and open it back, i see <?xml version="1.0" encoding="UTF-8"?>
    Is there any way to change the encoding? Or better still, anyway idea why this invalid sequence is coming from?
    Thanks,
    Ravi.

    Hi Ravi,
    We have encountered scenarios where we needed to take the <?xml version="1.0" encoding="UTF-8"?> out completely.  As xMII was providing the Web Service, it needed a workaround.
    In your case, it seems that you wish to pass it from xMII to an external Web Service provider.  One option might be to pass the XML document as string.
    Once you convert it to a string, it may escape all XMl characters (i.e. '<' into '&lt;').  You could perform a string manipulation and remove the <?xml version="1.0" encoding="UTF-8"?> from the string.  You may also need to play around with xmlDecode( string ) function in the Link Editor.
    I would suggest that before you try this option, create a string variable will the contents, but without the <?xml version="1.0" encoding="UTF-8"?> and try assigning it to the input.
    You may also wish to try a string variable that has <?xml version="1.0" encoding="iso-8859-1"?> as the first line.  If this works, you should be able to perform string manipulations to convert your XML document into this modified string.
    Cheers,
    Jai.

  • Java App on Linux : Unable to read iso-8859-1 encoded file correctly.

    I have a file which is encoded as iso-8859-1, and contains characters such as ô .
    I am reading this file with java code, something like:
    File in = new File("myfile.csv");
    InputStream fr = new FileInputStream(in);
    byte[] buffer = new byte[4096];
    while (true) {
    int byteCount = fr.read(buffer, 0, buffer.length);
    if (byteCount <= 0) {
    break;
    String s = new String(buffer, 0, byteCount,"ISO-8859-1");
    System.out.println(s);
    However the ô character is always garbled, usually printing as a ? .
    I am running this on a Linux machine. It works fine on my XP machine.
    I have verified that I can see the correct characters when I cat the file on the terminal.
    (Interestingly, but I think maybe only by co-incidence, it works when I run with the -Dfile.encoding=UTF16 option, but not with UTF8, although this appears a hack rather than a fix since this option was not intended for developer use by sun - but I thought mentioning it may provide some clues as to what is going on)

    I think your main probelm is with the console. When you send text to the console, it's sent in the system default encoding. On an English-locale system that might be ASCII, ISO-8859-1, windows-1252, UTF-8, MacRoman, and probably several other possibilities. Then the console decodes the the bytes using whatever encoding it feels like using--on my WinXP machine, it uses cp437 by default (just for laughs, as far as I can tell). If the text happens to be pure, seven-bit ASCII, there's no problem, since all those encodings are identical in that range.
    But if you need to output anything other than ASCII characters, avoid the console. Send the output to a file and specify an encoding that you know will be able to handle your characters--UTF-8 can handle anything. Then open the file with an editor that can read that encoding; most of them can handle UTF-8 these days, and many will even detect it automatically. You also need to be using a font that can display your characters.
    However, you're also going about the reading part wrong. Instead of reading the text in as bytes and passing them to a String constructor, you should use an InputStreamReader and read it as text from the beginning: BufferedReader br = new BufferedReader(
      new InputStreamReader(
        new FileInputStream("myfile.csv"), "ISO-8859-1"));I am curious about your statement that "it works" when you run with the -Dfile.encoding=UTF16 option. I wouldn't be surprised to see it output the correct characters (ASCII characters, anyway), but I would expect to see the characters interspersed with blank spaces or rectangles.

  • Help needed to enable ISO-8859-1 encoding in Weblogic 10.3

    We have upgraded from Weblogic 10 to Weblogic 10.3. In the old environment foreign characters (iso-8859-1 charater set) were enabled using the Java properties "*-Dfile.encoding=iso-8859-1 -Dclient.encoding.override=iso-8859-1*".
    Also, we set the page encoding in the JSPs to be
    _'<%@ page pageEncoding="iso-8859-1" contentType="text/html; charset=iso-8859-1" %>'._
    This was working fine. We were able to input characters like euro (€) and display it back to user.
    After we upgraded to Weblogic 10.3, I applied the same Java properties, all the special characters of the 8859-1 set like Æ, à etc are just displayed as garbled text like '��ï' etc. So, the new server is unable to handle/encode these characters.
    Our weblogic environment is - Weblogic appserver 10.3, JRockit 1.6.0_31-R28.2.3-4.1.0 running on Linux OS.
    Is there any additional seting in 10.3 to encode character sets?
    Thanks in advance.

    Hi Kalyan,
    Thanks for the info. Please help me understand just one more thing, before I ask our server team to apply the patch.
    I checked the full version of our Weblogic server from WEBLOGIC_HOME/registry.xml. It is *<component name="WebLogic Server" version="10.3.6.0" >*.
    The patch from MOS says it is *"Release WLS 10.3.3"*. Does this mean the version 10.3.6 should already have this patch? Is the release of Patch same as WLS version?
    Should I still go ahead and install the patch for this version 10.3.6 also?
    Thanks again.
    - Shankar.

  • WIN-1252 characters in ISO-8859-1 textfields

    My questing concerns the following situation :
    The Apex-Dads is configured to be WE8ISO8859P1, the database has the same character set.
    When submitting a text-field in an APEX-page, a user can submit win1252 characters : chr:128-159, although the page encoding is
    ISO-8859-1.
    These characters are even displayed on the apex-page(in IE, FF on windows) as the represented win1252 character.
    However when these characters are inserted in the database, I have non-display characters.
    My question is, how can I prevent these characters from being entered in the database ?
    I've tried these
    * I can assume that all input arrives in Win1252 and convert all data in the database. (seems not the optimal solution)
    * I can configure the dad to be UTF8, so there will be a conversion from UTF-8-&gt; ISO-8859-1.
    However some characters aren't translated properly.
    Partly this is acceptable : chr128 (euro) isn't available in ISO-8859-1 but chr146 : right shifted quote is 'translated' to ¿ (inverted quotation mark).
    Perhaps someone had the same experience and can give me some more info or tips.
    Thanks in advance,
    Art
    Edited by: amels on Sep 19, 2008 1:11 PM

    Hello Art,
    >> Do you know why this is / should be ?
    This setting became mandatory in version 2.0, when the AJAX framework was introduced. Since then, the APEX environment itself is massively using this technology. Although the XMLHttpRequest object can support deferent character sets, its default setting is UTF-8. Using AL32UTF8 in the DAD simplify the “behind the scenes” AJAX support. This configuration also simplifies import/export of APEX applications and data, minimizing some client-server character set conversions, and I probably don’t know all the reasons that led to make this configuration a mandatory one.
    The important thing to remember is that it is a mandatory setting, and ignoring it will defiantly cause you problems in functionality, both in the development environment, and the run-time environment.
    Regards,
    Arie.

  • XML data encoding iso-8859-1 . Currently utf-16 is default encoding

    Hello ABAP Gurus ,
    Need a help from you .
    Scenario : We have SAP4.7 enterprise version which we have now converted to Unicode system . There is a BSP application which talks to an external web application (Non Unicode) thru HTTP protocol and sends data thru XML.
    Problem : Problem is at the time when BSP application prepares the XML While preparing XML data , before converting to Unicode environment the encoding was "iso-8859-1" . But now after Unicode conversion , the encoding is "UTF-16".
    The XML data looks like
    <?xml version="1.0" encoding="utf-16"?><DATA><ACTION>CREATE_TICKET</ACTION><CRI
    I have tried replacing "utf-16" by "iso-8859-1 " . The interface works . But at the recieving end , ie external web application , the German umlauts appear as some garbage values .
    I know we need to enforce encoding . I have tried with the following code but could not suceed . The encoding appears as "utf-16".
    Following is the section of code which I have written in BSP application.
      Daten in DOM-Baum wandeln
        CALL FUNCTION 'SDIXML_DATA_TO_DOM'
          EXPORTING
            name        = 'DATA'
            dataobject  = ls_cr_xml
          IMPORTING
            data_as_dom = if_dom
          CHANGING
            document    = if_document
          EXCEPTIONS
            OTHERS      = 1.
        IF sy-subrc NE 0.
          error_out text-f47 text-f48 space space.
        ENDIF.
      DOM-Baum in Character-Stream wandeln
        if_pixml = cl_ixml=>create( ).
        IF if_pixml IS INITIAL.
          error_out text-f50 text-f53 space space.
        ENDIF.
        if_pstreamfact = if_pixml->create_stream_factory( ).
        IF if_pstreamfact IS INITIAL.
          error_out text-f51 text-f53 space space.
        ENDIF.
        if_postream = if_pstreamfact->create_ostream_cstring( string = xml_doc ).
        IF if_pstreamfact IS INITIAL.
          error_out text-f52 text-f53 space space.
        ENDIF.
    --Encoding--
    data: gv_str type string.
    data: gv_l_xml_encoding type ref to if_ixml_encoding.
    data gv_l_resultb type boolean.
    gv_str = 'ISO-8859-1' .
    clear gv_l_xml_encoding.
          call method if_pixml->create_encoding
            EXPORTING
              byte_order    = 0
              character_set = gv_str
            RECEIVING
              rval          = gv_l_xml_encoding.
          clear gv_l_resultb.
          call method gv_l_xml_encoding->set_character_set
            EXPORTING
              charset = gv_str
            RECEIVING
              rval    = gv_l_resultb.
            call method if_document->set_encoding
            EXPORTING
              encoding = gv_l_xml_encoding.
    ----Append child -
    CALL METHOD if_document->append_child
          EXPORTING
            new_child = if_dom
          RECEIVING
            rval      = lv_return.
        IF lv_return NE 0.
          error_out text-f47 text-f49 space space.
        ENDIF.
    --Render the XML data to output stream--
        CALL METHOD if_document->render
          EXPORTING
            ostream = if_postream.
    After this section of code is executed , the variable xml_doc  gets filled up with XML data .
    <?xml version="1.0" encoding="utf-16"?><DATA><ACTION>CREATE_TICKET</ACTION><CRI
    Can anybody help how to enforce encoding ?
    Regards,
    Laxman Nayak.

    Hi Aslam Riaz,
    Did you find any solution..?
    Kindly help me on this issue.
    Thanks and Regards,
    Shailaja Chityala

  • WSDL Generating in ISO-8859-1 encoding want in UTF - 8

    Hi XI Geeks,
           When I am using "Define Web Service" tool in Integration Directory, the WSDL file is being created with encoding <b>ISO-8859-1</b>, <i><b>but we want in "UTF-8"</b></i>.
            Is there any setting we need to change in WAS which will change this default.
    Any ideas are welcome and thanks in advance.
    Regards
    Sujan

    Hi Vishnu,
             We already did that for the time being. But the client wants it be consistent.
             In another client engagement the WSDL is created in  "UTF-8" format we double checked twice.
             We are running on SP12.
    If any of the SP12 guys can generated a WSDL and let us know that it is created in "ISO-8859-1" encoding.
    Regards
    Sujan

  • XMLReader throws "Invalid UTF8 encoding." - Need parser for ISO-8859-1 chrs

    Hi,
    We are facing an issue when we try to send data which is encoded in "ISO-8859-1" charset (german chars) via the EMDClient (agent), which tries to parse it using the oracle.xml.parser.v2.XMLParser . The parser, while trying to read it, is unable to determine the charset encoding of our data and assumes that the encoding is "UTF-8", and when it tries to read it, throws the :
    "java.io.UTFDataFormatException: Invalid UTF8 encoding." exception.
    I looked at the XMLReader's code and found that it tries to read the first 4 bytes (Byte Order Mark - BOM) to determine the encoding. It is probably expecting us to send the data where the first line is probably:
    <?xml version="1.0" encoding="iso88591" ?>
    But, the data that our application sends is typically as below:
    ========================================================
    # listener.ora Network Configuration File: /ade/vivsharm_emsa2/oracle/work/listener.ora
    # Generated by Oracle configuration tools.
    SID_LIST_LISTENER =
    (SID_LIST =
    (SID_DESC =
    (SID_NAME = semsa2)
    (ORACLE_HOME = /ade/vivsharm_emsa2/oracle)
    LISTENER =
    (DESCRIPTION_LIST =
    (DESCRIPTION =
    (ADDRESS_LIST =
    (ADDRESS = (PROTOCOL = tcp)(HOST = stadm18.us.oracle.com)(PORT = 15100))
    ========================================================
    the first 4 bytes in our case will be, int[] {35, 32, 108, 105} == chars {#, SPACE, l, i},
    which does not match any of the encodings predefined in oracle.xml.parser.v2.XMLReader.pushXMLReader() method.
    How do we ensure that the parser identifies the encoding properly and instantiates the correct parser for "ISO-8859-1"...
    Should we just add the line <?xml version="1.0" encoding="iso88591" ?> at the beginning of our data?
    We have tried constructing the inputstream (ByteArrayInputStream) by using String.getBytes("ISO-8859-1") and passing that to the parser, but that does not seem to work.
    Please suggest.
    Thanks & Regards,
    Vivek.
    PS: The exception we get is as below:
    java.io.UTFDataFormatException: Invalid UTF8 encoding.
    at oracle.xml.parser.v2.XMLUTF8Reader.checkUTF8Byte(XMLUTF8Reader.java:160)
    at oracle.xml.parser.v2.XMLUTF8Reader.readUTF8Char(XMLUTF8Reader.java:187)
    at oracle.xml.parser.v2.XMLUTF8Reader.fillBuffer(XMLUTF8Reader.java:120)
    at oracle.xml.parser.v2.XMLByteReader.saveBuffer(XMLByteReader.java:450)
    at oracle.xml.parser.v2.XMLReader.fillBuffer(XMLReader.java:2229)
    at oracle.xml.parser.v2.XMLReader.tryRead(XMLReader.java:994)
    at oracle.xml.parser.v2.XMLReader.scanXMLDecl(XMLReader.java:2788)
    at oracle.xml.parser.v2.XMLReader.pushXMLReader(XMLReader.java:502)
    at oracle.xml.parser.v2.XMLReader.pushXMLReader(XMLReader.java:205)
    at oracle.xml.parser.v2.XMLParser.parse(XMLParser.java:180)
    at org.xml.sax.helpers.ParserAdapter.parse(ParserAdapter.java:431)
    at oracle.sysman.emSDK.emd.comm.RemoteOperationInputStream.readXML(RemoteOperationInputStream.java:363)
    at oracle.sysman.emSDK.emd.comm.RemoteOperationInputStream.readHeader(RemoteOperationInputStream.java:195)
    at oracle.sysman.emSDK.emd.comm.RemoteOperationInputStream.read(RemoteOperationInputStream.java:151)
    at oracle.sysman.emSDK.emd.comm.EMDClient.remotePut(EMDClient.java:2075)
    at oracle.sysman.emo.net.util.agent.Operation.saveFile(Operation.java:758)
    at oracle.sysman.emo.net.common.WebIOHandler.saveFile(WebIOHandler.java:152)
    at oracle.sysman.emo.net.common.BaseWebConfigContext.saveConfig(BaseWebConfigContext.java:505)

    Vivek
    Your message is not XML. I believe that the XMLParser is going to have problems with that as well. Perhaps you could wrap the message in an XML tag set and begin the document as you suggested with <?xml version="1.0" encoding="iso88591"?>.
    You are correct in that the parser uses only the first 4 bytes to detect the encoding of the document. It can only determine if the document in ASCII or EPCDIC based. If it is ASCII it can detect only between UTF-8 and UTF-16. It will need the encoding attribute to recognize the ISO-8859-1 encoding.
    hope this helps
    tom

  • Encoding autodetection chooses ISO-8859-5 instead of UTF-8

    A certain site producing HTML in UTF-8 without meta tags. The Firefox detects the encoding as ISO-8859-5, an absolutely dead and useless 'standard'. How to exclude it from the autodetection and maki it proper unicode?

    It's a strange guess, if it's a guess. Are you sure the server isn't sending a header specifying that encoding? To view the content-type header, you can use an add-on (or if it's not a secure page, an external proxy):
    * Live HTTP Headers extension: https://addons.mozilla.org/en-us/firefox/addon/live-http-headers/
    * Firebug extension: https://addons.mozilla.org/en-US/firefox/addon/firebug/
    * Fiddler debugging proxy: http://www.fiddler2.com/fiddler2/

Maybe you are looking for

  • Photoshop CS4 Extended will not open file from Bridge.

    This is 2nd time this has occurred.  Last time I uninstalled CS4 and re-installed to fix it.  Prefer to not do that again. System: XP Media center, SP3,  3 GB memory, Athlon 64x2.  Generic - motherboard graphic card.  18 GB disk space remaining. Typi

  • Using an iDVD project to create a glass master for replicati

    Hello, I hope someone can help me with this question. I am aware that creating a project in DVD studio pro and exporting it to DLT is one way to go for a media source that can be used by a replication company to create a glass master for replication.

  • Confirm Premise takes me to Personalization screen

    Dear Experts, In IC Webclient in the Technical master data I click on the Confirm button and i am taken to the Personalization screen did any one encounter this issue? if yes, could you resolve it? how? we are using CRM 2007 Regards, Raj

  • Java.lang.SecurityException: Authentication for user test1 denied in realm wl_realm

    Environment: WLS61 SP2 Two WLS61 servers on different machines. User test1 is authenticated against LDAP on server_1, then tries to execute a class (from JSP) that calls EJB on server_2. The environment properties for the call to EJB on server_2 to a

  • Viewing videos in notes folder?

    I would like to know if there is a way i can view videos in my ipods note folder. So i wont have to go back to my house and connect it to a certain computer to put files on it. Like putting them on its hard disk in the notes folder and the ipod could