Charset issues

I have a JSP that pulls an RSS news feed from another site, via javascript. The news is in french and the french characters are not displaying properly on some browsers. I tried setting the charset and encoding as followings...
<%@ page pageEncoding="UTF-8" contentType="text/html; charset=UTF-8" %>
<html>
<head>
<meta http-equiv="Content-Type" content="text/html;charset=UTF-8" />I also tried ISO-8859-1 and neither seem to do the trick.
In debugging I noticed the following: If I turn off auto-detect on my computer, it works fine in ISO or UTF. On another computer it only works in UTF and not ISO. Even with auto-detect it doesn't work.
Anything else I can try?
thanks
Edited by: black_lotus on 2-Mar-2010 4:46 PM

Because the page encoding is set via java. Are you implying that the html retrieved from the feed with javascript can have another encoding?Well if it's a completely different request/response then yeah, it will have its own character encoding associated with that exchange.
The pageEncoding specifies the character set that your Jsp server (Tomcat or whatever) will send the html to the browser in. It doesn't affect the running of javascript at all as I understand.

Similar Messages

  • Excel charset issue. Scrambled output.

    I faced this issue many times, but whenever the issue recurs, I end up with no clue of what's going wrong.
    Trying to dump data to an excel file:
                        iResponse.setHeader("application/vnd.ms-excel; charset=UTF-8", "content-type");
                        iResponse.setHeader("attachment; filename = \"" + reportFileName() + "\" ", "content-disposition");
                        iResponse.setContentEncoding("UTF-8");
                        iResponse.setContent(fileData());
    The excel file contains international languages scrambled (e.g. Chinese). I tried bigendian as charset, but still the output is scrambled.
    I checked the response using firebug, my output data is fine.
    Can anybody throw some pointers on this issue.
    Thanks in advance,
    Senthil

    reviving.

  • The Content-type charset issue

    Hi
    I've been reading the thread about the origins of the #HEAD# substitition: Where does #HEAD# substitution comes from?
    Because I want to change the charset parameter in the content-type meta tag in my application. I'm a bit worried since it appears that the value of this attribute is controlled by the PlsqlNLSLanguage setting in the dad. OK and Jan has had various settings here. But there are references to documentation that state that from Apex 3.0 you really can't change the setting with regards to charset (http://download-uk.oracle.com/docs/cd/B32472_01/doc/relnotes.300/b32465/toc.htm#BGBCJIEA) - it must be AL32UTF8.
    Now, I'm on 2.2, but I'm definately planning on upgrading at some point, so I guess that limitation will apply to my config too.
    AL32UTF8 yields a content type meta tag with charset=utf-8. If I can't change that, how do I get charset=iso-8859-1 like I need?
    Scott proposed to put in a second content-type meta tag to (maybe) override the one that Apex makes. I'm going to try that to see if it saves me. But can it be true that Apex doesn't give me any options here - other than relying on browsers probably guessing what I want when I give them HTML-hacks? In my view, that would clearly constitute a bug.
    Jakob

    Hi Joel
    Thanks for your reply. You're right. It's clearly not a bug after all. But you might want to slap Scott for suggesting what he did in Where does #HEAD# substitution comes from? :-)
    I understand that the problem lies with the XmlHttpRequest so it's not Apex' fault.
    My issue arises when people post non-ascii (Danish) characters in forms. Æ, Ø, Å etc. The browser encodes these as UTF-8 and they become garbage to the database - or to be more precise, they look like two or three bytes of even stranger characters when I retrieve them from the database. Also when I retrieve them with Apex and put them back into the very same input element that they came from.
    Personally I can't think of a good reason why the conversion seems to happen OK when I get the values back in the browser (if it didn't the funky characters would be "right" when viewed in the browser) and not happen when I post? I realize that I should be careful with the b-word, but the marshalling is not symmetric so some component in here doesn't behave ... reasonably. Perhaps it's the browser...
    Would you or anybody know of a workaround? Others must have had this problem? Is it possible to tweek the form or the form elements (using attributes like enctype, accept, accept-charsest with which I must admin I have no previous experience)? Or will I have to do some really nasty hacking in either JavaScript or database triggers or something?
    The Apex report on NLS in the database contains this:
    NLS_CHARACTERSET     WE8MSWIN1252
    NLS_COMP     BINARY
    NLS_LANGUAGE     DANISH
    NLS_LENGTH_SEMANTICS     BYTE
    NLS_NCHAR_CHARACTERSET     AL16UTF16
    NLS_NCHAR_CONV_EXCP     FALSE
    NLS_NUMERIC_CHARACTERS     ,.
    NLS_SORT     DANISH
    NLS_TERRITORY     DENMARK
    I don't believe that I'm free to change any of that. There are other applications running against the database.
    Jakob

  • SQL Loader UTF8 Charset  issue

    Hi,
    My database charset is UTF8, NLS_LANG parameter on mid tier (unix) is UTF8. I have created a spool file (Korean language) in UTF8 charset. Also, verified the dat file using a unicode text editor. When I am trying to upload data back to the database using SQL loader data is not getting uploaded properly. It uploads after doing some internal encoding/decoding. I have tried with characterset parament in ctrl file as UTF8.
    Could anyone pls help me identifying the issue and respective resolution?
    Thanks in advance.
    Sanyog

    Also useful for you (NLS_LANG and sqlloader) may be this thread:
    sqllder especial carector issue

  • Html 5 charset issue

    We have an application using html5. The server returns an accepted charset of utf-8. On the page is the following meta-tag: <meta charset="utf-8">.
    When inputting "special"characters (é, è, à, etc) and submitting the form, firefox translates this to these characters: éÃ. Further examination tells us that these are utf-8 representations of the iso-8859-1 values for the given characters. (we are using firefox on windows)
    When putting the accept-charset on the html-form, the issue is solved. However when reading the html5 spec, it seems to me that firefox should interpret either the server response or the meta-tag and submit the content as utf-8.
    We have the same issue in chrome, however in IE it seems to work.

    Hello,
    Can you please confirm the following
    #The file is saved with UTF-8 encoding
    #Are the pages getting served by server side scripting (PHP etc) or is it a HTML+JS single page application
    #Is it possible for you to provide a sample of the <meta> tag you are using
    Like you mentioned, if you did provide the charset in the meta tag, it is not required to set the charset for the form (provided the charset is the same), but the browser should be able to identify the charset properly.
    Can you please check these links to see if having the charset within the first 512/1024 bytes of the page is causing this issue
    #[http://forums.mozillazine.org/viewtopic.php?f=9&t=1168365 Setting default encoding of a page to UTF-8]
    #[https://bugzilla.mozilla.org/show_bug.cgi?id=812542 Charset identified wrongly as Western-8859-1 for a UTF-8 page]
    #[http://code.google.com/p/doctype-mirror/wiki/MetaCharsetAttribute Details on MetaCharSetAttribute]
    #[https://developer.mozilla.org/en-US/docs/Web/HTML/Element/meta Meta on MDN]
    Thank you

  • Charset UTF-8 issue in JSF

    hi All,
    Could you please give me the idea , what to do for charset issue in JSF.
    The main isssue is for the first request the data is converting to UTF-8, from the second request the data is converting to ISO-8859-1. I want the data should be encoded in UTF-8.
    I have jsf file it contains the <h:inputText> field value
    &#1072;&#1073;&#1074;&#1075;&#1076;&#1077;&#1078;&#1079;&#1080;&#1081;&#1082;&#1083;&#1084;&#1085;&#1086;
    I am passing this value into some other popup window using javascript function.
    mywindow = window.open('<%=request.getContextPath()%>/faces/vmp470.faces'+ '?nameField=' nameFieldValue '&'+'fieldType=' +fieldTypeValue   ,'mywindow','resizable=false,scrollbars=0,z-lock=true,width=550,height=520,left=150,top=150,screenX=150,screenY=150')         
    Here nameFieldValue =&#1072;&#1073;&#1074;&#1075;&#1076;&#1077;&#1078;&#1079;&#1080;&#1081;&#1082;&#1083;&#1084;&#1085;&#1086;
    For the first time the values is passing correctly into the second window.
    From the second time the value is changing into
    ������������������������������
    Can you please give me the idea if you have other alternative.
    In my RequestContextFilter I added UTF-8 char set also.
    servletRequest.setCharacterEncoding("UTF-8");
    ((HttpServletResponse)servletResponse).setContentType("text/html;charset=UTF-8");
    In JSF file also I added <%@page language="java" pageEncoding="UTF-8" contentType="text/html; charset=UTF-8" %>
    some where the characater set chaning from UTF-8 to ISO-8859-1
    Thanks & Regards,
    Bhushanam.

    I don't have like that.
    In system variables
    path= C:\j2sdk1.4.2\bin;D:\oracle\ora92\bin;C:\Program Files\Oracle\jre\1.3.1\bin;C:\Program Files\Oracle\jre\1.1.8\bin;C:\apache-ant-1.6.0\bin;D:\Tomcat5.0\bin;D:\jsf-1_1_01\jsf-1_1_01\lib;%M2_HOME%\bin;
    classpath= C:\j2sdk1.4.2\lib\tools.jar;D:\logos\logos\lib\oracle\classes12.jar;D:\logos\logos\lib\log4j\log4j.jar;D:\Tomcat5.0\bin;D:\jsf-1_1_01\jsf-1_1_01\lib;C:\mavenbook-1.0\genapp\test-application\;D:\Tomcat5.0\webapps\login\WEB-INF\lib\servlet-api-2.3.jar;
    If i print
    System.getProperty("java.class.path"), the below path is displaying.
    D:\Jdev9052\j2ee\home\oc4j.jar;D:\Jdev9052\jdev\lib\jdev-oc4j.jar;D:\Jdev9052\j2ee\home\lib/ejb.jar;D:\Jdev9052\j2ee\home\lib/servlet.jar;D:\Jdev9052\j2ee\home\lib/ojsp.jar;D:\Jdev9052\j2ee\home\lib/jndi.jar;D:\Jdev9052\j2ee\home\lib/jdbc.jar;D:\Jdev9052\j2ee\home\iiop.jar;D:\Jdev9052\j2ee\home\iiop_gen_bin.jar;D:\Jdev9052\j2ee\home\lib/jms.jar;D:\Jdev9052\j2ee\home\lib/jta.jar;D:\Jdev9052\j2ee\home\lib/jmxri.jar;D:\Jdev9052\j2ee\home\lib/javax77.jar;D:\Jdev9052\j2ee\home\lib/javax88.jar;D:\Jdev9052\j2ee\home\../../opmn/lib/ons.jar;D:\Jdev9052\j2ee\home\../../opmn/lib/optic.jar;D:\Jdev9052\j2ee\home\../../lib/dms.jar;D:\Jdev9052\j2ee\home\../../dms/lib/dms.jar;D:\Jdev9052\j2ee\home\lib/connector.jar;D:\Jdev9052\j2ee\home\lib/cos.jar;D:\Jdev9052\j2ee\home\lib/jsse.jar;D:\Jdev9052\j2ee\home\../../oracle/lib/jsse.jar;D:\Jdev9052\j2ee\home\lib/jnet.jar;D:\Jdev9052\j2ee\home\lib/jcert.jar;D:\Jdev9052\j2ee\home\lib/activation.jar;D:\Jdev9052\j2ee\home\lib/mail.jar;D:\Jdev9052\j2ee\home\../../javavm/lib/jasper.zip;D:\Jdev9052\j2ee\home\../../lib/xmlparserv2.jar;D:\Jdev9052\j2ee\home\../../oracle/lib/xmlparserv2.jar;D:\Jdev9052\j2ee\home\lib/jaxp.jar;D:\Jdev9052\j2ee\home\lib/jaas.jar;D:\Jdev9052\j2ee\home\jazn.jar;D:\Jdev9052\j2ee\home\../../jdbc/lib/classes12dms.jar;D:\Jdev9052\j2ee\home\../../oracle/jdbc/lib/classes12dms.jar;D:\Jdev9052\j2ee\home\../../jdbc/lib/nls_charset12.jar;D:\Jdev9052\j2ee\home\../../oracle/jdbc/lib/nls_charset12.jar;D:\Jdev9052\j2ee\home\jaxb-rt-1.0-ea.jar;D:\Jdev9052\j2ee\home\../../soap/lib/soap.jar;D:\Jdev9052\j2ee\home\../../webservices/lib/wsserver.jar;D:\Jdev9052\j2ee\home\../../webservices/lib/wsdl.jar;D:\Jdev9052\j2ee\home\../../rdbms/jlib/aqapi.jar;D:\Jdev9052\j2ee\home\lib/jem.jar;D:\Jdev9052\j2ee\home\../../javacache/lib/cache.jar;D:\Jdev9052\j2ee\home\lib/http_client.jar;D:\Jdev9052\j2ee\home\../../jlib/jssl-1_1.jar;D:\Jdev9052\j2ee\home\../../oracle/jlib/jssl-1_1.jar;D:\Jdev9052\j2ee\home\../../jlib/repository.jar;D:\Jdev9052\j2ee\home\../../jlib/ldapjclnt9.jar;D:\Jdev9052\j2ee\home\../../oracle/jlib/repository.jar;D:\Jdev9052\j2ee\home\lib/jaasmodules.jar;D:\Jdev9052\j2ee\home\../../sqlj/lib/runtime12ee.jar;D:\Jdev9052\j2ee\home\../../sqlj/lib/translator.jar;D:\Jdev9052\j2ee\home\lib/crimson.jar;;D:\Jdev9052\j2ee\home\applib;D:\Jdev9052\BC4J\lib;D:\Jdev9052\BC4J\lib\adfm.jar;D:\Jdev9052\BC4J\lib\adfmtl.jar;D:\Jdev9052\BC4J\lib\adfmweb.jar;D:\Jdev9052\BC4J\lib\bc4jct.jar;D:\Jdev9052\BC4J\lib\bc4jctejb.jar;D:\Jdev9052\BC4J\lib\bc4jdomorcl.jar;D:\Jdev9052\BC4J\lib\bc4jimdomains.jar;D:\Jdev9052\BC4J\lib\bc4jmt.jar;D:\Jdev9052\BC4J\lib\bc4jmtejb.jar;D:\Jdev9052\BC4J\lib\collections.jar;D:\Jdev9052\jlib\ojmisc.jar;D:\Jdev9052\ord\jlib\ordim.jar;D:\Jdev9052\ord\jlib\ordhttp.jar;D:\Jdev9052\jlib\share.jar;D:\Jdev9052\jlib\regexp.jar;D:\Jdev9052\jlib\jdev-cm.jar;D:\Jdev9052\lib\dsv2.jar;D:\Jdev9052\rdbms\jlib\xsu12.jar;D:\Jdev9052\j2ee\home\jsp\lib\taglib;D:\Jdev9052\j2ee\home\jsp\lib\taglib\jaxen-full.jar;D:\Jdev9052\j2ee\home\jsp\lib\taglib\ojsputil.jar;D:\Jdev9052\j2ee\home\jsp\lib\taglib\saxpath.jar;D:\Jdev9052\j2ee\home\jsp\lib\taglib\standard.jar;D:\Jdev9052\lib\oraclexsql.jar;D:\Jdev9052\lib\xsqlserializers.jar;D:\Jdev9052\jlib\bigraphbean.jar;D:\Jdev9052\jlib\bigraphbean-nls.zip;D:\Jdev9052\jlib\jewt4.jar;D:\Jdev9052\jlib\jewt4-nls.jar;D:\Jdev9052\toplink\jlib\toplink.jar;D:\Jdev9052\jdev\lib\jdev-rt.jar;D:\Jdev9052\vbroker4\lib\vbjorb.jar;D:\Jdev9052\jdev\lib\ojc.jar;;C:\vmp\development\vmp\vmp-web\src\main\webapp\WEB-INF\lib\CVS;C:\vmp\development\vmp\vmp-web\target\classes;C:\vmp\development\vmp\vmp-app\target\classes;C:\vmp\development\vmp-ext-lib\activation-1.0.2.jar;C:\vmp\development\vmp-ext-lib\calendar-src1.jar;C:\vmp\development\vmp-ext-lib\commons-beanutils.jar;C:\vmp\development\vmp-ext-lib\commons-collections.jar;C:\vmp\development\vmp-ext-lib\commons-digester.jar;C:\vmp\development\vmp-ext-lib\commons-email-1.0.jar;C:\vmp\development\vmp-ext-lib\commons-fileupload-1.1.jar;C:\vmp\development\vmp-ext-lib\commons-io-1.1.jar;C:\vmp\development\vmp-ext-lib\commons-lang-2.0.jar;C:\vmp\development\vmp-ext-lib\commons-logging.jar;C:\vmp\development\vmp-ext-lib\geronimo-spec-ejb-2.1-rc1.jar;C:\vmp\development\vmp-ext-lib\hsqldb.jar;C:\vmp\development\vmp-ext-lib\jai_codec-1.1.2.jar;C:\vmp\development\vmp-ext-lib\jai_core-1.1.2.jar;C:\vmp\development\vmp-ext-lib\javax-ssl-1_2.jar;C:\vmp\development\vmp-ext-lib\jcommon-1.0.0.jar;C:\vmp\development\vmp-ext-lib\jfreechart-1.0.1.jar;C:\vmp\development\vmp-ext-lib\jsf-api.jar;C:\vmp\development\vmp-ext-lib\jsf-impl.jar;C:\vmp\development\vmp-ext-lib\jssl-1_2.jar;C:\vmp\development\vmp-ext-lib\junit-3.8.1.jar;C:\vmp\development\vmp-ext-lib\log4j-1.2.9.jar;C:\vmp\development\vmp-ext-lib\mail-1.3.3.jar;C:\vmp\development\vmp-ext-lib\servlet-api-2.3.jar;C:\vmp\development\vmp-ext-lib\soap.jar;C:\vmp\development\vmp-ext-lib\toplink-9.0.4.jar;C:\vmp\development\vmp-ext-lib\xmlparserv2.jar;C:\vmp\development\vmp-ext-lib\OracleADFFaces\adf-faces-api-SNAPSHOT.jar;C:\vmp\development\vmp-ext-lib\OracleADFFaces\adf-faces-impl-SNAPSHOT.jar;C:\vmp\development\vmp-ext-lib\OracleADFFaces\adfshare-3549S.jar;C:\vmp\development\vmp-ext-lib\OracleADFFaces\jstl.jar;C:\vmp\development\vmp-int-lib\vmp-app-1.0-SNAPSHOT.jar;D:\Jdev9052\jakarta-taglibs\jstl-1.0\lib\jaxen-full.jar;D:\Jdev9052\jakarta-taglibs\jstl-1.0\lib\saxpath.jar;D:\Jdev9052\jakarta-taglibs\jstl-1.0\lib\xalan.jar;D:\Jdev9052\jakarta-taglibs\jstl-1.0\lib\jstl.jar;D:\Jdev9052\jakarta-taglibs\jstl-1.0\lib\standard.jar;D:\Jdev9052\jakarta-struts\lib\struts.jar;D:\Jdev9052\jakarta-struts\lib\commons-beanutils.jar;D:\Jdev9052\jakarta-struts\lib\commons-collections.jar;D:\Jdev9052\jakarta-struts\lib\commons-fileupload.jar;D:\Jdev9052\jakarta-struts\lib\commons-digester.jar;D:\Jdev9052\jakarta-struts\lib\commons-lang.jar;D:\Jdev9052\jakarta-struts\lib\commons-logging.jar;D:\Jdev9052\jakarta-struts\lib\commons-validator.jar;D:\Jdev9052\jakarta-struts\lib\jakarta-oro.jar
    can you please check the above paths.
    Thanks & regards
    bhushanam.
    Message was edited by:
    Java_Smart

  • Setting the Charset for outgoing mail in Mail

    When I send mails to windows-systems, the Outlook program gets the message with wrong characters. Especially of course the special danish characters, but also other information goes wrong. F.ex. grey lines hows up where I certainly did not put it!
    I have noticed, that the header of the message from Mail is like this:
    --Apple-Mail-1--672517002
    Content-Transfer-Encoding: 7bit
    Content-Type: text/plain;
    charset=US-ASCII;
    delsp=yes;
    format=flowed
    Sometimes like this:
    --Apple-Mail-9--596884052
    Content-Transfer-Encoding: quoted-printable
    Content-Type: text/plain;
    charset=UTF-8;
    delsp=yes;
    format=flowed
    When using the HTML functionality in Mail I get headers like this:
    --Apple-Mail-2--619494695
    Content-Transfer-Encoding: quoted-printable
    Content-Type: text/plain;
    charset=WINDOWS-1252;
    delsp=yes;
    format=flowed
    And the HTML-mails show's correctly!
    I guess it is an charset issue, but how can I control that maually?

    Open your Mail.app
    From the top of your screen choose Window>Connection Doctor.
    Double click on each server and make sure the password field is filled out, (your smtp server is likely to be missing the password)
    Enter the password into the field you find empty and close the preferences windows (saving changes if prompted) and close the Connection Doctor.
    Test to make sure you are fixed.
    If issues continue its going to be an issue with Keychain Access and you will need additional steps.
    EE

  • Charset problems in LC

    Hi all,<br />I have the following promblem:<br />I've made the form in LC Designer that is filled by data from XML file.<br />When I request this form from LiveCycle server I'm get the right PDF, except following charset issue:<br />All '&#8470;' symbols (AKA Numero sign) in data from XML file that merges with PDF gets changed to '¹' (AKA Superscript one).<br />Seems like somewhere on the server I've got utf-8 --> cp1251 conversion.<br />Both resulting PDF and XML file with data seems to be ok and have utf-8 charset. Only one place i could find "CP1251" is in the config.xml, that i've exported from the LC Server AdminUI page.<br /><br /><node name="Output"><br /><br /><entry key="RenderedOutputCacheEnabled" value="true"/><br /><entry key="RenderedOutputCachePolicy" value="LRU"/><br /><entry key="charset" value="CP1251"/><br />...etc<br /><br />But when i'm tried to change this to "utf-8" in config.xml and import it back to the server nothing happens and another export gives me same config.xml with "CP1251" in it.<br />Maybe somone can help me to solve this problem and have my '&#8470;' back? )<br /><br />Thank you!

    Hello trankiemhung,
    Thanks for using Apple Support Communities.
    I found the following information that will help resolve your issue:
    Using Game Center
    http://support.apple.com/kb/HT4314
    Additional Information
    If you are having difficulty logging in to Game Center or staying connected
    Verify that you are connected to the Internet.
    If you are unable to create or sign in to your Game Center account from within a game, try creating or signing in to your account using the Game Center app.
    Try signing out of your Game Center account, then sign back in. If you can't sign in to your Game Center account with an Apple ID, try resetting your password or using another email address. To manage your Apple ID account, go to My Apple ID.
    When using a Wi-Fi connection, verify that your Wi-Fi router is configured for Game Center.
    Take care,
    Alex H.

  • Select in plsql

    Hi All,
    - 11gR2 DB set as:
    NLS_CHARACTERSET US7ASCII Character set
    NLS_NCHAR_CHARACTERSET AL16UTF16 NCHAR Character set
    - I have table column as below. It is storing the special characters like ABCµτŭXYZ
    entity nvarchar2(2000)
    - When I see the data in oracle SQL developer, it shows the special characters ok.
    The moment I select the entity in PL/SQL, entity is losing the charset and shows ?.
    That is ABC???XYZ
    How do I fix this? Like using java stored procedure etc…
    Any thoughts?
    Thanks for your help.

    user5156030 wrote:
    What I meant is
    after selecting the entity in PL/SQL, I am displaying it using htp.print (I am doing all this in the stored procedure and displaying using plsql web toolkit)In that case the data is rendered by the browser. PL/SQL acts as the "conduit" of data between the database and the browser.
    The browser needs to support the specific charset to render. It needs to be instructed (via the HTML response created by PL/SQL) correctly. More than that I unfortunately do not know - we do not deal with anything other than standard charsets in Oracle and HTML.
    I did the following test - created this procedure and called it via an Apache mod_plsql server.
    create or replace procedure WebTest is
    begin
            htp.prn( 'Japanese kanji (漢字) character set' );
    end;The web browser displayed the following:
    Japanese kanji (¿¿) character setSo it seems to be a charset issue - possibly you need to specify specifics in the Mime header of the HTML payload? But this is not a PL/SQL language or Oracle issue. As neither does the actual rendering of that string on the client. You need to make sure that your code generates the correct payload for the client to render.

  • Cfhttp error on CF7 but not CF8 - need help

    I have been battling this for a few days now, and I am starting to go a bit insane.  So, I am hoping someone can offer some assistance.
    I am trying to make a call to the Live Contacts API.  On CF8, I can get this to work but on CF7 it throws an error:
    <cfhttp url="#theURL#" method="get" result="httpResult">
         <cfhttpparam type="header" name="Authorization" value="DelegatedToken dt=""#dt#""" />
         <cfhttpparam type="header" name="Accept-Encoding" value="*" />
    </cfhttp>
    The Error:
    Charset
    [empty string]
    ErrorDetail
    I/O Exception: peer not authenticated
    Filecontent
    Connection Failure
    Header
    [undefined struct element]
    Mimetype
    Unable to determine MIME type of file.
    Responseheader
    struct [empty]
    Statuscode
    Connection Failure.  Status code unavailable.
    Text
    YES
    theURL and dt are dynamically created with the info returned from the consent token.  This process fine and they are what they should be.  But there is a difference between versions that is throwing it off.  My theory is that it is a charset issue.  I ran into similar with Gmail Contacts API on CF7 v CF8.  I was able to get that resolved by using: charset="utf-8".  However, that doesn't work for Live.  <cfhttp url="#theURL#" method="get" result="httpResult" charset="utf-8">.
    I have tried about every combination I can think of but still get a connection error.  On CF8 servers (tested on two different ones) it worked perfect.
    In googling and looking at forums, I notice this in reference to an SSL issue.  I don't think that is the case here, as one of my servers doesn't have an SSL in use that would affect things.  But, it is in a hosting environment, so maybe.
    I think I am missing something small or some combination of headers or something.  Any help would be greatly appreciated...and maybe ever rewarded with a Starbucks GC!  If you have any questions or need any more info let me know.
    Thanks.

    also sent to you on twitter:
    you have:
    <cfhttpparam type="header" name="Authorization" value="DelegatedToken dt=""#dt#""" />
    try:
    <cfhttpparam type="header" name="Authorization" value='DelegatedToken dt="#dt#"' />
    note the use of single and double quotes.
    not sure if that would cause it but worth a shot.
    Does the Live API require that the dt be surrounded by double quotes?

  • Problem convertting certain extended ascii characters

    I'm having problems with the extended ascii characters in the range 128-159. I'm working with SQL server environment using java. I originally had problems with characters in the range 128-159 when I did a 'select char_col from my_table' I always get junk when I try to retreive it from the ResultSet using the code 'String str = rs.getString(1)'. For example char_col would have the ascii character (in hex) '0x83' but when I retrieved it from the database, my str equaled '0x192'. I'm aware there is a gap in the range 128-159 in ISO-8859-1 charset. I've tracked the problem to be a charset issue converting the extended ascii characters in ISO-8859-1 into java's unicode charset.
    I looked on the forum and it said to try to specify the charset when I retreived it from the resultset so I did 'String str = new String(rs.getBytes(), "ISO-8859-1")' and it was able to read the characters 128-159 correctly except for five characters (129, 141, 143, 144, 157). These characters always returned the character 63 or 0x3f. Does anyone who what's happening here? How come these characters didn't work? Is there a workaround this? I need to use only use java and its default charsets and I don't want to switch to the windows Cp1252 charset cuz I'm using the java code in a unix environment as well.
    thanks.
    -B

    Normally your JDBC driver should understand the charset used in the database, and it should use that charset to produce a correct value for the result of getString(). However it does sometimes happen that the database is created by programs in some other language that ignore the database's charset and do their own encoding, bypassing the database's facilities. It is often difficult to deal with that problem, because the custodians of those other programs don't have a problem, everything is consistent for them, and they will not allow you to "repair" the database.
    I don't mean to say that really is your problem, it is a possibility though. You are using an SQL Server JDBC driver, aren't you? Does its connection URL allow you to specify the charset? If so, try specifying that SQL-Latin1 thing and see if it works.

  • Application import inserts newlines (v1.5)?

    Hi,
    When I export my application from development server and import to the test server, there are newlines in my code that were not there before. It has not ever bothered me before, I have not even noticed, but now one of those happens in a javascript string so that javascript throws error of not-delimited string when executed. Actually I construct the javascript inside a region's display condition (not a very good place but it makes sense for the situation) and removing the newline fixes the code. I don't see the newline in that place in the export file though so importing must be creating it.
    I don't want to search my whole application for any possible javascript errors each time I update the version on my production site. I still use HTML_DB version 1.5.0.00.33 so I don't know, it might be fixed in some later version - and then I would like to know about it so that I can convince my client to change to new version. Or is it possible that there is some workaround/fix?
    Hoping for an answer,
    Kaja

    I am back to the issue.
    Now it is verified that the issue is still up in version 1.6 - I upgraded my HTMLDB instance to 1.6 and the javascript worked. Then I exported it from there and imported into the same instance again - and the javascript had the disturbing newlines.
    When I imported the same file exported from my 1.6 instance to htmldb.oracle.com (ws kaja_test, application 35414), I found that there is even more newlines inserted and in other locations than before (see page 21, region "Müügiraporti alus", inside new value of :PTHIS_TABS). Could it be there a charset issue?
    The sqlplus workaround is still ok, I just thought you might want to know to be able to address it in some future release.
    Kaja

  • How to store binary data in Java

    I need to show news in my web aplication.So I am retrieving news from a Feed Server in which the field GEN_UNICODE_MESSAGE_1 gives the news data.News consists of both English/Arabic data.So how will I store the binary content in JAVA and how can I display the news in JSP?
    I tried storing the binary content in ByteArrayInputStream and appended it using StringBuilder.But I didnt get the desired output.Output is seen as ÇáÎáíÌíÉ ááÇÓÊËãÇÑÇáãÌãæÚÉÇáÔÑßÉÇáÇãÓÈÊãÈÑÓÈÊãÈÑÅÌãÇá.Please help me.
    Here GEN_UNICODE_MESSAGE_1 is of type BINARY.
    try{
    BufferedInputStream in=new BufferedInputStream(new ByteArrayInputStream((byte[])msg.getField("GEN_UNICODE_MESSAGE_1").value.get(h)));
    //ByteArrayInputStream in=new ByteArrayInputStream(((byte[])msg.getField("GEN_UNICODE_MESSAGE_1").value.get(h)));
    StringBuilder builder = new StringBuilder();
    byte[] buff=new byte[1024];
    int len;
    while ((len=in.read(buff))!=-1){
    builder.append(new String(buff,0,len));
    String incomingmsg=builder.toString();
    }catch(IOException e){}
    Edited by: Alance on May 7, 2010 10:12 PM

    BufferedInputStream bis=new BufferedInputStream(new ByteArrayInputStream((byte[])msg.getField("GEN_UNICODE_MESSAGE_1").value().toString().getBytes()));
    BufferedReader br = new BufferedReader(new InputStreamReader(bis));What on earth is all this?
    1. What is the type of msg.getField("GEN_UNICODE_MESSAGE_1").value()?
    2. Whatever that type is, you are then converting it to a String.
    3. You are then converting that to bytes.
    4. You are then casting that to byte[], which it already is,
    5. You are then wrapping a ByteArrayInputStream around that.
    6. ... and an InputStreamReader round that ...
    7. ... and a BufferedReader around that ...
    8. ... and reading lines
    9. ... and appending them to a StringBuilder
    10. ... and converting that to a String.
    Which you already at at step 2.
    String incoming = msg.getField("GEN_UNICODE_MESSAGE_1").value().toString();Not sure whether that's correct given the charset issues, but if not the problem starts at step 2. In any case steps 3-10 add precisely nothing.
    This is all pointless. The key question is (1). Answer that first.
    2. The type of String.getButy
    >
    StringBuilder sb = new StringBuilder();
    String line = null;
    while ((line = br.readLine()) != null) {
    sb.append(line + "\n");
    br.close();
    String incomingmsg=sb.toString();

  • Data Corrupted when using CREATE AS SELECT * FROM over DB LINK

    Hi ,
    I wonder if anyone has suffered a simillar issue as this:
    I have a DB Link Between a 10gR1 (Base install not patched) database on Windows 2003, and a 10gR2 (with latest Patch Database ) on Sun Solaris. The 10.1 DB is the SOURCE 10.2 is TARGET
    When using a statement like the following on the Target :
    CREATE TABLE a AS SELECT * FROM TABLE a@SOURCE
    The statement completes, however for some (not all) tables, the data content is seriously corrupted, I noticed this when trying to apply a UKey on the newly created table in TARGET. The data was completely messed up for around 600 out of 450000 rows, I could not easily tell which rows were messed up as it was the columns that build the UKey that were affected :-(
    The same happens if I precreate the table and use INSERT /*+APPEND*/ INTO AS SELECT.... etc
    I realise that using an unpatched 10.1 is not nessasarily advisable, however currently it is difficult for us to patch or upgrade the DB...
    If anyone has seen this before an/or has any ideas what might be the cause , I would appreciate any help I can get.
    Cheers
    JAmes

    The Problem Manifest itsself in that some fileds are created with incorrect Values, specificaly in easy to identify cases NULL where the original values where not NULL.
    These field do not contain special chars like ä or ß , so it does not seem to indicate a charset issue.
    JAmes

  • BLOBs stored in DB

    I have some files stored in the DB (11g) as Blobs...mostly images but some PDFs. The files are uploaded using an upload webpage, into a table like:
    CREATE TABLE wpg_document (
      NAME               VARCHAR(256)   UNIQUE NOT NULL,
      MIME_TYPE          VARCHAR(128),
      DOC_SIZE           NUMBER,
      DAD_CHARSET        VARCHAR(128),
      LAST_UPDATED       DATE,
      CONTENT_TYPE       VARCHAR(128),
      CONTENT            LONG RAW,
      BLOB_CONTENT       BLOB
    Using this Understanding mod_plsql as a reference to set this up.
    I use the PL/SQL Developer tool, and can "see" the BLOBs in this table (blob_content column), and they look fine.
    Now I am trying to provide the user a page to view or download the files. I think I'm running into charset issues, but not sure...all the above get stored as "ascii" in the DAD_CHARSET column (not sure why) ...our DB is UTF-8. (NLS_CHARACTERSET is AL32UTF8)
    On my "download" page, the PDFs come across as corrupt, and the images render all garbled (actually, the first 1/3 (the first 4086 bytes?) of the image shows fine, then it's all garbled). Here's the code snippet I'm using to fetch the BLOBs, which is the similar to a lot of what's out there on the web:
    lv_blob                      BLOB;
    v_amt NUMBER DEFAULT 4096;
    v_off NUMBER DEFAULT 1;
    v_raw RAW(4096);
    ..SELECT BLOB_CONTENT INTO lv_blob FROM wpg_document WHERE...
    BEGIN
           LOOP
             --dbms_output.put_line('Offset = '||v_off);
             -- Read the BLOB
             dbms_lob.READ(lv_blob, v_amt, v_off, v_raw);
             -- Display image
             htp.prn(utl_raw.cast_to_varchar2(v_raw));
             v_off := v_off + v_amt;
             v_amt := 4096;
           END LOOP;
           dbms_lob.CLOSE(lv_blob);
         EXCEPTION
           WHEN NO_DATA_FOUND THEN
             NULL;
         END;
    When I run this procedure straight from a PL/SQL Developer "test" window, I get an error (but no error when I browse to the download webpage)
    ORA-06502: PL/SQL: numeric or value error: character string buffer too small
    it points to this line:
    htp.prn(utl_raw.cast_to_varchar2(v_raw));
    I'm not sure what's going on here...any ideas?
    tia

    Ihave narrowed this down a bit. When I change the offset (v_off) to other values (e.g., 1000, 2000, 32767) more or less of the images get garbled. So if I set to 2000, then the first 2000 bytes of the image seems OK, but everything after that is garbled (corrupt). Not sure why.
    Not sure this is a solution, nor why it works, but if I replace everything above between BEGIN and END with:
    wpg_docload.download_file(lv_blob);
    all works fine....dunno.

Maybe you are looking for