Character integrity issue after data conversion in database/JDBC

Hi
I am using oracle 9i with the following NLS setting:
NLS_LANGUAGE :AMERICANS
NLS_CHARACTERSET : UTF8
NLS_NCHAR_CHARACTERSET :AL16UTF16
I am running on Linux with this as my environment Language:
Lang: en_US.UTF8
I am sending hindi characters in XML file (UTF-8 encoding) to my java application to be stored in the database. In my xml file, I give this encoding (ignore the double quotes, reason for putting in the quotes so that the browser will not interpret it)
"&#x928";"&#x92E";"&#x938";"&#x94D";"&#x924";"&#x947"
But the characters appeared unreadable in the database. When I use Select DUMP to check the characters encoding:
Typ=1 Len=12 CharacterSet=UTF8: 0,28,0,2e,0,38,0,4d,0,24,0,47
When I retrieve data from the database via my application, the weird characters will appear.
However, if i manually input the hindi characters into the column of the table, then the Hindi characters appear correctly. When I do a DUMP to check, this is what I get:
Typ=1 Len=12 CharacterSet=UTF8: 9,28,9,2e,9,38,9,4d,9,24,9,47
When I check the unicode chart here http://www.unicode.org/charts/PDF/U0900.pdf, the second DUMP result is correct. When I retrieve data from the database via my application, the correct hindi string appear.
I understand that in Java the encoding is in UTF-16 and Oracle JDBC will convert from UTF-16 to UTF-8 before storing in my database and vice versa. The thing that puzzles me is why correct hindi string appears on my web interface when that the same conversion is used to extract the data from the database. At first I suspect it is the conversion problem in JDBC when the UTF-16 characters get truncated to UTF-8 when I try to store the data to database. But when good data is stored in the database, the extraction seems to be correct albeit that it is going through the same conversion.
I read from several threads of this forum and also the Oracle Globalization Support article but I cannot find an answer to my question.
Can anyone help? Thanks.
Edited by: user13085722 on May 10, 2010 1:12 AM
Edited by: user13085722 on May 10, 2010 1:16 AM

A couple of checkpoints for you:
1. When you load the XML from SXMB_MONI in the test tab of message mapping it turns red..this means the constructed XML (from CC content conversion) doesnt match the one (XSD) defined in your ESR/IR. In this case you have to check again thoroughly the file content conversion fields values/field length in the sender Communication chaneel.
2. Once you rectify the error above then you can test the mapping in ESR message mapping.

Similar Messages

  • Issue with Date Conversion when loading XML File into Oracle 10g Database

    Hello all,
    I have the interface shown in the screenshot below. In it, amongst other actions, I'm mapping an XML file element representing a date to an Oracle table column defined as DATE. The source and target columns are highlighted in the screenshot.
    !http://img223.imageshack.us/img223/1565/odiscr275.jpg!
    When I execute the interface, I get the following error message:
    java.lang.IllegalArgumentException at java.sql.Date.valueOf(Date.java:103)
    I'm assuming this refers to the date conversion!
    I've already tried replacing SRC_TRADES.DEAL_DATE with TO_DATE( SRC_TRADES.DEAL_DATE, 'DD/MM/YYYY' ) in the Implementation tab. This function was not recognised when I executed the interface, so it didn't work! The date value in the XML file is in DD/MM/YYYY format.
    I'm guessing that Oracle SQL Date functions don't work in the Implementation tab. Please could somebody let me know:
    1. Which Date Conversion function I could use instead?
    2. Where I can find a reference for the methods/functions I can use in the Implementation tab (if such a reference exists)?
    Cheers.
    James

    Hi.
    Try to change the execution area to staging area. After You change it, write in the mapping box just SRC_TRADERS.DEAL_DATE. When You use TO_DATE, the source field typu should be varchar2, not date (as it is in your source datastore)

  • Some character display incorrect after unicode conversion

    Dear Experts,
    Currently we upgrade our SAP system from 4.6B NU to ECC6.0 UN.
    in 4.6B system we had implemented Simple Chinese and French language package to the system. after upgrade to ECC6.0 no-unicode system. we perform a MDMP unicode conversion.currently system running with ECC6.0 unicode interface with language 1EFD. SUMG releated tasks had been finished. Chinese and French language checked OK in ECC6 UN system.
    Our issue is. in the past time. some Spanish local end user typing some spanish character in some master date field when they login in 4.6B GUI choice english as login language.these kind of spanish characters could not display correct in current unicode environment.
    could you please given us some suggestion for how to fix this issue.
    I also sent this message in the Netweaver administrator forums with below links.
    Spanish display incorrect after unicode conversion

    Hello,
    I doubt that the data shown by you in the link was caused by Spanish users logged on in EN.
    I think this was rather caused by utf-8 data uploaded into the Non-Unicode system.
    In this case, you need to repair the according texts manually, I do not know any automatic repair.
    Best regards,
    Nils Buerckel
    SAP AG

  • Unable to Merge after Data Conversion

    I read entry with the similar solution of adding a script component after the data conversion, but I am not finding all the properties that were referenced in that solution.  Was this perhaps on a 2005 SSIS job?
    I never saw a way to set the script component as Asynchronous.  I was only given the choices of None or input (Input 0) for the SynchronousInputID property of the output.  The only column having a SortKeyPosition is the Input Column and that property
    is grayed out.  I did not find a way to "add an override for ProcessInputRow".  It looks like there is a method Output0Buffer.AddRow that is called from somewhere in the script, but all the method stubs have been removed.  So is there
    supposed to be a main() method or something in which to enter this Output0Buffer.AddRow method. 
    I don't deal with the script component normally and I need a little detail if possible.  Thanks.

    First, you set the SynchronousInputID to None to create an Asynchronous script component.
    Next, once you have set the output to be Asynchronous, you add the columns to that output.  Then there are two methods you can use to add rows to that output:
    public override void Input0_ProcessInputRow(Input0Buffer Row)
    Output0Buffer.AddRow();
    Output0Buffer.Test = Row.Column + 1;
    public override void CreateNewOutputRows()
    base.CreateNewOutputRows();
    for (int i = 100; i < 110; i++)
    Output0Buffer.AddRow();
    Output0Buffer.Test = i;
    The first, xxx_ProcessInputRow is the more normal. This is what you would use if you want to populate your output with values from the input.  If you want to start from scratch, with no influence from the input, then override CreateNewOutputRows. 
    I can't think of a compelling reason to use the latter.
    Russel Loski, MCT, MCSE Data Platform/Business Intelligence. Twitter: @sqlmovers; blog: www.sqlmovers.com

  • Character set issue after import?

    Hi,
    Source DB version:10.2.0.1
    OS:Red hat Linux
    Target DB version:10.2.0.1
    OS:Windows server
    source database character set:AL32UTF8
    Performed the export as below
    $export NLS_LANG=AMERICAN.AL32UTF8
    Performed the full database export and it finished successfully with out any warnings
    Export done in AL32UTF8 character set and AL16UTF16 NCHAR character set
    Now imported into the target database as below.
    target database character set:AL32UTF8
    c:\>set NLS_LANG=AMERICAN.AL32UTF8
    now run import command which imported successfully with out any warnings.
    However I’m having problems with Greek characters. Most of them are shown as ?, while some of them are converted to Latin chars
    For example:
    This was supposed to be Αγγελική ???e????
    And this Κουκουτσάκη ??????ts???
    While this one should be Δήμητρα ??µ?t?a
    From the import log file I can see that ‘import done in AL32UTF8 character set and AL16UTF16 NCHAR character set’ which I believe is correct.
    Can any one tell me how i can over come this problem of greek charecters.
    Thank you all.

    PARAMETER
    VALUE
    NLS_LANGUAGE
    AMERICAN
    NLS_TERRITORY
    AMERICA
    NLS_CURRENCY
    $
    PARAMETER
    VALUE
    NLS_ISO_CURRENCY
    AMERICA
    NLS_NUMERIC_CHARACTERS
    NLS_CHARACTERSET
    AL32UTF8
    PARAMETER
    VALUE
    NLS_CALENDAR
    GREGORIAN
    NLS_DATE_FORMAT
    DD-MON-RR
    NLS_DATE_LANGUAGE
    AMERICAN
    PARAMETER
    VALUE
    NLS_SORT
    BINARY
    NLS_TIME_FORMAT
    HH.MI.SSXFF AM
    NLS_TIMESTAMP_FORMAT
    DD-MON-RR HH.MI.SSXFF AM
    PARAMETER
    VALUE
    NLS_TIME_TZ_FORMAT
    HH.MI.SSXFF AM TZR
    NLS_TIMESTAMP_TZ_FORMAT
    DD-MON-RR HH.MI.SSXFF AM TZR
    NLS_DUAL_CURRENCY
    $
    PARAMETER
    VALUE
    NLS_COMP
    BINARY
    NLS_LENGTH_SEMANTICS
    BYTE
    NLS_NCHAR_CONV_EXCP
    FALSE
    PARAMETER
    VALUE
    NLS_NCHAR_CHARACTERSET
    AL16UTF16
    NLS_RDBMS_VERSION
    10.2.0.1.0
    20 rows selected.

  • Mapping issue after Content Conversion in PI 7.1

    Hi Experts
    I am working on a File (Fixed format) to proxy , Data is getting converted in the File Content and conversion and passing to mapping as xml data , But the when the mapping happens no values are being returned on the  target side
    When I load the data (xml ) file from sxmb_moni on the sender side all nodes  shows in red color though file content happened without issues
    please provide the inputs how to map with the values
    Thanks
    PR

    A couple of checkpoints for you:
    1. When you load the XML from SXMB_MONI in the test tab of message mapping it turns red..this means the constructed XML (from CC content conversion) doesnt match the one (XSD) defined in your ESR/IR. In this case you have to check again thoroughly the file content conversion fields values/field length in the sender Communication chaneel.
    2. Once you rectify the error above then you can test the mapping in ESR message mapping.

  • E Rec integration issue while data transfer

    Hi Experts,
    In E Recruitment we are facing an issue that when education is getting transferred from E Rec to PA then education is not getting transferred correctly.
    The issue is because in E Rec we are using table T77RCF_DEGREE & in PA we are using Table T518a. Challenge is coming in their synchronization.
    Please suggest if any workaround is possible.
    Rwgards
    Puneet

    Hi Puneet,
    Usually the 'standard way' to execute the syncronization is through the indicated steps in the following link:
    SAP Library - Talent Management and Talent Development
    This scenario must be recommended to your customer if its using talent management in its HCM system and its in the correct version. Otherwise, if the customer still requires the sync to PA infotypes, I see the following walkarounds: (Depending your scenario, the technology you're using and the moment in which you need to sync the data, if you can provide more detail in this points we can get the best solution)
    - Implement a custom RFC with all the logic to read/write/convert the information between infotypes
    - Use PI to convert/translate the information
    - Modify the logic of the synchronization reports mentioned in the link above.
    Hope this helps.
    Regards.

  • Issue with date conversion

    Hello,
    I am getting input string as YYYYMMD. I want it to be converted to date format based on the format that user has in his profile.
    So if user has MM/DD/YYY or DD.MM.YYYY or any other ata from user data paramets, I want to convert it into that format.
    I tried using FM Convert_date_to_external but it has issue since when I test it does not take data in YYYYMMDD format.
    Please hlep.
    Regards,
    rajesh.

    Hi Rajesh,
    1. Define a variable with type 'DATS'.
    2. Move input date string 'YYYYMMDD' to this.
    3. Write this date variable to Output string.
    DATA : v_date type dats,
               output_date_string(10) type c .
    v_date = input_date_string .
    write v_date to output_date_string .
    And  output_date_string will contain the date in user defined date format.
    Thanks,
    Aditya. V

  • Workflow Attachment Issue after UTF8 conversion

    I have custom worklow with an html attachment. It worked fine until UTF8 conversion is done. Now the attachment is missing from the email message at all.
    Any suggestions you may have would be greatly appreciated. Thanks in advance!

    Default HTML Attachment
    I am not sure if UTF8 conversion has anything to do with the HTML attachment missing. If you are on 11.5.9, the HTML attachment with the e-mail notification is driven by the Notification Preference that the user has. Following are the possible notification preferences that a user can select.
    MAILHTML - HTML mail with attachments
    MAILHTM2 - HTML mail
    MAILATTH - Plain text mail with HTML attachments
    MAILTEXT - Plain text mail
    SUMMARY     - Plain text summary mail
    QUERY - Do not send me mail
    SUMHTML - HTML summary mail
    If the user has MAILHTM2, the default HTML attachments are not sent. You need to have MAILHTML to receive default HTML attachments.
    Custom Attachment
    If you had written your own PLSQL Document API to generate a HTML attachment to the notification, you will have to check if the web-based notification accessible from Self-Service is visible. Check the workflow definition if the Document Type attribute for the HTML attachment is still available in the DB and that the Attach Content check box is checked.
    Hope this helps
    Vijay

  • Home Folder Issues after data drive move.....

    Hello All,
    I have a Mac OS X Server Leopard (latest version) that we have had running without too much troubles for some time now. When we ran out of space, I added a new large hard drive into our xServe and dittoed the user home folders to the new drive. I unshared the home folder file on the old drive, and then setup an identical share on the new drive. When users login on the mac, the system finds the home directory on the new drive fine, no issue.
    The problem arrises when the user does a connect to server request. Their home folder comes up in the list and they mount the drive. However this share-point seams to be the folder still on the old drive. I have now deleted the folder containing the user home folders on the old drive, but it simply gets recreated when as user does a connect to server request. This folder is not longer shared and I can't understand why it still comes up with this old share point.
    It is not a huge issue, but rather annoying.
    Any ideas would be great.
    Thanks
    Michael

    Functionality for customized setups may differ slightly with a new os. Power cycling the device daily is always a good idea for maintenance.
    I don't really consider having groups of icons on my home screen a customized setup.  This is a native feature of the Android OS and something on the update caused this issue.  I've been running 4.3 on two other devices (Nexus 7 and Sprint Galaxy S4) for months and I've had zero issues.  If I didn't use folders I would need to fill up 3+ home screens just to have enough space for the apps I have on my main home screen.  I have an identical setup on the Sprint Galaxy S4 (test phone for work) that is running 4.3 and I don't have this issue.
    Second, are you officially saying that rebooting an Android device on a daily basis is a best practice?  I typically reboot my phone every couple of weeks, but daily seems a bit excessive.  Since the update I have to reboot at least once every 2-3 hours during the day to resolve the folder issue.  It's the first thing I have to do in the morning to be able to access any apps that are within a folder. 

  • Date conversion for internal table

    Hi experts,
    I have an issue with Date conversion.
    i have declared my internal table with type same as that of standard table from which im fetching data.
    The date is coming in YYYYMMDD format and i have to change it to MM/DD/YYYY format.
    I tried using a function Module and used Mask as well . It is working but wen im putting the value back into my internal table it is truncating my conver sion .
    For eg : 20110530 after conversion and wen im putting it back to internal table value is : 05/30/20
    bacuse of length issue "11" disappeared.
    I tried changing the type declaration of my internal table to : date(10) type c.
    But wen im executing the code, the place were im putting the values from my standard table to internal, it is giving run time error.
    can anyonenhelp me with this?

    Hi,
    Why not have the date in MMDDYYYY format truncating the '/' part. Doing this would reduce the size to 8. Which can later be added in any field level rotine.
    Regards
    Raj Rai
    Edited by: Raj Rai on Jun 1, 2011 5:48 AM

  • Java date conversion interesting

    HI ,
    I am facing an issue in date conversion i recieve date from webservices in the below format
    **2008-06-22T18:11:00.000-04:00**
    when i recieve the calender object and dot get
    date=calenderObj.getTime();
    SimpleDateFormat dateFormat = null;
              String formatedStr = "";
              try{
                   dateFormat = new SimpleDateFormat( formatToConvert );
                   formatedStr = dateFormat.format( date );
    }catch(){}
    Date gets postponed to more date
    i.e i get 2008-06-23

    Vicky wrote:
    This is the format of date i am receiving 2008-06-22T18:11:00.000-04:00 through webserivcesAs a String, java.util.Data, java.sql.Date or what? The class of the object matters.
    The calender object that is sent by webservies is automatically converted to date 2008-06-23By what?
    >
    >
    but if the fromat of date is sent in this pattern 2008-06-22 calender object that is created shows proper date any idea why this happens
    Edited by: Vicky on Jan 23, 2012 12:26 AMSorry Vicky but you have added nothing further to your original problem. You need to show exactly what you are doing and explain exactly what the class of each object is and exactly the content of each object and exactly what you expect and exactly what you actually get.

  • Unsupported data conversion

    I just upgrade my jdbc driver to "BEA's MS SQL Server Driver(Type 4)" in WLS 8.1. I got the following errors when executing rs.getTimestamp(1) with a NULL value fetched. Any idea?????
    java.sql.SQLException: [BEA][SQLServer JDBC Driver]Unsupported data conversion.
    at weblogic.jdbc.base.BaseExceptions.createException(Unknown Source)
    at weblogic.jdbc.base.BaseExceptions.getException(Unknown Source)
    at weblogic.jdbc.base.BaseData.unsupportedConversion(Unknown Source)
    at weblogic.jdbc.base.BaseData.getTimestamp(Unknown Source)
    at weblogic.jdbc.base.BaseResultSet.getTimestamp(Unknown Source)
    at weblogic.jdbc.base.BaseResultSet.getTimestamp(Unknown Source)
    at weblogic.jdbc.wrapper.ResultSet_weblogic_jdbc_base_BaseResultSet.getT
    imestamp(Unknown Source)

    Steven Yip wrote:
    I tried to run the application under WLS8.1sp4 and shown that its version
    is 3.40.19 (012727.007216.008716); however, the problem persists.Hi Steven. I think something else must be wrong. I can't duplicate it.
    Here's a small program. Change it to your user, password, dbms etc,
    and see if you can alter it to get it to show the problem.
    What is your table definition?
    thanks,
    Joe
    import java.io.*;
    import java.util.*;
    import java.net.*;
    import java.sql.*;
    import weblogic.common.*;
    public class ddora
    public static void main(String argv[])
    throws Exception
    Connection c = null;
    try
    java.util.Properties props = new java.util.Properties();
    props.put("user", "scott");
    props.put("password", "tiger");
    props.put("SID", "JOE");
    Driver d = (Driver)Class.forName("weblogic.jdbc.oracle.OracleDriver").newInstance();
    c = d.connect("jdbc:bea:oracle://JOE:1521", props);
    System.out.println("Driver version is " + c.getMetaData().getDriverVersion());
    Statement s = c.createStatement();
    try{s.executeUpdate("drop table joe");} catch (Exception ignore){}
    s.executeUpdate("create table joe "
    + " ( DATEOBJ DATE NULL)"
    s.executeUpdate("insert into joe values(NULL)");
    ResultSet rs = s.executeQuery("SELECT * FROM JOE");
    rs.next();
    rs.getTimestamp(1);
    catch (Exception e)
    { e.printStackTrace(); }
    finally
    { try {c.close();}catch (Exception e){} }

  • Oracle Database : Date conversion issue between timezones

    Hi All,
    We are trying to convert date from Eurpoer/Amsterdam timezone to Australia/Sydnet timezone and extracting time out of it.
    We are facing issue related to incorrect timing after conversion.
    Please find below issue details,
    Environment
    Database: Oracle 10.2.0.4
    Machine: Linux RHEL 4
    Location: Amsterdam, Netherlands
    Issue: After converting date from Eurpoer/Amsterdam timezone to Australia/Sydnet timezone
    SELECT TO_CHAR (FROM_TZ (TO_TIMESTAMP ('201110201416', 'YYYYMMDDHH24MI'),
    DBTIMEZONE
    ) AT TIME ZONE 'Australia/Sydney', 'HH24:MI')
    FROM DUAL
    Output of above SQL is 22:16.
    Expected output is 23:16
    Database Timezone (DBTIMEZONE) is set to +02:00 (i.e. GMT + 2) (Europe/Amsterdam)
    If we convert date to Australia/Tasmania timezone then we get expected output i.e. 23:16.
    Question: Expected time for Australia/sydney is 23:16, but why Oracle database provides 22:16 as output.
    Please note that, Sydney and Tasmania comes under same timezone. Still we get different output for both the timezone.
    Oracle database considers GMT + 10 for Australia/sydney and GMT + 11 for Australia/Tasmania.
    If we want to use Australia/sydney then how should we get correct sydney time?
    Regards
    Shailendra

    I made a test and it showed when I omit TO_CHAR, both queries have the same result 11:15 PM. So it's TO_CHAR which is the 'culprit'. Since this is a globalization issue, you may repeat your question in this forum:
    Globalization Support
    (So one of the moderators moves the thread to this forum).
    Werner
    By the way my local timezone is Europe/Berlin, there should be no difference to Amsterdam.

  • Crystal Data Conversion Issue (Error converting data type varchar to datetime)

    Hi,
    I can run stored procedure without error in SQL Server using my personal credentials as well as database credentials.
    I can also run Crystal Report after connecting to Stored procedure without error on my desktop using my personal credentials as well as database credentials.
    But when I upload the crystal report in BOBJDEV and when I run using database credentials report fails saying that "Error in File ~tmp1d1480b8e70fd90.rpt: Unable to connect: incorrect log on parameters. Details: [Database Vendor Code: 18456 ]" but I can run the crystal report successfully on BOBJDEV using my personal credentials.
    I googled (Data Conversion Error Message) about this issue & lot of people asked to do "Verify Database" in Crystal Report. So I did that, but when I do it I am getting a error message like this:
    Error converting data type varchar to datetime.
    Where do you think the error might be occurring? Did anyone faced this kind of issue before? If so, how to resolve it?
    (FYI, I am using Crystal Reports 2008, & for stored procedure I have used SSMS 2012 )
    Please help me with this issue.
    Thanks & Regards.
    Naveen.

    hello Naveen,
    since the report works fine in the cr designer / desktop, we need to figure out where you should post this question.
    by bobjdev do you mean businessobjects enterprise or crystal reports server? if so please post this question to the bi platform space.
    -jamie

Maybe you are looking for

  • How do i sync new files without haveing to untick previous files

    I want to add more music files to my ipod however i cannot find a way of unticking over 400songs already synced without doing so individually. I obviously dont want to duplicate those i already have on it so can anyone give me a hint on how to do so

  • Triggering events for the Lost Triggers!!! Going Nuts....

    Gurus, Oracle ver 11g I am going nuts and crazy here.... Can you think of any possible reasons that I have lost all my objects that I created for triggers ( approx 65 objects) and how to get back those objects. I don't have the script to re-create al

  • How can I download iOS 5 without iTunes?

    My computer can only support through Tiger, so I had to hook up my iPhone through my brother's computer months ago. Is there anyway for me to get iOS 5?

  • Upgrading from logic 8 to 9?

    i have an "academic" version of logic 8. i dont know what the difference is i still paid full price from the apple store online, can i still just buy the upgrade? or do i have to by the whole package....again.

  • Java3D on Windows XP

    I wrote a application using java3D. It works fine on Windows 2000 but when I run it on Windows XP, everytime when I load the program, the mouse cursor begins to flicker. When I move the mouse, the whole screen of the program flashes a lot. Does anyon