In ECC6.0,why chinese character is 1 byte?

Dear experts,
In our current project, SAP version is ECC 6.0. In my program, I found chinese character only is 1 byte, for example:
data: l_char(20) type c.
l_char = '北京'.
l_char+4(6) = '欢迎您'.
write: l_char.
The result is: 北京  欢迎您
There are two blanks after '北京'.
So it will cause problem, I can't control the display length of the string because the number of chinese charater in this string is variable.
Why the chinese character only is 1 byte in our system?
Thanks in advance.
Sam

Dear Rich,
Could you please help me to resolve this problem or please give me some suggestion.
Thanks a lot
Sam

Similar Messages

  • Does one Simplied Chinese character occupy 1 bype in SAPscript in Ecc6.0?

    Recently,we upgraded our SAP system from R3 4.6C to ECC6.0. But one simplied chinese character occupy 1 byte when I printed a sapscript form.
    It means that a field which length is 12 bytes can output 12 simplied chinese  character, otherwise it could only output 6 simplied chinese character in R3 4.6C.
    I don't know why? Cany anybody tell me?

    Does nobody know how to solve it?

  • GUI_DOWNLOAD give 2 bytes for each chinese character - I need fixed length

    Due to the Unicode system takes each Chinese character as a byte, the field(10) can take 10 Chinese Characters, not like before only 5 characters allowed.
    The problem is when the data is downloaded to a txt file. Some records contains only Western Europe charactes and will fill 10 bytes in the output file and other records contains only Chinese characters and will fill up 20 bytes in the output file.
    The data is exported for a system that can upload chinese characters, but only allow fixed record length file (always 10 bytes).
    In other words: the records with Western Europe characters should contain 10 characters = 10 bytes but the records with Chinese characters should only contain 5 characters = 10 bytes.
    Anyone who can solve this, thanks in advance.
    My code:
      CALL METHOD CL_GUI_FRONTEND_SERVICES=>GUI_DOWNLOAD
        EXPORTING
          FILENAME                = p_file
          CONFIRM_OVERWRITE       = 'X'
          codepage                = '8400' "Chinese
        CHANGING
          DATA_TAB                = it_output
        EXCEPTIONS

    Hi Thomas,
    Now I understand.
    Pl. check the class CL_ABAP_CONV_X2X_CE.
    May it can help u.
    Here is it's documentation.
    CL CL_ABAP_CONV_X2X_CE
    Short Text
    Code Page and Endian Conversion Between External Formats
    Functionality
    Instances of the class CL_ABAP_CONV_X2X_CE allow you to convert text data between different character sets and numeric data between different number formats (byte order).
    Using the class corresponds to transforming data from a binary input stream to a binary output stream: The data objects are read from the input buffer sequentially and written to an output buffer.
    The class methods are normally used as follows:
    CL_ABAP_CONV_X2X_CE=>CREATE
    This creates a conversion instance. Among other options, you can specify the following as parameters: The input buffer (that is the binary string that contains the data to be read), the character format at used in the input buffer, the byte order used in the input buffer, the character format to be used in the output buffer, or the byte order to be used in the output buffer.
    CONVERT_C, CONVERT_...
    This converts the data. Calling this method several times consecutively allows you to read data from the input buffer sequentially and add it to the end of the output buffer.
    GET_OUT_BUFFER
    This gets the output buffer, making it available for further processing (such as writing to a file, sending using RFC).
    RESET
    This allows you to reset individual attributes of the conversion instance. (This method is especially designed for restarting on a new input buffer and deleting the output buffer while retaining the remaining attributes.)
    Relationships
    CL_ABAP_CONV_IN_CE
    Converting Binary Data into ABAP Data Objects
    CL_ABAP_CONV_OUT_CE
    Converting ABAP Data Objects to a  Binary Format.
    CL_ABAP_CHAR_UTILITIES
    Various Attributes and Methods for Character Sets and Byte Order
    Example
    In the following example, UTF-8 texts are converted to the codepage 1100 (Latin-1) and numbers are converted from little-endian to big-endian format:
    DATA:
      in_buffer TYPE XSTRING,
      out_buffer TYPE XSTRING,
      conv TYPE REF TO cl_abap_conv_x2x_ce.
    in_buffer = '414220C3B602010000'.
    conv = cl_abap_conv_x2x_ce=>create(
             in_encoding = 'UTF-8'
             in_endian = 'L'
             out_encoding = '1100'
             out_endian = 'B'
             input = in_buffer
    CALL METHOD conv->convert_c( n = 5 ).
    CALL METHOD conv->convert_i( ).
    out_buffer = conv->get_out_buffer( ).
    The content of the in_buffer variable is made up of  5 bytes (hexadecimal "414220C3B6"), which represent 4 UTF-8 characters (A, B, SPACE, o-Umlaut), and 4 bytes (hexadecimal "02010000"),  which represent the value 258 in little-endian format. This data is converted and the out_buffer variable now contains the hexadecimal string "414220F600000102". It is made up of:
    4 bytes (hexadecimal "414220F6"), which represent the above  4 characters in codepage 1100 (ISO-8859-1). Note that the length specification n = 5 refers to the number of elementary characters to be converted. For multi-byte codepages like UTF-8 this means that the multi-byte character "C3B6" is counted in length 2.
    4 Bytes (hexadecimal "00000102"), which represent the value 258 in big-endian format.
    If you have the desired byte order in the old format as expected by TRANSLATE ... NUMBER FORMAT (as an N(4) value), you can proceed as follows:
    TYPE-POOLS: ABAP.
    CLASS cl_abap_char_utilities DEFINITION LOAD.
    DATA:
      in_number_format(4) TYPE N VALUE '0101',
      out_number_format(4) TYPE N VALUE '0000'.
    DATA:
      in_endian TYPE ABAP_ENDIAN,
      out_endian TYPE ABAP_ENDIAN,
      conv TYPE REF TO cl_abap_conv_x2x_ce..
    in_endian  = cl_abap_char_utilities=>number_format_to_endian(
                   in_number_format
    out_endian = cl_abap_char_utilities=>number_format_to_endian(
                   out_number_format
    conv = cl_abap_conv_x2x_ce=>create(
             in_encoding = 'UTF-8'
             in_endian = in_endian
             out_encoding = '1100'
             out_endian = out_endian
             input = in_buffer
    Notes
    For details refer to the documentation for the individual methods.
    Regards,
    Joy.

  • Why are there chinese character fonts overwriting my fonts?

    At the bottom of my font list are a bunch of Chinese character fonts. The list seems to grow over time (it's gotten a lot larger lately).  When I select most of these fonts they work and display as standard alphabet characters but I don't use them. How do I delete these? I'm trying to clean up all my fonts and I hate these.
    Note: in the Library files (mac OSX) these fonts do not show up as Chinese charters, they display as standard font names. So I have no way of knowing the name of the font, so I can't find it and delete it.
    Help?

    "I have no way of knowing the name of the font" : Try Illustrator/Preferences/Type and check Show Font Names in English.

  • How can i convert jstring to PSTR with chinese character

    Hi all,
    I'm not an expert of C++. So please help me to fix the following problem.
    I'm using jni to call the dll. In java side, the input parameter is jstring. In C++ side, the input parameter of PrintDrugReceipt are PSTR.
    The following is the code of my C++:
    #include <windows.h>
    #include <stdio.h>
    #include <jni.h>
    #include "DrugReceiptWrapper.h"
    #include "DrugReceipt.h"
    const char * JNU_GetStringNativeChars(JNIEnv *env, jobject obj, jstring jstr) {
    jbyteArray bytes = 0;
    jthrowable exc;
         jclass cls;
         jmethodID getBytes;
    char *result = 0;
    if ((*env)->EnsureLocalCapacity(env, 2) < 0) {
    return 0; /* out of memory error */
         cls = (*env)->GetObjectClass(env, obj);
         getBytes = (*env)->GetMethodID(env, cls, "getBytes","()[B");
         //jbyteArray buf = (jbyteArray*)(*env)->CallObjectMethod(env, obj, jlprSourceChiName, getBytes);
         bytes = (*env)->CallObjectMethod(env, jstr, getBytes);
         exc = (*env)->ExceptionOccurred(env);
         if (!exc) {
             jint len = (*env)->GetArrayLength(env, bytes);
             result = (char *)malloc(len + 1);
             if (result == 0) {
                 //JNU_ThrowByName(env, "java/lang/OutOfMemoryError", 0);
                 (*env)->DeleteLocalRef(env, bytes);
                 return 0;
    (*env)->GetByteArrayRegion(env, bytes, 0, len, (jbyte *)result);
    result[len] = 0; /* NULL-terminate */
    } else {
    (*env)->DeleteLocalRef(env, exc);
    (*env)->DeleteLocalRef(env, bytes);
    return result;
    JNIEXPORT jlong JNICALL
    Java_TestPrint_PrintDrugReceiptWrapper(
    JNIEnv *env, jobject obj,
    jstring jlprPrinterPort, jstring jlprSourceChiName,
    jstring jlprTargetChiName, jstring jlprPrintData1,
    jstring jlprPrintData2, jstring jlprCaseNo, jstring jlprReceiptNo){
         PSTR lprPrinterPort;
         PSTR lprSourceChiName;
         PSTR lprTargetChiName, lprPrintData1;
    PSTR lprPrintData2 , lprCaseNo, lprReceiptNo;
         printf("before %s", jlprSourceChiName);
         lprPrinterPort = (*env)->GetStringChars(env, jlprPrinterPort , 0);
         lprSourceChiName = (*env)->GetStringChars(env, jlprSourceChiName, 0);     
         lprTargetChiName = (*env)->GetStringChars(env, jlprTargetChiName, 0);
         lprPrintData1 = (*env)->GetStringChars(env, jlprPrintData1 , 0);
         lprPrintData2 = (*env)->GetStringChars(env, jlprPrintData2 , 0);
    lprCaseNo = (*env)->GetStringChars(env, jlprCaseNo , 0);
         lprReceiptNo = (*env)->GetStringChars(env, jlprReceiptNo , 0);
         PrintDrugReceipt(lprPrinterPort, lprSourceChiName, lprTargetChiName, lprPrintData1, lprPrintData2, lprCaseNo, lprReceiptNo);
         (*env)->ReleaseStringChars(env, jlprPrinterPort , lprPrinterPort);
         (*env)->ReleaseStringChars(env, jlprSourceChiName, lprSourceChiName);
         (*env)->ReleaseStringChars(env, jlprTargetChiName, lprTargetChiName);
         (*env)->ReleaseStringChars(env, jlprPrintData1 , lprPrintData1);
         (*env)->ReleaseStringChars(env, jlprPrintData2 , lprPrintData2);
         (*env)->ReleaseStringChars(env, jlprCaseNo , lprCaseNo);
         (*env)->ReleaseStringChars(env, jlprReceiptNo , lprReceiptNo);
         return 1;
    the h file:
    /* DO NOT EDIT THIS FILE - it is machine generated */
    #include <jni.h>
    /* Header for class TestPrint */
    #ifndef _Included_TestPrint
    #define _Included_TestPrint
    #ifdef __cplusplus
    extern "C" {
    #endif
    * Class: TestPrint
    * Method: PrintDrugReceipt
    * Signature: (Ljava/lang/String;Ljava/lang/String;Ljava/lang/String;Ljava/lang/String;Ljava/lang/String;Ljava/lang/String;Ljava/lang/String;)J
    JNIEXPORT jlong JNICALL Java_TestPrint_PrintDrugReceiptWrapper
    (JNIEnv *, jobject, jstring, jstring, jstring, jstring, jstring, jstring, jstring);
    #ifdef __cplusplus
    #endif
    #endif
    Note: the jstring input parameters are the chinese character. If i print out the jstring, it will display dirty character, why? and how to fix it? and the PrintDrugReceipt will expected accept the chinese character and print it to the printer.
    Thank you for you guys help.
    Matthew

    you can't use char...have to use a wchar....and look around at the wide character handling function in C/C++.

  • How to put Chinese character in jar file by java.util.jar.Manifest?

    Now I want to develop a simple package tool which can modify some property in manifest.mf of jar files,but the Manifest class's putValue method only can correctly save English character.why?
    And how can I put Chinese character?
    the code is:
    Attributes ab = mf.getMainAttributes();
    ab.putValue("agent-Name", agent);

    Attribute values can contain any character and it will be UTF-8 encoded when written to the manifest, according to Javadoc.
    What makes you think that this mechanism fails? What do you see instead of the Chinese character? And what tool/editor/program you use to see it? I did not try myself, but according to the Javadoc there should be no problem.

  • Why Chinese characters not showing up in JLabel correctly , sometimes ?

    I have a large program which uses a JLabel to display some Chinese characters in html, it was doing fine until the program grew larger and larger, then only html fonts in certain sizes will show up correctly, other sizes of the same fonts will show up as rectangles. It's not because it doesn't have these fonts, but it seems to be related to the complexity of the program. If I change the characters in rectangles to a different size ( smaller or larger ), they might show up correctly again (in the large program).
    I have a small test program below to see if a particular font size would show up correctly, they all showed up fine in this test, but if I try to display the same html in my large program, some sizes of characters will go wrong. I've even printed out the unicode and looked at them (6309 4f60 7684 9700 8981 ...), they are definitely in the Chinese character range.
    import java.awt.*;
    import javax.swing.*;
    import java.awt.event.*;
    public class Java_Test extends JPanel
      static int Total=900;
      JComboBox ComboBox_Array[]=new JComboBox[Total];
      Font Times_New_Roman_15_Font=new Font("Times New Roman",0,15);
      static int Small_Chinese_Font_Size=3;
      static String Software_Info_Chinese_Text="<Html><Table Width=100% Border=0 Cellpadding=2 Cellspacing=3><Tr><Td Align=Center><Font Size=6 Color=#3737FF>[ \u4EA7\u54C1\u7BA1\u7406 ] </Font></Td></Tr>"+
                                               "<Tr><Td Align=Center><Font Size=5 Color=green>\u5E2E\u52A9\u4F60\u7BA1\u7406\u4EA7\u54C1, \u5408\u540C, \u53CA\u5176\u5B83\u4FE1\u606F :</Font></Td></Tr>"+
                                               "<Tr><Td>\u8F93\u5165<Font Size="+Small_Chinese_Font_Size+" Color=#2255BB>"+
                                                 "<li>\u8F93\u5165, \u7EF4\u62A4\u548C\u6253\u5370\u8BE6\u7EC6\u4EA7\u54C1\u53CA\u4F9B\u8D27\u5546\u4FE1\u606F<Br>"+
                                                 "<li>\u5730\u5740\u548C\u5907\u6CE8\u4FE1\u606F\u53EF\u7528\u591A\u79CD\u8BED\u8A00\u8F93\u5165<Br>"+
                                               "</Font></Td></Tr>"+
                                               "</Table></Html>";
      JPanel Main_Panel=new JPanel();
      static boolean Exit_When_Window_Closed=false;
      Java_Test(String Test_String)
        for (int i=0;i<Total;i++) ComboBox_Array=new JComboBox();
    JLabel A_Non_English_Label=new JLabel(Test_String);
    A_Non_English_Label.setFont(Times_New_Roman_15_Font);
    A_Non_English_Label.setForeground(Color.BLUE);
    add(A_Non_English_Label);
    setPreferredSize(new Dimension(600,300));
    void Show_Up()
    Main_Panel.add(this);
    Dimension Screen_Size=Toolkit.getDefaultToolkit().getScreenSize();
    final JFrame frame=new JFrame("Java Test");
    frame.add(Main_Panel);
    frame.addWindowListener(new WindowAdapter()
    public void windowClosing(WindowEvent e) { if (Exit_When_Window_Closed) System.exit(0); }
    public void windowDeiconified(WindowEvent e) { Main_Panel.repaint(); }
    public void windowGainedFocus(WindowEvent e) { Main_Panel.repaint(); }
    public void windowOpening(WindowEvent e) { Main_Panel.repaint(); }
    public void windowResized(WindowEvent e) { Main_Panel.repaint(); }
    public void windowStateChanged(WindowEvent e) { Main_Panel.repaint(); }
    frame.pack();
    frame.setBounds((Screen_Size.width-Main_Panel.getWidth())/2,(Screen_Size.height-Main_Panel.getHeight())/2-10,Main_Panel.getWidth(),Main_Panel.getHeight()+50);
    frame.setVisible(true);
    Main_Panel.updateUI();
    static void Out(String message) { System.out.println(message); }
    public static void main(String[] args)
    Exit_When_Window_Closed=true;
    new Java_Test(Software_Info_Chinese_Text).Show_Up();
    final Java_Test demo=new Java_Test(Software_Info_Chinese_Text);
    Dimension Screen_Size=Toolkit.getDefaultToolkit().getScreenSize();
    final JFrame frame=new JFrame("Java Test");
    frame.add(demo);
    frame.addWindowListener(new WindowAdapter()
    public void windowClosing(WindowEvent e) { System.exit(0); }
    public void windowDeiconified(WindowEvent e) { demo.repaint(); }
    public void windowGainedFocus(WindowEvent e) { demo.repaint(); }
    public void windowOpening(WindowEvent e) { demo.repaint(); }
    public void windowResized(WindowEvent e) { demo.repaint(); }
    public void windowStateChanged(WindowEvent e) { demo.repaint(); }
    frame.pack();
    frame.setBounds((Screen_Size.width-demo.getWidth())/2,(Screen_Size.height-demo.getHeight())/2-10,demo.getWidth(),demo.getHeight()+38);
    frame.setVisible(true);
    I called this test program from my large program, and it won't display correctly, it will only display all sizes of characters correctly if I run the test program by itself.
    I've posted a similar question in Java/Swing but got no answer. So I'm trying it here. The above test program will compile and run, it can display all sizes of html fonts correctly.Which means it's not lacking any fonts. I've even tried the following in my large program :
    dialog.validate();
    dialog.repaint();
    pane.validate();
    pane.repaint();
    pane.updateUI();
    A_Label.validate();
    A_Label.repaint();
    A_Label.updateUI();
    Still doesn't work, does anyone know why and how to fix it ?
    Frank                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                       

    I've just made a breakthrough, the large program will work correctly if I comment out : A_Non_English_Label.setFont(Times_New_Roman_15_Font);
    But then the font doesn't look the way I wanted it to be.
    So I changed it to the following :
    ================================================
    Font Courier_New_15_Font=new Font("Courier New",0,15);
    A_Non_English_Label.setFont(Courier_New_15_Font);
    ================================================
    Now it looks not the same but similar to the way I wanted. Thus my question becomes : why it can't work correctly with "Times_New_Roman", but with "Courier_New" in the large program, they both displayed correctly in my small test program ? Is it because it takes too much resource to load the "Times_New_Roman" font ?
    Frank

  • PDF report in 6i does not support the Simplified Chinese Character?

    Hi :
    I generate the PDF format report by report 6i (patch 5) , but the Simplified Chinese Character does not display in PDF files correctly(Acrobat reader 5) , Why ??
    can somebody help me , thanks in advance !!!

    I believe this is not available in PDF until 9i - ref <Bug:1738911>
    Regards
    Grant Ronald

  • Output chinese character to CSV file in UNIX

    Hi
    I encountered ABAP dump whenever output chinese character to CSV file in UNIX in ECC6. Error show as
    "At the conversion of a text from codepage '4102' to codepage '1100':
    - a character was found that cannot be displayed in one of the two
    codepages;
    - or it was detected that this conversion is not supported"
    The program with coding of statement OPEN DATASET xxxxx FOR OUTPUT IN TEXT MODE  ENCODING NON-UNICODE. Reason to output to OPEN statement to non-unicode as users would like to open the csv file thru Excel directly. They do not wish to open the text file in Excel. Can Experts please share with me how to overcome the problem?
    Thanks
    Kang Ring

    May be you could give a try with the following code and check
    OPEN DATASET xxxxx FOR OUTPUT IN TEXT MODE ENCODING NON-UNICODE CODEPAGE '4103'.
    Vikranth

  • 'FTP_R3_TO_SERVER'  Chinese Character display as u2018#u2019

    I want to upload a *.txt which was saved as unicode .
    I use the following code to upload file to FTP server:
    call function 'FTP_R3_TO_SERVER'
      exporting
         handle = p_handle
         fname = p_filename
         character_mode = 'X'
      tables
         text = i_file
      exceptions
         tcpip_error = 1
         command_error = 2
         data_error = 3
        others = 4.
    But after upload, I find all Chinese Character display as u2018#u2019. 
    By the way, I use ECC6.0 to do development.
    How can I let the Chinese display correctly?

    Hi
    Try the following:
    1) Changes in the GUI settings. for this:
    a) go to SAP LOGON PAD.
    b) select your server -> change item.
    c) Go to Advanced.
    d) On the next screen the default selection under the section u2018Encoding for up- and download is u2018Default- ANSI u2013for Unicode Systemsu2019
    e) Please change this to u2018Default- UTF8 u2013for Unicode Systemsu2019
    f) Click on OK and save. Then logout from the system if you were already logged in. The login again for settings to take effect.
    Hope it'll help you.
    Regards
    Rishika bawa

  • Chinese Character cannot be decoded

    hi,
    I would like to implement two JSP pages. The first JSP is just a html form, which is used to submit unicoded chinese data to a target JSP file.
    The target JSP file received those data and display.
    <meta http-equiv="Content-Type" content="text/html; charset=UTF-8"> is added in the first JSP file. As a result, data will be submitted in UTF-8 format.
    In target JSP, I used the following code to recieve and decode data:
    <%@ page contentType="text/html; charset=UTF-8" %>
    <%
    String para = request.getParameter("para"); // where para is name of received parameter
    byte[] bytes = para.getBytes();
    para = new String(bytes, "UTF-8");
    out.println("Recieved character: " + para);
    %>
    My Problem:
    After I submitted chinese characters from the first JSP file, only some of them can be displayed on target JSP. Some of those characters are missing.
    For example, when I input "�@", target JSP can display the character. On the other hand, when I input "�p", nothing is displayed. But I know that variable "bytes" stored 3 bytes for each chinese character. I would like ask why
    para = new String(bytes, "UTF-8");
    cannot encode properly. Is anything wrong about my coding?
    Thx

    More information can be provided.
    OS: Windows 2000 server
    web server: iPlanet
    P.S. : I have set the Character set to UTF-8 in iPlanet.
    thx.
    hi,
    I would like to implement two JSP pages. The
    The first JSP is just a html form, which is used to
    submit unicoded chinese data to a target JSP file.
    The target JSP file received those data and
    and display.
    <meta http-equiv="Content-Type"
    ype" content="text/html; charset=UTF-8"> is added in
    the first JSP file. As a result, data will be
    submitted in UTF-8 format.
    In target JSP, I used the following code to
    to recieve and decode data:
    <%@ page contentType="text/html; charset=UTF-8"
    F-8" %>
    <%
    String para = request.getParameter("para"); //
    // where para is name of received parameter
    byte[] bytes = para.getBytes();
    para = new String(bytes, "UTF-8");
    out.println("Recieved character: " + para);
    %>
    My Problem:
    After I submitted chinese characters from the
    m the first JSP file, only some of them can be
    displayed on target JSP. Some of those characters are
    missing.
    For example, when I input "�@", target JSP can
    P can display the character. On the other hand, when I
    input "�p", nothing is displayed. But I know that
    variable "bytes" stored 3 bytes for each chinese
    character. I would like ask why
    para = new String(bytes, "UTF-8");
    cannot encode properly. Is anything wrong about my
    coding?
    Thx

  • Chinese character displaying as ???? in excel  while importing data from sql-server 2000

    Hi everyone,
    I am facing a problem while importing sql-server 2000 data in
    excelshhet .There is some chinese character in database table.
    After importing data in excel chinese character looks like
    ????????.In sql-server this chinese character looks like square
    field.
    I am using UTF-8 .
    This is last hope.
    vijay k singh
    Text

    I was having the same problem with the AIRSQL i couldn't find how to rencode my csv file...
    So i resave the csv in xls and then chose the utf-8 thingy and then just changed the extension back to csv.
    i was always wondering why i had an xls file in my source code, which was just left there from the previous reencoding, i guess
    hope it helps

  • Display Chinese character in iText

    hi all I need to display Chinese character in my pdf file
    I have iTextAsian.jar on my libraries , but the Chinese character did not show up
    nested1.addCell(new Phrase(chinessname));

    I download freetype-2.1.10,
    and I get ./configure --enable-static
    ./configure: make: not found
    how to continue ... inorder to create your owner font
    by the way I download the new extrajars-2.1.zip, this new jar supporting Chinese characters . but my first font is not working second font is work fine I don't know why?
    first one
    BaseFont bfChinese = BaseFont.createFont("STSong-Light", "UniGB-UCS2-H", BaseFont.NOT_EMBEDDED); not working
    //second one
    BaseFont bfChinese= BaseFont.createFont("MSungStd-Light", "UniCNS-UCS2-H", BaseFont.NOT_EMBEDDED);
    Thank you!

  • How can i display chinese character in asp?

    ---windows2003
    ---iis6.0
    ---oracle92010
    ---oo4o
    oracle character set is we8mswin1252
    sqlplus display chinese character normally.
    when i export the oracle table into microsoft excel,the excel can display chinese character normally.
    but when i write code in asp.The IE can not display the chinese character?
    who would tell my why?
    Message was edited by:
    user525971

    Hi,
    here's a link to the SQL Developer forum: SQL Developer
    Frank

  • Send PO Mail with PDF File that Chinese character doestn't display

    Send PO Mail with PDF File that Chinese character doestn't display.
    I am using RSTXPDFT4, unicode ECC6.0
    Some computer Adobe Reader can read the file, but some computer cannot read, just a blank page.
    Thanks.

    Hi
    I worked for one client-chinese where we have to print chinese & english ( bilingual).You need to have dricer program which could identify both scripts .You are right ( unicode0
    Please check for the driver program : TWPDF : PDF converter Chinese in SPAD setting.
    SAP note is available.I will check and let you update .
    Edited by: sunny on Oct 28, 2009 10:29 AM

Maybe you are looking for

  • ABAP Logic/Structure for a Start and Field Routine in Transformations

    My Requirment is to export data from Data Target to Application Server. And for that purpose i built a APD... In Transformations to read data from MAster Data Table i had written below Global & Field Routine. Start Routine: Glodal Declaration DATA: i

  • F.62 Configure print program

    Hello Experts, I need to modify the data and the layout of transaction F.62 (Print Journal Voucher).  I have searched information from the web that it is possible to add a custom print program to modify the logic.  I would like to ask what is the tra

  • Trying to Find Sample Java Implementation for GATE Information Extractor

    I would like to semantically index a large repository of unstructured documents (over a million) using a GATE Information Extractor, as described in chapter 4 of the Oracle Semantic Technologies Developers Guide 11g Release 2 (E11828-08). Page 4-11 s

  • Rman active cloning is getting failed

    Hi Team, when i am doing rman active cloning it is getting failed channel ORA_DISK_1: starting datafile copy input datafile file number=00406 name=/u01/oradata/ebstest/a_txn_ind05.dbf RMAN-03009: failure of backup command on ORA_DISK_1 channel at 02/

  • How to findout the sharepoint job which responsible for database re indxing

    Hi In sharepoint 2010 i configured RBS storage for  Web application content database in our org form  has two web frontend servers, two application servers, and two index servers ,one database server so when users upload BLOBs to sharepoint library w