Character set supported by post and util.zip

hi friends
in my application i was sending a file(in client) using post method and in server getting using servlets.i what that file to be compressed using util.zip in client and uncompresed in server but i was unsucessful .i think the file before post (compressed one) and retrieving one are not same. is it problem different character set supported by http and zip files? or any thing else?
plz help me.......

Just ran into this myself yester day.
There is a Metalink Note on it.
http://metalink.oracle.com/metalink/plsql/ml2_documents.showFrameDocument?p_database_id=NOT&p_id=190281.1
Problem Description
You are using OC4J and trying to connect to a database using JDBC OCI and
are getting:
"java.sql.SQLException: Character Set Not Supported !!: DBConversion"
Solution Description
Replace the <OC4J_HOME>\jdbc\lib\classes12dms.jar with
<ORACLE_HOME>\jdbc\lib\classes12.jar and rename it with classes12dms.jar.
Explanation
It seems there is a mismatch of classes12.zip supplied with OC4J 9.0.2
and the Oracle9i client libraries ocijdbc8.dll or ocijdbc8.so.
OC4J 9.0.2 does not use jdbc\lib\classes12.jar instead it uses
jdbc\lib\classes12dms.jar. So, in order to use the 9.0.1 client with OC4J, you
will need to make a copy of classes12.jar and rename it to classes12dms.jar
References
[NOTE:108876.1] Creating Connection gives "No ocijdbc8 in java.library.path"
[NOTE:174808.1] JDev9i and OCI Connections
I copied and renamed the jar (classes12.jar) as they stated.
Note: It should be in the directory you set in the JDev.conf, mine is
AddNativeCodePath D:\OraNT\9iDS\bin
Didn't try the other reply's suggestion of setting an environment variable.

Similar Messages

  • Character Set Support

    Hi,
    I need to open my application for international users and support multi lingual languages, mostly confined to Europe.
    I am using Forms 4.5 with Oracle 8.1.6 on HP Unix.
    My questions are
    Is UTF8 character set supported by Forms 4.5? I know forms 6i can.
    Has anyone tried to achieve it? Someone can share his experience here?
    Thanks in advance
    Sanjay
    null

    Any one...

  • BIG5 and HKSCS Character Set Support

    Hi,
    We're experiencing some problems inserting a string containing both BIG5 and HKSCS characters to a 7.3.4 Oracle DB using JDBC. The underlying character set used by the DB is ZHT16BIG5 (this cannot be changed). The characters can be inserted correctly if we use SQLPlus/WorkSheet.
    Take note that the BIG5 character set can be inserted correctly. The problem occurs if we include HKSCS characters in the statement.
    We have tried a number of ways already but failed to convert the data properly.
    We tried converting the data using ByteToCharConverter.getConverter("Big5") but this cannot handle the HKSCS properly.
    We even tried using the CharacterSet.ZHT16BIG5_CHARSET provided by the NLS character set but it cannot convert all HKSCS characters correctly.
    Any ideas on how to solve this problem? Or is it because the HKSCS character set is NOT supported by the JDBC driver?
    Below is a sample text containg both BIG5 and HKSCS characters:
    'i$h%49D$G$Q$T89     Ize     _     ^     S(     R     @     A     Y     q
    Any help/suggestion is most welcome.
    Thanks,
    Cis
    null

    I got the exact same problem as you.
    (The Oracle I using is 8.1.7)
    Can any one help??

  • 9iLite and multibyte character set support

    Does 9iLite support a character set that will allow for accented characters?
    for example: i

    "NLS Character Integrity Issues for Consolidator
    When Mobile Sync synchronizes with an Oracle database which has a
    multibyte character set other than UTF8, the character integrity issue
    occurs. Mobile Sync retrieves data from the server database through Oracle
    8.1.7 OCI JDBC Driver for Oracle9iAS version 1.0.2.2, and 9i for Oracle9iAS
    version 2.0. Character sets are converted from database character sets to
    UTF8 by Oracle Servers NLS functions. In the code conversion, some
    multibyte characters are garbled because of the difference of the character
    mapping. This is not a bug of Mobile Sync.
    For more Information, see "Character Integrity Issues in NLS Environment"
    technical paper on Oracle Technology Network (technet.oracle.com)
    Java/SQLJ & JDBC section in Technologies category."
    from the Readme file with the media (read the manual I guess)

  • Conversions between character sets when using 'exp' and 'expdp' utilities

    When I make export with exp utility with NLS_LANG environment
    variable not set, export is done with US7ASCII character set,
    but when server uses some other character set e.g. EE8ISO8859P2,
    there can be some national specific characters lost with such a conversion.
    But when I use expdp utility and have not NLS_LANG set,
    in log file there is not mentioned that export is done in other character set
    than that on the server, does it mean that it uses character set on server?

    When I make export with exp utility with NLS_LANG environment
    variable not set, export is done with US7ASCII character set,
    but when server uses some other character set e.g. EE8ISO8859P2,
    there can be some national specific characters lost with such a conversion.
    But when I use expdp utility and have not NLS_LANG set,
    in log file there is not mentioned that export is done in other character set
    than that on the server, does it mean that it uses character set on server?

  • Multi character set support

    we plan to implement Interconnect to get new customer records created in many databases (at distributed locations) to be populated into a central system.
    all the databases are running on Oracle Database 9.x and we plan to use DB adapters to access each of them.
    3 of these databases store data in double-byte format (Traditional Chinese, Simplified Chinese, Japanese). others are in standard US/American English
    my questions are:-
    1) does my central system's database have to be set to use a certain character set too ?
    2) do i have to configure Interconnect to ensure that data from non-English systems are retrieved and populated correctly into my central database
    thanks for any response

    Hi,
    We are also facing similar problem. We are using IC version 9.0.2. We have AQ adapter at source and DB adapter at destination. The characterset at DB application database is UTF-8. Where as in AQ application database is AR8MSWIN1256. We are using request/reply scenario. But the reply message which gets enqueued to the queue by AQ adapter contains '??????' (arabic data)
    We have specified the encoding type in both adapter.ini files but no use.
    We also followed the steps as stated in Metalink bug id 2375248. But nothing works. Can somebody help us???
    -Vimala

  • Conversions between character sets when using exp and imp utilities

    I use EE8ISO8859P2 character set on my server,
    when exporting database with NLS_LANG not set
    then conversion should be done between
    EE8ISO8859P2 and US7ASCII charsets, so some
    characters not present in US7ASCII should not be
    successfully converted.
    But when I import such a dump, all characters not
    present in US7ASCII charset are imported to the database.
    I thought that some characters should be lost when
    doing such a conversions, can someone tell me why is it not so?

    Not exactly. If the import is done with the same DB character set, then no matter how it has been exported. Conversion (corruption) may happen if the destination DB has a different character set. See this example :
    [ora102 work db102]$ echo $NLS_LANG
    AMERICAN_AMERICA.WE8ISO8859P15
    [ora102 work db102]$ sqlplus test/test
    SQL*Plus: Release 10.2.0.1.0 - Production on Tue Jul 25 14:47:01 2006
    Copyright (c) 1982, 2005, Oracle.  All rights reserved.
    Connected to:
    Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - Production
    With the Partitioning, OLAP and Data Mining options
    TEST@db102 SQL> create table test(col1 varchar2(1));
    Table created.
    TEST@db102 SQL> insert into test values(chr(166));
    1 row created.
    TEST@db102 SQL> select * from test;
    C
    ¦
    TEST@db102 SQL> exit
    Disconnected from Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - Production
    With the Partitioning, OLAP and Data Mining options
    [ora102 work db102]$ export NLS_LANG=AMERICAN_AMERICA.EE8ISO8859P2
    [ora102 work db102]$ sqlplus test/test
    SQL*Plus: Release 10.2.0.1.0 - Production on Tue Jul 25 14:47:55 2006
    Copyright (c) 1982, 2005, Oracle.  All rights reserved.
    Connected to:
    Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - Production
    With the Partitioning, OLAP and Data Mining options
    TEST@db102 SQL> select col1, dump(col1) from test;
    C
    DUMP(COL1)
    ©
    Typ=1 Len=1: 166
    TEST@db102 SQL> exit
    Disconnected from Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - Production
    With the Partitioning, OLAP and Data Mining options
    [ora102 work db102]$ echo $NLS_LANG
    AMERICAN_AMERICA.EE8ISO8859P2
    [ora102 work db102]$ exp test/test file=test.dmp tables=test
    Export: Release 10.2.0.1.0 - Production on Tue Jul 25 14:48:47 2006
    Copyright (c) 1982, 2005, Oracle.  All rights reserved.
    Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - Production
    With the Partitioning, OLAP and Data Mining options
    Export done in EE8ISO8859P2 character set and AL16UTF16 NCHAR character set
    server uses WE8ISO8859P15 character set (possible charset conversion)
    About to export specified tables via Conventional Path ...
    . . exporting table                           TEST          1 rows exported
    Export terminated successfully without warnings.
    [ora102 work db102]$ sqlplus test/test
    SQL*Plus: Release 10.2.0.1.0 - Production on Tue Jul 25 14:48:56 2006
    Copyright (c) 1982, 2005, Oracle.  All rights reserved.
    Connected to:
    Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - Production
    With the Partitioning, OLAP and Data Mining options
    TEST@db102 SQL> drop table test purge;
    Table dropped.
    TEST@db102 SQL> exit
    Disconnected from Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - Production
    With the Partitioning, OLAP and Data Mining options
    [ora102 work db102]$ imp test/test file=test.dmp
    Import: Release 10.2.0.1.0 - Production on Tue Jul 25 14:49:15 2006
    Copyright (c) 1982, 2005, Oracle.  All rights reserved.
    Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - Production
    With the Partitioning, OLAP and Data Mining options
    Export file created by EXPORT:V10.02.01 via conventional path
    import done in EE8ISO8859P2 character set and AL16UTF16 NCHAR character set
    import server uses WE8ISO8859P15 character set (possible charset conversion)
    . importing TEST's objects into TEST
    . importing TEST's objects into TEST
    . . importing table                         "TEST"          1 rows imported
    Import terminated successfully without warnings.
    [ora102 work db102]$ export NLS_LANG=AMERICAN_AMERICA.WE8ISO8859P15
    [ora102 work db102]$ sqlplus test/test
    SQL*Plus: Release 10.2.0.1.0 - Production on Tue Jul 25 14:49:34 2006
    Copyright (c) 1982, 2005, Oracle.  All rights reserved.
    Connected to:
    Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - Production
    With the Partitioning, OLAP and Data Mining options
    TEST@db102 SQL> select col1, dump(col1) from test;
    C
    DUMP(COL1)
    ¦
    Typ=1 Len=1: 166
    TEST@db102 SQL>

  • Character set problems with xmltype and SQLdeveloper

    Hi all,
    Using SQL DEveloper v.3.2.20.09 on Windows XP and connecting to Oracle Database 11g Enterprise Edition Release 11.2.0.2.0 - 64bit Production.
    SQL Developer encoding is set to Cp1252 (my understanding is this has nothing to do with nls_lang and used for encoding of files in sqldeveloper)
    SQL Developer NLS Language params set to Canada English.
    Database NLS parameters are:
    "PARAMETER"    "VALUE"
    "NLS_LANGUAGE"    "AMERICAN"
    "NLS_TERRITORY"    "AMERICA"
    "NLS_CURRENCY"    "$"
    "NLS_ISO_CURRENCY"    "AMERICA"
    "NLS_NUMERIC_CHARACTERS"    ".,"
    "NLS_CHARACTERSET"    "AL32UTF8"
    "NLS_CALENDAR"    "GREGORIAN"
    "NLS_DATE_FORMAT"    "DD-MON-RR"
    "NLS_DATE_LANGUAGE"    "AMERICAN"
    "NLS_SORT"    "BINARY"
    "NLS_TIME_FORMAT"    "HH.MI.SSXFF AM"
    "NLS_TIMESTAMP_FORMAT"    "DD-MON-RR HH.MI.SSXFF AM"
    "NLS_TIME_TZ_FORMAT"    "HH.MI.SSXFF AM TZR"
    "NLS_TIMESTAMP_TZ_FORMAT"    "DD-MON-RR HH.MI.SSXFF AM TZR"
    "NLS_DUAL_CURRENCY"    "$"
    "NLS_COMP"    "BINARY"
    "NLS_LENGTH_SEMANTICS"    "BYTE"
    "NLS_NCHAR_CONV_EXCP"    "FALSE"
    "NLS_NCHAR_CHARACTERSET"    "AL16UTF16"
    "NLS_RDBMS_VERSION"    "11.2.0.2.0"
    I didn't have an NLS_LANG user env variable set, so i set it to "AMERICAN_AMERICA.WE8MSWIN1252"
    Problem i'm having is editing my xmltype data... i have some french accented characters in there, but when saved through SQLDeveloper, they come out as:
    Étape par étape -> Étape par étape
    I do have an encoding on the xml file in the xmltype column set to UTF-8... but from what i read, this has no effect when reading or writing in 11g...
    I ran a dump(col, 1016) on my table and got the following:
    Typ=58 Len=2037: 0,0,0,1,10,4,8c,a8,0,0,0,1,10,8c,ff,b8,0,0,0,1,11,2,2,f8,0,0,0,1,10,e1,a9,20,3b,9a,ca,0,0,0,f,a0,0,0,f,a0,0,1,0,4,0,0,0,1,9,84,d5,24,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,10,f,93,48,0,0,0,1,10,88,0,8,40,b3,8f,0,0,0,1,41,0,0,0,0,0,0,0,0,0,0,0,1,10,88,ab,b8,0,0,1,40,0,0,0,0,0,0,0,1,10,88,ab,d8,0,0,0,1,10,f,93,48,0,0,40,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,2,7,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,40,68,6b,78,73,2d,68,65,61,70,2d,63,0,0,0,0,0,0,7f,ff,7f,ff,80,0,80,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,18,0,0,0,1,10,8d,0,b8,0,0,0,1,10,8d,0,b8,0,0,0,0,0,0,0,28,0,0,0,1,10,8d,0,d0,0,0,0,1,10,8d,0,d0,0,0,0,0,0,0,0,38,0,0,0,1,10,8d,0,e8,0,0,0,1,10,8d,0,e8,0,0,0,0,0,0,0,58,0,0,0,1,10,8d,1,0,0,0,0,1,10,8d,1,0,0,0,0,0,0,0,1,18,0,0,0,1,10,8d,1,18,0,0,0,1,10,8d,1,18,0,0,0,0,0,0,4,18,0,0,0,1,10,8d,1,30,0,0,0,1,10,8d,1,30,0,0,0,0,0,0,10,18,0,0,0,1,10,8d,1,48,0,0,0,1,10,8d,1,48,d0,b3,8f,0,0,0,fe,a9,0,0,0,1,10,8d,0,18,0,0,0,1,10,f,94,e0,0,0,0,1,10,ff,aa,98,0,0,0,1,10,8d,8,98,d0,b3,8f,0,0,0,7,1,0,0,0,0,0,0,0,0,0,0,0,1,10,8d,8,b8,0,0,0,1,10,f,84,48,0,0,0,1,10,8d,8,c0,0,b3,8f,0,0,0,0,31,0,0,0,0,0,0,0,0,0,0,0,1,8,fc,ac,8,d0,b3,8f,0,0,0,fe,41,0,0,0,1,10,8d,1,58,0,0,0,1,10,f,93,48,0,0,0,1,10,f,83,a0,0,0,0,1,11,35,a9,0,d0,b3,8f,0,0,0,fe,19,0,0,0,0,0,0,0,0,0,0,0,1,10,f,84,48,0,0,0,1,10,f,84,48,50,0,4,1,0,0,0,18,0,0,0,58,0,4,1,0,0,0,30,0,0,0,60,0,4,1,0,0,0,48,0,0,0,68,0,4,1,0,0,0,60,0,0,0,70,0,4,1,0,0,0,78,0,0,0,78,0,1,1,0,0,0,90,0,0,0,90,0,2,2,0,0,0,a8,0,0,0,a8,0,17,1,1,3,69,2,0,0,0,d0,0,0,0,c0,0,11,1,1,3,69,2,0,0,0,f8,0,0,0,d8,0,20,1,1,3,69,2,0,0,1,20,0,0,1,0,0,14,1,1,3,69,2,0,0,1,48,0,0,1,18,0,40,1,1,3,69,2,0,0,1,70,0,0,1,60,0,28,1,1,3,69,2,0,0,1,98,0,0,1,90,0,3f,1,1,3,69,2,0,0,1,c0,0,0,1,d0,0,1a,1,1,3,69,2,0,0,1,e8,0,0,1,f0,0,28,1,1,3,69,2,0,0,2,10,0,0,2,20,0,2b,1,1,3,69,2,0,0,2,38,0,0,2,50,0,33,1,1,c0,b3,8f,0,0,0,10,81,0,0,0,1,10,8d,1,58,0,0,0,1,10,f,93,48,0,0,0,1,10,8d,4d,30,0,0,0,1,10,8d,13,88,d0,b3,8f,0,0,0,10,59,0,0,0,0,0,0,0,0,0,0,0,1,10,8d,13,a8,0,0,0,1,11,0,98,88,0,0,0,1,10,8d,13,c8,0,b3,8f,0,0,0,0,41,0,0,0,0,0,0,0,0,0,0,0,1,8,fc,ac,8,0,0,0,0,0,0,0,0,0,0,0,1,10,8d,3,c0,80,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,b3,8f,0,0,0,0,29,0,0,0,1,10,8d,3,40,0,0,0,1,8,fc,ac,50,0,0,0,b7,0,2,43,31,0,0,0,0,0,0,0,0,0,b3,8f,0,0,0,0,99,0,0,0,1,10,8d,3,80,0,0,0,1,9,10,e7,50,1,0,0,0,0,0,0,0,0,0,0,b7,0,0,0,0,0,0,0,0,0,0,0,0,0,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,7,0,0,0,7,c7,bf,b8,c0,b3,8f,0,0,0,18,79,0,0,0,1,10,8d,1,58,0,0,0,1,10,f,93,48,0,0,0,0,0,0,0,0,0,0,0,1,10,8d,1c,80,d0,b3,8f,0,0,0,18,51,0,0,0,0,0,0,0,0,0,0,0,1,10,8d,1c,a0,0,0,0,1,10,f,84,48,0,0,0,1,10,8d,1c,a8,10,b3,8f,0,0,0,18,29,0,0,0,0,0,0,0,0,0,0,0,1,8,fb,fe,44,0,0,0,0,0,0,0,0,0,0,0,1,10,8d,1c,68,0,0,0,0,0,0,0,0,0,30,0,30,0,0,0,4,0,0,0,1,10,8d,24,80,0,0,0,1,10,8d,4,88,0,0,0,1,10,8d,4,58,0,0,0,1,10,8d,4,a0,0,0,0,1,10,8d,1c,68,0,0,0,1,10,88,4,f0,0,0,0,1,10,8d,49,38,0,0,0,1,20,0,0,1,0,0,0,0,0,0,0,0,0,0,0,1,10,8d,40,88,0,0,0,1,9,10,e6,5c,0,0,0,1,10,88,6,0,0,0,0,1,10,8d,38,a0,0,0,0,2,10,8d,0,2,0,0,0,0,0,0,0,0,0,0,0,1,10,8d,40,a8,0,b3,8f,0,0,0,0,69,0,0,0,1,10,88,14,c8,0,0,0,1,10,8d,38,a0,0,0,0,2,20,0,0,2,0,0,0,0,0,0,0,0,0,0,0,1,10,8d,40,c8,0,0,0,0,0,0,0,0,0,0,0,1,10,88,14,30,0,0,0,1,10,8d,38,a0,0,0,0,2,0,0,0,2,0,0,0,0,0,0,0,0,0,0,0,1,10,8d,40,e8,0,0,0,0,0,0,0,0,0,b3,8f,0,0,0,0,59,0,0,0,1,10,8d,4,f8,0,0,0,1,8,fd,b9,e0,0,0,0,1,11,0,83,a8,0,0,0,1,11,0,83,a8,0,0,0,1,11,0,83,c0,0,0,0,1,11,0,7b,98,0,0,0,1,10,8d,26,c0,0,0,0,1,10,8d,26,c0,0,0,0,1,8,fd,b9,e0,0,0,8,0,ac,64,61,74,0,b3,8f,0,0,0,4,29,0,0,0,1,10,8d,5,60,0,0,0,1,8,fb,fe,44,0,0,0,0,0,0,0,0,0,0,0,1,10,8d,9,e0,0,0,0,0,0,0,0,0,0,40,0,40,0,0,0,0,0,0,0,1,10,8d,9,f8,0,0,0,1,10,8d,6,0,0,0,0,1,10,8d,5,d0,0,0,0,1,10,8d,6,18,0,0,0,1,10,8d,9,e0,48,45,4d,41,5f,50,52,45,53,45,4e,54,22,23,66,65,35,64,36,61,61,30,35,64,65,62,64,64,65,61,20,23,33,0,0,0,0,0,0,0,2,0,0,4,0,0,0,3,0,0,0,10,44,2,f,a0,1,0,f,d0,41,0,30,d,0,18,d,0,10,0,0,0,0,0,0,3,0,0,0,2,3,80
    I find it odd it doesnt tell me what character set is used. If i run a dump 1016 on a varchar2 on the same table, i get:
    Typ=1 Len=14 CharacterSet=AL32UTF8: 53,74,65,c3,a9,70,68,61,6e,65,c3,a9,c3,a0
    Any ideas? This is frustrating.... xml data gets corrupted everytime it's edited in SQLDeveloper.
    Thanks for the help

    Thanks for the reply...
    create table sample_charset(theString varchar2(2 char), theXml xmltype);
    insert into sample_charset values('éà',xmltype('<xml>éà</xml>'));
    commit;
    select * from sample_charset;
    "THESTRING"    "THEXML"
    "éà"    <xml>éà </xml>
    select dump(theString, 1016) as theStringDump, dump(theXml,1016) as theXmlDump from sample_charset;
    "THESTRINGDUMP"    "THEXMLDUMP"
    "Typ=1 Len=4 CharacterSet=AL32UTF8: c3,a9,c3,a0"   
    "Typ=58 Len=120: 0,0,0,1,10,4,8c,a8,0,0,0,1,10,e1,a8,60,0,0,0,1,10,dc,36,f8,0,0,0,1,10,95,bf,28,3b,9a,ca,0,0,0,f,a0,0,0,f,a0,0,1,0,4,0,0,0,1,9,84,d5,24,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,10,f,91,28,0,0,0,1,10,db,a8,b8"

  • SQLException: Non supported character set: oracle-character-set-178

    Hello,
    I run a j2ee application on the Sun J2EE Application server.
    I migrated to 0racle 9.2.0.3.
    I have an ejb making a connexion with the oci driver
    and in it I manipulate XMLType:
    PreparedStatement pst = cnx.prepareStatement(sql);
    XMLType xt = XMLType.createXML(cnx, "<ENTETE></ENTETE>");
    pst.setObject(1, xt);
    I've got an exception on the setObject method :
    java.sql.SQLException: Non supported character set: oracle-character-set-178
    at oracle.gss.util.NLSError.throwSQLException(NLSError.java:46)
    at oracle.sql.CharacterSetUnknown.failCharsetUnknown(CharacterSetFactory
    Thin.java:171)
    at oracle.sql.CharacterSetUnknown.convert(CharacterSetFactoryThin.java:1
    35)
    at oracle.xdb.XMLType.getBytesString(XMLType.java:1687)
    at oracle.xdb.XMLType.getBytesValue(XMLType.java:1623)
    at oracle.xdb.XMLType.toDatum(XMLType.java:431)
    at oracle.jdbc.driver.OraclePreparedStatement.setORAData(OraclePreparedS
    tatement.java:2745)
    at oracle.jdbc.driver.OraclePreparedStatement.setObject(OraclePreparedSt
    atement.java:3208)
    I have the nsl_charset12.zip , ojdbc14.jar,xdb.jar,xmlparserv2.jar in the J2EE_CLASSPATH of my server.
    Does any one have an idea.
    Thanks , Edwige

    (*bump*)
    Almost the exact same problem here...
    "conn" is an object of type OracleConnection
    "cst" is an object of type "CallableStatement"
    "xml" is a String holding a well-formed XML document (say, "<foo><bar>value</bar></foo>")
    XMLType xt = XMLType.createXML(conn, xml);
    cst.setObject(1, xt); // BOOM! Non supported character set: oracle-character-set-178
    Other relevant details:
    Using the 9.2.0.3 JDBC thin client (ojdbc14.jar)
    Using the newest nls_charset12.zip (in the 9.2.0.3 download section under the "1.2/1.3" heading.
    JDK1.4.1_03 (04?)
    Oracle itself is 9.2.0.3
    database charset is 'WE8MSWIN1252' (yeah, I fought with the DBA and lobbied for UTF-8, but obviously he won that battle... sigh...)
    Also relevant... prior to a few days ago, it worked. Then, the DBA did an upgrade and the charset errors began. It's hard to get a straight answer from him because his English isn't very good, but my guess is that he updated it from 9.2.0.1 to 9.2.0.3.

  • Non supported character set: oracle-character-set-41

    Hi everyone,
    I have a issue while passing a VARRAY from my Java Class to Stored Procedure.
    TYPE ERR_DATA_ARRAY AS VARRAY(100) OF VARCHAR2(1000);
    In my code i have used ArrayDescriptor and ARRAY class of oracle.
    When i run my program in eclipse i get following exception:-
    java.sql.SQLException: Non supported character set: oracle-character-set-41
    at oracle.gss.util.NLSError.throwSQLException(NLSError.java:46)
    at oracle.sql.CharacterSetUnknown.failCharsetUnknown(CharacterSetFactoryThin.java:171)
    at oracle.sql.CharacterSetUnknown.convert(CharacterSetFactoryThin.java:135)
    at oracle.sql.CHAR.<init>(CHAR.java:159)
    at oracle.sql.CHAR.<init>(CHAR.java:183)
    at oracle.jdbc.oracore.OracleTypeCHAR.toDatum(OracleTypeCHAR.java:161)
    at oracle.jdbc.oracore.OracleType.toDatumArray(OracleType.java:166)
    at oracle.jdbc.oracore.OracleTypeCHAR.toDatumArray(OracleTypeCHAR.java:208)
    at oracle.sql.ArrayDescriptor.toOracleArray(ArrayDescriptor.java:1517)
    at oracle.sql.ARRAY.<init>(ARRAY.java:117)
    at com.niit.slims.common.ejb.Array_test.test(Array_test.java:52)
    at com.niit.slims.common.ejb.Array_test.main(Array_test.java:20)
    I have added ojdbc14.jar.And my working jdk is jdk1.4.
    Plz help
    Edited by: user10569173 on Dec 8, 2011 3:58 AM

    I am not an expert here,
    but it seems that since you are using ojdbc14.jar you may actually need the
    nls_charset12.jar
    instead of the one I suggested previously.
    Regards
    Peter

  • Java.sql.SQLException: Non supported character set: oracle-character-set-17

    Hi,
    i am trying to execute an Oracle procedure from JDBC. The procedure accepts a Nested table as an input parameter. Definition of the nested table is given below.
    Database – Oracle 10g.
    Application Server – JBOSS 4.2.1
    I get the following exception._
    java.sql.SQLException: Non supported character set: oracle-character-set-178
    at oracle.gss.util.NLSError.throwSQLException(NLSError.java:46)
    I.  JDBC  Code_
    Session s = HibernateUtil.getSession();
    Transaction tr = s.beginTransaction();
    con=s.connection();
    oraConn = (OracleConnection) con.getMetaData().getConnection();
    TableObject obj=new TableObject();
    obj.setId(new Integer(123));//Tested ok, stroing in DB
    obj.setDescr("test"); // this line throwing error
    obj.setCre_user(new Integer(456));
    obj.setUpd_user(new Integer(789));
    obj.setXfr_flag("Y");
    ArrayList al=new ArrayList();
    al.add(obj);
    Object[] objAray = al.toArray();
    ArrayDescriptor arrayDescriptor =ArrayDescriptor.createDescriptor("T_TEST_SYN", oraConn);
    ARRAY oracleArray = new ARRAY(arrayDescriptor, oraConn, objAray);
    cs = (OracleCallableStatement)oraConn.prepareCall("call PKG_OBJ_TEST.accept_ui_input(?) ");
    cs.setArray(1, oracleArray);
    cs.execute();
    tr.commit();
    public class TableObject implements SQLData{
    private String sql_type = "T_OBJ_TEST";
    private int id;
    private String descr;
    //private Date cre_date;
    private int cre_user;
    //private Date upd_date;
    private int upd_user;
    private String xfr_flag;
    public TableObject()
    public TableObject (int id,String descr,int cre_user,int upd_user,String xfr_flag)
    // this.sql_type = sql_type;
    this.id = id;
    this.descr = descr;
    // this.cre_date=cre_date;
    this.cre_user=cre_user;
    //this.upd_date=upd_date;
    this.upd_user=upd_user;
    this.xfr_flag=xfr_flag;
    public String getSQLTypeName() throws SQLException {
    return "T_OBJ_TEST";
    public void readSQL(SQLInput stream, String typeName) throws SQLException {
    //sql_type = typeName;
    id=stream.readInt();
    descr=stream.readString();
    //cre_date=stream.readDate();
    cre_user=stream.readInt();
    //upd_date=stream.readDate();
    upd_user=stream.readInt();
    xfr_flag=stream.readString();
    public void writeSQL(SQLOutput stream) throws SQLException {
    try {
    stream.writeInt(this.id);
    System.out.println("Iddddd");
    stream.writeString(this.descr);
    System.out.println("Desccccccccccccccc"+":"+descr);
    //stream.writeDate(cre_date);
    stream.writeInt(this.cre_user);
    System.out.println("userrrrrrrrrrrr");
    //stream.writeDate(upd_date);
    stream.writeInt(this.upd_user);
    System.out.println("upd uiserrrrrrrrrrr");
    stream.writeString(this.xfr_flag);
    System.out.println("flagggggggggggggggggggg"+xfr_flag);
    }catch (SQLException se) {
    System.out.println("Table object sql exception");
    se.printStackTrace();
    catch (Exception e) {
    System.out.println("Table object exception");
    * @return the id
    public int getId() {
    return id;
    * @param id the id to set
    public void setId(Object obj) {
    Integer iobj= (Integer)obj;
    this.id =iobj.intValue();
    * @return the descr
    public String getDescr() {
    System.out.println("getDescr "+descr);
    return descr;
    * @param descr the descr to set
    public void setDescr(Object obj) {
    System.out.println("setDescr "+obj);
    String sobj = (String)obj;
    this.descr=sobj.toString();
    System.out.println("setDescr "+obj);
    * @return the cre_user
    public int getCre_user() {
    return cre_user;
    * @param cre_user the cre_user to set
    public void setCre_user(Object obj) {
    Integer iobj=(Integer)obj;
    this.cre_user = iobj.intValue();
    * @return the upd_user
    public int getUpd_user() {
    return upd_user;
    * @param upd_user the upd_user to set
    public void setUpd_user(Object obj) {
    Integer iobj=(Integer)obj;
    this.upd_user = iobj.intValue();
    * @return the xfr_flag
    public String getXfr_flag() {
    return xfr_flag;
    * @param xfr_flag the xfr_flag to set
    public void setXfr_flag(Object obj) {
    this.xfr_flag = (String)xfr_flag;
    II.  Oracle database object details
    Details of Object and Nested table created in the database.
    T_TEST_SYN is a public synonym created for t_tab_obj_test
    CREATE OR REPLACE TYPE t_obj_test as object (
    id number(10),
    descr varchar2(100),
    --cre_date  date,
    cre_user number(10),
    --upd_date  date,
    upd_user number(10),
    xfr_flag varchar2(1),
    CONSTRUCTOR FUNCTION t_obj_test ( id IN NUMBER DEFAULT NULL,
    descr IN varchar2 default null,
    --cre_date  in date      default null,
    cre_user in number default null,
    --upd_date  in date      default null,
    upd_user in number default null,
    xfr_flag in varchar2 default null ) RETURN SELF AS RESULT ) ;
    CREATE OR REPLACE TYPE BODY t_obj_test as
    CONSTRUCTOR FUNCTION t_obj_test ( id IN NUMBER DEFAULT NULL,
    descr IN varchar2 default null,
    --cre_date  in date      default null,
    cre_user in number default null,
    --upd_date  in date      default null,
    upd_user in number default null,
    xfr_flag in varchar2 default null ) RETURN SELF AS RESULT IS
    BEGIN
    SELF.id := id ;
    SELF.descr := descr ;
    --SELF.cre_date  := cre_date ;
    SELF.cre_user := cre_user ;
    --SELF.upd_date  := cre_date ;
    SELF.upd_user := cre_user ;
    SELF.xfr_flag := xfr_flag ;
    RETURN ;
    END ;
    END ;
    CREATE OR REPLACE TYPE t_tab_obj_test AS TABLE OF t_obj_test ;
    CREATE OR REPLACE PACKAGE BODY PKG_OBJ_TEST AS
    PROCEDURE accept_ui_input ( p_tab_obj_test in T_TAB_OBJ_TEST ) IS
    BEGIN
    FOR row IN p_tab_obj_test.First .. p_tab_obj_test.LAST
    LOOP
    INSERT INTO OBJ_TEST ( ID,
    DESCR,
    CRE_DATE,
    CRE_USER,
    UPD_DATE,
    UPD_USER,
    XFR_FLAG )
    VALUES ( p_tab_obj_test(row).ID,
    p_tab_obj_test(row).DESCR,
    NULL,
    p_tab_obj_test(row).CRE_USER,
    NULL,
    p_tab_obj_test(row).UPD_USER,
    p_tab_obj_test(row).XFR_FLAG ) ;
    END LOOP ;
    COMMIT ;
    END accept_ui_input ;
    END PKG_OBJ_TEST;
    /

    Check your CLASSPATH enviroment variable. Try to add something like c:\Ora10g\jlib\orai18n.jar.
    From "JDBC Developer’s Guide and Reference":
    orai18n.jar
    Contains classes for globalization and multibyte character sets support
    This solved the same error in my case.

  • Oracle.sql.ARRAY error "Non supported character set: oracle-character-set-"

    Hi Folks,
    I am getting error :
    java.sql.SQLException: Non supported character set: oracle-character-set-178
    at oracle.gss.util.NLSError.throwSQLException(NLSError.java:46)
    in the following code :
    Object[] arrayToPassToDB=new Object[7];
    ArrayDescriptor arraydesc = ArrayDescriptor.createDescriptor(arrayDesc,con);
    ARRAY array = null;
    array = new ARRAY(arraydesc, con, arrayToPassToDB);
    i am using jdk1.6 and oracle 10g.
    I have copied the orai18n.jar and ojdbc6.jar in WEB-INF\lib directory of my project and also in the Tomcat\lib directory.
    can you please help me?
    Thanks in advance.

    java.sql.SQLException: Non supported character set:oracle-character-set-178

  • Use of UTF8 and AL32UTF8 for database character set

    I will be implementing Unicode on a 10g database, and am considering using AL32UTF8 as the database character set, as opposed to AL16UTF16 as the national character set, primarily to economize storage requirements for primarily English-based string data.
    Is anyone aware of any issues, or tradeoffs, for implementing AL32UTF8 as the database character set, as opposed to using the national character set for storing Unicode data? I am aware of the fact that UTF-8 may require 3 bytes where UTF-16 would only require 2, so my question is more specific to the use of the database character set vs. the national character set, as opposed to differences between the encoding itself. (I realize that I could use UTF8 as the national character set, but don't want to lose the ability to store supplementary characters, which UTF8 does not support, as this Oracle character set supports up to Unicode 3.0 only.)
    Thanks in advance for any counsel.

    I don't have a lot of experience with SQL Server, but my belief is that a fair number of tools that handle SQL Server NCHAR/ NVARCHAR2 columns do not handle Oracle NCHAR/ NVARCHAR2 columns. I'm not sure if that's because of differences in the provided drivers, because of architectural differences, or because I don't have enough data points on the SQL Server side.
    I've not run into any barriers, no. The two most common speedbumps I've seen are
    - I generally prefer in Unicode databases to set NLS_LENGTH_SEMANTICS to CHAR so that a VARCHAR2(100) holds 100 characters rather than 100 bytes (the default). You could also declare the fields as VARCHAR2(100 CHAR), but I'm generally lazy.
    - Making sure that the client NLS_LANG properly identifies the character set of the data going in to the database (and the character set of the data that the client wants to come out) so that Oracle's character set conversion libraries will work. If this is set incorrectly, all manner of grief can befall you. If your client NLS_LANG matches your database character set, for example, Oracle doesn't do a character set conversion, so if you have an application that is passing in Windows-1252 data, Oracle will store it using the same binary representation. If another application thinks that data is really UTF-8, the character set conversion will fail, causing it to display garbage, and then you get to go through the database to figure out which rows in which tables are affected and do a major cleanup. If you have multiple character sets inadvertently stored in the database (i.e. a few rows of Windows-1252, a few of Shift-JIS, and a few of UTF8), you'll have a gigantic mess to clean up. This is a concern whether you're using CHAR/ VARCHAR2 or NCHAR/ NVARCHAR2, and it's actually slightly harder with the N data types, but it's something to be very aware of.
    Justin

  • Region type URL  and character set encoding

    Hello,
    I'd like to include static html page using URL region, but there is some translation done between encoding of input static html page and output of HTML DB. Does anyone know how the file encoding is translateded when the page is rendered?
    I have tried some encodings of input file (CP1250, UTF-8, Unicode) but it did not work.

    DarkFiBrE72 wrote:
    Its AL16UTF16,
    From metalink
    Starting in Oracle 9i the National Characterset (NLS_NCHAR_CHARACTERSET) will be
    limited to UTF8 and AL16UTF16.
    For more details refer to The National Character Set in Oracle 9i and 10g
    Any other NLS_NCHAR_CHARACTERSET will no longer be supported.
    When upgrading to 10g the value of NLS_NCHAR_CHARACTERSET is based
    on value currently used in the Oracle8 version.
    If the NLS_NCHAR_CHARACTERSET is UTF8 then new it will stay UTF8.
    In all other cases the NLS_NCHAR_CHARACTERSET is changed to AL16UTF16
    and -if used- N-type data (= data in columns using NCHAR, NVARCHAR2 orNCLOB )
    may need to be converted.
    Edited by: DarkFiBrE72 on Sep 24, 2008 7:12 PMI'm not sure if the OP was referring to the National character set? Is this implied by the corresponding SQL Server characterset mentioned?
    Otherwise I would assume we are talking about the Database character set, which allows numerous different character sets and types (single-byte, multi-byte, Unicode etc. depending on Oracle release).
    Regards,
    Randolf
    Oracle related stuff blog:
    http://oracle-randolf.blogspot.com/
    SQLTools++ for Oracle (Open source Oracle GUI for Windows):
    http://www.sqltools-plusplus.org:7676/
    http://sourceforge.net/projects/sqlt-pp/

  • Database Encoding and character set

    Hi,
    Is it possible to change [some easier straigtht forward way] the encodings and character set for a database [DB11 in this case]?
    I have a database which has UTF16 and MSWIN1252 as the encoding and characterset, i want to change it to UTF8 for some testing and reasons.
    Thanks,
    Anupam

    You should detail what you mean by encoding and character set.
    A database has 2 character sets:
    - the character set for CHAR, VARCHAR2 and CLOB
    - the national character set for NCHAR, NVARCHAR2 and NCLOB.
    You have 2 ways to change the database character set:
    - with ALTER DATABASE statement
    - with export/import
    See http://download-uk.oracle.com/docs/cd/B10501_01/server.920/a96529/ch10.htm#1656

Maybe you are looking for

  • ORA-12560-TNS PROTOCOL ERROR

    SIR, i hadbeen installed oracle aplication server.but i m not connecting to database.when i use the sql*plus.i use the user name-scott &password-tiger.i got an error ORA-12560=TNS PROTOCOL ERROR. I M NOT ABLE TO RECTIFY ITS PROBLEM,how can i create h

  • Clearing Out Bleeding Scripts when using GWMI

    OK, so I may be in the minority, but I don't like when the scripts I write bleed all over my screen.  And when I use GWMI, I find that they bleed more often than not.  See, I often work with systems that I don't have access to, and when I use GWMI (e

  • Odi 11.1.1 client on 11.1.5 repository

    hii I want to upgrade my repository to odi 11.1.5 from 11.1.1. 1)Can models/interfaces from 11.1.1 be exported to odi11.1.5.? 2)Will odi 11.1.1 client ge connected to odi 11.1.5 repository ? 3)Will odi 11.1.5 client be connected to odi 11.1.1 reposit

  • "already exist" error during JD Edwards EnterpriseOne 9.0 installation

    Hi all I am trying to install JD Edwards EnterpriseOne 9.0 and i am having the following error: "Prototype Control Tables and/or Business Data already exist. The installer detected an existing Prototype Environment on your computer. Please go back an

  • Adobe site won't let me activate product

    I've installed Photoshop Elements 13; when trying to open for the first time, system says I have to log in and activate, when I try to activate system keeps kicking me back out.