NLS_LENGTH_SEMANTICS parameter...

Hi ,
I want to insert multilingual characters in an Oracle XE 10g Db. To do that i connected as sys and issued the command
insert into dept(deptno , dname)
   values(90,'ΛΟΓΙΣΤΗΡΙΟ')
ORA-12899: value too large for column "SCOTT"."DEPT"."DNAME" (actual: 20, maximum: 14)
SQL> ALTER SYSTEM SET NLS_LENGTH_SEMANTICS='CHAR' SCOPE=BOTH;
System altered
SQL>
SQL> insert into dept(deptno , dname)
  2     values(90,'ΛΟΓΙΣΤΗΡΙΟ')
  3  /
insert into dept(deptno , dname)
   values(90,'ΛΟΓΙΣΤΗΡΙΟ')
ORA-12899: value too large for column "SCOTT"."DEPT"."DNAME" (actual: 20, maximum: 14)
SQL> SHOW PARAMETER NLS_LENGTH_SEMANTICS;
NAME                                 TYPE        VALUE
nls_length_semantics                 string      CHARWhy does this problem persist...?????
Thanks...
Sim

Why does this problem persist...?????Because your tables are already created with BYTE semantics i.e. your length of column is 14 bytes but your text (logistics? ;)) you'd like to insert is 10 symbols but 20 bytes, so you cannot do that. As previous poster suggested modify column to 14 char. And your alter system will take care of all your next DDLs but it is not modifying already existing tables and BTW compiled programm units as well.
Gints Plivna
http://www.gplivna.eu

Similar Messages

  • Use of nls_length_semantics parameter

    What is the use of nls_length_semantics pratamter?
    And what is the difference if setting nls_length_semantics='CHAR' instead of 'BYTE'
    Thanks in advance

    NLS_LENGTH_SEMANTICS enables you to create CHAR and VARCHAR2 columns using either byte or character length semantics. Existing columns are not affected.
    NCHAR, NVARCHAR2, CLOB, and NCLOB columns are always character-based. You may be required to use byte semantics in order to maintain compatibility with existing applications.
    NLS_LENGTH_SEMANTICS does not apply to tables in SYS and SYSTEM. The data dictionary always uses byte semantics.
    http://www.oratransplant.nl/2005/11/

  • Gurus, what exactly is the purpose of this NLS_LENGTH_SEMANTICS para?

    Gurus, it would be greatly appreciated if you could also explain what the exact purpose of this NLS_LENGTH_SEMANTICS parameter and what "LENGTH SEMATICS" mean?
    I read some posts in this Forum and got some idea but still don't get the REAL PURPOSE of this variable.
    I mean, I have never changed these parameters in my 5 years as an Oracle developer when creating tables or doing anything else for that matter..
    I have done all my programming and everything in ENGLISH. I have never used multi-ling apps.
    Our DBs are 10.2.0.4.0s.
    Our DB character set is WE8ISO8859P1.
    NLS_NCHAR_CHARACTERSET is AL16UTF16.
    NLS_LENGTH_SEMANTICS is BYTE
    (1.) Why would anybody want to change this NLS_LENGTH_SEMANTICS to CHAR??
    (2.) Is it only used when you want to store NON-ENGLISH characters like Chinese, Persian etc.?
    (3.) If so why?
    (4.) Is there a storage advantage when CHAR is used instead of BYTES? Can you put BOTH as the parameter value?
    (5.) Is there issues when searching when non-English characters are stored in the DB??
    (6.) Is AL16UTF16 a multi-byte character set?
    (7.) Is WE8ISO8859P1 multi-byte or single-byte character set?? How to find these things???
    (8.) What is the real advantage of this NLS_LENGTH_SEMANTICS column and when should we use it and not use it??
    In easy to understand language please.
    thanks in advance.
    Edited by: user12240205 on Oct 28, 2011 12:36 AM

    user12240205 wrote:
    Gurus, it would be greatly appreciated if you could also explain what the exact purpose of this NLS_LENGTH_SEMANTICS parameter and what "LENGTH SEMATICS" mean?
    This is useful only when you use multi bytes character set for you database. Multi bytes characterset means the database stores one character more than one bytes. It can use 2-4 bytes and it can be vary even within one character set. I mean e.g. AL32UTF8 can store an English character in one byte while it can store a Chinese char in 4 bytes only.
    VARCHAR2(10 CHAR) means that you define the maximum length of your column in characters no matter how many bytes needed for those to store.
    You can also check the docs. There is a really good book called [url http://download.oracle.com/docs/cd/B19306_01/server.102/b14225/ch1overview.htm#sthref51]Oracle® Database Globalization Support Guide also [url http://download.oracle.com/docs/cd/B19306_01/server.102/b14225/ch3globenv.htm#sthref385]LENGTH SEMATICS has a page in it...
    I read some posts in this Forum and got some idea but still don't get the REAL PURPOSE of this variable.
    I mean, I have never changed these parameters in my 5 years as an Oracle developer when creating tables or doing anything else for that matter..
    I have done all my programming and everything in ENGLISH. I have never used multi-ling apps.
    Our DBs are 10.2.0.4.0s.
    Our DB character set is WE8ISO8859P1.
    NLS_NCHAR_CHARACTERSET is AL16UTF16.
    NLS_LENGTH_SEMANTICS is BYTE
    (1.) Why would anybody want to change this NLS_LENGTH_SEMANTICS to CHAR??I think the above explanation gives you the answer.
    >
    (2.) Is it only used when you want to store NON-ENGLISH characters like Chinese, Persian etc.? mainly yes. but for any other multi-byte character set
    >
    (3.) If so why? see above.
    >
    (4.) Is there a storage advantage when CHAR is used instead of BYTES? Can you put BOTH as the parameter value?BYTE or CHAR
    you can use only one, but you can change the value in your sessions. and you can mix CHAR and BYTES column length definitions in a table.
    >
    (5.) Is there issues when searching when non-English characters are stored in the DB??>
    (6.) Is AL16UTF16 a multi-byte character set?16 means 16 bites and this character set always uses 2 bytes to store any characters if I know well.
    >
    (7.) Is WE8ISO8859P1 multi-byte or single-byte character set?? How to find these things???single.
    See MB for multibyte in comment column in [url http://download.oracle.com/docs/cd/B19306_01/server.102/b14225/applocaledata.htm#sthref1960]Recommended Database Character Sets
    >
    (8.) What is the real advantage of this NLS_LENGTH_SEMANTICS column and when should we use it and not use it??Well in a multi language column where many languages can be stored it is good to use IMHO.
    So you can type/store into that column equal amount of Japanese, Chinese or English characters.
    >
    In easy to understand language please.
    thanks in advance.
    Finally if you have MOS access see
    [url https://support.oracle.com/CSP/main/article?cmd=show&type=NOT&doctype=BULLETIN&id=144808.1]Examples and limits of BYTE and CHAR semantics usage (NLS_LENGTH_SEMANTICS) ID 144808.1
    Edited by: Kecskemethy on Oct 28, 2011 7:46 AM

  • Unable to change NLS_LENGTH_SEMANTICS

    I am currently trying to run a UTF8 enabled application on a Oracle 11.2.0.1.0 database, and have been having trouble storing some characters into the DB. I have been advised by the application support that the product can definitely do store them, and that I must set the NLS_LENGTH_SEMANTICS parameter to CHAR using the following statement;
    ALTER SYSTEM SET NLS_LENGTH_SEMANTICS=CHAR SCOPE=BOTH;
    However after logging into the target database as the "systems" user and running this statement, which runs successfully, querying the "v$nls_parameters" still shows that it is set to "BYTE". I have tried then creating a new database and setting this value to CHAR in the advanced settings, but still to no avail.
    However after googling this and reading several articles I have began to notice some other strange behaviour. The first thing i noticed was the "SPFILE{dbname}.ORA" file contains the right value i.e;
    *.nls_length_semantics='CHAR'
    The second thing that I noticed was that running the "v$nls_parameters" query in SQL Developer produced a different result to the one produced by SQL Plus (See below) even though exactly the same user in the same database is being used. In SQL developer the following query "select * from v$nls_parameters where Parameter = 'NLS_LENGTH_SEMANTICS';" produces;
    PARAMETER VALUE
    NLS_LENGTH_SEMANTICS BYTE
    1 rows selected
    Where as in SQLPLUS this query produces;
    SQL> select * from v$nls_parameters where Parameter = 'NLS_LENGTH_SEMANTICS';
    PARAMETER
    VALUE
    NLS_LENGTH_SEMANTICS
    CHAR
    Can anyone provide any insight as to what could be the problem with my oracle database setup, as the application developers are adamant that UTF8 chars are supported and the only difference they can see between my setup (which doesnt work) and theirs (which does work) is the value of this parameter. Details of my system are;
    OS: Windows 7 x64
    SQL Developer: 2.1.0.63
    DATABASE: Oracle Database 11g Enterprise Edition Release 11.2.0.1.0 - 64bit Production
    PL/SQL Release 11.2.0.1.0 - Production
    CORE     11.2.0.1.0     Production
    TNS for 64-bit Windows: Version 11.2.0.1.0 - Production
    NLSRTL Version 11.2.0.1.0 - Production

    Forum for NLS / Globalization Support discussions:
    Globalization Support
    user10600690 wrote:
    I am currently trying to run a UTF8 enabled application on a Oracle 11.2.0.1.0 database, Is this database created with db charset of AL32UTF8?
    and have been having trouble storing some characters into the DB. What kind of trouble?
    ALTER SYSTEM SET NLS_LENGTH_SEMANTICS=CHAR SCOPE=BOTH;
    However after logging into the target database as the "systems" user and running this statement, which runs successfully, querying the "v$nls_parameters" still shows that it is set to "BYTE". I have tried then creating a new database and setting this value to CHAR in the advanced settings, but still to no avail. You've changed one parameter and looked at another.
    There are different "levels" in what you wrote in the previous quote. Take a look at the dictionary views (http://download.oracle.com/docs/cd/E11882_01/server.112/e10729/ch3globenv.htm#i1006415)
    NLS_SESSION_PARAMETERS (compare to v$nls_parameters, on which the view is based)
    NLS_INSTANCE_PARAMETERS (alter system set nls...)
    NLS_DATABASE_PARAMETERS (creating a new database...)
    You may need to bounce the instance for the setting to have any effect (even with scope=both) for new sessions. At least I think that was the case in some previous release.
    Note that I do not think you can (or, at least, should) create a database with semantics=char. The "advcanced settings" probably referred to instance parameters.
    The first thing i noticed was the "SPFILE{dbname}.ORA" file contains the right value i.e;
    *.nls_length_semantics='CHAR'Yes, the instance level view should agree.
    >
    The second thing that I noticed was that running the "v$nls_parameters" query in SQL Developer produced a different result to the one produced by SQL Plus Could be different client config/environment settings affecting session parameters. Compare with session level view.

  • Change NLS_LENGTH_SEMANTICS from BYTE to CHAR on Oracle 9i2 problem!

    Hi,
    I have created a new database on Oracle 9i 2, I did not find the correct pfile parameter for the NLS_LENGTH_SEMANTICS setting.
    So have created a standart UTF8 database and now I am trying to change the standard NLS_LENGTH_SEMANTICS=BYTE to CHAR. When I execute the following command in SQL PLUS "ALTER SYSTEM SET NLS_LENGTH_SEMANTICS=CHAR SCOPE=BOTH"
    The system is tells me that command is successfully executed.
    But when I look at the NLS_DATABASE_PARAMETERS table I do not see any change for the NLS_LENGTH_SEMANTICS parameter.
    I have also restarted the instance but still everything is the same as it was before.
    Do you know what I am doing wrong?
    Regards
    RobH

    Hi,
    Yeah you are right, the nls_session_parameters "NLS_LENGTH_SEMANTICS" for the app user is set to CHAR.
    This means that NLS_DATABASE_PARAMETERS is from the SYS or SYSTEM user view?
    Thanks a lot
    Regards
    RobH

  • Column Length Semantics  and Table Columns - NLS_LENGTH_SEMANTICS

    Is NLS_LENGTH_SEMANTICS defined in oracle 10g?
    if not what is its equivalent.
    Please let me know about this.

    $ sqlplus / as sysdba
    SQL*Plus: Release 10.2.0.1.0 - Production on Mon May 22 23:30:14 2006
    Copyright (c) 1982, 2005, Oracle.  All rights reserved.
    Connected to:
    Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - Production
    With the Partitioning, OLAP and Data Mining options
    SYS@db102 SQL> select * from nls_database_parameters
      2  where parameter = 'NLS_LENGTH_SEMANTICS';
    PARAMETER                      VALUE
    NLS_LENGTH_SEMANTICS           BYTE
    SYS@db102 SQL>                                                                  

  • Unicode in Oracle 8i and Oracle XE

    I’m developing a program to read database objects and data of Oracle Database from Oracle 8.1.7 and store everything as XML files and then read these XML file on different computer and built the same database on Oracle XE and import the data in to it.
    Everything worked fine so far, except that I’m facing an issue with some of the fields of type VARCHAR2, I’m getting this error when importing the data (ORA-12899: value too large for column … etc.)
    Data stored in these fields as Unicode in both Oracle 8i and XE, I don’t know why the some data do not fit in the new XE database.
    Any one has a clue why the same data require more bytes in Oracle XE than they were taking in oracle 8i?
    Note: I'm not using any of Oracle tools to import or export, I'm using a program that I developed.

    Could be related to nls_length_semantics problems similar as here NLS_LENGTH_SEMANTICS parameter...
    It seems that Oracle 8i did not have nls_length_smantics parameter and most probably it used only byte characteristics. To my mind Oracle8i had quite limited support for unicode, so might be that behavoir changed between versions.
    Gints Plivna
    http://www.gplivna.eu

  • NLS data conversion – best practice

    Hello,
    I have several tables originate from a database with a single byte character set. I want to load the data into a database with multi-byte character set like UTF-8, and in the future, be able to use the Unicode version of Oracle XE.
    When I'm using DDL scripts to create the tables on the new database, and after that trying to load the data, I receive a lot of error messages regarding the size of the VARCHAR2 fields (which, of course, makes sense).
    As I understand, I can solve the problem by doubling the size of the verachar2 fields: VARCHAR2(20) will become VARCHAR2(40) and so on. Another option is to use the NVARCHAR2 datatype, and retain the correlation with the number of characters in the field.
    I never used NVARCHAR2 before, so I don't know if there are any side affects on the pre-built APEX processes like Automatic DML, Automatic Row Fetch and the likes, or on the APEX import data mechanism.
    What will be the best practice solution for APEX?
    I'll appreciate any comments on the subjects,
    Arie.

    Hello,
    Thanks Maxim and Patrick for your replies.
    I started to answer Maxim when Patrick post came in. It's interesting as I tried to change this nls_length_semantics parameter once before, but without any success. I even wrote an APEX procedure to run over all my VARCHAR2 columns, and change them to something like VARCHAR2(20 char). However, I wasn't satisfied with this solution, partially because what Patrick said about developers forgetting the full syntax, and partially because I read that some of the internal procedures (mainly with LOBs) do not support this character mode and always working with byte mode.
    Changing the nls_length_semantics parameter seems like a very good solution, mainly because, as Patrick wrote, " The big advantage is that you don't have to change any scripts or PL/SQL code."
    I'm just curious, what is the technique APEX is using to run on all various, SB and MB character sets?
    Thanks,
    Arie.

  • Import taking too much time

    Hi all
    I'm quite new to database administration.my problem is that i'm trying to import dump file but one of the table taking too much time to import .
    Description::
    1 Export taken from source database which is in oracle 8i character set is WE8ISO8859P1
    2 I am taking import in 10 g with character set utf 8 and national character set is also same.
    3 dump file is about 1.5 gb.
    4 I got error like value is too large for column so in target db which is in utf 8 i convert all coloumn from varchar2 to char.
    5 while taking a import some table get import very fast bt at perticular table it get very slow
    please help me thanks in advance.......

    Hello,
    4 I got error like value is too large for column so in target db which is in utf 8 i convert all coloumn from varchar2 to char.
    5 while taking a import some table get import very fast bt at perticular table it get very slow For the point *4* it's typically due to the CHARACTER SET conversion.
    You export data in WE8ISO8859P1 and import in UTF8. In WE8ISO8859P1 characters are encoded in *1 Byte* so *1 CHAR = 1 BYTE*. In UTF8 (Unicode) characters are encoded in up to *4 Bytes* so *1 CHAR > 1 BYTE*.
    For this reason you'll have to modify the length of your CHAR or VARCHAR2 Columns, or add the CHAR option (by default it's BYTE) in the column datatype definition of the Tables. For instance:
    VARCHAR2(100 CHAR)The NLS_LENGTH_SEMANTICS parameter may be used also but it's not very well managed by export/Import.
    So, I suggest you this:
    1. set NLS_LENGTH_SEMANTICS=CHAR on your target database and restart the database.
    2. Create from a script all your Tables (empty) on the target database (without the indexes and constraints).
    3. Import the datas to the Tables.
    4. Import the Indexes and constraints.You'll have more information on the following Note of MOS:
    Examples and limits of BYTE and CHAR semantics usage (NLS_LENGTH_SEMANTICS) [ID 144808.1]For the point *5* it may be due to the conversion problem you are experiencing, it may also due to some special datatype like LONG.
    Else, I have a question, why do you choose UTF8 on your Target database and not AL32UTF8 ?
    AL32UTF8 is recommended for Unicode uses.
    Hope this help.
    Best regards,
    Jean-Valentin

  • Problem creating a table with a subquery and a dblink

    Hello!!
    I have a little problem. When I create a table with a subquery and a dblink, for example:
    CREATE TABLE EXAMPLE2 AS SELECT * FROM EXAMPLE1@DBLINK
    the table definition is changed. Fields with a type of CHAR or VARCHAR2 are created with a size three times bigger than the original table in the remote database. Field of type DATE and NUMBER are not changed. For example if the original table, in the database 1, has a field of type CHAR(1) it is create in the local database with a type of CHAR(3), and a VARCHAR2(5) field is created with VARCHAR2(15).
    Database 1 has a WE8DEC character set.
    Database 2 has a AL32UTF8 character set.
    Could it be related to the difference in character sets?
    What can I do to make Oracle use the same table definition when creating a table in this way?
    Thanks!!

    That is related to character sets, and probably necessary if you want all the data in the remote table to be able to fit in the new table.
    When you declare a column VARCHAR2(5), by default, you're allocating 5 bytes of storage. In a single-byte character set, which I believe WE8DEC is, that also happens to equate to 5 characters. In a multi-byte character set like AL32UTF8, though, 1 character may require up to 3 bytes of storage, so you'd need to allocate 15 bytes to store that data. That's what's going on here.
    You could still store all the data if you create the table locally by explicitly requesting 5 characters of storage by declaring the column VARCHAR2(5 CHAR). You could also set the NLS_LENGTH_SEMANTICS parameter to CHAR rather than BYTE before creating the table, but I believe that both of these will only apply when you're explicitly defining columns as part of your CREATE TABLE. I don't believe either will apply to CTAS statements.
    Justin

  • Loading data by sql loader in oracle 10g on linux

    I am trying to load data in Oracle 10g on linux by using sql loader, but getting error
    Problem in log showing that field length of SURNAME field is more than table field size.
    Following is the error in log file of sql loader
    Record 21: Rejected - Error on table TABLE1, column
    SURNAME.
    ORA-12899: value too large for column SURNAME (actual: 65, maximum: 64)
    and it is evident from following controlfile that i am using trim to discard any space then why it is giving an error.
    LOAD DATA
    TRUNCATE
    INTO TABLE TABLE1
    FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"'
    ID INTEGER EXTERNAL,
    OPTION1 CHAR,
    REF1 CHAR,
    OTHER_REF CHAR,
    TITLE "TRIM(:TITLE)",
    FORENAME "TRIM(:FORENAME)",
    SURNAME "TRIM(:SURNAME)",
    JOINT_TITLE "TRIM(:JOINT_TITLE)",
    JOINT_FORENAME "TRIM(:JOINT_FORENAME)",
    JOINT_SURNAME "TRIM(:JOINT_SURNAME)",
    I checked the bad file and count number of characters, they are 64 characters.
    When i am inserting individual record from bad file by sql loader, it is loading

    Probably your database character set is multi-byte. That is %UTF8 or AL16UTF16%
    Post your NLS Database Parameters value
    select * from nls_database_parameters;
    In General varchar2(65) by default means 65 BYTES unless
    you have changed your Defalut NLS_LENGTH_SEMANTICS parameter from BYTE to CHAR.
    With best regards
    Shan

  • Dbms_crypto encrypt date number datatype

    I am using oracle 11g. I am very new to dbms_crypto. I went through documentation but have following doubts:
    Is it mandatory to convert varchar2(32) to RAW to use dbms_crypto.encrypt?
    If I change varchar2(32) to RAW, Can I make it RAW(32) or does it needs to be bigger?
    Does the RAW size must be in multiple of 16?
    How can I encrypt data of datatype date and number using dbms_crypto?
    Thanks a lot for your time to clarify my quries?

    spur230 wrote:
    Is it mandatory to convert varchar2(32) to RAW to use dbms_crypto.encrypt?It's not mandatory, but it's certainly a good idea. If you store encrypted data in a VARCHAR2 column, that means that it is subject to character set conversion if it's moved from one database to another or sent from a database to a client machine. But if character set conversion happens, your encrypted data is corrupted.
    If I change varchar2(32) to RAW, Can I make it RAW(32) or does it needs to be bigger?
    Does the RAW size must be in multiple of 16?It would be helpful to specify exactly what algorithm and parameters you intend to use because it may vary. If, for example, we encrypt using AES-256 with Cipher Block Chaining and PKCS#5 compliant padding (which happens to be the example in the DBMS_CRYPTO manual), the output RAW will always be a multiple of 16 and as large or larger than the input RAW.
    A VARCHAR2(32) will either allocate 32 characters of storage or 32 bytes of storage depending on your NLS_LENGTH_SEMANTICS parameter. If you're using the default, it will allocate 32 bytes. But 32 bytes in the database character set may require more than 32 bytes of storage once you convert it to a UTF-8 encoded RAW (which, technically, also isn't required but is a good practice) and, thus, the encrypted string might require more than 32 bytes of storage. Your database character set and the actual data you store/ want to be able to store will influence how likely it is that you'll need a larger RAW than your VARCHAR2.
    How can I encrypt data of datatype date and number using dbms_crypto?dbms_crypto only operates on RAW data. Just like you convert strings to RAW before encrypting them, you'd need to convert your dates and numbers to RAW. For numbers, you should be able to use UTL_RAW.CAST_FROM_NUMBER. I don't know of a method of casting dates to a RAW other than converting them to a known string representation and then encrypting that (and, of course, doing the reverse when you decrypt the string and convert it back to a date using that same format).
    Justin

  • Incorrect data_length for columns with char semantics in 10g

    Hi,
    I was going through a few databases at my work place and I noticed something unusual.
    Database Server - Oracle 10g R2
    Database Client - Oracle 11g R1 (11.1.0.6.0 EE)
    Client OS - Win XP
    SQL>
    SQL> @ver
    BANNER
    Oracle Database 10g Enterprise Edition Release 10.2.0.4.0 - 64bi
    PL/SQL Release 10.2.0.4.0 - Production
    CORE    10.2.0.4.0      Production
    TNS for Linux: Version 10.2.0.4.0 - Production
    NLSRTL Version 10.2.0.4.0 - Production
    5 rows selected.
    SQL> --
    SQL> drop table t;
    Table dropped.
    SQL> create table t (
      2    a    char(3 char),
      3    b    char(3 byte),
      4    c    char(3),
      5    d    varchar2(3 char),
      6    e    varchar2(3 byte),
      7    f    varchar2(3)
      8  );
    Table created.
    SQL> --
    SQL> desc t
    Name                                      Null?    Type
    A                                                  CHAR(3 CHAR)
    B                                                  CHAR(3)
    C                                                  CHAR(3 CHAR)      <= why does it show "CHAR" ? isn't "BYTE" semantics the default i.e. CHAR(3) = CHAR(3 BYTE) ?
    D                                                  VARCHAR2(3 CHAR)
    E                                                  VARCHAR2(3)
    F                                                  VARCHAR2(3 CHAR)  <= same here; this should be VARCHAR2(3)
    SQL> --
    SQL> select table_name,
      2         column_name,
      3         data_type,
      4         data_length,
      5         data_precision,
      6         data_scale
      7    from user_tab_columns
      8   where table_name = 'T';
    TABLE_NAME   COLUMN_NAME  DATA_TYPE  DATA_LENGTH DATA_PRECISION DATA_SCALE
    T            A            CHAR                12                               <= why 12 and not 3 ? why multiply by 4 ?
    T            B            CHAR                 3
    T            C            CHAR                12                               <= same here
    T            D            VARCHAR2            12                               <= and here
    T            E            VARCHAR2             3
    T            F            VARCHAR2            12                               <= and here
    6 rows selected.
    SQL>
    SQL>I believe it multiplies the size by 4, because it shows 16 in user_tab_columns when the size is changed to 4.
    When I try this on 11g R1 server, it looks good -
    Database Server - Oracle 11g R1
    Database Client - Oracle 11g R1 (11.1.0.6.0 EE)
    Client OS - Win XP
    SQL>
    SQL> @ver
    BANNER
    Oracle Database 11g Enterprise Edition Release 11.1.0.6.0 - Production
    PL/SQL Release 11.1.0.6.0 - Production
    CORE    11.1.0.6.0      Production
    TNS for 32-bit Windows: Version 11.1.0.6.0 - Production
    NLSRTL Version 11.1.0.6.0 - Production
    5 rows selected.
    SQL> --
    SQL> drop table t;
    Table dropped.
    SQL> create table t (
      2    a    char(3 char),
      3    b    char(3 byte),
      4    c    char(3),
      5    d    varchar2(3 char),
      6    e    varchar2(3 byte),
      7    f    varchar2(3)
      8  );
    Table created.
    SQL> --
    SQL> desc t
    Name                                      Null?    Type
    A                                                  CHAR(3 CHAR)
    B                                                  CHAR(3)
    C                                                  CHAR(3)
    D                                                  VARCHAR2(3 CHAR)
    E                                                  VARCHAR2(3)
    F                                                  VARCHAR2(3)
    SQL> --
    SQL> select table_name,
      2         column_name,
      3         data_type,
      4         data_length,
      5         data_precision,
      6         data_scale
      7    from user_tab_columns
      8   where table_name = 'T';
    TABLE_NAME   COLUMN_NAME  DATA_TYPE    DATA_LENGTH DATA_PRECISION DATA_SCALE
    T            A            CHAR                   3
    T            B            CHAR                   3
    T            C            CHAR                   3
    T            D            VARCHAR2               3
    T            E            VARCHAR2               3
    T            F            VARCHAR2               3
    6 rows selected.
    SQL>
    SQL>Is it a known bug ? Unfortunately, I do not have access to Metalink.
    Thanks,
    isotope
    Edited by: isotope on Mar 3, 2010 6:46 AM

    Anurag Tibrewal wrote:
    It is just because you have different NLS_LENGTH_SEMANTICS in v$nls_parameter for both the database. It is BYTE in R10 and CHAR in R11.
    I cannot query v$nls_parameter in the 10g database. I tried this testcase with the ALTER SESSION and checking with nls_session_parameter in both 10g and 11g. The client is 11g in each case.
    The DESCRIBE table looks ok, but the user_tab_column shows size*4.
    Testcase -
    cl scr
    select * from v$version;
    -- Try CHAR semantics
    alter session set nls_length_semantics=char;
    select * from nls_session_parameters where parameter = 'NLS_LENGTH_SEMANTICS';
    drop table t;
    create table t (
      a    char(3 char),
      b    char(3 byte),
      c    char(3),
      d    varchar2(3 char),
      e    varchar2(3 byte),
      f    varchar2(3)
    desc t
    select table_name,
           column_name,
           data_type,
           data_length,
           data_precision,
           data_scale
      from user_tab_columns
    where table_name = 'T';
    -- Try BYTE semantics
    alter session set nls_length_semantics=byte;
    select * from nls_session_parameters where parameter = 'NLS_LENGTH_SEMANTICS';
    drop table t;
    create table t (
      a    char(3 char),
      b    char(3 byte),
      c    char(3),
      d    varchar2(3 char),
      e    varchar2(3 byte),
      f    varchar2(3)
    desc t
    select table_name,
           column_name,
           data_type,
           data_length,
           data_precision,
           data_scale
      from user_tab_columns
    where table_name = 'T';In 10g R2 server -
    SQL>
    SQL> select * from v$version;
    BANNER
    Oracle Database 10g Enterprise Edition Release 10.2.0.4.0 - 64bi
    PL/SQL Release 10.2.0.4.0 - Production
    CORE    10.2.0.4.0      Production
    TNS for Linux: Version 10.2.0.4.0 - Production
    NLSRTL Version 10.2.0.4.0 - Production
    SQL>
    SQL> -- Try CHAR semantics
    SQL> alter session set nls_length_semantics=char;
    Session altered.
    SQL> select * from nls_session_parameters where parameter = 'NLS_LENGTH_SEMANTICS';
    PARAMETER            VALUE
    NLS_LENGTH_SEMANTICS CHAR
    SQL> --
    SQL> drop table t;
    Table dropped.
    SQL> create table t (
      2    a    char(3 char),
      3    b    char(3 byte),
      4    c    char(3),
      5    d    varchar2(3 char),
      6    e    varchar2(3 byte),
      7    f    varchar2(3)
      8  );
    Table created.
    SQL> --
    SQL> desc t
    Name                                            Null?    Type
    A                                                        CHAR(3)
    B                                                        CHAR(3 BYTE)
    C                                                        CHAR(3)
    D                                                        VARCHAR2(3)
    E                                                        VARCHAR2(3 BYTE)
    F                                                        VARCHAR2(3)
    SQL> --
    SQL> select table_name,
      2         column_name,
      3         data_type,
      4         data_length,
      5         data_precision,
      6         data_scale
      7    from user_tab_columns
      8   where table_name = 'T';
    TABLE_NAME        COLUMN_NAME  DATA_TYPE    DATA_LENGTH DATA_PRECISION DATA_SCALE
    T                 A            CHAR                  12      <==
    T                 B            CHAR                   3
    T                 C            CHAR                  12      <==
    T                 D            VARCHAR2              12      <==
    T                 E            VARCHAR2               3
    T                 F            VARCHAR2              12      <==
    6 rows selected.
    SQL>
    SQL> -- Try BYTE semantics
    SQL> alter session set nls_length_semantics=byte;
    Session altered.
    SQL> select * from nls_session_parameters where parameter = 'NLS_LENGTH_SEMANTICS';
    PARAMETER            VALUE
    NLS_LENGTH_SEMANTICS BYTE
    SQL> --
    SQL> drop table t;
    Table dropped.
    SQL> create table t (
      2    a    char(3 char),
      3    b    char(3 byte),
      4    c    char(3),
      5    d    varchar2(3 char),
      6    e    varchar2(3 byte),
      7    f    varchar2(3)
      8  );
    Table created.
    SQL> --
    SQL> desc t
    Name                                            Null?    Type
    A                                                        CHAR(3 CHAR)
    B                                                        CHAR(3)
    C                                                        CHAR(3)
    D                                                        VARCHAR2(3 CHAR)
    E                                                        VARCHAR2(3)
    F                                                        VARCHAR2(3)
    SQL> --
    SQL> select table_name,
      2         column_name,
      3         data_type,
      4         data_length,
      5         data_precision,
      6         data_scale
      7    from user_tab_columns
      8   where table_name = 'T';
    TABLE_NAME        COLUMN_NAME  DATA_TYPE    DATA_LENGTH DATA_PRECISION DATA_SCALE
    T                 A            CHAR                  12    <==
    T                 B            CHAR                   3
    T                 C            CHAR                   3
    T                 D            VARCHAR2              12   <==
    T                 E            VARCHAR2               3
    T                 F            VARCHAR2               3
    6 rows selected.
    SQL>
    SQL>In 11g R1 server -
    SQL>
    SQL> select * from v$version;
    BANNER
    Oracle Database 11g Enterprise Edition Release 11.1.0.6.0 - Production
    PL/SQL Release 11.1.0.6.0 - Production
    CORE    11.1.0.6.0      Production
    TNS for 32-bit Windows: Version 11.1.0.6.0 - Production
    NLSRTL Version 11.1.0.6.0 - Production
    5 rows selected.
    SQL>
    SQL> -- Try CHAR semantics
    SQL> alter session set nls_length_semantics=char;
    Session altered.
    SQL> select * from nls_session_parameters where parameter = 'NLS_LENGTH_SEMANTICS';
    PARAMETER                      VALUE
    NLS_LENGTH_SEMANTICS           CHAR
    1 row selected.
    SQL> --
    SQL> drop table t;
    Table dropped.
    SQL> create table t (
      2    a    char(3 char),
      3    b    char(3 byte),
      4    c    char(3),
      5    d    varchar2(3 char),
      6    e    varchar2(3 byte),
      7    f    varchar2(3)
      8  );
    Table created.
    SQL> --
    SQL> desc t
    Name                                      Null?    Type
    A                                                  CHAR(3)
    B                                                  CHAR(3 BYTE)
    C                                                  CHAR(3)
    D                                                  VARCHAR2(3)
    E                                                  VARCHAR2(3 BYTE)
    F                                                  VARCHAR2(3)
    SQL> --
    SQL> select table_name,
      2         column_name,
      3         data_type,
      4         data_length,
      5         data_precision,
      6         data_scale
      7    from user_tab_columns
      8   where table_name = 'T';
    TABLE_NAME   COLUMN_NAME  DATA_TYPE    DATA_LENGTH DATA_PRECISION DATA_SCALE
    T            A            CHAR                   3
    T            B            CHAR                   3
    T            C            CHAR                   3
    T            D            VARCHAR2               3
    T            E            VARCHAR2               3
    T            F            VARCHAR2               3
    6 rows selected.
    SQL>
    SQL> -- Try BYTE semantics
    SQL> alter session set nls_length_semantics=byte;
    Session altered.
    SQL> select * from nls_session_parameters where parameter = 'NLS_LENGTH_SEMANTICS';
    PARAMETER                      VALUE
    NLS_LENGTH_SEMANTICS           BYTE
    1 row selected.
    SQL> --
    SQL> drop table t;
    Table dropped.
    SQL> create table t (
      2    a    char(3 char),
      3    b    char(3 byte),
      4    c    char(3),
      5    d    varchar2(3 char),
      6    e    varchar2(3 byte),
      7    f    varchar2(3)
      8  );
    Table created.
    SQL> --
    SQL> desc t
    Name                                      Null?    Type
    A                                                  CHAR(3 CHAR)
    B                                                  CHAR(3)
    C                                                  CHAR(3)
    D                                                  VARCHAR2(3 CHAR)
    E                                                  VARCHAR2(3)
    F                                                  VARCHAR2(3)
    SQL> --
    SQL> select table_name,
      2         column_name,
      3         data_type,
      4         data_length,
      5         data_precision,
      6         data_scale
      7    from user_tab_columns
      8   where table_name = 'T';
    TABLE_NAME   COLUMN_NAME  DATA_TYPE    DATA_LENGTH DATA_PRECISION DATA_SCALE
    T            A            CHAR                   3
    T            B            CHAR                   3
    T            C            CHAR                   3
    T            D            VARCHAR2               3
    T            E            VARCHAR2               3
    T            F            VARCHAR2               3
    6 rows selected.
    SQL>
    SQL>isotope

  • DatabaseMetaData.getColumns(...) returns an invalid size of NVARCHAR2

    Hi,
    I have the following problem.
    I'm using ojdbc14.jar version 10.2.0.2.0
    I'm trying to read table metadata from database using DatabaseMetaData.getColumns(...). And, when I read the size of a NVARCHAR2 column (using either COLUMN_SIZE or CHAR_OCTET_LENGTH) it returns the double of the maximum length (as if it were expressed in bytes!)
    Javadocs says: COLUMN_SIZE int => column size. For char or date types this is the maximum number of characters, for numeric or decimal types this is precision.
    Does anyone have the same problem?
    Does anyone know if there is an open bug on this topic?
    I will really appreciate your help
    Thanks
    ArielUBA

    Hi Ashok,
    Thanks for your answer.
    I tried changing NLS_LENGTH_SEMANTICS parameter, and unfortunately it was unsuccessful :-(
    I'm using NVARCHAR2 column type and in http://download.oracle.com/docs/cd/B19306_01/server.102/b14237/initparams127.htm says the following:
    NCHAR, NVARCHAR2, CLOB, and NCLOB columns are always character-based
    I also tried to use (in a simple test case) the newest available driver 11.1.0.6.0. (but, for production I need ojdbc14.jar) and I got the same result.
    PreparedStatement smt = connection.prepareStatement("ALTER SESSION SET NLS_LENGTH_SEMANTICS=CHAR");
    smt.execute();
    connection.commit();
    DatabaseMetaData metaData = connection.getMetaData();
    ResultSet columns = metaData.getColumns(null, "ENG_AAM_572_STD_1E", "PPROCINSTANCE", "V_LOAN_NUMBER");
    dispVerticalResultSet(columns);The column length of V_LOAN_NUMBER is 31 characters and the result was:
    TABLE_CAT=null
    TABLE_SCHEM=ENG_AAM_572_STD_1E
    TABLE_NAME=PPROCINSTANCE
    COLUMN_NAME=V_LOAN_NUMBER
    DATA_TYPE=1111
    TYPE_NAME=NVARCHAR2
    COLUMN_SIZE=62
    BUFFER_LENGTH=0
    DECIMAL_DIGITS=null
    NUM_PREC_RADIX=10
    NULLABLE=1
    REMARKS=null
    COLUMN_DEF=NULL
    SQL_DATA_TYPE=0
    SQL_DATETIME_SUB=0
    CHAR_OCTET_LENGTH=62
    ORDINAL_POSITION=32
    IS_NULLABLE=YES
    Are you sure that I'm dealing with the bug 4485954?
    Do you know if there is a workaround?
    Thanks in advance for your time
    ArielUBA

  • Tom Kyte -v- Oracle Documentation - Character Semantics for AL32UTF8 ?

    On a unicode database, what is best practice for varchar columns - character semantics or byte semantics ?
    Tom Kyte's new book Expert Oracle Database Architecture says "When using a multibyte character set such as UTF8, you would be well advised to use the CHAR modifier in the VARCHAR2/CHAR definition - that is, use VARCHAR2(80 CHAR), not VARCHAR2(80), since your intention is likely to define a column that can in fact store 80 characters of data." (p499). This I would agree with.
    Yet, when I read the 10gR2 documentation "Globalization Support Guide", it says "The BYTE and CHAR qualifiers shown in the VARCHAR2 definitions should be avoided when possible because they lead to mixed-semantics databases. Instead, set NLS_LENGTH_SEMANTICS in the initialization parameter file and define column datatypes to use the default semantics based on the value of NLS_LENGTH_SEMANTICS. Byte semantics is the default for the database character set."
    I'd much rather explicitly state that I'm using character semantics, than relying on some default setting. So who's right ? Tom Kyte, or the Oracle documentation ? Or are they saying different things ?
    If I use character semantics, is this going to come back and bite me later ?
    I'm using Oracle 10gR2, and the database character set is the default of AL32UTF8, using only VARCHAR columns - not NVARCHAR.
    Thanks,
    Andy Mackie

    I would say that both want to say the same thing in a different way.
    Whether you always specify CHAR modifier or define NLS_LENGTH_SEMANTICS to CHAR is the same ... until you decide to change NLS_LENTGH_SEMANTICS to BYTE but this change will only apply to new columns created after NLS_LENGTH_SEMANTICS parameter change and I wonder why you would to do it.
    You have to use character semantics and make a choice: if you don't need to change it, it should not bite you back.

Maybe you are looking for