Oracle length semantics migration documention
Due to the oracle documention to migrate length semantics:
To convert an existing schema and its associated data from byte semantics and a single-byte character set to character semantics and a multibyte character set, such as UTF-8, you need only follow these steps: [The following steps have been corrected since the magazine was printed.]
1. Export the schema.
2. Issue an ALTER SYSTEM SET NLS_LENGTH_SEMANTICS=CHAR SCOPE=BOTH command on the target database.
3. Stop and restart the instance so that the parameter change takes effect.
4. Drop the original schema.
5. Recreate the original schema and its tables (you can use import's show=Y option to get the CREATE TABLE statements). Columns in the recreated tables will use character semantics, because that's now the default.
6. Import the schema into the target database using the IGNORE=Y import option.
What is the meaning of the terms target and original?
Suppose there is a (source) database with length semantics byte. If a (target) database
is to be created as a clone of the (source) database, except for the length semantic char, does one have to migrate the (source) database first?
Or rather, why is it not possible to
1. Export the data from the source database with length semantic byte,
2. Create a target database with length semantics char,
3. Import the data from the source database?)
This documentation is, unfortunately, poorly written.
If you want to migrate data from one database to another, with both databases having different character sets, you can avoid some data expansion issues, if you migrate to character set semantics at the same time.
You cannot just export data, and import it into a character semantics database, because export/import preserves original semantics of the exported tables.
Note: there is actually no such thing as a character semantics database. Character semantics is a column/variable property. The "character semantics database" is a confusing, though commonly used, term that actually means an instance which has NLS_LENGTH_SEMANTICS set to CHAR in its initialization file. The only significant meaning of this parameter is to be the default for session-level NLS_LENGTH_SEMANTICS. It is used for sessions that do not set this parameter explictly (through environment variable or ALTER SESSION). The session-level parameter is significant for CREATE TABLE/PROCEDURE/FUNCTION/PACKAGE [BODY] statements and tells the default for column and variable declarations that do not specify BYTE or CHAR explicitly.
To migrate semantics of an original schema you need to create a script that will contain all CREATE statements needed to recreate this schema (at least CREATE {TYPE | TABLE | MATERIALIZED VIEW | PROCEDURE | FUNCTION | PACKAGE [BODY]}). Then, you can just add the ALTER SESSION SET NLS_LENGTH_SEMANTICS=CHAR after any CONNECT command in this script. You can than run the script in the target database. How you create the script is irrelevant. You can use any reverse-engineering tool available (e.g. SHOW=Y option of import, DBMS_METADATA package, etc.)
After you pre-create the schema with the new semantics, you can import the data from the original (source) database with IGNORE=Y. The original semantics saved in the export file will be ignored for pre-created objects.
Note: PL/SQL may need manual corrections to migrate to character semantics. For example, SUBSTRB used to trim values before assignment may need to be replaced with SUBSTR.
-- Sergiusz
Similar Messages
-
Convert all VARCHAR2 data types to character length semantics
Hi,
I am wondering if there is an easy way to convert all columns in the database of data type VARCHAR2(x BYTE) to VARCHAR2(x CHAR)?
Regards
HåkanThe DMU does not allow character length semantics migration for the following type of objects:
- Columns already in character length semantics
- Data dictionary columns
- Columns under Oracle-supplied application schemas
- CHAR attribute columns of ADT
- Columns in clusters
- Columns on which partition keys are defined
Please check if the disabled nodes you observed in the wizard fall under one of these categories. -
Hi,
There are odbc apis which have character i/o parameters. These arguments have length parameters associated with them. These lengths can be number of bytes or number of wide characters/code points. I want to know about these length semantics. I could not find any relevant documentation. It would be fine if somebody points me to such a reference.
Thanks and regards,
Vivek.Hi,
What does codepoint mean
In simple terms Code point Assinging a distinct numeric value for each character.
As coming to length Semantics, that totally depends upon the Language setting like UTF-8. Since in Language Setting in order to maintain/store a Asian character it takes(Single Character).. it takes three code units some thing like that...
Let me put the nice link for you from where you can get some understanding about the length semantics...
Here we go finally got it.. for you... But nice link..
http://www.oracle.com/technology/oramag/oracle/03-nov/o63tech_glob.html
- Pavan Kumar N -
Querying CHAR columns with character length semantics unreliable
Hi again,
It appears that there is a bug in the JDBC drivers whereby it is highly unlikely that the values of CHAR columns that use character length semantics can be accurately queried using ResultSet.getString(). Instead, the drivers return the value padded with space (0x#20) characters out to a number of bytes equal to the number of characters multiplied by 4. The number of bytes varies depending on the number and size of any non-ascii characters stored in the column.
For instance, if I have a CHAR(1) column, a value of 'a' will return 'a ' (4 characters/bytes are returned), a value of '\u00E0' will return '\u00E0 ' (3 characters / 4 bytes), and a value of '\uE000' will return '\uE000 ' (2 characters / 4 bytes).
I'm currently using version 9.2.0.3 of the standalone drivers (ojdbc.jar) with JDK 1.4.1_04 on Redhat Linux 9, connecting to Oracle 9.2.0.2.0 running on Solaris.
The following sample code can be used to demonstrate the problem (where the DDL at the top of the file must be executed first):
import java.sql.*;
import java.util.*;
This sample generates another bug in the Oracle JDBC drivers where it is not
possible to query the values of CHAR columns that use character length semantics
and are NOT full of non-ascii characters. The inclusion of the VARCHAR2 column
is just a control.
CREATE TABLE TMP2
TMP_ID NUMBER(10) NOT NULL PRIMARY KEY,
TMP_CHAR CHAR(10 CHAR),
TMP_VCHAR VARCHAR2(10 CHAR)
public class ClsCharSelection
private static String createString(char character, int length)
char characters[] = new char[length];
Arrays.fill(characters, character);
return new String(characters);
} // private static String createString(char, int)
private static void insertRow(PreparedStatement ps,
int key, char character)
throws SQLException
ps.setInt(1, key);
ps.setString(2, createString(character, 10));
ps.setString(3, createString(character, 10));
ps.executeUpdate();
} // private static String insertRow(PreparedStatement, int, char)
private static void analyseResults(PreparedStatement ps, int key)
throws SQLException
ps.setInt(1, key);
ResultSet results = ps.executeQuery();
results.next();
String tmpChar = results.getString(1);
String tmpVChar = results.getString(2);
System.out.println(key + ", " + tmpChar.length() + ", '" + tmpChar + "'");
System.out.println(key + ", " + tmpVChar.length() + ", '" + tmpVChar + "'");
results.close();
} // private static void analyseResults(PreparedStatement, int)
public static void main(String argv[])
throws Exception
Driver driver = (Driver)Class.forName(
"oracle.jdbc.driver.OracleDriver").newInstance();
DriverManager.registerDriver(driver);
Connection connection = DriverManager.getConnection(
argv[0], argv[1], argv[2]);
PreparedStatement ps = null;
try
ps = connection.prepareStatement(
"DELETE FROM tmp2");
ps.executeUpdate();
ps.close();
ps = connection.prepareStatement(
"INSERT INTO tmp2 ( tmp_id, tmp_char, tmp_vchar " +
") VALUES ( ?, ?, ? )");
insertRow(ps, 1, 'a');
insertRow(ps, 2, '\u00E0');
insertRow(ps, 3, '\uE000');
ps.close();
ps = connection.prepareStatement(
"SELECT tmp_char, tmp_vchar FROM tmp2 WHERE tmp_id = ?");
analyseResults(ps, 1);
analyseResults(ps, 2);
analyseResults(ps, 3);
ps.close();
connection.commit();
catch (SQLException e)
e.printStackTrace();
connection.close();
} // public static void main(String[])
} // public class ClsColumnInsertionFYI, this has been mentioned as early as November last year:
String with length 1 became 4 when nls_lang_semantics=CHAR
and was also brought up in Feburary:
JDBC thin driver pads CHAR col to byte size when NLS_LENGTH_SEMANTICS=CHAR -
Sybase ASE to Oracle Database 11g Migration
Hi,
I am looking for documentation on migration from Sybase ASE to Oracle Database 11g Migration.
Front End application is Peoplesoft.
Please suggest documentation on the same.
I look forward to you reply.
Best regards
SonaliHave a look at the SQL Developer tool:
Oracle SQL Developer</title><meta name="Title" content="Oracle SQL Developer"><meta n…
It contains a migration utility which allows you to migrate foreign databases to Oracle:
Database Migration Technology</title><meta name="Title" content="Database Migration Technology&q…
Documentation and videos are available from the migration wbe page.
- Klaus -
Hi All
I am trying to find this class so I can start the toplink workbench - any ideas where I can get it ? its not in the toplink.jar as mentioned in the Oracle documentation
ThanksFrom the documentation, it says you have to have
<TOPLINK_HOME>/jlib/cmpmigrator.jar
on the classpath, which is where the oracle.toplink.tools.migration.TopLinkCMPMigrator class can be found.
This is from the 10.1.3 docs at:
http://download-west.oracle.com/otn_hosted_doc/toplink/1013/MAIN/_html/asinteg003.htm#BABGFHIA
Best Regards,
Chris -
Hi,
in 10g R2 how to enable Character length semantics ?
Thank you.You cannot just enable character length semantics.
The following link would be helpful.
You need to export schema and import schema after setting the parameter
NLS_LENGTH_SEMANTICS=CHAR
[Character semantics |http://www.oracle.com/technology/oramag/oracle/03-mar/o23sql.html] -
Issues caused by changing length semantics
Hi All,
Our database was formerly using BYTE for columns of tables and for stored procedures. We recently changed the Length Semantics to CHAR but caused some issues. An ORA-06502 pl/sql numeric or value error character string buffer too small appears when we access the database's stored procedures via Java. What could be the possible cause of this? Could you give me some paths to take in troubleshooting this issue?
Thanks to all!
Edited by: 1002671 on 25.4.2013 23:551002671 wrote:
Thanks for answering Sir! Are you kidding!!! No 'Sir' please... Common I don't know anything yet.
Correct me if I'm wrong but doesn't CHAR already handle multi-byte characters passed to or used in stored procedures? I'm really not that knowledgeable when it comes to the effects of changing the Length Semantics. We already changed the columns from VARCHAR2(BYTE) to VARCHAR2(CHAR). The problem lies within the stored procedures.I'm not clear on your doubt, but please check this -
Link: http://docs.oracle.com/cd/E11882_01/appdev.112/e10472/datatypes.htm (Section 'Declaring Variables for Multibyte Characters')
>
When declaring a CHAR or VARCHAR2 variable, to ensure that it can always hold n characters in any multibyte character set, declare its length in characters—that is, CHAR(n CHAR) or VARCHAR2(n CHAR), where n does not exceed FLOOR(32767/4) = 8191.
>
What i feel is you getting confused with the SQL data-type 'VARCHAR2' (i.e. used in specifying column type) and PL/SQL data-type 'VARCHAR2' (i.e. used while declaring variables)
Then check this : difference between BYTE & CHAR
Read and thoroughly research each comment given by the Experts there. -
SQL Loader Multibyte character error, LENGTH SEMANTICS CHARACTER
Hi,
startet SQL Loader Multibyte character error
{thread:id=2340726}
some mod locked the thread, why?
the solution for others:
add LENGTH SEMANTICS CHARACTER to the controlfile
LOAD DATA characterset UTF8 LENGTH SEMANTICS CHARACTER
TRUNCATE
INTO TABLE utf8file_to_we8mswin1252
ID CHAR(1)
, TEXT CHAR(40)
)Regards
MichaelHi Werner,
on my linux desktop:
$ file test.dat
test.dat: UTF-8 Unicode text, with very long lines
my colleague is working on a windows system.
On both systems exact the same error from SQL Loader.
Btw, try with different number of special characters (german umlaute and euro) and there is no chance to load without the error
when to many (?) special characters or data is long as column length and special characters included.
Regards
Michael -
Runnig forms in oracle 10g after migrating them from oracle 6i
Guyz ....i have developed forms in oracle 6i and when i presented them to the my manager he says that he wants them to be web based.....arghhhh.!!!!!! shud have told me at the beginning itself....anyways i downloaded the forms 10g from oracle website and migrates my forms from 6i to 10g ....
1.The forms are getting compiled and i can run them on my laptop.. i wanna noe whats the procedure to run them on other system .....the port number is 8889
I mean running them on a web on some other LP or system.???
2.And my 2nd problem is that ...The records are not being saved in 10g as in 6i. I created the forms in 6i using the data block wizard and set the block accordingly as told by u people in this forumLook at the existing formsweb.cfg configuration file located in your <DEVSUITE_HOME>/forms/server folder. It allows you to create different sections to start your Forms applications.
For instance, if you create a application_1 section, give this name in your URL.
Francois -
Oracle 10g XE migrate from 32bit to 64bit becomes slower
Oracle 10g XE migrate from 32bit to 64bit becomes slower
Currently we have a database using Oracle 10g XE R2
(Oracle Database 10g Express Edition Release 10.2.0.1.0)
on a linux 32bit server.
And recently we did migrate it to another 64bit linux server.
But found after migration is it running slower at 64bit server (~ 25% slower)
1. We would wonder any tuning is required on 64bit server?
Besides, as I know Oracle XE only using single core CPU for processing.
On 32bit linux the CPU is Intel(R) Xeon(R) CPU E31235 @ 3.20GHz
On 64bit linux the CPU is Intel(R) Xeon(R) CPU E5-2470 v2 @ 2.40GHz
2. Would the CPU clock speed on a single core also made the 64bit Oracle slower?
Thanks a lot.32 or 64 bit normally won't cause much performance difference as long as you can't use > 4GB RAM
Please note that XE can use only 1 core and 1GB memory.
According to benchmarks, Single Thread Rating of the 2 CPU really differ by around 25%. Quite consistent with your observation.
Of course, IO (i.e., disk) speed is also an important factor.
https://www.cpubenchmark.net/cpu.php?id=2003
https://www.cpubenchmark.net/cpu.php?id=1200 -
Oracle SQL Developer vs Oracle SQL Developer Migration Workbench
Gurus,
Can anybody let me know what's the difference between Oracle SQL Developer vs Oracle SQL Developer Migration Workbench tools.
I am in the process of Migrating MS-Access Application to APEX. So the example says me to use Oracle SQL Developer Migration Workbench. Is this part of Oracle SQL Developer? If yes, I know SQL Developer is free.
If not is Oracle SQL Developer Migration Workbench if free?
Thanks and RegardsThank Oracle ;)
(well, it's the least they can do after you paid big $$$ for the database)
Regards,
K. -
How to determine column length semantics through ANSI Dynamic SQL ?
I am looking for a way to determine the length semantics used for a column through ANSI Dynamic SQL.
I have a database with NLS_CHARACTERSET=AL32UTF8.
In this database I have the following table:
T1(C1 varchar2(10 char), C2 varchar2(40 byte))
When I describe this table in SQL*Plus, I get:
C1 VARCHAR2(10 CHAR)
C2 VARCHAR2(40)
In my Pro*C program (mode=ansi), I get the select statement on input, use PREPARE method to prepare it and then use the GET DESCRIPTOR method to obtain colum information for output:
GET DESCRIPTOR 'output_descriptor' VALUE :col_num
:name = NAME, :type = TYPE,
:length = LENGTH, :octet_length = OCTET_LENGTH
For both C1 and C2 I get the following:
:type=12
:length=40
:octet_length=40
So, even if I know that my database is AL32UTF8, there doesn't seem to be a way for me to determine whether char or byte length semantics were used in C1 and C2 column definitions.
Does anybody know how I can obtain this information through ANSI Dynamic SQL?
Note: the use of system views such as ALL_TAB_COLUMNS is not an option, since we wish to obtain this information even for columns in a complex select statements which may involve multiple tables.
Note: I believe OCI provides the information that we need through OCI_ATTR_DATA_SIZE (which is in bytes) and OCI_ATTR_CHAR_SIZE (which is in chars). However, switching to OCI is something we would like to avoid at this point.Yes, I was wondering which forum would be the best for my question. I see similar questions in various forums, Call Interface, SQL and PL/SQL and Database - General. Unfortunately there is no Pro*C or Dynamic SQL forum which would be my first choice for posting this question.
Anyway I now posted the same question (same subject) in the Call Interface forum, so hopefully I'll get some answers there.
Thank you for the suggestion. -
Oracle JDeveloper Application Migration Assistant (AMA)
Oracle JDeveloper Application Migration Assistant (AMA) link from http://otn.oracle.com/tech/migration/index.html
returns Page Not FoundThis is working now.
Thanks,
OTN -
Export with data length semantics
Hello,
I've following problem.
I have a table abcd which contains 2 VARCHAR2 columns with different data length semantics (one with BYTE, one with CHAR). Charset is Single Byte; let's say WE8MSWIN1252, so data length semantics should not be a problem. should not. details later.
So this would be:
create table abcd (a_char VARCHAR2(2 CHAR), a_byte VARCHAR2(2 BYTE));after that I export the table via exp. I'm not setting NLS_LENGTH_SEMANTICS environment variable, so BYTE is used.
In the dump file the data length semantics for the byte col is omitted, as I exported it with BYTE:
create table abcd (a_char VARCHAR2(2 CHAR), a_byte VARCHAR2(2));after that, I "accidently" import it with data length semantics set to CHAR, and the table looks like this now
abcd
a_char VARCHAR2(2 CHAR)
a_byte VARCHAR2(2 CHAR)Same happens vice versa when using CHAR for export and BYTE for import...
In single byte charsets this might not be so much of a problem, as one CHAR is equal to one BYTE, but...
If I compile plsql against the original table, and run against the outcoming table after export, I get an ORA-4062, and I have to recompile...
Would not be a problem if the plsql I compile would be on the database...Big problem is that the ORA-4062 occurs in forms, where it's difficult for me to recompile (I would have to transfer all the sources to customer and compile there).
Is there any possibility to export data length semantics regardless which environment variable is set?
database version would be 9.2.0.6; but if there exists a solution in higher versions I would also be happy to hear them...
many thanks,
regardsI can't reproduce your problem:
SQL> show parameter nls_length_semantics
NAME TYPE VALUE
nls_length_semantics string BYTE
SQL> create table scott.demo( col1 varchar2(10 byte), col2 varchar2(10 char) );
SQL> describe scott.demo
Name Null? Type
COL1 VARCHAR2(10)
COL2 VARCHAR2(10 CHAR)
$ export NLS_LENGTH_SEMANTICS=BYTE
$ exp scott/tiger file=scott.dmp tables=demo
SQL> drop table scott.demo;
$ export NLS_LENGTH_SEMANTICS=CHAR
$ imp scott/tiger file=scott.dmp
SQL> describe scott.demo
Name Null? Type
COL1 VARCHAR2(10 BYTE)
COL2 VARCHAR2(10)
SQL> alter session set nls_length_semantics=byte;
SQL> describe scott.demo
Name Null? Type
COL1 VARCHAR2(10)
COL2 VARCHAR2(10 CHAR)Can you post a test like mine?
Enrique
PS If you have access to Metalink, read Note:144808.1 Examples and limits of BYTE and CHAR semantics usage. From 9i and up, imp doesn't read nls_length_semantics from the environment.
Edited by: Enrique Orbegozo on Dec 16, 2008 12:50 PM
Edited by: Enrique Orbegozo on Dec 16, 2008 12:53 PM
Maybe you are looking for
-
ITunes 10.4 menubar/icons error on OS Lion
My iTunes 10.4 looks weird. I upgraded to 10.4 just today and everything works perfect. I even loaded apps and stuffs on my iPhone. It works. The thing is with the looks. * My volume bar does not display full bar. (although it works perfect)... it lo
-
Installation of ABA_PLUS 100
Hello, I'm using OSS 626311 to try and install the ABA_PLUS 100 add-on but can't find the installation package. I have support packages SAPKGPBA01 --> SAPKGPBA20 but can't find the Installation package. Does anyone know where this would be or how I
-
Service battery message after only 147 cycles
I have a 2010 Macbook Pro with Mavericks on it. After only 147 cyles I am getting a message that says "Service Battery" Why would this be happening? I did some searching and see it might be a common problem? Is Apple going to fix this? Doesn't make s
-
I've upgraded 9.3.1 to 9.3.3 and when opening upgraded app (both from Workspace and thick client) I'm getting following error message: 'An unexpected database error occurred. Check event log for details. (-2147220900) Looking at the HsvEventLog: <?xm
-
I have problem with a Content Engine 505. It reboot after 01 hour . Any help will be very useful !