New white paper: Character Set Migration Best Practices

This paper can be found on the Globalization Home Page at:
http://technet.oracle.com/tech/globalization/pdf/mwp.pdf
This paper outlines the best practices for database character set
migration that has been utilized on behalf of hundreds of customers
successfully. Following these methods will help determine what
strategies are best suited for your environment and will help minimize
risk and downtime. This paper also highlights migration to Unicode.
Many customers today are finding Unicode to be essential to supporting
their global businesses.

Sorry about that. I posted that too soon. It should become available today (Monday Aug 22nd).
Doug

Similar Messages

  • Migration Best Practice When Using an Auth Source

    Hi,
    I'm looking for some advice on migration best practices or more specifically, how to choose whether to import/export groups and users or to let the auth source do a sync to bring users and groups into each environment.
    One of our customers is using an LDAP auth source to synchronize users and groups. I'm trying to help them do a migration from a development environment to a test environment. I'd like to export/import security on each object as I migrate it, but does this mean I have to export/import the groups on each object's ACLs before I export/import each object? What about users? I'd like to leave users and groups out of the PTE files and just export/import the auth source and let it run in each environment. But I'm afraid the UUIDs for the newly created groups will be different and they won't match up with object ACLs any more, causing all the objects to lose their security settings.
    If anyone has done this before, any suggestions about best practices and gotchas when using the migration wizard in conjunction with an auth source would be much appreciated.
    Thanks,
    Chris Bucchere
    Bucchere Development Group
    [email protected]
    http://www.bucchere.com

    The best practice here would be to migrate only the auth source through the migration wizard, and then do an LDAP sync on the new system to pull in the users and groups. The migration wizard will then just "do the right thing" in matching up the users and groups on the ACLs of objects between the two systems.
    Users and groups are actually a special case during migration -- they are resolved first by UUID, but if that is not found, then a user with the same auth source UUID and unique auth name is also treated as a match. Since you are importing from the same LDAP auth source, the unique auth name for the user/group should be the same on both systems. The auth source's UUID will also match on the two systems, since you just migrated that over using the migration wizard.

  • Character Set Migration - Arabic & English Language Support

    Hi,
    Sofware Specifications:
    OS Version : Windows 2003 EE Server, SP2, 32-Bit
    DB Version : 9.2.0.1
    Application : Lotus Domino 6.5
    Existing Set Up:
    DB CHAR SET : WE8MSWIN 1252
    National Character Set : AL16UTF16
    NLS_LANG : NA
    Now the customer extended their business in EGYPT.
    They need the existing database to support ARABIC & ENGLISH Languages.
    Kindly let me know how to do this character set migration and achieve the client specification.
    Regards
    Suresh

    Check Metalink
    Note:179133.1
    Subject:      The correct NLS_LANG in a Windows Environment
    Note:187739.1
    Subject:      NLS Setup in a Multilingual Database Environment
    Note:260023.1
    Subject:      Difference between AR8MSWIN1256 and AR8ISO8859P6 characterset
    Also, please list all the steps you have performed till now

  • Clarification on Character set migration from US7ASCII to UTF8

    Hi,
    I need clarification on the below.
    I need to migrate the database from US7ASCII to UTF8.
    For this I ran csscan for user "TEST" as well as against full database.
    Below log is the csscan output against full database. but my application is depended on TEST schema only. Shall I need to migrate SYS objects data as shown below or it's not required?. If required how to migrate these objects data?
    Looking forward you help.
    USER.TABLE Convertible Exceptional
    SYS.METASTYLESHEET 58 TEST.Table_1 9 0
    TEST.Table_2 11 0
    TEST.Table_3 17 0
    TEST.Table_4 11 0
    [Distribution of Convertible Data per Column]
    USER.TABLE|COLUMN Convertible Exceptional
    SYS.METASTYLESHEET|STYLESHEET 58 0
    Thanks,
    Sankar

    I think you need to migrate all schemas data not only one application schema because
    the database character set is common to all CHAR, VARCHAR2, LONG and CLOB colums
    for any tables in any schema.
    In your case (US7ASCII to UTF8), you need to use export/import because:
    Another restriction of the ALTER DATABASE CHARACTER SET statement is that it can be used only when the character set migration is between two single-byte character sets or between two multibyte character sets. If the planned character set migration is from a single-byte character set to a multibyte character set, then use the Export and Import utilities.
    (see http://download-uk.oracle.com/docs/cd/B10501_01/server.920/a96529/ch10.htm#1009904)

  • New White Paper: The Career Benefits Of Certification

    Is Certification Worth It?
    The answer is a resounding YES for IT professionals who are looking to boost their career. While there are no guarantees, certification has been shown to enhance various aspects of an IT professional’s career, including:
    • Employability
    • Salary
    • Job Effectiveness
    • Job Satisfaction
    Download the new white paper (free, PDF): http://blogs.oracle.com/certification/entry/0633

    Sorry about that. I posted that too soon. It should become available today (Monday Aug 22nd).
    Doug

  • Wausau Bright White 65 lb paper - What setting for best quality on a D7560

    Have just purchased a D7560. Have thousands of sheets of Wausau Bright White Paper that I print note cards on.
    Want to know the best "paper setting" to use to get the brightest printing possible.
    While it is printing ok now, the colors seem to me to be on the dull side.
    I am allowing the printer to set the printer settings and the photos come from Photoshop CS4
    Should I also change the printing profiles to improve the print quality?
    Thanks, Bc. 

    Have just purchased a D7560. Have thousands of sheets of Wausau Bright White Paper that I print note cards on.
    Want to know the best "paper setting" to use to get the brightest printing possible.
    While it is printing ok now, the colors seem to me to be on the dull side.
    I am allowing the printer to set the printer settings and the photos come from Photoshop CS4
    Should I also change the printing profiles to improve the print quality?
    Thanks, Bc. 

  • Data Migration Best Practice

    Is the a clear cut best practice procedure for conducting data migration from one company to a new one ?

    I don't think there is a clear cut for that.  Best Practice would always be relative.  It varies dramatically depending on many factors.  There is no magical bullet here.
    One except for above: you should always use Tab delimited Text format.  It is DTW friendly format.
    Thanks,
    Gordon

  • Character set migration error to UTF8 urgent

    Hi
    when we migrated from ar8iso889p6 to utf8 characterset we are facing one error when i try to compile one package through forms i am getting error program unit pu not found.
    When i running the source code of that procedure direct from database using sqlplus its running wihtout any problem.How can i migrate this forms from ar8iso889p6 to utf8 characterset. We migrated from databas with ar8iso889p6 oracle 81.7 database to oracle 9.2. database with character set UTF8 (windows 2000) export and import done without any error
    I am using oracle 11i inside the calling forms6i and reports 6i
    with regards
    ramya
    1) this is server side program yaa when connecting with forms i am getting error .When i am running this program using direct sql its working when i running compiling i am getting this error.
    3) yes i am using 11 i (11.5.10) inside its calling forms 6i and reports .Why this is giving problem using forms.Is there any setting changing in forms nls_lang
    with regards

    Hi Ramya
    what i understand from your question is that you are trying to compile a procedure from a forms interface at client side?
    if yes you should check the code in the forms that is calling the compilation package.
    does it contains strings that might be affected from the character set change???
    Tony G.

  • Native Toplink to EclipseLink to JPA - Migration Best Practice

    I am currently looking at the future technical stack of our developments, and would appreciate any advise concerning best practice migration paths.
    Our current platform is as follows:
    Oracle 10g AS -> Toplink 10g -> Spring 2.5.x
    We have (approx.) 100 seperate Toplink Mapping Workbench projects (we have one per DDD Aggregate object in effect) and therefore 100 Repositories (or DAOs), some using Toplink code (e.g. Expression Builder, Class Extractors etc) on top of the mappings to support Object to RDB (legacy) mismatch.
    Future platform is:
    Oracle 11g AS -> EclipseLink -> Spring 3.x
    Migration issues are as follows:
    Spring 3.x does not provide any Native Toplink ORM support
    Spring 2.5.x requires Toplink 10g to provide Native Toplink ORM support
    My current plan is as follows:
    1. Migrate Code and Mappings to use EclipseLink (as per Link:[http://wiki.eclipse.org/EclipseLink/Examples/MigratingFromOracleTopLink])
    2. Temporarily re-implement the Spring 2.5.x ->Toplink 10g support code to use EclipseLink (e.g. TopLinkDaoSupport etc) to enable testing of this step.
    3. Refactor all Repositories/DAOs and Support code to use JPA engine (i.e. Entity Manager etc.)
    4. Move to Spring 3.x
    5. Move to 11g (when available!)
    Step 2 is only required to enable testing of the mapping changes, without changing to use the JPA engine.
    Step 3 will only work if my understanding of the following statement is correct (i.e. I can use the JPA engine to run native Toplink mappings and associated code):
    Quote:"Deployment XML files from Oracle TopLink 10.1.3 and above can be read by EclipseLink."
    Speciifc questions are:
    Is my understanding correct regarding the above?
    Is there any other path to achieve the goal of using 11g, EclipseLink (and Spring 3.x)?
    Is this achieveable without refactoring all XML mappings from Native -> JPA?
    Many thanks for any assistance.
    Marc

    It is possible to use the native/MW TopLink/EclipseLink deployment xml files with JPA in EclipseLink, this is correct. You just need to pass a persistence property giving your sessions.xml file location. The native API is also still supported in EclipseLink.
    James : http://www.eclipselink.org

  • Win xp to Win 7 migration -- best practice

    Hi , 
    What are the best practices that need to be followed when we migrate xp to win 7 using configuration manager?
     - like computer name (should we rename it during migration or keep it as is?).
    - USMT like what should be migrated?
    Psl. share any pointers\suggeations. Thanks in advance.
    Regards,

    First determine your needs... do you really need to capture the user data or not.. perhaps they can store their precious data before the OS upgrade by themselves, so you don't need to worry about it? If you can made this kind of political decision, then
    you don't need USMT. Same goes for computer name, you can use the old ones if you please. It's a political decision, you should use unique name for all of your computers, some prefer PCs serial numbers, some prefer something else,
    it's really up to you to decide.
    Some technical pointers to consider:
    Clients should have ConfigMgr client installed on them before the migration (so that they appear in the console and can be instructed to do things, like run task sequence...)
    If the clients use static IP addresses, you need to configure your TS to capture those settings and use them during the upgrade process...

  • Database character set migration rules

    We have an application which migrates various entities from one database to the other. These databases could have different character sets. The migration process is automated. What character set rules should we follow while migration? Can we use the character set scanner is some automated way?
    Does ORACLE publish a comprehensive list of subsets and supersets? (and I am NOT talking about strict subsets and supersets) I know that you have a whitepaper which has some rules but that is not a complete set. Any help would be much appreciated. This is an urgent requirement.

    There is no simple answer as this is a complex process. You make reference to having looked at a whitepaper on this subject so I assume you know that your application will likely need to dynamically set the nls_lang setting so that proper conversion can take place on the database.
    To find what character sets are subsets or supersets of what character sets?
    A) The only listing available is in the 8.1.7 Document addendum shipped with the rdbms CD. This document can be found on the web at:
    http://technet.oracle.com/docs/products/oracle8i/doc_index.htm
    There is no real elegant way to determine possible lossy data situations programmically. Using the scanner to detect within your program whether proper conversion will take place might be possible. There is a documented v$ table that is updated when the scanner is run that could be examined
    via SQL in your program to detect lossy data and truncation. You could also possibly use the PL/SQL convert function to detect lossy data. Something like if 'string' = convert('string', us7ascii, WE8MSWIN1252) then ....
    Hope this helps.

  • GRC AACG/TCG and CCG control migration best practice.

    Is there any best practice documents which illustrates the step by step migration of AACG/TCG and CCG controls from the development instance to the production? Also, how should one take the back up for the same ?
    Thanks,
    Arka

    There are no automated out of the box tools to migrate anything from CCG.  In AACG/TCG  you can export and import Access Models (includes the Entitlements) and Global Conditions.  You will have to manual setup roles, users, path conditions, etc.
    You can't clone AACG/TCG or CCG.
    Regards,
    Roger Drolet
    OIC

  • Need very high PCIe throughput in your next design? Get help using Xilinx UltraScale, UltraScale+ devices in new White Paper

    The Xilinx UltraScale architecture has many features that make implementing high-performance PCIe designs possible. Each of the integrated PCIe blocks in Xilinx Virtex UltraScale+ and Kintex UltraScale+ devices can transfer more than 14Gbytes/s in each direction sustained throughput when configured to operate as PCIe Gen3 x16 or Gen4 x8 port using a 256-byte system Maximum Payload Size and most UltraScale and UltraScale+ devices incorporate more than one such integrated PCIe block.
    The transceivers in Xilinx devices based on the UltraScale architecture contain features that allow for very robust operation at these high PCIe data rates. These features include:
    Transmitter emphasis/equalization
    Auto-adaptive equalization
    In addition, most PCIe applications use some type of high-speed memory for data buffering. Here again, Xilinx UltraScale and UltraScale+ devices provide for robust PCIe designs by supporting high-speed DDR4-2400 and DDR4-2666 SDRAM.
    A new Xilinx White Paper, “PCI Express for UltraScale Architecture-Based Devices” (WP464), discusses these topics in much more detail.
     

    The Xilinx UltraScale architecture has many features that make implementing high-performance PCIe designs possible. Each of the integrated PCIe blocks in Xilinx Virtex UltraScale+ and Kintex UltraScale+ devices can transfer more than 14Gbytes/s in each direction sustained throughput when configured to operate as PCIe Gen3 x16 or Gen4 x8 port using a 256-byte system Maximum Payload Size and most UltraScale and UltraScale+ devices incorporate more than one such integrated PCIe block.
    The transceivers in Xilinx devices based on the UltraScale architecture contain features that allow for very robust operation at these high PCIe data rates. These features include:
    Transmitter emphasis/equalization
    Auto-adaptive equalization
    In addition, most PCIe applications use some type of high-speed memory for data buffering. Here again, Xilinx UltraScale and UltraScale+ devices provide for robust PCIe designs by supporting high-speed DDR4-2400 and DDR4-2666 SDRAM.
    A new Xilinx White Paper, “PCI Express for UltraScale Architecture-Based Devices” (WP464), discusses these topics in much more detail.
     

  • New White Paper: "Oracle 9i PL/SQL New Features"

    The July issue of the Pipeline Newsletter includes an 8-page feature article entitled "Oracle 9i PL/SQL New Features", written by Sandeepan Banerjee, of Oracle's PL/SQL Development Team. Click on the URL below for the newsletter:
    http://www.revealnet.com/newsletter-v2/newsletter_0701.htm
    In this article, you will read about:
    - Native Compilation of PL/SQL
    - Bulk Binds and Bulk Dynamic SQL
    - Common SQL Parser
    - Pipelined, Parallelized Table Functions
    - Transparent Performance Improvements
    - PL/SQL and XML
    - DBMS_XMLGEN
    - URI_References
    - HTTP "Cookie" Support in PL/SQL
    Visit www.revealnet.com and subscribe to the Pipeline Newsletter for monthly articles tips and code utilities for Oracle professionals.
    Best wishes,
    Cam White
    RevealNet

    I have some questions to the opdx.
    [list]
    [*] What do I need to use the opdx. There is a link on the ifs page to a demopage with download but it is broken. SO PLEASE list up everything I need.
    [*] are there some samples how to use the opdx, OR is there a fully functional OPDX with which I can play?
    [list]
    Thank you very much

  • New User Trying To Set Up Best Backing Up Scheme

    i am trying to set up the best system to upload my photos from my camera and maintain a good backup procudure, and i am a little confused on the best way to do this.
    i have a new sony dsc-tx5 that i got around the same time i bought my lightroom software, so am learning new things all around. the camera is great, but the software that came with it is not very good, (PMB).
    - i have backed up all my photos from my camera into one main directory on my PC for my original photo files - using the Sony PMB software.
    - i then backed this directory up to my external harddrive. and whenever i backup new photos to this directory, i will also back these new files to my external drive as well -  using Windows 7 explorer to do this task.
    - then i copied all the original photos to another directory on the same hard drive, but i divided all the photos in this new directory into several sub-directories to the approximate size of a dvd data disk, and then backed those up to dvd data disks; and whenever i get enough files in a new sub-directory about the size of another dvd data disk, i back them up to a dvd disk as well - again using windows explorer.
    - so far so good, but now i am unsure about how to incorporate Lightroom into this whole process. The PMB software from my sony camera downloads all new photos and puts them into my main directory to store my orignal files as i had mentioned in step one here. and in fact, PMB keeps an index of all files and directories, and any changes to them, etc. in my "My Pictures" Directory on my PC. so i dont need lightroom to mange that part at all.
    - the original photo files should remain separate from anything I do with lightroom right now, so i have a separate directory of photos for lightroom to use and play around with my files, without involving the originals at all. it is a completely separate directory with copies of the original files that  i only use to lightroom as i learn my way around the propram.
    - right now i made a new directory of all my files and renamed it as something for just lightroom to use. but i also see in the lightroom subdirectory in the "my pictures" directory on my PC are serveral other sub-directories named in the "Download Backups" sub-sub directory, and i can see some of my same photos in there too. so now i am lost about what is what and where, etc. this part is a bit confusing, and i am not sure how to incorporate a way and/or where to backup the finished modified photos, etc.
    - and the issues of backing-up catalogs vs regular back-ups has me confused too.
    - am i using the best method to mange my files before and after using lightroom so far, or am i way off track here?
    any help one might offer to straighten this whole confusion up before i get heavy into the editing process would be much appreciated! where to look for instructional info to clear it all up, etc. what terms to search for on the help resources, etc.
    thank you for you support!

    Thank you for your assistance dj_paige.it was helpful to confirm what i am doing is on the right track. but in finally having the time to get around to trying all this out, it has of course brought a few more questions!
    before i ask my questions though, i just wanted to more specifically reiterate what my "setting up" process has evolved to, thus far:
    -  I use the photo management software tool that came with my Sony camera, named PMB, to upload all of the latest photo files i've shot to my computer into a directory, without any sub-directories, named it something like "Sony Photos <with latest upload date added to this name>". Besides uploading the newest photo files, I do no other tasks with PMB, other than just keeping track of what is where, which it updates automatically somehow.
    - The "Sony Photos" directory is placed as a sub-directory in my "My Pictures" parent directory on my PC.
    - I have another sub-directory, named "Sony Photos Organized To BU To DVDs", also located as a sub-directory in my "My Pictures" parent directory on my PC. Within this subdirectory, are several sub-sub-directories which are named according the the date ranges of the photo files within them were shot. The files within these sub-sub-directories are copies of the same photo files in the previously mentioned "Sony Photos" directory; the only difference here being that i have divided each of sub-directories in the "Sony Photos Organized To BU To DVDs" to each be approximately 4 GB in size, so that I can then easily copy the subdirectories onto DVD data disks which are also approximately the same 4 GB in size.
    - I also have another sub-directory, also located as a sub-directory in my "My Pictures" parent directory on my PC, which is also one big directory holding all of my photos from my previous digital camera, named "Cannon Photos". Since I no longer shoot with this camera anymore, there are no new images to worry about here, and they have all been backed up to my external drive as well as DVD disks. So it is just an issue to keeping track of them in Lightroom, more on that later.
    - The last of the four photo-related subdirectories within my "My Pictures" parent directory is the Lightroom directory, simply named "Lightroom" of course. There is one file and one sub-directory in this Lightroom directory. The file is named "Lightroom 3 Catalog" and the sub-directory is named "Lightroom 3 Catalog Previews.Irdata".
    - When I am backing up all of these 4 photo-related sub-directories to my external harddiskdrive, I am also backing up the entire Lightroom directory, which includes the "previews subdirectory" as well as the catalog file. Even though this directory is relatively small, it sure does take a long time to back up this preview directory, i guess it is because there are so many files to copy/transfer - I have a huge number of photo files.
    - * So my first question is:  Are both the aforementioned "catalog" file and the accompanying sub-directory in the Lightroom directory considered as a group that should be kept together during backups, or is just backing up the "catalog" file by itself okay? Ie.: Do I need to back up the previews subdirectory as well?
    - Next Question: Do I need to use the PMB software  in this situation at all, or should I figure out how to use just Lightroom by itself, without the help of PMB to manage this set-up situation?
    - Which leads to my last question, does this all seem like a good method for setting up my photos for uploading, backing-up, to keeping track of things, etc.?
    Thank you again for you time, dj_paige. If you, or anyone else reading this, has any other suggestions here please let me know!
    - tee 

Maybe you are looking for