Need Best Practice for Migrating from Solaris to Linux

Hi Team,
We are migrating our Data Center from Solaris to Linux and our EBS 11i, database 10g (10.2.0.5) is 6TB. Please let us know the Best Practice to Migrate our EBS 11.5.10.2 from Solaris to Linux RHEL 5.
we require Database 10g (10.2.0.5) on Linux x86-64 RHEL 5 and Application EBS on Linux x86 RHEL 5. Please let us know for any details.
EBS version: 11.5.10.2
DB version: 10.2.0.5
We have checked the certifications in Oracle support.
Oracle EBS 11.5.10.2 is not certified with Linux x86-64 RHEL 5. 
Oracle EBS 11.5.10.2 is certified on Linux x86 RHEL 5.
So we require Database 10g (10.2.0.5) on Linux x86-64 RHEL 5 and Application EBS on Linux x86 RHEL 5.
Thank You.

You can transportable tablespace for the database tier node.
https://blogs.oracle.com/stevenChan/entry/10gr2_xtts_ebs11i
https://blogs.oracle.com/stevenChan/entry/call_for_xtts_eap_participants
For the application tier node, please see:
https://blogs.oracle.com/stevenChan/entry/migrate_ebs_apptiers_linux
https://blogs.oracle.com/stevenChan/entry/migrating_oracle_applications_to_new_platforms
Thanks,
Hussein

Similar Messages

  • Best practices for migrating from 10.4 XServe to 10.5 XServe

    Hi--
    Suggestions welcomed for the following scenario. I have an almost 3-year-old XServe G5, running OS X Server 10.4.11. I have a brand new XServe Xeon on order, which of course will come with Server 10.5.(1).
    The following are the critical services running on my current XServe:
    --An Open Directory database with approximately 200 users;
    --Around 50 file shares
    --The data itself stored in the shared folders
    What is the best way to migrate my Open Directory information from old to new? I know that a 10.5 server cannot be a replica for a 10.4 OD master.
    I know that I can use the password export instructions provided on AFP548, or at least I know I could use them in 10.4. Has anyone successfully exported users/passwords from 10.4 and imported them into 10.5?
    I think I could install 10.4.11 on the new XServe, make it an OD Replica, promote it to Master, take the old Master offline, then upgrade the new XServe to 10.5.
    Alternately, I could clone my current server, then upgrade it to 10.5, make the new one a replica, then promote, etc. I will eventually want to get the old server on 10.5, as its intended role is to be a Time Machine Server.
    Suggestions welcomed. Should I just stick with 10.4 for now, rolling back the new server and waiting on 10.5 for a few months? My preference would be to get both of these servers on 10.5 during the downtime for the migration, but I'm open to new ideas.
    Will migrating the Open Directory info also migrate the shares? In other words, if I set up a folder structure on the new server identical to the old, and bring the data over, will I have to recreate my shares and their respective permissions?
    Thanks in advance.
    Eric

    I would recommend shutting off mail services and then booting your server on another drive and, using Carbon Copy Cloner, copy the disk to the new server. Disconnect the old server from the network, connect the new server and then boot it up and immediately start an upgrade on the new server to 10.5.0 then 10.5.1. If it fails, revert to the old before starting mail services.
    I did that upgrading an old sawtooth 10.4.11 to a mini with 10.5.1 and it went well except for all the virtual hosts for web sites. Still have not solved that one - just did the upgrade this weekend.
    Paul

  • Best practice for migrating from C160 to C370

    Hi there
    I am upgrading from C160 to C370. I have lots of policies and filtering created in C160. How do I migrate all the configuration and policies to my new box?
    Hope someone can advise me.
    Thanks.
    Winston

    Hello Winston,
    you should not have any problems migrating your configuration to the new box, basically all you have to do is to bring both appiances to the same version of AsyncOS, then safe the configuration of the C160 with unmasked(!) passwords (default is to mask them, so make sure you uncheck that option, otherwise the config will not upload), then upload the configuration to the C370, submit and commit to activate it.
    It's as easy as that, the only requirement you'd have to follow is that the AsyncOS version is the same on both boxes, and the config contains unmasked passwords.
    Hope that helps,
    Andreas

  • Best practice for migrating IDOCs?

    Subject: Best practice for migrating IDOC's? 
    Hi,
    I need to migrate some IDOC's to another system for 'historical reference'.
    However, I don't want to move them using the regular setup as I don't want the inbound processing to be triggered.
    The data that was created in the original system by the processed IDOC's will be migrated to the new system using migration workbench. I only need to migrate the IDOC's as-is due to legal requirements.
    What is the best way to do this? I can see three solutions:
    A) Download IDOC table contents to a local file and upload them in the new system. Quick and dirty approach, but it might also be a bit risky.
    B) Use LSMW. However, I'm not sure whether this is feasible for IDOC's.
    C) Using ALE and setting up a custom partner profile where inbound processing only writes the IDOC's to the database. Send the IDOC's from legacy to the new system. Using standard functionality in this way seems to me to be the best solution, but I need to make sure that the IDOC's once migration will get the same status as they had in the old system.
    Any help/input will be appreciated
    Regards
    Karl Johan
    PS. For anyone interested in the business case: Within EU the utility market was deregulated a few years ago, so that any customer can buy electricity from any supplier. When a customer switches supplier this is handled via EDI, in SAP using ALE and IDOC's. I'm working on a merger between two utility companies and for legal reasons we need to move the IDOC's. Any other data is migrated using migration workbench for IS-U.

    Hi Daniele
    I am not entirely sure, what you are asking, Please could you provide additional information.
    Are you looking for best practice recommendations for Governance, for example: Change transports between DEV, QA and PRD in BPC 7.0?
    What is the best method? Server Manager backup and restore, etc  ?
    And
    Best Practice recommendations on how to upgrade to a different version of BPC, for example: Upgrading from BPC 7.0 to 7.5 or 10.0 ?
    Kind Regards
    Daniel

  • Best practice for migration to new hardware

    Hi,
    We are commissioning new hardware for our Web Server. Our current webserver is version 6.1SP4, and for the new server we've decided to stay with 6.1 but install SP7.
    Is there a best practice for migrating content from one physical server to another?
    What configuration files should I watch out for?
    Hopefully the jump from SP4 to SP7 won't cause too many problems.
    Thanks,
    John

    unfortunately, there is no quick solution for migrating from 1 server to other. you will need to carefully reconstruct
    - acl rules
    - server hostname configurations
    - any certificates that have been created on the old machine

  • Best Practices for migrating .rpd

    Hi,
    Can someone please share with us the industry best practices for migrating the .rpd file from a source (QA) to a destination (Prod) environment. Our Oracle BI server runs on a unix box and currently what we do to migrate is to save copies of QA and prod .rpds on our local machine and do a Merge. Once merged, we ftp the resulting .rpd to destination (prod) machine. This approach does not seem reliable and we are looking for a better established approach for .rpd migration. Kindly help me where I can find the Documentation..Any help in this regard would be highly appreciated.

    I don't think there is an industry best practice as such. What applies for one customer doesn't necessarily apply at another site.
    Have a read of this from Mark Rittman, it should give you some good ideas:
    http://www.rittmanmead.com/2009/11/obiee-software-configuration-management-part-2-subsequent-deployments-from-dev-to-prod/
    Paul

  • What is the best solution for migrating from Maverick to Yosemite?

    What is the best solution for migrating from Maverick to Yosemite? Anyone have suggestions?

    Back up all data. Update all third-party software to the latest version and remove any you don't need. Download the Yosemite installer from the App Store. Run it.

  • Best practice for exporting from iMovie '08 to iDVD

    I am looking to find out what is the best practice for exporting from iMovie '08 to iDVD. I have read the other postings that give the basic howto (export to Media Browser then select the video in iDVD). However, my question is a little more technical. I have 1080i HD projects. I am interested in burning them to DVD in the best possible quality. What setting should I be using when I publish to Media Browser?
    I am wondering about quality loss due to more than one conversion/compression. I suspect that when I export to the Media Browser then this is occurring. If I am not mistaken iMovie is using something like H.264 for this. Then, when I run iDVD I suspect it will it do another conversion/compression, I think to get to MPEG2. Not only could this result in a loss of quality but also it will take extra time. I am interested to know what others think about this.
    Finally, I am looking to create DVDs for a lot of video. I am wondering if there are any USB or firewire hardware devices out there that could speed up the compression. I use the Elgato Turbo.264 when I want to encode to H.264 but I wonder if there is something similar for DVD creation.
    Thanks in advance.

    the standards for videoDVD are 720x480, and usually mpeg2 encoded..
    so, your HiDef project HAS to be 'downsampled' somehow..
    I would Export with Qucktime/apple intermediate => which is the 'format' your project is allready, and you avoid any useless 'inbetween encoding'..
    iDVD will 'swallow' this huge export file - don't mind: iDVD cares for length, not size.
    iDVD will then convert into DVD-standards..
    you can 'raise' quality, by using projects <60min - this sets iDVD automatically to highest technical possible bitrate
    hint: judge pic quality on a DVDplayer + TV.. not on your computer (DVDs are meant for TVdelivery)

  • Cisco Network 2009 - Best practices for migrating previous versions of cisco unified communications manager to cucm 7.1

    Does anybody have a copy of the above referenced presentation that you could send me. 
    Thanks in advanced. 
    The presentation can be purchased at the following site:
    http://www.scribd.com/doc/33211957/BRKVVT-2011-Best-Practices-for-Migrating-Previous-Versions-of-Cisco-Unified-Communications#archive
    but felt I ask one of my peeps first. 
    Thanks in advanced.
    Dennis

    Hi Dennis,
    Well..let's give this a try
    Cheers!
    Rob

  • Best Practice for Migration of BO  from one server to another

    Hi All,
               I would like to know what is the Best pratice for Migration of BO from One server to another.
    i have Installed BO Xi R2 on my server.
    Thanks,
    Anendu Bothra
    Edited by: Anendu Bothra on Mar 5, 2009 10:24 AM

    You need to copy your input and output file stores from the old server to the new server. By default these are located in the <Business Objects install path>\FileStore directory.
    Then you need to stop the CMS, Right click the CMS,Click the Configuration tab, and then click Specify.
    Choose Copy, then click OK.
    Choose the version information for the source CMS database.
    Select the database type for the source CMS database, and then specify
    its database information (including host name, user name, and password).
    Select the database type for the destination CMS database, and then
    specify its database information (including host name, user name, and
    password).
    When the CMS database has finished copying, click OK.
    Once this process has been completed start the CMS and click on update objects -> located on the top of the CCM.
    I'd advise taking full backups beforehand.

  • Oracle Application R12.1.1 Migration from Solaris to Linux 64-bit

    Hi,
    Our existing System:
    On Solaris 9:
    Database 10.2.0.2
    EBS 11.5.10.2
    We are planning to Upgrade the system to
    Database 11.2.0.3
    EBS R12.1.3
    We also want to migrate to RHEL 5 64-bit.
    Please let me know if you have experience with respect to the following issues:
    1. Can I use *Application Tier Platform Migration with Oracle E-Business Suite Release 12 [ID 438086.1]* as reference for my scenario? The document mentions that only Windows is supported as the source platform.
    2. When they copy APPL_TOP from source to target - how the binaries from source operating system will work on target operating system? Else, are they platform independent?
    3. Will the following approach work?
    a. Migrate 10.2.0.2 database from source (Solaris) to target (RHEL 5 64-bit).
    b. Upgrade 10.2.0.2 database to 11.2.0.3 on target.
    c. Install R12.1.1 in upgrade mode on target.
    d. Retrofit customization.
    e. Upgrade R12.1.1 to R12.1.3
    Also can you recommend best practise for Migrating database?
    Thanks in advance.

    Note: Application Tier Platform Migration with Oracle E-Business Suite Release 12 (Doc ID 438086.1) reads: "The instructions in this document are for migration to a target UNIX/Linux platform. At present, Windows is only supported as a source platform."
    Note the word SOURCE. So the restriction is Windows cannot be used as a TARGET platform, but other platforms can.
    The content of the APPL_TOP is re-used in the Target environment, except for the binaries; these are delivered by applying the "customer-specific update" as in Note : Application Tier Platform Migration with Oracle E-Business Suite Release 12 (Doc ID 438086.1)
    For information on the database migration check Note: 10g Release 2 Export/Import Process for Oracle Applications Release 11i (Doc ID 362205.1) and http://blogs.oracle.com/stevenChan/entry/migrating_ebusiness_suite_databases_between_platfo
    The scenario seems OK, but needs to be tested of course.
    For information related to upgrading the database for the EBS upgrade check Note: Database Preparation Guidelines for an E-Business Suite Release 12.1.1 Upgrade (Doc ID 761570.1)
    Regards, Carlo.

  • Need to Clone A Database From Solaris to Linux Oracle 11g

    Hi Team,
    I am using Oracle Database Version 11.2.0.1.
    We have 6 production servers and all of them are in solaris box. We have a new requirement from application to use Linux boxes and for that we need to perform a database refresh from Solaris to Linux box.
    I can do it by Export and Import technique. But I want some other way to perform this database refresh activity.
    1] Can I do it while taking a rman backup from solaris box( Source) and restore it back to Linux( Target)?
    2] Can I do it using transportable tablespace method?
    I am aware of performing this using export-import but as our database size is near of 1TB I can't apply this technique.
    I need your suggestions in this activity, please put your opinion on the correct method I should opt for and along with that share me the supporting document where things are mentioned in step by step way.
    Please let me know if this can achieved by RMAN...
    Source Machine Make:-
    bash-3.00$ uname -a
    SunOS blrdlvdwhdb01 5.10 Generic_142909-17 sun4v sparc SUNW,Sun-Blade-T6320
    Target System Make:-
    Linux blrulvremoradb01 2.6.18-194.el5 #1 SMP Tue Mar 16 21:52:39 EDT 2010 x86_64 x86_64 x86_64 GNU/Linux
    Regards,
    Arijit

    Yes, you can create a backup with rman and restore in other platform, check:
    How To Use RMAN CONVERT DATABASE on Source Host for Cross Platform Migration - 413586.1
    Cross-Platform Migration on Destination Host Using Rman Convert Database - Note 414878.1
    but easier and faster will be with expdp/impdp.

  • Best practice for migrating eLearning data from Sol Mgr 3 to 4?

    Greetings,
    What is the recommended method for moving eLearning data when migrating from Solution Manager 3 to version 4?
    Thanks in advance,
         Ken Henderson

    948115 wrote:
    Dear All,
    This is Priya.
    We are using ODI 11.1.1.6 version.
    In my ODI project, we have separate installations for Dev, Test and Prod. i.e. Master repositories are not common between all the three. Now my code is ready in dev. Test environment is just installed with ODI and Master and Work repositories are created. Thats it
    Now, I need to know and understand what is the simple & best way to import the code from Dev and migrate it to test environment. Can some one brief the same as a step by step procedure in 5-6 lines? If this is the 1st time you are moving to QA, better export/import complete work repositories. If it is not the 1st time then create scenario of specific packages and export/import them to QA. In case of scenario you need not to bother about model/datastores. keep in mind that the logical schema name should be same in QA as used in your DEV.
    Some questions on current state.
    1. Do the id's of master and work repositories in Dev and Test need to be the same?It should be different.
    2. I usually see in export file a repository id with 999 and fail to understand what it is exactly. None of my master or work repositories are named with that id.It is required to ensure object uniqueness across several work repositories. For more understanding you can refer
    http://docs.oracle.com/cd/E14571_01/integrate.1111/e12643/export_import.htm
    http://odiexperts.com/odi-internal-id/
    3. Logical Architecture objects and context do not have an export option. What is the suitable alternative for this?If you are exporting topology then you will get the logical connection and context details. If you are not exporting topology then you need to manually create context and other physical connection/logical connection.
    >
    Thanks,
    Priya
    Edited by: 948115 on Jul 23, 2012 6:19 AM

  • Best practice for migrating data tables- please comment.

    I have 5 new tables seeded with data that need to be promoted from a development to a production environment.
    Instead of the DBAs just using a tool to migrate the data they are insistent that I save and provide scripts for every single commit, in proper order, necessary to both build the table and insert the data from ground zero.
    I am very unaccustomed to this kind of environment and it seems much riskier for me to try and rebuild the objects from scratch when I already have a perfect, tested, ready model.
    They also require extensive documentation where every step is recorded in a document and use that for the deployment.
    I believe their rationale is they don't want to rely on backups but instead want to rely on a document that specifies each step to recreate.
    Please comment on your view of this practice. Thanks!

    >
    Please comment on your view of this practice. Thanks!
    >
    Sounds like the DBAs are using best practices to get the job done. Congratulations to them!
    >
    I have 5 new tables seeded with data that need to be promoted from a development to a production environment.
    Instead of the DBAs just using a tool to migrate the data they are insistent that I save and provide scripts for every single commit, in proper order, necessary to both build the table and insert the data from ground zero.
    >
    The process you describe is what I would expect, and require, in any well-run environment.
    >
    I am very unaccustomed to this kind of environment and it seems much riskier for me to try and rebuild the objects from scratch when I already have a perfect, tested, ready model.
    >
    Nobody cares if if is riskier for you. The production environment is sacred. Any and all risk to it must be reduced to a minimum at all cost. In my opinion a DBA should NEVER move ANYTHING from a development environment directly to a production environment. NEVER.
    Development environments are sandboxes. They are often not backed up. You or anyone else could easily modify tables or data with no controls in place. Anything done in a DEV environment is assumed to be incomplete, unsecure, disposable and unvetted.
    If you are doing development and don't have scripts to rebuild your objects from scratch then you are doing it wrong. You should ALWAYS have your own backup copies of DDL in case anything happens (and it does) to the development environment. By 'have your own' I mean there should be copies in a version control system or central repository where your teammates can get their hands on them if you are not available.
    As for data - I agree with what others have said. Further - ALL data in a dev environment is assumed to be dev data and not production data. In all environments I have worked in ALL production data must be validated and approved by the business. That means every piece of data in lookup tables, fact tables, dimension tables, etc. Only computed data, such as might be in a data warehouse system generated by an ETL process might be exempt; but the process that creates that data is not exempt - that process and ultimately the data - must be signed off on by the business.
    And the business generally has no access to, or control of, a development environment. That means using a TEST or QA environment for the business users to test and validate.
    >
    They also require extensive documentation where every step is recorded in a document and use that for the deployment.
    I believe their rationale is they don't want to rely on backups but instead want to rely on a document that specifies each step to recreate.
    >
    Absolutely! That's how professional deployments are performed. Deployment documents are prepared and submitted for sign off by each of the affected groups. Those groups can include security, dba, business user, IT and even legal. The deployment documents always include recovery steps so that is something goes wrong or the deployment can't procede there is a documented procedure of how to restore the system to a valid working state.
    The deployments themselves that I participate in have representatives from the each of those groups in the room or on a conference call as each step of the deployment is performed. Your 5 tables may be used by stored procedures, views or other code that has to be deployed as part of the same process. Each step of the deployment has to be performed in the correct order. If something goes wrong the responsible party is responsible for assisting in the retry or recovery of their component.
    It is absolutely vital to have a known, secure, repeatable process for deployments. There are no shortcuts. I agree, for a simple 5 new table and small amount of data scenario it may seem like overkill.
    But, despite what you say it simply cannot be that easy for one simple reason. Adding 5 tables with data to a production system has no business impact or utility at all unless there is some code, process or application somewhere that accesses those tables and data. Your post didn't mention the part about what changes are being made to actually USE what you are adding.

  • Best practice for moving from a G5 to a new Mac with SL

    I am receiving my new iMac today (27") and am very excited
    However I want to move over using the best practices to assure that I remain excited and not frustrated
    My initial thoughts are to boot it up and doe the initial set up - to move my iPhoto library over and to use migration assistance to move the rest of my data files
    Then to install all of the extra software that I can find the packages for from the original installation disks
    And then finally to use migration assistant again to move over any software that I can not find original disks for (I've moved from Mac to Mac to Mac over and over and some of the software goes back to OS 9 (and won't run anymore I guess)
    Is this a good way
    OR
    will I mess up doing it this way
    OR
    am I spending far too much time worrying about moving old problems over and would be better off to just turn MA loose and let it do its thing form the beginning?
    BTW - mail crashes a lot on my existing system - pretty much everything else seems ok - except iPhoto is slow - hoping that the new Intel dual core will help that
    LN

    Migration Assistant is not a general file moving tool. MA will migrate your Applications and Home folders transferring only your third-party applications. MA will transfer any application support folders required by your applications, your preferences, and network setup. You do not have a choice of what will be migrated other than the above. MA cannot determine whether anything transferred is compatible with Snow Leopard. I recommend you look at the following:
    A Basic Guide for Migrating to Intel-Macs
    If you are migrating a PowerPC system (G3, G4, or G5) to an Intel-Mac be careful what you migrate. Keep in mind that some items that may get transferred will not work on Intel machines and may end up causing your computer's operating system to malfunction.
    Rosetta supports "software that runs on the PowerPC G3, G4, or G5 processor that are built for Mac OS X". This excludes the items that are not universal binaries or simply will not work in Rosetta:
    Classic Environment, and subsequently any Mac OS 9 or earlier applications
    Screensavers written for the PowerPC
    System Preference add-ons
    All Unsanity Haxies
    Browser and other plug-ins
    Contextual Menu Items
    Applications which specifically require the PowerPC G5
    Kernel extensions
    Java applications with JNI (PowerPC) libraries
    See also What Can Be Translated by Rosetta.
    In addition to the above you could also have problems with migrated cache files and/or cache files containing code that is incompatible.
    If you migrate a user folder that contains any of these items, you may find that your Intel-Mac is malfunctioning. It would be wise to take care when migrating your systems from a PowerPC platform to an Intel-Mac platform to assure that you do not migrate these incompatible items.
    If you have problems with applications not working, then completely uninstall said application and reinstall it from scratch. Take great care with Java applications and Java-based Peer-to-Peer applications. Many Java apps will not work on Intel-Macs as they are currently compiled. As of this time Limewire, Cabos, and Acquisition are available as universal binaries. Do not install browser plug-ins such as Flash or Shockwave from downloaded installers unless they are universal binaries. The version of OS X installed on your Intel-Mac comes with special compatible versions of Flash and Shockwave plug-ins for use with your browser.
    The same problem will exist for any hardware drivers such as mouse software unless the drivers have been compiled as universal binaries. For third-party mice the current choices are USB Overdrive or SteerMouse. Contact the developer or manufacturer of your third-party mouse software to find out when a universal binary version will be available.
    Also be careful with some backup utilities and third-party disk repair utilities. Disk Warrior 4.1, TechTool Pro 4.6.1, SuperDuper 2.5, and Drive Genius 2.0.2 work properly on Intel-Macs with Leopard. The same caution may apply to the many "maintenance" utilities that have not yet been converted to universal binaries. Leopard Cache Cleaner, Onyx, TinkerTool System, and Cocktail are now compatible with Leopard.
    Before migrating or installing software on your Intel-Mac check MacFixit's Rosetta Compatibility Index.
    Additional links that will be helpful to new Intel-Mac users:
    Intel In Macs
    Apple Guide to Universal Applications
    MacInTouch List of Compatible Universal Binaries
    MacInTouch List of Rosetta Compatible Applications
    MacUpdate List of Intel-Compatible Software
    Transferring data with Setup Assistant - Migration Assistant FAQ
    Because Migration Assistant isn't the ideal way to migrate from PowerPC to Intel Macs, using Target Disk Mode, copying the critical contents to CD and DVD, an external hard drive, or networking
    will work better when moving from PowerPC to Intel Macs. The initial section below discusses Target Disk Mode. It is then followed by a section which discusses networking with Macs that lack Firewire.
    If both computers support the use of Firewire then you can use the following instructions:
    1. Repair the hard drive and permissions using Disk Utility.
    2. Backup your data. This is vitally important in case you make a mistake or there's some other problem.
    3. Connect a Firewire cable between your old Mac and your new Intel Mac.
    4. Startup your old Mac in Target Disk Mode.
    5. Startup your new Mac for the first time, go through the setup and registration screens, but do NOT migrate data over. Get to your desktop on the new Mac without migrating any new data over.
    If you are not able to use a Firewire connection (for example you have a Late 2008 MacBook that only supports USB:)
    1. Set up a local home network: Creating a small Ethernet Network.
    2. If you have a MacBook Air or Late 2008 MacBook see the following:
    MacBook (13-inch, Aluminum, Late 2008) and MacBook Pro (15-inch, Late 2008)- Migration Tips and Tricks;
    MacBook (13-inch, Aluminum, Late 2008) and MacBook Pro (15-inch, Late 2008)- What to do if migration is unsuccessful;
    MacBook Air- Migration Tips and Tricks;
    MacBook Air- Remote Disc, Migration, or Remote Install Mac OS X and wireless 802.11n networks.
    Copy the following items from your old Mac to the new Mac:
    In your /Home/ folder: Documents, Movies, Music, Pictures, and Sites folders.
    In your /Home/Library/ folder:
    /Home/Library/Application Support/AddressBook (copy the whole folder)
    /Home/Library/Application Support/iCal (copy the whole folder)
    Also in /Home/Library/Application Support (copy whatever else you need including folders for any third-party applications)
    /Home/Library/Keychains (copy the whole folder)
    /Home/Library/Mail (copy the whole folder)
    /Home/Library/Preferences/ (copy the whole folder)
    /Home /Library/Calendars (copy the whole folder)
    /Home /Library/iTunes (copy the whole folder)
    /Home /Library/Safari (copy the whole folder)
    If you want cookies:
    /Home/Library/Cookies/Cookies.plist
    /Home/Library/Application Support/WebFoundation/HTTPCookies.plist
    For Entourage users:
    Entourage is in /Home/Documents/Microsoft User Data
    Also in /Home/Library/Preferences/Microsoft
    Credit goes to Macjack for this information.
    If you need to transfer data for other applications please ask the vendor or ask in the Discussions where specific applications store their data.
    5. Once you have transferred what you need restart the new Mac and test to make sure the contents are there for each of the applications.
    Written by Kappy with additional contributions from a brody.
    Revised 1/6/2009
    In general you are better off reinstalling any third-party software that is PPC-only. Otherwise update your software so it's compatible with Snow Leopard.
    Do not transfer any OS 9 software because it's unsupported. You can transfer documents you want to keep.
    Buy an external hard drive to use for backup.

Maybe you are looking for

  • Is incompatible with

    Hi friends, I am getting the error message: In PERFORM or CALL FUNCTION "GETATTRIBUTES", the actual parameter "LT_ATTR_DAT" is incompatible with the formal parameter "P_LT_ATTR_DATA". Please advice what is wrong right here. Thanks in advance Sas DATA

  • Explain Plan in Raptor

    Over all a very nice product. As with everything now it is difficult to place the raptor feature set in the over all product set. However any comparison is inevitable. It would be nice if the explain plan option would ask for a default userid in case

  • Wall and/or Car charging iPod Touch 32GB with cell phone/other USB charger

    I know the "stock" answer is to buy an iPod compatible charger for my $300 device, but while traveling, it's nice to pack less gadget chargers and more gadgets Can I safely charge my iPod Touch with my LG Phone charger ("5.1V - .07A" output)? I also

  • WebLogic/Metro interop policy assertions

    Hi, I'm merging Glassfish-Metro WS-policies into the WebLogic 10.3.1 server as part of a migration project. I call the web services with a Metro 2.0.1 client to verify that everything still works in WLS. My original policies contain the following ass

  • EHP4 Installation: Error in Phase PREP_INPUT/KX_CPYORG

    Hi! I try to execute a EHP4 installation for SAP ECC system with EhP Installer. The programm stopps after several repeat attempts in the the phase PREP_INPUT/KX_CPYORG. The error I get is: CURRENTPHASE PREP_INPUT/KX_CPYORG ...started at 2009032016391