Win xp to Win 7 migration -- best practice

Hi , 
What are the best practices that need to be followed when we migrate xp to win 7 using configuration manager?
 - like computer name (should we rename it during migration or keep it as is?).
- USMT like what should be migrated?
Psl. share any pointers\suggeations. Thanks in advance.
Regards,

First determine your needs... do you really need to capture the user data or not.. perhaps they can store their precious data before the OS upgrade by themselves, so you don't need to worry about it? If you can made this kind of political decision, then
you don't need USMT. Same goes for computer name, you can use the old ones if you please. It's a political decision, you should use unique name for all of your computers, some prefer PCs serial numbers, some prefer something else,
it's really up to you to decide.
Some technical pointers to consider:
Clients should have ConfigMgr client installed on them before the migration (so that they appear in the console and can be instructed to do things, like run task sequence...)
If the clients use static IP addresses, you need to configure your TS to capture those settings and use them during the upgrade process...

Similar Messages

  • Migration Best Practice When Using an Auth Source

    Hi,
    I'm looking for some advice on migration best practices or more specifically, how to choose whether to import/export groups and users or to let the auth source do a sync to bring users and groups into each environment.
    One of our customers is using an LDAP auth source to synchronize users and groups. I'm trying to help them do a migration from a development environment to a test environment. I'd like to export/import security on each object as I migrate it, but does this mean I have to export/import the groups on each object's ACLs before I export/import each object? What about users? I'd like to leave users and groups out of the PTE files and just export/import the auth source and let it run in each environment. But I'm afraid the UUIDs for the newly created groups will be different and they won't match up with object ACLs any more, causing all the objects to lose their security settings.
    If anyone has done this before, any suggestions about best practices and gotchas when using the migration wizard in conjunction with an auth source would be much appreciated.
    Thanks,
    Chris Bucchere
    Bucchere Development Group
    [email protected]
    http://www.bucchere.com

    The best practice here would be to migrate only the auth source through the migration wizard, and then do an LDAP sync on the new system to pull in the users and groups. The migration wizard will then just "do the right thing" in matching up the users and groups on the ACLs of objects between the two systems.
    Users and groups are actually a special case during migration -- they are resolved first by UUID, but if that is not found, then a user with the same auth source UUID and unique auth name is also treated as a match. Since you are importing from the same LDAP auth source, the unique auth name for the user/group should be the same on both systems. The auth source's UUID will also match on the two systems, since you just migrated that over using the migration wizard.

  • Data Migration Best Practice

    Is the a clear cut best practice procedure for conducting data migration from one company to a new one ?

    I don't think there is a clear cut for that.  Best Practice would always be relative.  It varies dramatically depending on many factors.  There is no magical bullet here.
    One except for above: you should always use Tab delimited Text format.  It is DTW friendly format.
    Thanks,
    Gordon

  • Native Toplink to EclipseLink to JPA - Migration Best Practice

    I am currently looking at the future technical stack of our developments, and would appreciate any advise concerning best practice migration paths.
    Our current platform is as follows:
    Oracle 10g AS -> Toplink 10g -> Spring 2.5.x
    We have (approx.) 100 seperate Toplink Mapping Workbench projects (we have one per DDD Aggregate object in effect) and therefore 100 Repositories (or DAOs), some using Toplink code (e.g. Expression Builder, Class Extractors etc) on top of the mappings to support Object to RDB (legacy) mismatch.
    Future platform is:
    Oracle 11g AS -> EclipseLink -> Spring 3.x
    Migration issues are as follows:
    Spring 3.x does not provide any Native Toplink ORM support
    Spring 2.5.x requires Toplink 10g to provide Native Toplink ORM support
    My current plan is as follows:
    1. Migrate Code and Mappings to use EclipseLink (as per Link:[http://wiki.eclipse.org/EclipseLink/Examples/MigratingFromOracleTopLink])
    2. Temporarily re-implement the Spring 2.5.x ->Toplink 10g support code to use EclipseLink (e.g. TopLinkDaoSupport etc) to enable testing of this step.
    3. Refactor all Repositories/DAOs and Support code to use JPA engine (i.e. Entity Manager etc.)
    4. Move to Spring 3.x
    5. Move to 11g (when available!)
    Step 2 is only required to enable testing of the mapping changes, without changing to use the JPA engine.
    Step 3 will only work if my understanding of the following statement is correct (i.e. I can use the JPA engine to run native Toplink mappings and associated code):
    Quote:"Deployment XML files from Oracle TopLink 10.1.3 and above can be read by EclipseLink."
    Speciifc questions are:
    Is my understanding correct regarding the above?
    Is there any other path to achieve the goal of using 11g, EclipseLink (and Spring 3.x)?
    Is this achieveable without refactoring all XML mappings from Native -> JPA?
    Many thanks for any assistance.
    Marc

    It is possible to use the native/MW TopLink/EclipseLink deployment xml files with JPA in EclipseLink, this is correct. You just need to pass a persistence property giving your sessions.xml file location. The native API is also still supported in EclipseLink.
    James : http://www.eclipselink.org

  • New white paper: Character Set Migration Best Practices

    This paper can be found on the Globalization Home Page at:
    http://technet.oracle.com/tech/globalization/pdf/mwp.pdf
    This paper outlines the best practices for database character set
    migration that has been utilized on behalf of hundreds of customers
    successfully. Following these methods will help determine what
    strategies are best suited for your environment and will help minimize
    risk and downtime. This paper also highlights migration to Unicode.
    Many customers today are finding Unicode to be essential to supporting
    their global businesses.

    Sorry about that. I posted that too soon. It should become available today (Monday Aug 22nd).
    Doug

  • GRC AACG/TCG and CCG control migration best practice.

    Is there any best practice documents which illustrates the step by step migration of AACG/TCG and CCG controls from the development instance to the production? Also, how should one take the back up for the same ?
    Thanks,
    Arka

    There are no automated out of the box tools to migrate anything from CCG.  In AACG/TCG  you can export and import Access Models (includes the Entitlements) and Global Conditions.  You will have to manual setup roles, users, path conditions, etc.
    You can't clone AACG/TCG or CCG.
    Regards,
    Roger Drolet
    OIC

  • EIS Migration Best Practice

    Hello, All
    I have a couple of questions on EIS upgrade and migrations. Appreciate any useful inputs:
    1. What is the best way to automate the EIS application migration from one environment to another environment? I doubt whether LCM can be used. On the other hand, I think there maybe an export/import options available, but is there a command line utility to trigger this export/import so that we can put it into a batch job? If it works, anything else that needs to be migrated?
    2. If it is to upgrade from 7 to latest version of EIS? Can we simply restore the EIS catalog, and let the system to automatically convert/upgrade the catalog?
    Thanks a lot!

    That is correct, there is no command line migration utility.
    Supposedly, you can install a 32 bit version of EIS on a 32 bit environment and do the XML import on that, but I haven't had any success with it.
    Yes, you should just be able to use the backup/restore of the EIS repository for EIS migration. We did a simple "lift & shift" of the repository database when we upgraded. We didn't rebuild or migrate anything.
    We could have just left the catalog in place and pointed the new EIS at it, but our SQLServer2005 group wanted to keep the naming conventions consistent across all the Oracle Hyperion databases.
    Tim Young

  • Best Practice for migration to Exadata2

    Hi Guru,
    I'm thinking to migrate an Oracle RAC 11g (11.2.0.2) on HP/UX Itanium cluster machine to a New Exadata 2 System
    Are there best practice? Where can I found documentation about migration?
    Thanks very much
    Regards
    Gio
    Edited by: ggiulian on 18-ago-2011 7.39

    There are several docs available on MOS
    HP Oracle Exadata Migration Best Practices [ID 760390.1]
    Oracle Exadata Best Practices [ID 757552.1]
    Oracle Sun Database Machine X2-2/X2-8 Migration Best Practices [ID 1312308.1]
    If you already have Exadata, I recommend to open an SR with Oracle and engage with ACS.
    - Wilson
    www.michaelwilsondba.info

  • Best practices for installing Win 10 under Hyper-V on Server 2012R2 host

    Yeah, yeah, I know I could probably get my answers after spending 10 hours reading hundreds of isolated threads here.  I've already put in about 2 hours, and I'm exhausted.  Plus, this site does not have a very sophisticated search function.
    I want to install Win10 as a VM on my Server 2012R2 machine.  I am not currently hosting any other VMs, so my first decision was whether to try it using Hyper-V or VirtualBox.  I started with VirtualBox, but I ran into two problems: networking
    and video.  Also, VirtualBox itself seems to have some issues with failing to install the extension pack.  So now I think I'll give Hyper-V a shot.
    I found some blog posts from last year providing guidance on setting up Hyper-V for Win10, but given the rate of change of this beta OS, I expect there are many new "features" that can be mitigated against by specific settings on the VM.
    Some specific questions:
    1. Generation 1 or Generation 2 in the Hyper-V setup?  The blogs I've seen say to use Gen1, but provide no justification.  Perhaps because they are using Win8 as the host?  I am using 2012R2.
    2. Does the Win10 ISO file need to be continually available to the VM, or is it only used in the initial installation?
    3. How do I get the VM to access the GPU card, which has lots of memory, over the useless onboard video chip which only has 8MB and no 3D instruction set?  This was a dealbreaker with VirtualBox.
    4. I anticipate many issues with networking, but I'll start with this: I have dual onboard NICs going into a managed switch.  Should I just give one physical NIC to the VM and let the host have the other?  I think I'm going to have some issues
    with DHCP IP address assignment, but we'll see.  Any best practices here would be helpful.
    Thanks.

    >1.
    I'd use Gen 1, that is a BIOS type boot, but that's just because I've had
    less trouble than with Gen2 VM's.
    >2.
    Only during install, refresh, reset, or sfc.
    >3.
    No virtualization solution does it easily, but there is RemoteFX if you can
    get a Windows 10 client to use it.  I've never tried.
    http://social.technet.microsoft.com/wiki/contents/articles/16652.remotefx-vgpu-setup-and-configuration-guide-for-windows-server-2012.aspx
    >4.
    That's what I would do (assigning one NIC to the VM, and one the host).  If
    both are receiving an IP address right now from DHCP, they will continue to
    do so the new way unless you have a managed switch that would prevent more
    IP addresses.  It's hard to tell...
    Bob Comer

  • What is the best practice for uninstalling only some CS4 programs (Win 7 PC)

    I recently upgraded from CS4 to CS5.5 and wanted to free up some hard drive space on my Windows 7 PC. I wanted to uninstall only a few programs though from CS4, such as Photoshop, Illustrator, Flash and Bridge. What is the best way to do this keeping in mind licensing, deactivating and properly removing components? I have the original installation disk for CS4 if needed. Thanks for any help!

    Best practice: Uninstall everything (including your CS5.5), run the Creative Suite Cleaner Tool, then reinstall the components you need from both editions. CS4 may have a repair/ change configuration mode, but I'd strongly advise against using it, as it will do more damage than good, so use the long way round. It's also the only way to not bust up file associations with an uninstall of CS4...
    Mylenium

  • Best practice for data migration install v1.40 - Error 2732 Directory manag

    Hi
    I'm attempting to install SAP Best Practice for Data migration 1.40 on Win Server 2008 R2 (64 bit).
    Prerequisite error
    Installation program stops with missing file error
    The following file was not found
    ... \migration\InstallationWizard\BusinessObjects Data Services\setup.exe
    The file is necessary for successful installation. Please connect to internet or refer to Quick Guide (available on SAP note 1527151) for information regarding the above file.
    Windows installer log displays
    Error 2732 Directory Manager not initialized
    SAP note 1527151 does not exist or is internal.
    Any help appreciated  on what is the root cause of the error as the file does not exist in that folder in the installation zip file.
    Other prerequisite of .NET 3.5.1 met already.
    Patch is released since 20.11.2011 so I presume that it is a good installation set.
    Thanks,
    Alan

    Hi Alan,
    There are details on data migration v1.4 installations on SAP website and market place. The below link should guide to the right place. It has a power point presentation and other useful links as well.
    http://help.sap.com/saap/sap_bp/DMS_V140/DMS_US/html/index.htm
    Arun

  • Best practice for migrating data tables- please comment.

    I have 5 new tables seeded with data that need to be promoted from a development to a production environment.
    Instead of the DBAs just using a tool to migrate the data they are insistent that I save and provide scripts for every single commit, in proper order, necessary to both build the table and insert the data from ground zero.
    I am very unaccustomed to this kind of environment and it seems much riskier for me to try and rebuild the objects from scratch when I already have a perfect, tested, ready model.
    They also require extensive documentation where every step is recorded in a document and use that for the deployment.
    I believe their rationale is they don't want to rely on backups but instead want to rely on a document that specifies each step to recreate.
    Please comment on your view of this practice. Thanks!

    >
    Please comment on your view of this practice. Thanks!
    >
    Sounds like the DBAs are using best practices to get the job done. Congratulations to them!
    >
    I have 5 new tables seeded with data that need to be promoted from a development to a production environment.
    Instead of the DBAs just using a tool to migrate the data they are insistent that I save and provide scripts for every single commit, in proper order, necessary to both build the table and insert the data from ground zero.
    >
    The process you describe is what I would expect, and require, in any well-run environment.
    >
    I am very unaccustomed to this kind of environment and it seems much riskier for me to try and rebuild the objects from scratch when I already have a perfect, tested, ready model.
    >
    Nobody cares if if is riskier for you. The production environment is sacred. Any and all risk to it must be reduced to a minimum at all cost. In my opinion a DBA should NEVER move ANYTHING from a development environment directly to a production environment. NEVER.
    Development environments are sandboxes. They are often not backed up. You or anyone else could easily modify tables or data with no controls in place. Anything done in a DEV environment is assumed to be incomplete, unsecure, disposable and unvetted.
    If you are doing development and don't have scripts to rebuild your objects from scratch then you are doing it wrong. You should ALWAYS have your own backup copies of DDL in case anything happens (and it does) to the development environment. By 'have your own' I mean there should be copies in a version control system or central repository where your teammates can get their hands on them if you are not available.
    As for data - I agree with what others have said. Further - ALL data in a dev environment is assumed to be dev data and not production data. In all environments I have worked in ALL production data must be validated and approved by the business. That means every piece of data in lookup tables, fact tables, dimension tables, etc. Only computed data, such as might be in a data warehouse system generated by an ETL process might be exempt; but the process that creates that data is not exempt - that process and ultimately the data - must be signed off on by the business.
    And the business generally has no access to, or control of, a development environment. That means using a TEST or QA environment for the business users to test and validate.
    >
    They also require extensive documentation where every step is recorded in a document and use that for the deployment.
    I believe their rationale is they don't want to rely on backups but instead want to rely on a document that specifies each step to recreate.
    >
    Absolutely! That's how professional deployments are performed. Deployment documents are prepared and submitted for sign off by each of the affected groups. Those groups can include security, dba, business user, IT and even legal. The deployment documents always include recovery steps so that is something goes wrong or the deployment can't procede there is a documented procedure of how to restore the system to a valid working state.
    The deployments themselves that I participate in have representatives from the each of those groups in the room or on a conference call as each step of the deployment is performed. Your 5 tables may be used by stored procedures, views or other code that has to be deployed as part of the same process. Each step of the deployment has to be performed in the correct order. If something goes wrong the responsible party is responsible for assisting in the retry or recovery of their component.
    It is absolutely vital to have a known, secure, repeatable process for deployments. There are no shortcuts. I agree, for a simple 5 new table and small amount of data scenario it may seem like overkill.
    But, despite what you say it simply cannot be that easy for one simple reason. Adding 5 tables with data to a production system has no business impact or utility at all unless there is some code, process or application somewhere that accesses those tables and data. Your post didn't mention the part about what changes are being made to actually USE what you are adding.

  • What are the best practices to migrate VPN users for Inter forest mgration?

    What are the best practices to migrate VPN users for Inter forest mgration?

    It depends on a various factors. There is no "generic" solution or best practice recommendation. Which migration tool are you planning to use?
    Quest (QMM) has a VPN migration solution/tool.
    ADMT - you can develop your own service based solution if required. I believe it was mentioned in my blog post.
    Santhosh Sivarajan | Houston, TX | www.sivarajan.com
    ITIL,MCITP,MCTS,MCSE (W2K3/W2K/NT4),MCSA(W2K3/W2K/MSG),Network+,CCNA
    Windows Server 2012 Book - Migrating from 2008 to Windows Server 2012
    Blogs: Blogs
    Twitter: Twitter
    LinkedIn: LinkedIn
    Facebook: Facebook
    Microsoft Virtual Academy:
    Microsoft Virtual Academy
    This posting is provided AS IS with no warranties, and confers no rights.

  • SCCM 2012 R2 User State Migration Win XP to Win 8.1 does not migrate Domain User Files

    Hi @ all
    i'm trying migrate Win XP Sp3 to Win 8.1 using SCCM 2012 R2. So i followed the how to from the SCCM Team.
    https://blogs.technet.com/b/configmgrteam/archive/2013/09/12/how-to-migrate-user-data-from-win-xp-to-win-8-1-with-system-center-2012-r2-configmgr.aspx
    All things worked fine, but the user files from my test Domain user are not restored.
    Here some extracts of the loadstate log.
    2014-01-07 15:49:16, Info                  [0x000000] User SCCM\test.user maps to S-1-5-21-2486663232-1734351201-1738771205-1113
    2014-01-07 15:49:16, Info                  [0x000000] User TEST-COMPUTER\Administrator maps to TEST-COMPUTER\Administrator
    2014-01-07 15:49:16, Error                 [0x000000] The account TEST-COMPUTER\User is chosen for migration, but the target does not have account TEST-COMPUTER\User. See documentation
    on /lac, /lae, /ui, /ue and /uel options.
    2014-01-07 15:49:16, Info                  [0x000000] Failed.[gle=0x00000006]
    2014-01-07 15:49:16, Info                  [0x000000]   Unable to create a local account because /lac was not specified[gle=0x00000006]
    2014-01-07 15:49:16, Info                  [0x000000] Entering MigCloseCurrentStore method
    2014-01-07 15:49:16, Info                  [0x0801dc] Closing catalog file
    2014-01-07 15:49:16, Info                  [0x0801dd] Deleting catalog file at C:\Windows\Temp\tmpF6E7.tmp\Temp\tmp9F3.tmp
    2014-01-07 15:49:16, Info                  [0x000000] Leaving MigCloseCurrentStore method
    2014-01-07 15:49:16, Info                  [0x000000] USMT Completed at 2014/01/07:15:49:16.078[gle=0x00000057]
    The user sccm\test.user is my test user but, i cannot see any error in relation to that user.
    Has some some an idea?
    Thank you
    Adrian

    Hi,
    I found a similar article for your reference.
    http://blogs.technet.com/b/sudheesn/archive/2009/12/28/in-place-upgrade-from-windows-xp-to-windows-7.aspx
    We
    are trying to better understand customer views on social support experience, so your participation in this
    interview project would be greatly appreciated if you have time.
    Thanks for helping make community forums a great place.

  • Best practice for migrating IDOCs?

    Subject: Best practice for migrating IDOC's? 
    Hi,
    I need to migrate some IDOC's to another system for 'historical reference'.
    However, I don't want to move them using the regular setup as I don't want the inbound processing to be triggered.
    The data that was created in the original system by the processed IDOC's will be migrated to the new system using migration workbench. I only need to migrate the IDOC's as-is due to legal requirements.
    What is the best way to do this? I can see three solutions:
    A) Download IDOC table contents to a local file and upload them in the new system. Quick and dirty approach, but it might also be a bit risky.
    B) Use LSMW. However, I'm not sure whether this is feasible for IDOC's.
    C) Using ALE and setting up a custom partner profile where inbound processing only writes the IDOC's to the database. Send the IDOC's from legacy to the new system. Using standard functionality in this way seems to me to be the best solution, but I need to make sure that the IDOC's once migration will get the same status as they had in the old system.
    Any help/input will be appreciated
    Regards
    Karl Johan
    PS. For anyone interested in the business case: Within EU the utility market was deregulated a few years ago, so that any customer can buy electricity from any supplier. When a customer switches supplier this is handled via EDI, in SAP using ALE and IDOC's. I'm working on a merger between two utility companies and for legal reasons we need to move the IDOC's. Any other data is migrated using migration workbench for IS-U.

    Hi Daniele
    I am not entirely sure, what you are asking, Please could you provide additional information.
    Are you looking for best practice recommendations for Governance, for example: Change transports between DEV, QA and PRD in BPC 7.0?
    What is the best method? Server Manager backup and restore, etc  ?
    And
    Best Practice recommendations on how to upgrade to a different version of BPC, for example: Upgrading from BPC 7.0 to 7.5 or 10.0 ?
    Kind Regards
    Daniel

Maybe you are looking for

  • Setup problem - password - WRT300N

    Hey I'm having a problem setting up my router. I'm stuck in the setup where it says "enter password". I tried a bunch of different passwords and for each one it says like "Checking system information, please wait..." with a task bar. When the task ba

  • ActiveSync with F5 doesn't work!

    Hello All! I'm configuring nowadays ActiveSync for mobile devices that will connect from Internet. I have URL for ActiveSync that points to External IP of our F5, let's say: https://activesync.com  After F5 gets request from Mobile Device it's being

  • Saved project only produces video and no audio

    I created a simple project with ony one AVI file.  I applied a single effect -- rotating the video from the portrait mode in which it was shot to landscape mode.  When I play the project in Premiere Elements v. 9 there is audio as well as video.  But

  • Why does my iPod need to re-synchronize so many files every time

    I just plugged in my iPod and it synchronized 12 files, including files that I know for a fact were on the iPod already. Not 30 minutes later I plugged it in and it synchronized 13 files, including some of the same files it had just synchronized. I'v

  • Execution conflict between a SubVI and a while loop

    Hi, I have wired the conditional terminal of a while loop with a boolean control. The same boolean control is responsible to terminate a SubVI (not inside the while loop) which also contains a while loop (I am using a reference to the control here).