Best practice migration

Hi
i search, i link to explain the best practice to migrate SCCM 2007 to SCCM 2012 R2. (Ex.: It's necessary to configure a discovery method on SCCM 2012 before a start migration. Ex.: after migrate computer the new client deploy automaticaly or not?)
Thanks

Hi,
There is a CM 2012 Migration guide below that a lot of articles and blog posts available to help you with migration process.
http://anoopcnair.com/2012/07/06/sccm-configmgr-2007-to-2012-migration-reference-guide/
Note: Microsoft provides third-party contact information
to help you find technical support. This contact information may change without notice. Microsoft does not guarantee the accuracy of this third-party contact information.
Best Regards,
Joyce

Similar Messages

  • Best practices: migrating from Aperture 2 to Aperture 3

    What are the best practices for moving from Aperture 2 to Aperture 3. One thing I do know from reading the discussions board is to turn off Faces recognition until everything is working. What else?

    Make sure you do a full backup and rebuild on the aperture 2 library before you migrate. (opt-cmd when you open Aperture 2)
    Aperture 3 may slow down on you don't be scared it will finish. Mine took 1 1-/2 days, about 30,000 raw photos / 800GB.
    Don't reprocess masters until Aperture is done with everything it needs to do, keep an eye on the activity window.
    After you reprocess masters it will take a long time to generate thumbnails, again don't worry.
    Bill Debevc
    sshaphotos.com

  • Best Practice: Migrating transports to Prod (system down etc.)

    Hi all
    This is more of a process and governance question as opposed to a ChaRM question.
    We use ChaRM to migrate transports to Production systems. For example, we have a Minor BAU Release (every 2 weeks), a Minor Initiative Release (every 4 weeks) and a Major Release (every 3 months).
    We realise that some of the major releases may require SAP to be taken offline. But what is SAP Best practice for ANY release into Production? i.e. for our Minor BAU Release we never shut down any Production systems, never stop batch jobs, never lock users etc.
    What does SAP recommend when migrating transports to Prod?
    Thanks
    Shaun

    Have you checked out the "Two Value Releases Per Year" whitepaper for SAP recommendations?  Section 6 is applicable.
    Lifetime Support by SAP » Two Value Releases per Year
    The "real-world" answer is going to depend on how risk-adverse versus downtime adverse your company is.  I think most companies would choose to keep the systems running except when SAP forces an outage or there is a real risk of data corruption (some data conversions and data loads, for example).
    Specific to your minor BAU releases, it may be wise to make a process whereby anything that requires a production shutdown, stopped batch jobs, locked users, etc. needs to be in a different release type. But if you don't have the kind of control, your process will need to allow for these things to happen with those releases.
    Also, with regards to stopping batch jobs in the real world, you always need to balance the desire to take full advantage of the available systems versus the pain of managing the variations.  If your batch schedule is full, how are you going to make sure the critical jobs complete on time when you do need to take the system down?  If it isn't full, why do you need that time?  Can you make sure only non-critical batch jobs run during those times?  Do you have a good method of implementing an alternate batch schedule when need be?

  • Best practice migrate file server

    I have 3 file servers n two different domains:
    FileS01 and FileS02 in domainA (FileS01 / FileS02: Windows 2003R2)
    FilesS03 in domainB (FileS03: Windows 2003R2)
    Should I migrate to a new
    corporate server to be FileS01 (Windows 2012)
    also call and be in the domainA.
    As you advise me to do it?
    Thanks

    I recommend you consult this guide:
    http://technet.microsoft.com/en-us/library/jj863566.aspx
    Yes, it is a ton of information but it also covers all angles; what to name the servers, when to rename the servers, using DFSN to make things smoother, the migration tools, migrating local users, etc.  Due to the sheer amount of information, ignore
    the areas that don't apply to your implementation (BrancheCache, as an example.)  HOWEVER, I do recommend you consider their suggestion to move this into a DFS Namespace. That way the next time you need to move to a new server, the users will barely feel
    it.
    Best of luck. 

  • Best practice - Migrate from WLC4404 to WLC5508

    Hello everyone,
    I would like to have pointers how to migrate from WLC4404 to WLC5508. I want to know your propositions.
    I could replicate the configuration manually but there is a lot of confirugation menus and all. If both could be online and migrate the AP to the new one until there is none on the old one, it would be great.
    I'm waiting your replies.
    Best regards

    1.  Upgrade the 4400 to the "highest" firmware, 7.0.235.3.  Make sure the 5500 also have the same firmware (7.0.235.3).
    2.  Copy the config from the 4400 to your TFTP server. 
    3.  Copy the config from the TFTP to the 5500.
    4.  All done.

  • Best practice migration 10.4.11 G4 PPC to Intel server

    We are ready to upgrade our aging server to a new Intel system. I am wondering the easiest approach. The current server is not running the Universal 10.4. I don't believe there is any viable option that doesn't require lots of hands on rebuilding to move from 10.4 PPC to 10.4 Universal. If that is true, I figure the best route might be to upgrade to 10.5.1. Can I take a backup drive and do this on the Intel machine? I am thinking it will not go for this and I likely have to upgrade on a PPC system and then clone that over to the Intel. Will the format of the drive effect this? I am thinking it will in which case I am thinking my only solution is the topicdesk tool for migrating email as the rest will not be a big hassle. Thanks for any thoughts!

    Paul,
    I tested this on a borrowed Intel workstation, so I was only able to run it for about a day or so. In that short time I did not encounter any problems.
    The GUID Partition table is available in Disk Utility (Applications/Utilities). Select the hard disk, then the Partition tab should appear. Select the number of partitions then click options. You will see the GUID as one of the options.
    You can use CCC to clone from a boot drive to another disk. Before I started cloning, I turned off all services and disconnected the network cable.
    It seems like Leopard loads all the files it needs to run on an Intel box.
    As long as the target disk has enough space, it does not have to match the size of the source drive. It's generally best, to erase the target disk before cloning.
    One last thing, if you are trying to upgrade a production server, I would use CCC to clone the current 10.4.x set up to a disk image. That way you have something to fall back to if anything goes wrong.
    Let me know how it goes.
    Henry

  • Native Toplink to EclipseLink to JPA - Migration Best Practice

    I am currently looking at the future technical stack of our developments, and would appreciate any advise concerning best practice migration paths.
    Our current platform is as follows:
    Oracle 10g AS -> Toplink 10g -> Spring 2.5.x
    We have (approx.) 100 seperate Toplink Mapping Workbench projects (we have one per DDD Aggregate object in effect) and therefore 100 Repositories (or DAOs), some using Toplink code (e.g. Expression Builder, Class Extractors etc) on top of the mappings to support Object to RDB (legacy) mismatch.
    Future platform is:
    Oracle 11g AS -> EclipseLink -> Spring 3.x
    Migration issues are as follows:
    Spring 3.x does not provide any Native Toplink ORM support
    Spring 2.5.x requires Toplink 10g to provide Native Toplink ORM support
    My current plan is as follows:
    1. Migrate Code and Mappings to use EclipseLink (as per Link:[http://wiki.eclipse.org/EclipseLink/Examples/MigratingFromOracleTopLink])
    2. Temporarily re-implement the Spring 2.5.x ->Toplink 10g support code to use EclipseLink (e.g. TopLinkDaoSupport etc) to enable testing of this step.
    3. Refactor all Repositories/DAOs and Support code to use JPA engine (i.e. Entity Manager etc.)
    4. Move to Spring 3.x
    5. Move to 11g (when available!)
    Step 2 is only required to enable testing of the mapping changes, without changing to use the JPA engine.
    Step 3 will only work if my understanding of the following statement is correct (i.e. I can use the JPA engine to run native Toplink mappings and associated code):
    Quote:"Deployment XML files from Oracle TopLink 10.1.3 and above can be read by EclipseLink."
    Speciifc questions are:
    Is my understanding correct regarding the above?
    Is there any other path to achieve the goal of using 11g, EclipseLink (and Spring 3.x)?
    Is this achieveable without refactoring all XML mappings from Native -> JPA?
    Many thanks for any assistance.
    Marc

    It is possible to use the native/MW TopLink/EclipseLink deployment xml files with JPA in EclipseLink, this is correct. You just need to pass a persistence property giving your sessions.xml file location. The native API is also still supported in EclipseLink.
    James : http://www.eclipselink.org

  • Best practice for migrating data tables- please comment.

    I have 5 new tables seeded with data that need to be promoted from a development to a production environment.
    Instead of the DBAs just using a tool to migrate the data they are insistent that I save and provide scripts for every single commit, in proper order, necessary to both build the table and insert the data from ground zero.
    I am very unaccustomed to this kind of environment and it seems much riskier for me to try and rebuild the objects from scratch when I already have a perfect, tested, ready model.
    They also require extensive documentation where every step is recorded in a document and use that for the deployment.
    I believe their rationale is they don't want to rely on backups but instead want to rely on a document that specifies each step to recreate.
    Please comment on your view of this practice. Thanks!

    >
    Please comment on your view of this practice. Thanks!
    >
    Sounds like the DBAs are using best practices to get the job done. Congratulations to them!
    >
    I have 5 new tables seeded with data that need to be promoted from a development to a production environment.
    Instead of the DBAs just using a tool to migrate the data they are insistent that I save and provide scripts for every single commit, in proper order, necessary to both build the table and insert the data from ground zero.
    >
    The process you describe is what I would expect, and require, in any well-run environment.
    >
    I am very unaccustomed to this kind of environment and it seems much riskier for me to try and rebuild the objects from scratch when I already have a perfect, tested, ready model.
    >
    Nobody cares if if is riskier for you. The production environment is sacred. Any and all risk to it must be reduced to a minimum at all cost. In my opinion a DBA should NEVER move ANYTHING from a development environment directly to a production environment. NEVER.
    Development environments are sandboxes. They are often not backed up. You or anyone else could easily modify tables or data with no controls in place. Anything done in a DEV environment is assumed to be incomplete, unsecure, disposable and unvetted.
    If you are doing development and don't have scripts to rebuild your objects from scratch then you are doing it wrong. You should ALWAYS have your own backup copies of DDL in case anything happens (and it does) to the development environment. By 'have your own' I mean there should be copies in a version control system or central repository where your teammates can get their hands on them if you are not available.
    As for data - I agree with what others have said. Further - ALL data in a dev environment is assumed to be dev data and not production data. In all environments I have worked in ALL production data must be validated and approved by the business. That means every piece of data in lookup tables, fact tables, dimension tables, etc. Only computed data, such as might be in a data warehouse system generated by an ETL process might be exempt; but the process that creates that data is not exempt - that process and ultimately the data - must be signed off on by the business.
    And the business generally has no access to, or control of, a development environment. That means using a TEST or QA environment for the business users to test and validate.
    >
    They also require extensive documentation where every step is recorded in a document and use that for the deployment.
    I believe their rationale is they don't want to rely on backups but instead want to rely on a document that specifies each step to recreate.
    >
    Absolutely! That's how professional deployments are performed. Deployment documents are prepared and submitted for sign off by each of the affected groups. Those groups can include security, dba, business user, IT and even legal. The deployment documents always include recovery steps so that is something goes wrong or the deployment can't procede there is a documented procedure of how to restore the system to a valid working state.
    The deployments themselves that I participate in have representatives from the each of those groups in the room or on a conference call as each step of the deployment is performed. Your 5 tables may be used by stored procedures, views or other code that has to be deployed as part of the same process. Each step of the deployment has to be performed in the correct order. If something goes wrong the responsible party is responsible for assisting in the retry or recovery of their component.
    It is absolutely vital to have a known, secure, repeatable process for deployments. There are no shortcuts. I agree, for a simple 5 new table and small amount of data scenario it may seem like overkill.
    But, despite what you say it simply cannot be that easy for one simple reason. Adding 5 tables with data to a production system has no business impact or utility at all unless there is some code, process or application somewhere that accesses those tables and data. Your post didn't mention the part about what changes are being made to actually USE what you are adding.

  • What are the best practices to migrate VPN users for Inter forest mgration?

    What are the best practices to migrate VPN users for Inter forest mgration?

    It depends on a various factors. There is no "generic" solution or best practice recommendation. Which migration tool are you planning to use?
    Quest (QMM) has a VPN migration solution/tool.
    ADMT - you can develop your own service based solution if required. I believe it was mentioned in my blog post.
    Santhosh Sivarajan | Houston, TX | www.sivarajan.com
    ITIL,MCITP,MCTS,MCSE (W2K3/W2K/NT4),MCSA(W2K3/W2K/MSG),Network+,CCNA
    Windows Server 2012 Book - Migrating from 2008 to Windows Server 2012
    Blogs: Blogs
    Twitter: Twitter
    LinkedIn: LinkedIn
    Facebook: Facebook
    Microsoft Virtual Academy:
    Microsoft Virtual Academy
    This posting is provided AS IS with no warranties, and confers no rights.

  • Best practice for migrating IDOCs?

    Subject: Best practice for migrating IDOC's? 
    Hi,
    I need to migrate some IDOC's to another system for 'historical reference'.
    However, I don't want to move them using the regular setup as I don't want the inbound processing to be triggered.
    The data that was created in the original system by the processed IDOC's will be migrated to the new system using migration workbench. I only need to migrate the IDOC's as-is due to legal requirements.
    What is the best way to do this? I can see three solutions:
    A) Download IDOC table contents to a local file and upload them in the new system. Quick and dirty approach, but it might also be a bit risky.
    B) Use LSMW. However, I'm not sure whether this is feasible for IDOC's.
    C) Using ALE and setting up a custom partner profile where inbound processing only writes the IDOC's to the database. Send the IDOC's from legacy to the new system. Using standard functionality in this way seems to me to be the best solution, but I need to make sure that the IDOC's once migration will get the same status as they had in the old system.
    Any help/input will be appreciated
    Regards
    Karl Johan
    PS. For anyone interested in the business case: Within EU the utility market was deregulated a few years ago, so that any customer can buy electricity from any supplier. When a customer switches supplier this is handled via EDI, in SAP using ALE and IDOC's. I'm working on a merger between two utility companies and for legal reasons we need to move the IDOC's. Any other data is migrated using migration workbench for IS-U.

    Hi Daniele
    I am not entirely sure, what you are asking, Please could you provide additional information.
    Are you looking for best practice recommendations for Governance, for example: Change transports between DEV, QA and PRD in BPC 7.0?
    What is the best method? Server Manager backup and restore, etc  ?
    And
    Best Practice recommendations on how to upgrade to a different version of BPC, for example: Upgrading from BPC 7.0 to 7.5 or 10.0 ?
    Kind Regards
    Daniel

  • Best practice for database migration in 11g

    Hello,
    Database migration is required due to OS change.  Here, I have two database instances say A and B in the old server where RDBMS_VERSION is 11.1.0.7.0. They need to be migrated into a new OS where the oracle has been installed with version 11.2.0.2.0.
    Since all data + objects need to be migrated into the new server, I want to know what the best practice is and how to do that. Thanks in advance for your necessary guidance.
    Thanks and Regards,
    Prosenjit

    Hi Prosenjit,
    you have some options.
    1. RMAN Restore: you can restore your database via rman to the new host, and then upgrade it.
        Please follow instruction from MOS Note: RMAN Restore of Backups as Part of a Database Upgrade (Doc ID 790559.1)
    2. Data Guard: check the MOS Note: Mixed Oracle Version support with Data Guard Redo Transport Services (Doc ID 785347.1)
    3. Full Export / Import (DataPump)
    Borys

  • Best practice for data migration install v1.40 - Error 2732 Directory manag

    Hi
    I'm attempting to install SAP Best Practice for Data migration 1.40 on Win Server 2008 R2 (64 bit).
    Prerequisite error
    Installation program stops with missing file error
    The following file was not found
    ... \migration\InstallationWizard\BusinessObjects Data Services\setup.exe
    The file is necessary for successful installation. Please connect to internet or refer to Quick Guide (available on SAP note 1527151) for information regarding the above file.
    Windows installer log displays
    Error 2732 Directory Manager not initialized
    SAP note 1527151 does not exist or is internal.
    Any help appreciated  on what is the root cause of the error as the file does not exist in that folder in the installation zip file.
    Other prerequisite of .NET 3.5.1 met already.
    Patch is released since 20.11.2011 so I presume that it is a good installation set.
    Thanks,
    Alan

    Hi Alan,
    There are details on data migration v1.4 installations on SAP website and market place. The below link should guide to the right place. It has a power point presentation and other useful links as well.
    http://help.sap.com/saap/sap_bp/DMS_V140/DMS_US/html/index.htm
    Arun

  • Need Best Practice for Migrating from Solaris to Linux

    Hi Team,
    We are migrating our Data Center from Solaris to Linux and our EBS 11i, database 10g (10.2.0.5) is 6TB. Please let us know the Best Practice to Migrate our EBS 11.5.10.2 from Solaris to Linux RHEL 5.
    we require Database 10g (10.2.0.5) on Linux x86-64 RHEL 5 and Application EBS on Linux x86 RHEL 5. Please let us know for any details.
    EBS version: 11.5.10.2
    DB version: 10.2.0.5
    We have checked the certifications in Oracle support.
    Oracle EBS 11.5.10.2 is not certified with Linux x86-64 RHEL 5. 
    Oracle EBS 11.5.10.2 is certified on Linux x86 RHEL 5.
    So we require Database 10g (10.2.0.5) on Linux x86-64 RHEL 5 and Application EBS on Linux x86 RHEL 5.
    Thank You.

    You can transportable tablespace for the database tier node.
    https://blogs.oracle.com/stevenChan/entry/10gr2_xtts_ebs11i
    https://blogs.oracle.com/stevenChan/entry/call_for_xtts_eap_participants
    For the application tier node, please see:
    https://blogs.oracle.com/stevenChan/entry/migrate_ebs_apptiers_linux
    https://blogs.oracle.com/stevenChan/entry/migrating_oracle_applications_to_new_platforms
    Thanks,
    Hussein

  • Migration Best Practice When Using an Auth Source

    Hi,
    I'm looking for some advice on migration best practices or more specifically, how to choose whether to import/export groups and users or to let the auth source do a sync to bring users and groups into each environment.
    One of our customers is using an LDAP auth source to synchronize users and groups. I'm trying to help them do a migration from a development environment to a test environment. I'd like to export/import security on each object as I migrate it, but does this mean I have to export/import the groups on each object's ACLs before I export/import each object? What about users? I'd like to leave users and groups out of the PTE files and just export/import the auth source and let it run in each environment. But I'm afraid the UUIDs for the newly created groups will be different and they won't match up with object ACLs any more, causing all the objects to lose their security settings.
    If anyone has done this before, any suggestions about best practices and gotchas when using the migration wizard in conjunction with an auth source would be much appreciated.
    Thanks,
    Chris Bucchere
    Bucchere Development Group
    [email protected]
    http://www.bucchere.com

    The best practice here would be to migrate only the auth source through the migration wizard, and then do an LDAP sync on the new system to pull in the users and groups. The migration wizard will then just "do the right thing" in matching up the users and groups on the ACLs of objects between the two systems.
    Users and groups are actually a special case during migration -- they are resolved first by UUID, but if that is not found, then a user with the same auth source UUID and unique auth name is also treated as a match. Since you are importing from the same LDAP auth source, the unique auth name for the user/group should be the same on both systems. The auth source's UUID will also match on the two systems, since you just migrated that over using the migration wizard.

  • Upcoming SAP Best Practices Data Migration Training - Chicago

    YOU ARE INVITED TO ATTEND HANDS-ON TRAINING
    SAP America, Downers Grove in Chicago, IL:
    November 3 u2013 5, 2010     `
    Installation and Deployment of SAP Best Practices for Data Migration & SAP BusinessObjects Data Services
    Install and learn how to use the latest SAP Best Practices for Data Migration package. This new package combines the familiar IDoc technology together with the SAP BusinessObjects (SBOP) Data Services to load your customeru2019s legacy data to SAP ERP and SAP CRM (New!).
    Agenda
    At the end of this unique hands-on session, participants will depart with the SBOP Data Services and SAP Best Practices for Data Migration installed on their own laptops. The three-day training course will cover all aspects of the data migration package including:
    1.     Offering Overview  u2013 Introduction to the new SAP Best Practices for Data Migration package and data migration content designed for SAP BAiO / SAP ERP and SAP CRM
    2.     Data Services fundamentals u2013 Architecture, source and target metadata definition. Process of creating batch Jobs, validating, tracing, debugging, and data assessment.
    3.     Installation and configuration of the SBOP Data Servicesu2013 Installation and deployment of the Data Services and content from SAP Best Practices. Configuration of your target SAP environment and deploying the Migration Services application.
    4.     Customer Master example u2013 Demonstrations and hands-on exercises on migrating an object from a legacy source application through to the target SAP application.
    5.     Overview of Data Quality within the Data Migration process A demonstration of the Data Quality functionality available to partners using the full Data Services toolset as an extension to the Data Services license.
    Logistics & How to Register
    Nov. 3 u2013 5: SAP America, Downers Grove,  IL
                     Wednesday 10AM u2013 5PM
                     Thursday 9AM u2013 5PM
                     Friday 8AM u2013 3PM
                     Address:
                     SAP America u2013Buckingham Room
                     3010 Highland Parkway
                     Downers Grove, IL USA 60515
    Partner Requirements:  All participants must bring their own laptop to install SAP Business Objects Data Services on it. Please see attached laptop specifications and ensure your laptop meets these requirements.
    Cost: Partner registration is free of charge
    Who should attend: Partner team members responsible for customer data migration activities, or for delivery of implementation tools for SAP Business All-in-One solutions. Ideal candidates are:
    u2022         Data Migration consultant and IDoc experts involved in data migration and integration projects
    u2022         Functional experts that perform mapping activities for data migration
    u2022         ABAP developers who write load programs for data migration
    Trainers
    Oren Shatil u2013 SAP Business All-in-One Development
    Frank Densborn u2013 SAP Business All-in-One Development
    To register please use the hyperlink below.
    http://service.sap.com/~sapidb/011000358700000917382010E

    Hello,
    The link does not work. This training is still available ?
    Regards,
    Romuald

Maybe you are looking for

  • Log in error 400

    can someone please help me..error 400 when logging in...tried the fix and still nothing..i want to use the share option..

  • Unable to Meke EXE using VB6 and CW

    I know that I will be a little bit long, but that's the only way to be precise. So please, maybe someone out there can help me, since th NI Knowledge Base didn't ... We use ComponentWorks 2.0.1 and we also got 3.0.1 update. My computer runs Windows98

  • Error while installing in tomcat

    Hi all, when i deploy my webapp thru eclipse,it doesn't show any error. but when i try to install thru eclipse it shows the following error: java.net.MalformedURLException: no protocol: ${tomcat.manager.url}/list can anyone help me?and give step by s

  • Bit Depth 8, 16 or 32???

    I have been working in a bit depth of 8. I mostly do titles for wedding videos in motion. Is something going to improve if I change this setting. What is it for? Thanks in advance

  • Basic question regarding recording of sounds..

    So I have this idea that I would like to use some voice during my music making and since I don't own a direct microphone but I do have a microphone on my headset and built-in to my Mac I would like to use one of those. But I can't seem to find out ho