Best Practice to migrate existing Message/Central Instance to new hardware?
I've done some searching for this, but it all seems to be fairly old (I found one decent thread from 2008, but it just seemed outdated). I would like to get some updated info & suggestions on a process to follow for this task.
Here is some info about the current config:
I have an existing SAP ERP 6.0 EHP5 system. SID "XYZ"
Everything is currently running on Windows x64.
ABAP only config.
It is currently setup as a distributed system.
DB server is SQL, which is running on its own hardware as a dedicated DB server. I don't need to touch this.
Uses logon groups to distribute the load between 3 App servers.
One of the App Servers is setup as the Message/CI server. This is what I am wanting to move to VM hardware.
The other two App Servers are setup just as Dialog Instances. VM already. I don't need to touch these.
What I am wanting to do:
I am wanting to migrate the existing Message/CI server to VM hardware to get the current Message/CI system off of very old hardware.
I need to keep the system with the same name & IP.
I do not need/want to mess with the DB server at all.
Need to minimize downtime as much as possible.
I do NOT want to go to an HA environment. That is a different project for the future.
Suggestions? I would like to hear what some of you have done.
Minimize downtime? I have a short maintenance window of a few hours coming up in the next few months, and would like to do this changeover during that downtime.
How do I handle setting this up on the new hardware using the same name & IP, especially with the existing system still running.
What I think I need to do: (Some of these may be out of order, which is why I'm asking for suggestions)
Setup new Windows VM server, named differently than current Message/CI server (Done already).
Run SAPINST to run Global Host Preparation on the new Windows VM server.
Run SAPINST to setup ASCS Instance on the new Windows VM server.
Skip the Database Instance installation, since I already have one running.
Run SAPINST to setup Central Instance on the new Windows VM server.
Copy all of the appropriate profile parameters that were on the old server.
Start SAP and make sure everything is running and working correctly.
Copy all job, spool files, etc. from the old server.
What needs to be done in between those steps for the server rename & IP changes?
Which of these steps can I do in advance of my maintenance downtime window, without it affecting the currently running system?
I have a test environment that I am going to test this with. I'm just trying to get a jump start on my instructions, so that I can hopefully do this quickly and easily. I'm very comfortable doing the installs. Just needing some guidance on how to handle this case.
I'm open to suggestions, and any links you can send my way that show some more recent success at doing this. Everything I keep finding talks about doing the migration for the DB. I am not migrating my DB. It is staying where it is. I'm just wanting to migrate my Message/Central Instance to a VM, without affecting the users, and hopefully them never noticing the changes.
Thanks!
If you're using VMWare, there is a tool provided to virtualize existing systems. Basically it makes a copy of the old system (while it is down) and recreates it as a VM with the same name, IP, etc. I haven't double-checked to see if this process is 'officially' supported by SAP, and I don't know the technical details about how it's done or how long it will take (VMWare is managed by the Network Operations group in my team, so as the SAP sysadmin I'm basically a customer of theirs). However, we have virtualized a few smaller systems that needed to get off failing hardware in a hurry and it worked very well, so I would expect it would for an SAP dialog instance as well. As you're not copying a database, just the app server, it probably would fit within your maintenance window.
If this route works for you, then there is no need to actually install the SAP system on the VM instance... you just copy the old server as-is, then make any adjustments required due to no longer needing old drivers, etc, plus potentially having different memory parameters.
Similar Messages
-
What are the best practices to migrate VPN users for Inter forest mgration?
What are the best practices to migrate VPN users for Inter forest mgration?
It depends on a various factors. There is no "generic" solution or best practice recommendation. Which migration tool are you planning to use?
Quest (QMM) has a VPN migration solution/tool.
ADMT - you can develop your own service based solution if required. I believe it was mentioned in my blog post.
Santhosh Sivarajan | Houston, TX | www.sivarajan.com
ITIL,MCITP,MCTS,MCSE (W2K3/W2K/NT4),MCSA(W2K3/W2K/MSG),Network+,CCNA
Windows Server 2012 Book - Migrating from 2008 to Windows Server 2012
Blogs: Blogs
Twitter: Twitter
LinkedIn: LinkedIn
Facebook: Facebook
Microsoft Virtual Academy:
Microsoft Virtual Academy
This posting is provided AS IS with no warranties, and confers no rights. -
Best practice for migrating IDOCs?
Subject: Best practice for migrating IDOC's?
Hi,
I need to migrate some IDOC's to another system for 'historical reference'.
However, I don't want to move them using the regular setup as I don't want the inbound processing to be triggered.
The data that was created in the original system by the processed IDOC's will be migrated to the new system using migration workbench. I only need to migrate the IDOC's as-is due to legal requirements.
What is the best way to do this? I can see three solutions:
A) Download IDOC table contents to a local file and upload them in the new system. Quick and dirty approach, but it might also be a bit risky.
B) Use LSMW. However, I'm not sure whether this is feasible for IDOC's.
C) Using ALE and setting up a custom partner profile where inbound processing only writes the IDOC's to the database. Send the IDOC's from legacy to the new system. Using standard functionality in this way seems to me to be the best solution, but I need to make sure that the IDOC's once migration will get the same status as they had in the old system.
Any help/input will be appreciated
Regards
Karl Johan
PS. For anyone interested in the business case: Within EU the utility market was deregulated a few years ago, so that any customer can buy electricity from any supplier. When a customer switches supplier this is handled via EDI, in SAP using ALE and IDOC's. I'm working on a merger between two utility companies and for legal reasons we need to move the IDOC's. Any other data is migrated using migration workbench for IS-U.Hi Daniele
I am not entirely sure, what you are asking, Please could you provide additional information.
Are you looking for best practice recommendations for Governance, for example: Change transports between DEV, QA and PRD in BPC 7.0?
What is the best method? Server Manager backup and restore, etc ?
And
Best Practice recommendations on how to upgrade to a different version of BPC, for example: Upgrading from BPC 7.0 to 7.5 or 10.0 ?
Kind Regards
Daniel -
Need Best Practice for Migrating from Solaris to Linux
Hi Team,
We are migrating our Data Center from Solaris to Linux and our EBS 11i, database 10g (10.2.0.5) is 6TB. Please let us know the Best Practice to Migrate our EBS 11.5.10.2 from Solaris to Linux RHEL 5.
we require Database 10g (10.2.0.5) on Linux x86-64 RHEL 5 and Application EBS on Linux x86 RHEL 5. Please let us know for any details.
EBS version: 11.5.10.2
DB version: 10.2.0.5
We have checked the certifications in Oracle support.
Oracle EBS 11.5.10.2 is not certified with Linux x86-64 RHEL 5.
Oracle EBS 11.5.10.2 is certified on Linux x86 RHEL 5.
So we require Database 10g (10.2.0.5) on Linux x86-64 RHEL 5 and Application EBS on Linux x86 RHEL 5.
Thank You.You can transportable tablespace for the database tier node.
https://blogs.oracle.com/stevenChan/entry/10gr2_xtts_ebs11i
https://blogs.oracle.com/stevenChan/entry/call_for_xtts_eap_participants
For the application tier node, please see:
https://blogs.oracle.com/stevenChan/entry/migrate_ebs_apptiers_linux
https://blogs.oracle.com/stevenChan/entry/migrating_oracle_applications_to_new_platforms
Thanks,
Hussein -
Best practice for migration to new hardware
Hi,
We are commissioning new hardware for our Web Server. Our current webserver is version 6.1SP4, and for the new server we've decided to stay with 6.1 but install SP7.
Is there a best practice for migrating content from one physical server to another?
What configuration files should I watch out for?
Hopefully the jump from SP4 to SP7 won't cause too many problems.
Thanks,
Johnunfortunately, there is no quick solution for migrating from 1 server to other. you will need to carefully reconstruct
- acl rules
- server hostname configurations
- any certificates that have been created on the old machine -
Best Practices for migrating .rpd
Hi,
Can someone please share with us the industry best practices for migrating the .rpd file from a source (QA) to a destination (Prod) environment. Our Oracle BI server runs on a unix box and currently what we do to migrate is to save copies of QA and prod .rpds on our local machine and do a Merge. Once merged, we ftp the resulting .rpd to destination (prod) machine. This approach does not seem reliable and we are looking for a better established approach for .rpd migration. Kindly help me where I can find the Documentation..Any help in this regard would be highly appreciated.I don't think there is an industry best practice as such. What applies for one customer doesn't necessarily apply at another site.
Have a read of this from Mark Rittman, it should give you some good ideas:
http://www.rittmanmead.com/2009/11/obiee-software-configuration-management-part-2-subsequent-deployments-from-dev-to-prod/
Paul -
How to check verison of Best Practice Baseline in existing ECC system?
Hi Expert,
How to check verison of Best Practice Baseline in existing ECC system such as v1.603 or v1.604?
Any help will be appriciate.
SayanDear,
Please go to https://websmp201.sap-ag.de/bestpractices and click on Baseline packages then on right hand side you will see that On which release is SAP Best Practices Baseline package which version is applicable.
If you are on EHP4 then you can use the v1.604.
How to Get SAP Best Practices Data Files for Installation (pdf, 278 KB) please refer this link,
https://websmp201.sap-ag.de/~sapidb/011000358700000421882008E.pdf
Hope it will help you.
Regards,
R.Brahmankar -
Does anybody have a copy of the above referenced presentation that you could send me.
Thanks in advanced.
The presentation can be purchased at the following site:
http://www.scribd.com/doc/33211957/BRKVVT-2011-Best-Practices-for-Migrating-Previous-Versions-of-Cisco-Unified-Communications#archive
but felt I ask one of my peeps first.
Thanks in advanced.
DennisHi Dennis,
Well..let's give this a try
Cheers!
Rob -
Best practice for migrating between environments and versions?
Hi to all,
we've got a full suite of solutions custom developed in SAP BPC 7.0, SP 7. We'd like to understand if
- there are best practice in order to copy this applications from an environment to another environment (another client)
- there are best practice in case the client has got a newer version of SAP BPC (they should install the 7.5, while we're still stucked with the 7.0).
Thank you very much
DanieleHi Daniele
I am not entirely sure, what you are asking, Please could you provide additional information.
Are you looking for best practice recommendations for Governance, for example: Change transports between DEV, QA and PRD in BPC 7.0?
What is the best method? Server Manager backup and restore, etc ?
And
Best Practice recommendations on how to upgrade to a different version of BPC, for example: Upgrading from BPC 7.0 to 7.5 or 10.0 ?
Kind Regards
Daniel -
What is the best way to migrate my MacBook Pro to my new Imac desk top, both seem to be running OS X version 10.9.5
OS X: How to migrate data from another Mac using Mavericks
-
Best Practices to add second BW system/instance to existing one
Hi Experts,
I need to give my client - different strategies, Best Practice, Precautions, Steps, How to..document,,..... basically prepare methodology considering efforts required ( hardware + time) to develop second instance of BW system.
We want to create one more BW server and i need to create Blue Print for that starting from scratch to end..
Plz help me - -I am not aware of anything related to this...
Regards and thanks in advance
GauravHi Arun,
We have migrated to BI 7.0, but now since SAP wont be supporting old BW 3.0 - we are thinking to create new Box where would be rewiting the things more efficienty ..using transformation/end routine...clear the things...and get evrything new..
Also we need to think about different geography also..as we have to accomodate different hubs which would be evntually going live later..
Or else we have to change in existing system only...
Also whats difference between adding different clients/sandbox system or different application server...?
I think with different client - you cannot have seperate development but with seperate sandbox you can have...
What abt App Server - i m not sure abt that..
Can you please explains little more but - what would be advantges and wht efforts(hardware/time) would be required...
Thanks in advance
Gaurav
Message was edited by:
Gaurav -
Best practice for migrating data tables- please comment.
I have 5 new tables seeded with data that need to be promoted from a development to a production environment.
Instead of the DBAs just using a tool to migrate the data they are insistent that I save and provide scripts for every single commit, in proper order, necessary to both build the table and insert the data from ground zero.
I am very unaccustomed to this kind of environment and it seems much riskier for me to try and rebuild the objects from scratch when I already have a perfect, tested, ready model.
They also require extensive documentation where every step is recorded in a document and use that for the deployment.
I believe their rationale is they don't want to rely on backups but instead want to rely on a document that specifies each step to recreate.
Please comment on your view of this practice. Thanks!>
Please comment on your view of this practice. Thanks!
>
Sounds like the DBAs are using best practices to get the job done. Congratulations to them!
>
I have 5 new tables seeded with data that need to be promoted from a development to a production environment.
Instead of the DBAs just using a tool to migrate the data they are insistent that I save and provide scripts for every single commit, in proper order, necessary to both build the table and insert the data from ground zero.
>
The process you describe is what I would expect, and require, in any well-run environment.
>
I am very unaccustomed to this kind of environment and it seems much riskier for me to try and rebuild the objects from scratch when I already have a perfect, tested, ready model.
>
Nobody cares if if is riskier for you. The production environment is sacred. Any and all risk to it must be reduced to a minimum at all cost. In my opinion a DBA should NEVER move ANYTHING from a development environment directly to a production environment. NEVER.
Development environments are sandboxes. They are often not backed up. You or anyone else could easily modify tables or data with no controls in place. Anything done in a DEV environment is assumed to be incomplete, unsecure, disposable and unvetted.
If you are doing development and don't have scripts to rebuild your objects from scratch then you are doing it wrong. You should ALWAYS have your own backup copies of DDL in case anything happens (and it does) to the development environment. By 'have your own' I mean there should be copies in a version control system or central repository where your teammates can get their hands on them if you are not available.
As for data - I agree with what others have said. Further - ALL data in a dev environment is assumed to be dev data and not production data. In all environments I have worked in ALL production data must be validated and approved by the business. That means every piece of data in lookup tables, fact tables, dimension tables, etc. Only computed data, such as might be in a data warehouse system generated by an ETL process might be exempt; but the process that creates that data is not exempt - that process and ultimately the data - must be signed off on by the business.
And the business generally has no access to, or control of, a development environment. That means using a TEST or QA environment for the business users to test and validate.
>
They also require extensive documentation where every step is recorded in a document and use that for the deployment.
I believe their rationale is they don't want to rely on backups but instead want to rely on a document that specifies each step to recreate.
>
Absolutely! That's how professional deployments are performed. Deployment documents are prepared and submitted for sign off by each of the affected groups. Those groups can include security, dba, business user, IT and even legal. The deployment documents always include recovery steps so that is something goes wrong or the deployment can't procede there is a documented procedure of how to restore the system to a valid working state.
The deployments themselves that I participate in have representatives from the each of those groups in the room or on a conference call as each step of the deployment is performed. Your 5 tables may be used by stored procedures, views or other code that has to be deployed as part of the same process. Each step of the deployment has to be performed in the correct order. If something goes wrong the responsible party is responsible for assisting in the retry or recovery of their component.
It is absolutely vital to have a known, secure, repeatable process for deployments. There are no shortcuts. I agree, for a simple 5 new table and small amount of data scenario it may seem like overkill.
But, despite what you say it simply cannot be that easy for one simple reason. Adding 5 tables with data to a production system has no business impact or utility at all unless there is some code, process or application somewhere that accesses those tables and data. Your post didn't mention the part about what changes are being made to actually USE what you are adding. -
Upcoming SAP Best Practices Data Migration Training - Chicago
YOU ARE INVITED TO ATTEND HANDS-ON TRAINING
SAP America, Downers Grove in Chicago, IL:
November 3 u2013 5, 2010 `
Installation and Deployment of SAP Best Practices for Data Migration & SAP BusinessObjects Data Services
Install and learn how to use the latest SAP Best Practices for Data Migration package. This new package combines the familiar IDoc technology together with the SAP BusinessObjects (SBOP) Data Services to load your customeru2019s legacy data to SAP ERP and SAP CRM (New!).
Agenda
At the end of this unique hands-on session, participants will depart with the SBOP Data Services and SAP Best Practices for Data Migration installed on their own laptops. The three-day training course will cover all aspects of the data migration package including:
1. Offering Overview u2013 Introduction to the new SAP Best Practices for Data Migration package and data migration content designed for SAP BAiO / SAP ERP and SAP CRM
2. Data Services fundamentals u2013 Architecture, source and target metadata definition. Process of creating batch Jobs, validating, tracing, debugging, and data assessment.
3. Installation and configuration of the SBOP Data Servicesu2013 Installation and deployment of the Data Services and content from SAP Best Practices. Configuration of your target SAP environment and deploying the Migration Services application.
4. Customer Master example u2013 Demonstrations and hands-on exercises on migrating an object from a legacy source application through to the target SAP application.
5. Overview of Data Quality within the Data Migration process A demonstration of the Data Quality functionality available to partners using the full Data Services toolset as an extension to the Data Services license.
Logistics & How to Register
Nov. 3 u2013 5: SAP America, Downers Grove, IL
Wednesday 10AM u2013 5PM
Thursday 9AM u2013 5PM
Friday 8AM u2013 3PM
Address:
SAP America u2013Buckingham Room
3010 Highland Parkway
Downers Grove, IL USA 60515
Partner Requirements: All participants must bring their own laptop to install SAP Business Objects Data Services on it. Please see attached laptop specifications and ensure your laptop meets these requirements.
Cost: Partner registration is free of charge
Who should attend: Partner team members responsible for customer data migration activities, or for delivery of implementation tools for SAP Business All-in-One solutions. Ideal candidates are:
u2022 Data Migration consultant and IDoc experts involved in data migration and integration projects
u2022 Functional experts that perform mapping activities for data migration
u2022 ABAP developers who write load programs for data migration
Trainers
Oren Shatil u2013 SAP Business All-in-One Development
Frank Densborn u2013 SAP Business All-in-One Development
To register please use the hyperlink below.
http://service.sap.com/~sapidb/011000358700000917382010EHello,
The link does not work. This training is still available ?
Regards,
Romuald -
Upcoming SAP Best Practices Data Migration Training - Berlin
YOU ARE INVITED TO ATTEND HANDS-ON TRAINING
Berlin, Germany: October 06 u2013 08, 2010 `
Installation and Deployment of SAP Best Practices for Data Migration & SAP BusinessObjects Data Integrator
Install and learn how to use the latest SAP Best Practices for Data Migration package. This new package combines the familiar IDoc technology together with the SAP BusinessObjects (SBOP) Data Integrator to load your customeru2019s legacy data to SAP ERP and SAP CRM (New!).
Agenda
At the end of this unique hands-on session, participants will depart with the SBOP Data Integrator and SAP Best Practices for Data Migration installed on their own laptops. The three-day training course will cover all aspects of the data migration package including:
1. Offering Overview u2013 Introduction to the new SAP Best Practices for Data Migration package and data migration content designed for SAP BAiO / SAP ERP and SAP CRM
2. Data Integrator fundamentals u2013 Architecture, source and target metadata definition. Process of creating batch Jobs, validating, tracing, debugging, and data assessment.
3. Installation and configuration of the SBOP Data Integratoru2013 Installation and deployment of the Data Integrator and content from SAP Best Practices. Configuration of your target SAP environment and deploying the Migration Services application.
4. Customer Master example u2013 Demonstrations and hands-on exercises on migrating an object from a legacy source application through to the target SAP application.
Logistics & How to Register
October 06 u2013 08: Berlin, Germany
Wednesday 10AM u2013 5PM
Thursday 9AM u2013 5PM
Friday 9AM u2013 4PM
SAP Deutschland AG & Co. KG
Rosenthaler Strasse 30
D-10178 Berlin, Germany
Training room S5 (1st floor)
Partner Requirements: All participants must bring their own laptop to install SAP Business Objects Data Integrator on it. Please see attached laptop specifications and ensure your laptop meets these requirements.
Cost: Partner registration is free of charge
Who should attend: Partner team members responsible for customer data migration activities, or for delivery of implementation tools for SAP Business All-in-One solutions. Ideal candidates are:
u2022 Data Migration consultant and IDoc experts involved in data migration and integration projects
u2022 Functional experts that perform mapping activities for data migration
u2022 ABAP developers who write load programs for data migration
Trainers
Oren Shatil u2013 SAP Business All-in-One Development
Frank Densborn u2013 SAP Business All-in-One Development
To register please follow the hyperlink below
http://intranet.sap.com/~sapidb/011000358700000940832010EHello,
The link does not work. This training is still available ?
Regards,
Romuald -
Best Practice for migration to Exadata2
Hi Guru,
I'm thinking to migrate an Oracle RAC 11g (11.2.0.2) on HP/UX Itanium cluster machine to a New Exadata 2 System
Are there best practice? Where can I found documentation about migration?
Thanks very much
Regards
Gio
Edited by: ggiulian on 18-ago-2011 7.39There are several docs available on MOS
HP Oracle Exadata Migration Best Practices [ID 760390.1]
Oracle Exadata Best Practices [ID 757552.1]
Oracle Sun Database Machine X2-2/X2-8 Migration Best Practices [ID 1312308.1]
If you already have Exadata, I recommend to open an SR with Oracle and engage with ACS.
- Wilson
www.michaelwilsondba.info
Maybe you are looking for
-
Sales Order Total Weight Display in a report
Hi All, Is there any way that i can display the total weight and Unit of weight of a sales order in a report. Please let me know all the possibilities.. RV45A-BTGEW and RV45A-GEWEI are the technical names of the fields that i have to display.. thank
-
How many times a transaction has been executed
Hi All, I have to make a list on how many times certain transactions have been exectued last year. Does any of you have a idea on how to get this list? tried already ST03 and STAT but without satisfying result We have R/3 4.6B. Regards, Rod.
-
Setting an external monitor as primary on a MacBook in XP (w/ Bootcamp)?
Hi, I'm trying to use a 17" LCD with my macbook to take the advantage of "Extended Desktop". I know it's very easy in OSX where you just need to drag the menubar from one screen to the other in the display setting window. I was wondering if there is
-
Photoshop elements 12 won't load up on my PC. The first time I try to load it up I get a large egg timer, which lasts about 3 seconds. The programme then closes and if I try to load it again I get the circular icon which spins for 1 second and disapp
-
hello all, im trying to set up a swf that allows someone with the release of a button to download a picture file, i can't get it to work however, i have been using file.reference.download tutorials but iv found three vastly diferent ones and none of