Migrate BI 7 Routines
Hi
One my client has an intention to rebuild it's BW installation from scratch (i.e to build a BI 7 solution instead of upgrading/migrating the BW 3.5x version to version 7.0)
Our primary concern would be the sap standard routines in MM.
Assuming we reinstall business content.
Then copy the DSO/CUBE to customer specific objects (i.e 0PUR_C01 to ZPUR_C01).
SAP BC version will have its routines in BW 3.5x.
Do I have to rewrite them in version BI 7 ?
Please assist.
-Cheers-
Hi John,
I have to comment the last reply.
Please try to migrate the Routines with the standard means. In general, the success rate is pretty high. It might be that you have to make some manual ajustments afterwards.
Please check also the online documentation for assistance in this area.
Some interesting documentation around migration might be
[DataSources and Migration|http://help.sap.com/saphelp_nw70/helpdata/EN/43/682cacb88f297ee10000000a422035/content.htm]
[Migrating Rules and Routines|http://help.sap.com/saphelp_nw70/helpdata/en/43/f00e2696d24c5fe10000000a155369/content.htm]
Cheers
SAP NetWeaver BW Organisation
Similar Messages
-
Error while Migrating the custom routines in Transformations
Dear All,
I am in the process of migrating BW 3.5 to BI 7.0.I migrated the Standard cubes and DSO's from BW3.5 to BI 7.0 flow successfully.
But while migrating the transformations which are having the custom routines,I am facing the below errors.
The data object "COMM_STRUCTURE" does not have a component called BIC/ZGROSSPRI".But the routine contains BIC/ZGROSSPRI.
I tried to change the BW 3.5 terminology to BI 7.0 terminology.(Like COMM_STRUCTURE replaced by SOURCE_FIELDS).But unable to solve.There are nearly 20 custome routines written in all transformations.
Can any one guide me who faced the same tyepe of problem?
Thanks & Regards,
DinakarHI,
We need to include Source and Target see the below article.
http://wiki.sdn.sap.com/wiki/display/profile/Surendra+Reddy
How to Correct Routines in Transformations
http://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/com.sap.km.cm.docs/library/business-intelligence/g-i/how%20to%20correct%20routines%20in%20transformations.pdf
Thanks
Reddy -
How to migrate the catt routine?
Hi ,
One of our customer has upgraded the system to ECC 6.0 from R/3 4.7c. Prior to the upgrade, they were using 2 CATT routines. After the upgrade, these routines are not working.
I need to migrate the 2 catt routines in SCAT from R3 4.7c to ECC 6.0. Could you please let me know, how is it possible or is it necessary to create a new routine from the scratch?
Your response on this regard would be highly appreciated.
Thanks & Regards,
JohnHello John
When you call transaction SECATT choose menu eCATT Object -> Migrate CATT.
That is where can convert your CATT into an eCATT script which presumably needs some refinement afterwards.
Regards
Uwe -
Migration Assistant Crashes when importing from ext HD using Time Machine
Hi there,
I'm having issues restoring a harddrive using migration assistant to import a TimeMachine backup.
I've just reinstalled Mac OS X 10.5.8 onto a Mac Pro 2.66 GHz Quad-Core Intel Xeon with 2GB RAM.
I've done all updates and repaired permissions on bootable drive and repaired the external disk where the TimeMachine is stored.
The TimeMachine is stored on an external harddrive with USB2.0 connection.
I can load Migration Assistant, select migrate from TimeMachine, but when the import users screen appears after 40 seconds or so Migration Assistant crashes.
Here is the console report:
Querying receipt database for system packages
01/12/2010 17:07:20 Migration Assistant[123] Long name checked : admin05
01/12/2010 17:07:20 Migration Assistant[123] Long name checked : aeroplane05
01/12/2010 17:07:20 Migration Assistant[123] Long name checked : apple05
01/12/2010 17:07:20 Migration Assistant[123] Long name checked : bike05
01/12/2010 17:07:20 Migration Assistant[123] Long name checked : boat05
01/12/2010 17:07:20 Migration Assistant[123] Long name checked : car05
01/12/2010 17:07:20 Migration Assistant[123] Long name checked : cat05
01/12/2010 17:07:20 Migration Assistant[123] Long name checked : dragon05
01/12/2010 17:07:20 Migration Assistant[123] Long name checked : duck05
01/12/2010 17:07:20 Migration Assistant[123] Long name checked : fish05
01/12/2010 17:07:20 Migration Assistant[123] Long name checked : football05
01/12/2010 17:07:20 Migration Assistant[123] Long name checked : lily05
01/12/2010 17:07:20 Migration Assistant[123] Long name checked : mouse05
01/12/2010 17:07:20 Migration Assistant[123] Long name checked : panda05
01/12/2010 17:07:20 Migration Assistant[123] Long name checked : rhino05
01/12/2010 17:07:20 Migration Assistant[123] Long name checked : shark05
01/12/2010 17:07:20 Migration Assistant[123] Long name checked : snowflake05
01/12/2010 17:07:20 Migration Assistant[123] Long name checked : starfish05
01/12/2010 17:07:20 Migration Assistant[123] Long name checked : sunflower05
01/12/2010 17:07:20 Migration Assistant[123] Long name checked : tiger05
01/12/2010 17:07:20 Migration Assistant[123] Long name checked : zebra05
01/12/2010 17:07:20 [0x0-0x11011].com.apple.MigrateAssistant[123] mmap: Cannot allocate memory
01/12/2010 17:07:20 [0x0-0x11011].com.apple.MigrateAssistant[123] vm_allocate failed
01/12/2010 17:07:20 [0x0-0x11011].com.apple.MigrateAssistant[123] vm_allocate failed
01/12/2010 17:07:23 Migration Assistant[123] An uncaught exception was raised
01/12/2010 17:07:23 Migration Assistant[123] connection went invalid while waiting for a reply
01/12/2010 17:07:23 Migration Assistant[123] * Terminating app due to uncaught exception 'NSInvalidReceivePortException', reason: 'connection went invalid while waiting for a reply'
01/12/2010 17:07:23 Migration Assistant[123] Stack: (
2459639787,
2478489147,
2459639243,
2459639306,
2540575982,
2540572201,
2459662410,
2459662514,
240556308,
2540350973,
2540349860,
2462499157,
2462498834
01/12/2010 17:07:24 com.apple.launchd[75] ([0x0-0x11011].com.apple.MigrateAssistant[123]) Exited abnormally: Trace/BPT trap[/SIZE]
Has anyone experienced a similar issues?
Many thanks in advance.saleemmartin wrote:
Hi Pondini,
Thank you for your reply, your help is much appreciated.
The original problem was: On one user account the user root directory was not writable (i.e. the user couldn't save any work to their documents), so I tried to reset the permissions in the "get info" menu, but after that the OS X directory didn't start. I did successfully repair all permissions using Disk Utility (running from the external harddrive OSX i mentioned in my last post).
Do you think Disk Utility didn't repair all the permissions? In this case, the permissions on the Time Machine backup might still be corrupted.
That doesn't affect user permissions, only permissions on files installed by OSX or Apple apps. And you can't repair permissions on your backups, since that's not a "bootable" volume.
It seems that my Migration Assistant crashes routinely after 40 seconds (or so) no matter which options you select (if any). Even if I leave the app searching for user accounts on the Time Machine it crashes.
If I'm quick i'll be able to deselect all users except admin, or if I'm slow I won't get so far. No matter what, it still crashes.
Oh, it hadn't sunk in that it was crashing before you make selections. Now that I re-read your first post, I see you did say that, sorry.
The app never allows me to view installed applications.
No, it won't. You can only select or omit them.
I don't know whether you have a problem with your Mac or the backups.
You mentioned earlier that you'd done a +*Repair Disk+* on the TM drive, so if there's a problem with them, it's something DU can't fix.
Have you done a +*Verify Disk+* on your internal HD? If not, make sure it's ok.
If so, something may be amiss with the +Migration Assistant+ app. I'd suggest downloading and installing the "combo" update. That's the cleverly-named combination of all the updates to Leopard since it was first released, so installing it should fix anything that's gone wrong since then, such as with one of the normal "point" updates. Info and download available at: http://support.apple.com/downloads/MacOS_X_10_5_8_ComboUpdate Be sure to do a +Repair Permissions+ via Disk Utility (in your Applications/Utilities folder) afterwards. -
Ran Migrate to Mac and can't find my files
I just got a MacBook Pro and am coming over from the PC world. I ran the Migrate to Mac app and copied about 180GB of data from my PC to the MacBook (my Data Directory not in standard Microsoft folders). iTunes found the music files and Finder shows random other files, but the bulk of the data is "missing". I know it was copied because in the About This Mac storage graphic it shows 52 GB in photos and 89 GB in Other and only 50 GB free.
Without a File Manager, this old PC user is lost. :-(
How do I find the files and pics?UPDATE:
After trying the suggestions above with no luck I decided to call Apple Support. After 50 minutes on the phone with them the solution was to wipe the hard drive and start over from the factory settings, then redo the migration. Even though we could search for and find specific files, he couldn’t actually locate them on the hard drive to find the other files that were likely saved in the same place.
His explanation was that the Migration to Mac routine doesn’t always understand layered directory structures found on PCs. What the Migration Assistant is good at is taking files/docs from standard Windows folders (My Documents, Pictures, Music) and putting them in the same folders on the MacBook. If you have data elsewhere, like in a c:\data directory with a bunch of subfolders like in my case, the Migration wizard can get confused and not know where to put them.
So the advice for those that have data outside the Windows standard folders, assuming you have to start over like me, is take all the docs/pics/files you want to migrate over and put them in the Windows folders, then run the migration. -
Hi All,
I am working on BI7 and SRM standard content. I observe that the hierarchy DataSource are in 3.x version and the classic flow is not coming up after activation.
I have tried to migrate this DataSource to 7.x with the export option but am getting the below error.
" DataSource 0BBP_BIGURE_HIER(SRDCLNT200): Data type H not supported by object type RSDS "
If some one has had this problem below Please help in the resolution.
Kind Regards
NirenHi,
We are currently working on BI content 7.0. After transfering we got the 3.5 flow for most of it. Now we are looking for a migration of 3.x flow to 7.x flow. It would be really helpfull If any of you could answer my following queries:
1. How do we migrate queries/workbooks?
2. How much of effort it would be to migrate transformations having routines in it as part of stsandard content . And is it just a copy paste of the ABAP routines or do we need to change it to an Object oriented ABAP?
3. We are having around 100 info providers and 300 master objects. Will it be ok if we come up with a stratergy of migrating the info providers alone and still having 3.x flow for the master info objects.
Thanks and Regards
Niren. -
Used Backup to Restart and Can't Access my files
Hello! At the suggestion of the Genius Bar, I used the Seagate to backup my computer then I did a hard restart. The issue started with my system moving really slowly and slowly locking me out of all of my files. The people at the genius bar restarted my system, backing everything up to the Seagate thing and then having me start over again. I went through the whole process of restarting the system and now I can't find any of my files. Any suggestions on how to retrieve my old stuff please?
UPDATE:
After trying the suggestions above with no luck I decided to call Apple Support. After 50 minutes on the phone with them the solution was to wipe the hard drive and start over from the factory settings, then redo the migration. Even though we could search for and find specific files, he couldn’t actually locate them on the hard drive to find the other files that were likely saved in the same place.
His explanation was that the Migration to Mac routine doesn’t always understand layered directory structures found on PCs. What the Migration Assistant is good at is taking files/docs from standard Windows folders (My Documents, Pictures, Music) and putting them in the same folders on the MacBook. If you have data elsewhere, like in a c:\data directory with a bunch of subfolders like in my case, the Migration wizard can get confused and not know where to put them.
So the advice for those that have data outside the Windows standard folders, assuming you have to start over like me, is take all the docs/pics/files you want to migrate over and put them in the Windows folders, then run the migration. -
BI 7.0 Mapping --argunt
Hi ,
i am new to BI 7.0...
i migrate 3.5 to 7.0 for all data source .
but mapping is not done for all fileds.
so i map all fileds manuevally with help of sap help and 3.5 transferstrcture.
but i am not find max fileds for mapping bcoz tansfer rules are bitween DS and IS( DS fileds are different and IC fileds are different )
my DS : 2lis_02_HDR and ITM
My IC : 0PUR_C01.
and lot of routines are in update rules in 3.5 , how can transfer these routines also ?
i read lot of fourems for migrating
i done following steps:
1 . in Data source context menu ---migrate ..
so , all transfer rules are delted automatically.
2. in DS i created transformations to IC .
plz tell me i miss any steps ?
or can i recover my DS and go back to 3.5 procedure?
plz tell me way ?
it is very very argunt ..
i assign points..
regards
PSRYou should have migrated TR/UR before migrating the datasource, this way routines will also be migrated.
However, routines will have to be migrated accordingly to OO ABAP, you have to make minor changes in the Coding.
You can recover 3.X datasource in TCODE - RSDS - Give Datasource - Menu - Datasource - Recover 3.X Datasource.
After recovering you may have to reinstall UR & TR to get back to 3.5 flow. -
hi all,
need information on data migration and possible methods of LSMW
Thanks
Swapnahi
Can a give a Search on the Topic "Data Migration" & "LSMW" in the forum for valuable information,
<b>Pls Do NOT Post UR Mail-Id, Lets all follow some Rules</b>
Data Migration Life Cycle
Overview : Data Migration Life Cycle
Data Migration
This document aims to outline the typical processes involved in a data migration.
Data migration is the moving and transforming of data from legacy to target database systems. This includes one to one and one to many mapping and movement of static and transactional data. Migration also relates to the physical extraction and transmission of data between legacy and target hardware platforms.
ISO 9001 / TickIT accredited
The fundamental aims of certification are quality achievement and improvement and the delivery of customer satisfaction.
The ISO and TickIT Standards are adhered to throughout all stages of the migration process.
Customer Requirements
Dependencies
Analysis
Iterations
Data Cleanse
Post Implementation
Proposal
Project Management
Development
Quality Assurance
Implementation
Customer Requirements
The first stage is the contact from the customer asking us to tender for a data migration project. The invitation to tender will typically include the Scope /
Requirements and Business Rules:
 Legacy and Target - Databases / Hardware / Software
 Timeframes - Start and Finish
 Milestones
 Location
 Data Volumes
Dependencies
Environmental Dependencies
 Connectivity - remote or on-site
 Development and Testing Infrastructure - hardware, software, databases, applications and desktop configuration
Support Dependencies
 Training (legacy & target applications) - particularly for an in-house test team
 Business Analysts -provide expert knowledge on both legacy and target systems
 Operations - Hardware / Software / Database Analysts - facilitate system housekeeping when necessary
 Business Contacts
 User Acceptance Testers - chosen by the business
 Business Support for data cleanse
Data Dependencies
 Translation Tables - translates legacy parameters to target parameters
 Static Data / Parameters / Seed Data (target parameters)
 Business Rules - migration selection criteria (e.g. number of months history)
 Entity Relationship Diagrams / Transfer Dataset / Schemas (legacy & target)
 Sign Off / User Acceptance criteria - within agreed tolerance limits
 Data Dictionary
Analysis
Gap Analysis
Identifying where differences in the functionalities of the target system and legacy system mean that data may be left behind or alternatively generating default data for the new system where nothing comparable exists on legacy.
Liaison with the business is vital in this phase as mission critical data cannot be allowed to be left behind, it is usual to consult with the relevant business process leader or Subject Matter Expert (SME). Often it is the case that this process ends up as a compromise between:
 Pulling the necessary data out of the legacy system to meet the new systems functionality
 Pushing certain data into the new system from legacy to enable the continuity of certain ad hoc or custom in-house processes to continue.
Data mapping
This is the process of mapping data from the legacy to target database schemas taking into account any reformatting needed. This would normally include the derivation of translation tables used to transform parametric data. It may be the case at this point that the seed data, or static data, for the new system needs generating and here again tight integration and consultation with the business is a must.
Translation Tables
Mapping Legacy Parameters to Target Parameters
Specifications
These designs are produced to enable the developer to create the Extract, Transform and Load (ETL) modules. The output from the gap analysis and data mapping are used to drive the design process. Any constraints imposed by platforms, operating systems, programming languages, timescales etc should be referenced at this stage, as should any dependencies that this module will have on other such modules in the migration as a whole; failure to do this may result in the specifications being flawed.
There are generally two forms of migration specification: Functional (e.g. Premise migration strategy) Detailed Design (e.g. Premise data mapping document)
Built into the migration process at the specification level are steps to reconcile the migrated data at predetermined points during the migration. These checks verify that no data has been lost or gained during each step of an iteration and enable any anomalies to be spotted early and their cause ascertained with minimal loss of time.
Usually written independently from the migration, the specifications for the reconciliation programs used to validate the end-to-end migration process are designed once the target data has been mapped and is more or less static. These routines count like-for-like entities on the legacy system and target system and ensure that the correct volumes of data from legacy have migrated successfully to the target and thus build business confidence.
Iterations
These are the execution of the migration process, which may or may not include new cuts of legacy data.
These facilitate:
 Collation of migration process timings (extraction, transmission, transformation and load).
 The refinement of the migration code i.e. increase data volume and decrease exceptions through:
 Continual identification of data cleanse issues
 Confirmation of parameter settings and parameter translations
 Identification of any migration merge issues
 Reconciliation
From our experience the majority of the data will conform to the migration rules and as such take a minimal effort to migrate ("80/20 rule"). The remaining data, however, is often highly complex with many anomalies and deviations and so will take up the majority of the development time.
Data Cuts
 Extracts of data taken from the legacy and target systems. This can be a complex task where the migration is from multiple legacy systems and it is important that the data is synchronised across all systems at the time the cuts are taken (e.g. end of day processes complete).
 Subsets / selective cuts - Depending upon business rules and migration strategy the extracted data may need to be split before transfer.
Freeze
Prior to any iteration, Parameters, Translation Tables and Code should be frozen to provide a stable platform for the iteration.
Data Cleanse
This activity is required to ensure that legacy system data conforms to the rules of data migration. The activities include manual or automatic updates to legacy data. This is an ongoing activity, as while the legacy systems are active there is the potential to reintroduce data cleanse issues.
Identified by
Data Mapping
Eyeballing
Reconciliation
File Integrities
Common Areas
 Address Formats
 Titles (e.g. mrs, Mrs, MRS, first name)
 Invalid characters
 Duplicate Data
 Free Format to parameter field
Cleansing Strategy
 Legacy - Pre Migration
 During migration (not advised as this makes reconciliation very difficult)
 Target - Post Migration (either manual or via data fix)
 Ad Hoc Reporting - Ongoing
Post Implementation
Support
For an agreed period after implementation certain key members of the migration team will be available to the business to support them in the first stages of using the new system. Typically this will involve analysis of any irregularities that may have arisen through dirty data or otherwise and where necessary writing data fixes for them.
Post Implementation fixes
Post Implementation Data Fixes are programs that are executed post migration to fix data that was either migrated in an 'unclean' state or migrated with known errors. These will typically take the form of SQL scripts.
Proposal
This is a response to the invitation to tender, which comprises the following:
Migration Strategy
 Migration development models are based on an iterative approach.
 Multiple Legacy / Targets - any migration may transform data from one or more legacy databases to one or more targets
 Scope - Redwood definition / understanding of customer requirements, inclusions and exclusions
The data may be migrated in several ways, depending on data volumes and timescales:
 All at once (big bang)
 In logical blocks (chunking, e.g. by franchise)
 Pilot - A pre-test or trial run for the purpose of proving the migration process, live applications and business processes before implementing on a larger scale.
 Catch Up - To minimise downtime only business critical data is migrated, leaving historical data to be migrated at a later stage.
 Post Migration / Parallel Runs - Both pre and post migration systems remain active and are compared after a period of time to ensure the new systems are working as expected.
Milestones can include:
 Completion of specifications / mappings
 Successful 1st iteration
 Completion of an agreed number of iterations
 Delivery to User Acceptance Testing team
 Successful Dress Rehearsal
 Go Live
Roles and Responsibilities
Data Migration Project Manager/Team Lead is responsible for:
 Redwood Systems Limited project management
 Change Control
 Solution Design
 Quality
 Reporting
 Issues Management
Data Migration Analyst is responsible for:
 Gap Analysis
 Data Analysis & Mapping
 Data migration program specifications
 Extraction software design
 Exception reporting software design
Data Migration Developers are responsible for:
 Migration
 Integrity
 Reconciliation (note these are independently developed)
 Migration Execution and Control
Testers/Quality Assurance team is responsible for:
 Test approach
 Test scripts
 Test cases
 Integrity software design
 Reconciliation software design
OtherRoles:
Operational and Database Administration support for source/target systems.
Parameter Definition and Parameter Translation team
Legacy system Business Analysts
Target system Business Analysts
Data Cleansing Team
Testing Team
Project Management
Project Plan
 Milestones and Timescales
 Resources
 Individual Roles and Responsibilities
 Contingency
Communication
It is important to have good communication channels with the project manager and business analysts. Important considerations include the need to agree the location, method and format for regular meetings/contact to discuss progress, resources and communicate any problems or incidents, which may impact the ability of others to perform their duty. These could take the form of weekly conference calls, progress reports or attending on site
project meetings.
Change Control
 Scope Change Requests - a stringent change control mechanism needs to be in place to handle any deviations and creeping scope from the original project requirements.
 Version Control - all documents and code shall be version controlled.
Issue Management
 Internal issue management- as a result of Gap analysis, Data Mapping, Iterations Output (i.e. reconciliation and file integrity or as a result of eyeballing)
 External issue management - Load to Target problems and as a result of User Acceptance Testing
 Mechanism - examples:
 Test Director
 Bugzilla
 Excel
 Access
 TracNotes
Development
Extracts / Loads
 Depending on the migration strategy, extract routines shall be written to derive the legacy data required
 Transfer data from Legacy and/or Target to interim migration environment via FTP, Tape, CSV, D/B object copy, ODBC, API
 Transfer data from interim migration environment to target
Migration (transform)
There are a number of potential approaches to a Data Migration:
 Use a middleware tool (e.g. ETI, Powermart). This extracts data from the legacy system, manipulates it and pushes it to the target system. These "4th Generation" approaches are less flexible and often less efficient than bespoke coding, resulting in longer migrations and less control over the data migrated.
 The Data Migration processes are individually coded to be run on a source, an interim or target platform. The data is extracted from the legacy platform to the interim / target platform, where the code is used to manipulate the legacy data into the target system format. The great advantage of this approach is that it can encompass any migration manipulation that may be required in the most efficient, effective way and retain the utmost control. Where there is critical / sensitive data migrated this approach is desirable.
 Use a target system 'File Load Utility', if one exists. This usually requires the use of one of the above processes to populate a pre-defined Target Database. A load and validate facility will then push valid data to the target system.
 Use an application's data conversion/upgrade facility, where available.
Reconciliation
Independent end-to-end comparisons of data content to create the necessary level of business confidence
 Bespoke code is written to extract required total figures for each of the areas from the legacy, interim and target databases. These figures will be totalled and broken down into business areas and segments that are of relevant interest, so that they can be compared to each other. Where differences do occur, investigation will then instruct us to alter the migration code or if there are reasonable mitigating factors.
 Spreadsheets are created to report figures to all levels of management to verify that the process is working and build confidence in the process.
Referential File Integrities
Depending on the constraints of the interim/target database, data may be checked to ascertain and validate its quality. There may be certain categories of dirty data that should be disallowed e.g. duplicate data, null values, data that does not match to a parameter table or an incompatible combination of data in separate fields as proscribed by the analyst. Scripts are written that run automatically after each iteration of the migration. A report is then generated to itemise the non-compatible data.
Quality Assurance
Reconciliation
 Horizontal reconciliation (number on legacy = number on interim = number on target) and Vertical reconciliation (categorisation counts (i.e. Address counts by region = total addresses) and across systems).
 Figures at all stages (legacy, interim, target) to provide checkpoints.
File Integrities
Scripts that identify and report the following for each table:
 Referential Integrity - check values against target master and parameter files.
 Data Constraints
 Duplicate Data
Translation Table Validation
Run after new cut of data or new version of translation tables, two stages:
 Verifies that all legacy data accounted for in "From" translation
 Verifies that all "To" translations exist in target parameter data
Eyeballing
Comparison of legacy and target applications
 Scenario Testing -Legacy to target system verification that data has been migrated correctly for certain customers chosen by the business who's circumstances fall into categories (e.g. inclusion and exclusion Business Rule categories, data volumes etc.)
 Regression Testing - testing known problem areas
 Spot Testing - a random spot check on migrated data
 Independent Team - the eyeballing is generally carried out by a dedicated testing team rather than the migration team
UAT
This is the customer based User Acceptance Test of the migrated data which will form part of the Customer Signoff
Implementation
Freeze
A code and parameter freeze occurs in the run up to the dress rehearsal. Any problems post freeze are run as post freeze fixes.
Dress Rehearsal
Dress rehearsals are intended to mobilise the resources that will be required to support a cutover in the production environment. The primary aim of a dress rehearsal is to identify the risks and issues associated with the implementation plan. It will execute all the steps necessary to execute a successful 'go live' migration.
Through the execution of a dress rehearsal all the go live checkpoints will be properly managed and executed and if required, the appropriate escalation routes taken.
Go Live window (typical migration)
 Legacy system 'end of business day' closedown
 Legacy system data extractions
 Legacy system data transmissions
 Readiness checks
 Migration Execution
 Reconciliation
 Integrity checking
 Transfer load to Target
 User Acceptance testing
 Reconciliation
 Acceptance and GO Live
===================
LSMW: Refer to the links below, can get useful info (Screen Shots for various different methods of LSMW)
Step-By-Step Guide for LSMW using ALE/IDOC Method (Screen Shots)
http://www.****************/Tutorials/LSMW/IDocMethod/IDocMethod1.htm
Using Bapi in LSMW (Screen Shots)
http://www.****************/Tutorials/LSMW/BAPIinLSMW/BL1.htm
Uploading Material Master data using BAPI method in LSMW (Screen Shots)
http://www.****************/Tutorials/LSMW/MMBAPI/Page1.htm
Step-by-Step Guide for using LSMW to Update Customer Master Records(Screen Shots)
http://www.****************/Tutorials/LSMW/Recording/Recording.htm
Uploading Material master data using recording method of LSMW(Screen Shots)
http://www.****************/Tutorials/LSMW/MMRecording/Page1.htm
Step-by-Step Guide for using LSMW to Update Customer Master Records(Screen Shots) Batch Input method
Uploading Material master data using Direct input method
http://www.****************/Tutorials/LSMW/MMDIM/page1.htm
Steps to copy LSMW from one client to another
http://www.****************/Tutorials/LSMW/CopyLSMW/CL.htm
Modifying BAPI to fit custom requirements in LSMW
http://www.****************/Tutorials/LSMW/BAPIModify/Main.htm
Using Routines and exception handling in LSMW
http://www.****************/Tutorials/LSMW/Routines/Page1.htm
Reward if USeful
Thanx & regrads
Naren -
Error in Routine while migrating standard Transformations from 3.5 to BI7.0
Hi Experts,
We are migrating the Standard trasformations, from old version to new BI 7.0 version. when trying to create the new transformation we are getting a routine error, and unable activate the transformation.
Trasformation Name: TRCS ZCO_OM_NAE_1 -> CUBE 0PS_C08
Routine Desc.: Conversion of Actual / Commitment / Plan to Resid.Order Plan
Source Fields: 0CURRENCY & 0FISCPER
Target Fields: 0AMOUNT & 0CURRENCY
Error Message: E:Field "COMM_STRUCTURE" is unknown. It is neither in one of the specified tables nor defined by a "DATA" statement. "DATA" statement.
Routine:
PROGRAM trans_routine.
CLASS routine DEFINITION
CLASS lcl_transform DEFINITION.
PUBLIC SECTION.
Attributs
DATA:
p_check_master_data_exist
TYPE RSODSOCHECKONLY READ-ONLY,
*- Instance for getting request runtime attributs;
Available information: Refer to methods of
interface 'if_rsbk_request_admintab_view'
p_r_request
TYPE REF TO if_rsbk_request_admintab_view READ-ONLY.
PRIVATE SECTION.
TYPE-POOLS: rsd, rstr.
Rule specific types
$$ begin of global - insert your declaration only below this line -
... "insert your code here
$$ end of global - insert your declaration only before this line -
ENDCLASS. "routine DEFINITION
$$ begin of 2nd part global - insert your code only below this line *
$$ end of rule type
TYPES:
BEGIN OF tys_TG_1_full,
InfoObject: 0CHNGID Change Run ID.
CHNGID TYPE /BI0/OICHNGID,
InfoObject: 0RECORDTP Record type.
RECORDTP TYPE /BI0/OIRECORDTP,
InfoObject: 0REQUID Request ID.
REQUID TYPE /BI0/OIREQUID,
InfoObject: 0FISCVARNT Fiscal year variant.
FISCVARNT TYPE /BI0/OIFISCVARNT,
InfoObject: 0FISCYEAR Fiscal year.
FISCYEAR TYPE /BI0/OIFISCYEAR,
InfoObject: 0CURRENCY Currency key.
CURRENCY TYPE /BI0/OICURRENCY,
InfoObject: 0CO_AREA Controlling area.
CO_AREA TYPE /BI0/OICO_AREA,
InfoObject: 0CURTYPE Currency Type.
CURTYPE TYPE /BI0/OICURTYPE,
InfoObject: 0METYPE Key Figure Type.
METYPE TYPE /BI0/OIMETYPE,
InfoObject: 0VALUATION Valuation View.
VALUATION TYPE /BI0/OIVALUATION,
InfoObject: 0VERSION Version.
VERSION TYPE /BI0/OIVERSION,
InfoObject: 0VTYPE Value Type for Reporting.
VTYPE TYPE /BI0/OIVTYPE,
InfoObject: 0WBS_ELEMT Work Breakdown Structure Element (WBS Elem
*ent).
WBS_ELEMT TYPE /BI0/OIWBS_ELEMT,
InfoObject: 0COORDER Order Number.
COORDER TYPE /BI0/OICOORDER,
InfoObject: 0PROJECT Project Definition.
PROJECT TYPE /BI0/OIPROJECT,
InfoObject: 0ACTIVITY Network Activity.
ACTIVITY TYPE /BI0/OIACTIVITY,
InfoObject: 0NETWORK Network.
NETWORK TYPE /BI0/OINETWORK,
InfoObject: 0PROFIT_CTR Profit Center.
PROFIT_CTR TYPE /BI0/OIPROFIT_CTR,
InfoObject: 0COMP_CODE Company code.
COMP_CODE TYPE /BI0/OICOMP_CODE,
InfoObject: 0BUS_AREA Business area.
BUS_AREA TYPE /BI0/OIBUS_AREA,
InfoObject: 0ACTY_ELEMT Network Activity Element.
ACTY_ELEMT TYPE /BI0/OIACTY_ELEMT,
InfoObject: 0STATUSSYS0 System Status.
STATUSSYS0 TYPE /BI0/OISTATUSSYS0,
InfoObject: 0PS_OBJ PS Object Type.
PS_OBJ TYPE /BI0/OIPS_OBJ,
InfoObject: 0VTSTAT Statistics indicator for value type.
VTSTAT TYPE /BI0/OIVTSTAT,
InfoObject: 0AMOUNT Amount.
AMOUNT TYPE /BI0/OIAMOUNT,
Field: RECORD Data record number.
RECORD TYPE RSARECORD,
END OF tys_TG_1_full.
Additional declaration for update rule interface
DATA:
MONITOR type standard table of rsmonitor WITH HEADER LINE,
MONITOR_RECNO type standard table of rsmonitors WITH HEADER LINE,
RECORD_NO LIKE SY-TABIX,
RECORD_ALL LIKE SY-TABIX,
SOURCE_SYSTEM LIKE RSUPDSIMULH-LOGSYS.
global definitions from update rules
TABLES: ...
DATA: ...
FORM routine_0001
CHANGING
RETURNCODE LIKE sy-subrc
ABORT LIKE sy-subrc
RAISING
cx_sy_arithmetic_error
cx_sy_conversion_error.
init variables
not supported
icube_values = g.
CLEAR result_table. REFRESH result_table.
type-pools: PSBW1.
data: l_psbw1_type_s_int1 type psbw1_type_s_int1.
data: lt_spread_values type PSBW1_TYPE_T_ACT_SPREAD.
field-symbols: .
füllen Rückgabetabelle !
move-corresponding to RESULT_TABLE.
check not RESULT_TABLE-amount is initial.
append RESULT_TABLE.
endloop.
if the returncode is not equal zero, the result will not be updated
RETURNCODE = 0.
if abort is not equal zero, the update process will be canceled
ABORT = 0.
ENDFORM. "routine_0001
$$ end of 2nd part global - insert your code only before this line *
CLASS routine IMPLEMENTATION
CLASS lcl_transform IMPLEMENTATION.
*$*$ begin of routine - insert your code only below this line *-*
Data:
l_subrc type sy-tabix,
l_abort type sy-tabix,
ls_monitor TYPE rsmonitor,
ls_monitor_recno TYPE rsmonitors.
REFRESH:
MONITOR.
Runtime attributs
SOURCE_SYSTEM = p_r_request->get_logsys( ).
Migrated update rule call
Perform routine_0001
CHANGING
l_subrc
l_abort.
*-- Convert Messages in Transformation format
LOOP AT MONITOR INTO ls_monitor.
move-CORRESPONDING ls_monitor to MONITOR_REC.
append monitor_rec to MONITOR.
ENDLOOP.
IF l_subrc <> 0.
RAISE EXCEPTION TYPE cx_rsrout_skip_val.
ENDIF.
IF l_abort <> 0.
RAISE EXCEPTION TYPE CX_RSROUT_ABORT.
ENDIF.
$$ end of routine - insert your code only before this line -
ENDMETHOD. "compute_0AMOUNT
Method invert_0AMOUNT
This subroutine needs to be implemented only for direct access
(for better performance) and for the Report/Report Interface
(drill through).
The inverse routine should transform a projection and
a selection for the target to a projection and a selection
for the source, respectively.
If the implementation remains empty all fields are filled and
all values are selected.
METHOD invert_0AMOUNT.
$$ begin of inverse routine - insert your code only below this line-
... "insert your code here
$$ end of inverse routine - insert your code only before this line -
ENDMETHOD. "invert_0AMOUNT
ENDCLASS. "routine IMPLEMENTATION
Regards
Krishanu.Hi,
Go through the belowl link it may help you a lot
/message/7377688#7377688 [original link is broken]
Regards,
Marasa. -
Routine's Error After Migration BW to BI
*HI,*
*I have migrated 2lis_13_vditm data source from bw to bi and now issue is when i try to activate transformation it is showing error*
"RULE(TARGET 0SUBTOT_1S,GROUP 01 STANDARD GROUP): SYNTAX ERROR IN ROUTINE". ***
" ERROR WHEN I CHECK FOR SYNTAX CONSISTENCY "* it is showing**
*E:THE DATA OBJECT "AOMM_STRUCTURE" DOES NOT HAVE A COMPONENT CALLED*
*"NETVAL_IN"*Make sure you have the latest SP for CR 2008 installed;
https://smpdl.sap-ag.de/~sapidp/012002523100009038092009E/cr2008win_sp2.exe
Ensure you are deploying SP 2 runtime;
MSM
https://smpdl.sap-ag.de/~sapidp/012002523100009159092009E/cr2008sp2_mm.zip
MSI
https://smpdl.sap-ag.de/~sapidp/012002523100009159002009E/cr2008sp2_redistinstall.zip
If that does not help;
1) Do these reports work in the designer?
2) What database are you using and what is the connection type?
3) Compare the dlls loading on the systems where this app works (environment which has both Crystal Reports 10 and Crystal Reports 12 in GAC) and a system that is giving you the error. Use the [Modules|https://smpdl.sap-ag.de/~sapidp/012002523100006252802008E/modules.zip] utility.
4) Open the reports in the designer and look at the properties in the Database menu -> Set Datasource Location. Check to see if there are any differences between the two reports.
5) If there is a subreport in the report that causes the issue, remove it and see if the report works without the subreport.
Ludek -
Error after migrating new copy control routine
Hi,
I have created a new copy control routine in dev environment. It works fine there. When we migrated the transport to all QA environments getting dumps that the form is missing. I checked the routine and form in VOFM and it does exists. Am I missing a step here or need to do something after migration?
-Minhaj.go to se38 enter program name RV80HGEN - execute.
You need to generate newly created routines
You have to do this in each client where you transport this.
Edited by: Sampath Kumar on Nov 5, 2009 10:36 AM -
Routines Migration from 3.5 to 7.0
Is it mandatory to convert ABAP routines to ABAP objects while upgration from 3.5 to 7.0. As we have huge numbers of routines written in the update and trasfer rule. PLease suggest.
amit
9766392606Is it mandatory to convert ABAP routines to ABAP objects while upgration from 3.5 to 7.0. As we have huge numbers of routines written in the update and trasfer rule. PLease suggest.
7.0 migration from 3.X :
You neednot migrate your routines separately.
All your old routines will be put into a global perform and will execute normally.
Exception : Return tables(not supported in BI 7.0).
3.X :
If your flow is 3.X, no need to migrate at all.If old flow is desirable. -
Migration of Datasource with routine's
Dear All,
I am in the process of migrating 3.x DataSource to 7.0 and am following the step by step process as suggested.
In doing so transformations are created and proposals are generated where ono-to-one mapping between field and InfoObject are in place.
In case of InfoObject which are populated by routines are not mapped.
For example: for DataSource 2LIS_02_SCL routines are created and mapped for 0Order_val, 0PO_QTY,0TAR_DL_QTY to name a few in the update rules.
My question is how to migrate the routines along with the DataSource for all transformations requiring routines.
or
Is there any other method to migrate the routines
or
Is activating transformation an option?
All suggestions, links, articles, blogs are welcome.
Yours Truly,
K SenguptoHi Obaid
I Migrate my data source in Business Content in 3.x and I have my update rule and transfer rule for all ds of purchasing data.
I created a custom cube from 0PUR_C04, I want to enhance the data source 2LIS_02_HDR,
Step by step procedure for creating custom cube ZPUR_C04
1- Select the Cube Standard 0PUR_C04 , Copy
2- Default info cube with name ZPUR_C04
3- Select my new infocube ZPUR_C0
4- Create update rule for the data source 2LIS_02_SCL , 2LIS_02_HDR and 2LIS_02_ITM
With this approach I cant have the update rules routine like standard flow ?
For example 2LIS_02_HDR :
No. of Purch. Orders <-- mm_no_of_po
Number of requests for quotations <-- mm_no_of_rfq
Number of contracts <-- mm_no_of_co
How can do this mapping ? or how can I copy the transformation from standard cube to my custom cube ?
Thanks & Best Reagrds
ILyaebi -
Hi
I migrated the transfer rules from 3.5 to 7.0. The transformations are created by default with all the routines copies. But in transformation, All my routines are showing under the global routines.
For ex: In update rule key figure 1 has routine u201CABCu201D and key figure 2 has routine u201CDEFu201D and key figure 3 has routine u201CGHIu201D
After migration, in transformation key figure 1 has routine u201CABCu201D, u201CDEFu201D, u201CGHIu201D in global routine.
I donu2019t understand what is happening. Can anyone give me understand on what happens after migration?
Regards
AnnieHi Annie,
Please refere the following SAP note:
SAP Note 912841 Migration help
Link: https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/com.sap.km.cm.docs/oss_notes/sdn_oss_bw_whm/~form/handler
& Migrating transfer rules and update rules for BW7.x
SAP Note Number: 1052648
Link:
https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/com.sap.km.cm.docs/oss_notes/sdn_oss_bw_whm/~form/handler
Hope it helps.
Thanks & Regards,
Rashmi.
Maybe you are looking for
-
Error while cloning database to a different host
Hi, I'm using rman to clone a database from host1 to host2. The directory structures on host2 is different from host1 for the datafiles and control file. I have changed the init.ora file on host2 to reflect the new path for the control files. Within
-
I'm trying to convert an application from Flex 3 to Flex 4. One of the problems is discussed in another thread on how Flex 4 doesn't use CSS (background-Image). But this thread is for other things, starting with this. 1) When I go into design mode,
-
How to make restrictions in table maintenance
My requirement is to view the Z table through a transaction. Here we should have an option to delete a record but not to modify any field(s) of the record. How to restrict the maintenance view only for deletion?.
-
How can I customize Tile Page (what's this page) to cycle on a folder?
I love how the Tile Page ("what's this page") feature looks. But I rarely use it because it auto-populates with URLs (most visited or recent) I already have sorted in my bookmarks under folders and in the Bookmark Toolbar. I don't need a screen shot
-
Vista - Connected with Limited Access???
Please help!!! This problem is driving me mad. Every time I try to connect to the router whilst wireless it says "Connected with Limited Access"? I then plug the cable back in and "ta dah" it works fine. I have tried almost everything and cannot unde