Strategy details

I have 7 different finish materials ,with only change in there planning strategies [MRP3 view of tcode MM01].In MD61 of all materials i had done monthly planning for 100 EA and in stock i have 80 EA for all above 7 materials.So i wish to know what quantity of MRP run will be for following strategies:-
                Strategy                            MRP result
                   10
                   11
                   20
                   30
                   40
                   52
                   63
Sir can you please fill  MRP result column ?Also if any major changes are there please explain me.
Thankyou.
I wait online.

Sir can u please tell me strategy with example?
I online wait.
                                        Thankyou.

Similar Messages

  • Transport of Number range and release proceure and split valuation

    Dear Gurus,
    Can we transport number ranges from PRD server to Quality Server?
    Also, how will we trasport split valuation and release procedure?
    Kindly suggest..

    Hi
    Number ranges you have to create directly as it is not transportable.
    For release Strategy details, all your Release config details are transported Except for the below details
    1. Characteristics &  values assigned to it ( master data)
    2. Class & assigned characteristics to it (master data)
    3. Values/ranges inside the Release strategies ( classification details).
    These three details are to be recreated/assigned in the client you intend to work upon.
    Regards
    SAM

  • Audit Logs Empty in Communication Channel

    Hi,
    I am facing an issue in Channel Monitoring.I can see the messages are successful but nothing is displayed in Audit Logs.This is happening for both Receiver and Sender Channel.Earlier i could see the logs but it has started happening suddenly .
    I am working on PI 7.1.Kindly provide help .

    Hi Bhargav,
    Check this note: 1436261
    Reason and Prerequisites
    Since PI 7.1, the audit log handling was changed, such that only audit logs which describe an error are persisted to the database. Audit Logs of type info are only kept in an in-memory cache per clusternode, and they get evicted according to a LRU strategy (details are explained in note 1314974).
    There was an program error in the retrieval of audit logs from the memory caches of all cluster nodes, which led to the unavailability to monitoring of audit logs for messages which were processed on a different cluster node than the one on which the runtime workbench was executed.
    Solution
    To correct this problem, apply the patch matching your support package version as per the instructions in the NetWeaver Support Package Stack Guide.
    PI 7.10
    The archives and the Support Package Stack Guide can be found on the SAP Service Marketplace:
    http://service.sap.com/netweaver --> SAP NetWeaver Process Integration 7.1 - Release-Specific Inf --> Support Package Stacks Information -->
    7.1: SAP NetWeaver Process Integration 7.1
    EhP1: SAP enhancement package 1 for SAP NetWeaver Process Integration 7.1
    Please select the requested SP Stack.
    Thanks,

  • Data Migration_LSMW

    hi all,
    need information on data migration and possible methods of LSMW
    Thanks
    Swapna

    hi
    Can a give a Search on the Topic "Data Migration" & "LSMW" in the forum for valuable information,
    <b>Pls Do NOT Post UR Mail-Id, Lets all follow some Rules</b>
    Data Migration Life Cycle
    Overview : Data Migration Life Cycle
    Data Migration
    This document aims to outline the typical processes involved in a data migration.
    Data migration is the moving and transforming of data from legacy to target database systems. This includes one to one and one to many mapping and movement of static and transactional data. Migration also relates to the physical extraction and transmission of data between legacy and target hardware platforms.
    ISO 9001 / TickIT accredited
    The fundamental aims of certification are quality achievement and improvement and the delivery of customer satisfaction.
    The ISO and TickIT Standards are adhered to throughout all stages of the migration process.
    •     Customer Requirements
    •     Dependencies
    •     Analysis
    •     Iterations
    •     Data Cleanse
    •     Post Implementation
    •     Proposal
    •     Project Management
    •     Development
    •     Quality Assurance
    •     Implementation
    Customer Requirements
    The first stage is the contact from the customer asking us to tender for a data migration project. The invitation to tender will typically include the Scope /
    Requirements and Business Rules:
    &#61607;     Legacy and Target - Databases / Hardware / Software
    &#61607;     Timeframes - Start and Finish
    &#61607;     Milestones
    &#61607;     Location
    &#61607;     Data Volumes
    Dependencies
    Environmental Dependencies
    &#61607;     Connectivity - remote or on-site
    &#61607;     Development and Testing Infrastructure - hardware, software, databases, applications and desktop configuration
    Support Dependencies
    &#61607;     Training (legacy & target applications) - particularly for an in-house test team
    &#61607;     Business Analysts -provide expert knowledge on both legacy and target systems
    &#61607;     Operations - Hardware / Software / Database Analysts - facilitate system  housekeeping when necessary
    &#61607;     Business Contacts
    &#61607;     User Acceptance Testers - chosen by the business
    &#61607;     Business Support for data cleanse
    Data Dependencies
    &#61607;     Translation Tables - translates legacy parameters to target parameters
    &#61607;     Static Data / Parameters / Seed Data (target parameters)
    &#61607;     Business Rules - migration selection criteria (e.g. number of months history)
    &#61607;     Entity Relationship Diagrams / Transfer Dataset / Schemas (legacy & target)
    &#61607;     Sign Off / User Acceptance criteria - within agreed tolerance limits
    &#61607;     Data Dictionary
    Analysis
    Gap Analysis
    Identifying where differences in the functionalities of the target system and legacy system mean that data may be left behind or alternatively generating default data for the new system where nothing comparable exists on legacy.
    Liaison with the business is vital in this phase as mission critical data cannot be allowed to be left behind, it is usual to consult with the relevant business process leader or Subject Matter Expert (SME). Often it is the case that this process ends up as a compromise between:
    &#61607;     Pulling the necessary data out of the legacy system to meet the new systems functionality
    &#61607;     Pushing certain data into the new system from legacy to enable the continuity of certain ad hoc or custom in-house processes to continue.
    Data mapping
    This is the process of mapping data from the legacy to target database schemas taking into account any reformatting needed. This would normally include the derivation of translation tables used to transform parametric data. It may be the case at this point that the seed data, or static data, for the new system needs generating and here again tight integration and consultation with the business is a must.
    Translation Tables
    Mapping Legacy Parameters to Target Parameters
    Specifications
    These designs are produced to enable the developer to create the Extract, Transform and Load (ETL) modules. The output from the gap analysis and data mapping are used to drive the design process. Any constraints imposed by platforms, operating systems, programming languages, timescales etc should be referenced at this stage, as should any dependencies that this module will have on other such modules in the migration as a whole; failure to do this may result in the specifications being flawed.
    There are generally two forms of migration specification: Functional (e.g. Premise migration strategy) Detailed Design (e.g. Premise data mapping document)
    Built into the migration process at the specification level are steps to reconcile the migrated data at predetermined points during the migration. These checks verify that no data has been lost or gained during each step of an iteration and enable any anomalies to be spotted early and their cause ascertained with minimal loss of time.
    Usually written independently from the migration, the specifications for the reconciliation programs used to validate the end-to-end migration process are designed once the target data has been mapped and is more or less static. These routines count like-for-like entities on the legacy system and target system and ensure that the correct volumes of data from legacy have migrated successfully to the target and thus build business confidence.
    Iterations
    These are the execution of the migration process, which may or may not include new cuts of legacy data.
    These facilitate:
    &#61607;     Collation of migration process timings (extraction, transmission, transformation and load).
    &#61607;     The refinement of the migration code i.e. increase data volume and decrease exceptions through:
    &#61607;     Continual identification of data cleanse issues
    &#61607;     Confirmation of parameter settings and parameter translations
    &#61607;     Identification of any migration merge issues
    &#61607;     Reconciliation
    From our experience the majority of the data will conform to the migration rules and as such take a minimal effort to migrate ("80/20 rule"). The remaining data, however, is often highly complex with many anomalies and deviations and so will take up the majority of the development time.
    Data Cuts
    &#61607;     Extracts of data taken from the legacy and target systems. This can be a complex task where the migration is from multiple legacy systems and it is important that the data is synchronised across all systems at the time the cuts are taken (e.g. end of day processes complete).
    &#61607;     Subsets / selective cuts - Depending upon business rules and migration strategy the extracted data may need to be split before transfer.
    Freeze
    Prior to any iteration, Parameters, Translation Tables and Code should be frozen to provide a stable platform for the iteration.
    Data Cleanse
    This activity is required to ensure that legacy system data conforms to the rules of data migration. The activities include manual or automatic updates to legacy data. This is an ongoing activity, as while the legacy systems are active there is the potential to reintroduce data cleanse issues.
    Identified by
    •     Data Mapping
    •     Eyeballing
    •     Reconciliation
    •     File Integrities
    Common Areas
    &#61607;     Address Formats
    &#61607;     Titles (e.g. mrs, Mrs, MRS, first name)
    &#61607;     Invalid characters
    &#61607;     Duplicate Data
    &#61607;     Free Format to parameter field
    Cleansing Strategy
    &#61607;     Legacy - Pre Migration
    &#61607;     During migration (not advised as this makes reconciliation very difficult)
    &#61607;     Target - Post Migration (either manual or via data fix)
    &#61607;     Ad Hoc Reporting - Ongoing
    Post Implementation
    Support
    For an agreed period after implementation certain key members of the migration team will be available to the business to support them in the first stages of using the new system. Typically this will involve analysis of any irregularities that may have arisen through dirty data or otherwise and where necessary writing data fixes for them.
    Post Implementation fixes
    Post Implementation Data Fixes are programs that are executed post migration to fix data that was either migrated in an 'unclean' state or migrated with known errors. These will typically take the form of SQL scripts.
    Proposal
    This is a response to the invitation to tender, which comprises the following:
    Migration Strategy
    &#61607;     Migration development models are based on an iterative approach.
    &#61607;     Multiple Legacy / Targets - any migration may transform data from one or    more legacy databases to one or more targets
    &#61607;     Scope - Redwood definition / understanding of customer requirements, inclusions and exclusions
    The data may be migrated in several ways, depending on data volumes and timescales:
    &#61607;     All at once (big bang)
    &#61607;     In logical blocks (chunking, e.g. by franchise)
    &#61607;     Pilot - A pre-test or trial run for the purpose of proving the migration process, live applications and business processes before implementing on a larger scale.
    &#61607;     Catch Up - To minimise downtime only business critical data is migrated, leaving historical data to be migrated at a later stage.
    &#61607;     Post Migration / Parallel Runs - Both pre and post migration systems remain active and are compared after a period of time to ensure the new systems are working as expected.
    Milestones can include:
    &#61607;     Completion of specifications / mappings
    &#61607;     Successful 1st iteration
    &#61607;     Completion of an agreed number of iterations
    &#61607;     Delivery to User Acceptance Testing team
    &#61607;     Successful Dress Rehearsal
    &#61607;     Go Live
    Roles and Responsibilities
    Data Migration Project Manager/Team Lead is responsible for:
    &#61607;     Redwood Systems Limited project management
    &#61607;     Change Control
    &#61607;     Solution Design
    &#61607;     Quality
    &#61607;     Reporting
    &#61607;     Issues Management
    Data Migration Analyst is responsible for:
    &#61607;     Gap Analysis
    &#61607;     Data Analysis & Mapping
    &#61607;     Data migration program specifications
    &#61607;     Extraction software design
    &#61607;     Exception reporting software design
    Data Migration Developers are responsible for:
    &#61607;     Migration
    &#61607;     Integrity
    &#61607;     Reconciliation (note these are independently developed)
    &#61607;     Migration Execution and Control
    Testers/Quality Assurance team is responsible for:
    &#61607;     Test approach
    &#61607;     Test scripts
    &#61607;     Test cases
    &#61607;     Integrity software design
    &#61607;     Reconciliation software design
    OtherRoles:
    •     Operational and Database Administration support for source/target systems.
    •     Parameter Definition and Parameter Translation team
    •     Legacy system Business Analysts
    •     Target system Business Analysts
    •     Data Cleansing Team
    •     Testing Team
    Project Management
    Project Plan
    &#61607;     Milestones and Timescales
    &#61607;     Resources
    &#61607;     Individual Roles and Responsibilities
    &#61607;     Contingency
    Communication
    It is important to have good communication channels with the project manager and business analysts. Important considerations include the need to agree the location, method and format for regular meetings/contact to discuss progress, resources and communicate any problems or incidents, which may impact the ability of others to perform their duty. These could take the form of weekly conference calls, progress reports or attending on site
    project meetings.
    Change Control
    &#61607;     Scope Change Requests - a stringent change control mechanism needs to be in place to handle any deviations and creeping scope from the original project requirements.
    &#61607;     Version Control - all documents and code shall be version controlled.
    Issue Management
    &#61607;     Internal issue management- as a result of Gap analysis, Data Mapping, Iterations Output (i.e. reconciliation and file integrity or as a result of eyeballing)
    &#61607;     External issue management - Load to Target problems and as a result of User Acceptance Testing
    &#61607;     Mechanism - examples:
    &#61607;     Test Director
    &#61607;     Bugzilla
    &#61607;     Excel
    &#61607;     Access
    &#61607;     TracNotes
    Development
    Extracts / Loads
    &#61607;     Depending on the migration strategy, extract routines shall be written to derive the legacy data required
    &#61607;     Transfer data from Legacy and/or Target to interim migration environment via FTP, Tape, CSV, D/B object copy, ODBC, API
    &#61607;     Transfer data from interim migration environment to target
    Migration (transform)
    There are a number of potential approaches to a Data Migration:
    &#61607;     Use a middleware tool (e.g. ETI, Powermart). This extracts data from the legacy system, manipulates it and pushes it to the target system. These "4th Generation" approaches are less flexible and often less efficient than bespoke coding, resulting in longer migrations and less control over the data migrated.
    &#61607;     The Data Migration processes are individually coded to be run on a source, an interim or target platform. The data is extracted from the legacy platform to the interim / target platform, where the code is used to manipulate the legacy data into the target system format. The great advantage of this approach is that it can encompass any migration manipulation that may be required in the most efficient, effective way and retain the utmost control. Where there is critical / sensitive data migrated this approach is desirable.
    &#61607;     Use a target system 'File Load Utility', if one exists. This usually requires the use of one of the above processes to populate a pre-defined Target Database. A load and validate facility will then push valid data to the target system.
    &#61607;     Use an application's data conversion/upgrade facility, where available.
    Reconciliation
    Independent end-to-end comparisons of data content to create the necessary level of business confidence
    &#61607;     Bespoke code is written to extract required total figures for each of the areas from the legacy, interim and target databases. These figures will be totalled and broken down into business areas and segments that are of relevant interest, so that they can be compared to each other. Where differences do occur, investigation will then instruct us to alter the migration code or if there are reasonable mitigating factors.
    &#61607;     Spreadsheets are created to report figures to all levels of management to verify that the process is working and build confidence in the process.
    Referential File Integrities
    Depending on the constraints of the interim/target database, data may be checked to ascertain and validate its quality. There may be certain categories of dirty data that should be disallowed e.g. duplicate data, null values, data that does not match to a parameter table or an incompatible combination of data in separate fields as proscribed by the analyst. Scripts are written that run automatically after each iteration of the migration. A report is then generated to itemise the non-compatible data.
    Quality Assurance
    Reconciliation
    &#61607;     Horizontal reconciliation (number on legacy = number on interim = number on target) and Vertical reconciliation (categorisation counts (i.e. Address counts by region = total addresses) and across systems).
    &#61607;     Figures at all stages (legacy, interim, target) to provide checkpoints.
    File Integrities
    Scripts that identify and report the following for each table:
    &#61607;     Referential Integrity - check values against target master and parameter files.
    &#61607;     Data Constraints
    &#61607;     Duplicate Data
    Translation Table Validation
    Run after new cut of data or new version of translation tables, two stages:
    &#61607;     Verifies that all legacy data accounted for in "From" translation
    &#61607;     Verifies that all "To" translations exist in target parameter data
    Eyeballing
    Comparison of legacy and target applications
    &#61607;     Scenario Testing -Legacy to target system verification that data has been migrated correctly for certain customers chosen by the business who's circumstances fall into categories (e.g. inclusion and exclusion Business Rule categories, data volumes etc.)
    &#61607;     Regression Testing - testing known problem areas
    &#61607;     Spot Testing - a random spot check on migrated data
    &#61607;     Independent Team - the eyeballing is generally carried out by a dedicated testing team rather than the migration team
    UAT
    This is the customer based User Acceptance Test of the migrated data which will form part of the Customer Signoff
    Implementation
    Freeze
    A code and parameter freeze occurs in the run up to the dress rehearsal. Any problems post freeze are run as post freeze fixes.
    Dress Rehearsal
    Dress rehearsals are intended to mobilise the resources that will be required to support a cutover in the production environment. The primary aim of a dress rehearsal is to identify the risks and issues associated with the implementation plan. It will execute all the steps necessary to execute a successful 'go live' migration.
    Through the execution of a dress rehearsal all the go live checkpoints will be properly managed and executed and if required, the appropriate escalation routes taken.
    Go Live window (typical migration)
    &#61607;     Legacy system 'end of business day' closedown
    &#61607;     Legacy system data extractions
    &#61607;     Legacy system data transmissions
    &#61607;     Readiness checks
    &#61607;     Migration Execution
    &#61607;     Reconciliation
    &#61607;     Integrity checking
    &#61607;     Transfer load to Target
    &#61607;     User Acceptance testing
    &#61607;     Reconciliation
    &#61607;     Acceptance and GO Live
    ===================
    LSMW: Refer to the links below, can get useful info (Screen Shots  for various different methods of LSMW)
    Step-By-Step Guide for LSMW using ALE/IDOC Method (Screen Shots)
    http://www.****************/Tutorials/LSMW/IDocMethod/IDocMethod1.htm
    Using Bapi in LSMW (Screen Shots)
    http://www.****************/Tutorials/LSMW/BAPIinLSMW/BL1.htm
    Uploading Material Master data using BAPI method in LSMW (Screen Shots)
    http://www.****************/Tutorials/LSMW/MMBAPI/Page1.htm
    Step-by-Step Guide for using LSMW to Update Customer Master Records(Screen Shots)
    http://www.****************/Tutorials/LSMW/Recording/Recording.htm
    Uploading Material master data using recording method of LSMW(Screen Shots)
    http://www.****************/Tutorials/LSMW/MMRecording/Page1.htm
    Step-by-Step Guide for using LSMW to Update Customer Master Records(Screen Shots) Batch Input method
    Uploading Material master data using Direct input method
    http://www.****************/Tutorials/LSMW/MMDIM/page1.htm
    Steps to copy LSMW from one client to another
    http://www.****************/Tutorials/LSMW/CopyLSMW/CL.htm
    Modifying BAPI to fit custom requirements in LSMW
    http://www.****************/Tutorials/LSMW/BAPIModify/Main.htm
    Using Routines and exception handling in LSMW
    http://www.****************/Tutorials/LSMW/Routines/Page1.htm
    Reward if USeful
    Thanx & regrads
    Naren

  • Release code in user role

    hi
    In which table can we find release codes assigned against user.   We know how to check in su01 in user roles.   This is for a new report purpose.

    Hi
    This table has releasecodes and release strategy details: T16FS.
    To check users for a particular role
    Go to SUIM
    User--Users by Complex Selection Criteria--
    Users by Complex Selection Criteria
    Enter the authorization object   M_EINK_FRG
    And click on entry values
    enter the release code and release group and execute the report

  • Transport of Release Prerequisites and Release Indicators

    How the release prerequisites and release indicators in the Release strategy get transported.When we were trying to transport only strategies are getting transported.

    Hi
    Number ranges you have to create directly as it is not transportable.
    For release Strategy details, all your Release config details are transported Except for the below details
    1. Characteristics &  values assigned to it ( master data)
    2. Class & assigned characteristics to it (master data)
    3. Values/ranges inside the Release strategies ( classification details).
    These three details are to be recreated/assigned in the client you intend to work upon.
    Regards
    SAM

  • Strategy for implementing drpdownlistbykey in header / detail relationship

    Quite often I have header details relationships that I want to display.  The volumes of data are typically small, so rather than make multiple trips to the database I choose to load all the data into context during the init routine.  Implementing an efficient way to dispay this data and keep the header details relationship syncronized on the screen is where I am looking for suggestions.
    A good example would be selecting a user from a list and displaying details of that user in a series of text views.
    Currently to implement this, I load all the user detail into separate elements within a node. Simultaniously I load a value set with key information which is then used by a drop down list.   A series of text views are bound to attributes in the context outside the detail node.  These attributes contain the currently selected user detail information.  When a user is selected from the drop down list, the select event fires, and a search is performed of all the detail node elements for that key.  Once found, the user detail information is copied from the appropriate element to the corresponding 'current value' attributes.
    This process seems a bit cumbersome to me.  I am wondering if there are more efficient ways to manage this routine scenario.  Possibly there is a way of binding the text views directly to the attributes in the detail node and switching the element that is currently displayed?  I am just not aware of any way to do that.
    Any suggestions would be greatly appreciated
    -Sheldon

    Say you have a context node "Users", cardinality 0:N, with attributes like "Name" etc.
    Use a DropDown<b>ByIndex</b> element, bind property "texts" to attribute "Name". Assign an empty action to the "onSelect" event.
    Create a form for the details, bind the contained InputField, TextView, whatever to the attributes inside node "Users".
    Then you can select a user from the dropdown list (which displays the user names) and the detail form automatically displays the data of the (lead-)selected context element.
    Armin

  • Cusotm data tab In ME52n, copy release strategy tab details in custom tab

    I have created a custom data tab in Me52n using exit mereq001, and I need to get purchase requisition number, releasegroup, release code and release strategy values in the custom data tab screen in order to fulfill my requirement.
    I am using the screen exit and trying to code in the include (PBO), But in the include I am not able to get the required data as afore said.
    Can you please tell me the procedure for getting the global structure that I can use in the include to get these values?
    *Please give me the solution ASAP.

    HI,
    Use the fm  EXIT_SAPLMEREQ_001 and get the data ..............
    data : wa type mereq_item .
    call method im_req_item->get_data
      receiving
        re_data = wa
    the structure wa will contain the data .....
    Thanks,
    Shailaja Ainala.

  • Batch search strategy based on goods receipt date

    Hello,
    Can some one help me with setting up Batch search strategy based on goods receipt date currently is on batch number.
    Thanks a lot
    Thanks!
    Best Regards
    Srw

    Hi
    If you want to set up Batch search strategy based on goods receipt date......then you have to create GR Date as one charecteristics in T.Code: CT04 and assign this charecter in Class with respect to class type  023
    Then assign this class in Material Master Classification view..
    Make sure in Material master, Purchasing view...you have activated Batch management active check box....
    Then when ever you do GR in MIGO, In batch tab, In classification field maintain the GR Date..
    Then Create sort sequence in  T.Code: CU70 and you can maintain in what basis ie)in assending or desending GR Date you have to search the batch..
    And then Create Batch Search Statergy in MBC1 and there you assign the sort sequence in Details...
    Now  you do your regular procedings.....the material Batch will be searched based on GR Date...
    Reward if useful
    Regards
    S.Baskaran

  • Advice on a migration - or networking - strategy ?

    Hello. I am trying to map a logical strategy on how to do the following.
    I have a recording studio with an IMac 24 running 10.6.  It has all of my must-have apps and internet
    connectivity.
    I want to have identical capabilities on a separate IMac - located in another room. 
    I bought a used IMac which came loaded with 10.6 and is base set of apps/features.
    It seems I have a couple of options... Some form of networking where the added computer somehow
    would access the 'main' computer - or - running Migration Assistant and transferring the identity and
    contents of the first computer over to the newly bought computer.  I'm looking for the simplest and
    easiest solution.
    All computers have either ethernet and/or wireless access to my Airport Extreme/router connected to a Cable Modem. 
    I have never really messed with networking per se.  This second machine would easily be in range for wi-fi web connection.
    It's been quite a while since i've needed to deal with Migration Assistant and I don't want to wind up doing something
    irreversible.  If I were to connect the newly purchased Imac to the existing computer, do i have the option to selectively
    choose what gets carried over and what doesn't - like...if i don't want to clutter it with a lot of stuff that doesn't need to
    be on the target machine - just the must-have stuff -  like my recording, web access settings, Thunderbird email app and
    settings, and want to leave out excess things it really wouldn't require, would Migration ***'t allow me to itemize things
    that way i.e. customize the transfer?   
    Is there any reason i would have to "wipe" the OS on the target drive to do this
    or can i get away with just leaving the OS that's already installed on it, as-is, before i would run Migration Assistant?
    I'd hate to get into some weird Permissions issues that might be introduced...i never know quite how to resolve those
    situations as it's happened in the past.
    What i simply want to come away with is another Mac with pretty much all the same apps and internet/email access as
    my other machine... IF it was simple enough to use networking and make this 2nd machine capable of accessing/using
    all the main IMac's features and internet/email connection, i would consider that route but it seems more complicated...
    Sorry but i'm only an IT guy when I am forced to make changes and i just want to do it right 'the first time' so i appreciate
    any step-by-step on how to get this done in a sensible and straightforward way....and thanks very much!
    MIke

    Hi Mike,
    Best/easiest way is with the old one in Target mode...
    http://support.apple.com/kb/HT1661
    Use MA on the new one to Migrate everything, no detailed choices so it's best to get it all & remove what you don't need later.
    Other way would be the new one in Target mode...
    http://support.apple.com/kb/HT1661
    Then clone the old one to the new one for identical everything.
    Get carbon copy cloner to make an exact copy of your old HD to the New one...
    http://www.bombich.com/software/ccc.html
    Or SuperDuper...
    http://www.shirt-pocket.com/SuperDuper/

  • Release strategy for quantity contract

    Hi Gurus,
    In quantity contract ..  target value will not filled by the value(in header details) as it is quantityt contract.
    We may have different line items with different cost..
    my doubt is how the release strategy will be triggered for the quanityt contract and how the system will consider the whole value of the contract..
    please explain/clarify
    regards
    subbu

    Hi,
    The release strategy for the contract will work in the exact mechanism as for the PO i.e. it will be determined  based upon the total net value of the entire purchasing document.
    Cheers,
    HT

  • Report for release strategy

    Hi,
    Is there a report for PR/PO release strategy?
    Thanks in advance!

    Hey
    In that case you must query the tables from release strategies, i suggest to you:
    T16FS - to get release group, release strategy, and releases codes.
    AUSP / KLART=032  - to get Cost Center or others characteristics.
    More detailed information get from tables T16F... C/D/E/G/H/K/L/M/T/V and KSSK.
    Cya
    Jony

  • Alternative Performance Visualizations in SAP Strategy Management

    A question frequently posed to the SAP Strategy Management solution management team is u201Cu2026beyond scorecards and strategy maps, are there other ways within SSM that I can visualize my organizationu2019s performance?u201D The answer is yes.
    The SSM Strategy page allows you to create and display up to three goal diagrams. Typically SSM users will just create a strategy diagram (the most common type of goal diagram) and let the other two goal diagram place holders go unused. However, with a little ingenuity the other two goal diagrams could be put to very good use.
    Using the goal diagram manager on the SSM Administration console you can import custom images (e.g. Adobe Illustrator or MS Visio images) into the SSM Strategy page. Once a custom image has been imported, hotspots can be mapped on top of the image that allow you to view the status and details of scorecard objects in relation to the image. Hotspots can be created for Contexts, Perspectives, Objectives, and KPIs (KPI hotspots are new in SSM 7.0). For each hotspot location a status indicator is displayed reflecting the Red/Yellow/Green status of the scorecard object. When the hotspot is clicked, details for the scorecard object are displayed in a dropdown box.
    So if, for example, you have created contexts for different geographic regions within the United States, you could import a map of the United States and then map context hotspots that allow you to easily view your organizationu2019s performance across geographic regions. So, on your US Map goal diagram you might see a green status icon over California, red over Boston, Yellow over Florida, etc. indicating your organizationu2019s performance in those regions.
    Other custom goal diagram examples include:
    - Organization charts that show the performance of different divisions / groups within your organization
    - Process flow diagrams that show performance of different elements of a business or production process
    - Initiative / project summaries that show the performance of multiple initiatives
    For more information on how to manage custom diagrams in SSM, view the Help topic u201CGoal Diagramsu201D within the SSM Administration console.
    Best Regards,
    Chris Clay
    Enterprise Performance Management
    Solution Management

    Hi Guna,
    Please see this document for recommended roles for strategy management - http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/b0ac7368-092f-2a10-d68f-8ba5c3f75f10?quicklink=index&overridelayout=true
    Regards,
    Sen

  • Business Objects vs. SAP Strategy Management

    Hi all,
    I am new to this forum and would like to post a message regarding the recent news on SAP and Business Objects.
    Business Objects has more than 6000 CPM customers and in then meantime we find many statements like http://www.gartner.com/DisplayDocument?id=530626 in the web
    Since the announcement a couple of weeks have passed. My question is, if you have detailed analysis on how the Balanced Scorecard solution of both companies will differ respectively overlap. Are there any documents on that point available? And are there any consequences to be expected e.g. with respect to the price model?
    Thanks for your help,
    Alessandro

    Hi Alessandro,
    I fully agree with Prakash's reply - at the moment Nenshad's comments represent the 'only version of the truth' regarding the BOBJ acquisition.
    However, there is still a possibility to add some useful information on this as you are referring to a 'Balanced Scorecard' solution in general. As there are currently two sides to this equasion: SAP and BOBJ, for the SAP part we can be more specific:
    SAP Strategy Management (SSM) based on the technology acquired with Pilot Systems Co. in February 2007 is the tool of choice for scorecarding in the <u>current</u> CPM portfolio of SAP (SAP's current CPM portfolio is based on the acquisitions of Outlooksoft, Pilot and the reseller agreement with Acorn). This reduces the complexity a little bit as you no longer have to contemplate which of SAP's own scorecarding solutions is the leading one <u>at the moment</u>. SEM-CPM Balanced Scorecard is retired - being put in maintenance mode - and all customers encouraged to move to Strategy Management, facilitated by a planned migration utility.
    Shall you want to learn more about SAP Strategy Management, feel free to refer to my blog series (e.g. <a href="/people/karol.bliznak/blog/2007/09/12/one-blog-is-not-enough--how-to-manage-performance-management).
    Thanks / Regards,
    Karol

  • SAP Strategy Management implementation :BOOK

    Dear Gurus,
    I have searched SAP Press and Amazon but didn't find a suitable book for SAP Strategy management implementation.
    Could you please mentioned a cook book or ISBN number or any help is appericiatble.
    I am looking for Business as well as technical detail.
    Thanks & Regards
    Arif.

    Qureshi,
    As previously mentioned, there isn't any books about Strategy Management implementation. There are three areas that seem to get the most attention, but as SSM is changing in features, functionality and connectivity, it's moving faster than anything that could be published. The best documentation on implementation of SSM is the guides available on the Service Marketplace. These are regularly updated, so you always have access to the latest information.
    1) PAS - You build your multi-dimensional models in PAS. PAS has some amazing flexibility, especially how it handles time. Understanding how to use PAS seems to be a big challenge - as you can tell by the many questions on the Forum regarding models. There is no published book on PAS that I know of and even the PAS training in the SAP Education class is aimed as an overview. My suggestion here is at the very least try to take a SAP Education class on SSM that includes PAS training.
    2) Connectivity with SAP BW - Because of some fundamental differences between how PAS and BW deal with data, there is a tool - called the BICA - that SSM uses to harmonize BW and SSM data. There is a paper available on the EPM BPX on best practices with BICA.
    3) Wrangling KPIs - Strategy Management is about focusing on the essential areas of a business. Since this is YOUR business, your strategic objectives will be necessarily different - how else can you differentiate your products or services if you are doing everything the same as your competitors? This is not a situation of seeming how many more measures you can include, it's is trying to focus on doing those things that really are going to be game-changers for your organization.
    This business side aspect of Strategy Management is actually the area that will show the most success - and ROI - in the organization, although the typical focus is on the technical aspects of the application. Strategy Management continues to change the application to help address the different ways that people construct their organizational strategy and how they communicate it as well as monitor it. Do not underestimate the importance of the business side of this application.
    All three of these areas really strongly recommend Consulting Services. There are both SAP and 3rd Party consultants available for these areas. Why should you spend the money on a consultant?
    A) You will get implemented and running more quickly - less delays, less problems, more success. I have witnessed several projects where the organization wanted to DIY and faltered because there was no expertise on the team. Projects these days HAVE to be successful -right out of the gate - so get experienced help.
    B) Knowledge Transfer - There aren't any books and your boss isn't going to pay for a 5 day offsite training course. Bring in a consultant and work with them to learn what they do, how they do it, so you can become self-sufficient. It's much easier to build a consultant into a project budget than training. Although training is essential, I just never understand why executives neglect this important area.
    C) Personalized Help - The needs you have for your implementation and successful operation are going to be unique because it's going to revolve around your current technical architectural, your company's practices, as well as the structure of your organization. Having a consultant allows you to better use your time and the project's time by focusing on the needed areas and getting through the typical bumps that always accompany a new program's rollout.
    I am NOT a consultant, so I have no alterior motives. I DO want organizations to be successful with SSM implementations and from what I have witnessed the difference between success and merely muddling through is how willing the organization is to bring in either a SAP consultant or work with a knowledgable 3rd party consultant.

Maybe you are looking for

  • Open PO in production Client

    hello Guru ji tell me below scenario PO Not Showing in 0001 Client (production ) Please open case to determine why the following transaction failed to create a PO in 0001. (prodution client) some body has(take it as sunil)entered the following data o

  • IPod Classic 80g..message 'Delayed write failed?? Will not sync to itunes!

    This message appears when I try to connect to my pc... 'Delayed write failed. Windows is unable to save all the data for the file J:\ipod_control. The data has been lost. This error may be caused by a failiure of your computer hardware or network con

  • Reader 10.1.1 Protected Mode

    Hello I am deploying Reader 10.1.1, but on some computers protected mode is blocking PDF files from being opened from a network drive. Is it possible to configure protected mode for a safe domain. The solutions I have so far, disable protected mode,

  • Can I use the Bluetooth on my Centro to get on the internet.

    I have a refurbished palm Centro (so I wouldn't have to pay an extra $720 for the AT&T unlimited PDA data plan with the phone). It is replacing my old & dying Palm TX. One thing I loved about the TX was the wifi access. It was great to connect to my

  • How to collect and transport a hierrachy linked to an object ?

    Hi guru, I am working in BI 7.0  and  linked a Hierarchy to oan object. When I transported the object to production, the hierarchy is missing in production. How to collect and transport this hierachy  ? In case of a user-defined hierarchy, is it poss