MDG - Data Replication to Legacy and ECC

Hello All,
We are implementing MDG-M on a separate hub. Data Model is extended to support both ECC and Legacy system.
Material Master Replication has two scenarios based on the two different CR types. Requester will decide whether or not material is required for the Legacy System.
CR #1 Materials will only be replicated to ECC: It will be peer-to-peer.
CR #2 Materials will be replicated to ECC and Legacy:
ECC replication will be peer-to-peer. Only SAP relevant content will be replicated.
However, Legacy replication has to be via SAP PI. Some of the SAP fields such as Material Number, Lab Office will be mapped to Legacy Fields. There are also some custom fields which will be only replicated to Legacy System.
I appreciate if you can help me to figure out how to setup the Replication Model.
Thanks,
Hakan

Hi Hakan
Defining filters: Use transaction DRFF
Apart from that, I have these links:
MDG EhP6 Learning Map : http://service.sap.com/~form/sapnet?_SHORTKEY=01100035870000743543&
=> Master Data Governance
=> Overview: Master Data Replication in MDG for EhP6
=> Set Up, Implementation & Extensibility: Setting-up and Operating the Replication of Master Data
SDN - MDG Extensibility Center: http://scn.sap.com/docs/DOC-7858?rid=/webcontent/uuid/f0801486-9562-2e10-54b1-9c123497f1a7
Regards, Ingo

Similar Messages

  • Data Replication Between Sqlserver and Oracle11g using materialized view.

    I have Sqlserver 2005 as my source and oracle11g as my target.I need to populate the target daily with change data from source.
    for that we have created a dblink between sqlserver and oracle and replicated that table as a Materialized view in Oracle.
    problem we are getting here is Fast refresh option is not available.each day it will pick full data from the source.
    is there any way to use Fast refresh in this scenario??
    Thanks in advance.
    Regards,
    Balaram.

    Pl do not post duplicates - Data Replication Between Sqlserver and Oracle11g using materialized view.

  • SNP planned order availability date difference between APO and ECC

    Hi,
    I have observed that SNP Planned order availability date is not matching between APO and ECC. Details are as follows.
    I ran SNP Optimizer with bucket offset of 0.5. After publishing the optimizer created planned orders to ECC, only start date is matching.
    Example:
    I am using PDS as a source of supply.
    Fixed production activity in SNP PDS is 1 day.
    GR processing time: 3 day
    After running optimizer planned order is created with dates explained below.
    Start date/time: 09.05.2011 00:00:00
    End date/time: 12.05.2011 23:59:59
    Availability date: 16.05.2011 00:00:00
    Because of bucket offset defined as 0.5 optimizer planned order availability is start of next monday.
    After publishing this planned order to ECC dates on the planned order is as follows.
    Start date: 09.05.2011
    End date: 09.05.2011
    Availability date: 12.05.2011
    I have not maintained any scheduling margin key in ECC. Also if I dont define the GR processing time, planned dates between APO and ECC always match. Can anyone explain the impact of GR time on the availability date.
    Regards,
    Venkat

    Hi Venkadesh,
    What's "state stamp"? Do you mean different time zone?
    note : 645597  mentioned by Nandha is very helpful:
    In standard, CCR will use duedate - "the available date of the output product".
    Nandha's words "In SAP APO, if the receipt date of the primary product deviates from the
    end date of the last activity of the order, the receipt date
    always identifies this as inconsistent. You cannot rectify
    inconsistencies of this type by using CCR."
    I guess in your PDS or PPM, the output product is not assigned to the End of the last activity. Someone changed it?
    Please CIF the PDS or PPM again.
    If you really want to apply a note, please use note 815509 as you're using planned order,
    and system will use order end  date in CCR instead.
    GR time is always considered. BR/Tiemin

  • Options for loading data from legacy mainframe to SAP CRM and ECC

    Hello All,
    I am new to SAP.
    As part of data conversion planning from legacy mainframe to SAP CRM and ECC systems, I need to know the different tools available for loading the data and comparative analysis of the tools by showing the features. Please also describe the process to evaluate the tools (prototyping, technical discussions etc) and make the recommendations.
    I would also appreciate any information on testing tools like (eCATT) and any third party tools (like Informatica)
    I know I am asking for lot of things here. But, I really appreciate any information reagrding this.
    Thanks,
    Balu

    Hi
    Data Migration(Conversions) mainly involves the following Major Steps
    Discovery->Extract->Cleanse->Transform->Load
    Discovery-> Involves identifying various source systems available where the actual data is residing.
    Extract->Extract the data from the source systems(legacy systems)
    Cleanse->Cleanse the data by enriching it,De-duplications etc
    Transform->Transform the data into SAP Loadable format
    Load->Upload the data into SAP Using the Load Programs(Eg,BDC,LSMW,Data Transfer Workbench,BAPIs etc)
    Also i would request you to visit http://www.businessobjects.com/solutions/im/data_migration.asp
    Cheers,
    Hakim

  • MDG-F 7.0 Data Replication using SOA

    Hello, 
    There has been a kind of issue since MDG-F 7.0 is started. The issue is related to data replication results. Once master record data are successfully replicated over into ECC SAP 6.0, ECC is not sending anythhing back in to MDG-F for status update. We intended to put some custom development work done to fill the gap using email notification but we don't have a clear picture of what needs to be done.
    Requirement is that once master data is successfully or unsuccessfully updated in ECC, we want MDG-F to generate an email notiication to tell the requester of what happens and we want the status to be updated by master record 1, 2, 3.. by change request 1, 2, 3, by Edition.
    Any thought?
    Thanks,
    LUO

    Hello Luo
    You can configure the custom email notification after successful update of master data in ECC.
    MDG-M: How To Send an Email Notification to the Requestor at Time of Material Activation
    Another way out is - since you are using SOA, you can send the message back to receiver system. Check with your PI consultant. You can configure the messages after successful data update.
    Kiran

  • MDG-F 7.0 future planned data replication

    Hello,
    From what I have learned, valid from date in Edition can be used to control of timing of the data replication. In my data prototype in MDG-F 7.0, it is clear that GL change request will go with that. In other words, if the valid from date =August 1, 2014 (today=07/15/2014), GL change request won't come into ECC until August 1 2014 is arrived. it is behaved this way regardless of the choise of replication timing I have made.
    Will the same behavior occur to cost center change request and profit center change request?
    Thanks,
    Luo

    Hello Luo
    Yes you can do that but the real problem I faced is that in the edition screen you get the list of all the editions. Users get confused with this.
    Best way is to connect your DEV / Quality systems with resp MDG Dev and Quality. Use automatic replication. Perform the testings in each system. Once you get the approval you can create master data in production a day before go-live. This is working well till now in my case.
    Kiran

  • Replication from CRM to ECC and vice versa

    Hi,
    I am a newbie and would really appreciate any of your advice and help. I was given a task to replicate business partner from CRM to ECC. I was advised to go through a building block in SAP Best practices for CRM called C03:CRM Master and Data Replication. I've read through the materials and found out that there are some prerequisites. Do I need to installed all of the prerequisites before I can proceed with the steps given in the building block. I hope anyone could reply to me as soon as possible.
    Regards,
    Julianna

    Hi,
    Thats right! C03 best practices would be certainly helpful to you.
    Few important steps for BP repllication from CRM to ECC would be
    In CRM , First define your Logical destination in BD54 .
    1) SU01 Transaction to create users : create user in CRM & R/3
    2) Transaction code SM59 : Create RFC DESTINATION .. here u use the users created for reaching into R/3 from CRM
    3) transaction code: SMOEAC: ADMINISTRATION CONSOLE:-
    here define Publication, subscription and site ( OLTP) , and define replication objects ( like BUPA_MAIN, BUPA_REL)
    4) define Queue: SMQR and SMQS - inbound and outbound queue /
    Que registration .
    5) check entry in R/3 side tables:- Make the proper entry into R/3 side tables in SM30
    a)CRMRFCPAR
    b)CRMPAROLTP
    c)CRMSUBTAB
    d) CRMCONSUM
    These are important steps for Middleware settings between CRM & R/3
    In addition to this , you need to correctly define your
    1) in CRM ,BP GROUPINGS with Internal Number range
    2) in R/3 define "Account Group" with "External Number Range" for BUSINESS PARTNER
    3) In R/3 , Mapping of CRM CLASSIFICATION with R/3 account group...e.g. Classification B for customer with ACCOUNT group of R/3
    You need to also map in CRM  correctly the partner functions and R/3 Partner functions.
    regards,
    PD
    Reward points if it helps...

  • Data replication monitoring on MDG flex mode objects with Information Steward

    Hi,
    I am exploring different options available to setup data replication monitors on SAP MDG custom objects(flex mode). Basically my requirement is to setup rules in SAP Information steward which identifies if there are any missing records or mismatch between the data in MDG system and the replicated systems.
    The challenge I am facing is that the tables generated by MDG framework for flex mode are not consistent and they keep changing if we change some key information in the MDG data model. Also the generated tables are not the same in each environment (like development, quality, production). Appreciate if anybody can provide suggestions on handling the MDG generated tables in SAP Information Steward.
    Thanks in Advance.
    Goutham

    Hi Abdullah,
    Thanks for your reply. I cannot write any ABAP code in Information Steward. As you mentioned I can always use MDG API to pull the active data from MDG but here I can only give the table names.
    The other option as you mentioned, using DRF I can replicate the data in flat files to Information steward, but it is not making sense as I am validating with replicated data and not with actual data available in MDG. Other point is I don't think it is a good idea to replicate all the records periodically.
    Thanks and Regards
    Goutham

  • SAP Data Archiving in R/3 4.6C and ECC 6.0.

    Hi Guys,
        Need some suggestions,
    We currently working on SAP R/3 4.6 and have plans to upgrade it to ECC 6.0,
    In the mean time there is an requirement for SAP Data Archiving for reducing the database size and increase system performance.So wanted to know if its better to do Data Archiving before upgrade or after. Technically which will be comfortable, Also wanted to know if any advanced method available in ECC 6.0 compared to SAP R/3 4.6.
    Please provide your valuable suggestion.
    Thanks and Regards
    Deepu

    Hi Deepu,
    With respect to archiving process there will be no major difference in 4.6 and ECC 6.0 system. However you may get more advantage in ECC6.0 system because of Archive routing and also upgraded Write, Delete programs (upgraded program will depend on your current program in 4.6 system). For example In 4.6 system for archive MM_EKKO write program RM06EW30, archiving of purchase document is based on company code in the selection criteria and there is no preprocessing functionality. In ECC 6.0 you can archive by purchase organization selection criteria and preprocessing functionality will additional help in archiving of PO.
    In case if you archive documents in 4.6 and later upgrade it to ECC 6.0, SAP system will assure you to retrieve the archived data.
    With this i can say that going with archiving after upgrade to ECC 6.0 system will be better with respect to archiving process.
    -Thanks,
    Ajay

  • Data replication and synchronization in Oracle 10g XE.

    We are trying to do data replication and synchronization sort of thing for all our servers. We are using Oracle 10g. XE. I guess there are some features in oracle already for replication but I am not very sure about them.
    To explain it more clearly - we will have individual database servers in our sub-divisions and then divisions and centers and then main server. We need to synchronize at various levels. So If any body is aware of any techniques, please let me know.

    Hi,
    Could you tell me what exactly synchronisation your talking about..?
    we will have individual database servers in our sub-divisions and then divisions >>and centers and then main serverIf you have mulitple DB servers then you can connect it by DB links. also if you are talking DB synchronisation then you can have Triggers,Materialized views.
    we also have two independent severs which are synchronised(atleast schema levels).
    Regards!

  • Data Replication between Oracle 9i and 10g

    Hello,
    I have a question regarding possible Replication Models between Oracle 9i and 10g. Does anybody know a possible way to syncronize the schema data between a 9i and 10g database in realtime?
    If yes can you please post perhaps a link with a kind of how to?
    Many thanks to all,
    Bob...

    You can read this metalink note 370850.1 - there are bit more ways of replication discussed.
    Yes, platforms can be different, but recomendation is to use the same platform for both databases.

  • Data Transfer Erec and ECC HR

    Hi all,
    We are using having 2 backend systems 1 for HR other than Erec and another one for erec.
    Now we have setup the ALE data transfer using Message type HRMD_ABA.
    In erec we have only ERECRUIT Component deployed in backend,no EA HR or SAP HR component deployed.
    MEaning no PA* tables.
    Now when Transfer personnel number(HR!001-A008 with Position) to erec box,it says no pernr exist.
    Howver i cannot tranfer the pernr from ECC to Erec as we dont have PA* tables in Erec.
    Can you share me the object types that are transferred from ECC to EREc and EREC and ECC.
    What are the components to be deployed in backend Erec box.
    In Portal for requistions application which should backend(System Object) and for recruiter and recruitment admin applications which should be the backend system(System Object).
    Thanks,
    Nachy

    Hi,
    Check the link :
    http://help.sap.com/saphelp_erp2005/helpdata/en/45/8150635e9c40c1e10000000a1553f7/frameset.htm
    Supported infotypes for transfering data from the E-Recruiting to the HR-System are the PA infotypes 0,1,2 and 6. There are 15 fields sent to pa48 if they were filled in the E-Recruiting system, which are
    - forename, initials, surname
    - gender
    - birth date
    - correspondance language
    - address (street, city, streetcode, region, country)
    - hiring date
    - organisation unit
    - position
    - personel number (for internal candidates)
    So these are the infotypes which can be transfered automatically and cause no problems. If you want the "title" to be transferred additionally, you have to adjust the  system behaviour accordingly.
    To set up data transfer,
    In E-Rec System, use Tcode - BD64 Partner Type LS - Generate partner profile for the ECC System
    Use Tcode - WE20 for the Partner profiles checking
    For ECC Partner Profile, Give Outbound parameters - Messahge type HRMD_ABA and SYNCH. Give Inbound parameters- Message Type HRMD_ABA
    In ECC system, use BD64 for the Partner Type LS. Partner Type LS - Generate partner profile for theCreate Erec System(
    Use Tcode - WE20 for the Partner profiles checking
    For E-Rec Partner Profile  , Give Outbound parameters - Messahge type HRMD_ABA and SYNCH. Give Inbound parameters- Message Type HRMD_ABA
    After doing data transfer using RHALEINI, use transaction SM58 to check whether the data has been transferred or not.
    Hope it helps!
    Arpita

  • Data Inconsistency for 0EMPLOYEE between ECC and BI

    Hi,
    We do a full load to 0EMPLOYEE using 0EMPLOYEE_ATTR from ECC. There were records deleted for lot of employees (some action types) in ECC. This has caused data inconsistency for 0EMPLOYEE master data (time-dependent) between ECC and BI. In BI we have more records for these employees (Additional records have time-dependent range that were deleted from ECC but still exist in BI). These employee records are already being used in lot of InfoProviders. Is there an efficient way to fix this issue? One of the solution is to delete data from all InfoProviders and then delete from Master data for 0EMPLOYEE, but the deletion of employee records can happen quite often, so we don't want to take this route. Also, I tried to re-organize master data attributes for 0EMPLOYEE through process chain, but that didn't work either.
    Message was edited by:
            Ripel Desai

    Hi Ripel,
    I share your pain. This is one of the real pains of Time-Dependant master data in BW. I have been in your exact position and the only way round the issue for me was to clear out all the cubes that used 0EMPLOYEE and then delete and re-load 0EMPLOYEE data.
    I know this responce doesn't help you but at least you are not alone.
    Regards,
    Pete
    http://www.teklink.co.uk

  • Synching the data between MDM and ECC

    Hi All,
    We have been on SAP MDM Central Master Data Management scenario for more then a year now. In the pas we have seen many issues ike - Syndication server failures, idoc failures due to which the data is not 100% in synch between MDM and ECC.
    Though we are working on the idoc failure tracking process but my question is this -
    I need to create an automated program which runs on a weekly basis finds the data which is out of synch in MDM and FEP and gives me the same. So the logic should be something like this - Pick up the records updated during the last week in MDM, look for those records in ECC. Compare field by field and give me the delta.
    Since it is a central master data management scenario hence there are cases in which records are created in MDM but not present in ECC may be because of syndication failures or idoc failures.
    Any one worked n a similar sort of program/logic? Kindly let me know your experiences as well in efforts to keep the data in synch.
    Regards,
    Indraveer

    Hello Indraveer
    Yes you can do it.
    You can configure exchange between ECC and MDM by schedule directly or through PI
    or
    SAP MDM 7.1 has own API for ABAP, JAVA, .NET. (5.5 for ABAP,JAVA, .COM)
    That may be abap transaction  or java(.net) application is working by schedule.
    more about MDM API you can read here:
      ABAP
    http://help.sap.com/saphelp_nwmdm71/helpdata/en/48/edff445128209ce10000000a42189d/frameset.htm
    JAVA and .NET
    http://help.sap.com/saphelp_nwmdm71/helpdata/en/13/041975d8ce4d4287d5205816ea955a/frameset.htm
    JavaDocs
    http://help.sap.com/javadocs/MDM71/index.html
    Regards
    Kanstantsin Chernichenka

  • SCCM 2012 Site communication and data replication

    Hi,
    Is there any videos or diagrams which is clearly explaining the site communication, data replication, file based replication?
    Regards,
    MTM

    Hi,
    The blogs below might be helpful, you could have a look.
    Data Replication in System Center 2012 Configuration Manager
    SCCM
    ConfigMgr 2012 Site to Site replication and SQL Replication
    (Note: Microsoft provides third-party contact information to help you find technical support. This contact information
    may change without notice. Microsoft does not guarantee the accuracy of this third-party contact information.)
    Best Regards,
    Joyce Li
    We
    are trying to better understand customer views on social support experience, so your participation in this
    interview project would be greatly appreciated if you have time.
    Thanks for helping make community forums a great place.

Maybe you are looking for