Central MDM Performance

MDM: 7.1 SP6
CE 7.2
Fields in main table (customer): 75
Total records: approx 500,000
MDM JAVA API and Web Services Configurator used
Hi,
Has anyone, using MDM for central master data maintenance, carried out performance tests on their systems? Do you get good performance (lets say 3-5 secs response). Does your system scale when multiple users send concurrent requests to the MDM server to update records? Do you experience a bottleneck due to MDM product design?
Any learnings will be much appreicated...
Thanks and regards,
Shehryar

Hi Shehryar,
what OS system do you use? I have experienced that especially on AIX/Unix systems performance issues occur because of incorrect AIX memory handling parameter settings (e.g. "ulimit"). As an example - although one of my customers had 60GB RAM we got memory and therefore performance issues because ulimit was set to 32MB. In general I would say - this is really important to have all these parameters right!
Pls do also check following notes:
1282668 - Performance settings for AIX
973227 - AIX Virtual Memory Management- Tuning Recommendations,
1012745 - Tips for improving MDM performance,
1240587 - MDM 5.5 and 7.1 Low performance when MDM workflow is used,
Moreover, pls do also keep in mind that e.g. MDM Data Manager is actually an administrative tool, i.e. not really suitable for mass user access! Pls do therefore also check # 1512323 - Number of Data Manager instances affects performance. Instead, for that sort of access - you should use e.g. Portal or CE (--> BPM) etc.
If you don't use Portal or CE - pls setup your environment in a way so that import server is used instead of manual modifications!
Example: instead of having 10 users logged on via rich client doing mass changes - you might want to consider a way where you prepare an upload template file (xml or txt format) that you use to feed import server. IS does the job right away, All you need to do is to check whether the import went through the right way etc.
Pls do also not forget to delete all completed workflows as they have a big impact on the MDS performance! In SP06 it is possible to archive them which should be done periodically (e.g. via clix). If you have many WFs to trigger - this really helps increasing performance.
Another important point are all In-/outbounds directories. Here, you might want to bring in periodic jobs that empty your Archive folders (Repository --> Remote System --> Port --> Archive ).
Moreover, delete all unreferenced tables and fields in your data model. You might also want to deactivate change tracking on all fields (in case it is activated). Pls do also check Sort Index parameters, whether you need Stemmers, etc etc etc.
It looks simple to setup a proper working MDM - however it requires a lot of experience and technical expertise!
Hope this helps!
Best Regards,
Erdal

Similar Messages

  • Centralized MDM

    Just a question on a type of business scenario - is there anyone out there who has successfully implemented a complete centralized MDM implementation? Just as an example - modeling a Customer Repository (in MDM) that totally and accurately replicates the Customer structure (and all related structures) in SAP. Have you set up a MDM environment where business users maintain every element of a Customer Master Record (as an example) in MDM (via Data Manager) with automatic/continuous syndication outbound to SAP (no inbound activity).  As part of your centralized MDM/SAP solution, have you totally disabled all SAP transactions that would allow users to add or modify any element of a master record in SAP (i.e. XD01, XD02, .....)?   Altho MDM is supplied with base Customer/Material/Vendor, etc... Repositories and base Syndication Maps, etc.... if anyone has embarked on a full-fledged, centralized implementation of Master Data - these supplied components require a great deal of expansion and re-work to make this type of implementation possible.  Just wondering -  how many out there are using (or have attempted to use) MDM in this way - and, if so, what type of obstacles and/or limitations have you encountered?     MMF

    Hi MMF,
    In this kind of scenario,u have to change the fields and maps according to your requirement. Create required ports for ur R/3 systems to syndicate data to these ports. You should not provide access to the users to create or change the customer in R/3. Onlly the display customer access can be given.
    Hope this is helpful to u.
    Regards,
    Dheeraj.

  • Best approach for syndication in Central MDM

    MDM 7.1
    CE 7.2
    ERP 6 EHP4
    PI 7.1 EHP1
    We are currently developing a custom application using CE/BPM workflow for central maintenance of customer master data. One of the topics under discussion is the right approach for syndication once a record is complete.
    [This |http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/60a3118e-3c3e-2d10-d899-ddd0b963beba?quicklink=downloads&overridelayout=true] SAP document on collaborative material master data creation provides one way to achieve this syndication by first calling a web service from BPM to create record in ERP before checking in MDM. While I am personally fine with the approach, some of the other colleagues aren't too keen on issuing synchronous calls from BPM. Rather, they would like to use the syndication engine of MDM to transmit data to downstream systems (currently only SAP ERP) using Idocs. But there is a caveat here. To use syndication, the record has to be checked in.
    The problem is that if the record is checked in MDM, it is ready for modification. However, the asynchronous call to ERP using Idocs for creation of customer master might fail for any number of reasons. In this case, the MDM record might need a modification before resubmitting to ERP. In the meantime, since the record was checked in before syndication, someone else might have checked it out, potentially resulting in data quality issues. So to avoid this situation, the developer has decided to take the approach to check in -> syndicate -> check out -> wait for confirmation Idoc -> check in if success. This isn't a clean approach to syndicate but might address the record locking issue.
    Another consideration is to design the application with the view that sometime in the future, this master data might have to be syndicated to other SAP and non-SAP systems as well. To ensure syndication to all downstream systems is complete before checking in MDM can be a tricky requirement and might need some complex ccBPM development or evaluating something similar to two-phase commit (might be an overkill). In any case, a best practice approach for keeping downstreams systems in sync with MDM in case of central MDM has to be shared by SAP. So it would be good to have comments of the people who developed the reference application for collaborative material master data creation.
    If there are any customers who have come up with a custom solution which works, please do share the experience.
    Thanks and regards,
    Shehryar

    Thanks Ravi. While there are more than one possible solutions to the immediated problem, I am actually looking for a design pattern which SAP recommends or a customer has developed to address the issues related to synchronization of master data in a Central MDM environment.
    The idea behind a central master data management function, as you know, is that all participating business systems use the same basic master data being authored in MDM. This data has to be synchronized with all participating systems, rather than just one system. To me, a ccBPM workflow or 2 phase commit design pattern seem to be the solution. But it would be good to know how other customers are addressing the issue of master data synchronization with multiple systems, or SAP's recommendations for this issue.
    Regards,
    Shehryar

  • Central MDM

    Hi Experts,
    Can anyone tell me about central master data management ?
    Is it cenrtally carrying out of processes such as consolidation and harmonization?
    I am also refering to Help.sap.com links. Please help me understand by sharing experience if anybody has wokked on such scenario.
    Thanks in advancem

    Hi,
    Beofre sharing some of experiences on Centrally Managed Master Data. I'd like to share definitions of MDM basic scenrios.
    Master Data Consolidation
    Importing the Data from different systems using MDM Import Manager and consolidate it i.e. identifying duplicates etc.
    Involves MDM Import Manager and MDM Data Manager.
    Master Data Harmonization
    Harmonization involves importing data from different systems, consolidate it and then distribute consolidates Master data to different systems.
    Uses MDM Import Manager, MDM Data Manager and MDM Syndicator
    Central Master Data Management
    Its a creation of Master Data in MDM itself i.e. using Data Manager and then syndicate the master records to the desired systems.
    To be precise,CMDM indicates all master data is maintained in the central MDM system. No maintenance of global master data fields occurs in the connected client systems.No transactional data as well
    In particular sub-tables containing customizing data (like company codes, etc.) will always need to be consolidated before the central creation scenario gets operational.
    An MDM Central Master Data management is not an isolated scenario. Usually depending on the business case and on the Implementation phase the other scenarios will be necessary.
    In practice more scenarios can be built from a combination of the three main scenarios(MDC,MDH and CMDM). For example some customers implement a mixture between MDC and CMDM. In that case in spite that data is being created centrally in MDM, local Master Data may still be created at the client systems, and than consolidated in MDM.
    From ome of the sources i read, Consolidation and Harmonization procedures are very different from CMDM (central master data management), however often times CMDM is not very useful without considering the other two as well.
    Consolidation essentially means creating identity around your master data across your landscape. More specifically you want to identify duplicate records and merge them into one record. Harmonization is the process of pushing the new cleansed data back out to your partner systems.
    Central master data management is the process of creating and managing enterprise level attributes in one place. One common mistake that people make is that central master data management means having one place to create your records and containing ALL data. This is not correct, MDM focuses on enterprise level attributes. In other words, which attributes are KPI's, shared across multiple processes in scope, shared across multiple systems, important for reporting, need ultimate quality, etc. Once you have determined your data model then you begin developing the workflow (if needed) around the creation of master data. This way you can easily keep your data clean and free of duplicates moving forward following a consolidation and harmonization process.
    I am sure, this would be helpful to you and all SDNers.
    Regards,
    Krutarth

  • MDM performance test

    Hi all,
    Anyone has done a performance test on MDM? Any test script would be really helpful!
    Thanks in advance.

    Hi,
    MDM Performance need to be monitored in basically two main MDM activities when it come to working in MDM real time:
    1) MDM Importing
    2) MDM Syndicatiion
    MDM Importing:
    - When ever you are using MDM in an IT Landscape there will be many source systems from where data will be inputted into MDM.
    - This data may either be imported in a Manual fashion or Automatically as in most case using the MDM import server.
    - In either case you will have to deal with thousands of records.
    - Importing such a large set of records afftects the performance as very stage till importing means(at the record matching steps etc till teh final import)
    - The performance gets affected slowing the process of importing right from opening a saved map ar even Field or value mapping.
    - And above all if you have an exception due to which a single record in the import chunk fails then the entire  set fails.
    - So care must be taken to improve the performance of importing records.
    - Which is done taking into consideration the following:
    Areas to focus on improving performance during Importing:
    1) The chunk size which defines the number of records to be imported at a time
    2) The number of records processed parallely (MDIS settings)
    3) The number of Fields Mapped
    4) Number of Matching field used
    5) The number of Validations and assignemnts running on the records etc
    MDM Syndication:
    - In the similar lines MDM performance also should be monitored when Harmonising data to the target systems
    - Selecting the records to be syndicated using search criteria,Suppressing Unchanged records,Key mapping etc all afftects the MDM performance adversely.
    You can in general create test scripts that will monitor MDM system performance in thes two prime areas as well as in MDM generak activities as well.Using different OS and different sizing.As MDM performance will differ according to the Software used in each case which includes the disk space,Cache and RAM as well.
    You can Create test scripts on the following activities using differnt Configurations and thus compare and test teh performance of teh MDM system under different conditions with the expected output as against the actual output
    - Testing mounting/unmounting repoisitory
    - Loading/Unloading Rep
    - Export/Import Schema
    - Archieving/Unarchieving rep
    - Creating Master/Slave rep
    - Importing different sets of Records using manual and automatic method
    - Syndicating different sets of rec using Manual and automatic method
    Hope it helped
    Thanks & Regards
    Simona Pinto

  • MDM performance

    Hello
    We are in a blueprint phase of a MDM project. We want to use MDM for products and price conditions. We have about 100' products and about 4 millions price conditions. We plan to create a repository where the main table is the priceconditions (4 mill records) and using a lookup table for the materials (about 100' records). We receive updates on both materials and price conditions daily.
    I have two questions here:
    Does anyone have any experiance with such large amount of data (performence and so on and so forth)?
    And (a specific one). The price conditions file that we receive contains several materials that we don't have. Is it possible to NOT import these records (records that don't automap to the lookup table materials)
    Regards,
    Anders

    Hi Anders,
    We have loaded around 1.3 million main table records. My experience loading such a big quantity was not that impressive when we used Import manager. Here are the steps we followed:
    1] Not possible to load all records at the same time.
    2] Break the file into say 30000 - 50000 records per file.
    3] Took lot time first at "Record Matching" step and then to actually impport the records.
    Fortunately we were on SP04 version and we had Batch clients that time which helped me a lot. But again we dint see any performance improvement in actually importing the records. Its just we could save time in: 1. Opening Import Manager 2. Loading my saved map.
    We have not tried loading such amount in latest versions. Please share your experience once done.
    Regarding second question , thats not possible , you will have to change you source file for that or import the record as is with mapping "Material" unmapped values to "NULL" in the destination.
    Regards,
    Dheeraj.

  • MDM Performance Metrics

    We have an interface coming up where we have to handle 10 million DB records from a source file, process them and load them to MDM database. Anybody has any performance figures on MDM and PI? Would greatly appreciate any help. I scoured the net as well as sap services to no avail.

    Hi,
    there should be no difference in performance because both products share the very same kernel.
    Of course, EE has features like partitioning which make a huge difference when they are used.
    So the answer clearly depends...

  • How to lock R/3 master data transatcions when using Central MDM Scenario?

    Hello Colleagues.
    We are implementing a Central master data management scenario, all master data will be created / changed within MDM; so we need to block create/change for fields that are considered in MDM for the master records within R/3.
    In a previous project we used authorization profiles to achieve this task. I supose an other option is to use the field status configuration.
    However, I was wondering if SAP has something standard to achieve this blocking within R/3?
    Thank you for your answers.
    Regards,
    Jorge.

    Hi Jorge,
    SAP MDM has released Central Master Data management as standard already.  Also SAP MDM SP04 has more features which will have easy integratoin for standard R/3 Repositories for CRM, SRM etc.  You have a facility in MDM Repository for creating your customized Roles which will help you to block certain access to users.
    Hope this answers your requirement.
    Regards
    Veera
    Note : Please mark the Points if this reply is helpful to you

  • Central Repository Performance

    We have recently upgraded to data Services 4.2, sp1, patch 4.  We have been seeing very slow performance when checking things in and out of our development secured central repository.  The performance seems worse than it was in our 4.0 environment.
    I know that we have a large volume of objects in our central repository (84 projects, 419 jobs, etc.) and we have about 7o local development repositories hooked to it.  I am sure these may all be contributing factors to performance but are there other things I can be on the lookout for or settings that could help improve performance.

    As a follow up here is some of the performance degradation we are seeing.  This includes logging in to our 4.2 repositorys as well as get latest on various size projects.  The 4.0 test was done with a designer on a virtual server, the first 4.2 test was also a designer on a virtual machine, and the last was with designer on the data services job server.

  • MDM Performance problems

    Hi All:
    We are suffering terrible connections problem when we tried to connect using MDM Data Manager to the MDM Server. To give an example, sometime we wait for more than 10 minutes to complete the connection and the repository load. This obviously would make sense if we would have all the records loaded and we would have an enormous repository to be loaded; however this is not the case.
    Have you ever performed repository loading problems or any other sort of performance problem. How are you handling it? What sort of best practices have you implemented?
    Kind regards
    Gonzalo Perez-Prim

    Have you sized your MDM Server properly?
    MDM Sizing guide is available in Service Market place.
    For running client applications, I would recommend atleast 2 Gb ram.
    CPU Consumption is high when you are executing some matching strategies or Import Manager.
    If the repository has lot of binary data(imange) then the archiving would take a lot of time. We are experiencing an hour to archive a repository and 20 to 25 minutes to load the repository.
    This is after the fact that the sizing was done properly.

  • SAP MDM and Zycus or Zoomix or Kalido or Trillium

    Hi All,
    please, could you help me to understand better which is the best enterprise-wide master data management software solution for harmonizing, storing and managing master data over time??
    Even if I’ll use SAP MDM; I need to cleanse and harmonize MD.
    What about Zynapse? Kalido? Zoomix? Trillium??
    Thank u in advance
    Regards,
    da

    Hi,
    Kalido MDM :
    Master (reference) data management is  thought as a special form of data warehouse federation(A ‘federated data warehouse’ consists of a set of data warehouse instances that operate semi-autonomously, are generally geographically or organizationally disparate, but which can be thought of as one large data warehouse).
       However, experience shows that such systems are complex to manage and that, over time,the absence of a common warehouse (a single common source of master data) leads to inconsistency and frequent errors in transactional data.
        Furthermore, such systems generally do not easily lend themselves to being rapidly adapted to new business requirements since they are often dependent upon writing low-level code to effect the revisions.
                The high level architecture for a federated set of data warehouses with integrated data, based on maintenance of master data using a Master Data Warehouse, which supplies
    master data to both the data warehouse federation and the underlying source systems.
    It is important to note here that the Master Data Warehouse is in effect an operational system since it manages all master data flowing through the organization.
    This demands effective processes and procedures for management and ownership of master data.
    SAP MDM
    Companies striving to accelerate business growth and improve business performance need to innovate and remain flexible to dynamic market forces. NetWeaver provides the ideal platform for enabling such innovation and flexibility. Satyam’s NetWeaver Practice provides a host of solutions based on the primary user interface (EP) framework, the primary integration engine (XI), the primary data management tool (MDM), and the business process orchestration tool that enable customers to:
         Design, build, and implement new business processes
         Knit together disparate processes and systems
         Develop a unified view of information from every facet of the organization, and deliver it to stakeholders when and how they need it through a dedicated NetWeaver Center of Excellence
    SAP Master Data Management (SAP MDM) enables master data on customers, partners and products to be consolidated and harmonized across the enterprise, making it available to all staff and business partners. A key component of SAP NetWeaver, SAP MDM ensures data integrity across all IT systems.
    The SAP NetWeaver Master Data Management (SAP NetWeaver MDM) component of SAP NetWeaver creates the preconditions for enterprise services and business process management. The functionality represents customers, products, employees, vendors, and user-defined data objects in unified form. With SAP NetWeaver MDM, customers can manage master data and supplemental content, such as texts, PDF documents, high-resolution images, or diagrams in a central business information warehouse.
    SAP Master Data Management (SAP MDM) is a component of SAP's NetWeaver product group and is used as a platform to consolidate, cleanse and synchronise a single version of the truth for master data within a heterogeneous application landscape. It has the ability to distribute internally and externally to SAP and non-SAP applications. SAP MDM is a key enabler of SAP Enterprise Service-Oriented Architecture. Standard system architecture would consist of a single central MDM server connected to client systems through SAP Exchange Infrastructure using XML documents, although connectivity without SAP XI can also be achieved. There are five standard implementation scenarios:
    Content Consolidation - centralised cleansing, de-duplication and consolidation, enabling key mapping and consolidated group reporting in SAP BI. No re-distribution of cleansed data.
    Master Data Harmonisation - as for Content Consolidation, plus re-distribution of cleansed, consolidated master data.
    Central Master Data Management - as for Master Data Harmonisation, but all master data is maintained in the central MDM system. No maintenance of master data occurs in the connected client systems.
    Rich Product Content Management - Catalogue management and publishing. Uses elements of Content Consolidation to centrally store rich content (images, PDF files, video, sound etc.) together with standard content in order to produce product catalogues (web or print). Has standard adapters to export content to DTP packages.
    Global Data Synchronization - provides consistent trade item information exchange with retailers through data hubs (e.g. 1SYNC) Some features (for example, workflow) require custom development out of the box to provide screens for end users to use.
    Trillium :
    It is basically used for Data profiling.It can be used for data enhancements with certain address masters and other masters avaialble in the box. There is a lot of data quality analysis can be done in Trillium
    Advantages of Zoomix :
    1.Fast, accurate, automated, self-learning system :
    2.No scripting or rule development required – Zoomix Accelerator never requires the development of scripts or the manual definition or maintenance of data processing rules. The system automatically interprets data to learn the optimal matching, normalization and classification required for any data domain.
    3.Processes all types of complex corporate data – Zoomix Accelerator works equally well with product data, customer data, supplier data, financial data, etc. – in any language or combination of languages.
    4.Rapid integration into existing applications and business workflows – The system's integration with daily user workflow provides objective, on-the-fly data correction. Erroneous, duplicated and incorrectly categorized data are repaired immediately, without disrupting routine business functions.
    Reward points if helpful
    Regards
    nisha
    Edited by: Nisha Lalwani on Mar 4, 2008 2:25 PM

  • Wht is the advantages of mdm

    wht is the exact use for company if the implement mdm .

    Hi Ganishetti,
    Master data is data about your customers, products, suppliers etc.Quality of master data has an impact on transactional and
    analytical data.
    What MDM Does:
    Aggregate master data across SAP and non-SAP systems into a centralized master data repository. Once data is consolidated, you can search for data across linked systems, identify identical or similar objects across systems, and
    provide key mapping for reliable companywide analytics and reporting.
    Consolidate and harmonize master data from heterogeneous systems. Ensure high quality master data by distributing harmonized data that is globally relevant using distribution
    mechanisms. Allow subscribing applications to enrich master data with locally relevant information.
    Support company quality standards by ensuring the central control of master data, including maintenance and storage.
    Distribute centrally created master data to client systems as required using distribution mechanisms.
    System consolidation from R/3, ERP and other sources
    Direct ODBC system access, extract flat files, 3rd party application data, XML sources and many moreu2026
    Single pass data transformation, auto-mapping, validation rules, and exception handling.
    Business Users can define matching rules, complex matching strategies,and conduct data profiling
    Data Enrichment Controller to use 3rd party sources.
    and other partners for address completion, company validation and enriching data.
    Search and compare records, identify sub-attributes for consolidation in sub-second response times
    Merge records seamlessly, tracking source systems with built in key mappings
    Leverage out of box data models for consolidating data
    Consolidation has never been easier Extract, cleanse and consolidate master data
    HARMONIZATION
    Cleanse and distribute across entire landscape.
    CENTRAL MDM
    Create consistent master data from the start centrally.
    Advantages Of MDM
    End-to-end solution. The MDM system provide an end-to-end
    solution that automates the entire process of managing master data from start to finish, including bulk data import, centralized master data management, and published output to a variety of media.
    u2022 Database-driven system. MDM layers a thick shell of functionality on top of a powerful SQL-based DBMS so that the MDM system is fully scalable and the master data is fully accessible to other SQLbased applications and tools.
    u2022 Large capacity. The MDM system efficiently manages master data repositories containing up to millions of records.
    u2022 Superior performance. MDM breaks through SQL performance
    bottlenecks to deliver blazingly fast performance that is measured in milliseconds rather than seconds and is literally 100u20131000 times that of a SQL DBMS alone. No other system on the market today delivers comparable performance.
    u2022 Powerful search and retrieval. All of the MDM modules include
    powerful search and retrieval capabilities, so that an entire repository of thousands or millions of items can be easily searched and any item or group of items located in a matter of seconds.
    you can follow the blog
    /people/karen.comer/blog/2006/12/19/understanding-sap-netweaver-master-data-management-from-an-sap-netweaver-business-intelligence-perspective
    Reward if helpful
    Regards,
    Vinay Yadav

  • SAP MDM, ORACLE MDM, Microsoft MDM and IBM MDM

    Could someone discuss the differences and similarities between these MDMs-
    SAP MDM, ORACLE MDM, Microsoft MDM and IBM MDM. Who is/will dominate the Market?
    Thank you.
    Priya.

    Hi priya k  ,
    These r the details regarding MDMs-SAP MDM, ORACLE MDM, Microsoft MDM and IBM MDM.
    SAP Master Data Management
    SAP Master Data Management (SAP MDM) enables master data on customers, partners and products to be consolidated and harmonized across the enterprise, making it available to all staff and business partners. A key component of SAP NetWeaver, SAP MDM ensures data integrity across all IT systems.
    The SAP NetWeaver Master Data Management (SAP NetWeaver MDM) component of SAP NetWeaver creates the preconditions for enterprise services and business process management. The functionality represents customers, products, employees, vendors, and user-defined data objects in unified form. With SAP NetWeaver MDM, customers can manage master data and supplemental content, such as texts, PDF documents, high-resolution images, or diagrams in a central business information warehouse.
    SAP Master Data Management (SAP MDM) is a component of SAP's NetWeaver product group and is used as a platform to consolidate, cleanse and synchronise a single version of the truth for master data within a heterogeneous application landscape. It has the ability to distribute internally and externally to SAP and non-SAP applications. SAP MDM is a key enabler of SAP Enterprise Service-Oriented Architecture. Standard system architecture would consist of a single central MDM server connected to client systems through SAP Exchange Infrastructure using XML documents, although connectivity without SAP XI can also be achieved. There are five standard implementation scenarios:
    Content Consolidation - centralised cleansing, de-duplication and consolidation, enabling key mapping and consolidated group reporting in SAP BI. No re-distribution of cleansed data.
    Master Data Harmonisation - as for Content Consolidation, plus re-distribution of cleansed, consolidated master data.
    Central Master Data Management - as for Master Data Harmonisation, but all master data is maintained in the central MDM system. No maintenance of master data occurs in the connected client systems.
    Rich Product Content Management - Catalogue management and publishing. Uses elements of Content Consolidation to centrally store rich content (images, PDF files, video, sound etc.) together with standard content in order to produce product catalogues (web or print). Has standard adapters to export content to DTP packages.
    Global Data Synchronization - provides consistent trade item information exchange with retailers through data hubs (e.g. 1SYNC) Some features (for example, workflow) require custom development out of the box to provide screens for end users to use.
    History
    SAP is currently on its second iteration of MDM software. Facing limited adoption of its initial release, SAP changed direction and in 2004 purchased a small vendor in the PIM space called A2i. This code has become the basis for the currently shipping SAP MDM 5.5, and as such, most analysts consider SAP MDM to be more of a PIM than a general MDM product at this time.
    Getting Started with Master Data Management
    /docs/DOC-8746#section2 [original link is broken]
    Master Data Management (SAP MDM)
    http://help.sap.com/saphelp_mdm300/helpdata/EN/2d/ca9b835855804d9446044fd06f4484/frameset.htm
    http://www.sap.com/netherlands/platform/netweaver/components/mdm/index.epx
    http://www11.sap.com/platform/netweaver/components/mdm/index.epx
    Master Data Integration
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/webcontent/uuid/f062dd92-302d-2a10-fe81-ca1be331303c [original link is broken]
    Master Data Operations
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/webcontent/uuid/a0057714-302e-2a10-02ad-b56c95ec9376 [original link is broken]
    Master Data Quality
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/webcontent/uuid/c0a78be0-142e-2a10-f6b9-f67069b835da [original link is broken]
    Oracle MDM
    Oracle MDM is a core “Fusion” technology that will enable them to integrate their diverse portfolio of acquired solutions (PeopleSoft, JD Edwards, Siebel, etc.) and allow them to work together much more effectively.MDM is also a key component of Oracle’s middleware strategy to aggressively target heterogeneous environments – solutions running on different databases, operating systems, and using different data warehouse or analytical solutions. Oracle is a leading provider of ERP and CRM applications, but more broadly they are a provider of core platform solutions that help pull together and harmonise all a company’s IT assets.”
    Master Data Management solutions provide the ability to consolidate and federate master information from disparate systems and lines of business into one central data repository or “hub.” As part of its Master Data Management strategy, Oracle offers solutions that enable customers to synchronise critical information in a single, central location to achieve an accurate, 360-degree view of data, whether from packaged, legacy or custom applications. Oracle offers the following solutions:
    Oracle Product Information Management Data Hub – Enables companies to centralise, manage and synchronise all product information with heterogeneous systems and trading partners.
    Oracle Customer Hub – Centralises, de-duplicates and enriches customer data, continuously synchronising with other data sources across the enterprise, to provide a single, accurate, authoritative source of customer information throughout the enterprise.
    Oracle Financial Consolidation Hub – Allows companies to control the financial consolidation process by integrating and automating data synchronisation, currency translation, inter-company eliminations, acquisitions and disposals.
    “Clean, consolidated and accurate master data distributed throughout the enterprise can save companies millions of dollars annually. With Oracle, organisations are equipped to leverage the best data management practices available today to help dramatically increase operating efficiencies, improve customer loyalty and ensure corporate governance.
    MDM solutions provide the ability to consolidate and federate disparate systems and lines of business into one central data repository. Acxiom is the world's largest processor of consumer data and collects and manages more than a billion records a day for its customers, which include nine of the country's top ten credit-card issuers and most major retail banks, insurers and automakers. By integrating Acxiom's consumer data with Oracle's comprehensive MDM solutions, Oracle plans to provide customers with a pre-packaged, content-enriched customer information repository with unprecedented levels of data quality. This hybrid approach to customer data management will help clients conduct better, faster and easier MDM projects and perform more targeted marketing campaigns to cross-sell and up-sell into existing customer accounts.
    Oracle's MDM solutions serve as a customer data hub to unify customer data across multiple business units and functionally disparate systems. Oracle's comprehensive functionality manages customer data over the entire lifecycle: from capturing customer data, to cleansing addresses and spelling, identifying potential duplicates, consolidating duplicates, enhancing customer profiles with external data and distributing the authoritative customer profile to the operational systems.
    Microsoft MDM
    http://www.microsoft.com/sharepoint/mdm/default.mspx
    The Microsoft MDM Roadmap
    http://www.stratature.com/portals/0/MSMDMRoadmap.pdf
    http://msdn2.microsoft.com/en-us/library/bb410798.aspx#mdmhubarch_topic1
    The IBM View of MDM
    http://www.db2mag.com/story/showArticle.jhtml?articleID=167100925
    cheers!
    gyanaraj
    *******Pls reward points if u find this informative

  • BPM/MDM Process - Confirmations Back to BPM from ECC

    We are implementing a BPM / central MDM solution.  After the new master data is sent from the BPM to MDM, and syndicated through PI to ECC, the requirement is to send a confirmation back from ECC that the new master data (ex. Vendor) was created.  I don't see such a confirmation in any of the related blogs on the BPM/MDM subject.
    (1)  Is there a way to send a confirmation back to the BPM (file that BPM picks up?, BPM listener-type service?)?
    (2)  Do you recommend using BAM or ALEAUD for the confirmation out of SAP?  Both would work.  With ALEAUD, I could catch errors in PI and handle them.  With BAM on a Vendor.Created event, I would have no extra IDoc statuses to filter away.
    Thanks,
    Keith

    Hello,
    There are 2 options .
    You can use an outbound IDOC from ECC to PI - after the material is successfully created and PI can update MDM with the status and id created.
    1. You can write a service that will check MDM periodically to see if got back the status & id from ECC/PI successfully.
    2. The other way is to write a synchronous webservice call where BPM will wait for a PI service to provide an update directly to BPM. You need to model Synchronous call in BPM.
    Regards, Anil

  • SAP MDM Vs IBM MDM

    Hello Experts,
    I am new to MDM, and want to know what is difference between SAP MDM and IBM MDM.
    Also please provide some good documetation on SAP MDM.
    For what it is used, what are the advantages of using it and all.
    Thanks,
    Suma

    hi,
    SAP MDM:SAP MDM enables master data on customers, partners and products to be consolidated and harmonized across the enterprise, making it available to all staff and business partners. A key component of SAP NetWeaver, SAP MDM ensures data integrity across all IT systems. With SAP NetWeaver MDM, customers can manage master data and supplemental content, such as texts, PDF documents, high-resolution images, or diagrams in a central business information warehouse.
    Advantages of SAP MDM
    SAP Master Data Management (SAP MDM) is a component of SAP's NetWeaver product group and is used as a platform to consolidate, cleanse and synchronise a single version of the truth for master data within a heterogeneous application landscape. It has the ability to distribute internally and externally to SAP and non-SAP applications. SAP MDM is a key enabler of SAP Enterprise Service-Oriented Architecture. Standard system architecture would consist of a single central MDM server connected to client systems through SAP Exchange Infrastructure using XML documents, although connectivity without SAP XI can also be achieved
    There are five standard implementation scenarios:
    Content Consolidation
    Master Data Harmonisation
    Central Master Data Management
    Rich Product Content Management
    Global Data Synchronization
    The IBM View
    In IBM's view, MDM is a set of disciplines, technologies, and solutions used to create and maintain consistent, complete, contextual, and accurate business data for all stakeholders (users, applications, data warehouses, processes, enterprises, trading partners, and so on). It's a holistic framework for managing structured and unstructured data that's aligned with business processes and managed throughout the information life cycle
    To a large extent, SAP’s NetWeaver initiatives are concessions to the reality of heterogeneity. No customer is going to have a completely SAP- or Oracle-based environment. With NetWeaver and Fusion, SAP and Oracle propose to accommodate the heterogeneous data-access requirements of their customers while keeping their bread-and-butter application software at the heart of things.
    IBM’s middleware vision displaces the primacy of the ERP vendors. It’s probably too much to say that customers who embrace Big Blue’s WebSphere data integration middleware can pick and choose from between and among best-of-breed financial, SCM, and MRP (among others) products for their enterprise application needs. If nothing else, however, IBM’s data integration and MDM middleware stack does give customers an alternative to SAP .
    However one of the problems with IBM’s approach to MDM is that, taken in isolation, it would mean that application vendors would need to completely rewrite their applications to take advantage of the MDM layer. This isn’t going to happen, not least because vendors like SAP have their own MDM capabilities.
    Moreover, users those do not want to convert to IBM's new multiform MDM offering will have to
    negotiate separate maintenance agreements to protect their long-term investments in
    WPC (which is WebSphere Product Center is a product information management solution for building a consistent central repository.).
    Lastly, users should work with IBM to ensure that the standard IBM three-year
    maintenance commitment will continue with the current WPC product, independent of
    any move to a single MDM platform.
    Regards
    Nisha

Maybe you are looking for

  • OpenFile fails in Word for Mac: "helper application missing"

    The OpenFile command in Word 2004 for Mac stopped working on my machine: it threw up am error message "Helper application not present" and froze Word. This was occasionally a real problem, since the same thing happened when trying to download a Word

  • Unplanned Delivery Cost

    Dear All , I need to post undelivery cost for a single PO and i receive the invoice after final delivery . How can i  enter the following unplanned delivery costs for this PO ? Thanks Dash

  • Getting Photo's on the iPod

    Hey guy's, I can't believe I'm asking this but I'm just missing something. How on earth do i get pictures off my HD onto the Ipod. There seems to be no Folder for Pictures just "Contacts" and "Notes" internally. iTunes handles the video, podcast etc

  • IChat 5.0.1 no audio when video chatting (one end only)

    Hello, I have a Early Mac Mini Core Duo 1.66Ghz with OS 10.6.2. iChat version 5.0.1. I am using a compatible webcam (Logitech Quickcam Pro 5000). I have full video and audio until I start a video chat. The logitech Quickcam is selected in both the sy

  • Reg Error in Workflow - SAP HR Triggers

    Dear Experts, We have SAP HR triggers as per our requirement. Whenever new employee gets created in SAP HR, in CUP - HR triggers. When we see it in process log, it shows the request in process. It gives below information. Please check 2011-07-27 12:1