Master data inconsistencies

Hello,
How can i check and repair master data inconsistencies ? I know that such a program exists, can you tell me what is it?
Regards,
Jorge Diogo

Hi Jorge,
Here it is:
RSDMD_CHECKPRG_ALL
Regards,
Diogo.

Similar Messages

  • Flexible update master data inconsistencies

    Dear experts,
    I have the following issue with flexible update master data for 0employee.
    0employee is getting loaded from 3 different sources. here is the example below;
    employee  address   phone compcode
    1                germany   123     
    1                denmark                001
    1                UK
    here employee is a key for 0employee what happens to above data after attribute change run.
    can any one let me know what will be final results.  my requirement is i have to get company code at any circumstances. it there a proper sequence of load to be followed in this case. your inputs is highly appreciated with points.
    Cheers,
    VSN

    If 0employee gets data from 3 different sources it will overwrite the existing records when same employee '1' is coming from Germany , Den , UK .. Only the recent record will be availble in MD tables..
    You need to add SOURCESYSTEM as compounding object to 0EMPLOYEE.
    Put constant GE for 0SOURCE SYSTEM in transformations
    constant Den for 0SOURCE SYSTEM in transformations
    constant UK for 0SOURCE SYSTEM in transformations
    when you execute MD
    Source system ...employee address phone compcode
    GE.............................1...................... germany.... 123
    Den.............................1...................... Denmark.... 123
    UK.............................1......................UK .............. 123

  • Inconsistencies in Master data

    Hi All.
    I would to know whether is there any transcation to check the inconsistencies in Master Data, just like running reconcilation/comparision of Transcation Data (/sapapo/ccr)
    Regards
    Raja kiran

    Raja,
    SAP does not provide a report that will detect all inconsistencies between Product master in APO and the Material Master in ECC.
    Most fields that exist in the ECC Material Masters do not even exist in APO.  Many fields are not intended to be in synch with APO.
    It is generally not necessary to report the APO/ECC product master differences for 'synchronized' fields on a regular basis.  In the rare case that they fall out of synch, it is usually sufficient to run RIMODINI report in ECC against the Material Master Integration model. Some companies even add RIMODINI steps to their daily IM jobs in ECC.
    In my experience, the major problem with ECC<>APO master data inconsistencies is poorly written enhancements.
    Best Regards,
    DB49

  • Master data load failure. RSRV check resulted in inconsistencies.

    Hi...
    In our production system, master data load to 0EMPLOYEE is failing every alternate day. When I check the same in RSRV, following checks are red:
    1. Time intervals in Q table for a characteristic with time-dep. master data
    2. Compare sizes of P or Q and X or Y tables for characteristic 0EMPLOYEE
    3. SID values in X and Y table: Characteristic '0EMPLOYEE'
    If I repair it, it becomes green and the load is fine. Next day the load fails again. When I check in RSRV, I get the same 3 errors. So, again I need to repair it. Let me know the permanent solution for this.
    I ran the programs: RSDG_IOBJ_REORG and RSDMD_CHECKPRG_ALL but these fixes are also temporary. Moving a new version of the object from Dev to QA and then to Production is not preferable right now as this involves a high amount of visibility.
    I know the SID tables and all are corrupted from the logs I see. But is there any permanent solution for this without transports?
    Thanks,
    Srinivas

    Hi
    Chk this link will help you: Master data deletion
    Regards
    Ashwin.

  • Deletion of master data for 0CS_ORDER

    Hello Friends,
    I have a requirement to enhance an IOBJ 0CS_ORDER with Order status values(2 new statues are added). I have done the enhancement part to the extractor in the source system side. Now after replication in BW, to add 2 new IOBJs in CS_ORDER, I need to delete the master data.
    When I try to do this, I get the msg that some master data could not be deleted.
    This is fine as we have transaction data loaded for large no. of objects which has 0CS_ORDER.
    In dev, I have deleted the master data using SE14 for tables P, S , X, T.
    My concern is in Production, I have huge volume of transaction data. To transport the changes, I need to delete master data. But this would require deletion of Transaction data from all IP where CS_ORDER is used.
    If I do so using SE14, and tranport the changes and later load MD to CS_ORDER. Will this cause INCONSISTENCIES?
    What can be other possible ways??
    Please share your opinions.
    Many Thanks,
    VA
    Edited by: Vishwa  Anand on Sep 2, 2010 5:36 PM

    Definitely Yes.
    It will damage all the SID connections earlier which this master data had with the other objects likes transactiona objects (ODS, CUBES etc). Even You can not access the data from the cube or ODS where the navigational attributes are flagged for reading.
    So do not use SE14 for deletion of MD. This is very specific to use.
    You can delete the Master data as long as its not been used in other targets so that you can refill it easily and also there wont be any inconsistencies..
    Hope this is clear for you..

  • Diff B/W master data and transaction data

    Hi all,
    What is the main Diff B/W master data and transaction data.  give me some example ?
    Thanks in Advance
    Krish...

    hi krish,
    MASTER Data is the data that exists in the organization like employee details, material master, customer master, vendor master etc. These are generally created once.
    Master data are distributed throughout the company, they are often not standardised and often redundant. As a result it is very costly to offer efficient customer service, keep track of supply chains and make strategic decisions. With SAP Master Data Management (SAP MDM) these important business data from across the company can be brought together, harmonised and made accessible to all staff and business partners. As a key component of SAP NetWeaver, SAP MDM ensures data integrity via all IT systems.
    Regardless of the industry, companies often work with different ERP and Legacy systems. The result: the business processes are based on information about customers, partners and products which is displayed in different ways in the systems. If the data are recorded manually, there are more inconsistencies: some data sets are entered several times, others cannot be retrieved by all divisions of the company.
    As corporate applications are becoming increasingly complex and produce ever greater amounts of data, the problem is intensified further. Nevertheless, your employees must work with the inconsistent data and make decisions on this basis. The lack of standardised master data easily leads to wrong decisions, which restrict efficiency and threaten customer satisfaction and profitability.
    In a word: in order to save costs and ensure your company’s success it is necessary to consolidate master data about customers, partners and products, make them available to all employees beyond system boundaries and use attributes valid company-wide for the purpose of description.
    TRASNACTION Data - These are the business documents that you create using the master data - Purchase orders, sales orders etc
    http://help.sap.com/saphelp_nw2004s/helpdata/en/9d/193e4045796913e10000000a1550b0/content.htm
    Regards,
    GNK.

  • Impacts in COPA of changing material and customer master data

    Dear experts,
    In my company we are considering following scenario:
    Currently mySAPerp 6.0 is implemented for all modules for the mother company.
    We have developed a new global template where there are significant changes versus the existing system, especially in the SD processes. Material and customer master also change significantly in terms of content in the tables/fields and/or values in the fields.
    The idea was to build the template from scratch in a new machine and roll-out all group affiliates, but now we are considering the possibility of making an evolutionary of the current system and try to stretch it to the processes defined in the global template.
    The scenario we want to analyze is: Keeping same organizational structure in terms of Company code, CO area and Operating Concern in existing SAP client and make an evolutionary of the existing settings to the global template processes.
    The doubts we are having are the following:
    Changing material & customer master data: Impact in COPA
    Option 1: Material master data and customer master data codes are maintained but content in the tables/fields is changed substantially, both in terms of logical content of specific fields and/or the values in the specific fields. We have following examples of changes.
    Case 1: source field in material master changes logical content. E.g. Material master field MVGR1 is currently used for product series (design line) and the content changes to be the Market Segment. The product series will be moved to a classification field. At least 5 other fields are affected by this. How can data in terms of COPA line items be converted so that they are aligned at time of reporting?
    Case 2: the source field is not changed so that the logical content of the field remains but the values change, i.e. for the same concept there will be different codifications. How can data in terms of COPA line items be converted so that they are aligned at time of reporting?
    Case 3: Characteristics where currently the source material master field is a Z field and the derivation is via table look up and where the Z field changes to a classification field. How can you convert the existing COPA line items to ensure that attributes are aligned? Should new characteristics be created or just change the derivation logic of the characteristic?
    Option 2: Material master data and customer data codes are re-created (codification of records is changed), meaning that new material and customer codes will exist and content in tables/fields is changed (as in option 1)
    Case: material and customer codes are changed. How can data in terms of COPA line items be converted so that they are aligned at time of reporting?
    Iu2019ve never phased a similar scenario and I fear that maintaining operating concern while changing source master data and also SD flows (we have new billing types, item categories, sales doc. Types, order reasons) may lead to inconsistencies and problems in COPA.
    I would like to ask you experts if you have come across a similar scenario and if from your experience, it is something feasible to do or there are many risks involved. What can be the impact of this scenario in existing Operating Concern for both option 1 and 2 and what would be the key activities to perform to adapt the existing operating concern. What will be the impact of the needed conversions on P&L reporting?
    Sorry for the long story. I hope you can help me out.
    Thanks and Regards,
    Eric

    Hi,
       First i think you will need to test if it works for new COPA documents created via billing.
      If it works fine then the issue is if you wish to apply these changes to the historical data already posted.
      Normally there are transactions like KE4S where you can repost the billing document to COPA
      However this may not be viable for bulk postings
      You can perform realignment (KEND) but this only works at the PA segment level (table CE4XXXX)
    regards
    Waman

  • What if i load transaction data without loading master data

    Hello experts,
    What are the consequences if i load transaction data without loading master data?Are there any other factors except the load performance because of SID generations etc and inconsistencies.
    <b>What kind of potential inconsistencies will occur?</b>
    Problem here is:
    when the transaction load starts a new master data such as employee(x) would be have been created in R/3 which does not exist in BW and hence transaction load fails.
    Thanks and Regards
    Uma Srinivasa rao

    Hi Rao,
    In case you load the master data after loading the transcation data and if there is any look up at the master data in teh update rules, then you can delete and reconstruct the requests in the ODS/Cube so that the latest master data is pulled in in the data target.
    Make sure you do the Apply hier/attr change before doing the delete and reconstruct.
    Bye
    Dinesh

  • Missing "Mapping entry" issue in tranferring Cust/Vendor Master Data

    Hi All,
    We are doing a version upgrade in one of our client for SCM APO (SCM 7.0 SPS 8). We are getting an error while transferring customer/ vendor master data (either in foreground or background) that "No location XXXXXXXX exists for mapping entry XXXXXXX of category 1011 (1010 in case of customer).
    Please provide your valuable help to resolve this issue.
    Thanks.

    Dear Ravi.
    Please use the report ZMATMAP_REORG provided in note 859650 to remove the inconsistencies between /SAPAPO/LOCMAP and /SAPAPO/LOC.
    The number of entries in these both tables should be the same.
    I am not aware of your SCM Release, but it is also relevant for release 4.1, despite the info in the note.Please copy the report in your system and execute it firstly in the test mode.
    I hope this helps you!
    Let me know the outcome of the issue.
    Thank you!
    Will

  • Transport PC and CC master data

    Hi,
    Can we transport profit and cost center master data? I didn't see that option on menu when I created one.
    Also, I tried transporting std hierarchy, which is allowed and in there I checked Profit center master data to be transport along. But my questions are:
    1. Whether complete list is transported or are only the Profit center under the highlighted node get transported? 
    2. Do they get copied into other clients in active/inactive state?
    3. Does the transport carry only the changes or the complete hierarchy?
    Please advise!
    Thanks!
    N.

    1. If the hierarchy was already created and transported and now I make some more changes to it, will the transport carry the whole of hierarchy or only the changes.
    Entire Hierarchy
    2. Though there is a check box for only std hierarchy, still does that mean both std and alternative hierarchy will be tranported.
    I think just the std hierarchy
    3. There seems to be no way I could select the cost/profit centers specifically in these transactions. Does that mean all the profit and cost centers in the client will get transported.
    Yes
    4. I am assuming they will be transported in active state if activated in originating client. Is that correct?
    Yes
    Using above transaction, it is always suggested to transport hierarchy, profit centers and groups together; as groups are just sets and if not transported could cause inconsistencies in the target system.
    Here is SAP's recommendation on transporting PC master data.
    Recommendation
    1. Always transport the following objects together to avoid inconsistencies in the target system:
    The standard hierarchy and the profit center groups represent logical sets of profit centers. Transporting these sets separately could lead to inconsistencies if the corresponding profit centers (master data) do not exist or are incorrect in the target system.
    2. If you have transported profit centers, restart the matchcode data in the target system so that the transported master data can be displayed using the "Possible entries" function.

  • Is it possible to change a master data key ?

    Hi Experts,
    i'd like to know if it is possible in any way to change/update the key of a master data table.
    Reason: All of our projects have a certain nomenclature. Now I realized one project that does not follow that nomenclature and would like to change it accordingly (0PROJECT -> /BI0/PPROJECT).
    The connection to the cubes is through SIDs, therefore I cannot remove the current record in the master data table and create a new (the correct) one.
    I also cannot reload the cube after I possibly had changed the master data table.
    For me, the simpliest would be to just update the key portion in /Bi0/PPROJECT and I think modern database systems would allow this (we are running BI on top of DB2), but may be I'm wrong.
    I tried the update in the maintenance dialog in RSA1. Stupid enough it allows me to change the key and even the subsequent Save would work. However, by opening the table afterwards again, the old (worng) values are displayed again.
    Would someone know how this could work ?

    Hi,
    Are you trying to change the value  of a master data key for a particular record in the P table?
    if this is your requirement, then the answer is NO. i don't think so..changing it manually doesn't help you.
    I suggest you to load the same record from your source system with new master data key value and all other attributes with the same data as the old record.
    Now you will have two records in your P table( one with the old value and another with the new one)
    There is no harm in having the old record in your P table...deleting of any master data is risky and not appreciable ( but possible)
    You need to use this new master key value in your transaction loads which will intrun connect to the master data using SIDs
    By using the new value in your transaction loads, the old one will not visible in any of your reports...but will just sit in the P table with no use, which is still fine for you.
    However, to laod this new value (in both master and transaction data) you need to get in touch with the functional team.
    Modifying the data manually in BI/BW ( Unless there is an invalid data from source to PSA) will lead to inconsistencies and reconciliation issues. - NOT Advisable
    Regards,
    Sudheer
    Edited by: Sudheer Kumar Kurra on Jan 26, 2011 7:21 PM
    Edited by: Sudheer Kumar Kurra on Jan 26, 2011 7:24 PM
    Edited by: Sudheer Kumar Kurra on Jan 26, 2011 7:54 PM

  • Integration connections to bring master data into APO and BW system.

    Hi All,
    Can somebody give step by stip activities to be done FOR creating integration connection for following requriment.
    1)  To bring master data from R3 System to APO system through BI CUBES using BI type connection
    2) APO/R3 type connection R3 system to APO system.
    Regards

    Nikhil,
    Request you to let me know how can basis person helps in this.
      If this is an existing system, the applications team (APO and BW) will not require much assistance from you.  You can mostly sit back and watch them do all the work.
    I suggest that you read the help link in its entirety provided by expert Datta.  Much of the info there is helpful for both solutions.
    For the APO interface (question 2 in your original post) there is a complete step-by-step guide provided by SAP.  Check these two docs out.
    http://help.sap.com/bp_scmv250/BBLibrary/Documentation/B02_BB_ConfigGuide_EN_DE.doc
    http://help.sap.com/bp_scmv250/BBLibrary/Documentation/B05_BB_ConfigGuide_EN_DE.doc
    If you are being asked to select from one of the two scenarios you outlined, the standard Core interface is by far the simpler to implement.  Plus, it has the advantage that SAP will support you when something goes wrong.
    The BW solution is different, in that the developer (you?) MUST have detailed specifications before he can proceed.  Which master data?  At what organizational levels?  How often refreshed?  What will be the ultimate use of the data, and how must it be stored?  Where will it be stored? How to deal with inconsistencies?  Etc etc. SAP support will only solve technical problems in your solution, they won't solve errors of design or logic, unless you happen to get a very accommodating support person when you raise your message.
    But in case a basis person need to create RFC how can it be done for above scenario.
    The instructions for creating all the RFC stuff is contained in the connectivity doc I cited above.
    Best Regards,
    DB49

  • Profit Center Master Data- Segement need to be changed

    Hello Gurus,
    We have somehow set wrong segment in Profit Center Master Data, We need to change the segment. Transction is posted to the profit center what is the easy way out to change the segment for profit Center master data?
    Any help will be apreciated? i have gone through the SAP Note
    940721  - KE52: Segment in prof center master record cannot be changed
    Or a work around solution.
    Thanks
    Prasad

    Hi Prasad,
    As you already find out, once there is transaction data on a profit center, the segment field is no longer changeable in order to prevent inconsistencies and incorrect result from the segment reporting point of view. This is standard system behaviour!
    You can try with the the following *Notes. It will provide further corrections relevant for your issue.
    *Notes:1373616    Segment in profit center changed despite transaction data
                1037986    KE52: Changing the segment in the profit center master data
    Regards
    Eugene

  • Data clean for master data

    We've got a lots of obsolete material master, some are used for kinds of orders and some are never used (just created by mistake).
    The functional consultant has set "mark for deletion" for those master data to prevent from being used again.
    The problem is when end user create orders, they still can select those master data, when the process the order, they are told this master data is not useable. Our end user ask to make those master data disappear when they select.
    Master data archiving is highly depend on application data archiving, we do not want to archive 10+ objects to delete just one kind of master data. Is there a simple a way to make the master data  disappear when end user select it?

    no there is no simple way.
    if they were never in use, then you can archive right away.
    if they are used in data, then this data has to be archived first, that is just the way it is.
    you cannot expect to get the goods from ebay before you have won the bid.Everything has its logical order, just deleting data leaves inconsistencies in your system and I guess this is a bigger issue.
    I actually would ask what is wrong that your users use materials that is marked for deletion.
    Did you create duplicates and they do not know which one to use? Did you delete the wrong materials, materials that are still in stock and have still demand from the market?
    The match codes can be amended (new match code, or making use of the user exit in the search helps) to hide materials that are marked for deletion.
    Set the material and sales status to restrict the usage, see my blog: Material master deletion flag and blocking activities

  • Delta for generic master data extractor

    Hello,
    Is it not possible to load deltas for a generic master data extractor?
    I have created a generic extractor on R/3 table MEAN (EAN number assigned to material). I have also performed the init run which worked fine, but no deltas are being loaded.
    Why is this?
    Best regards,
    Fredrik

    hi,
    its possible
    how to generic delta
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/84bf4d68-0601-0010-13b5-b062adbb3e33
    SAFETY DELTA
    check these links
    oss note 368739
    Symptom
    For the setup of summarization levels and summarization data in the Profitability Analysis a 'Safety delta' of half an hour is used. This means, the system only includes records which exist half hour already. You want to reduce this delta in order to get more current data in reporting.
    Additional key words
    Summarization levels, summarization data, safety delta, time stamp
    Cause and prerequisites
    The reason for the duration of these safety deltas are possible differences in the clocks on different application servers. If the delta is selected too short, when updating the summarization levels/data, records may not be taken into account. In particular the Account-based Profitability Analysis depends on the sufficient length of the delta, so that update processes which take longer do not cause inconsistencies. For the Costing-based Profitability Analysis a lock is set. If not active update processes exist, it fails and the update terminates.
    Solution
    The attached source code correction is not part of the standard system. With this change the safety delta is set from 30 to 5 minutes. You should only implement this, if exclusively the Costing-based Profitability Analysis is active in your system. If you also use Account-based Profitability Analysis we do not recommend the change for the above-mentioned reasons.
    Generic delta safty intervals
    oss note 392876
    safety interval
    0FI_GL_4 Safety Delta
    Genric delta fro table
    check this thread which already discussed about this topic
    Generic Extractor - Delta
    Shreya

Maybe you are looking for

  • Adding maps to a book=phantom feature?

    So. After an hour or three looking through archives here and researching the web it seems you cannot add a map to any book other than the apple template with theme "photo essay" or "journal" wich cannot be accessed when using custom page sizes. The o

  • Can't create netinstall image

    Hi, I'm currently trying to create a Net Install image from the system volume of another computer in Target Disk Mode. After selecting the source and entering the info, I am told that the source does is missing a required "archive file." I'm thinking

  • Dynamic binding problem

    I have a problem, i want to use a extend to arraylist to implement different validation to objects. So i added a method add(Person person), dynamic binding, but it has a incorrect behavior, i think import java.util.ArrayList; import java.util.Collect

  • How to tell what maps are loaded?

    Please forgive me if this has been asked already: I have an N95 8GB, over the past few days, I've been playing around with the Map Loader program, loading different things up and deleting them. But, I can't remember where I left it, and which maps I

  • Motion Blur Seams

    Is Motion Blur broken? When rendering 3D surfaces with Motion Blur I get visible seams. Example - 2 same-colored Solids, butted up against each other without a gap look like one big Solid. But if the camera zooms by the seam between the two Solids be