Huge number of Master data errors

Hi All,
I have a situation. Actaully in our project data records with special characters for master data (0CUSTOMER here) are being edited every day and manually loaded. (note: no provision of RSKC). what happened a day back is unknowingly someone triggered a full load and data with error records from Day 1 is now there till the PSA along with the latest data. now untill and unless the data is corrected and reloaded new data won't get updated. So can anybody please suggest me the quickest way of getting out of this mess. I mean to remove the old data which is already there and get the latest data only from the PSA. Otherwise i need to manually correct around 7000 records!
Please help. Do tell me if you need more clarifications.
Thanks in advance!

Hi
If it's possible speak to business poeple if they want that invalid characters.If they say yes then allow into BW on RSKC.
Just delete reuest  and goto PSA and update datatarget immediatly.
Regards,
Chama.

Similar Messages

  • Missing authorization check to display master data error..

    Hi,
    We are in SRM7.0.
    After adding content in my shopping cart, I am trying to change the Account Assignment in the Item Details from Network to Cost Center or GL Account.
    Under Item Overview, after changing the Account Assignment to Cost Center, I go to "Assign Number" to add a Cost center..On clicking the matchcode search box,  I get selection entries like Cost Center, Controlling area etc..If I click on the Search box without keying anything, I get a "Missing authorization check to display master data error".
    I get the same error, when I try to select Assets under Account Assignment area and use search under Assign number. I dont get any errors on searching networks.
    Am we missing any authorization check in the backend ?
    Regards,
    Rajan.K

    Hello Rajan,
    You could use transaction ST01 in SRM to perform some 'authorisation' trace.
    It might give you some pointers to the issue.
    Regards,
    Franz

  • Need to find out the number of Master data records transfered to the BW

    Hi,
    We need to find out the number of Master data (example -0MAT_PLANT_ATTR) records to be transfered to BW side. This is a delta extract.  We are preparing test scripts to check the master data extract ( full & delta) from ECC6 TO BI 7.0..
    Advance Thanks.

    Hi,
    Goto RSA3 and run that master data extractor in D mode if you want to know the number of records in delta and in F mode if you want to know the Full volume. But make sure that you set data records/calls and the display extr calls numbers so that you get the total number of records.
    The other option is goto the master data source table and look at the number of records in that table. That'll give you an idea of the total number of records in the table.
    One other option is to goto RSA7 , select the delta datasource and hit the display button. You'll get to know the number of entries in the delta queue that way as well.
    Cheers,
    Kedar

  • Vendor number entered in customer master data-error

    When entered the vendor number in the customer master data in contrao data tab in general data and save it ,system throws an error enter address number.
    Double click on the error it gives below message :
    Please enter an address number
    Message no. AM016
    Please advice.

    Even if the address for both are same still it gives error,and also if the address is different the error is thrown.Both the ways it throws the error.I get this error only when i enter vendor no in customer or customer no in vendor if not it saves with out any error.
    Edited by: mysap query on Dec 22, 2008 12:02 PM

  • Performance: reading huge amount of master data in end routine

    In our 7.0 system, each day a full load runs from DSO X to DSO Y in which from six characteristics from DSO X master data is read to about 15 fields in DSO Y contains about 2mln. records, which are all transferred each day. The master data tables all contain between 2mln. and 4mln. records. Before this load starts, DSO Y is emptied. DSO Y is write optimized.
    At first, we designed this with the standard "master data reads", but this resulted in load times of 4 hours, because all master data is read with single lookups. We redesigned and fill all master data attributes in the end routine, after fillilng internal tables with the master data values corresponding to the data package:
    *   Read 0UCPREMISE into temp table
        SELECT ucpremise ucpremisty ucdele_ind
          FROM /BI0/PUCPREMISE
          INTO CORRESPONDING FIELDS OF TABLE lt_0ucpremise
          FOR ALL ENTRIES IN RESULT_PACKAGE
          WHERE ucpremise EQ RESULT_PACKAGE-ucpremise.
    And when we loop over the data package, we write someting like:
        LOOP AT RESULT_PACKAGE ASSIGNING <fs_rp>.
          READ TABLE lt_0ucpremise INTO ls_0ucpremise
            WITH KEY ucpremise = <fs_rp>-ucpremise
            BINARY SEARCH.
          IF sy-subrc EQ 0.
            <fs_rp>-ucpremisty = ls_0ucpremise-ucpremisty.
            <fs_rp>-ucdele_ind = ls_0ucpremise-ucdele_ind.
          ENDIF.
    *all other MD reads
    ENDLOOP.
    So the above statement is repeated for all master data we need to read from. Now this method is quite faster (1,5 hr). But we want to make it faster. We noticed that reading in the master data in the internal tables still takes a long time, and this has to be repeated for each data package. We want to change this. We have now tried a similar method, but now load all master data in internal tables, without filtering on the data package, and we do this only once.
    *   Read 0UCPREMISE into temp table
        SELECT ucpremise ucpremisty ucdele_ind
          FROM /BI0/PUCPREMISE
          INTO CORRESPONDING FIELDS OF TABLE lt_0ucpremise.
    So when the first data package starts, it fills all master data values, which 95% of them we would need anyway. To accomplish that the following data packages can use the same table and don't need to fill them again, we placed the definition of the internal tables in the global part of the end routine. In the global we also write:
    DATA: lv_data_loaded TYPE C LENGTH 1.
    And in the method we write:
    IF lv_data_loaded IS INITIAL.
      lv_0bpartner_loaded = 'X'.
    * load all internal tables
    lv_data_loaded = 'Y'.
    WHILE lv_0bpartner_loaded NE 'Y'.
      Call FUNCTION 'ENQUEUE_SLEEP'
      EXPORTING
         seconds = 1.
    ENDWHILE.
    LOOP AT RESULT_PACKAGE
    * assign all data
    ENDLOOP.
    This makes sure that another data package that already started, "sleeps" until the first data package is done with filling the internal tables.
    Well this all seems to work: it takes now 10 minutes to load everything to DSO Y. But I'm wondering if I'm missing anything. The system seems to work fine loading all these records in internal tables. But any improvements or critic remarks are very welcome.

    This is a great question, and you've clearly done a good job of investigating this, but there are some additional things you should look at and perhaps a few things you have missed.
    Zephania Wilder wrote:
    At first, we designed this with the standard "master data reads", but this resulted in load times of 4 hours, because all master data is read with single lookups.
    This is not accurate. After SP14, BW does a prefetch and buffers the master data values used in the lookup. Note [1092539|https://service.sap.com/sap/support/notes/1092539] discusses this in detail. The important thing, and most likely the reason you are probably seeing individual master data lookups on the DB, is that you must manually maintain the MD_LOOKUP_MAX_BUFFER_SIZE parameter to be larger than the number of lines of master data (from all characteristics used in lookups) that will be read. If you are seeing one select statement per line, then something is going wrong.
    You might want to go back and test with master data lookups using this setting and see how fast it goes. If memory serves, the BW master data lookup uses an approach very similar to your second example (1,5 hrs), though I think that it first loops through the source package and extracts the lists of required master data keys, which is probably faster than your statement "FOR ALL ENTRIES IN RESULT_PACKAGE" if RESULT_PACKAGE contains very many duplicate keys.
    I'm guessing you'll get down to at least the 1,5 hrs that you saw in your second example, but it is possible that it will get down quite a bit further.
    Zephania Wilder wrote:
    This makes sure that another data package that already started, "sleeps" until the first data package is done with filling the internal tables.
    This sleeping approach is not necessary as only one data package will be running at a time in any given process. I believe that the "global" internal table is not be shared between parallel processes, so if your DTP is running with three parallel processes, then this table will just get filled three times. Within a process, all data packages are processed serially, so all you need to do is check whether or not it has already been filled. Or are you are doing something additional to export the filled lookup table into a shared memory location?
    Actually, you have your global data defined with the statement "DATA: lv_data_loaded TYPE C LENGTH 1.". I'm not completely sure, but I don't think that this data will persist from one data package to the next. Data defined in the global section using "DATA" is global to the package start, end, and field routines, but I believe it is discarded between packages. I think you need to use "CLASS-DATA: lv_data_loaded TYPE C LENGTH 1." to get the variables to persist between packages. Have you checked in the debugger that you are really only filling the table once per request and not once per package in your current setup? << This is incorrect - see next posting for correction.
    Otherwise the third approach is fine as long as you are comfortable managing your process memory allocations and you know the maximum size that your master data tables can have. On the other hand, if your master data tables grow regularly, then you are eventually going to run out of memory and start seeing dumps.
    Hopefully that helps out a little bit. This was a great question. If I'm off-base with my assumptions above and you can provide more information, I would be really interested in looking at it further.
    Edited by: Ethan Jewett on Feb 13, 2011 1:47 PM

  • Payroll period does not correspond to master data error in INLK run & soln

    Dear All,
    Just wanted to share this Issue and solution which i found after debugging.
    I am not sure it is 100% correct.
    I am working on indian payroll and was doing lagacy data transfer with INLK schema
    i was used to get error '2 payroll period does not correspond to master data'
    and this was for only new joinees in that period .what i found out after debugging is
    for e.g. employee is hired in 03.06.2011 .and record in t558b is
    90300035 1  2011    1 2011 3 01.06.2011 31.06.2011
    where 01.06.2011 is a for period begin date ,
    now in payroll program this is being checked with joining date in following subroutine check_aper_versus_t558b
    so it ends up with above error.now if i go and change 01.06.2011 to 03.06.2011 in T558B then payroll will be successful.
    I dont no whether this is right approach but this removed that error.Just wanted to share so
    It might help the members.

    la diferencia es la fecha del campo FPBEG que debe ser la fecha para los empleados que no esten con el año o periodo completo
    the difference is the date field FPBEG Table T558B to be the date for employees who are not full with the year or period

  • MATERIALS MASTER DATA ERROR

    HELLO SIR,
    I AM GETTING AN ERROR WHILE CREATING MATERIAL MASTER DATA IN MM01.
    IN BASIC DATA 1 THE BASE UNIT OF MEASURE IS SUPRESSED. I CANT ABLE TO ENTER THE OPTION.
    SO FOR THAT I CANT ABLE MOVE FURTHER.
    PLEASE KINDLY GIVE ME THE SOLUTIONS...

    Hi,
    Please check any documents created or the material or any transacctions.if any transaction is done over the material base unit of material can not be changed.reverse all the transactions happened over the material and try change.Or try to provide alternative unit of measure and its conversions in the UOM tab in the material master.
    Regards,

  • Vendor master data error - check issue manueal

    Hi all,
    i got the following message when i create vendor master data, same message as error when i try to post manual check.
    " Changes for vendor 500011 not yet confirmed
    Diagnosis
    The change to sensitive fields of vendor 500011 have not been confirmed yet.
    Procedure
    You can confirm the change with transaction FK08."
    could any one give me the solution ple
    thanking you

    Dear,
    If the "sensitive field for dual control(vendor) " is defing in IMG under "preparations for creating vendor master data, then when you creating the vendor master data, the confirm is necessary. the responsible person could help you to confirm it via FK08.
    With Best Regards,
    Gladys Xing

  • Delta loading master data errors in source system Message RSM340 quality

    Hi,
    I'm trying to load with delta  master data for standard data sources.
    In dev, everything seems fine.
    I' ve transported the same elements to QA. I've done init, it is ok. I do delta, I get red status: Errors in source system, message no. RSM340. I've tried to put my delta on green, I've remade my init, I've reloading with delta, and I get the same problem.
    I have this problem for all master data loading.
    Should I install some infoobject from Bi Content?
    Thanks a lot !

    Hi,
    Check following threads
    [3003 + R3299 + RSM340] error messages loading an ODS
    Extraction failed with error Message no. RSM340
    BW Data Loading - Delta Processes
    Thanks and regards
    Kiran

  • Union Number in Master data

    Hi all,
    Can anybody tell me how I can found out where the Union Number is maintained in the EE master data? We are planning to use this field in the Benefits feature (BSTAT) for a custom need. I checked infotype 0057 but not sure which field exactly represents the Union Number. Reasons,
    1. No field says 'Union Number'
    2. The technical info in the feature structure says its 4 character field but all the fields in IT 0057 with a number allowed entry of more than 4 characters.
    Any guidance on finding out the exact location of Union Number in the master data will be a great help because it solves our custom need so easily. Thanks in advance.

    hai..
    Try with contract types in IT 0001... Else u can use Personnel Ids i.e 185 IT..
    manu

  • Master data error - when switching an attribute to navigable

    Hi,
    I have a serious problem on a master data. I have this master data that until yesterday did have not a problem.
    Now, when I add an attribute not navigable (DIS), no problem. When I switch to DISPLAY and activate, following error occurs:
    Error in activating the InfoObjects
    Characteristic 0FS_CTR_NO: The attibutes SID table(s) could not be filled
    Error in BW:
    Error when activating InfoObject 0FS_CTR_NO
    Characteristic 0FS_CTR_NO: Error when generating master data routines
    Error when resetting InfoObject 0FS_CTR_NO to the active version
    I tried to make this modification also directly in Quality without transport, and the error is the same.
    Master data is empty, so I cannot understand why there is a problem with SIDS.
    Any suggestions?
    Thanks,
    Gianluca

    Hi,
    as Raj said,  when infoobject attribute is changed from dispaly attribute to navigational attribute there will be impact.
    check this thread.
    Link: [Error when activating 0MATERIAL after adding Navigation attribute]
    Regards
    Daya Sagar

  • "Error reading crrdit master data " error in Contract management(CRM 2007)

    Hi All,
    In CRM 2007,whenever we navigate to contract management screen,error message come which is "Error reading credit master data "
    Any suggestion on how to solve this issue.
    Regrads
    Nikhil

    Hi All,
    This is resolved.
    Just set the  entry in the table CRMV_CREDIT_GEN to 'Do not Display credit master data '
    Regards
    Ankit

  • Master data error

    hi
       what is the solution,when master data is sent to cube
    with out check marking ignore duplecate data records at infopackage level.
    thanks in advance....
    points will be assigned

    No error while loadng the cube.
    But when you are loading the master data object, you would get errors if there are duplicate entries. In that case check the box.

  • Previous Account Number customer master data

    FI Gurus,
    I want to be able to use field KNB1-ALTKN under company code data in master data. I want to use this field to store the old legacy customer number, but then I want ot be able to search when Ina doing customer search in FBL5N. Is there a way to do that?
    Thank You for your help.

    Hi Frank,
    Try this way.
    Go to SE36-> DDF->Extras-> selection views.
    origin of view -> SAP-> Change
    you can see Tables/Nodes on right hand side -> click page down.
    place cursor on KNB1 and click choose.
    In Table fields/Node fields
    click page down until you see ALTKN field.
    assign functional group 04 to the field and save.
    Note: Under origin of view ->SAP, this is provided by SAP. Make sure before you do change.
    Best Regards,
    Mohan.

  • Master data error while loading data

    Dear Masters
    Im getting below error while loading delta update for master data(Attr) through infopackage to only PSA. Im in BI 7.0
    When I checked Error message, Im getting 2 errors. They are
    1) Update mode R is not supported by the extraction API
    2) Error in source system.
    I checked source system is fine. im getting data also in RSA3
    Error message from the source system
    Diagnosis
    An error occurred in the source system.
    System Response
    Caller 09 contains an error message.
    Further analysis:
    The error occurred in Service API .
    Refer to the error message.
    Procedure
    How you remove the error depends on the error message.
    Note
    If the source system is a Client Workstation, then it is possible that the file that you wanted to load was being edited at the time of the data request. Make sure that the file is in the specified directory, that it is not being processed at the moment, and restart the
    Could you please help me to resolve this issue.

    Process chains might fail if the data source is not in sync between the source and the BW system. In such a case, go to the 'Source Systems' in transaction RSA1, find the datasource, right click and select 'Replicate Datasources'. Once this is done, the transfer structure needs to be activated as well. To do this in production, run the report RS_TRANSTRU_ACTIVATE_ALL with the source system name and the infosource name. This will activate the transfer structure.
    NOTE: For master data using direct upload the infosource name is the name of the info object itself.

Maybe you are looking for

  • Moving Itunes Library from a Mac hard drive to a Windows hard drive.

    We would like to move our Itunes library from our G5 that is on a external hard drive to another external hard drive that is Windows based to use on our other computer that runs Windows 7. We will not be moving it back to the G5. The hard drive conta

  • How to connect computer and itunes locked mode is lost at the same time

    when I have enough information and icloud

  • Develop IDOC to FLAT FILE

    Hi friends, I need help do develop a new interface using IDOC -> FLAT FILE scenario. Please send exemples and links. In found only XML file formats, i need FLAT FILE. Thanks, Marco

  • JSF and dynamic inclusion

    Hi, I have the following problem, I'm trying to dynamically include a footer jsf fragment into different pages, and depending on the page it will have different properties: I need to do this.... on the general pages include the footer: <jsp:include p

  • Flash not working with YouTube

    I updated Flash, Firefox, and Windows 7 on 12/10/14. Since that time I am no longer able to use Flash on YouTube. I get a black player and the message: "An error has occurred. Try again later." I'm using Firefox version 34.0.5 and Flash version 16.0.