Duplicate records In Master Data

Hi,
I don't understant why we get Duplicate records in Master Data though it has got the overwritten functionality..
   Any idea will be appreciated..

Hi,
<u>Solution:</u> if the load to master data fails due to duplicate records,
Goto Monitor screen --> in the details tab --> under processeing find the duplicate record --> on the context menu of the error record select 'Manual update'.
After the above step is done....trigger the attribute change run for that infoobject.
This should solve your problem.
if there is any problem in the reporting, select the data using filter option on the master data.
Regards,
Vijay.

Similar Messages

  • Duplicate records in master data infoobject...how to delete it...pls help

    hi all,
    how to delete duplicate records in master data infoobject which has no requests because it is a direct update?

    Hi,
    Right click on the info object and
    select Maintain
    in that you will get the Master data table
    from that select the Record and delete it.
    hope this solves your query.
    reward points if useful
    regards,
    ANJI

  • Duplicate records during master data loading.

    hello guys,
    I am reading one blog where the blogger wrote about 'Various issues in a BW Production project'.....I came across one issue which I couldnot understand...
    Data loading failed due to Duplicate records during master data loading.......
    Why do this error occur?How can we rectify this in a production environment?
    Thanks and Regards,
    S

    Hi SChandx200 ,
          May I ask where you get "Various issues in a BW production project"?
    Many Thanks,

  • Ignore duplicate records for master data attributes

    dear  experts ,
                   how & where can i  enable "ignore duplicate records" when i am running my DTP to load data
                     to master data attributes.

    Hi Raj
    Suppose you are loading master data to InfoObject and in PSA you have more than one records for Key.
    Let's assume you are loading some attributes of Document NUmber ( 0DOC_NUMBER) and in PSA you have multiple records for same document number. In PSA this is not a problem, as Key for PSA table is technical key and which is different for every record in PSA. But this is a problem for InfoObject attribute table as more than  one records for primary key ( here 0DOC_NUMBER) will create Primary Key consraints in database.
    This issue can be easily avoided by selecting "Handle Duplicate Records" in DTP . You will find this option under *update" tab of DTP.
    Regards
    Anindya

  • Duplicate records in master data info object

    dear friends,
    i got a standard infoobject called 'A' and it has got some attributes B,C,D,E for that one standard datasource with the following fields A,B,C,D,E  exists and i loaded data. abapers had created a z table with the following P,Q,R,S,X,Y,Z fields.  'P' holds the same records what the 'A' infoobject holds. my requirement is to create a report on the following fields:
    P,Q,R,S,B,C,D,E
    WAT I DONE IS I CREATED A GENERIC DATASOURCE FOR THE FOLLOWING FIELDS P,Q,R,S AND I ADDED THE STANDARD INFOOBJECT CALLED 'A' WITH THE FOLLOWING FIELDS: Q,R,S.
    AND I CRTEATED AND  SHEDULED THE INFOPACKAGE FOR THE STANDARD DATASOURCE(A,B,C,D,E,) TO STANDARD INFOOBJECT(A,B,C,D,E,Q,R,S). NEXT I CREATED THE ANOTHER INFOPACKAGE AND SHEDULED THE INFOPACKAGE AND SHEDULED FROM GENERIC DATASOURCE (P,Q,R,S) TO STANDARD INFOOBJECT(A,B,C,D,E,P,Q,R,S) WITH TRANSFER RULES P->A,Q->Q,R->R,S->S. AFTER LOADING THE DATA I AM GETTING DUPLICATE RECORDS.I WILL GIVE THE TABLE HOW MY MASTER DATA LOOKS LIKE
    A B C D E P Q R S
    1 2 3 4 5
    2 3 4 5 6
    3 4 5 6 7
    1         6 7 8 9
    2         7 8 9 3
    3         4 6 2 1
    THIS IS HOW MY MASTERDATA LOOKS LIKE BUT I NEED IN THE FOLLLOWING FORMAT:
    A B C D E P Q R S
    1 2 3 4 5 6 7 8 9
    2 3 4 5 6 7 8 9 3
    3 4 5 6 7 4 6 2 1
    PLEASE LET ME KNOW
    THANKS & REGARDS,
    HARI

    Hari,
       why don't you enhance the Masterdata info object?. You are suppose to see overwritten records. infoobject A is primary key in the table.
    try to enhance the masterdata Datasource. you will get required output or create masterdat generic Data source.
    All the best.
    any questions let us know.
    Nagesh.

  • Duplicate reocrds for Master data

    Hi Friends,
    I have a request which contains duplicate records for master data. When i try to load data, after loading to PSA, BW shows an error saying that there are duplicate records.
    My requirement is still i need to load the data overwriting the previous duplicate record.
    How to do this?
    Thanks,
    Raja

    Hi,
    Subsequently i need to load the data to target as well. If i select the option "Only PSA" and "Ignore double data records" the option for "Update subsequently in data targets" is disabled.
    Thanks,
    Raja

  • How to check the records in Master Data Table?

    Hi,
       I am trying to load the Master Data Table using the Flat File.Now how to check the records in Master Data Table?
    I done the following way:
    Info Provider->Info Object->Right Click->Display Data or Maintain Master Data
    But it's not showing the records.It's asking like CID from......To......
                                                                        CID(SID)from.............To.......
                                                                         here CID means customer id(characteristic).
    and showing some settings.
    Please guide me.
    Thanks & Regards

    Hi Sri,
    Go to T- code RSD1 and type your info object name and open the P- table in the infoobject then select execute symbol to see the updated  data in to master data info object.
    regards
    sap

  • Fm to create, modify delimit the records in master data for org mag

    hi,
        There is a fm hr_infotype_operation to create, modify, delimit the records for one infotype of pernrs.
    Like this I want fm through which i can create, modify the records to master data of organisational management (OM). Whatever the infotype i will enter in that fm and data it will create a new record in OM master data. 
              If anybody know this, please tell me.
    thanks & regards,
        Sekhar.

    Hi,
    Try RH_UPDATE_INFTY.
    RH_INSERT_INFTY.
    RH_DELETE_INFTY.
    Regards,
    Dilek

  • Deleting duplicate records from different data packets in BI data source.

    Hi,
    I am getting same (duplicate) records from different data packets in BI data source, after completion of extraction.
    I tried to store key fields of the first data packet in an internal table. But this internal table is not carrying the previous data at the time of extraction of second data packet.
    Is there any other way to remove duplicate records after completion of extraction.
    Thanks in advance.

    I did not extensively worked in BI routenes. But I recon there will be routene which will het executed before data mapping part there will be a start routene in which you can validate the existense of data before beeing passed from data source to cube.
    Hope this helps,
    Regards,
    Murthy.

  • Getting duplicate data records for master data

    Hi All,
    When the process chain for the master data, i am getting duplicate data records and , for that  selected the options in Info package level under processing 1)a  update PSA and subsequentky data targets and alternateely select the option Ignore double data records. But still the load was failing and error message "Duplicate  Data Records" after that rhe sehuduled the Info package then i am not getting the error message next time,
    Can any one help on this to resolve the issue.
    Regrasd
    KK

    Yes, for the first option u can write a routine ,what is ur data target--> if it is a cube, there may be a chances of duplicate records because of the additive nature.if its a ODS then u can avoid this, bec only delta is going to be updated.
    Regarding the time dependant attributes, its based on the date field.we have 4 types of slowly changing dimensions.
    check the following link
    http://help.sap.com/bp_biv135/documentation/Multi-dimensional_modeling_EN.doc
    http://www.intelligententerprise.com/info_centers/data_warehousing/showArticle.jhtml?articleID=59301280&pgno=1
    http://help.sap.com/saphelp_nw04/helpdata/en/dd/f470375fbf307ee10000009b38f8cf/frameset.htm

  • Deleting a record from Master data

    Hi all,
    I need some help with deleting a record from the master data.I did go to the master dta maintenance screen and selected the record to be deleted. I saved it. I received the message, "master Data record cannot be deleted".
    I then went into transaction slg1 to check for the details of the record. A message stating " Master data record XXX is being used in the cube /BIC/Dzzzyyy312.
    This record is no longer needed by the end user and was requested to be deleted. Could some one tell me if there is a possibility to delete this unused masterdata record. Your suggestions are appreciated.
    Regards!

    Hi Sumana,
       Check this...Similar Post..
    MAster data deletion throws a dump
    Hope it helps
    Srini

  • Insert New Record in Master Data by Code

    Hi guys,
    I need to insert a new value in an infoobject by code creating:
    1 new record in table P (data not time dependent)
    1 new record in table S (SID table)
    This code could be executed by many tasks in parallel and so it could create problems of concurrency in writing and in quality of the value of new SID selected.
    The first question is:
    THERE IS A STANDARD CODE THAT INSERT A NEW RECORD ALSO CREATING SIDS, managing concurrency in writing and reading?
    The second (if not answer to first)
    This is a part of my code (draft)... any suggestions:
    insert into TABLE P
    INSERT INTO /bic/pzck9idfl VALUES st_p_zck9idfl.
    IF sy-subrc = 0.
    FLAG = 1.
    WHILE FLAG = 0.
        SELECT MAX( sid )
        INTO v_sididfl
        FROM /bic/szck9idfl.
        ADD 1 TO v_sid.
    *record for SID table
        st_zck9idfl-sid = v_sid.
        st_zck9idfl-/bic/zck9idfl = v_idfl.
        st_zck9idfl-chckfl = 'X'.
        st_zck9idfl-datafl = 'X'.
        st_zck9idfl-incfl  = 'X'.
    insert record in SID Table
        INSERT INTO /bic/szck9idfl VALUES st_zck9idfl.
        COMMIT WORK AND WAIT.
    IF Sy-subrc = 0.
    SELECT SINGLE FROM /bic/szck9idfl
    WHERE SID = v_SID
    AND /bic/zck9idfl NE v_idfl.
    IF Sy-SUBRC = 0.
    FLAG = 1.
    ELSE.
    FLAG = 0.
    ENDIF.
    ELSE FLAG = 1.
    ENDIF.
    ENDWHILE.
    Thanks and points to helpful answer!
    ciao
    C@f

    Hi Claudio,
    I would not recommend to do this. Please have a look for standard fm to that job of have a look into the class library to find some methods. On the first look at your code here my comments:
    SELECT MAX( sid )
    INTO v_sididfl
    FROM /bic/szck9idfl.
    ADD 1 TO v_sid.
    Not a pretty good idea, as there is a number range object for getting a sid for each infoobject. If you get your sid like this, all later standard postings will fail with 'duplicate records'.
    *record for SID table
    st_zck9idfl-sid = v_sid.
    st_zck9idfl-/bic/zck9idfl = v_idfl.
    st_zck9idfl-chckfl = 'X'.
    st_zck9idfl-datafl = 'X'.
    st_zck9idfl-incfl = 'X'.
    if you mark all these flags with 'X' you will tell the system that this record is used somewhere in masterdata or in a datatarget and you cannot delete it with standard methods.
    regards
    Siggi

  • Duplicate records in exported data

    I'm trying to export the inventory data with a wildcard (%) filter on the
    Workstation Name.
    If I run the same filter in a query from ConsoleOne, I don't see any
    duplicate records.
    If I run the data export, the exported data for some workstations will have
    a duplicate DN. Just the DN is duplicated, all the other fields are either
    empty or have some default value.
    I have also ran the manual duplicate removal process and have tried
    deleting the records all together using the InventoryRemoval service.
    Any other ideas?

    Dlee,
    It appears that in the past few days you have not received a response to your
    posting. That concerns us, and has triggered this automated reply.
    Has your problem been resolved? If not, you might try one of the following options:
    - Do a search of our knowledgebase at http://support.novell.com/search/kb_index.jsp
    - Check all of the other support tools and options available at
    http://support.novell.com.
    - You could also try posting your message again. Make sure it is posted in the
    correct newsgroup. (http://support.novell.com/forums)
    Be sure to read the forum FAQ about what to expect in the way of responses:
    http://support.novell.com/forums/faq_general.html
    If this is a reply to a duplicate posting, please ignore and accept our apologies
    and rest assured we will issue a stern reprimand to our posting bot.
    Good luck!
    Your Novell Product Support Forums Team
    http://support.novell.com/forums/

  • Duplicate Records, Import transaction Data

    Hi Everybody
    I'm using BPC 7.5 NW and I gett\ a warning that says that there are duplicate records when I run the package "Load Transaction Data". The txt file that I'm using does not have duplicate records. I have the following data in my flat file:
    ACCOUNT           INTCO               AMOUNT
    61012                   I_65                       10
    61012                   I_66                       12
    61012                   I_67                       13
    I'm using a conversion file for INTCO as:
    EXTERNAL               INTERNAL
    I_65                              I_99
    I_66                              I_99
    I_67                              I_99
    When I ran the package, it says that there are duplicate records, the records are:
    ACCOUNT           INTCO               AMOUNT
    61012                   I_99                       10
    61012                   I_99                       12
    My cuestion is, It is not posible to use this package when I use conversion files? If I use the APPEND package, it works fine, but why it dosnt work whit the Import Transaction Data?
    since I remember in MS version is posible to do that.
    Thanks in advenced.
    Regards

    Hi,
    Originally, you had the following records:
    ACCOUNT INTCO AMOUNT
    61012 I_65 10
    61012 I_66 12
    61012 I_67 13
    However, after the conversion file, the records are become:
    ACCOUNT INTCO AMOUNT
    61012 I_99 10
    61012 I_99 12
    61012 I_99 13
    So, there are 3 records which are duplicate.
    The import package will not accept the 2nd and the 3rd record. Because these are duplicate records fro the 1st record. However, the append package will append the 2nd and the 3rd records to the 1st one.
    Hope you got the idea.

  • Duplicate records in generic data source

    Hello,
    We have created a generic data source using a database view joing two tables MARA and MBEW.
    When we run the view in our DEV server, we get perfectly fine data. Now when run the same view in QA, we get duplicate records.
    Is it any thing to do with the CLIENT as in QA, we have 2 clients with same data.
    MARA     MANDT     =     MBEW     MANDT
    MARA     MATNR     =     MBEW     MATNR
    This is what I mention in JOIN Conditions.
    Hope I could explain my issue properly. Please HELP !
    Abhishek

    Please check the possibility of Multiple records for a given material in MBEW,as same material can be in multiple valuation areas
    More over you will be executing extraction in one client so it is very unlikely that you see data of the other client
    In dev normally we do not have good data to test so it seems like design is correct in dev

Maybe you are looking for