Help....Duplicate records found in loading customer attributes!!!!

Hi ALL,
we have a full update master data with attributes job that  is failed with this error message: (note the processing is PSA and then Data targets)
1 duplicate record found. 66 recordings used in table /BIC/PZMKE_CUST RSDMD 199     
1 duplicate record found. 0 recordings used in table /BIC/PZTSA_CUST RSDMD 199     
1 duplicate record found. 0 recordings used in table /BIC/PZTSA_CUST RSDMD 199     
1 duplicate record found. 0 recordings used in table /BIC/XZTSA_CUST RSDMD 199     
1 duplicate record found. 66 recordings used in table /BIC/XZMKE_CUST RSDMD 199     
our datasource is 0CUSTOMER_ATTR i tried to use the transaction rso2 to get more information about this datasource to know where i can find the original data in R/3 but when i execute i got this message:
DataSource 0CUSTOMER_ATTR  is extracted using functional module MDEX_CUSTOMER_MD
Can you help me please what should i do to correct and reload or to find the duplicate data to tell the person on R/3 system to delete the duplicate.
Thanks
Bilal

Hi Bilal,
Could you try the following pls.
Only PSA and Update Subsequently into Datatargets in the Processing tab.
In Update tab use Error Handling , mark "Valid Records Update, Reporting Possible(request green).
If my assumptions are right, this should collect the duplicate records in another request, from where you could get the data that needs to be corrected.
To resolve this issue, just load using ONLY PSA and Update Subsequently into Datatargets with Ignore duplicate Records checked.
Cheers,
Praveen.

Similar Messages

  • Duplicate records found while loading master data(very urgent)

    Hi all,
    One infopackage in the process chain failed while laoding the master data(full update).Its showing the following error-->duplicate record found ..1 record used in /BI0/PTCTQUERY and the same record occured in /BI0/PTCTQUERY tables.
    can anyone give me the solution...its very urgent...
    Thanks & Regards,
    Manjula

    Hi
    You can see the check box in the Processing tab page. Make a tick mark for the check box Ignore Duplicate Data Records indicator . When multiple data records that have the same key are transferred, the last data record in the request is updated to BI. Any other data records in the request with the same key are ignored.
    Help says that:
    To maintain consistency, ignoring duplicate data records is only possible if the data is updated serially. A serial update is when data is first updated into the PSA then, after it has been successfully written to the PSA, it is updated into the master data or text tables of the InfoObject.
    If a DataSource transfers potentially duplicate data records or if you manually set the Ignore Duplicate Data Records indicator, the PSA Only update type is automatically selected in the scheduler.
    hope it clears ur doubt, otherwise let me know.
    Regards
    Kiran

  • Master Delta Load - Error "38 Duplicate records found"

    Hi Guys,
    In one of our delta uploads for a Master data we are getting error "38 Duplicate records found". I checked the PSA data but there were no duplicate records in that.
    Once the data upload fails I manually update the failed packet and it goes fine.
    Please any one have solution for this.

    Hi
    You can see the check box in the Processing tab page. Make a tick mark for the check box Ignore Duplicate Data Records indicator . When multiple data records that have the same key are transferred, the last data record in the request is updated to BI. Any other data records in the request with the same key are ignored.
    Help says that:
    To maintain consistency, ignoring duplicate data records is only possible if the data is updated serially. A serial update is when data is first updated into the PSA then, after it has been successfully written to the PSA, it is updated into the master data or text tables of the InfoObject.
    If a DataSource transfers potentially duplicate data records or if you manually set the Ignore Duplicate Data Records indicator, the PSA Only update type is automatically selected in the scheduler.
    hope it clears ur doubt, otherwise let me know.
    u can see the same in my previous thread
    Re: duplicate records found while loading master data(very urgent)
    Regards
    Kiran

  • Duplicate Record Found

    Hi Friends,
    We are getting error "Duplicate Record Found" while loading master data.
    When we checked in PSA there are no duplicated in PSA.
    When we deleted the request and reloaded from PSA it completed successfully.
    Daily we are facing this error and after deletion and reload from PSA it finishes successfully.
    What could be the resaone? Any solution for this?
    Regards
    SSS

    Hi,
    In InfoPackage maintenance screen, under update option, select the check box, Ignore Double Data Records. This may solve u r probelm.
    Hope this helps u a lot.........
    Assigning points is the way of saying Thanks in SDN
    Regards
    Ramakrishna Kamurthy

  • 36 duplicate record  found. -- error while loading master data

    Hello BW Experts,
    Error while loading master data
    ( green light )Update PSA ( 50000 Records posted ) No errors
    ( green light )Transfer rules ( 50000  „³ 50000 Records ): No errors
    ( green light )Update rules ( 50000 „³ 50000 Records ): No errors
    ( green light ) Update ( 0 new / 50000 changed ): No errors
    ( red light )Processing end: errors occurred
         Processing 2 finished
    ƒÞ     36 duplicate record found. 15718 recordings used in table /BIC/PZMATNUM
    ƒÞ     36 duplicate record found. 15718 recordings used in table /BIC/XZMATNUM
    This error repeats with all the data-packets.
    what could be the reason of the error. how to correct the error.
    Any suggestions appreciated.
    Thanks,
    BWer

    BWer,
    We have exactly the same issue when loading the infoobject 0PM_ORDER, the datasource is 0PM_ORDER_ATTR.
    The workaround we have been using is 'Manual push' from the details tab of the monitor. Weirdly we don't have this issue in our test and dev systems, even in Production, this doesn't occur somedays. We have all the necessary settings enabled in the infopackage.
    Did you figure out a solution for your issue, if so please let me know and likewise I will.
    -Venkat

  • Job Fail:9 duplicate record found. 73 recordings used in table /BIC/XZ_DEAL

    Dear All
      Job load from bw ODS to bw Master data InfoObject and ODS. This job always fail with same message: <b>9 duplicate record found. 73 recordings used in table /BIC/XZ_DEAL</b> (Z_DEAL is Infoobject).
    When I rerun, this job has job no error.
    Please help me solved this problem.
    thanks
    Phet

    Hi,
    What is the info object name.
    Regards,
    Goutam

  • BI 7.0 - Duplicate Record Error while loading master data

    I am working on BI 7.0 and I am trying to load master data to an info object.
    I created an Infopackage and loaded into PSA.
    I created transformation and DTP and I get an error after I execute the DTP about duplicate records.
    I have read all previous threads about duplicate record error while loading master data and most of the them suggested to check 'Ignore duplicate records' option in the infopackage. But in 7.0, I can only load to PSA with infopackage and it doesn't have any option for me to ignore duplicate records.
    My data is getting loaded to PSA fine and I get this error while loading to info object using DTP.
    I would appreciate your help to resolve this issue.
    Regards,
    Ram.

    Hi,
    Refer:
    http://help.sap.com/saphelp_nw2004s/helpdata/en/45/2a8430131e03c3e10000000a1553f6/frameset.htm
    http://help.sap.com/saphelp_nw2004s/helpdata/en/d0/538f3b294a7f2de10000000a11402f/frameset.htm
    With rgds,
    Anil Kumar Sharma .P

  • Intellisync error "Duplicate records found"

    I am trying to sync my BB 8300 with Outlook 2007.  I am using Blackberry Desk Manager 4.2.2.14.
    I was able to sync before but now it get this error "Duplicate records found in the device" and it won't sync.
    Any ideas to fix this problem?
    I tried to un-install and re-install DM.
    I also tried to delete Intellisync folder
    Thanks! 

    JJKD wrote:
    I am trying to sync my BB 8300 with Outlook 2007.  I am using Blackberry Desk Manager 4.2.2.14.
    I was able to sync before but now it get this error "Duplicate records found in the device" and it won't sync.
    Any ideas to fix this problem?
    I tried to un-install and re-install DM.
    I also tried to delete Intellisync folder
    Thanks! 
    Hi JJKD;
    What is your email application?
    Backup your device.
    Review the BB KB Here...
    There are a couple of approaches to this. If your not needing to complete a two way sync, you can clear the calendar on the BB Handheld by going to BACKUP / ADVANCED and clearing the calendar on right side of the Advanced GUI. Then just try a regular sync.
    This can also work by clearing
    If you do need to complete a two way sync you can BACKUP the device and clear the Calendar (edit), then RESTORE just the the calendar in BACKUP / ADVANCED by menu'ing the backup you created and complete a restore of the calendar. By backing up and then clearing and then restoring, this will clear any issues with the calendar DB and reindex the data. Then try a regular sync and see if the symptom is taken care of.
    Good luck and let us know how your doing...
    Message Edited by hmeister on 10-10-2008 09:48 PM
    +++++++++++++++++++++++++++++++++++++++++++++++++
    If successful in helping you, please give me kudos in my post.
    Please mark the post that solved it for you!
    Thank you!
    Best Regards
    hmeister

  • Calendar and Adressbook error: Duplicate records found for GUID

    Hi all,
    i have a Mountaion Lion Server running on a mac mini and everything was working well.
    This morning one user of mine is unable to connect to his calendar and adressbook.... i found this error in the log files:
    2013-06-30 15:19:50+0200 [-] [caldav-1]  [-] [twistedcaldav.directory.appleopendirectory.OpenDirectoryService#error] Duplicate records found for GUID ****USER_GUID****:
    2013-06-30 15:19:50+0200 [-] [caldav-1]  [-] [twistedcaldav.directory.appleopendirectory.OpenDirectoryService#error] Duplicate: ***USER_Shortname***
    Apperetnly there is a duplicate match in the database. how can i fix this issue?
    In Server App this user is only listed once.
    Mail and other services for this user are working correctly.
    Thanks for any advice!

    Hi Samuel,
    You may try:
    select code,count(code)
    from [dbo\].[@XTSD_XA_CMD\]
    group by code having count(code) > 1
    What is the result?
    Thanks,
    Gordon

  • Duplicate record found short dump, if routed through PSA

    Hi Experts,
    I am getting these errors extracting the master data for 0ACTIVITY
    1 duplicate record found. 1991 recordings used in table /BI0/XACTIVITY
    1 duplicate record found. 1991 recordings used in table /BI0/PACTIVITY
    2 duplicate record found. 2100 recordings used in table /BI0/XACTIVITY
    2 duplicate record found. 2100 recordings used in table /BI0/PACTIVITY.
    If I extract via PSA with the option to ingnore duplicate records, I am getting a short dump
    ShrtText                                            
        An SQL error occurred when accessing a table.   
    How to correct the error                                                             
         Database error text........: "ORA-14400: inserted partition key does not map to  
          any partition"                                                                  
    What is  causing errors in my extraction ?
    thanks
    D Bret

    Go to RSRV .Go to All Elementary Tests -->PSA Tables -->Consistency Between PSA Partitions and SAP Administration Information## ..and correct the error...

  • Duplicate Records Found?

    Hi Experts,
       Is there any Permanent solution for Duplicate Records Found?
        Info Package> only PSA subsequent Data Targets- Ignore Duplicate Records.
    Can u explain clearly for this Issue to close Permanently.
    Note : Points will be assinged.
    With Regards,
    Kiran

    k

  • Duplicate records in delta load?????pls help!!!! will assign points

    Hi all,
    I am extracting payroll data with datasource 0hr_py_1 for 0py_c02 cube.
    I ran full load with selection crieteria in infopackage -01.2007 to 02.2007, extracted 20,000 records and then
    i ran init of delta without data transfer, extracted 0 records as expected.
    then ran delta with selection crieteria in infopackage -02.2007 to 01.2010, extracted 4500 where the FEB month records are extracted again.
    what could be the reason for duplicate records to occur in the delta load?
    i have seen the same records in full load with selection crieteria 01.2007 to 02.2007 as well as in selection crieteria 02.2007 to 01.2010. What and how it is possible?
    Actually the datasource 0hr_py_1 datasource is not supporting delta. apart from this what other reasons are there for occuring duplicate records? plss help!!!!!!!!!!!!!!
    Will assign points.

    ur selection criteria -
    01.2007 to 02.2007 as well as in selection crieteria 02.2007 to 01.2010
    both of ur selection includes the month- .02.2007
    might b all selections come under .02.2007
    hav u checkd tht?
    Regards,
    Naveen Natarajan

  • Help needed in Inbox search for Custom attribute

    Hi,
    We have  a requirement where in we are having a custom attribute on Service request to store the ECC Order number.
    We have enhanced the Inbox search to retreive all the service requests havig the ECC order number. 
    Here we are encountering a problem. i just created a new crm service request and entered order number 1234. and now when i search for the same in Inbox search giving the criteria order number as 1234. I get no results found. But when i extend the max list to 2000, then i see the service request appearing in the result list. not sure about the algorithm that is designed for inbox search.
    Any pointers on how to resolve this issue would be of great help.
    Thanks,
    Udaya

    Hi,
    I do not have the time to research this completely, but I had a short look into the class you posted.
    In the GET_DYNAMIC_QUERY_RESULT there is a call to CL_CRM_QCOD_HELPER->PREPROCESS( )
    A little bit lower there are blocks marked by comments for the single searches that are handled by this class. I had a look into the campaign_serach() method. There if you scroll a little bit down (around line 123) they set all search parameters to SIGN = 'I' OPTION = 'EQ'. This is done several times below as well.
    Set a breakpoint in the proprocess() method and check which of the blocks is called and how they handle your search criteria.
    Hope it helps.
    cheers Carsten

  • Duplicate Records in Transactional Load

    Dear All,
    I have an issue where data is getting loaded from a Write Optimized DSO to another Write Optimized DSO and the DTP is getting failed because of Duplicate Records. It is for a Transactional Load.
    I would be grateful if you could please help me to know how i can handle this situation and reload the DTP again.
    I have tried searching the Forum,but i can see many for Master data loading where i can select Handling Duplicate Records.
    Thanks in Advance...
    Regards,
    Syed

    Hi Ravi,
    Thanks for your reply.
    If we uncheck the option, it would take the duplicate records right.
    In my scenario, data is coming from a Write Optimized DSO to another Write Optimized DSO. Where as in the first DSO the Data Uniqueness is not checked and in the second DSO uniqueness is checked, so it is giving me the duplicate error message.
    I saw around 28 records in the Error Stack. So please let me know, how i can process this error records(Duplicate) as well.
    Many Thanks...
    Regards,
    Syed

  • Error:1 duplicate record found. 0 recordings used in table /BI0/XBPARTNER

    Hei everyone
    I get an error of duplicate data records when trying to load masterdata to InfoObject.
    I've tried to load to PSA  and then to InfoObject. In PSA it doesn't show any errors but when loading to infoobject it again gets an error.
    I've also tried to loading to PSA with options 'Ignore double data records' and 'Update subsequently in data targets'.  I still get an error.
    Any suggestions.
    Thanks in advance,
    Maarja

    Take a look at these links below....
    http://help.sap.com/saphelp_nw70/helpdata/en/d0/538f3b294a7f2de10000000a11402f/frameset.htm
    http://help.sap.com/saphelp_nw70/helpdata/en/05/bb223905b61b0ae10000000a11402f/frameset.htm
    (Use option -  1 C under activtites)

Maybe you are looking for