Duplicate Records Found?

Hi Experts,
   Is there any Permanent solution for Duplicate Records Found?
    Info Package> only PSA subsequent Data Targets- Ignore Duplicate Records.
Can u explain clearly for this Issue to close Permanently.
Note : Points will be assinged.
With Regards,
Kiran

k

Similar Messages

  • 36 duplicate record  found. -- error while loading master data

    Hello BW Experts,
    Error while loading master data
    ( green light )Update PSA ( 50000 Records posted ) No errors
    ( green light )Transfer rules ( 50000  „³ 50000 Records ): No errors
    ( green light )Update rules ( 50000 „³ 50000 Records ): No errors
    ( green light ) Update ( 0 new / 50000 changed ): No errors
    ( red light )Processing end: errors occurred
         Processing 2 finished
    ƒÞ     36 duplicate record found. 15718 recordings used in table /BIC/PZMATNUM
    ƒÞ     36 duplicate record found. 15718 recordings used in table /BIC/XZMATNUM
    This error repeats with all the data-packets.
    what could be the reason of the error. how to correct the error.
    Any suggestions appreciated.
    Thanks,
    BWer

    BWer,
    We have exactly the same issue when loading the infoobject 0PM_ORDER, the datasource is 0PM_ORDER_ATTR.
    The workaround we have been using is 'Manual push' from the details tab of the monitor. Weirdly we don't have this issue in our test and dev systems, even in Production, this doesn't occur somedays. We have all the necessary settings enabled in the infopackage.
    Did you figure out a solution for your issue, if so please let me know and likewise I will.
    -Venkat

  • Duplicate records found while loading master data(very urgent)

    Hi all,
    One infopackage in the process chain failed while laoding the master data(full update).Its showing the following error-->duplicate record found ..1 record used in /BI0/PTCTQUERY and the same record occured in /BI0/PTCTQUERY tables.
    can anyone give me the solution...its very urgent...
    Thanks & Regards,
    Manjula

    Hi
    You can see the check box in the Processing tab page. Make a tick mark for the check box Ignore Duplicate Data Records indicator . When multiple data records that have the same key are transferred, the last data record in the request is updated to BI. Any other data records in the request with the same key are ignored.
    Help says that:
    To maintain consistency, ignoring duplicate data records is only possible if the data is updated serially. A serial update is when data is first updated into the PSA then, after it has been successfully written to the PSA, it is updated into the master data or text tables of the InfoObject.
    If a DataSource transfers potentially duplicate data records or if you manually set the Ignore Duplicate Data Records indicator, the PSA Only update type is automatically selected in the scheduler.
    hope it clears ur doubt, otherwise let me know.
    Regards
    Kiran

  • Calendar and Adressbook error: Duplicate records found for GUID

    Hi all,
    i have a Mountaion Lion Server running on a mac mini and everything was working well.
    This morning one user of mine is unable to connect to his calendar and adressbook.... i found this error in the log files:
    2013-06-30 15:19:50+0200 [-] [caldav-1]  [-] [twistedcaldav.directory.appleopendirectory.OpenDirectoryService#error] Duplicate records found for GUID ****USER_GUID****:
    2013-06-30 15:19:50+0200 [-] [caldav-1]  [-] [twistedcaldav.directory.appleopendirectory.OpenDirectoryService#error] Duplicate: ***USER_Shortname***
    Apperetnly there is a duplicate match in the database. how can i fix this issue?
    In Server App this user is only listed once.
    Mail and other services for this user are working correctly.
    Thanks for any advice!

    Hi Samuel,
    You may try:
    select code,count(code)
    from [dbo\].[@XTSD_XA_CMD\]
    group by code having count(code) > 1
    What is the result?
    Thanks,
    Gordon

  • Duplicate record found short dump, if routed through PSA

    Hi Experts,
    I am getting these errors extracting the master data for 0ACTIVITY
    1 duplicate record found. 1991 recordings used in table /BI0/XACTIVITY
    1 duplicate record found. 1991 recordings used in table /BI0/PACTIVITY
    2 duplicate record found. 2100 recordings used in table /BI0/XACTIVITY
    2 duplicate record found. 2100 recordings used in table /BI0/PACTIVITY.
    If I extract via PSA with the option to ingnore duplicate records, I am getting a short dump
    ShrtText                                            
        An SQL error occurred when accessing a table.   
    How to correct the error                                                             
         Database error text........: "ORA-14400: inserted partition key does not map to  
          any partition"                                                                  
    What is  causing errors in my extraction ?
    thanks
    D Bret

    Go to RSRV .Go to All Elementary Tests -->PSA Tables -->Consistency Between PSA Partitions and SAP Administration Information## ..and correct the error...

  • Help....Duplicate records found in loading customer attributes!!!!

    Hi ALL,
    we have a full update master data with attributes job that  is failed with this error message: (note the processing is PSA and then Data targets)
    1 duplicate record found. 66 recordings used in table /BIC/PZMKE_CUST RSDMD 199     
    1 duplicate record found. 0 recordings used in table /BIC/PZTSA_CUST RSDMD 199     
    1 duplicate record found. 0 recordings used in table /BIC/PZTSA_CUST RSDMD 199     
    1 duplicate record found. 0 recordings used in table /BIC/XZTSA_CUST RSDMD 199     
    1 duplicate record found. 66 recordings used in table /BIC/XZMKE_CUST RSDMD 199     
    our datasource is 0CUSTOMER_ATTR i tried to use the transaction rso2 to get more information about this datasource to know where i can find the original data in R/3 but when i execute i got this message:
    DataSource 0CUSTOMER_ATTR  is extracted using functional module MDEX_CUSTOMER_MD
    Can you help me please what should i do to correct and reload or to find the duplicate data to tell the person on R/3 system to delete the duplicate.
    Thanks
    Bilal

    Hi Bilal,
    Could you try the following pls.
    Only PSA and Update Subsequently into Datatargets in the Processing tab.
    In Update tab use Error Handling , mark "Valid Records Update, Reporting Possible(request green).
    If my assumptions are right, this should collect the duplicate records in another request, from where you could get the data that needs to be corrected.
    To resolve this issue, just load using ONLY PSA and Update Subsequently into Datatargets with Ignore duplicate Records checked.
    Cheers,
    Praveen.

  • Job Fail:9 duplicate record found. 73 recordings used in table /BIC/XZ_DEAL

    Dear All
      Job load from bw ODS to bw Master data InfoObject and ODS. This job always fail with same message: <b>9 duplicate record found. 73 recordings used in table /BIC/XZ_DEAL</b> (Z_DEAL is Infoobject).
    When I rerun, this job has job no error.
    Please help me solved this problem.
    thanks
    Phet

    Hi,
    What is the info object name.
    Regards,
    Goutam

  • Intellisync error "Duplicate records found"

    I am trying to sync my BB 8300 with Outlook 2007.  I am using Blackberry Desk Manager 4.2.2.14.
    I was able to sync before but now it get this error "Duplicate records found in the device" and it won't sync.
    Any ideas to fix this problem?
    I tried to un-install and re-install DM.
    I also tried to delete Intellisync folder
    Thanks! 

    JJKD wrote:
    I am trying to sync my BB 8300 with Outlook 2007.  I am using Blackberry Desk Manager 4.2.2.14.
    I was able to sync before but now it get this error "Duplicate records found in the device" and it won't sync.
    Any ideas to fix this problem?
    I tried to un-install and re-install DM.
    I also tried to delete Intellisync folder
    Thanks! 
    Hi JJKD;
    What is your email application?
    Backup your device.
    Review the BB KB Here...
    There are a couple of approaches to this. If your not needing to complete a two way sync, you can clear the calendar on the BB Handheld by going to BACKUP / ADVANCED and clearing the calendar on right side of the Advanced GUI. Then just try a regular sync.
    This can also work by clearing
    If you do need to complete a two way sync you can BACKUP the device and clear the Calendar (edit), then RESTORE just the the calendar in BACKUP / ADVANCED by menu'ing the backup you created and complete a restore of the calendar. By backing up and then clearing and then restoring, this will clear any issues with the calendar DB and reindex the data. Then try a regular sync and see if the symptom is taken care of.
    Good luck and let us know how your doing...
    Message Edited by hmeister on 10-10-2008 09:48 PM
    +++++++++++++++++++++++++++++++++++++++++++++++++
    If successful in helping you, please give me kudos in my post.
    Please mark the post that solved it for you!
    Thank you!
    Best Regards
    hmeister

  • Master Delta Load - Error "38 Duplicate records found"

    Hi Guys,
    In one of our delta uploads for a Master data we are getting error "38 Duplicate records found". I checked the PSA data but there were no duplicate records in that.
    Once the data upload fails I manually update the failed packet and it goes fine.
    Please any one have solution for this.

    Hi
    You can see the check box in the Processing tab page. Make a tick mark for the check box Ignore Duplicate Data Records indicator . When multiple data records that have the same key are transferred, the last data record in the request is updated to BI. Any other data records in the request with the same key are ignored.
    Help says that:
    To maintain consistency, ignoring duplicate data records is only possible if the data is updated serially. A serial update is when data is first updated into the PSA then, after it has been successfully written to the PSA, it is updated into the master data or text tables of the InfoObject.
    If a DataSource transfers potentially duplicate data records or if you manually set the Ignore Duplicate Data Records indicator, the PSA Only update type is automatically selected in the scheduler.
    hope it clears ur doubt, otherwise let me know.
    u can see the same in my previous thread
    Re: duplicate records found while loading master data(very urgent)
    Regards
    Kiran

  • Duplicate Record Found

    Hi Friends,
    We are getting error "Duplicate Record Found" while loading master data.
    When we checked in PSA there are no duplicated in PSA.
    When we deleted the request and reloaded from PSA it completed successfully.
    Daily we are facing this error and after deletion and reload from PSA it finishes successfully.
    What could be the resaone? Any solution for this?
    Regards
    SSS

    Hi,
    In InfoPackage maintenance screen, under update option, select the check box, Ignore Double Data Records. This may solve u r probelm.
    Hope this helps u a lot.........
    Assigning points is the way of saying Thanks in SDN
    Regards
    Ramakrishna Kamurthy

  • Error:1 duplicate record found. 0 recordings used in table /BI0/XBPARTNER

    Hei everyone
    I get an error of duplicate data records when trying to load masterdata to InfoObject.
    I've tried to load to PSA  and then to InfoObject. In PSA it doesn't show any errors but when loading to infoobject it again gets an error.
    I've also tried to loading to PSA with options 'Ignore double data records' and 'Update subsequently in data targets'.  I still get an error.
    Any suggestions.
    Thanks in advance,
    Maarja

    Take a look at these links below....
    http://help.sap.com/saphelp_nw70/helpdata/en/d0/538f3b294a7f2de10000000a11402f/frameset.htm
    http://help.sap.com/saphelp_nw70/helpdata/en/05/bb223905b61b0ae10000000a11402f/frameset.htm
    (Use option -  1 C under activtites)

  • Duplicate records in material master

    Hi All
    I am trying to init material master and I am getting this error message
    "281 duplicate record found. 0 recordings used in table /BI0/XMATERIAL"
    This is not the first time , I am initializing . Deltas were running for a month and then I had to run a repair request to capture some changes and then when I ran the delta from then onwards I am getting this message.
    I started by deleting all the records and running this inti packet.
    I have tried the error handling and also I do not see any option for "ignore duplicate records " in the packet.
    I cannot see any error (red) records also in the PSA enough though the message says there are errors
    Please advice
    Thanks

    Hi,
    The duplicate record check is in the extraction program. I would suggest you do not deactivate/comment it out.
    What you should do is to go back to your material master records from the source system and sort out the materials not having unique identifier. Once this is sorted out, you can then re-run your delta. You shouldn't have the problem again. I once had the same problem from an HR extraction and i had to go back to the source data and ask the business to correct the duplication. A record for an employee was changed and there was overlap of dates in the employees record. The BW extraction program saw this as a duplicate record.
    I hope this help.
    Do not forget to award the points please.
    Regards,
    Jacob

  • Impdp - duplicate keys found error

    I'm facing a strange issue. I'm running expdp / impdp using datapump. The export of the schema is successful. The import of the schema is raising the below errors for few PKs. I have verified on the source database. There are no duplicate records found and the PK is in enabled state. What is causing for this issue?. I had the similar issue earlier and run the Point-In Time export using FLASHBACK_SCN parameter and the import run smooth. But, the issue this time is the applications are shutdown and there are no connections from the application side and no DML transactions running on the schema. So, I have run the export without FLASHBACK_SCN parameter as we are moving this production database to another. But, the import is raising the below errors. Could someone let me know why it's causing?.
    ORA-39083: Object type INDEX failed to create with error:
    ORA-01452: cannot CREATE UNIQUE INDEX; duplicate keys found
    ORA-39083: Object type CONSTRAINT failed to create with error:
    ORA-02437: cannot validate (schema.tab_PK) - primary key violated

    Hi,
    Can you run the impdp commnd with this:
    sqlfile=index.sql include=INDEX
    then look in index.sql to find the index that is failing and try crating it manually. The other thing is to do a query on the target table to see if there are
    duplicate keys on the target. I can't explain why, but I can't imagine why the create index would fail if there were not.
    If you have a network link between the 2 databases maybe you can do a
    select * from target minus select * from source@network_link
    to see if there is anything different.
    Dean
    Edited by: Dean Gagne on Feb 17, 2010 7:37 AM

  • While loading master data to infoobject Load failed due to Duplicate record

    Hi Experts,
    While loading master data to the infoobject load failed .
    The error it is showing is 24 Duplicate record found. 23 recordings used in table.
    Pls help me to solve this issue
    Thanks in Advance.
    Regards,
    Gopal.

    In infopackage settings u will find a checkbox for 'delete duplicate records'.
    I think it appears beside the radio button for 'To PSA',and also tick checkbox for 'subsequent update to data targets'.
    This will remove the duplicate records(if any) from the PSA before they are processed further by transfer and update rules.
    Use this and reload master data.
    cheers,
    Vishvesh

  • Duplicate record error

    Hi,
    I am using a ODS as source to update the master data infoobject with flexible update. The issue is in spit of using option ONLY PSA ( Update subsequently in data target ) in the infopackage with error handling enabled, I am getting Duplicate record error. This happens only when I am updating through process chain. If I run it manually, the error doesn't comes. Plz let me know the reason.
    Thanks

    Hi Maneesh,
    As we r loading from ODS to info object we dont get the option "don't update duplicate records if exists"
    first did u checked any duplicate records found in PSA?? if so delete them from PSA.
    or one option is enable error handling in infopackage.
    or
    check any incosistencies for infoobject in RSRV and if found repair it and load again. check the inconsistencies for P,X,Y tables and for the complete object also.
    *assign points if helpfull*
    KS

Maybe you are looking for