Master Delta Load - Error "38 Duplicate records found"

Hi Guys,
In one of our delta uploads for a Master data we are getting error "38 Duplicate records found". I checked the PSA data but there were no duplicate records in that.
Once the data upload fails I manually update the failed packet and it goes fine.
Please any one have solution for this.

Hi
You can see the check box in the Processing tab page. Make a tick mark for the check box Ignore Duplicate Data Records indicator . When multiple data records that have the same key are transferred, the last data record in the request is updated to BI. Any other data records in the request with the same key are ignored.
Help says that:
To maintain consistency, ignoring duplicate data records is only possible if the data is updated serially. A serial update is when data is first updated into the PSA then, after it has been successfully written to the PSA, it is updated into the master data or text tables of the InfoObject.
If a DataSource transfers potentially duplicate data records or if you manually set the Ignore Duplicate Data Records indicator, the PSA Only update type is automatically selected in the scheduler.
hope it clears ur doubt, otherwise let me know.
u can see the same in my previous thread
Re: duplicate records found while loading master data(very urgent)
Regards
Kiran

Similar Messages

  • Error:1 duplicate record found. 0 recordings used in table /BI0/XBPARTNER

    Hei everyone
    I get an error of duplicate data records when trying to load masterdata to InfoObject.
    I've tried to load to PSA  and then to InfoObject. In PSA it doesn't show any errors but when loading to infoobject it again gets an error.
    I've also tried to loading to PSA with options 'Ignore double data records' and 'Update subsequently in data targets'.  I still get an error.
    Any suggestions.
    Thanks in advance,
    Maarja

    Take a look at these links below....
    http://help.sap.com/saphelp_nw70/helpdata/en/d0/538f3b294a7f2de10000000a11402f/frameset.htm
    http://help.sap.com/saphelp_nw70/helpdata/en/05/bb223905b61b0ae10000000a11402f/frameset.htm
    (Use option -  1 C under activtites)

  • Duplicate record found short dump, if routed through PSA

    Hi Experts,
    I am getting these errors extracting the master data for 0ACTIVITY
    1 duplicate record found. 1991 recordings used in table /BI0/XACTIVITY
    1 duplicate record found. 1991 recordings used in table /BI0/PACTIVITY
    2 duplicate record found. 2100 recordings used in table /BI0/XACTIVITY
    2 duplicate record found. 2100 recordings used in table /BI0/PACTIVITY.
    If I extract via PSA with the option to ingnore duplicate records, I am getting a short dump
    ShrtText                                            
        An SQL error occurred when accessing a table.   
    How to correct the error                                                             
         Database error text........: "ORA-14400: inserted partition key does not map to  
          any partition"                                                                  
    What is  causing errors in my extraction ?
    thanks
    D Bret

    Go to RSRV .Go to All Elementary Tests -->PSA Tables -->Consistency Between PSA Partitions and SAP Administration Information## ..and correct the error...

  • 36 duplicate record  found. -- error while loading master data

    Hello BW Experts,
    Error while loading master data
    ( green light )Update PSA ( 50000 Records posted ) No errors
    ( green light )Transfer rules ( 50000  „³ 50000 Records ): No errors
    ( green light )Update rules ( 50000 „³ 50000 Records ): No errors
    ( green light ) Update ( 0 new / 50000 changed ): No errors
    ( red light )Processing end: errors occurred
         Processing 2 finished
    ƒÞ     36 duplicate record found. 15718 recordings used in table /BIC/PZMATNUM
    ƒÞ     36 duplicate record found. 15718 recordings used in table /BIC/XZMATNUM
    This error repeats with all the data-packets.
    what could be the reason of the error. how to correct the error.
    Any suggestions appreciated.
    Thanks,
    BWer

    BWer,
    We have exactly the same issue when loading the infoobject 0PM_ORDER, the datasource is 0PM_ORDER_ATTR.
    The workaround we have been using is 'Manual push' from the details tab of the monitor. Weirdly we don't have this issue in our test and dev systems, even in Production, this doesn't occur somedays. We have all the necessary settings enabled in the infopackage.
    Did you figure out a solution for your issue, if so please let me know and likewise I will.
    -Venkat

  • Duplicate records found while loading master data(very urgent)

    Hi all,
    One infopackage in the process chain failed while laoding the master data(full update).Its showing the following error-->duplicate record found ..1 record used in /BI0/PTCTQUERY and the same record occured in /BI0/PTCTQUERY tables.
    can anyone give me the solution...its very urgent...
    Thanks & Regards,
    Manjula

    Hi
    You can see the check box in the Processing tab page. Make a tick mark for the check box Ignore Duplicate Data Records indicator . When multiple data records that have the same key are transferred, the last data record in the request is updated to BI. Any other data records in the request with the same key are ignored.
    Help says that:
    To maintain consistency, ignoring duplicate data records is only possible if the data is updated serially. A serial update is when data is first updated into the PSA then, after it has been successfully written to the PSA, it is updated into the master data or text tables of the InfoObject.
    If a DataSource transfers potentially duplicate data records or if you manually set the Ignore Duplicate Data Records indicator, the PSA Only update type is automatically selected in the scheduler.
    hope it clears ur doubt, otherwise let me know.
    Regards
    Kiran

  • Help....Duplicate records found in loading customer attributes!!!!

    Hi ALL,
    we have a full update master data with attributes job that  is failed with this error message: (note the processing is PSA and then Data targets)
    1 duplicate record found. 66 recordings used in table /BIC/PZMKE_CUST RSDMD 199     
    1 duplicate record found. 0 recordings used in table /BIC/PZTSA_CUST RSDMD 199     
    1 duplicate record found. 0 recordings used in table /BIC/PZTSA_CUST RSDMD 199     
    1 duplicate record found. 0 recordings used in table /BIC/XZTSA_CUST RSDMD 199     
    1 duplicate record found. 66 recordings used in table /BIC/XZMKE_CUST RSDMD 199     
    our datasource is 0CUSTOMER_ATTR i tried to use the transaction rso2 to get more information about this datasource to know where i can find the original data in R/3 but when i execute i got this message:
    DataSource 0CUSTOMER_ATTR  is extracted using functional module MDEX_CUSTOMER_MD
    Can you help me please what should i do to correct and reload or to find the duplicate data to tell the person on R/3 system to delete the duplicate.
    Thanks
    Bilal

    Hi Bilal,
    Could you try the following pls.
    Only PSA and Update Subsequently into Datatargets in the Processing tab.
    In Update tab use Error Handling , mark "Valid Records Update, Reporting Possible(request green).
    If my assumptions are right, this should collect the duplicate records in another request, from where you could get the data that needs to be corrected.
    To resolve this issue, just load using ONLY PSA and Update Subsequently into Datatargets with Ignore duplicate Records checked.
    Cheers,
    Praveen.

  • Calendar and Adressbook error: Duplicate records found for GUID

    Hi all,
    i have a Mountaion Lion Server running on a mac mini and everything was working well.
    This morning one user of mine is unable to connect to his calendar and adressbook.... i found this error in the log files:
    2013-06-30 15:19:50+0200 [-] [caldav-1]  [-] [twistedcaldav.directory.appleopendirectory.OpenDirectoryService#error] Duplicate records found for GUID ****USER_GUID****:
    2013-06-30 15:19:50+0200 [-] [caldav-1]  [-] [twistedcaldav.directory.appleopendirectory.OpenDirectoryService#error] Duplicate: ***USER_Shortname***
    Apperetnly there is a duplicate match in the database. how can i fix this issue?
    In Server App this user is only listed once.
    Mail and other services for this user are working correctly.
    Thanks for any advice!

    Hi Samuel,
    You may try:
    select code,count(code)
    from [dbo\].[@XTSD_XA_CMD\]
    group by code having count(code) > 1
    What is the result?
    Thanks,
    Gordon

  • Intellisync error "Duplicate records found"

    I am trying to sync my BB 8300 with Outlook 2007.  I am using Blackberry Desk Manager 4.2.2.14.
    I was able to sync before but now it get this error "Duplicate records found in the device" and it won't sync.
    Any ideas to fix this problem?
    I tried to un-install and re-install DM.
    I also tried to delete Intellisync folder
    Thanks! 

    JJKD wrote:
    I am trying to sync my BB 8300 with Outlook 2007.  I am using Blackberry Desk Manager 4.2.2.14.
    I was able to sync before but now it get this error "Duplicate records found in the device" and it won't sync.
    Any ideas to fix this problem?
    I tried to un-install and re-install DM.
    I also tried to delete Intellisync folder
    Thanks! 
    Hi JJKD;
    What is your email application?
    Backup your device.
    Review the BB KB Here...
    There are a couple of approaches to this. If your not needing to complete a two way sync, you can clear the calendar on the BB Handheld by going to BACKUP / ADVANCED and clearing the calendar on right side of the Advanced GUI. Then just try a regular sync.
    This can also work by clearing
    If you do need to complete a two way sync you can BACKUP the device and clear the Calendar (edit), then RESTORE just the the calendar in BACKUP / ADVANCED by menu'ing the backup you created and complete a restore of the calendar. By backing up and then clearing and then restoring, this will clear any issues with the calendar DB and reindex the data. Then try a regular sync and see if the symptom is taken care of.
    Good luck and let us know how your doing...
    Message Edited by hmeister on 10-10-2008 09:48 PM
    +++++++++++++++++++++++++++++++++++++++++++++++++
    If successful in helping you, please give me kudos in my post.
    Please mark the post that solved it for you!
    Thank you!
    Best Regards
    hmeister

  • 0COORDER - Delta Load Error

    Hi,
    We are having 0COORDER as a regular delta load. Today we are facing error message: <b>68 duplicate records found in the master data</b>. I am updating data in series from PSA to data target.
    Can you please let me advice, how can i correct this data load error in Master data?
    Best Regards,
    Venkat.

    You should choose the option of ignore duplicate records to let this master data complete sucsessfully. I wouldnt recommend this until you find the root cause for the duplicate records. If you ignore the duplicate records, it is only going to load the first record and the ignore the records with the same key combination. What if the record with correct values came in after the incorrect record. This may lead to some inconsistency. So please check with functional folks and try to analyze the root cause.

  • Job Fail:9 duplicate record found. 73 recordings used in table /BIC/XZ_DEAL

    Dear All
      Job load from bw ODS to bw Master data InfoObject and ODS. This job always fail with same message: <b>9 duplicate record found. 73 recordings used in table /BIC/XZ_DEAL</b> (Z_DEAL is Infoobject).
    When I rerun, this job has job no error.
    Please help me solved this problem.
    thanks
    Phet

    Hi,
    What is the info object name.
    Regards,
    Goutam

  • Duplicate Record Found

    Hi Friends,
    We are getting error "Duplicate Record Found" while loading master data.
    When we checked in PSA there are no duplicated in PSA.
    When we deleted the request and reloaded from PSA it completed successfully.
    Daily we are facing this error and after deletion and reload from PSA it finishes successfully.
    What could be the resaone? Any solution for this?
    Regards
    SSS

    Hi,
    In InfoPackage maintenance screen, under update option, select the check box, Ignore Double Data Records. This may solve u r probelm.
    Hope this helps u a lot.........
    Assigning points is the way of saying Thanks in SDN
    Regards
    Ramakrishna Kamurthy

  • Sap bw 7.3 master data load error

    sap bw 7.3 master data load error
    erroe: Exception in Substep Rules

    Hi Jayram,
    I am assuming that you are getting this error when you are loading data from PSA to IO. If so, the error might be because of
    1. Duplicate records
    PSA might have duplicate records. But you will be able to load only one record to Master data IO. In DTP Update tab, there is an option for "Handle Duplicate Record Keys" . Just enable this & try to load again.
    2. Erroneous records.
    Some special reocrds are allowed in PSA where as thye might not be allowed in Master Data/Data Targets like Lower case letters/ some speacial characters etc. If error is because of this, then you might need to correct the data at PSA level & reload the data to Master data. Else get it corrected in source system itself & fetch the data to BW. Or write some code to take care of these special characters.
    Hope it helps!
    Regards,
    Pavan

  • Delta Load - BI Statictics 0 records

    Hi All,
    I have loaded Init successfully for all the infoprovider related to BI Statistics. But the delta loads are bringing 0 records then I checked in RSA7 and found no records.
    Please help me how to resolve this issue.
    Regards,

    Hi Vikas,
    I will tell you how I did the setting for the statistics  -
    1. Allowed all  the Infoprovider for statistics collection (RSA1-Tools-Setting BI Statistics---Selected Infoprovider changed Defalt  setting to X mode).
    2. Init load for all Statistics cubes.
    While running Delta No records are updated.

  • Payment Wizard error: 'No matching records found  'Bank Codes' (ODSC) (ODBC -2028)'

    Hi Experts,
    I am running a Payment Wizard for a Bank Transfer for a certain vendor, however I get the error "No matching records found  'Bank Codes' (ODSC) (ODBC -2028)".
    I already checked my Bank settings and payment method settings but I still get the error.
    I checked SAP Note 1980507 but the scenario should be without bank details in the BP payment terms tab. But in my case, its defined yet I get the same error as reported.
    Any help would be appreciated.
    Thanks,
    Don

    Hi,
    Please check whether following setup done for payment wizard.
    725786
    - Definitions necessary for the payment wizard
    Thanks & Regards,
    Nagarajan

  • Error: No matching records found 'G/L Accounts' (OACT) ( ODBC-2028)'

    Hi all
    While adding outgoing excise invoice from Delivery the system gives the foll error:
    No matching records found 'G/L Accounts' (OACT) ( ODBC-2028)'
    Please note that
    1. I have already mapped cenvat accounts in outgoing/incoming in general tab of G/L account determination.
    2. I have two fiscal years 09-10 and 10-11. Both are unlock. Im working on 09-10 fiscal year and cenvat accounts are mapped for both fiscal years.
    3. For rounding i have also mapped rounding account.
    4. I am easily able to create incoming excise invoice.
    5. I am able to  add some of outgoing excise invoice but in the accounting tab there is no transaction.
    6. My excise tax codes are BED+VAT in which BED is of 0 rate and VAT is of 12.5% or 4% rate, the reason for taking BED as 0 is the comapany is trading and we have to pass excise to customer by taking item as batch.
    This is whole scenario.
    Plz solve my problem considering all these points. Waiting.
    Thanks
    Edited by: Malhotra Saurabh on May 11, 2010 7:13 AM

    I have managed item by groups and warehouse is excisable as i have done GRPO in that warehouse.
    I have already given cenvat accounts in Warehouse too.
    Edited by: Malhotra Saurabh on May 11, 2010 7:55 AM
    Edited by: Malhotra Saurabh on May 11, 2010 7:57 AM

Maybe you are looking for

  • XI 3 File to IDoc scenario - Still referencing Old Logical system

    Hi Dont know if you can help. I have a file to idoc scenario  using XI 3 to R/3 4.6c system. I had the interface working from a third party system XMLHUB to SAP R/3. The Logical system name for the XMLHUB was XMLHUB01 in the SLD which I have set up a

  • Key Mapping Sample

    Hi Is there any simple key mapping sample? I would like to create a protoype loading (manually) a materials base from different sources (companies) and later try to create key mapping, in order to understand how this key mapping works. Regards, Ronal

  • My website has changed without my knowledge

    I'm new to support communities and submitted a question without the proper information, so here goes again.  My website is joanhauck.com.  All of a sudden, it looks like a blog with extraneous material inserted and a download button, none of which I

  • How to add two same components to jsplitpane?

    I want to add same jfilechooser as left and right component of jsplitpane. But when i am adding, it doesn't shows me either of the component(eg blank jsplitpane). I am Sir plz give me any solution to add same component to jsplitpane.

  • Its is posible to backup a 3tb iMac into a 3tb Time capsule

    Hello, I am getting ready to buy a new iMac with 3tb fusion hard drive, I currently have a 1tb time capsule but I am assuming that I can't backup the new iMac into this TimeCapsule so I am thinking on buying a 3tb TimeCapsule. Can a 3tb TimeCapsule b