Duplicate records in DTP

Guys,
My DTP failed 4 days back, but the full load from Source system to PSA is executing successfully.
To correct the DTP load, I have deleted the last DTP request and also all the PSA requests except the last one and when I re-executed the DTP I am getting Duplicate records. Also, in the failed DTP header, I am seeing all the deleted PSA requests?
How do I resolve this issue? I cannot check Handle duplicate records since it is a Time dependent data.
Thanks,
Kumar

Hi Kumar,
1.deleting from PSA and updating is not a permanent solution..
chk wht are the records creating dupicates in PSA first...
the issue with the duplicates are not from PSA previously there are some records in the object itself...
so deleting from PSA will not solve the issue...
chk the data from source tables itself why the wrong data is cmg into PSA like tht...
then u can edit with the correct records in PSA and update the data into target using DTP..
u can create an error DTP for tht so tht it'll be easy to trace the duplicates easily,...
2. You have the option "handle duplicate reords" in the DTP.
check the box and try to load the data again.
If this is time dependent master data then check also "valid to" as key along with other objects in sematic group option in the DTP.
check this and they try to load the data.
http://help.sap.com/saphelp_nw70/helpdata/EN/42/fbd598481e1a61e10000000a422035/content.htm
Regards
Sudheer

Similar Messages

  • Duplicate Records in DTP's Temporary Storage Area

    I am getting duplicate records (36Records) in DTP's Temporary storage, while loading data to 0MAT_SALES (Text). In Error DTP also I am not getting the Error stack. What will be the reason?
    Ram Mohan

    As I informed in the Previous thread, I have Duplicate record.
    in PSA there is no any duplicate, while execute the DTP i am getting duplicate records in DTP's temporary storage area...
    Ram

  • Duplicate Records in DTP, but not in PSA

    Hi,
    I'm facing a strange behavior of the DTP while trying to load a Master Data, detecting duplicate notes where there is none.
    For example:
    ID 'cours000000000001000'
    In the source system: 1 record
    In the PSA: 1 record
    In the DTP Temporary Storage, 2 identical lines are identified.
    In fact, in this Temporary Storage, all the PSA records are duplicated... but only 101 are displayed as erroneous in the DTP...
    Here is my question: How to get rid of this duplication in the temporary storage?
    Thanks for your help
    Sylvain

    semantic keys selection could cause the duplicate issue in master data. if similar values in the keys were found then that will be taken as duplicate .
    in the second tab of DTP u can the handle duplicate records option choose that and load.
    Ramesh

  • Ignore duplicate records for master data attributes

    dear  experts ,
                   how & where can i  enable "ignore duplicate records" when i am running my DTP to load data
                     to master data attributes.

    Hi Raj
    Suppose you are loading master data to InfoObject and in PSA you have more than one records for Key.
    Let's assume you are loading some attributes of Document NUmber ( 0DOC_NUMBER) and in PSA you have multiple records for same document number. In PSA this is not a problem, as Key for PSA table is technical key and which is different for every record in PSA. But this is a problem for InfoObject attribute table as more than  one records for primary key ( here 0DOC_NUMBER) will create Primary Key consraints in database.
    This issue can be easily avoided by selecting "Handle Duplicate Records" in DTP . You will find this option under *update" tab of DTP.
    Regards
    Anindya

  • Duplicate records in PSA and Error in Delta DTP

    Hi Experts,
    I have enabled delta on CALDAY Feild for a Generic Data Source in Source system.
    1) Initially I have created one record in source system and I have run the Delta Infopackage.So PSA Contains one record.
    2)  I have run the Delta DTP.Now That one record has moved to DSO
    3) Again I have created one more record in source system and I have run the Delta Infopackage.Here I have contains 2 records in PSA.One record is created in step 2 and other record created from step 1.
    4) If  I try to run the Delta DTP, I am getting Error saying that Duplicate records are there.
    So would you please tell me how to rectify this Issue.
    My requirement is I need to run the load multiple times a day . (Note: Data Source is delta enabled)
    Thanks and Regards,
    K.Krishna Chaitanya.

    you can use the this feature only when loading master data, as it will ignore lines. if you would have for example a changed record with a before and after image, one of the two lines would be ignored.
    you can keep searching and asking, but if you want to load a delta more then once a day, you can't use a calendar day as a generic delta field. this will not work.
    the best thing is to have a timestamp.
    if in the datasource you have a date and time, you can create a timestamp yourself.
    you need in this case to use a function module.
    search the forum with the terms 'generic delta function module' and you'll for sure find code that you can reuse
    M.

  • BI 7.0 - Duplicate Record Error while loading master data

    I am working on BI 7.0 and I am trying to load master data to an info object.
    I created an Infopackage and loaded into PSA.
    I created transformation and DTP and I get an error after I execute the DTP about duplicate records.
    I have read all previous threads about duplicate record error while loading master data and most of the them suggested to check 'Ignore duplicate records' option in the infopackage. But in 7.0, I can only load to PSA with infopackage and it doesn't have any option for me to ignore duplicate records.
    My data is getting loaded to PSA fine and I get this error while loading to info object using DTP.
    I would appreciate your help to resolve this issue.
    Regards,
    Ram.

    Hi,
    Refer:
    http://help.sap.com/saphelp_nw2004s/helpdata/en/45/2a8430131e03c3e10000000a1553f6/frameset.htm
    http://help.sap.com/saphelp_nw2004s/helpdata/en/d0/538f3b294a7f2de10000000a11402f/frameset.htm
    With rgds,
    Anil Kumar Sharma .P

  • Duplicate records-problems

    Hi gurus
    We created a text datasource in R/3 and replicated it into BW 7.0
    An infopackage (loading to PSA) and DataTransferProcess was created and included in a process chain.
    The job failed because of duplicate records.
    We now discovered that the setting of the u201CDelivery of Duplicate recordsu201D for this Datasource in BW is set to u201CUndefinedu201D
    When creating the datasource in R/3, there were no settings for the u201CDelivery of duplicate recordsu201D.
    In BW, Iu2019ve tried to change the settings of u201CDelivery of Duplicate data recordsu201D to NONE, but when I go into change-mode, the u201CDelivery of duplicateu201D is not changeable.
    Does anyone have any suggestion on how to solve this problem?
    Thanks,
    @nne Therese

    Hi Muraly,
    I do have the same issue. I am loading texts from R/3 to PSA using Infopackage with Full update. From PSA I am using DTP with Delta with the option" vaild records update, No reporting(Request Record).
    It was running fine for last few weeks like transfer records and added records are same as in the PSA request every day.
    suddenly the load has filed to infoobect . I deleted the request from Infoobject and reloaded using the DTP then again failed. I tried loading full update as it texts then again failed. Now I analised the error it says Duplicate records. So I changed the DTP by checking the option Handling Duplicate records and loaded with full update . It worked fine the transferred records more than 50000 and added records are exact no of the PSA request.
    I reset the DTP again back to Delta and loaded today but the transferred records are 14000 and added records(3000) same as PSA request. I am fine if you see the history of loads the no of records in Transfer and Added in infoobjects and the no of records in PSA request are same every day..
    Why this difference is now? But in Production I have no issues. Since I changed the DTP if I transport to Production does it make any difference. I am first time doing BI 7.0.
    Please suggest me and explain me if I am wrong.
    Thanks,
    Sudha..

  • Duplicate Records generating for infocube.

    hi all,
    when I load the data from datasource to infocube I am getting the duplicate records.Actually the data is loaded into the datasource from a flat file when i executed first time it worked but when i changed  the flat file structure i am not getting the modified content in the infocube instead it is showing the duplicates.In the data source     'preview data' option  it is showing the required data i.e modified flat file) .But where as in the infocube  i made all the necessary changes in the datasource,infocube,infopackage,dtp but still I am getting the duplicates. I even deleted the data in the infocube.Still i am getting the duplicates. What is the ideal solution for this problem ? One way is to create a new  data source with the modified flat file but  I think it is not ideal .Then what is the possible solution with out creating the data source again.
    Edited by: dharmatejandt on Oct 14, 2010 1:46 PM
    Edited by: dharmatejandt on Oct 14, 2010 1:52 PM
    Edited by: dharmatejandt on Oct 14, 2010 1:59 PM

    Finally i got it .I deleted the requestids in the infopackage ( right click infopackage go to manage) then i executed the transformation ,dtp finally i got the required output with out duplicate.
    Edited by: dharmatejandt on Oct 14, 2010 4:05 PM

  • Duplicate Records in Transactional Load

    Dear All,
    I have an issue where data is getting loaded from a Write Optimized DSO to another Write Optimized DSO and the DTP is getting failed because of Duplicate Records. It is for a Transactional Load.
    I would be grateful if you could please help me to know how i can handle this situation and reload the DTP again.
    I have tried searching the Forum,but i can see many for Master data loading where i can select Handling Duplicate Records.
    Thanks in Advance...
    Regards,
    Syed

    Hi Ravi,
    Thanks for your reply.
    If we uncheck the option, it would take the duplicate records right.
    In my scenario, data is coming from a Write Optimized DSO to another Write Optimized DSO. Where as in the first DSO the Data Uniqueness is not checked and in the second DSO uniqueness is checked, so it is giving me the duplicate error message.
    I saw around 28 records in the Error Stack. So please let me know, how i can process this error records(Duplicate) as well.
    Many Thanks...
    Regards,
    Syed

  • Duplicate Records in InfoProvider

    Hi,
    I am loading the Transaction Data from the flat files in to the Data Sources.
    Initially I have One Request (data from one flat file) loaded in to PSA and InfoCube that has say 100 records.
    Later, I loaded another flatfile in to PSA with 50 records (without deleting the initial request).  Now in PSA, I have 150 records.
    But I would like to load only 50 New records in to the Infocube.  When I am executing the DTP, its loading 150 records.  i.e. Total 100 (initial records) + 150 = 250 records.
    Is there any option by which I can avoid loading the duplicate records in to my InfoCube.
    I can find an option that says "Get Data by Request" in DTP.  I tried checking that, but no luck.
    How can I solve this issue and what exactly the check on "Get Data by Request" does?
    Thanks,

    Hi Sesh,
    There is an option in DTP where you can load only the new records. I think you have to select "Do not allow Duplicate records" radio button(i guess)... then try to load the data... I am not sure but you can research for that option in DTP...
    Regards,
    Kishore

  • Duplicate records in flat file extracted using openhub

    Hi folks
    I am extracting data from the cube to opnhub into a flat file, I see duplicate records in the file.
    I am doing a full load to a flat file
    I cannot have technical key because I am using a flat file.
    Poonam

    I am using aggregates(In DTP there is a option to use aggregates) and the aggregates are compressed and I am still facing thiis issue.
    Poonam

  • Duplicate records in PSA

    Hi all,
    how to identify & eliminate the duplicate records in the PSA??

    Hi,
    Here is the FI Help for the 'Handle Duplicate Record Keys' option in the Update tab:
    "Indicator: Handling Duplicate Data Records
    If this indicator is set, duplicate data records are handled during an update in the order in which they occur in a data package.
    For time-independent attributes of a characteristic, the last data record with the corresponding key within a data package defines the the valid attribute value for the update for a given data record key.
    For time-dependent attributes, the validity ranges of the data record values are calculated according to their order (see example).
    If during your data quality measures you want to make sure that the data packages delivered by the DTP are not modified by the master data update, you must not set this indicator!
    Use:
    Note that for time-dependent master data, the semantic key of the DTP may not contain the field of the data source containing the DATETO information. When you set this indicator, error handling must be activated for the DTP because correcting duplicate data records is an error correction. The error correction must be "Update valid records, no reporting" or "Update valid records, reporting possible".
    Example:
    Handling of time-dependent data records
    - Data record 1 is valid from 01.01.2006 to 31.12.2006
    - Data record 2 has the same key but is valid from 01.07.2006 to 31.12.2007
    - The system corrects the time interval for data record 1 to 01.01.2006 to 30.06.2006. As of 01.07.2006, the next data record in the data package (data record 2) is valid."
    By flagging this option in the DTP, you are allowing it to take the latest value.
    There is further information at this SAP Help Portal link:
    http://help.sap.com/saphelp_nw04s/helpdata/en/d0/538f3b294a7f2de10000000a11402f/content.htm
    Rgds,
    Colum

  • Technical Content Object - Duplicate Records

    Hi,
    My PSA getting duplicate records for Technical Content Object. Due to that the load becoming failure.
    Can I remove duplicate records form the Technical Content tables ?
    Thanks,
    Abhi.

    Its 3.x Data Source. So it doesn't have DTP. I found  SAP Note 1229385 - BIAC:Duplicate entries for 0TCTPRCVAR_TEXT in PSA . But it will works for SP19. We are on SP18.
    KIndly pls let me know SAP Note for SP18 to avoid Duplicate entries for 0TCTPRCVAR_TEXT in PSA.
    Thanks,
    Abhi.

  • Data Load Fails due to duplicate records from the PSA

    Hi,
    I have loaded the Master Data Twice in to the PSA.  Then, I created the DTP to load the data from the PSA to the InfoProvider.  The data load is failing with an error "duplicate key/records found".
    Is there any setting that I can configure by which even though I have duplicate records in PSA, I can be successfully able to load only one set of data (without duplicates) in to the InfoProvider?
    How can I set up the process chains to do so?
    Your answer to the above two questions is appreciated.
    Thanks,

    Hi Sesh,
    There are 2 places where the DTP checks for duplicates.
    In the first, it checks previous error stacks. If the records you are loading are still contained in the error stack of a previous DTP run, it will throw the error at this stage. In this case you will first have to clean up the previous error stack.
    The second stage will clean up duplicates  across datapackeages, provided the option is set in your datasource. But you should note that this will not solve the problem if you have duplicates in the same datapackage. In that case you can do the filtering yourself in the start routine of your transformation.
    Hope this helps,
    Pieter

  • Master data infoobject can't handle duplicate records after SP10

    Hi
    I am trying to load master data which happened to contain duplicate records from the source system.  In the DTP of the master data infoobject, I have ticked the 'Handle Duplicate Record Keys' checkbox.  After executing this DTP, the duplicate master data records were trapped in the Error Stack.  I am expecting overwriting of the duplicate master data to take place instead.  I understand that this error was fixed in Note 954661 - Updating master data texts when error in data package which is from SP9.  After applying Support Pack 10, the master data infoobject just can't handle records with duplicate keys.
    Please let me know if you manage to fix this problem.
    Many thanks,
    Anthony

    Found a fix for this problem.  Just applied this OSS note  Note 986196 - Error during duplicate record handling of master data texts.

Maybe you are looking for

  • Why can't I click on my search results?

    I have iTunes version 11.1.4.62 on a Windows 8 and when I search for something in the iTunes store, suggestions appear like they are supposed to and I can highlight them by putting my cursor over it (screenshot below), but clicking does nothing. I ca

  • Btplayerctrl.exe causes problems in Microsoft Word

    I was having a problem in Microsoft Word 2010 and 2013: tooltips that show the content of comments, footnotes, etc., were flickering and disappearing. I have now identified this as an imcompatibility with btplayerctrl.exe, the Bluetooth Media Player

  • CRVS2010 Beta - group tree in VS 2010 Crystal Report

    Hi, I have installed VS 2010 Beta 2 Crystral Report and converted VS2005 report to VS2010 version. I noticed there is a group tree button on the left side of report and I can't remove it. question1) is there a way to remove Parameter Panel from the r

  • Having trouble exporting to PDF from InDesign CS5?

    This background tasks looks good on paper, but it isn't working in the real world. There are some serious problems with this feature. Exporting to PDF just spools away on certain files. Anyone else experiencing this? I've tried ALL the workarounds, b

  • 8320 problems since updated O.S

    I updated to 4.5.02.52 because I could not connect to browser and my hotmail emails were not coming thro to my phone unless i was in a wireless area.(was fine untill 4 days ago) I checked with Oeange and blackberry service are fine. Now I dont have a