Duplicate Records in DTP's Temporary Storage Area

I am getting duplicate records (36Records) in DTP's Temporary storage, while loading data to 0MAT_SALES (Text). In Error DTP also I am not getting the Error stack. What will be the reason?
Ram Mohan

As I informed in the Previous thread, I have Duplicate record.
in PSA there is no any duplicate, while execute the DTP i am getting duplicate records in DTP's temporary storage area...
Ram

Similar Messages

  • How to view temporary storage areas while loading from  DTP?

    I am loading loading to Infocube using DTP.The following error occurs
    "Error while updating to target C_0APO_C (type INFOCUBE)" .
    I want to view temporary storage area to see data after transformation.How to do that and why this error is occuring.

    Hi..
    Option 1:
    To view the Temporary Storage after the data load:
    You have to enable the Temporary storage in the DTP Definition for various steps that you want to analyze.
    This is available in the Settings of DTP (see Menu options)
    Option 2:
    Whenever your data load generates any errors, in the DTP Monitor you can see the Error Stack to analyze the errors.
    Cheers.

  • Duplicate Records in DTP, but not in PSA

    Hi,
    I'm facing a strange behavior of the DTP while trying to load a Master Data, detecting duplicate notes where there is none.
    For example:
    ID 'cours000000000001000'
    In the source system: 1 record
    In the PSA: 1 record
    In the DTP Temporary Storage, 2 identical lines are identified.
    In fact, in this Temporary Storage, all the PSA records are duplicated... but only 101 are displayed as erroneous in the DTP...
    Here is my question: How to get rid of this duplication in the temporary storage?
    Thanks for your help
    Sylvain

    semantic keys selection could cause the duplicate issue in master data. if similar values in the keys were found then that will be taken as duplicate .
    in the second tab of DTP u can the handle duplicate records option choose that and load.
    Ramesh

  • Duplicate records in DTP

    Guys,
    My DTP failed 4 days back, but the full load from Source system to PSA is executing successfully.
    To correct the DTP load, I have deleted the last DTP request and also all the PSA requests except the last one and when I re-executed the DTP I am getting Duplicate records. Also, in the failed DTP header, I am seeing all the deleted PSA requests?
    How do I resolve this issue? I cannot check Handle duplicate records since it is a Time dependent data.
    Thanks,
    Kumar

    Hi Kumar,
    1.deleting from PSA and updating is not a permanent solution..
    chk wht are the records creating dupicates in PSA first...
    the issue with the duplicates are not from PSA previously there are some records in the object itself...
    so deleting from PSA will not solve the issue...
    chk the data from source tables itself why the wrong data is cmg into PSA like tht...
    then u can edit with the correct records in PSA and update the data into target using DTP..
    u can create an error DTP for tht so tht it'll be easy to trace the duplicates easily,...
    2. You have the option "handle duplicate reords" in the DTP.
    check the box and try to load the data again.
    If this is time dependent master data then check also "valid to" as key along with other objects in sematic group option in the DTP.
    check this and they try to load the data.
    http://help.sap.com/saphelp_nw70/helpdata/EN/42/fbd598481e1a61e10000000a422035/content.htm
    Regards
    Sudheer

  • Ignore duplicate records for master data attributes

    dear  experts ,
                   how & where can i  enable "ignore duplicate records" when i am running my DTP to load data
                     to master data attributes.

    Hi Raj
    Suppose you are loading master data to InfoObject and in PSA you have more than one records for Key.
    Let's assume you are loading some attributes of Document NUmber ( 0DOC_NUMBER) and in PSA you have multiple records for same document number. In PSA this is not a problem, as Key for PSA table is technical key and which is different for every record in PSA. But this is a problem for InfoObject attribute table as more than  one records for primary key ( here 0DOC_NUMBER) will create Primary Key consraints in database.
    This issue can be easily avoided by selecting "Handle Duplicate Records" in DTP . You will find this option under *update" tab of DTP.
    Regards
    Anindya

  • Identifying duplicate records in a table

    I am trying to identify duplicate records in a table - well they are broadly duplicated but some of the fields are changed on each insert whilst others are always the same.
    I can't work out the logic and it is driving me #$%$#^@ crazy !

    Here are a couple of other examples:
    Method 1: -- Makes use of the uniqueness of Oracle ROWIDs to identify duplicates.
    =========
    To check for single column duplicates:
    select rowid, deptno
    from dept outer
    where
    outer.rowid >
    (select min(rowid) from dept inner
    where inner.deptno=outer.deptno)
    order by deptno;
    To check for multi-column (key) duplicates:
    select rowid, deptno, dname
    from dept outer
    where
    outer.rowid >
    (select min(rowid) from dept inner
    where inner.deptno| |inner.dname=outer.deptno| |outer.deptno)
    order by deptno;
    Method 2: -- Makes use of resultset groups to identify uniqueness
    =========
    To check for single column duplicates:
    select rowid, deptno
    from dept
    where
    deptno in
    (select deptno from dept group by deptno having count(*) > 1)
    order by deptno;
    To check for multi-column (key) duplicates:
    select rowid, deptno, dname
    from dept
    where
    deptno| |dname in
    (select deptno| |dname from dept group by deptno| |dname having count(*) > 1)
    order by deptno;
    null

  • Duplicate records in PSA and Error in Delta DTP

    Hi Experts,
    I have enabled delta on CALDAY Feild for a Generic Data Source in Source system.
    1) Initially I have created one record in source system and I have run the Delta Infopackage.So PSA Contains one record.
    2)  I have run the Delta DTP.Now That one record has moved to DSO
    3) Again I have created one more record in source system and I have run the Delta Infopackage.Here I have contains 2 records in PSA.One record is created in step 2 and other record created from step 1.
    4) If  I try to run the Delta DTP, I am getting Error saying that Duplicate records are there.
    So would you please tell me how to rectify this Issue.
    My requirement is I need to run the load multiple times a day . (Note: Data Source is delta enabled)
    Thanks and Regards,
    K.Krishna Chaitanya.

    you can use the this feature only when loading master data, as it will ignore lines. if you would have for example a changed record with a before and after image, one of the two lines would be ignored.
    you can keep searching and asking, but if you want to load a delta more then once a day, you can't use a calendar day as a generic delta field. this will not work.
    the best thing is to have a timestamp.
    if in the datasource you have a date and time, you can create a timestamp yourself.
    you need in this case to use a function module.
    search the forum with the terms 'generic delta function module' and you'll for sure find code that you can reuse
    M.

  • SSIS - "Violation of PRIMARY KEY constraint. Cannot insert duplicate key in object ' tablename '. The duplicate key value is 1234 . Though there are no duplicate records.

    Hi,
    I am providing support to one of our clients, where we have jobs scheduled to load the data from the tables in the source database to the destination database via SSIS packages. The first time load is a full load where we truncate all the tables in the destination
    and load them from the source tables. But from the next day, we perform the incremental load from source to destination, i.e., only modified records fetched using changed tracking concept will be loaded to the destination. After full load, if we run the incremental
    load, the job is failing with the error on one of the packages "Violation of PRIMARY KEY constraint. Cannot insert duplicate key in object '<tablename>'. The duplicate key value is <1234>, even though there are no duplicate records. When we
    try debugging and running the failing package, it runs successfully. We are not able to figure out why the package fails and when we run the next day it runs successfully. Request you to help me in this regard.
    Thank you,
    Bala Murali Krishna Medipally.

    Hi,
    I am providing support to one of our clients, where we have jobs scheduled to load the data from the tables in the source database to the destination database via SSIS packages. The first time load is a full load where we truncate all the tables in the destination
    and load them from the source tables. But from the next day, we perform the incremental load from source to destination, i.e., only modified records fetched using changed tracking concept will be loaded to the destination. After full load, if we run the incremental
    load, the job is failing with the error on one of the packages "Violation of PRIMARY KEY constraint. Cannot insert duplicate key in object '<tablename>'. The duplicate key value is <1234>, even though there are no duplicate records. When we
    try debugging and running the failing package, it runs successfully. We are not able to figure out why the package fails and when we run the next day it runs successfully. Request you to help me in this regard.
    Thank you,
    Bala Murali Krishna Medipally.
    I suspect you are trying to insert modified records instead of updating.

  • Error when setting the Temporary Storage settings of DTP

    Hi all,
    I am trying to set the temporary storage settings for DTP.But the moment i click on the temporary storage settings, it gives an ABAP Short Dump with the following message
    Runtime Errors         OBJECTS_OBJREF_NOT_ASSIGNED
    Except.                CX_SY_REF_IS_INITIAL
    Date and Time          29.09.2008 23:39:09
    Do anyone know the resolution for this?.Any SAP note that can be applied to solve this problem.
    regards,
    Ram

    Hi,
    Check the note  1081884 ....

  • How to create duplicate records in end routines

    Hi
    Key fields in DSO are:
    Plant
    Storage Location
    MRP Area
    Material
    Changed Date
    Data Fields:
    Safety Stocky
    Service Level
    MRP Type
    Counter_1 (In flow Key figure)
    Counter_2 (Out flow Key Figure)
    n_ctr  (Non Cumulative Key Figure)
    For every record that comes in, we need to create a dupicate record. For the original record, we need to make the Counter_1 as 1 and Counter_2 as 0. For the duplicate record, we need to update Changed_Date to today's date and rest of the values will remain as is and update the counter_1 as 0 and counter_2 as -1. Where is the best place to write this code in DSO. IS it End
    routine?
    please let me know some bais cidea of code.

    Hi Uday,
    I have same situation like Suneel and have written your logic in End routine DSO as follows:
    DATA: l_t_duplicate_records TYPE TABLE OF TYS_TG_1,
          l_w_duplicate_record TYPE TYS_TG_1.
    LOOP AT RESULT_PACKAGE ASSIGNING <result_fields>.
        MOVE-CORRESPONDING <result_fields> TO l_w_duplicate_record.
        <result_fields>-/BIC/ZPP_ICNT = 1.
        <result_fields>-/BIC/ZPP_OCNT = 0.
        l_w_duplicate_record-CH_ON = sy-datum.
        l_w_duplicate_record-/BIC/ZPP_ICNT = 0.
        l_w_duplicate_record-/BIC/ZPP_OCNT = -1.
        APPEND l_w_duplicate_record TO  l_t_duplicate_records.
    ENDLOOP.
    APPEND LINES OF l_t_duplicate_records TO RESULT_PACKAGE.
    I am getting below error:
    Duplicate data record detected (DS ZPP_O01 , data package: 000001 , data record: 4 )     RSODSO_UPDATE     19     
    i have different requirement for date. Actually my requirement is to populate the CH_ON date as mentioned below:
    sort the records based on the key and get the latest CH_ON value with unique Plant,sloc, material combination and populate
    that CH_ON value for duplicate record.
    Please help me to resolve this issue.
    Thanks,
    Ganga

  • Duplicate records-problems

    Hi gurus
    We created a text datasource in R/3 and replicated it into BW 7.0
    An infopackage (loading to PSA) and DataTransferProcess was created and included in a process chain.
    The job failed because of duplicate records.
    We now discovered that the setting of the u201CDelivery of Duplicate recordsu201D for this Datasource in BW is set to u201CUndefinedu201D
    When creating the datasource in R/3, there were no settings for the u201CDelivery of duplicate recordsu201D.
    In BW, Iu2019ve tried to change the settings of u201CDelivery of Duplicate data recordsu201D to NONE, but when I go into change-mode, the u201CDelivery of duplicateu201D is not changeable.
    Does anyone have any suggestion on how to solve this problem?
    Thanks,
    @nne Therese

    Hi Muraly,
    I do have the same issue. I am loading texts from R/3 to PSA using Infopackage with Full update. From PSA I am using DTP with Delta with the option" vaild records update, No reporting(Request Record).
    It was running fine for last few weeks like transfer records and added records are same as in the PSA request every day.
    suddenly the load has filed to infoobect . I deleted the request from Infoobject and reloaded using the DTP then again failed. I tried loading full update as it texts then again failed. Now I analised the error it says Duplicate records. So I changed the DTP by checking the option Handling Duplicate records and loaded with full update . It worked fine the transferred records more than 50000 and added records are exact no of the PSA request.
    I reset the DTP again back to Delta and loaded today but the transferred records are 14000 and added records(3000) same as PSA request. I am fine if you see the history of loads the no of records in Transfer and Added in infoobjects and the no of records in PSA request are same every day..
    Why this difference is now? But in Production I have no issues. Since I changed the DTP if I transport to Production does it make any difference. I am first time doing BI 7.0.
    Please suggest me and explain me if I am wrong.
    Thanks,
    Sudha..

  • Duplicate records in Infoobject

    Hi gurus,
      While loading in to a infoobject itz throwing an error as 150 Duplicate records are found in the details tab. My doubt is about temporary correction i.e in the infopackage either we can increase the error handling number and load or we can five the third option(valid records update) and update to infoobject.
    My question is when error occurs i can find that that some records are there in added. After the error occured i forced the request to red(Delta) and deleted it and gave 3rd option(valid records update) and updated to infoobject. The load was successful all the records are transferred but in added the number of records is 0.And also i read in one article that the error records will be stored as seperate request in PSA even i cannot find that. I want to know when or how i will get the news records added.

    Hi Jitu,
    If its only Infoobject,then under processing tab select only PSA and check the first option.After this relrun the infopackge.Go to monitor and see whether u got the same no of records including duplicates.After this follow the below steps,
    1.Go to RSRV- tcode, expand Master data and double click on second option.
    2.Click on the option that is present on right side of the panel. A pop-up window will appear and give the name of the infobject that is failed.
    3.select that particular infoobject and click on Delete button,
    if necessary then save it and click on execute----do this action only if necessary. Usually by deleting , the problem will be resolved. So, after clicking on Delete button, go back to the monitor screen and process the data packet manually by clicking on the wheel button.
    4.Select the request and click on the read manually button (wheel button).
    Dont forget to save the infopackge back by selecting only infoobject.
    hope this works.Let me know if u have doubts.
    Assign points if u r able to find the solution
    Regards,
    Manjula

  • Duplicate records in flat file extracted using openhub

    Hi folks
    I am extracting data from the cube to opnhub into a flat file, I see duplicate records in the file.
    I am doing a full load to a flat file
    I cannot have technical key because I am using a flat file.
    Poonam

    I am using aggregates(In DTP there is a option to use aggregates) and the aggregates are compressed and I am still facing thiis issue.
    Poonam

  • Duplicate records in PSA

    Hi all,
    how to identify & eliminate the duplicate records in the PSA??

    Hi,
    Here is the FI Help for the 'Handle Duplicate Record Keys' option in the Update tab:
    "Indicator: Handling Duplicate Data Records
    If this indicator is set, duplicate data records are handled during an update in the order in which they occur in a data package.
    For time-independent attributes of a characteristic, the last data record with the corresponding key within a data package defines the the valid attribute value for the update for a given data record key.
    For time-dependent attributes, the validity ranges of the data record values are calculated according to their order (see example).
    If during your data quality measures you want to make sure that the data packages delivered by the DTP are not modified by the master data update, you must not set this indicator!
    Use:
    Note that for time-dependent master data, the semantic key of the DTP may not contain the field of the data source containing the DATETO information. When you set this indicator, error handling must be activated for the DTP because correcting duplicate data records is an error correction. The error correction must be "Update valid records, no reporting" or "Update valid records, reporting possible".
    Example:
    Handling of time-dependent data records
    - Data record 1 is valid from 01.01.2006 to 31.12.2006
    - Data record 2 has the same key but is valid from 01.07.2006 to 31.12.2007
    - The system corrects the time interval for data record 1 to 01.01.2006 to 30.06.2006. As of 01.07.2006, the next data record in the data package (data record 2) is valid."
    By flagging this option in the DTP, you are allowing it to take the latest value.
    There is further information at this SAP Help Portal link:
    http://help.sap.com/saphelp_nw04s/helpdata/en/d0/538f3b294a7f2de10000000a11402f/content.htm
    Rgds,
    Colum

  • Technical Content Object - Duplicate Records

    Hi,
    My PSA getting duplicate records for Technical Content Object. Due to that the load becoming failure.
    Can I remove duplicate records form the Technical Content tables ?
    Thanks,
    Abhi.

    Its 3.x Data Source. So it doesn't have DTP. I found  SAP Note 1229385 - BIAC:Duplicate entries for 0TCTPRCVAR_TEXT in PSA . But it will works for SP19. We are on SP18.
    KIndly pls let me know SAP Note for SP18 to avoid Duplicate entries for 0TCTPRCVAR_TEXT in PSA.
    Thanks,
    Abhi.

Maybe you are looking for

  • Can we create an Eloqua cookie for a user if we just know his email address and they have not clicked an email link or filled out an Eloqua form?

    Scenario: When a visitor comes to our website and creates a login, we are pushing their profile information to Eloqua via the Eloqua web services api. However, because they have not filled out an Eloqua form or clicked through an email, their page vi

  • Flash video, Macs and Acrobat 9

    So I can now "insert" Flash video into my PDFs that I create in Acro 9 Pro. I just can't "convert" other video file types to Flash in Acrobat unless I'm using Extended on a PC, right? But once I have the video as an FLV, I'm good? Thanks! dc

  • Page numbers wrong in PDF - original Quark file ok!

    We do magazines mainly and have a Homes and Interiors publication which I have produced for over 7 years. The workflow has changed over the years and is now a PDF one as you'd expect. We output from Quark 7.31 to a postscript file then distill using

  • Why did the Mozilla Plugin Check page stop working?

    This morning I updated to Firefox 18.0.2. Then because of the java stuff that has been going on, I went to update to the latest version. I uninstalled the old Java and installed the new. Then I went to the plugin check page to make sure everything wa

  • Error in creating PO.

    Hi, I don't understand the following error: Please enter an MPN material for your firm's own inv.-managed material    25.05.64                                                                                Message no. MPN006