DTP Error: Duplicate data record detected

Hi experts,
I have a problem with loading data from DataSource to standart DSO.
In DS there are master data attr. which have a key  containing id_field.
In End routine I make some operations which multiple lines in result package and fill new date field - defined in DSO ( and also in result_package definition )
I.E.
Result_package before End routine:
__ Id_field ____ attra1 ____  attr_b  ...___   attr_x ____ date_field
   ____1________ a1______ b1_________ x1         
   ____2________ a2______ b2_________ x2       
Result_package after End routine:
__ Id_field ____ attra1 ____  attr_b  ..___   attr_x ____ date_field
   ____1________ a1______ b1_________ x1______d1         
   ____2________ a1______ b1_________ x1______d2    
   ____3________ a2______ b2_________ x2______d1         
   ____4________ a2______ b2_________ x2______d2   
The  date_field (date type)  is in a key fields in DSO
When I execute DTP I have an error in section Update to DataStore Object: "Duplicate data record detected "
"During loading, there was a key violation. You tried to save more than one data record with the same semantic key."
As I know the result_package key contains all fields except fields type i, p, f.
In simulate mode (debuging) everything is correct and the status is green.
In DSO I have uncheched checkbox "Unique Data Records"
Any ideas?
Thanks in advance.
MG

Hi,
      In the end routine, try giving
DELETE ADJACENT DUPLICATES FROM RESULT_PACKAGE COMPARING  XXX  YYY.
Here XXX and YYY are keys so that you can eliminate the extra duplicate record.
Or you can even try giving
    SORT itab_XXX BY field1 field2  field3 ASCENDING.
    DELETE ADJACENT DUPLICATES FROM itab_XXX COMPARING field1 field2  field3.
this can be given before you loop your internal table (in case you are using internal table and loops)  itab_xxx is the internal table.
field1, field2 and field 3 may vary depending on your requirement.
By using the above lines, you can get rid of duplicates coming through the end routine.
Regards
Sunil
Edited by: Sunny84 on Aug 7, 2009 1:13 PM

Similar Messages

  • Duplicate data records error

    Hello
    IO is updated from ODS. System generates an error : duplicate data records.
    This is because keys of ods dont match keys of infoobject.
    I can build an other ods to aggregate data before loading to infobjects.
    Are there any ways to do it in start/end routine?
    Thanks

    there is Handle duplicate recors option in DTP
    Indicator: Handling Duplicate Data Records
    Use
    If this indicator is set, duplicate data records (that is, records with the same key) are handled during an update in the order in which they occur in a data package.
    For time-independent attributes of a characteristic, the last data record with the corresponding key within a data package defines the valid attribute value for the update for a given data record key.
    Issue solved
    Thanks a lot !

  • How to rectify the error message " duplicate data records found"

    Hi,
    How to  rectify the error "duplicate data records found" , PSA is not there.
    and give me brief description about RSRV
    Thanks in advance,
    Ravi Alakunlta

    Hi Ravi,
    In the Info Package screen...Processing tab...check the option Do not allow Duplicate records.
    RSRV is used for Repair and Analysis purpose.
    If you found Duplicate records in the F fact table...Compress it then Duplicate records will be summarized in the Cube.
    Hope this helps.

  • Getting Duplicate data Records error while loading the Master data.

    Hi All,
    We are getting Duplicate data Records error while loading the Profit centre Master data. Master data contains time dependent attributes.
    the load is direct update. So i made it red and tried to reloaded from PSA even though it is throwing same error.
    I checked in PSA. Showing red which records have same Profit centre.
    Could any one give us any suggestions to resolve the issues please.
    Thanks & Regards,
    Raju

    Hi Raju,
            I assume there are no routines written in the update rules and also you ae directly loading the data from R/3 ( not from any ODS). If that is the case then it could be that, the data maintained in R/3 has overlapping time intervals (since  time dependency of attributes is involved). Check your PSA if the same profit center has time intervals which over lap. In that case, you need to get this fixed in R/3. If there are no overlapping time intervals, you can simply increase the error tolerance limit in your info-package and repeat the load.
    Hope this helps you.
    Thanks & Regards,
    Nithin Reddy.

  • How to avoid 'duplicate data record' error message when loading master data

    Dear Experts
    We have a custom extractor on table CSKS called ZCOSTCENTER_ATTR. The settings of this datasource are the same as the settings of 0COSTCENTER_ATTR. The problem is that when loading to BW it seems that validity (DATEFROM and DATETO) is not taken into account. If there is a cost center with several entries having different validity, I get this duplicate data record error. There is no error when loading 0COSTCENTER_ATTR.
    Enhancing 0COSTCENTER_ATTR to have one datasource instead of two is not an option.
    I know that you can set ignore duplicates in the infopackage, but that is not a nice solution. 0COSTCENTER_ATTR can run without this!
    Is there a trick you know to tell the system that the date fields are also part of the key??
    Thank you for your help
    Peter

    Alessandro - ZCOSTCENTER_ATTR is loading 0COSTCENTER, just like 0COSTCENTER_ATTR.
    Siggi - I don't have the error message described in the note.
    "There are duplicates of the data record 2 & with the key 'NO010000122077 &' for characteristic 0COSTCENTER &."
    In PSA the records are marked red with the same message (MSG no 191).
    As you see the key does not contain the date when the record is valid. How do I add it? How is it working for 0COSTCENTER_ATTR with the same records? Is it done on the R/3 or on the BW side?
    Thanks
    Peter

  • Duplicate data records through DTP

    Hi Guys,
    I am loading duplicate data records to customer master data.
    data upto PSA level is correct,
    now when  i am it from psa to customer master through dtp ,
    when in DTP in update tab i select the check box for duplicate data records then at bottom it shows the
    message that *ENTER VALID VALUE*
    After this message i am unable to click any function and repeat the same message again & again.
    So please give me solution that the above mentioned message shouldnt appear and then
    i will be able to Execute the data ?
    Thanks .
      Saurabh jain.

    Hi,
    if you get duplicate data for your customer there might be something wrong with your data source or the data in psa. But anyway, leave the dtp by restarting rsa1. Edit or create the dtp again and press save immediately after entering edit mode. Leave the dtp again and start editing it. That should do the trick.
    regards
    Siggi

  • Duplicate data records through DTP for attribute

    Hi Guys,
    I am loading data to customer master data, But it contains duplicate data in large volume.
    I have to load both attribute and text data .
    Data upto PSA level is correct, and text data also is loaded successfully.
    When i am loading attribute data to customer master ,it fails due to duplicate data records.
    Then in dtp with update tab, I select the check box for duplicate data records .
    As i select this check box ,at bottom it shows the
    message that *ENTER VALID VALUE* .
    After this message i am unable to click any function and repeat the same message again & again.
    So i am unable to execute the DTP.
    helpful answer will get full points.
    So please give me solution that the above mentioned message should not appear and then
    i will be able to Execute the data ?
    Thanks .
    Saurabh jain.

    Hi,
    if you get duplicate data for your customer there might be something wrong with your data source or the data in psa. But anyway, leave the dtp by restarting rsa1. Edit or create the dtp again and press save immediately after entering edit mode. Leave the dtp again and start editing it. That should do the trick.
    regards
    Siggi

  • Duplicate Data records occured while loading

    Hai Experts
    While loading Duplicate Data records occured earlier there was less no of records i used to delete the duplicate records in background in PSA. Now  i am having more records which i am confusing which to delete and which not to delete.  Since it is a flatfile loading with delta update mode. and when i went to infopackage it is showing   update subsequently in data target in hide position and i went through process chainipdisplay variant---- and made status red--- selected updated subsequently in data target
    ignore duplicate data records. and i want to trigger the subsequent process. if we go through process monitor i can rspc_process_finish. I i go display variant then what is the process.......................... is there any function module.

    Select the check box " Handle duplicate records key" in DTP on update tab.
    Thanks....
    Shambhu

  • Duplicate Data Records indicator / the handle duplicate records

    Hi All,
    I am getting double data in two request. How can I delete extra data using "Duplicate Data Records indicator ".
    I am not able to see this option in PSA as well as DTP for "the handle duplicate records".
    can u help me to find the option in PSA/DTP.
    Regards
    Amit Srivastava

    What Arvind said is correct.
    But if you can try this out in an End Routine, this may work, Not sure though.
    Because then you will be dealing with the entire result_package.
    Also, say if the target that you are talking about is a DSO, then you can Delete Adjacant Duplicates in Start Routine while updating it into your next target. That can be a cube, for ex.

  • Getting duplicate data records for master data

    Hi All,
    When the process chain for the master data, i am getting duplicate data records and , for that  selected the options in Info package level under processing 1)a  update PSA and subsequentky data targets and alternateely select the option Ignore double data records. But still the load was failing and error message "Duplicate  Data Records" after that rhe sehuduled the Info package then i am not getting the error message next time,
    Can any one help on this to resolve the issue.
    Regrasd
    KK

    Yes, for the first option u can write a routine ,what is ur data target--> if it is a cube, there may be a chances of duplicate records because of the additive nature.if its a ODS then u can avoid this, bec only delta is going to be updated.
    Regarding the time dependant attributes, its based on the date field.we have 4 types of slowly changing dimensions.
    check the following link
    http://help.sap.com/bp_biv135/documentation/Multi-dimensional_modeling_EN.doc
    http://www.intelligententerprise.com/info_centers/data_warehousing/showArticle.jhtml?articleID=59301280&pgno=1
    http://help.sap.com/saphelp_nw04/helpdata/en/dd/f470375fbf307ee10000009b38f8cf/frameset.htm

  • Duplicate data record

    hi can anbody help me out,can u tell me,in what senario we will get the duplicate data record  issue and what will be the solution?
    recently i joined the company and iam new to support work.
    please provide me step by step guidance.any help can be appriciateable.

    Hi sk ss 
    It depends try to search in our forum u will find n no of postings on this particular
    issue.
    Any ways
    In general it comes in the Master data loads .To avoid this we flag an option which
    is avilable in our info pak level.like Ignore double data records.
    Some times it comes under data level part as well at that time we will just rerun
    the Attribute change run and then restart the infopackage.
    <u><b>My suggestion Please do a search in our portal to get very clear about this issue .
    there huge no of postings on this..</b></u>
    Hope itz clear a little atleast...!
    Thanks & Regards
    R M K
    ***Assigning pointz is the only way of saying thanx in SDN ***
    **Winners dont do different things,they do things differently**
    > hi can anbody help me out,can u tell me,in what
    > senario we will get the duplicate data record  issue
    > and what will be the solution?
    > recently i joined the company and iam new to support
    > work.
    > please provide me step by step guidance.any help can
    > be appriciateable.

  • DTP Error for Data Loads

    Hello,
    i'm using BI 7x version and we have a custom datasource and custom transformation/DTP process. My transformation bring the data into PSA and then DTP shall load the PSA data into infocube... instead my DTP is going against the datasource itself thereby brining double values.
    Please let me know if i'm doing anything wrong here.
    Thanks
    Sameer

    Sameer,
    I'm a bit confused with your explaination.  A transformation doesn't bring data into the PSA (DataSource), an infopackage does this.  Then you have a transformation between the DataSource(PSA) to a InfoProvider which you use a DTP to move the data.  Assuming all of this is true in your example and you don't have two same data packages in the PSA then what may be the issue is that your data source is a 3.x not 7.0.  If so, then your infopackage can load the data to the target and the PSA just like in BI 3.x.  If this is the case, then you run the DTP, you will double your data.
    Regards,
    Dae

  • Error:IDOCXML data record : In segment attribute occured instead of SEGMENT

    hello friends
    my content scenario is oracle -> xi-> IDoc
    i am getting error as stated in subject from <b>IDOC_ADAPTER</b> error category and ERROR id is <b>ATTRIBUTE_IDOC_RUNTIME></b>
    i am  using IDoc of message type <b>pordcr1.pordcr101</b>.
    i tried every change but still i am  getting the same error.
    i am unable to locate the problem source i went through similar blog posted on sdn but no luck .
    what i will do to solve my problem .i am waiting for my sdn friends suggestion.....amit ranjan

    Hi,
    Maybe you can reload structure of your IDOC in integration repository and delete idoc metadata from IDX2. Please check it maybe it will help. Are you doing a mapping to this SEGMENT? If no you can add a counter there for each SEGMENT.
    regards,
    wojtek

  • Duplciuate Data record error while reloading a DTP

    I have a DTP created to load etxt datasource and put in the process chain. I have set "delta mode" so that it picks up if additional records are available.
    But, I noiticc that when there are no additional records (for eg: earlier load had 269 records and current load has same 269 records), I am gettign the error - "Duplciate data record" and status is set as "Red". I do not want to see error in this case as it did not find any new records and want the status to be green if there are no new records.
    Could you please suggest what settings will do that.
    Regards
    Raj

    Delta DTP will fetch only unloaded requests ( requests which donot exist in target ) and not additional records.
    Is the text datasource delta enabled ?
    Do you have Infopackage of Update type Delta setup?
    Did you run a Full DTP before Delta DTP?
    Assuming Full Infopackage has loaded 269 records to PSA and the same will be loaded ( if no additional records in source system ) again.
    Req 1 - 269 - Yesterday
    Req 2 - 269  - Today
    Full DTP ran yesterday will load 269 records to target.
    Delta DTP ran today will load Req1 & Req 2 to target ( master data will be overwritten ) - Reason Delta DTP is acting like Init with data transfer.
    You can start off with a Delta DTP instead Full, if Full DTP is ran before a Delta DTP maje sure you delete requests loaded by Full DTP.
    This can be ignored as this is Master Data which wil be overwritten.
    To get rid of error jst Check " Handle Duplicate Records " in  Update Tab of DTP.

  • Loading ODS - Data record exists in duplicate within loaded data

    BI Experts,
    I am attemping to load an ODS with the Unique Data Records flag turned ON.  The flat file I am loading is a crosswalk with four fields, the first 3 fields are being used as Key Fields in order to make the records unique.  I have had this issue before, but gave up in frustration and added an ascending number count field so simply create a unique key.  This time I would like to solve the problem if possible.
    The errors come back referring to two data rows that are duplicate:
    Data record 1 - Request / Data package / Data record: REQU_4CNUD93Q3RCC80XFBG2CZXJ0T/000003/ 339
    Data record 2 - Request / data package / data record: REQU_4CNUD93Q3RCC80XFBG2CZXJ0T/000003/ 338
    And below here are the two records that the error message refers to:
    3     338     3902301480     19C*     *     J1JD     
    3     339     3902301510     19C*     *     J1Q5     
    As you can see, the combination of my three Key Fields should not be creating a duplicate. (3902301480, 19C(asterisk) , (asterisk))   and (3902301510, 19C(asterisk) , (asterisk))  I replaced the *'s because they turn bold!
    Is there something off with the numbering of the data records?  Am I looking in the wrong place?  I have examined my flat file and can not find duplicates and the records that BW say are duplicates are not, I am really having a hard time with this - any and all help greatly appreciated!!!

    Thank you for the response Sabuj....
    I was about to answer your questions but I wanted to try one more thing, and it actually worked.  I simply moved the MOST unique Key Field to the TOP of my Key Field list. It was at the bottom before.
    FYI for other people with this issue -
    Apparantly the ORDER of your Key Fields is important when trying to avoid creating duplicate records.
    I am using four data fields, and was using three data fields as the Key Fields.  Any combination of all three would NOT have a duplicate, however when BW finds that the first two key fields match, sometimes it apparantly doesn't care about the third one which would make the row unique.  By simply changing the order of my Key Fields I was able to stop getting the duplicate row errors...
    Lesson - If you KNOW that your records are unique, and you are STILL getting errors for duplicates, try changing the ORDER of your key fields.

Maybe you are looking for