Duplicate data records through DTP for attribute

Hi Guys,
I am loading data to customer master data, But it contains duplicate data in large volume.
I have to load both attribute and text data .
Data upto PSA level is correct, and text data also is loaded successfully.
When i am loading attribute data to customer master ,it fails due to duplicate data records.
Then in dtp with update tab, I select the check box for duplicate data records .
As i select this check box ,at bottom it shows the
message that *ENTER VALID VALUE* .
After this message i am unable to click any function and repeat the same message again & again.
So i am unable to execute the DTP.
helpful answer will get full points.
So please give me solution that the above mentioned message should not appear and then
i will be able to Execute the data ?
Thanks .
Saurabh jain.

Hi,
if you get duplicate data for your customer there might be something wrong with your data source or the data in psa. But anyway, leave the dtp by restarting rsa1. Edit or create the dtp again and press save immediately after entering edit mode. Leave the dtp again and start editing it. That should do the trick.
regards
Siggi

Similar Messages

  • Duplicate data records through DTP

    Hi Guys,
    I am loading duplicate data records to customer master data.
    data upto PSA level is correct,
    now when  i am it from psa to customer master through dtp ,
    when in DTP in update tab i select the check box for duplicate data records then at bottom it shows the
    message that *ENTER VALID VALUE*
    After this message i am unable to click any function and repeat the same message again & again.
    So please give me solution that the above mentioned message shouldnt appear and then
    i will be able to Execute the data ?
    Thanks .
      Saurabh jain.

    Hi,
    if you get duplicate data for your customer there might be something wrong with your data source or the data in psa. But anyway, leave the dtp by restarting rsa1. Edit or create the dtp again and press save immediately after entering edit mode. Leave the dtp again and start editing it. That should do the trick.
    regards
    Siggi

  • DTP Error: Duplicate data record detected

    Hi experts,
    I have a problem with loading data from DataSource to standart DSO.
    In DS there are master data attr. which have a key  containing id_field.
    In End routine I make some operations which multiple lines in result package and fill new date field - defined in DSO ( and also in result_package definition )
    I.E.
    Result_package before End routine:
    __ Id_field ____ attra1 ____  attr_b  ...___   attr_x ____ date_field
       ____1________ a1______ b1_________ x1         
       ____2________ a2______ b2_________ x2       
    Result_package after End routine:
    __ Id_field ____ attra1 ____  attr_b  ..___   attr_x ____ date_field
       ____1________ a1______ b1_________ x1______d1         
       ____2________ a1______ b1_________ x1______d2    
       ____3________ a2______ b2_________ x2______d1         
       ____4________ a2______ b2_________ x2______d2   
    The  date_field (date type)  is in a key fields in DSO
    When I execute DTP I have an error in section Update to DataStore Object: "Duplicate data record detected "
    "During loading, there was a key violation. You tried to save more than one data record with the same semantic key."
    As I know the result_package key contains all fields except fields type i, p, f.
    In simulate mode (debuging) everything is correct and the status is green.
    In DSO I have uncheched checkbox "Unique Data Records"
    Any ideas?
    Thanks in advance.
    MG

    Hi,
          In the end routine, try giving
    DELETE ADJACENT DUPLICATES FROM RESULT_PACKAGE COMPARING  XXX  YYY.
    Here XXX and YYY are keys so that you can eliminate the extra duplicate record.
    Or you can even try giving
        SORT itab_XXX BY field1 field2  field3 ASCENDING.
        DELETE ADJACENT DUPLICATES FROM itab_XXX COMPARING field1 field2  field3.
    this can be given before you loop your internal table (in case you are using internal table and loops)  itab_xxx is the internal table.
    field1, field2 and field 3 may vary depending on your requirement.
    By using the above lines, you can get rid of duplicates coming through the end routine.
    Regards
    Sunil
    Edited by: Sunny84 on Aug 7, 2009 1:13 PM

  • Getting duplicate data records for master data

    Hi All,
    When the process chain for the master data, i am getting duplicate data records and , for that  selected the options in Info package level under processing 1)a  update PSA and subsequentky data targets and alternateely select the option Ignore double data records. But still the load was failing and error message "Duplicate  Data Records" after that rhe sehuduled the Info package then i am not getting the error message next time,
    Can any one help on this to resolve the issue.
    Regrasd
    KK

    Yes, for the first option u can write a routine ,what is ur data target--> if it is a cube, there may be a chances of duplicate records because of the additive nature.if its a ODS then u can avoid this, bec only delta is going to be updated.
    Regarding the time dependant attributes, its based on the date field.we have 4 types of slowly changing dimensions.
    check the following link
    http://help.sap.com/bp_biv135/documentation/Multi-dimensional_modeling_EN.doc
    http://www.intelligententerprise.com/info_centers/data_warehousing/showArticle.jhtml?articleID=59301280&pgno=1
    http://help.sap.com/saphelp_nw04/helpdata/en/dd/f470375fbf307ee10000009b38f8cf/frameset.htm

  • Duplicate data records error

    Hello
    IO is updated from ODS. System generates an error : duplicate data records.
    This is because keys of ods dont match keys of infoobject.
    I can build an other ods to aggregate data before loading to infobjects.
    Are there any ways to do it in start/end routine?
    Thanks

    there is Handle duplicate recors option in DTP
    Indicator: Handling Duplicate Data Records
    Use
    If this indicator is set, duplicate data records (that is, records with the same key) are handled during an update in the order in which they occur in a data package.
    For time-independent attributes of a characteristic, the last data record with the corresponding key within a data package defines the valid attribute value for the update for a given data record key.
    Issue solved
    Thanks a lot !

  • Duplicate Data records occured while loading

    Hai Experts
    While loading Duplicate Data records occured earlier there was less no of records i used to delete the duplicate records in background in PSA. Now  i am having more records which i am confusing which to delete and which not to delete.  Since it is a flatfile loading with delta update mode. and when i went to infopackage it is showing   update subsequently in data target in hide position and i went through process chainipdisplay variant---- and made status red--- selected updated subsequently in data target
    ignore duplicate data records. and i want to trigger the subsequent process. if we go through process monitor i can rspc_process_finish. I i go display variant then what is the process.......................... is there any function module.

    Select the check box " Handle duplicate records key" in DTP on update tab.
    Thanks....
    Shambhu

  • Duplicate Data Records indicator / the handle duplicate records

    Hi All,
    I am getting double data in two request. How can I delete extra data using "Duplicate Data Records indicator ".
    I am not able to see this option in PSA as well as DTP for "the handle duplicate records".
    can u help me to find the option in PSA/DTP.
    Regards
    Amit Srivastava

    What Arvind said is correct.
    But if you can try this out in an End Routine, this may work, Not sure though.
    Because then you will be dealing with the entire result_package.
    Also, say if the target that you are talking about is a DSO, then you can Delete Adjacant Duplicates in Start Routine while updating it into your next target. That can be a cube, for ex.

  • Getting Duplicate data Records error while loading the Master data.

    Hi All,
    We are getting Duplicate data Records error while loading the Profit centre Master data. Master data contains time dependent attributes.
    the load is direct update. So i made it red and tried to reloaded from PSA even though it is throwing same error.
    I checked in PSA. Showing red which records have same Profit centre.
    Could any one give us any suggestions to resolve the issues please.
    Thanks & Regards,
    Raju

    Hi Raju,
            I assume there are no routines written in the update rules and also you ae directly loading the data from R/3 ( not from any ODS). If that is the case then it could be that, the data maintained in R/3 has overlapping time intervals (since  time dependency of attributes is involved). Check your PSA if the same profit center has time intervals which over lap. In that case, you need to get this fixed in R/3. If there are no overlapping time intervals, you can simply increase the error tolerance limit in your info-package and repeat the load.
    Hope this helps you.
    Thanks & Regards,
    Nithin Reddy.

  • How to avoid 'duplicate data record' error message when loading master data

    Dear Experts
    We have a custom extractor on table CSKS called ZCOSTCENTER_ATTR. The settings of this datasource are the same as the settings of 0COSTCENTER_ATTR. The problem is that when loading to BW it seems that validity (DATEFROM and DATETO) is not taken into account. If there is a cost center with several entries having different validity, I get this duplicate data record error. There is no error when loading 0COSTCENTER_ATTR.
    Enhancing 0COSTCENTER_ATTR to have one datasource instead of two is not an option.
    I know that you can set ignore duplicates in the infopackage, but that is not a nice solution. 0COSTCENTER_ATTR can run without this!
    Is there a trick you know to tell the system that the date fields are also part of the key??
    Thank you for your help
    Peter

    Alessandro - ZCOSTCENTER_ATTR is loading 0COSTCENTER, just like 0COSTCENTER_ATTR.
    Siggi - I don't have the error message described in the note.
    "There are duplicates of the data record 2 & with the key 'NO010000122077 &' for characteristic 0COSTCENTER &."
    In PSA the records are marked red with the same message (MSG no 191).
    As you see the key does not contain the date when the record is valid. How do I add it? How is it working for 0COSTCENTER_ATTR with the same records? Is it done on the R/3 or on the BW side?
    Thanks
    Peter

  • How to rectify the error message " duplicate data records found"

    Hi,
    How to  rectify the error "duplicate data records found" , PSA is not there.
    and give me brief description about RSRV
    Thanks in advance,
    Ravi Alakunlta

    Hi Ravi,
    In the Info Package screen...Processing tab...check the option Do not allow Duplicate records.
    RSRV is used for Repair and Analysis purpose.
    If you found Duplicate records in the F fact table...Compress it then Duplicate records will be summarized in the Cube.
    Hope this helps.

  • Duplicate data record

    hi can anbody help me out,can u tell me,in what senario we will get the duplicate data record  issue and what will be the solution?
    recently i joined the company and iam new to support work.
    please provide me step by step guidance.any help can be appriciateable.

    Hi sk ss 
    It depends try to search in our forum u will find n no of postings on this particular
    issue.
    Any ways
    In general it comes in the Master data loads .To avoid this we flag an option which
    is avilable in our info pak level.like Ignore double data records.
    Some times it comes under data level part as well at that time we will just rerun
    the Attribute change run and then restart the infopackage.
    <u><b>My suggestion Please do a search in our portal to get very clear about this issue .
    there huge no of postings on this..</b></u>
    Hope itz clear a little atleast...!
    Thanks & Regards
    R M K
    ***Assigning pointz is the only way of saying thanx in SDN ***
    **Winners dont do different things,they do things differently**
    > hi can anbody help me out,can u tell me,in what
    > senario we will get the duplicate data record  issue
    > and what will be the solution?
    > recently i joined the company and iam new to support
    > work.
    > please provide me step by step guidance.any help can
    > be appriciateable.

  • Not fetching the records through DTP

    Hi Gurus,
    I am facing a problem while loading the data in to infocube through DTP.
    I have successfully loaded the data till PSA but not able to load the records into infocube.
    The request was successfully with status green but able to see only 0 records loaded.
    later one of my friend executed the DTP successfully with all the records loaded.
    can you please tell me why it is not working with my userid.
    I have found the following difference in the monitor.
    I am not able to see any selections for my request and ale to see REQUID = 871063 in the selection of the request started by my friend.
    can any one tell me why that REQUID = 871063 is not displaying automatically when I have started the schedule.

    Hi,
    I guess the DTP update process is DELTA UPDATE mode. Because you and your friend/colleague have executed SAME DTP object with a small time gap and during the same period no new TRANSATIONS HAVE POSTED IN SOURCE.
    -Try to execute after couple hours....
    Regards

  • Master Data load via DTP (Updating attribute section taking long time)

    Hi all,
    Iam loading to a Z infoobject. Its a master data load for attributes.Surprisingly, i could find that PSA  pulls records very fastly( 2 minutes) but the DTP which updates the infoobject takes a lot of time. It runs into hours.
    When observed the DTP execution monitor, which shows the breakup of time between extraction,filter,transformation,updation of attributes
    i could observe that the last step "updation of attributes for infoobject" is taking lots of time.
    The masterdata infoobject has got also two infoobjects compounded.
    In transformation ,even they are mapped.
    No of parallel processes for the DTP was set to 3 in our system.
    Job Class being "C".
    Can anyone think of what could be the reason.

    Hi,
    Check the T code ST22 for any short dump while loading this master data. There must be some short dump occured.
    There is also a chance that you are trying to load some invalid data (like ! character as a first character in the field) into the master.
    Regards,
    Yogesh.

  • Data Package 1 ( 0 Data Records ) at DTP with Status RED

    Hi All,
       For 0 records at DTP level it is showing Overall status & Technical status as RED and yellow beside Data Package 1 ( 0 Data Records ).  There is no short dump no error message. At PSA level in status tab the message displayed is Status 8 which says no data on R3 side for this particular load. Help me out.
    Regards,
    Krishna.

    Hi,
    if traffic light is not highlighted, you are probably running a delta.
    You will have to set the traffic light in the according init.
    (and run init again )
    the setting in the delta will be the same.
    Udo

  • Master Data: Transformation and DTP for compounded characteristic

    Good day
    Please assist, I am not sure what is the correct way.
    I extract master data from a db via DB connect.
    There are three fields in the db view for extraction. (1) Code (2) Kind and (3) Code Text.
    What I have done is the following. I created a datasource with transformation and DTP for master data text for (1) and (3), and then a datasource master data attribute transformation and DTP for (1) and (3).
    Is this the correct way to handle extracts of compounded characteristics?
    Your assistance ill be appreciated.
    Thanks
    Cj

    Hello,
    if the char ' Code' is compounded with 'Kind'.
    then for text datasource u shld have  1, 2 and 3. the table for this datasource should have 'code' and 'kind' as keys.
    for the attribute datasource u shld have 1 ,2 followed by the reqd attributes of 'Code'.
    Reagrds,
    Dhanya

Maybe you are looking for

  • Dynamic display in a jsp

    Hi all, My requirement is I need to display a calender on the jsp and when the user clicks on a date I need to display a set of radio buttons underneath the calender on the same page. The date field on the calendar is an anchor tag. So what I need to

  • OS X Leopard installation hangs with about 24 mins left! Please help!

    I was prepping my iMac (Intel) for resale, so after backing up my data, I booted up with the Mac OS X Installation Disc (1 of 2). Before proceeding with the installation I went to Disk Utilities and ran a 7-pass erase of my HD. After erasure was comp

  • I want to install 10.4 on my eMac which is running 10.5.8 so that I can install OS9. (Don't ask.)

    I want to install 10.4 on my eMac which is running 10.5.8 so that I can install OS9. (Don't ask.) The eMac is the very last one which was produced for the education market. What I need is an idiot's step by step guide to un-installing 10.5.8 and then

  • 2600n not printing

    Printer HP 2600n has ceased to print colour images. When you send on the press on the printer the green light-emitting diode blinks and is written - "There is a press", more occurs nothing. Tried to connect the printer to other personal computer - on

  • Connected to wifi but unable to get on safari, any suggestions?

    I am in a hotel and connected to the wifi system. When I try to get on the Internet I am td there is no connection. I was on earlier today.  The signal strength is full. What can I do to trouble shoot?