No 'Handle Duplicate records' in update tab

Hi,
I've a DTP from ODS/DSO to ODS/DSO and I got a Duplicate record error, which I find rather strange for standard ODS/DSO. I read in the help and within these forums that it can be fixed with the 'Handle Duplicate records' checkbox in the update tab. Trouble is that there isn't such a checkbox in our BI7 SP15 installation.
Any suggestion on the reason for that and how to fix this and more important on how to get rid of the duplicate record error (and the reason why it's occurring)?
Many thanks in advance
Eddy

Hi Eddy,
I am confused -:)
Have u tried by checking or by unchecking..
My suggestion is to try by selecting the setting unique data records...
Cheers
Siva

Similar Messages

  • Master data infoobject can't handle duplicate records after SP10

    Hi
    I am trying to load master data which happened to contain duplicate records from the source system.  In the DTP of the master data infoobject, I have ticked the 'Handle Duplicate Record Keys' checkbox.  After executing this DTP, the duplicate master data records were trapped in the Error Stack.  I am expecting overwriting of the duplicate master data to take place instead.  I understand that this error was fixed in Note 954661 - Updating master data texts when error in data package which is from SP9.  After applying Support Pack 10, the master data infoobject just can't handle records with duplicate keys.
    Please let me know if you manage to fix this problem.
    Many thanks,
    Anthony

    Found a fix for this problem.  Just applied this OSS note  Note 986196 - Error during duplicate record handling of master data texts.

  • Duplicate Data Records indicator / the handle duplicate records

    Hi All,
    I am getting double data in two request. How can I delete extra data using "Duplicate Data Records indicator ".
    I am not able to see this option in PSA as well as DTP for "the handle duplicate records".
    can u help me to find the option in PSA/DTP.
    Regards
    Amit Srivastava

    What Arvind said is correct.
    But if you can try this out in an End Routine, this may work, Not sure though.
    Because then you will be dealing with the entire result_package.
    Also, say if the target that you are talking about is a DSO, then you can Delete Adjacant Duplicates in Start Routine while updating it into your next target. That can be a cube, for ex.

  • Ignore duplicate records for master data attributes

    dear  experts ,
                   how & where can i  enable "ignore duplicate records" when i am running my DTP to load data
                     to master data attributes.

    Hi Raj
    Suppose you are loading master data to InfoObject and in PSA you have more than one records for Key.
    Let's assume you are loading some attributes of Document NUmber ( 0DOC_NUMBER) and in PSA you have multiple records for same document number. In PSA this is not a problem, as Key for PSA table is technical key and which is different for every record in PSA. But this is a problem for InfoObject attribute table as more than  one records for primary key ( here 0DOC_NUMBER) will create Primary Key consraints in database.
    This issue can be easily avoided by selecting "Handle Duplicate Records" in DTP . You will find this option under *update" tab of DTP.
    Regards
    Anindya

  • Duplicate records in PSA

    Hi all,
    how to identify & eliminate the duplicate records in the PSA??

    Hi,
    Here is the FI Help for the 'Handle Duplicate Record Keys' option in the Update tab:
    "Indicator: Handling Duplicate Data Records
    If this indicator is set, duplicate data records are handled during an update in the order in which they occur in a data package.
    For time-independent attributes of a characteristic, the last data record with the corresponding key within a data package defines the the valid attribute value for the update for a given data record key.
    For time-dependent attributes, the validity ranges of the data record values are calculated according to their order (see example).
    If during your data quality measures you want to make sure that the data packages delivered by the DTP are not modified by the master data update, you must not set this indicator!
    Use:
    Note that for time-dependent master data, the semantic key of the DTP may not contain the field of the data source containing the DATETO information. When you set this indicator, error handling must be activated for the DTP because correcting duplicate data records is an error correction. The error correction must be "Update valid records, no reporting" or "Update valid records, reporting possible".
    Example:
    Handling of time-dependent data records
    - Data record 1 is valid from 01.01.2006 to 31.12.2006
    - Data record 2 has the same key but is valid from 01.07.2006 to 31.12.2007
    - The system corrects the time interval for data record 1 to 01.01.2006 to 30.06.2006. As of 01.07.2006, the next data record in the data package (data record 2) is valid."
    By flagging this option in the DTP, you are allowing it to take the latest value.
    There is further information at this SAP Help Portal link:
    http://help.sap.com/saphelp_nw04s/helpdata/en/d0/538f3b294a7f2de10000000a11402f/content.htm
    Rgds,
    Colum

  • Handling Duplicated Records in DTP

    Dear Experts,
    I am trying to load to the master data of 0PLANT using datasource 0BBP_PLANT_LOCMAP_ATTR (Location/Plant Mapping) using DTP.
    This standard datasource is neither language- nor time-dependent. Also, in the source system, it is not marked to handle duplicate records.
    I have also referred to OSS Note 1147563 - Consulting note: Indicator "Duplicate records". One of the key highlight is "If you use a DTP to transfer master data or texts, you can set the indicator 'Duplicate records' in the DTP tab page 'Data transfer'." I would suppose that this means the "Handle Duplicated Record Keys" option under tab "Update" of the respective DTP.
    In this OSS Note, it was also mentioned that
    "You must not set the indicator if the following prerequisites apply:
    The indicator 'DataSource delivers duplicate records' is not set in the DataSource."
    >> which is currently the case of datasource 0BBP_PLANT_LOCMAP_ATTR
    Checked in SAP Help [link|http://help.sap.com/saphelp_nw04s/helpdata/en/42/fbd598481e1a61e10000000a422035/frameset.htm]:
    You can specify how duplicate data records within a request are handled, independently of whether the setting that allows DataSources to deliver potentially duplicate data records has been made. This is useful if the setting was not made to the DataSource, but the system knows from other sources that duplicate data records are transferred (for example, when flat files are loaded).
    My question is, I can't load the master data mainly because of these duplicated record key errors and when I checked on the indicator to handle duplicated record keys, I was given the error message "Enter a valid value". Therafter, I can't do anything at all - activate DTP, click on other tabs - and it just got stucked at this point, and I could only choose to exit the transaction.
    Can anyone advise if I have basically missed anything?
    Thank you in advance.
    Regards,
    Adelynn

    Hi,
    Handling Duplicate Data Records 
    Use
    DataSources for texts or attributes can transfer data records with the same key into BI in one request. Whether the DataSource transfers multiple data records with the same key in one request is a property of the DataSource. There may be cases in which you want to transfer multiple data records with the same key (referred to as duplicate data records below) to BI more than once within a request; this is not always an error. BI provides functions to handle duplicate data records so that you can accommodate this.
    Features
    In a dataflow that is modeled using a transformation, you can work with duplicate data records for time-dependent and time-independent attributes and texts.
    If you are updating attributes or texts from a DataSource to an InfoObject using a data transfer process (DTP), you can specify the number of data records with the same record key within a request that the system can process. In DTP maintenance on the Update tab page, you set the Handle Duplicate Record Keys indicator to specify the number of data records.
    This indicator is not set by default.
    If you set the indicator, duplicate data records (multiple records with identical key values) are handled as follows:
    ●      Time-independent data:
    If data records have the same key, the last data record in the data package is interpreted as being valid and is updated to the target.
    ●      Time-Dependent Data
    If data records have the same key, the system calculates new time intervals for the data record values. The system calculates new time intervals on the basis of the intersecting time intervals and the sequence of the data records.
    Data record 1 is valid from 01.01.2006 to 31.12.2006
    Data record 2 has the same key but is valid from 01.07.2006 to 31.12.2007
    The system corrects the time interval for data record 1 to 01.01.2006 to 30.06.2006. As of 01.07.2006, the next data record in the data package (data record 2) is valid.
    If you set the indicator for time-dependent data, note the following:
    You cannot include the data source field that contains the DATETO information in the semantic key of the DTP. This may cause duplicate data records to be sorted incorrectly and time intervals to be incorrectly calculated.
    The semantic key specifies the structure of the data packages that are read from the source.
    Example
    You have two data records with the same key within one data package.
    In the following graphic, DATETO is not an element of the key:
    In the data package, the data records are in sequence DS2, DS1. In this case, the time interval for data record 1 is corrected:
    Data record 1 is valid from 1.1.2002 to 31.12.2006.
    Data record 2 is valid from 1.1.2000 to 31.12.2001.
    In the following graphic, DATETO is an element of the key:
    If DATETO is an element of the key, the records are sorted by DATETO. In this case, the data record with the earliest date is put before the data record with the most recent date. In the data package, the data records are in sequence DS2, DS1. In this case, the time interval for data record 2 is corrected:
    Data record 2 is valid from 1.1.2000 to 31.12.2000.
    Data record 1 is valid from 1.1.2001 to 31.12.2006.
    If you do not set this indicator, data records that have the same key are written to the error stack of the DTP.
    Note
    You can specify how duplicate data records within a request are handled, independently of whether the setting that allows DataSources to deliver potentially duplicate data records has been made. This is useful if the setting was not made to the DataSource, but the system knows from other sources that duplicate data records are transferred (for example, when flat files are loaded).

  • Duplicate records-problems

    Hi gurus
    We created a text datasource in R/3 and replicated it into BW 7.0
    An infopackage (loading to PSA) and DataTransferProcess was created and included in a process chain.
    The job failed because of duplicate records.
    We now discovered that the setting of the u201CDelivery of Duplicate recordsu201D for this Datasource in BW is set to u201CUndefinedu201D
    When creating the datasource in R/3, there were no settings for the u201CDelivery of duplicate recordsu201D.
    In BW, Iu2019ve tried to change the settings of u201CDelivery of Duplicate data recordsu201D to NONE, but when I go into change-mode, the u201CDelivery of duplicateu201D is not changeable.
    Does anyone have any suggestion on how to solve this problem?
    Thanks,
    @nne Therese

    Hi Muraly,
    I do have the same issue. I am loading texts from R/3 to PSA using Infopackage with Full update. From PSA I am using DTP with Delta with the option" vaild records update, No reporting(Request Record).
    It was running fine for last few weeks like transfer records and added records are same as in the PSA request every day.
    suddenly the load has filed to infoobect . I deleted the request from Infoobject and reloaded using the DTP then again failed. I tried loading full update as it texts then again failed. Now I analised the error it says Duplicate records. So I changed the DTP by checking the option Handling Duplicate records and loaded with full update . It worked fine the transferred records more than 50000 and added records are exact no of the PSA request.
    I reset the DTP again back to Delta and loaded today but the transferred records are 14000 and added records(3000) same as PSA request. I am fine if you see the history of loads the no of records in Transfer and Added in infoobjects and the no of records in PSA request are same every day..
    Why this difference is now? But in Production I have no issues. Since I changed the DTP if I transport to Production does it make any difference. I am first time doing BI 7.0.
    Please suggest me and explain me if I am wrong.
    Thanks,
    Sudha..

  • Duplicate records error?

    hello all
    while extracting master data am getting duplicate records error?
    how do i rectify this?
    in infopackage screen in processing tab,will i get the  option " ignore double data records",?
    when will this option will be enable?
    regards

    Hello
    This option will be available only for Master Data and not for Transactional Data. You could control the Duplicate Records for Transactional Data in ODS, there is an option in the ODS Settings.
    ***F1 Help
    Flag: Handling of duplicate data records
    From BW 3.0 you can determine for DataSources for master data attributes and texts whether the extractor transfers more than one data record in a request for a value belonging to time-independent master data.
    Independently of the extractor settings (the extractor potentially delivers duplicate data records) you can use this indicator to tell the BW whether or not you want it to handle any duplicate records.
    This is useful if the setting telling the extractor how to handle duplicate records is not active, but the system is told from another party that duplicate records are being transferred (for example, when data is loaded from flat files).
    Sankar

  • Duplicate records in DTP

    Guys,
    My DTP failed 4 days back, but the full load from Source system to PSA is executing successfully.
    To correct the DTP load, I have deleted the last DTP request and also all the PSA requests except the last one and when I re-executed the DTP I am getting Duplicate records. Also, in the failed DTP header, I am seeing all the deleted PSA requests?
    How do I resolve this issue? I cannot check Handle duplicate records since it is a Time dependent data.
    Thanks,
    Kumar

    Hi Kumar,
    1.deleting from PSA and updating is not a permanent solution..
    chk wht are the records creating dupicates in PSA first...
    the issue with the duplicates are not from PSA previously there are some records in the object itself...
    so deleting from PSA will not solve the issue...
    chk the data from source tables itself why the wrong data is cmg into PSA like tht...
    then u can edit with the correct records in PSA and update the data into target using DTP..
    u can create an error DTP for tht so tht it'll be easy to trace the duplicates easily,...
    2. You have the option "handle duplicate reords" in the DTP.
    check the box and try to load the data again.
    If this is time dependent master data then check also "valid to" as key along with other objects in sematic group option in the DTP.
    check this and they try to load the data.
    http://help.sap.com/saphelp_nw70/helpdata/EN/42/fbd598481e1a61e10000000a422035/content.htm
    Regards
    Sudheer

  • Duplicate Records in DTP, but not in PSA

    Hi,
    I'm facing a strange behavior of the DTP while trying to load a Master Data, detecting duplicate notes where there is none.
    For example:
    ID 'cours000000000001000'
    In the source system: 1 record
    In the PSA: 1 record
    In the DTP Temporary Storage, 2 identical lines are identified.
    In fact, in this Temporary Storage, all the PSA records are duplicated... but only 101 are displayed as erroneous in the DTP...
    Here is my question: How to get rid of this duplication in the temporary storage?
    Thanks for your help
    Sylvain

    semantic keys selection could cause the duplicate issue in master data. if similar values in the keys were found then that will be taken as duplicate .
    in the second tab of DTP u can the handle duplicate records option choose that and load.
    Ramesh

  • Duplicate Records in Transactional Load

    Dear All,
    I have an issue where data is getting loaded from a Write Optimized DSO to another Write Optimized DSO and the DTP is getting failed because of Duplicate Records. It is for a Transactional Load.
    I would be grateful if you could please help me to know how i can handle this situation and reload the DTP again.
    I have tried searching the Forum,but i can see many for Master data loading where i can select Handling Duplicate Records.
    Thanks in Advance...
    Regards,
    Syed

    Hi Ravi,
    Thanks for your reply.
    If we uncheck the option, it would take the duplicate records right.
    In my scenario, data is coming from a Write Optimized DSO to another Write Optimized DSO. Where as in the first DSO the Data Uniqueness is not checked and in the second DSO uniqueness is checked, so it is giving me the duplicate error message.
    I saw around 28 records in the Error Stack. So please let me know, how i can process this error records(Duplicate) as well.
    Many Thanks...
    Regards,
    Syed

  • POSDM Duplicate records

    Hi Experts,
                        Is there a way to handle duplicate records in SAP POSDM?
    Thanks in advance,
    Khan

    Hi Abhijit,
                   Thanks for the reply. Is it mandatory that we have to maintain rule code to maintain a precondition filter?
    and what does rule code 0001 (balanced transaction) imply?
    as of now i have maintained only the standard BADI Implementation for precondition filter to check for duplicate records for all my ISR & BI tasks.
    Regards,
    Khan

  • Error due to duplicate records

    Hello friends,
    I have done a full upload to the particular charactersitics info object using direct update.(PSA and directly to data target). I used PSA and Subsequently into data target option.
    When i load the data into the object through process chain i get an error that duplicate records exist and the request become red in PSA.
    But no duplicate records exist in the data package and when we try to manually load the record from PSA to data Target it works fine.
    Can any one try to throw some lights on this error?
    Regards
    Sre....

    Hello Roberto and Paolo
    There was an OSS note that we should not use that option only PSA with delete duplicate records and update into data target .
    I dont know the reason Exactly.
    Can you throw some lights on this, Why its like that?
    Thanks for the reply paolo and roberto
    Regards
    Sri

  • Use of "Error Handling" Tab in InfoPackage Update Tab?

    Hello All
    I want to know the Use of "<b>Error Handling</b>" Tab in InfoPackage Update Tab?
    Options in this tab are like
    1.No Update,no reporting
    2.Valid records update,no reporting(request red)
    3.Valid records update,reporting possible(request green)
    Terminate After No.Errors _________
    4.No Aggregation Allowed
    Will anyone tell about this in detail?
    Many Thanks
    balaji

    Hi,
    No update, no reporting (default):
    If errors do occur, the update of the entire data packet is canceled. The request is not released for reporting. However, the records check continues.
    Updating valid records, no reporting (request red) :
    This option enables you to update valid data that is released for reporting only after the administrator has checked the incorrect, not updated records, and has manually released the request (by a QM action, meaning setting the Overall status on the tabstrip Status in the monitor).
    Updating valid records, reporting possible :
    The valid records can be reported on immediately. Automatic follow-up actions are also carried out, such as adjusting the aggregates.
    With option a) the incorrect records are marked in red in the PSA maintenance. You can display the relevant error messages amodally, edit the records, and update the request manually. If it was not possible to write into the PSA (hierarchy or transfer method IDoc), an application log is written instead.
    With options b) and c) a new request that is only read into the PSA is formed from the incorrect records. Here you can edit the records of the new request in the PSA and start the update manually.
    For more information please refer the URL below.
    http://help.sap.com/saphelp_bw30b/helpdata/en/05/bb223905b61b0ae10000000a11402f/frameset.htm
    Regards,
    K.Manikandan.

  • Importing and Updating Non-Duplicate Records from 2 Tables

    I need some help with the code to import data from one table
    into another if it is not a duplicate or if a record has changed.
    I have 2 tables, Members and NetNews. I want to check NetNews
    and import non-duplicate records from Members into NetNews and
    update an email address in NetNews if it has changed in Members. I
    figured it could be as simple as checking Members.MembersNumber and
    Members.Email against the existance of NetNews.Email and
    Members.MemberNumber and if a record in NetNews does not exist,
    create it and if the email address in Members.email has changed,
    update it in NetNews.Email.
    Here is what I have from all of the suggestions received from
    another category last year. It is not complete, but I am stuck on
    the solution. Can someone please help me get this code working?
    Thanks!
    <cfquery datasource="#application.dsrepl#"
    name="qryMember">
    SELECT distinct Email,FirstName,LastName,MemberNumber
    FROM members
    WHERE memberstanding <= 2 AND email IS NOT NULL AND email
    <> ' '
    </cfquery>
    <cfquery datasource="#application.ds#"
    name="newsMember">
    SELECT distinct MemberNumber
    FROM NetNews
    </cfquery>
    <cfif
    not(listfindnocase(valuelist(newsMember.MemberNumber),qryMember.MemberNumber)
    AND isnumeric(qryMember.MemberNumber))>
    insert into NetNews (Email_address, First_Name, Last_Name,
    MemberNumber)
    values ('#trim(qryMember.Email)#',
    '#trim(qryMember.FirstName)#', '#trim(qryMember.LastName)#', '#
    trim(qryMember.MemberNumber)#')-
    </cfif>
    </cfloop>
    </cfquery>
    ------------------

    Dan,
    My DBA doesn't have the experience to help with a VIEW. Did I
    mention that these are 2 separate databases on different servers?
    This project is over a year old now and it really needs to get
    finished so I thought the import would be the easiest way to go.
    Thanks to your help, it is almost working.
    I added some additional code to check for a changed email
    address and update the NetNews database. It runs without error, but
    I don't have a way to test it right now. Can you please look at the
    code and see if it looks OK?
    I am also still getting an error on line 10 after the routine
    runs. The line that has this code: "and membernumber not in
    (<cfqueryparam list="yes"
    value="#valuelist(newsmember.membernumber)#
    cfsqltype="cf_sql_integer">)" even with the cfif that Phil
    suggested.
    <cfquery datasource="#application.ds#"
    name="newsMember">
    SELECT distinct MemberNumber, Email_Address
    FROM NetNewsTest
    </cfquery>
    <cfquery datasource="#application.dsrepl#"
    name="qryMember">
    SELECT distinct Email,FirstName,LastName,MemberNumber
    FROM members
    WHERE memberstanding <= 2 AND email IS NOT NULL AND email
    <> ' '
    and membernumber not in (<cfqueryparam list="yes"
    value="#valuelist(newsmember.membernumber)#"
    cfsqltype="cf_sql_integer">)
    </cfquery>
    <CFIF qryMember.recordcount NEQ 0>
    <cfloop query ="qryMember">
    <cfquery datasource="#application.ds#"
    name="newsMember">
    insert into NetNewsTest (Email_address, First_Name,
    Last_Name, MemberNumber)
    values ('#trim(qryMember.Email)#',
    '#trim(qryMember.FirstName)#', '#trim(qryMember.LastName)#', '#
    trim(qryMember.MemberNumber)#')
    </cfquery>
    </cfloop>
    </cfif>
    <cfquery datasource="#application.dsrepl#"
    name="qryEmail">
    SELECT distinct Email
    FROM members
    WHERE memberstanding <= 2 AND email IS NOT NULL AND email
    <> ' '
    and qryMember.email NEQ newsMember.email
    </cfquery>
    <CFIF qryEmail.recordcount NEQ 0>
    <cfloop query ="qryEmail">
    <cfquery datasource="#application.ds#"
    name="newsMember">
    update NetNewsTest (Email_address)
    values ('#trim(qryMember.Email)#')
    where email_address = #qryEmail.email#
    </cfquery>
    </cfloop>
    </cfif>
    Thank you again for the help.

Maybe you are looking for

  • A different way to deal with "unknown" user after upgrade from Tiger.

    It's a known issue that doing upgrade and install or archive and install from Tiger to Leopard, produces a lot of folders with "unknown" group in their "get info" panel. This has to do with a different group structure in Tiger and Leopard. In Tiger,

  • Archivelogs: recovery.

    Hello I have troubles with db recovery. I have db running in archivelog mode. I backup my db every day at 2 am by dumping it. Now I wanted to try some recovery actions. The import of dumped db goes without any troubles but I want also to recover chan

  • Multiple Image Upload and watermarking

    Hi, I am using a gallery script which takes main images from a dynamic folder and creates thumbnails on the fly. It gets the main images dynamically from $config['fulls'] = $_GET['imgfolder_id']; Then GD is applied to create thumbnails on the fly. Th

  • Production Order Time Ticket Confirmation

    Hi All, Hi Guys i have found out a BAPI  " BAPI_PRODORDCONF_CREATE_TT"  to post the Time related data for a production Order But i am unable to fetch the data required to be passed to the BAPI. Can u ppl please give me a example or guide how to go ab

  • AE 2014 Transition please help?

    Hello, I'd love to reproduce the effect shown in this video: The Lowland by Jhumpa Lahiri - YouTube Thank you in advance for your advise.