Duplicated records for DTP

Hello Gurus,
         I have a process chain failed. data flow is from sap datasource to write-optimized DSO.  due to duplicated records in the datasource, so DTP failed.  how can I resolve this issue in detailed steps?
Many thanks

Hi,
There is a setting in DTP handled the duplicate key.Just check that and try to load it again.
It will reslove the issue.
Cheers,
Reena

Similar Messages

  • Handling Duplicated Records in DTP

    Dear Experts,
    I am trying to load to the master data of 0PLANT using datasource 0BBP_PLANT_LOCMAP_ATTR (Location/Plant Mapping) using DTP.
    This standard datasource is neither language- nor time-dependent. Also, in the source system, it is not marked to handle duplicate records.
    I have also referred to OSS Note 1147563 - Consulting note: Indicator "Duplicate records". One of the key highlight is "If you use a DTP to transfer master data or texts, you can set the indicator 'Duplicate records' in the DTP tab page 'Data transfer'." I would suppose that this means the "Handle Duplicated Record Keys" option under tab "Update" of the respective DTP.
    In this OSS Note, it was also mentioned that
    "You must not set the indicator if the following prerequisites apply:
    The indicator 'DataSource delivers duplicate records' is not set in the DataSource."
    >> which is currently the case of datasource 0BBP_PLANT_LOCMAP_ATTR
    Checked in SAP Help [link|http://help.sap.com/saphelp_nw04s/helpdata/en/42/fbd598481e1a61e10000000a422035/frameset.htm]:
    You can specify how duplicate data records within a request are handled, independently of whether the setting that allows DataSources to deliver potentially duplicate data records has been made. This is useful if the setting was not made to the DataSource, but the system knows from other sources that duplicate data records are transferred (for example, when flat files are loaded).
    My question is, I can't load the master data mainly because of these duplicated record key errors and when I checked on the indicator to handle duplicated record keys, I was given the error message "Enter a valid value". Therafter, I can't do anything at all - activate DTP, click on other tabs - and it just got stucked at this point, and I could only choose to exit the transaction.
    Can anyone advise if I have basically missed anything?
    Thank you in advance.
    Regards,
    Adelynn

    Hi,
    Handling Duplicate Data Records 
    Use
    DataSources for texts or attributes can transfer data records with the same key into BI in one request. Whether the DataSource transfers multiple data records with the same key in one request is a property of the DataSource. There may be cases in which you want to transfer multiple data records with the same key (referred to as duplicate data records below) to BI more than once within a request; this is not always an error. BI provides functions to handle duplicate data records so that you can accommodate this.
    Features
    In a dataflow that is modeled using a transformation, you can work with duplicate data records for time-dependent and time-independent attributes and texts.
    If you are updating attributes or texts from a DataSource to an InfoObject using a data transfer process (DTP), you can specify the number of data records with the same record key within a request that the system can process. In DTP maintenance on the Update tab page, you set the Handle Duplicate Record Keys indicator to specify the number of data records.
    This indicator is not set by default.
    If you set the indicator, duplicate data records (multiple records with identical key values) are handled as follows:
    ●      Time-independent data:
    If data records have the same key, the last data record in the data package is interpreted as being valid and is updated to the target.
    ●      Time-Dependent Data
    If data records have the same key, the system calculates new time intervals for the data record values. The system calculates new time intervals on the basis of the intersecting time intervals and the sequence of the data records.
    Data record 1 is valid from 01.01.2006 to 31.12.2006
    Data record 2 has the same key but is valid from 01.07.2006 to 31.12.2007
    The system corrects the time interval for data record 1 to 01.01.2006 to 30.06.2006. As of 01.07.2006, the next data record in the data package (data record 2) is valid.
    If you set the indicator for time-dependent data, note the following:
    You cannot include the data source field that contains the DATETO information in the semantic key of the DTP. This may cause duplicate data records to be sorted incorrectly and time intervals to be incorrectly calculated.
    The semantic key specifies the structure of the data packages that are read from the source.
    Example
    You have two data records with the same key within one data package.
    In the following graphic, DATETO is not an element of the key:
    In the data package, the data records are in sequence DS2, DS1. In this case, the time interval for data record 1 is corrected:
    Data record 1 is valid from 1.1.2002 to 31.12.2006.
    Data record 2 is valid from 1.1.2000 to 31.12.2001.
    In the following graphic, DATETO is an element of the key:
    If DATETO is an element of the key, the records are sorted by DATETO. In this case, the data record with the earliest date is put before the data record with the most recent date. In the data package, the data records are in sequence DS2, DS1. In this case, the time interval for data record 2 is corrected:
    Data record 2 is valid from 1.1.2000 to 31.12.2000.
    Data record 1 is valid from 1.1.2001 to 31.12.2006.
    If you do not set this indicator, data records that have the same key are written to the error stack of the DTP.
    Note
    You can specify how duplicate data records within a request are handled, independently of whether the setting that allows DataSources to deliver potentially duplicate data records has been made. This is useful if the setting was not made to the DataSource, but the system knows from other sources that duplicate data records are transferred (for example, when flat files are loaded).

  • LSO: Duplicated Records Created For Each Person During Booking For Course

    Dear All,
    Since 2 months ago, I've been hitting the following issue.
    When I book an employee to a course through ESS, duplicated course participation records are created in the system.
    The duplicated records are having different booking priorities.
    And I aware that, this happened to the course which has workflow tagged to it.
    Sometimes, instead of having duplicated records created, it hit dump with the following message:
    "The ABAP/4 Open SQL array insert results in duplicate database records."
    Does anyone know what problem is this?
    Thanks.

    Did you solve this problem, Im facing the same problem.
    My HRP1001 table is increasing.

  • How to delete the duplicated records, not just surpress the records?

    I am new to CR. Right now I am doing a project which needs CR get query from Oracle. I got the query from Oracle. There are records with duplicated fields. For example (the following only show part of fields):
    ID    body_code  
    1            10            
    2             10            
    3             15            
    4              15           
    5              15           
    6              16           
    I need to only select records (not surpress, because I will do some caluculate later) like following:
    ID         body_code        
    1             10         
    2              15
    3               16   
    I tried to creat selection fomula in fomula workshop, shown as follows:
    onlastrecord;
    <>next
    but CR said next can be evaluated. I think it must have something to do with the print-time. So what show I do to delete the duplicated records.Thank you very much.

    Ting,
    Try this:
    Insert a group on body_code.  Then create a running total called Distinct Count.  Field to summarize -> ID, Evaluate on change of group Group # 1 Body Code, and Never Reset.
    Then insert a chart in the report header:
    In the advanced layout, select body_code on change of and select Distinct Count running total in the Show values.
    I hope I understood what you're looking to accomplish.
    Z

  • Option for error handling for DTP, ' no updata, no reporting" and "deactiva

    Hello Gurus,
         option for error handling for DTP, ' no updata, no reporting" and "deactivated" , please give some explanation and instance for them?
    Many Thanks,

    On the Update tab page, specify how you want the system to respond to data records with errors:
                                a.      No update, no reporting (default)
    If errors occur, the system terminates the update of the entire data package. The request is not released for reporting. However, the system continues to check the records.
                                b.      Update valid records, no reporting (request red)
    This option allows you to update valid data. This data is only released for reporting after the administrator checks the incorrect records that have not been updated and manually releases the request by setting the overall status on the Status tab page in the monitor (QM action).
                                c.      Update valid records, reporting possible
    Valid records can be reported immediately. Automatic follow-up actions, such as adjusting the aggregates, are also carried out.
    http://help.sap.com/saphelp_smehp1/helpdata/en/42/fbd598481e1a61e10000000a422035/content.htm
    Hope it helps.
    rgds, Ghuru

  • Ignore duplicate records for master data attributes

    dear  experts ,
                   how & where can i  enable "ignore duplicate records" when i am running my DTP to load data
                     to master data attributes.

    Hi Raj
    Suppose you are loading master data to InfoObject and in PSA you have more than one records for Key.
    Let's assume you are loading some attributes of Document NUmber ( 0DOC_NUMBER) and in PSA you have multiple records for same document number. In PSA this is not a problem, as Key for PSA table is technical key and which is different for every record in PSA. But this is a problem for InfoObject attribute table as more than  one records for primary key ( here 0DOC_NUMBER) will create Primary Key consraints in database.
    This issue can be easily avoided by selecting "Handle Duplicate Records" in DTP . You will find this option under *update" tab of DTP.
    Regards
    Anindya

  • Data Package 1 ( 0 Data Records ) at DTP with Status RED

    Hi All,
       For 0 records at DTP level it is showing Overall status & Technical status as RED and yellow beside Data Package 1 ( 0 Data Records ).  There is no short dump no error message. At PSA level in status tab the message displayed is Status 8 which says no data on R3 side for this particular load. Help me out.
    Regards,
    Krishna.

    Hi,
    if traffic light is not highlighted, you are probably running a delta.
    You will have to set the traffic light in the according init.
    (and run init again )
    the setting in the delta will be the same.
    Udo

  • Duplicated records

    Dear friends,
    One question for you from a PL/SQL beginner:
    I have to individuate duplicated records (and set them as errated):
    With the below code I retrieve only the plus records, but I need ALL the duplicated records (example, if I have 2 equal records, I need the 2 records, not only one)
    SELECT
    ID_AMBS
    , ASL
    , CD_PRESIDIO
    , GGMM_CONTATTO
    , NR_RICETTA
    , CD_CONT_PRESCR
    , NR_PROG_INT
    FROM NOC_AMBS_WORK
    WHERE ID_AMBS IN
        SELECT
        min(ID_AMBS)
        FROM
        NOC_AMBS_WORK
        GROUP BY
        ASL, CD_PRESIDIO, GGMM_CONTATTO, NR_RICETTA, CD_CONT_PRESCR, NR_PROG_INT
        HAVING
        COUNT (*) > 1
    )With the below code I retrieve all the records of the table (not the duplicated)
           SELECT *
            FROM noc_ambs_work a,
                 noc_ambs_work b
           WHERE NVL(a.asl,'x') = NVL(b.asl,'x')
             AND NVL(a.cd_presidio,'x') = NVL(b.cd_presidio,'x')
             AND NVL(a.ggmm_contatto,'x') = NVL(b.ggmm_contatto,'x')
             AND NVL(a.nr_ricetta,'x') = NVL(b.nr_ricetta,'x')
             AND NVL(a.cd_cont_prescr,'x') = NVL(b.cd_cont_prescr,'x')
             AND NVL(a.nr_prog_int,'x') = NVL(b.nr_prog_int,'x')Can you help me?
    Tks
    leo

    Since you have not mentioned, which columns are getting duplicated, whats table structure, which column is pk, I think you can take help from below sample
    SQL> ed
    Wrote file afiedt.buf
      1  with c as
      2  (
      3  select 1 id, 'A' n from dual union all
      4  select 1,'A' from dual union all
      5  select 1,'A' from dual union all
      6  select 2,'e' from dual union all
      7  select 3,'r' from dual)
      8  select *
      9  from c
    10  where id =(select t1.id
    11         from c t1
    12         group by id
    13*        having count(*) >1)
    SQL> /
            ID N
             1 A
             1 A
             1 A
    SQL>

  • Duplicated records on infoobect data load

    Hi,
    I have a problem when loading dato to 0UCINSTALLA infoobject.
    It goes to Red flga and It reports duplicated records in /BI0/QUCINSTALLA and /BI0/YUCINSTALLA tables.
    I checked the infopackage and the "PSA only" checkbox is selected, and "Continuing..." and "Ingnore dup records" checkboxes are selected, too.
    If the "Ignore duplicated records" is selected, why is reporting the error?
    I don't know what to do with this problem.
    any ideas?
    thanks for the help.
    Mauricio.

    In transfer structure write a start routine that delete duplicate record like that:
    sort DATAPAK by /BIC/filed1 descending /BIC/filed2
    /BIC/filed3.
    delete adjacent duplicates from DATAPAK comparing /BIC/filed1 /BIC/filed2.
    Hope it helps.
    Regards

  • How to connect COEP-ESLH tables - duplicated records

    Dear All,
    I'd like to ask you to please give a hand in this issue:
    I connected COEP and ESLH tables in SQVI. The tables are connected through fields EBELN & EBELP. The problem is that I get duplicated records and I till now I haven't been able to figure out what the reason might be...
    How can I solve this? Please...
    Thanks,
    Csaba

    Hi Christine,
    Thanks for your help.
    To tell the truth I've got a task to check why someone's else query doesn't work...after I while I found out that the problem is about this table-connection...
    I'm not (fully) aware of the business purpose of this query, I'm working in logistics field...
    Example of duplicated reords:
    ESLH
    packno_ebeln_ebelp
    000001_0011_0010
    000002_0011_0010
    COEP
    belnr_buzei_ebeln_ebelp
    000A_0001_0011_0010
    000A_0002_0011_0010
    As a result of COEP-ESLH query I get four records instead of 2 (I guess first record in COEP belong to first record in ESLH, etc but of course it's my opinion and I SAP cannot consider it...)
    BR
    Csaba

  • Outbound: Duplicated records, Scripting tool, Campaign name

    Hi All;
    Is there an option to remove duplicated records in the outbound which could happen when importing two files that have some same numbers, so we need the new to be used while old to be removed (this option is called: remove the duplication .. or something like this).
    Also, I am looking for a scipting tool, so when the agents is talking with the customer, there is a script that help the agent to know how to deal with the customer, this script need to be created per each campaign. Is there something like this?
    Another thing, when the call reaches to the agent, is it possible the name of the campaign to be shown at the CTI client?
    Regards
    Bilal

    Dear Pallavi,
    Very useful post!
    I am looking for similar accelerators for
    Software Inventory Accelerator
    Hardware Inventory Accelerator
    Interfaces Inventory
    Customization Assessment Accelerator
    Sizing Tool
    Which helps us to come up with the relevant Bill of Matetials for every area mentioned above, and the ones which I dont know...
    Request help on such accelerators... Any clues?
    Any reply, help is highly appreciated.
    Regards
    Manish Madhav

  • Failure for DTP

    Hello Gurus,
            I have a failure for DTP, detailed message in the monitor as follows, please show me how can I deal with it?
           Data Package 14: Errors During Processing
                          Update to DataStore Object ZCANDTRA : 50001 -> 50001 Data Records
                          Data package 14 / 04/29/2010 18:44:24 / Status 'Processed with Errors'     
    Many thanks

    Hi,
    It looks like the record has got some errors check what the desciption tells you and also check if it any special characters.If it is special characters then include those and reload.It should work fine.
    Harish
    Edited by: Harish3152 on Apr 30, 2010 10:59 AM

  • The load from ODS - 0FIAP_O03 to cube 0FIAP_C03  is duplicating records

    Hello Everyone
    I need some help/advice for the following issue
    SAP BW version 3.5
    The Delta load for ODS - 0FIAP_O03 works correctly
    The load from ODS - 0FIAP_O03 to cube 0FIAP_C03  is duplicating records
    NB Noticed one other forum user who has raised the same issue but question is not answered
    My questions are 
    1. Is this a known problem?
    2. Is there a fix available from SAP?
    3. If anyone has had this issue and fixed it, could you please share how you achieved this
    i have possible solutions but need to know if there is a standard solution
    Thankyou
    Pushpa

    Hello Pushpa,
    I assume that you are using the Delta load to the initial ODS and then to the CUBE as well.
    If the Delta is placed in both the places then there should not be any issue while sending the data to CUBE.
    If you are using the FULL Load then normally the data will gets aggregated in the cube as objects are with the Addition mode in the Cube.
    Can you post the exact error that you are facing here as this also can be of the design issue.
    Murali

  • Since you don't allow emails any more – I suspect because of the numerous complaints with your service and the way you treat people that you don't want documented, I am calling and I want this call recorded for future reference. I have been a long time fa

    Since you don’t allow emails any more – I suspect because of the numerous complaints with your service and the way you treat people that you don’t want documented, I am calling and I want this call recorded for future reference.
    I have been a long time faithful customer of vzw and although the past year I have been late on payments many times and really couldn’t afford your exorbitant prices for services lots of other companies offer sometimes three times cheaper than what you charge, I have hung in there trying my best to meet my obligations.
    This month has been no exception. You don’t know the background; the whole story of people’s lives. I know you could care less because all you care about is the profit-the money that comes in.
    I was told when I agreed to pay my bill on the third per the recorded message that I had 14 days to pay…you cut me off anyway. The phones are not the tissue; your suspending my service means I cannot work. I may lose my job…how do you justify that? In any case? The least you could do would be to keep 4986 on and cut the phones off. But no. You refuse to compromise and meet the basic needs of your customer. What does that say about your company? I tried to call back on three separate occasions to tell you I couldn’t pay because of unexpected expenses but couldn’t get out of the automated system…sadly couldn’t get to a real person which also speaks volumes to me.
    All this tells me this is a company I don’t wish to be affiliated with any more. As soon as I can, I will discontinue service with you…I know you could care less. I will honor the remaining portion of the contract but that’s it. You don’t deserve my business. I am a good, hardworking person who, at the sacrifice of myself and my needs, always pays her bills…albeit late at times. I realize others tell you stories and lies to justify themselves. That’s not me. If you knew what I had been through the last 7 yrs you would marvel that I am  still on my feet…don’t judge too quickly. You could be wrong…and in my eyes you are by doing this to me.
    God will see us through this extremely scary time of that I have no doubt. No thanks to your company and lack of understanding and mercy. I am doing the best I can. Sadly you are not.
    See I have choices. MANY choices of providers for services you offer. I don’t have to be treated like this. I don’t have to succumb to your coldness and callousness. I intend to choose better (and cheaper). If your company doesn’t get the “people factor” back you will be sorry.

    Problem here is you admit you cannot afford the service.
    And you want to blame Verizon for losing a job because you have no cell phone.
    If your job depends on that phone I would pay it on time every time if you need a job to pay your bill.
    No other service is going to treat you any different. And if you cannot afford Verizon's monthly invoice how are you going to afford new devices, activation fees, possible security deposits on any other cellular carrier? You can't.
    Also if you made an arraignment to pay and then decide you cannot do so, why should Verizon extend you service or credit, or why is it you want to use the service and data and not pay for it as agreed.
    Get a prepay phone. Its evident the cost is too high for you to afford on post pay.
    Good Luck

  • SQL help: return number of records for each day of last month.

    Hi: I have records in the database with a field in the table which contains the Unix epoch time for each record. Letz say the Table name is ED and the field utime contains the Unix epoch time.
    Is there a way to get a count of number of records for each day of the last one month? Essentially I want a query which returns a list of count (number of records for each day) with the utime field containing the Unix epoch time. If a particular day does not have any records I want the query to return 0 for that day. I have no clue where to start. Would I need another table which has the list of days?
    Thanks
    Ray

    Peter: thanks. That helps but not completely.
    When I run the query to include only records for July using a statement such as following
    ============
    SELECT /*+ FIRST_ROWS */ COUNT(ED.UTIMESTAMP), TO_CHAR((TO_DATE('01/01/1970','MM/DD/YYYY') + (ED.UTIMESTAMP/86400)), 'MM/DD') AS DATA
    FROM EVENT_DATA ED
    WHERE AGENT_ID = 160
    AND (TO_CHAR((TO_DATE('01/01/1970','MM/DD/YYYY')+(ED.UTIMESTAMP/86400)), 'MM/YYYY') = TO_CHAR(SYSDATE-15, 'MM/YYYY'))
    GROUP BY TO_CHAR((TO_DATE('01/01/1970','MM/DD/YYYY') + (ED.UTIMESTAMP/86400)), 'MM/DD')
    ORDER BY TO_CHAR((TO_DATE('01/01/1970','MM/DD/YYYY') + (ED.UTIMESTAMP/86400)), 'MM/DD');
    =============
    I get the following
    COUNT(ED.UTIMESTAMP) DATA
    1 07/20
    1 07/21
    1 07/24
    2 07/25
    2 07/27
    2 07/28
    2 07/29
    1 07/30
    2 07/31
    Some dates donot have any records and so no output. Is there a way to show the missing dates with a COUNT value = 0?
    Thanks
    Ray

Maybe you are looking for