Avoiding duplicate records in report

Hi All,
             I have a scenario where
Delivery document gets created in R/3 say on 7/1 with Act GI date "#" and KFs are all "0". This gets loaded into BI.
On "7/5" this is PGId and the status in R/3 changesto ACT GI date "7/5" and Qty of "100" . This when loaded into BI is getting published as dupicate records i.e.
Del doc     Created date     Act GI     Del. Ind     Qty
12345     1-Jul                         #        #     0
12345     1-Jul                         5-Jul        #     100
Please note that the data is getting loaded from DSO into Infocube and DSO is in overwrite mode.
Any suggestions to overcome this problem.

Is ACT GI date  a keyfield in the DSO ?
If yes, data will not be overwritten and two records will be loaded into the Cube.
Make ACT GI date  a datafield which will result in only one record 12345 1-Jul 5-Jul # 100 as the keyfield values are same.
Firstly make sure if this is right for all business scenarios.

Similar Messages

  • Avoid duplicate records

    hi guys
    could u pls let me know where is the option for avoiding duplicate records?
    1. in case of Info package
    2.In case of DTP?

    Hi,
    Incase of infopackage in 3.5 - > Processing tab -> select only PSA ,update subsequent data tagets,ignore double data records
    in 7.0 processing tab by default the selection is only PSA
    Incase of DTP - >update tab -> select handle duplicate data records.

  • How to avoid Duplicate Records  while joining two tables

    Hi,
    I am trying to join three tables, basically two tables are same one is like history table, so I wrote a query like
    select
    e.id,
    e.seqNo,
    e.name,
    d.resDate,
    d.details
    from employees e,
    ((select * from dept)union(select * from dept_hist)) d
    join on d.id=e.id and e.seqno=d.seqno
    but this returing duplicate records.
    Could anyone please tell me how to avoid duplicate records of this query.

    Actually it is like if the record is processed it will be moved to hist table, so both table will not have same records and I need the record from both the tables so i have done the union of both the tables, so d will have the union of both records.
    But I am getting duplicate records if even I am distinct.

  • How to avoid duplicate measures in reports due to case functions?

    Hi,
    If I create a report, using a dimension called insert_source_type where the next measure would be insert_source in the dimensions hirarchie, if I do not put any formula, when I become a report where i can drill down on insert_source_type and i get insert_source values.
    If I use a function like (CASE "Ins Source"."Ins Source Type" WHEN 'OWS' THEN 'WEB' ELSE "Ins Source"."Ins Source Type" END) and change the label of insert_source_tpye to Channel Group instead, when
    I drill down on Channel Group, it goes to insert_source_tpye and from there i can drill down to insert_source.
    There is an insert_source_type too much!
    How can be this avoided?
    Thanks and Regards
    Giuliano

    hi mahesh,
    write module processor for checking the duplicate record in file adapter
    or
    With a JAVA/ABAP mapping u can eliminate the duplicate records
    and check this links
    Re: How to Handle this "Duplicate Records"
    Duplicate records
    Ignoring Duplicate Records--urgent
    Re: Duplicate records frequently occurred
    Re: Reg ODS JUNK DATA
    http://help.sap.com/saphelp_nw2004s/helpdata/en/d0/538f3b294a7f2de10000000a11402f/frameset.htm
    regards
    srinivas

  • Avoiding duplicate records while inserting into the table

    Hi
    I tried the following insert statement , where i want to avoid the duplicate records while inserting itself
    but giving me the errror like invalid identifier, though the column exists in the table
    Please let me know Where i'm doing the mistake.
    INSERT INTO t_map tm(sn_id,o_id,txt,typ,sn_time)
       SELECT 100,
              sk.obj_id,
              sk.key_txt,
              sk.obj_typ,
              sysdate,
              FROM S_KEY sk
        WHERE     sk.obj_typ = 'AY'
              AND SYSDATE BETWEEN sk.start_date AND sk.end_date
              AND sk.obj_id IN (100170,1001054)
               and   not exists  (select 1
                                                                   FROM t_map tm1 where tm1.O_ID=tm.o_id
                                                                        and tm1.sn_id=tm.sn_id
                                                                        and tm1.txt=tm.txt
                                                                        and tm1.typ=tm.typ
                                                                        and tm1.sn_time=tm.sn_time )

    Then
    you have to join the table with alias tml where is that ?do you want like this?
    INSERT INTO t_map tm(sn_id,o_id,txt,typ,sn_time)
       SELECT 100,
              sk.obj_id,
              sk.key_txt,
              sk.obj_typ,
              sysdate,
              FROM S_KEY sk
        WHERE     sk.obj_typ = 'AY'
              AND SYSDATE BETWEEN sk.start_date AND sk.end_date
              AND sk.obj_id IN (100170,1001054)
               and   not exists  (select 1
                                                                   FROM t_map tm where sk.obj_ID=tm.o_id
                                                                        and 100=tm.sn_id
                                                                        and sk.key_txt=tm.txt
                                                                        and sk.obj_typ=tm.typ
                                                                        and sysdate=tm.sn_time )

  • Sqlloader controlfileparam's to avoid duplicate records loading

    Hi All,
    I am trying for a option in control file which should restrict sql loader to load duplicate records.I know if we apply a constraint on table itself, it wont allow you to load duplicate data and reject into reject file but in controlfile i have to do it to make sql loader to avoid loading duplicate records.Can you please suggest me which option in controlfile will enable this.
    Can you please also inform me the exact difference between badfile and reject file.In what scenarios it will write to reject or to badfile
    Regards

    Hey
    i dont think there is any option to avoid loading duplicate records by using a parameter in the control file
    On the difference between the bad and reject files try this link
    http://www.exforsys.com/content/view/1587/240/
    Regards,
    Sushant

  • How to avoid duplicate record in a file to file

    Hi Guys,
              Could you please provide a soultion
              in order  to avoid duplicate entries in a flat file based on key field.
              i request in terms of standard functions
             either at message mappingf level or by configuring the file adapter.
    warm regards
    mahesh.

    hi mahesh,
    write module processor for checking the duplicate record in file adapter
    or
    With a JAVA/ABAP mapping u can eliminate the duplicate records
    and check this links
    Re: How to Handle this "Duplicate Records"
    Duplicate records
    Ignoring Duplicate Records--urgent
    Re: Duplicate records frequently occurred
    Re: Reg ODS JUNK DATA
    http://help.sap.com/saphelp_nw2004s/helpdata/en/d0/538f3b294a7f2de10000000a11402f/frameset.htm
    regards
    srinivas

  • Duplicate records in report

    hello guys,
    he was asking,i have duplicate ecords in the report how do we rectify them?
    why and how the duplicate records come in reporting?how is it possible??
    pls explain me how can this is possible?
    thanks & regards

    Hi,
    It may be possible that your data target may be reading data from DSO (for eg).
    If this DSO have a keyfield as account but not center then in this case , the accounts with different centers but with same amount can acculmualte to duplicate data.
    This case may occur with a flat file load and the records need to be corrected in that case. Also the flat file can work directly in the case when we have both account & center as Keyfield for that particular DSO.
    This is scenario which can happen other than the above.
    Best Regards,
    Arpit 

  • Oracle 10 -   Avoiding Duplicate Records During Import Process

    I have two databases on different servers(DB1&DB2) and a dblink connecting the two. In DB2, I have 100 million records in table 'DB2Target'.
    I tried to load 100 million more records from tables DB1Source to DB2Target on top of existing 400 million records but after an hour I got a network error from DB2.
    The load failed after inserting 70% records.Now I have three tasks. First I have to find the duplicate record between DB1 and DB2. Second I have to find out the remaining 30% records missing from DB2Target.
    Third I have to re-load the remaining 30% records. What is the best solution?
    SELECT COUNT(*), A, B FROM DB2TARGET
    GROUP BY A, B
    HAVING COUNT(*) > 2
    re-loading
    MERGE INTO DB2TARGET tgt
    USING DB1SOURCE src
    ON ( tgt .A=  tgt .A)
    WHEN NOT MATCHED THEN
    INSERT ( tgt.A,  tgt .B)
    VALUES ( src .A,  src .B)Thanks for any guidance.

    when I execute this I get the folllowing error message:
    SQL Error: ORA-02064: distributed operation not supported
    02064. 00000 - "distributed operation not supported"
    *Cause:    One of the following unsupported operations was attempted
    1. array execute of a remote update with a subquery that references
    a dblink, or
    2. an update of a long column with bind variable and an update of
    a second column with a subquery that both references a dblink
    and a bind variable, or
    3. a commit is issued in a coordinated session from an RPC procedure
    call with OUT parameters or function call.
    *Action:   simplify remote update statement                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                       

  • Duplicate records in report but not in Database

    I am new to Crystal Reports and took an online training course for it this week.  I am pulling the data from and Access Database and there is no duplication of records in the tables.
    I can create and run the same report from a different workstation and it works fine, but when I create it on my laptop it creates duplicate entries.
    I am pulling information from 3 seperate tables within the same database.  It will pull the correct "order id" from an order table but it won't pull the correct "unit price" or "quanity" from the order details table.
    The instructor was able to look at report and he even tried to recreate it with the same results.  He stated that there was something wrong with Crystal Report and then something about the SQL, but he talked quickly and then moved out and I wasn't able to ask questions.
    I'm not sure what kind of information you need from me.  Any assistance that you provide would be appricated.  I tried to search on the threads, but couldn't find any similar problems.
    Thanks!

    Hi Angela
    Please check following things:
    1. The databse that is used while creating report on the workstation and laptop are same.
    2. The vesion of crystal reports on both the machines is same(HELP>>About Crystal Reports).
    3.Could you compare this situation for some other database.
    Pleas keep the thread updated so that we can discuss further.
    Thanks

  • Scenario - Webservice - XI - BW. How to Avoid duplicate records?

    Hi all,
    Webservice --> XI -->BW .
    BPM has been used to send to send the response back.
    BPM :
    start ->Receive(Request)> Transformation(Responsemap)>Send(SendtoBW)->Send(Send Response) ---> stop.
    We are making use of MSGID to maintain the uniqueness of each message which is coming from Webservice. Uniqueness is maintained for the combination of sale-num:trans-num:sap-saletype:sale-type like below. One msgID will be registered in XI for each Unique message.
    ex:   sale-num:trans-num:sap-saletype:sale-type
           1983:5837:E:NEW
    If they receive any duplicate message again it will send the response back to webservice as "DUPLICATE_GUID:c9237200-0c69-2a80-dd11-79d5b47b213a".*
    It is working correctly. But only problem is when XI is down or if any communication failure happens in the middle of the processing like below example.
    Sample example which has failed recently. A webservice call has been failed three times and the reasons are..
    First time :
    It got the error as ""FAILED TO INVOKE WEB SERVICE OPERATION OS_CWUSales
    Error receiving Web Service Response: Fatal Error: csnet read operation failed (No such file or directory) (11773)" .
    Second time:
    MessageExpiredException: Message c9237200-0c69-2a80-dd11-79d5b47b213a(OUTBOUND) expired.
    Third Time :
    "DUPLICATE_GUID:c9237200-0c69-2a80-dd11-79d5b47b213a" ""
    If you observe when the call made 2nd time, the MsgID has been registered but due to server down or some other reason it could not able to process further.So MSGID got registered here but processing wasn't happened for the message. When they retried thrid time again to send the same call again they are getting error as "DUPLICATE GUID".
    DUPLICATE GUID means it has meaning that the message has been processed and the records has been updated in the backend system which is not happened now.
    Final Result is :
    Status in Webservice showing as "it has been updated in receicing system" as it is indicating as duplicate guid.
    - But it has not updated in backend system which is a problem.
    Firstly is there any suggestions on how to solve this problem?
    Is there any better way to handle this duplicate thing instead of Msgid?
    Please help me in solving this.
    Thanks & Regards
    Deepthi.
    Edited by: deepthi reddy on Jan 7, 2009 2:07 AM
    Edited by: deepthi reddy on Jan 7, 2009 2:07 AM

    >> My suggestion: You can have a Webservice - BW synch-synch synario without BPM. Sender SOAP adapter sending a synch req msg, getting it mapped to BW req and getting its response back from BW, then map it to webservice response and send to webservice without any receiver SOAP adapter.
    Thanks for the suggestion . Looks like a good Idea.
    >> Regarding the problem of duplicate check: see when your BW system gets req msg, it processes it and then sends a response msg.........in this response message have a STATUS field with values S for success and E for error - send this response to webservice and store it in database for that req MSGID ........now in web service application, check the response code for a MSGID and if it is E for error, then only resend the msg to BW.
    Initially they have planned the same way. But sometimes the response is getting very late back from BW. So now once the request reached to XI then immediately they are sending back the response from XI itself withhardcoded "OK" status. They are not waiting for BAPI response from BW.
    So If message is succesfull the Status will go as "OK"
         If message is not succesfull the status will go with blank. and this will indicate that there is some problem in the other side. Then they will check the SOAP Fault message which will have the errors like 
    "DUPLICATE_GUID:c9237200-0c69-2a80-dd11-79d5b47b213a".
    "FAILED TO INVOKE WEB SERVICE OPERATION "
    Right now they are having the issue only for duplicate data and response time to go back. So by making use of MsgID they solved the Issue.But due to this we are seeing daily so many error messages failing with this DUPLICATE error in production which is hampering the performance.
    So we are thinking of to get rid of this MsgId method and BPM. So from your first answer I think we can able to acheive it w/o BPM and after that I need to check the response time how fast it will go back to webservice.
    -Deepthi.

  • Prevent Duplicate records in Report

    I have the following 2 records...
    PART_EIPN -------|NOMEN--|FMC | PDT | PDT_DESC---------------|LOG_NO
    70107-28400-043 | BIFILAR | 16 ++| A23 | BIFILAR ASSEMBLY |+ 18
    70107-28400-043 | BIFILAR | 16 ++| A23 | BIFILAR ASSEMBLY |+ 23
    How can I have report only print one record and have under log_no print 18,23
    Is there anyway I can prevent the 2nd record from print but still display the 2 log_no (18,23) (+) are not part of record... just for formatting purposes for this forum

    Sorry I missed something.
    Please confirm that this data:
    PART_EIPN -------|NOMEN--|FMC | PDT | PDT_DESC---------------|LOG_NO
    70107-28400-043 | BIFILAR | 16 ++| A23 | BIFILAR ASSEMBLY |+ 18
    70107-28400-043 | BIFILAR | 16 ++| A23 | BIFILAR ASSEMBLY |+ 23
    is produced by this query?
    Select f.model,f.part_eipn,f.nomen,e.FMC,s.PDT,p.PDT_DESC,e.log_no
    from GENFRAC_r_part_EIpn f, event_log e,event_status s,r_pdt p
    where s.pdt = :pdt
    and s.pdt(+) = p.PDT_CODE
    and p.model = 'PBLH60'
    and f.PART_EIPN = e.EIPN
    and e.log_no = s.log_no
    and f.model = 'PBLH60'
    and e.model = 'PBLH60'
    and s.model = 'PBLH60'
    order by fmc
    e.log_no last colum
    If this is not true then what I need to help you is one query(maybe you could create a view), that produces the data that is currently in your report, then getting the log_no's on one line is not a problem.

  • BDC program - avoid duplicate records

    Hello Experts,
    I m doing BDC .My query is not working
    My req is if company code,Pur Org.,A/c Grp & name1 is same as in the database i.e. if vendor having same these 4 fields in the databse as in the excel, this particular record will not get uploaded .
    In it_success is the internal table with records to be uploaded after some validations.
    PLz check where i m wrong
    loop at i_lfa1.
        loop at i_lfb1 where lifnr = i_lfa1-lifnr.
                 loop at i_lfm1 where lifnr = i_lfb1-lifnr.
                   main_table-lifnr  = i_lfa1-lifnr.
                   main_table-bukrs = i_lfb1-bukrs.
                   main_table-ekorg = i_lfm1-ekorg.
                   main_table-KTOKK = i_lfa1-KTOKK.
                   main_table-name1 = i_lfa1-name1.
                   append main_table.
    endloop.
        endloop.
    endloop.
    loop at it_success.
       loop at main_table where ( bukrs = it_success-bukrs_001
                           and EKORG = it_success-EKORG_002
                           and KTOKK = it_success-KTOKK_003
                           and name1 = it_success-name1_006 ).
    MOVE-CORRESPONDING it_success TO it_error.
    APPEND it_error.
    delete main_table.
    if sy-subrc eq 0.
    delete it_success.
    e_fret-type = 'E'.
    e_fret-name = it_error-NAME1_006.
    e_fret-message = 'Vendor Name already exists '.
    append e_fret.
    endif.
    endloop.
    endloop.
    Ravi
    Edited by: Julius Bussche on Oct 1, 2008 9:30 AM
    Please use meaningfull subject titles.

    Hi,
    loop at i_lfa1.
    loop at i_lfb1 where lifnr = i_lfa1-lifnr.
    read table i_lfm1 with key lifnr = i_lfb1-lifnr.
    if sy-subrc = 0.
    main_table-ekorg = i_lfm1-ekorg.
    endif.
    main_table-lifnr = i_lfa1-lifnr.
    main_table-bukrs = i_lfb1-bukrs.
    main_table-KTOKK = i_lfa1-KTOKK.
    main_table-name1 = i_lfa1-name1.
    append main_table.
    endloop.
    endloop.
    Try Above code.
    Thanks,
    Durai.V

  • BI statistics, Duplicate records in ST03N

    Hi Friends,
    We had applied BI statistics, and now we are checking the query performance in ST03n.
    In the Reporting Analysis View, I can monitor the query access, the problem is that this view show de statistical data saved in 0TCT_C02 (infocube) and 0TCT_VC02 (virtual cube, the the entries that the view displar, are duplicated.
    How can I solve this?
    Thanks in advance!

    Hi,
    Please implement the OSS Note:
    1401235: Avoid Duplicate records and handling of Virtual Cube call
    -Vikram

  • Avoid duplicate standard receipe qty

    Dear All,
           I have found one query when i am making one report. In C203 t.code we can see product receipe. Generally receipe group is only one for one product but in some products i have found two receipe group like 5....100 & 5...200 and it is ok and it happens.
    Now i need to fetch standard qty for input materials vs process order qty for input materials. so currently i can fetch two receipe group like 0001...820 for one receipe group and 0001...820 for second receipe group but i need only one receipe group qty. currently it seems double standard qty against process order qty because BOM no(STLNR) is same for both receipe group.
    I can also see in COR3 t.code in master data tab, there is defined particular receipe group like 5...100. and this effect we see in AFKO table. But mainly i need std.qty of receipe so i have found STAS,STKO and STPO table.In STPO table i can see std.qty of input materials and in STKO we can see Product no and its batch size.  STLAL field in STAS table and also in STKO but noy in STPO for linking purpose. Now in STPO i can see like,
    STLNR        IDNRK           Qty 
    00000639   0001...820    50
    00000639   0001...820    50
    In my report std.qty comes 100 but i want 50 qty because i have not ound any link to filter one BOM no.(STLNR).
    Is there any other tables that i can search or what to do.
    Regards,
    Shivam.

    Hi! shivam pastagia
                                      u can use delete adjacent syntax to avoid duplicate records in internal table.
    STLNR IDNRK Qty
    00000639 0001...820 50
    00000639 0001...820 50
    sort itab by stlnr idrnk etc..
    DELETE ADJACENT DUPLICATES FROM itab comparing stlnr idrnk tetc...
    Regards,
    Mohammed Rasul.S

Maybe you are looking for

  • NOTES are all stranded on my iPad

    I wish to transfer my 1500 Notes from my iPad to MacBook. I have been backing up to iCloud for at least a year, maybe 18 months (my notes actually date back 4 years) but only 2 months of notes appear on my iCloud account. And even then, individual no

  • Clearing with residual item

    Hi there, Just wonder if it is possible to have the document date, baseline date and payment term to be inherited from the original document to the residual item. Please advise. Thank you.

  • How do I set file permissions

    I need to rename a file but am getting a file permission error. What can I do?

  • Is it possible use PPF to setting Auto Good Receive?

    Dear Experts, We met the problem in inbound process When we packing the product to Handling unit in Inbound Delivery ,then we want to EWM system can be auto Good Receive. Is it possible use PPF to setting Auto Good Receive? I have tried to use Full P

  • Tab controls eating up CPU time

    A coworker created the section of code in the attached example to change the colors of each tab page in a tab control. Originally, he had this portion of code inside a while loop. We found that as the application ran it would use more and more CPU ti