Avoid duplicate records

hi guys
could u pls let me know where is the option for avoiding duplicate records?
1. in case of Info package
2.In case of DTP?

Hi,
Incase of infopackage in 3.5 - > Processing tab -> select only PSA ,update subsequent data tagets,ignore double data records
in 7.0 processing tab by default the selection is only PSA
Incase of DTP - >update tab -> select handle duplicate data records.

Similar Messages

  • How to avoid Duplicate Records  while joining two tables

    Hi,
    I am trying to join three tables, basically two tables are same one is like history table, so I wrote a query like
    select
    e.id,
    e.seqNo,
    e.name,
    d.resDate,
    d.details
    from employees e,
    ((select * from dept)union(select * from dept_hist)) d
    join on d.id=e.id and e.seqno=d.seqno
    but this returing duplicate records.
    Could anyone please tell me how to avoid duplicate records of this query.

    Actually it is like if the record is processed it will be moved to hist table, so both table will not have same records and I need the record from both the tables so i have done the union of both the tables, so d will have the union of both records.
    But I am getting duplicate records if even I am distinct.

  • Avoiding duplicate records while inserting into the table

    Hi
    I tried the following insert statement , where i want to avoid the duplicate records while inserting itself
    but giving me the errror like invalid identifier, though the column exists in the table
    Please let me know Where i'm doing the mistake.
    INSERT INTO t_map tm(sn_id,o_id,txt,typ,sn_time)
       SELECT 100,
              sk.obj_id,
              sk.key_txt,
              sk.obj_typ,
              sysdate,
              FROM S_KEY sk
        WHERE     sk.obj_typ = 'AY'
              AND SYSDATE BETWEEN sk.start_date AND sk.end_date
              AND sk.obj_id IN (100170,1001054)
               and   not exists  (select 1
                                                                   FROM t_map tm1 where tm1.O_ID=tm.o_id
                                                                        and tm1.sn_id=tm.sn_id
                                                                        and tm1.txt=tm.txt
                                                                        and tm1.typ=tm.typ
                                                                        and tm1.sn_time=tm.sn_time )

    Then
    you have to join the table with alias tml where is that ?do you want like this?
    INSERT INTO t_map tm(sn_id,o_id,txt,typ,sn_time)
       SELECT 100,
              sk.obj_id,
              sk.key_txt,
              sk.obj_typ,
              sysdate,
              FROM S_KEY sk
        WHERE     sk.obj_typ = 'AY'
              AND SYSDATE BETWEEN sk.start_date AND sk.end_date
              AND sk.obj_id IN (100170,1001054)
               and   not exists  (select 1
                                                                   FROM t_map tm where sk.obj_ID=tm.o_id
                                                                        and 100=tm.sn_id
                                                                        and sk.key_txt=tm.txt
                                                                        and sk.obj_typ=tm.typ
                                                                        and sysdate=tm.sn_time )

  • Sqlloader controlfileparam's to avoid duplicate records loading

    Hi All,
    I am trying for a option in control file which should restrict sql loader to load duplicate records.I know if we apply a constraint on table itself, it wont allow you to load duplicate data and reject into reject file but in controlfile i have to do it to make sql loader to avoid loading duplicate records.Can you please suggest me which option in controlfile will enable this.
    Can you please also inform me the exact difference between badfile and reject file.In what scenarios it will write to reject or to badfile
    Regards

    Hey
    i dont think there is any option to avoid loading duplicate records by using a parameter in the control file
    On the difference between the bad and reject files try this link
    http://www.exforsys.com/content/view/1587/240/
    Regards,
    Sushant

  • How to avoid duplicate record in a file to file

    Hi Guys,
              Could you please provide a soultion
              in order  to avoid duplicate entries in a flat file based on key field.
              i request in terms of standard functions
             either at message mappingf level or by configuring the file adapter.
    warm regards
    mahesh.

    hi mahesh,
    write module processor for checking the duplicate record in file adapter
    or
    With a JAVA/ABAP mapping u can eliminate the duplicate records
    and check this links
    Re: How to Handle this "Duplicate Records"
    Duplicate records
    Ignoring Duplicate Records--urgent
    Re: Duplicate records frequently occurred
    Re: Reg ODS JUNK DATA
    http://help.sap.com/saphelp_nw2004s/helpdata/en/d0/538f3b294a7f2de10000000a11402f/frameset.htm
    regards
    srinivas

  • Oracle 10 -   Avoiding Duplicate Records During Import Process

    I have two databases on different servers(DB1&DB2) and a dblink connecting the two. In DB2, I have 100 million records in table 'DB2Target'.
    I tried to load 100 million more records from tables DB1Source to DB2Target on top of existing 400 million records but after an hour I got a network error from DB2.
    The load failed after inserting 70% records.Now I have three tasks. First I have to find the duplicate record between DB1 and DB2. Second I have to find out the remaining 30% records missing from DB2Target.
    Third I have to re-load the remaining 30% records. What is the best solution?
    SELECT COUNT(*), A, B FROM DB2TARGET
    GROUP BY A, B
    HAVING COUNT(*) > 2
    re-loading
    MERGE INTO DB2TARGET tgt
    USING DB1SOURCE src
    ON ( tgt .A=  tgt .A)
    WHEN NOT MATCHED THEN
    INSERT ( tgt.A,  tgt .B)
    VALUES ( src .A,  src .B)Thanks for any guidance.

    when I execute this I get the folllowing error message:
    SQL Error: ORA-02064: distributed operation not supported
    02064. 00000 - "distributed operation not supported"
    *Cause:    One of the following unsupported operations was attempted
    1. array execute of a remote update with a subquery that references
    a dblink, or
    2. an update of a long column with bind variable and an update of
    a second column with a subquery that both references a dblink
    and a bind variable, or
    3. a commit is issued in a coordinated session from an RPC procedure
    call with OUT parameters or function call.
    *Action:   simplify remote update statement                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                       

  • Scenario - Webservice - XI - BW. How to Avoid duplicate records?

    Hi all,
    Webservice --> XI -->BW .
    BPM has been used to send to send the response back.
    BPM :
    start ->Receive(Request)> Transformation(Responsemap)>Send(SendtoBW)->Send(Send Response) ---> stop.
    We are making use of MSGID to maintain the uniqueness of each message which is coming from Webservice. Uniqueness is maintained for the combination of sale-num:trans-num:sap-saletype:sale-type like below. One msgID will be registered in XI for each Unique message.
    ex:   sale-num:trans-num:sap-saletype:sale-type
           1983:5837:E:NEW
    If they receive any duplicate message again it will send the response back to webservice as "DUPLICATE_GUID:c9237200-0c69-2a80-dd11-79d5b47b213a".*
    It is working correctly. But only problem is when XI is down or if any communication failure happens in the middle of the processing like below example.
    Sample example which has failed recently. A webservice call has been failed three times and the reasons are..
    First time :
    It got the error as ""FAILED TO INVOKE WEB SERVICE OPERATION OS_CWUSales
    Error receiving Web Service Response: Fatal Error: csnet read operation failed (No such file or directory) (11773)" .
    Second time:
    MessageExpiredException: Message c9237200-0c69-2a80-dd11-79d5b47b213a(OUTBOUND) expired.
    Third Time :
    "DUPLICATE_GUID:c9237200-0c69-2a80-dd11-79d5b47b213a" ""
    If you observe when the call made 2nd time, the MsgID has been registered but due to server down or some other reason it could not able to process further.So MSGID got registered here but processing wasn't happened for the message. When they retried thrid time again to send the same call again they are getting error as "DUPLICATE GUID".
    DUPLICATE GUID means it has meaning that the message has been processed and the records has been updated in the backend system which is not happened now.
    Final Result is :
    Status in Webservice showing as "it has been updated in receicing system" as it is indicating as duplicate guid.
    - But it has not updated in backend system which is a problem.
    Firstly is there any suggestions on how to solve this problem?
    Is there any better way to handle this duplicate thing instead of Msgid?
    Please help me in solving this.
    Thanks & Regards
    Deepthi.
    Edited by: deepthi reddy on Jan 7, 2009 2:07 AM
    Edited by: deepthi reddy on Jan 7, 2009 2:07 AM

    >> My suggestion: You can have a Webservice - BW synch-synch synario without BPM. Sender SOAP adapter sending a synch req msg, getting it mapped to BW req and getting its response back from BW, then map it to webservice response and send to webservice without any receiver SOAP adapter.
    Thanks for the suggestion . Looks like a good Idea.
    >> Regarding the problem of duplicate check: see when your BW system gets req msg, it processes it and then sends a response msg.........in this response message have a STATUS field with values S for success and E for error - send this response to webservice and store it in database for that req MSGID ........now in web service application, check the response code for a MSGID and if it is E for error, then only resend the msg to BW.
    Initially they have planned the same way. But sometimes the response is getting very late back from BW. So now once the request reached to XI then immediately they are sending back the response from XI itself withhardcoded "OK" status. They are not waiting for BAPI response from BW.
    So If message is succesfull the Status will go as "OK"
         If message is not succesfull the status will go with blank. and this will indicate that there is some problem in the other side. Then they will check the SOAP Fault message which will have the errors like 
    "DUPLICATE_GUID:c9237200-0c69-2a80-dd11-79d5b47b213a".
    "FAILED TO INVOKE WEB SERVICE OPERATION "
    Right now they are having the issue only for duplicate data and response time to go back. So by making use of MsgID they solved the Issue.But due to this we are seeing daily so many error messages failing with this DUPLICATE error in production which is hampering the performance.
    So we are thinking of to get rid of this MsgId method and BPM. So from your first answer I think we can able to acheive it w/o BPM and after that I need to check the response time how fast it will go back to webservice.
    -Deepthi.

  • Avoiding duplicate records in report

    Hi All,
                 I have a scenario where
    Delivery document gets created in R/3 say on 7/1 with Act GI date "#" and KFs are all "0". This gets loaded into BI.
    On "7/5" this is PGId and the status in R/3 changesto ACT GI date "7/5" and Qty of "100" . This when loaded into BI is getting published as dupicate records i.e.
    Del doc     Created date     Act GI     Del. Ind     Qty
    12345     1-Jul                         #        #     0
    12345     1-Jul                         5-Jul        #     100
    Please note that the data is getting loaded from DSO into Infocube and DSO is in overwrite mode.
    Any suggestions to overcome this problem.

    Is ACT GI date  a keyfield in the DSO ?
    If yes, data will not be overwritten and two records will be loaded into the Cube.
    Make ACT GI date  a datafield which will result in only one record 12345 1-Jul 5-Jul # 100 as the keyfield values are same.
    Firstly make sure if this is right for all business scenarios.

  • BDC program - avoid duplicate records

    Hello Experts,
    I m doing BDC .My query is not working
    My req is if company code,Pur Org.,A/c Grp & name1 is same as in the database i.e. if vendor having same these 4 fields in the databse as in the excel, this particular record will not get uploaded .
    In it_success is the internal table with records to be uploaded after some validations.
    PLz check where i m wrong
    loop at i_lfa1.
        loop at i_lfb1 where lifnr = i_lfa1-lifnr.
                 loop at i_lfm1 where lifnr = i_lfb1-lifnr.
                   main_table-lifnr  = i_lfa1-lifnr.
                   main_table-bukrs = i_lfb1-bukrs.
                   main_table-ekorg = i_lfm1-ekorg.
                   main_table-KTOKK = i_lfa1-KTOKK.
                   main_table-name1 = i_lfa1-name1.
                   append main_table.
    endloop.
        endloop.
    endloop.
    loop at it_success.
       loop at main_table where ( bukrs = it_success-bukrs_001
                           and EKORG = it_success-EKORG_002
                           and KTOKK = it_success-KTOKK_003
                           and name1 = it_success-name1_006 ).
    MOVE-CORRESPONDING it_success TO it_error.
    APPEND it_error.
    delete main_table.
    if sy-subrc eq 0.
    delete it_success.
    e_fret-type = 'E'.
    e_fret-name = it_error-NAME1_006.
    e_fret-message = 'Vendor Name already exists '.
    append e_fret.
    endif.
    endloop.
    endloop.
    Ravi
    Edited by: Julius Bussche on Oct 1, 2008 9:30 AM
    Please use meaningfull subject titles.

    Hi,
    loop at i_lfa1.
    loop at i_lfb1 where lifnr = i_lfa1-lifnr.
    read table i_lfm1 with key lifnr = i_lfb1-lifnr.
    if sy-subrc = 0.
    main_table-ekorg = i_lfm1-ekorg.
    endif.
    main_table-lifnr = i_lfa1-lifnr.
    main_table-bukrs = i_lfb1-bukrs.
    main_table-KTOKK = i_lfa1-KTOKK.
    main_table-name1 = i_lfa1-name1.
    append main_table.
    endloop.
    endloop.
    Try Above code.
    Thanks,
    Durai.V

  • BI statistics, Duplicate records in ST03N

    Hi Friends,
    We had applied BI statistics, and now we are checking the query performance in ST03n.
    In the Reporting Analysis View, I can monitor the query access, the problem is that this view show de statistical data saved in 0TCT_C02 (infocube) and 0TCT_VC02 (virtual cube, the the entries that the view displar, are duplicated.
    How can I solve this?
    Thanks in advance!

    Hi,
    Please implement the OSS Note:
    1401235: Avoid Duplicate records and handling of Virtual Cube call
    -Vikram

  • Avoid duplicate standard receipe qty

    Dear All,
           I have found one query when i am making one report. In C203 t.code we can see product receipe. Generally receipe group is only one for one product but in some products i have found two receipe group like 5....100 & 5...200 and it is ok and it happens.
    Now i need to fetch standard qty for input materials vs process order qty for input materials. so currently i can fetch two receipe group like 0001...820 for one receipe group and 0001...820 for second receipe group but i need only one receipe group qty. currently it seems double standard qty against process order qty because BOM no(STLNR) is same for both receipe group.
    I can also see in COR3 t.code in master data tab, there is defined particular receipe group like 5...100. and this effect we see in AFKO table. But mainly i need std.qty of receipe so i have found STAS,STKO and STPO table.In STPO table i can see std.qty of input materials and in STKO we can see Product no and its batch size.  STLAL field in STAS table and also in STKO but noy in STPO for linking purpose. Now in STPO i can see like,
    STLNR        IDNRK           Qty 
    00000639   0001...820    50
    00000639   0001...820    50
    In my report std.qty comes 100 but i want 50 qty because i have not ound any link to filter one BOM no.(STLNR).
    Is there any other tables that i can search or what to do.
    Regards,
    Shivam.

    Hi! shivam pastagia
                                      u can use delete adjacent syntax to avoid duplicate records in internal table.
    STLNR IDNRK Qty
    00000639 0001...820 50
    00000639 0001...820 50
    sort itab by stlnr idrnk etc..
    DELETE ADJACENT DUPLICATES FROM itab comparing stlnr idrnk tetc...
    Regards,
    Mohammed Rasul.S

  • Avoid displaying duplicates records?

    I have the following query to display records from two tables:
    select * from COLLATERAL CL, COLLATERAL_REF CF where CF.ACCOUNT_NUMBER like '1000000001%' AND CF.COLLATERAL_ID=CL.COLLATERAL_ID
    How do i change this query to avoid duplicate collateral_id's?.
    Collateral_id is the primary key on Collateral table and foreign key on Collateral_ref table.
    null

    Thanks isotope for your reply.
    I just ran the query and I get 00904:"cl"."collateral_id": invalid identifier.
    Here is my table structure:
    Collateral (Collateral_id is Primary Key):
    Collateral_id Collateral_type Description Serial_Number ..
    1 Title 97 Accord 111111
    32 Mortgage 122 Grand Av
    26 Title          2008 BMW 222222
    Collateral_Ref (Collateral_id is foreign key)
    Collateral_id Account_Number      Description ..
    1          100000001 111111111     97 Accord     
    1          100000001 222222222     97 Accord     
    32          100000001 444444444     122 Grand Av
    32          100000001 333333333     122 Grand Av
    32          100000001 999999999     122 Grand Av
    26          200000001 222222222 2008 BMW
    So, My query currently returns all the duplicate (1 twice and 32 thrice)
    expected results should be:
    Collateral_id Collateral_type Description ..
    1 Title 97 Accord
    32 Mortgage 122 Grand Av

  • How to avoid retrieve duplicate records from SalesLogix

    I wanted to know if you could assist me.  I am now responsible for reporting our inside sales activities which includes (each month), outbound calls made, opportunities created, opportunities won $, etc.  We use SalesLogix as our tool.  I have been working with Business Objects exporting this information from SalesLogix and have pretty much created the report I need.  The only problem I have is it will pull in duplicate records with the same opportunity ID number because my query is based on u201Ccampaign codesu201D attached to SLX opportunities.  When an opportunity is created in SLX, it automatically assigns an opportunity ID (ex: OQF8AA008YQB) which is distinctive.  However, when we attach more than one u201Ccampaign codeu201D to this opportunity it pulls in opportunity ID that many more times.
    Is there a way to filter or only retrieve one ID record number regardless of how many campaign codes are attached?  All the information attached to the opportunity are the same with the exception that the "campaign code" is different which makes it two records since I pull by "campaign code"
    My greatest appreciation!

    Hi,
    If you are having CAMPAIGN CODE in your query and if you are displaying it in your report, then it would definitely display multiple rows for OPPORTUNITY ID for each CAMPAIGN CODE it has. 
    If you would like to have just one row for OPPORTUNITY ID, then you will need to remove CAMPAIGN CODE from your report.

  • Duplicate records problem

    Hi everyone,
    I'm having a a little difficulty resolving a problem with a repeating field causing duplication of data in a report I'm working on, and was hoping someone on here can suggest something to help!
    My report is designed to detail library issues during a particular period, categorised by the language of the item issued. My problem is that on the sql database that out library management system uses, it is possible for an item to have more than one language listed against it (some books will be in more than one language). When I list the loan records excluding the language data field, I get a list of distinct loan records. Bringing the language data into the report causes the loan record to repeat for each language associated with it, so if a book is both in English and French, it will cause the loan record to appear like this:
    LOAN RECORD NO.     LANGUAGE CODE
      123456                             ENG
      123456                             FRE
    So, although the loan only occurred once I have two instances of it in my report.
    I am only interested in the language that appears first and I can exclude duplicated records from the report page. I can also count only the distinct records to get an accurate overall total. My problem is that when I group the loan records by language code (I really need to do this as there are millions of loan records held in the database) the distinct count stops being a solution, as when placed at this group level it only excludes duplicates in the respective group level it's placed in. So my report would display something like this:
    ENG     1
    FRE      1
    A distinct count of the whole report would give the correct total of 1, but a cumulative total of the figures calculated at the language code group level would total 2, and be incorrect. I've encountered similar results when using Running Totals evaluating on a formula that excludes repeated loan record no.s from the count, but again when I group on the language code this goes out of the window.
    I need to find a way of grouping the loan records by language with a total count of loan records alongside each grouping that accurately reflects how many loans of that language took place.
    Is this possible using a calculation formula when there are repeating fields, or do I need to find a way of merging the repeating language fields into one field so that the report would appear like:
    LOAN RECORD     LANGUAGE CODE
      123456                      ENG, FRE
    Any suggestions would be greatly appreciated, as aside from this repeating language data there are quite a few other repeating database fields on the system that it would be nice to report on!
    Thanks!

    if you create a group by loan
    then create a group by language
    place the values in the group(loan id in the loan header)
    you should only see the loan id 1x.
    place the language in the language group you should only see that one time
    a group header returns the 1st value of a unique id....
    then in order to calculate avoiding the duplicates
    use manual running totals
    create a set for each summary you want- make sure each set has a different variable name
    MANUAL RUNNING TOTALS
    RESET
    The reset formula is placed in a group header report header to reset the summary to zero for each unique record it groups by.
    whileprintingrecords;
    Numbervar  X := 0;
    CALCULATION
    The calculation is placed adjacent to the field or formula that is being calculated.
    (if there are duplicate values; create a group on the field that is being calculated on. If there are not duplicate records, the detail section is used.
    whileprintingrecords;
    Numbervar  X := x + ; ( or formula)
    DISPLAY
    The display is the sum of what is being calculated. This is placed in a group, page or report footer. (generally placed in the group footer of the group header where the reset is placed.)
    whileprintingrecords;
    Numbervar  X;
    X

  • How to delete duplicate record in Query report

    Hi Experts,
    I had created an infoset and query in my sap, but I want to delete some duplicate records before the list out put.Please can we add some further codes in the Extras code to delete duplicates? And how do it? Would you please give me a simple brief.
    Joe

    Hi,
    You can try to restrict in the filter area in query designer with the values for characteristic which gives correct
    result.
    But still i would suggest that in the cube you keep not the duplicate records as this is not your requirement and giving
    you wrong result.
    So you can reload the correct records in the cube inorder to avoid such problems even in future.
    Regards,
    Amit

Maybe you are looking for

  • Transferring photos and video from iPhone 5S camera roll to windows folder, some files corrupt. Why?

    I back up all my photos and videos to my external hard drive every month or so in order to free up space in my phone. Now some folders once are transferred in windows contain some videos with jpeg extension in the file name and some photos with .avi

  • Adding a PC to my Airport network - help!

    Can any one tell me how to add my new Sony Vaio laptop PC to my Airport office network (I needed to have a PC for checking web sites I create) - it works fine with ethernet but i was to use it around the house! Do i i need to add the MAC address as i

  • PDF Printer Acrobat XI Pro

    When I choose File->Print->"Adobe PDF Printer" from any application (i.E. "Ai") I get following two windows. The first Window will disappear with 100%, but the second don´t and no file will be created?!!??!??!? My system: Win7 64 and Adobe CC (Acroba

  • Welcome file usage: can it be any URL?

    Hi, Do welcome-files have to always be concrete files such as .html or .jsp files? I am finding that when using Tomcat 4.0.1, if I specify a servlet mapping as a welcome file, I end up getting a directory listing. For example, since I am using Struts

  • Mail re: confirm in subject, kbase comprimised?

    I have received 12 emails so far in the last 40 minutes that appear to be from someone who has gotten ahold of the k-base email list. Where at Apple do we send the info to have security put a stop to this. Can't find an email address that would addre