Duplicate Record Search

I need help with the SQL Statement below.
SELECT DISTINCT RISK_CODE, LOB, COMPANY, STATE, RISK_CODE_DESCR, RISK_CODE_QUESTION_TEXT, EXPECTED_RESULT
FROM RISK_CODE
WHERE RISK_CODE_QUESTION_TEXT IS NOT NULL
AND ASK_INDICATOR = 'Y'
AND LOB = 'P1'
AND ((COMPANY = '' OR COMPANY IS NULL)
OR (STATE = '' OR STATE IS NULL))
I want to return the values (risk code questions) when there is a company number and where there isn't, and the same for state. The problem that I run into, is when I do have a company number (ex: 077) and a state (ex. 04), I simply return duplicate values back, and I wanted to know if there was a way to avoid that.
Not only do I need the values from a specific record (with Company and State), but I also need the records that are not company and state specific. A sample of the data is below.
RISK_CODE - LOB - COMPANY - STATE - RISK_CODE_DESCR
74 - P1 - 077 - 04 - DESCRIPTION1
74 - P1 - - - DESCRIPTION1
01 - P1 - 077 - 04 - DESCRIPTION2
01 - P1 - - - DESCRIPTION2
02 - P1 - - - DESCRIPTION3
TY - P1 - - - DESCRIPTION4
U7 - P1 - 077 - 04 - DESCRIPTION5
I don't know if this helps or not, but if I pass in the state and company I would need risk codes 74, 01, 02, and TY. But if I pass the state and company I would need 74, 01, 02, TY, and U7. I don't want duplicate records back, and don't want to have to create a huge stored procedure for this.

What duplicate values are you getting? From looking at you data I don't think you're getting duplicate values. What you are getting is the values for risk_code when there is a state populated and when then isn't. This is of course exactly what you've asked for.
Presumably the RISK_CODE_DESCR are different and that's why your DISTINCT isn't filtering them.
What you need to do is soemthing like this:
SELECT RISK_CODE, LOB, COMPANY, STATE, RISK_CODE_DESCR, RISK_CODE_QUESTION_TEXT, EXPECTED_RESULT
FROM RISK_CODE
WHERE RISK_CODE_QUESTION_TEXT IS NOT NULL
AND ASK_INDICATOR = 'Y'
AND LOB = 'P1'
AND ( COMPANY = '&&IN_CO' OR
      STATE = '&&IN_STATE')
UNION
SELECT RISK_CODE, LOB, COMPANY, STATE, RISK_CODE_DESCR, RISK_CODE_QUESTION_TEXT, EXPECTED_RESULT
FROM RISK_CODE
WHERE RISK_CODE_QUESTION_TEXT IS NOT NULL
AND ASK_INDICATOR = 'Y'
AND LOB = 'P1'
AND COMPANY IS NULL
AND STATE IS NULL
AND NOT EXISTS (SELECT  RISK_CODE
                FROM RISK_CODE
                WHERE RISK_CODE_QUESTION_TEXT IS NOT NULL
                AND ASK_INDICATOR = 'Y'
                AND LOB = 'P1'
                AND ( COMPANY = '&&IN_CO' OR
                      STATE = '&&IN_STATE')).
Mind you, I'm not saying it's pretty or fast.
Cheers, APC

Similar Messages

  • Searching for Duplicates records in Qualified Lookup tables

    Hi SDNers,
    I would like to know,
    How to find out how many records having duplicate records in Qualified Look Tables.
    In Free form search or Drill down Serach.. I am select the particular Qualified Look up Table. After how can I find Duplicated reocrds.
    Any Solution?
    Thanks
    Ravi

    Hi,
    If you want to find the duplicates present in qualified table then, you can go to qualified table in data manager and get to know the duplicates....else if you want to find the duplicate links to a record in main table, then you can write an expression in free form search to get the value of duplicate links. but that would be specific to a link value only..e.g. Country is your no qualifier, then in free form search, you can write an expression to get how many links are for USA.
    Hope this clarifies your doubt...if not, please elaborate on the requirement.
    Regards,
    Arafat.

  • Data loader : Import -- creating duplicate records ?

    Hi all,
    does anyone have also encountered the behaviour with Oracle Data Loader that duplicate records are created (also if i set the option: duplicatecheckoption=externalid) When i am checking the "import request queue - view" the request parameters of the job looks fine! ->
    Duplicate Checking Method == External Unique ID
    Action Taken if Duplicate Found == Overwrite Existing Records
    but data loader have created new records where the "External Unique ID" is already existent..
    Very strange is that when i create the import manually (by using Import Wizard) exactly the same import does work correct! Here the duplicate checking method works correct and the record is updated....
    I know the data loader has 2 methods, one for update and the other for import, however i do not expect that the import creates duplicates if the record is already existing, rather doing nothing!
    Anyone else experiencing the same ?? I hope that this is not expected behaviour!! - by the way method - "Update" works fine.
    thanks in advance, Juergen
    Edited by: 791265 on 27.08.2010 07:25
    Edited by: 791265 on 27.08.2010 07:26

    Sorry to hear about your duplicate records, Juergen. Hopefully you performed a small test load first, before a full load, which is a best practice for data import that we recommend in our documentation and courses.
    Sorry also to inform you that this is expected behavior --- Data Loader does not check for duplicates when inserting (aka importing). It only checks for duplicates when updating (aka overwriting). This is extensively documented in the Data Loader User Guide, the Data Loader FAQ, and in the Data Import Options Overview document.
    You should review all documentation on Oracle Data Loader On Demand before using it.
    These resources (and a recommended learning path for Data Loader) can all be found on the Data Import Resources page of the Training and Support Center. At the top right of the CRM On Demand application, click Training and Support, and search for "*data import resources*". This should bring you to the page.
    Pete

  • Duplicate records in exported data

    I'm trying to export the inventory data with a wildcard (%) filter on the
    Workstation Name.
    If I run the same filter in a query from ConsoleOne, I don't see any
    duplicate records.
    If I run the data export, the exported data for some workstations will have
    a duplicate DN. Just the DN is duplicated, all the other fields are either
    empty or have some default value.
    I have also ran the manual duplicate removal process and have tried
    deleting the records all together using the InventoryRemoval service.
    Any other ideas?

    Dlee,
    It appears that in the past few days you have not received a response to your
    posting. That concerns us, and has triggered this automated reply.
    Has your problem been resolved? If not, you might try one of the following options:
    - Do a search of our knowledgebase at http://support.novell.com/search/kb_index.jsp
    - Check all of the other support tools and options available at
    http://support.novell.com.
    - You could also try posting your message again. Make sure it is posted in the
    correct newsgroup. (http://support.novell.com/forums)
    Be sure to read the forum FAQ about what to expect in the way of responses:
    http://support.novell.com/forums/faq_general.html
    If this is a reply to a duplicate posting, please ignore and accept our apologies
    and rest assured we will issue a stern reprimand to our posting bot.
    Good luck!
    Your Novell Product Support Forums Team
    http://support.novell.com/forums/

  • Duplicate Records in Details for ECC data source. Help.

    Hello. First post on SDN. I have been searching prior posts, but have come up empty. I am in the middle of creating a report linking directly into 4 tables in ECC 6.0. I am having trouble in getting either the table links set up correctly, or filtering out duplicate record sets that are being reporting in the details section of my report. It appears that I have 119 records being displayed, when the parameters values should only yeild 7. The details section is repeating the 7 records 17 times (there are 17 matching records for the parameter choices in one of the other tables which I think is the cause).
    I think this is due to the other table links for my parameter values. But, I need to keep the links the way they are for other aspects of the report (header information). The tables in question are using an Inner Join, Enforced Both, =. I tried the other link options, with no luck.
    I am unable to use the "Select Disctinct Records" option in the Database menu since this is not supported when connecting to ECC.
    Any ideas would be greatly appreciated.
    Thanks,
    Barret
    PS. I come from more of a Functional background, so development is sort of new to me. Take it easy on the newbie.

    If you can't establish links to bring back unique data then use a group to diplay data.
    Group report by a filed which is the lowest commom denominator.
    Move all fields into group footer and suppress Group header and details
    You will not be able to use normal summaries as they will count/sum all the duplicated data, use Running Totals instead and select evaluate on change of the introduced group
    Ian

  • Duplicate records in TABLE CONTROL

    Hi folks,
    i am doing a module pool where my internal table (itab) data is comming to table ontrol(ctrl).then i need to select one record in table control & then i press REFRESH push button.
    after putting the refresh button, some new records are comming to that same internal table.then i need to display the modified internal table (some new records are added) data in the table control.
    The modified internal table data is comming to the table control but to the last of table control, some records are repeating.
    before comming to table control, i checked the modified itab. it contains correct data.i.e it contains 15 records.(previously i have 5 records.after REFRESH button 10 more records are added.). but when this table is comming to table control, it contains some 100 record.i should get only 15 record.
    why these records r repeting. how to delete the duplicate records from table control?
    plz suggest me where i am doing mistake.
    correct answer will be rewarded
    Thanks & Regards

    Hi ,
    Thanks for ur help. but i should not refresh the internal table  as some records r already present.after putting the REFRESH button, some new records r appending to this existing table.then i am going to display the previous records & the new records as well.
    i checked the internal table after modification.it contains actual number of records. but after comming to table control , more records r comming.
    is this the problem with scrolling or waht?
    plz suggest where i am doing mistake.i am giving my coding below.
    PROCESS BEFORE OUTPUT.
    MODULE STATUS_0200.
    module tc_shelf_change_tc_attr.
    loop at object_tab1
           with control tablctrl
           cursor tablctrl-current_line.
        module tc_shelf_get_lines.
      endloop.
    PROCESS AFTER INPUT.
    module set_exit AT EXIT-COMMAND.
       loop at object_tab1.
             chain.
              field: object_tab1-prueflos,
                     object_tab1-matnr.
               module shelf_modify on chain-request.
             endchain.
            field object_tab1-idx
             module shelf_mark on request.
                   endloop.
    module shelf_user_command.
    module user_command_0200.
    ***INCLUDE Y_RQEEAL10_STATUS_0200O01 .
    *&      Module  STATUS_0200  OUTPUT
          text
    MODULE STATUS_0200 OUTPUT.
      SET PF-STATUS 'MAIN'.
    SET TITLEBAR 'xxx'.
    ENDMODULE.                 " STATUS_0200  OUTPUT
    *&      Module  tc_shelf_change_tc_attr  OUTPUT
          text
    MODULE tc_shelf_change_tc_attr OUTPUT.
    delete adjacent duplicates from object_tab1 comparing prueflos matnr.
    describe table object_tab1 lines tablctrl-lines.
    ENDMODULE.                 " tc_shelf_change_tc_attr  OUTPUT
    *&      Module  tc_shelf_get_lines  OUTPUT
          text
    MODULE tc_shelf_get_lines OUTPUT.
    data:  g_tc_shelf_lines  like sy-loopc.
    if tablctrl-current_line > tablctrl-lines.
    stop.
    endif.
    g_tc_tablctrl_lines = sy-loopc.
    *refresh control tablctrl from screen 0200.
    ENDMODULE.                 " tc_shelf_get_lines  OUTPUT
    ***INCLUDE Y_RQEEAL10_SHELF_MODIFYI01 .
    *&      Module  shelf_modify  INPUT
          text
    MODULE shelf_modify INPUT.
    modify object_tab1
        index tablctrl-current_line.
    ENDMODULE.                 " shelf_modify  INPUT
    *&      Module  set_exit  INPUT
          text
    module set_exit INPUT.
    leave program.
    endmodule.                 " set_exit  INPUT
    *&      Module  shelf_mark  INPUT
          text
    MODULE shelf_mark INPUT.
    data: g_shelf_wa2 like line of object_tab1.
      if tablctrl-line_sel_mode = 1
      and object_tab1-idx = 'X'.
        loop at object_tab1 into g_shelf_wa2
          where idx = 'X'.
          g_shelf_wa2-idx = ''.
          modify object_tab1
            from g_shelf_wa2
            transporting idx.
        endloop.
      endif.
      modify object_tab1
        index tablctrl-current_line
        transporting idx plnty plnnr plnal.
    ENDMODULE.                 " shelf_mark  INPUT
    *&      Module  shelf_user_command  INPUT
          text
    MODULE shelf_user_command INPUT.
    ok_code = sy-ucomm.
      perform user_ok_tc using    'TABLCTRL'
                                  'OBJECT_TAB1'
                         changing ok_code.
      sy-ucomm = ok_code.
    ENDMODULE.                 " shelf_user_command  INPUT
    *&      Module  user_command_0100  INPUT
          text
    MODULE user_command_0200 INPUT.
    data:v_line(3).
    case OK_CODE.
    when 'LAST'.
    read table object_tab1 with key idx = 'X'.
    if sy-subrc = 0.
    select * from qals
                          where enstehdat <= object_tab1-enstehdat
                          and   plnty ne space
                          and   plnnr ne space
                          and   plnal ne space.
    if sy-dbcnt > 0.
    if qals-enstehdat = object_tab1-enstehdat.
       check qals-entstezeit < object_tab1-entstezeit.
       move-corresponding qals to object_tab2.
       append object_tab2.
       else.
       move-corresponding qals to object_tab2.
       append object_tab2.
       endif.
         endif.
            endselect.
       sort object_tab2 by enstehdat entstezeit descending.
    loop at object_tab2 to 25.
      if not object_tab2-prueflos is initial.
    append object_tab2 to object_tab1.
      endif.
      clear object_tab2.
    endloop.
      endif.
    when 'SAVE'.
    loop at object_tab1 where idx = 'X'.
      if ( not object_tab1-plnty is initial and
                    not object_tab1-plnnr is initial and
                               not object_tab1-plnal is initial ).
       select single * from qals into corresponding fields of wa_qals
       where prueflos = object_tab1-prueflos.
          if sy-subrc = 0.
           wa_qals-plnty = object_tab1-plnty.
           wa_qals-plnnr = object_tab1-plnnr.
           wa_qals-plnal = object_tab1-plnal.
    update qals from wa_qals.
      if sy-subrc <> 0.
    Message E001 with 'plan is not assigned to lot in sap(updation)'.
    else.
    v_line = tablctrl-current_line - ( tablctrl-current_line - 1 ).
    delete object_tab1.
    endif.
       endif.
          endif.
               endloop.
    when 'BACK'.
    leave program.
    when 'NEXT'.
    call screen 300.
    ENDCASE.
    ***INCLUDE Y_RQEEAL10_USER_OK_TCF01 .
    *&      Form  user_ok_tc
          text
         -->P_0078   text
         -->P_0079   text
         <--P_OK_CODE  text
    form user_ok_tc  using    p_tc_name type dynfnam
                              p_table_name
                     changing p_ok_code like sy-ucomm.
       data: l_ok              type sy-ucomm,
             l_offset          type i.
       search p_ok_code for p_tc_name.
       if sy-subrc <> 0.
         exit.
       endif.
       l_offset = strlen( p_tc_name ) + 1.
       l_ok = p_ok_code+l_offset.
       case l_ok.
         when 'P--' or                     "top of list
              'P-'  or                     "previous page
              'P+'  or                     "next page
              'P++'.                       "bottom of list
           perform compute_scrolling_in_tc using p_tc_name
                                                 l_ok.
           clear p_ok_code.
       endcase.
    endform.                    " user_ok_tc
    *&      Form  compute_scrolling_in_tc
          text
         -->P_P_TC_NAME  text
         -->P_L_OK  text
    form compute_scrolling_in_tc using    p_tc_name
                                           p_ok_code.
       data l_tc_new_top_line     type i.
       data l_tc_name             like feld-name.
       data l_tc_lines_name       like feld-name.
       data l_tc_field_name       like feld-name.
       field-symbols <tc>         type cxtab_control.
       field-symbols <lines>      type i.
       assign (p_tc_name) to <tc>.
       concatenate 'G_' p_tc_name '_LINES' into l_tc_lines_name.
       assign (l_tc_lines_name) to <lines>.
       if <tc>-lines = 0.
         l_tc_new_top_line = 1.
       else.
         call function 'SCROLLING_IN_TABLE'
           exporting
             entry_act      = <tc>-top_line
             entry_from     = 1
             entry_to       = <tc>-lines
             last_page_full = 'X'
             loops          = <lines>
             ok_code        = p_ok_code
             overlapping    = 'X'
           importing
             entry_new      = l_tc_new_top_line
           exceptions
             others         = 0.
       endif.
       get cursor field l_tc_field_name
                  area  l_tc_name.
       if syst-subrc = 0.
         if l_tc_name = p_tc_name.
           set cursor field l_tc_field_name line 1.
         endif.
       endif.
       <tc>-top_line = l_tc_new_top_line.
    endform.                              " COMPUTE_SCROLLING_IN_TC
    Thanks

  • Loacate and remove duplicate records in infocube.

    Hi!!
    we have found the infocube 0PUR_C01 contians duplicate records for the month april 2008, approx 1.5lac records are extracted to this infocube, similar situations may be occuring in the subsequent months.
    How do I locate these records and remove them for the infocube?
    How do I ensure that duplicate records are not extracted in the infocube?
    All answers/ links are welcome!!
    Yours Truly
    K Sengupto

    First :
    1. How do I locate duplicate records in an Infocube? other than down load all the records in an excel file and use excel funtionality to locate duplicate records.
    This is not possible since a duplicate record would not exist - the records are sent to a cube with a + and - sign to accordingly summarize data.
    You search for duplicate data would become that much troublesome.
    If you have a DSO to load it from - delete data for that month and reload if possible this would be quicker and cleaner as opposed to removing duplicate records.
    If you had
    ABC|100 in your DSO and it got doubled
    it would be
    ABC|+100
    ABC|+100
    against different requests in the cube - and added to this ill be your correct deltas also.

  • Duplicate Records in Transactional Load

    Dear All,
    I have an issue where data is getting loaded from a Write Optimized DSO to another Write Optimized DSO and the DTP is getting failed because of Duplicate Records. It is for a Transactional Load.
    I would be grateful if you could please help me to know how i can handle this situation and reload the DTP again.
    I have tried searching the Forum,but i can see many for Master data loading where i can select Handling Duplicate Records.
    Thanks in Advance...
    Regards,
    Syed

    Hi Ravi,
    Thanks for your reply.
    If we uncheck the option, it would take the duplicate records right.
    In my scenario, data is coming from a Write Optimized DSO to another Write Optimized DSO. Where as in the first DSO the Data Uniqueness is not checked and in the second DSO uniqueness is checked, so it is giving me the duplicate error message.
    I saw around 28 records in the Error Stack. So please let me know, how i can process this error records(Duplicate) as well.
    Many Thanks...
    Regards,
    Syed

  • Getting duplicate records in cube from each data packet.

    Hi Guys,
    I am using 3.x BI version. I am getting duplicate records in cube. for deleting these duplicate records i have written code. still it is giving same result. Actually i have written a start routine for deleting duplicate records.
    These duplication is occurring depending on the number of packets.
    Eg: If the number of packets are 2 then it is giving me 2 duplicate records.
    If the number of packets are 7 then it is giving me 7 duplicate records.
    How can i modify my code so that it can fetch only one record by eliminating duplicate records? Or any other solution is welcomed.
    Thanks in advance.

    Hi  Andreas, Mayank.
      Thanks for your reply.
      I created my own DSO, but its giving error. And I tried with the stanadard DSO too. Still its giving the same error as could not activate.
    In error its giving a name of function module RSB1_OLTPSOURCE_GENERATE.
    I searched in R3 but could not get that one.
    Even I tried creating DSO for trial basis, they are also giving the same problem.
    I think its the problem from BASIS side.
    Please help if you have any idea.
    Thanks.

  • Start routine to filter the duplicate records

    Dear Experts
    I have two questions regarding the start routine.
    1) I have a characteristic InfoObject with transactional InfoSource. Often the 'duplicate records' error happens during the data loading. I'm trying to put a start routine in the update rule to filter out the duplicate records. 
    After searching the SDN forum and SAPHelp, I use the code as:
    DELETE ADJACENT DUPLICATES FROM DATA_PACKAGE COMPARING KEY1 KEY2 KEY3.
    In my case, the InfoObject has 3 keys: SOURSYSTEM, /BIC/InfoObjectname, OBJVERS. My code is:
    DELETE ADJACENT DUPLICATES FROM DATA_PACKAGE COMPARING SOURSYSTEM /BIC/InfoObjectname OBJVERS.
    When checking the code I got message: 'E:No component exists with the name "OBJVERS".' So I only included the first 2 keys. But the routine does not work. The duplicate error is still happening. What is missing in this start routine?
    2) Generally, for a start routine, do I really need to include the data declaration, ITAB or WA, SELECT statement etc.?
    Do I have to use the statement below or just simply one line?
    LOOP AT DATA_PACKAGE.
    IF DATA_PACKAGE.....
    ENDIF.
    ENDLOOP.
    Thanks for your help in advance, Jessica

    Hello Jessica,
    if it won't be possible for you to get unique data from the very beginning, there is still another way to manage this problem in a start routine.
    Sort ... and delete adjacent ... must remain. Further on build up an internal table of type data_package, but defined with STATICS instead of DATA. This i-tab stays alive for all data-packages of one load. Fill it with the data of the transferred data-packages, and delete from every new data-package all records which already are in the statics i-tab. Alternatively you could do the same with a Z-(or Y-)database-table instead of the statics i-tab.
    It will probably cost some performance, but better slow than wrong data.
    Regards,
    Ernst

  • Remove all duplicate records and load into temp table

    Hi
    I have a table contains data like this.
    Emp No Designation location
    1111 SE CA
    1111 DE CT
    3456 WE NJ
    4523 TY GH
    We found that there are two duplicate records for emp no: 1111. I want to delete all duplicate records (in this case two records for emp no:1111) and load into the temp table.
    Please advice me how to do it.

    Oh look, you can search the forums...
    http://forums.oracle.com/forums/search.jspa?threadID=&q=delete+duplicates&objID=f75&dateRange=all&userID=&numResults=30

  • Remove duplicate record with a concurrent program

    Hi
    I have two nearly simliar records. I will remove one of them. It should be a program to do this.
    Any suggestion.
    Thanks

    Hi,
    It shoul be a conc. prog. for deleting (removing) duülicate records. Currently i am searching to find this prog.What is the point of removing this duplicate record using a concurrent program (not manually or using an API)? I believe this is no standard concurrent program to achieve this.
    Thanks,
    Hussein

  • Implementing Duplicate record functionality

    Hi,
    In my application, there is one master table (T1) and 11 model specific details tables(MT1-MT11). They are related by a PK/FK relationship on config_id in both the tables. Depending on the value of a particular field in the details table, one of the 11 table is populated with some data.
    Now, i have a search page which is based on the details table(T1). I need to provide a 'duplicate record' functionality on the search page itself, such that when the user selects a particular record and presses 'Duplicate' button, the record in T1 and its corresponding table (say MT1) should be duplicated. He should also have to facility to change some data in the T1 table before committing the changes.
    The following is my flow of thoughts:
    * From the search page, find out the record that has been selected (config_id is PK).
    * redirect to another page (based on a new VO) where the current record is again queried and some fields made updateable (others remain the same). A new config_id is generated from a sequence.
    * A new record for the details table (MT1) should be created having the new config_id, but all other data remain the same as before.
    Is this the correct approach? How should i go about doing this??
    THanks
    Ashish

    Hi,
    This is what i have done till now.
    My search is based on VO1. I select one of the records (using a singleSelect radio button), and press 'Copy' button. In the Controller of this search page, i check for button press and then invoke a method in the AM where i get the confif_id (PK) of the record that has been selected.
    Then i query another VO (VO2) for this config_id. I get the current row. I create another row from VO2 and assign each attribute from the oldRow to the newRow. I then alter some values in the newVO (such as a new ConfigId) and nullify the WHO columns.
    The code i have written in the AM is:
        SketchDetailsCopyVOImpl vo = (SketchDetailsCopyVOImpl)getSketchDetailsCopyVO1();
        vo.initQuery(ConfigId);
        OARow oldRow = (OARow)vo.getRowAtRangeIndex(0);
        vo.insertRow(vo.createRow()); 
        OARow newRow = (OARow)vo.first();
        for (int i=1; i<oldRow.getAttributeCount(); i++)
          System.out.println(i+" -- "+oldRow.getAttribute(i));
          newRow.setAttribute(i,oldRow.getAttribute(i));     
        OADBTransaction txn = getOADBTransaction();
        Number newConfigId = txn.getSequenceValue("NRCONFIG_DETAILS_S");
        newRow.setAttribute("ConfigId",newConfigId);
        newRow.setAttribute("SiteName",null);
        newRow.setAttribute("PointNumber",null);
        newRow.setAttribute("ConfigStatusFlag","D");
        newRow.setAttribute("CreationDate",null);
        newRow.setAttribute("CreatedBy",null);
        newRow.setAttribute("LastUpdateDate",null);
        newRow.setAttribute("LastUpdatedBy",null);
        newRow.setAttribute("LastUpdateLogin",null);
        vo.setCurrentRow(newRow);I then re-direct to another page where the user can fill in the other required fields (such as SiteName and PointNumber). On the 'Apply' button, i have simply called transaction.commit();
    Although this works, but the WHO columns are not automatically populated.
    Is this the correct way of doing it?
    Souns fine, when you said new Vo I am assuming that represents MT1 and you are just >querying up the current record from T1 and pushing the values to the new record of MT1. Table MT1 contains some additional data that the user dosent see in the search page, nor is his intervention required while querying. So, once the record in T1 has been duplicated, I will simply duplicate the record in MT1 repalcing the config_id with the new config_id.
    Thanks
    Ashish

  • Duplicate records in PSA and Error in Delta DTP

    Hi Experts,
    I have enabled delta on CALDAY Feild for a Generic Data Source in Source system.
    1) Initially I have created one record in source system and I have run the Delta Infopackage.So PSA Contains one record.
    2)  I have run the Delta DTP.Now That one record has moved to DSO
    3) Again I have created one more record in source system and I have run the Delta Infopackage.Here I have contains 2 records in PSA.One record is created in step 2 and other record created from step 1.
    4) If  I try to run the Delta DTP, I am getting Error saying that Duplicate records are there.
    So would you please tell me how to rectify this Issue.
    My requirement is I need to run the load multiple times a day . (Note: Data Source is delta enabled)
    Thanks and Regards,
    K.Krishna Chaitanya.

    you can use the this feature only when loading master data, as it will ignore lines. if you would have for example a changed record with a before and after image, one of the two lines would be ignored.
    you can keep searching and asking, but if you want to load a delta more then once a day, you can't use a calendar day as a generic delta field. this will not work.
    the best thing is to have a timestamp.
    if in the datasource you have a date and time, you can create a timestamp yourself.
    you need in this case to use a function module.
    search the forum with the terms 'generic delta function module' and you'll for sure find code that you can reuse
    M.

  • Preventing duplicate records

    Is there any way to prevent creating duplicate records instead of searching if it already exists?

    There are cetain fields for object which can fom uniqueness. Account Name and Location together form the uniqueness for Account object.
    But difficult in case of all the objects such as Contact , SR.

Maybe you are looking for