How to avoid duplicates in LOV

Hi, i'm using search query component (11g) and for the search fields i'm adding LOV. In the UI , i can see the LOV with values from the table. But i need to avoid the duplicates in this . Looked at the docs and demo and i couldn't figure out anything. Could some one point me a resource. Thanks.

How do you create the LOV then?
To make a view object with a lov you need two viewObjects - let us say viewObject and viewObjectLOV.
viewObject is updatalbe, viewObjectLOV is not.
Your problem is that viewObjectLOV returns duplicate values.
1. find where is the viewObjectLOV, you can reach it via viewObject's accessor
2. make sure it has a primary key.
3. modify viewObjectLOV's query so it does not return duplicate values, for example using distinct keyword.
0. If you did not understand what I tried to explain, maybe you should read some more documentation first :)

Similar Messages

  • How to avoid duplicate posting of noted items for advance payment requests?

    How to avoid duplicate posting of noted items for advace payments request?

    Puttasiddappa,
    In the PS module, we allow the deletion of a component pruchase requisition allthough a purchase order exists. The system will send message CN707 "<i>A purchase order already exists for purchase requisition &</i>" as an Iinformation message by design to allow flexible project management.
    If you, however, desire the message CN707 to be of type E you have to
    modify the standard coding. Doing so, using SE91, you can invoke the
    where-used-list of message 707 in message class CN, and to change the
      i707(cn)
    to
      e707(cn)
    where desired.
    Also, user exit CNEX0039 provides the possibility to reject the
    deletion of a component according to customers needs e. g. you may
    check here whether a purchase order exists and reject the deletion.
    Hope this helps!
    Best regards
    Martina Modolell

  • How to avoid duplicates values from alvgird see below code

    how to avoid duplicates values from alvgird see below code
    in below query docno no is repeated again and again
    how i can avoid duplication in this query.
    select * into corresponding fields of table itab
             from  J_1IEXCHDR
                     inner join  J_1IEXCDTL
                        on  J_1IEXCDTLlifnr =  J_1IEXCHDRlifnr
                     where  J_1IEXCHDr~status = 'P'.

    Hi Laxman,
    after that select statement
    select * into corresponding fields of table itab
    from J_1IEXCHDR
    inner join J_1IEXCDTL
    on J_1IEXCDTLlifnr = J_1IEXCHDRlifnr
    where J_1IEXCHDr~status = 'P'.
    <b>if sy-subrc = 0.
    delete adjucent duplicates from itab comparing <field name of itab internal table>
    endif.</b>
    this will delete your duplicate entries.once you done with this call the alv FM.
    <b>  call function 'REUSE_ALV_GRID_DISPLAY'</b>
    exporting
      I_INTERFACE_CHECK                 = ' '
      I_BYPASSING_BUFFER                = ' '
      I_BUFFER_ACTIVE                   = ' '
       i_callback_program                = v_repid
      I_CALLBACK_PF_STATUS_SET          = ' '
      I_CALLBACK_USER_COMMAND           = 'IT_USER_COMMAND'
      I_CALLBACK_TOP_OF_PAGE            = ' '
      I_CALLBACK_HTML_TOP_OF_PAGE       = ' '
      I_CALLBACK_HTML_END_OF_LIST       = ' '
      I_STRUCTURE_NAME                  =
      I_BACKGROUND_ID                   = ' '
       i_grid_title                      = 'Purchase Order Details'
      I_GRID_SETTINGS                   = I_GRID_SETTINGS
       is_layout                         = wa_layout
       it_fieldcat                       = it_fieldcat
      IT_EXCLUDING                      = IT_EXCLUDING
      IT_SPECIAL_GROUPS                 = IT_SPECIAL_GROUPS
       it_sort                           = it_sort
      IT_FILTER                         = IT_FILTER
      IS_SEL_HIDE                       = IS_SEL_HIDE
      I_DEFAULT                         = 'X'
      I_SAVE                            = ' '
      IS_VARIANT                        = IS_VARIANT
       it_events                         = it_event
      IT_EVENT_EXIT                     = IT_EVENT_EXIT
      IS_PRINT                          = IS_PRINT
      IS_REPREP_ID                      = IS_REPREP_ID
      I_SCREEN_START_COLUMN             = 0
      I_SCREEN_START_LINE               = 0
      I_SCREEN_END_COLUMN               = 0
      I_SCREEN_END_LINE                 = 0
      I_HTML_HEIGHT_TOP                 = 0
      I_HTML_HEIGHT_END                 = 0
      IT_ALV_GRAPHICS                   = IT_ALV_GRAPHICS
      IT_HYPERLINK                      = IT_HYPERLINK
      IT_ADD_FIELDCAT                   = IT_ADD_FIELDCAT
      IT_EXCEPT_QINFO                   = IT_EXCEPT_QINFO
      IR_SALV_FULLSCREEN_ADAPTER        = IR_SALV_FULLSCREEN_ADAPTER
    IMPORTING
      E_EXIT_CAUSED_BY_CALLER           = E_EXIT_CAUSED_BY_CALLER
      ES_EXIT_CAUSED_BY_USER            = ES_EXIT_CAUSED_BY_USER
        tables
    <b>      t_outtab                          = ITAB</b>
    exceptions
       program_error                     = 1
       others                            = 2
      if sy-subrc <> 0.
        message id sy-msgid type sy-msgty number sy-msgno
                with sy-msgv1 sy-msgv2 sy-msgv3 sy-msgv4.
      endif.
    Thanks
    Vikranth Khimavath

  • How to avoid Duplicate Records  while joining two tables

    Hi,
    I am trying to join three tables, basically two tables are same one is like history table, so I wrote a query like
    select
    e.id,
    e.seqNo,
    e.name,
    d.resDate,
    d.details
    from employees e,
    ((select * from dept)union(select * from dept_hist)) d
    join on d.id=e.id and e.seqno=d.seqno
    but this returing duplicate records.
    Could anyone please tell me how to avoid duplicate records of this query.

    Actually it is like if the record is processed it will be moved to hist table, so both table will not have same records and I need the record from both the tables so i have done the union of both the tables, so d will have the union of both records.
    But I am getting duplicate records if even I am distinct.

  • How to avoid duplicate data while inserting from sample.dat file to table

    Hi Guys,
    We have issue with duplicate data in flat file while loading data from sample.dat file to table. How to avoid duplicate data in control file.
    Can any one help me on this.
    Thanks in advance!
    Regards,
    LKR

    No, a control file will not remove duplicate data.
    You would be better to use an external table and then remove duplicate data using SQL as you query the data to insert it to your destination table.

  • How to avoid duplicate record in a file to file

    Hi Guys,
              Could you please provide a soultion
              in order  to avoid duplicate entries in a flat file based on key field.
              i request in terms of standard functions
             either at message mappingf level or by configuring the file adapter.
    warm regards
    mahesh.

    hi mahesh,
    write module processor for checking the duplicate record in file adapter
    or
    With a JAVA/ABAP mapping u can eliminate the duplicate records
    and check this links
    Re: How to Handle this "Duplicate Records"
    Duplicate records
    Ignoring Duplicate Records--urgent
    Re: Duplicate records frequently occurred
    Re: Reg ODS JUNK DATA
    http://help.sap.com/saphelp_nw2004s/helpdata/en/d0/538f3b294a7f2de10000000a11402f/frameset.htm
    regards
    srinivas

  • How to avoid duplicates for an result set

    how to avoid the duplicate rows for the below query
    SELECT  to_char(grecode (titleid)) gre_code, to_char(toeflcode (titleid)) toefl_code,titleid
              FROM (SELECT DISTINCT TO_CHAR
                                       (UPPER (TRIM (get_clob_value (table_name,
                                                                     KEY
                                       ) RESULT,
                                    titleid
                               FROM mcp_specifications a JOIN mcp_title_specifications b
                                    ON a.specificationid = b.specificationid
                                    JOIN mcp_titles c ON b.titleid = c.titleid
                              WHERE b.is_parent = 'F'
                                AND UPPER (TRIM (c.university_state)) =
                                                              UPPER (TRIM ('USA'))
                                AND TO_CHAR (get_clob_value (table_name, KEY)) IS NOT NULL
                                AND UPPER (TRIM (SPECIFICATION)) IN
                                                       (UPPER (TRIM ('program'))))
             WHERE UPPER (TRIM (RESULT)) = UPPER (TRIM ('COMPUTER SCIENCE'))
          ORDER BY RESULT ASC;the output of the query would be
    gre_code    toefl_code   titleid
    402             78             5518
    402             78             5519
    402             78             5520
    402             78             5521the output should be
    402 78 any titleid

    Some simplified code:
    SELECT grecode(titleid) gre_code,
           toeflcode(titleid) toefl_code,
           min(titleid) titleid
    FROM   (SELECT DISTINCT TO_CHAR(UPPER(TRIM(get_clob_value(table_name,KEY)))) RESULT,
                   titleid
            FROM   mcp_specifications a
                   JOIN mcp_title_specifications b
                        ON a.specificationid = b.specificationid
                   JOIN mcp_titles c
                        ON b.titleid = c.titleid
            WHERE  b.is_parent = 'F'
            AND    UPPER(TRIM(c.university_state)) = 'USA'
            AND    TO_CHAR (get_clob_value (table_name, KEY)) IS NOT NULL
            AND    UPPER(TRIM(SPECIFICATION)) = 'PROGRAM')
    WHERE  UPPER(TRIM(RESULT)) = 'COMPUTER SCIENCE'
    GROUP BY grecode(titleid),
             toeflcode(titleid)Please note that applying functions like UPPER and TRIM on a string literal can and should be avoided.
    For example:
    UPPER(TRIM('USA')) = 'USA'Why force the database to do both an UPPER and a TRIM on something that can just be represented in uppercase with no surrounding spaces? It's a waste of time.

  • How to avoid duplicates in export?

    Hi
    Is there any way to avoid duplicates when exporting from LR4/avoid exporting the same picture(to the same folder) again?
    Kindly
    Jan

    Rob Cole wrote:
    I stand corrected - thanks Jim .
    I think I had forgotten this because of how I use publish services / collections.
    All of my publish services have exactly one (smart) collection, which defines the photos to be published.
    I don't create a multitude of publish collections in order to define an associated publish tree - the tree is defined by the source folders. I started this convention before Lr was a glimmer in Adobe's eyes (even before digital photography was invented). If I was inventing now from scratch I might do it differently, but this is one reason I get aggravated when some of the experts in this forum continually "forget" that the need to maintain a prescribed convention is sometimes an absolute requirement (or at least *highly* desirable), and Lr should be able to adapt to the convention - and not the other way around.
    If I quit Lr today, I could still maintain published trees without Lr's publishing collections. If your scheme depends on publishing collections which have no visibility outside Lightroom (e.g. a multitude of hard drive publishing collections), then you'd be screwed (so to speak) if you wanted to migrate to another software for maintenance. Not only that, but if you rebuild your catalog, all such collections are lost (unless you know how to use plugins to preserve them). Even if your scheme depends on regular (non-publishing I mean, whether smart of not) collections, and you use jf's Collection Publisher to publish in matching hierarchy, you'd still be screwed when migrating, since those collections do not exist outside Lightroom.
    Impact may vary of course, but I like to minimized dependence on a specific software if possible.
    Folders exist regardless of which software you use to edit your photos. Put another way: collection hierarchies are proprietary, folder structure isn't.
    So, although lots of people prefer jf's Collection Publisher (understandably), it's worth considering jf's Folder Publisher too, or my very own TreeSync Publisher.
    Cheers,
    Rob

  • How to avoid duplicate BOM Item Numbers?

    Hello,
    is there a way to avoid duplicate BOM Item Numbers (STPO-POSNR) within one BOM?
    For Routings I could avoid duplicate Operation/Activity Numbers with transaction OP46 by setting T412-FLG_CHK = 'X' for Task List Check. Is there an aquivalent for BOMs?
    Regards,
    Helmut Gante

    Hello,
    is there a way to avoid duplicate BOM Item Numbers (STPO-POSNR) within one BOM?
    For Routings I could avoid duplicate Operation/Activity Numbers with transaction OP46 by setting T412-FLG_CHK = 'X' for Task List Check. Is there an aquivalent for BOMs?
    Regards,
    Helmut Gante

  • How to avoid duplicates in CROSS JOIN Query

    Hi,
    I am using CROSS JOIN to get all the subset of a table col values as shown below:
    PRODUCT (Col Header)
    Bag
    Plate
    Biscuit
    While doing cross join we will get as
    Bag Bag
    Bag Plate
    Bag Biscuit
    Plate Bag
    Plate Plate
    Plate Biscuit ..... like this
    By placing where condition prod1 <> prod2 to avoid Bag Bag and Plate Plate values. So the output will be like below
    Bag Plate
    Bag Biscuit
    Plate Bag
    Plate Biscuit
    Now "Bag Plate" and "Plage Bag" are same combination how to avoid these records. My expected result is
    Bag Biscuit
    Plate Biscuit
    How to derive this ?
    Sridhar

    Hi,
    This is the the solution that I found as fit to the OP question, but
    Visakh16 already posted the same idea (assuming the names are unique) from the start and I don't think that anyone notice it!
    Sridhar.DPM did
    you check Visakh16's response
    (the second response received)?!?
    I will mark his response as an answer. If this is not what you need pls clarify and you can unmark it :-)
    [Personal Site] [Blog] [Facebook]

  • How to avoid duplicates in Iphoto 5?

    I have IPhoto 5.0.4 with Leopard on a G4. When I attach my digital camera, all the photos from my camera upload. I keep some family pics on my camera all the time, so the same ones duplicate themselves every time I upload new photos. How can this be avoided? thanks

    There is a way to do this.
    If your camera mounts on the Desktop as an external device, then use iPhoto's File > Import to Library command. Navigate to the camera, preview the photos, select just the ones you want, and import directly into iPhoto.
    If your camera does not mount on the Desktop, like most Canon models, you can place the camera card into a USB card reader. That will mount on the Desktop, and you can control your imports as described above. It also gives you the advantage of not having to worry about your camera's battery power during the import.
    Regards.

  • How to avoid duplicate measures in reports due to case functions?

    Hi,
    If I create a report, using a dimension called insert_source_type where the next measure would be insert_source in the dimensions hirarchie, if I do not put any formula, when I become a report where i can drill down on insert_source_type and i get insert_source values.
    If I use a function like (CASE "Ins Source"."Ins Source Type" WHEN 'OWS' THEN 'WEB' ELSE "Ins Source"."Ins Source Type" END) and change the label of insert_source_tpye to Channel Group instead, when
    I drill down on Channel Group, it goes to insert_source_tpye and from there i can drill down to insert_source.
    There is an insert_source_type too much!
    How can be this avoided?
    Thanks and Regards
    Giuliano

    hi mahesh,
    write module processor for checking the duplicate record in file adapter
    or
    With a JAVA/ABAP mapping u can eliminate the duplicate records
    and check this links
    Re: How to Handle this "Duplicate Records"
    Duplicate records
    Ignoring Duplicate Records--urgent
    Re: Duplicate records frequently occurred
    Re: Reg ODS JUNK DATA
    http://help.sap.com/saphelp_nw2004s/helpdata/en/d0/538f3b294a7f2de10000000a11402f/frameset.htm
    regards
    srinivas

  • How to avoid 'duplicate data record' error message when loading master data

    Dear Experts
    We have a custom extractor on table CSKS called ZCOSTCENTER_ATTR. The settings of this datasource are the same as the settings of 0COSTCENTER_ATTR. The problem is that when loading to BW it seems that validity (DATEFROM and DATETO) is not taken into account. If there is a cost center with several entries having different validity, I get this duplicate data record error. There is no error when loading 0COSTCENTER_ATTR.
    Enhancing 0COSTCENTER_ATTR to have one datasource instead of two is not an option.
    I know that you can set ignore duplicates in the infopackage, but that is not a nice solution. 0COSTCENTER_ATTR can run without this!
    Is there a trick you know to tell the system that the date fields are also part of the key??
    Thank you for your help
    Peter

    Alessandro - ZCOSTCENTER_ATTR is loading 0COSTCENTER, just like 0COSTCENTER_ATTR.
    Siggi - I don't have the error message described in the note.
    "There are duplicates of the data record 2 & with the key 'NO010000122077 &' for characteristic 0COSTCENTER &."
    In PSA the records are marked red with the same message (MSG no 191).
    As you see the key does not contain the date when the record is valid. How do I add it? How is it working for 0COSTCENTER_ATTR with the same records? Is it done on the R/3 or on the BW side?
    Thanks
    Peter

  • How to avoid duplicate data loading from SAP-r/3 to BI

    Hi !
           I have created one process chain that will load data into some ODS from R/3,where(in R/3)the datasources/tables r updated daily.
           I want to scheduled the system such that ,if on any day the source data is not updated (if the tables r as it is) then that data shuold not be loaded into ODS.
           Can any one suggest me such mechanism,so that I can always have unique data in my data targets.
           Pls ! Reply soon.
          Thank You !
           Pankaj K.

    Hello Pankaj,
    By setting the unique records option, you pretty much are letting the system know to not check the uniqueness of the records using the change log and the ODS active table log.
    Also, in order to avoid the problem where you are having dual requests which are getting activated at the same time. Please make sure you select the options "Set Quality Status to 'OK' Automatically" and "Activate Data Automatically" that way you would be having an option to delete a request as required without having to delete the whole data.
    This is all to avoid the issue where even the new request has to be deleted to delete the duplicate data.
    Untill and unless the timestamp field is available in the table on top of which you have created the datasource it would be difficult to check the delta load.
    Check the table used to make sure there is no timestamp field or any other numeric counter field which can be used for creating a delta queue for the datasource you are dealing with.
    Let me know if the information is helpful or if you need additional information regarding the same.
    Thanks
    Dharma.

  • How to avoid Duplicate values in Oracle forms

    Hello Everyone,
    I am new to Oracle forms, working on Oracle Applications : 12.1.2,
    Here I have a form Receiving Transactions,
    in that I have Lot_Number and quantity columns.
    The purchased items need to be stored in Lot number.
    If 100 quantities purchased then 100 quantities can be stored in single lot number
    i.e
    Lot_Number         Qty
      001                100or 100 quantities can be stored in different lot numbers.
    i.e
    Lot_Number         Qty
      001                50
      002                50(The qty may differ)but not like the following,
    Lot_Number          Qty
      001                 50
      001                 50.For second line,If they selected same lot number as First lot number,
    The error message has to be shown as "Lot number duplicated,Select Different Lot number or update the full qty in original line".
    Can anyone help me to solve this.
    Thank you.
    Regards,
    Gurujothi

    Hi François Degrelle,
    I added the following Plsql in the 'When-validate-Item' block,
    declare
    l_current_number varchar2(80);
    begin     
    l_current_number := :lot_entry.lot_number;
    first_record;
    loop
      if l_current_number = :lot_entry.lot_number then
         message ('Duplicate');
      end if;
      next_record;
    end loop;Its started to check from First record itself and giving the message 'Duplicate',
    How to check from second record?
    when the first record is entered it should not check when the second record is entered it should compare with the First record value and if it is same then it should give message as 'Duplicate'.
    Thank you.

Maybe you are looking for

  • Worried about my powerbook g4

    First of all I'm a new mac user never used or touch one before I always was a PC guy until my family talked me into trying one. So I was looking on craiglist one day and seen someone that posted a ad a Powerbook G4 ,1.4GHZ,512,80GB,Superdrive,15" for

  • How to generate a certificate request with more than one OU?

    We're using Sun Java System Web Server 6.1 SP4. The Corp. has it's own CA and organize their certificates in a hierarchical rule with more then one organization unit (OU) in a chain. So what we need is generate a certificate requeste with more than o

  • Hello, I downloaded a book on itunes and connot view it at all. Help

    I purchased  a book on itunes, and can not seem to view it on my computer or apple device.  And it did not automatically go into my Ibooks.  I do not know what  to do.  Any information would be appreciated. 

  • 2011 MBP Frequent Kernel Panics

    I recently purchased a 17" MBP with 4GB of RAM. For the first several weeks the machine worked flawlessly, but then it suddenly started giving me constant kernel panics. It seems completely reproducible, as it often occurs when watching video online,

  • NPE in com.mysql.jdbc.ResultSet

    hi guys, config: Connector/J 3.1.11 - mysql 5.0 - java 1.5 is anybody has ever seen this exception?? Caused by: java.lang.NullPointerException         at com.mysql.jdbc.ResultSet.buildIndexMapping(ResultSet.java:597)         at com.mysql.jdbc.ResultS