Validate for duplicate data

How can I ensure that a form entry does not contain duplicate
data to the corresponding database field the data is to be entered
into? I am working with ASP.

You need to test the database first before handling the
insert. DW has an SB
called "CheckUsername" which can be used to handle this
function if you are
only checking one field.
Paul Whitham
Certified Dreamweaver MX2004 Professional
Adobe Community Expert - Dreamweaver
Valleybiz Internet Design
www.valleybiz.net
"aonefun" <[email protected]> wrote in
message
news:emu5gn$k5j$[email protected]..
> How can I ensure that a form entry does not contain
duplicate data to the
> corresponding database field the data is to be entered
into? I am working
> with ASP.

Similar Messages

  • Get only the first row for duplicate data

    Hello,
    I have the following situation...
    I have a table called employees with column (re, name, function) where "re" is the primary key.
    I have to import data inside this table using an xml file, the problem is that some employees can be repeated inside this xml, that means, I'll have repeated "re"'s, and this columns is primary key of my table.
    To workaround, I've created a table called employees_tmp that has the same structed as employees, but without the constraint of primary key, on this way I can import the xml data inside the table employees_tmp.
    What I need now is copy the data from employees_tmp to employess, but for the cases where "re" is repeated, I just want to copy the first row found.
    Just an example:
    EMPLOYEES_TMP
    RE NAME FUNCTION
    0987 GABRIEL ANALYST
    0987 GABRIEL MANAGER
    0978 RANIERI ANALYST
    0875 RICHARD VICE-PRESIDENT
    I want to copy the data to make employees looks like
    EMPLOYEES
    RE NAME FUNCTION
    0987 GABRIEL ANALYST
    0978 RANIERI ANALYST
    0875 RICHARD VICE-PRESIDENT
    How could I do this?
    I really appreciate any help.
    Thanks

    Try,
    SELECT re, NAME, FUNCTION
      FROM (SELECT re,
                   NAME,
                   FUNCTION,
                   ROW_NUMBER () OVER (PARTITION BY re ORDER BY NAME) rn
              FROM employees_tmp)
    WHERE rn = 1G.

  • How to validate for proper date format

    Hi Experts,
    I am using following Fm to calculate date difference.....
    call function 'FIMA_DAYS_AND_MONTHS_AND_YEARS'
          exporting
            i_date_from = wa_draw-zdm_vld_to_dat
            i_date_to   = sy-datum
          importing
            e_days      = v_days.
        if sy-subrc = 0 .
          if v_days > 0.
    But value of field  zdm_vld_to_dat (data type DATS  , YYYYMMDD)   in database tables itself is wrong   ...like  ct/-0/29-o
    I want to avoid passing these wrong data entry, in the FM...as also i give info message for this in spool.
    So can anybody tel me How can I detect or know this fields zdm_vld_to_dat  conatins wrong date other than YYYYMMDD...e.g.
    ct/-0/29-o   means any charecter or hyphen
    Thank you.  

    Hi Nilesh Hiwale,
    1. First thing is there is no need for any function module for difference between two dates
        See this.
    Data:
    days type i
    date1 type wa_draw-zdm_vld_to_dat
    date2 type sy-datum.
    days = date2 - date1.
    Write / days.
                          > (OR)
    2. write this condition in PAI module of your screen.
    At selection-screen.
    if date1 cn '1234567890'. or date2 cn '123456789'.
      write e000(Sabapdocu) with 'Enter YYYYMMDD Formate'.
    endif.
    Reward if it is useful,
    Mahi.

  • Query for duplicate data

    Hi guy,
    Plz Help me out for this query.
    i have tabe tb_adhc_hrrs which hireid,hiere_name,email, pagerno filed i
    want to all the dulicate value of pager no
    for example
    select pagerno,count(hirerid) from tb_adhoc_hrrs
    group by pagerno
    having count(*)>1 it will displace all duplicate pager no.
    suppose the pager number has 2 duplicate then
    paticular pager no related all information i needed
    format is
    hirerid hirername pagerno count
    1 x 12 2
    4 r 12 2
    thnks
    kalyan

    Hi Kalyan,
    I think you need something like this?
    select *
    from tb_adhoc_hrrs, (select pagerno,count(hirerid)
                           from tb_adhoc_hrrs
                       group by pagerno having count(*)>1) grp
    where tb_adhoc_hrrs.pagerno = grp.pagerno;Regards
    Peter

  • Checking for duplicate data

    Hi guys,
    I currently have a column in one of my database tables which is an id, the begining of the ID either starts with ab, ac, or ad followed by a few numbers (the amount of numbers varies from 6-8) What I want to be able to do is to check if there are multiple record of the same ID disregarding the front 2 letters and list the duplicates, ie I want to be able to run a select statement which will display the following id as being duplicates:
    ab12345
    ac12345
    ad12345
    The above 3 ids would flag up as duplicates. Would this be possible?
    Any help on this matter would be greatly appreciated.
    Thank you.

    Maybe (seems you want rows flagged)
    with
    data(some_id) as
    (select 'ab12345' from dual union all
    select 'ac12345' from dual union all
    select 'ad12345' from dual union all
    select 'ad123456' from dual union all
    select 'ad654321' from dual union all
    select 'ab123456' from dual union all
    select 'ac1234567' from dual union all
    select 'ad1234567' from dual union all
    select 'ab12345678' from dual
    select some_id,
           case when count(*) over (partition by substr(some_id,3)
                                        order by null rows between unbounded preceding
                                                               and unbounded following
                                   ) > 1
                then 'DUP'
           end duplicate_flag
      from dataRegards
    Etbin

  • Duplicate data records through DTP for attribute

    Hi Guys,
    I am loading data to customer master data, But it contains duplicate data in large volume.
    I have to load both attribute and text data .
    Data upto PSA level is correct, and text data also is loaded successfully.
    When i am loading attribute data to customer master ,it fails due to duplicate data records.
    Then in dtp with update tab, I select the check box for duplicate data records .
    As i select this check box ,at bottom it shows the
    message that *ENTER VALID VALUE* .
    After this message i am unable to click any function and repeat the same message again & again.
    So i am unable to execute the DTP.
    helpful answer will get full points.
    So please give me solution that the above mentioned message should not appear and then
    i will be able to Execute the data ?
    Thanks .
    Saurabh jain.

    Hi,
    if you get duplicate data for your customer there might be something wrong with your data source or the data in psa. But anyway, leave the dtp by restarting rsa1. Edit or create the dtp again and press save immediately after entering edit mode. Leave the dtp again and start editing it. That should do the trick.
    regards
    Siggi

  • App to erase duplicate data on iphone GS?

    Hi All,
    While using my beloved ex Palm Tungsten C PDA I used to have an app that searched for duplicate on my handheld in address book, calendar, Notes etc...
    Does anybody know if there is an app that will search for "duplicate" data and then give you the option to get rid of duplicates? Especially in my address book where due to using so many different PDA's, OS, software sync apps and other stuff I have as many as 10 duplicates of each contact....

    Don't know about an app, but I have opted to use Google account as my master contact organizer and set it up to sync with the iPhone. One love it as it does it wirelessly (thus no need to sync with computer) but Google does offer an option in it's address book to help with duplicates.

  • Duplicate data records through DTP

    Hi Guys,
    I am loading duplicate data records to customer master data.
    data upto PSA level is correct,
    now when  i am it from psa to customer master through dtp ,
    when in DTP in update tab i select the check box for duplicate data records then at bottom it shows the
    message that *ENTER VALID VALUE*
    After this message i am unable to click any function and repeat the same message again & again.
    So please give me solution that the above mentioned message shouldnt appear and then
    i will be able to Execute the data ?
    Thanks .
      Saurabh jain.

    Hi,
    if you get duplicate data for your customer there might be something wrong with your data source or the data in psa. But anyway, leave the dtp by restarting rsa1. Edit or create the dtp again and press save immediately after entering edit mode. Leave the dtp again and start editing it. That should do the trick.
    regards
    Siggi

  • Duplicate report title for every date where records are found in date range

    Hi,
    I have a developed a report that lists multiple entries by date range with a page break separating each date. What I would like to include now is a report title which only appears once for each date, and separated by page break.
    Example:
    (Business Unit)
    (Address)
    (Report Title)
    (Date)
    Entry 1
    Entry 2
    Entry 3
    Entry 4
    Entry 5
    Entry 6
    Entry 7
    (Business Unit)
    (Address)
    (Report Title)
    (Date)Entry 1

    Hi Camelbak2113,
    According to your description, it seems that you want to eliminate the duplicate report title for every date. If in this scenario, I suggest that we can try to add a group grouped on date range, and then add a child group grouped on report title. Then add
    page breaks between each instance of date range group.
    If I have something misunderstood, please provide us more information about the report. Such as provide us some screenshots about the report with sample data. So that we can make further analysis and help you out.
    Thanks,
    Katherine Xiong
    Katherine Xiong
    TechNet Community Support

  • How to validate ME21N for PO Date sy-datum using BADI

    Hello Experts,
                      I've a requirement in which I've to validate ME21N for PO Date < sy-datum. So which METHOD of which BADI I've to implement to get this done.
    Thanks in advance..
    Best Regards,
    Hardik B

    Hi Hardik,
         Please try writing your code in method POST of BAdI ME_PROCESS_PO_CUST.
    If you want to search for a BAdI for a particular T-code, Go to t-code SE24. Enter class name as cl_exithandler, put a break-point on method GET_INSTANCE of this class and come out of it. Enter the t-code for which you want to find a BAdI. Once the program stops in the debugger, you can find the BAdIs and user exits in the exit_name parameter of the method get_instance.
    Regards,
    Dnyanesh.

  • Validate the posting date for profit centre documents actual/plan

    Hi,
    I have an requiremnt where we want system to validate the posting date for profit centre documents actual/plan, Functionality should be same as system check  posting period for FI documents maintained in OB52 and controlling cost centre documents posting period maintained in OKP1.
    For creating Profit centre document activity types are available in Tcode 0KEO.
    I have come across Exits EXIT_SAPLPC08_001 and  EXIT_SAPLPC08_002. But i am notsure whether we can use these and implement these.
    PLease suggest what can be done to achieve this.
    Regards,

    Hi,
    U can use subsititution and specify that the profit center could be your head office or the store as per your req...
    I think thats the only option we have..
    Cheers
    Raghu

  • Duplicate reocrds for Master data

    Hi Friends,
    I have a request which contains duplicate records for master data. When i try to load data, after loading to PSA, BW shows an error saying that there are duplicate records.
    My requirement is still i need to load the data overwriting the previous duplicate record.
    How to do this?
    Thanks,
    Raja

    Hi,
    Subsequently i need to load the data to target as well. If i select the option "Only PSA" and "Ignore double data records" the option for "Update subsequently in data targets" is disabled.
    Thanks,
    Raja

  • Getting duplicate data records for master data

    Hi All,
    When the process chain for the master data, i am getting duplicate data records and , for that  selected the options in Info package level under processing 1)a  update PSA and subsequentky data targets and alternateely select the option Ignore double data records. But still the load was failing and error message "Duplicate  Data Records" after that rhe sehuduled the Info package then i am not getting the error message next time,
    Can any one help on this to resolve the issue.
    Regrasd
    KK

    Yes, for the first option u can write a routine ,what is ur data target--> if it is a cube, there may be a chances of duplicate records because of the additive nature.if its a ODS then u can avoid this, bec only delta is going to be updated.
    Regarding the time dependant attributes, its based on the date field.we have 4 types of slowly changing dimensions.
    check the following link
    http://help.sap.com/bp_biv135/documentation/Multi-dimensional_modeling_EN.doc
    http://www.intelligententerprise.com/info_centers/data_warehousing/showArticle.jhtml?articleID=59301280&pgno=1
    http://help.sap.com/saphelp_nw04/helpdata/en/dd/f470375fbf307ee10000009b38f8cf/frameset.htm

  • Which CKM is used for moving data from Oracle to delimited file ?

    Hi All
    Please let me know Which CKM is used for moving data from Oracle to delimited file ?
    Also is there need of defining each columns before hand in target datastore. Cant ODI take it from the oracle table itself ?

    Addy,
    A CKM is a Check KM which is used to validate data and log errors. It is not going to assist you in data movement. You will need an LKM SQL to File append as answered in another thread.
    Assuming that you have a one to one mapping, to make things simpler you can duplicate the Oracle based model and create a file based model. This will take all the column definitions from the Oracle based model.
    Alternatively, you can also use an ODI tool odiSQLUnload to dump the data to a file
    HTH

  • Check duplicate data entry in multi record block,which is a mandatory field

    Dear all,
    I have a situation where i have to check duplicate data entry(on a particular field,which is a mandatory field,i.e. it cannot be skipped by user without entering value) while data key-in in a Multi Record block.
    As for reference I have used a logic,such as
    1>In a When-Validate-Record trigger of that block I am assigning the value of that current item in Table type variable(collection type)
    as this trigger fire every time as soon as i leave that record,so its assigning the value of that current time.And this process continues
    then
    2>In a When-Validate-Item trigger of that corresponding item(i.e. the trigger is at item level) has been written,where it compares the value of that current item and the value stored in Table type variable(collection type) of When-Validate-Record trigger.If the current item value is matched with any value stored in Table type variable I am showing a message of ('Duplicate Record') following by raise_form_trigger failure
    This code is working fine for checking duplicate value of that multi record field
    The problem is that,if user enter the value in that field,and then goes to next field,enter value to that field and then press 'Enter Query 'icon,the bolth Validate trigger fires.As result first when-validate record fires,which stores that value,and then when-validate-item fires,as a result it shows duplicate record message
    Please give me a meaningful logic or code for solving this problem
    Any other logic to solve this problem is also welcome

    @Ammad Ahmed
    first of all thanks .your logic worked,but still i have some little bit of problem,
    now the requirement is a master detail form where both master and detail is multirecord ,where detail cannot have duplicate record,
    such as..........
    MASTER:--
    A code
    A1
    A2
    DETAIL:--
    D code
    d1
    d2 <-valid as for master A1 , detail d1 ,d2 are not duplicate
    d2 <--invalid as as for master A1 , detail d2 ,d2 are duplicate
    validation rule:  A Code –D Code combination is unique. The system will stop users from entering duplicate D Code for a A Code. Appropriate error message will be displayed.*
    actually i am facing a typical problem,the same logic i have been applied in detail section ,its working fine when i am inserting new records.problem starts when i query,after query in ' a ' field say 2 records (i.e. which has been earlier saved) has been pasted,now if i insert a new record with the value exactly same with the already present value in the screen(i.e. value populated after query) its not showing duplicate.................could u tell me the reason?and help me out...............its urgent plzzzzzzzzz
    Edited by: sushovan on Nov 22, 2010 4:34 AM
    Edited by: sushovan on Nov 22, 2010 4:36 AM
    Edited by: sushovan on Nov 22, 2010 8:58 AM

Maybe you are looking for

  • Does Playbook 2.0 support bluetooth headset?

    I heard some one said Playbook 2.0 supports A2DP  bluetooth headset. but when I tried to connect my playbook with Genius HS-905BT. playbook cannot find it. Is there one tell me if I missed anything? Thanks

  • Customized function Module BOM Problems

    Hi, i have created New function Module as copy of of IDOC_INPUT_BOMMAT . now i need to do some customizations in the new function module like 1.calculating validity dates by using function module CC_CHANGE_NUMBER_READ This is required for: Valid from

  • Trouble with quick vpn

    at home, I can connect to my business with quick vpn but the network is not visible. How do you connect with servers at the business?

  • Converting a works file to a pdf

    I just downloaded Adobe XI Pro.  How do I convert a Works file into a pdf?

  • Need to implement filter on a recursive node.

    Hi,     I have a table with a tree structure(TreeByNestingTableColumn) in its column. How do i apply filter on this table. For normal tables i have applied the generic table filter class for filtering. But how do i do the same on a recursive node?? R