Duplicate to check the duplicate record

We are loading master data hierarchy(alternate).. Already we loaded parent hierarchy . Now we loading subunit(child or alternate) just a small change from the parent one.
We already loaded parent hierarchy now we plan to load alternate will it create any duplicate ...
Thanks in advance
Taj

I am loading different hierarchy..
1st is the parent hierarchy which has everything. 2 nd hierarchy which has only specific brands since user wants to pull only this specific brands in the report variable.
(So this is subset of 1 st one).
If we loading both the hierarchy will it overwrite or create any duplicate.
Thanks in advance
Taj

Similar Messages

  • How can i check the duplicate NPD project name?

    Dear all,
    I would like to know on how i can check the duplicate NPD's project name? I found that some NPD's project is initiate serveral time with the same or semilar name from user name. Supposing the project name was "Smart Pilot". I always found that project may type differently such as "Smart PILOT","smart pilot" or even "Smart Pilots". Supposing i would like to validate all of these name before save by using validation framework, or custom validation. Is it possible?
    Supposing it was possible, can you please guiding me on how?
    Thank you so much in advance for all of the answer.
    Phaithoon W.

    So here is some example code for a quick validator. Note that this uses a hard coded English error message, rather than a translation. If you want to use translations instead, look at some of the examples in the ReferenceImplementations\Validation\SourceCode\ReferenceValidation folder in the Extensibility Pack, and review the Validation Training as well (in the ReferenceImplementations\Validation\Documentation folder).
    Note that the provided code is for demonstration purposes only and is not supported by Oracle.
    The classes:
    using System;
    using System.Data;
    using System.Xml;
    using Xeno.Data.NPD;
    using Xeno.Prodika.Application;
    using Xeno.Prodika.Services;
    using Xeno.Prodika.Validation;
    using Xeno.Prodika.Validation.Validators;
    namespace Oracle.Agile.PlmProcess.Validation
        public class NPDUniqueProjectNameValidatorFactory : XmlConfigValidatorFactoryBase
            protected override IValidator Create_Internal(XmlNode configNode)
                return new NPDUniqueProjectNameValidator();
        public class NPDUniqueProjectNameValidator : BaseValidator
            private const string sql = @"select 1 as dup from NPDPROJECTML where UPPER(title) = UPPER ('{0}') and FKPROJECT <> '{1}' and LANGID = {2}";
            public override bool Validate(IValidationContext ctx)
                var project = ctx.ValidationTarget as INPDProjectDO;
                bool hasDuplicate = false;
                string sqlToExecute = String.Format(sql, project.ProjectML.Title, project.PKID, UserService.UserContext.User.PreferredLanguage);
                using (IDataReader reader = AppPlatformHelper.DataManager.newQuery().execute(sqlToExecute))
                    if (reader.Read())
                        hasDuplicate = true;
                if (hasDuplicate)
                    ctx.AddError(String.Format("A project already exists with the name '{0}'.", project.ProjectML.Title));
                return hasDuplicate;
            private IUserService UserService
                get { return AppPlatformHelper.ServiceManager.GetServiceByType<IUserService>(); }
    Next, add the following to the ValidationSettings.xml file in config\Extensions:
    1. add the validator factory, with a reference name in the config:
    <ValidatorFactories>
            <!-- Custom  Validator Factory declaration goes here -->       
            <add key="NPD_CheckProjectNameForDuplicate" value="Class:Oracle.Agile.PlmProcess.Validation.NPDUniqueProjectNameValidatorFactory,ReferenceValidation" />
    </ValidatorFactories>
    2. Add a rule for NPD project Save event:
    <rule type="3202">
                <condition event="save">
                    <if type="NPD_CheckProjectNameForDuplicate" require="true" report="true" />
                </condition>
    </rule>
    Compile the reference example into a dll, and add that dll into web\npd\bin
    When you try to save a project with a duplicate name, it should now give you the error message.

  • Delete the duplicate and keep the max records.....

    I would like to remove the duplicate records based on columns ID and VAL but keep the max SAL records. ID + VAL is the key in the table and just delete same records by keeping the max sal.
    Note: Eventhough there are two records for the same max SAL, just keep one
    eg
    SQL>  select * from temp_fa;
            ID        VAL        SAL
             1        100         10
             1        100         20
             1        100         20
             2        200         10
             3        300         10
             3        300         30
             4        400         10
             4        400         10
             5        500         10
             5        500         20
             5        500         20
    After deleting the table should looks like
    SQL>  select * from temp_fa;
            ID        VAL        SAL
             1        100         20
             2        200         10
             3        300         30
             4        400         10
             5        500         20

    user520824 wrote:
    I would like to remove the duplicate records based on columns ID and VAL but keep the max SAL records. ID + VAL is the key in the table and just delete same records by keeping the max sal.
    Note: Eventhough there are two records for the same max SAL, just keep one
    eg
    SQL>  select * from temp_fa;
    ID        VAL        SAL
    1        100         10
    1        100         20
    1        100         20
    2        200         10
    3        300         10
    3        300         30
    4        400         10
    4        400         10
    5        500         10
    5        500         20
    5        500         20
    After deleting the table should looks like
    SQL>  select * from temp_fa;
    ID        VAL        SAL
    1        100         20
    2        200         10
    3        300         30
    4        400         10
    5        500         20
    Hi,
    In this script I included sal in the key because it's more safe for you.
            ID        VAL        SAL
             1        100         10
             1        100         20
             1        100         20
             2        200         10
             3        300         10
             3        300         30
             4        400         10
             4        400         10
             5        500         10
             5        500         20
             5        500         20
    --1.Preserve first row from all duplicates
    create table first_duplicate as
    (select distinct val, id, sal
    from temp_fa
    where to_char(temp_fa.id)||to_char(temp_fa.val)||to_char(temp_fa.sal) in
    (select to_char(temp_fa.id)||to_char(temp_fa.val)||to_char(temp_fa.sal) p_key
          from temp_fa
          group by to_char(temp_fa.id)||to_char(temp_fa.val)||to_char(temp_fa.sal)
             having count(to_char(temp_fa.id)||to_char(temp_fa.val)||to_char(temp_fa.sal))>1)
    --2.Delete all duplicates
    DELETE  FROM temp_fa
    where to_char(temp_fa.id)||to_char(temp_fa.val)||to_char(temp_fa.sal) in 
         (select to_char(temp_fa.id)||to_char(temp_fa.val)||to_char(temp_fa.sal) p_key
         from temp_fa
         where to_char(temp_fa.id)||to_char(temp_fa.val)||to_char(temp_fa.sal) in
              (select to_char(temp_fa.id)||to_char(temp_fa.val)||to_char(temp_fa.sal) p_key
              from temp_fa
               group by to_char(temp_fa.id)||to_char(temp_fa.val)||to_char(temp_fa.sal)
                  having count(to_char(temp_fa.id)||to_char(temp_fa.val)||to_char(temp_fa.sal))>1)
    --3.Add first row from all duplicates
    insert into  temp_fa (val, id, sal)
    select * from first_duplicate;
    --4.Delete rows that don't have the max salary
    DELETE  FROM temp_fa
    where to_char(temp_fa.id)||to_char(temp_fa.val)||to_char(temp_fa.sal) in 
         (select to_char(id)||to_char(val)||to_char(sal) p_key
         from temp_fa
              MINUS
         (select to_char(x.id)||to_char(x.val)||to_char(x.sal) p_key
         from temp_fa x,
         (select val, id, max(sal) max_val from temp_fa
           group by  val, id ) y
           WHERE x.val = y.val and
            x.id  = y.id and
            x.sal =max_val
    HR: XE > select * from temp_fa order by id;
            ID        VAL        SAL
             1        100         20
             2        200         10
             3        300         30
             4        400         10
             5        500         20
    HR: XE > Regards,
    Ion
    Edited by: user111444777 on Sep 25, 2009 10:42 PM

  • Duplicates AFTER deleting the DUPLICATES~!~~~~!!!!!  SERIOUSLY?!?!?!?

    I've spent DAYS....actually a couple weeks deleting over 50G of duplicate content. Then I transferred everything over to an external HD so that I would have ONE large HD that held everything. And whaddaya know.....all the songs that I'd initially deleted the extra copy of....on the hard drive NOW have an upwards of 3 extra copies. AND...when I reinstalled Itunes and attempted to re-import the songs into iTunes, iTunes showed up to an extra 8 copies.
    WHAT HAPPENED?!?!?!?
    thanks for the help in advance.

    Do all of the duplicate songs play or does it not allow you to play them? I would try to look in the folder where your compture has stored these songs and see if there are duplicates in there. If so I'd delete them so my library couldn't find them either, then go back into the library play the duplicates whichever ones don't allow you to play, are the ones you deleted from your compt, then delete them from your library.
    Otherwise, you could purchase a program called transipod, which allows you to take songs from your ipod and put them on your compt. Create a new Library, then just add the songs from your ipod. The only problem is it doesnt allow songs that have the same name for all of them to be transfered it only picks one for some reason... But I hope this helped if not i wish you the best of luck!

  • Plz help me for checking the Header record exists or not

    Hi
    How we check the whether the Header record exists or not.. if it is not exists i have raise an exception.. i wanna do this using UDF
    Plz help me  its urgent..
    venkat

    Maybe you should read all the responses in your earleir threads and read the rules of engagement also,
    /thread/117188 [original link is broken]
    Regards
    Bhavesh

  • Check the duplicate functionality of CRM 7.0

    Dear Sirs,
    could someone give me some Information about of checking and eliminating of duplicate business partner data records.  This functionality should be already integrated in SAP CRM, how can I customize it?
    There is a Data Quality Management Framework for SAP CRM, does someone know about that framework if it is integrated in SAP CRM already or is it to buy extra?
    thanks a lot.
    nice regards,
    Consi

    Hi,
    if you refer to CND* tables those are tables involved in the condition master data exchange between ERP and CRM (on CRM side). Actual condition tables for pricing related condition records start with CNC*. In the help there is a documentation of the condition master data tables for pricing available: http://help.sap.com/saphelp_crm70/helpdata/EN/0e/91f9392486ce1ae10000000a114084/content.htm
    Hope this helps.
    Best Regards,
    Michael

  • Setfilter to display all the duplicate rows pb7

    Friends,
    my version is powerbuilder 7.
    i am checking the duplicate rows in the datawindow before inserting.
    in a button click event...i have the below code
    dw_master.SetSort ("rollno A")
    dw_master.Sort()
    dw_master.SetFilter("rollno = rollno[-1]")
    dw_master.Filter()
    if dw_master.RowCount() > 0 then
        MessageBox('Duplicate Rows','Check the Duplicate Roll numbers',StopSign!,OK!)
        Return 1
    end if
    //no dups, so clear the filter and do the update...
    dw_master.SetFilter ("")
    dw_master.Filter()
    i have 2 duplicate rows...but this setfilter is displaying only one row....
    if it displays all the duplicate rows then it will be very useful for me...
    how can i modify this code to get all the duplicate rows in this datawindow before inserting?
    thanks

    Instead of a filter you could do this...
    boolean lb_dupe_found=false
    long ll_row, ll_rowcount
    string ls_val1, ls_val2
    ll_rowcount = dw_1.rowcount()
    for ll_row = 2 to ll_rowcount
    ls_val1 = dw_1.getitemstring(ll_row, 'col1')
    ls_val2 = dw_1.getitemstring(ll_row -1, 'col1')
    if ls_val1 = ls_val2 then
      // duplicate
      lb_dupe_found = true
      exit
    end if
    next
    if lb_dupe_found then
    messagebox('DUPE', 'Houston we have a problem!')
    else
    messagebox('SAVE', 'Data valid for saving')
    end if

  • Check for duplicate record in SQL database before doing INSERT

    Hey guys,
           This is part powershell app doing a SQL insert. BUt my question really relates to the SQL insert. I need to do a check of the database PRIOR to doing the insert to check for duplicate records and if it exists then that record needs
    to be overwritten. I'm not sure how to accomplish this task. My back end is a SQL 2000 Server. I'm piping the data into my insert statement from a powershell FileSystemWatcher app. In my scenario here if the file dumped into a directory starts with I it gets
    written to a SQL database otherwise it gets written to an Access Table. I know silly, but thats the environment im in. haha.
    Any help is appreciated.
    Thanks in Advance
    Rich T.
    #### DEFINE WATCH FOLDERS AND DEFAULT FILE EXTENSION TO WATCH FOR ####
                $cofa_folder = '\\cpsfs001\Data_pvs\TestCofA'
                $bulk_folder = '\\cpsfs001\PVS\Subsidiary\Nolwood\McWood\POD'
                $filter = '*.tif'
                $cofa = New-Object IO.FileSystemWatcher $cofa_folder, $filter -Property @{ IncludeSubdirectories = $false; EnableRaisingEvents= $true; NotifyFilter = [IO.NotifyFilters]'FileName, LastWrite' }
                $bulk = New-Object IO.FileSystemWatcher $bulk_folder, $filter -Property @{ IncludeSubdirectories = $false; EnableRaisingEvents= $true; NotifyFilter = [IO.NotifyFilters]'FileName, LastWrite' }
    #### CERTIFICATE OF ANALYSIS AND PACKAGE SHIPPER PROCESSING ####
                Register-ObjectEvent $cofa Created -SourceIdentifier COFA/PACKAGE -Action {
           $name = $Event.SourceEventArgs.Name
           $changeType = $Event.SourceEventArgs.ChangeType
           $timeStamp = $Event.TimeGenerated
    #### CERTIFICATE OF ANALYSIS PROCESS BEGINS ####
                $test=$name.StartsWith("I")
         if ($test -eq $true) {
                $pos = $name.IndexOf(".")
           $left=$name.substring(0,$pos)
           $pos = $left.IndexOf("L")
           $tempItem=$left.substring(0,$pos)
           $lot = $left.Substring($pos + 1)
           $item=$tempItem.Substring(1)
                Write-Host "in_item_key $item in_lot_key $lot imgfilename $name in_cofa_crtdt $timestamp"  -fore green
                Out-File -FilePath c:\OutputLogs\CofA.csv -Append -InputObject "in_item_key $item in_lot_key $lot imgfilename $name in_cofa_crtdt $timestamp"
                start-sleep -s 5
                $conn = New-Object System.Data.SqlClient.SqlConnection("Data Source=PVSNTDB33; Initial Catalog=adagecopy_daily; Integrated Security=TRUE")
                $conn.Open()
                $insert_stmt = "INSERT INTO in_cofa_pvs (in_item_key, in_lot_key, imgfileName, in_cofa_crtdt) VALUES ('$item','$lot','$name','$timestamp')"
                $cmd = $conn.CreateCommand()
                $cmd.CommandText = $insert_stmt
                $cmd.ExecuteNonQuery()
                $conn.Close()
    #### PACKAGE SHIPPER PROCESS BEGINS ####
              elseif ($test -eq $false) {
                $pos = $name.IndexOf(".")
           $left=$name.substring(0,$pos)
           $pos = $left.IndexOf("O")
           $tempItem=$left.substring(0,$pos)
           $order = $left.Substring($pos + 1)
           $shipid=$tempItem.Substring(1)
                Write-Host "so_hdr_key $order so_ship_key $shipid imgfilename $name in_cofa_crtdt $timestamp"  -fore green
                Out-File -FilePath c:\OutputLogs\PackageShipper.csv -Append -InputObject "so_hdr_key $order so_ship_key $shipid imgfilename $name in_cofa_crtdt $timestamp"
    Rich Thompson

    Hi
    Since SQL Server 2000 has been out of support, I recommend you to upgrade the SQL Server 2000 to a higher version, such as SQL Server 2005 or SQL Server 2008.
    According to your description, you can try the following methods to check duplicate record in SQL Server.
    1. You can use
    RAISERROR to check the duplicate record, if exists then RAISERROR unless insert accordingly, code block is given below:
    IF EXISTS (SELECT 1 FROM TableName AS t
    WHERE t.Column1 = @ Column1
    AND t.Column2 = @ Column2)
    BEGIN
    RAISERROR(‘Duplicate records’,18,1)
    END
    ELSE
    BEGIN
    INSERT INTO TableName (Column1, Column2, Column3)
    SELECT @ Column1, @ Column2, @ Column3
    END
    2. Also you can create UNIQUE INDEX or UNIQUE CONSTRAINT on the column of a table, when you try to INSERT a value that conflicts with the INDEX/CONSTRAINT, an exception will be thrown. 
    Add the unique index:
    CREATE UNIQUE INDEX Unique_Index_name ON TableName(ColumnName)
    Add the unique constraint:
    ALTER TABLE TableName
    ADD CONSTRAINT Unique_Contraint_Name
    UNIQUE (ColumnName)
    Thanks
    Lydia Zhang

  • To find the duplicate record in internal table

    Hi,
    i have a requirement to fine the duplicate record with 3 fields.
    i am getting a flat file with 15 fields  .
    i need to check the duplaicate records of  3 fields . if i get any 2nd same record of 3 fields , the records will go to other internal table.
    for ex :
    1. aaa  bbb ccc ddd  eee  fff  ggg   hhh
    2. aaa  bbb ccf  dde edd  ffg ggh   hhj
    3. aaa  bbb cce ddd  ees ffh  ggu  hhk
    in that 1st record and 3rd record are same (aaa bbb ddd)
    i need to find 3rd record
    please help me
    regrards
    srinivasu

    hi,
    itab2[] = itab1[].
    sort itab1 by f1 f2 f3.
    sort itab2 by f1 f2 f3.
    delete itab2 index 1.   "to delete the first record in itab2.
    loop at itab1 into ws_itab1.
      loop at itab2 into ws_itab2.
       if ws_itab1-f1 = ws_itab2-f1 and
         ws_itab1-f2 = ws_itab2-f2 and
        ws_itab1-f3 = ws_itab2-f3.
         ws_itab3 = ws_itab2.
         append ws_itab3 into itab3.   "Third internal table.
       endif.
    endloop.
    delete itab2 index 1.   
    endloop.
    ITAB3 will have all the duplicate records.
    Regards,
    Subramanian

  • Duplicate invoice check in LIV

    Hi,
    How system check the duplicate invoice check in LIV.
    We have activated Duplicate invoice check in vendor master and activated this in IMG via "SPRO-LIV-Incoming Invoice-Set check for Duplicate invoice" in the combination of Company code, reference number & Invoice date.
    Message number M8 462 as Error message only
    In our case, we have used the same reference number, Invoice date, amount, currency, vendor code, company code in MIR7 at header level. But we have posted to the different GL number and this is not reference to PO/SES number.
    Could any one help on this scenario and whether duplicate invoice check is available in G/L account tab also.
    Thanks & Regards
    R>Saravanan
    Edited by: saravanan_rsa on Nov 30, 2009 2:35 PM

    Hi,
    Refer below link
    [http://www.erptips.com/Snippet1/rbjyatmlgc.pdf]
    [http://help.sap.com/erp2005_ehp_04/helpdata/EN/ce/4f3e39ea3aee02e10000000a114084/frameset.htm]
    Regards,
    Vikas

  • Duplicate Customers Check.

    We have activated the system for checking the duplicate customers. We found that It only compares the Name and the City of the customer in general data for address.
    We would alSo like to include one more field u201C street 5u201D in the check, so it also compares the data in this field and warn the user that he is making the customer with same details like CR no./or telephone no.
    Please advice.
    Thanks
    Mansur.

    Dear Pratik
    This is a standard procedure . Follow the below process to activate duplicate checks.
    Procedure: Run SM30, key in Table V_T100C, select Display or Maintain, set App area F2. Here you can change current settings.
    Check the transaction OBMSG for application area F2, where these messages are maintained and at this place you can convert the message from warning to error.e user needs one more field to be added to this check.
    can you guide how to add a new fiekld in this check for duplicate entries.
    regards,

  • Regarding find the duplicates using match transformation

    Hi ,
    I want to find the duplicates on mutiple fields , how can i pass i the input  to match transformation. I gave the input by mergeing the input fields and pass it as input. with out concatenating any alternative way to give input to the match transformation and find the duplicates.
    Thanks & Regards,
    Ramana.

    Hi Sarthak,
    Thanks for your response. I am not finding the cross field duplicates.  I want to find the duplicates on multiple fileds not across the fields.
    E.g. :   take customer data
      CUSTOMER NO      CUSTNAME     STREET     CITY     COUNTRY     POBOX     ZIPCODE     TELEPHONE
    1000     C.E.B. BERLIN     Kolping Str. 15     Berlin     DE               06894/55501-0
    1001     C.E.B. BERLI     Kolping Str. 15     Berlin     DE               06894/55501-0
    1002     C.E.B. BERLIN     Kolping Str.      Berlin     DE               06894/55501-0
    1003     C.E.B. BERLIN     Kolping Str. 15     Berli     DE               06894/55501-0
    Here we have 4 records are potential duplicates because  in the second record customer name  last character N missing , in the 3rd record street missing the street number and in the 4th record city missing the last character n.  if we concatenate and find the duplicates means we will get it as duplicates. Without concatenating any possibilities are there finding the duplicates for this type of data.
    Thanks in Advance.
    Regards,
    Ramana.

  • How do i get rid of all the duplicates in my itunes

    I have unistalled itunes reinstalled and now i have two of every song

    Uninstalling and reinstalling iTunes does not create duplicates.
    Only the user importing content multiple times creates duplicates.
    Delete the duplicates and stop importing the same content multiple times.

  • How did I get all the duplicates in my iTunes and how do i get rid of them!!

    1000 duplicates in itunes.  How do I fix that?
    ...and how did it happen.?

    Uninstalling and reinstalling iTunes does not create duplicates.
    Only the user importing content multiple times creates duplicates.
    Delete the duplicates and stop importing the same content multiple times.

  • How to get the original record Number in Multi-Record Block

    Hello Everyone,
    I know how to find the duplicate item in the multi-Record block,
    For Ex:
    Line_Num            Item_Name             Quantity
    1                           AA                      10
    2                           BB                      20
    3                           AA Here 3rd record Item_name is duplicated, I can able to check and display the message that 'Item is duplicated' ,I found from [sheikyerbouti.developpez.com/duplicates/duplicates.htm] .
    but I want to show along with original line number i.e 1 when the item_name is entered .
    Here I want to check the original Line_Num and want to display the message
    'Item is duplicated,Update quantity in Original Line 1'
    Can anyone help me to get this?
    Thank You.
    Regards,
    Guru.

    Hi Francois,
    Actually I want to check and show the message when the item_name is entered i.e WHEN-VALIDATE-ITEM TRIGGER.
    I put the following code in WHEN-VALIDATE-ITEM TRIGGER
    Declare
         curnum number;
         dupnum number;
         cur_item varchar2(100);
         v_alert_no number;
         p_linerec varchar2(100);
    Begin
    curnum := TO_NUMBER(:System.Trigger_Record);
    cur_item := :Lines.Item_number;
    First_Record;
    p_linerec := :Lines.Item_number;
    LOOP
    If p_linerec = cur_item then
         dupnum := :Lines.Line_num;
         set_alert_property('ALERT_STOP',ALERT_MESSAGE_TEXT,
    'Duplicate Item Found,Update QTY in Original line number '||dupnum);
       V_ALERT_NO := show_alert('ALERT_STOP');
       :LINES.ITEM_NUMBER := NULL;
    :LINES.ITEM_DESCRIPTION:= NULL;
    :LINES.ITEM_REVISION:= NULL;
    :LINES.ITEM_CATEGORY:= NULL;
    elsIF (:System.Last_Record = 'TRUE') THEN
         Go_Record(curnum);
         EXIT;
      ELSE
         Next_Record;
      END IF;
    END LOOP;
    End; But I am getting the following error,
    FRM-40737:Illegal Restricted Procedure
    FIRST_RECORD in WHEN-VALIDATE-TRIGGERand then
    its showing for first line itself.
    Duplicate Item found.Update QTY in Original line number 1so I put the condition
    If :Lines.Line_num > 1 then --Only to check when the block having more than one record.but now it checking from second record and displaying,
    Duplicate Item found.Update QTY in Original line number 2 --(instead of Update QTY in Original line number 1)Can you tell me how can I change the above code for my requirement?
    Thank you.
    Edited by: Gurujothi on 27 Mar, 2013 5:20 PM

Maybe you are looking for