Urgent: Validation For Duplicate Record

Hi,
I Have a Simple Tabular Form for Emp,
(Number Of Records Display = 10)
I'm Inserting rows like
Empno Ename Sal Deptno
1 ABC 10000 10
2 XYZ 20000 10
3 XXX 25000 10
4 YYY 30000 10
1--- Err
If I again try to insert 1 then the system should give message 1 is already exist
before I save the Transaction.
Note: This validation should be when the item validate.
Thanks In Advance
Ahmed.

Hi ,
This topic is discussed and code is available somewhere in this forum , but if you cant spend time take this this will be handy, u have to call this procedure and make slight change in the procedure where it will give a error at the time of compilation , the output coloum called out will return values based on that u can know whether its a duplicate coloum, if any more doubts reply to [email protected]
PROCEDURE check_dupitem (item VARCHAR2, prod VARCHAR2, dup OUT VARCHAR2)
IS
drno NUMBER;
BEGIN
:GLOBAL.itr := 'Y';
IF :SYSTEM.cursor_record = '1' AND :SYSTEM.last_record = 'TRUE'
THEN
:GLOBAL.itr := 'N';
ELSE
drno := :SYSTEM.cursor_record;
dup := 'N';
first_record;
LOOP
IF TO_CHAR (drno) != :SYSTEM.cursor_record
THEN
/* the next line you have to change to your item */
IF item = :tixd_itemcode AND prod = NVL (:tixd_prodcode, '0')
THEN
message ('Duplicate Item! Enter a Valid Item!');
dup := 'Y';
EXIT;
END IF;
IF :SYSTEM.last_record = 'TRUE'
THEN
EXIT;
END IF;
END IF;
down;
END LOOP;
go_record (TO_CHAR (drno));
:GLOBAL.itr := 'N';
END IF;
END;
regards
Rajesh
Hi,
I Have a Simple Tabular Form for Emp,
(Number Of Records Display = 10)
I'm Inserting rows like
Empno Ename Sal Deptno
1 ABC 10000 10
2 XYZ 20000 10
3 XXX 25000 10
4 YYY 30000 10
1--- Err
If I again try to insert 1 then the system should give message 1 is already exist
before I save the Transaction.
Note: This validation should be when the item validate.
Thanks In Advance
Ahmed.

Similar Messages

  • Check for duplicate record in SQL database before doing INSERT

    Hey guys,
           This is part powershell app doing a SQL insert. BUt my question really relates to the SQL insert. I need to do a check of the database PRIOR to doing the insert to check for duplicate records and if it exists then that record needs
    to be overwritten. I'm not sure how to accomplish this task. My back end is a SQL 2000 Server. I'm piping the data into my insert statement from a powershell FileSystemWatcher app. In my scenario here if the file dumped into a directory starts with I it gets
    written to a SQL database otherwise it gets written to an Access Table. I know silly, but thats the environment im in. haha.
    Any help is appreciated.
    Thanks in Advance
    Rich T.
    #### DEFINE WATCH FOLDERS AND DEFAULT FILE EXTENSION TO WATCH FOR ####
                $cofa_folder = '\\cpsfs001\Data_pvs\TestCofA'
                $bulk_folder = '\\cpsfs001\PVS\Subsidiary\Nolwood\McWood\POD'
                $filter = '*.tif'
                $cofa = New-Object IO.FileSystemWatcher $cofa_folder, $filter -Property @{ IncludeSubdirectories = $false; EnableRaisingEvents= $true; NotifyFilter = [IO.NotifyFilters]'FileName, LastWrite' }
                $bulk = New-Object IO.FileSystemWatcher $bulk_folder, $filter -Property @{ IncludeSubdirectories = $false; EnableRaisingEvents= $true; NotifyFilter = [IO.NotifyFilters]'FileName, LastWrite' }
    #### CERTIFICATE OF ANALYSIS AND PACKAGE SHIPPER PROCESSING ####
                Register-ObjectEvent $cofa Created -SourceIdentifier COFA/PACKAGE -Action {
           $name = $Event.SourceEventArgs.Name
           $changeType = $Event.SourceEventArgs.ChangeType
           $timeStamp = $Event.TimeGenerated
    #### CERTIFICATE OF ANALYSIS PROCESS BEGINS ####
                $test=$name.StartsWith("I")
         if ($test -eq $true) {
                $pos = $name.IndexOf(".")
           $left=$name.substring(0,$pos)
           $pos = $left.IndexOf("L")
           $tempItem=$left.substring(0,$pos)
           $lot = $left.Substring($pos + 1)
           $item=$tempItem.Substring(1)
                Write-Host "in_item_key $item in_lot_key $lot imgfilename $name in_cofa_crtdt $timestamp"  -fore green
                Out-File -FilePath c:\OutputLogs\CofA.csv -Append -InputObject "in_item_key $item in_lot_key $lot imgfilename $name in_cofa_crtdt $timestamp"
                start-sleep -s 5
                $conn = New-Object System.Data.SqlClient.SqlConnection("Data Source=PVSNTDB33; Initial Catalog=adagecopy_daily; Integrated Security=TRUE")
                $conn.Open()
                $insert_stmt = "INSERT INTO in_cofa_pvs (in_item_key, in_lot_key, imgfileName, in_cofa_crtdt) VALUES ('$item','$lot','$name','$timestamp')"
                $cmd = $conn.CreateCommand()
                $cmd.CommandText = $insert_stmt
                $cmd.ExecuteNonQuery()
                $conn.Close()
    #### PACKAGE SHIPPER PROCESS BEGINS ####
              elseif ($test -eq $false) {
                $pos = $name.IndexOf(".")
           $left=$name.substring(0,$pos)
           $pos = $left.IndexOf("O")
           $tempItem=$left.substring(0,$pos)
           $order = $left.Substring($pos + 1)
           $shipid=$tempItem.Substring(1)
                Write-Host "so_hdr_key $order so_ship_key $shipid imgfilename $name in_cofa_crtdt $timestamp"  -fore green
                Out-File -FilePath c:\OutputLogs\PackageShipper.csv -Append -InputObject "so_hdr_key $order so_ship_key $shipid imgfilename $name in_cofa_crtdt $timestamp"
    Rich Thompson

    Hi
    Since SQL Server 2000 has been out of support, I recommend you to upgrade the SQL Server 2000 to a higher version, such as SQL Server 2005 or SQL Server 2008.
    According to your description, you can try the following methods to check duplicate record in SQL Server.
    1. You can use
    RAISERROR to check the duplicate record, if exists then RAISERROR unless insert accordingly, code block is given below:
    IF EXISTS (SELECT 1 FROM TableName AS t
    WHERE t.Column1 = @ Column1
    AND t.Column2 = @ Column2)
    BEGIN
    RAISERROR(‘Duplicate records’,18,1)
    END
    ELSE
    BEGIN
    INSERT INTO TableName (Column1, Column2, Column3)
    SELECT @ Column1, @ Column2, @ Column3
    END
    2. Also you can create UNIQUE INDEX or UNIQUE CONSTRAINT on the column of a table, when you try to INSERT a value that conflicts with the INDEX/CONSTRAINT, an exception will be thrown. 
    Add the unique index:
    CREATE UNIQUE INDEX Unique_Index_name ON TableName(ColumnName)
    Add the unique constraint:
    ALTER TABLE TableName
    ADD CONSTRAINT Unique_Contraint_Name
    UNIQUE (ColumnName)
    Thanks
    Lydia Zhang

  • Validation for duplicate AP invoice

    We would like to prevent duplicate AP invoices from being posted through FB60.   We created an FI validation in OB28 in which the prerequisite = Tcode FB60, and the check is a user exit to check the values in vendor / company code / reference fields and if those 3 fields have the same values as in an existing document, to generate an error message preventing the duplicate document from being posted.
    The above works well for creating new invoices.   However, when we have to make changes to an existing invoice (created through FB60), the system also calls on that validation and generates the error message, even though the only fields we can change on the invoice is payment terms, payment method, payment method supplement and text. 
    Other than modifying the user exit, does anybody know of a better way to do this validation or to prevent duplicate manual invoice from being created?  
    thank you.

    This has already been answered before so extracted part of the reply
    there are two duplicate checks in SAP:
    1. the one for FI documents, this is relevant for all F... postings (but nor for MIRO!)
    this can be turned off or on (in vendor master record) and 6 fields are checked:
    Check Flag for Double Invoices or Credit Memos
    Indicator which means that incoming invoices and credit memos are checked for double entries at the time of entry.
    Use
    Checking Logistics documents
    Firstly, the system checks whether the invoice documents have already been entered in the Logistics invoice verification; the system checks invoices that are incorrect, or invoices that were entered for invoice verification in the background.
    Checking FI documents
    The system then checks whether there are FI or Accounting documents that were created with the original invoice verification or the Logistics verification, and where the relevant criteria are the same.
    Checking Logistics documents
    In checking for duplicate invoices, the system compares the following characteristics by default:
    Vendor
    Currency
    Company code
    Gross amount of the invoice
    Reference document number
    Invoice document date
    If all of these characteristics are the same, the system issues a message that you can customize.
    When you enter credit memos or subsequent adjustments, the system does not check for duplicate invoices.
    Exception: Country-specific solution for Argentina, where invoices and credit memos are checked for duplicate documents.
    No message is issued if you enter a document that has previously been reversed.
    Dependencies
    The system only checks for duplicate invoices in Materials Management if you enter the reference document number upon entering the invoice.
    In Customizing for the Logistics invoice verification, you can specify that the following characteristics should not be checked:
    Reference document number
    Invoice document date
    Company code
    This means that you can increase the likelihood that the system will find a duplicate invoice, because you can reduce the number of characteristics checked.
    Example
    The following document has already been entered and posted:
    Reference document number: 333
    Invoice date: 04/28/00
    Gross invoice amount: 100.00
    Currency: EUR
    Vendor: Spencer
    Company code: Munich
    You have made the following settings in Customizing:
    The field "Reference document number" and "Company code" are deselected, which means that these characteristics will not be checked.
    Now you enter the following document:
    Reference document number: 334
    Invoice date: 04/28/00
    Gross invoice amount: 100.00
    Currency: EUR
    Vendor: Spencer
    Company code: Berlin
    Result
    Because you entered a reference document when you entered the invoice, the system checks for duplicate invoices.
    The reference document number and the company code are different from the invoice entered earlier, but these characteristics are not checked due to the settings you have made in Customizing.
    All other characteristics are the same. As a result, the system issues a message that a duplicate entry has been made.
    If the "Reference document number" had been selected in Customizing, the system would have checked the document and discovered that it was different from the invoice entered earlier, so it would not have issued a message.
    Checking FI documents
    Depending on the entry in the field "Reference", one of the following checks is carried out:
    1. If a reference number was specified in the sequential invoice/credit memo, the system checks whether an invoice/credit memo has been posted where all the following attributes agree:
    Company code
    Vendor
    Currency
    Document date
    Reference number
    2. If no reference number was specified in the sequential invoice/credit memo, the system checks whether an invoice/credit memo has been posted where all the following attributes agree:
    Company code
    Vendor
    Currency
    Document date
    Amount in document currency
    2. the one for LIV.
    this can be customized (as stated above) and is relevant for MIRO (and not for F... postings!).
    Company, Reference and Invoice date
    Set Check for Duplicate Invoices
    In this step, you can configure for each company code if the system is to check for duplicate invoices when you enter invoices.
    This check should prevent incoming invoices being accidentally entered and paid more than once.
    You can choose whether to activate or deactivate the check criteria of company code, reference document number and invoice date for each company code. The more criteria that you activate, the lower the probability of the system finding a duplicate invoice. The Accounting documents are checked first, followed by documents from Logistics Invoice Verification (only incorrect invoices or those entered for verification in the background).
    When checking duplicate invoices, the system compares the following
    attributes in the standard system:
    Vendor
    Currency Company code
    Gross invoice amount Reference document number
    1. Invoice date If the system finds an invoice that matches all attributes, the system
    displays a customizable message.
    If you are entering credit memos, subsequent debits, or subsequent
    credits, the system does not check for duplicate invoices.
    The exception is the Argentina country version, where the system checks
    for duplicate invoices and credit memos.
    If a previously processed document is later cancelled and then entered
    again, no message is displayed.
    Requirements
    The system only checks for duplicate invoices in Materials Management
    if you specify a reference document number when entering the invoice.
    In Customizing (IMG) for Invoice Verification, you can specify that the
    system check the following attributes

  • Need a query for duplicate records deletion

    here is one scenario...
    23130 ----> 'A'
    23130 ----> 'X'
    23130 ----> 'c'
    These are duplicate records.. when we remove duplicates, the record must get 'c', if it contains A,C,X. If it contains A and X, then the record must get 'X'. That means the priority goes like this C-->X-->A. for this i need query.. this is one scenario. It would be great if u reply me asap.

    Hello
    It's great that you gave examples of your data, but it is quite helpful to supply create table and insert statements too along with a clear example of expected results. Anyway, I think this does what you are looking for.
    CREATE TABLE dt_dup (ID NUMBER, flag VARCHAR2(1))
    INSERT INTO dt_dup VALUES(23130, 'A');
    insert into dt_dup values(23130, 'X');
    insert into dt_dup values(23130, 'C');
    INSERT INTO dt_dup VALUES(23131, 'A');
    INSERT INTO dt_dup VALUES(23131, 'X');
    DELETE
    FROM
      dt_dup
    WHERE
      ROWID IN (  SELECT
                    rid
                  FROM
                    (   SELECT
                          rowid rid,
                          ROW_NUMBER() OVER (PARTITION BY ID ORDER BY CASE
                                                                        WHEN flag = 'A' THEN
                                                                          3
                                                                        WHEN flag = 'X' THEN
                                                                          2
                                                                        WHEN flag = 'C' THEN
                                                                          1
                                                                      END
                                            ) rn
                        FROM
                          dt_dup
                  WHERE
                    rn > 1
    select * from dt_dup;HTH
    David
    Edited by: Bravid on Jun 30, 2011 8:12 AM

  • Searching for Duplicates records in Qualified Lookup tables

    Hi SDNers,
    I would like to know,
    How to find out how many records having duplicate records in Qualified Look Tables.
    In Free form search or Drill down Serach.. I am select the particular Qualified Look up Table. After how can I find Duplicated reocrds.
    Any Solution?
    Thanks
    Ravi

    Hi,
    If you want to find the duplicates present in qualified table then, you can go to qualified table in data manager and get to know the duplicates....else if you want to find the duplicate links to a record in main table, then you can write an expression in free form search to get the value of duplicate links. but that would be specific to a link value only..e.g. Country is your no qualifier, then in free form search, you can write an expression to get how many links are for USA.
    Hope this clarifies your doubt...if not, please elaborate on the requirement.
    Regards,
    Arafat.

  • How to tune the query for duplicate records while joining the two tables

    hi,i am executing the query which has retrieving multiple tables,in which one of them has duplicate record,how to get single record

    Not enough info...subject says "tune" the query, message says "write" the query...and where is actual query that you had tried ?

  • Validation for Duplicate Invoice Entry

    Hi Everyone,
    Currently, it seems that SAP reviews an incoming invoice for reference field (invoice number), invoice date, vendor and company code.  If the comibation of the criteria are met, then a warning message appears stating to review the entry due to a possible duplication.  I've since changed the warning message to an error so that users can not post the duplicate invoice. 
    Now, what is the process of stopping an invoice from being posted that meets the following criteria: (1) vendor, (2) invoice number, (3) amount.  Would this be a validation formula or some sort of user exit or is there standard SAP IMG that can be performed?
    Thanks for the assistance,
    Pete

    hi Peter,
    this should be a substitution exit, should happen in FI, on line item level (check transaction GGB1), for all vendor credit postings (posting key between 31 and 39). The substitution has to be done on field amount (WRBTR), because this is the first point where you have all information you need (I guess the vendor invoice number is saved into header). You have to write a small code in the exit to select possible vendor invoices from BSIK and BSAK. If any found you should issue an error message and block the process.
    hope this helps
    ec

  • Problem in putting validation for detail records

    Hi,
    I have a customer entity/ view object, and also an address entity/ view object. They are linked by an association/ view link. I need to put a chk that a customer should have at least one address. For this i created a validateCustomer method validator. In this i used the following code -
    RowIterator addresses = getAddresssDtl();
    if(addresses.getRowCount() <= 0)
    throw new JboException(String.valueOf(addresses.getRowCount()));
    else
    return true;
    When i test the application module (the address via customer view), i add a customer, do not commit, add an address, then place my cursor in one of the customer fields, press the commit button. The exeption is raised even though i have an address.
    Please suggest a solution. Why does getAddressDtl() not take into consideration the records in the current rowset of the child?
    Thanks for any help.
    Aparna.
    null

    If you are using JDev 3.0, then you need to set the command line flag:
    -Djbo.assoc.consistent=true
    In JDev 3.1, this flag is set by default, so you don't have to set it.
    I believe this is documented in the Release Notes.

  • Duplicate record issue in IP file

    Hello All,
    I have a requirement to handle the duplicate record in IP flat file.
    my interface is very simple, just to load the data using IP interface. There is no IP imput query just FF load.
    My requirement is to apply validation check if file has similer two records that file should be rejected.
    ex
    Field 1   Field2   Amount
    XXX       ABC    100
    XXX       ABC    100
    File should reject. As per the standard functionality it sum-up the data in the cube
    Is there any way to handle that
    Thanks
    Samit

    I dont think you can do it. This is standard. May be you can write your own class to check for duplicate records. You can use this class in a custom planning function and throw error message.
    Best is to make sure the end users take the reponsibilty for data.
    Arun

  • How to create duplicate records in end routines

    Hi
    Key fields in DSO are:
    Plant
    Storage Location
    MRP Area
    Material
    Changed Date
    Data Fields:
    Safety Stocky
    Service Level
    MRP Type
    Counter_1 (In flow Key figure)
    Counter_2 (Out flow Key Figure)
    n_ctr  (Non Cumulative Key Figure)
    For every record that comes in, we need to create a dupicate record. For the original record, we need to make the Counter_1 as 1 and Counter_2 as 0. For the duplicate record, we need to update Changed_Date to today's date and rest of the values will remain as is and update the counter_1 as 0 and counter_2 as -1. Where is the best place to write this code in DSO. IS it End
    routine?
    please let me know some bais cidea of code.

    Hi Uday,
    I have same situation like Suneel and have written your logic in End routine DSO as follows:
    DATA: l_t_duplicate_records TYPE TABLE OF TYS_TG_1,
          l_w_duplicate_record TYPE TYS_TG_1.
    LOOP AT RESULT_PACKAGE ASSIGNING <result_fields>.
        MOVE-CORRESPONDING <result_fields> TO l_w_duplicate_record.
        <result_fields>-/BIC/ZPP_ICNT = 1.
        <result_fields>-/BIC/ZPP_OCNT = 0.
        l_w_duplicate_record-CH_ON = sy-datum.
        l_w_duplicate_record-/BIC/ZPP_ICNT = 0.
        l_w_duplicate_record-/BIC/ZPP_OCNT = -1.
        APPEND l_w_duplicate_record TO  l_t_duplicate_records.
    ENDLOOP.
    APPEND LINES OF l_t_duplicate_records TO RESULT_PACKAGE.
    I am getting below error:
    Duplicate data record detected (DS ZPP_O01 , data package: 000001 , data record: 4 )     RSODSO_UPDATE     19     
    i have different requirement for date. Actually my requirement is to populate the CH_ON date as mentioned below:
    sort the records based on the key and get the latest CH_ON value with unique Plant,sloc, material combination and populate
    that CH_ON value for duplicate record.
    Please help me to resolve this issue.
    Thanks,
    Ganga

  • Ajax checing for duplicte records

    i am trying to have a form in ajx that will also insert a new
    contact... i havit working with the insert part but i want to have
    it check for duplicate records before inserting the contact... here
    is the form ... and i also have a cfc for the functions
    <cfcomponent output="false">
    <cfset this.dsn="myserver">
    <!--- Populates the grower list Select --->
    <cffunction name="getCompany" access="remote"
    returntype="array">
    <cfset var rsData="">
    <cfset var myReturn=ArrayNew(2)>
    <cfset var i=0>
    <cfquery name="rsData" datasource="myserver">
    SELECT cid ,Company
    FROM Contacts
    order by Company asc
    </cfquery>
    <cfloop query="rsData">
    <cfset myReturn[rsData.currentrow] [1]=rsdata.cid>
    <cfset myReturn[rsData.currentrow] [2]=rsdata.Company>
    </cfloop>
    <cfreturn myReturn>
    </cffunction>
    <!--- Populates list related to grower --->
    <cffunction name="getcontacts" access="remote"
    returntype="array">
    <cfargument name="cid" type="string" required="no">
    <cfset var rsData="">
    <cfset var myReturn=Arraynew(2)>
    <cfset var i=0>
    <cftry>
    <cfquery name="rsdata" datasource="myserver">
    SELECT cid, Company, FullName, Lname, Fname, Address1,
    Address2, City, State, zip, country, Phone, ext, cell, Fax,
    tollfree, Web, Email, Title, ExecutiveTitle
    FROM Contacts
    WHERE Contacts.Company = '#arguments.cid#'
    </cfquery>
    <cfcatch type="any">
    <cfset returnStruct.success = false />
    <cfset returnStruct.message = cfcatch.message />
    </cfcatch>
    </cftry>
    <cfloop query="rsdata">
    <cfif rsdata.recordcount gt 0>
    <cfset myReturn[rsData.currentrow] [1]=rsdata.cid>
    <cfset myReturn[rsData.currentrow] [2]=rsdata.FullName>
    <!--- <cfelse>
    <cfset rsdata.cid = 999999>
    <cfset rsdata.FullName = 'none'>
    <cfset myReturn[rsData.currentrow] [1]=rsdata.cid>
    <cfset myReturn[rsData.currentrow] [2]=rsdata.FullName>
    --->
    </cfif>
    </cfloop>
    <cfreturn myReturn>
    </cffunction>
    <!--- Gets contact info for 2page
    contacts!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
    --->
    <cffunction name="getcontactsinfo2" access="remote"
    returnType="struct">
    <cfargument name="growerName" type="string"
    required="no">
    <cfset var data="">
    <cfset var c = "">
    <cfset var s = structNew()>
    <cftry>
    <cfquery name="data" datasource="myserver">
    SELECT cid, Company, FullName, Lname, Fname, Address1,
    Address2, City, State, zip, country, Phone, ext, cell, Fax,
    tollfree, Web, Email, Title, ExecutiveTitle
    FROM Contacts
    WHERE Company= <cfqueryparam cfsqltype="cf_sql_varcar"
    value="#arguments.growerName#">
    </cfquery>
    <cfloop list="#data.columnlist#" index="c">
    <cfset s[c] = data[c][1]>
    </cfloop>
    <cfcatch type="any">
    <cfset returnStruct.success = false />
    <cfset returnStruct.message = cfcatch.message />
    </cfcatch>
    </cftry>
    <cfreturn s>
    </cffunction>
    <cffunction name="getcontactsinfo" access="remote"
    returnType="struct">
    <cfargument name="cid" type="numeric" required="true">
    <cfset var data="">
    <cfset var c = "">
    <cfset var s = structNew()>
    <cftry>
    <cfquery name="data" datasource="myserver">
    SELECT cid, Company, FullName, Lname, Fname, Address1,
    Address2, City, State, zip, country, Phone, ext, cell, Fax,
    tollfree, Web, Email, Title, ExecutiveTitle
    FROM Contacts
    WHERE contacts.cid= <cfqueryparam
    cfsqltype="cf_sql_integer" value="#arguments.cid#">
    </cfquery>
    <!--- --->
    <cfloop list="#data.columnlist#" index="c">
    <cfset s[c] = data[c][1]>
    </cfloop>
    <cfcatch type="any">
    <cfset returnStruct.success = false />
    <cfset returnStruct.message = cfcatch.message />
    </cfcatch>
    </cftry>
    <cfreturn s>
    </cffunction>
    <!--- Updates the database of grower contacts --->
    <cffunction name="markTaskComplete" output="false"
    returntype="struct" access="remote" hint="i mark a task
    complete">
    <cfargument name="cid2" type="numeric" required="true"
    />
    <cfargument name="company2" type="string" required="true"
    />
    <cfargument name="address2" type="string" required="true"
    />
    <cfargument name="city2" type="string" required="true"
    />
    <cfargument name="state2" type="string" required="true"
    />
    <cfargument name="zip2" type="string" required="true"
    />
    <cfset var qMarkTaskComplete = "" />
    <cfset var returnStruct = structNew() />
    <cfset returnStruct.success = true />
    <cfset returnStruct.taskID = arguments.cid2 />
    <cftry>
    <cfquery name="qMarkTaskComplete"
    datasource="myserver">
    UPDATE
    Contacts
    SET
    Company = <cfqueryparam value="#arguments.company2#"
    cfsqltype="cf_sq_varcar" /> ,
    Address1 = <cfqueryparam value="#arguments.address2#"
    cfsqltype="cf_sq_varcar" /> ,
    City = <cfqueryparam value="#arguments.city2#"
    cfsqltype="cf_sq_varcar" />,
    State = <cfqueryparam value="#arguments.state2#"
    cfsqltype="cf_sq_varcar" />,
    zip = <cfqueryparam value="#arguments.zip2#"
    cfsqltype="cf_sq_varcar" />
    WHERE
    cid = <cfqueryparam value="#arguments.cid2#"
    cfsqltype="cf_sq_int" />
    </cfquery>
    <cfcatch type="Database">
    <cfset returnStruct.success = false />
    <cfset returnStruct.message = cfcatch.message />
    </cfcatch>
    </cftry>
    <cfdump var="#returnStruct#"/>
    <cfreturn returnStruct />
    </cffunction>
    <cffunction name="lookupGrower" access="remote"
    returntype="array">
    <cfargument name="search" type="any" required="false"
    default="">
    <!--- Define variables --->
    <cfset var data="">
    <cfset var result=ArrayNew(1)>
    <!--- Do search --->
    <cfquery name="data" datasource="myserver">
    SELECT cid, Company, FullName, Lname, Fname, Address1,
    Address2, City, State, zip, country, Phone, ext, cell, Fax,
    tollfree, Web, Email, Title, ExecutiveTitle
    FROM Contacts
    WHERE (Company LIKE '#ARGUMENTS.search#%')
    </cfquery>
    <!--- Build result array --->
    <cfloop query="data">
    <cfset ArrayAppend(result, Company)>
    </cfloop>
    <!--- And return it --->
    <cfreturn result>
    </cffunction>
    <cffunction name="markTaskComplete2" output="false"
    returntype="struct" access="remote" hint="i mark a task
    complete">
    <cfargument name="company3" type="string" required="true"
    />
    <cfargument name="address3" type="string" required="true"
    />
    <cfargument name="city3" type="string" required="true"
    />
    <cfargument name="state3" type="string" required="true"
    />
    <cfargument name="zip3" type="string" required="true"
    />
    <cfset var qMarkTaskComplete = "" />
    <cfset var returnStruct = structNew() />
    <cfset returnStruct.success = true />
    <cfset returnStruct.taskID = arguments.cid />
    <cftry>
    <cfquery name="qMarkTaskComplete"
    datasource="myserver">
    INSERT INTO KYIntranet.dbo.Contacts
    (Company
    ,Address1
    ,City
    ,State
    ,zip)
    VALUES
    (<cfqueryparam value="#arguments.company3#"
    cfsqltype="cf_sq_varcar" /> , <cfqueryparam
    value="#arguments.address3#" cfsqltype="cf_sq_varcar" />
    ,<cfqueryparam value="#arguments.city3#"
    cfsqltype="cf_sq_varcar" />,<cfqueryparam
    value="#arguments.state3#" cfsqltype="cf_sq_varcar"
    />,<cfqueryparam value="#arguments.zip3#"
    cfsqltype="cf_sq_varcar" />)
    </cfquery>
    <cfcatch type="any">
    <cfset returnStruct.success = false />
    <cfset returnStruct.message = cfcatch.message />
    </cfcatch>
    </cftry>
    <cfreturn returnStruct />
    </cffunction>
    </cfcomponent>

    From my understanding your table that gets inserted is
    KYIntranet.dbo.Contacts . And i guess company field in that table
    needs to be checked for duplicates. So you can put the company
    field as primary key for that table. Inserting the same company
    name can result in a Primary key exception meaning that a duplicate
    has been entered. By this way you can check for duplicates.
    <cftry>
    <cfquery>Insert SQL code</cfquery>
    <cfcatch>
    <--Code to handle primary key exception-->
    </cfcatch>
    <cftry>

  • Duplicates Records

    I'm still new to SBO. What is a good way to check data for duplicate records. I want any duplicate records to go into another table and only the first to pass.
    Thansk for your help

    Depends on how you want to find duplicates.
    You can find more detail here:
    http://wiki.sdn.sap.com/wiki/display/BOBJ/DeDuplicatesourcedata

  • Duplicate records in delta load?????pls help!!!! will assign points

    Hi all,
    I am extracting payroll data with datasource 0hr_py_1 for 0py_c02 cube.
    I ran full load with selection crieteria in infopackage -01.2007 to 02.2007, extracted 20,000 records and then
    i ran init of delta without data transfer, extracted 0 records as expected.
    then ran delta with selection crieteria in infopackage -02.2007 to 01.2010, extracted 4500 where the FEB month records are extracted again.
    what could be the reason for duplicate records to occur in the delta load?
    i have seen the same records in full load with selection crieteria 01.2007 to 02.2007 as well as in selection crieteria 02.2007 to 01.2010. What and how it is possible?
    Actually the datasource 0hr_py_1 datasource is not supporting delta. apart from this what other reasons are there for occuring duplicate records? plss help!!!!!!!!!!!!!!
    Will assign points.

    ur selection criteria -
    01.2007 to 02.2007 as well as in selection crieteria 02.2007 to 01.2010
    both of ur selection includes the month- .02.2007
    might b all selections come under .02.2007
    hav u checkd tht?
    Regards,
    Naveen Natarajan

  • Duplicate records in a collection

    Hi Experts,
    Just now I've seen a thread related to finding duplicate records in a collection. I understand that it is not advisable to sort/filter data in a collection.
    (https://forums.oracle.com/thread/2584168)
    Just for curiosity I tried to display duplicate records in a collection. Please Please .. this is just for practice purpose only. Below is the rough code which I wrote.
    I'm aware of one way - can be handled effectively by passing data into a global temporary table and display the duplicate/unique records.
    Can you please let me know if there is any other efficient wayto do this.
    declare
      type emp_rec is record ( ename varchar2(40), empno number);
      l_emp_rec emp_rec; 
      type emp_tab is table of l_emp_rec%type index by binary_integer;
      l_emp_tab emp_tab;
      l_dup_tab emp_tab;
      l_cnt number;
      n number :=1;
    begin
    -- Assigning values to Associative array
      l_emp_tab(1).ename := 'suri';
      l_emp_tab(1).empno := 1;
      l_emp_tab(2).ename := 'surya';
      l_emp_tab(2).empno := 2;
      l_emp_tab(3).ename := 'suri';
      l_emp_tab(3).empno := 1;
    -- Comparing collection for duplicate records
    for i in l_emp_tab.first..l_emp_tab.last
    loop
        l_cnt :=0;  
    for j in l_emp_tab.first..l_emp_tab.last 
        loop      
           if l_emp_tab(i).empno  =  l_emp_tab(j).empno and l_emp_tab(i).ename  =  l_emp_tab(j).ename then
               l_cnt := l_cnt+1;          
                   if l_cnt >=2 then
                      l_dup_tab(n):= l_emp_tab(i);
                   end if;
           end if;                   
        end loop;  
    end loop;
    -- Displaying duplicate records
    for i in l_dup_tab.first..l_dup_tab.last
    loop
       dbms_output.put_line(l_dup_tab(i).ename||'  '||l_dup_tab(i).empno);
    end loop;
    end;
    Cheers,
    Suri

    Dunno if this is either easier or more efficient but it is different.  The biggest disadvantage to this technique is that you have extraneous database objects (a table) to keep track of.  The advantage is that you can use SQL to perform the difference checks easily.
    Create 2 global temporary tables with the structure you need, load them, and use set operators (UNION [ALL], INTERSECT, MINUS) to find the differences.  Or, create 1 GTT with an extra column identifying the set and use the extra column to identify the set records you need.

  • Duplicate Records Found?

    Hi Experts,
       Is there any Permanent solution for Duplicate Records Found?
        Info Package> only PSA subsequent Data Targets- Ignore Duplicate Records.
    Can u explain clearly for this Issue to close Permanently.
    Note : Points will be assinged.
    With Regards,
    Kiran

    k

Maybe you are looking for