Duplicate record identifier and update

My records look like 
Name City Duplicateindicator 
SAM   NYC   0
SAM   NYC1 0
SAM    ORD  0
TAM   NYC  0
TAM   NYC1  0 
DAM   NYC  0  
for some reason numeric character are inserted into city which duplicated my records , 
I need to 
Check for the duplicate records by name ( If name is repeating ) check for city if they  having same city (NYC and NYC1) are consider same city here. I am ok to do this for one city at a time.
SAM has a duplicate record as NYC and NYC01 , the record which is having  SAM   NYC1 0 must be updated to SAM   NYC1 1 

Good day tatva
Since the Cities names is not exactly the same, you will need to parse the text somehow in order to clean the numbers from the name, this is best to do with SQLCLR using regular expression (If this fit your need, then I can post the CLR code for you).
In this case you use simple regular expression replace function.
On the result of the function you use simple query with the function ROW_NUMBER over (partition by RegularExpressionReplace(ColumnName, '[0-9]') order by
ColumnName)
on the result of the ROW_NUMBER  every row with ROW_NUMBER  more then 1 is duplicate
I hope this useful :-)
  Ronen Ariely
 [Personal Site]    [Blog]    [Facebook]

Similar Messages

  • JDBC Sender - Different number of records selected and updated.

    Hi people,
    We have a JDBC -> Abap proxy scenario. The JDBC sender is pooling an Oracle database to retrieve data from a table X, each 30 minutes. The select and update statements in jdbc sender are below
    SELECT FIELD1, FIELD2, FIELD3 FROM MY_TABLE WHERE STATUS = 1
    UPDATE MY_TABLE SET STATUS = 2 WHERE STATUS = 1
    Sometimes the message sent to Abap proxy has, for example, 400 records. Looking at runtime workbench, message monitoring, for the same message there is a log like this
    Channel SENDER_JDBC_CHANNEL: Query executed successfully. Start update
    Channel SENDER_JDBC_CHANNEL: 510 row(s) updated successfully
    Someone has already experienced something like this? How can I handle this to guarantee to update only those read records?
    regards.
    roberti

    Hi All,
    Even we are facing the same problem.
    In our scenario, receiver is SAPR3. (IDOC)
    Will this parameter serialization work in our case?
    1. SELECT XBLNR, WERKS, MATNR, MDV01, BACKFLQUANT, STATUS, SAPTIMESTAMP, PITSTIMESTAMP, PMTIMESTAMP, BATCH FROM PMBPITS.PITS_UNITY WHERE STATUS = '01' and rownum<200 . 
    2. UPDATE PMBPITS.PITS_UNITY SET STATUS = '02' , SAPTIMESTAMP = sysdate WHERE STATUS = '01' and rownum<200  ( currently the value is rownum < 5 )
    Thanks!!
    Regards
    Gouri

  • How to identify and update the DB parameters (no SYS access)

    Hi experts,
    I have to check some parameter values for an OID tuning
    I do not have sys access, I have only schema user access
    Now how do I see the values of SGA_TARGET, db_cache_size etc parameter
    I manage to see these var in init.ora file under /dbs folder but there are 2-3 spfile also having containing these attributes, not sure which reflect the actual.
    Also can I alter also these value via schema user or do I need SYS access?
    Would appreciate if i can get help in this
    Thank you

    DK2010 wrote:
    Hi,
    Welcome to the forum,
    You need DBA access to see the value and sysdba access to alter the value.
    if you are able to check the file under the /dbs then you can identify the spfile naming with your database name
    generally it look like
    spfile<sid>.ora of init<sid>.ora
    you can use the cat command to see the value like
    cat init<sid>.ora|grep -i sga
    HTH'cat' will work, but since the spfile is a binary file, the 'strings' command would be the better choice:
    oracle:hr91dvvb$ ls -l spfile*
    -rw-r----- 1 oracle dba 2560 Apr 22 15:53 spfilehr91dvvb.ora
    oracle:hr91dvvb$ cat spfilehr91dvvb.ora
    C"ݾoè{0CC"!r91dvvb.__db_cache_size=339738624
    hr91dvvb.__java_pool_size=4194304
    hr91dvvb.__large_pool_size=4194304
    hr91dvvb.__oracle_base='/u01/app/oracle'#ORACLE_BASE set from environment
    hr91dvvb.__pga_aggregate_target=339738624
    hr91dvvb.__sga_target=503316480
    hr91dvvb.__shared_io_pool_size=0
    hr91dvvb.__shared_pool_size=146800640
    hr91dvvb.__streams_pool_size=0
    *.audit_file_dest='/u01/app/oracle/admin/hr91dvvb/adump'
    *.audit_trail='db'
    *.compatible='11.2.0.0.0'
    *.control_files='/oradata/hr91dvCC",2vb/control01.ctl','/oradata/hr91dvvb/control02.ctl','/oradata/hr91dvvb/control03.ctl'
    *.db_block_size=8192
    *.db_domain=''
    *.db_name='hr91dvvb'
    *.db_unique_name='hr91dvvb'
    *.diagnostic_dest='/u01/app/oracle'
    *.dispatchers='(PROTOCOL=TCP) (SERVICE=hr91dvvbXDB)'
    *.memory_target=842006528
    *.open_cursors=300
    *.processes=150
    *.remote_login_passwordfile='EXCLUSIVE'
    *.undo_tablespace='UNDOTBS1'
    CC"GeCC"FeCPuTTYPuTTY
    oracle:hr91dvvb$ PuTTYPuTTY
    oracle:hr91dvvb$ strings spfilehr91dvvb.ora
    hr91dvvb.__db_cache_size=339738624
    hr91dvvb.__java_pool_size=4194304
    hr91dvvb.__large_pool_size=4194304
    hr91dvvb.__oracle_base='/u01/app/oracle'#ORACLE_BASE set from environment
    hr91dvvb.__pga_aggregate_target=339738624
    hr91dvvb.__sga_target=503316480
    hr91dvvb.__shared_io_pool_size=0
    hr91dvvb.__shared_pool_size=146800640
    hr91dvvb.__streams_pool_size=0
    *.audit_file_dest='/u01/app/oracle/admin/hr91dvvb/adump'
    *.audit_trail='db'
    *.compatible='11.2.0.0.0'
    *.control_files='/oradata/hr91dv
    vb/control01.ctl','/oradata/hr91dvvb/control02.ctl','/oradata/hr91dvvb/control03.ctl'
    *.db_block_size=8192
    *.db_domain=''
    *.db_name='hr91dvvb'
    *.db_unique_name='hr91dvvb'
    *.diagnostic_dest='/u01/app/oracle'
    *.dispatchers='(PROTOCOL=TCP) (SERVICE=hr91dvvbXDB)'
    *.memory_target=842006528
    *.open_cursors=300
    *.processes=150
    *.remote_login_passwordfile='EXCLUSIVE'
    *.undo_tablespace='UNDOTBS1'
    oracle:hr91dvvb$
    {code}                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                           

  • No 'Handle Duplicate records' in update tab

    Hi,
    I've a DTP from ODS/DSO to ODS/DSO and I got a Duplicate record error, which I find rather strange for standard ODS/DSO. I read in the help and within these forums that it can be fixed with the 'Handle Duplicate records' checkbox in the update tab. Trouble is that there isn't such a checkbox in our BI7 SP15 installation.
    Any suggestion on the reason for that and how to fix this and more important on how to get rid of the duplicate record error (and the reason why it's occurring)?
    Many thanks in advance
    Eddy

    Hi Eddy,
    I am confused -:)
    Have u tried by checking or by unchecking..
    My suggestion is to try by selecting the setting unique data records...
    Cheers
    Siva

  • Error due to duplicate records

    Hello friends,
    I have done a full upload to the particular charactersitics info object using direct update.(PSA and directly to data target). I used PSA and Subsequently into data target option.
    When i load the data into the object through process chain i get an error that duplicate records exist and the request become red in PSA.
    But no duplicate records exist in the data package and when we try to manually load the record from PSA to data Target it works fine.
    Can any one try to throw some lights on this error?
    Regards
    Sre....

    Hello Roberto and Paolo
    There was an OSS note that we should not use that option only PSA with delete duplicate records and update into data target .
    I dont know the reason Exactly.
    Can you throw some lights on this, Why its like that?
    Thanks for the reply paolo and roberto
    Regards
    Sri

  • Duplicate Record Search

    I need help with the SQL Statement below.
    SELECT DISTINCT RISK_CODE, LOB, COMPANY, STATE, RISK_CODE_DESCR, RISK_CODE_QUESTION_TEXT, EXPECTED_RESULT
    FROM RISK_CODE
    WHERE RISK_CODE_QUESTION_TEXT IS NOT NULL
    AND ASK_INDICATOR = 'Y'
    AND LOB = 'P1'
    AND ((COMPANY = '' OR COMPANY IS NULL)
    OR (STATE = '' OR STATE IS NULL))
    I want to return the values (risk code questions) when there is a company number and where there isn't, and the same for state. The problem that I run into, is when I do have a company number (ex: 077) and a state (ex. 04), I simply return duplicate values back, and I wanted to know if there was a way to avoid that.
    Not only do I need the values from a specific record (with Company and State), but I also need the records that are not company and state specific. A sample of the data is below.
    RISK_CODE - LOB - COMPANY - STATE - RISK_CODE_DESCR
    74 - P1 - 077 - 04 - DESCRIPTION1
    74 - P1 - - - DESCRIPTION1
    01 - P1 - 077 - 04 - DESCRIPTION2
    01 - P1 - - - DESCRIPTION2
    02 - P1 - - - DESCRIPTION3
    TY - P1 - - - DESCRIPTION4
    U7 - P1 - 077 - 04 - DESCRIPTION5
    I don't know if this helps or not, but if I pass in the state and company I would need risk codes 74, 01, 02, and TY. But if I pass the state and company I would need 74, 01, 02, TY, and U7. I don't want duplicate records back, and don't want to have to create a huge stored procedure for this.

    What duplicate values are you getting? From looking at you data I don't think you're getting duplicate values. What you are getting is the values for risk_code when there is a state populated and when then isn't. This is of course exactly what you've asked for.
    Presumably the RISK_CODE_DESCR are different and that's why your DISTINCT isn't filtering them.
    What you need to do is soemthing like this:
    SELECT RISK_CODE, LOB, COMPANY, STATE, RISK_CODE_DESCR, RISK_CODE_QUESTION_TEXT, EXPECTED_RESULT
    FROM RISK_CODE
    WHERE RISK_CODE_QUESTION_TEXT IS NOT NULL
    AND ASK_INDICATOR = 'Y'
    AND LOB = 'P1'
    AND ( COMPANY = '&&IN_CO' OR
          STATE = '&&IN_STATE')
    UNION
    SELECT RISK_CODE, LOB, COMPANY, STATE, RISK_CODE_DESCR, RISK_CODE_QUESTION_TEXT, EXPECTED_RESULT
    FROM RISK_CODE
    WHERE RISK_CODE_QUESTION_TEXT IS NOT NULL
    AND ASK_INDICATOR = 'Y'
    AND LOB = 'P1'
    AND COMPANY IS NULL
    AND STATE IS NULL
    AND NOT EXISTS (SELECT  RISK_CODE
                    FROM RISK_CODE
                    WHERE RISK_CODE_QUESTION_TEXT IS NOT NULL
                    AND ASK_INDICATOR = 'Y'
                    AND LOB = 'P1'
                    AND ( COMPANY = '&&IN_CO' OR
                          STATE = '&&IN_STATE')).
    Mind you, I'm not saying it's pretty or fast.
    Cheers, APC

  • Count of inserted and updated rowcount in @@rowcount

    Similarly I have created a sp which is inserting and updating the records.
    Now I want to track the count of new record inserted and updated record in @@rowcount .
    please suggest me the code .
    below is my sp
    alter Procedure SP_Archive_using_merge
    AS
    --exec SP_Archive
    BEGIN
    SET NOCOUNT ON
    Declare @Source_RowCount int
    Declare @New_RowCount int
    DECLARE @TimeIn SMALLDATETIME
    DECLARE @LatestVersion INT
    SET NOCOUNT ON
    ---BBxKey and Hash value of all the source columns are derived in source query itself--
    select @TimeIn=getdate(),@LatestVersion=1
    MERGE Archive.dbo.ArchiveBBxCemxr AS stm
    USING (
    SELECT a.*,cast(SUBSTRING(a.Col001,1,10) as varchar(100)) BBxKey,
    HashBytes('MD5', CAST(CHECKSUM(a.Col001,a.Col002,a.Col003,a.Col004,a.Col005,a.Col006,a.Col007) AS varbinary(max))) RowChecksum,
    b.BBxKey as Archive_BBxKey, b.RowChecksum as Archive_RowChecksum
    FROM dbo.ImportBBxCemxr a LEFT OUTER JOIN Archive.dbo.ArchiveBBxCemxr b
    ON a.Col001 = b.BBxKey
    Where (b.LatestVersion = 1 OR b.LatestVersion IS NULL) AND a.Col001 IS NOT NULL
    ) AS sd 
    ON sd.Archive_BBxKey = stm.BBxKey and sd.RowChecksum = stm.RowChecksum
    WHEN MATCHED AND (stm.BBxKey = sd.Archive_BBxKey and stm.RowChecksum != sd.Archive_RowChecksum) THEN
    UPDATE SET 
    stm.TimeIn = @TimeIn,
    BBXKey=sd.BBXKey,
    RowChecksum=sd.RowChecksum,
    stm.Col001=sd.Col001,
    stm.Col002=sd.Col002,
    stm.Col003=sd.Col003,
    stm.Col004=sd.Col004,
    stm.Col005=sd.Col005,
    stm.Col006=sd.Col006,
    stm.Col007=sd.Col007,
    stm.LatestVersion=@LatestVersion
    WHEN NOT MATCHED and (sd.Archive_BBxKey is null) THEN
    Insert (TimeIn,BBXKey,RowChecksum,Col001,Col002,Col003,Col004,Col005,Col006,Col007,LatestVersion)
    values(getdate(),sd.BBXKey,sd.RowChecksum,sd.Col001,sd.Col002,sd.Col003,sd.Col004,sd.Col005,sd.Col006,sd.Col007,@LatestVersion);
    end 
    Thankx &amp; regards, Vipin jha MCP

    You need to OUTPUT clause with action column to get the info into teable variable and then count from the table variable.
    Try the below: (Not tested)
    alter Procedure SP_Archive_using_merge
    AS
    --exec SP_Archive
    BEGIN
    SET NOCOUNT ON
    Declare @Source_RowCount int
    Declare @New_RowCount int
    DECLARE @TimeIn SMALLDATETIME
    DECLARE @LatestVersion INT
    SET NOCOUNT ON
    ---BBxKey and Hash value of all the source columns are derived in source query itself--
    select @TimeIn=getdate(),@LatestVersion=1
    DECLARE @tableVariable TABLE (sAction VARCHAR(20), InsertedID INT, DeletedID INT)
    MERGE Archive.dbo.ArchiveBBxCemxr AS stm
    USING (
    SELECT a.*,cast(SUBSTRING(a.Col001,1,10) as varchar(100)) BBxKey,
    HashBytes('MD5', CAST(CHECKSUM(a.Col001,a.Col002,a.Col003,a.Col004,a.Col005,a.Col006,a.Col007) AS varbinary(max))) RowChecksum,
    b.BBxKey as Archive_BBxKey, b.RowChecksum as Archive_RowChecksum
    FROM dbo.ImportBBxCemxr a LEFT OUTER JOIN Archive.dbo.ArchiveBBxCemxr b
    ON a.Col001 = b.BBxKey
    Where (b.LatestVersion = 1 OR b.LatestVersion IS NULL) AND a.Col001 IS NOT NULL
    ) AS sd
    ON sd.Archive_BBxKey = stm.BBxKey and sd.RowChecksum = stm.RowChecksum
    WHEN MATCHED AND (stm.BBxKey = sd.Archive_BBxKey and stm.RowChecksum != sd.Archive_RowChecksum) THEN
    UPDATE SET
    stm.TimeIn = @TimeIn,
    BBXKey=sd.BBXKey,
    RowChecksum=sd.RowChecksum,
    stm.Col001=sd.Col001,
    stm.Col002=sd.Col002,
    stm.Col003=sd.Col003,
    stm.Col004=sd.Col004,
    stm.Col005=sd.Col005,
    stm.Col006=sd.Col006,
    stm.Col007=sd.Col007,
    stm.LatestVersion=@LatestVersion
    WHEN NOT MATCHED and (sd.Archive_BBxKey is null) THEN
    Insert (TimeIn,BBXKey,RowChecksum,Col001,Col002,Col003,Col004,Col005,Col006,Col007,LatestVersion)
    values(getdate(),sd.BBXKey,sd.RowChecksum,sd.Col001,sd.Col002,sd.Col003,sd.Col004,sd.Col005,sd.Col006,sd.Col007,@LatestVersion)
    OUTPUT $action as action, inserted.BBXKey as ins, deleted.BBXKey as del into @tableVariable;
    --To get the action count info
    SELECT sAction, COUNT(*) FROM @tableVariable GROUP BY sAction
    end

  • How to select and duplicate the records and update some column values using cursor

    I have a table with 920 records, we need to update the end date to 6/30/2014 for 920 records and I need to create all 920 records with start date is 7/1/2014 and update the external value to
    CCC.
    Note: the table primary key is not auto increment, but I have sp to get the latest key for that.
    Existing table.  
    ID
    Source Name
    Internal value
    External value
    Start date
    End date
    1
    XXX
    AAA
    BBB
    1/1/2013
    6/30/2015
    Create new records
    ID
    Source Name
    Internal value
    External value
    Start date
    End date
    921
    XXX
    AAA
    CCC
    7/1/2013
    12/30/2015

    Hi ManuGT
    If I understand what you need then you ask for:
    1. updating all current rows (920 rows in the table now)
    2. insert new rows which are duplicates of the preiviews rows, but with value 'CCC' insteade of 'BBB'
    If so, there is not reason to use a cursor and it is highly NOT RECOMMENDED to use ant type of loop.
    You should work with SET and do it all in 2 simple queries:
    -- first we duplicate the existing rows,
    -- but we use the values 'CCC" and '20140107' for the new rows values
    INSERT test (SourceName, InternalValue, ExternalValue, StartDate, EndDate)
    select SourceName, 'CCC', ExternalValue, '20140107' , EndDate
    from test
    where
    -- You can use any filter that you need if you dont want to update all rows
    InternalValue = 'AAA' and ExternalValue = 'BBB' and StartDate = '20140101' and EndDate = '20140630'
    -- Now we update the old rows (check the filter! I get only the old rows since I filter the new rown out)
    UPDATE test
    SET EndDate = '20140107' -- I use date in format yyyymmdd, You can use other formats as well
    where
    -- You can use any filter that you need if you dont want to update all rows
    InternalValue = 'AAA' and ExternalValue = 'BBB' and StartDate = '20140101' and EndDate = '20140630'
    Unfortunately you did not post DDL+DML! Therefore we cant see your table structure and the data sample and we can only guess. I used Saeid's post as the basic DDL+DML.
    Please next time post DDL+DML
    here is the full code with the DDL+DML that i used:
    -- This is our DDL - A create table query:
    create table test
    ( id int identity(1,1) primary key,
    SourceName nvarchar(3),
    InternalValue nvarchar(3),
    ExternalValue nvarchar(3),
    StartDate date,
    EndDate date
    go
    -- This is our DML - A query that insert some sample data
    declare @i int = 1 ;
    while @i < 921
    begin
    insert test (SourceName, InternalValue, ExternalValue, StartDate, EndDate)
    values ('XXX', 'AAA', 'BBB', '1/1/2014', '6/30/2014' ) ;
    set @i += 1 ;
    end ;
    GO
    -- Here is the solution for the problem as I understood your needs:
    -- first we duplicate the existing rows,
    -- but we use the values 'CCC" and '20140107' for the new rows values
    INSERT test (SourceName, InternalValue, ExternalValue, StartDate, EndDate)
    select SourceName, 'CCC', ExternalValue, '20140107' , EndDate
    from test
    where
    -- You can use any filter that you need if you dont want to update all rows
    InternalValue = 'AAA' and ExternalValue = 'BBB' and StartDate = '20140101' and EndDate = '20140630'
    -- Now we update the old rows (check the filter! I get only the old rows since I filter the new rown out)
    UPDATE test
    SET EndDate = '20140107' -- I use date in format yyyymmdd, You can use other formats as well
    where
    -- You can use any filter that you need if you dont want to update all rows
    InternalValue = 'AAA' and ExternalValue = 'BBB' and StartDate = '20140101' and EndDate = '20140630'
    -- Here we just check how the result look like :-)
    select *
    from test ;
    -- And since we do not realy need this table in our server... Here we clean the DDL (you probaby DO NOT WANT TO EXECUTE THIS!)
    DROP table test
    GO
    I hope this was useful :-)
    [Personal Site] [Blog] [Facebook]

  • Importing and Updating Non-Duplicate Records from 2 Tables

    I need some help with the code to import data from one table
    into another if it is not a duplicate or if a record has changed.
    I have 2 tables, Members and NetNews. I want to check NetNews
    and import non-duplicate records from Members into NetNews and
    update an email address in NetNews if it has changed in Members. I
    figured it could be as simple as checking Members.MembersNumber and
    Members.Email against the existance of NetNews.Email and
    Members.MemberNumber and if a record in NetNews does not exist,
    create it and if the email address in Members.email has changed,
    update it in NetNews.Email.
    Here is what I have from all of the suggestions received from
    another category last year. It is not complete, but I am stuck on
    the solution. Can someone please help me get this code working?
    Thanks!
    <cfquery datasource="#application.dsrepl#"
    name="qryMember">
    SELECT distinct Email,FirstName,LastName,MemberNumber
    FROM members
    WHERE memberstanding <= 2 AND email IS NOT NULL AND email
    <> ' '
    </cfquery>
    <cfquery datasource="#application.ds#"
    name="newsMember">
    SELECT distinct MemberNumber
    FROM NetNews
    </cfquery>
    <cfif
    not(listfindnocase(valuelist(newsMember.MemberNumber),qryMember.MemberNumber)
    AND isnumeric(qryMember.MemberNumber))>
    insert into NetNews (Email_address, First_Name, Last_Name,
    MemberNumber)
    values ('#trim(qryMember.Email)#',
    '#trim(qryMember.FirstName)#', '#trim(qryMember.LastName)#', '#
    trim(qryMember.MemberNumber)#')-
    </cfif>
    </cfloop>
    </cfquery>
    ------------------

    Dan,
    My DBA doesn't have the experience to help with a VIEW. Did I
    mention that these are 2 separate databases on different servers?
    This project is over a year old now and it really needs to get
    finished so I thought the import would be the easiest way to go.
    Thanks to your help, it is almost working.
    I added some additional code to check for a changed email
    address and update the NetNews database. It runs without error, but
    I don't have a way to test it right now. Can you please look at the
    code and see if it looks OK?
    I am also still getting an error on line 10 after the routine
    runs. The line that has this code: "and membernumber not in
    (<cfqueryparam list="yes"
    value="#valuelist(newsmember.membernumber)#
    cfsqltype="cf_sql_integer">)" even with the cfif that Phil
    suggested.
    <cfquery datasource="#application.ds#"
    name="newsMember">
    SELECT distinct MemberNumber, Email_Address
    FROM NetNewsTest
    </cfquery>
    <cfquery datasource="#application.dsrepl#"
    name="qryMember">
    SELECT distinct Email,FirstName,LastName,MemberNumber
    FROM members
    WHERE memberstanding <= 2 AND email IS NOT NULL AND email
    <> ' '
    and membernumber not in (<cfqueryparam list="yes"
    value="#valuelist(newsmember.membernumber)#"
    cfsqltype="cf_sql_integer">)
    </cfquery>
    <CFIF qryMember.recordcount NEQ 0>
    <cfloop query ="qryMember">
    <cfquery datasource="#application.ds#"
    name="newsMember">
    insert into NetNewsTest (Email_address, First_Name,
    Last_Name, MemberNumber)
    values ('#trim(qryMember.Email)#',
    '#trim(qryMember.FirstName)#', '#trim(qryMember.LastName)#', '#
    trim(qryMember.MemberNumber)#')
    </cfquery>
    </cfloop>
    </cfif>
    <cfquery datasource="#application.dsrepl#"
    name="qryEmail">
    SELECT distinct Email
    FROM members
    WHERE memberstanding <= 2 AND email IS NOT NULL AND email
    <> ' '
    and qryMember.email NEQ newsMember.email
    </cfquery>
    <CFIF qryEmail.recordcount NEQ 0>
    <cfloop query ="qryEmail">
    <cfquery datasource="#application.ds#"
    name="newsMember">
    update NetNewsTest (Email_address)
    values ('#trim(qryMember.Email)#')
    where email_address = #qryEmail.email#
    </cfquery>
    </cfloop>
    </cfif>
    Thank you again for the help.

  • Identifying duplicate records in a table

    I am trying to identify duplicate records in a table - well they are broadly duplicated but some of the fields are changed on each insert whilst others are always the same.
    I can't work out the logic and it is driving me #$%$#^@ crazy !

    Here are a couple of other examples:
    Method 1: -- Makes use of the uniqueness of Oracle ROWIDs to identify duplicates.
    =========
    To check for single column duplicates:
    select rowid, deptno
    from dept outer
    where
    outer.rowid >
    (select min(rowid) from dept inner
    where inner.deptno=outer.deptno)
    order by deptno;
    To check for multi-column (key) duplicates:
    select rowid, deptno, dname
    from dept outer
    where
    outer.rowid >
    (select min(rowid) from dept inner
    where inner.deptno&#0124; &#0124;inner.dname=outer.deptno&#0124; &#0124;outer.deptno)
    order by deptno;
    Method 2: -- Makes use of resultset groups to identify uniqueness
    =========
    To check for single column duplicates:
    select rowid, deptno
    from dept
    where
    deptno in
    (select deptno from dept group by deptno having count(*) > 1)
    order by deptno;
    To check for multi-column (key) duplicates:
    select rowid, deptno, dname
    from dept
    where
    deptno&#0124; &#0124;dname in
    (select deptno&#0124; &#0124;dname from dept group by deptno&#0124; &#0124;dname having count(*) > 1)
    order by deptno;
    null

  • Identify Duplicate Records

    Post Author: chrise
    CA Forum: Crystal Reports
    I know that Crystal has the ability to identify all distinct records, thereby, getting rid of all duplicate records. However, I need to do the exact opposite and so far have been unsuccessful. Can anyone aid me in creating a report to identify only duplicate records in Crystal?
    Thanks.

    Post Author: SKodidine
    CA Forum: Crystal Reports
    Check out this KBase article.
    Retrieving duplicate records

  • Duplicate records update

    Hi
    We do have primary key in source table and key columns without constraint in target table. We do transformation of converting schema name in source while capturing.
    We do allow_duplicate_rows=Y in apply side. Whenever, duplicate rows insert happening, it works fine. When it comes to update it not working it throws an error.
    ORA-01422: exact fetch returns more than requested number of rows
    ORA-01403: no data found
    Could you please shed some light to fix this issue.?
    Alternate solution, remove the duplicate and execute the streams error transaction, works fine. But allow_duplicate_rows functionality is not fulfilled.
    Thanks
    Bala

    Source 10.2.0.4.3
    Target 10.2.0.3
    We are creating the reporting instance, duplicate is allowed. Insert on duplicate records are successfully inserted, when it comes to update, it is failed with the following error.
    ----Error in Message: 1
    ----Error Number: 1422
    ----Message Text: ORA-01422: exact fetch returns more than requested number of rows
    ORA-01403: no data found
    --message: 1
    type name: SYS.LCR$_ROW_RECORD
    source database: PROD2
    owner: REPORT_TWO
    object: DEVICE
    is tag null: Y
    command_type: UPDATE
    Supplemental logging is enabled for both source and target database.
    Thanks
    Bala

  • Need help to insert and update records in MDM

    Hi ,
    I am trying to develop an webdynpro application which can create and update records in tables of a repository of MDM . For example .. I want to insert values and later update values in Vendor table.
    I am new to webdynpro and MDM. If any one can help step by step or can send a sample code which I can be ready to use that would be great help.
    If anyone can have a sample code .. kindly mail to "[email protected]"
    It is urgent. Please help.
    Regards,
    Niraj
    Edited by: Niraj Kumar on May 23, 2008 6:50 AM

    Hi Niraj,
    Are u going to work with webdynpro Java/ABAP?
    some materials which are found useful are sent.
    Cheers,
    Mary

  • How to call function behind the button and update only specific record

    Greetings,
    1 - i wnat to ask few things as i m new to apex, i am using apex 4.1, and created 3 select list and a button in seleting of parameter,
    1 select list : select area
    2 select list: select product
    3- select list - size of the product
    i want to generate Ids for the follwing. for that i created query for INSERTING RECORD FROM ONE TABLE TO ANOTHER , generation the ids when button pressed "Generate" after selecting parameters,
    Now where i call that QUERY on button ? because when i create button its gives me option to submit, defined dynamic action, etc, pls gudie me where i call the function name id_generation when button pressed?.
    2- second thing i creared tabular " select user_id, product_name, product_type from product".
    by defualt check box list are create delete submit button are created, first when i insert record it saves that was fine, e.g i entered 50 records and afterward i want to update only one record, e.g there is a record product name = box, if i change it to box small and click submit then it saves all the page means all 50 records,
    i want to submit only that record that i changed, for that i use the logic that only those records should be updated which are checked but the user. how will i do this ? where to use the preocess , please guide
    Edited by: Omzz on Oct 2, 2012 11:28 PM

    If I understand what you are trying to do is correct you could possibly do this by:
    Creating and AFTER INSERT trigger on the table based on the tabular form which inserts the record into a seperate table after the record is inserted something like:
    CREATE OR REPLACE TRIGGER copy_records
    AFTER INSERT ON table a
    REFERENCING NEW AS NEW OLD AS OLD
    FOR EACH ROW
    DECLARE
    BEGIN
    INSERT INTO table b
    VALUES :NEW.col1, :NEW.col2 etc......
    END;
    There is also a way that you could do it within the form using a cursor on the tabular form with APEX_APPLICATION.G_ ......
    Chris

  • How can I identify and delete duplicate photos

    is there a way to identify and delete duplicate photo's in I photo?

    No one is suggesting that you download third party "crap". What was suggested is that you download a well established and supported 3rd party application. As to why there isn't a "'delete your stupid duplicate photo' option", you'll need to ask Apple that. Apple aren't here. This forum is for Users to help other Users. As to why you import stupid photos, you'll need to ask yourself.
    Regards
    TD

Maybe you are looking for