Update duplicate records

Hi All,
Here is table data.
COl1 Col2 R_Status
100 A
100 A
100 A
100 A
101 A
102 A
103 B
103 B
103 B
I need output like this
COl1 Col2 R_Status
100 A error
100 A error
100 A error
100 A error
101 A error
102 A error
103 B load
103 B load
103 B load
Error - should update based on col1, col2 in the above example A has 3 different values B has only one value, so the status should be Error for all the records with value A.
please provide the SQL query for this
Appreciate your help

ME_XE?with data as
  2  (
  3     select 100 as col1, 'A' as col2 from dual union all
  4     select 100 as col1, 'A' as col2 from dual union all
  5     select 100 as col1, 'A' as col2 from dual union all
  6     select 100 as col1, 'A' as col2 from dual union all
  7     select 101 as col1, 'A' as col2 from dual union all
  8     select 102 as col1, 'A' as col2 from dual union all
  9     select 103 as col1, 'B' as col2 from dual union all
10     select 103 as col1, 'B' as col2 from dual union all
11     select 103 as col1, 'B' as col2 from dual
12  )
13  select
14     col1,
15     col2,
16     case when count(distinct col1) over (partition by col2) > 1
17     then
18        'error'
19     else
20        'load'
21     end as col3
22  from data
23  order by col1, col2;
              COL1 COL COL3
               100 A   error
               100 A   error
               100 A   error
               100 A   error
               101 A   error
               102 A   error
               103 B   load
               103 B   load
               103 B   load
9 rows selected.
Elapsed: 00:00:01.03
ME_XE?
ME_XE?
ME_XE?select * from v$version;
BANNER
Oracle Database 10g Express Edition Release 10.2.0.1.0 - Product
PL/SQL Release 10.2.0.1.0 - Production
CORE    10.2.0.1.0      Production
TNS for Linux: Version 10.2.0.1.0 - Production
NLSRTL Version 10.2.0.1.0 - Production
5 rows selected.
Elapsed: 00:00:01.09

Similar Messages

  • Update duplicate records in pl/sql with each different name

    this is my table newemp
      EMPNO ENAME      JOB
       7000 ELLISON    OWNER
       7369 SMITH      CLERK
       7499 ALLEN      SALESMAN
       7521 WARD       SALESMAN
       7566 JONES      MANAGER
       7654 MARTIN     SALESMAN
       7698 BLAKE      MANAGER
       7782 CLARK      MANAGER
       7788 SCOTT      ANALYST
       7839 KING       PRESIDENT
       7844 TURNER     SALESMAN
      EMPNO ENAME      JOB
       7876 ADAMS      CLERK
       7900 JAMES      CLERK
       7902 FORD       ANALYST
       7934 MILLER     CLERKI WANT TO UPDATE ALL THOSE REPEATING JOBS
    E.G IF THERE ARE FOUR SALESMAN IT SHOULD BE LIKE DIS SALE4,SALE5,SALE6,SALE7,SALE8. AND 2 ANALYST = ANA2,ANA3
    AND IF THE RECORD IS ONLY ONCE THN NO CHANGE.
    I DID THIS BY USING STORED PROCEDURE FOR EACH DISTINCT SALE BUT I WANT TO DO IN A SINGLE PROCEDURE
    THE STORED PROCEDURE THAT I HAVE DEVELOPED ID LIKE THIS:-
    create or replace procedure sp_duptest
    is
    x number(20);
    y number(3);
    j Number := 0;
    cursor z is select empno from  newemp where job = 'CLERK';
    begin
    select empno
        into x
      from newemp where job = 'CLERK' ;
    exception
    when no_data_found then
    null;
    when too_many_rows then
    begin
    select count(job) into y from newemp where  job='CLERK';
    y := y-1;
    for i in z
    loop
    y := y+ 1;
    if y>1 then
    update newemp
        set job = 'CLE'||y
        where empno = i.empno;
    else
    update newemp set job ='clerk';
    end loop;
    commit;
    end;
    end;Edited by: BluShadow on 02-Aug-2011 10:48
    added {noformat}{noformat} tags.  Please read {message:id=9360002} to learn to do this yourself.
    Edited by: Aviral on Aug 2, 2011 2:50 AM                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                           

    Something like this perhaps?
    SQL> create table emp2 as select * from emp;
    Table created.
    SQL> ed
    Wrote file afiedt.buf
      1  update emp2 set job = (select job
      2                         from (select empno
      3                                     ,substr(job,1,5)||to_char(row_number() over (partition by job order by empno),'fm09') as job
      4                               from emp2) t
      5*                        where t.empno = emp2.empno)
    SQL> /
    14 rows updated.
    SQL> select * from emp2;
         EMPNO ENAME      JOB              MGR HIREDATE                    SAL       COMM     DEPTNO
          7369 SMITH      CLERK01         7902 17-DEC-1980 00:00:00        800                    20
          7499 ALLEN      SALES01         7698 20-FEB-1981 00:00:00       1600        300         30
          7521 WARD       SALES02         7698 22-FEB-1981 00:00:00       1250        500         30
          7566 JONES      MANAG01         7839 02-APR-1981 00:00:00       2975                    20
          7654 MARTIN     SALES03         7698 28-SEP-1981 00:00:00       1250       1400         30
          7698 BLAKE      MANAG02         7839 01-MAY-1981 00:00:00       2850                    30
          7782 CLARK      MANAG03         7839 09-JUN-1981 00:00:00       2450                    10
          7788 SCOTT      ANALY01         7566 19-APR-1987 00:00:00       3000                    20
          7839 KING       PRESI01              17-NOV-1981 00:00:00       5000                    10
          7844 TURNER     SALES04         7698 08-SEP-1981 00:00:00       1500          0         30
          7876 ADAMS      CLERK02         7788 23-MAY-1987 00:00:00       1100                    20
          7900 JAMES      CLERK03         7698 03-DEC-1981 00:00:00        950                    30
          7902 FORD       ANALY02         7566 03-DEC-1981 00:00:00       3000                    20
          7934 MILLER     CLERK04         7782 23-JAN-1982 00:00:00       1300                    10
    14 rows selected.
    SQL>

  • Duplicate record error

    Hi,
    I am using a ODS as source to update the master data infoobject with flexible update. The issue is in spit of using option ONLY PSA ( Update subsequently in data target ) in the infopackage with error handling enabled, I am getting Duplicate record error. This happens only when I am updating through process chain. If I run it manually, the error doesn't comes. Plz let me know the reason.
    Thanks

    Hi Maneesh,
    As we r loading from ODS to info object we dont get the option "don't update duplicate records if exists"
    first did u checked any duplicate records found in PSA?? if so delete them from PSA.
    or one option is enable error handling in infopackage.
    or
    check any incosistencies for infoobject in RSRV and if found repair it and load again. check the inconsistencies for P,X,Y tables and for the complete object also.
    *assign points if helpfull*
    KS

  • Importing and Updating Non-Duplicate Records from 2 Tables

    I need some help with the code to import data from one table
    into another if it is not a duplicate or if a record has changed.
    I have 2 tables, Members and NetNews. I want to check NetNews
    and import non-duplicate records from Members into NetNews and
    update an email address in NetNews if it has changed in Members. I
    figured it could be as simple as checking Members.MembersNumber and
    Members.Email against the existance of NetNews.Email and
    Members.MemberNumber and if a record in NetNews does not exist,
    create it and if the email address in Members.email has changed,
    update it in NetNews.Email.
    Here is what I have from all of the suggestions received from
    another category last year. It is not complete, but I am stuck on
    the solution. Can someone please help me get this code working?
    Thanks!
    <cfquery datasource="#application.dsrepl#"
    name="qryMember">
    SELECT distinct Email,FirstName,LastName,MemberNumber
    FROM members
    WHERE memberstanding <= 2 AND email IS NOT NULL AND email
    <> ' '
    </cfquery>
    <cfquery datasource="#application.ds#"
    name="newsMember">
    SELECT distinct MemberNumber
    FROM NetNews
    </cfquery>
    <cfif
    not(listfindnocase(valuelist(newsMember.MemberNumber),qryMember.MemberNumber)
    AND isnumeric(qryMember.MemberNumber))>
    insert into NetNews (Email_address, First_Name, Last_Name,
    MemberNumber)
    values ('#trim(qryMember.Email)#',
    '#trim(qryMember.FirstName)#', '#trim(qryMember.LastName)#', '#
    trim(qryMember.MemberNumber)#')-
    </cfif>
    </cfloop>
    </cfquery>
    ------------------

    Dan,
    My DBA doesn't have the experience to help with a VIEW. Did I
    mention that these are 2 separate databases on different servers?
    This project is over a year old now and it really needs to get
    finished so I thought the import would be the easiest way to go.
    Thanks to your help, it is almost working.
    I added some additional code to check for a changed email
    address and update the NetNews database. It runs without error, but
    I don't have a way to test it right now. Can you please look at the
    code and see if it looks OK?
    I am also still getting an error on line 10 after the routine
    runs. The line that has this code: "and membernumber not in
    (<cfqueryparam list="yes"
    value="#valuelist(newsmember.membernumber)#
    cfsqltype="cf_sql_integer">)" even with the cfif that Phil
    suggested.
    <cfquery datasource="#application.ds#"
    name="newsMember">
    SELECT distinct MemberNumber, Email_Address
    FROM NetNewsTest
    </cfquery>
    <cfquery datasource="#application.dsrepl#"
    name="qryMember">
    SELECT distinct Email,FirstName,LastName,MemberNumber
    FROM members
    WHERE memberstanding <= 2 AND email IS NOT NULL AND email
    <> ' '
    and membernumber not in (<cfqueryparam list="yes"
    value="#valuelist(newsmember.membernumber)#"
    cfsqltype="cf_sql_integer">)
    </cfquery>
    <CFIF qryMember.recordcount NEQ 0>
    <cfloop query ="qryMember">
    <cfquery datasource="#application.ds#"
    name="newsMember">
    insert into NetNewsTest (Email_address, First_Name,
    Last_Name, MemberNumber)
    values ('#trim(qryMember.Email)#',
    '#trim(qryMember.FirstName)#', '#trim(qryMember.LastName)#', '#
    trim(qryMember.MemberNumber)#')
    </cfquery>
    </cfloop>
    </cfif>
    <cfquery datasource="#application.dsrepl#"
    name="qryEmail">
    SELECT distinct Email
    FROM members
    WHERE memberstanding <= 2 AND email IS NOT NULL AND email
    <> ' '
    and qryMember.email NEQ newsMember.email
    </cfquery>
    <CFIF qryEmail.recordcount NEQ 0>
    <cfloop query ="qryEmail">
    <cfquery datasource="#application.ds#"
    name="newsMember">
    update NetNewsTest (Email_address)
    values ('#trim(qryMember.Email)#')
    where email_address = #qryEmail.email#
    </cfquery>
    </cfloop>
    </cfif>
    Thank you again for the help.

  • Duplicate record identifier and update

    My records look like 
    Name City Duplicateindicator 
    SAM   NYC   0
    SAM   NYC1 0
    SAM    ORD  0
    TAM   NYC  0
    TAM   NYC1  0 
    DAM   NYC  0  
    for some reason numeric character are inserted into city which duplicated my records , 
    I need to 
    Check for the duplicate records by name ( If name is repeating ) check for city if they  having same city (NYC and NYC1) are consider same city here. I am ok to do this for one city at a time.
    SAM has a duplicate record as NYC and NYC01 , the record which is having  SAM   NYC1 0 must be updated to SAM   NYC1 1 

    Good day tatva
    Since the Cities names is not exactly the same, you will need to parse the text somehow in order to clean the numbers from the name, this is best to do with SQLCLR using regular expression (If this fit your need, then I can post the CLR code for you).
    In this case you use simple regular expression replace function.
    On the result of the function you use simple query with the function ROW_NUMBER over (partition by RegularExpressionReplace(ColumnName, '[0-9]') order by
    ColumnName)
    on the result of the ROW_NUMBER  every row with ROW_NUMBER  more then 1 is duplicate
    I hope this useful :-)
      Ronen Ariely
     [Personal Site]    [Blog]    [Facebook]

  • Duplicate records update

    Hi
    We do have primary key in source table and key columns without constraint in target table. We do transformation of converting schema name in source while capturing.
    We do allow_duplicate_rows=Y in apply side. Whenever, duplicate rows insert happening, it works fine. When it comes to update it not working it throws an error.
    ORA-01422: exact fetch returns more than requested number of rows
    ORA-01403: no data found
    Could you please shed some light to fix this issue.?
    Alternate solution, remove the duplicate and execute the streams error transaction, works fine. But allow_duplicate_rows functionality is not fulfilled.
    Thanks
    Bala

    Source 10.2.0.4.3
    Target 10.2.0.3
    We are creating the reporting instance, duplicate is allowed. Insert on duplicate records are successfully inserted, when it comes to update, it is failed with the following error.
    ----Error in Message: 1
    ----Error Number: 1422
    ----Message Text: ORA-01422: exact fetch returns more than requested number of rows
    ORA-01403: no data found
    --message: 1
    type name: SYS.LCR$_ROW_RECORD
    source database: PROD2
    owner: REPORT_TWO
    object: DEVICE
    is tag null: Y
    command_type: UPDATE
    Supplemental logging is enabled for both source and target database.
    Thanks
    Bala

  • No 'Handle Duplicate records' in update tab

    Hi,
    I've a DTP from ODS/DSO to ODS/DSO and I got a Duplicate record error, which I find rather strange for standard ODS/DSO. I read in the help and within these forums that it can be fixed with the 'Handle Duplicate records' checkbox in the update tab. Trouble is that there isn't such a checkbox in our BI7 SP15 installation.
    Any suggestion on the reason for that and how to fix this and more important on how to get rid of the duplicate record error (and the reason why it's occurring)?
    Many thanks in advance
    Eddy

    Hi Eddy,
    I am confused -:)
    Have u tried by checking or by unchecking..
    My suggestion is to try by selecting the setting unique data records...
    Cheers
    Siva

  • How to create duplicate records in end routines

    Hi
    Key fields in DSO are:
    Plant
    Storage Location
    MRP Area
    Material
    Changed Date
    Data Fields:
    Safety Stocky
    Service Level
    MRP Type
    Counter_1 (In flow Key figure)
    Counter_2 (Out flow Key Figure)
    n_ctr  (Non Cumulative Key Figure)
    For every record that comes in, we need to create a dupicate record. For the original record, we need to make the Counter_1 as 1 and Counter_2 as 0. For the duplicate record, we need to update Changed_Date to today's date and rest of the values will remain as is and update the counter_1 as 0 and counter_2 as -1. Where is the best place to write this code in DSO. IS it End
    routine?
    please let me know some bais cidea of code.

    Hi Uday,
    I have same situation like Suneel and have written your logic in End routine DSO as follows:
    DATA: l_t_duplicate_records TYPE TABLE OF TYS_TG_1,
          l_w_duplicate_record TYPE TYS_TG_1.
    LOOP AT RESULT_PACKAGE ASSIGNING <result_fields>.
        MOVE-CORRESPONDING <result_fields> TO l_w_duplicate_record.
        <result_fields>-/BIC/ZPP_ICNT = 1.
        <result_fields>-/BIC/ZPP_OCNT = 0.
        l_w_duplicate_record-CH_ON = sy-datum.
        l_w_duplicate_record-/BIC/ZPP_ICNT = 0.
        l_w_duplicate_record-/BIC/ZPP_OCNT = -1.
        APPEND l_w_duplicate_record TO  l_t_duplicate_records.
    ENDLOOP.
    APPEND LINES OF l_t_duplicate_records TO RESULT_PACKAGE.
    I am getting below error:
    Duplicate data record detected (DS ZPP_O01 , data package: 000001 , data record: 4 )     RSODSO_UPDATE     19     
    i have different requirement for date. Actually my requirement is to populate the CH_ON date as mentioned below:
    sort the records based on the key and get the latest CH_ON value with unique Plant,sloc, material combination and populate
    that CH_ON value for duplicate record.
    Please help me to resolve this issue.
    Thanks,
    Ganga

  • How to find out duplicate record contained in a flat file

    Hi Experts,
    For my project I have written a program for flat file upload.
    Requirement 1
    In the flat file there may be some duplicate record like:
    Field1   Field2
    11        test1
    11        test2
    12        test3
    13        test4
    Field1 is primary key.
    Can you please let me know how I can find out the duplicate record.
    Requirement 2
    The flat file contains the header row as shown above
    Field1   Field2
    How our program can skip this record and start reading / inserting records from row no 2 ie
    11        test1
    onwards.
    Thanks
    S
    FORM upload1.
    DATA : wf_title TYPE string,
    lt_filetab TYPE filetable,
    l_separator TYPE char01,
    l_action TYPE i,
    l_count TYPE i,
    ls_filetab TYPE file_table,
    wf_delemt TYPE rollname,
    wa_fieldcat TYPE lvc_s_fcat,
    tb_fieldcat TYPE lvc_t_fcat,
    rows_read TYPE i,
    p_error TYPE char01,
    l_file TYPE string.
    DATA: wf_object(30) TYPE c,
    wf_tablnm TYPE rsdchkview.
    wf_object = 'myprogram'.
    DATA i TYPE i.
    DATA:
    lr_mdmt TYPE REF TO cl_rsdmd_mdmt,
    lr_mdmtr TYPE REF TO cl_rsdmd_mdmtr,
    lt_idocstate TYPE rsarr_t_idocstate,
    lv_subrc TYPE sysubrc.
    TYPES : BEGIN OF test_struc,
    /bic/myprogram TYPE /bic/oimyprogram,
    txtmd TYPE rstxtmd,
    END OF test_struc.
    DATA : tb_assum TYPE TABLE OF /bic/pmyprogram.
    DATA: wa_ztext TYPE /bic/tmyprogram,
    myprogram_temp TYPE ziott_assum,
    wa_myprogram TYPE /bic/pmyprogram.
    DATA : test_upload TYPE STANDARD TABLE OF test_struc,
    wa2 TYPE test_struc.
    DATA : wa_test_upload TYPE test_struc,
    ztable_data TYPE TABLE OF /bic/pmyprogram,
    ztable_text TYPE TABLE OF /bic/tmyprogram,
    wa_upld_text TYPE /bic/tmyprogram,
    wa_upld_data TYPE /bic/pmyprogram,
    t_assum TYPE ziott_assum.
    DATA : wa1 LIKE test_upload.
    wf_title = text-026.
    CALL METHOD cl_gui_frontend_services=>file_open_dialog
    EXPORTING
    window_title = wf_title
    default_extension = 'txt'
    file_filter = 'Tab delimited Text Files (*.txt)'
    CHANGING
    file_table = lt_filetab
    rc = l_count
    user_action = l_action
    EXCEPTIONS
    file_open_dialog_failed = 1
    cntl_error = 2
    OTHERS = 3. "#EC NOTEXT
    IF sy-subrc 0.
    EXIT.
    ENDIF.
    LOOP AT lt_filetab INTO ls_filetab.
    l_file = ls_filetab.
    ENDLOOP.
    CHECK l_action = 0.
    IF l_file IS INITIAL.
    EXIT.
    ENDIF.
    l_separator = 'X'.
    wa_fieldcat-fieldname = 'test'.
    wa_fieldcat-dd_roll = wf_delemt.
    APPEND wa_fieldcat TO tb_fieldcat.
    CALL FUNCTION 'MESSAGES_INITIALIZE'.
    CLEAR wa_test_upload.
    Upload file from front-end (PC)
    File format is tab-delimited ASCII
    CALL FUNCTION 'GUI_UPLOAD'
    EXPORTING
    filename = l_file
    has_field_separator = l_separator
    TABLES
    data_tab = i_mara
    data_tab = test_upload
    EXCEPTIONS
    file_open_error = 1
    file_read_error = 2
    no_batch = 3
    gui_refuse_filetransfer = 4
    invalid_type = 5
    no_authority = 6
    unknown_error = 7
    bad_data_format = 8
    header_not_allowed = 9
    separator_not_allowed = 10
    header_too_long = 11
    unknown_dp_error = 12
    access_denied = 13
    dp_out_of_memory = 14
    disk_full = 15
    dp_timeout = 16
    OTHERS = 17.
    IF sy-subrc 0.
    EXIT.
    ELSE.
    CALL FUNCTION 'MESSAGES_INITIALIZE'.
    IF test_upload IS NOT INITIAL.
    DESCRIBE TABLE test_upload LINES rows_read.
    CLEAR : wa_test_upload,wa_upld_data.
    LOOP AT test_upload INTO wa_test_upload.
    CLEAR : p_error.
    rows_read = sy-tabix.
    IF wa_test_upload-/bic/myprogram IS INITIAL.
    p_error = 'X'.
    MESSAGE s153 WITH wa_test_upload-/bic/myprogram sy-tabix.
    CONTINUE.
    ELSE.
    TRANSLATE wa_test_upload-/bic/myprogram TO UPPER CASE.
    wa_upld_text-txtmd = wa_test_upload-txtmd.
    wa_upld_text-txtsh = wa_test_upload-txtmd.
    wa_upld_text-langu = sy-langu.
    wa_upld_data-chrt_accts = 'xyz1'.
    wa_upld_data-co_area = '12'.
    wa_upld_data-/bic/zxyzbcsg = 'Iy'.
    wa_upld_data-objvers = 'A'.
    wa_upld_data-changed = 'I'.
    wa_upld_data-/bic/zass_mdl = 'rrr'.
    wa_upld_data-/bic/zass_typ = 'I'.
    wa_upld_data-/bic/zdriver = 'yyy'.
    wa_upld_text-langu = sy-langu.
    MOVE-CORRESPONDING wa_test_upload TO wa_upld_data.
    MOVE-CORRESPONDING wa_test_upload TO wa_upld_text.
    APPEND wa_upld_data TO ztable_data.
    APPEND wa_upld_text TO ztable_text.
    ENDIF.
    ENDLOOP.
    DELETE ADJACENT DUPLICATES FROM ztable_data.
    DELETE ADJACENT DUPLICATES FROM ztable_text.
    IF ztable_data IS NOT INITIAL.
    CALL METHOD cl_rsdmd_mdmt=>factory
    EXPORTING
    i_chabasnm = 'myprogram'
    IMPORTING
    e_r_mdmt = lr_mdmt
    EXCEPTIONS
    invalid_iobjnm = 1
    OTHERS = 2.
    CALL FUNCTION 'MESSAGES_INITIALIZE'.
    **Lock the Infoobject to update
    CALL FUNCTION 'RSDG_IOBJ_ENQUEUE'
    EXPORTING
    i_objnm = wf_object
    i_scope = '1'
    i_msgty = rs_c_error
    EXCEPTIONS
    foreign_lock = 1
    sys_failure = 2.
    IF sy-subrc = 1.
    MESSAGE i107(zddd_rr) WITH wf_object sy-msgv2.
    EXIT.
    ELSEIF sy-subrc = 2.
    MESSAGE i108(zddd_rr) WITH wf_object.
    EXIT.
    ENDIF.
    *****Update Master Table
    IF ztable_data IS NOT INITIAL.
    CALL FUNCTION 'RSDMD_WRITE_ATTRIBUTES_TEXTS'
    EXPORTING
    i_iobjnm = 'myprogram'
    i_tabclass = 'M'
    I_T_ATTR = lt_attr
    TABLES
    i_t_table = ztable_data
    EXCEPTIONS
    attribute_name_error = 1
    iobj_not_found = 2
    generate_program_error = 3
    OTHERS = 4.
    IF sy-subrc 0.
    CALL FUNCTION 'MESSAGE_STORE'
    EXPORTING
    arbgb = 'zddd_rr'
    msgty = 'E'
    txtnr = '054'
    msgv1 = text-033
    EXCEPTIONS
    OTHERS = 3.
    MESSAGE e054(zddd_rr) WITH 'myprogram'.
    ELSE.
    CALL FUNCTION 'MESSAGE_STORE'
    EXPORTING
    arbgb = 'zddd_rr'
    msgty = 'S'
    txtnr = '053'
    msgv1 = text-033
    EXCEPTIONS
    OTHERS = 3.
    ENDIF.
    *endif.
    *****update Text Table
    IF ztable_text IS NOT INITIAL.
    CALL FUNCTION 'RSDMD_WRITE_ATTRIBUTES_TEXTS'
    EXPORTING
    i_iobjnm = 'myprogram'
    i_tabclass = 'T'
    TABLES
    i_t_table = ztable_text
    EXCEPTIONS
    attribute_name_error = 1
    iobj_not_found = 2
    generate_program_error = 3
    OTHERS = 4.
    IF sy-subrc 0.
    CALL FUNCTION 'MESSAGE_STORE'
    EXPORTING
    arbgb = 'zddd_rr'
    msgty = 'E'
    txtnr = '055'
    msgv1 = text-033
    EXCEPTIONS
    OTHERS = 3.
    ENDIF.
    ENDIF.
    ELSE.
    MESSAGE s178(zddd_rr).
    ENDIF.
    ENDIF.
    COMMIT WORK.
    CALL FUNCTION 'RSD_CHKTAB_GET_FOR_CHA_BAS'
    EXPORTING
    i_chabasnm = 'myprogram'
    IMPORTING
    e_chktab = wf_tablnm
    EXCEPTIONS
    name_error = 1.
    IF sy-subrc 0.
    MESSAGE ID sy-msgid TYPE sy-msgty NUMBER sy-msgno
    WITH sy-msgv1 sy-msgv2 sy-msgv3 sy-msgv4.
    ENDIF.
    ****Release locks on Infoobject
    CALL FUNCTION 'RSDG_IOBJ_DEQUEUE'
    EXPORTING
    i_objnm = 'myprogram'
    i_scope = '1'.
    ENDIF.
    ENDIF.
    PERFORM data_selection .
    PERFORM update_alv_grid_display.
    CALL FUNCTION 'MESSAGES_SHOW'.
    ENDFORM.

    Can you please let me know how I can find out the duplicate record.
    you need to split the records from flat file structure into your internal table ans use a delete ADJACENT duplicates comparing fields
    split flat_str into wa_f1 wa_f2 wa_f2 at tab_space.

  • 36 duplicate record  found. -- error while loading master data

    Hello BW Experts,
    Error while loading master data
    ( green light )Update PSA ( 50000 Records posted ) No errors
    ( green light )Transfer rules ( 50000  „³ 50000 Records ): No errors
    ( green light )Update rules ( 50000 „³ 50000 Records ): No errors
    ( green light ) Update ( 0 new / 50000 changed ): No errors
    ( red light )Processing end: errors occurred
         Processing 2 finished
    ƒÞ     36 duplicate record found. 15718 recordings used in table /BIC/PZMATNUM
    ƒÞ     36 duplicate record found. 15718 recordings used in table /BIC/XZMATNUM
    This error repeats with all the data-packets.
    what could be the reason of the error. how to correct the error.
    Any suggestions appreciated.
    Thanks,
    BWer

    BWer,
    We have exactly the same issue when loading the infoobject 0PM_ORDER, the datasource is 0PM_ORDER_ATTR.
    The workaround we have been using is 'Manual push' from the details tab of the monitor. Weirdly we don't have this issue in our test and dev systems, even in Production, this doesn't occur somedays. We have all the necessary settings enabled in the infopackage.
    Did you figure out a solution for your issue, if so please let me know and likewise I will.
    -Venkat

  • Duplicate records found while loading master data(very urgent)

    Hi all,
    One infopackage in the process chain failed while laoding the master data(full update).Its showing the following error-->duplicate record found ..1 record used in /BI0/PTCTQUERY and the same record occured in /BI0/PTCTQUERY tables.
    can anyone give me the solution...its very urgent...
    Thanks & Regards,
    Manjula

    Hi
    You can see the check box in the Processing tab page. Make a tick mark for the check box Ignore Duplicate Data Records indicator . When multiple data records that have the same key are transferred, the last data record in the request is updated to BI. Any other data records in the request with the same key are ignored.
    Help says that:
    To maintain consistency, ignoring duplicate data records is only possible if the data is updated serially. A serial update is when data is first updated into the PSA then, after it has been successfully written to the PSA, it is updated into the master data or text tables of the InfoObject.
    If a DataSource transfers potentially duplicate data records or if you manually set the Ignore Duplicate Data Records indicator, the PSA Only update type is automatically selected in the scheduler.
    hope it clears ur doubt, otherwise let me know.
    Regards
    Kiran

  • Data loader : Import -- creating duplicate records ?

    Hi all,
    does anyone have also encountered the behaviour with Oracle Data Loader that duplicate records are created (also if i set the option: duplicatecheckoption=externalid) When i am checking the "import request queue - view" the request parameters of the job looks fine! ->
    Duplicate Checking Method == External Unique ID
    Action Taken if Duplicate Found == Overwrite Existing Records
    but data loader have created new records where the "External Unique ID" is already existent..
    Very strange is that when i create the import manually (by using Import Wizard) exactly the same import does work correct! Here the duplicate checking method works correct and the record is updated....
    I know the data loader has 2 methods, one for update and the other for import, however i do not expect that the import creates duplicates if the record is already existing, rather doing nothing!
    Anyone else experiencing the same ?? I hope that this is not expected behaviour!! - by the way method - "Update" works fine.
    thanks in advance, Juergen
    Edited by: 791265 on 27.08.2010 07:25
    Edited by: 791265 on 27.08.2010 07:26

    Sorry to hear about your duplicate records, Juergen. Hopefully you performed a small test load first, before a full load, which is a best practice for data import that we recommend in our documentation and courses.
    Sorry also to inform you that this is expected behavior --- Data Loader does not check for duplicates when inserting (aka importing). It only checks for duplicates when updating (aka overwriting). This is extensively documented in the Data Loader User Guide, the Data Loader FAQ, and in the Data Import Options Overview document.
    You should review all documentation on Oracle Data Loader On Demand before using it.
    These resources (and a recommended learning path for Data Loader) can all be found on the Data Import Resources page of the Training and Support Center. At the top right of the CRM On Demand application, click Training and Support, and search for "*data import resources*". This should bring you to the page.
    Pete

  • Duplicate records in Fact Tables

    Hi,
    We are using BPC 7.0 MS SP7. BPC created duplicate records in WB and Fac2 tables. We faced similar issue before and the solution was to reboot the server and cleanup the additional data created. I think it should not be an issue with the script logic files we have. We had the issue across all applications. Data is fine now after the server reboot and running the same logic files.  I want to know if any one faced this issue and if there is any solution other than reboot. I appreciate your help.
    Thanks
    Raj

    Hi Sorin,
    I know this thread is rather old, but i have a problem which is pretty much related to this thread and appreciate if you could assist me. I have client running on 7.0 MS who has been using it for the past 3 years.
    It is a heavily customized system with many batch files running daily to update dimensions, copy data and sort. And Yes we do use custom packages that incorporates stored procedures.
    Recently, with no change in environment, we encountered our factwb ballooning up out of no where. fact table only contains less then 1 gb data but, factwb has 200 gb data and practically paralayzed the system. There is also equilavent 300 gb increase in the log files.
    We are not able to find out what caused this? Or if even the 200gb records in wb are even valid records that are duplicated. Is there a way to troubleshoot this?

  • Duplicate records in Cube Level ( BI 7.0 )

    Dear All
    I am working on BI 7.0 , I have an issue , i am  loading the data from  Flat File to ODS  and from ODS to Cube . In ODS  we are selected Overwrite option, in cube level we have an Summation option. the problem is while loading the data from Flatfile to ODS  records are fine . from ODS  to cube data  loading also fine but here i am getting the Duplicate records .
    what are the  best options to go ahead in such a situations??
    Regards
    KK

    I am sharing a case occured for me. Please see if its applicable for you.
    Sometimes, in the step loading to Cube, when any type of problem occurs, we restart the load. If the cube load prompts saying 'the lost load was unsuccessful.... reload?', this problem may occur. It will load the records in the previous load also.
    Verify what has been duplicated from the ODS changelog table and the cube load record count. If you see the number of records updated being the total of the records in the different ODSR request (in the change log table). Delete the previous load in the cube. (provided no other side effect is produced e.g from start routine etc)
    Cheers.

  • Data Loader inserting duplicate records

    Hi,
    There is an import that we need to run everyday in order to load data from another system into CRM On Demand . I have set up a data loader script which is scheduled to run every morning. The script should perform insert operation.
    Every morning a file with new insert data is available in the same location(generated by someone else) & same name. The data loader script must insert all records in it.
    One morning , there was a problem in the other job and a new file was not produced. When the data loader script ran , it found the old file and re-inserted the records (there were 3 in file). I had specified the -duplicatecheckoption parameter as the external id, since the records come from another system, but I came to know that the option works in the case of update operations only.
    How can a situation like this handled in future? The external id should be checked for duplicates before the insert operation is performed. If we cant check on the data loader side, is it possible to somehow specify the field as 'unique' in the UI so that there is an error if a duplicate record is inserted? Please suggest.
    Regards,

    Hi
    You can use something like this:
    cursor crs is select distinct deptno,dname,loc from dept.
    Now you can insert all the records present in this cursor.
    Assumption: You do not have duplicate entry in the dept table initially.
    Cheers
    Sudhir

Maybe you are looking for

  • How to format a text field for credit card numbers?

    OK, I found this conversation, http://forums.adobe.com/message/3598185, that gave some useful insight, and I adapted the code presented in it to make sure the user types in a 16-digit number, and it tells them to do it right if they don't. Here's my

  • Concert Start Up

    When loading a concert upon start up lots of the instrument settings are missing/not assigned correctly. I saved the concert before shutdown with all the sliders and knobs assigned correctly and working. Any help on this one Many Thanks Steve

  • DB creation with Hot backup of OTher DB ISSUES

    hi DBAs, I am using Oracle 11g R2, I want to make fresh database with another db hot backup of Datafile, Control file & SPfile. so far, I have run the below commands to create a db. C:\Documents and Settings\Administrator>E: E:\>set ORACLE_HOME=E:\ap

  • Working fine when running with eclipse, when exported, it doesnt.

    I have made a little mp3 player with the help of Jlayer 1.0 library. Now, it is running perfectly when I run it from eclipse. But when I export it to a .jar file, strange thing happens. The program starts ok, then I click add song button, a popup pop

  • 8330 Curve reboots every hour or so

    Recently had a problem with my phone spitting out error msgs (related to untrusted certificates while trying to go to a website). Long/short of it was the phone got wiped. I've put it back to normal as much as I can (new Service Books, restored the A