EDI Debatching ST segment containing Duplicate record

Hi ,
As inbound messages to Oracle B2B are automatically debatched and its based on ST segments in ISA. How to handle when the multiple ST segment contains same record.
Data in the multiple segments are same and should be processed only once or it should reject the file. Do we have feature in Oracle B2B for checking the duplicates based on payload contents OR any other alternate way.
thanks for your help.
Regards
sv

Hi All,
I too have the same scenarios in 856 outbound process where I have multiple transaction sets i.e multiple of STs and SEs.
How to achieve this scenario.
Please reply with your suggestions.
Regards,
Divya

Similar Messages

  • How to delete Duplicate records from the informatica target using mapping?

    Hi, I have a scenario like below: In my mapping I have a source, which may containg unique records or duplicate records. Source and target are different tables. I have a target in my mapping which contains duplicate records. Using Mapping I have to delete the duplicate records in the target table. Target table does not contain any surrogate key. We can use target table as a look up table, but target table cannot be used as source in the mapping. We cannot use post SQL.

    Hi All, I have multiple flat files which i need to load in a single table.I did that using indirect option at session level.But need to dig out on how to populate substring of header in name column in target table. i have two columns Id and Name. in all input file I have only one column 'id' with header like H|ABCD|Date. I need to populate target like below example. File 1                                    File2     H|ABCD|Date.                      H|EFGH|Date.1                                            42                                            5  3                                            6 Target tale: Id    Name1     ABCD2     ABCD3     ABCD4     EFGH5     EFGH6     EFGH can anyone help on what should be the logic to get this data in a table in informatica.

  • Master data infoobject can't handle duplicate records after SP10

    Hi
    I am trying to load master data which happened to contain duplicate records from the source system.  In the DTP of the master data infoobject, I have ticked the 'Handle Duplicate Record Keys' checkbox.  After executing this DTP, the duplicate master data records were trapped in the Error Stack.  I am expecting overwriting of the duplicate master data to take place instead.  I understand that this error was fixed in Note 954661 - Updating master data texts when error in data package which is from SP9.  After applying Support Pack 10, the master data infoobject just can't handle records with duplicate keys.
    Please let me know if you manage to fix this problem.
    Many thanks,
    Anthony

    Found a fix for this problem.  Just applied this OSS note  Note 986196 - Error during duplicate record handling of master data texts.

  • SQL Query to retrieve one line from duplicate records

    Hi
    I have one table which contains duplicate records in multiple column but the difference is in one column which contains the value 0 or positive. The query i want is to retrieve only the line with the positive value for only the duplicated records.
    here below a sample data for your reference:
    CREATE TABLE TRANS
      CALLTRANSTYPE     NVARCHAR2(6),
      ORIGANI                 NVARCHAR2(40),
      TERMANI                 NVARCHAR2(40),
      STARTTIME               DATE,
      STOPTIME                DATE,
      CELLID                  NVARCHAR2(10),
      CONNECTSECONDS          NUMBER,
      SWITCHCALLCHARGE        NUMBER
    INSERT INTO TRANS VALUES ('REC','555988801','222242850',to_date('05/15/2012 09:15:00','mm/dd/yyyy hh24:mi:ss'),to_date('05/15/2012 09:15:25','mm/dd/yyyy hh24:mi:ss'),null,25,0)
    INSERT INTO TRANS VALUES ('REC','555988801','222242850',to_date('05/15/2012 09:15:00','mm/dd/yyyy hh24:mi:ss'),to_date('05/15/2012 09:15:25','mm/dd/yyyy hh24:mi:ss'),null,25,18000)
    INSERT INTO TRANS VALUES ('REC','555988801','222242850',to_date('05/15/2012 09:18:03','mm/dd/yyyy hh24:mi:ss'),to_date('05/15/2012 09:18:20','mm/dd/yyyy hh24:mi:ss'),null,17,0)
    The output i want to have is:
    CALLTRANSTYPE     ORIGANI          TERMANI          STARTTIME          STOPTIME          CELLID          CONNECTSECONDS          SWITCHCALLCHARGE
    REC          555988801     222242850     05/15/2012 09:15:00     05/15/2012 09:15:25               25               18000
    REC          555988801     222242850     05/15/2012 09:18:03     05/15/2012 09:18:20               17               0 Thank you.

    Hi ekh
    this is the query i want to have, thank you for the help:
    SQL> Select *from
    select CALLTRANSTYPE,ORIGANI,TERMANI,STARTTIME,STOPTIME,CELLID,CONNECTSECONDS,SWITCHCALLCHARGE
    ,row_number() over( partition by     STARTTIME    ,STOPTIME order by    SWITCHCALLCHARGE DESC     ) rn from TRANS
    where rn=1;  
    CALLTR ORIGANI                                  TERMANI                                  STARTTIME STOPTIME  CELLID     CONNECTSECONDS SWITCHCALLCHARGE     RN
    REC    555988801                                222242850                                15-MAY-12 15-MAY-12                        25            18000      1
    REC    555988801                                222242850                                15-MAY-12 15-MAY-12                        17                0      1Regrads
    Lucienot.

  • How to find out duplicate record contained in a flat file

    Hi Experts,
    For my project I have written a program for flat file upload.
    Requirement 1
    In the flat file there may be some duplicate record like:
    Field1   Field2
    11        test1
    11        test2
    12        test3
    13        test4
    Field1 is primary key.
    Can you please let me know how I can find out the duplicate record.
    Requirement 2
    The flat file contains the header row as shown above
    Field1   Field2
    How our program can skip this record and start reading / inserting records from row no 2 ie
    11        test1
    onwards.
    Thanks
    S
    FORM upload1.
    DATA : wf_title TYPE string,
    lt_filetab TYPE filetable,
    l_separator TYPE char01,
    l_action TYPE i,
    l_count TYPE i,
    ls_filetab TYPE file_table,
    wf_delemt TYPE rollname,
    wa_fieldcat TYPE lvc_s_fcat,
    tb_fieldcat TYPE lvc_t_fcat,
    rows_read TYPE i,
    p_error TYPE char01,
    l_file TYPE string.
    DATA: wf_object(30) TYPE c,
    wf_tablnm TYPE rsdchkview.
    wf_object = 'myprogram'.
    DATA i TYPE i.
    DATA:
    lr_mdmt TYPE REF TO cl_rsdmd_mdmt,
    lr_mdmtr TYPE REF TO cl_rsdmd_mdmtr,
    lt_idocstate TYPE rsarr_t_idocstate,
    lv_subrc TYPE sysubrc.
    TYPES : BEGIN OF test_struc,
    /bic/myprogram TYPE /bic/oimyprogram,
    txtmd TYPE rstxtmd,
    END OF test_struc.
    DATA : tb_assum TYPE TABLE OF /bic/pmyprogram.
    DATA: wa_ztext TYPE /bic/tmyprogram,
    myprogram_temp TYPE ziott_assum,
    wa_myprogram TYPE /bic/pmyprogram.
    DATA : test_upload TYPE STANDARD TABLE OF test_struc,
    wa2 TYPE test_struc.
    DATA : wa_test_upload TYPE test_struc,
    ztable_data TYPE TABLE OF /bic/pmyprogram,
    ztable_text TYPE TABLE OF /bic/tmyprogram,
    wa_upld_text TYPE /bic/tmyprogram,
    wa_upld_data TYPE /bic/pmyprogram,
    t_assum TYPE ziott_assum.
    DATA : wa1 LIKE test_upload.
    wf_title = text-026.
    CALL METHOD cl_gui_frontend_services=>file_open_dialog
    EXPORTING
    window_title = wf_title
    default_extension = 'txt'
    file_filter = 'Tab delimited Text Files (*.txt)'
    CHANGING
    file_table = lt_filetab
    rc = l_count
    user_action = l_action
    EXCEPTIONS
    file_open_dialog_failed = 1
    cntl_error = 2
    OTHERS = 3. "#EC NOTEXT
    IF sy-subrc 0.
    EXIT.
    ENDIF.
    LOOP AT lt_filetab INTO ls_filetab.
    l_file = ls_filetab.
    ENDLOOP.
    CHECK l_action = 0.
    IF l_file IS INITIAL.
    EXIT.
    ENDIF.
    l_separator = 'X'.
    wa_fieldcat-fieldname = 'test'.
    wa_fieldcat-dd_roll = wf_delemt.
    APPEND wa_fieldcat TO tb_fieldcat.
    CALL FUNCTION 'MESSAGES_INITIALIZE'.
    CLEAR wa_test_upload.
    Upload file from front-end (PC)
    File format is tab-delimited ASCII
    CALL FUNCTION 'GUI_UPLOAD'
    EXPORTING
    filename = l_file
    has_field_separator = l_separator
    TABLES
    data_tab = i_mara
    data_tab = test_upload
    EXCEPTIONS
    file_open_error = 1
    file_read_error = 2
    no_batch = 3
    gui_refuse_filetransfer = 4
    invalid_type = 5
    no_authority = 6
    unknown_error = 7
    bad_data_format = 8
    header_not_allowed = 9
    separator_not_allowed = 10
    header_too_long = 11
    unknown_dp_error = 12
    access_denied = 13
    dp_out_of_memory = 14
    disk_full = 15
    dp_timeout = 16
    OTHERS = 17.
    IF sy-subrc 0.
    EXIT.
    ELSE.
    CALL FUNCTION 'MESSAGES_INITIALIZE'.
    IF test_upload IS NOT INITIAL.
    DESCRIBE TABLE test_upload LINES rows_read.
    CLEAR : wa_test_upload,wa_upld_data.
    LOOP AT test_upload INTO wa_test_upload.
    CLEAR : p_error.
    rows_read = sy-tabix.
    IF wa_test_upload-/bic/myprogram IS INITIAL.
    p_error = 'X'.
    MESSAGE s153 WITH wa_test_upload-/bic/myprogram sy-tabix.
    CONTINUE.
    ELSE.
    TRANSLATE wa_test_upload-/bic/myprogram TO UPPER CASE.
    wa_upld_text-txtmd = wa_test_upload-txtmd.
    wa_upld_text-txtsh = wa_test_upload-txtmd.
    wa_upld_text-langu = sy-langu.
    wa_upld_data-chrt_accts = 'xyz1'.
    wa_upld_data-co_area = '12'.
    wa_upld_data-/bic/zxyzbcsg = 'Iy'.
    wa_upld_data-objvers = 'A'.
    wa_upld_data-changed = 'I'.
    wa_upld_data-/bic/zass_mdl = 'rrr'.
    wa_upld_data-/bic/zass_typ = 'I'.
    wa_upld_data-/bic/zdriver = 'yyy'.
    wa_upld_text-langu = sy-langu.
    MOVE-CORRESPONDING wa_test_upload TO wa_upld_data.
    MOVE-CORRESPONDING wa_test_upload TO wa_upld_text.
    APPEND wa_upld_data TO ztable_data.
    APPEND wa_upld_text TO ztable_text.
    ENDIF.
    ENDLOOP.
    DELETE ADJACENT DUPLICATES FROM ztable_data.
    DELETE ADJACENT DUPLICATES FROM ztable_text.
    IF ztable_data IS NOT INITIAL.
    CALL METHOD cl_rsdmd_mdmt=>factory
    EXPORTING
    i_chabasnm = 'myprogram'
    IMPORTING
    e_r_mdmt = lr_mdmt
    EXCEPTIONS
    invalid_iobjnm = 1
    OTHERS = 2.
    CALL FUNCTION 'MESSAGES_INITIALIZE'.
    **Lock the Infoobject to update
    CALL FUNCTION 'RSDG_IOBJ_ENQUEUE'
    EXPORTING
    i_objnm = wf_object
    i_scope = '1'
    i_msgty = rs_c_error
    EXCEPTIONS
    foreign_lock = 1
    sys_failure = 2.
    IF sy-subrc = 1.
    MESSAGE i107(zddd_rr) WITH wf_object sy-msgv2.
    EXIT.
    ELSEIF sy-subrc = 2.
    MESSAGE i108(zddd_rr) WITH wf_object.
    EXIT.
    ENDIF.
    *****Update Master Table
    IF ztable_data IS NOT INITIAL.
    CALL FUNCTION 'RSDMD_WRITE_ATTRIBUTES_TEXTS'
    EXPORTING
    i_iobjnm = 'myprogram'
    i_tabclass = 'M'
    I_T_ATTR = lt_attr
    TABLES
    i_t_table = ztable_data
    EXCEPTIONS
    attribute_name_error = 1
    iobj_not_found = 2
    generate_program_error = 3
    OTHERS = 4.
    IF sy-subrc 0.
    CALL FUNCTION 'MESSAGE_STORE'
    EXPORTING
    arbgb = 'zddd_rr'
    msgty = 'E'
    txtnr = '054'
    msgv1 = text-033
    EXCEPTIONS
    OTHERS = 3.
    MESSAGE e054(zddd_rr) WITH 'myprogram'.
    ELSE.
    CALL FUNCTION 'MESSAGE_STORE'
    EXPORTING
    arbgb = 'zddd_rr'
    msgty = 'S'
    txtnr = '053'
    msgv1 = text-033
    EXCEPTIONS
    OTHERS = 3.
    ENDIF.
    *endif.
    *****update Text Table
    IF ztable_text IS NOT INITIAL.
    CALL FUNCTION 'RSDMD_WRITE_ATTRIBUTES_TEXTS'
    EXPORTING
    i_iobjnm = 'myprogram'
    i_tabclass = 'T'
    TABLES
    i_t_table = ztable_text
    EXCEPTIONS
    attribute_name_error = 1
    iobj_not_found = 2
    generate_program_error = 3
    OTHERS = 4.
    IF sy-subrc 0.
    CALL FUNCTION 'MESSAGE_STORE'
    EXPORTING
    arbgb = 'zddd_rr'
    msgty = 'E'
    txtnr = '055'
    msgv1 = text-033
    EXCEPTIONS
    OTHERS = 3.
    ENDIF.
    ENDIF.
    ELSE.
    MESSAGE s178(zddd_rr).
    ENDIF.
    ENDIF.
    COMMIT WORK.
    CALL FUNCTION 'RSD_CHKTAB_GET_FOR_CHA_BAS'
    EXPORTING
    i_chabasnm = 'myprogram'
    IMPORTING
    e_chktab = wf_tablnm
    EXCEPTIONS
    name_error = 1.
    IF sy-subrc 0.
    MESSAGE ID sy-msgid TYPE sy-msgty NUMBER sy-msgno
    WITH sy-msgv1 sy-msgv2 sy-msgv3 sy-msgv4.
    ENDIF.
    ****Release locks on Infoobject
    CALL FUNCTION 'RSDG_IOBJ_DEQUEUE'
    EXPORTING
    i_objnm = 'myprogram'
    i_scope = '1'.
    ENDIF.
    ENDIF.
    PERFORM data_selection .
    PERFORM update_alv_grid_display.
    CALL FUNCTION 'MESSAGES_SHOW'.
    ENDFORM.

    Can you please let me know how I can find out the duplicate record.
    you need to split the records from flat file structure into your internal table ans use a delete ADJACENT duplicates comparing fields
    split flat_str into wa_f1 wa_f2 wa_f2 at tab_space.

  • How to delete the duplicate records in a table without promary key

    I have a table that contains around 1 million records and there is no promary key or auto number coulums. I need to delete the duplicate records from this table. what is the simple effective way to do this.

    Please see this link:
    Remove duplicate records ...
    sqldevelop.wordpress.com

  • Duplicate records in Fact Tables

    Hi,
    We are using BPC 7.0 MS SP7. BPC created duplicate records in WB and Fac2 tables. We faced similar issue before and the solution was to reboot the server and cleanup the additional data created. I think it should not be an issue with the script logic files we have. We had the issue across all applications. Data is fine now after the server reboot and running the same logic files.  I want to know if any one faced this issue and if there is any solution other than reboot. I appreciate your help.
    Thanks
    Raj

    Hi Sorin,
    I know this thread is rather old, but i have a problem which is pretty much related to this thread and appreciate if you could assist me. I have client running on 7.0 MS who has been using it for the past 3 years.
    It is a heavily customized system with many batch files running daily to update dimensions, copy data and sort. And Yes we do use custom packages that incorporates stored procedures.
    Recently, with no change in environment, we encountered our factwb ballooning up out of no where. fact table only contains less then 1 gb data but, factwb has 200 gb data and practically paralayzed the system. There is also equilavent 300 gb increase in the log files.
    We are not able to find out what caused this? Or if even the 200gb records in wb are even valid records that are duplicated. Is there a way to troubleshoot this?

  • Script to merge multiple CSV files together with no duplicate records.

    I like a Script to merge multiple CSV files together with no duplicate records.
    None of the files have any headers and column A got a unique ID. What would be the best way to accomplish that?

    OK here is my answer :
    2 files in a directory with no headers.
    first column is the unique ID, second colomun you put whatever u want
    The headers are added when using the import-csv cmdlet
    first file contains :
    1;a
    2;SAMEID-FIRSTFILE
    3;c
    4;d
    5;e
    second file contains :
    6;a
    2;SAMEID-SECONDFILE
    7;c
    8;d
    9;e
    the second file contains the line : 2;b wich is the same in the first file
    the code :
    $i = 0
    Foreach($file in (get-childitem d:\yourpath)){
    if($i -eq 0){
    $ref = import-csv $file.fullname -Header id,value -Delimiter ";"
    }else{
    $temp = import-csv $file.fullname -Header id,value -Delimiter ";"
    foreach($line in $temp){
    if(!($ref.id.contains($line.id))){
    $objet = new-object Psobject
    Add-Member -InputObject $objet -MemberType NoteProperty -Name id -value $line.id
    Add-Member -InputObject $objet -MemberType NoteProperty -Name value -value $line.value
    $ref += $objet
    $i++
    $ref
    $ref should return:
    id                                                         
    value
    1                                                          
    a
    2                                                          
    SAMEID-FIRSTFILE
    3                                                          
    c
    4                                                          
    d
    5                                                          
    e
    6                                                          
    a
    7                                                          
    c
    8                                                          
    d
    9                                                          
    e
    (get-childitem d:\yourpath) -> yourpath containing the 2 csv file

  • The ABAP/4 Open SQL array insert results in duplicate Record in database

    Hi All,
    I am trying to transfer 4 plants from R/3 to APO. The IM contains only these 4 plants. However a queue gets generated in APO saying 'The ABAP/4 Open SQL array insert results in duplicate record in database'. I checked for table /SAPAPO/LOC, /SAPAPO/LOCMAP & /SAPAPO/LOCT for duplicate entry but the entry is not found.
    Can anybody guide me how to resolve this issue?
    Thanks in advance
    Sandeep Patil

    Hi Sandeep,
              Now try to delete ur location before activating the IM again.
    Use the program /SAPAPO/DELETE_LOCATIONS to delete locations.
    Note :
    1. Set the deletion flag (in /SAPAPO/LOC : Location -> Deletion Flag)
    2. Remove all the dependencies (like transportation lane, Model ........ )
    Check now and let me know.
    Regards,
    Siva.
    null

  • Problem with duplicate records..

    Hi..I am trying to insert the records to the database based on a button click from a Swing frame..I have succeeded in that...But when i try to insert a duplicate record it is showing com.mysql.jdbc.exceptions.jdbc4.MySQLIntegrityConstraintViolationException: Duplicate entry '1' for key 1
    My code is
    import java.awt.event.*;
    import java.sql.*;
    public class BarcodeReader extends javax.swing.JFrame implements ActionListener {
        static Connection con;
        static String url = "jdbc:mysql://localhost:3306/";
        static String db = "mynewdatabase";
        static String driver = "com.mysql.jdbc.Driver";
        static String user = "uname";
        static String pass = "pwd";
        static Statement stmt;
        /** Creates new form BarcodeReader */
        public BarcodeReader() {
            initComponents();
            nb.addActionListener(this);
        public static Connection getConnection(){
            try{
                Class.forName(driver);
                con = DriverManager.getConnection(url + db, user, pass);
                stmt=con.createStatement();
                System.out.println("jdbc driver for mysql : " + driver);
                System.out.println("Connection url : " + url + db);
                }catch (Exception ex)
                   ex.printStackTrace();
            return con;
    private void ActionPerformed(java.awt.event.ActionEvent evt) {                                
            Connection con=getConnection();
            String t=newtxt.getText();
            int t1=Integer.parseInt(t);
            try{
                PreparedStatement p=con.prepareStatement("INSERT INTO machine(barcode) values(?)");
                ResultSet rs=stmt.executeQuery("SELECT * FROM MACHINE");
                while(rs.next()){
                    String s=rs.getString("barcode");
                   if(s.equals(t))
                      System.out.println("this id exists....");
                        rs.last();
                   p.setInt(1, t1);
                   p.executeUpdate();
                   new NewJFrame().setVisible(true);
            }catch(Exception e){
                e.printStackTrace();
    public static void main(String args[]) {
            java.awt.EventQueue.invokeLater(new Runnable() {
                public void run() {
                    new BarcodeReader().setVisible(true);
        }When i insert a new record is ok, but when i am inserting the same record again it shows the following exception..
    jdbc driver for mysql : com.mysql.jdbc.Driver
    Connection url : jdbc:mysql://localhost:3306/mynewdatabase
    this id exists....
    com.mysql.jdbc.exceptions.jdbc4.MySQLIntegrityConstraintViolationException: Duplicate entry '1' for key 1
            at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
            at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
            at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
            at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
            at com.mysql.jdbc.Util.handleNewInstance(Util.java:406)
            at com.mysql.jdbc.Util.getInstance(Util.java:381)
            at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:1015)
            at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:956)
            at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3536)
            at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3468)
            at com.mysql.jdbc.MysqlIO.sendCommand(MysqlIO.java:1957)
            at com.mysql.jdbc.MysqlIO.sqlQueryDirect(MysqlIO.java:2107)
            at com.mysql.jdbc.ConnectionImpl.execSQL(ConnectionImpl.java:2648)
            at com.mysql.jdbc.PreparedStatement.executeInternal(PreparedStatement.java:2086)
            at com.mysql.jdbc.PreparedStatement.executeUpdate(PreparedStatement.java:2371)
            at com.mysql.jdbc.PreparedStatement.executeUpdate(PreparedStatement.java:2289)
            at com.mysql.jdbc.PreparedStatement.executeUpdate(PreparedStatement.java:2274)
            at project.BarcodeReader.ActionPerformed(BarcodeReader.java:276)
            at project.BarcodeReader.access$000(BarcodeReader.java:17)
            at project.BarcodeReader$1.actionPerformed(BarcodeReader.java:138)
            at javax.swing.AbstractButton.fireActionPerformed(AbstractButton.java:1995)
            at javax.swing.AbstractButton$Handler.actionPerformed(AbstractButton.java:2318)
            at javax.swing.DefaultButtonModel.fireActionPerformed(DefaultButtonModel.java:387)
            at javax.swing.DefaultButtonModel.setPressed(DefaultButtonModel.java:242)
            at javax.swing.plaf.basic.BasicButtonListener.mouseReleased(BasicButtonListener.java:236)
            at java.awt.Component.processMouseEvent(Component.java:6263)
            at javax.swing.JComponent.processMouseEvent(JComponent.java:3267)
            at java.awt.Component.processEvent(Component.java:6028)
            at java.awt.Container.processEvent(Container.java:2041)
            at java.awt.Component.dispatchEventImpl(Component.java:4630)
            at java.awt.Container.dispatchEventImpl(Container.java:2099)
            at java.awt.Component.dispatchEvent(Component.java:4460)
            at java.awt.LightweightDispatcher.retargetMouseEvent(Container.java:4574)
            at java.awt.LightweightDispatcher.processMouseEvent(Container.java:4238)
            at java.awt.LightweightDispatcher.dispatchEvent(Container.java:4168)
            at java.awt.Container.dispatchEventImpl(Container.java:2085)
            at java.awt.Window.dispatchEventImpl(Window.java:2475)
            at java.awt.Component.dispatchEvent(Component.java:4460)
            at java.awt.EventQueue.dispatchEvent(EventQueue.java:599)
            at java.awt.EventDispatchThread.pumpOneEventForFilters(EventDispatchThread.java:269)
            at java.awt.EventDispatchThread.pumpEventsForFilter(EventDispatchThread.java:184)
            at java.awt.EventDispatchThread.pumpEventsForHierarchy(EventDispatchThread.java:174)
            at java.awt.EventDispatchThread.pumpEvents(EventDispatchThread.java:169)
            at java.awt.EventDispatchThread.pumpEvents(EventDispatchThread.java:161)
            at java.awt.EventDispatchThread.run(EventDispatchThread.java:122)Thanks.

    shelton141 wrote:
    Hi..I am trying to insert the records to the database based on a button click from a Swing frame..I have succeeded in that...But when i try to insert a duplicate record it is showing com.mysql.jdbc.exceptions.jdbc4.MySQLIntegrityConstraintViolationException: Duplicate entry '1' for key 1Of course it is. That is what it is suppossed to do.
    Try inserting the same record twice, manually, and see what happens.
    There is however, if I remember right, an "IGNORE" "clause/option/whatever" that you can use on INSERT statements in MySQL. Check the manual, if that is what you really want.

  • Import Transaction Data - Duplicate records

    Hi,
    I need to upload a file of transaction data into BPC using data manager package. I've done the transformation and conversion files which validate successfully on a small data set. When I try to upload the data file using the real life file, it fails due to duplicate records. This happens because multiple external ID's map to one internal ID. Therefore, whilst there are no duplicates in the actual file produced by the client, the resulting data produced after conversion does contain duplicates and will therefore not upload.
    Apart from asking the client to perform the aggregation before sending me the file, is there any way to get BPC to allow the duplicates and simply sum up?
    Regards
    Sue

    Hi,
    Try adding the delivered package /CPMP/APPEND and run it. This should solve your problem.
    Thanks,
    Sreeni

  • Duplicate records in TABLE CONTROL

    Hi folks,
    i am doing a module pool where my internal table (itab) data is comming to table ontrol(ctrl).then i need to select one record in table control & then i press REFRESH push button.
    after putting the refresh button, some new records are comming to that same internal table.then i need to display the modified internal table (some new records are added) data in the table control.
    The modified internal table data is comming to the table control but to the last of table control, some records are repeating.
    before comming to table control, i checked the modified itab. it contains correct data.i.e it contains 15 records.(previously i have 5 records.after REFRESH button 10 more records are added.). but when this table is comming to table control, it contains some 100 record.i should get only 15 record.
    why these records r repeting. how to delete the duplicate records from table control?
    plz suggest me where i am doing mistake.
    correct answer will be rewarded
    Thanks & Regards

    Hi ,
    Thanks for ur help. but i should not refresh the internal table  as some records r already present.after putting the REFRESH button, some new records r appending to this existing table.then i am going to display the previous records & the new records as well.
    i checked the internal table after modification.it contains actual number of records. but after comming to table control , more records r comming.
    is this the problem with scrolling or waht?
    plz suggest where i am doing mistake.i am giving my coding below.
    PROCESS BEFORE OUTPUT.
    MODULE STATUS_0200.
    module tc_shelf_change_tc_attr.
    loop at object_tab1
           with control tablctrl
           cursor tablctrl-current_line.
        module tc_shelf_get_lines.
      endloop.
    PROCESS AFTER INPUT.
    module set_exit AT EXIT-COMMAND.
       loop at object_tab1.
             chain.
              field: object_tab1-prueflos,
                     object_tab1-matnr.
               module shelf_modify on chain-request.
             endchain.
            field object_tab1-idx
             module shelf_mark on request.
                   endloop.
    module shelf_user_command.
    module user_command_0200.
    ***INCLUDE Y_RQEEAL10_STATUS_0200O01 .
    *&      Module  STATUS_0200  OUTPUT
          text
    MODULE STATUS_0200 OUTPUT.
      SET PF-STATUS 'MAIN'.
    SET TITLEBAR 'xxx'.
    ENDMODULE.                 " STATUS_0200  OUTPUT
    *&      Module  tc_shelf_change_tc_attr  OUTPUT
          text
    MODULE tc_shelf_change_tc_attr OUTPUT.
    delete adjacent duplicates from object_tab1 comparing prueflos matnr.
    describe table object_tab1 lines tablctrl-lines.
    ENDMODULE.                 " tc_shelf_change_tc_attr  OUTPUT
    *&      Module  tc_shelf_get_lines  OUTPUT
          text
    MODULE tc_shelf_get_lines OUTPUT.
    data:  g_tc_shelf_lines  like sy-loopc.
    if tablctrl-current_line > tablctrl-lines.
    stop.
    endif.
    g_tc_tablctrl_lines = sy-loopc.
    *refresh control tablctrl from screen 0200.
    ENDMODULE.                 " tc_shelf_get_lines  OUTPUT
    ***INCLUDE Y_RQEEAL10_SHELF_MODIFYI01 .
    *&      Module  shelf_modify  INPUT
          text
    MODULE shelf_modify INPUT.
    modify object_tab1
        index tablctrl-current_line.
    ENDMODULE.                 " shelf_modify  INPUT
    *&      Module  set_exit  INPUT
          text
    module set_exit INPUT.
    leave program.
    endmodule.                 " set_exit  INPUT
    *&      Module  shelf_mark  INPUT
          text
    MODULE shelf_mark INPUT.
    data: g_shelf_wa2 like line of object_tab1.
      if tablctrl-line_sel_mode = 1
      and object_tab1-idx = 'X'.
        loop at object_tab1 into g_shelf_wa2
          where idx = 'X'.
          g_shelf_wa2-idx = ''.
          modify object_tab1
            from g_shelf_wa2
            transporting idx.
        endloop.
      endif.
      modify object_tab1
        index tablctrl-current_line
        transporting idx plnty plnnr plnal.
    ENDMODULE.                 " shelf_mark  INPUT
    *&      Module  shelf_user_command  INPUT
          text
    MODULE shelf_user_command INPUT.
    ok_code = sy-ucomm.
      perform user_ok_tc using    'TABLCTRL'
                                  'OBJECT_TAB1'
                         changing ok_code.
      sy-ucomm = ok_code.
    ENDMODULE.                 " shelf_user_command  INPUT
    *&      Module  user_command_0100  INPUT
          text
    MODULE user_command_0200 INPUT.
    data:v_line(3).
    case OK_CODE.
    when 'LAST'.
    read table object_tab1 with key idx = 'X'.
    if sy-subrc = 0.
    select * from qals
                          where enstehdat <= object_tab1-enstehdat
                          and   plnty ne space
                          and   plnnr ne space
                          and   plnal ne space.
    if sy-dbcnt > 0.
    if qals-enstehdat = object_tab1-enstehdat.
       check qals-entstezeit < object_tab1-entstezeit.
       move-corresponding qals to object_tab2.
       append object_tab2.
       else.
       move-corresponding qals to object_tab2.
       append object_tab2.
       endif.
         endif.
            endselect.
       sort object_tab2 by enstehdat entstezeit descending.
    loop at object_tab2 to 25.
      if not object_tab2-prueflos is initial.
    append object_tab2 to object_tab1.
      endif.
      clear object_tab2.
    endloop.
      endif.
    when 'SAVE'.
    loop at object_tab1 where idx = 'X'.
      if ( not object_tab1-plnty is initial and
                    not object_tab1-plnnr is initial and
                               not object_tab1-plnal is initial ).
       select single * from qals into corresponding fields of wa_qals
       where prueflos = object_tab1-prueflos.
          if sy-subrc = 0.
           wa_qals-plnty = object_tab1-plnty.
           wa_qals-plnnr = object_tab1-plnnr.
           wa_qals-plnal = object_tab1-plnal.
    update qals from wa_qals.
      if sy-subrc <> 0.
    Message E001 with 'plan is not assigned to lot in sap(updation)'.
    else.
    v_line = tablctrl-current_line - ( tablctrl-current_line - 1 ).
    delete object_tab1.
    endif.
       endif.
          endif.
               endloop.
    when 'BACK'.
    leave program.
    when 'NEXT'.
    call screen 300.
    ENDCASE.
    ***INCLUDE Y_RQEEAL10_USER_OK_TCF01 .
    *&      Form  user_ok_tc
          text
         -->P_0078   text
         -->P_0079   text
         <--P_OK_CODE  text
    form user_ok_tc  using    p_tc_name type dynfnam
                              p_table_name
                     changing p_ok_code like sy-ucomm.
       data: l_ok              type sy-ucomm,
             l_offset          type i.
       search p_ok_code for p_tc_name.
       if sy-subrc <> 0.
         exit.
       endif.
       l_offset = strlen( p_tc_name ) + 1.
       l_ok = p_ok_code+l_offset.
       case l_ok.
         when 'P--' or                     "top of list
              'P-'  or                     "previous page
              'P+'  or                     "next page
              'P++'.                       "bottom of list
           perform compute_scrolling_in_tc using p_tc_name
                                                 l_ok.
           clear p_ok_code.
       endcase.
    endform.                    " user_ok_tc
    *&      Form  compute_scrolling_in_tc
          text
         -->P_P_TC_NAME  text
         -->P_L_OK  text
    form compute_scrolling_in_tc using    p_tc_name
                                           p_ok_code.
       data l_tc_new_top_line     type i.
       data l_tc_name             like feld-name.
       data l_tc_lines_name       like feld-name.
       data l_tc_field_name       like feld-name.
       field-symbols <tc>         type cxtab_control.
       field-symbols <lines>      type i.
       assign (p_tc_name) to <tc>.
       concatenate 'G_' p_tc_name '_LINES' into l_tc_lines_name.
       assign (l_tc_lines_name) to <lines>.
       if <tc>-lines = 0.
         l_tc_new_top_line = 1.
       else.
         call function 'SCROLLING_IN_TABLE'
           exporting
             entry_act      = <tc>-top_line
             entry_from     = 1
             entry_to       = <tc>-lines
             last_page_full = 'X'
             loops          = <lines>
             ok_code        = p_ok_code
             overlapping    = 'X'
           importing
             entry_new      = l_tc_new_top_line
           exceptions
             others         = 0.
       endif.
       get cursor field l_tc_field_name
                  area  l_tc_name.
       if syst-subrc = 0.
         if l_tc_name = p_tc_name.
           set cursor field l_tc_field_name line 1.
         endif.
       endif.
       <tc>-top_line = l_tc_new_top_line.
    endform.                              " COMPUTE_SCROLLING_IN_TC
    Thanks

  • Duplicate records in PSA

    Hi all,
    how to identify & eliminate the duplicate records in the PSA??

    Hi,
    Here is the FI Help for the 'Handle Duplicate Record Keys' option in the Update tab:
    "Indicator: Handling Duplicate Data Records
    If this indicator is set, duplicate data records are handled during an update in the order in which they occur in a data package.
    For time-independent attributes of a characteristic, the last data record with the corresponding key within a data package defines the the valid attribute value for the update for a given data record key.
    For time-dependent attributes, the validity ranges of the data record values are calculated according to their order (see example).
    If during your data quality measures you want to make sure that the data packages delivered by the DTP are not modified by the master data update, you must not set this indicator!
    Use:
    Note that for time-dependent master data, the semantic key of the DTP may not contain the field of the data source containing the DATETO information. When you set this indicator, error handling must be activated for the DTP because correcting duplicate data records is an error correction. The error correction must be "Update valid records, no reporting" or "Update valid records, reporting possible".
    Example:
    Handling of time-dependent data records
    - Data record 1 is valid from 01.01.2006 to 31.12.2006
    - Data record 2 has the same key but is valid from 01.07.2006 to 31.12.2007
    - The system corrects the time interval for data record 1 to 01.01.2006 to 30.06.2006. As of 01.07.2006, the next data record in the data package (data record 2) is valid."
    By flagging this option in the DTP, you are allowing it to take the latest value.
    There is further information at this SAP Help Portal link:
    http://help.sap.com/saphelp_nw04s/helpdata/en/d0/538f3b294a7f2de10000000a11402f/content.htm
    Rgds,
    Colum

  • Duplicate Records in WebI

    Hi All,
    I am getting duplicate records in the report level.I Unchecked the option of Remove Duplicate records it not working but when i go to "Avoid Duplicate Row Aggregation" this removes the duplicate but i want to sum up the rows in the following format.
    Number    Name           Amt
    1234         abc             1000
    1234         abc                -10
    3456         xyz             2000
    3456         xyz             1000
    I want sum up only when number contains discount amount  i.e for the first 2 items.If the prouct doesn't contain any doiscounts thats hould not sumup...
    If i use "Avoid Duplicate Row Aggregation" its displaying result in the following format
    Number    Name           Amt
    1234         abc             990
    3456         xyz             3000
    But i want to display report in the following format.
    Number    Name           Amt
    1234         abc             990
    3456         xyz             2000
    3456         xyz             1000
    Regards,
    Shiva Kumar G.C
    Edited by: Shivu  Kumar on Dec 31, 2010 3:36 PM

    Ans

  • Need a query for duplicate records deletion

    here is one scenario...
    23130 ----> 'A'
    23130 ----> 'X'
    23130 ----> 'c'
    These are duplicate records.. when we remove duplicates, the record must get 'c', if it contains A,C,X. If it contains A and X, then the record must get 'X'. That means the priority goes like this C-->X-->A. for this i need query.. this is one scenario. It would be great if u reply me asap.

    Hello
    It's great that you gave examples of your data, but it is quite helpful to supply create table and insert statements too along with a clear example of expected results. Anyway, I think this does what you are looking for.
    CREATE TABLE dt_dup (ID NUMBER, flag VARCHAR2(1))
    INSERT INTO dt_dup VALUES(23130, 'A');
    insert into dt_dup values(23130, 'X');
    insert into dt_dup values(23130, 'C');
    INSERT INTO dt_dup VALUES(23131, 'A');
    INSERT INTO dt_dup VALUES(23131, 'X');
    DELETE
    FROM
      dt_dup
    WHERE
      ROWID IN (  SELECT
                    rid
                  FROM
                    (   SELECT
                          rowid rid,
                          ROW_NUMBER() OVER (PARTITION BY ID ORDER BY CASE
                                                                        WHEN flag = 'A' THEN
                                                                          3
                                                                        WHEN flag = 'X' THEN
                                                                          2
                                                                        WHEN flag = 'C' THEN
                                                                          1
                                                                      END
                                            ) rn
                        FROM
                          dt_dup
                  WHERE
                    rn > 1
    select * from dt_dup;HTH
    David
    Edited by: Bravid on Jun 30, 2011 8:12 AM

Maybe you are looking for