Duplicate Record - Perform Manual Recovery

hi,
I am facing . I will explain the scenario.
I have 10 XML files, each containing 5000 thousands records.
I am writing all the XML file date into database table as below.
     Write operation
XML Flies      ------------------> Database Tables
[ File Adapter ]     [Database Adapter]
Suppose BPEL is processing arround 500 records of the 1'st file and
rest of them are going into manual recovery.
The problem is when all the files are getting processed again BPEL is
processing first 500 records of 1st file again. As a result duplicated
entries of 500 records are getting created in the database table.
Is there any way, so that BPEL second time starts processing from 501 th
record of the first file ?
Regards,
Subhra

Hi Subhra,
you could configure the FileAdapter debatching to say only raise 100 records at a time as part of one xml payload/BPEL instance. That way on failure less records will need to be retried. Performance may be a little more stable as a result too.
As for on retry getting unique constraint exceptions, I would try using 'merge' instead of 'insert'. Merge does an existence check first and if your payload sizes are 100+ the performance hit is very minimal, as it does a single existence query for all rows at once.
Thanks
Steve

Similar Messages

  • Perform Manual Recovery  and Refresh Alarm Table in BPEL Console

    Hi,
    does anybody know what these options in BPEL Console (BPEL Processes tab) are for ? Thanks.
    - Refresh Alarm Table. What is this for ? Why do you need it ?
    - Perform Manual Recovery. Can you tell me an example when I can use this option ?

    Hi,
    does anybody know what these options in BPEL Console (BPEL Processes tab) are for ? Thanks.
    - Refresh Alarm Table. What is this for ? Why do you need it ?
    - Perform Manual Recovery. Can you tell me an example when I can use this option ?

  • Perform Manual Recovery

    Can anyone through me some idea when to use 'Perform Manual Recovery' ?
    Thanks

    Maybe on moment1 you run the bpel process, and the server inwhich the webservices are deployed is down, after a few hours you bring the server back up or it is just reachable again, and then you could manually retry the invoke again. (just a retry overhere is useless when for example a server is down).
    Maybe the payload of a process is corrupt/wrong and the process gets faulted. You could get the instance, and from within the consule you could get/set all the payload of the different steps of the process. So you're possible to update it and resubmit the process again.
    And i may assume others will have other situations inwhich they use it.

  • Callback Manual Recovery record created and never cleared

    Hi Guy's,
    I just noticed (meaning its been happening all along) in the Manual Recovery function of BPEL console, that under the 'callback' option, there seems to be an entry in there for most if not all BPEL processes that have ever been invoked. They don't seem to ever clear, and if I perform a manual recovery, this is the output traced to domain.log:
    <2007-09-14 11:43:52,510> <DEBUG> <gmc.collaxa.cube.engine.delivery> <SOAPProtocolHandler::resolveSubscriber> trying to find callbacks of convId=MD5{6e3d428f177762ae547ed02a43caca5a}
    <2007-09-14 11:43:52,513> <DEBUG> <gmc.collaxa.cube.engine.delivery> <SOAPProtocolHandler::resolveSubscriber> No callbacks are found of convId=MD5{6e3d428f177762ae547ed02a43caca5a}
    <2007-09-14 11:43:52,513> <DEBUG> <gmc.collaxa.cube.engine.delivery> <DeliveryService::resolveSubscriber> No messages found for subscriber bpel://localhost/gmc/SFI_CreateExceptionTask~1.0/2500004-BpRcv1-BpSeq2.9-4 for conversation MD5{6e3d428f177762ae547ed02a43caca5a}
    <2007-09-14 11:55:44,760> <DEBUG> <gmc.collaxa.cube.engine.delivery> <DeliveryService::resolveCallback> Resolving subscribers for convId bpel://localhost/gmc/SFI_CreateChartOfAccount~1.0/2500016-BpInv2-BpSeq5.15-1, msgGuid=51b8f0982742e789:-74af1a88:1150396d9bc:-7a53
    <2007-09-14 11:55:44,775> <DEBUG> <gmc.collaxa.cube.engine.delivery> <SOAPProtocolHandler::resolveCallback> trying to find subscribers for convId=bpel://localhost/gmc/SFI_CreateChartOfAccount~1.0/2500016-BpInv2-BpSeq5.15-1, operation onResult, process bpel://localhost/gmc/SFI_CreateChartOfAccount~1.0/
    <2007-09-14 11:55:44,777> <DEBUG> <gmc.collaxa.cube.engine.delivery> <SOAPProtocolHandler::resolveCallback> No subscribers found for convId: bpel://localhost/gmc/SFI_CreateChartOfAccount~1.0/2500016-BpInv2-BpSeq5.15-1
    <2007-09-14 11:55:44,777> <DEBUG> <gmc.collaxa.cube.engine.delivery> <DeliveryService::resolveCallback> No subscribers found for message 51b8f0982742e789:-74af1a88:1150396d9bc:-7a53 for conversation bpel://localhost/gmc/SFI_CreateChartOfAccount~1.0/2500016-BpInv2-BpSeq5.15-1
    <2007-09-14 11:56:11,540> <DEBUG> <gmc.collaxa.cube.engine.delivery> <DeliveryService::resolveCallback> Resolving subscribers for convId bpel://localhost/gmc/SFI_CreateChartOfAccount~1.0/2500016-BpInv2-BpSeq5.15-1, msgGuid=51b8f0982742e789:-74af1a88:1150396d9bc:-7a53
    <2007-09-14 11:56:11,547> <DEBUG> <gmc.collaxa.cube.engine.delivery> <SOAPProtocolHandler::resolveCallback> trying to find subscribers for convId=bpel://localhost/gmc/SFI_CreateChartOfAccount~1.0/2500016-BpInv2-BpSeq5.15-1, operation onResult, process bpel://localhost/gmc/SFI_CreateChartOfAccount~1.0/
    <2007-09-14 11:56:11,548> <DEBUG> <gmc.collaxa.cube.engine.delivery> <SOAPProtocolHandler::resolveCallback> No subscribers found for convId: bpel://localhost/gmc/SFI_CreateChartOfAccount~1.0/2500016-BpInv2-BpSeq5.15-1
    <2007-09-14 11:56:11,548> <DEBUG> <gmc.collaxa.cube.engine.delivery> <DeliveryService::resolveCallback> No subscribers found for message 51b8f0982742e789:-74af1a88:1150396d9bc:-7a53 for conversation bpel://localhost/gmc/SFI_CreateChartOfAccount~1.0/2500016-BpInv2-BpSeq5.15-1
    Any ideas what I might have configured incorrectly?
    Thanks.

    Hi Subhra,
    you could configure the FileAdapter debatching to say only raise 100 records at a time as part of one xml payload/BPEL instance. That way on failure less records will need to be retried. Performance may be a little more stable as a result too.
    As for on retry getting unique constraint exceptions, I would try using 'merge' instead of 'insert'. Merge does an existence check first and if your payload sizes are 100+ the performance hit is very minimal, as it does a single existence query for all rows at once.
    Thanks
    Steve

  • Duplicate record in Masterdata: Failure in Chain, Successful when manually

    Hi all,
    A daily load updates 0BPARTNER_ATTR from CRM.
    Recently this load started to fail, with the following error message:
    xx duplicate records found. xxxx recordings used in table /BI0/XBPARTNER
    The data load through PSA, but there is no red record in the PSA.
    After copying the PSA data to Excel for further analysis I concluded there are actually no duplicate in the PSA.
    Also, the option 'Ignore double data records'  in the infopackage is checked.
    When I manually update from PSA the load is successful.
    This started to happen about two weeks ago, and I didn't find an event which could have caused it.
    Now it happens every two or three days, and each time the manual update is successful.
    Any suggestions, anyone?
    Thanks, Jan.

    Hi Jan,
    Possibly you would have two requests in PSA and you would try to update the Data Target.
    Delete all the requests in PSA, schedule once again to bring the data to PSA and update then into Data Target.
    Thank you,
    Arvind

  • Data loader : Import -- creating duplicate records ?

    Hi all,
    does anyone have also encountered the behaviour with Oracle Data Loader that duplicate records are created (also if i set the option: duplicatecheckoption=externalid) When i am checking the "import request queue - view" the request parameters of the job looks fine! ->
    Duplicate Checking Method == External Unique ID
    Action Taken if Duplicate Found == Overwrite Existing Records
    but data loader have created new records where the "External Unique ID" is already existent..
    Very strange is that when i create the import manually (by using Import Wizard) exactly the same import does work correct! Here the duplicate checking method works correct and the record is updated....
    I know the data loader has 2 methods, one for update and the other for import, however i do not expect that the import creates duplicates if the record is already existing, rather doing nothing!
    Anyone else experiencing the same ?? I hope that this is not expected behaviour!! - by the way method - "Update" works fine.
    thanks in advance, Juergen
    Edited by: 791265 on 27.08.2010 07:25
    Edited by: 791265 on 27.08.2010 07:26

    Sorry to hear about your duplicate records, Juergen. Hopefully you performed a small test load first, before a full load, which is a best practice for data import that we recommend in our documentation and courses.
    Sorry also to inform you that this is expected behavior --- Data Loader does not check for duplicates when inserting (aka importing). It only checks for duplicates when updating (aka overwriting). This is extensively documented in the Data Loader User Guide, the Data Loader FAQ, and in the Data Import Options Overview document.
    You should review all documentation on Oracle Data Loader On Demand before using it.
    These resources (and a recommended learning path for Data Loader) can all be found on the Data Import Resources page of the Training and Support Center. At the top right of the CRM On Demand application, click Training and Support, and search for "*data import resources*". This should bring you to the page.
    Pete

  • Setting change to correct duplicate records ..

    Hi Experts,
    > I tried loading a flat file to an ODS which was in the quality system and i got some duplicate rows which i had to correct manually in the PSA.
    > When i tried loading the same flat file to the same ODS in Production , i did not get the error message 'duplicate records' .
    Can you please tell me what is the setting that has to be changed to allow duplicate records, or correct it ...that i have to do in the quality.
    Help done would be defenately rewarded with points ....
    Thanks ,
    Santosh ....

    Hi Santhosh,
    Remove check for unique data records.
    <i><b>Unique Data Records</b></i>
    With the Unique Data Records indicator, you determine whether only unique data records are to be updated to the ODS object. This means that you cannot load a data record into the ODS object the key combination for which already exists in the system – otherwise a termination occurs. Only use this setting when you are sure that only unique data records are to be loaded into the ODS object (for example, single documents). A typical application of this is in the loading of mass data. It improves the load performance.
    You can also deselect this indicator again (even if data has already been loaded into the ODS object). This can be necessary if you want to re-post deleted data records using a repair request (see: Tab Page: Updating). In this case, you need to deselect the Unique Data Records indicator before posting the repair request, following which you can then reset the Unique Data Records indicator once more. The regeneration of metadata of the Export DataSource, which takes place when the ODS object is reactivated, has no effect on the existing data mart delta method.
    More info @ <a href="http://help.sap.com/saphelp_nw04/helpdata/en/a6/1205406640c442e10000000a1550b0/content.htm">ODS Object Settings</a>
    Hope it Helps
    Srini

  • Duplicate records problem

    Hi everyone,
    I'm having a a little difficulty resolving a problem with a repeating field causing duplication of data in a report I'm working on, and was hoping someone on here can suggest something to help!
    My report is designed to detail library issues during a particular period, categorised by the language of the item issued. My problem is that on the sql database that out library management system uses, it is possible for an item to have more than one language listed against it (some books will be in more than one language). When I list the loan records excluding the language data field, I get a list of distinct loan records. Bringing the language data into the report causes the loan record to repeat for each language associated with it, so if a book is both in English and French, it will cause the loan record to appear like this:
    LOAN RECORD NO.     LANGUAGE CODE
      123456                             ENG
      123456                             FRE
    So, although the loan only occurred once I have two instances of it in my report.
    I am only interested in the language that appears first and I can exclude duplicated records from the report page. I can also count only the distinct records to get an accurate overall total. My problem is that when I group the loan records by language code (I really need to do this as there are millions of loan records held in the database) the distinct count stops being a solution, as when placed at this group level it only excludes duplicates in the respective group level it's placed in. So my report would display something like this:
    ENG     1
    FRE      1
    A distinct count of the whole report would give the correct total of 1, but a cumulative total of the figures calculated at the language code group level would total 2, and be incorrect. I've encountered similar results when using Running Totals evaluating on a formula that excludes repeated loan record no.s from the count, but again when I group on the language code this goes out of the window.
    I need to find a way of grouping the loan records by language with a total count of loan records alongside each grouping that accurately reflects how many loans of that language took place.
    Is this possible using a calculation formula when there are repeating fields, or do I need to find a way of merging the repeating language fields into one field so that the report would appear like:
    LOAN RECORD     LANGUAGE CODE
      123456                      ENG, FRE
    Any suggestions would be greatly appreciated, as aside from this repeating language data there are quite a few other repeating database fields on the system that it would be nice to report on!
    Thanks!

    if you create a group by loan
    then create a group by language
    place the values in the group(loan id in the loan header)
    you should only see the loan id 1x.
    place the language in the language group you should only see that one time
    a group header returns the 1st value of a unique id....
    then in order to calculate avoiding the duplicates
    use manual running totals
    create a set for each summary you want- make sure each set has a different variable name
    MANUAL RUNNING TOTALS
    RESET
    The reset formula is placed in a group header report header to reset the summary to zero for each unique record it groups by.
    whileprintingrecords;
    Numbervar  X := 0;
    CALCULATION
    The calculation is placed adjacent to the field or formula that is being calculated.
    (if there are duplicate values; create a group on the field that is being calculated on. If there are not duplicate records, the detail section is used.
    whileprintingrecords;
    Numbervar  X := x + ; ( or formula)
    DISPLAY
    The display is the sum of what is being calculated. This is placed in a group, page or report footer. (generally placed in the group footer of the group header where the reset is placed.)
    whileprintingrecords;
    Numbervar  X;
    X

  • How to find out duplicate record contained in a flat file

    Hi Experts,
    For my project I have written a program for flat file upload.
    Requirement 1
    In the flat file there may be some duplicate record like:
    Field1   Field2
    11        test1
    11        test2
    12        test3
    13        test4
    Field1 is primary key.
    Can you please let me know how I can find out the duplicate record.
    Requirement 2
    The flat file contains the header row as shown above
    Field1   Field2
    How our program can skip this record and start reading / inserting records from row no 2 ie
    11        test1
    onwards.
    Thanks
    S
    FORM upload1.
    DATA : wf_title TYPE string,
    lt_filetab TYPE filetable,
    l_separator TYPE char01,
    l_action TYPE i,
    l_count TYPE i,
    ls_filetab TYPE file_table,
    wf_delemt TYPE rollname,
    wa_fieldcat TYPE lvc_s_fcat,
    tb_fieldcat TYPE lvc_t_fcat,
    rows_read TYPE i,
    p_error TYPE char01,
    l_file TYPE string.
    DATA: wf_object(30) TYPE c,
    wf_tablnm TYPE rsdchkview.
    wf_object = 'myprogram'.
    DATA i TYPE i.
    DATA:
    lr_mdmt TYPE REF TO cl_rsdmd_mdmt,
    lr_mdmtr TYPE REF TO cl_rsdmd_mdmtr,
    lt_idocstate TYPE rsarr_t_idocstate,
    lv_subrc TYPE sysubrc.
    TYPES : BEGIN OF test_struc,
    /bic/myprogram TYPE /bic/oimyprogram,
    txtmd TYPE rstxtmd,
    END OF test_struc.
    DATA : tb_assum TYPE TABLE OF /bic/pmyprogram.
    DATA: wa_ztext TYPE /bic/tmyprogram,
    myprogram_temp TYPE ziott_assum,
    wa_myprogram TYPE /bic/pmyprogram.
    DATA : test_upload TYPE STANDARD TABLE OF test_struc,
    wa2 TYPE test_struc.
    DATA : wa_test_upload TYPE test_struc,
    ztable_data TYPE TABLE OF /bic/pmyprogram,
    ztable_text TYPE TABLE OF /bic/tmyprogram,
    wa_upld_text TYPE /bic/tmyprogram,
    wa_upld_data TYPE /bic/pmyprogram,
    t_assum TYPE ziott_assum.
    DATA : wa1 LIKE test_upload.
    wf_title = text-026.
    CALL METHOD cl_gui_frontend_services=>file_open_dialog
    EXPORTING
    window_title = wf_title
    default_extension = 'txt'
    file_filter = 'Tab delimited Text Files (*.txt)'
    CHANGING
    file_table = lt_filetab
    rc = l_count
    user_action = l_action
    EXCEPTIONS
    file_open_dialog_failed = 1
    cntl_error = 2
    OTHERS = 3. "#EC NOTEXT
    IF sy-subrc 0.
    EXIT.
    ENDIF.
    LOOP AT lt_filetab INTO ls_filetab.
    l_file = ls_filetab.
    ENDLOOP.
    CHECK l_action = 0.
    IF l_file IS INITIAL.
    EXIT.
    ENDIF.
    l_separator = 'X'.
    wa_fieldcat-fieldname = 'test'.
    wa_fieldcat-dd_roll = wf_delemt.
    APPEND wa_fieldcat TO tb_fieldcat.
    CALL FUNCTION 'MESSAGES_INITIALIZE'.
    CLEAR wa_test_upload.
    Upload file from front-end (PC)
    File format is tab-delimited ASCII
    CALL FUNCTION 'GUI_UPLOAD'
    EXPORTING
    filename = l_file
    has_field_separator = l_separator
    TABLES
    data_tab = i_mara
    data_tab = test_upload
    EXCEPTIONS
    file_open_error = 1
    file_read_error = 2
    no_batch = 3
    gui_refuse_filetransfer = 4
    invalid_type = 5
    no_authority = 6
    unknown_error = 7
    bad_data_format = 8
    header_not_allowed = 9
    separator_not_allowed = 10
    header_too_long = 11
    unknown_dp_error = 12
    access_denied = 13
    dp_out_of_memory = 14
    disk_full = 15
    dp_timeout = 16
    OTHERS = 17.
    IF sy-subrc 0.
    EXIT.
    ELSE.
    CALL FUNCTION 'MESSAGES_INITIALIZE'.
    IF test_upload IS NOT INITIAL.
    DESCRIBE TABLE test_upload LINES rows_read.
    CLEAR : wa_test_upload,wa_upld_data.
    LOOP AT test_upload INTO wa_test_upload.
    CLEAR : p_error.
    rows_read = sy-tabix.
    IF wa_test_upload-/bic/myprogram IS INITIAL.
    p_error = 'X'.
    MESSAGE s153 WITH wa_test_upload-/bic/myprogram sy-tabix.
    CONTINUE.
    ELSE.
    TRANSLATE wa_test_upload-/bic/myprogram TO UPPER CASE.
    wa_upld_text-txtmd = wa_test_upload-txtmd.
    wa_upld_text-txtsh = wa_test_upload-txtmd.
    wa_upld_text-langu = sy-langu.
    wa_upld_data-chrt_accts = 'xyz1'.
    wa_upld_data-co_area = '12'.
    wa_upld_data-/bic/zxyzbcsg = 'Iy'.
    wa_upld_data-objvers = 'A'.
    wa_upld_data-changed = 'I'.
    wa_upld_data-/bic/zass_mdl = 'rrr'.
    wa_upld_data-/bic/zass_typ = 'I'.
    wa_upld_data-/bic/zdriver = 'yyy'.
    wa_upld_text-langu = sy-langu.
    MOVE-CORRESPONDING wa_test_upload TO wa_upld_data.
    MOVE-CORRESPONDING wa_test_upload TO wa_upld_text.
    APPEND wa_upld_data TO ztable_data.
    APPEND wa_upld_text TO ztable_text.
    ENDIF.
    ENDLOOP.
    DELETE ADJACENT DUPLICATES FROM ztable_data.
    DELETE ADJACENT DUPLICATES FROM ztable_text.
    IF ztable_data IS NOT INITIAL.
    CALL METHOD cl_rsdmd_mdmt=>factory
    EXPORTING
    i_chabasnm = 'myprogram'
    IMPORTING
    e_r_mdmt = lr_mdmt
    EXCEPTIONS
    invalid_iobjnm = 1
    OTHERS = 2.
    CALL FUNCTION 'MESSAGES_INITIALIZE'.
    **Lock the Infoobject to update
    CALL FUNCTION 'RSDG_IOBJ_ENQUEUE'
    EXPORTING
    i_objnm = wf_object
    i_scope = '1'
    i_msgty = rs_c_error
    EXCEPTIONS
    foreign_lock = 1
    sys_failure = 2.
    IF sy-subrc = 1.
    MESSAGE i107(zddd_rr) WITH wf_object sy-msgv2.
    EXIT.
    ELSEIF sy-subrc = 2.
    MESSAGE i108(zddd_rr) WITH wf_object.
    EXIT.
    ENDIF.
    *****Update Master Table
    IF ztable_data IS NOT INITIAL.
    CALL FUNCTION 'RSDMD_WRITE_ATTRIBUTES_TEXTS'
    EXPORTING
    i_iobjnm = 'myprogram'
    i_tabclass = 'M'
    I_T_ATTR = lt_attr
    TABLES
    i_t_table = ztable_data
    EXCEPTIONS
    attribute_name_error = 1
    iobj_not_found = 2
    generate_program_error = 3
    OTHERS = 4.
    IF sy-subrc 0.
    CALL FUNCTION 'MESSAGE_STORE'
    EXPORTING
    arbgb = 'zddd_rr'
    msgty = 'E'
    txtnr = '054'
    msgv1 = text-033
    EXCEPTIONS
    OTHERS = 3.
    MESSAGE e054(zddd_rr) WITH 'myprogram'.
    ELSE.
    CALL FUNCTION 'MESSAGE_STORE'
    EXPORTING
    arbgb = 'zddd_rr'
    msgty = 'S'
    txtnr = '053'
    msgv1 = text-033
    EXCEPTIONS
    OTHERS = 3.
    ENDIF.
    *endif.
    *****update Text Table
    IF ztable_text IS NOT INITIAL.
    CALL FUNCTION 'RSDMD_WRITE_ATTRIBUTES_TEXTS'
    EXPORTING
    i_iobjnm = 'myprogram'
    i_tabclass = 'T'
    TABLES
    i_t_table = ztable_text
    EXCEPTIONS
    attribute_name_error = 1
    iobj_not_found = 2
    generate_program_error = 3
    OTHERS = 4.
    IF sy-subrc 0.
    CALL FUNCTION 'MESSAGE_STORE'
    EXPORTING
    arbgb = 'zddd_rr'
    msgty = 'E'
    txtnr = '055'
    msgv1 = text-033
    EXCEPTIONS
    OTHERS = 3.
    ENDIF.
    ENDIF.
    ELSE.
    MESSAGE s178(zddd_rr).
    ENDIF.
    ENDIF.
    COMMIT WORK.
    CALL FUNCTION 'RSD_CHKTAB_GET_FOR_CHA_BAS'
    EXPORTING
    i_chabasnm = 'myprogram'
    IMPORTING
    e_chktab = wf_tablnm
    EXCEPTIONS
    name_error = 1.
    IF sy-subrc 0.
    MESSAGE ID sy-msgid TYPE sy-msgty NUMBER sy-msgno
    WITH sy-msgv1 sy-msgv2 sy-msgv3 sy-msgv4.
    ENDIF.
    ****Release locks on Infoobject
    CALL FUNCTION 'RSDG_IOBJ_DEQUEUE'
    EXPORTING
    i_objnm = 'myprogram'
    i_scope = '1'.
    ENDIF.
    ENDIF.
    PERFORM data_selection .
    PERFORM update_alv_grid_display.
    CALL FUNCTION 'MESSAGES_SHOW'.
    ENDFORM.

    Can you please let me know how I can find out the duplicate record.
    you need to split the records from flat file structure into your internal table ans use a delete ADJACENT duplicates comparing fields
    split flat_str into wa_f1 wa_f2 wa_f2 at tab_space.

  • 36 duplicate record  found. -- error while loading master data

    Hello BW Experts,
    Error while loading master data
    ( green light )Update PSA ( 50000 Records posted ) No errors
    ( green light )Transfer rules ( 50000  „³ 50000 Records ): No errors
    ( green light )Update rules ( 50000 „³ 50000 Records ): No errors
    ( green light ) Update ( 0 new / 50000 changed ): No errors
    ( red light )Processing end: errors occurred
         Processing 2 finished
    ƒÞ     36 duplicate record found. 15718 recordings used in table /BIC/PZMATNUM
    ƒÞ     36 duplicate record found. 15718 recordings used in table /BIC/XZMATNUM
    This error repeats with all the data-packets.
    what could be the reason of the error. how to correct the error.
    Any suggestions appreciated.
    Thanks,
    BWer

    BWer,
    We have exactly the same issue when loading the infoobject 0PM_ORDER, the datasource is 0PM_ORDER_ATTR.
    The workaround we have been using is 'Manual push' from the details tab of the monitor. Weirdly we don't have this issue in our test and dev systems, even in Production, this doesn't occur somedays. We have all the necessary settings enabled in the infopackage.
    Did you figure out a solution for your issue, if so please let me know and likewise I will.
    -Venkat

  • Duplicate records found while loading master data(very urgent)

    Hi all,
    One infopackage in the process chain failed while laoding the master data(full update).Its showing the following error-->duplicate record found ..1 record used in /BI0/PTCTQUERY and the same record occured in /BI0/PTCTQUERY tables.
    can anyone give me the solution...its very urgent...
    Thanks & Regards,
    Manjula

    Hi
    You can see the check box in the Processing tab page. Make a tick mark for the check box Ignore Duplicate Data Records indicator . When multiple data records that have the same key are transferred, the last data record in the request is updated to BI. Any other data records in the request with the same key are ignored.
    Help says that:
    To maintain consistency, ignoring duplicate data records is only possible if the data is updated serially. A serial update is when data is first updated into the PSA then, after it has been successfully written to the PSA, it is updated into the master data or text tables of the InfoObject.
    If a DataSource transfers potentially duplicate data records or if you manually set the Ignore Duplicate Data Records indicator, the PSA Only update type is automatically selected in the scheduler.
    hope it clears ur doubt, otherwise let me know.
    Regards
    Kiran

  • Data Loader inserting duplicate records

    Hi,
    There is an import that we need to run everyday in order to load data from another system into CRM On Demand . I have set up a data loader script which is scheduled to run every morning. The script should perform insert operation.
    Every morning a file with new insert data is available in the same location(generated by someone else) & same name. The data loader script must insert all records in it.
    One morning , there was a problem in the other job and a new file was not produced. When the data loader script ran , it found the old file and re-inserted the records (there were 3 in file). I had specified the -duplicatecheckoption parameter as the external id, since the records come from another system, but I came to know that the option works in the case of update operations only.
    How can a situation like this handled in future? The external id should be checked for duplicates before the insert operation is performed. If we cant check on the data loader side, is it possible to somehow specify the field as 'unique' in the UI so that there is an error if a duplicate record is inserted? Please suggest.
    Regards,

    Hi
    You can use something like this:
    cursor crs is select distinct deptno,dname,loc from dept.
    Now you can insert all the records present in this cursor.
    Assumption: You do not have duplicate entry in the dept table initially.
    Cheers
    Sudhir

  • Duplicate records in exported data

    I'm trying to export the inventory data with a wildcard (%) filter on the
    Workstation Name.
    If I run the same filter in a query from ConsoleOne, I don't see any
    duplicate records.
    If I run the data export, the exported data for some workstations will have
    a duplicate DN. Just the DN is duplicated, all the other fields are either
    empty or have some default value.
    I have also ran the manual duplicate removal process and have tried
    deleting the records all together using the InventoryRemoval service.
    Any other ideas?

    Dlee,
    It appears that in the past few days you have not received a response to your
    posting. That concerns us, and has triggered this automated reply.
    Has your problem been resolved? If not, you might try one of the following options:
    - Do a search of our knowledgebase at http://support.novell.com/search/kb_index.jsp
    - Check all of the other support tools and options available at
    http://support.novell.com.
    - You could also try posting your message again. Make sure it is posted in the
    correct newsgroup. (http://support.novell.com/forums)
    Be sure to read the forum FAQ about what to expect in the way of responses:
    http://support.novell.com/forums/faq_general.html
    If this is a reply to a duplicate posting, please ignore and accept our apologies
    and rest assured we will issue a stern reprimand to our posting bot.
    Good luck!
    Your Novell Product Support Forums Team
    http://support.novell.com/forums/

  • Problem with duplicate records..

    Hi..I am trying to insert the records to the database based on a button click from a Swing frame..I have succeeded in that...But when i try to insert a duplicate record it is showing com.mysql.jdbc.exceptions.jdbc4.MySQLIntegrityConstraintViolationException: Duplicate entry '1' for key 1
    My code is
    import java.awt.event.*;
    import java.sql.*;
    public class BarcodeReader extends javax.swing.JFrame implements ActionListener {
        static Connection con;
        static String url = "jdbc:mysql://localhost:3306/";
        static String db = "mynewdatabase";
        static String driver = "com.mysql.jdbc.Driver";
        static String user = "uname";
        static String pass = "pwd";
        static Statement stmt;
        /** Creates new form BarcodeReader */
        public BarcodeReader() {
            initComponents();
            nb.addActionListener(this);
        public static Connection getConnection(){
            try{
                Class.forName(driver);
                con = DriverManager.getConnection(url + db, user, pass);
                stmt=con.createStatement();
                System.out.println("jdbc driver for mysql : " + driver);
                System.out.println("Connection url : " + url + db);
                }catch (Exception ex)
                   ex.printStackTrace();
            return con;
    private void ActionPerformed(java.awt.event.ActionEvent evt) {                                
            Connection con=getConnection();
            String t=newtxt.getText();
            int t1=Integer.parseInt(t);
            try{
                PreparedStatement p=con.prepareStatement("INSERT INTO machine(barcode) values(?)");
                ResultSet rs=stmt.executeQuery("SELECT * FROM MACHINE");
                while(rs.next()){
                    String s=rs.getString("barcode");
                   if(s.equals(t))
                      System.out.println("this id exists....");
                        rs.last();
                   p.setInt(1, t1);
                   p.executeUpdate();
                   new NewJFrame().setVisible(true);
            }catch(Exception e){
                e.printStackTrace();
    public static void main(String args[]) {
            java.awt.EventQueue.invokeLater(new Runnable() {
                public void run() {
                    new BarcodeReader().setVisible(true);
        }When i insert a new record is ok, but when i am inserting the same record again it shows the following exception..
    jdbc driver for mysql : com.mysql.jdbc.Driver
    Connection url : jdbc:mysql://localhost:3306/mynewdatabase
    this id exists....
    com.mysql.jdbc.exceptions.jdbc4.MySQLIntegrityConstraintViolationException: Duplicate entry '1' for key 1
            at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
            at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
            at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
            at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
            at com.mysql.jdbc.Util.handleNewInstance(Util.java:406)
            at com.mysql.jdbc.Util.getInstance(Util.java:381)
            at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:1015)
            at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:956)
            at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3536)
            at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3468)
            at com.mysql.jdbc.MysqlIO.sendCommand(MysqlIO.java:1957)
            at com.mysql.jdbc.MysqlIO.sqlQueryDirect(MysqlIO.java:2107)
            at com.mysql.jdbc.ConnectionImpl.execSQL(ConnectionImpl.java:2648)
            at com.mysql.jdbc.PreparedStatement.executeInternal(PreparedStatement.java:2086)
            at com.mysql.jdbc.PreparedStatement.executeUpdate(PreparedStatement.java:2371)
            at com.mysql.jdbc.PreparedStatement.executeUpdate(PreparedStatement.java:2289)
            at com.mysql.jdbc.PreparedStatement.executeUpdate(PreparedStatement.java:2274)
            at project.BarcodeReader.ActionPerformed(BarcodeReader.java:276)
            at project.BarcodeReader.access$000(BarcodeReader.java:17)
            at project.BarcodeReader$1.actionPerformed(BarcodeReader.java:138)
            at javax.swing.AbstractButton.fireActionPerformed(AbstractButton.java:1995)
            at javax.swing.AbstractButton$Handler.actionPerformed(AbstractButton.java:2318)
            at javax.swing.DefaultButtonModel.fireActionPerformed(DefaultButtonModel.java:387)
            at javax.swing.DefaultButtonModel.setPressed(DefaultButtonModel.java:242)
            at javax.swing.plaf.basic.BasicButtonListener.mouseReleased(BasicButtonListener.java:236)
            at java.awt.Component.processMouseEvent(Component.java:6263)
            at javax.swing.JComponent.processMouseEvent(JComponent.java:3267)
            at java.awt.Component.processEvent(Component.java:6028)
            at java.awt.Container.processEvent(Container.java:2041)
            at java.awt.Component.dispatchEventImpl(Component.java:4630)
            at java.awt.Container.dispatchEventImpl(Container.java:2099)
            at java.awt.Component.dispatchEvent(Component.java:4460)
            at java.awt.LightweightDispatcher.retargetMouseEvent(Container.java:4574)
            at java.awt.LightweightDispatcher.processMouseEvent(Container.java:4238)
            at java.awt.LightweightDispatcher.dispatchEvent(Container.java:4168)
            at java.awt.Container.dispatchEventImpl(Container.java:2085)
            at java.awt.Window.dispatchEventImpl(Window.java:2475)
            at java.awt.Component.dispatchEvent(Component.java:4460)
            at java.awt.EventQueue.dispatchEvent(EventQueue.java:599)
            at java.awt.EventDispatchThread.pumpOneEventForFilters(EventDispatchThread.java:269)
            at java.awt.EventDispatchThread.pumpEventsForFilter(EventDispatchThread.java:184)
            at java.awt.EventDispatchThread.pumpEventsForHierarchy(EventDispatchThread.java:174)
            at java.awt.EventDispatchThread.pumpEvents(EventDispatchThread.java:169)
            at java.awt.EventDispatchThread.pumpEvents(EventDispatchThread.java:161)
            at java.awt.EventDispatchThread.run(EventDispatchThread.java:122)Thanks.

    shelton141 wrote:
    Hi..I am trying to insert the records to the database based on a button click from a Swing frame..I have succeeded in that...But when i try to insert a duplicate record it is showing com.mysql.jdbc.exceptions.jdbc4.MySQLIntegrityConstraintViolationException: Duplicate entry '1' for key 1Of course it is. That is what it is suppossed to do.
    Try inserting the same record twice, manually, and see what happens.
    There is however, if I remember right, an "IGNORE" "clause/option/whatever" that you can use on INSERT statements in MySQL. Check the manual, if that is what you really want.

  • Import Transaction Data - Duplicate records

    Hi,
    I need to upload a file of transaction data into BPC using data manager package. I've done the transformation and conversion files which validate successfully on a small data set. When I try to upload the data file using the real life file, it fails due to duplicate records. This happens because multiple external ID's map to one internal ID. Therefore, whilst there are no duplicates in the actual file produced by the client, the resulting data produced after conversion does contain duplicates and will therefore not upload.
    Apart from asking the client to perform the aggregation before sending me the file, is there any way to get BPC to allow the duplicates and simply sum up?
    Regards
    Sue

    Hi,
    Try adding the delivered package /CPMP/APPEND and run it. This should solve your problem.
    Thanks,
    Sreeni

Maybe you are looking for