Checking of duplicate record

am wondering how to write code that will check the new item if it is already existing in the table before it can be inserted...am just new in forms. thanks.

According to online help, CHECK_RECORD_UNIQUENESS is only valid from the On-Check-Unique trigger and only performs what Forms does by default -- checks if the record has already been inserted into the table. Just set the Enforce Primary Key property to Yes for the block, and this happens automatically.
If you want to check right after the user enters the primary key for a new record, you can do it in the When-Validate-Item trigger:
declare
   record_count number := 0;
begin
   select count(*) into record_count from <tablename>
   where <primary_key_column> = :<block.primary_key>;
   if record_count > 0 then
      message('This record is already in the database.');
      raise form_trigger_failure;
   end if;
end;

Similar Messages

  • Check for duplicate record in SQL database before doing INSERT

    Hey guys,
           This is part powershell app doing a SQL insert. BUt my question really relates to the SQL insert. I need to do a check of the database PRIOR to doing the insert to check for duplicate records and if it exists then that record needs
    to be overwritten. I'm not sure how to accomplish this task. My back end is a SQL 2000 Server. I'm piping the data into my insert statement from a powershell FileSystemWatcher app. In my scenario here if the file dumped into a directory starts with I it gets
    written to a SQL database otherwise it gets written to an Access Table. I know silly, but thats the environment im in. haha.
    Any help is appreciated.
    Thanks in Advance
    Rich T.
    #### DEFINE WATCH FOLDERS AND DEFAULT FILE EXTENSION TO WATCH FOR ####
                $cofa_folder = '\\cpsfs001\Data_pvs\TestCofA'
                $bulk_folder = '\\cpsfs001\PVS\Subsidiary\Nolwood\McWood\POD'
                $filter = '*.tif'
                $cofa = New-Object IO.FileSystemWatcher $cofa_folder, $filter -Property @{ IncludeSubdirectories = $false; EnableRaisingEvents= $true; NotifyFilter = [IO.NotifyFilters]'FileName, LastWrite' }
                $bulk = New-Object IO.FileSystemWatcher $bulk_folder, $filter -Property @{ IncludeSubdirectories = $false; EnableRaisingEvents= $true; NotifyFilter = [IO.NotifyFilters]'FileName, LastWrite' }
    #### CERTIFICATE OF ANALYSIS AND PACKAGE SHIPPER PROCESSING ####
                Register-ObjectEvent $cofa Created -SourceIdentifier COFA/PACKAGE -Action {
           $name = $Event.SourceEventArgs.Name
           $changeType = $Event.SourceEventArgs.ChangeType
           $timeStamp = $Event.TimeGenerated
    #### CERTIFICATE OF ANALYSIS PROCESS BEGINS ####
                $test=$name.StartsWith("I")
         if ($test -eq $true) {
                $pos = $name.IndexOf(".")
           $left=$name.substring(0,$pos)
           $pos = $left.IndexOf("L")
           $tempItem=$left.substring(0,$pos)
           $lot = $left.Substring($pos + 1)
           $item=$tempItem.Substring(1)
                Write-Host "in_item_key $item in_lot_key $lot imgfilename $name in_cofa_crtdt $timestamp"  -fore green
                Out-File -FilePath c:\OutputLogs\CofA.csv -Append -InputObject "in_item_key $item in_lot_key $lot imgfilename $name in_cofa_crtdt $timestamp"
                start-sleep -s 5
                $conn = New-Object System.Data.SqlClient.SqlConnection("Data Source=PVSNTDB33; Initial Catalog=adagecopy_daily; Integrated Security=TRUE")
                $conn.Open()
                $insert_stmt = "INSERT INTO in_cofa_pvs (in_item_key, in_lot_key, imgfileName, in_cofa_crtdt) VALUES ('$item','$lot','$name','$timestamp')"
                $cmd = $conn.CreateCommand()
                $cmd.CommandText = $insert_stmt
                $cmd.ExecuteNonQuery()
                $conn.Close()
    #### PACKAGE SHIPPER PROCESS BEGINS ####
              elseif ($test -eq $false) {
                $pos = $name.IndexOf(".")
           $left=$name.substring(0,$pos)
           $pos = $left.IndexOf("O")
           $tempItem=$left.substring(0,$pos)
           $order = $left.Substring($pos + 1)
           $shipid=$tempItem.Substring(1)
                Write-Host "so_hdr_key $order so_ship_key $shipid imgfilename $name in_cofa_crtdt $timestamp"  -fore green
                Out-File -FilePath c:\OutputLogs\PackageShipper.csv -Append -InputObject "so_hdr_key $order so_ship_key $shipid imgfilename $name in_cofa_crtdt $timestamp"
    Rich Thompson

    Hi
    Since SQL Server 2000 has been out of support, I recommend you to upgrade the SQL Server 2000 to a higher version, such as SQL Server 2005 or SQL Server 2008.
    According to your description, you can try the following methods to check duplicate record in SQL Server.
    1. You can use
    RAISERROR to check the duplicate record, if exists then RAISERROR unless insert accordingly, code block is given below:
    IF EXISTS (SELECT 1 FROM TableName AS t
    WHERE t.Column1 = @ Column1
    AND t.Column2 = @ Column2)
    BEGIN
    RAISERROR(‘Duplicate records’,18,1)
    END
    ELSE
    BEGIN
    INSERT INTO TableName (Column1, Column2, Column3)
    SELECT @ Column1, @ Column2, @ Column3
    END
    2. Also you can create UNIQUE INDEX or UNIQUE CONSTRAINT on the column of a table, when you try to INSERT a value that conflicts with the INDEX/CONSTRAINT, an exception will be thrown. 
    Add the unique index:
    CREATE UNIQUE INDEX Unique_Index_name ON TableName(ColumnName)
    Add the unique constraint:
    ALTER TABLE TableName
    ADD CONSTRAINT Unique_Contraint_Name
    UNIQUE (ColumnName)
    Thanks
    Lydia Zhang

  • Duplicate to check the duplicate record

    We are loading master data hierarchy(alternate).. Already we loaded parent hierarchy . Now we loading subunit(child or alternate) just a small change from the parent one.
    We already loaded parent hierarchy now we plan to load alternate will it create any duplicate ...
    Thanks in advance
    Taj

    I am loading different hierarchy..
    1st is the parent hierarchy which has everything. 2 nd hierarchy which has only specific brands since user wants to pull only this specific brands in the report variable.
    (So this is subset of 1 st one).
    If we loading both the hierarchy will it overwrite or create any duplicate.
    Thanks in advance
    Taj

  • BI 7.0 - Duplicate Record Error while loading master data

    I am working on BI 7.0 and I am trying to load master data to an info object.
    I created an Infopackage and loaded into PSA.
    I created transformation and DTP and I get an error after I execute the DTP about duplicate records.
    I have read all previous threads about duplicate record error while loading master data and most of the them suggested to check 'Ignore duplicate records' option in the infopackage. But in 7.0, I can only load to PSA with infopackage and it doesn't have any option for me to ignore duplicate records.
    My data is getting loaded to PSA fine and I get this error while loading to info object using DTP.
    I would appreciate your help to resolve this issue.
    Regards,
    Ram.

    Hi,
    Refer:
    http://help.sap.com/saphelp_nw2004s/helpdata/en/45/2a8430131e03c3e10000000a1553f6/frameset.htm
    http://help.sap.com/saphelp_nw2004s/helpdata/en/d0/538f3b294a7f2de10000000a11402f/frameset.htm
    With rgds,
    Anil Kumar Sharma .P

  • How to avoid duplicate record in a file to file

    Hi Guys,
              Could you please provide a soultion
              in order  to avoid duplicate entries in a flat file based on key field.
              i request in terms of standard functions
             either at message mappingf level or by configuring the file adapter.
    warm regards
    mahesh.

    hi mahesh,
    write module processor for checking the duplicate record in file adapter
    or
    With a JAVA/ABAP mapping u can eliminate the duplicate records
    and check this links
    Re: How to Handle this "Duplicate Records"
    Duplicate records
    Ignoring Duplicate Records--urgent
    Re: Duplicate records frequently occurred
    Re: Reg ODS JUNK DATA
    http://help.sap.com/saphelp_nw2004s/helpdata/en/d0/538f3b294a7f2de10000000a11402f/frameset.htm
    regards
    srinivas

  • Duplicate records in DTP

    Guys,
    My DTP failed 4 days back, but the full load from Source system to PSA is executing successfully.
    To correct the DTP load, I have deleted the last DTP request and also all the PSA requests except the last one and when I re-executed the DTP I am getting Duplicate records. Also, in the failed DTP header, I am seeing all the deleted PSA requests?
    How do I resolve this issue? I cannot check Handle duplicate records since it is a Time dependent data.
    Thanks,
    Kumar

    Hi Kumar,
    1.deleting from PSA and updating is not a permanent solution..
    chk wht are the records creating dupicates in PSA first...
    the issue with the duplicates are not from PSA previously there are some records in the object itself...
    so deleting from PSA will not solve the issue...
    chk the data from source tables itself why the wrong data is cmg into PSA like tht...
    then u can edit with the correct records in PSA and update the data into target using DTP..
    u can create an error DTP for tht so tht it'll be easy to trace the duplicates easily,...
    2. You have the option "handle duplicate reords" in the DTP.
    check the box and try to load the data again.
    If this is time dependent master data then check also "valid to" as key along with other objects in sematic group option in the DTP.
    check this and they try to load the data.
    http://help.sap.com/saphelp_nw70/helpdata/EN/42/fbd598481e1a61e10000000a422035/content.htm
    Regards
    Sudheer

  • What is the use of ingnore duplicate records ?

    Hi guru's
    what is the use of ingnore duplicate records ? it will not allow duplicate records when u r loading masterdata ?
    actually md will not duplicate records , why we use this option? with out select the check box in duplicate records, what will happen?
    its supports only for flat file or r/3 sys, if supports flat file tell me both procedure
    Thanks
    Reddy

    Hi,
    If u checks Ignore Duplicate records means, system will allow duplicate records.
    Actually Master data will not have duplicate records. This option is rarely useful in certain scenarios.
    regards
    SR

  • POSDM Duplicate records

    Hi Experts,
                        Is there a way to handle duplicate records in SAP POSDM?
    Thanks in advance,
    Khan

    Hi Abhijit,
                   Thanks for the reply. Is it mandatory that we have to maintain rule code to maintain a precondition filter?
    and what does rule code 0001 (balanced transaction) imply?
    as of now i have maintained only the standard BADI Implementation for precondition filter to check for duplicate records for all my ISR & BI tasks.
    Regards,
    Khan

  • Duplicate record error

    Hi,
    I am using a ODS as source to update the master data infoobject with flexible update. The issue is in spit of using option ONLY PSA ( Update subsequently in data target ) in the infopackage with error handling enabled, I am getting Duplicate record error. This happens only when I am updating through process chain. If I run it manually, the error doesn't comes. Plz let me know the reason.
    Thanks

    Hi Maneesh,
    As we r loading from ODS to info object we dont get the option "don't update duplicate records if exists"
    first did u checked any duplicate records found in PSA?? if so delete them from PSA.
    or one option is enable error handling in infopackage.
    or
    check any incosistencies for infoobject in RSRV and if found repair it and load again. check the inconsistencies for P,X,Y tables and for the complete object also.
    *assign points if helpfull*
    KS

  • Duplicate record issue in IP file

    Hello All,
    I have a requirement to handle the duplicate record in IP flat file.
    my interface is very simple, just to load the data using IP interface. There is no IP imput query just FF load.
    My requirement is to apply validation check if file has similer two records that file should be rejected.
    ex
    Field 1   Field2   Amount
    XXX       ABC    100
    XXX       ABC    100
    File should reject. As per the standard functionality it sum-up the data in the cube
    Is there any way to handle that
    Thanks
    Samit

    I dont think you can do it. This is standard. May be you can write your own class to check for duplicate records. You can use this class in a custom planning function and throw error message.
    Best is to make sure the end users take the reponsibilty for data.
    Arun

  • Check duplicate records

    Right now i'm doing a migration project. I need to import data from excel and text files into the oracle database. Now my question is how to do the duplication data checking,validation on identical attributes and data type need to make sure i import the data correctly and accurately..Does anyone have any suggestion..your ideas n comments are highly appreciated..thanx in advance

    Hi,
    I'm new to this forum, so my answer is a little bit late ...
    Export data from all documents in an identical format. For example |name|adress|and|so|on|, merge all files to a big one. Than eliminate duplicate record with cat file | sort -u and you'll have unique rows. Next step, you have to check whether different keys means different attribute-values.Truncate the key-values from the file and do a unique sort again. Than diff -c both files and you get a list with the duplicate key. The last step ist to decide, which are the correct ones - I'm afraid this will be the hardest work.
    After that you can import all data without any constraint violation.
    Best regards
    Andreas

  • Check duplicate record in array

    suppose i have an array
    int a[]={1,2,3,4,4,5,6,7,7};
    how should i check duplicate record and remove them?
    thanks...

    Write yourself some pseudo-code:
    // Loop though all the elements in the array from 0 to length-2.
    // For each element i, loop through following elements from i+1 to length-1
    // if the following element is equal to the current one, do something with it.
    "do something" can mean either one of two things:
    (1) Remove it (VERY messy) or
    (2) Create a new array and only add in the non-duplicate elements.
    The easiest thing of all would be to create a List and then create a Set from that. Why code all this logic when somebody has already done it for you? Look at the java.util.Arrays and java.util.Collections classes. Use what's been given to you.
    %

  • How to suppress duplicate records in rtf templates

    Hi All,
    I am facing issue with payment reason comments in check template.
    we are displaying payment reason comments. Now the issue is while making batch payment we are getting multiple payment reason comments from multiple invoices with the same name and it doesn't looks good. You can see payment reason comments under tail number text field in the template.
    If you provide any xml syntax to suppress duplicate records for showing distinct payment reason comments.
    Attached screen shot, template and xml file for your reference.
    Thanks,
    Sagar.

    I have CRXI, so the instructions are for this release
    you can create a formula, I called it cust_Matches
    if = previous () then 'true' else 'false'
    IN your GH2 section, right click the field, select format field, select the common tab (far left at the top)
    Select the x/2 to the right of Supress  in the formula field type in
    {@Cust_Matches} = 'true'
    Now every time the {@Cust_Matches} is true, the CustID should be supressed,
    do the same with the other fields you wish to hide.  Ie Address, City, etc.

  • How to find out duplicate record contained in a flat file

    Hi Experts,
    For my project I have written a program for flat file upload.
    Requirement 1
    In the flat file there may be some duplicate record like:
    Field1   Field2
    11        test1
    11        test2
    12        test3
    13        test4
    Field1 is primary key.
    Can you please let me know how I can find out the duplicate record.
    Requirement 2
    The flat file contains the header row as shown above
    Field1   Field2
    How our program can skip this record and start reading / inserting records from row no 2 ie
    11        test1
    onwards.
    Thanks
    S
    FORM upload1.
    DATA : wf_title TYPE string,
    lt_filetab TYPE filetable,
    l_separator TYPE char01,
    l_action TYPE i,
    l_count TYPE i,
    ls_filetab TYPE file_table,
    wf_delemt TYPE rollname,
    wa_fieldcat TYPE lvc_s_fcat,
    tb_fieldcat TYPE lvc_t_fcat,
    rows_read TYPE i,
    p_error TYPE char01,
    l_file TYPE string.
    DATA: wf_object(30) TYPE c,
    wf_tablnm TYPE rsdchkview.
    wf_object = 'myprogram'.
    DATA i TYPE i.
    DATA:
    lr_mdmt TYPE REF TO cl_rsdmd_mdmt,
    lr_mdmtr TYPE REF TO cl_rsdmd_mdmtr,
    lt_idocstate TYPE rsarr_t_idocstate,
    lv_subrc TYPE sysubrc.
    TYPES : BEGIN OF test_struc,
    /bic/myprogram TYPE /bic/oimyprogram,
    txtmd TYPE rstxtmd,
    END OF test_struc.
    DATA : tb_assum TYPE TABLE OF /bic/pmyprogram.
    DATA: wa_ztext TYPE /bic/tmyprogram,
    myprogram_temp TYPE ziott_assum,
    wa_myprogram TYPE /bic/pmyprogram.
    DATA : test_upload TYPE STANDARD TABLE OF test_struc,
    wa2 TYPE test_struc.
    DATA : wa_test_upload TYPE test_struc,
    ztable_data TYPE TABLE OF /bic/pmyprogram,
    ztable_text TYPE TABLE OF /bic/tmyprogram,
    wa_upld_text TYPE /bic/tmyprogram,
    wa_upld_data TYPE /bic/pmyprogram,
    t_assum TYPE ziott_assum.
    DATA : wa1 LIKE test_upload.
    wf_title = text-026.
    CALL METHOD cl_gui_frontend_services=>file_open_dialog
    EXPORTING
    window_title = wf_title
    default_extension = 'txt'
    file_filter = 'Tab delimited Text Files (*.txt)'
    CHANGING
    file_table = lt_filetab
    rc = l_count
    user_action = l_action
    EXCEPTIONS
    file_open_dialog_failed = 1
    cntl_error = 2
    OTHERS = 3. "#EC NOTEXT
    IF sy-subrc 0.
    EXIT.
    ENDIF.
    LOOP AT lt_filetab INTO ls_filetab.
    l_file = ls_filetab.
    ENDLOOP.
    CHECK l_action = 0.
    IF l_file IS INITIAL.
    EXIT.
    ENDIF.
    l_separator = 'X'.
    wa_fieldcat-fieldname = 'test'.
    wa_fieldcat-dd_roll = wf_delemt.
    APPEND wa_fieldcat TO tb_fieldcat.
    CALL FUNCTION 'MESSAGES_INITIALIZE'.
    CLEAR wa_test_upload.
    Upload file from front-end (PC)
    File format is tab-delimited ASCII
    CALL FUNCTION 'GUI_UPLOAD'
    EXPORTING
    filename = l_file
    has_field_separator = l_separator
    TABLES
    data_tab = i_mara
    data_tab = test_upload
    EXCEPTIONS
    file_open_error = 1
    file_read_error = 2
    no_batch = 3
    gui_refuse_filetransfer = 4
    invalid_type = 5
    no_authority = 6
    unknown_error = 7
    bad_data_format = 8
    header_not_allowed = 9
    separator_not_allowed = 10
    header_too_long = 11
    unknown_dp_error = 12
    access_denied = 13
    dp_out_of_memory = 14
    disk_full = 15
    dp_timeout = 16
    OTHERS = 17.
    IF sy-subrc 0.
    EXIT.
    ELSE.
    CALL FUNCTION 'MESSAGES_INITIALIZE'.
    IF test_upload IS NOT INITIAL.
    DESCRIBE TABLE test_upload LINES rows_read.
    CLEAR : wa_test_upload,wa_upld_data.
    LOOP AT test_upload INTO wa_test_upload.
    CLEAR : p_error.
    rows_read = sy-tabix.
    IF wa_test_upload-/bic/myprogram IS INITIAL.
    p_error = 'X'.
    MESSAGE s153 WITH wa_test_upload-/bic/myprogram sy-tabix.
    CONTINUE.
    ELSE.
    TRANSLATE wa_test_upload-/bic/myprogram TO UPPER CASE.
    wa_upld_text-txtmd = wa_test_upload-txtmd.
    wa_upld_text-txtsh = wa_test_upload-txtmd.
    wa_upld_text-langu = sy-langu.
    wa_upld_data-chrt_accts = 'xyz1'.
    wa_upld_data-co_area = '12'.
    wa_upld_data-/bic/zxyzbcsg = 'Iy'.
    wa_upld_data-objvers = 'A'.
    wa_upld_data-changed = 'I'.
    wa_upld_data-/bic/zass_mdl = 'rrr'.
    wa_upld_data-/bic/zass_typ = 'I'.
    wa_upld_data-/bic/zdriver = 'yyy'.
    wa_upld_text-langu = sy-langu.
    MOVE-CORRESPONDING wa_test_upload TO wa_upld_data.
    MOVE-CORRESPONDING wa_test_upload TO wa_upld_text.
    APPEND wa_upld_data TO ztable_data.
    APPEND wa_upld_text TO ztable_text.
    ENDIF.
    ENDLOOP.
    DELETE ADJACENT DUPLICATES FROM ztable_data.
    DELETE ADJACENT DUPLICATES FROM ztable_text.
    IF ztable_data IS NOT INITIAL.
    CALL METHOD cl_rsdmd_mdmt=>factory
    EXPORTING
    i_chabasnm = 'myprogram'
    IMPORTING
    e_r_mdmt = lr_mdmt
    EXCEPTIONS
    invalid_iobjnm = 1
    OTHERS = 2.
    CALL FUNCTION 'MESSAGES_INITIALIZE'.
    **Lock the Infoobject to update
    CALL FUNCTION 'RSDG_IOBJ_ENQUEUE'
    EXPORTING
    i_objnm = wf_object
    i_scope = '1'
    i_msgty = rs_c_error
    EXCEPTIONS
    foreign_lock = 1
    sys_failure = 2.
    IF sy-subrc = 1.
    MESSAGE i107(zddd_rr) WITH wf_object sy-msgv2.
    EXIT.
    ELSEIF sy-subrc = 2.
    MESSAGE i108(zddd_rr) WITH wf_object.
    EXIT.
    ENDIF.
    *****Update Master Table
    IF ztable_data IS NOT INITIAL.
    CALL FUNCTION 'RSDMD_WRITE_ATTRIBUTES_TEXTS'
    EXPORTING
    i_iobjnm = 'myprogram'
    i_tabclass = 'M'
    I_T_ATTR = lt_attr
    TABLES
    i_t_table = ztable_data
    EXCEPTIONS
    attribute_name_error = 1
    iobj_not_found = 2
    generate_program_error = 3
    OTHERS = 4.
    IF sy-subrc 0.
    CALL FUNCTION 'MESSAGE_STORE'
    EXPORTING
    arbgb = 'zddd_rr'
    msgty = 'E'
    txtnr = '054'
    msgv1 = text-033
    EXCEPTIONS
    OTHERS = 3.
    MESSAGE e054(zddd_rr) WITH 'myprogram'.
    ELSE.
    CALL FUNCTION 'MESSAGE_STORE'
    EXPORTING
    arbgb = 'zddd_rr'
    msgty = 'S'
    txtnr = '053'
    msgv1 = text-033
    EXCEPTIONS
    OTHERS = 3.
    ENDIF.
    *endif.
    *****update Text Table
    IF ztable_text IS NOT INITIAL.
    CALL FUNCTION 'RSDMD_WRITE_ATTRIBUTES_TEXTS'
    EXPORTING
    i_iobjnm = 'myprogram'
    i_tabclass = 'T'
    TABLES
    i_t_table = ztable_text
    EXCEPTIONS
    attribute_name_error = 1
    iobj_not_found = 2
    generate_program_error = 3
    OTHERS = 4.
    IF sy-subrc 0.
    CALL FUNCTION 'MESSAGE_STORE'
    EXPORTING
    arbgb = 'zddd_rr'
    msgty = 'E'
    txtnr = '055'
    msgv1 = text-033
    EXCEPTIONS
    OTHERS = 3.
    ENDIF.
    ENDIF.
    ELSE.
    MESSAGE s178(zddd_rr).
    ENDIF.
    ENDIF.
    COMMIT WORK.
    CALL FUNCTION 'RSD_CHKTAB_GET_FOR_CHA_BAS'
    EXPORTING
    i_chabasnm = 'myprogram'
    IMPORTING
    e_chktab = wf_tablnm
    EXCEPTIONS
    name_error = 1.
    IF sy-subrc 0.
    MESSAGE ID sy-msgid TYPE sy-msgty NUMBER sy-msgno
    WITH sy-msgv1 sy-msgv2 sy-msgv3 sy-msgv4.
    ENDIF.
    ****Release locks on Infoobject
    CALL FUNCTION 'RSDG_IOBJ_DEQUEUE'
    EXPORTING
    i_objnm = 'myprogram'
    i_scope = '1'.
    ENDIF.
    ENDIF.
    PERFORM data_selection .
    PERFORM update_alv_grid_display.
    CALL FUNCTION 'MESSAGES_SHOW'.
    ENDFORM.

    Can you please let me know how I can find out the duplicate record.
    you need to split the records from flat file structure into your internal table ans use a delete ADJACENT duplicates comparing fields
    split flat_str into wa_f1 wa_f2 wa_f2 at tab_space.

  • Write Optimized DSO Duplicate records

    Hi,
    We are facing problem while doing delta to an write optimized data store object.
    It gives error "Duplicate data record detected (DS <ODS name> , data package: 000001 , data record: 294 )
    But it can not have an duplicate record since data is from DSO and
    we have also checked in PSA the particular record and there we couldn't find the duplicate record.....
    There is no much complex routine also.....
    Have any one ever faced this issue and got the solution? Please let me know if yes.
    Thanks
    VJ

    Ravi,
    We have checked that there is no duplicate records in PSA.
    Also the source ODS has two keys and target ODS is having three Keys.
    Also the records that it has mentioned are having Record mode "N" New.
    Seems to be issue with write-ptimized DSO.
    Regards
    VJ

Maybe you are looking for

  • How can I get Wireless N 5MHZ Capabilities Dual Band up to 300MHZ on my DV7 Laptop?

    How can I get Wireless N 5MHZ Capabilities Dual Band up to 300MHZ on my DV7 Laptop? Question for 4/18 experts ... Or specifically, how can I replace my wireless card on a DV7-6B32US and upgrade to Intel Centrino 6230? I have a relatively new DV7-6B32

  • MacBook Pro with Belkins Thunderbolt Express sometimes drops network

    Hi, I have encoutered a problem with my MacBook Pro Retina and Belkins Thunderbolt express where the network sometimes just drops. The network does still have "link" but no network. Removing and adding network ssettings does not help. Removing and ad

  • Software Reengineering is a Delicate Process

    Successfully reengineering a software application is often a challenging and fulfilling endeavor. The current program, written in COBOL, is an important component of an existing application. Our main objective is to reengineer the program with a obje

  • RT crash

    Hi! I want to workarownd a kind of a strange RT system crash, wich, as I found, was already discussed earlier on ni.com... here: http://digital.ni.com/public.nsf/allkb/1E06DC0A338​A52E886256FF2006C10E1 ...and here: http://forums.ni.com/ni/board/messa

  • Can I download an update on with one computer and install it using another?

    I want to update my Touch to 3.0 but my Mac at home that the Touch syncs with has a dial up connection so there is no way I can download the update- it's too big. Is it possible to download the update at work on a Windows machine and then somehow tra