Check duplicate record in array

suppose i have an array
int a[]={1,2,3,4,4,5,6,7,7};
how should i check duplicate record and remove them?
thanks...

Write yourself some pseudo-code:
// Loop though all the elements in the array from 0 to length-2.
// For each element i, loop through following elements from i+1 to length-1
// if the following element is equal to the current one, do something with it.
"do something" can mean either one of two things:
(1) Remove it (VERY messy) or
(2) Create a new array and only add in the non-duplicate elements.
The easiest thing of all would be to create a List and then create a Set from that. Why code all this logic when somebody has already done it for you? Look at the java.util.Arrays and java.util.Collections classes. Use what's been given to you.
%

Similar Messages

  • Check duplicate records

    Right now i'm doing a migration project. I need to import data from excel and text files into the oracle database. Now my question is how to do the duplication data checking,validation on identical attributes and data type need to make sure i import the data correctly and accurately..Does anyone have any suggestion..your ideas n comments are highly appreciated..thanx in advance

    Hi,
    I'm new to this forum, so my answer is a little bit late ...
    Export data from all documents in an identical format. For example |name|adress|and|so|on|, merge all files to a big one. Than eliminate duplicate record with cat file | sort -u and you'll have unique rows. Next step, you have to check whether different keys means different attribute-values.Truncate the key-values from the file and do a unique sort again. Than diff -c both files and you get a list with the duplicate key. The last step ist to decide, which are the correct ones - I'm afraid this will be the hardest work.
    After that you can import all data without any constraint violation.
    Best regards
    Andreas

  • Is there any functionality to check duplicate records of vendor master

    Dear all,
    In creation of vendor master, there is probability of creation of same vendor twice by the user by ignorance.
    Do we any check in SAP so that we cannot create the vendor if that vendor already exists.
    For instance, based on vendor name can we put check so that system gives error message if that name already exists.
    Kindly revert back with your suggestions.
    Thanks,
    Kumar

    Hi,
    First of all you need to have the information ,what are all the fields that needs to be checked for identifying the master data as duplicate, ie name /adress/post code/bank details.
    If you put only one 'name' only for checking the master data duplication then there might be cases where the two vendors having the same name . So i would suggest take more than one field for identifacation the vendors.
    Best Regards
    Suresh Addagiri

  • Check for duplicate record in SQL database before doing INSERT

    Hey guys,
           This is part powershell app doing a SQL insert. BUt my question really relates to the SQL insert. I need to do a check of the database PRIOR to doing the insert to check for duplicate records and if it exists then that record needs
    to be overwritten. I'm not sure how to accomplish this task. My back end is a SQL 2000 Server. I'm piping the data into my insert statement from a powershell FileSystemWatcher app. In my scenario here if the file dumped into a directory starts with I it gets
    written to a SQL database otherwise it gets written to an Access Table. I know silly, but thats the environment im in. haha.
    Any help is appreciated.
    Thanks in Advance
    Rich T.
    #### DEFINE WATCH FOLDERS AND DEFAULT FILE EXTENSION TO WATCH FOR ####
                $cofa_folder = '\\cpsfs001\Data_pvs\TestCofA'
                $bulk_folder = '\\cpsfs001\PVS\Subsidiary\Nolwood\McWood\POD'
                $filter = '*.tif'
                $cofa = New-Object IO.FileSystemWatcher $cofa_folder, $filter -Property @{ IncludeSubdirectories = $false; EnableRaisingEvents= $true; NotifyFilter = [IO.NotifyFilters]'FileName, LastWrite' }
                $bulk = New-Object IO.FileSystemWatcher $bulk_folder, $filter -Property @{ IncludeSubdirectories = $false; EnableRaisingEvents= $true; NotifyFilter = [IO.NotifyFilters]'FileName, LastWrite' }
    #### CERTIFICATE OF ANALYSIS AND PACKAGE SHIPPER PROCESSING ####
                Register-ObjectEvent $cofa Created -SourceIdentifier COFA/PACKAGE -Action {
           $name = $Event.SourceEventArgs.Name
           $changeType = $Event.SourceEventArgs.ChangeType
           $timeStamp = $Event.TimeGenerated
    #### CERTIFICATE OF ANALYSIS PROCESS BEGINS ####
                $test=$name.StartsWith("I")
         if ($test -eq $true) {
                $pos = $name.IndexOf(".")
           $left=$name.substring(0,$pos)
           $pos = $left.IndexOf("L")
           $tempItem=$left.substring(0,$pos)
           $lot = $left.Substring($pos + 1)
           $item=$tempItem.Substring(1)
                Write-Host "in_item_key $item in_lot_key $lot imgfilename $name in_cofa_crtdt $timestamp"  -fore green
                Out-File -FilePath c:\OutputLogs\CofA.csv -Append -InputObject "in_item_key $item in_lot_key $lot imgfilename $name in_cofa_crtdt $timestamp"
                start-sleep -s 5
                $conn = New-Object System.Data.SqlClient.SqlConnection("Data Source=PVSNTDB33; Initial Catalog=adagecopy_daily; Integrated Security=TRUE")
                $conn.Open()
                $insert_stmt = "INSERT INTO in_cofa_pvs (in_item_key, in_lot_key, imgfileName, in_cofa_crtdt) VALUES ('$item','$lot','$name','$timestamp')"
                $cmd = $conn.CreateCommand()
                $cmd.CommandText = $insert_stmt
                $cmd.ExecuteNonQuery()
                $conn.Close()
    #### PACKAGE SHIPPER PROCESS BEGINS ####
              elseif ($test -eq $false) {
                $pos = $name.IndexOf(".")
           $left=$name.substring(0,$pos)
           $pos = $left.IndexOf("O")
           $tempItem=$left.substring(0,$pos)
           $order = $left.Substring($pos + 1)
           $shipid=$tempItem.Substring(1)
                Write-Host "so_hdr_key $order so_ship_key $shipid imgfilename $name in_cofa_crtdt $timestamp"  -fore green
                Out-File -FilePath c:\OutputLogs\PackageShipper.csv -Append -InputObject "so_hdr_key $order so_ship_key $shipid imgfilename $name in_cofa_crtdt $timestamp"
    Rich Thompson

    Hi
    Since SQL Server 2000 has been out of support, I recommend you to upgrade the SQL Server 2000 to a higher version, such as SQL Server 2005 or SQL Server 2008.
    According to your description, you can try the following methods to check duplicate record in SQL Server.
    1. You can use
    RAISERROR to check the duplicate record, if exists then RAISERROR unless insert accordingly, code block is given below:
    IF EXISTS (SELECT 1 FROM TableName AS t
    WHERE t.Column1 = @ Column1
    AND t.Column2 = @ Column2)
    BEGIN
    RAISERROR(‘Duplicate records’,18,1)
    END
    ELSE
    BEGIN
    INSERT INTO TableName (Column1, Column2, Column3)
    SELECT @ Column1, @ Column2, @ Column3
    END
    2. Also you can create UNIQUE INDEX or UNIQUE CONSTRAINT on the column of a table, when you try to INSERT a value that conflicts with the INDEX/CONSTRAINT, an exception will be thrown. 
    Add the unique index:
    CREATE UNIQUE INDEX Unique_Index_name ON TableName(ColumnName)
    Add the unique constraint:
    ALTER TABLE TableName
    ADD CONSTRAINT Unique_Contraint_Name
    UNIQUE (ColumnName)
    Thanks
    Lydia Zhang

  • Check duplicates

    Hi,
    How would I check duplicate records in an internal table?
    Thanks
    Will

    Hi this si the code which can give u the list of the Key field of the records which are having more than one records, prerequestisite is tht "u shud have atleast a primary key in ur internal table"..please change the code as per ur requrimetn ,it wil work and efficient too ,if number of records is too large in table -
    Data : begin of itab occurs 0,
           pernr like pernr-pernr,
           endda like p0001-endda,
           end of itab.
    Data : begin of pernr_list occurs 0,
           pernr like pernr-pernr,
           end of pernr_list.
    data : itab1 type range of itab WITH HEADER LINE,
           itab2 like standard table of itab with header line.
    DATA: DUPCOUNT TYPE I,
          ORICOUNT TYPE I,
          NEWCOUNT TYPE I,
          TOTALHIT TYPE I VALUE 0.
    Start-of-selection.
    select pernr endda into corresponding fields of table itab
    from pa0001
    where endda = '99991231' and persg = p_eg and
    pernr in ( select pernr from pa0000 where STAT2 = p_es
               and endda = '99991231' ).
    if sy-subrc = 0.
    sort itab by pernr.
    itab2[] = itab[].
    DESCRIBE TABLE ITAB LINES ORICOUNT.
    DELETE ADJACENT DUPLICATES FROM ITAB2.
    DESCRIBE TABLE ITAB2 LINES NEWCOUNT.
    DUPCOUNT = ORICOUNT - NEWCOUNT.
      ITAB1-SIGN   =  'I'.
      ITAB1-OPTION   =  'EQ'.
      ITAB1-LOW  = '00000000'.
      APPEND ITAB1.
    loop at itab.
    IF NOT ITAB-PERNR IN ITAB1.
      ITAB1-SIGN   =  'I'.
      ITAB1-OPTION   =  'EQ'.
      ITAB1-LOW  = ITAB-PERNR.
      APPEND ITAB1.
    ELSE.
      pernr_list-pernr = itab-pernr.    "Contain key field Pernr of duplicate records
      append pernr_list.
      TOTALHIT = TOTALHIT + 1.
      IF DUPCOUNT EQ TOTALHIT.
       EXIT.
      ENDIF.
    ENDIF.
    endloop.
    Reward points if helpfull..

  • Error RSMPTEXTS~:Duplicate record dur durin EHP5 in phase SHADOW_IMPORT_INC

    Hi expert,
    i find this error during an EHP5 upgrade in phase shadow_import_inc:
    ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
    SHADOW IMPORT ERRORS and RETURN CODE in SAPK-701DOINSAPBASIS.ERD
    ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
    2EETW000 Table RSMPTEXTS~: Duplicate record during array insert occured.
    2EETW000 Table RSMPTEXTS~: Duplicate record during array insert occured.
    1 ETP111 exit code           : "8"
    here also the last part of log SAPK-701DOINSAPBASIS.ERD
    4 ETW000 Totally 4 tabentries imported.
    4 ETW000 953984 bytes modified in database.
    4 ETW000  [     dev trc,00000]  Thu Aug 11 16:58:45 2011                                             7954092  8.712985
    4 ETW000  [     dev trc,00000]  Disconnecting from ALL connections:                                       28  8.713013
    4 ETW000  [     dev trc,00000]  Disconnecting from connection 0 ...                                       38  8.713051
    4 ETW000  [     dev trc,00000]  Closing user session (con=0, svc=0000000005C317C8, usr=0000000005C409A0)
    4 ETW000                                                                                8382  8.721433
    4 ETW000  [     dev trc,00000]  Detaching from DB Server (con=0,svchp=0000000005C317C8,srvhp=0000000005C32048)
    4 ETW000                                                                                7275  8.728708
    4 ETW000  [     dev trc,00000]  Now I'm disconnected from ORACLE                                        8648  8.737356
    4 ETW000  [     dev trc,00000]  Disconnected from connection 0                                            84  8.737440
    4 ETW000  [     dev trc,00000]  statistics db_con_commit (com_total=13, com_tx=13)                        18  8.737458
    4 ETW000  [     dev trc,00000]  statistics db_con_rollback (roll_total=0, roll_tx=0)                      14  8.737472
    4 ETW000 Disconnected from database.
    4 ETW000 End of Transport (0008).
    4 ETW000 date&time: 11.08.2011 - 16:58:45
    4 ETW000 1 warning occured.
    4 ETW000 1 error occured.
    1 ETP187 R3TRANS SHADOW IMPORT
    1 ETP110 end date and time   : "20110811165845"
    1 ETP111 exit code           : "8"
    1 ETP199 ######################################
    4 EPU202XEND OF SECTION BEING ANALYZED IN PHASE SHADOW_IMPORT_INC
    and i've already try to use the last version of R3trans.
    can you help me????
    thanks a lot
    Franci

    Hello Fransesca,
    I am also facing same error while upgrading ehp5 upgradation please if you know tell me steps to solve it.
    Thanks,
    Venkat

  • The ABAP/4 Open SQL array insert results in duplicate Record in database

    Hi All,
    I am trying to transfer 4 plants from R/3 to APO. The IM contains only these 4 plants. However a queue gets generated in APO saying 'The ABAP/4 Open SQL array insert results in duplicate record in database'. I checked for table /SAPAPO/LOC, /SAPAPO/LOCMAP & /SAPAPO/LOCT for duplicate entry but the entry is not found.
    Can anybody guide me how to resolve this issue?
    Thanks in advance
    Sandeep Patil

    Hi Sandeep,
              Now try to delete ur location before activating the IM again.
    Use the program /SAPAPO/DELETE_LOCATIONS to delete locations.
    Note :
    1. Set the deletion flag (in /SAPAPO/LOC : Location -> Deletion Flag)
    2. Remove all the dependencies (like transportation lane, Model ........ )
    Check now and let me know.
    Regards,
    Siva.
    null

  • Duplicate record check before inserting records

    Hi All
    I want to show an user friendly message instead of (oracle.jbo.TooManyObjectsException: JBO-25013: Too many objects match the primary key oracle.jbo.Key). So in my EO i have written the following code:
    OADBTransaction transaction = getOADBTransaction();
    Object[] empNumberKey = {value};
    EntityDefImpl empDefinition =
    XXXXempEOImpl.getDefinitionObject();
    XXXXempEOImpl empNo=
    (XXXXempEOImpl)empDefinition.findByPrimaryKey(transaction,
    new Key(empNumberKey));
    if (empNo != null) {
    throw new OAAttrValException(OAException.TYP_ENTITY_OBJECT,
    getEntityDef().getFullName(),
    getPrimaryKey(), "CompanyNumber",
    value, "AK",
    "FWK_TBX_T_EMP_ID_UNIQUE");
    setAttributeInternal(COMPANYNUMBER, value);
    My observation is when duplicate empNumber is passed as '0011' then the error message is not thrown.But if i pass duplicate empNumber like '5411' error is thrown. So does it mean new Key(empNumberKey)) chops off leading 0's. Please note that in database the values are stored as '0011'. Pleasre advice. The validation fails only when value is having leading 0's.

    You need to create a select command before Insert and check for the result returned after executing ExecuteScalar, this will return the records count to decide whether to insert or not,
    Check the below example:
    http://stackoverflow.com/questions/15320544/how-to-check-if-record-exists-if-not-insert-using-vb-net
    Fouad Roumieh

  • Check duplicate data entry in multi record block,which is a mandatory field

    Dear all,
    I have a situation where i have to check duplicate data entry(on a particular field,which is a mandatory field,i.e. it cannot be skipped by user without entering value) while data key-in in a Multi Record block.
    As for reference I have used a logic,such as
    1>In a When-Validate-Record trigger of that block I am assigning the value of that current item in Table type variable(collection type)
    as this trigger fire every time as soon as i leave that record,so its assigning the value of that current time.And this process continues
    then
    2>In a When-Validate-Item trigger of that corresponding item(i.e. the trigger is at item level) has been written,where it compares the value of that current item and the value stored in Table type variable(collection type) of When-Validate-Record trigger.If the current item value is matched with any value stored in Table type variable I am showing a message of ('Duplicate Record') following by raise_form_trigger failure
    This code is working fine for checking duplicate value of that multi record field
    The problem is that,if user enter the value in that field,and then goes to next field,enter value to that field and then press 'Enter Query 'icon,the bolth Validate trigger fires.As result first when-validate record fires,which stores that value,and then when-validate-item fires,as a result it shows duplicate record message
    Please give me a meaningful logic or code for solving this problem
    Any other logic to solve this problem is also welcome

    @Ammad Ahmed
    first of all thanks .your logic worked,but still i have some little bit of problem,
    now the requirement is a master detail form where both master and detail is multirecord ,where detail cannot have duplicate record,
    such as..........
    MASTER:--
    A code
    A1
    A2
    DETAIL:--
    D code
    d1
    d2 <-valid as for master A1 , detail d1 ,d2 are not duplicate
    d2 <--invalid as as for master A1 , detail d2 ,d2 are duplicate
    validation rule:  A Code –D Code combination is unique. The system will stop users from entering duplicate D Code for a A Code. Appropriate error message will be displayed.*
    actually i am facing a typical problem,the same logic i have been applied in detail section ,its working fine when i am inserting new records.problem starts when i query,after query in ' a ' field say 2 records (i.e. which has been earlier saved) has been pasted,now if i insert a new record with the value exactly same with the already present value in the screen(i.e. value populated after query) its not showing duplicate.................could u tell me the reason?and help me out...............its urgent plzzzzzzzzz
    Edited by: sushovan on Nov 22, 2010 4:34 AM
    Edited by: sushovan on Nov 22, 2010 4:36 AM
    Edited by: sushovan on Nov 22, 2010 8:58 AM

  • Check Duplicate data during data key-in Multi Record Block

    Dear all,
    I have a situation where i have to check duplicate data entry(on a particular field,which is a mandatory field,i.e. it cannot be skipped by user without entering value) while data key-in in a Multi Record block.
    As for reference I have used a logic,such as
    1>In a When-Validate-Record trigger of that block I am assigning the value of that current item in Table type variable(collection type)
    as this trigger fire every time as soon as i leave that record,so its assigning the value of that current time.And this process continues
    then
    2>In a When-Validate-Item trigger of that corresponding item(i.e. the trigger is at item level) has been written,where it compares the value of that current item and the value stored in Table type variable(collection type) of When-Validate-Record trigger.If the current item value is matched with any value stored in Table type variable I am showing a message of ('Duplicate Record') following by raise_form_trigger failure
    This code is working fine for checking duplicate value of that multi record field
    The problem here is that suppose if usee gets a message of ('Duplicate Record') and after that without saving the values if user try to query of that block then also when validate item fired where as I am expecting ORACLE default alert message('Do You want to save?'),I want to restrict this When-Validate Item fire during query time..........................while user try to query.
    Please give me a meaningful logic or code for solving this problem
    Any other logic to solve this problem is also welcome

    When-Validate-Record trigger
    When-Validate-Item triggerThat smells like Oracle Forms...
    And the Oracle Forms forum is over here: Forms

  • Array permitting duplicate record

    Hi,
    I have string array like below:
    String strArr[] = {"style1","style2","style2","style4"};
    In this string array index 1 and index 2 is having same value which is style2.
    My target is to generate key(system time) for each value as below:
    "key1" : "style1"
    "key2" : "style2"
    "key2" : "style2" //Here I want to reuse already generated key2 in place of creating new key like key3
    "key4" : "style4 "
    At last generated array should be:
    ["key1" : "style1",
    "key2" : "style2",
    "key2" : "style2" , //Here I want to retain this duplicate record
    "key4" : "style4 "]
    Value position is also important, like below array is not accepted:
    ["key2" : "style2",
    "key1" : "style1",
    "key2" : "style2" ,
    "key4" : "style4 "]
    I am struggling to achieve this.
    Please guide me to achieve this!
    Regards
    My Efforts:
         static Map<String, String> m = new HashMap<String, String>();
         static int[] intarr;
         static List<Integer> list = new ArrayList<Integer>();
         public static void main(String[] args) throws InterruptedException {
              String strArr[] = {"style1","style2","style2","style4"};
              /**Find Index having duplicate value*/
              for(int i=0; i<strArr.length; i++){
                   for(int j=0; j<strArr.length; j++){
                        if((strArr==strArr[j])&&(i!=j)){
                             list.add(i);
                             String newclass = "class"+new Date().getTime();//
                             //Thread.sleep(1000);//
                             m.put(newclass, strArr[i]);//
                             break;
              for(int i : list){
                   //System.out.println("Duplicate value found at Index:: "+i);
              /*String newduplicateclass = "class"+new Date().getTime();
              Thread.sleep(1000);
              for(int i=0; i<strArr.length; i++){
                   if(list.contains(i)){
                        System.out.println("i="+i);
                        m.put(newduplicateclass, strArr[i]);
                   }else{
                        String newclass = "class"+new Date().getTime();
                        m.put(newclass, strArr[i]);
                        Thread.sleep(1000);
              System.out.println(m);
    But want to do with smartest way using few lines of code.
    Edited by: 898087 on Mar 17, 2012 3:16 AM                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                   

    @TPD Thanks for your help! It gave me a speed up.
    At last I am done with what I wanted and below is that code:
    public static void main(String[] args){
              String strArr[] = {"style7","style1","style5","style7","style2","style1","style2","style7","style2","style5","style7","style7"};
    Set<String> set = new HashSet<String>();
              for(int i=0; i<strArr.length; i++){
                   set.add(strArr);
              Object objArr[] = set.toArray();
              int counter = 0;
              Map<String, Integer> map = new HashMap<String, Integer>();
              for(int j=0; j<objArr.length; j++){
                   map.put(objArr[j].toString(), counter);
                   counter++;
              JSONArray jsonArr = new JSONArray();
              JSONObject jsonObj;
              for(int i=0; i<strArr.length; i++){
                   jsonObj = new JSONObject();
                   jsonObj.put(strArr[i], map.get(strArr[i]));
                   jsonArr.add(jsonObj);
              System.out.println(jsonArr);
    //OUTPUT: [{"style7":0},{"style1":2},{"style5":1},{"style7":0},{"style2":3},{"style1":2},{"style2":3},{"style7":0},{"style2":3},{"style5":1},{"style7":0},{"style7":0}]

  • Duplicate records in a collection

    Hi Experts,
    Just now I've seen a thread related to finding duplicate records in a collection. I understand that it is not advisable to sort/filter data in a collection.
    (https://forums.oracle.com/thread/2584168)
    Just for curiosity I tried to display duplicate records in a collection. Please Please .. this is just for practice purpose only. Below is the rough code which I wrote.
    I'm aware of one way - can be handled effectively by passing data into a global temporary table and display the duplicate/unique records.
    Can you please let me know if there is any other efficient wayto do this.
    declare
      type emp_rec is record ( ename varchar2(40), empno number);
      l_emp_rec emp_rec; 
      type emp_tab is table of l_emp_rec%type index by binary_integer;
      l_emp_tab emp_tab;
      l_dup_tab emp_tab;
      l_cnt number;
      n number :=1;
    begin
    -- Assigning values to Associative array
      l_emp_tab(1).ename := 'suri';
      l_emp_tab(1).empno := 1;
      l_emp_tab(2).ename := 'surya';
      l_emp_tab(2).empno := 2;
      l_emp_tab(3).ename := 'suri';
      l_emp_tab(3).empno := 1;
    -- Comparing collection for duplicate records
    for i in l_emp_tab.first..l_emp_tab.last
    loop
        l_cnt :=0;  
    for j in l_emp_tab.first..l_emp_tab.last 
        loop      
           if l_emp_tab(i).empno  =  l_emp_tab(j).empno and l_emp_tab(i).ename  =  l_emp_tab(j).ename then
               l_cnt := l_cnt+1;          
                   if l_cnt >=2 then
                      l_dup_tab(n):= l_emp_tab(i);
                   end if;
           end if;                   
        end loop;  
    end loop;
    -- Displaying duplicate records
    for i in l_dup_tab.first..l_dup_tab.last
    loop
       dbms_output.put_line(l_dup_tab(i).ename||'  '||l_dup_tab(i).empno);
    end loop;
    end;
    Cheers,
    Suri

    Dunno if this is either easier or more efficient but it is different.  The biggest disadvantage to this technique is that you have extraneous database objects (a table) to keep track of.  The advantage is that you can use SQL to perform the difference checks easily.
    Create 2 global temporary tables with the structure you need, load them, and use set operators (UNION [ALL], INTERSECT, MINUS) to find the differences.  Or, create 1 GTT with an extra column identifying the set and use the extra column to identify the set records you need.

  • How to suppress duplicate records in rtf templates

    Hi All,
    I am facing issue with payment reason comments in check template.
    we are displaying payment reason comments. Now the issue is while making batch payment we are getting multiple payment reason comments from multiple invoices with the same name and it doesn't looks good. You can see payment reason comments under tail number text field in the template.
    If you provide any xml syntax to suppress duplicate records for showing distinct payment reason comments.
    Attached screen shot, template and xml file for your reference.
    Thanks,
    Sagar.

    I have CRXI, so the instructions are for this release
    you can create a formula, I called it cust_Matches
    if = previous () then 'true' else 'false'
    IN your GH2 section, right click the field, select format field, select the common tab (far left at the top)
    Select the x/2 to the right of Supress  in the formula field type in
    {@Cust_Matches} = 'true'
    Now every time the {@Cust_Matches} is true, the CustID should be supressed,
    do the same with the other fields you wish to hide.  Ie Address, City, etc.

  • How to find out duplicate record contained in a flat file

    Hi Experts,
    For my project I have written a program for flat file upload.
    Requirement 1
    In the flat file there may be some duplicate record like:
    Field1   Field2
    11        test1
    11        test2
    12        test3
    13        test4
    Field1 is primary key.
    Can you please let me know how I can find out the duplicate record.
    Requirement 2
    The flat file contains the header row as shown above
    Field1   Field2
    How our program can skip this record and start reading / inserting records from row no 2 ie
    11        test1
    onwards.
    Thanks
    S
    FORM upload1.
    DATA : wf_title TYPE string,
    lt_filetab TYPE filetable,
    l_separator TYPE char01,
    l_action TYPE i,
    l_count TYPE i,
    ls_filetab TYPE file_table,
    wf_delemt TYPE rollname,
    wa_fieldcat TYPE lvc_s_fcat,
    tb_fieldcat TYPE lvc_t_fcat,
    rows_read TYPE i,
    p_error TYPE char01,
    l_file TYPE string.
    DATA: wf_object(30) TYPE c,
    wf_tablnm TYPE rsdchkview.
    wf_object = 'myprogram'.
    DATA i TYPE i.
    DATA:
    lr_mdmt TYPE REF TO cl_rsdmd_mdmt,
    lr_mdmtr TYPE REF TO cl_rsdmd_mdmtr,
    lt_idocstate TYPE rsarr_t_idocstate,
    lv_subrc TYPE sysubrc.
    TYPES : BEGIN OF test_struc,
    /bic/myprogram TYPE /bic/oimyprogram,
    txtmd TYPE rstxtmd,
    END OF test_struc.
    DATA : tb_assum TYPE TABLE OF /bic/pmyprogram.
    DATA: wa_ztext TYPE /bic/tmyprogram,
    myprogram_temp TYPE ziott_assum,
    wa_myprogram TYPE /bic/pmyprogram.
    DATA : test_upload TYPE STANDARD TABLE OF test_struc,
    wa2 TYPE test_struc.
    DATA : wa_test_upload TYPE test_struc,
    ztable_data TYPE TABLE OF /bic/pmyprogram,
    ztable_text TYPE TABLE OF /bic/tmyprogram,
    wa_upld_text TYPE /bic/tmyprogram,
    wa_upld_data TYPE /bic/pmyprogram,
    t_assum TYPE ziott_assum.
    DATA : wa1 LIKE test_upload.
    wf_title = text-026.
    CALL METHOD cl_gui_frontend_services=>file_open_dialog
    EXPORTING
    window_title = wf_title
    default_extension = 'txt'
    file_filter = 'Tab delimited Text Files (*.txt)'
    CHANGING
    file_table = lt_filetab
    rc = l_count
    user_action = l_action
    EXCEPTIONS
    file_open_dialog_failed = 1
    cntl_error = 2
    OTHERS = 3. "#EC NOTEXT
    IF sy-subrc 0.
    EXIT.
    ENDIF.
    LOOP AT lt_filetab INTO ls_filetab.
    l_file = ls_filetab.
    ENDLOOP.
    CHECK l_action = 0.
    IF l_file IS INITIAL.
    EXIT.
    ENDIF.
    l_separator = 'X'.
    wa_fieldcat-fieldname = 'test'.
    wa_fieldcat-dd_roll = wf_delemt.
    APPEND wa_fieldcat TO tb_fieldcat.
    CALL FUNCTION 'MESSAGES_INITIALIZE'.
    CLEAR wa_test_upload.
    Upload file from front-end (PC)
    File format is tab-delimited ASCII
    CALL FUNCTION 'GUI_UPLOAD'
    EXPORTING
    filename = l_file
    has_field_separator = l_separator
    TABLES
    data_tab = i_mara
    data_tab = test_upload
    EXCEPTIONS
    file_open_error = 1
    file_read_error = 2
    no_batch = 3
    gui_refuse_filetransfer = 4
    invalid_type = 5
    no_authority = 6
    unknown_error = 7
    bad_data_format = 8
    header_not_allowed = 9
    separator_not_allowed = 10
    header_too_long = 11
    unknown_dp_error = 12
    access_denied = 13
    dp_out_of_memory = 14
    disk_full = 15
    dp_timeout = 16
    OTHERS = 17.
    IF sy-subrc 0.
    EXIT.
    ELSE.
    CALL FUNCTION 'MESSAGES_INITIALIZE'.
    IF test_upload IS NOT INITIAL.
    DESCRIBE TABLE test_upload LINES rows_read.
    CLEAR : wa_test_upload,wa_upld_data.
    LOOP AT test_upload INTO wa_test_upload.
    CLEAR : p_error.
    rows_read = sy-tabix.
    IF wa_test_upload-/bic/myprogram IS INITIAL.
    p_error = 'X'.
    MESSAGE s153 WITH wa_test_upload-/bic/myprogram sy-tabix.
    CONTINUE.
    ELSE.
    TRANSLATE wa_test_upload-/bic/myprogram TO UPPER CASE.
    wa_upld_text-txtmd = wa_test_upload-txtmd.
    wa_upld_text-txtsh = wa_test_upload-txtmd.
    wa_upld_text-langu = sy-langu.
    wa_upld_data-chrt_accts = 'xyz1'.
    wa_upld_data-co_area = '12'.
    wa_upld_data-/bic/zxyzbcsg = 'Iy'.
    wa_upld_data-objvers = 'A'.
    wa_upld_data-changed = 'I'.
    wa_upld_data-/bic/zass_mdl = 'rrr'.
    wa_upld_data-/bic/zass_typ = 'I'.
    wa_upld_data-/bic/zdriver = 'yyy'.
    wa_upld_text-langu = sy-langu.
    MOVE-CORRESPONDING wa_test_upload TO wa_upld_data.
    MOVE-CORRESPONDING wa_test_upload TO wa_upld_text.
    APPEND wa_upld_data TO ztable_data.
    APPEND wa_upld_text TO ztable_text.
    ENDIF.
    ENDLOOP.
    DELETE ADJACENT DUPLICATES FROM ztable_data.
    DELETE ADJACENT DUPLICATES FROM ztable_text.
    IF ztable_data IS NOT INITIAL.
    CALL METHOD cl_rsdmd_mdmt=>factory
    EXPORTING
    i_chabasnm = 'myprogram'
    IMPORTING
    e_r_mdmt = lr_mdmt
    EXCEPTIONS
    invalid_iobjnm = 1
    OTHERS = 2.
    CALL FUNCTION 'MESSAGES_INITIALIZE'.
    **Lock the Infoobject to update
    CALL FUNCTION 'RSDG_IOBJ_ENQUEUE'
    EXPORTING
    i_objnm = wf_object
    i_scope = '1'
    i_msgty = rs_c_error
    EXCEPTIONS
    foreign_lock = 1
    sys_failure = 2.
    IF sy-subrc = 1.
    MESSAGE i107(zddd_rr) WITH wf_object sy-msgv2.
    EXIT.
    ELSEIF sy-subrc = 2.
    MESSAGE i108(zddd_rr) WITH wf_object.
    EXIT.
    ENDIF.
    *****Update Master Table
    IF ztable_data IS NOT INITIAL.
    CALL FUNCTION 'RSDMD_WRITE_ATTRIBUTES_TEXTS'
    EXPORTING
    i_iobjnm = 'myprogram'
    i_tabclass = 'M'
    I_T_ATTR = lt_attr
    TABLES
    i_t_table = ztable_data
    EXCEPTIONS
    attribute_name_error = 1
    iobj_not_found = 2
    generate_program_error = 3
    OTHERS = 4.
    IF sy-subrc 0.
    CALL FUNCTION 'MESSAGE_STORE'
    EXPORTING
    arbgb = 'zddd_rr'
    msgty = 'E'
    txtnr = '054'
    msgv1 = text-033
    EXCEPTIONS
    OTHERS = 3.
    MESSAGE e054(zddd_rr) WITH 'myprogram'.
    ELSE.
    CALL FUNCTION 'MESSAGE_STORE'
    EXPORTING
    arbgb = 'zddd_rr'
    msgty = 'S'
    txtnr = '053'
    msgv1 = text-033
    EXCEPTIONS
    OTHERS = 3.
    ENDIF.
    *endif.
    *****update Text Table
    IF ztable_text IS NOT INITIAL.
    CALL FUNCTION 'RSDMD_WRITE_ATTRIBUTES_TEXTS'
    EXPORTING
    i_iobjnm = 'myprogram'
    i_tabclass = 'T'
    TABLES
    i_t_table = ztable_text
    EXCEPTIONS
    attribute_name_error = 1
    iobj_not_found = 2
    generate_program_error = 3
    OTHERS = 4.
    IF sy-subrc 0.
    CALL FUNCTION 'MESSAGE_STORE'
    EXPORTING
    arbgb = 'zddd_rr'
    msgty = 'E'
    txtnr = '055'
    msgv1 = text-033
    EXCEPTIONS
    OTHERS = 3.
    ENDIF.
    ENDIF.
    ELSE.
    MESSAGE s178(zddd_rr).
    ENDIF.
    ENDIF.
    COMMIT WORK.
    CALL FUNCTION 'RSD_CHKTAB_GET_FOR_CHA_BAS'
    EXPORTING
    i_chabasnm = 'myprogram'
    IMPORTING
    e_chktab = wf_tablnm
    EXCEPTIONS
    name_error = 1.
    IF sy-subrc 0.
    MESSAGE ID sy-msgid TYPE sy-msgty NUMBER sy-msgno
    WITH sy-msgv1 sy-msgv2 sy-msgv3 sy-msgv4.
    ENDIF.
    ****Release locks on Infoobject
    CALL FUNCTION 'RSDG_IOBJ_DEQUEUE'
    EXPORTING
    i_objnm = 'myprogram'
    i_scope = '1'.
    ENDIF.
    ENDIF.
    PERFORM data_selection .
    PERFORM update_alv_grid_display.
    CALL FUNCTION 'MESSAGES_SHOW'.
    ENDFORM.

    Can you please let me know how I can find out the duplicate record.
    you need to split the records from flat file structure into your internal table ans use a delete ADJACENT duplicates comparing fields
    split flat_str into wa_f1 wa_f2 wa_f2 at tab_space.

  • Write Optimized DSO Duplicate records

    Hi,
    We are facing problem while doing delta to an write optimized data store object.
    It gives error "Duplicate data record detected (DS <ODS name> , data package: 000001 , data record: 294 )
    But it can not have an duplicate record since data is from DSO and
    we have also checked in PSA the particular record and there we couldn't find the duplicate record.....
    There is no much complex routine also.....
    Have any one ever faced this issue and got the solution? Please let me know if yes.
    Thanks
    VJ

    Ravi,
    We have checked that there is no duplicate records in PSA.
    Also the source ODS has two keys and target ODS is having three Keys.
    Also the records that it has mentioned are having Record mode "N" New.
    Seems to be issue with write-ptimized DSO.
    Regards
    VJ

Maybe you are looking for

  • Certain Maintenance Plan not getting scheduled in the background job

    Hi Gurus, I am facing an issue here. A weekly background job (RISTRA20) runs every weekend and schedules the maintenance plans. But we have noticed that some maintenance plans are not getting scheduled. i.e no orders or schedule is getting generated.

  • IndexOutOfBoundsException with BPEL FTP adapter

    I am running SOA Suite 10.1.3.4 on WebLogic Server 9.2, and have deployed a simple test BPEL process that is using the FTP adapter to poll for files on a remote SFTP server, retrieve the opaque payload, then copy it to a destination file on another S

  • Read/write permissions error message

    Hello everybody, I have recently upgraded to Logic Studio from LE8. Now, I am into manual reading and all that stuff, but there are a couple of issues that confuse me at the moment. 1) I am not able to edit an audio file that I created by recording a

  • Java.util.Properties issue

    I have an Integer array that i want to store using properties. At the moment i have something like this for(int i = 0; i < guitarNo.length; i++)     {      p.setProperty("Identification", Integer.toString(guitarNo)); }But it doesn't store any of the

  • Cust_Acct_Site_id populated in Hz_cust_account_role

    I have two records in HZ_CUST_ACCOUNT_ROLES for a single CUST_ACCOUNT_ID, one with CUST_ACCT_SITE_ID populated another not. e.g Role_ID party_id Account_id Cust_acct_site_id 1 300 284 900 2 301 284 But when I query in Customr screen for this account