Remove duplicates for list object

Hi,
I have retrieved all the xml node values in List<object>. Now, I want to remove all the duplicates objects from that list.
Example:
If the object has n properties, I should remove only duplicate objects that have same values in all the n properties not just one property as ID or name. Please let me know how to achieve this. Preferably using LINQ. 
Thank you.
Regards,
Kiran

You can use a GroupBy.  I took the code from last posting
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Xml;
using System.Xml.Linq;
using System.IO;
namespace ConsoleApplication1
class Program
static void Main(string[] args)
string file1 = "<?xml version=\"1.0\" encoding=\"utf-8\" ?>" +
"<Methods>" +
"<Method>" +
"<ID>1234</ID>" +
"<Name>manager</Name>" +
"<Path>path1</Path>" +
"</Method>" +
"<Method>" +
"<ID>5678</ID>" +
"<Name>manager</Name>" +
"<Path>path2</Path>" +
"</Method>" +
"<Method>" +
"<ID>5678</ID>" +
"<Name>manager</Name>" +
"<Path>path2</Path>" +
"</Method>" +
"</Methods>";
string file2 = "<?xml version=\"1.0\" encoding=\"utf-8\" ?>" +
"<Methods>" +
"<Method>" +
"<Path>path1</Path>" +
"<Description>text</Description>" +
"</Method>" +
"<Method>" +
"<Path>path2</Path>" +
"<Description>text</Description>" +
"</Method>" +
"</Methods>";
string file3 = "<?xml version=\"1.0\" encoding=\"utf-8\" ?>" +
"<Methods>" +
"</Methods>";
StringReader reader = new StringReader(file1);
XDocument doc1 = XDocument.Load(reader);
reader = new StringReader(file2);
XDocument doc2 = XDocument.Load(reader);
reader = new StringReader(file3);
XDocument newXDoc = XDocument.Load(reader);
var results = from e1 in doc1.Descendants().Elements("Method")
join e2 in doc2.Descendants().Elements("Method")
on e1.Element("Path").Value equals e2.Element("Path").Value
select new XElement("Method", new object[] { e1.Element("ID"), e1.Element("Name"), e2.Element("Description") });
var results2 = results.AsEnumerable()
.GroupBy(x => x.Element("ID").Value);
var results3 = results2.AsEnumerable()
.Select(x => x.Take(1));
newXDoc.Descendants("Methods").Last().Add(results3);
jdweng

Similar Messages

  • Name for list object:  ALVXXL01

    Dear Gurus
    I have query regarding expoert the data to spreadsheet , when i am doing the job system is asking below details , once i have provided that , file is not down loading, Please help me.
    first i am getting below message
    Filter criteria, sorting, totals and
    not taken into account
    Later system asking the below details
    Name for list object:  ALVXXL01
    Title for SAPoffice:
    Regards
    Srinivas

    Check:SAP EXCEL is not opening
                      End users not able to export a report to a spreadsheet
    SAP note 1080608
    Have a discussion with your BASIS team
    Regards
    Indranil

  • Ot - removing duplicates in lists

    hi
    sorry for the ot question.
    i created a very long list of keywords (for seo).
    anyone knows of a tool to remove duplicates? it's really a
    very long list
    thanks
    lenny

    hi
    thank you all
    lenny
    "Walt F. Schaefer" <[email protected]> wrote in
    message
    news:g80fba$9ng$[email protected]..
    > If the list is that long you have at least 10X too many
    words for a single
    > site. You cannot successfully optimize for a gazillion
    keywords. It's all
    > about keyword density.
    >
    > --
    >
    > Walt
    >
    >
    > "lenny" <[email protected]> wrote in message
    > news:g7vot2$h1u$[email protected]..
    >> hi murray
    >>
    >> i need the list for adwords, for creating content
    etc.
    >>
    >> thanks
    >>
    >> lenny
    >>
    http://www.big-t-shirts.com
    >>
    >>
    >>
    >> "Murray *ACE*"
    <[email protected]> wrote in message
    >> news:g7vn9p$fd3$[email protected]..
    >>>> i created a very long list of keywords (for
    seo).
    >>>
    >>> This is not a worthwhile thing to do. The
    keywords meta is ignored by
    >>> the major search engines.
    >>>
    >>> --
    >>> Murray --- ICQ 71997575
    >>> Adobe Community Expert
    >>> (If you *MUST* email me, don't LAUGH when you do
    so!)
    >>> ==================
    >>>
    http://www.projectseven.com/go
    - DW FAQs, Tutorials & Resources
    >>>
    http://www.dwfaq.com - DW FAQs,
    Tutorials & Resources
    >>> ==================
    >>>
    >>>
    >>> "lenny" <[email protected]> wrote in
    message
    >>> news:g7vmt6$f1t$[email protected]..
    >>>> hi
    >>>>
    >>>> sorry for the ot question.
    >>>>
    >>>> i created a very long list of keywords (for
    seo).
    >>>>
    >>>> anyone knows of a tool to remove duplicates?
    it's really a very long
    >>>> list
    >>>>
    >>>> thanks
    >>>>
    >>>> lenny
    >>>>
    >>>
    >>
    >>
    >
    >

  • New Profiles for list Object Overview of COOIS

    Dear guru ,
    i run COOIS for list PPIOA000 (Object Overview)  Profile 000000000001 (Standard Profile)
    for production orders.
    In object selection view the system purpose for every object (operations , components ..) a profile  000001.
    I have tried in customizing but I haven’t found where can I add new profile for this list.
    Thanks in advance.

    Dear Cristiano
    Go to transaction COIS double click the Overall profile 000000000001     Standard profile
    Order infromation system: Overall profile detail screen appears
    Double click the Profile 000001agianst order headers
    Order infromation system:Object profiles Overviewl screen appears
    Add new entries to appear in the object selection screen.
    Note : If you even drill down you can change the fields to display respectively.
    Regards
    Soundararajan M,

  • Open table/index node yields duplicates for partitioned objects

    The generated query for this operation (extracted from the log) is:
    select distinct t.table_name, t.num_rows, ao.status,
    +(select count(*) from sys.Dba_tab_columns c where t.table_name = c.table_name and t.owner = c.owner) columns,+
    +(select comments from sys.Dba_tab_comments c where t.table_name = c.table_name and t.owner = c.owner) comments,+
    +(select count(*) from sys.Dba_indexes i where t.table_name = i.table_name and t.owner = i.table_owner) indexed_columns,+
    t.avg_row_len,
    t.tablespace_name,
    t.avg_row_len * t.num_rows estimated_size,
    ao.last_ddl_time,
    ao.created,
    +(select count(*) from sys.Dba_dependencies where REFERENCED_OWNER = t.owner and REFERENCED_NAME = t.table_name) referenced_objects,+
    +(select count(*) from sys.Dba_triggers r where r.table_owner = t.owner and r.table_name = t.table_name) trigs+
    from sys.Dba_tables t,sys.Dba_objects ao
    where ao.owner = :OBJECT_OWNER
    and ao.owner = t.owner and ao.object_name = t.table_name
    and aO.GENERATED = 'N'
    AND aO.OBJECT_NAME NOT IN (SELECT OBJECT_NAME FROM RECYCLEBIN)
    AND aO.OBJECT_NAME NOT IN (SELECT MVIEW_NAME FROM SYS.Dba_MVIEWS WHERE :OBJECT_OWNER = OWNER)
    AND aO.OBJECT_NAME NOT IN (SELECT QUEUE_TABLE from Dba_queue_tables WHERE :OBJECT_OWNER = OWNER)
    AND not (   ao.object_name like 'AQ$_%_G'
    or   ao.object_name like 'AQ$_%_H'
    or   ao.object_name like 'AQ$_%_I'
    or   ao.object_name like 'AQ$_%_S'
    or   ao.object_name like 'AQ$_%_T' )
    AND ( user = :OBJECT_OWNER or not ao.object_name like 'BIN$%' ) -- user != :SCHEMA --> object_name not like 'BIN$%'
    -- RECYCLEBIN is USER_RECYCLEBIN!
    order by 1
    Trouble is that joining dba_tables to dba_objects produces duplicates for partitioned tables.
    The same is true for the index node.
    Should I file a bug, or is it already picked up ?
    Cheers,
    Olivier.

    No answer after 8 days -> Reactivation.
    Tx.
    Olivier.

  • Having trouble removing duplicates from list

    Hello
    I am strugling with this
    I have a ArrayList as i am adding stuff in it i want to make sure i am adding just the Unique stuff. I am adding objects of class paper in the list. paper class has an attribute called color. so i have overwritten the equals method of paper class like this
    public boolean equals(Object other)
    if (!(other instanceof paper)
        return false;
    paper detTo = (paper) other;
              return detTo.getColor().equalsIgnoreCase(this.getColor());
    }while I am adding stuff to the list i am doing this
    List onlyColor = new ArrayList();
    while (li.hasNext())
    paper p1 = (paper) li.next();
    if (!onlyColor.equals(p1)) //only add when equals returns false?
       onlyColor.add(p1);
       System.out.println("Adding: " + p1.getColor());
    }but it is still adding duplicate stuff in the list (onlyColor). what am i doing wrong!? i heard that i should overwrite the hashcode as well...but how can i do it..can anyone help? Also does it matter that class paper is extending another class called utilities. and getColor() attribute is actually in the utilities class..?
    Thanks! please help me solve this :(
    Message was edited by:
    Omnipresent
    also, if i chance the equals method to always return true by default
    public boolean equals(Object other)
    if (!(other instanceof paper)
        return false;
    paper detTo = (paper) other;
              return true;
    }and then change
    if (!onlyColor.equals(p1)) to
    if (onlyColor.equals(p1)) nothing is being added to the list!! i am suspicious if my overwritten equals method is even called!!?!?

    i am planning to get this book
    http://www.amazon.com/Certified-Programmer-310-055-Certification-Guides/dp/0072253606
    have u come across this one before?
    if i go thru this..do u you think i'll be uptodate
    with java5?That's what the goal of that book: getting you ready for a Java 5 certification. Personally I'd start with a decent Java book like this one: http://www.sun.com/books/catalog/gosling_JPL4.xml
    and perhaps a data structures and algorithms book.
    After that, buy yourself a certification book to see what's important to know when taking an exam.
    A certification book will not make you a good programmer. Don't get me wrong: a decent Java book won't make you a good programmer over night either, but it's (IMO) a better basis to learn the language than a certification book.

  • How to detect duplicate for custom object 1

    Hi expert,
    are there any fields in custom object can detect duplicates If These Fields Match?
    we thought it should be "Name" but it's not.
    Thanks, sab.

    Sab, field validation is not going to check for uniqueness of the custom object name. However, you could use field validation to add a value to the custom object name which could make it unique (such as name + rowID or name + created timestamp).

  • Search for records in the event viewer after the last run (not the entire event log), remove duplicate - Output Logon type for a specific OU users

    Hi,
    The following code works perfectly for me and give me a list of users for a specific OU and their respective logon types :-
    $logFile = 'c:\test\test.txt'
    $_myOU = "OU=ABC,dc=contosso,DC=com"
    # LogonType as per technet
    $_logontype = @{
        2 = "Interactive" 
        3 = "Network"
        4 = "Batch"
        5 = "Service"
        7 = "Unlock"
        8 = "NetworkCleartext"
        9 = "NewCredentials"
        10 = "RemoteInteractive"
        11 = "CachedInteractive"
    Get-WinEvent -FilterXml "<QueryList><Query Id=""0"" Path=""Security""><Select Path=""Security"">*[System[(EventID=4624)]]</Select><Suppress Path=""Security"">*[EventData[Data[@Name=""SubjectLogonId""]=""0x0""
    or Data[@Name=""TargetDomainName""]=""NT AUTHORITY"" or Data[@Name=""TargetDomainName""]=""Window Manager""]]</Suppress></Query></QueryList>" -ComputerName
    "XYZ" | ForEach-Object {
        #TargetUserSid
        $_cur_OU = ([ADSI]"LDAP://<SID=$(($_.Properties[4]).Value.Value)>").distinguishedName
        If ( $_cur_OU -like "*$_myOU" ) {
            $_cur_OU
            #LogonType
            $_logontype[ [int] $_.Properties[8].Value ]
    #Time-created
    $_.TimeCreated
        $_.Properties[18].Value
    } >> $logFile
    I am able to pipe the results to a file however, I would like to convert it to CSV/HTML When i try "convertto-HTML"
    function it converts certain values . Also,
    a) I would like to remove duplicate entries when the script runs only for that execution. 
    b) When the script is run, we may be able to search for records after the last run and not search in the same
    records that we have looked into before.
    PLEASE HELP ! 

    If you just want to look for the new events since the last run, I suggest to record the EventRecordID of the last event you parsed and use it as a reference in your filter. For example:
    <QueryList>
      <Query Id="0" Path="Security">
        <Select Path="Security">*[System[(EventID=4624 and
    EventRecordID>46452302)]]</Select>
        <Suppress Path="Security">*[EventData[Data[@Name="SubjectLogonId"]="0x0" or Data[@Name="TargetDomainName"]="NT AUTHORITY" or Data[@Name="TargetDomainName"]="Window Manager"]]</Suppress>
      </Query>
    </QueryList>
    That's this logic that the Server Manager of Windows Serve 2012 is using to save time, CPU and bandwidth. The problem is how to get that number and provide it to your next run. You can store in a file and read it at the beginning. If not found, you
    can go through the all event list.
    Let's say you store it in a simple text file, ref.txt
    1234
    At the beginning just read it.
    Try {
    $_intMyRef = [int] (Get-Content .\ref.txt)
    Catch {
    Write-Host "The reference EventRecordID cannot be found." -ForegroundColor Red
    $_intMyRef = 0
    This is very lazy check. You can do a proper parsing etc... That's a quick dirty way. If I can read
    it and parse it as an integer, I use it. Else, I just set it to 0 meaning I'll collect all info.
    Then include it in your filter. You Get-WinEvent becomes:
    Get-WinEvent -FilterXml "<QueryList><Query Id=""0"" Path=""Security""><Select Path=""Security"">*[System[(EventID=4624 and EventRecordID&gt;$_intMyRef)]]</Select><Suppress Path=""Security"">*[EventData[Data[@Name=""SubjectLogonId""]=""0x0"" or Data[@Name=""TargetDomainName""]=""NT AUTHORITY"" or Data[@Name=""TargetDomainName""]=""Window Manager""]]</Suppress></Query></QueryList>"
    At the end of your script, store the last value you got into your ref.txt file. So you can for example get that info in the loop. Like:
    $Result += $LogonRecord
    $_intLastId = $Event.RecordId
    And at the end:
    Write-Output $_intLastId | Out-File .\ref.txt
    Then next time you run it, it is just scanning the delta. Note that I prefer this versus the date filter in case of the machine wasn't active for long or in case of time sync issue which can sometimes mess up with the date based filters.
    If you want to go for a date filtering, do it at the Get-WinEvent level, not in the Where-Object. If the query is local, it doesn't change much. But in remote system, it does the filter on the remote side therefore you're saving time and resources on your
    side. So for example for the last 30 days, and if you want to use the XMLFilter parameter, you can use:
    <QueryList>
    <Query Id="0" Path="Security">
    <Select Path="Security">*[System[TimeCreated[timediff(@SystemTime) &lt;= 2592000000]]]</Select>
    </Query>
    </QueryList>
    Then you can combine it, etc...
    PS, I used the confusing underscores because I like it ;)
    Note: Posts are provided “AS IS” without warranty of any kind, either expressed or implied, including but not limited to the implied warranties of merchantability and/or fitness for a particular purpose.

  • Removing duplicate values from selectOneChoice bound to List Iterator

    I'm trying to remove duplicate values from a selectOneChoice that i have. The component binds back to a List Iterator on the pageDefinition.
    I have a table on a JSF page with 5 columns; the table is bound to a method iterator on the pageDef. Then above the table, there are 5 separate selectOneChoice components each one of which is bound to the result set of the table's iterator. So this means that each selectOneChoice only contains vales corresponding to the columns in the table which it represents.
    The selectOneChoice components are part of a search facility and allow the user to select values from them and restrict the results that are returned. The concept is fine and i works. However if i have repeating values in the selectOneChoice (which is inevitable given its bound to the table column result set), then i need to remove them. I can remove null values or empty strings using expression language in the rendered attribute as shown:
    <af:forEach var="item"
    items="#{bindings.XXXX.items}">
    <af:selectItem label="#{item.label}" value="#{item.label}"
    rendered="#{item.label != ''}"/>
    </af:forEach>
    But i dont know how i can remove duplicate values easily. I know i can programatically do it in a backing bean etc.... but i want to know if there is perhaps some EL that might do it or another setting that ADF gives which can overcome this.
    Any help would be appreciated.
    Kind Regards

    Hi,
    It'll be little difficult removing duplicates and keeping the context as it is with exixting standard functions. Removing duplicates irrespective of context changes, we can do with available functions. Please try with this UDF code which may help you...
    source>sort>UDF-->Target
    execution type of UDF is Allvalues of a context.
    public void UDF(String[] var1, ResultList result, Container container) throws StreamTransformationException{
    ArrayList aList = new ArrayList();
    aList.add(var1(0));
    result.addValue(var1(0));
    for(int i=1; i<var1.length; i++){
    if(aList.contains(var1(i)))
         continue;
    else{
    aList.add(var1(i));
    result.addValue(var1(i));
    Regards,
    Priyanka

  • Help needed in removing duplicate items of list box  in java

    How to remove duplicate items of list box while dynamically inserting (on-click event)
    It is not identifying duplicate data
    Variable name is HP_G1
    HP_dmg1 = (DefaultListModel) HP_G1.getModel();
    int a = HP_G1.getModel().getSize();
    System.out.println("HP list no--------> "+a);
    if(a!=0)
    for (int j=0; j<a; j++)
    String item1 = String.valueOf(HP_List.getModel().getElementAt(j));
    System.out.println("HP list added--------> "+item1);
    if(HP_dmg1.equals(item1)){
    HP_dmg1.remove(j);
    else
    HP_dmg1.addElement(GPL);
    }

    Your code is unreadable, so I'll ignore it. In the future please press the message editor's CODE button to format code.
    As to your problem, to the point you normally use a Set instead of List when you don't want duplicates in a collection.

  • Ho to remove duplicate element in the List ?

    It seem to be very basic, but it not working, even I try different way.
    This my List [10, 10, 11, 11, 12, 12, 13, 13, 14, 14],
    now to remove duplicate elements to have at the end [10, 11, 12, 13, 14]
    my code seem to be perfect but...
    for(int i = 0; i < listA.size(); i++){
         if(i%2 == 0){
              System.out.println("ce i est un nombre pair "+i);
              listA.remove(i);
    System.out.println(listA);

    senore100 wrote:
    The problem is that every single time an element is removed, the whole ArrayList is re-shuffled, with all the elements to the right moved to the left on spot. That's why.Yes, that's right. However if you had used an Iterator over the list, you could easily have removed every other element. It's only when you use an array index that you run into this (very common) problem.

  • Duplicate check for Connection Objects in CRM

    Hi,
    I want to implement a duplicate check for connection objects in CRM. The duplicate check shall use the address of the connection object and perhaps additonal attributes. I have found a duplicate check for business partners using the basis address service and TREX as index pool. However, I couldn't find a similar functionality for connection objects. Does somebody know:
    1) How to implement a duplicate check for connection objects which is based on the address of the connection object?
    2) Is there a way to use the TREX-based duplicate check which is integrated into the basis address service for connection objects?
    Thanks in advance!
    Best regards,
    Frank

    I also have a requirement to check for duplicate address at connection object level and we are using SAP data servies to validate the address. Can anyone share their experience.

  • HT2905 i have just followed all the insrtuctions in support to remove duplicates from my library but now most of my musicis gone except for my recent purchases. How come and how do i fix it?

    i have just followed all the instructions in support to remove duplicates from my library but now most of my music is gone except for my recent purchases. How come and how do i fix it?

    Final Cut is a separate, higher end video editor.  The pro version of iMovie.
    Give iPhoto a look at for creating the slideshow.  It's easy to assemble the photos in an album in iPhoto, put them in the order you want and then make a slideshow of them.  You can select from various themes and transitions between slides and add music from your iTunes library.
    When you have the slidshow as you want use the Export button at the bottom of the iPhoto window and export with Size = Medium or Large.
    Save the resulting Quicktime movie file in your Movies folder.
    Next, open iDVD, choose your theme and drag the QT movie file into the menu window being careful to avoid any drop zones.
    Then follow this workflow to help assure the best qualty video DVD:
    Once you have the project as you want it save it as a disk image via the File ➙ Save as Disk Image  menu option. This will separate the encoding process from the burn process. 
    To check the encoding mount the disk image, launch DVD Player and play it.  If it plays OK with DVD Player the encoding is good.
    Then burn to disk with Disk Utility or Toast at the slowest speed available (2x-4x) to assure the best burn quality.  Always use top quality media:  Verbatim, Maxell or Taiyo Yuden DVD-R are the most recommended in these forums.
    The reason I suggest iPhoto is that I find it much easier to use than iMovie (except for the older iMovie 6 HD version).  Personal preferences showing here.

  • Strange behaviour of Removal of Alpha for Info object in Quality system

    Strange behaviour of Removal of Alpha for Info object in Quality system as compared to Development system.
    Hi,
    The data for an info object Key in the DSO was 00000000000000000000000000123. I removed Alpha for the info object and data was corrected to 123 in the DSO in development system.
    Now, when i transported the info object without alpha to quality and loaded data into DSO, the data is still the same with leading zeros.
    I dont want to write routine to remove leading zeros, as I have values as 0. If i write routine, all zeros will be removed and this will be blank.
    Both development and quality have same patches applied and are on same level.
    Why is this strange behaviour in quality system...
    Any inputs?? please suggest.
    Thanks.
    Lavanya

    Hi,
      Did you drop and reload the data after changing the conversion?
    Regards,
    Raghavendra.

  • HT2905 How to remove duplicate songs from the iPhone when the duplicates are not listed in the iTunes library on the computer.

    How can I remove duplicate songs from the iPhone4. The duplicates are not listed in the computer iTunes library.

    The simplest thing may be to back-up and then restore the iPhone. If you don't fancy that then remove all media that is currently synced to it, then double check and manually remove any media that remains on the device, before reselecting your sync options.
    If you've been manually managing the device and your library doesn't contain copies of all your media then see Recover your iTunes library from your iPod or iOS device.
    tt2

Maybe you are looking for

  • BC Service Interface deployment issue (bc4j.xcfg is not found in the class)

    When I invoke my WS method I receive this formatted output: <env:Envelope xmlns:env="http://schemas.xmlsoap.org/soap/envelope/"> <env:Header/> <env:Body> <env:Fault> <faultcode>env:Server</faultcode> <faultstring>JBO-29000: Unexpected exception caugh

  • 2.2 Update Issue: Browser hangs

    My previous problem was the browser frequently crashed on even simple websites. Well, true to Apple's promise, they fixed that problem by replacing it with another - the browser just hangs in either Wifi mode or 3G/EDGE mode. I've "rebooted" many tim

  • Table API Extension for Oracle SQL Developer

    I just created small project [Table API Generator for Oracle|http://code.google.com/p/tapig/]. Idea is to only maintain tables and generate table API (TAPI) packages for data manipulation. Generated: - insert, update, delete, querying procedures - do

  • VCast Media Manager - Upload Button

    So I ordered VCast Media Manager so I don't need to bring a cable with me everywhere and so I can store files remotely and access them with my phone automatically when I need them...great and fun. I ordered the server, configured the service, and end

  • Please help- PSE12 keeps crashing

    When I go to use the text tool it crashes- I have an Imac.  Everything is up to date.  Please help me figure out the issue?!?!