Find duplicate values in a vector???

Hi,
Can somebody help me please?????
I have a little problem, but I can't find it :-(
I have a vector with a lot of values,
The question is, how can I find duplicated values?
Vector VNumber = new Vector();
VNumber.addElement(Bknm);
/**Bknm->values read out of a file**/
for(int j=0;j<VNumber.size();j++){
System.out.printl("Bknm = " + Bknm.elementAt(j));
/**code for find duplicate values????**/           
}

Its much easier to check items as they are going in, rather than parsing the Vector and finding dupes.
Vector VNumber = new Vector();
while(true)
   // Read next Bknm object here
   if(!VNumber.contains(Bknm))
      VNumber.add(Bknm);

Similar Messages

  • Finding duplicate values in a column with different values in a different c

    I'm finding duplicate values in a column of my table:
    select ba.CLAIM_NUMBER from bsi_auto_vw ba
    group by ba.CLAIM_NUMBER
    having count(*) > 1;
    How can I modify this query to find duplicate values in that column where I DON'T have duplicate values in a different specific column in that table (the column CLMT_NO in my case can't have duplicates.)?

    Well, you can use analytics assuming you don't mind full scanning the table.....
    select
       count(owner) over (partition by object_type),    
       count(object_type) over (partition by owner)
    from all_objects;You just need to intelligently (i didn't here) find your "window" (partition clause) to sort the data over, then make use of that (the analytics would be in a nested SQL and you'd then evaluate it in outside).
    Should be applicable if i understand what you're after.
    If you post some sample data we can mock up a SQL statement for you (if what i tried to convey wasn't understandable).

  • Find duplicate values in a table based on combination of two columns.

    In table which have millions of records -
    I have a Table B under Schema A and i have to find the duplicate records in a table. My Table B uses the combination of columns to identify the unique values.
    Scenario is something like this-
    One SV_Id can have multiple SV_Details_Id and i have to find out if there is any duplicate in the table. (I have to use combination of columns to identify the unique values and there is no Primary Key, Foreign Key or Unique Key on the table)
    then wrote the following query -
    select SV_Id,SV_Details_Id count (*) from SchemaA.TableB group by SV_Id,SV_Details_Id having Count (*) > 1 ;
    Is it correct as after firing the above query it is returning the rows which is not matching to the duplicate.
    kindly help me.
    Looking forward for some guidance -
    Thanks in advance.

    What is the desired output??
    Do you want to see only unique records??
    Or you want to see the duplicate records.
    Giving an example will make it easier to provide the required querty.
    BTW here is an simple query which is used to delete the duplicate records and keep the unique records in table.
    DELETE
      FROM table_name     a
    WHERE EXISTS (SELECT 1
                     FROM table_name      b
                    WHERE a.sv_id         = b.sv_id
                      AND a.sv_detail_id  = b.sv_detail_id
                      AND a.ROWID         > b.ROWID
                   );Regards
    Arun

  • How can i find duplicate rows in a field of a table

    hi all,
    how can i able to find duplicate values in a field of a table????
    Regards

    test@ORA92>
    test@ORA92> with x as (
      2    select 1 as p, 'a' as q from dual union all
      3    select 1 as p, 'b' as q from dual union all
      4    select 2 as p, 'c' as q from dual union all
      5    select 3 as p, 'd' as q from dual union all
      6    select 3 as p, 'e' as q from dual union all
      7    select 3 as p, 'd' as q from dual
      8  )
      9  select p, q,
    10  case row_number() over (partition by p order by q)
    11    when 1 then null
    12    else 'Duplicate value: '||p||' of column p'
    13  end as msg
    14  from x;
             P Q MSG
             1 a
             1 b Duplicate value: 1 of column p
             2 c
             3 d
             3 d Duplicate value: 3 of column p
             3 e Duplicate value: 3 of column p
    6 rows selected.
    test@ORA92>
    test@ORA92>
    test@ORA92>cheers,
    pratz

  • How find out the duplicate value from each columns.

    I have below four columns,
    How can i find out the duplicate value from each columns.
    with All_files as (
    select '1000' as INVOICE,'2000' AS DELIVERYNOTE,'3000' CANDELINVOICE,'4000' CANDELIVERYNOTE from dual union all
    select '5000','6000','7000','8000' from dual union all
    select '9000','1000','1100','1200' from dual union all
    select '1200','3400','6700','8790' from dual union all
    select '1000','2000','3000','9000' from dual union all
    select '1230','2340','3450','4560' from dual
    SELECT * FROM All_files
    Output should be as per below.
    1000 2000 3000 4000
    9000 1000 1100 1200
    1200 3400 6700 8790
    1000 2000 3000 9000
    Required to check uniqueness in cross columns.
    Thanks.

    Try this (sorry about the formatting)...
    WITH all_files AS (SELECT   '1000' AS INVOICE,
                                '2000' AS DELIVERYNOTE,
                                '3000' CANDELINVOICE,
                                '4000' CANDELIVERYNOTE
                         FROM   DUAL
                       UNION ALL
                       SELECT   '5000',
                                '6000',
                                '7000',
                                '8000'
                         FROM   DUAL
                       UNION ALL
                       SELECT   '9000',
                                '1000',
                                '1100',
                                '1200'
                         FROM   DUAL
                       UNION ALL
                       SELECT   '1200',
                                '3400',
                                '6700',
                                '8790'
                         FROM   DUAL
                       UNION ALL
                       SELECT   '1000',
                                '2000',
                                '3000',
                                '9000'
                         FROM   DUAL
                       UNION ALL
                       SELECT   '1230',
                                '2340',
                                '3450',
                                '4560'
                         FROM   DUAL),
        t_base
           AS (SELECT      invoice
                        || ','
                        || deliverynote
                        || ','
                        || candelinvoice
                        || ','
                        || candeliverynote
                           str
                 FROM   all_files),
        t_str
           AS (SELECT   str || ',' AS str,
                        (LENGTH (str) - LENGTH (REPLACE (str, ','))) + 1
                           AS no_of_elements
                 FROM   t_base),
        t_n_rows
           AS (    SELECT   LEVEL AS i
                     FROM   DUAL
               CONNECT BY   LEVEL <=
                               (    SELECT   SUM (no_of_elements) FROM t_str)),
        t_build AS (SELECT   t_str.str,
                             nt.i AS element_no,
                             INSTR (t_str.str,
                                    DECODE (nt.i, 1, 0, 1),
                                    DECODE (nt.i, 1, 1, nt.i - 1))
                             + 1
                                AS start_pos,
                             INSTR (t_str.str,
                                    1,
                                    DECODE (nt.i, 1, 1, nt.i))
                                AS next_pos
                      FROM      t_str
                             JOIN
                                t_n_rows nt
                             ON nt.i <= t_str.no_of_elements),
        t_build2
           AS (SELECT   RTRIM (str, ',') AS original_string,
                        SUBSTR (str, start_pos, (next_pos - start_pos))
                           AS single_element,
                        element_no
                 FROM   t_build),
        t_build3
           AS (SELECT   single_element,
                        COUNT( * )
                           OVER (PARTITION BY single_element
                                 ORDER BY single_element)
                           ele_count
                 FROM   t_build2)
    SELECT   DISTINCT INVOICE,
                      DELIVERYNOTE,
                      CANDELINVOICE,
                      CANDELIVERYNOTE
      FROM   all_files, t_build3
    WHERE   ele_count > 1
             AND (   INVOICE = single_element
                  OR DELIVERYNOTE = single_element
                  OR CANDELINVOICE = single_element
                  OR CANDELIVERYNOTE = single_element)I think this will be faster than the previous solution?
    Cheers
    Ben
    Edited by: Munky on Feb 17, 2011 2:11 PM - "I think this will be faster than the previous solution?", nope - it's not :(

  • Avoiding null and duplicate values using model clause

    Hi,
    I am trying to use model clause to get comma seperated list of data : following is the scenario:
    testuser>select * from test1;
    ID VALUE
    1 Value1
    2 Value2
    3 Value3
    4 Value4
    5 Value4
    6
    7 value5
    8
    8 rows selected.
    the query I have is:
    testuser>with src as (
    2 select distinct id,value
    3 from test1
    4 ),
    5 t as (
    6 select distinct substr(value,2) value
    7 from src
    8 model
    9 ignore nav
    10 dimension by (id)
    11 measures (cast(value as varchar2(100)) value)
    12 rules
    13 (
    14 value[any] order by id =
    15 value[cv()-1] || ',' || value[cv()]
    16 )
    17 )
    18 select max(value) oneline
    19 from t;
    ONELINE
    Value1,Value2,Value3,Value4,Value4,,value5,
    what I find is that this query has duplicate value and null (',,') coming in as data has null and duplicate value. Is there a way i can avoid the null and the duplicate values in the query output?
    thanks,
    Edited by: orausern on Feb 19, 2010 5:05 AM

    Hi,
    Try this code.
    with
    t as ( select substr(value,2)value,ind
            from test1
            model
            ignore nav
            dimension by (id)
            measures (cast(value as varchar2(100)) value, 0 ind)
            rules
            ( ind[any]=  instr(value[cv()-1],value[cv()]),
            value[any] order by id = value[cv()-1] || CASE WHEN value[cv()] IS NOT NULL
                                               and ind[cv()]=0     THEN ',' || value[cv()] END      
    select max(value) oneline
    from t;
    SQL> select * from test1;
            ID VALUE
             1 Value1
             2 Value2
             3 Value3
             4 Value4
             5 Value4
             6
             7 value5
             8
    8 ligne(s) sélectionnée(s).
    SQL> with
      2   t as ( select substr(value,2)value,ind
      3          from test1
      4          model
      5          ignore nav
      6          dimension by (id)
      7          measures (cast(value as varchar2(100)) value, 0 ind)
      8          rules
      9          ( ind[any]=  instr(value[cv()-1],value[cv()]),
    10          value[any] order by id = value[cv()-1] || CASE WHEN value[cv()] IS NOT NULL
    11                                             and ind[cv()]=0     THEN ',' || value[cv()] END 
    12          )
    13        )
    14   select max(value) oneline
    15   from t;
    ONELINE
    Value1,Value2,Value3,Value4,value5
    SQL>

  • Unable to Enforce Unique Values, Duplicate Values Exist

    I have list in SP 2010, it contains roughly 1000 items.  I would like to enforce unique values on the title field.  I started by cleaning up the list, ensuring that all items already had a unique value.  To help with this, I used the export
    to excel action, then highlight duplicates within Excel.  So as far as I can tell, there are no duplicates within that list column.
    However, when I try to enable the option to Enforce Unique Values, I receive the error that duplicate values exist within the field and must be removed.
    Steps I've taken so far to identify / resolve duplicate values:
    - Multiple exports to Excel from an unfiltered list view, then using highlight duplicates feature > no duplicates found
    - deleted ALL versions of every item from the list (except current), ensured they were completely removed by deleting from both site and site collection recycle bins
    - Using the SP Powershell console, grabbed all list items and exported all of the "Title" type fields (Item object Title, LinkTitle, LinkTitleNoMenu, etc) to a csv and ran that through excel duplicate checking as well. 
    Unless there's some rediculous hidden field value that MS expects anyone capable of attempting to enforce unique values on a list (which is simple enough for anyone to figure out - if it doesn't throw an error), then I've exhausted anything I can think
    of that might cause the list to report duplicate values for that field.
    While I wait to see if someone else has an idea, I'm also going to see what happens if I wipe the Crawl Index and start it from scratch.
    - Jon

    First, I create index for a column in list settings, it works fine no matter duplicate value exists or not;
    then I set enforce unique values in the field, after click OK, I get duplicate values error message.
    With SQL Server profiler, I find the call to proc_CheckIfExistingFieldHasDuplicateValues and the parameters. After reviewing this stored procedure in content database,
    I create the following script in SQL Server management studio:
    declare @siteid
    uniqueidentifier
    declare @webid
    uniqueidentifier
    declare @listid
    uniqueidentifier
    declare @fieldid
    uniqueidentifier
    set @siteid='F7C40DC9-E5D3-42D7-BE60-09B94FD67BEF'
    set @webid='17F02240-CE04-4487-B961-0482B30DDA84'
    set @listid='B349AF8D-7238-419D-B6C4-D88194A57EA7'
    set @fieldid='195A78AC-FC52-4212-A72B-D03144DC1E24'
    SELECT
    * FROM TVF_UserData_List(@ListId)
    AS U1 INNER
    MERGE JOIN
                NameValuePair_Latin1_General_CI_AS
    AS NVP1 WITH (INDEX=NameValuePair_Latin1_General_CI_AS_MatchUserData)
    ON NVP1.ListId
    = @ListId AND NVP1.ItemId
    = U1.tp_Id
    AND ((NVP1.Level
    = 1 AND U1.tp_DraftOwnerId
    IS NULL)
    OR NVP1.Level
    = 2)
    AND NOT((DATALENGTH(ISNULL(NVP1.Value,
    = 0)) AND U1.tp_Level
    = NVP1.Level
    AND U1.tp_IsCurrentVersion
    = CONVERT(bit, 1)
    AND U1.tp_CalculatedVersion
    = 0 AND U1.tp_RowOrdinal
    = 0 INNER
    MERGE JOIN
                NameValuePair_Latin1_General_CI_AS
    AS NVP2 WITH (INDEX=NameValuePair_Latin1_General_CI_AS_CI)
    ON NVP2.SiteId
    = @SiteId AND NVP2.ListId
    = @ListId AND NVP2.FieldId
    = @FieldId AND NVP2.Value
    = NVP1.Value
    AND NVP2.ItemId <> NVP1.ItemId
    CROSS APPLY TVF_UserData_ListItemLevelRow(NVP2.ListId, NVP2.ItemId,
    NVP2.Level, 0)
    AS U2 WHERE ((NVP2.Level
    = 1 AND U2.tp_DraftOwnerId
    IS NULL)
    OR NVP2.Level
    = 2)
    AND NOT((DATALENGTH(ISNULL(NVP2.Value,
    = 0))      
    I can find the duplicate list items based on the result returned by the query above.
    Note that you need to change the parameter values accordingly, and change the name of NameValuePair_Latin1_General1_CI_AS table based on the last parameter of the
    proc_CheckIfExistingFieldHasDuplicateValues stored procedure. You can review the code of this stored procedure by yourself.
    Note that direct operation on the content database in production environment is not supported, please do all these in test environment.

  • Duplicate values

    hi,
    Please see the tabl;e structure below
    -- Create table
    create table RN_RPT_FIG
    DT DATE not null,
    RPT_ID NUMBER(8) not null,
    RPT_ROW_ID NUMBER(4) not null,
    RPT_COL_ID NUMBER(4) not null,
    FIG NUMBER(20,2)
    tablespace TS_IRS
    pctfree 10
    initrans 1
    maxtrans 255
    storage
    initial 64K
    minextents 1
    maxextents unlimited
    -- Create/Recreate primary, unique and foreign key constraints
    alter table RN_RPT_FIG
    add constraint PK_RN_RPT_FIG primary key (DT, RPT_ID, RPT_ROW_ID, RPT_COL_ID)
    using index
    tablespace TS_IRS
    pctfree 10
    initrans 2
    maxtrans 255
    storage
    initial 128K
    minextents 1
    maxextents unlimited
    I would like to find any duplicate values that have been entered in this table. how can i write this query Keeping in mind the constraint PK_RN_RPT_FIG ?

    I am actually tryin to find out which are those columns
    below is the data in that table
         DT     RPT_ID     RPT_ROW_ID     RPT_COL_ID     FIG
    1     31-Mar-11     2000101     1     2     8157500.00
    2     31-Mar-11     2000101     1     1     17.00
    3     31-Mar-11     2000101     1     4     530000.00
    4     31-Mar-11     2000101     1     3     5.00
    5     31-Mar-11     2000101     2     2     96500.00
    6     31-Mar-11     2000101     2     1     6.00
    7     31-Mar-11     2000101     3     2     8301000.00
    8     31-Mar-11     2000101     3     1     7.00
    9     31-Mar-11     2000101     3     4     669000.00
    10     31-Mar-11     2000101     3     3     6.00
    11     31-Mar-11     2000101     5     2     25184500.00
    12     31-Mar-11     2000101     5     1     61.00
    13     31-Mar-11     2000101     5     4     22609000.00
    14     31-Mar-11     2000101     5     3     19.00
    15     31-Mar-11     2000101     6     2     14103500.00
    16     31-Mar-11     2000101     6     1     24.00
    17     31-Mar-11     2000101     7     5     9153500.00
    18     31-Mar-11     2000102     1     1     6.00
    19     31-Mar-11     2000102     1     2     664000.00
    20     31-Mar-11     2000102     1     3     1.00
    21     31-Mar-11     2000102     1     4     18500.00
    22     31-Mar-11     2000102     2     1     4.00
    23     31-Mar-11     2000102     2     2     4800000.00
    24     31-Mar-11     2000102     2     3     1.00
    25     31-Mar-11     2000102     2     4     98000.00
    26     31-Mar-11     2000102     3     3     1.00
    27     31-Mar-11     2000102     3     4     98000.00
    28     31-Mar-11     2000102     4     1     18.00
    29     31-Mar-11     2000102     4     2     16476000.00
    30     31-Mar-11     2000102     4     3     10.00
    31     31-Mar-11     2000102     4     4     21257000.00
    32     31-Mar-11     2000102     5     1     9.00
    33     31-Mar-11     2000102     5     2     6069000.00
    34     31-Mar-11     2000102     5     3     6.00
    35     31-Mar-11     2000102     5     4     79000.00
    36     31-Mar-11     2000102     6     1     26.00
    37     31-Mar-11     2000102     6     2     2938000.00
    38     31-Mar-11     2000102     6     3     2.00
    39     31-Mar-11     2000102     6     4     1167500.00
    40     31-Mar-11     2000102     7     1     2.00
    41     31-Mar-11     2000102     7     2     257000.00
    42     31-Mar-11     2000103     1     1     3.00
    43     31-Mar-11     2000103     1     2     169500.00
    44     31-Mar-11     2000103     2     3     3.00
    45     31-Mar-11     2000103     2     4     2266500.00
    46     31-Mar-11     2000103     3     1     4.00
    47     31-Mar-11     2000103     3     2     6400000.00
    48     31-Mar-11     2000103     4     1     17.00
    49     31-Mar-11     2000103     4     2     9369500.00
    50     31-Mar-11     2000103     4     3     8.00
    51     31-Mar-11     2000103     4     4     20149000.00
    52     31-Mar-11     2000103     5     1     25.00
    53     31-Mar-11     2000103     5     2     7307000.00
    54     31-Mar-11     2000103     5     3     3.00
    55     31-Mar-11     2000103     5     4     125500.00
    56     31-Mar-11     2000103     6     1     7.00
    57     31-Mar-11     2000103     6     2     194500.00
    58     31-Mar-11     2000103     6     3     5.00
    59     31-Mar-11     2000103     6     4     68000.00
    60     31-Mar-11     2000103     7     1     2.00
    61     31-Mar-11     2000103     7     2     247000.00
    62     31-Mar-11     2000103     8     1     2.00
    63     31-Mar-11     2000103     8     2     1375000.00
    64     31-Mar-11     2000103     11     1     1.00
    65     31-Mar-11     2000103     11     2     122000.00
    66     31-Mar-11     2000104     3     4     432500.00
    67     31-Mar-11     2000104     3     3     27.00
    68     31-Mar-11     2000104     3     6     115000.00
    69     31-Mar-11     2000104     3     5     8.00
    70     31-Mar-11     2000104     4     4     172000.00
    71     31-Mar-11     2000104     4     3     4.00
    72     31-Mar-11     2000104     4     6     294000.00
    73     31-Mar-11     2000104     4     5     3.00
    74     31-Mar-11     2000104     5     4     7030000.00
    75     31-Mar-11     2000104     5     3     23.00
    76     31-Mar-11     2000104     5     6     1150000.00
    77     31-Mar-11     2000104     5     5     1.00
    78     31-Mar-11     2000104     6     4     17550000.00
    79     31-Mar-11     2000104     6     3     7.00
    80     31-Mar-11     2000104     6     6     21050000.00
    81     31-Mar-11     2000104     6     5     7.00
    82     31-Mar-11     2000105     5     4     664000.00
    83     31-Mar-11     2000105     5     3     6.00
    84     31-Mar-11     2000105     5     6     18500.00
    85     31-Mar-11     2000105     5     5     1.00
    86     31-Mar-11     2000105     6     4     4800000.00
    87     31-Mar-11     2000105     6     3     4.00
    88     31-Mar-11     2000105     7     6     98000.00
    89     31-Mar-11     2000105     7     5     1.00
    90     31-Mar-11     2000105     8     4     16525500.00
    91     31-Mar-11     2000105     8     3     23.00
    92     31-Mar-11     2000105     8     6     21325000.00
    93     31-Mar-11     2000105     8     5     15.00
    94     31-Mar-11     2000105     9     4     2938000.00
    95     31-Mar-11     2000105     9     3     26.00
    96     31-Mar-11     2000105     9     6     1167500.00
    97     31-Mar-11     2000105     9     5     2.00
    98     31-Mar-11     2000105     10     4     257000.00
    99     31-Mar-11     2000105     10     3     2.00
    100     31-Mar-11     4000100     3     3     0.00
    101     31-Mar-11     4000100     3     4     18.00
    102     31-Mar-11     4000100     3     5     18.00
    103     31-Mar-11     4000100     4     3     0.00
    104     31-Mar-11     4000100     4     4     7.00
    105     31-Mar-11     4000100     4     5     7.00
    106     31-Mar-11     4000100     5     3     0.00
    107     31-Mar-11     4000100     5     4     10.00
    108     31-Mar-11     4000100     5     5     10.00
    109     31-Mar-11     4000100     6     4     8.00
    110     31-Mar-11     4000100     6     5     8.00
    111     31-Mar-11     4000100     7     4     27.00
    112     31-Mar-11     4000100     7     5     27.00
    113     31-Mar-11     4000100     8     4     41.00
    114     31-Mar-11     4000100     8     5     41.00
    115     31-Mar-11     4000100     9     3     0.00
    116     31-Mar-11     4000100     9     4     4.00
    117     31-Mar-11     4000100     9     5     4.00
    118     31-Mar-11     4000100     10     3     0.00
    119     31-Mar-11     4000100     10     4     2.00
    120     31-Mar-11     4000100     10     5     2.00
    121     31-Mar-11     4000100     11     3     0.00
    122     31-Mar-11     4000100     11     4     0.00
    123     31-Mar-11     4000100     11     5     0.00
    124     31-Mar-11     4000100     12     3     0.00
    125     31-Mar-11     4000100     12     4     4.00
    126     31-Mar-11     4000100     12     5     4.00
    127     31-Mar-11     4000100     13     3     0.00
    128     31-Mar-11     4000100     13     4     2.00
    129     31-Mar-11     4000100     13     5     2.00
    130     31-Mar-11     4000100     14     3     0.00
    131     31-Mar-11     4000100     14     4     0.00
    132     31-Mar-11     4000100     14     5     0.00
    133     31-Mar-11     4000100     15     3     0.00
    134     31-Mar-11     4000100     15     4     6.00
    135     31-Mar-11     4000100     15     5     6.00
    136     31-Mar-11     4000100     16     5     0.00
    137     31-Mar-11     4000100     17     5     0.00
    138     31-Mar-11     4000100     18     5     0.00
    139     31-Mar-11     4000100     19     5     212.81
    140     31-Mar-11     4000100     20     5     0.00
    141     31-Mar-11     4000100     21     5     39.20
    142     31-Mar-11     4000100     22     5     0.00
    143     31-Mar-11     4000100     23     5     0.00
    144     31-Mar-11     4000100     26     5     0.00
    145     31-Mar-11     4000100     27     5     92.96
    146     31-Mar-11     4000100     28     5     21.07
    147     31-Mar-11     4000100     29     5     0.00
    148     31-Mar-11     4000100     30     5     44.24
    149     31-Mar-11     4000100     31     5     10.09
    150     31-Mar-11     5000100     3     3     57700.00
    151     31-Mar-11     5000100     4     3     137900.00
    152     31-Mar-11     5000100     5     3     41700.00
    153     31-Mar-11     5000100     6     3     20900.00
    154     31-Mar-11     5000100     7     3     32800.00
    155     31-Mar-11     5000100     8     3     188100.00
    156     31-Mar-11     5000100     9     3     372730.00
    157     31-Mar-11     5000100     10     3     63100.00
    158     31-Mar-11     5000100     11     3     126200.00
    159     31-Mar-11     5000100     12     3     65100.00
    160     31-Mar-11     5000100     13     3     157200.00
    161     31-Mar-11     5000100     15     3     61100.00
    162     31-Mar-11     5000100     16     3     38900.00
    163     31-Mar-11     5000100     17     3     208500.00
    164     31-Mar-11     5000100     18     3     1167700.00
    165     31-Mar-11     5000100     19     3     67100.00
    166     31-Mar-11     5000100     20     3     68100.00
    167     31-Mar-11     5000100     21     3     81100.00
    168     31-Mar-11     5000100     22     3     82100.00
    169     31-Mar-11     5000100     24     3     90500.00
    170     31-Mar-11     5000100     25     3     20800.00
    171     31-Aug-11     3000107     3     4     1000.00
    172     31-Aug-11     3000107     3     3     1.00
    173     31-Aug-11     3000107     3     6     1000.00
    174     31-Aug-11     3000107     3     5     1.00
    175     31-Aug-11     3000108     8     3     4.00
    176     31-Aug-11     3000108     8     4     45100.00
    177     31-Aug-11     3000108     15     3     4.00
    178     31-Aug-11     3000108     15     4     60000.00
    179     31-Aug-11     3000108     16     3     3.00
    180     31-Aug-11     3000108     16     4     71020.56
    181     31-Aug-11     3000108     17     3     4.00
    182     31-Aug-11     3000108     17     4     97038.72
    183     31-Aug-11     3000108     18     3     4.00
    184     31-Aug-11     3000108     18     4     17096.91
    185     31-Aug-11     3000108     19     3     8.00
    186     31-Aug-11     3000108     19     4     106569.85
    187     31-Aug-11     3000108     20     3     3.00
    188     31-Aug-11     3000108     20     4     30456.12
    189     31-Aug-11     3000108     21     3     4.00
    190     31-Aug-11     3000108     21     4     63805.83
    191     31-Aug-11     3000108     22     3     4.00
    192     31-Aug-11     3000108     22     4     60000.00
    193     31-Aug-11     3000108     23     3     4.00
    194     31-Aug-11     3000108     23     4     16208.52
    195     31-Aug-11     3000108     24     3     4.00
    196     31-Aug-11     3000108     24     4     57784.26
    197     31-Aug-11     3000108     25     3     3.00
    198     31-Aug-11     3000108     25     4     45000.00
    199     31-Aug-11     3000108     26     3     4.00
    200     31-Aug-11     3000108     26     4     17212.24
    201     31-Aug-11     3000108     28     3     4.00
    202     31-Aug-11     3000108     28     4     30567.03
    203     31-Aug-11     3000108     33     3     3.00
    204     31-Aug-11     3000108     33     4     30130.34
    205     31-Aug-11     4000100     3     3     0.00
    206     31-Aug-11     4000100     3     4     18.00
    207     31-Aug-11     4000100     3     5     18.00
    208     31-Aug-11     4000100     4     3     0.00
    209     31-Aug-11     4000100     4     4     7.00
    210     31-Aug-11     4000100     4     5     7.00
    211     31-Aug-11     4000100     5     3     0.00
    212     31-Aug-11     4000100     5     4     10.00
    213     31-Aug-11     4000100     5     5     10.00
    214     31-Aug-11     4000100     6     5     8.00
    215     31-Aug-11     4000100     7     5     27.00
    216     31-Aug-11     4000100     8     5     41.00
    217     31-Aug-11     4000100     9     3     0.00
    218     31-Aug-11     4000100     9     4     4.00
    219     31-Aug-11     4000100     9     5     4.00
    220     31-Aug-11     4000100     10     3     0.00
    221     31-Aug-11     4000100     10     4     2.00
    222     31-Aug-11     4000100     10     5     2.00
    223     31-Aug-11     4000100     11     3     0.00
    224     31-Aug-11     4000100     11     4     0.00
    225     31-Aug-11     4000100     11     5     0.00
    226     31-Aug-11     4000100     12     3     0.00
    227     31-Aug-11     4000100     12     4     4.00
    228     31-Aug-11     4000100     12     5     4.00
    229     31-Aug-11     4000100     13     3     0.00
    230     31-Aug-11     4000100     13     4     2.00
    231     31-Aug-11     4000100     13     5     2.00
    232     31-Aug-11     4000100     14     3     0.00
    233     31-Aug-11     4000100     14     4     0.00
    234     31-Aug-11     4000100     14     5     0.00
    235     31-Aug-11     4000100     15     3     0.00
    236     31-Aug-11     4000100     15     4     6.00
    237     31-Aug-11     4000100     15     5     6.00
    238     31-Aug-11     4000100     16     5     0.00
    239     31-Aug-11     4000100     17     5     0.00
    240     31-Aug-11     4000100     18     5     0.00
    241     31-Aug-11     4000100     19     5     212.81
    242     31-Aug-11     4000100     20     5     0.00
    243     31-Aug-11     4000100     21     5     39.20
    244     31-Aug-11     4000100     22     5     0.00
    245     31-Aug-11     4000100     23     5     0.00
    246     31-Aug-11     4000100     26     5     0.00
    247     31-Aug-11     4000100     27     5     92.96
    248     31-Aug-11     4000100     28     5     21.07
    249     31-Aug-11     4000100     29     5     0.00
    250     31-Aug-11     4000100     30     5     44.24
    251     31-Aug-11     4000100     31     5     10.09
    252     31-Aug-11     5000100     3     3     97700.00
    253     31-Aug-11     5000100     4     3     100600.00
    254     31-Aug-11     5000100     5     3     82700.00
    255     31-Aug-11     5000100     6     3     33900.00
    256     31-Aug-11     5000100     7     3     59800.00
    257     31-Aug-11     5000100     8     3     196400.00
    258     31-Aug-11     5000100     9     3     287600.00
    259     31-Aug-11     5000100     10     3     77100.00
    260     31-Aug-11     5000100     11     3     154200.00
    261     31-Aug-11     5000100     12     3     79100.00
    262     30-Nov-07     4000100     6     4     8.00
    263     30-Nov-07     4000100     7     4     27.00
    264     30-Nov-07     4000100     8     4     41.00
    265     30-Nov-07     4000100     16     4     0.00
    266     30-Nov-07     4000100     18     4     0.00
    267     30-Nov-07     4000100     19     4     0.03
    268     30-Nov-07     4000100     20     4     0.00
    269     30-Nov-07     4000100     21     4     0.01
    270     30-Nov-07     4000100     26     4     0.00
    271     30-Nov-07     4000100     27     4     0.00
    272     30-Nov-07     4000100     28     4     0.00
    273     30-Nov-07     4000100     29     4     0.00
    274     30-Nov-07     4000100     30     4     0.00
    275     30-Nov-07     4000100     31     4     0.00
    276     31-Aug-11     1000101     6     2     1000.00
    277     31-Aug-11     1000101     6     1     5.00
    278     31-Aug-11     1000101     6     3     50800.00
    279     31-Aug-11     1000101     7     2     4000.00
    280     31-Aug-11     1000101     7     1     11.00
    281     31-Aug-11     1000101     7     3     129828.26
    282     31-Aug-11     1000101     12     2     14000.00
    283     31-Aug-11     1000101     12     1     27.00
    284     31-Aug-11     1000101     12     3     244433.51
    285     31-Aug-11     1000101     30     2     0.00
    286     31-Aug-11     1000101     30     1     6.00
    287     31-Aug-11     1000101     30     3     1415254.00
    288     31-Aug-11     1000101     39     2     0.00
    289     31-Aug-11     1000101     39     1     8.00
    290     31-Aug-11     1000101     39     3     5300.00
    291     31-Aug-11     3000103     6     6     46.00
    292     31-Aug-11     3000103     6     7     585509.54
    293     31-Aug-11     3000104     3     13     7.00
    294     31-Aug-11     3000104     3     14     47721.81
    295     31-Aug-11     3000104     4     13     6.00
    296     31-Aug-11     3000104     4     14     80805.83
    297     31-Aug-11     3000104     5     13     6.00
    298     31-Aug-11     3000104     5     14     46569.72
    299     31-Aug-11     3000104     6     13     7.00
    300     31-Aug-11     3000104     6     14     112907.50
    301     31-Aug-11     3000104     7     13     6.00
    302     31-Aug-11     3000104     7     14     75500.00
    303     31-Aug-11     3000104     8     13     13.00
    304     31-Aug-11     3000104     8     14     123471.04
    305     31-Aug-11     3000104     9     13     6.00
    306     31-Aug-11     3000104     9     14     101401.56
    307     31-Aug-11     3000104     20     13     6.00
    308     31-Aug-11     3000104     20     14     60489.10
    309     31-Aug-11     3000104     22     13     6.00
    310     31-Aug-11     3000104     22     14     75567.03
    311     31-Aug-11     3000104     23     13     6.00
    312     31-Aug-11     3000104     23     14     61339.10
    313     31-Aug-11     3000104     24     13     20.00
    314     31-Aug-11     3000104     24     14     94688.52
    315     31-Aug-11     3000104     25     13     45.00
    316     31-Aug-11     3000104     25     14     5772805.47
    317     31-Aug-11     3000104     26     13     6.00
    318     31-Aug-11     3000104     26     14     76462.24
    319     31-Aug-11     3000104     27     13     16.00
    320     31-Aug-11     3000104     27     14     63742.32
    321     31-Aug-11     3000104     31     13     6.00
    322     31-Aug-11     3000104     31     14     60141.20
    323     31-Aug-11     3000105     7     3     14.00
    324     31-Aug-11     3000105     7     4     162632.63
    325     31-Aug-11     3000105     7     5     11.00
    326     31-Aug-11     3000105     7     6     92950.93
    327     31-Aug-11     3000105     8     3     15.00
    328     31-Aug-11     3000105     8     4     128938.59
    329     31-Aug-11     3000105     8     5     14.00
    330     31-Aug-11     3000105     8     6     1491571.98
    331     31-Aug-11     3000105     9     3     14.00
    332     31-Aug-11     3000105     9     4     1666338.36
    333     31-Aug-11     3000105     9     5     13.00
    334     31-Aug-11     3000105     9     6     1377521.70
    335     31-Aug-11     5000100     13     3     91100.00
    336     31-Aug-11     5000100     15     3     75100.00
    337     31-Aug-11     5000100     16     3     60100.00
    338     31-Aug-11     5000100     17     3     448700.00
    339     31-Aug-11     5000100     18     3     1117400.00
    340     31-Aug-11     5000100     19     3     81100.00
    341     31-Aug-11     5000100     20     3     82100.00
    342     31-Aug-11     5000100     24     3     157500.00
    343     31-Aug-11     5000100     25     3     48800.00
    344     31-Aug-11     7000001     19     1     
    345     31-Aug-11     7000001     32     1     
    346     31-Aug-11     7000001     40     1     
    347     31-Aug-11     7000001     42     1     
    348     31-Aug-11     7000001     43     1     
    349     31-Aug-11     7000001     45     1     
    350     31-Aug-11     7000001     48     1     0.15
    351     31-Aug-11     7000001     50     1     0.15
    352     31-Aug-11     7000001     51     1     
    353     31-Aug-11     7000001     54     1     
    354     31-Aug-11     7000001     55     1     
    355     31-Aug-11     7000001     66     1     4417.45
    356     31-Aug-11     7000001     66     2     
    357     31-Aug-11     7000002     12     2     
    358     31-Aug-11     7000002     12     3     
    359     31-Aug-11     7000002     13     2     
    360     31-Aug-11     7000002     13     3     
    361     31-Aug-11     7000002     16     2     
    362     31-Aug-11     7000002     16     3     
    363     31-Aug-11     7000002     17     2     
    364     31-Aug-11     7000002     17     3     
    365     31-Aug-11     7000002     18     2     
    366     31-Aug-11     7000002     18     3     
    367     31-Aug-11     7000002     19     2     
    368     31-Aug-11     7000002     19     3     
    369     31-Aug-11     7000002     20     2     
    370     31-Aug-11     7000002     20     3     
    371     31-Aug-11     7000002     22     2     
    372     31-Aug-11     7000002     23     2     
    373     31-Aug-11     7000002     23     3     14900.00
    374     31-Aug-11     7000002     27     2     
    375     31-Aug-11     7000002     27     3     
    376     31-Aug-11     7000002     28     2     
    377     31-Aug-11     7000002     29     2     
    378     31-Aug-11     7000002     30     2     
    379     31-Aug-11     7000002     30     3     59100.00
    380     31-Aug-11     7000002     32     2     
    381     31-Aug-11     7000002     32     3     56100.00
    382     31-Aug-11     7000002     34     2     
    383     31-Aug-11     7000002     34     3     373880.00
    384     31-Aug-11     7000002     35     2     
    385     31-Aug-11     7000002     35     3     119363875.00
    386     31-Aug-11     7000002     36     2     
    387     31-Aug-11     7000002     36     3     9629230.00
    388     31-Aug-11     7000002     38     2     
    389     31-Aug-11     7000002     38     3     2287214.00
    390     31-Aug-11     7000002     40     2     
    391     31-Aug-11     7000002     40     3     1935671.00
    392     31-Aug-11     7000002     41     2     
    393     31-Aug-11     7000002     41     3     2515010.00
    394     31-Aug-11     7000002     42     2     
    395     31-Aug-11     7000002     42     3     36348309.00
    396     31-Aug-11     7000002     43     2     
    397     31-Aug-11     7000002     43     3     3212922.00

  • Duplicate values error

    Dear All
    I have received common error when creating relation:
    The relationship cannot be created because each column contains duplicate values. Select at least one column that contains only unique values.
    The issue is strange because I am sure that there are no duplicate values in my table! I checked it.
    What I've done was to create a single column "Append column" by append few columns with articles from different databases using Power Query. After that I removed duplicates.
    The final column with unique values was added to powerpivot model as a new table. There are no duplicates and spaces.
    Then I tried to create the relation between the column with append values and other databases which I had used to create this column.
    So it is look like this:
    Details of error:
    Error Message:
    ============================
    The relationship cannot be created because each column contains duplicate values. Select at least one column that contains only unique values.
    ============================
    Call Stack:
    ============================
       at Microsoft.AnalysisServices.Common.RelationshipController.CreateRelationship(DataModelingColumn sourceColumn, DataModelingColumn relatedColumn)
       at Microsoft.AnalysisServices.Common.RelationshipController.CreateRelationship(String fkTableName, String fkColumnName, String pkTableName, String pkColumnName)
       at Microsoft.AnalysisServices.Common.SandboxEditor.erDiagram_CreateRelationshipCallBack(Object sender, ERDiagramCreateRelationshipEventArgs e)
       at Microsoft.AnalysisServices.Common.ERDiagram.ERDiagramActionCreateRelationship.Do(IDiagramActionInstance actionInstance)
    ============================
    What could be  a reason of this issue?

    Thanks recio but your solutions not works
    I have found the issue but not a solution.
    The problem is that there is a space at the end of some articles names.
    In my Database 1 I have an "a" article but
    in my Database 2 I have an "a_"
    where "_" is invisible space character
    So PowerQuery and excel "Remove duplicate" function did not find duplicates but Powerpivot see ones
    My data model looks like as on example below
    So now my question is how to automaticly get rid of space character to be able to create relation?
    As I see power Query does not have such an option. Maybe a look up formula in PowerPivot? But how it should look like
    This is how my "Append column" Should look like:

  • Referencing multiple cells and removing duplicate values.

    Hello.
    I have a very complicated question, to be honest I am not totally sure if numbers is capable of handling all of this but it's worth a shot.
    I am working on a spreadsheet for organising a film. I've had the templates for years but I'm now using numbers to automate it as much as possible. Everything was going well until I hit the schedule/callsheet.
    On other sheets I can tell it to "look up scene two" it will then look up the correct row and find everything I need. On the callsheet however I might say "we're filming scenes two, five and nine" and numbers gets confused with the multiple values, Is there anyway around this?
    Also, if there is, I have a more complex question to ask. Is it possible for numbers to find and remove duplicate data? For example lets say scene two and five require Alice, but scene nine requires bob. If numbers just adds that info together it will display the result "Alice Alice Bob", is there a way to get it to parse the text, recognise the duplicate value and remove the unnecessary Alice? 
    I realise that numbers has limitations so it may not be able to do everything I want, however every bit I can automate saves me hours so even if I can only get half way there, totally worth it.
    Thanks in advance for any help you can offer, all the best.

    Ah excellent thank you.
    I've modified it to there are now multiple (well only four for now until I get this in better shape) indexes for finding a scene. And assigning each block to a new row.
    I only have one slight reservation about this. If I create 10 rows, it totally works, most of the time we'll only shoot three scenes a day so it's just blank space... However Murphy's law will inevitable raise its ugly head and put me in a situation where we are shooting 11 scenes in a day. 
    For countif, I think I get what you mean... Kinda. Basicially I would create a cell which combines the character strings from each scene into one long scene. Then I would have 100 extra cells (Lets say 100 so I'll never run out) each linked to the cast list, using the character name as a variable. These cells will each parse through the string to find their character name. If it appears then a true value comes up. This will remove duplicates as each cell will only respond once. So if Alice appears in every scene shooting that day, the cell will still only light up once. Is that it.
    One other question. Whenever I combine filled cells with blank cells. I usually gets the data from the filled cells, with a 0 for each blank cell. Usually this isn't a problem, but if I want to keep this flexible then I'll have quite a few blanks. The actor example above could result in 98 zeroes. Is there anyway to have blanks just not show up at all.
    I'll email the spreadsheet in a moment, a lot of it is still rough and under construction, but you should be able to see where it's all going hopefully.
    Thanks again, you have been extraordinarily helpful. 

  • Comparing values in a vector!!

    I solved my previous problem, but now i've come across another, it seems my unfamiliarity with vectors aint helping, oh well practice makes perfect. Anyway I've written a method which will iterate through a list of numbers using a vector and delete any duplicate items from the vector. Here's the code
    public int deleteMultiple() {
         int i = 0;
         while(i<current_size) {
              int j;
              int k = 0;
              int l = k + 1;
              m = 0;
              for(j = 0; j<current_size; j++) {
                if(data.elementAt(k) == data.elementAt(l)) {
         data.removeElementAt(l);
                    m++;
              else {
                   l++;
         k++;
         i++;
         return m;
    }Sorry about the alignment, it's hard to get right in this thing. I think the problem lies in the if statement I don't think it ever gets into that particular block. Say I had a list comprising of the following numbers
    1 2 3 1
    Then what the list should look like after deletion is
    1 2 3
    Any input would be greatly appreciated!!!!

    Ok I'm going to post all of the code cos I know I'm really close to solving this problem now.
    This code is ListDemo2.java, and this is basically the main progrm thats sets up all the applet stuff and displays the lists. What it can do is insert an item onto the end of the list, make an ordered list from zero up to an input in the text field(integer), make a random list, clear the list, delete duplicates ( this is what i'm having trouble with ) and exit the applet.
    import java.awt.*;
    import java.awt.event.*;   /* new in 1.1 for EventListener */
    import java.applet.Applet;
    import java.util.*;
    public class ListDemo2 extends Applet implements ActionListener,
                             AdjustmentListener
        private ListByVector list = new ListByVector(0);
        // setup a limit for displaying the list
        final int    numRow = 5;
        final int    numCol = 10;
        final int      maxDisplay = numRow * numCol;
        private int  scrollbarPos = 0;
        private TextField userInput;
        private Label messageLabel;
        private Label scrollbarLabel;
        private Label[] itemLabels;
        private Scrollbar listScroll;
        private Button insertButton;
        private Button mk_olistButton;
        private Button mk_rlistButton;
        private Button clearButton;
        private Button deleteButton;
        private Button exitButton;
        // ... adding more buttons here
        public void init() {
         // 1. define stuff on the first panel - control panel
            Panel controlPanel1 = new Panel();
             userInput = new TextField("", 5);
             insertButton = new Button("insert");
             mk_olistButton = new Button("mk_olist");
         mk_rlistButton = new Button("mk_rlist");
            clearButton = new Button("clear");
         deleteButton = new Button("delete");
             exitButton = new Button("quit");
         controlPanel1.setLayout(new FlowLayout(FlowLayout.LEFT));
         controlPanel1.add(userInput);
         controlPanel1.add(insertButton);
            controlPanel1.add(mk_olistButton);
         controlPanel1.add(mk_rlistButton);
            controlPanel1.add(clearButton);
         controlPanel1.add(deleteButton);
         controlPanel1.add(exitButton);
         // ... adding new buttons
         // for event handling we need these lines
         insertButton.addActionListener(this);
            mk_olistButton.addActionListener(this);
         mk_rlistButton.addActionListener(this);
            clearButton.addActionListener(this);
         deleteButton.addActionListener(this);
         exitButton.addActionListener(this);
         // ... any newly added buttons need the above line
         // 2. define stuff on the second panel - message panel
            Panel controlPanel2 = new Panel();
             messageLabel = new Label("Initially, the list is empty");
             scrollbarLabel = new Label("                          ");
            controlPanel2.setLayout(new GridLayout(2,1));
         controlPanel2.add(messageLabel);
         controlPanel2.add(scrollbarLabel);
         // 3. define stuff on the third panel - list display panel
            Panel listPanel = new Panel();
            Panel listDisplay = new Panel();
         itemLabels = new Label[maxDisplay];
         listDisplay.setLayout(new GridLayout(numRow,numCol));
            int tmp=0;
         for(tmp=0; tmp < maxDisplay; tmp++) {
                 itemLabels[tmp] = new Label("          ");
             listDisplay.add(itemLabels[tmp]);
            listScroll =
             new Scrollbar(Scrollbar.VERTICAL,0,0,0,
              list.sizeLimit()/numCol + 1);
         listScroll.addAdjustmentListener(this); /* new in 1.1 */
         listPanel.setLayout(new BorderLayout());
            listPanel.add( "West", listDisplay);
            listPanel.add( "East", listScroll);
            // finally, add the above 3 to the applet
         setLayout(new BorderLayout());
            add("North",  controlPanel1);
            add("Center", controlPanel2);
         add("South", listPanel);
        }   // end of ListDemo's constructor
        public void actionPerformed(ActionEvent event) {
         String     input_buffer;
            int     k;
            // read a string from the text field
         input_buffer = new String(userInput.getText());
            if(event.getSource() == clearButton) {
                list.clearUp();
             showList(list);
                showMessage("list cleared");
         else if(event.getSource() == insertButton) {
             k = list.insertAtEnd(input_buffer);
             if(k>=0) { // inserted OK
                    showList(list);
                    showMessage(
                        "inserted  " + input_buffer + "  at  " + k);
                else { // reached the limit, nothing inserted
                    showMessage("sorry, you cannot insert any more");
            else if(event.getSource() == mk_olistButton) {
                k = list.createOrderedList(input_buffer);
             if(k>0) {
                 showList(list);
                 showMessage("an ordered list is created, the list size is "+k);
             else showMessage("please input the size of your list");
            else if(event.getSource() == mk_rlistButton) {
                k = list.createRandomList(input_buffer);
                if(k>0) {
                    showList(list);
                    showMessage("a random list is created, the list size is "+k);
                else showMessage("please input the size of your list");
    /*Quick note about the code below, if I keep it as is
    with the showList(list) commented out it will not visibly
    delete the duplicates but the message that  comes up
    saying how many duplicates were deleted gives the
    correct answer for the first time only, if I were to create
    another random list with a different set of duplicates then
    the message will be incorrect. Also if the showList(list)
    were to be uncommented then the program suffers from
    overflow problems??????*/
            else if(event.getSource() == deleteButton) {
             k = list.deleteMultiple();
         //    showList(list);
             showMessage("deleted " +k + " duplicate items");                    
            else if(event.getSource() == exitButton) {
              System.exit(0);
        public void adjustmentValueChanged(AdjustmentEvent event) {
         if(event.getSource() == listScroll) {
              scrollbarPos = listScroll.getValue();
                 showScrollbarMessage(
                  "the scrollbar  position  is  " + scrollbarPos);
                 showList(list);
        void showList(ListByVector list)
         int     i = scrollbarPos;
            int     n = list.size();
         int     c = 0;
            i = i * numCol;
         while((i < n) && (c < maxDisplay)) {
             itemLabels[c].setVisible(true);
             itemLabels[c].setText(list.item(i));
             i++; c++;
         while(c < maxDisplay) {
             itemLabels[c].setVisible(true);
             itemLabels[c].setText("");
             c++;
         // clear input text field
         userInput.setText("");
        void showMessage(String Message) {
             messageLabel.setVisible(true); /* this line was not needed
                                   in java1.0 */
         messageLabel.setText(Message);
        void showScrollbarMessage(String Message) {
             scrollbarLabel.setVisible(true); /* this line was not needed
                                   in java1.0 */
         scrollbarLabel.setText(Message);
        public static void main(String args[])
         Frame f = new Frame("List Demo");
         ListDemo2 appletListDemo = new ListDemo2();
            f.setFont(new Font("Helvetica", Font.BOLD, 12));
            f.setSize(700, 300);
            f.setBackground(Color.green);
         f.add(appletListDemo);
         appletListDemo.init();
            f.setVisible(true);
    }This code is called ListByVector.java and is used to generate the lists and then return the info back to ListDemo2.
    import java.util.*;
    public class ListByVector {
         private final int default_max_size = 4096;
         private int current_size;
         private int m;
         private Vector data;
         ListByVector(int initial_size) {
              data = new Vector();
              current_size = initial_size;
         public int sizeLimit(){ return default_max_size; }
         public int size(){ return current_size; }
         public String item(int which){ return (String)data.elementAt(which); }
         public void clearUp() {
              current_size = 0;
         public int insertAtEnd(String input_item) {     
              if(current_size < default_max_size) {
                   data.insertElementAt(input_item, current_size);
                   current_size++;
                   return(current_size - 1);
              else {
                   return -1;
         public int createRandomList(String string_size) {     
              int i;
              int j;
              if(!isIntString(string_size)) {
                   return 0;
              else {
                   Integer size = Integer.valueOf(string_size);
                   current_size = size.intValue();
              Random R = new Random();
              for(i = 0; i<current_size; i++) {
                   j = (int)(R.nextFloat()*current_size);
                   data.insertElementAt(Integer.toString(j), i);
              return current_size;
         public int createOrderedList(String size_string) {
              int i;
              if(!isIntString(size_string)) {
                   return 0;
              else {
                   Integer size = Integer.valueOf(size_string);
                   current_size = size.intValue();
              for(i = 0; i<current_size; i++) {
                   data.insertElementAt(Integer.toString(i), i);
              return current_size;
    /* As you can see for the delete method I have tried
    several variations all giving the same problem, the one
    which isn't commented out is the one which kind of
    works i.e. when the delete button is pressed the correct
    value appears for the number of duplicates that should
    have been deleted but nothing visibly happens*/
         public int deleteMultiple() {
         /*     int i = 0;          
              while(i<data.size()) {
                   int j;
                   int k = 0;
                   int l = k + 1;
                   m = 0;
                   for(j = 0; j<data.size()-1; j++) {
                        if(data.elementAt(k) == data.elementAt(l)) {
                             data.removeElementAt(l);
                        }//end if
                        else {          
                             m++;               
                             l++;
                        }//end else                    
                   }//end for
                   k++;
                   i++;
              }//end while */
              m = 0;
              for(int i=0; i<data.size(); i++) {
                   for(int j=data.size()-1; j>i; j--) {
                        if(data.elementAt(i).equals(data.elementAt(j))) {
                             data.removeElementAt(j);
                             m++;
         /*     int currIndex = 0;
              while(currIndex<data.size()) {
                   int currIndex2;
                   for(currIndex2 = currIndex + 1; currIndex2<data.size(); currIndex2++) {
                        if(data.elementAt(currIndex2).equals(data.elementAt(currIndex))) {
                             data.removeElementAt(currIndex2);
                             currIndex2++;
                   currIndex++;
              return m;
         private boolean lessThan(String s1, String s2) {
              if(isIntString(s1) && isIntString(s2))
                return(Integer.parseInt(s1) < Integer.parseInt(s2));
              else
                return(s1.compareTo(s2) < 0);
         private boolean isIntString(String s) {
              int n = s.length();
              int i;
              if(n==0) return false;
              if(s.charAt(0) == '-') i = 1;
              else i = 0;
              if(i == n) return false;
              for(; i<n; i++) {
                   if(s.charAt(i) < '0') return false;
                   if(s.charAt(i) > '9') return false;
              return true;
    }

  • Error 5398 Duplicate value addition in attribute ...

    I'm seeing the following error messages in my error log and am not sure what to do about it since the reference guide does not list it. Solaris 8, DS 5.2.
    ERROR<5398> - Entry - conn=-1 op=-1 msgId=-1 - Duplicate value addition in attribute "objectClass" of entry "ou=Configs, o=Contivity, o=vpn"
    ERROR<5398> - Entry - conn=-1 op=-1 msgId=-1 - Duplicate value addition in attribute "objectClass" of entry "cn=14649, ou=Configs, o=Contivity, o=vpn"
    Here are some historical events that may help shed light on things:
    The errors are occuring on Searay. I have another LDAP server called Mantaray. Here is some historical data that ay help shed light on the matter:
    I wanted the DNS domain the LDAP was using changed on Searay so I configured searay:o=vpn for Master replication and created o=vpn on Mantaray and configured it as a consumer.
    After the suffix was replicated I broke the replication and unconfigured Searay and then configured it. I then did the reverse and made Mantaray:o=vpn the master and Searay:o=vpn the consumer. I then broke the replication again and tried to get Multi-Master replication to work between the two servers. It took a few tries before things seemed to start working right.

    This thread (http://swforum.sun.com/jive/thread.jspa?forumID=13&threadID=21473) seems similar but I cannot find where the nsslapd-rererral that kunal mehta mentions is located.
    I did look at the des.ldif for each server and both looked okay.

  • Find duplicate with sum=0

    I need to find duplicates (ID) but the sum of column VALUES has tu be zero. can anyone help me please ?
    I HAVE TABLE
    ID VALUE
    100 -100
    100 100
    100 100
    200 50
    --------------- THE DESIRED RESULT IS
    ID VALUE
    100 -100
    100 100
    ------------

    Sorry,
    I only read you leading question:
    I need to find duplicates (ID) but the sum of column VALUES has tu be zero(The obvious answer would be there isn't any)
    Only now do I see your expected output.
    You need to explain which row to eliminate and why, in order to go from
    ID VALUE
    100  -100
    100   100
    100   100To just:
    ID VALUE
    100  -100
    100   100Also, what should happen if you have, say
    ID VALUE
    200   -50
    200   -50
    200   100Regards
    Peter

  • Help with Finding Duplicate records Query

    HI,
    I am trying to write a query that will find duplicate records/cases.
    This query will be used in a report.
    So, here are the requirements:
    I need to find duplicate cases/records based on the following fields:
    DOB, DOCKET, SENT_DATEI was able to do that with the following query. The query below is able to give me all duplicate records based on the Criteria above
    SELECT      DEF.BIRTH_DATE DOB,
               S.DOCKET DOCKET,
               S.SENT_VIO_DATE SENT_DATE, COUNT(*)
    FROM SENTENCES S,
                DEFENDANTS DEF
    WHERE      S.DEF_ID = DEF.DEF_ID
    AND       S.CASE_TYPE_CODE = 10
    GROUP BY  DEF.BIRTH_DATE, S.DOCKET, S.SENT_VIO_DATE
    HAVING COUNT(*) > 1;
    //I AM GOING TO CALL THIS QUERY 'X'Now, the information to be displayed on the report: defendants Name, DOB, District, Docket, Def Num, Sent Date, and PACTS Num if possible.
    The problem that I need help on is how to combine those queries together (what I mean is a sub query). the 'X' query returns multiple values. please have a look at the comments on the query below to see what I'm trying to achieve.
    here is the main query:
    SELECT      INITCAP(DEF.LAST_NAME) || ' ' || INITCAP(DEF.FIRST_NAME) || ' ' || INITCAP(DEF.MIDDLE_NAME) DEFENDANT_NAME,
            DEF.BIRTH_DATE DOB,
            TRIM(DIST.DISTRICT_NAME) DISTRICT_NAME,
            S.DOCKET DOCKET,
            S.DEF_NUM DEF_NUM,
            S.SENT_VIO_DATE SENT_DATE,
            DEF.PACTS_ID PACTS_NUM
    FROM      USSC_CASES.DEFENDANTS DEF,
            USSC_CASES.SENTENCES S,
            LOOKUP.DISTRICTS DIST
    WHERE      DEF.DEF_ID = S.DEF_ID
    AND      S.DIST_ID = DIST.USSC_DISTRICT_ID
    AND     S.CASE_TYPE_CODE = 10
    AND     S.USSC_ID IS NOT NULL
    AND // what i'm trying to do is: DOB, DOCKET, SENT_DATE IN ('X' QUERY), is this possible ??
    ORDER BY DEFENDANT_NAME; thanks in advance.
    I am using Oracle 11g, and sql developer.
    if my approach doesn't work, is there a better approach ?
    Edited by: Rooney on Jul 11, 2012 3:50 PM

    If I got it right, you want to join table USSC_CASES.DEFENDANTS to duplicate rows in USSC_CASES. If so:
    SELECT  INITCAP(DEF.LAST_NAME) || ' ' || INITCAP(DEF.FIRST_NAME) || ' ' || INITCAP(DEF.MIDDLE_NAME) DEFENDANT_NAME,
            DEF.BIRTH_DATE DOB,
            TRIM(DIST.DISTRICT_NAME) DISTRICT_NAME,
            S.DOCKET DOCKET,
            S.DEF_NUM DEF_NUM,
            S.SENT_VIO_DATE SENT_DATE,
            DEF.PACTS_ID PACTS_NUM
      FROM  USSC_CASES.DEFENDANTS DEF,
             SELECT  *
               FROM  (
                      SELECT  S.*,
                              COUNT(*) OVER(PARTITION BY DEF.BIRTH_DATE, S.DOCKET, S.SENT_VIO_DATE) CNT
                        FROM  USSC_CASES.SENTENCES S
               WHERE CNT > 1
            ) S,
            LOOKUP.DISTRICTS DIST
      WHERE DEF.DEF_ID = S.DEF_ID
       AND  S.DIST_ID = DIST.USSC_DISTRICT_ID
       AND  S.CASE_TYPE_CODE = 10
       AND  S.USSC_ID IS NOT NULL
      ORDER BY DEFENDANT_NAME;If you want to exclude duplicates from the query and do not care which row out of duplicate rows to choose:
    SELECT  INITCAP(DEF.LAST_NAME) || ' ' || INITCAP(DEF.FIRST_NAME) || ' ' || INITCAP(DEF.MIDDLE_NAME) DEFENDANT_NAME,
            DEF.BIRTH_DATE DOB,
            TRIM(DIST.DISTRICT_NAME) DISTRICT_NAME,
            S.DOCKET DOCKET,
            S.DEF_NUM DEF_NUM,
            S.SENT_VIO_DATE SENT_DATE,
            DEF.PACTS_ID PACTS_NUM
      FROM  USSC_CASES.DEFENDANTS DEF,
             SELECT  *
               FROM  (
                      SELECT  S.*,
                              ROW_NUMBER() OVER(PARTITION BY DEF.BIRTH_DATE, S.DOCKET, S.SENT_VIO_DATE ORDER BY 1) RN
                        FROM  USSC_CASES.SENTENCES S
               WHERE RN = 1
            ) S,
            LOOKUP.DISTRICTS DIST
      WHERE DEF.DEF_ID = S.DEF_ID
       AND  S.DIST_ID = DIST.USSC_DISTRICT_ID
       AND  S.CASE_TYPE_CODE = 10
       AND  S.USSC_ID IS NOT NULL
      ORDER BY DEFENDANT_NAME;SY.

  • Find Duplicate Rows

    Hi,
    I have to find duplicate rows from a view where 2 columns of the view should be duplicate but the 3 col value should not be same.
    Pls help it's urgent.
    Thanks
    Regards
    Prashant

    Try this
    with t
    as
    select 1 rno, 'AAA' c1, 1 c2, 'NAME1' c3 from dual union all
    select 2, 'AAA', 1, 'NAME1' from dual union all
    select 3, 'AAA', 1, 'NAME3' from dual union all
    select 4, 'BBB', 2, 'NAME4' from dual union all
    select 5, 'BBB', 2, 'NAME4' from dual union all
    select 6, 'BBB', 2, 'NAME5' from dual
    select rno, c1, c2, c3
      from (
             select t.*, row_number() over(partition by c1, c2, c3 order by rno) rno1
               from t
    where rno1 = 1

Maybe you are looking for