Unable to Enforce Unique Values, Duplicate Values Exist

I have list in SP 2010, it contains roughly 1000 items.  I would like to enforce unique values on the title field.  I started by cleaning up the list, ensuring that all items already had a unique value.  To help with this, I used the export
to excel action, then highlight duplicates within Excel.  So as far as I can tell, there are no duplicates within that list column.
However, when I try to enable the option to Enforce Unique Values, I receive the error that duplicate values exist within the field and must be removed.
Steps I've taken so far to identify / resolve duplicate values:
- Multiple exports to Excel from an unfiltered list view, then using highlight duplicates feature > no duplicates found
- deleted ALL versions of every item from the list (except current), ensured they were completely removed by deleting from both site and site collection recycle bins
- Using the SP Powershell console, grabbed all list items and exported all of the "Title" type fields (Item object Title, LinkTitle, LinkTitleNoMenu, etc) to a csv and ran that through excel duplicate checking as well. 
Unless there's some rediculous hidden field value that MS expects anyone capable of attempting to enforce unique values on a list (which is simple enough for anyone to figure out - if it doesn't throw an error), then I've exhausted anything I can think
of that might cause the list to report duplicate values for that field.
While I wait to see if someone else has an idea, I'm also going to see what happens if I wipe the Crawl Index and start it from scratch.
- Jon

First, I create index for a column in list settings, it works fine no matter duplicate value exists or not;
then I set enforce unique values in the field, after click OK, I get duplicate values error message.
With SQL Server profiler, I find the call to proc_CheckIfExistingFieldHasDuplicateValues and the parameters. After reviewing this stored procedure in content database,
I create the following script in SQL Server management studio:
declare @siteid
uniqueidentifier
declare @webid
uniqueidentifier
declare @listid
uniqueidentifier
declare @fieldid
uniqueidentifier
set @siteid='F7C40DC9-E5D3-42D7-BE60-09B94FD67BEF'
set @webid='17F02240-CE04-4487-B961-0482B30DDA84'
set @listid='B349AF8D-7238-419D-B6C4-D88194A57EA7'
set @fieldid='195A78AC-FC52-4212-A72B-D03144DC1E24'
SELECT
* FROM TVF_UserData_List(@ListId)
AS U1 INNER
MERGE JOIN
            NameValuePair_Latin1_General_CI_AS
AS NVP1 WITH (INDEX=NameValuePair_Latin1_General_CI_AS_MatchUserData)
ON NVP1.ListId
= @ListId AND NVP1.ItemId
= U1.tp_Id
AND ((NVP1.Level
= 1 AND U1.tp_DraftOwnerId
IS NULL)
OR NVP1.Level
= 2)
AND NOT((DATALENGTH(ISNULL(NVP1.Value,
= 0)) AND U1.tp_Level
= NVP1.Level
AND U1.tp_IsCurrentVersion
= CONVERT(bit, 1)
AND U1.tp_CalculatedVersion
= 0 AND U1.tp_RowOrdinal
= 0 INNER
MERGE JOIN
            NameValuePair_Latin1_General_CI_AS
AS NVP2 WITH (INDEX=NameValuePair_Latin1_General_CI_AS_CI)
ON NVP2.SiteId
= @SiteId AND NVP2.ListId
= @ListId AND NVP2.FieldId
= @FieldId AND NVP2.Value
= NVP1.Value
AND NVP2.ItemId <> NVP1.ItemId
CROSS APPLY TVF_UserData_ListItemLevelRow(NVP2.ListId, NVP2.ItemId,
NVP2.Level, 0)
AS U2 WHERE ((NVP2.Level
= 1 AND U2.tp_DraftOwnerId
IS NULL)
OR NVP2.Level
= 2)
AND NOT((DATALENGTH(ISNULL(NVP2.Value,
= 0))      
I can find the duplicate list items based on the result returned by the query above.
Note that you need to change the parameter values accordingly, and change the name of NameValuePair_Latin1_General1_CI_AS table based on the last parameter of the
proc_CheckIfExistingFieldHasDuplicateValues stored procedure. You can review the code of this stored procedure by yourself.
Note that direct operation on the content database in production environment is not supported, please do all these in test environment.

Similar Messages

  • This column cannot enforce unique values because this list or document library may contain items that are not viewable by all users.

    The above error occurs when I place unique constraint on a column in a custom list. This error only occurs because the custom list has been configured to only allow creaters to view and edit their own list items.
    I just can't see why that should prevent SP from placing a unique value constraint though.
    One easy workaround would be to place the unique constraint on the SQL Server table that represents this list. However, in a hosted environment, it is not always possible to gain access to the backend server. What is more, I am uncomfortable with this
    approach as it may break other parts of SP 2010.
    Is there any other approach?

    Hi,
    If SharePoint were to enforce uniqueness on a column in a list where users are only allowed to see and edit list items that have created, then you could have a scenario where -
    a. user1 tries to add a new item to the list which violates the uniqueness constraint
    b. SharePoint reports an error
    c. user1 gets to know that there is another item containing the same value in this list
    d. this violates the security settings that you configured (each user should be allowed to only see the items that they have created). Now if you change the permissions settings to allow users to SEE all items, but only
    edit items that they have created, this error disappears (obviously).
    Now getting to the workaround you mentioned (creating a unique constraint on the SQL Server table). This is not going to be possible for numerous reasons -
    a. Modifying (or for that matter, even performing SELECT operations on) the SQL Server tables is an unsupported operation and it will definetely break other things in SharePoint, not
    to mention that you will lose all support options from Microsoft. However if you don't care about this, then it will still not be technically possible because...
    1. SharePoint does not create a new SQL Server table for every list/library. 
    2. If you did get around to enforcing a unique constraint in the one single table that SharePoint uses for all lists, then you will be enforcing this constraint on each and every list in all the sites and in all the site
    collections that are assigned to that content database.
    Please "Mark as Answer" if a post has answered your question or "Vote as Helpful" if it was helpful in some way. Here's
    why

  • Alert On UDF's when Duplicate value exists

    Hi Experts,
    I want an Alerts message on a UDF's Gate Entry Pass (G.E. No), when any duplicate value is entered in this UDF's.
    e.g.:-  a value is entered in this udf is 30.  when same value is entered it should give an alert that value already exists.
    Thanks & Regards,
    Pankaj Sharma.

    Hi Gordon,
    I am using this UDF's on the GRPO Form and the UDF's name is U_GE_NO(Gate Entry No). my client  uses this field when the material entered the company premises and entered the unique value. so to remove duplicacy in data the want the alert.

  • System error occurred (RFC call),key value exists in duplicate

    hi bw expert ,
    i happen the error report.
    "system error occurred (RFC call).
    key value exists in duplicate (Not allowed by the ODS object type).
    Activation of data records from ODS object Z08TRFKP terminated.
    No confirmation for request odsr_80p13w4lqhnib3g9tiflvth56 when activating ODS object Z08TRFKP.
    Request REQU_EVZB7CK82H7YMS13X3G42GKQY , data package 000001 contains errors with status 5 .
    Request REQU_EVZB7CK82H7YMS13X3G42GKQY , data package 000001 not correct.
    Inserted records 1- ; Changed records 1- ; Deleted records 1-
    pls, help me .
    thanks.

    hi experts,
    the first, i initial update the data from r3 to bw ods, it is successful. the second, the system automatic touch off delta update from r3 to bw ods. it is false. now i have deleted the false delta update and repeat extractive. but it is also error.
    please help me again.
    thanks.

  • Unable to get the values from PAYLOAD

    Hi,
    i'm unable to get the values of payload. When i use
    currentTask = client.getTaskQueryService().getTaskDetailsById(workflowContext, taskID);
    Element payload = (Element)currentTask.getPayloadAsElement();
    node = (Element)payload.getFirstChild();
    System.out.println(node.getElementsByTagName("documentName").item(0).getNodeValue());
    I get null but i instantiate the bpel process with a value and i have
    - <payload>
    - <ns1:DocumentReviewProcessRequest xmlns:ns1="http://xmlns.oracle.com/bpel/Review">
    <ns1:documentTitle>vbvnmbvn</ns1:documentTitle>
    <ns1:documentName>bvnbvnvn</ns1:documentName>
    <ns1:URI>http://www.first.pt</ns1:URI>
    <ns1:assignees>gestdoc</ns1:assignees>
    <ns1:groups />
    </ns1:DocumentReviewProcessRequest>
    </payload>
    if i use System.out.println(node.getElementsByTagName("HELLO").item(0).getNodeValue());
    I get a normal erro 'cause this variable don't exist...
    Can someone please help me?
    Thanks!!

    I think the problem is UME related. I had the same problem once. Try using a system user for searching, instead of the logged on user.
    Like this:
    com.sapportals.portal.security.usermanagement.IUser user = WPUMFactory.getUserFactory().getUser("cmadmin_service");
    Please reward the points if this helps.

  • Getting unique values from internal table

    Hi Gurus,
    From time to time I hit this problem and so far I havn't found any nice solution. I've an internal table with several fields. I would like to get all unique values for one (or several) of these fields. However let say that this table has a lot of entries so making a copy is not an option. Also changing this table in any way is forbiden.
    For example for table below I would like to get all unique values for field Number. In this case it would be 1,2,3,4.
    Name  | Number |
    name1 | 1|
    name2 | 2|
    name3 | 2|
    name4 | 3|
    name5 | 4|
    name5 | 3|
    Can anyone propose me better solution than going in the loop through all entries in table? Maybe there is some ABAP functionality that I don't know about?
    BR
    Marcin Cholewczuk

    Let's say that if I sort this table I won't be able to restore it to previous order which is important for me
    True...If you sort the table you won't be able to restore. So the only option is to copy/move all the records into another table.
    Sorting If you need to retrive unique values. I don't think without sorting the table would be a nice idea and proper programming to proceed ahead.
    Regarding logic, as replied earlier
    Either we can go with DELETE ADJACENT DUPLICATES or proceed as replied in my earlier post. There might be number of algorithms to resolve this. But we cannot go ahead without sorting or looping.

  • Getting unique values from an arraylist

    Hi
    I have an arraylist that contains stringbuffer objects. The arraylist has duplicate values of the objects. I have to create another arraylist with only the unique values from the first arraylist. I tried it many ways and using many functions, but no luck. Can someone provide me the code for this , as i need to do this urgently.
    Thank U
    Mansoor,

    > Sanity check: when are StringBuffers equal?
    Same character sequence.
    [EDIT] Or so I thought. Just tested, and found I was wrong. A quick glance at the API confirmed my error; StringBuffer doesn't override Object.equals().
    ~

  • Extracting unique values from (non-category) columns for chart

    Hello:
    I've created a worksheet to keep inventory of my Intellivision games.  It has the following columns:
    Publisher
    Class
    Network
    Title
    Quantity
    (misc...)
    The "Class" represents whether the game is "complete in box" or "loose cartridge."  The "Network" represents the general genre or game cateogory.  The quantity is how much I have of each.
    I have set the first three columns as categories:  Publisher, Class, and Network.  I also created a bar chart based on the Title and Quantity columns, to show how many I have of each.
    The problem I have is that, although it looks real cool and helps me keep the games organized, since a title can appear in more than one "Class" (e.g., I can have one in box, and two loose), the chart includes duplicates, and they are not grouped together.
    Is there a way that I can create a graph (or a secondary table) that exctracts only the UNIQUE values from a column that may contain duplicates?
    Note that I don't want to put the "Title" column in a category.  I want to group by the three major categories and list the games on each.
         -dZ.

    From the Numbers Help Menu, download the Numbers User Guide. Read the first three or four chapters to get a feel for the app. It's well written and won't take long to read that much.
    Then use the Table of Contents and the Search tool to get additional specific directions.
    First, delete all unneeded Rows and Columns from your data table. If you have patches of data in a larger table, Cut the patches and Paste them to a blank Sheet area to create separate dedicated tables for your various needs. These small special purpose tables are like Named Ranges in Excel. Name them in the Sheets Pane.
    This is how Numbers was intended to be used. The User Guide will describe how to reference cells in one table from expressions in another table. If you use the point and click method of creating references from within the equation editor it won't matter a bit that the tables are separate.
    Come back here for specific help on anything you are having trouble with.
    Jerry

  • How to update unique values

    I want to update the unique values, if they are not inserted... Please help in this regard
    CREATE OR REPLACE PROCEDURE PR IS
    lv_operator VARCHAR2(1) := 'N';
    TYPE subset_rt IS RECORD
    ltt_whse_loc oe_sales_ord_commit.whse_loc%TYPE,
    ltt_shape oe_sales_ord_commit.shape%TYPE,
    ltt_isize oe_sales_ord_commit.isize%TYPE,
    ltt_grade oe_sales_ord_commit.grade%TYPE,
    ltt_length oe_sales_ord_commit.length%TYPE,
    ltt_pieces NUMBER,
    ltt_inv_lbs_commit oe_sales_ord_commit.inv_lbs_commit%TYPE,
    ltt_roll_lbs_commit oe_sales_ord_commit.roll_lbs_commit%TYPE,
    ltt_ishcom_lbs oe_sales_ord_commit.ishcom_lbs%TYPE);
    TYPE subset_aat IS TABLE OF subset_rt
    INDEX BY PLS_INTEGER;
    aa_subset subset_aat;
    errors NUMBER;
    dml_errors EXCEPTION;
    PRAGMA EXCEPTION_INIT(dml_errors, -24381);
    BEGIN
    lv_operator := sy_checkpriv(USER,
    'SY_OPERATIONS');
    SELECT whse_loc,
    shape,
    isize,
    grade,
    length,
    0,
    inv_lbs_commit,
    roll_lbs_commit,
    ishcom_lbs
    BULK COLLECT INTO aa_subset
    FROM oe_sales_ord_commit_temp
    WHERE processed_datetime IS NULL
    AND username = decode(lv_operator,
    'Y',
    username,
    USER);
    BEGIN
    FORALL indx IN 1 .. aa_subset.COUNT SAVE EXCEPTIONS
    INSERT INTO
    (SELECT loc,
    shape,
    isize,
    grade,
    length,
    pieces,
    inv_lbs_commit,
    roll_lbs_commit,
    ishcom_lbs FROM oe_inventory_temp)
    VALUES
    aa_subset(indx);
    EXCEPTION
    WHEN OTHERS THEN
    *+<<Trap the rows that have failed due to dup_val_on_index exception and update the inv_lbs_commit,+*
    *+roll_lbs_commit, ishcom_lbs columns in oe_inventory_temp table>>+*
    -- loc,shape, isize,grade, length, pieces are composite primary keys.
    errors := SQL%BULK_EXCEPTIONS.COUNT;
    DBMS_OUTPUT.PUT_LINE('Number of statements that failed: ' || errors);
    FOR i IN 1..errors LOOP
    DBMS_OUTPUT.PUT_LINE('Error #' || i || ' occurred during '||
    'iteration #' || SQL%BULK_EXCEPTIONS(i).ERROR_INDEX);
    DBMS_OUTPUT.PUT_LINE('Error message is ' ||
    SQLERRM(-SQL%BULK_EXCEPTIONS(i).ERROR_CODE));
    DBMS_OUTPUT.PUT_LINE(aa_subset(SQL%BULK_EXCEPTIONS(i).ERROR_INDEX));
    END LOOP;
    END;
    END PR;
    /

    Due to the following restriction a number of simple collections is used instead of the collection of records.
    [FORALL Statement|http://download.oracle.com/docs/cd/B19306_01/appdev.102/b14261/forall_statement.htm#sthref2742]: You cannot refer to individual record fields within DML statements called by a FORALL statement. Instead, you can specify the entire record with the SET ROW clause in an UPDATE statement, or the VALUES clause in an INSERT statement.
    10g syntax of FORALL statement is used:
    SQL> select * from a;
           INT DT         STR
             1 10.02.2009 123
             2 11.02.2009 567
    SQL> desc a
    Name                                      Null?    Type
    INT                                       NOT NULL NUMBER(38)
    DT                                                 DATE
    STR                                                VARCHAR2(20)
    SQL> select * from a;
           INT DT         STR
             1 10.02.2009 123
             2 11.02.2009 567
    SQL> DECLARE
      2    TYPE T_Int_List IS TABLE OF A.Int%TYPE INDEX BY PLS_INTEGER;
      3    TYPE T_Dt_List IS TABLE OF A.Dt%TYPE INDEX BY PLS_INTEGER;
      4    TYPE T_Str_List IS TABLE OF A.Str%TYPE INDEX BY PLS_INTEGER;
      5  
      6    vInts T_Int_List;
      7    vDts T_Dt_List;
      8    vStrs T_Str_List;
      9 
    10    eBulk_Error EXCEPTION;
    11    PRAGMA EXCEPTION_INIT(eBulk_Error, -24381);
    12  BEGIN
    13    vInts(1) := 2;
    14    vDts(1) := sysdate - 1;
    15    vStrs(1) := '888';
    16 
    17    vInts(2) := 3;
    18    vDts(2) := sysdate - 22;
    19    vStrs(2) := '0000';
    20 
    21    BEGIN
    22      FORALL indx IN INDICES OF vInts SAVE EXCEPTIONS
    23        INSERT INTO A(Int, Dt, Str)
    24          VALUES(vInts(indx), vDts(indx), vStrs(indx));
    25    EXCEPTION
    26      WHEN eBulk_Error THEN
    27        DECLARE
    28          vErr_Count INTEGER;
    29          vErr_Message VARCHAR2(2000);
    30          vErr_Indices T_Int_List;
    31        BEGIN
    32          vErr_Count := SQL%BULK_EXCEPTIONS.COUNT;
    33          -- Getting the indices of rows with duplicate key
    34          -- FOR vI IN 1 .. SQL%BULK_EXCEPTIONS.COUNT LOOP - Doesn't work for some reason on 10.2.0.1.0
    35          FOR vI IN 1 .. vErr_Count LOOP
    36            IF SQL%BULK_EXCEPTIONS(vI).Error_Code <> 1 THEN
    37              -- Error other that duplicate key is reraised
    38              vErr_Message := SQLERRM(-SQL%BULK_EXCEPTIONS(vI).Error_Code);
    39              Raise_Application_Error(-20001, vErr_Message);
    40            END IF;
    41            vErr_Indices(SQL%BULK_EXCEPTIONS(vI).ERROR_INDEX) := SQL%BULK_EXCEPTIONS(vI).ERROR_INDEX;
    42          END LOOP;
    43       
    44          -- Bulk update
    45          FORALL vI IN INDICES OF vErr_Indices
    46            UPDATE A SET
    47                Dt = vDts(vI),
    48                Str = vStrs(vI)
    49              WHERE Int = vInts(vI);
    50        END;
    51    END;
    52  END;
    53  /
    PL/SQL procedure successfully completed.
    SQL> select * from a;
           INT DT         STR
             1 10.02.2009 123
             2 09.02.2009 888
             3 19.01.2009 0000Regards,
    Dima
    Edited by: DimaCit on Feb 10, 2009 10:39 AM

  • Creating a list of unique values within a range.

    I have a huge data set I am working with and need to do two things.
    First I need to get rid of any repeat entires, which there are many.
    Second, and more importantly, I need to generate a list of unique values within a column for which I have no idea what the number or sort of values might be. For instance, say the column contained flavors of ice cream, I am looking for a formula that can return an array of the flavors listed.
    Can either of these two things be done in numbers?

    CJ Eder wrote:
    I have a huge data set I am working with and need to do two things.
    First I need to get rid of any repeat entires, which there are many.
    If entries are in column B starting from B2
    In C2 enter the formula:
    =IF(COUNTIF($B$1:$B1,B)=0,B,"")
    and apply fill down.
    You will get a single copy of existing entries.
    Select the column C
    Copy to Clipboard
    Paste in column C
    Sort on column C
    delete the rows whose cell of column C is blank.
    If I understand well, the same protocol apply to your second request.
    Yvan KOENIG (from FRANCE mercredi 15 juillet 2009 21:32:30)

  • LOV's auto refresh and having unique values.

    Hi, I'm having LOVs connected with few attributes of a table, and those fields are used in view criteria of the table.
    This view criteria is used in the af:query component as search fields.
    To avoid the duplicates in the LOV, i created a separate view for each of the attribute, using the distinct key , and connected them with the attribute via view accessors. So that the af:query component , search field drop downs can give unique value of the column.
    There is an another requirement, to refresh the LOVs of the search field, when there is a change in the database. To accomplish this auto refresh property of the view's query, i added the PK of the table with each view. By doing this i can see the latest values of the database in the UI . ie, the LOVs of af:query getting the changes of database.
    But this one causing, the duplicate values to be in LOV, and the distinct key word in the view's query will not work because of the PK of table is added to query.
    I tried different ways to query , like groupby . But no success.
    I need both auto refresh as well as unique values in the LOVs of the af:query . Could some one point me a reference to solve this. Thanks .
    Edited by: user642477 on Oct 7, 2010 10:52 AM

    user642477 wrote:
    Hi, I'm having LOVs connected with few attributes of a table, and those fields are used in view criteria of the table.
    This view criteria is used in the af:query component as search fields.
    To avoid the duplicates in the LOV, i created a separate view for each of the attribute, using the distinct key , and connected them with the attribute via view accessors. So that the af:query component , search field drop downs can give unique value of the column.
    why you have done that what is your reason?!!! it seems so wierd to me? you could easily add distinct on query of the viewobject!

  • Need to test if a column have unique values or not

    Hi all,
    in ETL process I need to check if some cols have unique values or not using sql or plsql.
    Suppose we need to load a big file data with external table initially and then test if values
    on one or more columns have unique values  in order to proceed with ETL process.
    What is the faster test I can execute to verify that a column have unique values or not?
    It's better for the ETL performance, use:
    a. techniques regard constraints like described on Ask tom forum
    "ENABLE NOVALIDATE validating existing data"
    (Ask Tom &amp;quot;ENABLE NOVALIDATE validating existing da...&amp;quot;)
    b. "simply" query on the data?
    like this:
    select count(count(*)) distinct_count,
             sum(count(*)) total_count,
             sum(case when count(*) = 1 then 1 else null end) non_distinct_groups,
             sum(case when count(*) > 1 then 1 else null end) distinct_groups
    from hr.employees a
    group by A.JOB_ID
    c. use analytics function?
    d. use some feature directly on external table?
    Bye in advance

    Here is the example to handling the errrs using LOG_ERRORS into concept. You will check this and let me know if any doubt you have
    DATAFILE:-
    1000,ANN,ZZ105
    1001,KARTHI,ZZ106
    1002,PRAVEEN,ZZ109
    1002,PARTHA,ZZ107
    1003,SATHYA,ZZ108
    1000,ANN,ZZ105
    ----- Original Table With unique constraints
    SQL> CREATE TABLE tab_uniqtest(student_id     NUMBER(10) UNIQUE,
                              student_name   VARCHAR2(15),
                                                      course_name    VARCHAR2(15)
      2    3    4
    Table created.
    ----- External table
    SQL> CREATE TABLE tab_extuniqtest(student_id     NUMBER(10),
      2                               student_name   VARCHAR2(15),
      3                                                  course_name    VARCHAR2(15)
      4                              )
      5  ORGANIZATION EXTERNAL
      6  (
      7  DEFAULT DIRECTORY ann_dir
      8  ACCESS PARAMETERS
      9  (
    10    RECORDS DELIMITED BY NEWLINE
    11    BADFILE 'tabextuniqtest_badfile.txt'
    12    LOGFILE 'tabextuniqtest_logfile.txt'
    13    FIELDS TERMINATED BY ','
    14    MISSING FIELD VALUES ARE NULL
    15    REJECT ROWS WITH ALL NULL FIELDS
    16    (student_id,student_name,course_name)
    17  )
    18  LOCATION ('unique_check.csv')
    19  )
    20  REJECT LIMIT UNLIMITED;
    Table created.
    ---- Error logging table to log the errors
    SQL> CREATE TABLE dmlerrlog_uniqtest(ORA_ERR_NUMBER$     NUMBER ,
      2                                 ORA_ERR_MESG$       VARCHAR2(2000),
      3                                 ORA_ERR_ROWID$      ROWID,
      4                                 ORA_ERR_OPTYP$      VARCHAR2(2),
      5                                 ORA_ERR_TAG$        VARCHAR2(4000),
      6                                 inserted_dt         VARCHAR2(50) DEFAULT TO_CHAR(SYSDATE,'YYYY-MM-DD'),
      7                                 student_id              VARCHAR2(10)
      8                                  );
    Table created.
    ---- Procedure to insert from external table
    SQL> CREATE OR REPLACE PROCEDURE proc_uniqtest
      2  AS
      3  v_errcnt NUMBER;
      4  BEGIN
      5      INSERT INTO tab_uniqtest
      6      SELECT * FROM tab_extuniqtest
      7      LOG ERRORS INTO dmlerrlog_uniqtest('PROC_UNIQTEST@TAB_UNIQTEST') REJECT LIMIT UNLIMITED;
      8      SELECT COUNT(1) into v_errcnt
      9      FROM dmlerrlog_uniqtest
    10      WHERE ORA_ERR_TAG$ = 'PROC_UNIQTEST@TAB_UNIQTEST';
    11       IF(v_errcnt > 0) THEN
    12       ROLLBACK;
    13      ELSE
    14        COMMIT;
    15       END IF;
    16      DBMS_OUTPUT.PUT_LINE ( 'Procedure PROC_UNIQTEST is completed with ' || v_errcnt || ' errors') ;
    17  EXCEPTION
    18   WHEN OTHERS THEN
    19    RAISE;
    20  END proc_uniqtest;
    21  /
    Procedure created.
    SQL> SET SERVEROUTPUT ON
    SQL> EXEC proc_uniqtest;
    Procedure PROC_UNIQTEST is completed with 2 errors
    PL/SQL procedure successfully completed.
    SQL> SELECT STUDENT_ID,ORA_ERR_MESG$ FROM dmlerrlog_uniqtest;
    STUDENT_ID                     ORA_ERR_MESG$
    1002                           ORA-00001: unique constraint (
                                   SCOTT.SYS_C0037530) violated
    1000                           ORA-00001: unique constraint (
                                   SCOTT.SYS_C0037530) violated

  • Filtering unique values in dataset

    Hi, I have nice little dataset here, populated from xml. It
    contains articles, categories.....
    What I need is to initialize my custom menu class with unique
    values from categories column. So how do I query my dataset to
    return me only unique categories?
    I could do it in my xml parser (extract uniques in array),
    but that would be poor design.
    So please help! Need those uniques out of dataset fast!

    Could you please tell me how i will Filter duplicate
    valus inside the objectI'm sorry, but I can't read the future.

  • How to create surrogate key in dimension without unique value

    Hi, I have a dimension where there is no column with unique value. I want to add a surrogate key to replace the existing primary key which is derived from concatenating 3 columns(e.g. 'A'||'B'||'C'). I'm thinking of using sequence. But this won't allow me to link the dimension to fact table. How do I come up with surrogate key under this situation? Thanks. ~Tracy

    I'm actually trying to accomplish something similar myself.
    In my sources I've got two sorts of customers, ones that are directly reported, and ones whose information is provided with sales records (this is stored in module ODS).
    Of course identification is different, but in the datamart (module DWH) I'm sort of forced to use an equivalent way of loading (due to the way it first used to work). To accelerate lookups on dimensions, I copy the ODS surrogate key to DWH dimensions, but this does not work for the 'inbuilt' customers because they do not have a surrogate key in the ODS.
    They DO have means of unique identification, and at first I thought I could concatenate these (also 3) columns to use as identification code. Unfortunately this is VARCHAR2, where the surrogate key is (naturally) NUMBER.
    So now it looks like I'm forced to first build a table in ODS especially for these 'inbuilt' customers and assign a surrogate key (by sequence) to it, this way it conforms to how 'normal' customers are loaded into DWH.
    I guess you'll have to pull of the same trick, i.e. create a table with either only the 'translation' of D-code to a surrogate key or all information that is fed into the dimension, which then can be used as a lookup or as complete source when loading data into your datamart.
    Good luck, Patrick

  • Use only unique values as combo box data provider.

    I have an array collection of objects each of which has a
    property I
    will simply call 'name'. Let us say that this array
    collection contains
    a list of objects with the following name values
    ['George','Fred','George','Ron','Bill','Charlie','Bill','Bill','Percy'].
    I would like to use these names as the data provider of a
    combo box, but
    without the duplicates. What would be the easiest way to
    distill this
    down to an array of unique values to use as this combo box's
    data
    provider in Flex?
    The original array collection was the result of an remote
    object call to
    a cfc that returned a record set from a data store. Is it a
    better
    option to create a new function in this cfc that returns the
    unique list
    I desire, or can I do this easily inside the Flex code since
    I already
    have the complete list?
    TIA
    Ian Skinner

    Untested:
    var aData:Array =
    ['George','Fred','George','Ron','Bill','Charlie','Bill','Bill','Percy'];
    var oCheck:Object = new Object():
    var aUnique:Array = new Array()
    var sValue:String
    for (vari:int=0;i<aData.length;i++) {
    sValue = aData[ i ];
    if (!oCheck[sValue ]) { //is this a unique value?
    oCheck[sValue ] = true; //put this value in the check object
    aUnique.push(sValue)
    Note: if the values contain illegal property name values, yo
    would need to modify this.
    Tracy

Maybe you are looking for