Unique constraint control in data table

I developped a form where the user can enter or modify a list of profiles.
Each profile has a code (entered by the user).
I need to control that the profile code is unique.
What are the recomendations?
Should I do that control in the validation method, in the action method, or should I let this control to the database and manage an excpetion (but how...) ?
Do you have an example of such controls?

Thanks for your answer.
Actually, this is what I'm trying to do, but it's not so obvious:
1 - I need to control the unicity of the code in the list. So, I need to have all the new values before validating. The "Validator" attribute doesn't apply to the datatable.
2 - I want to verify the codes with other rows in the DB (which can be hidden on this screen). So, should I access to the DB from the managed bean just to verify?
I don't know if it's possible, but what I'd like is to try to do the insert, then the DB will send back an exception, but then, I don't know how to manage this exception. Because if I try to insert, I'm at the end of the lifecycle. I'm not any longer in the validation phase.

Similar Messages

  • Adding a UNIQUE Constraint to an existing table

    Hi,
    I got this issue because of the existing data. My issue is in my table I want to add a unique constraint to two columns but i cannot add this becuase of the existing repeating data. And I cannot do a data repair to fix the repeating data since the customer is reling on this data so we can not get the decision to repair the data.
    As a solution I try this method, by adding a function to check the repeating data before inserting the data to the table. It's working fine to a user but when it come to multiple users it's not working because users can log to the database and can do their transactions simultaneously.
    My question is; is there a way in oracle to add a constraint to the data that can add in future not to the old existing data?
    Thanks,
    Darex.

    user9359353 wrote:
    Hi,
    As a solution I try this method, by adding a function to check the repeating data before inserting the data to the table. It's working fine to a user but when it come to multiple users it's not working because users can log to the database and can do their transactions simultaneously.
    show us what is "not working". if you are calling this function from a trigger the correct way, the first person to commit will have their data inserted and the next person will not be able to insert.
    edit: you may want to have a read through this thread: where I was encountering a problem with multi-row validation using triggers:
    Row level validation dependant on other rows?
    note this post:
    Rob van Wijk wrote:
    Hi WhiteHat,
    Here are two blogposts of mine about this subject that you might find useful.
    One with some guidelines about implementing entity rules: http://rwijk.blogspot.com/2008/08/implementing-entity->rules.html
    And one with an example how to implement (among others) an overlap check in the context of another option for your >question, the product RuleGen: http://rwijk.blogspot.com/2008/05/rulegen-test.html
    Hope this helps.
    Regards,
    Rob.Edited by: WhiteHat on Jun 8, 2011 3:51 PM

  • Performance Impact of Unique Constraint on a Date Column

    In a table I have a compound unique constraint which extends over 3 columns. As a part of functionality I need to add another DATE column to this unique constraint.
    I would like to know the performance implications of adding a DATE column to the unique constraint. Would the DATE column behave like another VARCHAR2 or NUMBER column, or would it degrade the performance significantly?
    Thanks
    Message was edited by:
    user627808

    What performance are you concerned about degrading? Inserts? Or queries? If you're talking about queries, what sort of access path are you concerned about?
    Are you concerned that merely changing the definition of the unique constraint would impact performance? Or are you worried that whatever functional change you are making would impact performance (i.e. if you are now retaining historical data in the table rather than just updating it)?
    Regardless of the performance impact, unique indexes (and unique constraints) need to be correct. If you need to allow duplicates on the 3 current columns with different dates, then you would need to change the unique constraint definition regardless of the performance impact. Fast and wrong generally isn't going to be preferrable to slow and right.
    Generally, though, there probably is no reason to be terribly concerned about performance here. Indexing a date is no different than indexing any other primitive data type.
    Justin

  • Add a unique constraint on binary XML table

    How add a unique constraint of "brevet" field?
    The following INSERT failed
    SQL Error: ORA-19025: EXTRACTVALUE renvoie la valeur d'un seul noeud
    19025. 00000 - "EXTRACTVALUE returns value of only one node"
    If the ALTER is made after the INSERT is done, INSERT is valid but ALTER failed with the same error message!
    /* copy the file compavions.xml
    <?xml version="1.0" encoding="ISO-8859-1"?>
    <compagnie>
         <comp>AB</comp>
         <flotte>
              <avion immat="F-WTSS" capacite="90">
                   <typeAv>Concorde</typeAv>
              </avion>
              <avion immat="F-GFDR" capacite="145">
                   <typeAv>A320</typeAv>
              </avion>
              <avion immat="F-GTYA" capacite="150">
                   <typeAv>A320</typeAv>
              </avion>
         </flotte>
         <nomComp>Air Blagnac</nomComp>
         <pilotes>
              <pilote>
              <brevet>PL-1</brevet>
              <nom>C. Sigaudes</nom>
              </pilote>
              <pilote>
              <brevet>PL-2</brevet>
              <nom>P. Filloux</nom>
              </pilote>
         </pilotes>
    </compagnie>
    in C:\...
    --DROP DIRECTORY repxml;
    --CREATE DIRECTORY repxml AS 'C:\...';
    DROP TABLE pilote_binary_xml5;
    CREATE TABLE pilote_binary_xml5 OF XMLType
    XMLTYPE STORE AS BINARY XML
    VIRTUAL COLUMNS
    (col AS (EXTRACTVALUE(OBJECT_VALUE, '/compagnie/pilotes/pilote/brevet')));
    ALTER TABLE pilote_binary_xml5 ADD CONSTRAINT brevet_unique UNIQUE (col);
    INSERT INTO pilote_binary_xml5 VALUES (XMLType(BFILENAME ('REPXML','compavions.xml'), NLS_CHARSET_ID ('AL32UTF8')));
    --ALTER TABLE pilote_binary_xml5 ADD CONSTRAINT brevet_unique UNIQUE (col);                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                               

    You could try something like
    (extract(OBJECT_VALUE,compagnie/pilotes/pilote/brevet').getStringVal());
    but this is probably unadvisable.
    I suggest an alternative is for you to look at XMLINDEX so that you can bring the 'proper' XML functions into play like XMLTABLE etc.

  • Tip: Unique Constraint on Part of Table

    Here's a cute little trick I picked up from where it was hiding deep within one of the manuals:
    Consider a table where one column needs to be unique, but only part of the time based on some condition within the row. For example, col1 must be unique when its value is above 1000 but any number of rows can have the same value for col1 when it is 999 or less. Another example would be a flag column that determines that col1 must be unique. Don't ask me WHY you would do either of these things.
    Here's how to accomplish the first example (btw both examples require a PK on the table):
    CREATE TABLE junk(
    junk_key INTEGER NOT NULL,
    col1 INTEGER NOT NULL,
    CONSTRAINT pk_junk PRIMARY KEY(junk_key)
    --The way we accomplish this is with a function-based index. 
    CREATE UNIQUE INDEX idx_semiuniq ON junk(
    CASE WHEN col1>=1000 THEN TO_CHAR(col1) ELSE '~' || TO_CHAR(junk_key) END);
    --The following statements should succeed:
    insert into junk values(1,1);
    insert into junk values(2,1);
    insert into junk values(3,1000);
    insert into junk values(4,1001);
    update junk set col1=1002 where junk_key=4;
    --The following statements should fail:
    insert into junk values(5,1000);
    update junk set col1 = 1001 where junk_key=1;
    The same strategy applies to the second example, except that in this case we'll need an additional column to indicate that col1 needs to be unique:
    CREATE TABLE junk( junk_key INTEGER NOT NULL,
    col1 INTEGER NOT NULL,
    uniq_flag CHAR(1) NOT NULL,
    CONSTRAINT pk_junk PRIMARY KEY(junk_key),
    CONSTRAINT ck_junk CHECK (uniq_flag IN ('Y','N'))
    CREATE UNIQUE INDEX idx_semiuniq ON junk(CASE WHEN uniq_flag = 'Y' THEN TO_CHAR(col1) ELSE '~' || TO_CHAR(junk_key) END);
    --these should succeed:
    insert into junk values(1,1,'N');
    insert into junk values(2,1,'N');
    insert into junk values(3,1,'Y');
    insert into junk values(4,2,'Y');
    --these should fail:
    insert into junk values(5,2,'Y');
    update junk set uniq_flag='Y' where junk_key=2;
    Another way to accomplish this is with a "not exists" in a check constraint, but I think the function-based index will perform better especially when inserting or updating large numbers of rows.

    You could try something like
    (extract(OBJECT_VALUE,compagnie/pilotes/pilote/brevet').getStringVal());
    but this is probably unadvisable.
    I suggest an alternative is for you to look at XMLINDEX so that you can bring the 'proper' XML functions into play like XMLTABLE etc.

  • Insert called before delete in a collection with unique constraint

    Hi all,
    I have a simple @OneToMany private mapping:
    private Collection<Item> items;
    @OneToMany(mappedBy = "parent", cascade = CascadeType.ALL)
    public Collection<Item> getItems() {
    return items;
    public void setItems(Collection<Item> items) {
    this.items = items;
    public void customize(ClassDescriptor classDescriptor) throws Exception {
    OneToManyMapping mapping = (OneToManyMapping)
    classDescriptor.getMappingForAttributeName("items");
    mapping.privateOwnedRelationship();
    I have a unique constraint on my Items table that a certain value cannot be duplicated.
    My problem appears when I remove a previously saved item from the collection and add a new item containing the same data, at the same time.
    After I save the parent and do a flush, I receive SQLIntegrityConstraintViolationException because TopLink performs first an insert query instead of deleting the existing item.
    I tested the application and everything went fine with: remove item / save parent / insert item / save parent
    I checked on the Internet and the documentation but didn't find anything similar to my problem. I tried debugging TopLink's internal calls but I'm missing some general ideas about all the inner workings and don't know what to look for. I use TopLink version: Oracle TopLink Essentials - 2.1 (Build b60e-fcs (12/23/2008))
    Does anyone have a hint of what to look for?
    Edited by: wise_guybg on Sep 25, 2009 4:01 PM
    Edited by: wise_guybg on Oct 5, 2009 11:22 AM

    Thank you for the suggestions James
    As I mentioned briefly I have done some debugging but couldn't understand how collections are updated. What I did find out is that setShouldPerformDeletesFirst() doesn't come into play in this case because this is not a consecutive change on entities.
    What I have in my case is a collection inside an entity that the user has tampered with and now TopLink has to do a merge. I cannot call flush() in the middle since the user has not approved that the changes made to the entity should be saved.
    I see that for TopLink it's not easy to figure out the order in which changes were made to a collection. Here is pseudo-code of when the constraint is touched:
    entity.items.remove(a)
    entity.items.add(b)
    merge(entity)
    And here is code that executes without a problem:
    entity.items.remove(a)
    merge(entity)
    entity.items.add(b)
    merge(entity)
    So once again, I think that collection changes are managed differently but I don't find a way to tell TopLink how to handle them. Any ideas?

  • Unique Constraint

    Hello,
    OWM_VERSION: 11.1.0.7.0
    Oracle Database 11g Enterprise Edition Release 11.1.0.7.0 - 64bit Production
    I have a situation where somewhere along the line the primary key for a few records were thrown off. This is causing workspace manager to try and insert the T-520575 workspace records into LIVE when they are merged back together however this causes a unique constraint violation. Here is what the data looks like:
    PK       UNQ1    UNQ2    WORKSPACE
    2095989  495685  152568  T-520575
    491685  495685  152568  LIVE
    2095990  495687  152569  T-520575
    491687  495687  152569  LIVE
    2096052  495689  152570  T-520575
    491689  495689  152570  LIVEI tried deleting the records out of the T-520575 workspace and refreshing the table using the following, but it still raises the unique constraint.
       delete from table
        where PK in( 2095989, 2095990, 2096052);
       dbms_wm.refreshtable(
          workspace    => 'T-520575',
          table_id     => 'TABLE',
          where_clause => 'PK in ( 491685, 491687, 491689)',
          auto_commit  => FALSE
       );   I can see that the records I deleted have a WM_OPTYPE of 'D' but do not have a retiretime. How do I retire these records so that I can refresh them from LIVE? or is there a better solution to correct this issue?
    Edit
    I should also add that LIVE is the parent of this workspace T-520575
    Thanks,
    Tyson
    Edited by: Tyson Jouglet on Apr 27, 2012 8:34 AM

    Hi Tyson,
    Are you sure there are no other rows that have a conflicting value for the unique constraint? What you have described should work. Since the rows have been deleted, they are no longer considered when evaluating the constraints when executing RefreshTable. You should be able to both merge and refresh the workspace after deleting the rows. Also, the row(s) with the 'D' wm_optype would not have a retiretime value, unless the row was reinserted. Only the 'I' or 'U' row(s) would have a retiretime value in this case.
    If there are not any other rows, then I would need a more complete description. Are there any continuously refreshed workspaces that have T-520575 as a parent workspace? Those workspaces would also have to be checked.
    Regards,
    Ben

  • Foreign key also refer to unique constraint??

    foreign key also refer to unique constraint.
    (GREAT...)
    1.then table that containt unique constraint act as master table??
    2.IS unique constraint will replace with primary key??
    3.Is unique constraint+not null gives all fuctionality as primary key constraint in ???
    4.if primary key=unique+not null then what is use of primary key????????????
    thanks
    kuljeet pal singh

    When you are establishing a foreign key relationship between two tables, a child record must point to a unique record in the parent table. Typically, the child record points to the primary key of the parent, although any unique field or fields in the parent will do.
    So, a table with a unique constraint can act as a parent table in a foreign key relationship.
    A unique constraint may be replaced with a primary key, but not neccessarily.
    A unique constraint plus a not null constraint is functionally identical to a primary key.
    The principle benefit of a primary key compared to unique plus not null is that it provides additional information to someone looking at the database. The primary key is the unchanging identifier for a particular record. A unique constraint plus a not null constraint only implies uniqueness. It is somewhat common for unique values to change over time, as long as they remain unique, but a primary key should never change.
    TTFN
    John

  • Table with unique Constraint

    Dear All
    i have a a project ADF-BC / JSF - JDeveloper 11.1.2.3.0 latest, and i have EO contains PK constrain in db in 2 fields (userid & Roleid) and i implemented bundle to handle error message with jbo error code and it works fine in AM test
    and i have VO contains LOV in one attribute of this unique constrain columns (Roleid), now i dropped VO in jsf page as a af:table as below with input text with list of values for roleid and auto submit = true , and i face unexpected behavior from lov attribute in case of entering repeated value ..
    when i enter another repeated value , it give me error message i created in the bundle and everything ok until now
    but when i tab out of input text with list of values , it go back to old value as may be validation fired in back ground , it is not a problem until now
    when i try to make anything else, he still gives me error message of duplicated key
    i change the value again to correct value to avoid duplication error message , i am surprised , still i get the error message and shows me the repeated value again !!
    simply it still save the old repeated value however i corrected , please any one help me to know what is happening and how to solve ?
    Attribute in EO :
    <Attribute
    Name="RoleId"
    Precision="10"
    ColumnName="ROLE_ID"
    SQLType="VARCHAR"
    Type="java.lang.String"
    ColumnType="VARCHAR2"
    TableName="USER_ROLES"
    PrimaryKey="true">
    <DesignTime>
    <Attr Name="_DisplaySize" Value="10"/>
    </DesignTime>
    <validation:ExistsValidationBean
    Name="RoleId_Rule_1"
    ResId="CS.model.BC.EO.UserRolesEO.RoleId_Rule_1"
    OperandType="EO"
    AssocName="CS.model.BC.ASS.UsersRolesFk2ASS"/>
    </Attribute>
    Interface af : table
    <af:table value="#{bindings.UserRoles2.collectionModel}" var="row"
    rows="#{bindings.UserRoles2.rangeSize}"
    emptyText="#{bindings.UserRoles2.viewable ? 'No data to display.' : 'Access Denied.'}"
    fetchSize="#{bindings.UserRoles2.rangeSize}"
    rowBandingInterval="0"
    filterModel="#{bindings.UserRoles2Query.queryDescriptor}"
    queryListener="#{bindings.UserRoles2Query.processQuery}"
    filterVisible="true" varStatus="vs"
    selectedRowKeys="#{bindings.UserRoles2.collectionModel.selectedRow}"
    selectionListener="#{bindings.UserRoles2.collectionModel.makeCurrent}"
    rowSelection="single" id="t1" columnSelection="none"
    columnStretching="column:c3">
    <af:column sortProperty="#{bindings.UserRoles2.hints.RoleId.name}"
    filterable="true" sortable="true"
    headerText="#{bindings.UserRoles2.hints.RoleId.label}"
    id="c2">
    <af:inputListOfValues id="roleIdId"
    popupTitle="Search and Select: #{bindings.UserRoles2.hints.RoleId.label}"
    value="#{row.bindings.RoleId.inputValue}"
    model="#{row.bindings.RoleId.listOfValuesModel}"
    required="#{bindings.UserRoles2.hints.RoleId.mandatory}"
    columns="#{bindings.UserRoles2.hints.RoleId.displayWidth}"
    shortDesc="#{bindings.UserRoles2.hints.RoleId.tooltip}"
    autoSubmit="true" editMode="select">
    <f:validator binding="#{row.bindings.RoleId.validator}"/>
    </af:inputListOfValues>
    </af:column>
    <af:column sortProperty="#{bindings.UserRoles2.hints.RoleName.name}"
    sortable="true"
    headerText="#{bindings.UserRoles2.hints.RoleName.label}"
    id="c3">
    <af:outputFormatted value="#{row.bindings.RoleName.inputValue}"
    id="of7" partialTriggers="roleIdId"/>
    </af:column>
    <af:column sortProperty="#{bindings.UserRoles2.hints.Active.name}"
    filterable="true" sortable="true"
    headerText="#{bindings.UserRoles2.hints.Active.label}"
    id="c4">
    <af:outputFormatted value="#{row.bindings.Active.inputValue}"
    id="of8" partialTriggers="roleIdId"/>
    </af:column>
    </af:table>
    Edited by: user8854969 on Oct 7, 2012 1:34 PM
    Edited by: user8854969 on Oct 7, 2012 2:16 PM

    I believe there is a little confusion here. The error I am encountering has to do with a unique constraint violation and not a foreign key constraint. If I have the data:
    PK FK sequence
    1 5 1
    2 5 2
    3 5 3
    with a unique constraint on (FK, sequence) and want to change it to:
    PK FK sequence
    1 5 1
    4 5 2 --insert
    2 5 3 --update on sequence
    3 5 4 --update on sequence
    I am currently getting a unique constraint violation because the insert is issued before the updates, and the updates alone cause problems because they are issued out of order (i.e. if I do the shifting operation without the insertion of a new record).

  • Unique constraint violation on version enabled table

    hi!
    we're facing a strange problem with a version enabled table that has an unique constraint on one column. if we rename an object stored in the table (the name-attribute of the object is the one that has a unique constraint on the respective column) and rename it back to the old name again, we get an ORA-00001 unique constraint violation on the execution of an update trigger.
    if the constraint is simply applied as before to the now version enabled table, I understand that this happens, but shouldn't workspace manager take care of something like that when a table with unique constraints is version enabled? (the documentation also says that) because taking versioning into account it's not that we try to insert another object with the same name, it's the same object at another point in time now getting back it's old name.
    we somewhat assume that to be a pretty standard scenario when using versioned data.
    is this some kind of bug or do we just miss something important here?
    more information:
    - versioning is enabled on all tables with VIEW_WO_OVERWRITE and no valid time support
    - database version is 10.2.0.1.0
    - wm installation output:
    ALLOW_CAPTURE_EVENTS OFF
    ALLOW_MULTI_PARENT_WORKSPACES OFF
    ALLOW_NESTED_TABLE_COLUMNS OFF
    CR_WORKSPACE_MODE OPTIMISTIC_LOCKING
    FIRE_TRIGGERS_FOR_NONDML_EVENTS ON
    NONCR_WORKSPACE_MODE OPTIMISTIC_LOCKING
    NUMBER_OF_COMPRESS_BATCHES 50
    OWM_VERSION 10.2.0.1.0
    UNDO_SPACE UNLIMITED
    USE_TIMESTAMP_TYPE_FOR_HISTORY ON
    - all operations are done on LIVE workspace
    any help is appreciated.
    EDIT: we found out the following: the table we are talking about is the only table where the unique constraint is left. so there must have been a problem during version enabling. on another oracle installation we did everything the same way and the unique constraint wasn't left there, so everything works fine.
    regards,
    Andreas Schilling
    Message was edited by:
    aschilling

    hi!
    we're facing a strange problem with a version enabled table that has an unique constraint on one column. if we rename an object stored in the table (the name-attribute of the object is the one that has a unique constraint on the respective column) and rename it back to the old name again, we get an ORA-00001 unique constraint violation on the execution of an update trigger.
    if the constraint is simply applied as before to the now version enabled table, I understand that this happens, but shouldn't workspace manager take care of something like that when a table with unique constraints is version enabled? (the documentation also says that) because taking versioning into account it's not that we try to insert another object with the same name, it's the same object at another point in time now getting back it's old name.
    we somewhat assume that to be a pretty standard scenario when using versioned data.
    is this some kind of bug or do we just miss something important here?
    more information:
    - versioning is enabled on all tables with VIEW_WO_OVERWRITE and no valid time support
    - database version is 10.2.0.1.0
    - wm installation output:
    ALLOW_CAPTURE_EVENTS OFF
    ALLOW_MULTI_PARENT_WORKSPACES OFF
    ALLOW_NESTED_TABLE_COLUMNS OFF
    CR_WORKSPACE_MODE OPTIMISTIC_LOCKING
    FIRE_TRIGGERS_FOR_NONDML_EVENTS ON
    NONCR_WORKSPACE_MODE OPTIMISTIC_LOCKING
    NUMBER_OF_COMPRESS_BATCHES 50
    OWM_VERSION 10.2.0.1.0
    UNDO_SPACE UNLIMITED
    USE_TIMESTAMP_TYPE_FOR_HISTORY ON
    - all operations are done on LIVE workspace
    any help is appreciated.
    EDIT: we found out the following: the table we are talking about is the only table where the unique constraint is left. so there must have been a problem during version enabling. on another oracle installation we did everything the same way and the unique constraint wasn't left there, so everything works fine.
    regards,
    Andreas Schilling
    Message was edited by:
    aschilling

  • Unique constraint on valid_to and valid_from dates

    Suppose I have a table:
    create table test (name varchar2(40), valid_from date, valid_to date);
    insert into test values ('fred', to_date('01/04/2008', 'dd/mm/yyyy'), to_date('01/04/2008', 'dd/mm/yyyy');
    insert into test values ('fred', to_date('04/04/2008', 'dd/mm/yyyy'), to_date('06/04/2008', 'dd/mm/yyyy');
    insert into test values ('fred', to_date('08/04/2008', 'dd/mm/yyyy'), to_date('09/04/2008', 'dd/mm/yyyy');How can I enforce uniqueness such that at any one point in time, only one row exists in the table for each name?
    eg. insert into test values ('fred', to_date('02/04/2008', 'dd/mm/yyyy'), to_date('03/04/2008', 'dd/mm/yyyy'); -- success!
    insert into test values ('fred', to_date('02/04/2008', 'dd/mm/yyyy'), to_date('05/04/2008', 'dd/mm/yyyy'); -- fail!
    insert into test values ('fred', to_date('01/04/2008', 'dd/mm/yyyy'), to_date('03/04/2008', 'dd/mm/yyyy'); -- fail!
    insert into test values ('fred', to_date('05/04/2008', 'dd/mm/yyyy'), to_date('05/04/2008', 'dd/mm/yyyy'); -- fail!
    insert into test values ('fred', to_date('07/04/2008', 'dd/mm/yyyy'), to_date('11/04/2008', 'dd/mm/yyyy'); -- fail!Is there a method using fbi's or unique constraints? I'd really rather avoid using triggers and pl/sql if I can, but I can't think of a way...
    Message was edited by:
    Boneist
    Added some extra test conditions

    How about this pair of indexes:
    CREATE UNIQUE INDEX test_fromdate_idx ON test(name,valid_from);
    CREATE UNIQUE INDEX test_todate_idx ON test(name,valid_to);Here is the test:
    SQL> create table test (name varchar2(40), valid_from date, valid_to date);
    Table created.
    SQL> insert into test values ('fred', to_date('01/04/2008', 'dd/mm/yyyy'), to_date('01/04/2008', 'dd/mm/yyyy'));
    1 row created.
    SQL> insert into test values ('fred', to_date('04/04/2008', 'dd/mm/yyyy'), to_date('06/04/2008', 'dd/mm/yyyy'));
    1 row created.
    SQL> insert into test values ('fred', to_date('08/04/2008', 'dd/mm/yyyy'), to_date('09/04/2008', 'dd/mm/yyyy'));
    1 row created.
    SQL> CREATE UNIQUE INDEX test_fromdate_idx ON test(name,valid_from);
    Index created.
    SQL> CREATE UNIQUE INDEX test_todate_idx ON test(name,valid_to);
    Index created.
    SQL> insert into test values ('fred', to_date('02/04/2008', 'dd/mm/yyyy'), to_date('03/04/2008', 'dd/mm/yyyy'));
    1 row created.
    SQL> insert into test values ('fred', to_date('02/04/2008', 'dd/mm/yyyy'), to_date('05/04/2008', 'dd/mm/yyyy'));
    insert into test values ('fred', to_date('02/04/2008', 'dd/mm/yyyy'), to_date('05/04/2008', 'dd/mm/yyyy'))
    ERROR at line 1:
    ORA-00001: unique constraint (TEST_USER.TEST_FROMDATE_IDX) violated
    SQL> insert into test values ('fred', to_date('01/04/2008', 'dd/mm/yyyy'), to_date('03/04/2008', 'dd/mm/yyyy'));
    insert into test values ('fred', to_date('01/04/2008', 'dd/mm/yyyy'), to_date('03/04/2008', 'dd/mm/yyyy'))
    ERROR at line 1:
    ORA-00001: unique constraint (TEST_USER.TEST_FROMDATE_IDX) violated
    SQL> spool off;Hope this helps!

  • NOT NULL Unique Constraint in Data Modeler

    I've created Unique Constraints in the Relational Model and I'm trying to figure out how to make it a NOT NULL constraint.
    Let's say the table name is category with columns cat_id, cat_name, sort.
    In SQL I create "ALTER TABLE category MODIFY (category CONSTRAINT xxx_cat_name_nn NOT NULL);", but inside the modeler there is no data entry points in the [Unique Key Properties - xxx_cat_name_nn] dialog box, that I can find, that lets me tell it that it is a NOT NULL constraint. I'm sure there is a way but I'm just fall over my own feet trying to find it.
    Any help would be greatly appricated.
    Edited by: 991065 on Feb 28, 2013 1:40 PM

    Hi,
    You can make the column NOT NULL by unsetting the "Allow Nulls" property for the Column.
    If you want a named NOT NULL Constraint, you should also set the "Not Null Constraint Name" property (on the Default and Constraint tab of the Column Properties dialog).
    David

  • Put a unique constraint on column if duplicate data is already present

    how to put a unique constraint on column if duplicate data is already present in that column?

    Hello,
    I have Oracle 10g and in this version documentation (SQL Reference) just says
    ENABLE NOVALIDATE ensures that all new DML operations on the constrained data comply with the constraint. This clause does not ensure that existing data in the table complies with the constraint and therefore does not require a table lock.
    So, as far as I understand, it does not guarantee that the constraint is really not validated:
    Connected to Oracle Database 10g Enterprise Edition Release 10.2.0.1.0
    Connected as xxxx
    SQL>
    SQL> create table drop_me as
      2  select 1 as id from dual
      3  union all
      4  select 1 as id from dual;
    Table created
    SQL> alter table drop_me
      2  add constraint unq_id unique (id) enable novalidate;
    alter table drop_me
    add constraint unq_id unique (id) enable novalidate
    ORA-02299: cannot validate (XXXX.UNQ_ID) - duplicate keys found
    SQL>

  • Bulk table update returning ORA-00001: unique constraint

    I'm trying to update every record in a PROPERTY table that has a CLASS of either 1 or 9 and a STATUS of 'LHLD'. CLASS and STATUS descriptor records are in different tables to the PROPERTY table but reference the PROPERTY records by the PROPERTY tables unid.
    I have wrote the following update command,
    UPDATE RNT_PROPERTY_DESCRIPTOR SET DESCRIPTOR = 'PROP', DESCRIPTOR_VALUE = '1', EFFECT_DATE = '01-APR-04', USER_ID = 'USER'
    WHERE RNT_PROPERTY_DESCRIPTOR.UNID IN (SELECT PROPERTY.UNID FROM PROPERTY, PROPERTY_CLASS_STATUS
    WHERE PROPERTY_CLASS_STATUS.PROP_CLASS = '1'
    OR PROPERTY_CLASS_STATUS .PROP_CLASS = '9'
    AND PROPERTY.UNID IN (SELECT PROPERTY.UNID FROM PROPERTY, PROP_STATUS_HIST
    WHERE PROP_STATUS_HIST.code = 'LHLD'));
    However, after executing for around 10 mins the process update fails and the following error is returned:
    ORA-00001: unique constraint (RNT_PROPERTY_DESCRIPTOR_IDX) violated
    I know that the IDX suffix refers to the table INDEX but I'm not sure why I'm getting a key constraint, none of the colums that I'm trying to update must be unique.
    For info the PROPERTY table has around 250,000 rows.
    Any ideas? Is there an error in my update statement?
    Thanks in advance.

    Gintsp,
    can you explain a little more? I'm not sure what you are suggesting that I try.
    Here is the output of what I have tried
    SQL> UPDATE RNT_PROPERTY_DESCRIPTOR SET DESCRIPTOR = 'PROP', DESCRIPTOR_VALUE = '1', EFFECT_DATE = '01-APR-04', USER_ID = 'USER'
    2 WHERE RNT_PROPERTY_DESCRIPTOR.UNID IN (SELECT PROPERTY.UNID FROM PROPERTY, PROPERTY_CLASS_STATUS
    3 WHERE PROPERTY_CLASS_STATUS.PROP_CLASS = '1'
    4 OR PROPERTY_CLASS_STATUS.PROP_CLASS = '9'
    5 AND PROPERTY.UNID IN (SELECT PROPERTY.UNID FROM PROPERTY, PROP_STATUS_HIST
    6 WHERE PROP_STATUS_HIST.CODE = 'LHLD'));
    UPDATE RNT_PROPERTY_DESCRIPTOR SET DESCRIPTOR = 'PROP', DESCRIPTOR_VALUE = '1', EFFECT_DATE = '
    ERROR at line 1:
    ORA-00001: unique constraint (RNT_PROPERTY_DESCRIPTOR_IDX) violated
    SQL> select owner, constraint_type, table_name, search_condition from user_constraints where constraint_name = 'RNT_PROPERTY_DESCRIPTOR_IDX';
    no rows selected
    The RNT_PROPERTY_DESCRIPTOR table structure is as follows:
    Name Null? Type
    UPRN NOT NULL NUMBER(7)
    DESCRIPTOR NOT NULL VARCHAR2(4)
    DESCRIPTOR_VALUE VARCHAR2(11)
    EFFECT_DATE NOT NULL DATE
    VALUE_DESCRIPTION VARCHAR2(35)
    POINTS NUMBER(2)
    POUNDS NUMBER(5,2)
    SUPERSEDED VARCHAR2(1)
    CURRENT_FLAG VARCHAR2(1)
    FUTURE VARCHAR2(1)
    END_EFFECT_DATE DATE
    USER_ID NOT NULL VARCHAR2(10)
    CREATE_DATE DATE
    -------------------------------------------------------------

  • How add primary key constraint to already existing table with data

    I want apply primary key constraint to already existing table with data
    is there any command or way to do

    Alternatively, assuming you want to ensure uniqueness in your primary key column you can do this:
    alter table <table name> add constraint <cons name> primary key (col1,col2) exceptions into <exception_table>
    If the altter table statement fails this will populate the EXCEPTIONS table with the rows that contain duplicate values for (col1,col2).
    You will need to run (or get a DBA to run) a script called UTLEXCPT.SQL (which will be in the $ORACLE_HOME/rdbms/admin directory) if you don't already have an EXCEPTIONS table.
    Cheers, APC

Maybe you are looking for