Mapping to a Lookup Table

O.K. I have a basic Java question and how to do this the following using
Java/JDO:
I have a database table for a lookup class. It will have the following
structure:
Create Table ProductStatus
Id bigint not null
name char (20);
In C++ I would have an Enumeration Type which constrains the value of the
variable to the set of values in the database.
In Java I was thinking that I would have a ProductStatus class and then in
the Product Class I would have an instance as follows:
public class Product {
ProductStatus status;
But I have now believe that this is wrong...
From offline work I have been told that the correct JDO for what I want
to do is as follows:
<class name="Product">
<extension vendor-name="kodo" key="table" value="JDO_PRODUCT"/>
<extension vendor-name="kodo" key="pk-column" value="ID"/>
<extension vendor-name="kodo" key="class-column" value="JDOCLASS"/>
<extension vendor-name="kodo" key="lock-column" value="none"/>
<field name="currency">
<extension vendor-name="kodo" key="data-column" value="CURRENCY"/>
</field>
<field name="productStatus">
<extension vendor-name="kodo" key="data-column" value="STATUS_ID"/>
</field>
</class>
<class name="ProductStatus">
<extension vendor-name="kodo" key="table" value="JDO_STATUS"/>
<extension vendor-name="kodo" key="pk-column" value="ID"/>
<field name="name">
<extension vendor-name="kodo" key="data-column" value="NAME"/>
</field>
</class>
My two questions are:
1) What do I put for the Java code, both in the product class and for
ProductStatus.... This is more a Java question than JDO... I have been
told I am thinking in too much of a C++ way.... I keep wanting to have
a ProductStatus as a Type Class and then have an instance variable in
the Product class such as status that is of type: ProductStatus.
But I have been told this is not the correct Java way to do things.
How should this be done?
2) I would have expected that the JDO for the ProductStatus field in the
Product Class would have been a FK extension, but I had Abe White help me
with this and he said generated the JDO file I have included.... so how is
the Product.ProductStatus linking to the ProductStatus class ???? With the
current JDO I don't see any link, either by name or by shared ID value....
Obviously I am a total newbie to this and these are basic questions.
Thanks to anyone who has the time to reply.
Brian Smith

Hi Brian --
Java code:
public class Product
     private ProductStatus status;
public class ProductStatus
     private String name;
Metadata:
Exactly like you posted it.
I'm not sure where you're getting confused on how the relation works. It's
straightforward. The STATUS_ID column in the JDO_PRODUCT table holds the
primary key value of the related ProductStatus instance. That's all there is
to it. So, for example, the following Java code:
ProductStatus stat = new ProductStatus ();
stat.setName ("foo");
Product prod = new Product ();
prod.setProductStatus (stat);
pm.currentTransaction ().begin ();
pm.makePersistent (prod);
pm.currentTransaction ().commit ();
Would product SQL like:
INSERT INTO JDO_PRODUCT (ID, STATUS_ID) VALUES (100, 101);
INSERT INTO JDO_STATUS (ID, NAME) VALUES (101, 'foo');
Note that the '101' value for STATUS_ID in the inserted JDO_PRODUCT row
matches the '101' value for the ID (primary key) in the inserted JDO_STATUS
row.
If you wanted to simulate a C++ enumeration and at the same time get rid of
the necessity of a second JDO_STATUS table that seems a little redundant
(why have a table to hold a single piece of data?), you could do it in 2 ways.
The simple way would be to just use static constants in the Product class for
different possible statuses:
public class Product
     public static final String STATUS_XXX = "xxx";
     public static final String STATUS_YYY = "yyy";
     private String status;
And then just have the JDO_PRODUCT table have a VARCHAR column to hold the
status string rather than having a STATUS_ID column that forms a relation to
a separate table.
The second way, which more closely mirrors C++ enumerations, is a little more
complex. You keep the ProductStatus class, but make it non-persistent (note:
the static-instance, private-constructor pattern below is the standard Java
way of doing enums):
public class ProductStatus
     // enumerate all possible product status values
     public static final ProductStatus XXX = new ProductStatus ("xxx");
     public static final ProductStatus YYY = new ProductStatus ("yyy");
     private String name;
     * Convenience method to return a status for a given name.
     public static ProductStatus forName (String name)
          if (XXX.getName ().equals (name))
               return XXX;
          if (YYY.getName ().equals (name))
               return YYY;
          throw new IllegalArgumentException (name);
     * Make constructor private to prevent creation of any status instances
     * with illegal values.
     private ProductStatus (String name)
          this.name = name;
     public String getName ()
          return name;
Then in your Product class, keep a non-persistent ProductStatus instance and
a persistent field with the status name. Use the javax.jdo.InstanceCallbacks
interface to transfer the stored status name to/from the ProductStatus:
public class Product
     implements InstanceCallbacks
     // used internally only; map this to the column in the JDO_PRODUCT table
     // holding the status name for the product; at runtime we'll use JDO
     // lifecycle callbacks to transfer the value to/from a ProductStatus
     // instance
     private String statusName;
     // mark this as non-persistent in the metadata
     private ProductStatus status;
     public void jdoPostLoad ()
          status = ProductStatus.forName (statusName);
     public void jdoPreStore ()
          statusName = (status == null) ? null : status.getName ();
     // rest of InstanceCallbacks methods can be implemented as no-ops...
With Kodo, you could even implement your own FieldMapping to map status strings
to ProductStatus instances, but that's getting really complicated for very
little real advantage.
Hope that helps.

Similar Messages

  • Syndicating Key mapping value from lookup table

    Hi Experts,
    I want to Syndicating Remote Key value from lookup table as per the remote system.
    In syndicator, if I map destination field to the remote key of the lookup table, I am getting blank value.

    Hi Mrinmoy,
    kindly check in the Data Manger whether have you maintained Remote keys for the lookup table. If yes then choose the specified remote system from Remote key override fields under Map properties in the syndicator.
    Incase you cant find the remote system in the "remote key override" field for which remote key is assigned in Data manager, then you need to check the Type (outbound) of the remote system in Console admin node. Because only those Remote systems type set as Outbound can been found in Remote key Override in the syndicator.
    After choosing the remote key you need to map the destination field with Remote key value as shown in the below image.
    Regards
    Rahul

  • Key Mapping for Flat lookup tables

    Hi,
    How do we decide if we need to make Key Mapping "Yes" for flat look-up tables?
    Can anyone plz explain with an eg on where to make key mapping yes or no for Flat Tables.
    Thanks,
    Ketan

    Hi,
    Can anyone plz explain with an eg on where to make key mapping yes or no for Flat Tables.
    1. A remote system’s objects are mapped to master data objects within MDM using key mapping. A key mapping maintains the relationship between the remote system’s identifier (or key) for an object and the corresponding master data object in MDM.
    2. in the data manger based upon some strategy, you found that 4 records are duplicate, and then you merged into single record.
    the merged record is having 4 records inside it with respective remote keys.
    if you want to edit those records, key mapping should be enabled for that particular table.
    3. while harmonizing the records to the respective client systems, you can use edit key mapping functionality for merged records.
    if you enabled key mapping functionality in the console for the particular table, then only you can aceess EDIT KEY MAPPINGS functionality in data manager & syndicator.
    hope this may help you,
    Regards,
    Srinivas

  • Multiple columns (named the same originally) and mapped to the same lookup table are causing a Cube Build issue

    Hey folks, looking for some insight here.
    I've an implementation that contains some custom Enterprise columns mapped to lookup tables.  In the instance I'm working with now, it looks like there was/is an issue with one of those columns.  In this scenario, I have a column named
    ProjectType, created initially with that name, mapped to a lookup table.  This field's name was then changed to
    Project Type.  After that, it looks like another column was created, also called
    ProjectType.  So now, we have what I would have originally thought was two distinct columns, even though the names used are the same.
    Below is the error we're currently getting during the Cube Build Process...
    PWA:http://ps2010/PWA, ServiceApp:Project Web App, User:DOMAIN\user, PSI: SqlException occurred in DAL:  <Error><Class>1</Class><LineNumber>1</LineNumber><Number>4506</Number><Procedure>MSP_EpmProject_OlapView_B8546719-4D4C-473A-84B1-89DEDA2307E0</Procedure> 
    <Message>  System.Data.SqlClient.SqlError: Column names in each view or function must be unique. Column name 'ProjectType' in view or function 'MSP_EpmProject_OlapView_B8546719-4D4C-473A-84B1-89DEDA2307E0' is specified more than once.  </Message> 
    <CallStack>   
     at System.Data.SqlClient.SqlConnection.OnError(SqlException exception, Boolean breakConnection)   
     at System.Data.SqlClient.TdsParser.ThrowExceptionAndWarning(TdsParserStateObject stateObj)   
     at System.Data.SqlClient.TdsParser.Run(RunBehavior runBehavior, SqlCommand cmdHandler, SqlDataReader dataStream, BulkCopySimpleResultSet bulkCopyHandler, TdsParserStateObject stateObj)   
     at System.Data.SqlClient.SqlCommand.FinishExecuteReader(SqlDataReader ds, RunBehavior runBehavior, String resetOptionsString)   
     at System.Data.SqlClient.SqlCommand.RunExecuteReaderTds(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, Boolean async)   
     at System.Data.SqlClient.SqlCommand.RunExecuteReader(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, String method, DbAsyncResult result)   
     at System.Data.SqlClient.SqlCommand.InternalExecuteNonQuery(DbAsyncResult result, String methodName, Boolean sendToPipe)   
     at System.Data.SqlClient.SqlCommand.ExecuteNonQuery()   
     at Microsoft.Office.Project.Server.DataAccessLayer.DAL.SubDal.ExecuteStoredProcedureNoResult(String storedProcedureName, SqlParameter[] parameters)  </CallStack>  </Error>
    I've tried deleting the one column, but the build still gives the above error.
    Any thoughts as to how the above could be resolved?
    Thanks! - M
    Michael Mukalian | Jan 2010 - Dec 2010 MVP SharePoint Services | MCTS: MOSS 2007 Configuration | http://www.mukalian.com/blog

    We tried taking it out of the cubes, and it builds fine.  The challenge we're having is in building the cubes with that custom field "ProjectType".  It's as if the cubes still hold some reference to it even when it's deleted.
    Since the OLAP View ('MSP_EpmProject_OlapView_{guid}') is recreated, would it be as simple as deleting that View, and trying to recreate?
    Thanks - M
    Michael Mukalian | Jan 2010 - Dec 2010 MVP SharePoint Services | MCTS: MOSS 2007 Configuration | http://www.mukalian.com/blog

  • Create key mapping using import manager for lookup table FROM EXCEL file

    hello,
    i would like create key mapping while importing the values via excel file.
    the source file containing the key, but how do i map it to the lookup table?
    the properties of the table has enable the creation of mapping key. but during the mapping in import manager, i cant find any way to map the key mapping..
    eg
    lookup table contains:
    Material Group
    Code
    excel file contain
    MatGroup1  Code   System
    Thanks!
    Shanti

    Hi Shanti,
    Assuming you have already defined below listed points
    1)  Key Mapping "Yes" to your lookup table in MDM Console
    2) Created a New Remote System in MDM console
    3) proper rights for your account for updating the remote key values in to data manager through import manager.
    Your sample file can have Material Group and Code alone which can be exported from Data Manager by File-> Export To -> Excel, if you have  data already in Data Manager.
    Open your sample file through Import Manager by selecting  the remote system for which you want to import the Key mapping.
    (Do Not select MDM as Remote System, which do not allows you to maintain key mapping values) and also the file type as Excel
    Now select your Soruce and Destination tables, under the destination fields you will be seeing a new field called [Remote Key]
    Map you source and destination fields correspondingly and Clone your source field code by right clicking on code in the source hierarchy and map it to Remote Key if you want the code to be in the remote key values.
    And in the matching criteria select destination field code as a Matching field and change the default import action to Update NULL fields or UPDATED MAPPED FIELDS as required,
    After sucessfull import you can check the Remote Key values in Data Manager.
    Hope this helps
    Thanks
    Sowseel

  • How to import a lookup table

    Hi all,
    I am trying to import a simple XML file using import manager with the following information:
    - NAME: STRING
    - LAST NAME: STRING
    - CITY: STRING
    I have in my repository the next information:
    - NAME: TEXT.
    - LAST NAME: TEXT.
    - CITY: LOOKUP TABLE (CITY TABLE)
    CITY TABLE is a lookup table with just a fiel called 'NAME:TEXT'.
    With import manager I select the XML field CITY:STRING and I clone it. After that, I select the lookup table called TABLE CITY and I make the following map:
    - <b>Remote key</b> (Repository) maps with <b>City clone</b> (XML file)
    - <b>name</b> (Repository) maps with <b>city</b> (XML file).
    Finally, I select the Products table from the repository and I make a new map between CITY:LOOKUP TABLE (from the repository) and city (from the XML File).
    The import status fails. What is wrong?
    Thanks in advance,
    Marta

    Hi Marta,
    you have to import the lookup table before importing the main table data using an extra import map.
    But if I understand your description correctly, you only have one field in your city repository, which is the display field. This way, you can also load into the city lookup table using only the main table import.
    When you select 'city' in the field mapping (destination table is the main table), you will see the distinct values in the value mapping area below.
    In the destination pane of the value mapping, there shouldn't be any values for the initial import. So you have to select all your source values and 'ADD' them to your repository.
    If this still doesn't work, try the two maps aproach and import the cities prior to the main table records, then you can automap them later with the main table import.
    Please tell me if this still doesn't work.
    Regards,
    Christiane

  • How to map lookup table

    Hi friends,
    I did this simple report in obiee 10g(i.e)
    *"NATIONALITY COUNT IN DEPARTMENT WISE"*
    For that i used the following tables:
    per_all_assignments_f----->fact table
    hr_all_organization_units----->dim table(containing departments)
    per_all_people_f---------------->dim table(containing nationality)
    I made all the mappings in the physical diagram, as also viewed my report in BI answers
    It shows the following results like
    NATIONALITY---------------------------------------------------------------------COUNT(NATIONALITY)
    AUS------------------------------------------------------------------------------------------------24
    AFR------------------------------------------------------------------------------------------------25
    PHQ_VB-------------------------------------------------------------------------------------------40
    SH_VT----------------------------------------------------------------------------------------------4
    The problem is for me it is showing the above results, but the nationality column is of various codes of the country.
    Since i doesnt want the code of the nationalitian to display in the results..i need the meaning of each and every nationality..
    like,
    AUS------------------------Australian
    AFR-------------------------African
    PHQ_VB----------------------Germanian(assigned)
    Since i know that the meaning for the nationalitian is available in "FND_LOOKUP_VALUES"...okay..
    I can import "FND_LOOKUP_VALUES" table to the physical layer...but how i can able to give the mapping to the fact table in my physical diagram...
    In my report the fact table is "per_all_assignments_f"
    As my fact table doesnt contains any matching column corresponding to the dimension table "FND_LOOKUP_VALUES".....
    Then how i can give mappings to the fact column???? for viewing the full meaning of the nationalitian in my report.....
    Help me friends...
    All izz Well
    GTA...

    Hi Kranthi,
    Thanks for your reply....
    For the meaning to appear for each and every nationalitian i imported the HR_LOOKUPS table and i have given join to the per_all_people_f which is an dimension table.....
    This is the following query that i executed in toad for getting meaning for each and every nationality...
    select distinct h15.meaning, h15.lookup_code from hr_lookups h15, per_all_people_f papf, per_all_assignments_f paaf
    where h15.lookup_type(+) = 'NATIONALITY'
    AND h15.lookup_code(+) = papf.nationality and h15.meaning is not null and papf.person_id = paaf.person_id The samething i implemented in obiee, that is in physical diagram i have given the join between hr_lookups and per_all_people_f..
    like the lookup_code column in hr_lookups table and nationality column in the per_all_people_f table
    i obtained the results in BI answers, but i didnt get accurate result........
    Since for getting the accurate result i need to give one more join between the hr_lookups table and per_all_people_f table........
    This is the join i need to give to obtain accurate result that i have already mentioned in the above query...
    > h15.lookup_type(+) = 'NATIONALITY'
    since OBIEE is not allowing me to given a second join from the lookup table to the people table...then how i can able to obtain accurate result in BI answers......
    Is there anyother way to give second join between the same two tables in my case between hr_lookups and per_all_people_f..
    Please Help me with this......
    Thanks for your support.....
    All izz Well
    GTA...
    Edited by: GTA on Feb 17, 2011 1:15 AM

  • Import of Main and Lookup table in a single Map

    Hey Guys,
    I am developing a Proof of Concept to Import Main and Lookup-Flat in a single Import Map (by using a single excel file).
    Below is my Table structure:
    Main Table: Customer
    --->Customer_Number (Text,Unique Field, Display Field)
    --->Sales_Area (Lookup Flat)
    Lookup Table: Sales_Area
    Sales_Area_ID (Text,Unique Field,Display Field)
    Sales_Area_Desc (Text,Display Field).
    The import File (excel) has below attributes:
    Customer_Number,Sales_Area_ID,Sales_Area_Desc.
    When i start both Main Table and Lookup tables are empty (there is no data in Data Manager for either of the tables)
    Now in the Import Map, i selected source as the excel file and target as the Main table.
    I did mapping of Customer_Number as usual after that I created a compound field for Sales_Area_ID+ Sales_Area_Desc and did the mapping of this compound field. then did the mapping for Sales_Area_ID and Sales_Area_Desc.
    Now since there is no data in lookup table, i select the "Add" button in the "Value Mapping" section. when i execute this map, it works perfectly and data is loaded in both the Main table and lookup table but if a new value comes in the excel(a value which does not yet exist in lookup table), the map fails, when i open it , it says that i need to redo the Value Mapping, again i click on "Add" button and it starts working. So basically the Import Map fails whenever i get a value in excel which does not yet exist in Lookup table.
    Now my question is, is there a way to automate my import map, i thought clicking on "Add" button will take care of all the lookup values which are not already present.
    Can anyone please help me in this regard.
    Thanks
    Saif

    Hi Saif,
    You can try the following option.
    Right click on the lookup filed/compound filed in destination fields, and select the option 'SET MDIS Unmapped Value Handling' as 'ADD'.
    Cheers,
    Cherry.

  • Is Mapping Lookup table possible with IDOC to FIle scenario

    Hi all,
    Need suggestion, I am using SP16
    My sceanrio is IDOC to FIles, and have to use a Mapping Lookup tables for some of the fields within the mapping...
    'Crossref:  PlantLoc_to_WhseComDiv.  Value mapping lookup to take two fields from SAP and convert to WMS 3-digit value'
    How to go with this, since i have checked in SAP library that it is for only RFC,JDBC,SOAP adapters ...
    Need u r valuable inputs,
    Regards,
    sridhar

    You can use RFC or SOAP or JDBC lookup in your mapping.Why not?..It does not mean that we use the lookups only in RFC secnarios.You can use them in any scenario.

  • Remote key for lookup tables

    Hi,
    I need some advice on remote keys for lookup tables.
    We have loaded lookup data from several client system into the MDM repository. Each of the client system can have diffferences in the lookup values. What we need to do is to enable the keymappings so that the syndicator would know which value belongs to which system.
    The tricky part is. We haven't managed to send out the values based on the remote keys. We do <b></b>not<b></b> want to send the lookup tables themselves but the actually main table records. All lookup data should be checked at the point of the syndication and only the used lookup values that orginally came from one system should be send to that particular system. Otherwise they should be tag should be blank.
    Is this the right approach to handle this demand or is there a different way to take care of this? What would be the right settings in the syndicator?
    Help will be rewarded.
    Thank you very much
    best regards
    Nicolas

    Hi Andreas,
    that is correct. Let's take two examples:
    1) regions
    2) Sales Area data (qualified lookup data)
    Both tables are filled and loaded directly from the R/3s. So you would already know which value belongs to which system.
    The problem that I have is that we will not map the remote key from the main table because it will be blank for new created master data (Centralization scenario). Therefore we cannot map the remote key from the attached lookup tables, can we?
    The remote key will only work for lookup tables if the remote key of the actual master data is mapped. Since we don't have the remote key (local customer ID form R/3) in MDM and since we do not create it at the point of the syndication... how would the SAP standard scenario would look like for that?
    This is nothing extraordinary it's just a standard centralization scneario.
    Please advice.
    Thanks alot
    best regards
    Nicolas

  • Importing Key Combinations of a Lookup table.

    Hi, I have created a lookup table which is equivalent to a check table in R3. For example table named, SPT (Special Procurement Type), Which has a key combination (Composite Keys) in R3 itself and the keys are Plant + Procurement Type and so in MDM Look table both keys are defined as display fields. The problem comes in automating the import. In IM source, both key fields are separate fields, I tried combining the field using Partion, but Automap is not working and so the import automation process. Any thoughts/advices on tackling this issue is appreciated.
    Thanks
    Job.

    Hi Job,
    IF u  combine the fields in partition and then if try to do value mapping automatically,then it is not possible .There u have to map manualy .Automap in value mapping can be done only when values are not combination but single.
    Like
    Source----
    Destination
    abc----
    <null>
    def----
    abc
    xyz----
    def
    xyz
    But if it is like:
    Source----
    Destination
    abc,12----
    <null>
    def,23 -
    abc,12
    def,23
    in above case it is a combination so not possible.
    I hope u im able to make u understand.
    If this has helped u then do Reward points.
    Regards,
    Neethu Joy
    Edited by: Neethu joy on Dec 31, 2007 3:17 PM
    Edited by: Neethu joy on Dec 31, 2007 3:19 PM
    Edited by: Neethu joy on Dec 31, 2007 3:24 PM

  • Proper use of a Lookup table and adaptations for NET

    Hello,
    I need to create a few lookup tables and I often see the following:
    create table Languages
    Id int identity not null primary key (Id),
    Code nvarchar (4) not null,
    Description nvarchar (120) not null,
    create table Posts
    Id int identity not null primary key (Id),
    LanguageId int not null,
    Title nvarchar (400) not null,
    insert into Languages (Id, Code, Description)
    values (1, "en", "English");
    This way I am localizing Posts with language id ...
    IMHO, this is not the best scheme for Languages table because in a Lookup table the PK should be meaningful, right?
    So instead I would use the following:
    create table Languages
    Code nvarchar (4) not null primary key (Code),
    Description nvarchar (120) not null,
    create table Posts
    Id int identity not null primary key (Id),
    LanguageCode nvarchar (4) not null,
    Title nvarchar (400) not null,
    insert into Languages (Code, Description)
    values ("en", "English");
    The NET applications usually use language code so this way I can get a Post in English without using a Join.
    And with this approach I am also maintaining the database data integrity ...
    This could be applied to Genders table with codes "M", "F", countries table, transaction types table (should I?), ...
    However I think it is common to use int as PK in lookup tables because it is easier to map to ENUMS.
    And know it is even possible to map to Flag Enums so have a Many to Many relationship in an ENUM.
    That helps in NET code but in fact has limitations. A Languages table could never be mapped to a FLags Enum ...
    ... An flags enum can't have more than 64 items (Int64) because the keys must be a power of two.
    A SOLUTION
    I decided to find an approach that enforces database data integrity and still makes possible to use enums so I tried:
    create table Languages
    Code nvarchar (4) not null primary key (Code),
    Key int not null,
    Description nvarchar (120) not null,
    create table Posts
    Id int identity not null primary key (Id),
    LanguageCode nvarchar (4) not null,
    Title nvarchar (400) not null,
    insert into Languages (Code, Key, Description)
    values ("en", 1, "English");
    With this approach I have a meaningfully Language code, I avoid joins and I can create an enum by parsing the Key:
    public enum LanguageEnum {
    [Code("en")
    English = 1
    I can even preserve the code in an attribute. Or I can switch the code and description ...
    What about Flag enums? Well, I will have not Flag enums but I can have List<LanguageEnum> ...
    And when using List<LanguageEnum> I do not have the limitation of 64 items ...
    To me all this makes sense but would I apply it to a Roles table, or a ProductsCategory table?
    In my opinion I would apply only to tables that will rarely change over time ... So:
        Languages, Countries, Genders, ... Any other example?
    About the following I am not sure (They are intrinsic to the application):
       PaymentsTypes, UserRoles
    And to these I wouldn't apply (They can be managed by a CMS):
       ProductsCategories, ProductsColors
    What do you think about my approach for Lookup tables?
    Thank You,
    Miguel

    >>IMHO, this is not the best scheme for Languages table because in a Lookup table the PK should be meaningful, right?<<
    Not necessarily. The choice to use, or not to use, a surrogate key in a table is a preference, not a rule. There are pros and cons to either method, but I tend to agree with you. When the values are set as programming terms, I usually use a textual value
    for the key. But this is nothing to get hung up over.
    Bear in mind however, that this:
        create table Languages
          Id int identity not
    null primary key
    (Id),     
          Code nvarchar (4)
    not null, Description nvarchar
    (120) not
    null,
    is not equivalent to
        create table Languages
          Code nvarchar (4)
    not null primary
    key (Code),     
          Description nvarchar (120)
    not null,
    The first table needs a UNIQUE constraint on Code to make these solutions semantically the same. The first table could have the value 'Klingon' in it 20 times while the second only once.
    >>However I think it is common to use int as PK in lookup tables because it is easier to map to ENUMS.<<
    This was going to be my next point. For that case, I would only change the first table to not have an identity assigned key value, as it would be easier to manage at the same time and manner as the enum.
    >>. A Languages table could never be mapped to a FLags Enum ...<<
    You could, but I would highly suggest to avoid any values encoded in a bitwise pattern in SQL as much as possible. Rule #1 (First Normal Form) is partially to have 1 value per column. It is how the optimizer thinks, and how it works best.
    My rule of thumb for lookup (or I prefer the term  "domain" tables, as really all tables are there to look up values :)), is all data should be self explanatory in the database, through data if at all possible. So if you have a color column,
    and it contains the color "Vermillion", and all you will ever need is the name, and you feel like it is good enough to manage in the UI, then great. But bear in mind, the beauty of a table that is there for domain purposes, is that you can then store
    the R, G, and B attributes of the vermillion color (254, 73, 2 respectively, based on
    http://www.colorcombos.com/colors/FE4902) and you can then use that in coding. Alternate names for the color could be introduce, etc. And if UserRoles are 1, 2, 3, and 42 (I have seen worse), then
    definitely add columns. I think you are basically on the right track.
    Louis
    Without good requirements, my advice is only guesses. Please don't hold it against me if my answer answers my interpretation of your questions.

  • Lookup Table and Target Table are the same

    Hi All,
    I have a requirement in which I have to lookup the target table and based on the records in it, I need to load a new record into the target table.
    Being very specific,
    Suppose I have a key column which when changes I want to generate a new id and then insert this new value.
    The target table record structure looks like this
    list_id list_key list_name
    1 'A' 'NAME1'
    1 'A' 'NAME2'
    1 'A' 'NAME3'
    2 'B' 'NAME4'
    2 'B' 'NAME5'
    As shown the target table list_id changes only when the list key changes. I need to generate the list_id value from within OWB mapping.
    Can anyone throw some light as to how this can be done in OWB???
    regards
    -AP

    Hello, AP
    You underestimate the power of single mapping :) If you could tolerate using additional stage table (with is definitly recomended in case your table from example will account a lot of rows).
    You underestimate the power of single mapping :) It you could tolerate using additional stage table (witch is definitely recommended in case your table from example will account a lot of rows), you could accomplish all you need within one mapping and without using PLSQL function. This is true as far as you could have several targets within one mapping.
    Source ----------------------------------------------------- >| Join2 | ---- > Target 2
    |------------------------ >|Join 1| --> Lookup table -->|
    Target Dedup >|
    Here “Target” – your target table. “Join 1“ – operator covers operations needed to get existing key mapping (from dedup) and find new mappings. Results are stored within Lookup Table target (operation type TRUNCATE/INSERT).
    “Join 2” is used to perform final lookup and load it into the “Target 2” – the same as “Target”
    The approach with lookup table is fast and reliable and could run on Set base mode. Also you could revisit lookup table to find what key mapping are loaded during last load operation.
    Serhit

  • Not able to populate a lookup table in Import Mgr

    Hi,
    Any tips why a lookup table after mapping not able to do 'match record' process
    since not able to add the fields for match operation even they are enabaled/highlighted.
    While mapping all the source data got successfully mapped with green dots.
    There is an Auto_id field in that table.
    Thanks in advance.
    -reo

    There was an auto-id field whose parameter changed to no dispaly inorder to fix the issue.

  • Error in Import Manager if Constraints are assigned to lookup table

    Hi Folks,
    I am using the Repository SRM-MDM Catalog...and in the Console,
    For a Role, if I assign a Constraint for the lookup-table Product Groups, System shoots the error message in Import Manager when I try to Map the file.
    Scenario: Say I have Product Group from LOC1-LOC5, and for a Z-Catalog Manager Role(Manager1) if I assign the Constraint-LOC1.....and if I Import a file in the Import Manager, system shoots the error message "Unable to retrieve Unique values for Product Groups".

    Hi Mohan,
    Are you updating any Qualified table.Please explain what table and also some more details.
    Thanks
    Ganesh Kotti

Maybe you are looking for

  • Log on

    I have lost the link to log on. When I click on "Mac" it used to have a log on line, click on that and the log on page opened. Now I can't get to my mail. What's happened??

  • How to share my IMAC screen to my apple tv via local WIFI?

    Hi Can me some one to help configur my imac to share the screen to apple tv? I have allready enabled the screenshare funktion in network settings thx gabara

  • Health Check on 11.5.9

    Hi Friends, I want to do Health check on my E-Business suite 11.5.9(both Database and Application) please suggest how to do? Regards, Arun

  • Viewing videos on the tv screen.

    Is there a tv out from the iPod to the tv in order to view the videos on the tv screen. Thanks for your info. Abufaisal1

  • Error Message 4280

    Hi there, Sorry if there's already a thread for this but this is my first time on here and I'm not up with how it works:) I keep getting Error Message 4280 when I try to burn discs. I don't have a sony writer and I have Itunes7. If it helps this is w