How to "think" use of multiple logical table sources

Hi everybody,
i'm new to OBIEE and its paradigm. I'm trying to build a prototype with multiple physical sources: ERD and a Data Mart; I need OBIEE to go to the Data Mart model at the need of aggregated info and to go to the relational model at the need of transactional info. I need to do that working with dimensions, assigning different sources to each logical level of the dimension.
My problem begins when thinking how to actually do that. Here is my question:
- Should I add the relational source in the data mart logical table source folders, mapping the same information with joins? If so, how does dimensions work from that point? (drill down)
thanks in advance!

I'm a little confused
Is the problem
1 - You want to Access a table in an existing 9i database from a different 10g database
or
2 - You have a new 10g database as a copy of a 9i one, and a table called system has gone missing?

Similar Messages

  • Single Logical Table Source VS Multiple Logical Table Source

    When is it appropriate to use a Single Logical Table Source VS Multiple Logical Table Source.

    Hi,
    Single Logical Table source:A logical fact/dimension table with a single physical source/table.
    Mutilple Logical Table source:A logical fact/dimension table with a multiple physical sources/tables.
    mark if helpful/correct....
    thanks,
    prassu

  • Problem: 1 physical table -- multiple logical table sources

    Hi,
    I'm quite new to BIEE and setting up my repository.
    So I have a question, if the following scenario is possible:
    Physical Layer: TABLE_A: COL_A, COL_B, COL_C
    TABLE_B: COL_D, COL_E, COL_F
    Join TABLE_A.COL_A = TABLE_B.COL_D
    In Business Model I have a Dimension Table with TABLE_A as datasource with fields DIM1 (COL_B).
    The Fact Table (MEASURE) would have twice TABLE_B as data source with different where-clauses on COL_F and logical table columns (ATT1 and ATT2) of value COL_E.
    So far I have created everything and the consistency check shows no errors or warnings, but I get an error in Answer: Incorrectly defined logical table source (for fact table MEASURE) does not contain mapping for [MEASURE.ATT1, MEASURE.ATT2], when I creating an report showing DIM1, ATT1, ATT2.
    Isn't it possible to have one physical column used as multiple data source?
    I know it 's working, when I create the physical table twice ... but maybe there's a solution for business model.
    Thanks
    chrissy

    Hi mengesh,
    that's what I also tried, but it's always returning me the same error.
    I know it would work, when I import the physical table twice or more, but that's not what I want to do, because at the end I have 10 or more fields based on this one physical table. There's one field indicating what value is contained in the record, this means:
    COL_F | COL_E
    1 | customer name
    2 | customer number
    3 | customer branche
    4 | salesman
    5 | date
    6 | report number
    etc.
    I don't think it's usefull to import the physical table as often as I need this field. So I want to divide it in business model.
    thanks
    chrissy

  • Financial Analytics -- Multiple logical table sources

    we have two logical sources for Fact table. The two sources are Aggreates - one is at Invoice level and other one is at Supplier level. Each column in the fact is the derived column and is getting mapped to both the tables. My problem is when i use supplier name and invoice amount and other fact columns data , one of the fact column comes as Zero even though the back end query that BI server is firing is returning expected result. But some how when it comes on the report the data becomes zero. This doesnot happen when we specifically add a filter on the table at lower level of Aggregation (Invoice). This tells Bi server to fetch the data from table lower level of Aggregation .
    The RPD that we are using is OOTB.

    Scenario # 1: In BMM the two tables form a logical table called CUSTOMER with 2 different logical table sources which are CUSTOMER & ADDRESS.
    -->In this case; based on the columns select source is defined in the physical query
    possible case would be CUSTOMER , ADDRESS, and CUSTOMER & ADDRESS
    -->Used in DE-normalized scenarios
    --> BI uses intelligence based on content tab settings
    Scenario # 2: In BMM the two tables form a logical table called CUSTOMER with 1 logical table source called CUSTOMER only.
    -->In this case; CUSTOMER properties->General tab used both CUSTOMER & ADDRESS, in your physical query both tables present irrespective of column selection
    -->Used in normalized scenarios
    -->Forcing BI to follow our way, since we wont set content tab settings
    Hope this helps

  • Multiple Logical Table Sources In BMM Under 1 Logical Table In OBIEE 11g

    I have a question and even after doing a lot of search on google i could not find any article so through to ask you :-
    I want to create a Logical Dimension table
    This logical dimension has columns coming from 2 different physical tables, TableA and TableB. The relationship between TableA to TableB is 1 to zero.
    There are 2 ways to create the logical dimension :-
    1) Go to the logical table source properties of TableA-->then there + sign click and then add TableB right outer join with tableA.
    Here in the logical table source we see only one table tableA.
    2) Drag the tableB column so that we have 2 logical table source.
    Please try this to explain without using the concept of fragmentation and without considering the fact tables. I have this question for simple logical dimension tables only.
    I wanted to know which is the right way and what requirement/factor decides what to do ?

    Check this post
    Business Model - Logical Table Source
    Let me know for Qs.
    If helps mark as correct :)

  • Multiple Logical Table Sources vs Single Logical Table Source

    OBIEE 11g. I am totally confused. Can someone help me with the following for me.
    I have seen logical table sources being used in 2 ways. I have a 2 sources tables: CUSTOMER & ADDRESS. In physical layer these 2 tables are linked by CUSTOMER _ID in a 1:M relationship.  i.e. A customer can have many addresses.
    Scenario # 1: In BMM the two tables form a logical table called CUSTOMER with 2 different logical table sources which are CUSTOMER & ADDRESS.
    Scenario # 2: In BMM the two tables form a logical table called CUSTOMER with 1 logical table source called CUSTOMER only.
    What is the difference between the above 2 scenarios and which one is better to use when creating the logical table source.
    Regards.

    Scenario # 1: In BMM the two tables form a logical table called CUSTOMER with 2 different logical table sources which are CUSTOMER & ADDRESS.
    -->In this case; based on the columns select source is defined in the physical query
    possible case would be CUSTOMER , ADDRESS, and CUSTOMER & ADDRESS
    -->Used in DE-normalized scenarios
    --> BI uses intelligence based on content tab settings
    Scenario # 2: In BMM the two tables form a logical table called CUSTOMER with 1 logical table source called CUSTOMER only.
    -->In this case; CUSTOMER properties->General tab used both CUSTOMER & ADDRESS, in your physical query both tables present irrespective of column selection
    -->Used in normalized scenarios
    -->Forcing BI to follow our way, since we wont set content tab settings
    Hope this helps

  • Aggregation on multiple Logical table source

    I have mapped two logical table source(LTS) to a measure.
    Sum needs to be used for Essbase LTS and count distinct needs to be used for Oracle DB.
    Is there any option to set aggregation on LTS level?
    Thanks,

    Interesting!!
    We may not set aggregate tab based on LTS but you may try this:
    Do not use the aggregate tab, during the creation of metric using physical sources and use count distinct for Oracle db and sum for other.
    You should see 2 different expressions in 'Data Type' tab->Mapped as
    Note: I didnt tested it let me know updates.

  • Question related to Logical table source

    Hi All,
    I have very basic questions -
    (1) When do we create multiple logical table sources within a dimension?
    (2) If a dimension has more than one logical table sources , when do we need to map the underlying table to other underlying table of logical table sources.
    For exp consider these tables XLE_Entity_Profile, XLE_REgistration , AP_Invoice_ALL
    AP_Invoice_All is a fact table. relationship between XLE_Entity_Profile & XLE_REgistration is (1:M).
    Join info is as below -
    (a) XLE_Entity_Profile.Legal_entity_id = AP_Invoice_ALL.Legal_entity_id
    to get the registration of LE the where clause is as below-
    (b) XLE_Entity_Profile..Legal_entity_id = XLE_REgistration.Source_id and XLE_REgistration.Source_table = 'XLE_Entity_Profile'
    I have created the alias of XLE_REgistration as XLE_REgistration_LE.
    With in a dimension , i have 2 logical table source - XLE_Entity_Profile & XLE_REgistration_LE.
    logical table source 'XLE_REgistration_LE' has where clause 'XLE_REgistration.Source_table = 'XLE_Entity_Profile'
    When i query , LE name , LE Registration Name, i get error like -
    State: HY000. Code: 10058. [NQODBC] [SQL_STATE: HY000] [nQSError: 10058] A general error has occurred. [nQSError: 14070] Cannot find logical table source coverage for logical columns: [LE_NAME]. Please check more detailed level keys are mapped correctly. (HY000)
    Le Name is from XLE_Entity_Profile and LE Registration Name is from XLE_REgistration_LE.
    But when using the properties of logical table source ' XLE_REgistration_LE' , i map it to 'XLE_Entity_Profile' ,i get the correct result.
    I am not able to understand why am i getting error in first way of modeling.
    Thanks , Ashish

    Hi Ashish,
    first about logical table sources (lts): you can create different lts for aggregation and/or fragmentation. Aggregation means that data is aggregated on another level among different physical tables. Fragmentation means that the content is different over different tables (other rows).
    In your case, I think the problem is that your dimension is not denormalized, which results in a snowflake.
    What I understand is that you have the following (physical diagram):
    Invoice (fact table) >----- Entity (Dim) >----- Registration (Dim)
    You have the following joins:
    invoice.entity_id = entity.entity_id
    and entity.entity_id = registration.source_id and registration.source = 'something'
    First idea:
    I would create the following join (in physical diagram):
    Invoice (fact table) >---- Registration (Dim)
    Where:
    invoice.entity_id = registration.source_id and registration.source = 'something'.
    Then in your Entity dimension you should create a hierarchy:
    Grand Total Level
    Entity
    Registration.
    In your dimension you should create the first lts: Entity
    Set the aggregation content of this lts to Entity.
    This lts contains only one physical table.
    Map only the entity columns to the Entity physical table.
    Then create the second lts:
    Entity and Registration.
    Set the aggregation content of this lts to Registration.
    This lts must contain two physical tables, Entity and Registration.
    Map the entity columns to the Entity physical table and the registration columns to the registration physical table.
    Let me know if it works or not.
    Regards,
    Stijn

  • Logical Table source source query

    In OBIEE 10g we can have multiple logical table sources and we can also add multiple tables into a single logical table source(logical table source source). I wanted to know the difference between doing so and having multiple logical table sources for each logical source.
    Hope I made myself clear.
    Cheers
    Rem

    Hi Rem,
    When data is duplicated across different physical tables add them as separate LTS with column mapping pointing to most economical sources. Specifying the most economical source is about the idea that a single column exists in more than one table, based on the column mappings BI server picks up those LTS's which could satisfy the request with minimal joins.
    When the data is not duplicated add them in a single LTS source. When the physical sources are added in a single LTS, you have the flexibility of using outer joins. But specifying a join as outer join makes BI Server to include this source even if its not required otherwise when the join is inner, the sources will not be included if not required to satisfy the query.
    Hope this helps.
    Thanks!

  • BIEE Answers: Problem with logical table source

    Hi,
    I am creating a detail report in answers. This report does not use any column from the fact table in the logical model, but uses other dimension tables.
    In the logical model fact table, I use 3-4 times the same physical table as logical table source in order to filter results or not.
    The problem is that my report filters results by using one of the logical table sources. Is there a way to specify in answers or administration which should be my "default" logical table source for the specific physical table joins?
    Another workaround is to add as a filter on the report one of the columns from the fact table that do use the desired logical table source, but this is not always a workaround..
    Can you please advise?
    Thanks in advance,
    Nadia

    Remember to set the # Elements in your hierarchy to the appropriate ratio between levels, actual numbers not so important but the ratio needs to be correct for the BI Server to make the best guess if using more than one LTS.

  • How would multi Logical Table Source work?

    Hi All,
    I am facing a problem about multi Logical Table Source. Can I build a Business Model with a Logical Fact Table that have two Sources? Taking the below description as an example:
    Table Name
    Columns
    Description
    Fact
    TIMECD(2013Q101, which join to Time.TIMECE), DATA
    Fact_Aggr
    TIMECD(2013Q1, which join to Time.PERQUARTER), DATA
    Aggregate table for Fact and sum Monthly data to Quarterly
    Time
    TIMECD, YEAR, PERYEAR, QUARTER, PERQUARTER, MONTH, PERMONTH
    I tried to create a BMM that contains two Fact Table in the same Logical Fact Table. But BIEE shows the DATA from Fact even if I drag PERQUARTER into the report.
    The following is my question:
    1. Can I have a Logical Fact Table that contains Fact and Fact_Aggr?
    2. How will BIEE determine which DATA should be used?
    Best Regards,
    Martin

    The basic rule for one LTS vs two is:  If you use table A AND Table B together, then one LTS.  If you use table A OR Table B, then 2 LTSs.
    So when you have an aggregate fact table, you use either the Agg or the base fact, meaning you should have 2 LTSs.  OBI understands how to use one or the other through a variety of rules, but the main piece of config you need to have is proper levels defined on each and every LTS's Content tab.  In your example, you will need to make sure the grains of your aggregate and base fact table match the time dimension properly; thus normally you have a day table and a month table in the time dimension, each joining to the appropriate Fact LTS.  If you join a month level aggregate to a day time table your results will be incorrect.  OBI knows that month is higher in the hierarchy than day, so it will prefer to use the set of Month LTSs when it can.
    Jeff M.

  • How do you use the Multiple Item Information dialog box ???

    How do you use the Multiple Item Information dialog box ???
    Where are the instructions on how the information in the Multiple Item Information dialog box equates to ...
    1. The way iTunes sorts tracks and albums
    2. The reason to select a leading check box
    3. Why there are Option selections (Yes /No) and leading check boxes.
    4. Why some changes remain in the track info, but do not "take effect" in iTunes (Part of a compilation is an example)
    Looked in Help, Support, went to the local Genius bar for an hour, even arrainged a call from apple support ...
    Thanks

    As Christopher says, it's a compilation. Different tracks are by different artists.
    Setting the *Album Artist* field to *Various Artists* and setting *Part of a compilation* to Yes should be all that is required. Depending on your *Group compilations when browsing* setting ( I recommend On ) either should suffice but I suggest doing both.
    Based on your commentary, I selected all the "O Brother" tracks, and checked the boxes for everything line that was blank in the Info and the Sort panes. Only exceptions were the album name and the disc number 1 of 1 and the artwork. I blanked and checked anything else.
    That's not what I meant. When you select multiple tracks, only those values which +are already common+ to all tracks are displayed. Typically these will include Artist, though not with compilation albums, Album Artist, Album, No. of Tracks, Genre plus various sort fields. A blank value may indicate that different tracks have different values or it may be that the value is blank for all tracks. For the drop down values on the Options tab the value shown may not reflect the information in every tag. If values you expect to be common, such as Album Artist or the Album title are not displayed you can simply type these in and click OK. This will often be enough to group the album.
    If you place a checkmark against the blank boxes and apply changes then you will clear those fields so you should only do this if that is the effect you want. Putting a checkmark next to an empty (representing different values) *Track No.* box, for example, will just clear the all the track numbers which is very rarely useful.
    Adding then removing extra text is for a specific problem where despite all common values being identical across the tracks of the album iTunes seems to "remember" that it should see two albums. A typical example would be when an album originally listed as *Album CD1* & *Album CD2* is given disc numbers X of Y and then has the Album name changed to Album. I've seen iTunes merge all but one track into the new album, but insist on listing one remaining track separately, despite both albums having the same title. In this case I've found overtyping the album title again has no effect whereas changing it to AlbumX and then back to Album does what I was trying to achieve in the first place.
    Don't forget that even properly organsied albums may still break up if you don't chose an album-friendly view. Sorting on the track name or track number columns can be useful in some circumstances but in general I revert to Album by Artist when browsing through my library.
    tt2

  • To test how can we use the opt  'logical file name' to name the file based

    Hi Sir/Madam,
               to test how can we use the opt  'logical file name' to name the file based on the selection made in the dtp run for extracting data as flat file.

    Hi Vishali,
    In the DTP select the file location as application server and give the logical file path. The actual file and logical path can be created using transaction "FILE" and "AL11".
    Rest of the process is same as that of extraction from local file.
    Regards,
    Durgesh.

  • How to fetch data from multiple fact tables from a normalized schema?

    Hello everybody,
    I am working on a normalized schema to build my repository. I have categorized the tables into dimensions and facts. I have 3 fact tables in my schema and they have a 1:m:1 relationship i.e. if I have table A, B, and C; A has 1:m relationship with B and C also has 1:m relationship with B. How can I use measures from these tables to create a star schema. If A = Sales, B = Transaction, C = Payment_Amount
    Sales(Sales ID, Amount, Tax pct, ...)
    Transaction(Transaction ID, Sales ID, Payment ID,Transaction Amt, ...)
    Payment_Amount(Payment ID, Check ID, Payment Made, ...)
    Please give me some direction to pursue.
    Thank you!
    D

    Hi dcole,
    Go through this link for snowflake schema http://gerardnico.com/wiki/datamodeling/snowflake_
    http://www.rittmanmead.com/2007/06/19/obiee-data-modeling-tips-1-integrating-1-1-and-1-many-source-tables/
    I suppose it should work with snowflake schema...Im interested in this topic please let me know what is your approach every now and then.
    By,
    KK

  • How do we use Data rules/error table for source validation?

    How do we use Data rules/error table for source validation?
    We are using OWB repository 10.2.0.3.0 and OWB client 10.2.0.3.33. The Oracle version is 10 G (10.2.0.3.0). OWB is installed on Linux.
    I reviewed the posting
    Re: Using Data Rules
    Thanks for this forum.
    I want to apply data rules to source table/view and rule violated rows should go to defined error table. Here is an example.
    Table ProjectA
    Pro_ID Number(10)
    Project_name Varchar(50)
    Pro_date Date
    As per above posting, I created the table in object editor, created the data rule
    NAME_NOT_NULL (ie project name not null). I specified the shadow table name as ProjectA_ERR
    In mapping editor, I have projectA as source. I did not find error table name and defined data rules in table properties. It is not showing up the ERR group in source table
    How do we bring the defined data rules and error table into mapping?
    Are there any additional steps/process?
    Any idea ?
    Thanks in advance.
    RI

    Hi,
    Thanks for your reply/pointer. I reviewed the blog. It is interesting.
    What is the version of OWB used in this blog?
    After defining data rule/shadow table, I deployed the table via CC. It created a error table and created the all the source coulmns in alphabatical order. If I have the primary key as 1st coulmn (which does not start with 'A') in my source, it will apprear middle of of columns in error table.
    How do we prevent/workaround this?
    If I have source(view) in sch A, how do we create Error table in Sch B for source(view)?
    Is it feasible?
    I brought the error table details in mapping. Configured the data rules/error tables.
    If I picked up 'MOVE TO ERROR' option, I am getting "VLD-2802 Missing delete matching criteria in table. the condition is needed because the operator contain at least one data rule with a MOVE TO ERROR action"
    On condition Loading - I have 'All constraints' for matching criteria.
    I changed to "no constraints' still I get the above error.
    If I change to 'REPORT' option instead of 'MOVE TO ERROR' option, error goes off.
    Any idea?
    Thanks in advance.
    RI

Maybe you are looking for