Aggregate Navigation

Experts,
I have 2 aggregated fact tables in my BMM layer. AggFact1 has client at base level and product at segment level. Aggfact2 has client at base level (same level as aggfact2) and product at total product level. Both these agg fact tables has other dimension at same levels. I have set the content level correctly for the LTSs.
My question is, if I need to create a report that has only Time period and Client, then what I need to configure in OBIEE such that it has it hit Aggfact2 as it has the necessary client information and fewer records compared to Aggfact1
Regards,
Sarvan

You can set Implicit Fact column from Aggfact2.

Similar Messages

  • Setting Up Aggregate Navigation in OBIEE

    Hello Guru's
    Could somebody please tell me what is aggregate navigation and and how to configure it in OBIEE .
    Thanks

    Hi
    Aggregate navigation is excellent technique to get the aggregate values in reports
    you can check in this link which will completely explain the creation of aggregate table, setting up the aggregation & trouble shooting
    http://www.rittmanmead.com/2006/11/aggregate-navigation-using-oracle-bi-server/
    http://hiteshbiblog.blogspot.com/2010/04/obiee-aggregate-navigation-not-hitting.html
    you can get this from Oracle site in the documentation part
    Thanks
    K.Babu

  • Distinct count makes BI server to choose wrong aggregate table

    Hi experts,
    I have 4 dimension tables time, store, product and client and one fact table sales.
    The sources from table sales are from 3 aggregate tables:
    agg_sales_1: aggregate sales for one product of one client in one store per day
    agg_sales_2: aggregate sales for one product in one store per day (all clients)
    agg_sales_3: aggregate sales for one store per day (all products , all clients)
    You can see that agg_sales_1 have a lot of lines, agg_sales_2 has few lines and agg_sales_3 has very few lines (one line per store and day)...
    What I need is: all stores to see the average sales per one month (I don't care about products or clients - all of them)
    so I create :
    one fact logical column wich has sum(sales) and at time level i set it to month : total_sales_per_month
    one fact logical column wich has count(distinct(date)) and at time level i set it to month - wich gives me in one month how many days with sales I have: '#_of_days_in_with_sales_in_month'
    and I want to have the average_sales_per_month = total_sales_per_month / '#_of_days_in_with_sales_in_month'.
    So far so good:
    if in Presentation in my report I put day and total_sales_per_month then the server choses agg_sales_3 (wich is the best solution)
    if in Presentation in my report I put day and total_sales_per_month and '#_of_days_in_with_sales_in_month', or just average_sales_per_month then the server choses agg_sales_1 (wich is the worst solution).
    The question is why?
    another clue:
    if I change the aggregate function from count(distinct()) in count() (This is no good for me) then the server choses again the good table agg_sales_3.
    So, I'm thinking that the function count(distinct()) is makeing this bad behavior...
    Any suggestions pls...
    And Happy Hollydays
    Thanks
    Nicolae Ancuta

    One of the dimension table have joins to other fact tables and query routed through unwanted dim and fact tables. this is happeneing because of aggregate navigation in fact sources, content tab set to detailed level. I'm trying to use aggregate functions...

  • Aggregate navigate or Fragmentation in OBIEE

    Whats the better approach aggregate navigate or Fragmentation in OBIEE BMM layer and how ? What are the pros and cons of each one of them

    I still cannot understand the original question:  "aggregate navigate or Fragmentation.."
    Searching the documentation I can find:
    Setting Up Fragmentation Content for Aggregate Navigation
    When a logical table source does not contain the entire set of data at a given level, you need to specify the portion, or fragment, of the set that it does contain. You describe the content in terms of logical columns in the Fragmentation content box in the Content tab of the Logical Table Source dialog.
    and
    Setting Up Aggregate Navigation by Creating Sources for Aggregated Fact Data
    Aggregate tables store precomputed results from measures that have been aggregated over a set of dimensional attributes. Each aggregate.....

  • Please any one send me link for aggregation navigation

    Hi
    can any one send me user guide link of aggregation navigation..
    Thank's
    Harish

    Hi harish,
    Follow these links it gives you all points and steps http://gerardnico.com/wiki/dat/obiee/fragmentationlevel_based_
    http://www.rittmanmead.com/2006/11/01/aggregate-navigation-using-oracle-bi-server/
    http://www.rittmanmead.com/2007/10/26/using-the-obiee-aggregate-persistence-wizard/
    Hope it helps you.Dont forget to award points.
    By,
    KK

  • Content Level

    I have a fact table with 11 million rows that joins with 8 dimesion tables to form a perfect star schema.
    The client said that he can reduce the row size to 6 million if we remove 3 dimensions from the joins. So he created a materialized view with 6 million rows joining with 5 dimesnions which he is going to use the most in the reports.
    The client requires the previous model as well because there are 2 reports out of the 7 reports that will be based on the 11 million rows fact and the 8 dimension tables.
    if I convert this scenario to the aggregate and detail level Fact. I will have 2 models (5 Dimesnion model lets say it as aggregate and 8 dimesnion model as detail). Now i will create a BMM layer subject area and promote the detailed level model to the BMM layer. For Aggregate I will start mapping each of the objects in BMM layer except for thr 3 dimension tables that are nor present in AGG model with the Agg model (5 dimesnions) of the physical layer.
    While Aggregate navigation and creating dimension hierarchies we set the content level navigation for each logical table source to the level at which the lowest granularity exists. However in my scenario. The granularity for both the model will be same. So how will the query from OBIEE recognize which model to use when a report is built as both the models are built at the same content level with the exception of 3 dimsion tables.
    Please help it is urgent. How should I implement this scenario?

    For your "fact" table in the BMM, you should have two LTS - one for the detail level data, one for the aggregate level data.
    For each of your dimension tables, you should set up a hierarchy (sorry, using standard accepted terms of objects here...not Oracle's non-standard ones....) Each dimension should have at least a "total" and "detail" level, more if there are a proper hierarchy.
    Then when you set up your LTS, for the detail one - each of your eight dimensions will have content level set to the detail levels. For your aggregate LTS, on the three dimensions that aren't present, set the content level to the "total" level. I'd also set the PRIORITY GROUP so that the aggregate gets chosen first instead of the detail.
    Hope this helps!
    Scott

  • When to go for 2nd LTS

    Hi All, When should we go for the 2nd LTS in a logical Dimension table?
    One situation is when we have multiple columns comming from 2 different physical tables, and these columns exists in another single physical table, then we add 2nd LTS. Apart from this, what could be the other scenarios?

    You have 2 logical table source for a fact table for this reasons :
    * fact table vertical partitioning
    For example check this URL
    http://gerardnico.com/wiki/dat/obiee/bi_server/design/fact_table/obiee_fact-based_partitioning_outer_joins
    * federated query
    You have two same logical model in two differents datasource, you can join two star query. It's a capabilities of the fact table vertical partitioning.
    * fact table horizontal partitioning
    When the table depend of the content (fact before one year is in a other table)
    * aggregate navigation
    Of the query rewrite mechanism of OBIEE. Example : if you use only the year to calculate the total, you can redirect the query to a table with only this two columns.
    The table is then smaller and the data are quickly retrieve.
    http://gerardnico.com/wiki/dat/obiee/bi_server/design/fact_table/obiee_query_rewrite
    It's all what I know.
    Hope this help.
    Cheers
    Nico

  • Performance improvement in OBIEE 11.1.1.5

    Hi all,
    In OBIEE 11.1.1.5 reports takes long time to load , Kindly provide me some performance improvement guides.
    Thanks,
    Haree.

    Hi Haree,
    Steps to improve the performance.
    1. implement caching mechanism
    2. use aggregates
    3. use aggregate navigation
    4. limit the number of initialisation blocks
    5. turn off logging
    6. carry out calculations in database
    7. use materialized views if possible
    8. use database hints
    9. alter the NQSONFIG.ini parameters
    Note:calculate all the aggregates in the Repository it self and Create a Fast Refresh for MV(Materialized views).
    and you can also do one thing you can schedule an IBOT to run the report every 1 hour or some thing so that the report data will be cached and when the user runs the report the BI Server extracts the data from Cache
    This is the latest version for OBIEE11g.
    http://blogs.oracle.com/pa/resource/Oracle_OBIEE_Tuning_Guide.pdf
    Report level:
    1. Enable cache -- change nqsconfig instead of NO change to YES.
    2. GO--> Physical layer --> right click table--> properties --> check cacheable.
    3. Try to implement Aggregate mechanism.
    4.Create Index/Partition in Database level.
    There are multiple other ways to fine tune reports from OBIEE side itself:
    1) You can check for your measures granularity in reports and have level base measures created in RPD using OBIEE utility.
    http://www.rittmanmead.com/2007/10/using-the-obiee-aggregate-persistence-wizard/
    This will pick your aggr tables and not detailed tables.
    2) You can use Caching Seeding options. Using ibot or Using NQCMD command utility
    http://www.artofbi.com/index.php/2010/03/obiee-ibots-obi-caching-strategy-with-seeding-cache/
    http://satyaobieesolutions.blogspot.in/2012/07/different-to-manage-cache-in-obiee-one.html
    OR
    http://hiteshbiblog.blogspot.com/2010/08/obiee-schedule-purge-and-re-build-of.html
    Using one of the above 2 methods, you can fine tune your reports and reduce the query time.
    Also, on a safer side, just take the physical SQL from log and run it directly on DB to see the time taken and check for the explain plan with the help of a DBA.
    Hope this help's
    Thanks,
    Satya
    Edited by: Satya Ranki Reddy on Aug 12, 2012 7:39 PM
    Edited by: Satya Ranki Reddy on Aug 12, 2012 8:12 PM
    Edited by: Satya Ranki Reddy on Aug 12, 2012 8:20 PM

  • Link Join Concept In Physical and Business Model Layer

    Hi,
    As we know that we could make join relationship in physical layer and also business model layer in obiee 10g. This is what makes me confuse.
    First of all, I tried to follow the tutorial in the oracle learning library center with schema SH as an example.
    In the beginning, it is stated that we must make join relationship first in physical layer for all imported tables which are consist of dimension tables and fact tables.
    Then, in the business model layer, the tutorial said that we must also make join relationship for the logical dimension table and logical fact table.
    So, what's the purpose actually we must make join relationship in business model layer ?
    All objects in business model layer are actually mapped from physical layer. So automatically the relationship in business model layer should be available automatically as they are mapped from physical layer.
    Maybe for you guys who know the concept well about this difference of relationship, could tell me so I could get the idea of what it is actually about.
    Thanks

    Physical layer does represent datamodel joins as is in almost all cases.
    Federated queries are a best example why joins are implemented in both (Physical & BMM layer) places.
    Business model mapping layer is modelled according to your business requirements.
    This is where your model MUST be a simple star schema, also where you model your hierarchies based on logical tables in bmm layer, & do appropriate aggregate navigation.
    Hope the view points presented will put you in right direction.
    mark answers promptly.
    -bifacts
    http://www.obinotes.com
    J
    Edited by: bifacts on Dec 16, 2010 9:19 PM

  • BI server generating wrong query

    Hi,
    I have an issues with query generation by report. we have one fact table and 3 dim tables using by query. If I use aggregate functions like max, avg and stddev in column formula, BI server generating wrong query but normal report without any aggregate functions working fine(generation correct query by bi server). fact tables sources>>content tab, level settng to all three dim tables is at detailed level. I'm expecting BI server to rollup the detailed data to calculate min, max and stddev but it's not doing that and it is trying to join so many other unwanted tables in the query and fetching no result.
    How would I fix the above problem? thanks for your help
    Thanks
    Jay.

    One of the dimension table have joins to other fact tables and query routed through unwanted dim and fact tables. this is happeneing because of aggregate navigation in fact sources, content tab set to detailed level. I'm trying to use aggregate functions...

  • Count Distinct Wtih CASE Statement - Does not follow aggregation path

    All,
    I have a fact table, a day aggregate and a month aggregate. I have a time hierarchy and the month aggregate is set to the month level, the day aggregate is set to the day level within the time hierarchy.
    When using any measures and a field from my time dimension .. the appropriate aggregate is chosen, ie month & activity count .. month aggregate is used. Day & activity count .. day aggregate is used.
    However - when I use the count distinct aggregate rule .. the request always uses the lowest common denominator. The way I have found to get this to work is to use a logical table source override in the aggregation tab. Once I do this .. it does use the aggregates correctly.
    A few questions
    1. Is this the correct way to use aggregate navigation for the count distinct aggregation rule (using the source override option)? If yes, why is this necessary for count distinct .. what is special about it?
    2. The main problem I have now is that I need to create a simple count measure that has a CASE statement in it. The only way I see to do this is to select the Based on Dimensions checkbox which then allows me to add a CASE statement into my count distinct clause. But now the aggregation issue comes back into play and I can't do the logical table source override when the based on dimensions checkbox is checked .. so I am now stuck .. any help is appreciated.
    K

    Ok - I found a workaround (and maybe the preferred solution for my particular issue), which is - Using a CASE Statement with a COUNT DISTINCT aggregation and still havine AGGREGATE AWARENESS
    To get all three of the requirements above to work I had to do the following:
    - Create the COUNT DISTINCT as normal (counting on a USERID physically mapped column in my case)
    - Now I need to map my fact and aggregates to this column. This is where I got the case statement to work. Instead of trying to put the case statement inside of the Aggregate definition by using the checkbox 'Base on Dimension' (which didnt allow for aggregate awareness for some reason) .. I instead specified the case statement in the Column Mapping section of the Fact and Aggregate tables.
    - Once all the LTS's (facts and aggregates) are mapped .. you still have to define the Logical Table Source overrides in the aggregate tab of the count distinct definition. Add in all the fact and aggregates.
    Now the measure will use my month aggregate when i specify month, the day aggregate when i specify day, etc..
    If you are just trying to use a Count Distinct (no CASE satement needed) with Aggregate Awareness, you just need to use the Logical Table Source override on the aggregate tab.
    There is still a funky issue when using the COUNT aggregate type. As long as you dont map multiple logical table sources to the COUNT column it works fine and as expected. But, if you try to add in multiple sources and aggregate awareness it randomly starts SUMMING everything .. very weird. The blog in this thread says to check the 'Based on Dimension' checkbox to fix the problem but that did not work for me. Still not sure what to do on this one .. but its not currently causing me a problem so I will ignore for now ;)
    Thanks for all the help
    K

  • Oracle BI Joins

    Hi all,
    I have two tables and there is no direct join between them in physical or logical layer, but there is a few indirect joins through a few tables.
    When I put these tables in one report in presentation services, it doesn't give an error. From query log I can see that it joined these tables one using one of tables between. What I want to understand is how it decides the join. If there is more then one path between tables how it decides to go.

    Hi seth2,
    I think if you have more than one way to go in you table, then OBIEE will produce an error.
    It is recommended that you use table aliases frequently to eliminate extraneous joins, including the following:
    * Eliminate all physical joins that cross dimensions (inter-dimensional circular joins) by using aliases.
    * Eliminate all circular joins (intra-dimensional circular joins) in a logical table source in the Physical Model by creating physical table aliases.
    Come from here : http://download.oracle.com/docs/cd/E12103_01/books/admintool/admintool_RepositorySetup7.html
    If you have more than one logical table source in the business model layer then obiee will choose the table which have the lowest grain defined. Obiee call this an Aggregate Navigation :
    http://gerardnico.com/wiki/dat/obiee/bi_server/design/fact_table/obiee_query_rewrite
    Regards
    Nico

  • Incorrect Logical Table Source getting picked

    Dear All,
    Can you please help me with my query.
    I have 2 logical table sources for my fact table
    LTS1 --  L1 with some number of levels mappings
    LTS2-    L2 with same number of level mappings as L1 with one extra level mapped
    now when i query on the columns w.r.t to LTS1, though my LTS 1 is in the higher level in the order of LTS's for fact table, the obiee query is hitting the LTS2 and hence we are getting wrong results.
    Can you please suggest on my query.Appreciate your help.
    Best Regards,
    Achala

    I have found the solution. The dimensional hierarchy associated with the Dimension of  LTS2 of fact had wrong count set for "Number of elements at this level" in the hierarchy.
    the below info from http://gerardnico.com/wiki/dat/obiee/level_number_element
    helped me figure out this.
    "Fact sources will be selected on a combination of:
    the fields selected as well as
    the levels in the dimensions to which they link.
    For example, when aggregate navigation is used, multiple fact sources exist at different grains. The Oracle BI Server multiplies the number of elements at each level for each qualified source as a way to estimate the total number of rows for that source.
    Then, the Oracle BI Server compares the result for each source and selects the source with the lowest number of total elements to answer the query. The source with the lowest number of total elements is assumed to be the fastest.
    By adjusting these values, you can alter the fact source selected by Oracle Business Intelligence in some cases."
    Thank you all,
    Achala

  • Job Openings: DataStage, BI, OBIEE, Siebel Analytics

    We have SEVERAL positions available in DataStage, BI, OBIEE, Siebel Analytics area. These are urgent requirements and interviews are going on so resumes with out candidate Email/Phone# will not be considered.
    Also these position requires on site face to face screening.
    Please send us your resume/Visa Status/Hourly Bill Rate to [email protected] Please mention the position you are interested in the email subject.
    1. ETL Architect with a strong Datastage experience of 7-10 years. The candidate should have experience in a lead architect role in major data warehouse projects for fortune 500 companies. Information Server 8.0 and high tech experience are preferred.
    2. Data Analyst with DataStage experience. Knowledge of Siebel CRM, order life cycle, lead, opportunities, etc. is preferred. High tech experience preferred.
    3. Senior OBIEE Developers with minimum of 4 years of experience in OBIEE development. V.10 experience is required. RPD development expertise (with time-series, aggregate navigation) and solid dashboard development experience also required. Enterprise or broad cross-functional domain experience a plus. Should have 4+ years experience on OBIEE projects at US based companies.
    4. BI Project Manager with experience managing BI projects specifically. Can be BO, Cognos, Siebel Analytics, or other leading BI app. Ideally, candidate would have some experience working with service/call center and SAP Service Module. Must have very good communication skills*written and verbal. Should have 4 years minimum exp on BI projects at US based companies.
    5. BI Analysts with OBIEE experience and experience working with service/call center data. Must have very good communication skills*written and verbal. Should have 3 years minimum OBIEE experience in US based companies.

    Hi ,
    Any help.....?
    Regards,
    KK

  • Using different levels of the same dimension

    If I have 2 fact tables and a conforming time dimension. Can I make a join so that one table will ignore different years completely? (1 table has only 1 year of data, so it should display same values for different years for other table -yet, I still want to be able to drill-down from Year-to-Detail). I am somewhat successful at seeing it ok on a Annual level (when setting the desired metric to Grand Total Level - but aggregate navigation doesn't work correctly -I'm locked at annual level). Thanks

    HI mma (and anyone else who was following).
    Here's an update:
    a) AGO function doesn't support anything non-integer - it's official and the product enhancement request has been filled. Go figure - they cancel time-series wizard - but at the same time - the AGO isn't fully scalable. Since we're developing a brand-new RPD - and we weren't using any YAGO, MAGO, etc. tables - this has bitten us later than sooner.
    b) using CASE statement is no panacea (change for each year).
    c) your method (time.key=time.key (of 2005) + MOD (time,key, 365) would probably work in my situation (I tried it - and it wasn't that difficult to implement), UNLESS I'd get the following RPD integrity error - "Error: Using a complex join in table that sources time dimension" (I tried aliases as well).
    d) Right now, I'm just doing that metric on Annual level. Creating a view in Oracle that only contains needed data (for 2006) , using it as a physical table in physical layer, and connecting with all foreign keys BUT time.
    I still think there must be a better way without complex ETL and without creating additional column. LOJ and ROJ and FOJ didn't work.
    Thanks for looking at this again.

Maybe you are looking for

  • Error:java.lang.Double:method parseDouble(Ljava/lang/String;)D not found.

    Hi , oracle apps version : 11.5.10.2 and database version 11.2.0.2. OS version Solaris Sparc 64 Bit. We were performing a cloing activity on a new server and now while opening forms we are getting the following error : error:java.lang.Double:method p

  • Can't open the project again

    Hi there, unfortunately I have to come back again to solve the same problem as one week ago, but this time with a new project. Here it's the problem of the last weekend: https://discussions.apple.com/message/22096295#22096295 The process described th

  • Handling list box in alv

    hai all please check the code and tell how i can call in both open and close sales orders NAME = 'CATEGORY'.   VALUE-KEY = 'OPEN'.   VALUE-TEXT = 'OPEN'.   APPEND VALUE TO LIST.   VALUE-KEY = 'CLOSE'.   VALUE-TEXT = 'CLOSE'.   APPEND VALUE TO LIST.  

  • Urgent - JMS adapter

    Hi, How do we set-up transport level security (using encryption) when the message is sent from XI to MQ Series using receiver JMS adapter? Thanks, Vatsala.

  • Drill-down in protected sheet

    Dear Expert, I have a report in a protected sheet to prevent users from modifying formulas etc., The problem is when I try to do drill-down it doesn't allow me. Is there any workaround to that? Thanks in advance. Regards, Enric Munné