Star Schema and EUL

1. It's a prerequisite to use a star schema to build a EUL or I can/ must used a Relational Schema?
2. The OLAP Option of Discoverer Plus work with an EUL or with a star schema
3. Which components requiere the OLAP option that don't require the Relational Option (i.e. AW) ?
4 I can generate a star schema as: a) a simple relational model where the FK key is a common domain for the fact and dimension table (i.e. Dept. Number) and having a table for each dimension (I don't speaking about time dimensions, but fields like barnch, dept., etc,). b) a star schema where the dimensions are grouped in tables depending on its significance (i.e. Producto, Channels, Time, etc). In this case I'll use a auto-generated sequential number as key for each table record, wich is referenced in the fact table. The question is, which is, in general, the best strategy 1.a ot 1.b. It depends of the size of the database?
5. There is two bussines areas wich need the same information, but one of them, will used always a summarized version whit 60000 records (the other one will process more than 1000000 each time). No doubts in using two distincts set of tables to generate two distincts EULs or Star Schemas, in order to gain in performance?

This is a more suitable question for the Business Intelligence (EBS).
In the mean time, you may want to check the BI OBE: http://www.oracle.com/technology/obe/obe_bi/bi.html , as well as http://www.oracle.com/technology/products/bi/index.html, http://www.oracle.com/technology/documentation/bi_doc.html
~ Madrid.

Similar Messages

  • Star Schema and Cubes

    Hi,
    I'm learning the basic concept of data warehousing and I've some question in this regards.
    1) I want to know, are we have to first create star schema and tables according to that star schema in dbms, from that star schema we have to built cubes OR we directly build cubes from extracted data through OLTP systems.
    2) secondly, when we create cubes, in which formate they are store in dbms and are they directly load in the memory at run time?
    Please clarify me in these concepts as new to data warehouse I've certain concept misunderstanding in this regards. Answers will be highly appreciated
    Regards,
    D.Abbasi

    1) I want to know, are we have to first create star schema and tables according to that star schema in dbms, from that star schema we have to built cubes OR we directly build cubes from extracted data through OLTP systems. Star schema is the Dimensional Modeling Technique and that need to considered while creating Dimension and not cubes.
    So you need to create first design your database either in Start schema or Snow flag and then need to create dimension tables and after wards Cubes.
    2) secondly, when we create cubes, in which format they are store in dbms and are they directly load in the memory at run time?Cubes and Dimensions are stored in Database in some different format.
    This is the nice article regarding how the data is store in DB,
    http://www.dba-oracle.com/t_olap_dimensions_cubes.htm
    Cheers
    Nawneet
    Edited by: Nawneet on Mar 29, 2010 5:47 AM
    Edited by: Nawneet on Mar 29, 2010 5:50 AM

  • How to document your star-schemas and dashboards?

    Hello practitioners,
    We are new to OBIEE and started some weeks ago. We are a two-men team. One guy doing all the data design and the other for everything around Answers.
    So it happens regularly that the other guy has to ask the one guy, about the exact definition and meaning of data-fields, he wants to use in Answers.
    Being a real programmer the one guy only needs a proper, self-explainary field name as documentation, of course (or so he thought).
    Question:
    What tools and methods do you use to document the star-schemas and dashboards?
    Being the other guy, it is important, that I pick the correct fields, while on the other hand I do not want to disect a dashboard I built a while back, everytime somebody wants something simular, just to figure out what I actually did, back then...
    Thank you for your help
    Turalf
    Edited by: Turalf on Mar 24, 2010 12:03 PM

    Hi Turalf,
    Have you looked at the OBIEE Metadata Dictionary ? A method to allow answers users to gain access to column definitions, and you can view the mapping from Presentation, through BMM to Physical layer.
    Im pretty sure you can expose comments from the RPD into this also.
    http://obiee101.blogspot.com/2008/12/obiee-metadata-dictionary.html
    Maybe you could look at the ODI - OBIEE data lineage also, it provides "report to source" mapping info :
    http://www.oracle.com/technology/obe/fusion_middleware/ODI/OBI-ODI_Lineage/OBI-ODI_Lineage.htm
    Hope this helps.

  • Star Schema and Oracle 11gR2 ?

    Star Schema and Oracle 11gR2 ?
    I know the star schema (ROLAP) and implemented couple of them. Apart from general design principle of dimension, FACT, surrogate key etc, what are the specific items needed in Oracle 11gR2?
    Some one talked about over 10 conditions/pre-requisits for Star Schema (ROLAP) implementations in Oracle 11gR2. I did some search, but I did not get any hits.
    Do we design Star schema (ROLAP) differently in Oracle 11gR2?
    Any pointer welcome.
    Thanks in helping.

    Hi,
    from my experience there are no specific requirements for the star schema design when using owb 11.2.
    When using the OWB ETL Option (extra license required), one may use the owb dimensions and cubes.
    These make mapping development easier, since support for SCD2 is built into the dimension operators. Loading the cube is simplified because the lookup of the surrogate key from the dimension is built into the cube operator.
    These owb objects will deploy specific dimension and fact tables. If you already have existing ones, you must modify them manually.
    I implemented several projects without these advanced features. Baiscally I did the same in OWB what I would have done using hand-coded SQL and PL/SQL. And it worked just fine.
    If you find those 10 conditions, please post them here. I'm curious to learn about them!
    Regards,
    Carsten.

  • Using two facts of two different star schemas and conformed dimensions

    Hi,
    I've been working as developer and database designer for years and I'm new to Business Objects. Some people says you can not use two facts of two different star schemas in the same query because of conformed dimensions and loop problems in BO.
    For example I have a CUSTOMER_SALE_fACT table containing customer_id and date_id as FK, and some other business metrics about sales. And there is another fact table CUSTOMER_CAMPAIGN_FACT which also contains customer_id and date_id as FK, and some  other business metrics about customer campaigns. SO I have two stars like below:
    DIM_TIME -- SALE_FACT -- DIM_CUSTOMER
    DIM_TIME -- CAMPAIGN_FACT -- DIM_CUSTOMER
    Business metrics are loaded into fact tables and facts can be used together along conformed dimensions . This is one of the fundamentals of the dimensional modeling. Is it really impossible to use SALE_FACT and CAMPAIGN_FACT together? If the answer is No, what is the solution?
    Saying "you cannot do that because of loops" is very interesting.
    Thank you..

    When you join two facts together with a common dimension you have created what is called a "chasm trap" which leads to invalid results because of the way SQL is processed. The query rows are first retrieved and then aggregated. Since sales fact and campaign fact have no direct relationship, the rows coming from either side can end up as a product join.
    Suppose a customer has 3 sales fact rows and 2 campaign fact rows. The result set will have six rows before any aggregation is performed. That would mean that sales measures are doubled and campaign measures are tripled.
    You can report on them together, using multiple SQL passes, but you can't query them together. Does that distinction make sense?

  • Star Schema and MV's

    Hi Guys,
    I have designed a Star schema for one of my datamart and my client is after me suggesting that over that I should create a MV to provide a consolidated view. I am trying to convience my client not to do so with the points as below:
    1.     As we have created a Star Schema in the database we should take advantages of the same and should avoid creating another layer of reporting which in future will increase the complexity of the queries while expanding the functionality of the mart.
    2.     We have to create a complete refresh MV and during refresh data will not be available for reporting to users and the duration will increase over the period of time once the data increases
    3.     As MV are a table on a disk using a MV in this case will consume the tablespace which will increase over the period of time.
    Please can you experts suggest of any more points or additions. We are using SAP BO as a reporting tool in our organization wherein a Universe can be created easily for reporting.
    Cheers,
    Shaz

    I have designed a Star schema for one of my datamart and my client is after me suggesting that over that I should create a MV to provide a consolidated view. I am trying to convience my client not to do so with the points as below:You are convincing them to NOT do one of the the things materialized views were originally introduced to provide?
    I'm purposely going all the way back to 8i documentation here to emphasize the point.
    http://docs.oracle.com/cd/A87860_01/doc/server.817/a76994/qr.htm#35520
    " Overview of Query RewriteOne of the major benefits of creating and maintaining materialized views is the ability to take advantage of query rewrite, which transforms a SQL statement expressed in terms of tables or views into a statement accessing one or more materialized views that are defined on the detail tables. The transformation is transparent to the end user or application, requiring no intervention and no reference to the materialized view in the SQL statement. Because query rewrite is transparent, materialized views can be added or dropped just like indexes without invalidating the SQL in the application code. "
    >
    The theory behind query rewrite is this: have them build their queries based on your star schema (or you a build a traditional view that does that), then build a materialized view that mirrors the query/view. If the materialized view is refreshing or not up-to-date, their queries will run (more slowly) against the star schema. If it is up-to-date it will be used instead, providing faster results.
    But before you go to that trouble: they are asking for a consolidated view (presumably something easier to query - common in data warehousing). You can create a view to provide this. If that view is not fast enough for their performance requirements, materialize it. Yes, the materialized view uses space, but that space is the price you pay for meeting the performance requirement.

  • Star schema and Infoprovider

    Hello,
    maybe this question is strange but is it necessary to create for each Infoprovider / Infocube one star schema?
    Thanks in advance

    Hi
    I t is imperative that you refer some documentation related to BI Modeling. There are a couple of good docs. available which explain in detail about building an Infocube based on star schema to meet certain reporting requirement.
    From your statement, it is clear that you still need to get a better of understanding of the modeling. Instead refer to the link below. It's a fantastic document.
    http://www.sdn.sap.com/irj/scn/index?rid=/library/uuid/6ce7b0a4-0b01-0010-52ac-a6e813c35a84
    Cheers
    Umesh

  • Star schema and snow flake schema

    can any one tell star schema is better or snow flake schema is better why
    thx in advance

    Hi,
    Difference : http://www.diffen.com/difference/Snowflake_Schema_vs_Star_Schema
    When it comes to OBIEE star schema will be easy to configure because it don't involve much tables where as snow flake schema need to denormalize the tables in BMM layer to get the desired model but again it all depends on how your system was designed
    HR schema which is more like a snow-flake schema structure
    Refer http://www.varanasisaichand.com/2012/05/denormalizing-physical-tables-in-bmm.html
    Thanks,
    Saichand

  • Newbie question : why is star schema fast and efficient?

    Hi all,
    just a stupid question, but I haven't been able to find a proper
    answer so far...
    Why is star schema a good design for Data Marts and DWH?
    What is the underlying reason that makes it attractive
    performance wise?
    Why wouldn't just one big table with all the data in it and with
    the proper indexes be enough?
    Thanks all!!
    Regards
    Vincent

    There are several reasons to use star schemas, particularly in
    Oracle.
    A flat table like you asked about looks attractive but has
    several flaws, i.e. massive data redundancy, no logical
    groupings, no aggregation (or additional redundant data
    aggregated), etc.
    A start schema is semi-denormalized to allow easy reporting. A
    truely normalized system is diffucult to report against be cause
    you may have to join many tables to return just 2 pieces of
    related data. A star schema enables you to join to only a single
    dimension table to the fact table to return the same 2 pieces of
    data. If you're returning many pieces of data, a star schema
    keeps access very simple. Most third party reporting tools
    recognize star schemas and will build your where clauses behind
    the scenes making them a lot more useful to end users.
    Oracle is adding optimizations to the cbo for start schemas.
    Using dimensions, materialized views, partitions, IOTs, etc
    greatly enhances performance for queries against massive amounts
    of data. It does make loading the data more difficult but the
    trade off at query time is worth it.
    A flat table structure, besides having a lot of redundant data,
    is hard to optimize. When you have terebytes of data, a flat
    table structure gets scary even with indexes.
    This is just my opinion, hope that helps.
    Lewis

  • BMM layer creation (Star Schema Physical Layer) - What to add/not add?

    Hi All,
    I am just looking for any general feedback on the thought/question below.
    I am setting out on creating my first BMM layer and trying to determine what I need to do in this layer that will be different and add value from what I already did in the physical layer. My data model is already defined as a star schema within my data mart source. So in the physical layer I have my facts imported along with the dimensions and I have joined them together as needed. Here is what I think I will setup as I move into the BMM layer:
    1. I will add heirarchies as needed to enable drill down within my reports
    2. I will need to add my calculations/measures to allow for any type of metric to be returned through a request in Answers
    3. I do not see a large need to create logical tables (at least not yet) based on multiple physical source tables as my source is already a star schema and dimensionally modeled. For users that also already source a star schema at the physical layer .. do you find that you do a lot of logical table creation/mapping to add functionality or does your BMM look a lot like you Physical Layer?
    Other than steps 1 and 2, I am not really sure how much additional manipulation I might do from the Physical to BMM layer since my Physical is already a star schema. Am I missing anything? Obviously everyone's data model and circumstances are different but I wasn't sure if maybe there were some good thoughts on what I might be missing (if anything)?
    One last question .. I am not currently planning to use any aliases at the Physical Layer but I do plan to rename the tables at the presentation layer to be more business verbage like. Why are others using aliases?
    Thanks in advance for the help.
    K

    Alastair, thanks for the advice. I'll definitely keep that in mind as I start to build out the BMM.
    One question/issue I just ran into as I was wrapping up my Physical Layer mapping. When I check for global consistency, I am getting an error that is complaining that I have multiple joins defined between the same two tables (which I do). This is because I have the following setup:
    TBL_A_FACT
    F_ID_HIT
    F_HIT_DESC
    F_ID_MISSED
    F_MISSED_DESC
    TBL_B_DIM
    F_ID
    F_DESC
    Table A joins to Table B in two ways:
    TBL_A_FACT.F_ID_HIT = TBL_B_DIM.F_ID
    TBL_A._FACT.F_ID_MISSED = TBL_B_DIM.F_ID
    The F_IDs can be either hit or missed on any given fact record and the total distinct set exist in the dimension.
    When I define two foreign key joins in the physical layer based on the relationship above and check Global Consistency, I get an error saying that "TBL_A and TBL_B have multiple joins define. Delete duplicate foreign keys if they exist" and it is listed as an error. I guess this makes sense because when the two tables are used in a request OBIEE would need to know how to join them (using the hit or missed field). What is the best approach for handling this..
    - Should I define TBL_A twice in the physical layer as:
    TBL_A (Alias TBL_A_HIT)
    F_ID_HIT
    F_ID_HIT_DESC
    TBL_A (Alias TBL_A_MISSED)
    F_ID_MISSED
    F_ID_MISSED_DESC
    Or do something like the above in the BMM layer?
    Thanks for the help!
    K
    And then establish the relationships using these separate tables?

  • Star schema or Snowflake schema

    Hi Gurus,
    I have following dimensions and fact table. let me know can I go ahead with star schema and snowflake schema while building the cube.
    1. Country's table
    2. workgroup table --> each country have N number of work groups
    3. user table---> each workgroup have  N number of users.
    4. time table.
    5. fact table.

    This is a similar thread that discusses on the design approach of star vs normalized tables
    https://social.technet.microsoft.com/Forums/sqlserver/en-US/7bf4ca30-a1bc-415d-97e6-ce0ac3137b53/normalized-3nf-vs-denormalizedstar-schema-data-warehouse-?forum=sqldatawarehousing
    In my experience majority of cases I've some across is also star schema for data marts where tables will be more denormalized rather than applying priciples of normalization. And I believe so far as its through SSAS cubes that you exposes the OLAP model
    it would be much easier to implement relationships using a denormalised approach.
    What you may do is to have a normalised datawarehouse if you want and then built the datamarts over it using denormalised tables (star schema) for the cube.
    Please Mark This As Answer if it solved your issue
    Please Vote This As Helpful if it helps to solve your issue
    Visakh
    My Wiki User Page
    My MSDN Page
    My Personal Blog
    My Facebook Page

  • HR Analytics Star Schema Definitions

    All--
    We currently report on PeopleSoft data by extracting it into a data mart. ( The data store's configuration is not in a star schema format). We use it for day to day operational reporting needs. We are researching on HR Analytics and want to know more details about the data elements covered in the schemas, is there any document that explains the details of the various star schemas and the mappings to the original PeopleSoft database elements ?
    We need this to perform a gap analysis. Any help would be much appreciated.
    Thanks,
    Ajit

    Apparently on eDelivery they have finally made available the Source to Staging Mappings in spreadsheet form.
    Check that out.
    Bob Murching posted about it a few cycles ago on this forum as well.
    To clarify the document does not exactly detail the stars. You need to download the Data Model Reference Guide from Metalink3 to see those details.

  • Do we use direct star schema concept anywhere in sap bw

    i know about extended star schema,and where sap uses this concept.
    my question is do we use normal star schema concept any where in sap bw, apart from extended star schema concept.
    if yes specify the answer briefly .
    thanks in advance
    with regards
    yash.b

    Hi,
    If I'm not mistaken an Analytic view in Hana is more like the normal star schema, it is definitely not extended and can be consumed by BW for OLAP processing.
    Regards,
    Michael Devine

  • Star schema of Datawarehouse

    Hi,
    I'm working with BI Apps (OBIEE) uploaded on PeopleSoft HCM 9.1 source system. I'm using pre built ETL task to load the pre designed Datawarehouse.I want to know where can I found the Datawarehouse star schema description?

    What you are looking for the DMR (Datawarehouse Model Reference) Doc. I believe they have this on Metalink for the BI Apps releases. This provides the physcial star schemas and details on the DW tables. Here is the metalink note:
    Oracle Business Analytics Warehouse Data Model Reference Version 7.9.6, 7.9.6.1 and 7.9.6.2 [ID 819373.1
    If this was helpful, please mark the response as correct.                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                           

  • BW Star Scheme & Multi dimensional Data Modelling

    Hi BW Experts,
    Can any one please let me know when i have to check in help.sap or serivices.sap
    for detailed info on BW Star Scheema and Multi dimensional Data Modelling and how it is used in BW.
    Please update me where i have to check for this info
    Thanks

    hi...
    star schema..
    Please check the threads below..
    Differences between Star Schema and extended Star Schem
    What is the difference between Fact tables F & E?
    Invalid characters erros
    mdm..
    http://help.sap.com/bp_biv133/documentation/Multi-dimensional_modeling_EN.doc
    hope this helps,...

Maybe you are looking for