URGENT MODELING SNOW FLAKE SCHEMA

Hi i have 2 fact tables fact1 and fact 2 and four dimension tables D1,D2,D3 ,D4 & D1.1 ,D1.2 the relations in the data model is like this :
NOTE: D1.1 and D1.2 are derived from D1
[( D1.. 1:M..> Fact 1 , D1.. 1:M..>  Fact 2 ), (D2.. 1:M..>  Fact 1 , D2.. 1:M..>  Fact 2 ), ( D3.. 1: M.> Fact 1 , D3.. 1:M..> Fact 2 ),( D4.. 1:M..>  Fact 1 , D4 ... 1:M..>  Fact 2 )]
Now from D1 there is a child level like this: [D1 --(1:M)..> D1.1 and from D1.1.. 1:M..> D1.2.. 1:M..> D4]
Please help me how to design in physical layer and BMM layer. I will be very glad to you guys this is very urgent requirement.
PS: Deepak Gupta you are very helpful earlier please help me.
Thanks,
ch

They too are snowflake.... but we speak snowflake w.r.t dimension attached to FACT...
Alternate solution, but of the same type..
First of all create aliases for all the Tables in physical layer... but create two aliases for D1 , say D1.A1 & D1.A2
Note: you need to create joins between aliases only, no need of creating joins between original tables ..
In Physical Layer, just have all the joins as Physical Foreign Key Join, just the way you have described, making sure that
D1.A1 is connected only to both the facts aliases...
and
D1.A2 is connected only  to D1.1 alias
Drag all the aliases in BMM
Now work with Aliases only in BMM Layer
Here D4 is snowflaked...
So in BMM just bring D4 Alias from Physical Layer and onto it, drop D1.1 alias, D1.2 alias and D1.A2 alias , so that now you have 4 logical table sources under 'Logical Table folder' D4.
This is it....
The Only thing is that you ll have to create two Aliases for D1...
Thanks
Ashish Gupta

Similar Messages

  • Reg Snow Flake schema design

    Hi Experts,
    Can any one post few scenarios regarding Implementation of snow flake schema which can not be converted to star schema in BMM layer for better under standing about schemas. I searched my blogs but i did not found exact answer.
    Regards,
    Rafi

    Rafi,
    Check these links
    http://iniu.net/content/obiee-define-star-schema-over-snowflake-data-model-data-source
    http://www.varanasisaichand.com/2012/05/denormalizing-physical-tables-in-bmm.html
    http://majendi.blogspot.com/2009/01/obiee-snowflakes-and-stars-part-3.html
    http://blogs.datadirect.com/2012/02/obiee-oracle-business-intelligence-integration-with-salesforce-com-crm-and-database-com-via-odbc.html
    In general if you follow the OBI best practices you never get snowflake schema in BMM layer.
    If helps pls mark

  • Star schema and snow flake schema

    can any one tell star schema is better or snow flake schema is better why
    thx in advance

    Hi,
    Difference : http://www.diffen.com/difference/Snowflake_Schema_vs_Star_Schema
    When it comes to OBIEE star schema will be easy to configure because it don't involve much tables where as snow flake schema need to denormalize the tables in BMM layer to get the desired model but again it all depends on how your system was designed
    HR schema which is more like a snow-flake schema structure
    Refer http://www.varanasisaichand.com/2012/05/denormalizing-physical-tables-in-bmm.html
    Thanks,
    Saichand

  • Snow Flake Schema

    anybody give me some knowledge about snow flake schema.
    in which type of situations,useually snow flake schema creates in data model.
    can you please provide me with example?
    Thanks in Advance.

    In case of large dimesnsion, we will split them to smaller dimensions, this is to make the table better manageable.
    Let's take an example you have a Dimension table which stores the information about the shop details, you may have a seperate dimension table which hold the dempgraphic information of the sales area this provide the logical separation of data.
    You have to decide based on your data and requirement whether to go for snowflake or Star.
    If you have snowflake defined, will be flattened the same in the logical layer of OBIEE.
    Thanks,
    Vino

  • Question for integration star and snow flake schema in data warehouse

    Dear Reader,
    I facing a problem like that
    I have two data warehouse, one use star schema, other use snow flake schema. I would like to integrate both of them into one data warehouse. What is the strategy should these two data warehouse adopt in order to integrate int one data warehouse?
    Should I scrap both data warehouse and build a new one instead, or scrap one of them and use the other?
    What factors should be considered in order for me to more easily resolve the differences between the two data warehouses.
    Please advise. Thank you very much.

    Hi Mallis,
    This is a very broad question and the answer depends on so many factors. Please go through the following articles to get an
    understanding of what the differences are when to use which.
    When do you use a star schema and when to use a snowflake schema -
    http://www.information-management.com/news/10000243-1.html
    Star vs Snowflake Schemas – what’s your belief? –
    http://www.networkworld.com/community/blog/star-vs-snowflake-schemas-%E2%80%93-what%E2%80%99s-your-belie
    Hope this helps!

  • Multiple star or snow flake schema for universe

    Hi,
    I would like to know following things
    1. Can we use more than one star or snow flake schema for an universe? and how to do this?
    2. Using multiple shemas for one universe is it good practice or not?
    Regards,
    Manjunath

    Manjunath,
    This is exactly where BusinessObjects excels.
    When dealing with multiple fact tables, you use contexts.
    Contexts are very simple to understand and you must take your time to do so if you are going to successfully develop universes based on more than one fact table. No matter what universe you build, the rule for contexts is always the same; there are no different circumstances based on, say, industry.
    Each context starts with a base table, typically a fact table, where all its joins are at the many end of the relationship. The joins are then followed out through the joined tables up other joins where they in turn are at the many end, as in a snowflake schema.
    For example, consider the very basic schema below:
    D1 -< D2 -< F1 >- D3 -< F2 >- D4
    There are two tables that only have many joins attached to them - F1 and F2.
    Starting with F1, I can move to D2. I can also move from there to D1. In the other direction, I can move from F1 to D3. However, I cannot move from D3 to F2 because the join cardinalities are in the wrong direction. So, I've found all the joins that belong in the first context. They are D1-D2, D2-F1 and F1-D3. By the same process, I will get to the second context containing joins D3-F2 and F2-D4.
    It doesn't work well with many-to-many joins, but you really shouldn't be facing these in a well-designed multi-star.
    As for one-to-one joins, set the cardinality as one-to-many in the direction that you know the relationship should be in for the ownership to work out correctly..
    The process that I've described above is essentially how the Detect Contexts algorithm works.
    The only remaining thing for you to read up on is the SQL parameters but essentially you need ot be able to select multiple contexts and generate multiple sql statements for each context. Otherwise, what's the point in defining them?!!
    Hope that clears it up for you.
    Regards,
    Mark

  • Snow flake schema to star schema

    Hi Gurus,
    I am having a snow flake schema how can i turn it into star schema.
    thanks,
    kumar

    In Obiee You can use multiple table source and join 2 tables abd make as singel table.
    EX A->B->C
    Now we need to join B,C and call it D. Then A->D is star Schema.
    Steps is. Join B and C in the physical layer. join B to A Drop B into the Logical layer, Now Drop C on top od the B.
    Table B wil get all the column from C. Here it wont create 2 sources, It will look like single source only.
    IN the column mapping you can mpa the column to desired physical table.
    Later as usula you can join B and A in logical layer.
    Get A and B into presentaion layer.
    mak if correct/helpful
    fiaz

  • How to make snow flake schema to star in Business layer

    i have snow flake data model.
    anybody tell me how can i convert that snow flake to star by doing the adding multiple sources to a dimention.
    how to do the joins with 2 dimention tables in the business layer

    Hi
    we can do that for example we have dimension table that connected with another dimension table in physical layer
    solution- create one new dimension table in bmm layer and drag the columns from two dimension table which in physical layer and give the join condition it ll work,...
    Regards
    prabhu haridass

  • Snow flake problem

    Hello
    whats the procedure to resolve the problem if we are able to only build a snowflake schema
    my tables are like this:
    fact table: sales
    dimesion tables: customers, products, orders
    other tables that are connected to dimension tables with primary keys are "countries" is connected to "customers" and orders_inventory is connected to orders
    so it is forming a snow flake in physical layer and how to we do make in BMM to form a star schema
    if we are dragging columns from countries to customers in BMM, will that work effiecently as expected because as a user i am not able to see the data from both the tables. i am working on SH schema from ORACLE 10g database
    thanks
    rake

    rake,
    I suggest you do some reading on dimensional modeling and use aliases and multi-source LTSs
    You'd need to model in such a way as to avoid circular joins.

  • Physical model attribute "Partition Scheme" not saved. (SS2000)

    Hello,
    I believe this is a bug report.
    Scenario:
    I have a physical model of a SQL Server database. The model contains partitioned tables, 1 partitioning function and 1 partitioning scheme. If I open the physical model properties of an index or a table, set the "Partition Scheme" attribute, save, close and re-open the model, the "Partition Scheme" attribute is lost. If I don't close out and just re-open the properties page, the attribute is retained. Some how, the attribute is lost in the persistence layer.
    If there is a reason the attribute is being reset, please let me know.
    Thank for your help, and all the great products you guys make,
    Kurt
    DataModeler v 3.1.1

    Hi Kurt,
    If there is a reason the attribute is being resetUnfortunately there is a bug causing the problem. Fix will be available in next patch release.
    Philip

  • Can't create connection to export model to reporting schema

    Hi,
    I am using version 3.1.1703. I am trying to export the table and column comments from report manager. I try to export my model to Report Schema but I can't create a connection. When I press the plus to create a connection nothing happens.
    Please help.
    Dean

    Hi,
    Just to note that the log file is normally the file datamodeler.log in the folder datamodeler\datamodeler\log (unless its location has been changed in the Preferences).
    David

  • Advice/Recommendation please? How to best handle big models with various schemas?

    Hi
    Using DM 3.3.0.747.
    We have several bespoke applications, each using its own data model in separate schemas.  They do share various entities/tables which are made available in a shared schema.  There are references to these shared tables, etc. (e.g. foreign keys)
    We want to embark on a process to reverse engineer and then maintain these applications' schemas in Data Modeller.
    I would really like your views on how to best approach this.  The main concern is that the bigger the design (or more models/sub-views) the slower actions like saving and loading it becomes.  Once you also have it version controlled more time is added onto saving and synchronising with the sub-version repository and between team members.
    The thinking is to primarily have each application (and hence its own schema) in separate designs (projects), but include tables from the shared/core schema as needed.  Would this be a good/recommended approach?
    How does one address potential changes made (differences introduced) that may conflict between these separate designs?
    e.g. in physical model the schema becomes a user that has permissions, etc.  and that same user may be in multiple designs.  Each design is version controlled, but in the end only exists once in the database.  If each design is synch'ed with the database, DDL generated and executed, this can cause one design to add permissions, etc. that the second DDL from the second design removes or replaces with its own?
    I hope my question/scenario is clear enough.  If not, please revert to me for more clarification.
    Thank you & Regards

    Hi Hans,
    I think it's better to use DM 4.0 and have everything in one design.
    1) You can import all schemas you need into one design (one relational model). Save that design - important to do it before using "remote objects" approach
    2) Use "Create new models based on schema names" wizard available in context menu for relational model in browser. Wizard will create new relational model for each schema in originating model and create there tables and views that
    belong to that schema. All objects in originating model become read-only "remote objects".
    If table (in new model) has a foreign key that refer table in another schema (now that table is moved to another new model) then "remote object" presentation is created for referred table.
    At the end you'll have one large model with read-only remote objects and set of relational models that keep the track of foreign keys between tables in different schemas in database.
    You can keep the originating model (the large one) in order to have global picture or delete it.
    Properties from physical model are also transferred to new models in DM 4.0.
    3) Check "Show 'Select relational models' dialog" in "Preferences>Data Modeler" - upon "open design"  that will bring up dialog allowing to select which model   to be loaded. The choice is persisted and
    applied next time you open design (no need to see that dialog again if no change in what to be loaded).
    4) For each model you can create "DDL generation" configuration (one or more) and exclude remote object from generation in DDL. Permissions should be maintained in model where table is editable and where DDL for that table will be generated.
    The thinking is to primarily have each application (and hence its own schema) in separate designs (projects), but include tables from the shared/core schema as needed.
    I won't recommend it. I know it's possible to have remote object in another design but you'll face following problems:
    1) You cannot use design level domains and definitions in data types model
    2) Remote object from another design are sensible to location of that remote design. You need to maintain the same directory structure everywhere designs (all related) are checked out and they need all to be checked out before dependent design to be open
    3) You need to add remote objects manually while wizard described above will do it for you in the scope of one design
    The main concern is that the bigger the design (or more models/sub-views) the slower actions like saving and loading it becomes.
    The number of models shouldn't be a problem with support of "Select relational models" dialog. Subviews will impact loading if only they are large and are visible when "save" is used.
    Total number of objects in model doesn't impact "save' operation - only modified objects are saved.
    Yes, there is overhead for versioned designs - one more reason to move to DM 4.0 - SVN client 1.7 is used there.
    Philip

  • Mapping UML Model to Database Schema

    Hi All,
    Is it possible to map UML business object model on database schema using Mapping Workbenck? I've done the mapping with imported java classes, but wodering if same could be done with the UML models. I couldn't find any documentation about it either. This feature may be very useful in cases where you want to incorporate TopLink in the design stages of a project.
    Any help will be highly appreciated. Thanks.

    Hi Sharad,
    I agree that incorporating design time practices into this tool might be useful, but is not supported as you described.
    From the perspective of application development, I find it useful to first develop the object model in an IDE such as JDeveloper which supports UML (Which will automatically allow you to deploy the .class files of the project). I point my MW project to the output location of my .class files which seems to work great as I can make changes to the object model in the IDE and easily refresh the contents in the MW.
    I hope this helps.
    Darren

  • Splitting huge Schema/E-R Model into Modular Schema/Multiple E-R Models

    Hi,
    Currently we have huge DB Schema (1500+ Tables) using 1 single OWNER ID.
    Based on Domain Partitioning,we intend to create 6 new OWNER IDs (based on individual Module/Domain)
    and re-assign the appropriate Tables into the respective Domain.
    basically we expect 1500 Tables under one OWNER ID would now be split
    into 250 Tables per OWNER ID with a total of 6 new OWNER IDs being created.
    We also will need to create 6 new E-R Models ,1 per OWNER ID
    Trying to find out what could be the best possible way to do the "splitting" of a
    linear huge Data Model, into 6 modular E-R models. Going forward, we would like
    to maintain individual Models on per Domain basis rather than the existing monolithic Model.
    Any suggestions or tips on achieving this using SQL Developer Data Modeler would be greatly appreciated.
    What is the best and clean way of achieving this
    Thanks
    Auro

    Hi Auro,
    first you should be aware that you can have foreign keys created only between tables in one model.
    If this is not restriction for you then you can proceed:
    1) create one subview per future model and put in each subview tables that will constitute the model
    2) each model in separate design
    - use "Export>To Data modeler Design" - select subview that you want to export and only objects belonging to that subview will be exported
    Philip

  • Compare model with diferent schemas to database with diferent schemas

    Hi All,
    I have some tables in diferent schemas that are equal(i.e: country table, i need this table in all the schemas) name and structure are the same but with diferences in some properties like mandatory columns,
    the problem is when I try to compare with the database,
    Import from datadictionary swap target model,
    i select the model and select the diferent schemas in the database,
    in the result, the table with the same name is compared for all the schemas in my model with the table in only one schema of the database
    and at the end appears a table to be droped for the tables in the other schemas of my model,
    the table comparison is giving me differences in the schema but is not comparing with the correct schema table,
    I'm not sure if I'm doing something incorrectly or is just a limitation of the tool,
    Thanks in advance

    Hi Philip,
    You are righ, if a select this option in the compare tab options
    I could filter which elements of the model I will compare with the datadictionary,
    But is not too much friendly to change this property every time I want to compare the model,
    I mean, I have generated with DDL editor a file for some objects but the flag 'generated in DDL' is not changing so I need to do it manually,
    I don't know if I'm not doing correctly the generation or if I have some problem with the configuration,
    Could you confirm me that if I execute the generation of DDL with only one object this object will be the only one will have the flag checked??
    In my opinion if could be posible to include a step in the compare way to select the objects will be more friendly for the user but is still a solution to my problem, Thanks a lot

Maybe you are looking for

  • Regarding topdown list box

    how to place topdown list box in selection screen

  • Delivery quantity should not exceed order or confirm quantity

    Dear Friends, How to handle a below scenario, Sales order is for 100 Tons.Partial delivery is allowed. Now delivery quantity should be equal to 100 Tons.Delivery quantity should not exceed order quantity. I mean delivery quantity must be equal to ord

  • Computer crashed, can't transfer old songs to new harddrive. Help?

    My old Dell w/ Windows XP crashed and Dell sent me a new harddrive that I installed on my computer. All my songs are on my IPOD Nano (3rd Gen) and I want to transfer the music from my Nano to the computer with the new harddrive, but when I hook up th

  • HTTP Get healthprobe with AD user authentication

    Hi, I'm throwing this one out there to the ACE module Load-balancing experts! how do I configure a request method get url for google.co.uk so that it authenticates a healthprobe AD user with a Bluecoat proxy appliance? The objective here is to have a

  • OIM 11g: How to get a list of uploaded JAR files

    I am new to OIM 11g and just getting familiar with the new tools/scripts used for uploading/downloading JAR files, plugins, resources, etc. Is there any script or API available that allows you to list what has already been uploaded? Thanks.