"Reverse Star schema" to star schema

Hey
This is my first post and I´m working with OWB for abou 4 Months now.
My problem:
I am working with a cube but instead of a real star schema I always get a model like this:
a fact in the fact table of my cube matches to multiple records in the dimensions. (1 to n)
The cube itself is ok and even the results but on the database site I want to implement a real star schema.
Is there any way to remodel the dimensions or the cube or tables so that i an get a star schema.
I read about bridge tables to solve this problem, but I´m not sure they can help me.
Anyone an idea?
Thanks Mario

See if this helps,
http://docs.oracle.com/cd/E14571_01/bi.1111/e10540/busmodlayer.htm#BGBJGJIB
http://kimballgroup.forumotion.net/t1356-weighting-factor-in-bridge-table

Similar Messages

  • Creation of star schema from snowflake schem in BMM layer

    hi,
    This is my situation.I have "Fact-table" which has Dim 1 .now Dim 1 is joined to Dim2,Dim3
    Fact
    |
    Dim 1
    |
    | |
    Dim 2 Dim 3
    Now in Bmm Layer how can i make this snowfalke schema to star schema.I heard about making changes in the Logical Table source.And what will be the look of the presentation layer.
    Any help is appricaited Guys.

    In physical layer, you have a join between Dim 1 and Dim 2, Dim1 and DIm3, Fact and Dim1. In BMM for Dim1, in the sources, add Dim2 and Dim3. You may add both these dimensions in one single LTS if the data is not duplicate in the tables. In case the data is duplicated add them as seperate LTS in the sources for Dim1. Refer this post for reference -- Logical Table source source query
    In BMM you need a join between Dim1 and Fact. Basically your Dim1 is sourced from three different tables which are your dimensions. This would transform your snowflake into star. In your presentation layer you will have all the columns from your dimensions (except for the duplicates, lets say you have column A in both dim1 and dim2, you should map this column in column mapping tab so as to enable BI server to pick the most economical source) and facts.
    Hope this clears your question.

  • Star schema or Snowflake schema

    Hi Gurus,
    I have following dimensions and fact table. let me know can I go ahead with star schema and snowflake schema while building the cube.
    1. Country's table
    2. workgroup table --> each country have N number of work groups
    3. user table---> each workgroup have  N number of users.
    4. time table.
    5. fact table.

    This is a similar thread that discusses on the design approach of star vs normalized tables
    https://social.technet.microsoft.com/Forums/sqlserver/en-US/7bf4ca30-a1bc-415d-97e6-ce0ac3137b53/normalized-3nf-vs-denormalizedstar-schema-data-warehouse-?forum=sqldatawarehousing
    In my experience majority of cases I've some across is also star schema for data marts where tables will be more denormalized rather than applying priciples of normalization. And I believe so far as its through SSAS cubes that you exposes the OLAP model
    it would be much easier to implement relationships using a denormalised approach.
    What you may do is to have a normalised datawarehouse if you want and then built the datamarts over it using denormalised tables (star schema) for the cube.
    Please Mark This As Answer if it solved your issue
    Please Vote This As Helpful if it helps to solve your issue
    Visakh
    My Wiki User Page
    My MSDN Page
    My Personal Blog
    My Facebook Page

  • Multiple star or snow flake schema for universe

    Hi,
    I would like to know following things
    1. Can we use more than one star or snow flake schema for an universe? and how to do this?
    2. Using multiple shemas for one universe is it good practice or not?
    Regards,
    Manjunath

    Manjunath,
    This is exactly where BusinessObjects excels.
    When dealing with multiple fact tables, you use contexts.
    Contexts are very simple to understand and you must take your time to do so if you are going to successfully develop universes based on more than one fact table. No matter what universe you build, the rule for contexts is always the same; there are no different circumstances based on, say, industry.
    Each context starts with a base table, typically a fact table, where all its joins are at the many end of the relationship. The joins are then followed out through the joined tables up other joins where they in turn are at the many end, as in a snowflake schema.
    For example, consider the very basic schema below:
    D1 -< D2 -< F1 >- D3 -< F2 >- D4
    There are two tables that only have many joins attached to them - F1 and F2.
    Starting with F1, I can move to D2. I can also move from there to D1. In the other direction, I can move from F1 to D3. However, I cannot move from D3 to F2 because the join cardinalities are in the wrong direction. So, I've found all the joins that belong in the first context. They are D1-D2, D2-F1 and F1-D3. By the same process, I will get to the second context containing joins D3-F2 and F2-D4.
    It doesn't work well with many-to-many joins, but you really shouldn't be facing these in a well-designed multi-star.
    As for one-to-one joins, set the cardinality as one-to-many in the direction that you know the relationship should be in for the ownership to work out correctly..
    The process that I've described above is essentially how the Detect Contexts algorithm works.
    The only remaining thing for you to read up on is the SQL parameters but essentially you need ot be able to select multiple contexts and generate multiple sql statements for each context. Otherwise, what's the point in defining them?!!
    Hope that clears it up for you.
    Regards,
    Mark

  • Enquiry:  Any tool to reverse engineer an existing db schema in Oracle 8i

    Dear All,
    Do you know whether Oracle SQL Developer Data Modeler (3.1.1.703) supports Oracle 8i? If not, is there any tool that can be used to reverse engineer an existing database schema in Oracle 8i?
    Your early reply is appreciated.
    Thanks a lot.
    LAWRENCE CHOW

    That's a pity. The initial version of Data Modeler used to support import from Oracle 8i.
    You could try setting up an ODBC connection to your database and using the JDBC tab (rather than the Oracle tab) when entering the Connection details into Data Modeler.
    Alternatively, if you can get hold of an earlier version of Data Modeler (e.g. version 3.0 or earlier), you could try this (and if it works you could then upgrade your model to the current version).
    David

  • Context,Physical schema and Logical schema

    Hi,
    How the context,physical schema,logical schema and agent are interrelated.
    Please explain
    Thanks
    Jack

    Hi Jack,
    Context:
    A context is a set of resources allowing the operation or simulation of one or more data processing applications. Contexts allow the same jobs (Reverse, Data Quality Control, Package, etc) to be executed on different databases and/or schemas.
    Its used to run the object(process) in different database.
    Physical Schema:
    The physical schema is a decomposition of the data server, allowing the Datastores (tables, files, etc) to be classified. Objects stored in data servers with this mode of classification can be accessed by specifying the name of the schema attached to the object name.
    Ex
    Oracle classifies its tables by "schema" (or User). Each table is linked to a schema, thus SCOTT.EMP represents the table EMP in the schema SCOTT.
    Logical schema:
    A logical schema is an alias that allows a unique name to be given to all the physical schemas containing the same datastore structures.
    ->The aim of the logical schema is to ensure the portability of the procedures and models on the different physical schemas. In this way, all developments in ODI Designer are carried out exclusively on logical schemas.
    Thanks
    Madha

  • STAR TREK moving stars effect?  Plug-in or tip how to do it

    I need to create the classic STAR TREK moving stars like seen in the original series and/or STAR TREK II: THE WRATH OF KHAN. Is there a plug-in ready-made for this or is there a secret?
    Much appreciated.

    Aye Captain,
    There is a particle Emitter in Motion called “Star Field” and another “Star Travel” that should do the trick…
    G4 Dual 1 GHz/1.5 ram/Lacie(FW400)externals FCP 5.1.2/ Shake Mac OS X (10.4.8)

  • How do I move a table from one schema to another schema on Oracle XE?

    How do I move a table from one schema to another schema on Oracle XE?

    Hi,
    I tried to use the insert/select statement that you had given, it did not work.
    The error is ORA-00913: too many values.
    But finally what I did was, I went into the system schema where the table was and generated the DDL through the utilities and afterwards I imported them into the schema that I am currently working on. It solved the problem!
    However I am still curious to know why the insert/select statement did not work? Do you know any site/tutorial which gives a real time example?
    Thank you
    Skye

  • How do I move a table from one schema to another schema?

    How do I move a table from one schema to another schema?

    Grant access to the table from the source schema to destination schema.
      GRANT SELECT ON <TABLE_NAME> TO  <DESTINATION SCHEMA>A simple way would be to use CREATE Table with select syntax (in destination schema)
      CREATE TABLE <TABLE_NAME> AS SELECT * FROM <SOURCE SCHEMA>.<TABLE_NAME><li>However, you would be in <b><u>trouble when the table has index,constraints and triggers</u></b>.
    So you can better of grab the DDL statement of the table(and any additional components) andd then create the table in the destination schema.You can use SQL developer, Toad or Apex's Object browser for this.
    After the table is created, Insert the records using SELECT.
    INSERT INTO <TABLE_NAME> SELECT * FROM <SOURCE SCHEMA>.<TABLE_NAME>This question is discussed in great detail in this <b>AskTom thread</b>

  • How can i access all the objects of one schema from another schema

    Dear All,
    How can i access all the objects(Tables,Views,Triggers,Procedures,Functions,Packages etc..) and do the modifications of one schema from another schema (Without using synonyms concept).
    Thanks in advance,
    Mahi

    First of all, synonyms only help you easy reference the object. It doesn't have any implication of object privilege.
    As long as you have proper privilege on target object. You can access it with or without synonyms.
    Assuming you have proper privilege of objects, you can use following command to assume schema owner.
    ALTER SESSION SET CURRENT_SCHEMA = Schema_owner

  • Grant access to all the views created in user schema to another schema

    How to grant access for all the views created in own HAGGIS schema to comqdhb schema on the HAGGIS database.
    Oracle Grant Privileges
    ===============
    Object privileges assign the right to perform a particular operation on a specific object
    I read that we can use select 'grant select on' ||view_name||'HAGGIS' user_views where owner='COMQDHB'
    Is this right
    Oracle System Privileges
    ===============
    System privileges should be used in only cases where security isnt important,because a single grant statement could remove all security from the table
    Role based security
    ============
    Role security allows you to gather related grants into a collection-since the role is a predefined collection of privileges that are grouped together.privileges are easier to assign to users.
    [http://www.dba-oracle.com/art_builder_grant_sec.htm]
    can we grant select update to all the views at a time to the other schema.
    Are there any other ways to secure the data other than creating users and assigning roles.
    Thank you
    Edited by: Trooper on Dec 23, 2008 9:24 AM

    I think what was suggested was that you use SQL to generate the grants on each and every view, that is, you use SQL to generate SQL where the SQL being generated is "grant select on view_name to role'"
    If you users to connect to Oracle you have to create usernames for them though if the users only connect via an application the application might run just as one user and access to the application is controled via application security. The control on the application can be via Directory Services such as OID or MS Active Directory. User access to Oracle can also be controlled via OID.
    To connect to Oracle you can use OS authenication (not recommended), usernames with passwords, or via Advanced Security Option which supports single sign-on products like Kebros or Oracle Internet Directory etc....
    Example using SQL to generate SQL
    How do I find out which users have the rights, or privileges, to access a given object ?
    http://www.jlcomp.demon.co.uk/faq/privileges.html
    HTH -- Mark D Powell --

  • From schema 1 to schema 2 migration delegated admin problem

    I want migrate from schema 1 to schema 2 the messaging server 6.2 ( jes 2005q1).
    I have install access manager and delegated admin.
    With the commdirmig I migrate the domain and schema , the messaging work correctly.
    I have a problem with the delegated admin web interface.
    The delegated don't view my domain. If I add the sundelegatedorganization objectclass I can view my domain on delegated admin but I can view user and group.
    Any Idea?
    TIA
    Bye Giovanni

    There are two very different products called "deletaged admin". The old iPlanet Delegated Admin (iDA) only works with Schema 1. The current Delegated Admin, that comes with JES3 only works with Schema 2.
    If you're using the old iDA that worked with schema 1, it won't work with schema 2. You have to install the new DA for that.
    It doesn't work with groups/lists, only with users and domains.

  • Cannt execute stored proc of one schema in another schema from java app.

    I am posting my problem in this forum as i i though it could be server-independent.
    I am working on apache tomcat and spring framework with Oracle db (schema/user A)
    We access oracle db from our java application by setting jndi and works fine.We have sqlstatements, stored procs and functions all run fine.
    Now we create a role (DBROLE) with all permissions to that original db schema/user(A) . We created another empty schema B and assigned that role(DBROLE) to that user B.
    (We grant all kind of permissions on tables/packages of schema A to user role DBROLE and also created synonyms)
    Intentions are: to access the schema A though schema B from application and avoiding direct access.
    In our spring application, we replaced database-settings with schema B.
    Things work fine: When its plain SQL statement is run from Java code but Stored proc wont run and we get
    'Wrong num of arguments/data types' error.
    Also all stored procs are in packages.To execute stored proc in java code, we use SimpleJdbcCall.
    I also checked run stored proc from schema B and its works. Only from web app, it doesnt work.
    Please suggest,what should be done to make this working or if there is other alternative.
    Thanks

    Instead of importing a scema in another schema specifiy the schemas in the external-schemaLocation property.
    SAXParser saxParser = new SAXParser();
    saxParser.setProperty("http://apache.org/xml/properties/schema/external-schemaLocation", "xmlschema1.xsd, xmlschema2.xsd");

  • Best LKM to move data from with in Oracle from one schema to another Schema

    Hi Gurus,
    What is the best KM to move data from one schema to another schema within same oracle database.
    Thanks in advance

    Dear,
    If your source and target are on the same database server then you dont need LKM.
    You have to 1. create one data server for the database server
    2. Create one physical schema for your source and another physical schema for your target under the above created data server.
    3. Then create models for each above created physical schema
    In this case you just need IKM knowledge module
    Please refer http://oditrainings.blogspot.in/2012/08/odi-interface-source-target-on-same.html
    If your source and target are on different server then you must create two different data servers in topology. You have to use LKM.
    The best LKM to use is LKM oracle to Oracle dblink. But you should have proper grants to use it
    If your source has very few records you can go with LKM SQL to Oracle other wise use LKM oracle to Oracle dblink

  • To kill session in one schema from another schema

    Hi Team,
    I got a problem like a table from one of my schema has been locked. I am getting 'ORA-00054: resource busy and acquire with NOWAIT specified' error when trying to delete rows from that table or even when trying to truncate that table.
    Let the table be 'T1' present in schema 'VIEW'
    I tried to kill the session which is active for that schema by below query
    select sid,serial#,status from v$session where username='VIEW' and STATUS = 'ACTIVE';
    alter system kill session '681,2586';
    But i couldn't do the above as i don't have DBA privilege for that. But i have DBA privilege for another schema let it be 'ADMIN'
    Now how can i kill the session in schema 'VIEW' from schema 'ADMIN'
    can any one get me solution.
    Thanks in Advance
    11081985

    I got a problem like a table from one of my schema has been locked. I am getting 'ORA-00054: resource busy and acquire with NOWAIT specified' error when trying to delete rows from that table or even when trying to truncate that table.
    Before you do anything why don't you actually find out WHY that table has been locked.
    You generally should NOT be killing sessions without knowing what is causing the problem to begin with.
    Then you also need to determine if you should use KILL SESSION or instead use DISCONNECT SESSION and well as whether the use of IMMEDIATE is appropriate.
    Each of those choices acts differently. Many people use KILL when they should really use DISCONNECT.
    See DISCONNECT SESSION Clause and KILL SESSION Clause in the ALTER SESSION chapter of the SQL Language doc
    http://docs.oracle.com/cd/E11882_01/server.112/e17118/statements_2014.htm#i2282145

  • HT5024 It states that it takes 24hrs for changes to come into effect. However, what does it mean that it may retains my review in the system indefinitely, so if I rate it 1 star by accident, and changed it to rate 5 stars, the 1 star rating is not removed

    It states that it takes 24hrs for changes to come into effect. However, what does it mean that it may retain my review in the system indefinitely? So if I rate it 1 star by accident, and changed it to rate 5 stars, the 1 star rating is not removed?

     Hi,
    One of my ex-colleagues has installed a NI-DAQ 6.5 in our system. [And I do not see any other naitional instruments card in the CPU of the computer, may be he removed it] I deleted the account and all his files in the system. When I am trying to install version8.0, its not getting installed and giving me a message that I should uninstall the previous version by going to Add/Remove programs in the control panel.
    I tried doing that, but the "Change/Remove" button does not seem to work...[There is no response and so unable to install the new version...]
    Any idea how can this problem be solved?
    It is a windowsXP operating system with SP2 installed on a machine with P4 processor.
    Thanks

Maybe you are looking for

  • Simple transformation with reference to ddic structures

    Hi, experts, we decide to use xml as the format when exchanging massive data with other applications. and we want to use simple transformation because according to the document it's more fast. actually our file structure is determined by certain ddic

  • Plz help upgrade issue moving data from char type structure to non char typ

    Hi Experts plz help its very urgent Data :workout(5000) . FIELD-SYMBOLS : <FS_WORKOUT> TYPE ANY.   workout = '         u' . ASSIGN WORKOUT TO <FS_WORKOUT> CASTING TYPE C .                   BAPISDITM = <FS_WORKOUT>. i am getting dump after BAPISDITM

  • Time Capsule and Verizon MIFI

    I have Verizon MIFI into our home for internet connection.  How can I use this with the Apple Time Capsule? Any inputs would be greatly appreciated.

  • IDOC filtering based on field

    Hi all, How to filter IDoc on the basis of field (Like if Bank id : SCB ) and then send  to a file( save in local file)?

  • CUVCM 5.5 Gatekeeper error 10035 and video P2P disconnection

    Under the Cisco Gatekeeper Log on VideoCM 5.5 there are some errors : 31428:15:14:59 <GK-MC> <- Notification: H245 Call end (origin = MC) 31429:15:14:59 [APP] (LI) liTcpRecv: sock 20464 len 4 bytes -1 error 10035 Then the video P2P disconnects. Does