Combining relation facts with dimensions from an Essbase cube

Hi!
I am having trouble combining relational measures (from EBS) with dimensions from an Essbase cube. The dimensions that we want to use for reporting (drilling etc) are in an Essbase cube and the facts are in EBS.
I have managed to import both the EBS tables and the cube into OBIEE (11.1.15) and I have created a business model on the cube. For the cube I converted the accounts dimension to a value based dimension, other than that it was basically just drag and drop.
In this business model I created a new logical table with an LTS consisting of three tables from the relational database.
The relational data has an account key that conforms to the member key of the accounts dimension in the Essbase cube. So in the accounts dimension (in the BMM layer) I mapped the relational column to correct column (that is already mapped to the cube) - this column now has two sources; the relational table and the cube. This account key is also available in the LTS of my fact table.
The content levels for the LTS in the fact table have all been set to detail level for the accounts dimension.
So far I am able to report on the data from the fact table (only relational data) and I can combine this report with account key from the account dimension (because this column is mapped to the relational source as well as the cube). But if expand the report with a column (from the accounts dimension) that is mapped only to the cube (the alias column that contains the description of the accounts key), I get an error (NQSError 14025 - see below).
Seeing as how I have modeled that the facts are connected to the dimension through the common accounts key, I cannot understand why OBIEE doesn't seem to understand which other columns - from the same dimension - to fetch.
If this had been in a relational database I could have done this very easily with SQL; something along the lines of select * from relational_fact, dim_accounts where relational_fact.account_key=dim_accounts.account_key.
Error message:
[nQSError: 14025] No fact table exists at the requested level of detail
Edit:
Regards
Mogens
Edited by: user13050224 on Jun 19, 2012 6:40 AM

Avneet gave you the beginnings of one way, but left out that a couple of things. First, you would want to do the export of level zero only. Second, the export needs to be in column format and third, you need to make sure the load rule you use is set to be additive otherwise the last row will overwrite the previouse values.
A couple of other wats I can think of doing this
Create a replicated partition that maps the 3 non used dimensiosn to null (Pick the member at the top of the dimension in your mapping area)
Create a report script to extract the data putting the three dimensions in the page so they don't show up.
Use the custom defined function jexport in a calc script to get what you want

Similar Messages

  • Drill through with OLAP Universe on Essbase cube

    Hi all
    How can I achieve drill through from an Essbase cube to an Oracle database?
    Is there an option in OLAP universes to achieve this?
    If not do I have alternative options?
    Thank you for your help and kind regards,
    Dean

    Hi,
    There is no way to activate drill-through from Essbase to any other database within OLAP universes.
    If you are using Web Intelligence you have 2 alternatives:
    <li> The first one is to create 2 data providers (1 on Essbase and 1 on Oracle) and synchronize them.
    This is not really a drill thorugh operation but you have both data sources in sync in the same document.
    <li> The second option is to create 2 Web Intelligence documents (1 on Essbase and 1 on Oracle) and then parameterize the Essbase document to use Opendoc capabilities (hyperlink) to dril-through on the Oracle document.
    Didier

  • Implementing Logical FACT & Logical dimension from 2 different data sources

    Hi Gurus,
    Here is my situation. We have 2 Different Data sources. One is SRMW and the other one is a different source. What we are trying to implement here is we wanted to create logical dimensions and logical facts as well in the BMM layer. For ex: w_day_d from SRMW and other time dimension from another source makes a logical table. And similarly a fact from SRMW and and another similar fact (Same data types though) will make a logical fact in the BMM.
    I have done the POC of it, but the only problem is that i was able to fetch the data from only one data source not the other source.
    Any suggestions ??
    Thanks in Advance.

    What I already mentioned is that you have to create multiple logical table sources and set the fragmentation content on each logical table source.
    When you have two physical tables for the product dimension, for example DIM_PRODUCT_A and DIM_PRODUCT_B, you must add them to your logical table Products as two separate logical table sources and map all columns to the corresponding logical table columns on the Column Mapping tab.
    Hereafter you should go to the Content tab of each logical table source and describe what content is in the logical table source.
    For example for the logical table source of DIM_PRODUCT_A:
    "BM"."Product"."Product Name" <= 'Product 2'
    and for the logical table source of DIM_PRODUCT_B.
    "BM"."Product"."Product Name" >= 'Product 3'
    Then you must also check "This source should be combined with other sources at this level"
    When you run a query in Answers only on the product table, two queries will be generated to get values from both tables.
    Regards,
    Stijn

  • Relating facts with different grains

    I'm having an issue designing a cube/star schema where I need to relate 2 fact tables that are at a different grain.
    I currently have a star with 1 fact table (Insurance Premium) and 3 dimensions (Date, Account, Policy).  The fact table has 1 row for each policy within each account for every quarter (quarterly snapshot).  The measure is Premium.
    We need to enhance this star with a new fact table that stores Lost Accounts for each quarter.  The difference with this data is that we only get it at the Account Level.  So the Fact table will have 1 row for each lost account per quarter.  The
    measure is Lost Premium.
    I created the Lost Business Fact table which relates to the same Date and Account dimension as the Insurance Premium Fact table.  In the cube, i created a new measure group for the Lost Premium fact table and related it to the same Date and Account
    dimensions.
    the 2 issues that I am having are:
    1.  In the Policy Dimension table, there is a field called AccountBroker, which stores the Broker for the account.  So if the account has 5 policies, this table will have 5 rows and the AccountBroker field will be the same for each row.  The
    end users want to be able to use this field (AccountBroker) in the Lost Premium Fact table so that they can slice and dice the Lost Premium by Account Broker.  The issue is that I'm not sure how to relate the Lost Premium Fact to the Policy Dimension
    since the Lost Premium Fact is at Account level and the Policy Dimension is at Policy Level.
    2.  The other issue is that when we receive the Lost Business File for each quarter, the account may not be present in the Premium Fact because it was lost.  So for example, for Q1 2014, account 1234 was lost and we receive a record in the Lost
    Business File with Account 1234 as being lost and it gets inserted into the Lost Premium Fact.  However, when we get the file that loads the Insurance Premium Fact for Q1 2014, this account is not present because it was lost in that quarter, so there
    is no way to link the records.  My first thought was to maybe take the records from the previous quarter (Q4 2012) and insert those records for Q1 2014 into the Insurance Premium Fact with 0 premium so that they exist and then there will be a match, but
    not sure if there is a better design.
    any ideas would be appreciated.
    thanks
    Scott

    For me,
    DEPT_FACT is not a fact. It's a dimension table because you have a one-to-many relationship and you have a measure in the dimension table (it's an aggregated measure).
    And EMP_FACT is also not a fact because you don't have any measure on it.
    But if we say that EMP_FACT is a fact. DEPT_FACT is an aggregated table from EMP_FACT.
    I will :
    * create a logical dimension for the employee with three levels (all, departement and detail)
    * create a logical fact table with :
    - one logical column for the revenue in the level all departement
    - one logical column for the employee
    and two physical source :
    * DEPT_FACT with the departement level
    * EMP_FACT with the level to detail
    Success
    Nico

  • Error : Data load from relation source (SQL Server 2005) to Essbase Cube

    Hi All,
    I am looking help from you. I am trying to load data from SQLServer 2005 table to Essbase Cube using IKM SQL to Hyperion Essbase (Metadata) Modules.
    I am getting below error. Let me know if i am missing some things.
    org.apache.bsf.BSFException: exception from Jython:
    Traceback (innermost last):
    File "<string>", line 61, in ?
    com.hyperion.odi.essbase.ODIEssbaseException: Invalid value specified [RULES_FILE] for Load option [null]
         at com.hyperion.odi.essbase.ODIEssbaseMetaWriter.validateLoadOptions(Unknown Source)
         at com.hyperion.odi.essbase.AbstractEssbaseWriter.beginLoad(Unknown Source)
         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
         at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
         at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
         at java.lang.reflect.Method.invoke(Unknown Source)
         at org.python.core.PyReflectedFunction.__call__(PyReflectedFunction.java)
         at org.python.core.PyMethod.__call__(PyMethod.java)
         at org.python.core.PyObject.__call__(PyObject.java)
         at org.python.core.PyInstance.invoke(PyInstance.java)
         at org.python.pycode._pyx3.f$0(<string>:61)
         at org.python.pycode._pyx3.call_function(<string>)
         at org.python.core.PyTableCode.call(PyTableCode.java)
         at org.python.core.PyCode.call(PyCode.java)
         at org.python.core.Py.runCode(Py.java)
         at org.python.core.Py.exec(Py.java)
         at org.python.util.PythonInterpreter.exec(PythonInterpreter.java)
         at org.apache.bsf.engines.jython.JythonEngine.exec(JythonEngine.java:144)
         at com.sunopsis.dwg.codeinterpretor.k.a(k.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.scripting(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.execScriptingOrders(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.execScriptingOrders(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTaskTrt(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSqlI.treatTaskTrt(SnpSessTaskSqlI.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java)
         at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java)
         at com.sunopsis.dwg.cmd.DwgCommandSession.treatCommand(DwgCommandSession.java)
         at com.sunopsis.dwg.cmd.DwgCommandBase.execute(DwgCommandBase.java)
         at com.sunopsis.dwg.cmd.e.i(e.java)
         at com.sunopsis.dwg.cmd.g.y(g.java)
         at com.sunopsis.dwg.cmd.e.run(e.java)
         at java.lang.Thread.run(Unknown Source)
    com.hyperion.odi.essbase.ODIEssbaseException: com.hyperion.odi.essbase.ODIEssbaseException: Invalid value specified [RULES_FILE] for Load option [null]
         at org.apache.bsf.engines.jython.JythonEngine.exec(JythonEngine.java:146)
         at com.sunopsis.dwg.codeinterpretor.k.a(k.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.scripting(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.execScriptingOrders(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.execScriptingOrders(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTaskTrt(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSqlI.treatTaskTrt(SnpSessTaskSqlI.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java)
         at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java)
         at com.sunopsis.dwg.cmd.DwgCommandSession.treatCommand(DwgCommandSession.java)
         at com.sunopsis.dwg.cmd.DwgCommandBase.execute(DwgCommandBase.java)
         at com.sunopsis.dwg.cmd.e.i(e.java)
         at com.sunopsis.dwg.cmd.g.y(g.java)
         at com.sunopsis.dwg.cmd.e.run(e.java)
         at java.lang.Thread.run(Unknown Source)

    ODI Step: Prepare for Loading Step:
    from java.util import HashMap
    from java.lang import Boolean
    from java.lang import Integer
    from com.hyperion.odi.common import ODIConstants
    from com.hyperion.odi.connection import HypAppConnectionFactory
    # Target planning connection properties
    serverName = "HCDCD-HYPDB01"
    userName = "admin"
    password = "<@=snpRef.getInfo("DEST_PASS") @>"
    application = "BUDGET01"
    database = "PLAN1"
    portStr = "1423"
    srvportParts = serverName.split(':',2)
    srvStr = srvportParts[0]
    if(len(srvportParts) > 1):
    portStr = srvportParts[1]
    # Put the connection properites and initialize the essbase loader
    targetProps = HashMap()
    targetProps.put(ODIConstants.SERVER,srvStr)
    targetProps.put(ODIConstants.PORT,portStr)
    targetProps.put(ODIConstants.USER,userName)
    targetProps.put(ODIConstants.PASSWORD,password)
    targetProps.put(ODIConstants.APPLICATION_NAME,application)
    targetProps.put(ODIConstants.DATABASE_NAME,database)
    targetProps.put(ODIConstants.WRITER_TYPE,ODIConstants.DATA_WRITER)
    print "Initalizing the essbase wrapper and connecting"
    pWriter = HypAppConnectionFactory.getAppWriter(HypAppConnectionFactory.APP_ESSBASE, targetProps);
    tableName = "BUDGET01_PLAN1"
    rulesFile = r"ActLd"
    ruleSeparator = "Tab"
    clearDatabase = "None"
    calcScript = r""
    maxErrors = 1
    logErrors = 1
    errFileName = r"E:\OraHome_ODI\Error\Budget01Plan1.err"
    logEnabled = 1
    logFileName = r"E:\OraHome_ODI\Error\Budget01Plan1.log"
    errColDelimiter = r","
    errRowDelimiter = r"\r\n"
    errTextDelimiter = r"'"
    logHeader = 1
    commitInterval = 1000
    calcOnly = 0
    preMaxlScript = r""
    postMaxlScript = r""
    abortOnPreMaxlError = 1
    # set the load options
    loadOptions = HashMap()
    loadOptions.put(ODIConstants.CLEAR_DATABASE, clearDatabase)
    loadOptions.put(ODIConstants.CALCULATION_SCRIPT, calcScript)
    loadOptions.put(ODIConstants.RULES_FILE, rulesFile)
    loadOptions.put(ODIConstants.LOG_ENABLED, Boolean(logEnabled))
    loadOptions.put(ODIConstants.LOG_FILE_NAME, logFileName)
    loadOptions.put(ODIConstants.MAXIMUM_ERRORS_ALLOWED, Integer(maxErrors))
    loadOptions.put(ODIConstants.LOG_ERRORS, Boolean(logErrors))
    loadOptions.put(ODIConstants.ERROR_LOG_FILENAME, errFileName)
    loadOptions.put(ODIConstants.RULE_SEPARATOR, ruleSeparator)
    loadOptions.put(ODIConstants.ERR_COL_DELIMITER, errColDelimiter)
    loadOptions.put(ODIConstants.ERR_ROW_DELIMITER, errRowDelimiter)
    loadOptions.put(ODIConstants.ERR_TEXT_DELIMITER, errTextDelimiter)
    loadOptions.put(ODIConstants.ERR_LOG_HEADER_ROW, Boolean(logHeader))
    loadOptions.put(ODIConstants.COMMIT_INTERVAL, Integer(commitInterval))
    loadOptions.put(ODIConstants.RUN_CALC_SCRIPT_ONLY,Boolean(calcOnly))
    loadOptions.put(ODIConstants.PRE_LOAD_MAXL_SCRIPT,preMaxlScript)
    loadOptions.put(ODIConstants.POST_LOAD_MAXL_SCRIPT,postMaxlScript)
    loadOptions.put(ODIConstants.ABORT_ON_PRE_MAXL_ERROR,Boolean(abortOnPreMaxlError))
    #call begin load
    pWriter.beginLoad(loadOptions)
    Execution step from opartor:
    Read rows: 0
    Insert/Delete/updat rows: 0
    ODI Step: Load Data Into Essbase
    from com.hyperion.odi.common import ODIConstants
    from com.hyperion.odi.connection import HypAppConnectionFactory
    from java.lang import Class
    from java.lang import Boolean
    from java.sql import *
    from java.util import HashMap
    # Get the select statement on the staging area:
    sql= """select C1_ACCOUNT "Account",C3_TIMEPERIOD "TimePeriod",C4_LOBS "LOBs",C5_TREATY "Treaty",C6_SCENARIO "Scenario",C7_VERSION "Version",C8_CURRENCY "Currency",C9_YEAR "Year",C10_DEPARTMENT "Department",C11_ENTITY "Entity",C2_DIVLOC "DivLoc",C12_DATA "Data" from OdiMapping.dbo.C$_0BUDGET01_PLAN1Data where      (1=1) """
    srcCx = odiRef.getJDBCConnection("SRC")
    stmt = srcCx.createStatement()
    srcFetchSize=30
    stmt.setFetchSize(srcFetchSize)
    rs = stmt.executeQuery(sql)
    #load the data
    stats = pWriter.loadData(rs)
    #close the database result set, connection
    rs.close()
    stmt.close()
    ODI Step: Report Statistics
    org.apache.bsf.BSFException: exception from Jython:
    Traceback (innermost last):
    File "<string>", line 2, in ?
    Essbase Writer Load Summary:
         Number of rows successfully processed: 1
         Number of rows rejected: 0
         at org.apache.bsf.engines.jython.JythonEngine.exec(JythonEngine.java:146)
         at com.sunopsis.dwg.codeinterpretor.k.a(k.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.scripting(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.execScriptingOrders(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.execScriptingOrders(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTaskTrt(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSqlI.treatTaskTrt(SnpSessTaskSqlI.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java)
         at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java)
         at com.sunopsis.dwg.cmd.DwgCommandSession.treatCommand(DwgCommandSession.java)
         at com.sunopsis.dwg.cmd.DwgCommandBase.execute(DwgCommandBase.java)
         at com.sunopsis.dwg.cmd.e.i(e.java)
         at com.sunopsis.dwg.cmd.h.y(h.java)
         at com.sunopsis.dwg.cmd.e.run(e.java)
         at java.lang.Thread.run(Unknown Source)
    Where am i doing wronge for data load in Essbase? How I will find correct information to load data into Essbase?

  • Problem with update from ODS to Cube

    Hi All,
    I had an issue, when i was loading from ODS to Cube everything was fine except one data package failed coz of some object locked( User : myself, though i wasn't running anything else except this), but i check in SM12 no locks exist for that particular time.
    apparently the load failed coz of this single package, its a load from ODS to Cube, so no chance of Manual update from PSA.
    any suggestion: do i have to repeat that load all over again( sucks !!! it is for 40 mil)
    Regards,
    Robyn.

    Hi Hoggard
    Check the Job log fro ur Job in the SM37 and check for that Datapackage is there any thing failed liek
    "ARFCSTATE-SYSFAIL" if this is there .. it means its an Deadlock .. please check the ST22 dump...
    U can use the PSA...
    1) go to the 8ODS infosource and copy the Delta IP
    2) check the option in the processsing as Update PSA and the subsequently into the data targets
    3) Schedule the IP manually
    hope it helps
    regards
    AK

  • SSRS Expression Current month in a Fiscal calendar dimension from a SSAS cube

    How to create an expression for a default SSRS parameter to a SSAS cube when using i Fiscal Year calendar - and where I want to show current Calendar month and year (for example February 2014) 
    This is an example of my Time dimension for February 2014:
    [Time].[FiscalYMD].[Fiscal_Year].&[2013].&[10]
    The fiscal calendar goes from May - May - which means that April 2014
    would look like this:
    [Time].[FiscalYMD].[Fiscal_Year].&[2013].&[12]
    and May 2014 would look like this:
    [Time].[FiscalYMD].[Fiscal_Year].&[2014].&[01]
    Please advise

    Hi HCMJ,
    To represent the fiscal calendar, you shift the current date by 4 months back. Then you can do:
    ="[Time].[FiscalYMD].[Fiscal_Year].&[" & Year( DateAdd( "m", -4, Today() ) ) & "].&[" & Month( DateAdd( "m", -4, Today() ) ) & "]"
    It just gets the year and month of '4 months ago'. I have not tested it, but I guess it should work.
    Regards
    Andrew Borg Cardona

  • Parallel Export from multiple Essbase cubes

    Hi,
    Is it possible to take parallel Export from Essbase where the source data resides in multiple cubes? I am looking at using source data(lev 0) from multiple cubes and load into simultaneous txt files.
    Please provide links to any documents that may help.
    Regards,
    Lijoy

    IF I read your question correctly, No. You can't merge multiple cube sources into one set of export files. The files are open for output and can't be shared. If you tried to do the one after the other into the same file, it would overwrite.
    You could get creative and after the exports are done concatinate the files together to get a single file. I'm couroius, why do you need all the data in a single file. Even if oyu are loading to other systems you ssould be able to load multiple files. If not, then the concatination through a windows batch script or unix shell script would give you what you want.

  • How to remove the default members of a dimension from the control table?

    Hi all,
    I am new to hyperion FDM 11.1.1.2.0.0 .
    When I navigate to Metadata>Control Tables and try to delete the default memmbers
    for the dimension like category and period i got the following error.
    Error: Period( 8/31/2009 ) is an active Global or Local Period and cannot be deleted.
    Then again i unchecked active checkbox for that dimension from Hyperion Essbase Integration Setup of Workbench Client.
    After doing that i got the following error
    Error: Unable to retrieve target System Data.
    Can anyone tell me how to remove the default members of a dimension from the control table?
    Thanks

    Create a new record for the table.
    Go change the POV to the new record.
    Return to the control table and delete the default record.

  • Imported essbase cube into Oracle BI

    Hi all,
    Environment : Oracle BI EE and Oracle Hyperion Essbase 11.1.3
    I have created Oracle BIEE dashboards using ESSBASE cube as source
    I am following the below steps :
    Step 1 : from Administrative Tool i imported essbase cube into the physical layer
    Step 2 : i drag the cube in to the Business Model layer and Presentation Layer
    Step 3 : while creating Oracle BI dashboards in Answers(Presentation Service), Metadata is perfectly Comming but the Measure values should always be empty.
    I followed the link for creating and configuring the the Oracle BIEE Dashboards with Oralce Hyperion Essbase 11.1.3
    1. http://www.oracle.com/technology/obe/obe_bi/bi_ee_1013/essbase/biee_essbase.htm
    2. http://www.oracle.com/technology/obe/obe_bi/bi_ee_1013/fed_data/fed_data.html
    i am facing the problem to get the Amount Values, So i need help to Achive this issue.
    Thanks and regards,
    Mano

    Have you set the dimension in your Essbase cube that stores the data facts (Amount Values) as an "Accounts" dimension? If not, do that. You'll need to re-import the whole thing in the BI Administration tool again and then drag it across to the BMM and Presentation layers.

  • Two facts with conformed dimension

    Starting new thread per Alistair's request :)
    I have two facts
    Class Enrollment (who is enrolled in which classes)
    External Test Scores (who got what score on what exam)
    The only dimension they have in common is "Person".
    Is it possible to build a presentation layer that includes both facts, the Person dimension, and dimensions that related to one fact but not the other (e.g. dimension "External Test Type" or dimension "Course")?
    How would I set up the BI metadata to answer the question:
    What was the average grade for Course ABC for students who scored above a 50 on test XYZ?
    So far, for External Test Scores fact, I have created hierarchies(/dimensions) for any related dimension tables that didn't already have hierarchies defined. I then went to the Content tab for External Test Scores and set aggregation levels (lowest level for everything related to the fact, "total" level for everything NOT related to the fact).
    I put both facts and the "Person" dimension into a presentation folder. I can query test scores ok, but when I add in a simple "Row Count" column from class enrollment, I get no data back.
    Do I now go back and do the same Content tab settings for the Class Enrollment fact? My initial attempts at this yielded consistency check warnings of "Logical table source does not join to any fact source".
    Huge thanks for suggestions.
    -John

    Hi John,
    I tend to harp on a lot about the need for a hierachy on every logical dim, even just total -> detail. It allows this (non-conformed) approach for starters, also allows another developer to quickly ascertain the relationship between objects.
    About the complex joins, I meant do as you will have done for a normal star with just the conformed dims, so thats Foreign keys where applicable in the Phyiscal layer, and Complex joins where applicable in the BMM layer. I just wanted to make a point that you should do this as per a normal normal star schema with fully conformed dimensions, and merely that the none-conformed dimension will only be connected to its relevant fact table - probably didnt need to say it as it may be more confusing!
    Your correct with aggregation rules, ideally all objects in a logical fact should have an aggregation rule - otherwise it should be in a logical dimension.
    As to the approach, this is the way i've always handled non-conformed dimensions when the requirement is to see both facts on the same report. Our Peak Indicators OBIEE training material on this subject is as follows :
    "order" fact table, dimensioned by Date, Customer, Product.
    "inventory" fact table dimensioned by Product only.
    Report requirement - Show me all orders where the order_qty (order fact) is greater than the total qty in stock (inventory fact)
    So like your example, we sum up both qty's and use a where clause to filter only rows where one fact measure is greater than another measure.
    Whats the ideal approach ??
    If you had control of the ETL perhaps you could seed an 'unknown' record in the non-conforming dimensions, and then reference this on the other fact table, but I'd say the majority of the time, the customer who wants everything by everything in their subject area needs a little education, its really not good for end users having 10's of folders with 10's of columns - we should be creating analysis subject areas to enable the answering of business questions, theres no reason why we cant combine reports from different subject areas on one dashboard page, all using a common set of prompts, but there really isnt a need to give an end user the whole BMM / DW etc in one go.......unless they are paying of course, and after you've tried to educate them they want it there way :-)

  • Build and essbase cube from two facts

    Hi,
    I wanna now if is possible build an essbase cube from two facts. My source is a dwh that have two facts.
    - Fact A with dimensions D1,D2 and D3
    - Fact B with dimensions D1,D2 and D4
    Both facts have a relation 1 to n. How can i model that?
    Thank you

    any suggestion?

  • Joining Two facts with one dimension

    hi,
    Hello Experts,I am reporting out of 2 fact groups.
    Mailing History and Booking.
    The only common dimension between the 2 is the Person Dimension. I have created the hierarchies and defined the aggregation content in the LTS.This helps me in reporting using any column from the Facts or any one dimension from either of the fact groups. But if i select 2 or more dimensions , one or more from each fact group , then the report gives and error. A general error has occurred. [nQSError: 14026] Unable to navigate requested expression: MailingHistory Campaign.Campaign Name. Please fix the metadata consistency warnings. (HY000)
    For Example : BookingDim--BookingFact-MailingHistoryFact (ok) MailingSourceDim--BookingFact-MailingHistoryFact works (ok) but BookingDim--BookingFact-MailingHistoryFact - MailingsourceDim (Fails ).
    When i am applying the filter from the fact 1 . i am getting this error.(which is the column in the second fact table 2).I checked all the joins in the BMM layer.
    Please help me!!!
    Thanks,
    Surya

    Surya did you set the levels properly for two fact tables and also for metrics.
    When ever you are using two fact groups with non confirmed dimensions you have to make sure all the content levels set for the fact tables and as well as for metrics.
    in your fact tables/ metrics - For related dimensions set the level to Detail. and for the non related dimension set the level to Total. This way obiee server will understand what is related dimension and whats not related..
    hope this works for you.

  • Single Fact with Multiple Conformed Dimensions

    Hi,
    I have a fairly simple report:
    Dim A and Dim B are joined with Fact 1
    Now, I want all rows from Dim A and Dim B along with Fact 1 values. If there are no matching values of Dim A and Dim B in Fact 1, I should get NULL or some message "No Data Available".
    If I make left outer join from Dim A to Fact 1 and Dim B to Fact 1 (to consider all rows from dimension tables), I dont get matching records. Because Fact 1 may not have all records for Dim A or Dim B.
    I dont want to make Full outer joins between Dim A and Fact 1 and DIme B and Fact 2. Do I have any other solution?

    Hi,
    ok let's assume,
    I have something like voucher used flag in one fact table, this is not a measure, it's a degenerated dimension entry.
    What to do with this one? My model is at the moment clear, we have managed to have 15 facts and 10 conformed dimensions, without issues.
    Now they want to analyze over this flag "voucher used" in best case over all facts and dimension combinations.
    This is the main requirement
    regards,

  • 2 Facts with common dimensions

    Hi All,
    I am trying to developer an Answer
    I have two Fact tables :- Budgets and Actuals
    The two Fact Tables have common dimensions (Period, Cost Centre and Expense)
    When I create an answer which combines these common dimensions and the two fact tables I can create a report that compares Actuals Vs Budgets and that works fine
    However the Budget Fact has an additional dimension Bugdet Name (which can have values like 'Budget','Forecast','Allocation') that does not apply to Actuals
    When I try to fliter or include the Budget Name I lose all my actuals
    (Incidentally some actuals do not have budgets, some budgets do not have actuals and I want to see all rows)
    Any ideas?
    Cheers
    Pete

    Hi,
    I try this:
    I have in BMM one subject area with two fact and common dimension
    SALES -> quantity_sold measure
    and
    SALES_TIME_ID -> select (amount_sold-100) as amount_sold, time_id from sales
    and dimensions
    TIMES
    PRODUCTS
    Both are joined to a SALES and just TIMES is joined to a SALES_TIME_ID in physical layer and in BMM complex join are propagated from facts to dimensions
    Test in Answers - both queries are generated:
    TIMES.CALENDAR_YEAR
    SALES.QUANTITY_SOLD
    SALES_TIME_ID.AMOUNT_SOLD
    select T20553.CALENDAR_YEAR as c1,
    sum(T20550.QUANTITY_SOLD) as c2
    from
    TIMES T20553,
    SALES T20550
    where ( T20550.TIME_ID = T20553.TIME_ID )
    group by T20553.CALENDAR_YEAR
    order by c1
    ------------------- Sending query to database named orcl (id: <<13806>>):
    select T20553.CALENDAR_YEAR as c1,
    sum(T35447.amount_sold) as c2
    from
    TIMES T20553,
    (select (amount_sold-100) as amount_sold, time_id from sales) T35447
    where ( T20553.TIME_ID = T35447.time_id )
    group by T20553.CALENDAR_YEAR
    order by c1
    PRODUCTS.PROD_CATEGORY
    TIMES.CALENDAR_YEAR
    SALES.QUANTITY_SOLD
    SALES_TIME_ID.AMOUNT_SOLD
    select distinct D1.c2 as c1,
    D1.c3 as c2,
    D1.c1 as c3,
    cast(NULL as  DOUBLE PRECISION  ) as c4
    from
    (select sum(T20550.QUANTITY_SOLD) as c1,
    T20553.CALENDAR_YEAR as c2,
    T21473.PROD_CATEGORY as c3
    from
    TIMES T20553,
    PRODUCTS T21473,
    SALES T20550
    where ( T20550.PROD_ID = T21473.PROD_ID and T20550.TIME_ID = T20553.TIME_ID )
    group by T20553.CALENDAR_YEAR, T21473.PROD_CATEGORY
    ) D1
    order by c1, c2, c4
    Amount sold is null because SALES_TIME_ID does not contain PROD_CATEGORY attribut, it's not common.

Maybe you are looking for

  • Color changes in preview

    Hello, I'm so tired trying to figure out the reason of a problem in Premiere Pro (my version is 6.0.4). I'm very frustrated just because I've spent too much time trying to solve the problem which is so ridiculous! What I'm doing is just trying to col

  • Problem with a simple query

    Hey. I've got a table which contains only one entry: Table "users": id | date | user | pass | profile | last_logged 1 | ... | evo | ... | ... | ... My query is: "SELECT * FROM users WHERE user=evo". The problem is that im always gettings a null point

  • Best 27" for photography

    I'm planning on getting a 27" iMac (i7 or i5?). I mostly use my computer for photo editing in Photoshop CS5 and LR3. Also web surfing, email, etc. I thought to get 4GBs of memory and upgrading in the future as needed. Should I consider getting the SA

  • No more bugs in C !!! just use PARASOFT solutions and forget about it all !

    Just some quick links to the PARASOFT homepage ( www.parasoft.com ) so you can see how YOU could develop bug free applications in C. These are tools developed by professionals for professionals. Check out how powerful they really are by trying them o

  • Processing oracle relational data with hadoop

    hello every body: i would like if it possible to process oracle relational data with hadoop in order to get better performance native parallel processing in oracle VS hadoop processing? who is the better? how can we do it? thank you at advance