How to implement fact tables with finest level of detail (fine-grained)?

Hi,
Maybe this is basic knowledge what I'm asking here... I don't know, well, here it goes:
I need to know the way carry my transactional data to a fact table, but keeping the finest level of detail possible (namely, the transactions). I implemented my cubes with MOLAP option for storage (those were the specs that I had to follow) so I can't add a unique constraint to those structures.
I only seem to be able to load aggregated, precomputed data. If I wanted to load the transactions (after the data has been transformed and clenased) where should I do it?
I tried to implement a version of the fact tables as ROLAP but got nowhere (I couldn't add a unique constraint or index on that column either).
I would really, really appreciate your help.
Best Regards,
osvaldo.
[osantos]

Hi Veeravalli,
Thanks for your reply :)
Let me explain the problem in more detail. I have one Date dimension(Date_Code,Month_Code,Quarter_Code,Half_Year_Code,Year_Code). Here Date_Code is the PK.
In F1---->Date (Using Month_Code key)
F2-------->Date (Using Date_Code Key)
Level based hierarchy is there starting from Year to Date.Each level has PK defined and chronological key selected.
F1 has level set to Month and F2 has level set to Day.
Now if i am using ago() function on measure of F2 (having day level data) then it's working fine but if i am using ago() function on measure of F1...I am getting an error at Presentation service: Date_code must be projected for time-series functions.
So the whole issue is with time-series functions. As per my research...I think for time series the tables in the physical model containing the time dimension cannot join to other data sources, except at the most detailed level but here i am joining with F1(using Month_Code which is not the most detailed level).
So kindly let me know how to achieve this in rpd?

Similar Messages

  • How to handle Fact tables with different granularity in OBIEE 11g RPD

    Hello Everyone,
    I have got stuck here and need your help.
    I have two fact tables (Say F1 and F2... F1 is containing data at month-level and F2 is containing data at day level) and one Date DIMENSION TABLE. Date_Code is the PK of Date dimension table.I need to use time-series functions also.
    Can anyone tell me how to model this requirement in the RPD.
    Can we use a single dimension table(Here Date dimension table) with two fact table of different grainularity? What would be the best way to implement this requirement?
    Thanks in advance :)

    Hi Veeravalli,
    Thanks for your reply :)
    Let me explain the problem in more detail. I have one Date dimension(Date_Code,Month_Code,Quarter_Code,Half_Year_Code,Year_Code). Here Date_Code is the PK.
    In F1---->Date (Using Month_Code key)
    F2-------->Date (Using Date_Code Key)
    Level based hierarchy is there starting from Year to Date.Each level has PK defined and chronological key selected.
    F1 has level set to Month and F2 has level set to Day.
    Now if i am using ago() function on measure of F2 (having day level data) then it's working fine but if i am using ago() function on measure of F1...I am getting an error at Presentation service: Date_code must be projected for time-series functions.
    So the whole issue is with time-series functions. As per my research...I think for time series the tables in the physical model containing the time dimension cannot join to other data sources, except at the most detailed level but here i am joining with F1(using Month_Code which is not the most detailed level).
    So kindly let me know how to achieve this in rpd?

  • Fact table  with different  granularity

    I have a fact table , the measure with different granualrity , for example, there are a fact table named project, and in the project , there are some columns measure like project value, Opportunity value, (one opporuntity have many project). so I have to sum the project value and Opportuntity value
    by month, how to sum opportuntity value? (I want to sum the value distinct Opportuntity ). anyone have suggetion. thanks in advance.

    The purist answer is that you should NEVER have a fact table with multiple levels of granularity - it's asking for trouble. Better solution is to split off a new fact table at the proper level of granularity. Having said that, I've worked with BI Apps before so I understand the pain you're going through. In the long run though we found it better to add new fact tables and track the data at the proper levels. Less confusing (long run) than trying to kludge round data to fit in a square fact table.
    Thx,
    Scott

  • Join fact table with higher dimension level

    how do i join fact tables with higher dimension levels with discoverer?
    fact with detail at level C
    measure X
    dimension with
    D->C->B->A
    E->C
    level
    A B C
    1------1------1
    2------2------1
    3------2------1
    join between fact X and dimension level C
    X=3*C because of sum(X) in discoverer and 3xC in dimension
    is there a way to get correct values for X without creating a dimension like
    D->C
    E->

    another way of asking this is whether you can create a summary table in Discoverer at a higher level than a dimension's fundamental grain. In other words - the summary examples in the documentation all describe leaving out one or more of your dimensions... they are either left in or completely taken out. But, some of the most effective summarization occurs when you summarize daily data to a monthly level. Assuming that I have a sales table (at a daily level, and a key value sales_date), and a table date_dim (primary key sales_date), I would like to create a summary sales_month_summary where the sales are grouped on month_year (which is a field in the sales_date table).
    How is this done? I suspect that we can't use the date_dim table with the summary (due to the problems noted by the poster above). Do we have to create another table "month_dim"? Do we have to fold all of the desired date attributes (month, quarter, year) into the summary? Obviously we'd like to re-use all of the pertinent already existing date items (quarter, month, year, etc.), not recreate them over again, which would result in essentially two sets of items in the EUL. [One used for this month summary, and another used for the detail.]
    I searched the forum - someone asked this same question back in 2000 - there was no answer provided.
    The only other thought I have is to "snowflake" the date_dim into two tables and two folders, one at a date level, another at the month level. Then the detail tables can connect to date_dim (which is linked to month_dim), while the summary data can connect directly to month_dim.

  • Joining two fact tables with different dimensions into single logical table

    Hi Gurus,
    I try to accomplish in Oracle Business Intelligence 11.1.1.3.0:
    F1 (D1, D2 and D3)
    F2 (D1 and D2 and D4)
    And we want to build a report F1 F2 D1 D2 D3 D4 to have data for:
    F1 that match only for D1-D2-D3
    and data for
    F2 that match only D1-D2-D4
    all that in one row, so D3 and D4 are not common dimensions.
    I can only do:
    F3 (D1, D2)
    F4 (D1, D2 and D4)
    And report
    F3 F4 D1,D2,D4 (all that in one row, and only D4 is not a common dimension)
    Here is the very good example how to accomplish the scenario 1
    http://108obiee.blogspot.com/2009/08/joining-two-fact-tables-with-different.html
    But looks like it does not work in 11.1.1.3.0
    I get
    State: HY000. Code: 10058. [NQODBC] [SQL_STATE: HY000] [nQSError: 10058] A general error has occurred. [nQSError: 43113] Message returned from OBIS. [nQSError: 14025] No fact table exists at the requested level of detail: [,,Clients,,Day,ROI,,,,EW_Names,,,,,,,,,,,,,,,,,]. (HY000)
    I am sure I set up everything correctly as advised in the blog but it works with only one not a common dimension
    Is it a bug in 11.1.1.3.0 or something?
    Thanks,
    Kate

    Thanks for all your replies.
    Actually, I've tried the solutions you guys mentioned. Generally speaking, the result should be displayed. However, my scenario is a little bit tricky.
    table Y's figures are not the aggregation of table X for D dimension. Instead, table Y's figures include not only D dimension total, but also others (others do not mean A, B, C dimension). For example, table Y stores all food's figure, while table X stores only drink's figure. D dimension is only about drink's detail. In my scenario, other foods' figure is not provided.
    So, even if I set D dimension to all/total for table X, table X's result is still not the same as table Y.
    Indeed, table Y does not have a column key to join to D dimension's key. So, if I select D dimension and table Y's measures at the same time in BI Answer, result returns no data. Hence, I can't compare table X and table Y's results with selection of D dimension.
    Is there any solution to solve this problem?
    Edited by: TomChan on Jun 3, 2009 9:36 AM

  • Fact tables with common and uncommon dimensions

    Hi -
    I have 2 facts:
    F1 with Dimensions D1, D2, D3, D4
    F2 with dimensions D1, D2, D5, D6
    If I create a report with F1, F2, D1, D2 then 2 queries are created and the report is correct. Why am I not able to create a report with F1, F2, D1, D2, D3?
    Thanks !

    Hi Veeravalli -
    I found this blog helpful as it details out how to establish the relation. http://108obiee.blogspot.ca/2009/08/joining-two-fact-tables-with-different.html
    I had a little difficulty in understanding what you wanted to say or implement. Now, after reading it 3-4 times I figured out that you mean 1. what the blog says 2. possibility of combining the facts as different source.
    As a newbie it was difficult to understand it clearly.
    Thanks !

  • Using 2 fact tables with different granularity against calendar dimension

    Hello gurus,
    I have a requirement to provide a report to show the consumption of available capacity per month and also YTD.
    I have two fact tables:
    Fact table ‘Capacity’ with columns:
    - Site_id
    - Month_id
    - Capacity
    Ie.
    001, 2010M01, 50
    001, 2010M02, 50
    001, 2010M12, 75
    002, 2010M01, 60
    002, 2010M02, 65
    002, 2010M12, 80
    Etc
    Fact table ‘Consumption’ with columns
    - Site_id
    - Day_id
    - Consumption
    Ie
    001, 20100101, 2
    001, 20100102, 3
    001, 20100131, 1
    001, 20100201, 5
    001, 20100212, 6
    001, 20100228, 4
    Etc
    As can be see above, my ‘Capacity’ table contains monthly volumes, and the ‘Cunsumption’ table contains daily volumes.
    My Calendar dimension is straightforward:
    Year
    Quarter_id
    Month_id
    Day_id
    Ie
    2010, 2010Q1, 2010M01, 20100101
    2010, 2010Q1, 2010M01, 20100102
    2010, 2010Q1, 2010M01, 20100103
    2010, 2010Q1, 2010M01, 20100104
    Etc
    The MfgSite dimension is also simple:
    Site_id
    Site_name
    Group
    These are the steps I have taken sofar:
    - Imported the four tables
    - Created following joins:
         MfgSite.Site_id = Capacity.Site_id
         MfgSite.Site_id = Consumption.Site_id
         Calendar.Month_id = Capacity.Month_id
         Calendar.Day_id = Consumption.Day_id
    - Created Business Model Diagram in BMM
    - Created Calendar hierachy:
         Year, Quarter, Month, Day
    - Created MfgSite hierarchy:
         Group, SiteName
    - Setup Logical Table Source / Content settings as follows:
         Fact table Capacity:
              Dimension MfgSite: Logical Level = Site
              Dimension Calendar: Logical Level = Month
         Fact table Consumption:
              Dimension MfgSite: Logical Level = Site
              Dimension Calendar: Logical Level = Day
    - Set Default Aggregation Rule to Sum on Logical Columns:
    Capacity.Capacity
    Consumption.Consumption
    - Created following YTD Logical Columns:
         YTDCapacity = TODATE(Capacity.Capacity, Calendar.Year)
         YTDConsumption = TODATE(Consumption.Consumption, Calendar.Year)
    - Created Presentaion layer
    I then built a few reports to test it out and found that I have an issue with the Capacity object: When I build a simple report to show capacity per month:
    SiteName, Month, Capacity
    the capacity for each month is multiplied by the number of calendardays in that months, so I get
    Site      Month      Capacity
    001      2010M01      1550 (= 31 x 50)
    001      2010M02     1400 (= 28 x 50)
    Etc
    In addition, when I add YTDCapacity to my report, the report fails with the following message:
    Unable to navigate requested expression: ToDate(Capacity:[DAggr(Capacity.Capacity by [ Calendar.Year, Calendar.Month_id, MfgSite.Site_id, MfgSite.SiteName] )], [Level Year]). Please fix the metadata consistency warnings. (HY000)
    Did I miss any steps? Any help is greatly appreciated!
    Thanks!
    Randall

    hi in the capacity fact table remove the level set for the calendar dim and see.

  • How to use error table in mapping level?

    Hi all
    Any one please tell me how to use error table in mapping level or how to handle the errors in mapping.
    I am creating one error table in oracle but i dont know how to use it in mapping.
    Thanks in advance.
    Kumar

    Hi Kumar,
    You need to use Error Table along with the Data Rule .
    Error tables are used with Logical error handlers (Data Rules). Detailed description on using data rules is explained in the following document
    http://www.oracle.com/technology/products/warehouse/pdf/OWB10gR2_ETLandBusinessRules.pdf
    In a nut shell usage of error tables is as follows.
    1. Create a data rule associate it with a table (Target table or table in which you load data).
    2. Use the table in a mapping as target table.
    3. Deploy the table and mapping.
    Now in the target schema one more table will be created with the name <target table name>_ERR. When a mapping is executed the data which violates the data rule gets into this error table.
    The following has to be done to have error table name of customers choice.
    1. Right click on the table and click configure.
    2. Give the table name in "shadow table name" section.
    3. Deploy the table.
    4. Use this table in a mapping. Now in the mapping on this table properties in "Error Table Name" section the name of the table specified in "shadow table name" will be populated.
    Thanks,
    Sutirtha

  • Load fact table with null dimension keys

    Dear All,
    We have OWB 10g R2 and ROLAP star schema. In our source system some rows don’t have all attributes populated with values (null value), and this empty attributes are dimension (business) keys in star schema. Is it possible to load fact table with such rows (some dimension keys are null) in the OWB mappings? We use cube operator in mappings.
    Thanks And Regards
    Miran

    The dimension should have a row indicating UNKNOWN, this will have a business key outside of the normal range e.g. -999999.
    In the mapping the missing business keys can then be NVL'd to -999999.
    Cheers
    Si

  • How to create a table with events in smartforms?

    How to create a table with events view in smartforms?
    It doesn't like general table with header, main area and footer.
    for example:
    in smartforms: LE_SHP_DELNOTE
    table name is TABLEITEM(Delivery items table)

    Vel wrote:
    I am creating XML file using DBMS_XMLGEN package. This XML file will contain data from two different database tables. So I am creating temporary table in the PL/SQL procedure to have the data from these different tables in a single temporary table.
    Please find the below Dynamic SQL statements that i'm using for create the temp table and inserting the data into it.
    Before insert the V_NAME filed, i will be appending a VARCHAR field to the original data.
    EXECUTE IMMEDIATE 'CREATE TABLE TEMP_TABLE (UNIQUE_KEY NUMBER , FILE_NAME VARCHAR2(1000), LAST_DATE DATE)';
    EXECUTE IMMEDIATE 'INSERT INTO TEMP_TABLE values (SEQUENCE.nextval,:1,:2)' USING V_NAME,vLastDate;What exactly i need is to eliminate the INSERT portion of it,Since i have to insert more 90,000 rows into it. Is there way to have the temp table created with data in it along with the sequence value as well.
    I'm using Oracle 10.2.0.4 version.
    Edited by: 903948 on Dec 22, 2011 10:58 PMWhat you need to do to eliminate the INSERT statement is to -- as already suggested by others - eliminate the temporary table. You don't need it. It is just necessary overhead. Please explain why you (apparently) believe that the suggestion of a view will not meet your requirements.

  • How to create a table with varied number of columns?

    I am trying to create a balance table. The colunms should include years between the start year and end year the user will input at run time. The rows will be the customers with outstanding balance in those years.
    If the user input years 2000 and 2002, the table should have columns 2000, 2001, 2002. But if the user input 2000 and 2001, the table will only have columns 2000 and 2001.
    Can I do it? How? Thanka a lot.

    Why did you create a new thread for this?
    How to create a table with varied number of columns?

  • How to create a table with datatype blob and insert a pdf file (ravi)

    how to create a table with datatype blob and insert a pdf file,
    give me the explain asap
    1.create the table?
    2.insert the pdffiles into tables?
    3.how to view the files?
    Thanks & Regards
    ravikumar.k
    Edited by: 895044 on Dec 5, 2011 2:55 AM

    895044 wrote:
    how to create a table with datatype blob and insert a pdf file,
    give me the explain asapPerhaps you should read...
    {message:id=9360002}
    especially point 2.
    We're not just sitting here waiting to answer your question as quickly as possible for you.

  • Select count from large fact tables with bitmap indexes on them

    Hi..
    I have several large fact tables with bitmap indexes on them, and when I do a select count from these tables, I get a different result than when I do a select count, column one from the table, group by column one. I don't have any null values in these columns. Is there a patch or a one-off that can rectify this.
    Thx

    You may have corruption in the index if the queries ...
    Select /*+ full(t) */ count(*) from my_table t
    ... and ...
    Select /*+ index_combine(t my_index) */ count(*) from my_table t;
    ... give different results.
    Look at metalink for patches, and in the meantime drop-and-recreate the indexes or make them unusable then rebuild them.

  • Logical fact table with fragmented data sources with different dimensions

    Hello.
    I have a logical fact table with four logical table sources. Three of the LTS's share the same dimensions, but the fourth LTS has one dimension (called Dim_A) less. In the physical layer the dimension Dim_A is joined to the first three physical fact tables, but not to the fourth fact table (since it doesn't have that dimensionality). In the BMM layer the logical fact table is joines to the logical dimansion Dim_A.
    When I run an analysis on this RPD the measures from the logical fact is aggregated correctly (union of all four table sources) as long as I doesn't include Dim_A, but as soon as I include dimension Dim_A I get the error message:
    +State: HY000. Code: 10058. [NQODBC] [SQL_STATE: HY000] [nQSError: 10058] A general error has occurred. [nQSError: 43113] Message returned from OBIS. [nQSError: 43119] Query Failed: [nQSError: 14052] Internal Error: Logical column Dim_A.Column_X has no physical sources that can be joined to the physical fact table source [Logical table sources (Priority=0): Fact_B.Fact_Y]. (HY000)+
    I would like a solution where the analysis returns correctly aggregated measures also for the LTS with the "missing" dimension, but with a dimension value NULL for this LTS. Or something like this.
    Is there a way to set this up in the RPD.
    Thanks,
    Henning Eriksen

    The SQL could look something like this.
    SELECT dim_a.col_1, fact_a.measure_1
    FROM db.dim_a
    JOIN
    db.fact_a
    ON fact_a.col_2 = dim_a.col_2
    WHERE fact_a.date = '28-nov-2012'
    UNION ALL
    SELECT dim_a.col_1, SUM (fact_b.measure_1)
    FROM db.dim_a
    JOIN
    db.fact_b
    ON fact_b.col_2 = dim_a.col_2
    WHERE fact_b.date = '28-nov-2012'
    UNION ALL
    SELECT dim_a.col_1, SUM (fact_c.measure_1)
    FROM db.dim_a
    JOIN
    db.fact_c
    ON fact_c.col_2 = dim_a.col_2
    WHERE fact_c.date = '28-nov-2012'
    UNION ALL
    SELECT NULL, SUM (fact_d.measure_1)
    FROM    db.fact_d
    WHERE fact_d.date = '28-nov-2012'
    I would appreciate if you could give me some hints for the RPD.
    Thanks,
    Henning

  • Webinar: How to implement secure scenarios with SAP NW PI 7.1

    SAP Intelligence Platform & NetWeaver RIG APJ Expert Call
    Dear valued SAP Experts,
    Next SAP Intelligence Platform & NetWeaver RIG Expert Call Session will take place on Tuesday, August 18.
    The SAP Intelligence Platform & NetWeaver RIG Expert Call Sessions are designed to support consultants, partners and customers  during their implementation projects. The sessions cover all different aspects of SAP NetWeaver and are aimed at
    thus provide knowledge which is not available via standard training courses. The session duration is typically 60min and includes questions and answers.
    Tuesday, August 18, 2009:
    How to implement secure scenarios with SAP NetWeaver Process Integration 7.1
    Time: 2.00 - 3.00 p.m. Singapore Time (UTC +8)
    This event will feature Makoto Sugishita with the SAP Intelligence Platform & NetWeaver Regional Implementation Group.
    Makoto provides the following abstract:
    In this session you will learn more about the core security concepts that are provided with the service-oriented architecture (SOA)
    management capabilities in SAP NetWeaver Process Integration (SAP NetWeaver PI). This session will cover main use cases and
    supported scenarios of secure SAP NetWeaver PI deployments. 
    SAP Connect Link: https://sap.emea.pgiconnect.com/I016095
    (no passcode needed)
    Dial in:
    For dial in details please register here http://www.surveymonkey.com/s.aspx?sm=EFeuZl9PxrwKOW5i5W556g_3d_3d
    Kind regards,
    Sarma Sishta
    SAP Intelligence Platform & NetWeaver RIG APJ

    hi,
    I'm making this a sticky thread till August 18 so it will have better visibility
    Regards,
    Michal Krawczyk

Maybe you are looking for