Fact Table with multi billion records.

Hi ---> Is it advisable to build hierarchies around 3 billion record table Fact table and still growing in OBIEE..I know we can use oracle OLAP Tool and DB and other additional infrastructure will do it. But tryiing to look for a better way with the existing tools and hardware... Please let know the possibilities using obiee..yes I am aware of aggregate tables, materialized views etc.., it takes time to implement and view which is most optimal...
Edited by: user007009 on Apr 11, 2013 5:49 PM

Is it advisable to build hierarchies around 3 billion record table Fact tableIs it warehouse table or OLTP table treating as fact table? if it is ware house table then in BI we create hierarchy on dimension table but not on fact! or else you are using a fact table as both fact and dim then it can be but you have to mention as dim instead of fact.
I dont think there is a limitation, its all how you see the data. I dont think you create a report to show 1 or 2Billion records at once.
If you didnt convince try to check with Oracle with SR.
If helps mark
Thanks
BTW: You havent mentioned about time period for that 3 billion records! if it is over a period of time then BI can take care of its source using fragmentation.
Edited by: Srini VEERAVALLI on Apr 12, 2013 7:26 AM

Similar Messages

  • Load fact table with null dimension keys

    Dear All,
    We have OWB 10g R2 and ROLAP star schema. In our source system some rows don’t have all attributes populated with values (null value), and this empty attributes are dimension (business) keys in star schema. Is it possible to load fact table with such rows (some dimension keys are null) in the OWB mappings? We use cube operator in mappings.
    Thanks And Regards
    Miran

    The dimension should have a row indicating UNKNOWN, this will have a business key outside of the normal range e.g. -999999.
    In the mapping the missing business keys can then be NVL'd to -999999.
    Cheers
    Si

  • Select count from large fact tables with bitmap indexes on them

    Hi..
    I have several large fact tables with bitmap indexes on them, and when I do a select count from these tables, I get a different result than when I do a select count, column one from the table, group by column one. I don't have any null values in these columns. Is there a patch or a one-off that can rectify this.
    Thx

    You may have corruption in the index if the queries ...
    Select /*+ full(t) */ count(*) from my_table t
    ... and ...
    Select /*+ index_combine(t my_index) */ count(*) from my_table t;
    ... give different results.
    Look at metalink for patches, and in the meantime drop-and-recreate the indexes or make them unusable then rebuild them.

  • Join fact table with higher dimension level

    how do i join fact tables with higher dimension levels with discoverer?
    fact with detail at level C
    measure X
    dimension with
    D->C->B->A
    E->C
    level
    A B C
    1------1------1
    2------2------1
    3------2------1
    join between fact X and dimension level C
    X=3*C because of sum(X) in discoverer and 3xC in dimension
    is there a way to get correct values for X without creating a dimension like
    D->C
    E->

    another way of asking this is whether you can create a summary table in Discoverer at a higher level than a dimension's fundamental grain. In other words - the summary examples in the documentation all describe leaving out one or more of your dimensions... they are either left in or completely taken out. But, some of the most effective summarization occurs when you summarize daily data to a monthly level. Assuming that I have a sales table (at a daily level, and a key value sales_date), and a table date_dim (primary key sales_date), I would like to create a summary sales_month_summary where the sales are grouped on month_year (which is a field in the sales_date table).
    How is this done? I suspect that we can't use the date_dim table with the summary (due to the problems noted by the poster above). Do we have to create another table "month_dim"? Do we have to fold all of the desired date attributes (month, quarter, year) into the summary? Obviously we'd like to re-use all of the pertinent already existing date items (quarter, month, year, etc.), not recreate them over again, which would result in essentially two sets of items in the EUL. [One used for this month summary, and another used for the detail.]
    I searched the forum - someone asked this same question back in 2000 - there was no answer provided.
    The only other thought I have is to "snowflake" the date_dim into two tables and two folders, one at a date level, another at the month level. Then the detail tables can connect to date_dim (which is linked to month_dim), while the summary data can connect directly to month_dim.

  • Joining two fact tables with different dimensions into single logical table

    Hi Gurus,
    I try to accomplish in Oracle Business Intelligence 11.1.1.3.0:
    F1 (D1, D2 and D3)
    F2 (D1 and D2 and D4)
    And we want to build a report F1 F2 D1 D2 D3 D4 to have data for:
    F1 that match only for D1-D2-D3
    and data for
    F2 that match only D1-D2-D4
    all that in one row, so D3 and D4 are not common dimensions.
    I can only do:
    F3 (D1, D2)
    F4 (D1, D2 and D4)
    And report
    F3 F4 D1,D2,D4 (all that in one row, and only D4 is not a common dimension)
    Here is the very good example how to accomplish the scenario 1
    http://108obiee.blogspot.com/2009/08/joining-two-fact-tables-with-different.html
    But looks like it does not work in 11.1.1.3.0
    I get
    State: HY000. Code: 10058. [NQODBC] [SQL_STATE: HY000] [nQSError: 10058] A general error has occurred. [nQSError: 43113] Message returned from OBIS. [nQSError: 14025] No fact table exists at the requested level of detail: [,,Clients,,Day,ROI,,,,EW_Names,,,,,,,,,,,,,,,,,]. (HY000)
    I am sure I set up everything correctly as advised in the blog but it works with only one not a common dimension
    Is it a bug in 11.1.1.3.0 or something?
    Thanks,
    Kate

    Thanks for all your replies.
    Actually, I've tried the solutions you guys mentioned. Generally speaking, the result should be displayed. However, my scenario is a little bit tricky.
    table Y's figures are not the aggregation of table X for D dimension. Instead, table Y's figures include not only D dimension total, but also others (others do not mean A, B, C dimension). For example, table Y stores all food's figure, while table X stores only drink's figure. D dimension is only about drink's detail. In my scenario, other foods' figure is not provided.
    So, even if I set D dimension to all/total for table X, table X's result is still not the same as table Y.
    Indeed, table Y does not have a column key to join to D dimension's key. So, if I select D dimension and table Y's measures at the same time in BI Answer, result returns no data. Hence, I can't compare table X and table Y's results with selection of D dimension.
    Is there any solution to solve this problem?
    Edited by: TomChan on Jun 3, 2009 9:36 AM

  • Logical fact table with fragmented data sources with different dimensions

    Hello.
    I have a logical fact table with four logical table sources. Three of the LTS's share the same dimensions, but the fourth LTS has one dimension (called Dim_A) less. In the physical layer the dimension Dim_A is joined to the first three physical fact tables, but not to the fourth fact table (since it doesn't have that dimensionality). In the BMM layer the logical fact table is joines to the logical dimansion Dim_A.
    When I run an analysis on this RPD the measures from the logical fact is aggregated correctly (union of all four table sources) as long as I doesn't include Dim_A, but as soon as I include dimension Dim_A I get the error message:
    +State: HY000. Code: 10058. [NQODBC] [SQL_STATE: HY000] [nQSError: 10058] A general error has occurred. [nQSError: 43113] Message returned from OBIS. [nQSError: 43119] Query Failed: [nQSError: 14052] Internal Error: Logical column Dim_A.Column_X has no physical sources that can be joined to the physical fact table source [Logical table sources (Priority=0): Fact_B.Fact_Y]. (HY000)+
    I would like a solution where the analysis returns correctly aggregated measures also for the LTS with the "missing" dimension, but with a dimension value NULL for this LTS. Or something like this.
    Is there a way to set this up in the RPD.
    Thanks,
    Henning Eriksen

    The SQL could look something like this.
    SELECT dim_a.col_1, fact_a.measure_1
    FROM db.dim_a
    JOIN
    db.fact_a
    ON fact_a.col_2 = dim_a.col_2
    WHERE fact_a.date = '28-nov-2012'
    UNION ALL
    SELECT dim_a.col_1, SUM (fact_b.measure_1)
    FROM db.dim_a
    JOIN
    db.fact_b
    ON fact_b.col_2 = dim_a.col_2
    WHERE fact_b.date = '28-nov-2012'
    UNION ALL
    SELECT dim_a.col_1, SUM (fact_c.measure_1)
    FROM db.dim_a
    JOIN
    db.fact_c
    ON fact_c.col_2 = dim_a.col_2
    WHERE fact_c.date = '28-nov-2012'
    UNION ALL
    SELECT NULL, SUM (fact_d.measure_1)
    FROM    db.fact_d
    WHERE fact_d.date = '28-nov-2012'
    I would appreciate if you could give me some hints for the RPD.
    Thanks,
    Henning

  • Fact tables with common and uncommon dimensions

    Hi -
    I have 2 facts:
    F1 with Dimensions D1, D2, D3, D4
    F2 with dimensions D1, D2, D5, D6
    If I create a report with F1, F2, D1, D2 then 2 queries are created and the report is correct. Why am I not able to create a report with F1, F2, D1, D2, D3?
    Thanks !

    Hi Veeravalli -
    I found this blog helpful as it details out how to establish the relation. http://108obiee.blogspot.ca/2009/08/joining-two-fact-tables-with-different.html
    I had a little difficulty in understanding what you wanted to say or implement. Now, after reading it 3-4 times I figured out that you mean 1. what the blog says 2. possibility of combining the facts as different source.
    As a newbie it was difficult to understand it clearly.
    Thanks !

  • Fact table  with different  granularity

    I have a fact table , the measure with different granualrity , for example, there are a fact table named project, and in the project , there are some columns measure like project value, Opportunity value, (one opporuntity have many project). so I have to sum the project value and Opportuntity value
    by month, how to sum opportuntity value? (I want to sum the value distinct Opportuntity ). anyone have suggetion. thanks in advance.

    The purist answer is that you should NEVER have a fact table with multiple levels of granularity - it's asking for trouble. Better solution is to split off a new fact table at the proper level of granularity. Having said that, I've worked with BI Apps before so I understand the pain you're going through. In the long run though we found it better to add new fact tables and track the data at the proper levels. Less confusing (long run) than trying to kludge round data to fit in a square fact table.
    Thx,
    Scott

  • How to get the table with no. of records after filter in webdynpro

    Dear Gurus,
    How to get the table with no. of records after filter in webdynpro?
    Thanks in advance.
    Sankar

    Hello Sankar,
    Please explain your requirement clearly so that we can help you easily.
    To get the table records from your context node use method get_static_attributes_table()
    data lo_nd_mynode       type ref to if_wd_context_node. 
    data lt_atrributes_table  type wd_this->elements_mynode. 
    lo_nd_mynode = wd_context->get_child_node( name = wd_this->wdctx_mynode ). 
    lo_nd_mynode->get_static_attributes_table( importing table = lt_atrributes_table ). 
    Note: You should have already defined your context node as a Dictionary Structure.
    BR,
    RAM

  • BW 3.5 Fact table with data no corresponding records on P table

    Hi Gurus,
    I have a weird situation: There is an BW 3.5 SP 16 (I know itu2019s oldu2026) infocube that receive 400,000 records every single day.
    They never compressed (at least there is no data on E table), instead they ran a program calling FM u201CRSDRD_SEL_DELETION'u201D every day that takes about 5 hours to run (when it runs without errors).
    So I checked Fact table and found 337,564,988 records. If you check the number of records on Infocube Admin it shows 14,018,370 record. I found that  the u201Cmissingu201D record are not on P table (there is data on Fact pointing to missing Packages).
    I think the Selective deletion messed up the infocube. These record that are missing on P table have to be deleted. I ran an RSRV check on this Infocube on Background and it canceledu2026
    So, I want to delete this data from Fact my question is: How to do that with minimum effort without invalidade the information?
    Of course I will backup the data to a temp cubeu2026
    Well any idea is welcome
    Regards,
    Alex

    Hi Navesh,
    First of all thanks to your quick answer.
    Well, the arguments of FM are set to del any 0SEM_CRDATE older than 30 days in the past. This characteristic receive SY-DATUM on Act. Rule (transformationu2026) so every package older than 30 days have to be deleted. The weird thing is that it is deleting the P table record and keeping the data on F table.
    So You think it would be a nice try to do an selective deletion with the same criteria manually?
    This is the code, can you (or someone) take a look?
    START-OF-SELECTION.
    *Cria Range do período a ser eliminado
      PERFORM f_range.
      PERFORM f_elima_cubo.
    END-OF-SELECTION.
    *&      Form  F_RANGE
    FORM f_range.
      v_data_inic = sy-datum - 31.
      v_range-sign   = 'I'.
      v_range-option = 'LT'.
      v_range-low    = v_data_inic.
      v_range-keyfl  = 'X'.
      APPEND v_range  TO  t_tab_main-t_range.
      t_tab_main-iobjnm        = '0SEM_CRDATE'.
      INSERT t_tab_main INTO TABLE t_thx_sel.
    ENDFORM.                    " F_RANGE
    *&      Form  F_ELIMA_CUBO
    FORM f_elima_cubo.
      v_parallel               = '01'.
      CALL FUNCTION 'RSDRD_SEL_DELETION'
        EXPORTING
          i_datatarget      = 'IC_LP_B01'
          i_thx_sel         = t_thx_sel
          i_authority_check = c_flag
          i_no_logging      = v_nl
          i_parallel_degree = v_parallel
          i_show_report     = v_sr
        CHANGING
          c_t_msg           = t_msg.
    ENDFORM.                   
    Regards,
    Alex

  • 2 Fact Tables with a same content level

    Hi,
    I have 2 fact tables F1,F2  joined to the same 4 dimensions D1,D2,D3,D4.The detail level # of elements multiplied for all the dimensions D1,D2,D3,D4 are the same for both dimension.I even set the content level for the 2 fact tables and the content filter is the same in both fact tables F1.Column1=0 in F1 fact table and F2.Column1=0 in F2 fact table.So how does OBIEE decide which fact table to select in this scenario.
    I went through the below blog but could not get much information
    http://obiee-blog.info/administration-tool/what-rule-is-followed-when-several-fact-are-at-the-same-content-level/
    I appreciate if anyone can guide me with the right answers.

    How many records retrieving when joining F1 & F2 with dimension ? if the f1 and d1 has 200 records and f2 and d1 is retriving 100 records it chooses shortest path.
    thanks,
    Saichand

  • Fact tables with different granularity

    We currently have 3 dimensions (Site, Well, Date) and 2 fact tables (GasEngine, GasField), both having granularity of a day.
    GasEngine is linked to Site and Date
    GasField is linked to Site, Well and Date
    We now have a requirement to make the GasEngine fact table have granularity of an hour but keep
    GasField at a day.
    We therefore must include a new Time dimension, which would only be linked to GasEngine.
    Is it ok to have a DW with these two fact tables having different granularity? 
    And would we therefore require two separate cubes for querying this data?

    Hi Rajendra and Visakh16,
    Based on your input provided to this thread, I would like to ask a question just to fine-tune my knowledge regarding data modelling. In Darren’s case I guess his date dimension only store dimension records up to day level granularity. Now the requirement
    is to make the “GasEngine” fact table to hold data granularity of an hour.
    Now based on Rajendra’s input
    “Yes, you can have. but why you need new time dimension, I recommend, make GasEngine fact to
    hour granularity.”
    How Darren could display data for each hour without having a time dimension attached to GasEngine fact table? With the existing date dimension he ONLY can display the aggregated data with the minimum granularity of day level.
    Now anyone can modify the date dimension to hold time records which will complicate the date dimension totally. Instead why Darren cannot have a separate time dimension which hold ONLY time related data and have a timekey in GasEngine fact table and relate
    those tables using the time key? This way isn’t Darren’s data model become more readable and simplified? As we provide another way of slicing and dicing data by using a time dimension I do not think Darren’s cube becomes a complex STAR schema.
    I could be totally wrong therefore for the sake of knowledge for Darren and me I am asking the question from both of you.
    Best regards…   
    Chandima Lakmal Fonseka

  • Fact table with datetime measure showing #value error while browsing the cube

    Hi All,
    I have a cube with a fact table having datetime measure.
    when I browse the cube, I am able to see the data for all measures except  for the measure with the datetime as datatype.
    Thanks in advance.

    Hi jarugulalaks,
    Actually this forum is to discuss:
    Visual Studio WPF/SL Designer, Visual Studio Guidance Automation Toolkit, Developer Documentation and Help System, and Visual Studio Editor.
    To make this issue clearly, would you mind letting us know more information about this issue? Whether it is the VS IDE issue? Which language are you using? Which kind of app are you developing? Maybe you could share us a screen shot about it.
    But like this case posted by you here:
    https://social.msdn.microsoft.com/Forums/vstudio/en-US/bc2d30b8-a60d-4f0f-a273-b7cf0f5aaed5/value-error-for-datetime-measure-in-ssas?forum=visualstudiogeneral#bc2d30b8-a60d-4f0f-a273-b7cf0f5aaed5
    If it is the SSAS issue, please post this issue to the SSAS forum for dedicated support.
    Best Regards,
    Jack
    We are trying to better understand customer views on social support experience, so your participation in this interview project would be greatly appreciated if you have time. Thanks for helping make community forums a great place.
    Click
    HERE to participate the survey.

  • Creating an Excel file from table with 40,000 records

    I need help with creating an Excel file for client that has a
    table of 40,000 records. I have code that generates the Excel and
    has worked well in the past but with this much data it is timing
    out.
    I've already informed the client that Excel has limit of
    66000 records. So it might be better to export to CSV as the data
    in this table is going to keep growing.
    If you have time to work with me on this , contact me a
    alexagui [at] gmail [dot] com
    and I can send you more details so you can put together an
    accurate quote.
    Thanks,
    Alex

    asdren_one wrote:
    > I need help with creating an Excel file for client that
    has a table of 40,000
    > records. I have code that generates the Excel and has
    worked well in the past
    > but with this much data it is timing out.
    >
    > I've already informed the client that Excel has limit of
    66000 records. So it
    > might be better to export to CSV as the data in this
    table is going to keep
    > growing.
    >
    > If you have time to work with me on this , contact me a
    alexagui [at] gmail
    > [dot] com
    > and I can send you more details so you can put together
    an accurate quote.
    For so many records my guess is that the concatenation times
    the most
    time so you'll need to use StringBuffer to build the string.
    Google for
    coldfusion and stringBuffer:
    http://www.stillnetstudios.com/2007/03/07/java-strings-in-coldfusion/
    Mack

  • Issue using one 2 Fact tables with one dimension Table.

    Hi,
    I have 1 Dimension table X and 2 Fact tables A and B
    X is joined to Both A and B for Loan Amount ( with A) and for colleatral amount (with B) when I am selecting the X.Product_Name, A.Loan_Amt, B.Collateral Amount, it is giving an error message
    State: HY000. Code: 10058. [NQODBC] [SQL_STATE: HY000] [nQSError: 10058] A general error has occurred. [nQSError: 15018] Incorrectly defined logical table source (for fact table EIP Collateral FACT) does not contain mapping for [EIP Reporting FACT.PD ID]. (HY000)
    Any clues???
    Is there a Inner or Outer join which needs to be created or set in the RPD to get the desired results???

    Ok..
    I have one table which is Porfolio Details which has Portfolio name, Product Category , Product Name, Product ID, Product sources code.- This is my Dimension table.
    I have another 2 set of fact tables : EIP Reporting FACT and EIP Collateral FACT..
    These two tables are joined to Portfolio Details table.
    EIP Reprting FACT gives portfolio wise Loan Amount
    and EIP Collateral FACT gives Portfolio wise Collateral Amount details for same set of customer..
    Now, I am selecting Portfolio Name, Product Category, Product Name,SUM( EIP Reporting FACT.LOAN_AMOUNT), SUM(EIP Collaetral FACT.Collateral_Amt) in a report
    Now, on selecting these columns I am getting that error message which is related to mapping.
    If I take any column from Portfolio details table and any column from EIP Reporting FACT- It works.
    If I take any column from Portfolio details table and any column from EIP Colletral FACT- It works.
    But if I take any column from portfolio table and columns from both FACT tables it gives mapping error...
    Hope I am able to explain the issue in a better way now..
    Edited by: help-required on Mar 11, 2010 6:53 PM
    Edited by: help-required on Mar 11, 2010 6:53 PM

Maybe you are looking for