Is table necessry for dimension?

Hi Everyone;
Q:1 I am a student and facing a problem in dimension creation. My problem: it is necessry that i must create table before dimension creation ?
Q:2 if i do not create dimension and i want to make a cube of these tables, office, accounts,employee, pay etc. i have to mension fk's of these tables in fact table. when i go to owb console and create a cube it does not show my created tables. it just ask for dimensions but i dont have any dimension so what to do?
Q3: What is dimension modeling
Please guide me, i,l be thankful to you for this kind

Q3 :
Please refer to
www.rkimball.com (good one)
http://www.amazon.com/gp/product/0471200247/sr=8-1/qid=1152725383/ref=pd_bbs_1/102-7756454-8024905?ie=UTF8
http://www.amazon.com/gp/product/0471255475/sr=8-4/qid=1152725383/ref=pd_bbs_4/102-7756454-8024905?ie=UTF8

Similar Messages

  • DML Error Logging for underlying tables autocreated for dimensions

    Hi,
    I have problems with logging errors during mapping for underlying tables that are automatically generated for creating dimensions. I know that DML Error Logging is supposed to work only on tables, views and materialized views, and I have tried it out using a pure table in a mapping without any reference to a dimension and specifying the error table name for it. This works perfectly. The error rows were captured correctly in the error table and the correct rows were also loaded successfully to the target table.
    However when it comes to mapping to a dimension, I have some issues with that. I specified the shadowtable name for the underlying table(right clicking on the table in design center and then choosing configure), and after deploying I did check that the error table/shadow table and also the target table were indeed created in the database.
    The problem lies now in the mapping because I am actually mapping to a dimension and not the underlying table the dimension references to. It seems that the mapping did not manage to capture details of the error table for the underlying table that the dimension was referencing to and thus were unable to capture the errors. Only the corrected rows get loaded and there were no error messages during the loading that suggests that the incorrect rows were detected.
    Would appreciate some assistance here.
    Thanks!
    WY

    Hi
    The DML error logging feature in 10gR2 and 11gR1 was restricted purely to tables, so the dimension operator did not support it. This is now supported in OWB 11gR2 (plus there is the orphan management functionality).
    Cheers
    David

  • Missing most detailed table for dimension tables

    Hi ,
    I am getting this following error
    Business Model Core:
    [nQSError: 15003] Missing most detailed table for dimension tables: [Dim - Customer,Dim - Account Hierarchy,Dim - Account Region Hierarchy,Fact - Fins - Period Days Count].
    [nQSError: 15001] Could not load navigation space for subject area Core.
    I got this error when I tried to configure # of Elapsed Days and # of Cumulative Elapsed Days by following way-
    1. Using the Administration Tool, open OracleBIAnalyticsApps.rpd.
    Configuration Steps for Controlling Your Data Set
    Configuring Oracle Financial Analytics 5-51
    The OracleBIAnalyticsApps.rpd file is located at:
    ORACLE_INSTANCE\bifoundation\OracleBIServerComponent\coreapplication_
    obisn\repository
    2. In the Business Model and Mapping layer, go the logical table Fact - Fins - Period
    Days Count.
    3. Under Sources, select the Fact_W_DAY_D_PSFT logical table source.
    4. Clear the Disabled option in the General tab and click OK.
    5. Open the other two logical table sources, Fact_W_DAY_D_ORA and Fact_W_
    DAY_D_PSFT, and select the Disabled option.
    6. Add the "Fact - Fins - Period Days Count" and "Dim - Company" logical tables to
    the Business Model Diagram. To do so, right-click the objects and select Business
    Model Diagram, Selected Tables Only.
    7. In the Business Model Diagram, create a new logical join from "Dim - Company"
    to "Fact - Fins - Period Days Count." The direction of the foreign key should be
    from the "Dim - Company" logical table to the "Fact - Fins - Period Days Count"
    table. For example, on a (0,1):N cardinality join, "Dim - Company" will be on the
    (0/1) side and "Fact - Fins - Period Days Count" will be on the N side.
    8. Under the Fact - Fins - Period Days Count logical table, open the "# of Elapsed
    Days" and "# of Cumulative Elapsed Days" metrics, one at a time.
    9. Go to the Levels tab. For the Company dimension, the Logical Level is set to All.
    Click the X button to remove it. Repeat until the Company dimension does not
    have a Logical Level setting.
    10. Make sure to check Global Consistency to ensure there are no errors, and then
    save the RPD file.
    Please help me to resolve.
    Thanks,
    Soumitro

    Could you let me know how you resolved this. I am facing the same.

  • "Missing most detailed table for dimension tables" eror when I run the Global Consistency check

    ERRORS:
    Business Model DAC Measures:
    [nQSError: 15003] Missing most detailed table for dimension tables: [D_DETAILS,D_EXECUTION_PLAN,D_TASK].
    [nQSError: 15001] Could not load navigation space for subject area DAC Measures.
    I am also attaching my Business Model layer for easy understanding. I have a fact table and several Dimension table. I got this error only after creating the following hierarchies:
    Execution Plan -> Tasks -> Details
    Start Date Time Hierarchy
    End Date Time Hierarchy
    Is there a solution for this problem? Thanks in advance!

    Yes ! My Task Hierarchy has 3 dimension tables that form a hierarchy :Execution Plan -> Tasks -> Detail
    All the 3 levels in the hierarchy are 3 different dimension tables.

  • [nQSError: 15003] Missing most detailed table for dimension tables:

    Hi,
    I am new to BI. I created fresh respository from the scratch.
    When run the consistency check I get the following messages.
    [nQSError: 15001] Could not load navigation space for subject area SALES.
    [nQSError: 15003] Missing most detailed table for dimension tables: [Products,Customers].
    Can anyone help me on this.
    Thank you
    subra.

    Got rid of the problem by deleting and recreating the dimension tables.
    Subra.

  • Number range buffering for dimension tables (DIM IDs) and InfoObjects (SIDs

    Hi BW Experts,
    How to check whether this number range buffering for dimension tables (DIM IDs) and InfoObjects (SIDs) is Active or not ??
    can you please provide me this technical information where to go and check in the system ?
    Thanks in Advance.

    The Early Watch report lists these because of the large number of rows in the MD/Dim tables.  Keep in mind though, that Number Range buffering will really only help if you continue to generate a large volume of new SIDs or Dim IDs with ongoing loads, e.g.
    If your transactions being loaded to a cube result in several thousand new rows being added a Dim table, then it makes sense to turn on Number Range buffering for that Dim ID.  But if the transaction volume is only cause a few hundred DIM IDs to be added, buffering will not really get you anything.

  • Help on  Setting logical Levels  in Fact tables and on Dimension tables

    Hi all
    Can any body provide any blogs or any king of material on what exactly is levelling .
    Like after creating the Dimensional hierarchies we need to set the logical levels for the LTS of fact tabels ri8 .So what is the difference between setting logical levels to fact tabels and also Setting levelling on Dimension tables .
    Any kind of help is appreciated
    Thanks
    Xavier.
    Edited by: Xavier on Aug 4, 2011 10:50 AM

    I have read these blogs ,but what my question is
    Setting the logical levels in LTS of Fact tables i understood .
    But we can also set the logical levels for dimensions also ri8 .I didn't understand why do we set the logical levels for dimensions .Is there any reason why we go with the levelling at dimensions
    Thanks
    Xavier
    Edited by: Xavier on Aug 4, 2011 2:03 PM
    Edited by: Xavier on Aug 4, 2011 2:32 PM

  • Follow-up on u0093Fact Table VS. Dimension relative sizeu0094

    You don’t have to read the background info. below, it is only for reference. I have already gone though several links as such your touch in your own words will be much appreciated.
    Can you help me understand these which came from the many links the Experts referred me .
    1. For better performance: “Use line item dimension where applicable, but use the "high cardinality" option with extreme care.”  Any clarification on “high cardinality”? And why with “extreme care”?
    2.  For better performance:  “1. Make the Index perfect and create secondary index based on requirement..”  I don’t seem to follow this index issue being raised here.
    Thanks.
    --Background: References from these past postings ---
    There is much to be said about performance in general, but I will try to answer your specific question regarding fact and dimension tables.
    First of all try to compress as many requests as possible in the fact table and do that regularily.
    Partition your compressed fact table physically based on for example 0CALMONTH. In the infocube maintenance, in the Extras menu, choose partitioning.
    Partition your cube logically into several smaller cubes based on for example 0CALYEAR. Combine the cubes with a multiprovider.
    Use constants on infocube level (Extras->Structure Specific Infoobject properties) and/or restrictions on specific cubes in your multiprovider queries if needed.
    Create aggregates of subsets of your characteristics based on your query design. Use the debug option in RSRT to investigate which objects you need to include.
    To investigate the size of the dimension tables, first use the test in transaction RSRV (Database Information about InfoProvider Tables). It will tell you the relative sizes of your dimensions in comparison to your fact table. Then go to transaction DB02 and conduct a detailed analysis on the large dimension tables. You can choose "table columns" in the detailed analysis screen to see the number of distinct values in each column (characteristic). You also need to understand the "business logic" behind these objects. The ones that have low cardinality, that is relate to each other shoule be located together. With this information at hand you can understand which objects contribute the most to the size of the dimension and separate the dimension.
    Use line item dimension where applicable, but use the "high cardinality" option with extreme care.
    Generate database statistics regularily using process chains or (if you use Oracle) schedule BRCONNECT runs using transaction DB13.
    Good luck!
    Kind Regards
    Andreas
    There are some simple things by which we can maintain the performance.
    1> Make the Index perfect and create secondary index based on requirement.
    2> Make statistics perfect.
    3> Based on data size in dimenssion table use lineitem dimenssion.
    HOpe this will help you.
    Suneel

    Hi Caud..
    1] High Cardinality..
    High Cardinality means that this dimension contains a large number of characteristic values. This information is used in accordance with the individual database platform in order to optimize performance. For example, an index type other than the standard may be used. Generally a dimension is perceived to have high cardinality when the dimension is at least 20% the size of the fact tables in terms of the number of entries each contain. Avoid marking a dimension as having high cardinality if you are in any doubt.
    Cardinality creates indexes on the Dimension table entries and there by you would see an improvement in performance.
    With Oracle DB, setting the Cardinality option causes a b-tree index to be built instead of a bitmap index even if it is not a line item dim.
    Setting it as a Line Item dim also causes a b-tree index to be built instead of a bitmap index, but also embeds SID directly in the fact table, eliminating the dimension table.
    As it changes from bit map to B-tree, u have to be careful as the SAP recommends the Bit map.
    2]As indices increased the performance tremendously, u have to consider when ever the performance is main factor. it is very easy for query to access the data in the data target if it already indexed.
    Refer this link..
    http://help.sap.com/saphelp_nw04/helpdata/en/a7/d50f395fc8cb7fe10000000a11402f/frameset.htm
    Hope it helps-
    Regards-
    MM
    Assign points if it is useful, it is right way to say thanks.
    Message was edited by: vishnuC

  • No UDA Found for Dimension XYZ

    We are loading metadata into Planning using outline load utility. The command we are using is as follows:
    C:\Hyperion\products\Planning\bin\OutlineLoad -f:%scripthome%\EncPassword.txt /A:NFR /U:%Essid% /M /C /-F /I:C:\Data\NFR_Customer_Dim.csv /D:Entity /L:%scripthome%\Logs\Cust_Dim_OTL_Load.Log /X:c:/outlineLoad.exc
    The outline load log throws a *"NO UDA defined for dimension xyz" error.* but seems to load the records fine, without rejecting any. Any idea why this is ?
    The first part of the log is :
    Successfully logged into "NFR" application, Release 11.113, Adapter Interface Version 5, Workforce supported and not enabled, CapEx not supported and not enabled, CSS Version 3
    +"Account" dimension properties and information:+
    Account, Parent, Alias: Default, Valid For Consolidations, Data Storage, Two Pass Calculation, Description, Formula, UDA, Smart List, Data Type, Operation, Account Type, Time Balance, Skip Value, Exchange Rate Type, Variance Reporting, Source Plan Type, Plan Type (Sales), Aggregation (Sales), Plan Type (Plan2), Aggregation (Plan2), Plan Type (Plan3), Aggregation (Plan3)
    No UDA's defined on "Account"+
    +"Periods" dimension properties and information:+
    Periods, Parent, Alias: Default, Data Storage, Two Pass Calculation, Description, Formula, UDA, Smart List, Data Type, Operation, Type, Start Period, End Period, Aggregation (Sales), Aggregation (Plan2), Aggregation (Plan3)
    No UDA's defined on "Periods"+
    ......... So on and so forth for EACH dimension and then ....
    +"Product" dimension properties and information:+
    Product, Parent, Alias: Default, Valid For Consolidations, Data Storage, Two Pass Calculation, Description, Formula, UDA, Smart List, Data Type, Operation, Aggregation (Sales), Aggregation (Plan2), Aggregation (Plan3), InputType
    UDA's bound to "Product" dimension: Product_Type
    +"InputType" attribute dimension (on base dimension "Product"). Attributes defined on the "InputType" dimension: ProductLine; ProductNumber;+
    InputType, Parent, Alias: Default, Operation
    Exchange Rates properties:
    Table, Description, To Currency, From Currency, Operation, Method, Historical, Beg Balance, Year, Period, Average, Ending
    UDA properties:
    Dimension, UDA, Operation
    Translation input file fields:
    Value, Driver Member, Point-of-View, Data Load Cube Name
    +[Tue Feb 01 01:30:12 EST 2011]Successfully located and opened input file "C:\Data\NFR_Customer_Dim.csv".+
    +[Tue Feb 01 01:30:12 EST 2011]Header record fields: Entity, Parent, Alias: Default, Data Storage+
    +[Tue Feb 01 01:30:12 EST 2011]Located and using "Entity" dimension for loading data in "NFR" application.+
    +[Tue Feb 01 01:30:14 EST 2011]Load dimension "Entity" has been unlocked successfully.+
    +[Tue Feb 01 01:30:14 EST 2011]Performing cube refresh[Tue Feb 01 01:30:36 EST 2011]Cube refresh operation has completed. Please check the Essbase log for status.+
    +[Tue Feb 01 01:30:36 EST 2011]Create security filters operation will not be performed.+
    +[Tue Feb 01 01:30:36 EST 2011]Examine the Essbase log files for status if Essbase data was loaded.+
    +[Tue Feb 01 01:30:36 EST 2011]Planning Outline data store load process finished. *1599 data records were read, 1599 data records were processed, 1599 were successfully loaded, 0 were rejected.*+
    \

    Do you get the same errors if you load just the first record from the file, if you don't keep increasing the number of records until you find the record that is causing the problem.
    Cheers
    John
    http://john-goodwin.blogspot.com/

  • Issue using one 2 Fact tables with one dimension Table.

    Hi,
    I have 1 Dimension table X and 2 Fact tables A and B
    X is joined to Both A and B for Loan Amount ( with A) and for colleatral amount (with B) when I am selecting the X.Product_Name, A.Loan_Amt, B.Collateral Amount, it is giving an error message
    State: HY000. Code: 10058. [NQODBC] [SQL_STATE: HY000] [nQSError: 10058] A general error has occurred. [nQSError: 15018] Incorrectly defined logical table source (for fact table EIP Collateral FACT) does not contain mapping for [EIP Reporting FACT.PD ID]. (HY000)
    Any clues???
    Is there a Inner or Outer join which needs to be created or set in the RPD to get the desired results???

    Ok..
    I have one table which is Porfolio Details which has Portfolio name, Product Category , Product Name, Product ID, Product sources code.- This is my Dimension table.
    I have another 2 set of fact tables : EIP Reporting FACT and EIP Collateral FACT..
    These two tables are joined to Portfolio Details table.
    EIP Reprting FACT gives portfolio wise Loan Amount
    and EIP Collateral FACT gives Portfolio wise Collateral Amount details for same set of customer..
    Now, I am selecting Portfolio Name, Product Category, Product Name,SUM( EIP Reporting FACT.LOAN_AMOUNT), SUM(EIP Collaetral FACT.Collateral_Amt) in a report
    Now, on selecting these columns I am getting that error message which is related to mapping.
    If I take any column from Portfolio details table and any column from EIP Reporting FACT- It works.
    If I take any column from Portfolio details table and any column from EIP Colletral FACT- It works.
    But if I take any column from portfolio table and columns from both FACT tables it gives mapping error...
    Hope I am able to explain the issue in a better way now..
    Edited by: help-required on Mar 11, 2010 6:53 PM
    Edited by: help-required on Mar 11, 2010 6:53 PM

  • Resolving loops in a star schema with 5 fact tables and 6 dimension tables

    Hello
    I have a star schema, ie 5 FACT tables and 7 dimension tables, All fact tables share the same dimension tables, some FACT tables share 3 dimesnsions, while other share 5 dimensions.  
    I did adopt the best practices, and as recommended in the book, I tried to resolve them using Context, as it is the recommended option to Alias in a star schema setting.  The contexts are resolved, but I still have loops.  I also cleared the Multiple SQL Statement for each context option, but no luck.  I need to get this resoved ASAP

    Hi Patil,
    It is not clear what exactly is the problem. As a starting point you could set the context up so that it only covers the joins from fact to dimension.
    Fact A, joins Dim 1, Dim 2, Dim 3, and Dim 4
    Fact B, joins Dim 1, Dim 2, Dim 3, Dim 4 and Dim 5
    Fact C, joins Dim 1, Dim 2, Dim 3, Dim 4 and Dim 6
    Fact D, joins Dim 1, Dim 2, Dim 3, Dim 4 and Dim 7
    Fact E, joins Dim 1, Dim 2, Dim 4 and Dim 6
    If each of these are contexts are done and just cover the joins from fact to dim then you should be not get loops.
    If you could lay out your joins like above then it may be possible to specify the contexts/aliases that should work.
    Regards
    Alan

  • Join fact table with higher dimension level

    how do i join fact tables with higher dimension levels with discoverer?
    fact with detail at level C
    measure X
    dimension with
    D->C->B->A
    E->C
    level
    A B C
    1------1------1
    2------2------1
    3------2------1
    join between fact X and dimension level C
    X=3*C because of sum(X) in discoverer and 3xC in dimension
    is there a way to get correct values for X without creating a dimension like
    D->C
    E->

    another way of asking this is whether you can create a summary table in Discoverer at a higher level than a dimension's fundamental grain. In other words - the summary examples in the documentation all describe leaving out one or more of your dimensions... they are either left in or completely taken out. But, some of the most effective summarization occurs when you summarize daily data to a monthly level. Assuming that I have a sales table (at a daily level, and a key value sales_date), and a table date_dim (primary key sales_date), I would like to create a summary sales_month_summary where the sales are grouped on month_year (which is a field in the sales_date table).
    How is this done? I suspect that we can't use the date_dim table with the summary (due to the problems noted by the poster above). Do we have to create another table "month_dim"? Do we have to fold all of the desired date attributes (month, quarter, year) into the summary? Obviously we'd like to re-use all of the pertinent already existing date items (quarter, month, year, etc.), not recreate them over again, which would result in essentially two sets of items in the EUL. [One used for this month summary, and another used for the detail.]
    I searched the forum - someone asked this same question back in 2000 - there was no answer provided.
    The only other thought I have is to "snowflake" the date_dim into two tables and two folders, one at a date level, another at the month level. Then the detail tables can connect to date_dim (which is linked to month_dim), while the summary data can connect directly to month_dim.

  • Joining two fact tables with different dimensions into single logical table

    Hi Gurus,
    I try to accomplish in Oracle Business Intelligence 11.1.1.3.0:
    F1 (D1, D2 and D3)
    F2 (D1 and D2 and D4)
    And we want to build a report F1 F2 D1 D2 D3 D4 to have data for:
    F1 that match only for D1-D2-D3
    and data for
    F2 that match only D1-D2-D4
    all that in one row, so D3 and D4 are not common dimensions.
    I can only do:
    F3 (D1, D2)
    F4 (D1, D2 and D4)
    And report
    F3 F4 D1,D2,D4 (all that in one row, and only D4 is not a common dimension)
    Here is the very good example how to accomplish the scenario 1
    http://108obiee.blogspot.com/2009/08/joining-two-fact-tables-with-different.html
    But looks like it does not work in 11.1.1.3.0
    I get
    State: HY000. Code: 10058. [NQODBC] [SQL_STATE: HY000] [nQSError: 10058] A general error has occurred. [nQSError: 43113] Message returned from OBIS. [nQSError: 14025] No fact table exists at the requested level of detail: [,,Clients,,Day,ROI,,,,EW_Names,,,,,,,,,,,,,,,,,]. (HY000)
    I am sure I set up everything correctly as advised in the blog but it works with only one not a common dimension
    Is it a bug in 11.1.1.3.0 or something?
    Thanks,
    Kate

    Thanks for all your replies.
    Actually, I've tried the solutions you guys mentioned. Generally speaking, the result should be displayed. However, my scenario is a little bit tricky.
    table Y's figures are not the aggregation of table X for D dimension. Instead, table Y's figures include not only D dimension total, but also others (others do not mean A, B, C dimension). For example, table Y stores all food's figure, while table X stores only drink's figure. D dimension is only about drink's detail. In my scenario, other foods' figure is not provided.
    So, even if I set D dimension to all/total for table X, table X's result is still not the same as table Y.
    Indeed, table Y does not have a column key to join to D dimension's key. So, if I select D dimension and table Y's measures at the same time in BI Answer, result returns no data. Hence, I can't compare table X and table Y's results with selection of D dimension.
    Is there any solution to solve this problem?
    Edited by: TomChan on Jun 3, 2009 9:36 AM

  • Multiple 'logical joins' between a fact table and one dimension table

    It appears that one cannot create multiple ‘logical joins’ between a fact table and one dimension table in OBIEE using the Oracle BI Administration Tool. For example, considering a Business Model with a dimension table TIMES and a fact table FACT containing START_TIME and END_TIME, we would like to create separate logical joins from FACT to TIMES for the START_TIMEs and END_TIMEs? Obviously, the underlying foreign keys can be created, but as far as I can tell the Oracle BI Administration Tool doesn’t support this. The workaround would be to replicate the TIMES table, but that’s ugly.
    I seek an alternative approach.

    Try this. Create an two aliases for the TIMES dimension (Start & End) in the Physical Layer and then remove foreign key to the "Parent" Times dimension. Create the Foreign Key in the Physical Layer to the new aliases and then create the complex joins in the BMM Layer to the new aliases as well. This will allow you to present both dates within the same table in the Presentation Layer. Not the most elegant solution but it works.

  • How to update a fact table when a dimension table is reloaded

    We have implemented BI Apps 796. Insertion into W_EMPLOYEE_D table which stores all the employee information had stopped one year back as some company security policy restricted the informatica worklfows to pick up the data. (PER_ALL_PEOPLE_F was a HRMS table and it contained sensitive information line SSN and salary, was inaccessible to the user which informatica uses and the SDE mapping used to return 0 rows).
    Now we have the approval to see those rows and the dimension table is loaded with some 100 new employees who joined in last one year.
    The ROW_WID of W_EMPLOYEE_D is referenced in lot of fact tables and for all those missing employees the WID in the fact table is 0.
    Now that we have all employees, how to make the FACT table point to the correct WID and not store 0. Has anyone faced this problem before?? Writing an update statement will be a tedious task as there are so many fact tables that join to w_employee_d. Also our company uses Sales, Procurement, Finance modules of OB Apps (which constitutes atleast 20 fact tables)
    Any guidance is appreciated. Thanks in advance

    Hello Kostis,
    thank you for your answer. I don't fully understand you. Can you show me short example, please? I create alias table for time dimension on Physical Layer - original table is TimeDayDim and I create aliases TimeDayDim1, TimeDayDim2, TimeDayDim3, TimeDayDim4. Then I create foreign key Fact.Time1 -> TimeDayDim1, Fact.Time2 -> TimeDayDim2, Fact.Time3 -> TimeDayDim3, Fact.Time4 -> TimeDayDim4. And what now? Must I create these table api Bussines Model and create new time dimensions at bussiness model????
    I need in Answers ONE Time dimension. I think I must split my fact table to four tables ... (time1, place1 ...) (time2, place2 ...) (time3 place3...) (time4 place4...) then link those tables to Time dimension (but I dont know where I can split those tables - on Physical Layer or on Bussines Layer).
    I suppose that I will have in Answers one time dimension and four facts tables and I will be able to query them. (for example: Time.Days, Fact1.Place1, Fact3.Speed, Fact4.Count Criteria: Time.Year = 2008)
    Best Regards Vlada

Maybe you are looking for