Missing most detailed table for dimension tables

Hi ,
I am getting this following error
Business Model Core:
[nQSError: 15003] Missing most detailed table for dimension tables: [Dim - Customer,Dim - Account Hierarchy,Dim - Account Region Hierarchy,Fact - Fins - Period Days Count].
[nQSError: 15001] Could not load navigation space for subject area Core.
I got this error when I tried to configure # of Elapsed Days and # of Cumulative Elapsed Days by following way-
1. Using the Administration Tool, open OracleBIAnalyticsApps.rpd.
Configuration Steps for Controlling Your Data Set
Configuring Oracle Financial Analytics 5-51
The OracleBIAnalyticsApps.rpd file is located at:
ORACLE_INSTANCE\bifoundation\OracleBIServerComponent\coreapplication_
obisn\repository
2. In the Business Model and Mapping layer, go the logical table Fact - Fins - Period
Days Count.
3. Under Sources, select the Fact_W_DAY_D_PSFT logical table source.
4. Clear the Disabled option in the General tab and click OK.
5. Open the other two logical table sources, Fact_W_DAY_D_ORA and Fact_W_
DAY_D_PSFT, and select the Disabled option.
6. Add the "Fact - Fins - Period Days Count" and "Dim - Company" logical tables to
the Business Model Diagram. To do so, right-click the objects and select Business
Model Diagram, Selected Tables Only.
7. In the Business Model Diagram, create a new logical join from "Dim - Company"
to "Fact - Fins - Period Days Count." The direction of the foreign key should be
from the "Dim - Company" logical table to the "Fact - Fins - Period Days Count"
table. For example, on a (0,1):N cardinality join, "Dim - Company" will be on the
(0/1) side and "Fact - Fins - Period Days Count" will be on the N side.
8. Under the Fact - Fins - Period Days Count logical table, open the "# of Elapsed
Days" and "# of Cumulative Elapsed Days" metrics, one at a time.
9. Go to the Levels tab. For the Company dimension, the Logical Level is set to All.
Click the X button to remove it. Repeat until the Company dimension does not
have a Logical Level setting.
10. Make sure to check Global Consistency to ensure there are no errors, and then
save the RPD file.
Please help me to resolve.
Thanks,
Soumitro

Could you let me know how you resolved this. I am facing the same.

Similar Messages

  • "Missing most detailed table for dimension tables" eror when I run the Global Consistency check

    ERRORS:
    Business Model DAC Measures:
    [nQSError: 15003] Missing most detailed table for dimension tables: [D_DETAILS,D_EXECUTION_PLAN,D_TASK].
    [nQSError: 15001] Could not load navigation space for subject area DAC Measures.
    I am also attaching my Business Model layer for easy understanding. I have a fact table and several Dimension table. I got this error only after creating the following hierarchies:
    Execution Plan -> Tasks -> Details
    Start Date Time Hierarchy
    End Date Time Hierarchy
    Is there a solution for this problem? Thanks in advance!

    Yes ! My Task Hierarchy has 3 dimension tables that form a hierarchy :Execution Plan -> Tasks -> Detail
    All the 3 levels in the hierarchy are 3 different dimension tables.

  • [nQSError: 15003] Missing most detailed table for dimension tables:

    Hi,
    I am new to BI. I created fresh respository from the scratch.
    When run the consistency check I get the following messages.
    [nQSError: 15001] Could not load navigation space for subject area SALES.
    [nQSError: 15003] Missing most detailed table for dimension tables: [Products,Customers].
    Can anyone help me on this.
    Thank you
    subra.

    Got rid of the problem by deleting and recreating the dimension tables.
    Subra.

  • Number range buffering for dimension tables (DIM IDs) and InfoObjects (SIDs

    Hi BW Experts,
    How to check whether this number range buffering for dimension tables (DIM IDs) and InfoObjects (SIDs) is Active or not ??
    can you please provide me this technical information where to go and check in the system ?
    Thanks in Advance.

    The Early Watch report lists these because of the large number of rows in the MD/Dim tables.  Keep in mind though, that Number Range buffering will really only help if you continue to generate a large volume of new SIDs or Dim IDs with ongoing loads, e.g.
    If your transactions being loaded to a cube result in several thousand new rows being added a Dim table, then it makes sense to turn on Number Range buffering for that Dim ID.  But if the transaction volume is only cause a few hundred DIM IDs to be added, buffering will not really get you anything.

  • Fact Table and Dimension Tables

    Hi Experts, I'm creating custom InfoCubes for data coming from non-SAP source systems. I have two InfoCubes. Tha data is coming from like 10 tables. I have 10 DataSources created fo this and the data will be consolidated in Standard DSO before it will flow into 2 InfoCubes.
    Now client wants to know before how much data will be there in InfoCubes in Fact table nad Dimension tables in both the InfoCubes. I have the total size of all the 10 tables from the sources given to me by the DBA. I wan not sure how I can convert that info for Fact table and Dimension table as I have not yet created these Infocubes.
    Please help me with this on how I should address this.

    hi,
    The exact data will be hard to give however you can reach at a round figure in your case.
    You are consolidating the data from the tables that means that there is relation between the tables. Arrive at a rough figure based on the relation and the activity you are performing while consolidating the data of the tables.
    For example, let us say we want to combine data for sales order and deliveries in a DSO.
    Let Sales order has 1000 records and Delivery has 2000 records. Both the tables have a common link (Sales Order).In DSO you are combining the data that means the data will be at the most granular level consist of Delivery data, so the maximum no of records which the consolidated DSO can have is 2000.
    regards,
    Arvind.

  • To find the size of the fact table and dimension table

    Hi experts,
    Can anyone plz tell me if i want to find size of the fact table and size of the dimension table to find cardinality and line item do we first build statistics then find size by transaction DB02 or any other method we have?
    Thanks in advance

    Hi ,
    Please go to Tcode DB02 >Space>Table and Indexes.Give your table name or pattern (like /BIC/F* for gettinf all the Fact tables)
    .This will give you sizes of all the table.
    Also if you want to get list like TOP 30 Fact tables and Dimension Table.Please use TCode ST14, this will give a desired output with all the required details.
    -Vikram

  • Oracle Data Compression on SID tables and Dimension Tables

    Hello Community,
    We have had great success with Oracle compression on ODS tables that are no longer loaded.
    We'd now like to move on to other types of BW tables that are very large.
    OSS Note 701235 provides answers to questions concerning the possible use of Oracle compression together with SAP BW.
    But the Note does not give suggestions for (or against) Oracle compression on SID tables or Dimension tables.
    I believe both table types would exhibit the same behaviour : mostly inserts of new SID IDs and new DIM IDs, but few updates to existing SID or Dimension records.  If this is true, then both are good candidates for oracle compression. 
    Do you also agree that this is the typical behaviour for SID tables and dimension tables ?  And that these types of tables are good candidates for Oracle compression in a large BW system ?
    Thanks kindly!
    Keith Helfrich

    Hi all,
    Although this is an old thread I stumbled on during my own investigations I can provide some answers to your questions.
    Table candidates for compression are found by these criteria
           - Table size big enough?
           - Long lifetime of object planned ?
          - No or only rare structural changes for the table   ?
          - u201EUpdateu201C rate low : is your data mostly kind of u201Cread onlyu201D ?
    --               for the wideley used rolling window partition techniques of
                      tables in BW this is not a problem: mostly INSERTu2019s in the
                     current partition not affecting other partitions
    BW tables that can benefit from compression (see SAP notes 105047,701235)
           - PSA tables with data that must be saved for a longer time
           - ODS change log (no Updates of old data, only Inserts of new data)
           - u201Ehistoricalu201C cubes wich get no changes in table structure anymore
    Limitations
             - normal Insert or Update statements are stored ALWAYS in uncompressed
                    format and must be compressed separately ( <= Oracle10g )
             - Slight CPU overhead of compression, butu2026
             --      CPU consumption is more than compensated by doing less I/O as
                   for Bulk loads or parallel processing
             --      SAP BW transformations took a significant amount of CPU for
                       overall load-time into cubes caused by the application server not
                       the database
              - Table must have not more than 255 fields
              - Adding columns with an initial value or dropping columns require
                   uncompression of the complete table (strongest limitation)
    Consider all this above and you can decide that tables that going through UPDATE's are
    not good candidates for compression or tables that can change it's structure (like
    Fact- or DIM tables) .
    Now, my questions to you:
    Wich Oracle version do you use?
    Wich tool do you use for  Oracle compression?
    BRspace (can you give an example ?)
    ALTER ... MOVE COMPRESS
    bye
    yk

  • Reg: Fact table and Dimension table in Data Warehousing -

    Hi Experts,
    I'm not exactly getting the difference between the criteria which decide how to create a Fact table and Dimension table.
    This link http://stackoverflow.com/questions/9362854/database-fact-table-and-dimension-table states :
    Fact table contains data that can be aggregate.
    Measures are aggregated data expressions (e. Sum of costs, Count of calls, ...)
    Dimension contains data that is use to generate groups and filters.
    This's fine but how does one decide which columns to consider for Fact table and which columns for Dimension table?
    Any help is much appreciated.
    Pardon me if this's not the correct place for this question. My first question in the new forum.
    Thanks and Regards,
    Ranit Biswas

    ranitB wrote:
    But my main doubt was - what is the criteria to differentiate between columns for Fact tables and Dimension tables? How can one decide upon the design?
    Columns of a fact table will often be 'scalar' attributes of the 'fact' data item. A dimension table will often be 'compound' attributes of a 'fact'.
    Consider employee information. The EMPLOYEE table can be a fact table. It might have scalar attribute columns such as: DATE_HIRED, STATUS, EMPLOYEE_ID, and so on.
    Other related information that can't be specified as a single attribute value would often be stored in a 'dimension' table: ADDRESS, PHONE_NUMBER.
    Each address requires several columns to define it: ADDRESS1, ADDRESS2, CITY, STATE, ZIP, COUNTRY. And an employee might have several addresses: WORK_ADDRESS, HOME_ADDRESS. That address info would be stored in a 'dimension' table and only the primary key value of the address record would be stored in the EMPLOYEE 'fact' table.
    Same with PHONE_NUMBER. Several columns are required to define a phone number and each employee might have several of them. The dimension tables are used to help 'normalize' the data in the employee 'fact' table.
    And that EMPLOYEE table might also be a DIMENSION table for other FACT tables. A DEVELOPER table might have an EMPLOYEE_ID column with a value that points to a 'dimension' row in the EMPLOYEE dimension table.

  • What are the Target Base Tables for FA_TAX_INTERFACE table?

    Hi,
    We are using "upload tax book interface" program to update the Tax Book informaiton.
    For this we wanted to know all the Target Base Tables for FA_TAX_INTERFACE table (Specially for the ATTRIBUTE1-ATTRIBUTE15 columns of the interface table).
    Kindly request you to help on this.
    Thanks & Regards,
    Satvik Patel

    Well, what are you reconciling to? You are looking at something that seems to be different from your data extract, so what is that thing? My trouble shooting process is to take an asset that seems to be wrong and reconcile from the report/screen data to the extract data. You have to figure out what Oracle program is involved and what kind of SQL that the program is doing and what tables it is looking at. This could be an Oracle program error. I just ran into a problem on the Payables Posted Invoice Register report, where the total did not match my extract total. Turns out the report was wrong. If I had table records with a date/time stamp value (instead of just a date stamp value) in the accounting date field for the TO ACCOUNTING DATE value in my report request selection, those records were being dropped from the report. I reported this to Oracle support and found out that there is a patch to fix the problem (happens when cancelling an invoice) and a data script to follow to fix the historical data. So you have to keep an open mind. Yes, trying to find a reconciling issue can be very painful.
    My reaction is that you are looking at the right tables. There is also a FA_DISTRIBUTION_HISTORY table that you may want to look into, that links back to the FA_DEPRN_DETAIL table. Maybe your issue lies there. At least something to look at.
    John Dickey

  • Fact table and dimension table

    what is the difference b/w fact table and dimension table

    A fact table contains numeric values and also contain composite key(i.e collection of foreign key) e.g.. sales and profit. Typically has two types of columns: those that contain facts and those that are foreign keys to dimension  tables.
    Dimension tables, also known as lookup or reference tables, contain the relatively static data in the warehouse. It contains character values E.g Customer_name,Customer_city.
    Dimension tables store the information you normally use to contain queries. Dimension tables are usually textual and descriptive and you can use them as the row headers of the result set.
    Rachna

  • Sizeof fact table and dimension table

    Hi Experts ,
    Can you tell me how to measure the size of dimension table and fact tables and how to find the ratio of size of dimension table to the size of fact table.
    Thanks in advance,
    Thanks and Regards
    Ram Kommineni

    Hi,
    The ratio of size of fact table to dimension table is ideally should be 10:1. That means the size of the dimension tables should be 10% of the size of the fact table. This serves better in terms of performance.
    Whenever you load any cube, you can easily find out the number of records in fact table (E and F table) and the dimension tables. Based on that statistics you can determine the ratio of size in fact and dimension tables.
    Hope this info helps you.
    Regards,
    Yogesh.

  • Where can we see Fact Table And Dimension Table in DataWarehouse Workbench?

    Hai Experts
    where can we see Fact Table And Dimension Table in DataWarehouse Workbench?
    Best Regards
    nvnkmr12

    Hi
    Refer the link below and you will get a comprehensive explanation of how data is stored and modeled in BI. If explains fact table, dimension table, SID table, P and Q , X and Y tables.
    http://www.sdn.sap.com/irj/scn/index?rid=/library/uuid/6ce7b0a4-0b01-0010-52ac-a6e813c35a84
    Cheers
    Umesh

  • ABAP table for "Condition table"

    Dears,
    Could you please tell the ABAP table for Condition table.
    My requirement is : I need to check a field e.g. KUNNR, in which all the condition tables KUNNR is been used.
    Thanks
    pinky

    Hi,
    Use t.code se16n, and then provide the sales order number for vbak,
    or vbap take value of field KNUMV (The internal number under which
    the system saves conditions that apply to a sales order header),
    KNUMH (The internal number under which the system saves conditions
    that apply to a sales orde itemr), and go to KONV (for vbak) and KONP
    (for vbap) enter this KNUMV or KNUMH number and execute , you will get
    all item pricing conditions for that sales order.
    For pricing record :- table KONH - Condition (Header)
                     table KONP - Condition (Item)
    with field KNUMH - Condition record number.
    hope this will help.
    regards
    Vivek.

  • Underl;ying database tables for Setup Tables

    Experts,
    Can anyone tell me what are the underlying tables for Setup tables for Purchasing which I can check via SE11 in R/3.
    Thanks,
    Jain

    Hi Rajesh,
    You can check the records in NPRT transaction.
    Hope this helps you.
    Regards,
    Saravanan.

  • How to Maintain Surrogate Key Mapping (cross-reference) for Dimension Tables

    Hi,
    What would be the best approach on ODI to implement the Surrogate Key Mapping Table on the STG layer according to Kimball's technique:
    "Surrogate key mapping tables are designed to map natural keys from the disparate source systems to their master data warehouse surrogate key. Mapping tables are an efficient way to maintain surrogate keys in your data warehouse. These compact tables are designed for high-speed processing. Mapping tables contain only the most current value of a surrogate key— used to populate a dimension—and the natural key from the source system. Since the same dimension can have many sources, a mapping table contains a natural key column for each of its sources.
    Mapping tables can be equally effective if they are stored in a database or on the file system. The advantage of using a database for mapping tables is that you can utilize the database sequence generator to create new surrogate keys. And also, when indexed properly, mapping tables in a database are very efficient during key value lookups."
    We have a requirement to implement cross-reference mapping tables with Natural and Surrogate Keys for each dimension table. These mappings tables will be populated automatically (only inserts) during the E-LT execution, right after inserting into the dimension table.
    Someone have any idea on how to implement this on ODI?
    Thanks,
    Danilo

    Hi,
    first of all please avoid bolding something. After this according Kimball (if i remember well) is a 1:1 mapping, so no-surrogate key.
    After that personally you could use Lookup Table
    http://www.odigurus.com/2012/02/lookup-transformation-using-odi.html
    or make a simple outer join filtering by your "Active_Flag" column (remember that this filter need to be inside your outer join).
    Let us know
    Francesco

Maybe you are looking for

  • UltraSPARC I processors are not supported by this release of Solaris.

    I'm getting this message when trying to install Solaris 10 on my Ultra Enterprise 2. My buddy that gave it to me has the same exact system and had no problems installing Solaris 10. Maybe he has UltraSPARC II? Can I upgrade to UltraSPARC II? Will Sol

  • New Vista machine says iTunes is not a valid Win32 application

    I am trying to install iTunes on my new Sony Vaio running Vista. When I try to install iTunes, it says iTunes is not a valid Win32 application. Can anyone help me...I have no idea what this means or what to do. Thanks!

  • Mime objects

    I have the following problem: A BSP has just been migrated from XI to R/3, along with some MIME objects that are used in the BSP. In the transport order, the MIME objects are present. But they are not present in the BSP's object tree, and the BSP dis

  • Windows 7 pro drivers for my pavilion a1710n

    I just upgraded my pavilion a1710n from xp to 7 pro.  Does anyone know where I can get drivers for my system?  HP support doesnt have drivers for win 7.

  • Field Selection for Production Orders

    Hi , Can anyone tell me the best way to make the Profit Center field mandatory on the production order create/change. It did not look like in config that this was an option. Please Advise, Brett Walton