Record count is more in the Fact tables when compared to Cube data!

When i delete request from my cube it is not getting deleted from the fact table. Cube shows around 3.5 million records but when I checked in the fact table it was showing more than 11 million records.

Hi Kingsley,
   You may try this approach.
   1. Use TCode listschema
   2. Select the type of cube abd give the cube name as well.
   3. This will display the table sinvolved in the cube(star schema).
   4. Select the Ftable listed there and see the number of rows in it.
   Inorder to delete the entire data from cube, you may need to delete all the load requests that has happened before as well.
   Hope this helps...
Thanks,
Raj

Similar Messages

  • Record count is double in the Fact tables when compared to Cube data!

    Hello BW Gurus,
    I have 2 question to be answered!
    1. I have a cube which consist of 3 years of data. Due to some bad data in it, I have dropped the cube data completely from fact and dim tables and did an init. from ODS. the load failed due to "No SID found for value '2000000000000000000000010' of characteristic 0MAT_PLANT". I have run a RSRV test to find if there is any inconsistency in SID's for 0MAT_PLANT but everything looks great. Considering the time span in mind to finish the load I have removed that record from PSA and re-pushed the data from PSA which was successful, however I wanted to know if anyone has come across such kind of error.
    2. The load finished successfully in the cube but the performance of the cube was very bad coz my cube request shows only 3.5 million records but when I checked in the fact table it was showing double i.e. 7 million records, which is because I deleted the failed init. request and re-pushed the data from PSA. can anyone suggest me how to overcome this I mean how to bring back my fact table records to 3.5 million?
    please advise me and thanks so much!
    Swathi.

    hi Swathi,
    1. for 0mat_plant no sid, beside rsrv, you can try rsd1, infoobject maintenance, there is menu (about 3rd from left) 'fill sid ...'.
    for infocube update, please make sure you 'delete data',
    'fact and dim...'. check again if fact table counts 0 records then you can update from ods with 'initialize'. or if you go by psa then delete data both infocube and ods (from ods to cube should sufficient).
    2. for performance, check in infocube -> manage -> performance, make sure index and statistic is green. you can create index there, also refresh the statistic.
    hope this helps.

  • How we will know that dimension size is more than the fact table size?

    how we will know that dimension size is more than the fact table size?

    Hi,
    Let us assume that we are going to take Division and distribution channel in a dimension and assume we have 20 distinct values for Division in R/3 and 30 Distinct values for Distribution channel .So Maximum, we can get 20 * 30 records in dimension table and we can take rough estimation of records in the cube by observing the raw data in source system.
    With rgds,
    Anil Kumar Sharma .P

  • Can date be included in the fact table as a measure?

    Dear All,
    I have to migrate a database form relational model to dimensional model. It a kind of human resource database. I don't know what MEASURES should I keep in the fact table. There are only dates, like date the employee joined the institution and the date he will leave. Most of the other fields are non-numeric. well date is also non-numeric but we can calculate the duration the employee worked from these dates.
    What do you suggest?

    I'd be careful about adding a "measure" of duration worked (be it days, months, years - doesn't matter). Causes lots of churn. For example, if you choose a measure of "duration_worked_in_days" - every single row in the fact table would be obsoleted every single day....
    What types of questions do you expect the fact table to answer?
    I'm working on a HR mart right now, and my fact data is around pay rates (not actual pay), i.e. annual salary, hourly salary, etc. My records also have two "date" dims - effective start date and effective end date. Meaning if my annual salary is $50 a year between 1/1/2008 and 12/31/2008, that's what the row shows. When (or if) I get a pay raise (/cut), the "current" record gets end dated, and a new record inserted.
    When you say that a fact table "must" contain measure columns - I assume you're using the actual OWB fact / dimension objects, vs. just tables? Very common in a HR data warehouse to have a "factless" fact table.
    Hope this helps,
    Scott

  • Define granularity of the fact table

    Hi BW experts,
    Can you explain defining granularity of the fact table when doing data warehousing?
    Thanks,
    Bill

    Data Modeling issue:  want example of Define granularity of the fact table
    http://help.sap.com/bp_biv335/BI_EN/documentation/Multi-dimensional_modeling_EN.doc

  • Display master data without data in the fact table

    Characteristic 0PROJECT
    Attribute Price
    I want to show in the query all the prices including the projects that don't have registers in the fact table.
    How do I do this?
    Tnks.

    I believe you are describing what SAP referes to as the Slow Moving Item scenario.  Search SDN using that phrase and you'll get hits on documents and  Notes that talk more about this.  Here's something from an old How To
    Slow Moving Item Scenario
    You want to define a query that displays all products that have been purchased only
    infrequently or not at all. In other words, the query is also display characteristic values for
    which no transaction data or only low values exist for the selected period.
    Procedure
    In the Administrator Workbench;
    1. Create a MultiProvider consisting of a revenue InfoCube, containing the InfoObject
    Material (0MATERIAL), and the InfoObject 0MATERIAL. The InfoObject must be set as
    an InfoProvider in InfoObject maintenance. In other words, you need to have assigned
    the InfoObject to an InfoArea. (also refer to Tab Page: Master Data/texts [Ext.]).
    In the BEx Analyzer:
    2. Select your MultiProvider in the Query Designer.
    3. Define a query that contains the InfoObject 1ROWCOUNT in the columns.
    The InfoObject 1ROWCOUNT is contained in all “flat” InfoProviders, that is, in all
    InfoObjects and ODS objects. It counts the number of records in the InfoProvider.
    In this scenario, you can see from the row number display whether or nor values
    from the InfoProvider InfoObject are really displayed.
    4. Save the query and execute it. All values are now displayed, including those for materials
    that were not purchased.
    If you filter by time (0CALYEAR, for example), values from the InfoProvider
    InfoObjects are not displayed since 0CALYEAR is not an attribute of
    0MATERIAL. You can see this from the absence of values in the 1ROWCOUNT
    column in the query. If you want to restrict by time, you need to proceed as
    follows:
    Constant Selection for the InfoObject
    You need to set the constant selection for the 1ROWCOUNT key figure in order to be able to
    set a filter by time in this query.
    1. In the Query Designer, via the context menu for 1ROWCOUNT, choose Edit.
    2. On the left hand half of the screen, under the data package dimension, select the
    characteristic InfoProvider (0INFOPROV) and drag it into the right-hand screen area.
    3. From the context menu for the InfoProvider, choose Restrict, and restrict across the
    InfoProvider InfoObject.
    4. Also from the context menu for the InfoProvider, choose the function Constant Selection.
    5. Save the query and execute it. You can now also set a filter for a time characteristic, the
    materials display remains as it was.
    Displaying Slow Moving Items
    SAP Online Help 05.11.02
    MultiProviders 3.0B, Support Package 07 10
    If you want to display a list of slow moving items, excluding products that are selling well, you
    need to proceed as follows:
    1. In the Query Designer, via the context menu for 1ROWCOUNT, choose Edit.
    2. Via the context menu for InfoProvider, choose the function Display Empty Values. Also
    select Constant Selection.
    3. Save the query and execute it. The result is that the system displays the materials for
    which there was no revenue.
    Displaying Products with Small Revenues
    If you want to display a list of products that have not been sold or have only been selling
    badly, you need to proceed as follows:
    1. Set constant selection as described above, but do not select the display empty values
    function.
    2. In the Query Designer, define a condition for the 0MATERIAL InfoObject. Specify a value
    that is to be the upper limit for a bad sale.
    3. Save the query and execute it. The result is that the system displays all materials that
    have not been sold or have been selling badly.

  • Dimension table is larger than the fact table

    Hi Community,
    How can we explain the phenomenon when a dimension table has MORE records in it than the fact table ?  What are the conditions that would cause this to occur ?
    Thank you !
    Keith

    Thanks, Bhanu,
    I am wondering specifically how to explain the output from program SAP_INFOCUBE_DESIGNS when the dimension table is shown to have a fact table ratio that is greater than 100%
    I believe that SAP_INFOCUBE_DESIGNS already takes into consideration both the E and also the F-fact table when calculating the ratio.  So in this case, we could not explain it by your first suggestion (after compression - but looking at only the F table).
    In the case where selective deletions have been performed, how can we correct the situation ?  For example, how could we clean out the records in the dimension tables which no longer have any facts in the fact table ?  (I think the BW system should do this automatically as a part of the selective deletion, don't you agree ?).
    Also, is there any other explanation for how the dimension table could arrive at greater than 100% the size of the fact table(s) ?
    For example, lets say that (theoretically) we placed many very dynamic characteristics together into the same dimension.. which we know you should not do.  Would it be possible for the combination of these very many dynamic characteristics to cause so many DIM IDs that the dimension table overtakes the record count of the fact table ?  Is this situation then made worse by compression if the number of fact table records is reduced thanks to removal of the request ID ?

  • How map to my particular table to the fact table in obiee 11g...

    Hi friends,
    I did this simple report in obiee 11g(i.e)
    "NATIONALITY COUNT IN DEPARTMENT WISE"
    For that i used the following tables:
    per_all_assignments_f----->fact table
    hr_all_organization_units----->dim table(containing departments)
    per_all_people_f---------------->dim table(containing nationality)
    I made all the mappings in the physical diagram, as also viewed my report in BI answers
    It shows the following results like
    NATIONALITY---------------------------------------------------------------------COUNT(NATIONALITY)
    AUS------------------------------------------------------------------------------------------------24
    AFR------------------------------------------------------------------------------------------------25
    PHQ_VB-------------------------------------------------------------------------------------------40
    SH_VT----------------------------------------------------------------------------------------------4
    The problem is for me it is showing the above results, but the nationality column is of various codes of the country.
    Since i doesnt want the code of the nationalitian to display in the results..i need the meaning of each and every nationality..
    like,
    AUS------------------------Australian
    AFR-------------------------African
    PHQ_VB----------------------Germanian(assigned)
    Since i know that the meaning for the nationalitian is available in "FND_LOOKUP_VALUES"...okay..
    I can import "FND_LOOKUP_VALUES" table to the physical layer...but how i can able to give the mapping to the fact table in my physical diagram...
    In my report the fact table is "per_all_assignments_f"
    As my fact table doesnt contains any matching column corresponding to the dimension table "FND_LOOKUP_VALUES".....
    Then how i can give mappings to the fact column???? for viewing the full meaning of the nationalitian in my report.....Help me friends...
    Regards,
    Harry...

    Hi bifact,
    I followed the step that U asked me to go but im stuck with later.....
    *) I saved the query that i executed in toad with the columns showing country codes and country meaning to the excel sheet...
    This is the query that i executed in toad and copied data s in excel sheet..
    select z.lookup_code, z.meaning
    from per_all_people_f e, per_all_assignments_f f, hr_all_organization_units h, fnd_lookup_values z where
    e.person_id = f.person_id and f.organization_id = h.organization_id and e.business_group_id = f.business_group_id
    and f.business_group_id = h.business_group_id and f.location_id = h.location_id
    AND z.lookup_type(+) = 'NATIONALITY'AND z.lookup_code(+) = e.nationality and sysdate between
    e.effective_start_date and e.effective_end_date and e.nationality is not null
    *) After that i created a system DSN for the excel drive..
    *) After that when i tried to import metadata of the excel data that is saved, it showed me connection failed...
    For importing this excel data
    what connection details i need to give:--------------------
    as well as user name and also password...
    Soon after importing this excel you said that to give key connection only to fnd_lookup_tables(dim) and that excel data..
    if so, again im not giving key connection to my fact table..
    I think again the same error will occur that no logical mapping is not made to fact so again repository is inconsistent...
    Hi bifact sorry to ask you, but can you see the steps that i followed is correct and after that wat more steps i need to proceed..Can you tell me in breif manner...Thanks for your help...
    Regards,
    Harry...

  • Content Tab: None of the fact tables are compatible with the query request

    Hi All,
    **One thing I am not clear yet of all my years with OBIEE is working with the content tab in BMM.**
    I have made a rpd the joins in physical layer as shown below:
    https://picasaweb.google.com/114804305606242416264/OBIEEError#5663056545119428530
    And the BMM layer as:
    https://picasaweb.google.com/114804305606242416264/OBIEEError#5663056519553812930
    Error I am getting when i run a request from the 3 columns from the selected 3 tables is:
    Dim - Comment Code Details
    Fact - Complaint
    Dim - Service Details
    Error Codes: OPR4ONWY:U9IM8TAC:OI2DL65P
    State: HY000. Code: 10058. [NQODBC] [SQL_STATE: HY000] [nQSError: 10058] A general error has occurred. [nQSError: 14020] None of the fact tables are compatible with the query request Sr Num:[DAggr(Fact - Complaint.Sr Num by [ Dim - Service Details.Sr Cat Type Cd, Dim - Comment Code Details.Cmtcode name] )]. (HY000).
    I get no error for consistency.. I read everywhere and I know i need to set the appropriate aggregation levels in the various dims and facts LTS properties to help OBIEE understanding our model, but how to do that.. how do i decide... how should I approach, what should be the aggregation level, what details.
    When i click More button i see different options: Copy, Copy From, Get Levels, Check Level, what do these mean.
    Aggregation Content, group by - Logical Level or Column which one should i choose and how should I decide.
    Can anyone explain the Content Tab in details and from scratch with some example and why we get these errors.... I know many people who are well versed with many other things related to RPD but this. A little efforts of explaining from you guys will really be appreciated.
    Thanks in advance,
    Dev

    Hi Deepak,
    Option 1:
    My tables in physical layer are joined as below:
    D1--> F1 <--D2--> F2 <--D3
    Same way i model it in BMM
    D1--> F1 <-- D2--> F2 <--D3
    Here D1 is non Conformed Dimension for F2 and D3 is non Conformed dim for F1. Later create Dimensional hierarchies, I tried setting up the content levels
    I go Sources>content tab of Fact F1 I set
    Dimensions----------- Logical level
    D1---------------------- D1 Detail
    D2---------------------- D2 Detail
    D3---------------------- D3 Total
    then, I go Sources>content tab of Fact F2 I set
    Dimensions----------- Logical level
    D1---------------------- D1 Total
    D2---------------------- D2 Detail
    D3---------------------- D3 Detail
    Then, I also go in all the dimensions and set their content levels to Details, but it still gives me errors not sure where I am going wrong in setting the content levels.
    I need to know whether the way I have modeled it in BMM is right,
    Option 2:
    I can combine the two facts in a single Logical Fact or the above design should also work.
    (F1&F2)<--D1, D2 , D3 joined separately using complex logical joins.
    what will be the content tab details?
    Thanks,
    Dev

  • Best approach to delete records that are not in the source table anymore.

    I have a situation where I need to remove records from dimensions that are not in the source data anymore. Right now we are not maintaing history, i.e. not using SCD but planning for the next release. If we did that it would be easy to figure the latest records. The load is nightly and records are updated and new added.
    The approach that I am considering is to join the dimension tables the the sources on keys and delete what doesn't join. However, is there perhaps some function in OWB that would allow to do this automatically on import so it can be also in place for the future?
    Thanks!

    Bear in mind that deleting dimension records becomes problematic if you have facts attached to them. Just because this record is no longer in the active set doesn't mean that it wasn't used historically, and so have foreign key constraints on it in your database. IF this is the case, a short-term solution would be to add an expiry_date field to the dimension and update the load to set this value when the record disappears rather than to delete it.
    And to do that, use the target dimension as a source table, outer join it to the actual source table on the natural key, and so your update will set expiry_date=nvl(expiry_date,sysdate) to set to sysdate if this record has not already been expired on all records where the outer join fails.
    Further consideration: what do you do if the record is re-inserted into the source table? create a new dimension key? Or remove the expiry date?
    But I will say that I am not a fan of deleting records in most circumstances. What do you do if you discover a calculation error and need to fix that and republish historical cubes? Without the historical data, you lose the ability to do things like that.

  • Error for the fact table while processing the cube - attribute key cannot be found when processing

    Please help as I am new to SSAS and this is urgent requirement. This is a MOLAP cube and below is the error that I am receiving when processing the cube. The cube is set to Prrocess Full. Several similar errors are popped up for various dimensions.
    "Errors in the OLAP storage engine: The attribute key cannot be found when processing: Table: 'Fact_Table', Column: 'ID', Value: '1'. The attribute is 'Id'. Errors in the OLAP storage engine: The attribute key was converted to an unknown member because
    the attribute key was not found. Attribute Id of Dimension: 17 - Ves - PoC Cont from Database: DB, Cube: IPNCube, Measure Group: iSrvy, Partition: Partition1, Record: 1."
    Thanks in advance.

    Thanks for the recommendations David.
    It will be really great if you can clear some of my doubts:
    To my information, all the dimensions need to be processed first and then the fact table will be processed.
    So if the ID's are not present in the dimension tables, then it should not be present in the Fact table either.
    Here we found null values in the dimension table and the ID's were present in the Fact table. What might be the reasons causing such situation?
    Also how frequently the cube needs to be processed? Currently the ETL which processes the cube, is scheduled in a SQL Job Agent on hourly basis everyday. 
    Is there any possibilty that the cube might be under processing state and the SQL job for the next run getting executed trying to access and process the cube while it was still processing?

  • Duplicate record with same primary key in Fact table

    Hi all,
       Can the fact table have duplicate record with same primary key . When i checked a cube i could see records with same primary key combination but the key figure values are different. My cube has 6 dimentions (Including Time,Unit and DP) and 2 key figures. So 6 fields combined to form the composite primary key of the fact table. When i checked the records in se16 i could see duplicate records with same primary key. Ther are no parallel loading happening for the cube.
    BW system version is 3.1
    Data base is : Oracle 10.2
    I am not sure how is this possible.
    Regards,
    PM

    Hi Krish,
       I checked the datapacket dimention also. Both the record have same dimention id (141). Except the Keyfigure value there is no other change in the Fact table record.  I know this is against the basic DBMS primary key rule. But i have records like this in the cube.
    Can this situation arise when same records is there in different data packet of same request.
    Thx,
    PM
    null

  • More than one fact tables...

    Hi.
    I have tried OLAP until now with only one fact table.
    But now I have more than one. To start i added one more.
    I am always using SOLVED LEVEL...LOWEST LEVEL.
    I am always receiving the following error when creating the cube with this measures:
    "exact fetch returns more than requested number of rows"
    What shall I look for when dealing with more than one fact table?
    Thanks.
    ODDS
    :: ... and still have a very poor performance ...

    1.
    Well ... I saw the global star schema and we have two fact tables there!!!
    Do I have to build different cubes for each fact table always?
    2.
    I have built cubes, created a java client and a jsp client.
    Performance is much better in JSP using the AppServer(sure!).
    The power of the JSP client is more limited i presume.
    I wonder if I can do things such setCellEditing for a crosstab in both.
    3.
    Some aggregation questions:
    Everytime I create a cube using CWM2 and also a AW using AWM wizards with that cube I have one aggregation plan by default that processes everything online.
    After that I create and deploy my own aggregation plan.
    My question is: If I don't want to aggregate anything!??! I want to see, for instance, in BiBeans the lowest level values only. And everything at the top levels empty.
    I am missing something 'cause I still have everything aggregated !!!
    Thanks.
    ODDS

  • How to change the fact table in backend query

    Hi
    I have a criteria where f1 is the fact table comming in backend query, how can I change/modify so that if i select same criterian it should come different fact table f2.
    Please suggest.

    Hi Hussain,
    I have a measure 'po amount' which is comming from two fact tables cost_f and line_f from physical layer. I have a implict fact column in presentation layer on a column(internal-row count) from cost_f table.
    Now, when I take only 'po amount ' in criteria, in the backend query it should have cost_f table, but i am seeing line_f table.
    The reason I am checking in this direction is.
    I have criteria with 4 columns and measure column 'po amount' and run the result it is fecting from line_f table.
    The same 4 columns and measure column 'po amount' and one new column 'cost center' is added to the critera the fact table changes to cost_f table.
    In both the cases the result should result from cost_f table, not sure why line_f is comming in backend query.
    Please suggest.

  • Measures in the fact table

    Hello,
    Can i have measures in the fact table in business layer?
    I am using obiee 11g.
    like i have a fact table where i have a user i applied aggregation rule of count distinct.
    Now i would like to add a lower function to it like count (distinct(lower(username)).
    But then it becomes like a measure with fx with the name and not the usual yellow colour.
    Wanted to know if its good practice?
    Thanks

    Hi,
    You have two measure columns.
    one with lower function
    other without lower function.
    For your information: Distinct is case sensitive.It will show the difference between 'A' and 'a'.
    Regards,
    Lakshmipathi.
    Edited by: Lakshmipathi on Jul 7, 2011 2:41 PM

Maybe you are looking for