Updating fact table

Hi,
I'm just starting with data warehousing. I need to design a warehouse that will record sales on a daily basis. This seems to be pretty much standard task. However in my case an order may change many times before it is fulfilled. I'm planning to handle it similar to slowly changing dimensions type 2, by adding effective date/expiration date (ref. to date dimension) to fact table. So when an order is updated I would mark the current record as expired and add a new one. I cannot just replace or remove the previous record, since it would make historical information incorrect. I also don't really like the idea of storing let's say order history and periodic snapshots separately - it seems overly complicated.
I was thinking about partitioning the fact table so that only records in the most current partition would be updated. In addition I would create a view for users "where "Expiration Date" = 'N/A'" to work with the current information.
I'm sure that it's a common situation in data warehousing, however I could not find any useful information on dealing with it. Am I on the right track? Does OWB support such functionality (find existing facts by a criteria -> update them -> load new records) well?
Thanks.

Seems like an odd situation. I mean, even if an order is revised several times does it get included in a daily sales total until it is complete? I think not.
you seem to want to be able to report on a daily order book total which is completely different than sales. Sales are generally defined as an atomic, completed transaction. So I would tend to do this via a SALES_ORDER type-2 dimension which includes order status and order_total. If you need line-item totals then you might want to look at a many-to-many relationship to hold those details.
The fact table would then just hold the associated details of completed sales (do you need to report sales by payment type/ delivery method / sales rep / cashier / location / etc? If so those are other FKs to dimensions).
I suggest this route as the current value of the order book is not summable - it is a point in time look at orders on hand. Sales, on the other hand, are summable so can be aggregated / averaged / whatever over a given time period. You just can't do that with the current value of the order book which suggests that it is NOT best served by being stored in a fact.
I think that this has to be the route to go if you want to be able to sum totals over time. You may have sales. You may need an associated
"returns" fact. But you have to pick a point in time where a sale is an atomic entity and then treat any new data as a new fact.
just my two cents worth.....
Mike
Edited by: zeppo on Sep 11, 2009 12:04 PM

Similar Messages

  • Update data automatically in fact table in Data Warehouse

    Hi,
    I'm working on the creation of a data warehouse that include different data source like SQL Server performance (more than one), Active Directory users, Server performance (more than one), Exchange server mailboxes. The problem is that performance data change
    frequently (like CPU and Memory), so my question is how to update data in fact table every 5 seconds automatically with SSIS.
    Thank you for any advice  

    I'm assuming you have already figured out how to capture the data e.g. Powershell, extended events, MDW etc. and just need to know what dimensions or fact tables do you need.
    You need to decide how often you are going to capture this data and based on that you will have dimensions with appropriate grain. Don't try to cram everything in the same fact table if it not of the same granularity. Also, separate process usually
    have separate fact tables.
    In addition to the Date dimension, you will need a Time dimension with a grain of 1 second (or maybe 5 seconds if that is when you get your data) then run the SSIS every 5 seconds to capture and append that data in the fact table.
    - Aalamjeet Rangi | (Blog)

  • How to update a fact table when a dimension table is reloaded

    We have implemented BI Apps 796. Insertion into W_EMPLOYEE_D table which stores all the employee information had stopped one year back as some company security policy restricted the informatica worklfows to pick up the data. (PER_ALL_PEOPLE_F was a HRMS table and it contained sensitive information line SSN and salary, was inaccessible to the user which informatica uses and the SDE mapping used to return 0 rows).
    Now we have the approval to see those rows and the dimension table is loaded with some 100 new employees who joined in last one year.
    The ROW_WID of W_EMPLOYEE_D is referenced in lot of fact tables and for all those missing employees the WID in the fact table is 0.
    Now that we have all employees, how to make the FACT table point to the correct WID and not store 0. Has anyone faced this problem before?? Writing an update statement will be a tedious task as there are so many fact tables that join to w_employee_d. Also our company uses Sales, Procurement, Finance modules of OB Apps (which constitutes atleast 20 fact tables)
    Any guidance is appreciated. Thanks in advance

    Hello Kostis,
    thank you for your answer. I don't fully understand you. Can you show me short example, please? I create alias table for time dimension on Physical Layer - original table is TimeDayDim and I create aliases TimeDayDim1, TimeDayDim2, TimeDayDim3, TimeDayDim4. Then I create foreign key Fact.Time1 -> TimeDayDim1, Fact.Time2 -> TimeDayDim2, Fact.Time3 -> TimeDayDim3, Fact.Time4 -> TimeDayDim4. And what now? Must I create these table api Bussines Model and create new time dimensions at bussiness model????
    I need in Answers ONE Time dimension. I think I must split my fact table to four tables ... (time1, place1 ...) (time2, place2 ...) (time3 place3...) (time4 place4...) then link those tables to Time dimension (but I dont know where I can split those tables - on Physical Layer or on Bussines Layer).
    I suppose that I will have in Answers one time dimension and four facts tables and I will be able to query them. (for example: Time.Days, Fact1.Place1, Fact3.Speed, Fact4.Count Criteria: Time.Year = 2008)
    Best Regards Vlada

  • Design patterns for updating a fact table

    I have a fact table and about 10 dimensions.
    One of these dimensions can be missing values, so the fact table row will link to an UNKNOWN value.
    When the correct value is finally entered in the dimension table i want to update any associated fact rows.
    Whats the most efficient way of doing this?

    I know i have to use a lookup transformation ;-) 
    I wouldnt be at teh stage of even having a fact table if i didnt know that! I was looking for a design pattern, not the name of a shape!
    The solution i went with was to take a hard line on any rows with unknown values. If when importing the data there are unknown values for two of the most important dimensions, those rows are not inserted into the fact table, but instead pushed to an ErrorLog
    table.
    Users run a report that shows what this table contains and if they really want those rows, they insert the correct dimension values and rerun the import, which will only import any rows not already in the fact table.
    This way:
    1. all sorting & filtering issues are resolved as there will be no unknown rows for the most important dimensions used in sorting and filtering.
    2. we can quickly see any rows with unknown values and figure out whats wrong. its always missing reference data that the client didnt think to give us. a quick insert of the dimension data and import and the rows get imported.
    thanks for the replies.

  • How do you handle update and delete rules for fact tables?

    I have a fact table with a composite key of 5 columns. Two of the columns are FKs to the date dimension. I was setting the delete/update rules for the FK relationship in SSMS and it had a problem with me creating cascade action on the FKs that connected
    to the date dimension.
    What is the proper way to set up FK relationships in fact tables with SSMS when  you have composite keys as most fact tables do?

    Yeah I understand all that. What I'm trying to do is to protect my database from RI violations that occur by production support people blowing away stuff in a dimension table but forgetting to blow away related records in the fact table. I want those fact
    records deleted automatically so we don't have orphan records which was a real issue at a previous engagement. Production support is usually just people that know SQL and some relational modeling. It's not too likely they will understand the details of dimensional
    modeling enough such that they would know that they had to blow away the fact record first.
    My problem is I have a FK to a role playing dimension (the date dimension in this case). So basically I have to columns in the fact table that have a FK relationship to the PK of the date dimension. When I create both relationships SSMS and try to have both
    of them cascade delete SSMS has an issue with it.
    The error I get is:
    Unable to create relationship '[relationship name]'
    Introducing Foreign Key constraint '[constraint name]' on table '[table name]' may cause cycles or multiple cascade paths. Specify ON DELETE NO ACTION or ON UPDATE NO ACTION, or modify other foreign key constraints.
    I can go ahead and put no action and the table will save fine. The question now becomes how does the cascade delete actually work. Can I just set one part of the key to cascade delete?
    Actually I just realized that this is an even bigger design issue. What DOES happen to a fact record when one of it's dimensions gets deleted and I've got full RI set up on the table?
    Or am I totally thinking about this wrong. Do you set up cascade deletes in a dimensional model? Is there a way to prevent deletes from the dimension table if there are related fact records?

  • Update DIM data in Fact Table

    All -
    I have a TSR dimension data (1 of 12 dimensional attributes) in our forecast fact table which gets updated monthly in our source system and we have to reflect this in our BPC system. What is the best method of updating this data in BPC?
    1. Can I update the TSR directly in the forecast fact table and reprocess the application to reflect this in the cube?
    2. Do I have to negate that entire fact table row with the old TSR and insert a new row with the new TSR and process the cube?
    3. Any other method?
    Thanks for your help.

    You need to explain your requirement a bit more. Why you want to add type or rejection reason in fact table.
    In general you should avoid adding textual data in fact.
    Fact tables are only supposed to store keys and measures. It can hold textual data in case of degenerate dimension ( i.e if the number of records are same in dimension as well as fact ).
    Type or Rejection reason are going to have much less records. You can create a new dimension for them having probably 20 or 30 records.
    Disadvantages of adding textual data in fact is, once the fact has millions of records the fact table updates are going to take huge time and also your reports will take long time.
    If you have a rejection reason dimension with some 20 records, you can use rejection reason in prompts.
    But if the same rejection reason is coming from fact table your prompts performance is going to be very slow since it has to fetch 20 distinct records from millions of records.
    Consider these before adding textual data in facts.
    Thanks
    Edited by: Maqsood Hussain on Dec 5, 2012 9:40 PM

  • How to refer to fact table to update the same fact table

    Here is my scenario. Fact table is having measures basic_sal, Tax, net_sal with dimension key to employee dimension.
    I neet to update the net_sal = basic_sal - tax in fact table.
    I tried to do a map with fact_table as source and also target. but after deploying the map, it's not updating the net_sal to the existing rows. It's inserting the new rows with dimension key and net_sal columns. so now fact table is having the double the existing rows.
    Thanks,
    Srini.

    Ola Srini,
    We have some experience that updating a table while using that same table as source often takes a lot of time... Specially when the table contains a lot of data (which is mostly the case with fact tables).
    You can use two solutions. Expand the mapping u use to fill the fact table... The calculations looks not that complex to me... But I could be wrong.
    Other solution is to create a temp table where you store all sal types. You could load the fact table from this temp table. Disadvantage of this solution is maintenance of the temp table...
    Regards,
    Moscowic

  • Logical level for logical fact table sources

    it is clear that for fact aggregates, we should use the Content tab of the Logical Table Source dialog to assign the correct logical level to each dimension.
    question is : is it mandatory to assign even for non-aggregates fact tables the logical level for each dimension (which normally should be set to the most detailed level of each dimension) ? is it any known issue if "logical levels"in content tab are not set ?
    the reason I'm asking this is a strange bug I have (I'm not going to discuss it here) and then only workaround seems to be NOT setting the logical levels (on content tab) for logical fact table sources.
    thank you !

    If levels are not set: By default levels are considered as lowest level
    It should not matter if you set or not
    Generally we set for facts explicitly when we are using Aggregate tables.
    Your current issue might be a case by case; I would suggest to check implicit fact, any table mapped to the source to force a join etc
    Mark if helps
    Let me know how it helps
    Edited by: Srini VEERAVALLI on Feb 5, 2013 8:33 AM
    Any updates on this?+_
    Edited by: Srini VEERAVALLI on Feb 14, 2013 9:09 AM

  • No Message: Write to Fact table.

    Hi ALL,
    Source: ECC 6
    Target: BI 7.3
    We are Transferring 2LIS_13_VDITM Datasource---->> 0SD_CO3 Infocube .
    After Data Replication ,
    1. Data Transferred to PSA .
    2. During Transformation Creation Manuel Mapping is performed . Activated .
    3. During DTP Creation Only Following Warning Messages Occur , Status s not Transferred to Green .
    Data is not coming Cube , No Error Messages. (Totally 29000 Records have to transfer to BI Cube)
    Warning Messages are,
    1.No Message: Write to Fact table.
    2.No Message:Infocube Update Completed .
    What is the Problem?

    Hi,
    Have you set the Industroy sector before uploading the set up tables?
    For more information refer the note: 353042
    Summary
    Symptom
    Fields BWGEO, BWGEOO, BWGVP, BWGVO, BWNETWR, BWMNG, etc. of DataSources 2LIS_02_SCL, 2LIS_02_ITM, 2LIS_03_BF, 2LIS_03_UM, 2LIS_40_REVAL are not filled.
    This may lead to the following:
    The system does not perform any update into an InfoCube (for example: 0RT_C*, 0PUR_C01, 0CP_PURC1 and so on), even though data arrives in BW.
    This occurs with the following InfoSources:
    2LIS_02_SCL, 2LIS_02_ITM
    2LIS_03_BF, 2LIS_03_UM
    2LIS_40_REVAL
    With some restriction, this symptom also occurs with the following InfoSources if they are used in connection with retail or consumer products. (InfoCube: 0RT_* or 0CP_* ).
    2LIS_11_VAITM, 2LIS_12_VCITM, 2LIS_13_VDITM
    Other terms
    0PROCESSKEY, PROCESSKEY, 0RT_C01, 0RT_C02, 0RT_C03, 0RT_C04, BWBRTWR, BWGEO, BWGEOO, BWGVP, BWGVO, BWNETWR, BWMNG
    Reason and Prerequisites
    The process key (0PROCESSKEY and 0BWAPPLNM) of the InfoSources has not been filled. As a result, no key figures are updated because of the update routine of the participating InfoCube and along with it no records are inserted into the InfoCube. In each update routine, the system checks the content of the PROCESSKEY. If this field has no contents, then no data is written into the InfoCube because of the IF condition in the update rules.
    Solution
    So that you can work in the above mentioned InfoSources, you MUST activate the determination of the process key. This is done with the help of Transaction MCB_ which you can find in the OLTP IMG for BW (Transaction SBIW) in your attached R/3 source system.
    Here you can choose your industry sector. 'Standard' and 'Consumer products' are for R/3 standard customers, whereas 'Retail' is intended for customers with R/3 Retail only.
    You can display the characteristics of the process key (R/3 field BWVORG, BW field 0PROCESSKEY) by using Transaction MCB0.
    If you have already set up historical data (for example for testing purposes) by using the setup transactions (Statistical Setup Programs) (for example: Purchasing: Tx OLI3BW, material movements: OLI1BW) into the provided setup tables (for example: MC02M_0SCLSETUP, MC03BF0SETUP), you unfortunately have to delete this data (Tx LBWG). After you have chosen the industry sector by using  MCB_, perform the setup again, so that the system fills a valid transaction key for each data record generated. Then load this data into your connected BW by using 'Full update' or 'Initialization of the delta process'. Check, whether the system updates data into the involved InfoCubes now.
    If all this is not successful, please see Note 315880, and set the application indicator 'BW' to active using Transaction 'BF11'.
    Related notes:
    157317 --> You MUST make sure that this note is relevant for you.
    352344 -> Process key + reversals in Inventory Management
              (Consulting note).
    Regards,
    Anil Kumar Sharma .P

  • Key Figure units in Fact Table - Error

    All -
    When a run a report off of a cube, some row display 0 when there are corresponding values in my cube.  The report doesn't agree with LISTCUBE.  I have ran transaction RSRV on my cube and tested the "Key figure units in fact tables of Infocube" and I get an error saying that 1380 units are missing from fact table.
    <b>Diagnosis
    In the fact table /BIC/FEU_FRCTS records have been found that contain values other than zero for key figures that have units, but that have no value for the unit of the key figure. Since the value of the unit has to correspond to the value of the key figure, this inidicates an error when the data was loaded. The values of the units have not been loaded into BW correctly. Choose Details to display the incorrect records.</b>
    Does anyone know what this error means? How do I solve this problem?
    Thanks,
    Nyrvole

    hi Nyrvole,
    as the message said, you have keyfigures with unit but the unit value not filled, click 'detail' as suggested to check which keyfigure(s) involved, that go to rsd2 type in that keyfigure and see which infoobject unit is used, then check transfer/update rules how this unit infoobject mapped, try correct the values and upload again.
    there is option 'repair' in rsrv but think in this case it can't fix the error, just try.
    hope this helps.

  • Can't load data to cubes - same error for all DTPs  "exception in substep write to fact table"

    Hi Experts,
    This initial DTP error is:
    Data package processing terminated Message no. RSBK229
    Error while updating to target <cubename> (type INFOCUBE) Message no. RSBK241
    If I drill down to the specific package in the process monitor the error is:  "exception in substep write to fact table" .
    No more details for why the error is being generated.
    SAP BW 307 service pack 8.
    I have run the debugger without success, an ABAPer also ran debug mode to get more details on the error without success
    We checked teh termination point and the ABAP stack
    No error stack is available
    No short dumps
    I have reactivated the DTP
    The data looks fine
    The server crashed the day before the DTPs started to fail
    One cube is standard content and the other is a Z version of the same cube.
    Please let me know if anyone has resolved this issue before and what steps were necessary.

    Hello Russell,
    We have got the similar situation and we later found that it was a data issue only.
    We tried to enable the error stack, but as the data issue was due to a lookup which is bringing the invalid data, its not captured in the error stack and throwing short dump with the message u mentioned.
    so my suggestion, just put a break point if there are any lookups and check the data in internal tables if everything is as expected.
    Hope this helps.
    Thanks,
    Venkata Naresh

  • Can't update master table when creating a materialized view log.

    Hi all,
    I am facing a very strange issue when trying to update a table on which I have created a materialized view log (to enable downstream fast refresh of MV's). The database I am working on is 10.2.0.4. Here is my issue:
    1. I can successfully update (via merge) a dimension table, call it TABLEA, with 100k updates. However when I create a materialized view log on TABLEA the merge statement hangs (I killed the query after leaving it to run for 8 hrs!). TABLEA has 11m records and has a number of indexes (bitmaps and btree) and constraints on it.
    2. I then create a copy of TABLEA, call it TABLEB and re-created all the indexes and constraints that exist on TABLEA. I created a materialzied view log on TABLEB and ran the same update....the merge completed in under 5min!
    The only difference between TABLEA and TABLEB is that the dimension TABLEA is referenced by a number of FACT tables (by FKs on the FACTS) however this surely should not cause a problem. I don't understand why the merge on TABLEA is not completing...even though it works fine on its copy TABLEB? I have tried rebuilding the indexes on TABLEA but this did not work.
    Any help or ideas on this would be most appreciated.
    Kind Regards
    Mitesh
    email: [email protected]

    Thats what I thought, the MVL will only read data that has changed since it was created and wont have the option to load in all the data as though it was made before the table was created.
    From what I have read, the MVL is quicker than a Trigger and I have some free code that prooved to work from a MVL using it as a reference to know what records to update. There is not that much to a MVL, a record ID and type of update, New, Update or Delete.
    I think what I will have to do is work on a the same principle for the MVL but use a Trigger as this way we can do a full reload if required at any point.
    Many thanks for your help.

  • Fact Tables

    Hi!
    I needed help on understanding how data is tored in the FACT Tables.
    I am currently facing an issue after running SPRUNCONVERSION. The stored procedure generates 4 records in the FACT tables and thereby eliminates any converted values. For eg, If an liability account 1111001 is loaded with a value -5000 at LC; Local Currency, Running SPRUNCONVERSION say wthe the Translation rate of 2, generates 4 records in the FACTWB Tables:
    -10000 in USD, 
    10000 in USD,
    -10000 in USD and another +10000 in USD at the same account and at the same crossings.
    This in effect shows no converted data in USD and shows a 0 value.
    Any guidance to understand how these records are generated and stored will be of great help.
    Regards,
    Pankti Shroff

    Hi
    when you save an application  Outlooksoft Admin rebuilds the cube or application in Analysis Services and recreates the following SQL items on the applications wb, fac2, and fact table.
    1. Dependencies
    2. Indexes
    3. Contraints
    4. Stored procedures
    The source field of the applications wb, fac2, and fact tables is used for optimization purposes only. The 'updating source field' step can vary in time depending on the record count and the default value of the source field. This process will take more time when performing a full optimization because we are querying the wb, fac2, and fact tables.
    Hope this helps...
    Regards
    SN...

  • Querying against two fact tables with non conformed dimension

    I have two fact tables and I have this set up in RPD :
    Fact1 joined to DimA and DimB
    Fact2 joined to DimA
    On Front end I build two analysis:
    Analysis 1:
    DimA.A, Fact1.1, Fact2.2
    Analysis 2 :
    DimA.A, DimB.B, Fact1.1, Fact2.2
    In the results of Analysis 1 , I am seeing correct values for Fact2.2
    In the results of Analysis 2, I am seeing Fact2.2 as empty column. I think the reason is that Fact2 is not joined to DIMB.
    Is it possible that I should be able to report against both dimension tables (DIMA and DIMB) columns for FACT1 and FACT2 measures.... Even though I don't have join between DIMB and FACT2.
    Any response would be helpful!!!
    Regards,
    Annu

    Hi,
    Go to the LTS-- Content Level of the fact which does not have join with Dimension and put Total level for that Dimension and total level on Column(Measure also)(Double click --Levels)(Assume Dim Hierarchy is  already set up)
    Pull everything(D1,D2,F1,F2) You will see results.
    Update Me
    Thanks
    NK
    Edited by: DNK on Mar 20, 2013 1:35 PM

  • Duplicate records in Fact Tables

    Hi,
    We are using BPC 7.0 MS SP7. BPC created duplicate records in WB and Fac2 tables. We faced similar issue before and the solution was to reboot the server and cleanup the additional data created. I think it should not be an issue with the script logic files we have. We had the issue across all applications. Data is fine now after the server reboot and running the same logic files.  I want to know if any one faced this issue and if there is any solution other than reboot. I appreciate your help.
    Thanks
    Raj

    Hi Sorin,
    I know this thread is rather old, but i have a problem which is pretty much related to this thread and appreciate if you could assist me. I have client running on 7.0 MS who has been using it for the past 3 years.
    It is a heavily customized system with many batch files running daily to update dimensions, copy data and sort. And Yes we do use custom packages that incorporates stored procedures.
    Recently, with no change in environment, we encountered our factwb ballooning up out of no where. fact table only contains less then 1 gb data but, factwb has 200 gb data and practically paralayzed the system. There is also equilavent 300 gb increase in the log files.
    We are not able to find out what caused this? Or if even the 200gb records in wb are even valid records that are duplicated. Is there a way to troubleshoot this?

Maybe you are looking for

  • Initial load of user data from differnet applications

    How can we match the different IDs from different applications when we load first time. Have we to do this manually or is there any functionality to do this automatilliy like CUA. Thanks Gabi

  • HP Touchpad "Unknown Error" when trying to login to Google account

    Hello! When i try to login to Google on HP Touchpad i get a "Unknown Error" I have tried: 1) Restarting Touchpad 2) Checking (& Installing) All Updates 3) Changing Google Password None Of These Work! My Google account password has no special charecte

  • Missing Fonts in PDF

    Hello everybody I have a font problem somwhere in the chain Powerpoint->PDF->InDesign. When i Import the generated PDF, InDesign is reporting missing fonts in it. Now, there is a difference in which and where the fonts are missing (11 vs 31 errors) d

  • Could not save because the ICC profile description is invalid

    I'm getting this error whenever trying to do a "save" or "save as" from Photoshop CC. I've tried converting to different color profiles but that doesn't seem to make a difference. Even if I create a new blank document and try to save it, I get the sa

  • Frequent '500 Internal Server Error' on apexlistener 2.0

    Hi I hope someone can help me. We've got an OracleXE 11g DB with apex 4.2 and apex listener 2.0 with apache reverse proxy setup. The Database and Apex4.2 is running on it's own Server and the Apex Listener is running in Glassfish on its own server, t