Snapshot of Fact Table

I'm working on a snapshot of a Fact. The snapshot's purpose is to record changes of columns x,y, and z. If either x,y, or z change in the Fact, a new row should be added to the Snapshot and flag columns for the three columns should read Y or N (depending
on which of the three columns changed).
My approach was to simply compare columns x,y, and z between extract date and dates from the day before(could be mins or secs before, nonetheless, I just want a variance in time).
My thought was to use a CASE statement in the source query:
SELECT CASE WHEN [x] <> (SELECT [x] FROM dw.dbo.FCT_TABLE WHERE LAST_MOD_DATE = DATEADD(DAY,-1,[LAST_MOD_DATE])) then 'Y' else 'N' end as FLAG_X 
But I don't seem to be getting the correct results. Any suggestions? Ultimately, I will apply the logic to SSIS

This WHERE clause
   WHERE LAST_MOD_DATE = DATEADD(DAY,-1,[LAST_MOD_DATE])
will of course never match any rows.
Supposedly, you mean LAST_MOD_DATE from two different tables, but you need to tell the computer that. Computers are not very smart.
Also, what data type is LAST_MOD_DATE? If it's datetime or datetime2 and includes a time portion, your changes for a hit are bad.
If x can be NULL, the logic needs to be more sophisticated.
You could of course also consider Change Data Capture (if you are on Enterprise Edition) of Change Tracking (any edition), but it may be too heavy artillery, if this is your only snapshot.
Erland Sommarskog, SQL Server MVP, [email protected]

Similar Messages

  • Snapshot Dimension & Fact tables

    We are currently designing a logical multidimensional model from an OLTP tables.Dimension tables have monthly snapshot because some of or all the attributes might change on monthly basis.The same situation for the fact table has monthly snapshot.
    I know that according to Kimball's modeling, the attributes for dimensions should be implemented using mini-dimensions and put combination key as a foreign key in the fact table but this step needs an ETL job to handle mini-dimension and other fact table.However, in our situation and according to scope limitation,there is no time to design separate ETL to handle mini-dimension.
    An example for our records:
    Customer:
    Cust_ID
    Month_ID
    Attr1
    Attr2
    Attr3
    Attr20
    Installments
    Installment_ID
    Cust_ID
    Month_ID
    Attr1
    Attr2
    Attr3
    Attr5
    Measure1
    Measure2
    Measure3
    Measure4
    So, Installments table contains attributes as well as measures so what is the suitable consideration and the suitable OBIEE logical BM design ,should we consider Installments table as fact or should we divide it into dimension and fact tables ?

    Xerox wrote:
    So, Installments table contains attributes as well as measures so what is the suitable consideration and the suitable OBIEE logical BM design ,should we consider Installments table as fact or should we divide it into dimension and fact tables ?You already give the answer yourself: you should create an Installment dimension and a Installment fact table, using the same physical table as logical table source. The logical dimension should only contain the attributes, the logical fact should only contain measures.

  • Snapshot / Trend Fact Design

    I have a fact table that has the file sizes of all the files in the organisation and it has 10 billion rows. Now we have a requirement of see the trend of these files growth over a period of time.
    If i use snapshot date and use this to link a dimension with snapshot ( monthly or yearly ) that will solve my problem to see trend but that means no of rows in fact will be multiplied by snapshots that will lead to performance issues and not feasible.Is
    there any alternative solution to solve this in terms of design and reporting, please help
    File Fact : 10 Billion rows
        ID  |  FileName  |  Size(GB)  |  Snapshot Date

    Hi if the data loaded in the data mart is only having percentages and not having constituents then they cannot be added to any other levels other than the once at which they are loaded. The only way is to ensure that that data to be loaded is the constituents as shown in example.
    Suppose user needs to see Gross Profitability which is ( Revenue - Cost)/ Revenue
    I should not be creating a measure called Gross Profitability if i require reports of gross profitability at different context. Rather I should create Revenue and Cost as my measures so that i can calculate the Gross Profitability at any level required.

  • Updating fact table

    Hi,
    I'm just starting with data warehousing. I need to design a warehouse that will record sales on a daily basis. This seems to be pretty much standard task. However in my case an order may change many times before it is fulfilled. I'm planning to handle it similar to slowly changing dimensions type 2, by adding effective date/expiration date (ref. to date dimension) to fact table. So when an order is updated I would mark the current record as expired and add a new one. I cannot just replace or remove the previous record, since it would make historical information incorrect. I also don't really like the idea of storing let's say order history and periodic snapshots separately - it seems overly complicated.
    I was thinking about partitioning the fact table so that only records in the most current partition would be updated. In addition I would create a view for users "where "Expiration Date" = 'N/A'" to work with the current information.
    I'm sure that it's a common situation in data warehousing, however I could not find any useful information on dealing with it. Am I on the right track? Does OWB support such functionality (find existing facts by a criteria -> update them -> load new records) well?
    Thanks.

    Seems like an odd situation. I mean, even if an order is revised several times does it get included in a daily sales total until it is complete? I think not.
    you seem to want to be able to report on a daily order book total which is completely different than sales. Sales are generally defined as an atomic, completed transaction. So I would tend to do this via a SALES_ORDER type-2 dimension which includes order status and order_total. If you need line-item totals then you might want to look at a many-to-many relationship to hold those details.
    The fact table would then just hold the associated details of completed sales (do you need to report sales by payment type/ delivery method / sales rep / cashier / location / etc? If so those are other FKs to dimensions).
    I suggest this route as the current value of the order book is not summable - it is a point in time look at orders on hand. Sales, on the other hand, are summable so can be aggregated / averaged / whatever over a given time period. You just can't do that with the current value of the order book which suggests that it is NOT best served by being stored in a fact.
    I think that this has to be the route to go if you want to be able to sum totals over time. You may have sales. You may need an associated
    "returns" fact. But you have to pick a point in time where a sale is an atomic entity and then treat any new data as a new fact.
    just my two cents worth.....
    Mike
    Edited by: zeppo on Sep 11, 2009 12:04 PM

  • Logical level in Fact tables - best practice

    Hi all,
    I am currently working on a complex OBIEE project/solution where I am going straight to the production tables, so the fact (and dimension) tables are pretty complex since I am using more sources in the logical tables to increase performance. Anyway, what I am many times struggling with is the Logical Levels (in Content tab) where the level of each dimension is to be set. In a star schema (one-to-many) this is pretty straight forward and easy to set up, but when the Business Model (and physical model) gets more complex I sometimes struggle with the aggregates - to get them work/appear with different dimensions. (Using the menu "More" - "Get levels" does not allways give the best solution......far from). I have some combinations of left- and right outer join as well, making it even more complicated for the BI server.
    For instance - I have about 10-12 different dimensions - should all of them allways be connected to each fact table? Either on Detail or Total level. I can see the use of the logical levels when using aggregate fact tables (on quarter, month etc.), but is it better just to skip the logical level setup when no aggregate tables are used? Sometimes it seems like that is the easiest approach...
    Does anyone have a best practice concerning this issue? I have googled for this but I haven't found anything good yet. Any ideas/articles are highly appreciated.

    Hi User,
    For instance - I have about 10-12 different dimensions - should all of them always be connected to each fact table? Either on Detail or Total level.It not necessary to connect to all dimensions completely based on the report that you are creating ,but as a best practice we should maintain all at Detail level only,when you are mentioning any join conditions in physical layer
    for example for the sales table if u want to report at ProductDimension.ProductnameLevel then u should use detail level else total level(at Product,employee level)
    Get Levels. (Available only for fact tables) Changes aggregation content. If joins do not exist between fact table sources and dimension table sources (for example, if the same physical table is in both sources), the aggregation content determined by the administration tool will not include the aggregation content of this dimension.
    Source admin guide(get level definition)
    thanks,
    Saichand.v

  • Parent member values in Fact tables

    Hello,
    I want to understand something, as far as I know, we can only send data to base level members, right ?
    Then how come we find rows of data that have parent member values in the Fact tables ? (assuming we do not play manually with the database of course), I thought that this can be due to an import with the data manager, can this be right ?

    nilanjan chatterjee wrote:
    Hi,
    >
    > The data for the parent members should be available in the SQL tables.
    > For example, 2011.TOTAL is parent member. You should not have any data for this member in your database. If it is there, it might have come somehow (may be an import). But this is not right. You might want to remove these records. But be sure that you dont delete the records for the base level members.
    >
    > Hope this helps.
    I guess you meant should not, right ?

  • Null key in fact tables

    Hi all,
    I have one role play dimension with some null key in fact table, I liked know if it's a good practice?
    thanks

    Depends on what you actually mean..
    When one dimension column contains NULL in your fact table, then it's normally a bad practice. Create a entry in your dimension table to represent this state.
    The problem is that NULL is a state, which means does missing or inapplicable information. This means that the row in the fact table is semantically meaningless. Cause your fact table is no longer additive over this dimension.

  • Help on  Setting logical Levels  in Fact tables and on Dimension tables

    Hi all
    Can any body provide any blogs or any king of material on what exactly is levelling .
    Like after creating the Dimensional hierarchies we need to set the logical levels for the LTS of fact tabels ri8 .So what is the difference between setting logical levels to fact tabels and also Setting levelling on Dimension tables .
    Any kind of help is appreciated
    Thanks
    Xavier.
    Edited by: Xavier on Aug 4, 2011 10:50 AM

    I have read these blogs ,but what my question is
    Setting the logical levels in LTS of Fact tables i understood .
    But we can also set the logical levels for dimensions also ri8 .I didn't understand why do we set the logical levels for dimensions .Is there any reason why we go with the levelling at dimensions
    Thanks
    Xavier
    Edited by: Xavier on Aug 4, 2011 2:03 PM
    Edited by: Xavier on Aug 4, 2011 2:32 PM

  • Logical level for logical fact table sources

    it is clear that for fact aggregates, we should use the Content tab of the Logical Table Source dialog to assign the correct logical level to each dimension.
    question is : is it mandatory to assign even for non-aggregates fact tables the logical level for each dimension (which normally should be set to the most detailed level of each dimension) ? is it any known issue if "logical levels"in content tab are not set ?
    the reason I'm asking this is a strange bug I have (I'm not going to discuss it here) and then only workaround seems to be NOT setting the logical levels (on content tab) for logical fact table sources.
    thank you !

    If levels are not set: By default levels are considered as lowest level
    It should not matter if you set or not
    Generally we set for facts explicitly when we are using Aggregate tables.
    Your current issue might be a case by case; I would suggest to check implicit fact, any table mapped to the source to force a join etc
    Mark if helps
    Let me know how it helps
    Edited by: Srini VEERAVALLI on Feb 5, 2013 8:33 AM
    Any updates on this?+_
    Edited by: Srini VEERAVALLI on Feb 14, 2013 9:09 AM

  • Best way to combine multiple fact tables in single mart

    Hi, quick question that I think I know the answer to, just wanted to bounce it off everyone here to make sure I'm on the right track.
    I have a HR datamart that contains several different fact tables. Some of the facts are additive across time (i.e. compensation - people get paid on different days, when I look at a month I want to see the total of all pay dates within that month). The other type of fact is more "status over a set of time" - i.e. a record saying that I'm employed in job X with a salary of Y from a given start date to a given end date.
    For the "status over time" type facts, if I choose January 2009 (month level) in the time dimension, what I'd really like to see is the fact records that were in place "as of" the last day of the month - i.e. all records where the start date is on or before 1/1/2009, and whose end date is on or after 1/1/2009. Note that my time dimension does go down to the day level (so you could look at a person "as of" the middle of the month, etc. if you're browsing on a day-by-day basis)
    I've set up the join between the time dimension and the fact table as a complex join in the physical layer, with a clause like "DIM_DATE.DATE >= FACT.START_DATE AND DIM_DATE.DATE <= FACT.END_DATE". This seems to work perfectly at the day level - I have no problems at all finding the proper records for a person as of any given day.
    However, I'm not quite sure how to proceed at the month level. My initial thought is:
    a) create a new LTS for the fact table at the month level
    b) in the new LTS, add the join to the time dimension
    c) in the new LTS, add a where clause similar to LAST_DAY_IND = 'Y' (true for the last day of each month).
    Is this the proper way to do this?
    Thanks in advance!
    Scott

    Hi Scott,
    I think you're on the right track but I don't think you need the last part. Let me generalize the situation to the following tables
    DAILY_FACT (
    DAILY_FACT_KEY NUMBER, -- PRIMARY KEY
    START_DATE_KEY NUMBER, -- FOREIGN KEY TO DATE DIMENSION FOR START DATE
    END_DATE_KEY NUMBER, -- FOREIGN KEY TO DATE DIMENSION FOR END DATE
    DAILY_VALUE NUMBER); -- FACT MEASURE
    MONTHLY_FACT(
    MONTHLY_FACT_KEY NUMBER, -- PRIMARY KEY
    MONTH_DATE_KEY NUMBER, -- FOREIGN KEY TO DATE DIMENSION, POPULATED WITH THE KEY TO THE LAST DAY OF THE MONTH
    MONTHLY_VALUE NUMBER); -- FACT MEASURE at MONTH LEVEL. DATE_KEY is at END of MONTH
    DIM_DATE(
    DATE_KEY NUMBER,
    DATE_VALUE DATE,
    DATE_MONTH VARCHAR2(20),
    DATE_YEAR NUMBER(4));
    DIM_DATE_END (ALIAS OF DIM_DATE for END_DATE_KEY join)
    Step 1)
    Make the following three joins in the physical layer:
    a. DAILY_FACT.START_DATE_KEY = DIM_DATE.DATE_KEY
    b. DAILY_FACT.END_DATE_KEY = DIM_DATE_END.DATE_KEY
    C. MONTHLY_FACT.DATE_KEY = DIM_DATE.DATE_KEY
    Note: The MONTHLY_FACT DATE_KEY is joined to the same instance of the date dimension as the START_DATE_KEY of the DAILY_FACT table. This is because these are the dates you want to make sure are in the same month.
    Step 2)
    Create a business model and drag DIM_DATE, DAILY_FACT and DIM_DATE_END into it.
    Step 3)
    Drag the physical table MONTHLY_FACT into the logical table source of the logical table DAILY_FACT.
    Step 4)
    Set DAILY_VALUE and MONTHLY_VALUE to be aggregates with a "SUM" aggregation function
    Step 5)
    Drag all required reporting columns to the Presentation layer.
    Step 6)
    Create your report using the two different measures from the different fact tables.
    Step 7)
    Filter the report by the Month that joined to the Start Date/Monthly Date (not the one that joined to the end date).
    Step 8)
    You're done.
    The act of combining the two facts into one logical table allows you to report on them at the same time. The strategy of joining the START_DATE_KEY and the MONTH_DATE_KEY allows you to make sure that the daily measure start date will be in the same month as the monthly fact table.
    Hope that helps!
    -Joe
    Edited by: Joe Bertram on Jan 5, 2010 6:29 PM

  • Content tab for a fact table

    Hi
    Please , help me in knowing the use of content tab for a fact table in the repository in OBIEE.
    Thanks.

    if you have multiple LTS then you should set the content level approprately otherwise you can get errors during consistency checks.not able to find any link which talks only about content level.see these links and let us know if you have any doubts
    http://kr.forums.oracle.com/forums/thread.jspa?threadID=604637
    Content tab is also handy when you are using aggregate tables.
    Regards,
    Sandeep

  • Content Tab: None of the fact tables are compatible with the query request

    Hi All,
    **One thing I am not clear yet of all my years with OBIEE is working with the content tab in BMM.**
    I have made a rpd the joins in physical layer as shown below:
    https://picasaweb.google.com/114804305606242416264/OBIEEError#5663056545119428530
    And the BMM layer as:
    https://picasaweb.google.com/114804305606242416264/OBIEEError#5663056519553812930
    Error I am getting when i run a request from the 3 columns from the selected 3 tables is:
    Dim - Comment Code Details
    Fact - Complaint
    Dim - Service Details
    Error Codes: OPR4ONWY:U9IM8TAC:OI2DL65P
    State: HY000. Code: 10058. [NQODBC] [SQL_STATE: HY000] [nQSError: 10058] A general error has occurred. [nQSError: 14020] None of the fact tables are compatible with the query request Sr Num:[DAggr(Fact - Complaint.Sr Num by [ Dim - Service Details.Sr Cat Type Cd, Dim - Comment Code Details.Cmtcode name] )]. (HY000).
    I get no error for consistency.. I read everywhere and I know i need to set the appropriate aggregation levels in the various dims and facts LTS properties to help OBIEE understanding our model, but how to do that.. how do i decide... how should I approach, what should be the aggregation level, what details.
    When i click More button i see different options: Copy, Copy From, Get Levels, Check Level, what do these mean.
    Aggregation Content, group by - Logical Level or Column which one should i choose and how should I decide.
    Can anyone explain the Content Tab in details and from scratch with some example and why we get these errors.... I know many people who are well versed with many other things related to RPD but this. A little efforts of explaining from you guys will really be appreciated.
    Thanks in advance,
    Dev

    Hi Deepak,
    Option 1:
    My tables in physical layer are joined as below:
    D1--> F1 <--D2--> F2 <--D3
    Same way i model it in BMM
    D1--> F1 <-- D2--> F2 <--D3
    Here D1 is non Conformed Dimension for F2 and D3 is non Conformed dim for F1. Later create Dimensional hierarchies, I tried setting up the content levels
    I go Sources>content tab of Fact F1 I set
    Dimensions----------- Logical level
    D1---------------------- D1 Detail
    D2---------------------- D2 Detail
    D3---------------------- D3 Total
    then, I go Sources>content tab of Fact F2 I set
    Dimensions----------- Logical level
    D1---------------------- D1 Total
    D2---------------------- D2 Detail
    D3---------------------- D3 Detail
    Then, I also go in all the dimensions and set their content levels to Details, but it still gives me errors not sure where I am going wrong in setting the content levels.
    I need to know whether the way I have modeled it in BMM is right,
    Option 2:
    I can combine the two facts in a single Logical Fact or the above design should also work.
    (F1&F2)<--D1, D2 , D3 joined separately using complex logical joins.
    what will be the content tab details?
    Thanks,
    Dev

  • OBIEE 11g - No fact table exists at the requested level of detail

    My dimesion tables are snow-flake.
    Table1 has Key, ProductName, ProductSize, Table2Key
    Table2 has Key, ProductDepartment, Table3Key
    Table3 has Key, ProductDivision
    I have created 2 hierarchies (in same dimension Product). Note: ProductSize is in Table1.
    ProductDivision > ProductDepartment > ProductName (shared level)
    ProductSize > ProductName (shared level)
    There are 2 fact tables
    Fact1 is at ProductName level
    Fact2 is at ProductDepartment level
    When I create a request with columns as ProductSize and some measure; and filter it on ProductDepartment. The request fails with error "No fact table exists at the requested level of detail", but the request can ideally be answered using fact with ProductName level.
    I have properly defined logical level keys in the hierarchies and logical level in the LTS (content tab)
    Can anyone point me what I am doing wrong here?

    Since both fact tables are at same granular level I would suggest to map each other (Signon_A maping Signon_B) in BMM layer logical fact @source.
    Considering them as Fact and with fact extension.
    BTW: Did you try by setting implicit fact at subject area properties?
    Edited by: Srini VEERAVALLI on Feb 1, 2013 9:04 AM

  • Join two fact tables

    Hi,
    I have two fact tables F1 and F2 joined with few dimensions D1.....D9. D1,D2 and D3 are conformed dimensions.
    In Physical:
    F1>--D1,D2,D3,D4,D5,D6
    F2>--D1,D2,D3,D7,D8,D9
    In Logical:
    F1>--D1,D2,D3,D4,D5,D6
    F2>--D1,D2,D3,D7,D8,D9
    there are some ports already using above star schemas. now I got new requirements and I need to use measures from both the fact tables and conformed and unconformed dimensions.
    I did the following in logical
    F1 LTS GENERAL TAB ADDED F2,D1,D2,D3,D7,D8,D9 and in content tab set logical level to unconformed dimensions at total level and for conformed dimensios at details level.
    F2 LTS GENERAL TAB ADDED F1,D1,D2,D3,D7,D8,D9 and in content tab set logical level to unconformed dimensions at total level and for conformed dimensios at details level.
    but I'm still getting
    [nQSError: 14026] Unable to navigate requested expression:
    Please fix the metadata consistency warnings. (HY000)
    I cheked metadata global consistency and no errors found.
    Appreciate your help
    Thanks
    Jay.
    Edited by: Jay on Sep 27, 2011 10:14 AM
    Edited by: Jay on Sep 27, 2011 10:15 AM

    Let me explain my issue again
    In Physical:
    F1>--D1,D2,D3,D4,D5,D6
    F2>--D1,D7
    In Logical: Single logical fact Fact_FY_Ratio has two logical sources
    LTS1: F1>--D1,D2,D3,D4,D5,D6
    LTS2: F2>--D1,D7
    Set the content level for each LTS in Fact_FY_Ratio:
    F1 to Detail for D1,D2,D3,D4,D5,D6 and to Total for D7
    F2 to Detail for D1,D7 and to Total for D2,D3,D4,D5,D6
    In LTS 1 general tab added inner joins with all dimensions F1,D1,D2,D3,D4,D5,D6,D7and F2
    In LTS 2 general tab added inner joins with all dimensions F2,D1,D7,D2,D3,D4,D5,D6 and F1
    And I also did logical complex join for Fact_FY_Ratio with all logical dimensions D1,D2,D3,D4,D5,D6,D7
    I'm able to run query using D1,D2,D3,D4,D5,D7,F1,F2 but when I include D6(time dimension) it is causing problem with F2(witth F1 it is working fine).
    query success with:
    F1,D1,D2,D3,D4,D5,D6
    F2,D1,D7
    D1,D2,D3,D4,D5,D7,F1,F2
    Query failed with:
    F2>--D1,D7 and D6
    Please let me know anything wrong with above configuration.
    Thanks
    Jay...
    Edited by: Jay on Oct 3, 2011 7:11 AM

  • No Message: Write to Fact table.

    Hi ALL,
    Source: ECC 6
    Target: BI 7.3
    We are Transferring 2LIS_13_VDITM Datasource---->> 0SD_CO3 Infocube .
    After Data Replication ,
    1. Data Transferred to PSA .
    2. During Transformation Creation Manuel Mapping is performed . Activated .
    3. During DTP Creation Only Following Warning Messages Occur , Status s not Transferred to Green .
    Data is not coming Cube , No Error Messages. (Totally 29000 Records have to transfer to BI Cube)
    Warning Messages are,
    1.No Message: Write to Fact table.
    2.No Message:Infocube Update Completed .
    What is the Problem?

    Hi,
    Have you set the Industroy sector before uploading the set up tables?
    For more information refer the note: 353042
    Summary
    Symptom
    Fields BWGEO, BWGEOO, BWGVP, BWGVO, BWNETWR, BWMNG, etc. of DataSources 2LIS_02_SCL, 2LIS_02_ITM, 2LIS_03_BF, 2LIS_03_UM, 2LIS_40_REVAL are not filled.
    This may lead to the following:
    The system does not perform any update into an InfoCube (for example: 0RT_C*, 0PUR_C01, 0CP_PURC1 and so on), even though data arrives in BW.
    This occurs with the following InfoSources:
    2LIS_02_SCL, 2LIS_02_ITM
    2LIS_03_BF, 2LIS_03_UM
    2LIS_40_REVAL
    With some restriction, this symptom also occurs with the following InfoSources if they are used in connection with retail or consumer products. (InfoCube: 0RT_* or 0CP_* ).
    2LIS_11_VAITM, 2LIS_12_VCITM, 2LIS_13_VDITM
    Other terms
    0PROCESSKEY, PROCESSKEY, 0RT_C01, 0RT_C02, 0RT_C03, 0RT_C04, BWBRTWR, BWGEO, BWGEOO, BWGVP, BWGVO, BWNETWR, BWMNG
    Reason and Prerequisites
    The process key (0PROCESSKEY and 0BWAPPLNM) of the InfoSources has not been filled. As a result, no key figures are updated because of the update routine of the participating InfoCube and along with it no records are inserted into the InfoCube. In each update routine, the system checks the content of the PROCESSKEY. If this field has no contents, then no data is written into the InfoCube because of the IF condition in the update rules.
    Solution
    So that you can work in the above mentioned InfoSources, you MUST activate the determination of the process key. This is done with the help of Transaction MCB_ which you can find in the OLTP IMG for BW (Transaction SBIW) in your attached R/3 source system.
    Here you can choose your industry sector. 'Standard' and 'Consumer products' are for R/3 standard customers, whereas 'Retail' is intended for customers with R/3 Retail only.
    You can display the characteristics of the process key (R/3 field BWVORG, BW field 0PROCESSKEY) by using Transaction MCB0.
    If you have already set up historical data (for example for testing purposes) by using the setup transactions (Statistical Setup Programs) (for example: Purchasing: Tx OLI3BW, material movements: OLI1BW) into the provided setup tables (for example: MC02M_0SCLSETUP, MC03BF0SETUP), you unfortunately have to delete this data (Tx LBWG). After you have chosen the industry sector by using  MCB_, perform the setup again, so that the system fills a valid transaction key for each data record generated. Then load this data into your connected BW by using 'Full update' or 'Initialization of the delta process'. Check, whether the system updates data into the involved InfoCubes now.
    If all this is not successful, please see Note 315880, and set the application indicator 'BW' to active using Transaction 'BF11'.
    Related notes:
    157317 --> You MUST make sure that this note is relevant for you.
    352344 -> Process key + reversals in Inventory Management
              (Consulting note).
    Regards,
    Anil Kumar Sharma .P

Maybe you are looking for

  • Calling schema specific packe/function from PL/SQL

    From PL/SQL I am trying to call a package function in a specific schema (see below). When I try it gives me this error: Invalid function body condition: ORA-06550: line 3, column 15: PLS-00201: identifier 'SCHEMA1.SIS_EXPRESS' must be declared ORA-06

  • Too late to cancel?

    I ordered a MBP with the Apple Care. They already processed and charged me for the Apple Care (nothing mailed to me yet), but have not yet shipped my order. Is it too late to cancel the entire order? ill they be able to refund my money for the Apple

  • Error at the time of dep. run

    Dear All, I have assigned the cost centers in the asset master & also create the Dep. GL accounts as a cost element but the problem arise when i run the depreciation. At the time of running the depreciation run, we are getting the following error, "G

  • What is Cisco-Atlanta IPTV architecture?

    Hello All. I want to ask what is the Cisco-Atlanta IPTV architecture?, in other word what is main components(H.W & S.W),because I perform a lot of search and I can't gain any thing. Waiting urs replies,,,,,,,, Thanks a lot.

  • My ipad serial number is invalid

    I have bought a new Ipad Air 2... while checking its information over the internet at Apple website... i found that its serial number is invalid. Please advice