Update data automatically in fact table in Data Warehouse

Hi,
I'm working on the creation of a data warehouse that include different data source like SQL Server performance (more than one), Active Directory users, Server performance (more than one), Exchange server mailboxes. The problem is that performance data change
frequently (like CPU and Memory), so my question is how to update data in fact table every 5 seconds automatically with SSIS.
Thank you for any advice  

I'm assuming you have already figured out how to capture the data e.g. Powershell, extended events, MDW etc. and just need to know what dimensions or fact tables do you need.
You need to decide how often you are going to capture this data and based on that you will have dimensions with appropriate grain. Don't try to cram everything in the same fact table if it not of the same granularity. Also, separate process usually
have separate fact tables.
In addition to the Date dimension, you will need a Time dimension with a grain of 1 second (or maybe 5 seconds if that is when you get your data) then run the SSIS every 5 seconds to capture and append that data in the fact table.
- Aalamjeet Rangi | (Blog)

Similar Messages

  • Load data in a fact table

    Hello,
    I have implemented SCD2 dimension and mapping executing works fine.
    Now I have question about loading data in a fact table.
    How do I need to use OWB (maybe JOINER operator - Join condition - between dimensions and source table) in case of:
    - update on source table
    - delete on source table
    I think the most simple is insert on source table. It is probably to_char(source_transaction_date,'dd.mm.yyyy') = to_char(sysdate,'dd.mm.yyyy'), if I load once a day..
    What is the procedure for fact table mapping to handle updates and deletes on source table?
    Regards

    Some discussions in previous forums should help you
    http://forums.sdn.sap.com/thread.jspa?threadID=2019448
    http://forums.sdn.sap.com/thread.jspa?threadID=1908902
    In the SAP tutorial, you can see a sample example of making fact tables.
    http://help.sap.com/businessobject/product_guides/boexir32SP1/en/xi321_ds_tutorial_en.pdf
    Arun

  • How can we delete the data in e-fact table.

    how can we delete the data in e-fact table.

    hii,
    You cannot delete the request individually but you can one of the following:
    1. Do a selective deletion from the cube. RSA1 -> Cube -> Contents -> selective deletion.
    2. Delete all the data in the cube and then reconstruct only the required request ids. This would work only if you have the PSA available for all the requests.
    3. Reverse posting is another possibility.
    hope it helps,
    partha

  • Dimension table and fact table exists data physically

    Hi experts,
    can anyone plz tell me weather dimension table and fact table exists data physically or not/

    Hi..Sudheer
    SAPu2019s BW is based on "Enhanced Star schema" or "Info Cubes" database design.This database design has a central database table, known as u2018Fact Tableu2019 which is surrounded by associated dimension tables.
    Fact table is surrounded by dimensional tables. Fact table is usually very large, that means it contains
    millions to billions of records.
    These dimension tables doesn't contain data  it contain references to the pointer tables that point to the master data tables which in turn contain Master data objects such as customer, material and destination country stored in BW as Info objects. An InfoObjects can contain single field definitions such as transaction data or complex Customer Master Data that hold attributes, hierarchy and customer texts that are stored in their own tables.
    SID is surrogate ID generated by the system. The SID tables are created when we create a master data IO. In SAP BW star schema, the distinction is made between two self contained areas: Infocube & master data tables/SID tables.
    The master data doesn't reside in the satr schema but resides in separate tables which are shared across all the star schemas in SAP BW. A numer ID is generated which connects the dimension tables of the infocube to that of the master data tables.
    The dimension tables contain the dim ID and SID of a particular IO. Using this SID the attributes and texts of an master data Io is accessed.
    The SID table is connected to the associated master data tables via teh char key.
    Fact table(Transaction data,DIM ID)<>Dimention Table(SID and Dim ID)<->Masterdata table(SID,IO)
    Thanks,
    Abha

  • Confirmed Dimensions. OBIEE Not able to pull data from two fact tables.

    Hi Experts,
    I have a very simple set up of Star Schema with two fact tables and 1 dimension. Both fact tables joined to the dimension at the same level.
    When i pull a column from both fact tables and the dimension table in OBIEE, it has to create simple SQL like below:
    select FACT1.column1,
    Fact2.Column1,
    Dim.Column1
    from FACT1, FACT2, DIM
    where FACT1.ID = DIM.ID and FACT2.ID = DIM.ID
    but instead it creating a query in a very complex way:
    select case  when D1.c2 is not null then D1.c2 when D2.c2 is not null then D2.c2 end  as c2,
         D1.c1 as c3,
         D2.c1 as c4
    from
         (select FACT1.Column1 as c1,
                   DIM.Column1 as c2
              from
                   DIM T1287863,              
                   FACT1 T1287945              
       where  (DIM.ID = FACT1.ID)
           ) D1 full outer join (
            select FACT2.Column1 as c1,
                   DIM.Column1 as c2
              from
                   DIM,              
                   FACT2
              where  ( DIM.ID = FACT2.ID)
         ) D2 On isnull(D1.c2 , '1') = isnull(D2.c2 , '1') and isnull(D1.c2 , '2') = isnull(D2.c2 , '2')
    I even tried setting the levels for both the fact tables and it still creates the query in avove way. Any thoughts on this will be vary helpful.

    Subramanian,
    see below the code we're using for the RFM.
    on the ct_containers table i'm passing a line, and its getting updated after the call.
    on the ct_errors table i just want to receive the errors and i only receive the line, we add manually there ('Serious error with validation code').
    kr, achim
    FUNCTION zbapi_ra_validations .
    *"*"Local Interface:
    *"  IMPORTING
    *"     VALUE(IS_RA_SCREEN) TYPE  ZBAPI_S_RA_SCREEN
    *"  CHANGING
    *"     VALUE(CT_ERRORS) TYPE  ZRA_T_ERRORS
    *"     VALUE(CT_CONTAINERS) TYPE  ZRA_T_CONT_IP
      DATA:
        lo_badi_handle TYPE REF TO zra_validation_rule,
        ls_error       TYPE zra_s_error.
      GET BADI lo_badi_handle.
      TRY.
          CALL BADI lo_badi_handle->validate_rules
            EXPORTING
              is_screen_flds = is_ra_screen
            CHANGING
              ct_containers  = ct_containers
              ct_errors      = ct_errors.
        CATCH zcx_ra.
          ls_error-message = 'Serious error with validation code'.
          APPEND ls_error TO ct_errors.
      ENDTRY.
    ENDFUNCTION.
    if i call this rfm in SE37 the ct_errors table is populated with all errors and the manually created line.
    Message was edited by: Achim Hauck

  • Best ways to create rpd or reports if we have data in more fact tables

    I have fact and dimensional data in one or more different tables. Then each logical table source represents one data segment.Please suggest me some methods or ways like fragmentation through which i can use them in creating rpd and report and main problem here is facts too large contains 25 million records.But adding tables in BMM layer effecting performance so can anyone other ways doing it in database side.
    Thanks in advance
    Edited by: user2989722 on Dec 3, 2009 3:09 PM

    hi,
    For the fragmentation you can create on dimension .The procedure is clearly explained in this blog
    http://108obiee.blogspot.com/2009/01/fragmentation-in-obiee.html
    http://www.rittmanmead.com/2007/06/19/obiee-data-modeling-tips-2-fragmentation/
    For the performance point of view you can create a Materialized view (based on columns that your report is using) so that it will hit that particluar view instead of hitting whole table(25million records table) please look into this post
    Re: Materialized views in OBIEE
    thanks,
    saichand.v

  • Creating Time Dimension from date columns in fact tables.

    I remember watching a demo from a BI Tool a couple years ago, wich I swear was OBIEE, and the presentator stated it was possible to create a Time Dimension in the admin tool, based on a date column in other table.
    Can you guys tell me if there's such functionality in OBIEE?
    If so, how could I achieve that?!
    Thanks in advance!
    Marcos

    hi,
    You are trying to make Fact table as Dim table ???
    Fact table has some dim columns??
    We can do this by making a fact table as dim table ,create a dim hierarchy on that table
    Year level :Extract(year from fact_date_column)
    Month and year  level:CAST (Extract(month from fact_date_column) As CHAR(5) ) || CAST (Extract(yearfrom fact_date_column) As CHAR(5) )
    Like this
    But,be careful while doing this make sure that all joins and content levels are good
    As per my knowledge this is not a good way,Experts can add some words lets see!!!!!! :-)
    thanks,
    saichand.v

  • Unexpected results getting data from two fact tables through conformed dim

    Hi all,
    We are getting an unexpected behaviour in our OBIEE 10.1.3.3.3. We have this scenario:
    We have {color:#0000ff}2 fact tables{color}{color:#000000} called F1 and F2. F1 has one measure, f1m1 and F2 has another one, f2m1.
    We have {color:#0000ff}4 conformed dimensions{color}, called D1, D2, D3, Date.
    When we are requesting for individual fact tables, we are getting:
    date d1 d2 d3 f1m1
    dt1 - x - y - z - m1
    dt1 - x - y - z' - m2
    date d1 d2 d3 f2m1
    dt1 - x - y - z - m3
    dt1 - x - y - z'' - m4
    But, trying to obtain a compare scenario, we are getting
    date d1 d2 d3 f1m1 f2m1
    dt1 x y z m1 m4
    Instead of
    date d1 d2 d3 f1m1 f2m1
    dt1 x y z m1 m3
    Looking at query log, we have catched the reason. That's why BI Server is using to solve this request using ROW_COUNT() to join SAWITH0 and SAWITH1 in SAWITH2 result set. So, the order may not be the same in the results sets in every fact table. More or less, generated query is like:
    WITH
    SAWITH0 AS
    (select ....
    from F1),
    SAWITH1 AS
    (select ...
    from F2),
    SAWITH2 AS
    select from (select ...
    ROW_NUMBER() OVER PARTITION (....) c10
    from SAWITH0.d1 full outer join SAWITH1.d1 ....) D1
    {color:#ff0000}where (D1.c10 = 1){color}
    select SAWITH2. ....
    from SAWITH2
    order by c1..c10
    The problems seems to be that BI server is ordering the result sets SAWITH0 and SAWITH1 and getting row number to join this results sets, but this is not getting the correct result.
    Any ideas?
    TIA
    Javier
    {color}
    Edited by: jirazazábal on Mar 13, 2009 2:46 PM

    I have done a logical fact table with two fact table source on it.
    The Sql performed against the database was this one.
    -------------------- Sending query to database named PRODS_AIX (id: <<153418>>):
    WITH
    SAWITH0 AS (select sum(T21296.CONSUMERS_SALES_EURO) as c1,
         T21309.DIVISION_CODE as c2
    from
         DIVISION T21309,
         C_CONSUMERS_SALES T21296
    where  ( T21296.DIVISION = T21309.DIMENSION_KEY )
    group by T21309.DIVISION_CODE),
    SAWITH1 AS (select sum(T21356.ORDER_VALUE) as c1,
         T21309.DIVISION_CODE as c2
    from
         DIVISION T21309,
         DWH_SALES_ORDER_OVERVIEW T21356
    where  ( T21309.DIMENSION_KEY = T21356.DIVISION_KEY )
    group by T21309.DIVISION_CODE)
    select distinct case  when SAWITH0.c2 is not null then SAWITH0.c2 when SAWITH1.c2 is not null then SAWITH1.c2 end  as c1,
         SAWITH0.c1 as c2,
         SAWITH1.c1 as c3
    from
         SAWITH0 full outer join SAWITH1 On nvl(SAWITH0.c2 , 'q') = nvl(SAWITH1.c2 , 'q') and nvl(SAWITH0.c2 , 'z') = nvl(SAWITH1.c2 , 'z')
    order by c1As you can see one select (SAWITH0) for the first fact table C_CONSUMERS_SALES and one select for the second fact table DWH_SALES_ORDER_OVERVIEW (SAWITH1 ) and the two statement are joined with a full outer join.
    I ask me why you have the three select (SAWITH0,SAWITH1 and SAWITH2). Can you please paste the complete SQL performed ?
    Can you tell us also which SQL is performed if you select only the columns from one fact table and not for the other ?
    Regards
    Nico
    http://gerardnico.com

  • Dimension values without data in a fact table

    I have an ODS system and a Data warehouse system
    I have a Sales fact table in the ODS system and I have these fields:
    SALES
    ID_CUSTOMER (PK),
    ID_MODEL (PK),
    ID_TIME (PK),
    SALES,
    QUANT_ART,
    COST
    Then in some records in the fields ID_Time or ID_Model or ID_Customer I don’t have values (NULL) because in the transactional systems these record don’t have values (NULL).
    The users want to generate aggregate reports with the Sales table...
    The question is:
    I have to put a “dummy” value in the dimensions Customer, Model and Time (for example “0”) and put this value in the fact table if the dimensions fields have NULL values????
    Or I have to leave the NULL values?
    What is the best choice? Why?

    There's often some specific reason why these values don't exist, such as the record being a manual adjustment to sales (for example a journal voucher). In these cases it can be helpful to have a flag column to indicate this, so that when a user comes across a bunch of sales with a Store Name of "Unknown" or "Not Applicable" they can also look at the reason for this unusual entry.

  • What is correct way to join a start/end date driven dimension to a fact table in data foundation?

    I have a bad universe or a data design issue. 
    Several versions of hierarchies reporting store entities in reporting Fact measures.
    Example of date driven Hierarchy Dimension:   ORG_KEY
                                                                                START_DATE
                                                                                END_DATE
                                                                                STORE_NUMBER
                                                                                ORG_HIERARCHY   
                                                                                CURR_FLG   (Y/N)        
    Fact table  :                                                         ORG_KEY
                                                                                 CALENDAR_KEY
                                                                                 TRANS_DATE
                                                                                 $amount  
    Calendar Dimension:                                            CALENDAR_KEY
                                                                                  DAY_DATE
                                                                                   FY_WEEK          (201452)
                                                                                    FY_PERIOD      (201412)
                                                                                    FY_QUARTER   (201401)
                                                                                     FISCAL_YEAR    (2014)
    Users WISH:
    Wish for store number and org hierarchy to pull as of the last day of each pull without prompt.    The Store(ORG_KEY) as in the fact table; but the ORG_HIERARCHY and other attributes as of the last day they pull.
    Daily (Would be Calendar.Day_Date in Filter) ,
    Week to date (would be Max Calendar.Day_Date for (201452) FY_WEEK  as entered,
    Month to date,
    Year to date, 
    AdHoc queries.  
    My problem is I see how they could manually pull this in Webi.   I have tried everything I know to join to no avail.  I have not gotten @Prompts to work in joins, derived tables, etc.  in the data foundation.  Wonder what is difference between parameter in Data foundation vs. filter in buisness layer.    None of them worked. 
    Please help!   ANY ideas would be appreciated.   .

    {Note that abbrevations in brackets are just for short form further down the answer}
    Join Store Dim (SD) to Transaction Fact (TF) on ORG_KEY=ORG_KEY with 1 to Many cardinality
    Create an alias of Calendar Dim and call it Transaction Date (TD)
    Join TD to TF on TD.CALENDAR_KEY=TF.CALENDAR_KEY with 1 to Many cardinality
    Create a predefined condition in your universe called "Return Yesterday's Transactions" as:
    TD.DAY_DATE = trunc(sysdate-1) <-- That assumes Oracle database; use whatever is correct for yesterday for your RDBMS
    The above predefined condition when added to your data query will return only yesterday's transactions.
    However, if you want to return all the different types of sales, you would need an object for each one. To do that, you'd use a case statement. Again, using Oracle syntax, an example of MTD would be:
    SUM(CASE WHEN trunc(TD.DAY_DATE,'yyyymm') = trunc(sysdate,'yyyymm') THEN TF.amount END)
    If you have any more questions about how this approach would work please shout.
    As an alternative, you could create a time hierarchy and add scope of analysis to your report for it and enable drilling at report level.

  • How to fetch data from multiple fact tables from a normalized schema?

    Hello everybody,
    I am working on a normalized schema to build my repository. I have categorized the tables into dimensions and facts. I have 3 fact tables in my schema and they have a 1:m:1 relationship i.e. if I have table A, B, and C; A has 1:m relationship with B and C also has 1:m relationship with B. How can I use measures from these tables to create a star schema. If A = Sales, B = Transaction, C = Payment_Amount
    Sales(Sales ID, Amount, Tax pct, ...)
    Transaction(Transaction ID, Sales ID, Payment ID,Transaction Amt, ...)
    Payment_Amount(Payment ID, Check ID, Payment Made, ...)
    Please give me some direction to pursue.
    Thank you!
    D

    Hi dcole,
    Go through this link for snowflake schema http://gerardnico.com/wiki/datamodeling/snowflake_
    http://www.rittmanmead.com/2007/06/19/obiee-data-modeling-tips-1-integrating-1-1-and-1-many-source-tables/
    I suppose it should work with snowflake schema...Im interested in this topic please let me know what is your approach every now and then.
    By,
    KK

  • Foreign keys in SCD2 dimensions and fact tables in data warehouse

    Hello.
    I have datawarehouse in snowflake schema. All dimensions are SCD2, the columns are like that:
    ID (PK) SID NAME ... START_DATE END_DATE IS_ACTUAL
    1 1 XXX 01.01.2000 01.01.2002 0
    2 1 YYX 02.01.2002 01.01.2004 1
    3 2 SYX 02.01.2002 1
    4 3 AYX 02.01.2002 01.01.2004 0
    5 3 YYZ 02.01.2004 1
    On this table there are relations from other dimension and fact table.
    Need I create foreign keys for relation?
    And if I do, on what columns? SID (serial ID) is not unique. If I create on ID, I have to get SID and actual row in any query.

    >
    I have datawarehouse in snowflake schema. All dimensions are SCD2, the columns are like that:
    ID (PK) SID NAME ... START_DATE END_DATE IS_ACTUAL
    1 1 XXX 01.01.2000 01.01.2002 0
    2 1 YYX 02.01.2002 01.01.2004 1
    3 2 SYX 02.01.2002 1
    4 3 AYX 02.01.2002 01.01.2004 0
    5 3 YYZ 02.01.2004 1
    On this table there are relations from other dimension and fact table.
    Need I create foreign keys for relation?
    >
    Are you still designing your system? Why did you choose NOT to use a Star schema? Star schema's are simpler and have some performance benefits over snowflakes. Although there may be some data redundancy that is usually not an issue for data warehouse systems since any DML is usually well-managed and normalization is often sacrificed for better performance.
    Only YOU can determine what foreign keys you need. Generally you will create foreign keys between any child table and its parent table and those need to be created on a primary key or unique key value.
    >
    And if I do, on what columns? SID (serial ID) is not unique. If I create on ID, I have to get SID and actual row in any query.
    >
    I have no idea what that means. There isn't any way to tell from just the DDL for one dimension table that you provided.
    It is not clear if you are saying that your fact table will have a direct relationship to the star-flake dimension tables or only link to them through the top-level dimensions.
    Some types of snowflakes do nothing more than normalize a dimension table to eliminate redundancy. For those types the dimension table is, in a sense, a 'mini' fact table and the other normalized tables become its children. The fact table only has a relation to the main dimension table; any data needed from the dimensions 'child' tables is obtained by joining them to their 'parent'.
    Other snowflake types have the main fact table having relations to one or more of the dimensions 'child' tables. That complicates the maintenance of the fact table since any change to the dimension 'child' table impacts the fact table also. It is not recommended to use that type of snowflake.
    See the 'Snowflake Schemas' section of the Data Warehousing Guide
    http://docs.oracle.com/cd/B28359_01/server.111/b28313/schemas.htm
    >
    Snowflake Schemas
    The snowflake schema is a more complex data warehouse model than a star schema, and is a type of star schema. It is called a snowflake schema because the diagram of the schema resembles a snowflake.
    Snowflake schemas normalize dimensions to eliminate redundancy. That is, the dimension data has been grouped into multiple tables instead of one large table. For example, a product dimension table in a star schema might be normalized into a products table, a product_category table, and a product_manufacturer table in a snowflake schema. While this saves space, it increases the number of dimension tables and requires more foreign key joins. The result is more complex queries and reduced query performance. Figure 19-3 presents a graphical representation of a snowflake schema.

  • Display master data without data in the fact table

    Characteristic 0PROJECT
    Attribute Price
    I want to show in the query all the prices including the projects that don't have registers in the fact table.
    How do I do this?
    Tnks.

    I believe you are describing what SAP referes to as the Slow Moving Item scenario.  Search SDN using that phrase and you'll get hits on documents and  Notes that talk more about this.  Here's something from an old How To
    Slow Moving Item Scenario
    You want to define a query that displays all products that have been purchased only
    infrequently or not at all. In other words, the query is also display characteristic values for
    which no transaction data or only low values exist for the selected period.
    Procedure
    In the Administrator Workbench;
    1. Create a MultiProvider consisting of a revenue InfoCube, containing the InfoObject
    Material (0MATERIAL), and the InfoObject 0MATERIAL. The InfoObject must be set as
    an InfoProvider in InfoObject maintenance. In other words, you need to have assigned
    the InfoObject to an InfoArea. (also refer to Tab Page: Master Data/texts [Ext.]).
    In the BEx Analyzer:
    2. Select your MultiProvider in the Query Designer.
    3. Define a query that contains the InfoObject 1ROWCOUNT in the columns.
    The InfoObject 1ROWCOUNT is contained in all “flat” InfoProviders, that is, in all
    InfoObjects and ODS objects. It counts the number of records in the InfoProvider.
    In this scenario, you can see from the row number display whether or nor values
    from the InfoProvider InfoObject are really displayed.
    4. Save the query and execute it. All values are now displayed, including those for materials
    that were not purchased.
    If you filter by time (0CALYEAR, for example), values from the InfoProvider
    InfoObjects are not displayed since 0CALYEAR is not an attribute of
    0MATERIAL. You can see this from the absence of values in the 1ROWCOUNT
    column in the query. If you want to restrict by time, you need to proceed as
    follows:
    Constant Selection for the InfoObject
    You need to set the constant selection for the 1ROWCOUNT key figure in order to be able to
    set a filter by time in this query.
    1. In the Query Designer, via the context menu for 1ROWCOUNT, choose Edit.
    2. On the left hand half of the screen, under the data package dimension, select the
    characteristic InfoProvider (0INFOPROV) and drag it into the right-hand screen area.
    3. From the context menu for the InfoProvider, choose Restrict, and restrict across the
    InfoProvider InfoObject.
    4. Also from the context menu for the InfoProvider, choose the function Constant Selection.
    5. Save the query and execute it. You can now also set a filter for a time characteristic, the
    materials display remains as it was.
    Displaying Slow Moving Items
    SAP Online Help 05.11.02
    MultiProviders 3.0B, Support Package 07 10
    If you want to display a list of slow moving items, excluding products that are selling well, you
    need to proceed as follows:
    1. In the Query Designer, via the context menu for 1ROWCOUNT, choose Edit.
    2. Via the context menu for InfoProvider, choose the function Display Empty Values. Also
    select Constant Selection.
    3. Save the query and execute it. The result is that the system displays the materials for
    which there was no revenue.
    Displaying Products with Small Revenues
    If you want to display a list of products that have not been sold or have only been selling
    badly, you need to proceed as follows:
    1. Set constant selection as described above, but do not select the display empty values
    function.
    2. In the Query Designer, define a condition for the 0MATERIAL InfoObject. Specify a value
    that is to be the upper limit for a bad sale.
    3. Save the query and execute it. The result is that the system displays all materials that
    have not been sold or have been selling badly.

  • FAct TABLE LOOKUPs data

    Hi,
    Please help me out in loading the fact tables
    I had used lookup on DIM table to get my SUK and if I use union transformation to get the out put from each lookup and then loading the data with some condition the data in my fact is not loading in a proper format.
    The union transformation is splitting the out put in to different records
    Please do inform me about which transformation should be used to get the data from lookup tables.
    Or please do inform me the approach to load the fact table in SSIS.
    I’m basically INFORMATICA resource and I’m implementing in terms of INFORMATICA    

    Hi, glad to see your reply. I am encountering almost the same problem, loading fact table, 
    My Dataflow is  like:
    Source Component -->LKP1(get Dim1 SUK) -->LKP2(get Dim2 SUK) -->LKn(getDimn SUK) ...-->Destination Component(load to fact table database), while i have tough difficulties:
    [FactoryLookup [352]] Error: Row yielded no match during lookup. 
    [FactoryLookup [352]] Error: The "component "FactoryLookup" (352)" failed because error code 0xC020901E occurred, and the error row disposition on "output "Lookup Output" (354)" specifies failure on error. An error occurred on the specified object of the specified
    component. 
    [DTS.Pipeline] Error: The ProcessInput method on component "FactoryLookup" (352) failed with error code 0xC0209029. The identified component returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and will
    cause the Data Flow task to stop running. 
    [DTS.Pipeline] Error: Thread "WorkThread0" has exited with error code 0xC0209029. 
    Could you please help me? How to fix them?

  • BW 3.5 Fact table with data no corresponding records on P table

    Hi Gurus,
    I have a weird situation: There is an BW 3.5 SP 16 (I know itu2019s oldu2026) infocube that receive 400,000 records every single day.
    They never compressed (at least there is no data on E table), instead they ran a program calling FM u201CRSDRD_SEL_DELETION'u201D every day that takes about 5 hours to run (when it runs without errors).
    So I checked Fact table and found 337,564,988 records. If you check the number of records on Infocube Admin it shows 14,018,370 record. I found that  the u201Cmissingu201D record are not on P table (there is data on Fact pointing to missing Packages).
    I think the Selective deletion messed up the infocube. These record that are missing on P table have to be deleted. I ran an RSRV check on this Infocube on Background and it canceledu2026
    So, I want to delete this data from Fact my question is: How to do that with minimum effort without invalidade the information?
    Of course I will backup the data to a temp cubeu2026
    Well any idea is welcome
    Regards,
    Alex

    Hi Navesh,
    First of all thanks to your quick answer.
    Well, the arguments of FM are set to del any 0SEM_CRDATE older than 30 days in the past. This characteristic receive SY-DATUM on Act. Rule (transformationu2026) so every package older than 30 days have to be deleted. The weird thing is that it is deleting the P table record and keeping the data on F table.
    So You think it would be a nice try to do an selective deletion with the same criteria manually?
    This is the code, can you (or someone) take a look?
    START-OF-SELECTION.
    *Cria Range do período a ser eliminado
      PERFORM f_range.
      PERFORM f_elima_cubo.
    END-OF-SELECTION.
    *&      Form  F_RANGE
    FORM f_range.
      v_data_inic = sy-datum - 31.
      v_range-sign   = 'I'.
      v_range-option = 'LT'.
      v_range-low    = v_data_inic.
      v_range-keyfl  = 'X'.
      APPEND v_range  TO  t_tab_main-t_range.
      t_tab_main-iobjnm        = '0SEM_CRDATE'.
      INSERT t_tab_main INTO TABLE t_thx_sel.
    ENDFORM.                    " F_RANGE
    *&      Form  F_ELIMA_CUBO
    FORM f_elima_cubo.
      v_parallel               = '01'.
      CALL FUNCTION 'RSDRD_SEL_DELETION'
        EXPORTING
          i_datatarget      = 'IC_LP_B01'
          i_thx_sel         = t_thx_sel
          i_authority_check = c_flag
          i_no_logging      = v_nl
          i_parallel_degree = v_parallel
          i_show_report     = v_sr
        CHANGING
          c_t_msg           = t_msg.
    ENDFORM.                   
    Regards,
    Alex

Maybe you are looking for

  • Set up my own HTML 5 DocType

    I would like to set up my own HTML 5 DocType, eg: "mydefault.html" file in Dreamweaver CS6 to have an additional code. I have been searching everywhere to find out how to do it without success. Can you please point me where and how to set up?

  • Iterator for ALV

    I am just migrating my application from BSP to WDA. I liked the concept of an tableview iterator in BSP a lot. Is there a similar concept for an ALV in WDA?

  • How do Virtual Applications get published in SCCM 2012

    SCCM 2012 Virtrual applications are deployed, icons appear on desktop and appear to be installed according to the Application Management Client console.  However the SCCM2012 AppDiscovery.log states "Required component is not published".   We are run

  • Corrupt bc4j.xcfg file

    I have had problems getting the bc4j.xcfg file to generate correctly following changes to the database connections. ie. the jdbcname would be updated following a change to the configuration but the dbconnection string would point to the connection st

  • Sharpoint designer 2013 crashes when Workflow People/Group Selection after adding AAM

    I have same issue as mention in this post. Sharpoint designer 2013 Stopping when Workflow People/Group Selection. SharePoint designer crashing after selecting peopicker in workflow. This is happening after adding AAM url. And also as answer mentioned