Loading measures to a cube

hi
How to load measures to a cube.
I created a cube, validated and deployed successfully.
I am loading only a single measure from a table, but it does not load anything.
even the keys of the dimensions are not in the database table.
do i need to map on the measure to the proper row in the cube or i also need the keys to be mapped?
regards
Arif
P.S. I am unable to include a screen dump in the forum, how to do it?
Edited by: badwanpk on Nov 4, 2008 2:36 PM

Yes, you need to build a row for insertion to the fact by joining your source tables.
At the moment you are building a Data Mart/Star Schema (fact and dimension tables i.e. a cube), this Star Schema will be used for reporting i.e. no need to go back to the source. Ultimately, this could grow into a Data Warehouse that has dimensions and facts populated from many sources allowing reporting across your whole organisation rather than a single application. The dimensions are likely to contain far more descriptive i.e. text attributes than the source systems and will be used to constrain your reporting queries.
I suggest reading some articles/books by Ralph Kimbal, they should give you a good overview of Data Warehousing and Dimensional modelling.
http://www.kimballgroup.com/html/articles.html
Si

Similar Messages

  • Mapping measures in a cube vs Calculated Measure

    I have a fact table as
    Fact_Table_
    Sales Product_ID Special_Product_Flag
    Special_Product_Flag is a y/n flag
    and a Dimension_ table as
    Product_ID Product_Name
    I want to have 2 measures in the cube
    1. Product_Sales
    In the cube I specify the mapping for this measure as Fact_Table.Sales
    2. Special_Product_Sales
    in the cube I specify an olap expression in the mapping as
         case
         when Fact_Table.Special_Product_Flag = 'Y' then Fact_Table.Sales else 0 end
    Now is the measure      Special_Product_Sales treated as a calculated measure ?

    Here are definitions of base versus calculated measures.
    <li> A base measure is a measure that has associated physical storage in the AW. This physical storage is populated using some kind of load step, which may use either SQL or OLAP DML. The measure may then be aggregated according to the aggregation rules of the cube. The value of a base measure for any given cell in the cube may be stored or may be calculated dynamically depending on the "precomputation" rules of the cube.
    <li> A calculated measure is one that has no physical storage in the AW. The values of a calculated measure are always calculated dynamically regardless of the precomputation rules of the cube.
    Your case is clearly a base measure because you are physically loading data into the AW. The fact that you use a case expression in the mapping does not change the fact that is takes up physical storage nor the fact that it can be aggregated. It is the ability to aggregate the data that makes this a useful feature. In your case you will be able to aggregate special_product_sales up the product hierarchy so that its value for 'ALL_PRODUCTS' will be
    select sum(sales)
    from fact_table
    where Special_Product_Flag = 'Y'The value of sales for 'ALL_PRODUCTS', by contrast, would be
    select sum(sales)
    from fact_tableAs a side note it may be better to define your mapping using NULL instead of 0.
    case
    when Fact_Table.Special_Product_Flag = 'Y'
    then Fact_Table.Sales
    else null
    end

  • Input ready query is not showing loaded data in the cube

    Dear Experts,
    In Input ready query we have problem that it is not showing the values which was not entered by throught hat query. Is any settings in input ready query that we can do to populate the loaded data on the cube as well as data entered through Input ready query itself.
    Thanks,
    Gopi R

    Hi,
    input ready queries always should display most recent data (i.e. all green and the yellow request). So you can check the status of the requests in the real-time InfoCube. There should exist only green requests and maybe at most one yellow request.
    In addition you can try to delete the OLAP cache for the plan buffer query: Use RSRCACHE to do this. The technical names of the plan buffer query can be found as follows:
    1. InfoCube\!!1InfoCube, e.g. ZTSC0T003/!!1ZTSC0T003 if ZTSC0T003 is the technical name of the InfoCube
    2. MPRO\!!1MPRO, e.g. ZTSC0M002/!!1ZTSC0M002 if ZTSC0M002 is the technical name of the multiprovider
    If the input ready query is defined on an aggregation level using a real-time InfoCube, the first case is relevant; if the aggregation level is defined on a multiprovider the second case is relevant. If the input-ready query is defined on a multiprovider containing aggregation levels again the first case is relevant (find the real-time InfoCubes used in the aggregation level).
    Regards,
    Gregor

  • To load data from a cube in SCM(APO) system to a cube in BI system.

    Experts,
         Please let me know whether it is possible to load data from a cube in SCM(APO) system to a cube in BI system.If so explain the steps to perform.
    Thanks,
    Meera

    Hi,
    Think in this way,
    To load the data fro any source we need datasource Ok. You can genare Export data source for Cube in APO,  then use that datasource for BW extraction, try like this. I think it will work, in my case I'm directly loading data from APO to BW using the DataSource sthat are genaraed on Planning Area.
    Why you need to take data from APO cube?. Is there any condition for that?. If it is not mandatory, you can use the dame datasource and load the data to BW, if they have any conditions while loading the data from APO to APO cube, they you try check wherther it is possible in BW or not. If possible then you use DataSource and do the same calculation in BW directly.
    Thanks
    Reddy

  • How to load data to a cube from multiple infosources ?

    Hi friends,
    How to load data to a cube from multiple infosources ? could u please answer this question .
    thanks in advance......

    Hi ,
    say for example you need to load data to 1 cube from 3 info sources:
    1) You need to create 3 update rules for the Cube.
    2) Each time you create the update rules. Mention the name the name of the Info source. and create update rules correspondingly.
    Regards
    satish
    Message was edited by:
            satish murthy

  • Data load from DSO to cube

    Hi gurus
    We have a typical problem, we have to combined 2 records into one when they reach to DSO.
    One source is flat file and the other is R3. so few fields I am geting from flat file and few from R3 . They are creating one record when they loaded to DSO . Now I am getting one record in active data table but when I load the data to cube from that DSO ( Data is going from change log to cube ) to cube , I am getting 2 seperate records ( one which is loaded from flat file and one which is loaded from R3 ) in cube which I dont want. I want only one record in cube just like i have in my active data table in DSO.
    I cant get the data from Active data table because I need delta load.
    Would you please advise what can I do to get that one record in Cube?
    Pl help

    Ravi
    I am sending the data thro DTP only but is there any solution to get one record bcos in other scenario I am getting data from 2 different ERP sources and getting one record in DSO and cube as well.
    but that is not happening for this second scenario where i am getting data from FLAT file and ERP and traing to create one record.

  • Data load from DSO to Cube in BI7?

    Hi All,
    We just migrated a dataflow from 3.5 to 7 in development and moved to production. So till now in production, the dataloads happend using the infopackages.
    a. Infopackage1 from datasource to ODS and
    b. Infopackage2 from ODS to the CUBE.
    Now after we transported the migrated dataflow to production, to load the same infoproviders I use
    1. Infopackage to load PSA.
    2. DTP1 to load from PSA to DSO.
    3. DTP2 to load from DSO to CUBE.
    step1 and step2 works fine but when I run the DTP2 it is getting terminated. But now when I tried the step b (above), it loads the CUBE fine using the infopackage. So I am unable to understand why the DTP failed and why the infopackage load is successful. In order to use the DTP do we need to do any cleanup when using it for first time? Please let me know if you have any suggestions.
    Please note that the DSO already has data loaded using infopackage.  (Is this causing the problem?)
    Thanks,
    Sirish.

    Hi Naveen,
    Thanks for the Reply. The creation of DTP is not possible without a transformation.
    The transformation has been moved to production successfully.

  • Partial data loading from dso to cube

    Dear All,
    I am loading the data from dso to the cube through DTP. The problem is that some records are getting added  through data
    packet 1 to the cube around 50 crore records while through data packet2 records are not getting added.
    It is full load from dso to cube.
    I have tried deleting the request and executing DTP but again the same no of records get added to the cube through datapacket 1   and after that records are not added to the cube through datapacket 1  ;request remains in yellow state only.
    Please suggest .

    Nidhuk,
    Data load transfers package by package. Your story sounds like it got stuck in second package or something. Suggest you check package size, try to increase to a higher number to see if anything  changes ? 50 records per package kind of low, your load should not spread out into too many packages.
    Regards. Jen
    Edited by: Jen Yakimoto on Oct 8, 2010 1:45 AM

  • Automatically trigger the event to load data from Planning cube to Standard Cube

    Hello,
    We have a below set up in our system..
    1. A Planning BEx query using which user makes certain entries and writes back data to the planning cube.
    2. An actual reporting cube which gets data from the planning cube above.
    Now, what we want to do is to automate the data load from Planning cube to Reporting cube.
    This involves 2 things..
    1. Change the setting " Change real time load behaviour " of the planing cube to Planning.
    2. Trigger the DTP which loads data from Planning cube to reporting cube.
    We want to automate the above two steps...
    I have tried few things to achieve the same..
    1. Created an event in SM64,
    2. In the Planning cube "Manage" Screen,  clicked on "Subsequent Processing" and provided the event details. ( Not sure if its the correct place to provide the event detail ).
    3. Wrote a ABAP program which changes the setting of the planning cube (  " Change real time load behaviour " to Loading )
    4. Created a process chain, where we have used the event as a start variant, used the ABAP program as a next step, DTP run as the last step..
    This i hoped, would trigger the event as soon as a new request comes and sits in planning cube, which in turn would trigger the process chain which would load the data from Planning cube to reporting cube..
    This is not working. I dont think the event is triggering, and even if does i am not sure whether it will start the process chain automatically. Any ideas please ?

    hi,
    try to do the transformation directly in the input cube by using CR of type exit, more details :
    http://help.sap.com/saphelp_nw70ehp2/helpdata/en/43/1c3d0f31b70701e10000000a422035/content.htm
    hope it helps.

  • Error in delta loading from a planning cube

    hi all,
    I am using a delta load from one planning cube to another.When the changed records are of a magnitude of 100 then it is successful but when the records are of magnitude of 100000 then it stays yellow with no data records sent and eventually fails.the packet size is 20000.
    any help is appreciated.

    Hello Ajay,
    Have you checked that your Request is closed in the planning cube? FM RSAPO_CLOSE_TRANS_REQUEST will do it
    It is a customizing issue to tell the infopackage to end "green" when it has no data.
    Transaction Spro ==> SAP Netweaver ==> SAP Business Information Warehouse ==> Automated Processes ==> Monitor Settings ==> Set Traffic Light Color. There you have to set the Traffic light on green if no data is available in the system.
    Hope that helps
    best regards
    Yannick

  • Error while data loading in real time cube

    HI experts,
    I have a problem. I am loading data from a flat file.The data is loading correctly till the DSO but when i am trying to load it into the cube it is giving an error.
    The cube is  a real time cube for PLANING. I have chnaged the status to allow data loading but still the DTP is giving an error.
    It shows an error "error while extracting from DataStore" and some RSBK 224 ERROR and rsar 051 error.

    What was the resolution to this issue.  We rae having the same issue only with external system (not a flat file).  We get the  RSAR 051 with a return code of 238 error message, like it is not even getting to the rfc connection (DI_SOURCE).  We have been facing this issue for a while and even opened up a message with SAP.

  • While loading transaction data into cube what are the tables generats

    Hi,
    while loading transaction data into cube what are the tables normally generats.

    Hi,
    Normally datas will be loading to 'F'- Fact tables (/BIC/F****) *** - cube name..
    When you do compress the request the data will be moved into E tables.
    Regards,
    Siva.

  • Unable to load master data to cube

    Hi,
    I have one master data 0PART which is referenced by 0Activity Type master data.
    I have added 0PART chara into one infocube. and wrote abap code in update rule as below:
    data:    begin of i_acttype occurs 0,
                    0PART like /BI0/PACTTYPE-ACTTYPE,
              end of i_acttype.
    select ACTTYPE into table i_acttype
            from /bi0/pacttype
        for all entries in DATA_PACKAGE
           where objvers   = 'A' and
                 acttype = DATA_PACKAGE-PART.
    The reason behind adding this code is,
    Cube is populating from ODS and in ODS data comes from R/3 but standard extractor dont populate this field so i am trying to load it from master data.
    when i try to load data into info cube it gets stuck in update rule. it reads the data but not transfering it to cube. so i believe that there is some problem in my code. in monitor i can see that it starts processing update for cube and ends immediately without any error message. process status ends but shows request is still running.
    can someone correct it for me ?
    thanks,
    KS

    Hi
    You are just populating the internal table with the records using the select but no code to write the record to the part field of the cube. Im not very good in ABAP but you can try adding the below code and see if it works or not.
    data: begin of i_acttype occurs 0,
    0PART like /BI0/PACTTYPE-ACTTYPE,
    end of i_acttype.
    select ACTTYPE into table i_acttype
    from /bi0/pacttype
    for all entries in DATA_PACKAGE
    where objvers = 'A' and
    acttype = DATA_PACKAGE-PART.
    loop at data_package where part = i_acttype-0part.
    if sy-subrc eq 0.
    datapackage-part = i_acttype-0part.
    endif.
    endloop.
    Hope it helps.
    Regards
    Sadeesh

  • Error when loading from ODS to Cube

    Hello Friends,
    I am having trouble loading data from ODS to Infocube in my 2004S system.When loading data i get this message
    07/03/2007     13:10:25     Data target 'ODSXYZ ' removed from list of loadable targets; not loadable.
    07/03/2007     13:28:42     Data target 'ODSXYZ ' is not active or is incorrect; no loading allowed     
    I checked for ODSXYZ in my data target but there is nothing by that name.Even the infopackage doesnt have it.What needs to be done.Please help.
    Thanks.

    Its a expected behavior. When you migrate ur DS, the infopacks associated with it, will grey out all the data targets that they were feeding before, that applies to any infopack you create even after the migration. You womt be able to delete it.
    Having said this, this shouldnt impact ur loads from ODS to Cube. As this shoudl be taken care of by ur DTP's rather than ur infopacks.
    A few questions:
    How are you loading ur cube?
    Did the data get thru fine to PSA with the infopack in question?
    How did you load ur DSO(assuming the load was successful)?
    Message was edited by:
            voodi

  • Using ODI to load 2 similar Essbase cubes

    Hello all,
    I am trying to create an ODI routine that incorporates the following details:
    -Loads specific Essbase member properties from a single flat file to 2 similar Essbase cubes
    -The Essbase cubes are virtually identical, except that one is ASO and one is BSO, therefore the properties may differ for specific members (i.e. DataStorage, Consolidation, Formula, etc.). There are also other differences in dimensionality, but those details are inconsequential to this routine...
    -For each type of member property, the flat file has 2 sets of columns - one for ASO and one for BSO. Therefore, one column might be called "ASODataStorage" while another one is called "BSODataStorage". They have different values in them respective to the Essbase architecture.
    -There is also a set of columns called "ASOFlag" and "BSOFlag" that flag which members should even be loaded to that respective cube. So, one sub hierarchy may be needed for the ASO cube, but not the BSO cube.
    Earlier, I created 2 contexts for these cubes so I could use the same ODI routine and source metadata to update both cubes with a simple change of context. This part is working correctly. The challenge I'm facing now, however, is how to design a routine that will select the appropriate columns of data based on my context. Therefore, if one context is called "ASO", it should select the "ASO" columns and apply only those properties/members. And a similar situation should happen when the "BSO" context is created.
    I have tinkered with the idea of using a variable that stores the current context information, and becomes evaluated at runtime and then changes the source columns based on the variable value. However, I'm having trouble getting this to work correctly (I am a newbie to variables). I am also having trouble figuring out how to map all of the source columns to the target columns, as I am going from double property columns in the source to a single set of property columns in the Essbase metdata target.
    Is there a better way to approach this in ODI? Please note that I am aware that EIS can handle this as well...but I am looking for an ODI solution.
    Thanks in advance!
    -OS

    Hi,
    In simple terms you should be able to create a package, drag the variable on to the package and set the value as ASOflag
    Then drag your interface on to the diagram to run the ASO load (the interface should have a filter that uses the variable to filter the required records)
    Then drag the variable on to the diagram and set the value to BSOflag.
    Then drag your interface on to the diagram to run the BSO load (the interface should have a filter that uses the variable to filter the required records)
    Though you say you are changing contexts so you might have to go down the route of generating a scenario from a package for the ASO load, then a scenario for the BSO load, then create another package using the ODIstartscen tool and execute the scenarios with the specified scenarios.
    Cheers
    John
    http://john-goodwin.blogspot.com/

Maybe you are looking for