Monthly Snapshot

Hi experts,
I'm working in the monthly snapshot scenario, following the instruction given in the related How-To from SAP. But I've one problem:
When I load data from goods receipts or issues, all is ok, but when I load stacok types changes, the 0BASE_UOM and 0LOC_CURRCY gets empty.
For example, I've already loaded one register with the following:
0BASE_UOM.|.0LOC_CURRCY.|.ZTOTALST.|.0STOC_TYPE
.......EA...........|...........MXN........|.......30.......|......A..........
If I load a 2LIS_03_BF delta with one stock change from A to D type, of 666 Units (using transaction code MB1B in R/3, with the movement type 344 from unrestricted to blocked stock), the following registers appear
0BASE_UOM.|.0LOC_CURRCY.|.ZTOTALST.|.0STOC_TYPE
.......................|.........................|......-636.......|......A..........
.......................|.........................|......666........|......D..........
Without base units nor currency.
Does anybody have any idea?
Best regards
Edited by: Angel Moro on Jun 26, 2008 4:35 PM

Hi,
I think 0RECORDMODE is necessary in 2LIS_03_bf infosource, just for other purposes, and I can't leave it unmapped.
My question was if it is used for other things appart from calculating the sign of the key figures.
If it is it's only purpose, then I could delete that flag in order to update the 0BASE_UOM and 0LOC_CURRCY.
Regards
Edited by: Angel Moro on Jun 30, 2008 3:46 PM

Similar Messages

  • Monthly snapshot for Open / Cleared Cube

    Can the standard Delta extractor 0FI_AR_4 be leveraged to get a monthly snapshot of Open and Closed items?
    Open - All and Closed only for the past year.
    Currently I am thinking of feeding a delta upto a standard DSO at the Account number, line item level.
    The next level will be a Monthly DSO at the Business area level, which summarizes the open balance and weighted days to pay for cleared. This feeds to a Monthly snapshot Cube.
    Now between the line item DSO and the Monthly DSO, I am thinking of Full loads that bring all the Open and only the past year of Cleared records. The way I build snapshot is, compare posting date of the documents to the current fiscal period. If the posting date is less than the current fiscal period, I duplicate that record and put it in the previous month and this current month. This way, we are building two months history at a time. And at the end of the current month, the previous month snapshot is frozen.
    Is there a way with Deltas to accomplish a 12 month snapshot ( and not just two months with Full) ?
    All thoughts appreciated

    The requirement is to show Outstanding Balance not just as of today but as it was each month.
    say item A was open in Jan through today
           item B was open Jan through March
           item C was open Jan and Feb
    The report should show, item A in Months JAN, FEB, MARCH, APRIL
                                                    B in Months JAN , FEB, MARCH
                                                    C in Months JAN and FEB
    With conventional reporting, there will be one entry  for A in Jan as Open and when that closes in May, it will have a reverse entry for Jan and a After image for May.
    What I need is to show A as Open in all Months through April, B Open in all months through March, etc.
    Hence the name snapshot.
    Edited by: CC on Apr 20, 2010 9:17 PM

  • Weekly/Monthly Snapshot Report

    Hi,
    I have now three cubes Daily,Weekly and Monthly cubes...
    I had made copy of the daily cubes as weekly and monthly.. There is no difference as such structure wise...
    But what extra i need to do to make these weekly snapshot report and monthly snapshot report..
    Now all three cubes are added in multiprovider and multiprovider was used in the definition of the query..
    For daily cube report though, i had simply restricted infoprovider as daily cube and designed the query..
    But what about weekly and monthly snapshot reports?
    as snapshot reports are basically meant for comparing time periods how i could achive the same in reporting?
    What i am contemplating as follows:
    1) Since these are comparing different times ( for weekly snapshot, you must have at least two different weeks of data to compare ) i guess we need to include a time chararistic 0CALWEEK for Weekly snapshot, 0CALMONTH for Monthly snapshot..
    But even after including this time charactristic, how i would be comparing?
    i am not clear on how to creata a snapshot report
    What one has to do in the weekly and monthly cube to be able to compare the data on weekly/monthly basis?

    Simran,
    Weekly Snapshot means entire data has to restated against each week.
    For example to check open orders status week wise, entire open orders(till end of current week) has to restate against current week. at end of the week, need to pass remaining open orders as next weeks snap shot.
    In this way we will get snapshoting of open orders week wise. We can use these weeks to compare.
    For weekly snapshot add week to cube, and at end of each week pass current week closing balances as next week opening balances + transactions in that Week gives snap shot of that week.
    For example Check: [HOW TO HANDLE INVENTORY MANAGEMENT SCENARIOS IN BW (NW2004)|http://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/f83be790-0201-0010-4fb0-98bd7c01e328]
    Snapshot scenario.
    Hope it Helps
    Srini

  • Inventory  Book Monthly snapshot

    Hey Team,
         I bring in MARD data on 9/30/2008 which  gives  me  an inventory snapshot for  the  end of  month.
    how do I get  these  snapshots everymonth...
    meaning...is there a  delta method I can  use/write an FM  on..
    or  do  a full load at the  end  of each month.
    thnx
    tb

    Hi
    You have to implement 2LIS_03_BX and 2LIS_03_BF. There is a detail HowTo document availalble to guide you on using the snapshot vs cumulitive methos. Search SDN for the mentioned datasources to get a link to this doc.
    Regards

  • How to load stocks for the months prior to initialization ?

    Hello,
    We load a cube stock from 3 ODS for BF,BX and UM extractors.
    We are in a monthly snapshot scenario(we have read the specific how to but we must use 3 ODS one for each datasource because of data integrity (strong customer architecture contrainst).
    For opening stock and the future month the solution is ok.
    How to load the data for the 12 month prior the loading of opening stock ?
    Specific datasources, full loading ?
    Thanks to share our experiences
    Best regards
    Christophe

    your previous months closing stock becomes opening stock for current month. Try the infostructure S039,this holds the closing stock of each month. You can populate this infostructure with program rmbs039. create a generic data source on this and populate opening and closing stock for each month.

  • Tnventory Management Weekly snapshot 0IC_C03

    Hi,
    We are trying to instal the stocks Flow, with the 0IC_C03 cube. Our Business Content (Release 703 Level 8) has the new Infosources (2LIS_03_BF_TR, 2LIS_03_BX_TR and 2LIS_03_UM_TR) and the related transformations from the Infosources to the cube. But We've realized taht there is not one of the Transformations from 2LIS_03_BF datasource to 2LIS_03_BF_TR infosource. (The othrer 2 transformations appear in our Business Content)
    We are thinking to do the following: mantain the transformations for the 2LIS_03_BX and 2LIS_03_UM side, and copy from another business Content (manually, of course) the 2LIS_03_BF side. Has anybody know if it is OK?
    On the other hand, we need to create a Weekly snapshot, and reviewing the document "How to... handle Inventory Management Scenarios in BW", there is described the process to make a monthly snapshot if the old flow of the 3 datasources (2LIS_03_BF, 2LIS_03_BX and 2LIS_03_UM) is built (the flow with the Transfer and Update Rules)
    So, is it recomended to implement the old flow for the 3 datasources (the one with Transfer and Update Rules)? And does anybody knows how can we modify the routines in order to make it weekly, not monthly?
    Thanks in advance for your help.
    Best regards

    Hello Angel Moro,
    Were you able to implement weekly snapshot cube? If yes, Can you please email me step by step how you did it at [email protected]

  • Stock Snapshot using APD

    Hello all,
    Overview
    I have requirement to get monthly snapshot of stock for all Materials, Plants and other characteristic combination. "How to... Inventory guide" mentions about using snapshot ODS and then snapshot cube.
    Analysis
    I would like to avoid using "How to" and replace that with APD.
    I would create a Query on 0IC_C03 infocube and then using APD Monthly I would create the snapshot recordset for all Materials, Plants and other characteristic combination .
    I have already looked at various OSS notes including 751577 - APD-FAQ: Data source query.
    Questions
    1)Will Query as source work for APD work with large volumes of data?
    2). If Yes is there is a limitation what volume of records with the Query using APD feeding to transactional ODS can be handled.Our client will have the one of the highest volumes of records?
    At the moment we are on BW 3.5.
    Please let me know if you have any experience and thought about this.
    Thanks
    TK

    Hi Mansi,
    Thanks for the reply.
    Scenario : I have data in Standard DSO ,I want to send the data to CRM system from BI.
    1 ) yes, I have seen the CRM icon in targets , but I am not able to connect both BI and CRM system, please let me know what transformations option should I use, because I donu2019t want to do any calculations here , just I want to transfer the data from BI to CRM.
    2 ) I have clicked the DSO option in the T-code RSANWB , here I have selected the standard DSO (Drag and Drop) and selected the CRM icon , I have connected both by black arrow sign and Transfer Rule icon in the middle.
    3 ) After drag and Drop the DSO , I have right click the DSO and selected properties ,here I am able to see the info objects (under Selected Char).If I right click the CRM icon (i can see the description name for CRM) and select Properties , it is asking log in ID and password to CRM server , after that I am able to see u201CDescription which I have givenu201D , u201CLogical Systemu201D and u201CData Targetu201D.
    I donu2019t what to mention here in the options?
    Thanks,
    Nithi.
    Pls guide me , if I have missed out any steps.

  • Snapshot Cube

    Hi Guys,
    I was wondering if you could give me a hand with a Snapshot Cube that one of my clients want me to create for one of their APO Cubes.
    So basically what they want is to be able to take a monthly snapshot (what ever day of the month they choose) of their APO Cube. But it should only retain the last three snapshots (last three months).
    i.e There are three months in the snapshot cube already: January,Febuary and March, but when it gets to April and they do their monthly snapshot for that month it will knock off January and add April. So it would end up being Febuary, March and April.
    What is the best way to achieve this?
    Thanks in advance.
    Nick
    Edited by: Nick_Raus on Jan 27, 2011 1:36 AM

    Hi,
    In order to copy the latest month (April in your example), you can use an ABAP routine filter in the corresponding DTP, which calculates the period range to be copied based on SY-DATUM.
    On the other hand, when it comes to deleting the oldest month (January in your example), you can use DELETE_FACTS transaction in order to create a Selection program that you can include in your process chain as an ABAP process type.  A very helpful example can be found at:  http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/603a9558-0af1-2b10-86a3-c685c60071bc
    When selecting the method of calculation for the date, you may use:
    Beginning of mth-xx months, end of mth+yy months (for example, if you are in April and want to delete January, use xx=3, yy=-3)
    I hope this helps you.
    Regards,
    Maximiliano

  • Snapshot, wrong information in OCALMONTH

    Hi Gurus,
    We have created a monthly Snapshot following the document called: How to Handle Inventory Management Scenarios in BW
    If one month the process that upload the movements from DSO to infocube is not executed (or it's executed and it fails), when we execute again the process the next month old data will be updated in the Infocube with wrong information in 0CALMONTH.(because 0CALMONTH is filled with the system date at the moment of the load).
    As result, the old and new data has been added in the same month.
    Do you know how to resolve this probleme? or How to avoid it?
    Any feedback will be really appreciated.
    Thanks

    Hi  David Trapote,
    We also faced same problem deriving week or month from system date.
    To avoid this problem, we maintained date in a table. week and month will be derived from this particular table instead of system date.
    Create a simple program to update table before load starts, include this particular program in process chain after start process(before data loading). Then you can avoid these problems perminently.
    Dont forget to create program with user selection for input, so you can update date accordingly.
    Hope it Helps
    Srini

  • Manage btrfs snapshots

    One of the feature that i much like about btrfs is Snapshot.
    What i want to do is to create snapshots of /home every 5/15 minutes (i have to test better the impact on performance) and retain:
    02 - Yearly snapshots
    12 - Monthly snapshots
    16 - Weekly snapshots
    28 - Daily snapshots
    48 - Hourly snapshots
    60 - Minutely snapshots
    The above scheme is indicative and i'm looking for suggestion in order to improve  it.
    After long google searching finally i've found a very simple but powerful bash script to manage btrfs snapshots: https://github.com/mmehnert/btrfs-snapshot-rotation
    I've changed a bit the script in order to use my naming scheme
    #!/bin/bash
    # Parse arguments:
    SOURCE=$1
    TARGET=$2
    SNAP=$3
    COUNT=$4
    QUIET=$5
    # Function to display usage:
    usage() {
    scriptname=`/usr/bin/basename $0`
    cat <<EOF
    $scriptname: Take and rotate snapshots on a btrfs file system
    Usage:
    $scriptname source target snap_name count [-q]
    source: path to make snaphost of
    target: snapshot directory
    snap_name: Base name for snapshots, to be appended to
    date "+%F--%H-%M-%S"
    count: Number of snapshots in the timestamp-@snap_name format to
    keep at one time for a given snap_name.
    [-q]: Be quiet.
    Example for crontab:
    15,30,45 * * * * root /usr/local/bin/btrfs-snapshot /home /home/__snapshots quarterly 4 -q
    0 * * * * root /usr/local/bin/btrfs-snapshot /home /home/__snapshots hourly 8 -q
    Example for anacrontab:
    1 10 daily_snap /usr/local/bin/btrfs-snapshot /home /home/__snapshots daily 8
    7 30 weekly_snap /usr/local/bin/btrfs-snapshot /home /home/__snapsnots weekly 5
    @monthly 90 monthly_snap /usr/local/bin/btrfs-snapshot /home /home/__snapshots monthly 3
    EOF
    exit
    # Basic argument checks:
    if [ -z $COUNT ] ; then
    echo "COUNT is not provided."
    usage
    fi
    if [ ! -z $6 ] ; then
    echo "Too many options."
    usage
    fi
    if [ -n "$QUIET" ] && [ "x$QUIET" != "x-q" ] ; then
    echo "Option 4 is either -q or empty. Given: \"$QUIET\""
    usage
    fi
    # $max_snap is the highest number of snapshots that will be kept for $SNAP.
    max_snap=$(($COUNT -1))
    # $time_stamp is the date of snapshots
    time_stamp=`date "+%F_%H-%M"`
    # Clean up older snapshots:
    for i in `ls $TARGET|sort |grep ${SNAP}|head -n -${max_snap}`; do
    cmd="btrfs subvolume delete $TARGET/$i"
    if [ -z $QUIET ]; then
    echo $cmd
    fi
    $cmd >/dev/null
    done
    # Create new snapshot:
    cmd="btrfs subvolume snapshot $SOURCE $TARGET/"${SNAP}-$time_stamp""
    if [ -z $QUIET ]; then
    echo $cmd
    fi
    $cmd >/dev/null
    I use fcontab
    [root@kabuky ~]# fcrontab -l
    17:32:58 listing root's fcrontab
    @ 5 /usr/local/bin/btrfs-snapshot /home /home/__snapshots minutely 60 -q
    @ 1h /usr/local/bin/btrfs-snapshot /home /home/__snapshots hourly 48 -q
    @ 1d /usr/local/bin/btrfs-snapshot /home /home/__snapshots daily 28 -q
    @ 1w /usr/local/bin/btrfs-snapshot /home /home/__snapshots weekly 16 -q
    @ 1m /usr/local/bin/btrfs-snapshot /home /home/__snapshots monthly 12 -q
    And this is what i get
    [root@kabuky ~]# ls /home/__snapshots
    [root@kabuky ~]# ls -l /home/__snapshots
    total 0
    drwxr-xr-x 1 root root 30 Jul 15 19:00 hourly-2011-10-31_17-00
    drwxr-xr-x 1 root root 30 Jul 15 19:00 hourly-2011-10-31_17-44
    drwxr-xr-x 1 root root 30 Jul 15 19:00 minutely-2011-10-31_17-18
    drwxr-xr-x 1 root root 30 Jul 15 19:00 minutely-2011-10-31_17-20
    drwxr-xr-x 1 root root 30 Jul 15 19:00 minutely-2011-10-31_17-22
    drwxr-xr-x 1 root root 30 Jul 15 19:00 minutely-2011-10-31_17-24
    drwxr-xr-x 1 root root 30 Jul 15 19:00 minutely-2011-10-31_17-26
    drwxr-xr-x 1 root root 30 Jul 15 19:00 minutely-2011-10-31_17-32
    drwxr-xr-x 1 root root 30 Jul 15 19:00 minutely-2011-10-31_17-37
    drwxr-xr-x 1 root root 30 Jul 15 19:00 minutely-2011-10-31_17-42

    tavianator wrote:
    Cool, I just wasn't sure if the "And this is what i get" was what you were expecting.
    Looks good, I may take a look since I'm using btrfs.  Don't confuse this for a backup though .
    Sorry for confusion, my english is not very good
    Yes this is not a backup.
    It's a sort of parachute if you do something stupid with your files you can at any time come back

  • Snapshot Dimension & Fact tables

    We are currently designing a logical multidimensional model from an OLTP tables.Dimension tables have monthly snapshot because some of or all the attributes might change on monthly basis.The same situation for the fact table has monthly snapshot.
    I know that according to Kimball's modeling, the attributes for dimensions should be implemented using mini-dimensions and put combination key as a foreign key in the fact table but this step needs an ETL job to handle mini-dimension and other fact table.However, in our situation and according to scope limitation,there is no time to design separate ETL to handle mini-dimension.
    An example for our records:
    Customer:
    Cust_ID
    Month_ID
    Attr1
    Attr2
    Attr3
    Attr20
    Installments
    Installment_ID
    Cust_ID
    Month_ID
    Attr1
    Attr2
    Attr3
    Attr5
    Measure1
    Measure2
    Measure3
    Measure4
    So, Installments table contains attributes as well as measures so what is the suitable consideration and the suitable OBIEE logical BM design ,should we consider Installments table as fact or should we divide it into dimension and fact tables ?

    Xerox wrote:
    So, Installments table contains attributes as well as measures so what is the suitable consideration and the suitable OBIEE logical BM design ,should we consider Installments table as fact or should we divide it into dimension and fact tables ?You already give the answer yourself: you should create an Installment dimension and a Installment fact table, using the same physical table as logical table source. The logical dimension should only contain the attributes, the logical fact should only contain measures.

  • Regular snapshots of project information for trend analysis

    Hi folks,
    I am working with a client who wants to take regular (monthly) snapshots of Project Server data, specifically project status, estimated, actuals, resource allocations etc. I note that 2007 and 2010 doesn't have anything OOTB besides BL functionality, does
    Project Server 2013 offer anything new in this area?
    Example: we want to match SAP actuals with Project Server current/estimates and record monthly snapshots of this information so we can do trend analysis. Is it possible in Project Server 2013 OOTB to take monthly snapshots beyond BL 10+1 ?
    Thanks,
    Gavin.

    Hi Gavin,
    As Guillaume and Sachin have correctly stated, there is no monthly snapshotting functionality OOTB. The easiest solution is to do this via SQL. I blogged about a simple example a few years back, see below:
    http://pwmather.wordpress.com/2012/04/27/projectserver-data-snapshot-for-reporting-ps2010-msproject-sp2010-excel-ssrs-sql/
    That should give you an idea.
    Paul
    Paul Mather | Twitter |
    http://pwmather.wordpress.com | CPS

  • 0_FI_AR_4 Modeling question

    Hi ,
    How does one model a solution in BW for getting a correct  historical update of open and closed items from 0fi_ar_4.
    Imagine that the data is staged upward from write optimized dso to a std. dso and going up the cube.
    Now at the datamart level, how do we ensure a consistent updated picture of open closed items as they change from one status to another?
    Thanks

    Thanks Simon,
    I like the idea about two seperate cubes for Open and Closed.
    Our existing design is as follows:
    the five Write optimized DSOs (WDSO) - one for each source system,  feed data to one line item Standard DSO (SDSO) . Now the KEY for this SDSO includes Open/Closed Indicator. 
    The SDSO feeds(1 DTP) a Cube for AR Line items (CubeA- doesnt have source system field) and also feeds a Transformation DSO(TDSO) which aggregates into Total AR, monthly debits and current and past due on line items. and also a couple of lookups.
    This TDSO feeds up into a Cube for AR monthly (CubeM).
    The design team that developed this model is not reachable anymore and I am working towards understanding the workings and conveying some performance fixes for this model, which could also entail a complete redesign.
    Now, they have 5 process chains, for each source system, each triggered by loads into the WDSOs.
    Each PC has two parts :
    1. Open item selective deletion
    2. Closed item selective deletion
    The explanation for the numerous selective deletes everyday is :
    Everyday, there is a full load of Open items and only current year records of Closed items. So, its necessary to delete and reload the Open items and current year records for Closed items from WDSOs, SDSO and CubeA.
    For monthly snapshot to be accurate, it is necessary to delete and reload current month data in TDSO and CubeM.
    Now in the absence of any real knowledge transfer and at the behest of making a decent evaluation/recommendation, I need to understand why they have the current model andmultiple selective deletes.
    My observations:
    - The Open/Closed indicator cannot be a part of the key for SDSO as it would result in doubling up of the key figures.
    - There need not be Selective Deletion for the Open Items, iff source system field is included in CubeA and source system specific DTPs are employed.
    - To eliminate the Selective deletes for Closed items, 'Delete Overlapping Requests' (DOR) type can be used in Process Chains. To account for building history one month at a time (rolling 12 months), the DOR could be coded in the routine.
    Please let me know based on the above description, if my observations are correct. And what would it cost to make the suggested fixes.
    Also, there should be a better way of building this model. Please let me know your thoughts. Appreciate your insightful replies!

  • Using @Prompt in the SELECT clause (?)

    Post Author: faltinowski
    CA Forum: Semantic Layer and Data Connectivity
    Product:  Business Objects
    Version:  6.5 SP3 
    Patches Applied:  MHF11 and CHF48
    Operating System(s):  Windows
    Database(s):  Oracle
    Error Messages:  "Parse failed: Exception: DBD, ORA-00903 invalid table name  State N/A"
    Hi!  I'm bewildered -- we have an object that parses but when I try to reproduce this object, it does not.
    We have a universe that's been in production for several years using an object developed by another designer who's no longer with the company.  This object is a dimension, datatype is character, and there's no LOV associated.  The SELECT statement in this object is
    decode(@Prompt('Select Snapshot Month','A','Object Definitions\CY Month Snapshot',MONO,CONSTRAINED),'00-Previous Month',to_number(to_char(add_months(sysdate,-1),'MM')),'01-Current Month',to_number(to_char(add_months(sysdate,0),'MM')),'01-January','1','02-February','2','03-March','3','04-April','4','05-May','5','06-June','6','07-July','7','08-August','8','09-September','9','10-October','10','11-November','11','12-December','12')
    This object parses. The client uses the object in the select clause to capture the "month of interest" for the report.  So the report may be for the entire year's data which is graphed to show trends, but there's a table below the graph which is filtered to show just the month of interest.  Typically they use the value "00-Previous Month" so they can schedule the report and it will always show the last month's data.
    Problem
    The original object parses.
    If I copy the object within the same universe, it parses.
    If I copy the code into a new object in the same universe, it doesn't parse
    If I copy the code into a new object in a different universe, it doesn't parse
    If I copy the object to a different universe, then edit the LOV reference, it doesn't parse
    If I create any new object having @Prompt in the SELECT statement, it doesn't parse.
    If another designer tries - they get the same thing.
    What am I missing?  Obviously someone was able to create this successfully.
    On the brighter side
    The object I created in a new universe (which doesn't parse in the universe) seems to work fine in the report.

    Seems that, the prompt syntax is correct.
    But the condition is not correct.
    You are taking the prompt value and not doing anything. That could be one issue for this error.
    I believe that, you just want to capture the prompt value use it in report level and do not want to apply as a filter.
    So, use the condition as follows.
    @Prompt('Select Grouping','A',{'A','B','C'},mono,constrained) = @Prompt('Select Grouping','A',{'A','B','C'},mono,constrained)
    Hope this helps!

  • Non-Cumulative vs. Cumulative KeyFigures for Inventory Cube Implementation?

    A non-cumulative is a non-aggregating key figure on the level of one or more objects, which is always displayed in relation to time. Generally speaking, in SAP BI data modeling, there are two options for non-cumulative management. First option is to use non-cumulative management with non-cumulative key figures. Second option is to use non-cumulative management with normal key figures (cumulative key figures). For SAP inventory management, 0IC_C03 is a standard business content cube based upon the option of non-cumulative management with non-Cumulative key figures. Due to specific business requirements (this cube is designed primarily for detailed inventory balance reconciliation, we have to enhance 0IC_C03 to add additional characteristics such as Doc Number, Movement type and so on. The original estimated size of the cube is about 100 million records since we are extracting all history records from ECC (inception to date). We spent a lot of time to debate on if we should use non-cumulative key figures based upon the  standard business content of 0IC_C03 cube. We understand that, by using Non-Cumulative key figures, the fact table will be smaller (potentially). But, there are some disadvantages such as following:
    (1) We cannot use the InfoCube together with another InfoCube with non-cumulative key figures in a MultiProvider.
    (2) The query runtime can be affected by the calculation of the non-cumulative.
    (3) The InfoCube cannot logically partition by time characteristics (e.g. fiscal year) which makes it difficult for future archiving.
    (4) It is more difficult to maintain non-cumulative InfoCube since we have added more granularity (more characteristics) into the cube.
    Thus, we have decided not to use the Cumulative key figures. Instead, we are using cumulative key figures such as Receipt Stock Quantity (0RECTOTSTCK) ,  Issue Stock Quantity(0ISSTOTSTCK)
    , Receipt Valuated Stock Value (0RECVS_VAL) and Issue Valuated Stock Value (0ISSVS_VAL). All of those four key figures are available in the InfoCube and are calculated during the update process. Based upon the study of reporting requirements, those four key figures seems to be sufficient to meet all reporting requirements.
    In addition, since we have decided not to use cumulative key figures, we have removed non-cumulative key figures from the 0IC_C03 InfoCube and logically partitioned the cube by fiscal year. Furthermore, those InfoCube are fiscally partitioned by fiscal year/period as well.
    To a large extent, we are going away from the standard business content cube, and we have a pretty customized cube here. We'd like to use this opportunity to seek some guidance from SAP BI experts. Specifically, we want to understand what we are losing here by not using non-cumulative key figures as provided by original 0IC_C03  business content cube. Your honest suggestions and comment are greatly appreciated!

    Hello Marc,
    Thanks for the reply.
    I work for Dongxin, and would like to add couple of points to the original question...
    Based on the requirements, we decided to add Doc Number and Movement type along few other characteristics into the InfoCube (Custom InfoCube - article movements) as once we added these characteristics the Non Cumulative keyfigures even when the marker was properly set were not handling the stock values (balance) and the movements the right way causing data inconsistency issues.
    So, we are just using the Cumulative keyfigures and have decided to do the logical partitioning on fiscal year (as posting period is used to derive the time characteristics and compared to MC.1 makes more sense for comparison between ECC and BI.
    Also, I have gone through the How to manual for Inventory and in either case the reporting requirement is Inception to date (not just weekly or monthly snapshot).
    We would like to confirm if there would be any long term issues doing so.
    To optimize the performance we are planning to create aggregates at plant level.
    Couple of other points we took into consideration for using cumulative keyfigures are:
    1. Parallel processes possible if non-cumulative keyfigures are not used.
    2. Aggregates on fixed Plant possible if non-cumulative keyfigures are not used. (This being as all plants are not active and some of them are not reported).
    So, since we are not using the stock keyfigures (non cumulative) is it ok not to use 2LIS_03_BX as this is only to bring in the stock opening balance....
    We would like to know if there would be any issue only using BF and UM and using the InfoCube as the one to capture article movements along with cumulative keyfigures.
    Once again, thanks for your input on this issue.
    Thanks
    Dharma.

Maybe you are looking for