Question on "How to Handle Inv management Scenarios in BW" docuemnt.

Hi all,
I read the document "How to Handle inventory management scenarios in BW" and I did not understood what a snapshot scenario is?  Can anyone tell me what is the difference between snapshop and non-cumulative scenario's.
thanks,
Sabrina.

In a non-cumulative scenario the current stock of any day (or month) is not stored physically in the cube. Instead the stock values over time is calculated dynamically at query runtime by the BW OLAP engine, which basically derives the stock by summing up the periodic value changes (cumulative inflow and outflow of the non-cumulative key figure), which are stored in the cube (together with a so called stock marker used as the basis of the calclation).
In the snapshot scenario the current stock of any month is calculated in a snapshot ODS and then loaded to a cube. This means that the the stock value is physically stored in the cube in an ordinary cumulative key figure.
Since a non-cumulative cube store value changes and not the actual stock this means that performance might be bad if there are many value changes for each characteristic combination in a month (since the stock is calculated at runtime and many records must be processed to derive the stock). So in this case the snapshot scenario is better since no runtime calculations of the stock need to occur  and since only one record, containing the actual stock value will be stored in each month for each characteristic combination having a stock value.
I think you would be better of with an example, but with this explanation in mind looking at the scenarios in the How to again might clarify things...
Regards,
Christian
/ Christian
Message was edited by: Christian

Similar Messages

  • Posting block Question in How to handle inventory management scenarios

    Hi all - I have a question in document
    <b>"How to...Handle Inventory Management Scenarios in BW"</b>
    can anyone please tell me what a posting block is in this document...
    also I did not understood what a Validity table is?
    thanks,
    Sabrina.

    Hi Sabrina!
    A 'posting block' period is something that now you are facing in inventory management, but you have to consider it in ALL logistic cockpit flows (what you can see in LBWE).
    This requirement can be easily explained in this way: as you know, to fill a setup table you have to activate the extract structure of the datasource for which you want to run the setup job (otherwise the system says that no active extract structure exists, do you remember ?); but, by doing so with the posting operations open (in other words, users can create new document or change the existing ones), you run the risk that these records are automatically included in the extraction queue (or in SM13 if you are using unserialized method) AND also collected in setup table ! And you will have the same records with the initial load (from setup table) and then the same from delta load (from the queue): a nice data duplication.
    To avoid this situation, a "posting block" (or a "free-posting period" or a "downtime") is requested. No one can post new document during your setup job...
    Hope now is clearer...
    Bye,
    Roberto

  • How to handle inventory management scenario

    Hi All,
    In How to handle inventory management scenario whitepaper, I fail to understand the basic difference between 'Inventory Management with non-cumulative Key Figures' and 'Inventory Management with Snapshots'. Can anyone please explain me the basic difference.
    Thanks
    PB

    Hi Loic
    Can u please see if all the steps listed below are OK.
    While setting up  the IM Scenario
    I have jotted down the steps with valuable inputs from you  and paper 'How to Hanlde IM scenarios'
    Please find the steps below. Kindly review them and let me know in case I have got anything wrong or missed out on anything.
    1. Transfer Business Content Datasources 2LIS_03_BX , 2LIS_03_BF, AND 2LIS_03_UM in RSA5.
    2. Go to Transaction LBWE, Activate Datasources 2LIS_03_BF, AND 2LIS_03_UM.
    3. Go to Transaction MCB_ select SAP BW usage 'Standard' Radio button.
    4. Save it.
    5. Go to Transaction MCNB, enter name of run ‘Stock_init’. Enter Termination date in future. Enter Datasource as 2LIS_03_BX.
    6. Execute the initialisation.
    7. Entries can be found in SETUP Table MC03BFSETUP & MC03BFSETUP.
    8. Run SETUP for 2LIS_03_BF, AND 2LIS_03_UM using TCodes OLI1BW and OLIZBW respectively.
    9. Run the extraction for 2LIS_03_BX in BW.
    10. Compress the request with ‘No Marker Update’ NOT SET i.e. unticked or unchecked.
    11. Run the extraction for 2LIS_03_BF in BW.
    12. Compress the request with ‘No Marker Update’ SET i.e. ticked or checked.
    13. Run the extraction for 2LIS_03_UM in BW.
    14. Compress the request with ‘No Marker Update’ SET i.e. ticked or checked.
    Steps 1 and 2 shall be executed in Development and transported to production and step 3 onward should be carried out in Production itself with Posting Block.
    Thanks in advance.
    Regards
    PB

  • How to Handle Inventory Management Scenarios in BW (NW2004) BW7

    Hello,
    does anybody know if there is a more actual version of the document available?
    Thx,
    Christian

    Hi,
    Before posting please search on Google.
    please find the link.
    SAP BW - Inventory data loading step by step(0IC_C03)
    http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/906b837a-0d54-2c10-08b8-bde70337547e?overridelayout=t…
    Thanks,
    Phani.

  • How to handle power managment features of Windows Vista

    Hi guys
    I searched a little bit in the Toshiba Technical Knowledge Base and found very interesting article about the power management features of Windows Vista.
    *[How to handle power management features of Windows Vista|http://aps2.toshiba-tro.de/kb0/HTD84026Q0000R01.htm]*
    The point is that you will not find a same Toshiba Power Saver tool on Vista like on the Windows XP notebooks and the power saver settings are a little bit hidden in the Vista Power Saver settings.
    But you can also control the certain options like Cooling Method or Power Saving Mode.
    I think this document could be interesting for some here in the forum!
    Greetings to all!

    Hi Jayjay
    Thanks for posting this here in the forum.
    Several days ago I searched for such further power option in Vista.
    This document was really, really helpful for me.
    Thanks again

  • How to Handle Batch Managed materials in AII

    Hi,
    We are trying out various scenarios with AII one of then including how does AII handle Batch Managed materials. While creating a delivery we provide Batch information, but this is not going to AII, as AII is only interested in EPC codes, quantity and UOM.
    We are able to successfully get tags into AII but the delivery IDOC fails to post in the ERP system cause it is expecting BATCH information.
    Would appreciate any ideas on the same, we have tried posting the BATCH_ID in the PML document header as well as at the TAG level but have not been successful so far, Are we missing some steps?
    - Aniruddha

    Hi Aniruddha,
    AII does not support the extension fields ( like BATCH_ID) directly from UI ie.  mobile or Auto ID Cockpit
    Incase you are testing through Auto Id Test Tool, you can assign these fields under tags data list in Tags tab.
    The way of passing these fileds from auto id test tool is to select the EPC being passed and add the field BATCH_ID under tag data list. This will add the extension field to the XML being sent to XI.
    As far as sending it through UI is concerned then development option has to be explored. For mobile UI, you can think of creating a user data profile and add such fields to the rule being processed. You can refer to way SAP has defined Z_CON_FIELDS element set in the config guides.
    I hope this should help you in finding a solution to your problem.
    Regards,
    Ashish
    PS: Reward points for useful answer

  • Question about how ExifMeta handles multivalued tags

    Hi,
    I have tested the ExifMeta (Great work, just what I needed!) to read face recognition data written by Picasa to RegionName, RegionType etc. fields in jpg metadata. Everything seems to work fine if there is only one name in the field but Picasa writes several values separated with commas like this:
    RegionName     John Doe, Jill Doll
    RegionType     Face, Face
    etc.
    In that case, the corresponding fields remain completely empty. So is this convention of writing several values separated by commas "standard" way of doing this? Is there any way to fix this?
    Additionally, three Region category tags refuse to load giving the following error:
    "Not updating due to error getting property, id: XMPmwgrs_RegionAppliedToDimensionsUnit, from: nil, to: pixel, err-msg: Attempt to access property "XMPmwgrs_RegionAppliedToDimensionsUnit" not declared in com.robcole.lightroom.ExifMeta's Info.lua"
    This is minor problem but it would be nice to know how to handle this kind of error in the future. Do I need to declare these in the lua-code if I need them?
    Looking forward to replies,
    Mikko

    Mikko,
    In future, please direct exif-meta specific questions to the exif-meta forum, or me personally.
    But a quick answer: exif-meta doesn't (by default) parse ("interpret") text values, so commas shouldn't be handled differently than any other text.
    The best way to find out what's going on is to invoke exiftool from the command line and inspect the output.
    My hunch is there is a zero character in there representing end-of-string, which can cause problems for lua code (e.g. Lightroom).
    If you send me one of your files which includes the face data, I'll have a look.
    PS - No clue about the 3 region tags (might be worth trying it with the latest version of exiftool) - please send me a file for inspection - thanks.
    Rob

  • How to handle and manage a multi Database access in runtime with LCDS?

    Hello there
    I got several customer working with the same application and I wonder how,  with LCDS,  to manage  in a runtime a multi dataBase access; without creating a configuration "mxl" file in
    the folder catalina for each database.
    Indeed, each customer have their own dataBase, and so far, I did not find out how to avoid creating a config xml file in catalina for every single database; which force me to create as well for each customer a  folder application, since the name of the config file in catalina require a folder application to be ran under tomcat....
    Thus, my question is :
    Is there anyway to create only one configuration mxl file in catalina (in the server side) and then from the client side (application) let the user select its environment (meaning its database) to run the application.... this technic can be also used for multi database environment such as : Dev / Test / Prod   environment (or database) where the same application can access to.
    Please if any one have an idea or already delt with; just let me know, because I'm entering in a bootle neck and the situation is getting serioulsy critical....
    Regards

    Hello Ulrich,
    with compact and repair I mean the MSAccess function "Compact and Repair".
    Please follow the link below for more details:
    http://office.microsoft.com/en-us/access-help/compact-and-repair-an-access-file-HP005187449.aspx
    Normally you can execute this function directly in Access or with the Windows ODBC Data Sources Administrator  => "Control Panel" => "Administrative Tools" => "Data Sources (ODBC)"...
     I want to execute this function via cvi code and not by hand ;-).
    Thank you for your support.
    Frank

  • How to handle and Manage Multi DataBase access with LCDS in runtime ?

    Hello there
    I got several customer working with the same application and I wonder how,  with LCDS,  to manage  in a runtime a multi dataBase access; without creating a configuration "mxl" file in
    the folder catalina for each database.
    Indeed, each customer have their own dataBase, and so far, I did not find out how to avoid creating a config xml file in catalina for every single database; which force me to create as well for each customer a  folder application, since the name of the config file in catalina require a folder application to be ran under tomcat....
    Thus, my question is :
    Is there anyway to create only one configuration mxl file in catalina (in the server side) and then from the client side (application) let the user select its environment (meaning its database) to run the application.... this technic can be also used for multi database environment such as : Dev / Test / Prod   environment (or database) where the same application can access to.
    Please if any one have an idea or already delt with; just let me know, because I'm entering in a bootle neck and the situation is getting serioulsy critical....
    Regards

    Hello Ulrich,
    with compact and repair I mean the MSAccess function "Compact and Repair".
    Please follow the link below for more details:
    http://office.microsoft.com/en-us/access-help/compact-and-repair-an-access-file-HP005187449.aspx
    Normally you can execute this function directly in Access or with the Windows ODBC Data Sources Administrator  => "Control Panel" => "Administrative Tools" => "Data Sources (ODBC)"...
     I want to execute this function via cvi code and not by hand ;-).
    Thank you for your support.
    Frank

  • How to handle this Rework Scenario.

    Hello friends,
    Following is the rework scenario that is in Pharma.
    My client produces finish product quantity extra than required. Suppose i have order of 1,00,000 tablets. Production order is created and production is completed. Suppose for this order 1000 tablets are produced extra than order quantity. This extra production is kept in production only. In such way, for each order, extra quantity of finish good  goes on increasing in production.
    Suppose now i have order of 10,000 tablets and in production i have 5000 tablets that are already produced. Now i use this 5000 tablets as rework material. I want that when i run MRP system should generate order of only 5000. I do production of  10000 qty. by using 5000 rework material and 5000 order quantity.
    How can i map this scenario in SAP?
    Thanks

    hi
    Pls. find here step by step procedure for Rework Order :
    REWORK HANDLING PROCEDURE
    There are two way one is manual and another is with trigger point (ref operation set)
    1.Create Rework order without material at CO07 with out material .
    2. Assign the settlement receiver as original production order at settlement rule.
    3. Assign the rework material & component if needed at rework order & assign the issue storage location (unrest- use storage location above).
    4. Do the confirmation for rework order & rework material will be issued from stock to order
    5. GR will not be done for rework order. To get stock for reworked material Do the GR for orginal order & settle the both order .
    Another method,
    1. Create a Reference op set in tcode CA11.
    Enter the plant, Description, usage and status
    for the ref op set
    2. Create a Standard trigger point in CO31.(Ex
    Production)
    Enter the Trigger Point usage /Group as FERT.
    3.Tick the Trigger Point Functions.
    4. Enable the indicator create order with Reference .
    5.Enter the system status as PCNF.
    6.The system will says standard trigger point exists from which you can
    select the created Trigger Point (EXx Production)and save.
    7. MD02/MD04.
    8. Convert the Planned Order into Production Order and Release the order.
    9 .Now confirm th ops.
    10. For the last op enter the yield and rework qty
    and set the status as PCNF.
    11. If the status of last op is PCNF then in the
    system a POP UP will appear as Activited by
    Trigger Point /Create order with reference/order
    number/sequence and op number which trigger point
    works and enter.
    12. Now a rework order will be created.
    13. In c003 enter the number which is next to the actual prodn order no.Rework order will be seen.
    Hope this helps.

  • How To Handle Stale Stats Scenario.

    hi ,
    I am using Release 10.2.0.1.0 of Oracle. I am having a scenario in which i am getting poor execution plans due to stale stats , and how should i tackle the scenario. below is the part of my main query which deviates the execution path due to wrong cardinality estimation.
          My column c1 of table tab1 holds javatimestamp values i.e. its NUMBER datatype which points to a date and time component only. And we gather stats each weekend on this table tab1.
          below is my query:
          select /*+gather_plan_statistics*/* from tab1         
          where c1 BETWEEN 1346300090668 AND 1346325539486    ;     
    Plan hash value: 3167980259
    | Id  | Operation                   | Name                    | Starts | E-Rows | A-Rows |   A-Time   | Buffers | Reads  |
    |   1 |  TABLE ACCESS BY INDEX ROWID| tab1                    |      1 |   1    |    167K|00:01:13.72 |     158K|  12390 |
    |*  2 |   INDEX RANGE SCAN          | IDX_N1                  |      1 |   1    |    167K|00:00:13.27 |   13880 |   1736 |
         Above shows a big gap in actual and estimated cardinality estimation, and its due to the fact that the HIGH_VALUE (1346203206173 points to 8/29/2012 1:20:06 AM) in DBA_TAB_COLUMN for     column C1 is well below  the STARTRANGE(1346300090668 points to 8/30/2012 4:14:51 AM) and ENDRANGE(1346325539486 points to 8/30/2012 11:18:59 AM) of the BETWEEN clause.
         So even gathering stats daily on the table wont help me as because, in morning again it will require updated maxvalue for the column C1 for estimating proper, So how to handle this situation?  Dont want to go with 'hint' , want to make the stats proper so that optimizer will automatically pick the right path.Edited by: 930254 on Aug 30, 2012 4:41 AM

    930254 wrote:
    hi ,
    I am using Release 10.2.0.1.0 of Oracle. I am having a scenario in which i am getting poor execution plans due to stale stats , and how should i tackle the scenario. below is the part of my main query which deviates the execution path due to wrong cardinality estimation.
          My column c1 of table tab1 holds javatimestamp values i.e. its NUMBER datatype which points to a date and time component only. And we gather stats each weekend on this table tab1.
          below is my query:
          select /*+gather_plan_statistics*/* from tab1         
          where c1 BETWEEN 1346300090668 AND 1346325539486    ;     
    Plan hash value: 3167980259
    | Id  | Operation                   | Name                    | Starts | E-Rows | A-Rows |   A-Time   | Buffers | Reads  |
    |   1 |  TABLE ACCESS BY INDEX ROWID| tab1                    |      1 |   1    |    167K|00:01:13.72 |     158K|  12390 |
    |*  2 |   INDEX RANGE SCAN          | IDX_N1                  |      1 |   1    |    167K|00:00:13.27 |   13880 |   1736 |
         Above shows a big gap in actual and estimated cardinality estimation, and its due to the fact that the HIGH_VALUE (1346203206173 points to 8/29/2012 1:20:06 AM) in DBA_TAB_COLUMN for     column C1 is well below  the STARTRANGE(1346300090668 points to 8/30/2012 4:14:51 AM) and ENDRANGE(1346325539486 points to 8/30/2012 11:18:59 AM) of the BETWEEN clause.
         So even gathering stats daily on the table wont help me as because, in morning again it will require updated maxvalue for the column C1 for estimating proper, So how to handle this situation?  Dont want to go with 'hint' , want to make the stats proper so that optimizer will automatically pick the right path.Edited by: 930254 on Aug 30, 2012 4:41 AMUm, refresh the stats on a regular basis?
    Oracle 10 and later has a default job to do that. Runs at 2200 daily.
    If you need an 'on demand' refresh, that's easy enough to set up.

  • How to handle an asynchronized scenario?

    Hi, there:
    I had an asynchronized scenario in my web application. After client side collecting information and click "submit", my servlet in server side is supposed to wrap those information in a transaction and send it to 3rd party engine, however, the response of that transaction from the 3rd party engine is asynchronized (may take several minutes).
    Question:
    At client side, after user click "submit", what should I do?
    a. tell user transaction is submitted successfully, and ask him to check response later;
    b. display some information like "please wait, your transaction is processing ..........", and my servlet keeps polling response, and send html response back after get response from 3rd party?
    c. Or, servlet send response back immediately, however, somehow to get client side keeps sending request to servlet to check the status of transaction?
    Highly appreciated if any suggestion or comments.
    Thanks a lot in advance
    David

    You are waiting
    for card authorization back. How do they handle those
    scenario? like during waiting for response, the
    client navigate to other links, or even shut down the
    browser.You can't do anything about clients who don't wait for a response. You can't even find out that they didn't wait. If your response time is going to be fairly reasonable -- a few seconds, maybe up to 30 seconds -- then displaying scary warnings may persuade the client to wait. If your response time is going to be a minute or more, then you are just going to annoy people. One option in that case is to get the client's e-mail address. Then your servlet would just send a trivial response ("Thank you for your order, you will be notified when it is processed") and put the job in a queue to be handled off-line. Handling of the job would involve sending a confirmation e-mail.
    Or you could provide another page where the user can keep track of the progress of his request, which your off-line system is updating whenever it does something with it.
    Also, for those time consuming tasks in
    serverside, if hundreds of requests come at almost
    the same time, will it cause some problem in server
    side? like performance, resource issue?Sure it could. There are people who do nothing but tune Web servers for a living.

  • How to handle this payment scenario in SAP

    Hello Gurus,
    Can anyone please help me with how we make this payment in SAP. Customer A is the alternate payer for Customer B. We gave customer B a credit memo. When customer A is making payments, customer A wants to use the credit memo assigned to customer B as a part of the total payment. How does this scenario be handled in SAP
    Thanks in advance

    Hi,
    You can use the Head office and Branch relationship then this should work . All the invoices/Credit memos get updated in the branch office and head office for reporting and at the time of receipt it can be made at the headoffice level
    K.R

  • Question on how to handle United States 401-K catch up deduction

    Just have a question here to see how other Oracle customers in the United States might be handling the 401-K catch up deduction. For 2009, an employee can deduct up to $16,500 in pay as 401-K deductions. That is the limit for this year as set by US law and IRS regulations. However for employees who are age 50 and over, for 2009 they can deduct an additional $5,500 for 401-K, making their total limit for the year $22,000. These limits can change from year to year. Since not all employees are eligible for this catch up amount, you cannot just set one limit amount for 401-K. The way we have attacked it is that we created two elements, one for 401-K and one for 401-K catchup. That mostly works pretty well. The issue we run into is that when an employee hits the limit on the main 401-K deduction, payroll does not automatically roll over and start the catch up deduction, in the same payroll run. In the next payroll run, the catch up deduction will kick in and the employee's paycheck is back to normal. But it is confusing for the employees who hit this situation (which is a small number) because their paycheck ends up a higher amount in the week that they hit the limit on the main 401-K deduction. How have other customers handled setting up the 401-K catch up deduction? Have you been able to get payroll to continue the normal weekly deduction for 401-K even when the employee hits the yearly limit for the regular 401-K deduction? If so, would be curious to hear how you have configured your system to do this. Thanks for your time and assistance.
    John Dickey

    Okay, after doing a lot of research and studying, I am beginning to see a way to make this work. We will have to find a way to test this out in a development environment. It looks like that if I set the catch up element entry percentage amount to NULL (blank), then set the Take Overlimit Catchup value to ALWAYS, then it may work the way that I want it to. If I am reading this right, at the start of the year only the base element percentage will have a calculation done and a deduction taken (let me emphasize that I am talking about a SEQUENTIAL value for the catch up processing field value). In the period that the employee hits the base limiit, the base element percentage will be fully calculated, with the amount in excess assigned to the CATCH UP ELEMENT. The trick here, that I was failing to realize at first, was that the take overlimit amounts always goes to the catchup element. The next week all of the base percentage amount is overflow and goes to the catch up element. You keep adding to the catch up element amount until you hit the catch up limit. There is NEVER a calculation of an catch up element amount, since the percentage is NULL in that element entry. The only element that ever gets an amount calculated for it is the base element. There is a document, document id 733997.1, that gives an explanation and example of how this Take Overlimit Catchup function works. Will see what happens whenever we can get around to testing this out.
    John Dickey

  • How to handle Investement Management Account (IMA)?

    Hi,
    Our client engages in Investment Management Account arrangement with a bank where in;
    Under an IMA arrangement, the Principal (our client) transfers and delivers to the Investment Manager (bank) a sum of money which the latter shall manage, invest and reinvest, including all increments and additions thereto that may constitute income hereof, for the benefit of the Principal (the principal is our client). This is like a trust service. How do I handle this in SAP Treasury?
    Regards,
    Chaikaru

    Hi,
    I don't think there is a standard product type available for you this.  You can handle it in Moneymarket as a cash flow transaction and define your own flows for the same.
    Regards,
    Ravi

Maybe you are looking for

  • Create Adobe PDF from Word DOCX Issue

    Good Morning All, We have stumbled across an apparently well-known issue, which annoyingly for all concerned does not seem to have a satisfying answer.  One rather long discussion is here: http://forums.adobe.com/message/4237240.  There is at least o

  • Problems with location based applications

    I am having problems with all of my location based applications i.e. none of them work. It seems I can never be located by the GPS no matter where i am and i just get a permanently spinning wheel. I have deleted and reloaded the Apps (although even w

  • PDF forms in andriod

    How to make submit of pdf forms work in andriod OS.  It is working fine in windows system.

  • External Display Resolutions?

    I just got a ViewSonic vx2025VM, it's a 20 wide screen LCD. It does not have an option of 1400x900 on it. So what do you do? Can I hurt the graphics card by making the resolution higher on the external monitor, like 1600x1000 or 1680x1050?

  • Slitting text in lines when printing

    I'm trying to send a string to the printer that is multiple lines. When printing on the paper the new line is ignored. How can I split my strings so that it will print on multiple lines? Thanks