Posting block Question in How to handle inventory management scenarios

Hi all - I have a question in document
<b>"How to...Handle Inventory Management Scenarios in BW"</b>
can anyone please tell me what a posting block is in this document...
also I did not understood what a Validity table is?
thanks,
Sabrina.

Hi Sabrina!
A 'posting block' period is something that now you are facing in inventory management, but you have to consider it in ALL logistic cockpit flows (what you can see in LBWE).
This requirement can be easily explained in this way: as you know, to fill a setup table you have to activate the extract structure of the datasource for which you want to run the setup job (otherwise the system says that no active extract structure exists, do you remember ?); but, by doing so with the posting operations open (in other words, users can create new document or change the existing ones), you run the risk that these records are automatically included in the extraction queue (or in SM13 if you are using unserialized method) AND also collected in setup table ! And you will have the same records with the initial load (from setup table) and then the same from delta load (from the queue): a nice data duplication.
To avoid this situation, a "posting block" (or a "free-posting period" or a "downtime") is requested. No one can post new document during your setup job...
Hope now is clearer...
Bye,
Roberto

Similar Messages

  • How to handle inventory management scenario

    Hi All,
    In How to handle inventory management scenario whitepaper, I fail to understand the basic difference between 'Inventory Management with non-cumulative Key Figures' and 'Inventory Management with Snapshots'. Can anyone please explain me the basic difference.
    Thanks
    PB

    Hi Loic
    Can u please see if all the steps listed below are OK.
    While setting up  the IM Scenario
    I have jotted down the steps with valuable inputs from you  and paper 'How to Hanlde IM scenarios'
    Please find the steps below. Kindly review them and let me know in case I have got anything wrong or missed out on anything.
    1. Transfer Business Content Datasources 2LIS_03_BX , 2LIS_03_BF, AND 2LIS_03_UM in RSA5.
    2. Go to Transaction LBWE, Activate Datasources 2LIS_03_BF, AND 2LIS_03_UM.
    3. Go to Transaction MCB_ select SAP BW usage 'Standard' Radio button.
    4. Save it.
    5. Go to Transaction MCNB, enter name of run ‘Stock_init’. Enter Termination date in future. Enter Datasource as 2LIS_03_BX.
    6. Execute the initialisation.
    7. Entries can be found in SETUP Table MC03BFSETUP & MC03BFSETUP.
    8. Run SETUP for 2LIS_03_BF, AND 2LIS_03_UM using TCodes OLI1BW and OLIZBW respectively.
    9. Run the extraction for 2LIS_03_BX in BW.
    10. Compress the request with ‘No Marker Update’ NOT SET i.e. unticked or unchecked.
    11. Run the extraction for 2LIS_03_BF in BW.
    12. Compress the request with ‘No Marker Update’ SET i.e. ticked or checked.
    13. Run the extraction for 2LIS_03_UM in BW.
    14. Compress the request with ‘No Marker Update’ SET i.e. ticked or checked.
    Steps 1 and 2 shall be executed in Development and transported to production and step 3 onward should be carried out in Production itself with Posting Block.
    Thanks in advance.
    Regards
    PB

  • Question on "How to Handle Inv management Scenarios in BW" docuemnt.

    Hi all,
    I read the document "How to Handle inventory management scenarios in BW" and I did not understood what a snapshot scenario is?  Can anyone tell me what is the difference between snapshop and non-cumulative scenario's.
    thanks,
    Sabrina.

    In a non-cumulative scenario the current stock of any day (or month) is not stored physically in the cube. Instead the stock values over time is calculated dynamically at query runtime by the BW OLAP engine, which basically derives the stock by summing up the periodic value changes (cumulative inflow and outflow of the non-cumulative key figure), which are stored in the cube (together with a so called stock marker used as the basis of the calclation).
    In the snapshot scenario the current stock of any month is calculated in a snapshot ODS and then loaded to a cube. This means that the the stock value is physically stored in the cube in an ordinary cumulative key figure.
    Since a non-cumulative cube store value changes and not the actual stock this means that performance might be bad if there are many value changes for each characteristic combination in a month (since the stock is calculated at runtime and many records must be processed to derive the stock). So in this case the snapshot scenario is better since no runtime calculations of the stock need to occur  and since only one record, containing the actual stock value will be stored in each month for each characteristic combination having a stock value.
    I think you would be better of with an example, but with this explanation in mind looking at the scenarios in the How to again might clarify things...
    Regards,
    Christian
    / Christian
    Message was edited by: Christian

  • How to Handle Inventory Management Scenarios in BW (NW2004) BW7

    Hello,
    does anybody know if there is a more actual version of the document available?
    Thx,
    Christian

    Hi,
    Before posting please search on Google.
    please find the link.
    SAP BW - Inventory data loading step by step(0IC_C03)
    http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/906b837a-0d54-2c10-08b8-bde70337547e?overridelayout=t…
    Thanks,
    Phani.

  • Inventory Management Scenario

    Hi BW Experts,
    I was working on the Inventory Management screnario as in How to Handle Inventory Management Scenarios in BW with cumulative KFs. I read many of the forum postings but I am still little bit confused about the sequence of the loading. As per my understanding, BX goes to 0IC_C03 bus. content cube and snapshot ODS. BX load in 0IC_C03 is compressed. Then BF and UM goes to snapshot ODS, snapshot infocube and 0IC_C03 (initilized for Delta). Then the generic extractor from snapshot ODS goes to snapshot infocube as monthly load and also to ODS itself. Now the questions are:
    1-) Is the sequence above correct? If not could you pls. explain? What are reasons behind of this?
    2-) What are the functions of these objects exactly, in detailed? How do you validate data (by comparing 0IC_C03 and snapshot cube)?
    3-) Why the generic extractor goes to infocube and ODS together, what is the reason?
    4-) Is snapshot infocube loaded with BF and UM as delta daily? After initialization am I going to load the snapshot infocube every night or just the 0IC_C03?
    5-) Do I have to put 0CALDAY into Agg. Ref. char. in the maintenance screen of the defined custom KFs (Z* keyfigurs for this scenario) or just in the report enough?
    6-) Are the routines working correctly except for
      diff_month(2) type i-> "i" is going to be changed into "n"
    After I loaded in the above sequence when the bx is loaded into 0IC_C03 cube transferred data records are less than added. What is the reason?
    Contributions are highly appreciated. Thanks.
    Best Regards,
    Message was edited by: John Seker
    Message was edited by: John Seker

    Hi Loic
    Can u please see if all the steps listed below are OK.
    While setting up  the IM Scenario
    I have jotted down the steps with valuable inputs from you  and paper 'How to Hanlde IM scenarios'
    Please find the steps below. Kindly review them and let me know in case I have got anything wrong or missed out on anything.
    1. Transfer Business Content Datasources 2LIS_03_BX , 2LIS_03_BF, AND 2LIS_03_UM in RSA5.
    2. Go to Transaction LBWE, Activate Datasources 2LIS_03_BF, AND 2LIS_03_UM.
    3. Go to Transaction MCB_ select SAP BW usage 'Standard' Radio button.
    4. Save it.
    5. Go to Transaction MCNB, enter name of run ‘Stock_init’. Enter Termination date in future. Enter Datasource as 2LIS_03_BX.
    6. Execute the initialisation.
    7. Entries can be found in SETUP Table MC03BFSETUP & MC03BFSETUP.
    8. Run SETUP for 2LIS_03_BF, AND 2LIS_03_UM using TCodes OLI1BW and OLIZBW respectively.
    9. Run the extraction for 2LIS_03_BX in BW.
    10. Compress the request with ‘No Marker Update’ NOT SET i.e. unticked or unchecked.
    11. Run the extraction for 2LIS_03_BF in BW.
    12. Compress the request with ‘No Marker Update’ SET i.e. ticked or checked.
    13. Run the extraction for 2LIS_03_UM in BW.
    14. Compress the request with ‘No Marker Update’ SET i.e. ticked or checked.
    Steps 1 and 2 shall be executed in Development and transported to production and step 3 onward should be carried out in Production itself with Posting Block.
    Thanks in advance.
    Regards
    PB

  • Default check on the 'Posting block' check box in the Physical Inventory

    I am looking for a default check on the 'Posting block' check box in the Physical Inventory document for T-code MI01 or MI31. Can anyone help in this?

    Thanks for your reply. Can you send me the detail process how to do it through Transaction Variant.
    Regards,
    Tt

  • How to handle power managment features of Windows Vista

    Hi guys
    I searched a little bit in the Toshiba Technical Knowledge Base and found very interesting article about the power management features of Windows Vista.
    *[How to handle power management features of Windows Vista|http://aps2.toshiba-tro.de/kb0/HTD84026Q0000R01.htm]*
    The point is that you will not find a same Toshiba Power Saver tool on Vista like on the Windows XP notebooks and the power saver settings are a little bit hidden in the Vista Power Saver settings.
    But you can also control the certain options like Cooling Method or Power Saving Mode.
    I think this document could be interesting for some here in the forum!
    Greetings to all!

    Hi Jayjay
    Thanks for posting this here in the forum.
    Several days ago I searched for such further power option in Vista.
    This document was really, really helpful for me.
    Thanks again

  • Question about how ExifMeta handles multivalued tags

    Hi,
    I have tested the ExifMeta (Great work, just what I needed!) to read face recognition data written by Picasa to RegionName, RegionType etc. fields in jpg metadata. Everything seems to work fine if there is only one name in the field but Picasa writes several values separated with commas like this:
    RegionName     John Doe, Jill Doll
    RegionType     Face, Face
    etc.
    In that case, the corresponding fields remain completely empty. So is this convention of writing several values separated by commas "standard" way of doing this? Is there any way to fix this?
    Additionally, three Region category tags refuse to load giving the following error:
    "Not updating due to error getting property, id: XMPmwgrs_RegionAppliedToDimensionsUnit, from: nil, to: pixel, err-msg: Attempt to access property "XMPmwgrs_RegionAppliedToDimensionsUnit" not declared in com.robcole.lightroom.ExifMeta's Info.lua"
    This is minor problem but it would be nice to know how to handle this kind of error in the future. Do I need to declare these in the lua-code if I need them?
    Looking forward to replies,
    Mikko

    Mikko,
    In future, please direct exif-meta specific questions to the exif-meta forum, or me personally.
    But a quick answer: exif-meta doesn't (by default) parse ("interpret") text values, so commas shouldn't be handled differently than any other text.
    The best way to find out what's going on is to invoke exiftool from the command line and inspect the output.
    My hunch is there is a zero character in there representing end-of-string, which can cause problems for lua code (e.g. Lightroom).
    If you send me one of your files which includes the face data, I'll have a look.
    PS - No clue about the 3 region tags (might be worth trying it with the latest version of exiftool) - please send me a file for inspection - thanks.
    Rob

  • Handling Inventory managment

    Hi Friends,
    May be this topic might have delt many times in this forum ,
    I am new to handling 0IC_C03 Cube where in i went through the doucument How to handle the inventory managment(snap shot scneriao).
    I went through some OSS notes regarding the Marker settings and Compressions.
    I have some basic answers and very enthusiastic to know more abt the cube.
    I do know we have to set the marker when you load data source 2LIS_03_BX,and do not set the marker when we load/delta load with out marker(no marker).My question is.
    1) when opening balance is taken we do set the marker and compress,is that we set marker only first time we we load BX data source.
    2) how frequently we should compress the request
    3) BX data soure is loaded only for one time.
    4)How will be fact table look like when we set marker.
    Thanks & Regards
    Ram
    Edited by: Ram on May 31, 2008 8:30 PM

    hi,
    1) when opening balance is taken we do set the marker and compress,is that we set marker only first time we we load BX data source.
    Ans - yes, this compression will update the reference point (markar) for the non cumaltive key figures used in cube.(BX load is done only once)
    2) how frequently we should compress the request
    Ans - once the init is compressed, delta loads were not mandatory to compress, but if you do compression then report performance will be good.(non cumulative KFs will get values only at the report run time with the help of markar set for them.
    3) BX data soure is loaded only for one time.
    Yes
    4)How will be fact table look like when we set marker.
    Ans- All the data from F table will be moved to E table as in ordinary compression the reference point is get updated.
    Ramesh

  • How to impliment inventory Management in BI using BX/BF/UM

    Experts !
    i have to impliment Inventory management Data flow. i was reading SAP documentation on it, but i fount it bit confusing.
    Could you please help me out to undertstand how does these 3 DS works ? and why do we need 3 diff. DS ?
    I know , i am  asking a big question, and it would be bit tough to explain. but any kind of documentation will help.
    i did research , but couldnt find the right way of doing it.
    Please give me some thoughts !
    Appreciate it.

    Hi,
    There are 3 DS for IM :-
    2LIS_03_BX: extracts stock data from MM Inventory Management for initialization to a BW system.
    2LIS_03_BF: extracts material movement data from MM Inventory Management (MM-IM) consistently to a BW system.
    2LIS_03_UM: extracts revaluation data from MM Inventory Management (MM-IM) consistently to a BW system.
    Maintain the following entries intables of the ECC System:
    Table TBE11:
    Maintain entry 'NDI' (text: New dimension integration) and u2018BWu2019 and the flag Active.
    Also refer : OSS note 315880 & 571669.
    Step 1:
    Go to transaction RSA5 and activate the 3 DS.
    Step 2: Delete Historical Data.
    Step 3: Delete Data from setup Tables.
    Step 4: Take reqd ECC downtime
    Step 5: Stock Initialization : use transaction MCNB for 2LIS_03_BX data source.
    Step 6: Fill Setup Tables for 2LIS_03_BF. (U can acces this from SBIW)
    Step 7: Fill Setup Table for 2LIS_03_UM
    Step 8: Pull Data for 2LIS_03_BX (the "NO marker update" shouldnt be ticked)
    Step 9: Pull Data for 2LIS_03_BF.(the "NO marker update" should be ticked)
    Successive delta loads should be with marker Updates (No Tick).
    Step 10:Pull Data for 2LIS_03_UM.
    Hope this helps.
    Regards,
    Rahul

  • How to Handle Batch Managed materials in AII

    Hi,
    We are trying out various scenarios with AII one of then including how does AII handle Batch Managed materials. While creating a delivery we provide Batch information, but this is not going to AII, as AII is only interested in EPC codes, quantity and UOM.
    We are able to successfully get tags into AII but the delivery IDOC fails to post in the ERP system cause it is expecting BATCH information.
    Would appreciate any ideas on the same, we have tried posting the BATCH_ID in the PML document header as well as at the TAG level but have not been successful so far, Are we missing some steps?
    - Aniruddha

    Hi Aniruddha,
    AII does not support the extension fields ( like BATCH_ID) directly from UI ie.  mobile or Auto ID Cockpit
    Incase you are testing through Auto Id Test Tool, you can assign these fields under tags data list in Tags tab.
    The way of passing these fileds from auto id test tool is to select the EPC being passed and add the field BATCH_ID under tag data list. This will add the extension field to the XML being sent to XI.
    As far as sending it through UI is concerned then development option has to be explored. For mobile UI, you can think of creating a user data profile and add such fields to the rule being processed. You can refer to way SAP has defined Z_CON_FIELDS element set in the config guides.
    I hope this should help you in finding a solution to your problem.
    Regards,
    Ashish
    PS: Reward points for useful answer

  • How to handle and manage a multi Database access in runtime with LCDS?

    Hello there
    I got several customer working with the same application and I wonder how,  with LCDS,  to manage  in a runtime a multi dataBase access; without creating a configuration "mxl" file in
    the folder catalina for each database.
    Indeed, each customer have their own dataBase, and so far, I did not find out how to avoid creating a config xml file in catalina for every single database; which force me to create as well for each customer a  folder application, since the name of the config file in catalina require a folder application to be ran under tomcat....
    Thus, my question is :
    Is there anyway to create only one configuration mxl file in catalina (in the server side) and then from the client side (application) let the user select its environment (meaning its database) to run the application.... this technic can be also used for multi database environment such as : Dev / Test / Prod   environment (or database) where the same application can access to.
    Please if any one have an idea or already delt with; just let me know, because I'm entering in a bootle neck and the situation is getting serioulsy critical....
    Regards

    Hello Ulrich,
    with compact and repair I mean the MSAccess function "Compact and Repair".
    Please follow the link below for more details:
    http://office.microsoft.com/en-us/access-help/compact-and-repair-an-access-file-HP005187449.aspx
    Normally you can execute this function directly in Access or with the Windows ODBC Data Sources Administrator  => "Control Panel" => "Administrative Tools" => "Data Sources (ODBC)"...
     I want to execute this function via cvi code and not by hand ;-).
    Thank you for your support.
    Frank

  • How to handle and Manage Multi DataBase access with LCDS in runtime ?

    Hello there
    I got several customer working with the same application and I wonder how,  with LCDS,  to manage  in a runtime a multi dataBase access; without creating a configuration "mxl" file in
    the folder catalina for each database.
    Indeed, each customer have their own dataBase, and so far, I did not find out how to avoid creating a config xml file in catalina for every single database; which force me to create as well for each customer a  folder application, since the name of the config file in catalina require a folder application to be ran under tomcat....
    Thus, my question is :
    Is there anyway to create only one configuration mxl file in catalina (in the server side) and then from the client side (application) let the user select its environment (meaning its database) to run the application.... this technic can be also used for multi database environment such as : Dev / Test / Prod   environment (or database) where the same application can access to.
    Please if any one have an idea or already delt with; just let me know, because I'm entering in a bootle neck and the situation is getting serioulsy critical....
    Regards

    Hello Ulrich,
    with compact and repair I mean the MSAccess function "Compact and Repair".
    Please follow the link below for more details:
    http://office.microsoft.com/en-us/access-help/compact-and-repair-an-access-file-HP005187449.aspx
    Normally you can execute this function directly in Access or with the Windows ODBC Data Sources Administrator  => "Control Panel" => "Administrative Tools" => "Data Sources (ODBC)"...
     I want to execute this function via cvi code and not by hand ;-).
    Thank you for your support.
    Frank

  • How to handle this Rework Scenario.

    Hello friends,
    Following is the rework scenario that is in Pharma.
    My client produces finish product quantity extra than required. Suppose i have order of 1,00,000 tablets. Production order is created and production is completed. Suppose for this order 1000 tablets are produced extra than order quantity. This extra production is kept in production only. In such way, for each order, extra quantity of finish good  goes on increasing in production.
    Suppose now i have order of 10,000 tablets and in production i have 5000 tablets that are already produced. Now i use this 5000 tablets as rework material. I want that when i run MRP system should generate order of only 5000. I do production of  10000 qty. by using 5000 rework material and 5000 order quantity.
    How can i map this scenario in SAP?
    Thanks

    hi
    Pls. find here step by step procedure for Rework Order :
    REWORK HANDLING PROCEDURE
    There are two way one is manual and another is with trigger point (ref operation set)
    1.Create Rework order without material at CO07 with out material .
    2. Assign the settlement receiver as original production order at settlement rule.
    3. Assign the rework material & component if needed at rework order & assign the issue storage location (unrest- use storage location above).
    4. Do the confirmation for rework order & rework material will be issued from stock to order
    5. GR will not be done for rework order. To get stock for reworked material Do the GR for orginal order & settle the both order .
    Another method,
    1. Create a Reference op set in tcode CA11.
    Enter the plant, Description, usage and status
    for the ref op set
    2. Create a Standard trigger point in CO31.(Ex
    Production)
    Enter the Trigger Point usage /Group as FERT.
    3.Tick the Trigger Point Functions.
    4. Enable the indicator create order with Reference .
    5.Enter the system status as PCNF.
    6.The system will says standard trigger point exists from which you can
    select the created Trigger Point (EXx Production)and save.
    7. MD02/MD04.
    8. Convert the Planned Order into Production Order and Release the order.
    9 .Now confirm th ops.
    10. For the last op enter the yield and rework qty
    and set the status as PCNF.
    11. If the status of last op is PCNF then in the
    system a POP UP will appear as Activited by
    Trigger Point /Create order with reference/order
    number/sequence and op number which trigger point
    works and enter.
    12. Now a rework order will be created.
    13. In c003 enter the number which is next to the actual prodn order no.Rework order will be seen.
    Hope this helps.

  • Post Installation Steps for ERP add-on Inventory Manager 4.0

    Hi,
    I need to confirm what are the post-installation activities we must to perfom for ERP Add-on (Inventory Manager 4.0), besides de BC Activation. I did not find any information regarding the post installation steps for the ERP Add-on of Inventory Manager 4.0 (the installation guide for IM 4.0 on service marketplace does not remark the steps only explain the theorical part of the installation)... And we need to confirm if the steps are the same of Inventory Manager 3.2 Add-on or we must do an additional step?
    Best Regards,
    MC
    Stephen Streeter Manju Venkatesha

    Yeah...of course...
    I forgot to specify it.
    I install the first ERP 2005 SR2 on Oracle 10.2.0.2 and the other have to be installed on the same version of DB.
    All installation are not MCOD.
    This mean every installation has its own DB with different SID for every installation.
    And of course I have at least doubled the space required by installer. Taken from sap notes the required space for ERP 2005 SR2 is about 90 GB.
    The total space of mount point is about 500 GB....
    Thanks
    Dan

Maybe you are looking for