Detailed approach when the used Live cache data area crosses the threshold

Hi,
Could any of you please let me know the detailed approach when the used Live cache data area crosses the threshold in APO system?
The approach I have as of now is :
1) When it is identified that data cache usage is nearly 100%, check for hit rate for OMS data in data cache in LC10 .Because generally hit rate for OMS data in data cache should be atleaset 99.8% and Data Cache usage should be well below 100%.
2) To monitor unsuccessful accesses to data cache choose refresh and compare value now and before unsuccessful accesses result in physical disk I/O and should generally be avoided.
3) The number of OMS data pages (OMS Data) should be much higher than the number of OMS history pages (History/Undo).A ratio of 4:1 is desirable. If OMS history has nearly the same size as OMS data, use Problem AnalysisPerformanceOMS versions to find out if named consistent views (Versions) are open for a long time. Maximum age should be 8hrs.
4)If consumption of OMS heap and data cache is large, one reason may be a long running transaction simulation that accumulates heap memory and prevents the garbage collector from releasing old object images.
5) To display existing transactional simulations in LC10,use Problem AnalysisPerformanceOMS versions and SM04 to find out user of corresponding transaction and may be required to cancel the session after contacting user if the version open for long time..
Please help me by providing additional information on the issue.
Thanks,
Varada Reddy.

Hi Mayank, sorry, one basic question - are you using some selection criteria during extraction? If yes, then try extraction without the selection criteria.
If you maintain selection based on, let's say, material, you need to use the right number of zeros as prefix (based on how you have defined the characteristic for material) otherwise no records would be selected.
Is this relevant in your case?
One more option is to try to repair teh datasource. In the planning area, go to extraction tools, select the datasource, and then choose the option of repair datasource.
If you need more info, pls let me know.
- Pawan

Similar Messages

  • Approach when the used Live cache data area crosses the threshold

    Hi,
    Could any of you please let me know the detailed approach when the used Live cache data area crosses the threshold in APO system?
    The approach I have as of now is :
    1) When it is identified that data cache usage is nearly 100%, check for hit rate for OMS data in data cache in LC10 .Because generally hit rate for OMS data in data cache should be atleaset 99.8% and Data Cache usage should be well below 100%.
    2) To monitor unsuccessful accesses to data cache choose refresh and compare value now and before unsuccessful accesses result in physical disk I/O and should generally be avoided.
    3) The number of OMS data pages (OMS Data) should be much higher than the number of OMS history pages (History/Undo).A ratio of 4:1 is desirable. If OMS history has nearly the same size as OMS data, use Problem AnalysisPerformanceOMS versions to find out if named consistent views (Versions) are open for a long time. Maximum age should be 8hrs.
    4)If consumption of OMS heap and data cache is large, one reason may be a long running transaction simulation that accumulates heap memory and prevents the garbage collector from releasing old object images.
    5) To display existing transactional simulations in LC10,use Problem AnalysisPerformanceOMS versions and SM04 to find out user of corresponding transaction and may be required to cancel the session after contacting user if the version open for long time..
    Please help me by providing additional information on the issue.
    Thanks,
    Varada Reddy.

    Hi Mayank, sorry, one basic question - are you using some selection criteria during extraction? If yes, then try extraction without the selection criteria.
    If you maintain selection based on, let's say, material, you need to use the right number of zeros as prefix (based on how you have defined the characteristic for material) otherwise no records would be selected.
    Is this relevant in your case?
    One more option is to try to repair teh datasource. In the planning area, go to extraction tools, select the datasource, and then choose the option of repair datasource.
    If you need more info, pls let me know.
    - Pawan

  • In which table is the Live cache data stored?

    Hi experts,
       I am new APO .Can anyone let me know in which database table will the Live cache data be stored?
    Thanks in advance
    regards
    Ashwin.

    Hello Ashwin,
    the idea of the liveCache is to haev data permanently and quickly available without having to read/write from/to database. Therefore, the liveCache is <b>NOT</b> a physical database, it is a program in C++ that <i>simulates</i> a database and holds the data in the memory.
    However, it does need so called liveCache-anchors which are saved on the database, and which are similar to pointers.
    However, you can extract data from liveCache using BADIs or by creating a datasource on a planning area (for DP and SNP), manipulation can also be done only by BADIs or sophisticated programming (which basically uses RFCs).
    I hope this answers your question.
    Regards,
    Klaus

  • Problems in Master-Detail forms when the detail contains LOVs

    I am having problems with master detail forms when the detail contains and lov.
    In my detail form I have a messageLovInput field that returns 2 values (code and description). When the lov window returns both values the messageLovInput disappear from the form.
    This problem happens only when I iterate in the master view and I try to add a record in the detail form.
    If I add a new record without navigation for the master view there is no problem.

    Jode,
    which technology are you using, ADF JClient ? If yes, please provide a step by step description of how to setup a testcase to reproduce the problem.
    Frank

  • Details of When the print of the shop paper is carried out for notification

    Hi Experts,
    We have a requirement where we need to get the details like : time and date and the Shop-paper used for printing , when a line item of the notification is seleted and printed.
    We have created our own smartform and attached it to the custom shop-paper. We need to capture the above specified details, when the print option is selected . We have also checked the entries in PMPL but not of much help as we need the details as each time the user goes and prints the line items.
    I tried to search for some BADI's but ended in vain.Unable to find out how to move further, Would be greatful for any reply!!!!
    Thanks
    Vidya

    Hi Pete & Sunil .. thanks a lot for the inputs.. but the details are not getting updated in PMPL table.
    The PMPL table is getting Updated only if I download the form.
    The steps followed by us is :
    1. IW53/IW52 -> opened the notification
    2.Go to Menu option : Service notification -> Print -> Item selection .
    3. I selected the items then I selected the shop papers and then print.
    4. Checked the PMPL table ..but no entries where maintained..
    I tried to get the details in the driver program of the smartform itself from the standard include "RIPRIF01"..it Worked !!  but unable to get the Items which where selected for printing :(....Please help..
    Thanks
    Vidya

  • Live cache data deletion

    Hi Friends,
    We are facing one strange problem with our Live cache.
    We have DP data in Livecache. due to some reasons planning results are corrupted.
    So we have deleted time series for DP planning area and recreated time series.
    Here what we found is even after deletion and recreation of time series for planning area still
    the key figure data exists in live cache and can see in interactive planning book.
    Can some one please give some hints what went wrong.
    Any other Live cache related programs or jobs need to run to get rid of the problem?.
    Regards
    Krish

    Hi Thomas,
    You are right. after running /SAPAPO/TS_PSTRU_CONTENT_DEL all CVCs for the POS are deleted
    and there are 0 CVC left.
    but again i have Generated CVCs from Infocube (possibly same old combinations are generated since the infocube is same).
    Now i have initialized planning area and then accessed the selection id in interactive planning.
    Surprisingly still i can see key figure values in planning book.
    Here are steps what i did:
    1. Deleted time series for PA (de initialization)
    2. Deleted CVCs from POS
    3.Deactivated POS and reactivated POS
    4.Generated CVCs again from Infocube
    5.Created time series for PA (re initialize)
    6.Loaded selection id in interactive PB.
    7.Still i can see key figure data for loaded CVCs.
    Some how the data is not getting cleaned in livecache inspite of doing above steps.
    data might be stored superfluously in live cache.
    I want to know any additional reports are available to get rid of  these kind of issues.
    Regards,

  • Upload PRODUCTION live cache data to QUALITY live cache

    Hi experts,
    We have gotten a request for an estimate on how long time it will take to take data from SCP to SCQ.
    It is data from the Livecache in production that is needed in quality livecache also.
    We have a backup cube(ZFCST_BC) where we can take out the data in an OpenHUB.
    And then we would probably need to use this file as a datasource and load it into a cube(maybe ZFCST_BC).
    Is there an easier way to do this and how to approach to achieve above solution.
    Regards
    Venuscm

    Venuscm,
    Are you talking about a subset of LC data, or are you talking about all LC data?  If it is a subset, then exactly which data will be subject to this process?
    Much LC data must be synchronized with table data.  Attempting to load the wrong data in the wrong way will only result in a quality system that is un-usable.
    The most reliable way to load Prod LC data into Qual is to follow SAP's recommendations, using 'System Copy'.
    https://service.sap.com/sap/support/notes/886103
    Best Regards,
    DB49

  • When to use Locale specific data in HWC ?

    Working with a HWC application i noticed two other options as in screenshot  in the MBO screen (i.e Wrap data in PRE CODE and Locale -specific display ). When is the appropriate scenario to use these options?

    I never tried this but you may look at here:
    Wrap header in PRE : SyBooks Online
    Locale Specific : SyBooks Online

  • When to use ADF Caching and/or Coherence?

    Looking for a high level answer to this question... basically what is best practice?
    Given that I've found virtually nothing that mentions these two items in the same breath, I'm wondering if the question is even applicable!
    If there is overlap, at what point do you "move" to Coherence.
    Presumably the two are not mutually exclusive either?
    Any experience of this? Problems or wins perhaps?
    Thx
    .Stuart

    Hi Henry,
    Whether you want the file to be on the presentation server or on the application server is a choice that you have to make.
    If your SAP system is linked to some other system and that system sends data to the SAP system in a file format, then it would be FTPed to the application Server of your SAP system.
    This is one common scenario.
    If you put the file on the presentation server, then the program which accesses the file cannot be run without a user sitting in front of the screen, logged on to the SAP system. this means that a file on the presentation server cannot be processed in background.
    Application server is also preferred in case the data in the file is huge. If such a file is loaded from the presenation server, then it may take a long time to get uploaded and then it has to be processed. This may lead to time-outs in many cases. If the file is on the application server, then since you can process it in background, you will never have the problem of the progrma timing out.
    Also, having a file on the presentation server leads to the assumption (and not a totally untrue one at that) that the data in the file is actually user-specific. At least it is (certainly) determined by the user.
    Hope the point is clear. If not, get back.
    Regards,
    Anand Mandalika.

  • How can I download iCloud when W7 uses Live Mail

    Can anyone please explain how I overcome the fact that iCloud requires Outlook and my Windows 7 has replaced this with Live mail.
    thanks

    abefromnanaimo wrote:
      At this point I believe iCloud only works with MS Outlook.
    Not only at "this point", it will be forever so.
    LiveMail/Vista Mail do not support Caldav/carddav protocols. Only Outlook can do this.

  • Accessing the planning cache data (In IP)

    Hello...
    I need a help in retreiving the live cache data.
    while we are using the transactional cube everytime we need to switch into planning mode and while loading we need to switch to load mode.
    Is there any other way to access the IP cache data.
    Appreciate any suggestions....
    Regards,
    Pari.

    Hi Parimala,
    Custom Planning Functions can be created using the transaction RSPLF1. Give a technical name and click on create button. It will ask for a class name to be attached with the function type. The class should be created in SE24 transaction. The implementation of the class will be in object oriented ABAP, where you will write the implementation logic in class methods.
    The custom planning function so created is available for usage when you create a planning function attached with an aggregation level. You can see it in the drop-down list of all the function types.
    You can refer to the standard function type for delete (0RSPL_DELETE) which will give you an idea on how a the class is implemented.
    The link provided by Marc above is helpful :
    http://help.sap.com/saphelp_nw70/helpdata/en/43/332530c1b64866e10000000a1553f6/frameset.htm
    Also, go through this how to guide:
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/c0ac03a4-e46e-2910-f69d-ec5fbb050cbf
    Hope this helps you.
    Regards,
    Srinivas Kamireddy.

  • Some concern about Live cache Homogeneous System copy

    Hi All,
    I need to do the Homogeneous system copy SCM 5.0 / Live cache 7.6.00 on AIX. Following are the Source and target System SID for Live cache. Moreover I had check the respective SAP note 457425 and 877203 but still having some concern.
    Source System
    Live cache SID           =      SCD
    Live cache user ID      =      SAPSCD
    Live Database instant software owner = scdadm
    Live cache Data Size      =  8 GB ( 4 Data volume / 2 GB each / auto extend off)
    Target System
    Live cache SID           =      SCT
    Live cache user ID      =      SAPSCT
    Live Database instant software owner = sctadm
    Live cache Data Size      =  4 GB ( 2 Data volume / 2 GB each / auto extend off)
    For same I having following concern.
    1.     Since there is difference between source system and target system size / data volume number what action I need to take on target system. Do I need to add two more data volume into the target system?
    2.     After complication of database restore on target the system do I need to change the Live cache user ID (i.e. SAPSCD to SAPSCT)?
    or
    It is just fine to change the log on data in “Integration” tab in LC10.
    Please let me know some more information on same.
    Thanks,
    Harshal

    Hello Harshal,
    "1. Since there is difference between source system and target system size / data volume number what action I need to take on target system. Do I need to add two more data volume into the target system?"
    Please pay attention to the SAP note :: 457425    Homogenous liveCache copy using backup/restore
    Please update the thread withy additional information ::
      What is the version of the source liveCache? What is the version of the target liveCache?
      How much data do you have in the source liveCache?
    And you need to have the size data area of the target liveCache to be able to restore the databackup created in the source liveCache.
    So by the size of the data volumes in the source liveCache, it's not clear how much data you have.
    "2. After complication of database restore on target the system do I need to change the Live cache user ID (i.e. SAPSCD to SAPSCT)?"
    Please review the SAP note 877203.
    You could rename the user SAPSCD to SAPSCT using the SAP note 877203 steps
    OR
    You could have the user SAPSCD in the target liveCache as Standard liveCache user &
    After you run the homogenous liveCache copy using backup/restore procedure you need
    to change the LCA/LDA/LEA connection with user SAPSCD on target system and
    set the user SAPSCD to the user containers  ( please see the SAP note 616555 for more details )
    Before to restart the liveCache in LC10.
    PS: Please pay attention about the value of the liveCache parameter _UNICODE.
          What are the values of this parameter in the source & target liveCache?
    Thank you and best regards, Natalia Khlopina

  • What is Live Cache Anchor

    Hi ,
    Some times we see errors like "No Live Cache Anchor Found". Can somebody tell me in detail that what is a live cache anchor and why this inconsistency occurs. Is there a link containing detailed documentation?
    Best Regards,
    Chandan Dubey

    Dear Chandan,
    This error message "No Live Cache Anchor Found" states that for one or even for several characteristics combinations (see Transaction /SAPAPO/MC62) a so-called LiveCache anchor does not exist.In this case, a LiveCache anchor is a  pointer to one or several time series in the LiveCache.
    There is one LiveCache anchor per planning area, characteristics combination (planning object) and model (model belonging to the planning version, see transaction /SAPAPO/MVM). If for one planning area, time series objects were created for several versions with different models, that means several LiveCache anchors exist for the same planning area and for the same characteristics combination.If you created time series objects for several versions of the same model, that means one LiveCache anchor
    points to several time series in the LiveCache.                        
    If there is no LiveCache anchor for a planning area, a model and a characteristics combination, this also means that no respective time series exists in the LiveCache and thus, this characteristics combination cannot be used for the planning with this planning area for all versions of the model.If this state occured for a certain characteristics combination, the above-mentioned error message occurs if either exactly this characteristics combination is selected or if a selection contains this characteristics combination.                                                                               
    Possible causes and solutions:                                          
    - Time series objects have not yet been created for the selected version.
    Solution: Create time series objects (see documentation) 
    - You are using a planning session with version assignment.For the version that was acutally selected, the time series objects were created, however, this is not the case for the assigned version.
    Solution:Create time series objects for the assigned version.
    - New characteristics combinations were created without the 'Create time series ob' option (see Transaction /SAPAPO/MC62). 
    Solution:Execute the report /SAPAPO/TS_LCM_PLOB_DELTA_SYNC for the basis planning object structure of the planning area.This  will create the corresponding LiveCache anchors and LiveCache time series for all planning areas that used these basis planning object structure and for all versions of these planning areas for which time series objects already exist.                                                                               
    If none of these possible solutions is successful, you can use report /SAPAPO/TS_LCM_CONS_CHECK to determine and correct the inconsistencies for a planning area ('Repair' option).                                  
    I hope this helps.
    Regards,
    Tibor

  • Production orders not updated in Live Cache

    Hi All,
    We are currently facing a strange issue. Planned orders created in APO are converted into Production orders in R/3. When they are CIFfed to APO, those prodcution orders are captured in Product view but are not updated in either /SAPAPO/OM16 or Production list (/SAPAPO/PPL1). In these two transactions the production orders are not visible but their respective planned orders are showing up.
    When delta report is executed for Production orders with "Consider Requirements/Receipts" as sub-object, these orders are captured with Error code 501 (Requirement/Receipt for the order exists only in R/3). When pushed to APO, these orders are updated in the Live Cache.
    Can anyone share your thoughts on what could be the root cause for this issue.
    Appreciate your thoughts
    Thanks,
    Sai

    Thanks Vikas and Senthil,
    Its not the issue with all the Production orders. The issue is with only few orders and we are in the process of tracing out the pattern.
    Our Primary concern is that the orders captured in the delta report are visible in Product view even without taking any action in CCR but are missing in OM16. Its only after taking the action those orders are visible in OM16.  I believe the data in the product view is read from Live Cache and the Live Cache contents are displayed in OM16. So, whatever orders visible in Product view have to be displayed in OM16 which is not happening.
    Please let me know if I am missing anything.
    Thanks,
    Sai

  • When to use AMF and when to use RTMP??

    Hello can someone explain the difference??
    I am using all the defaults for the configurations files
    (remoting and datamanagement) and my app works perfectly on
    development mode where everything is on localhost, but when we try
    it on pre-production it keeps throwing an error saying that it
    couldn't connect to the RTMP channel. The only thing that changes
    between these two modes is that on development i access everything
    on localhost and on pre-production we point to another server.
    I am totally clueless why this is happening, i am bit
    confused why data-management uses RTMP and why remoting uses AMF,
    so if anyone could explain it to me it would be great.
    Thanks

    There's no special reason why one uses RTMP vs. AMF. Data
    management service can also use AMF and Remoting can also use RTMP.
    The only real requiement is that Data management service needs to
    be able to receive updates from the FDS/LCDS server. This can be
    achieved by using a channel that is capable of receiving pushed
    updates from the server (eg. RTMP) or a channel that is capable of
    polling for updates on the server (i.e. AMF-polling). Therefore,
    when you use AMF with Data management service, you need to have
    polling enabled with a polling interval.
    In terms of why AMF works but RTMP doesn't is that AMF
    channel uses AMF format over the HTTP protocol whereas RTMP channel
    uses simple TCP socket (and not HTTP). Therefore in some
    environments, RTMP will be blocked by the firewalls but AMF won't
    since it's like regular HTTP traffic. In this case, you can use AMF
    polling instead or use RTMPT which is new in LCDS 2.5.

Maybe you are looking for