Proactive Caching - Monitoring processing

I'd like to hear from anyone that is using proactive caching and how they monitor the loads of the cube. 
I have created a Aggregation=Max measure in each measure group that loads as getdate(), this allows me to see the load date by partition.  My date dimension has a partition_cd, which denotes what dates a partition covers.  The partition date
scheme is the same for all measure groups.  This handles things from the user perspective, they know how recent their data is.
What it doesn't do is allow me to see average load times, number of loads per day, etc..  the things I need from a support perspective.
The only solution I have seen for this is the ASTrace.exe application.  This would mean installing something custom on the server, like to avoid that if I can.  Any other options out there?
Any other feedback on this area in general?
As always you guys are great, thanks for all the help!
-Ken

Hi Ken,
Thank you for your question. 
I am trying to involve someone more familiar with this topic for a further look at this issue. Sometime delay might be expected from the job transferring. Your patience is greatly appreciated. 
Thank you for your understanding and support.
Regards,
Charlie Liao
TechNet Community Support

Similar Messages

  • Proactive Caching for Cube process.

    Hi,
    We need to implement proactive caching on one of our cubes in SQL Server 2012, we are able to do it at Partition level (Measures) when there data changes in tables, I am looking for option to implement Proactive Caching at Cube level every
    night at particular time (12.00 A.M) irrespective of data change in the tables. We dont want to use SSIS Packages.
    Thank You.
    Praveen

    Hi Praveen,
    Proactive Caching is a feature in SSAS that allows you to specify when to process a measure group partition or dimension as the data in the relational data source changes.
    Generally, to implement Proactive Caching for a cube, we develop an SSIS package that processes the dimensions and measure group partitions. And then execute the SSIS package periodically. As
    Kieran said, why don't you want to use SSIS Packages in your scenario?
    Here are some useful links for your reference.
    http://vmdp.blogspot.com/2011/07/pro-active-caching-in-ssas.html
    http://www.mssqltips.com/sqlservertip/1563/how-to-implement-proactive-caching-in-sql-server-analysis-services-ssas/
    Regards,
    Charlie Liao
    If you have any feedback on our support, please click
    here.
    Charlie Liao
    TechNet Community Support

  • Automatic MOLAP cube : Proactive caching was cancelled because newer data became available

    When I process the cube manually after processing Dimension, It works fine. But when I append data into database column, It performs proactive caching, at that time it fails.
    Sometimes: It does not get key attribute because measure gets processed before dimension
    and sometimes it gives error:  
    Proactive caching was cancelled because newer data became available  Internal error: The operation terminated unsuccessfully. OLE DB error:
    OLE DB or ODBC error: Operation canceled; HY008. Errors in the OLAP storage engine: An error occurred while the dimension, with the ID of 'call dim Monthly 201401 2', Name of 'call dim Monthly
    201401 2' was being processed. Errors in the OLAP storage engine: An error occurred while the 'MSW' attribute of the 'call dim Monthly 201401 2' dimension from the 'callAnalysisProject' database was being processed.  etc....

    I have also seen this error occur in other scenarios.
    In the first if you have set Pro-Active caching to refresh every 1 minute and your query takes 2 minutes to refresh the error above can be displayed.  Solution increase your refresh time or tune your Pro-Active caching query.
    In connection with the above if your server is limited on available resources this can also cause the slower query response times during refresh and the message above.

  • When does proactive caching make sense?

    Hi all!
    An standard pattern for multi-dimensional cube is to have
    one cube doing heavy time-consuming processing and then synchronize it to query cubes.
    In this setup, will pro-active caching makes sense?
    Best regards
    Bjørn
    B. D. Jensen

    Hello Jensen,
    Proactive Cache is useful low volume data cubes where data is updating frequently like inventory, forecasting etc. But I will tell you with my own experience Proactive cache in SSAS is not worth. It behaves unexpectedly some times when data update/insert/Delete
    in source table the cube doesn't start its processing ,  better you create a SQL Job to process the cube after specified time .
    If you want to process the cube in specified interval then I would suggest you to go with SQL JOB
    blog:My Blog/
    Hope this will help you !!!
    Sanjeewan

  • Value Mapping Replication-Values not reflected in Cache Monitoring

    Hello
    I implemented Java proxy for populating values in Java cache. The call was successful also i could see successful messages in SXMB_MONI. also in message moniotring it says "JPR successfully processed the message".
    But when i am checking in cache monitoring i couldn't view the context that i populated through proxy.
    Do i need to refresh any of the cache or restart the server?
    Thanks in advance.
    Regards
    Rajeev

    Hi Rajeev,
    Close the Integration Builder and RWB pages and open again with browser cache refresh. it works for me
    Regards,
    Ricardo.
    Message was edited by:
            Ricardo  Quintino

  • Error in MOLAP Proactive Caching

    Hello,
    We have enabled Proactive Caching for MOLAP and we are using Polling mechanism.  The polling query queries the view for date column. An SQL view is the datasource for the cube with joins on different tables.
    When an insert happens on the underlying table, the polling query works fine and starts to process the cube.
    During the cube process, the following error is logged in the SQL Server Profiler
    Internal error: The operation terminated unsuccessfully. Errors in the OLAP storage engine: The attribute key cannot be found when processing: Table:vw_realdata, Column: 'Product', Value: 'Product1'. The attibute is 'Product'....
    We have enabled the "Apply settings to dimensions" checkbox for the Measure group
    When the complete database is processed, this error does not occur.
    Please let me know how to prevent this error using Proactive Caching?

    Eileen,
    "The issue is during the cube process which is run by SSAS once it detects changes by Poll query"
    Say I have a dimension Product, with key as Product_Key and an attribute BRAND. with values {1,BRAND-A}.
    Up to now everything works fine.
    Dimension data in database got updated - BRAND-A updated to BRAND-B.
    During this time,
    - before poll query detect
    - after poll query detected and during cube processing by SSAS
    Any MDX query fired with attribute BRAND, will look for BRAND-B in MOLAP dimension, if not found it will throw error. Why BRAND-B, because the DB is already updated.
    SELECT non empty [PRODUCT].[BRAND].MEMBERS on rows, [Sales] on columns FROM MYCUBE
    will translate in sql query like below
    SELECT BRAND, SUM(Sales) Sales FROM <MYFACT> fact , PRODUCT prod where fact.PRODUCT_KEY = prod.PRODUCT_KEY GROUP by prod.BRAND
    The sql returns
    BRAND-B|9999.89 , the attribute values are checked against MOLAP dimension , it would fail with the error message as Anandh got.
    After cube process completed by pro-active mechanism, the error will go away.
    Thanks
    Shom
    Shom

  • Any problems having Admin Optimization and Proactive caching run concurrently

    Hi,
    We've recently enabled proactive caching refreshing every 10 minutes and have seen data in locked versions changing after Full Admin Optimization runs. Given how the data reverts back to a prior submitted number, I suspect having proactive caching occur while the Full Admin Optimization runs may be the culprit.
    here's an example to depict what is happening.
    original revenue is $10M.
    user submits new revenue amount of $11M.
    version is locked.
    data in locked version is copied into a new open version.
    full optimization runs at night and take 60 mins. all the while, proactive caching runs every 10 mins.
    user reports the revenue in the previously locked version is $10M and the new version shows $11M.
    We've never experienced this prior to enabling proactive caching which leads me to believe the 2 processes running concurrently may be the source of the problem.
    Is proactive caching supposed to be disabled while Full Admin Optimization process is running?
    Thanks,
    Fo

    Hi Fo
    When a full optimization is run, the following operations take place:
    - data is moved from wb and fac2 tables to the fact table
    - the cube is processed
    If the users are loading data while full optimization occurs then it is expected that a certain discrepancy will be observed. One needs to know that even with proactive caching enabled, the OLAP cube will not be 100% accurate 100% of the time.
    Please have a look at this post which explains the details of proactive caching:
    http://www.bidn.com/blogs/MMilligan/bidn-blog/2468/near-real-time-olap-using-ssas-proactive-caching
    Also - depending on how they are built, the BPC reports may generate a combination of MDX and SQL queries which will retrieve data from the cube and data from the backend tables.
    I would suggest to prevent users from loading data and running reports while the optimization takes place.
    Stefan

  • Cache Monitoring issue

    Hello all.
    I'va the following error when I check the Cache Monitoring in RWB
    for "Integration Server (ABAP Cache)" :
    Communication error
    It's the case for all objects : example for Integration processes
    What is the origin of this issue ?
    If I check "Integration Server (Java Cache)" objects , it's ok !!
    Thnaks in adavance

    hi,
    you always need to start with this doc
    if you have cache related issues:
    https://websmp109.sap-ag.de/~sapdownload/011000358700003163902004E/HowTo_handle_XI_30_Caches.pdf
    Regards,
    michal
    <a href="/people/michal.krawczyk2/blog/2005/06/28/xipi-faq-frequently-asked-questions"><b>XI / PI FAQ - Frequently Asked Questions</b></a>

  • Cache Monitoring in Runtime Workbench not showing status/throwing error

    Hello Friends,
    The Cache Monitoring in Runtime Workbench is not showing the status at all. Below is the error message that is displayed.
    Connection to system RUNTIME using application RUNTIME lost. Detailed information: Error accessing "https://us-medpiqas.ww005.siemens.net:50001/run/value_mapping_cache/int?method=InvalidateCache" with user "null". Status of response is HTTP/1.1 401 Unauthorized - Unauthorized
    It was working fine few days before. Now the messages are not getting processed as they are not able to access the message mapping at runtime. I am surprised as to how this happened. Can anyone of you tell me what I can do to proceed with the problem?
    FYI: SXI_CACHE in ABAP Stack shows updated status and is green.
    Thanks & Best Regards,
    Anand Patil
    Edited by: Anand Patil on Dec 22, 2010 4:06 PM

    Hi,
    Restarting the XI: CPA Cache Service in https://<host>:<port>/nwa -
    > Systems---->Start & Stop. After restarting the CPA Cache service. The error is no more shown.
    This works!
    Thanks & Regards,
    Anand Patil

  • How to read Group ID from Value Mapping Context in Cache Monitoring ?

    Hi friends,
        In RWB --> Cache Monitoring --> Integration Server (Java) -> (Search for Value Mapping Groups) in this each item is identified by Value Mapping Group (GroupID, Context, Identifier/Agency, Identifier/Scheme). Either we create Value Mapping Table in ID or replicate value mapping data directly from text file/SAP table etc., in run time cache, data will be identified in this manner.
        Now, our requirement is to delete a record the Cache for a particular context. Two operations provided by XI one is 'Delete' and another one is 'DeleteGroup'. When we use either of this, we should know GroupID. Suppose, I replicated some large amount of data from my text file in Runtime Cache. Value Mapping Table is like that IN --> India, US --> USA, AU --> Australia, EG --> Egypt. Now, I am required to write a program to get the input country code from user which is going to delete in the value mapping table like IN/AU....  For this, what logic we should follow in the program is, First we scan the value mapping table and find the record (country code)  which is match with the input. Then find the GUID value for this record. Now we use the DeleteGroup Operation and pass this GUID and then delete the record.
        So, in essence, how to read the GUID from value mapping context.
        Friends, Kindly help me to do this.
    Thanks in advance.
    Jegatheeswaran P.

    Did you get the way to read group id?

  • What is a cache monitoring? and what it is used for?

    what is a cache monitoring? and what it is used for? how safe it is to execute the transaction RSRCACHE in development?
    Thank you,
    York

    Hi Les,
    Cache is a temporary storage for recently accessed data.
    Used to enhance query performance.
    Use t-code RSRT to view more on cache.
    Please see this link:
    http://help.sap.com/saphelp_nw04/helpdata/en/41/b987eb1443534ba78a793f4beed9d5/content.htm
    Cache helps to improve query performance. As it can save data in memory, flatfile, cluster table or BLOB.
    You can remove cache per query or inactivate it for perticular info provider or inactivate it overall.
    But which is not recommended. Yes if u know that certain queries you dont use often and its not accessing large number of records then you can inactivate it for those queries. you can manage cache via three t-codes RSRT or RSRCACHE or SPRO>SAP Reference IMG>SAP Busines Wearhouse--> Reporting relevent setting -->General Reporting Setting in BEX -->Global Cache Setting.
    please follow the link which has few good documents on cache. which will clear your complete concept for cache.
    https://service.sap.com/bi -->Product information previous releases -->BI InfoIndex --> OLAP --> you will get bunch of documents in that.
    Look at the following threads :
    OLAP Cache
    what is cache?
    also check RSRT and OLAP cahce docs
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/afc59790-0201-0010-9f99-940f7c3934fc
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/9f4a452b-0301-0010-8ca6-ef25a095834a
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/7c361b95-0501-0010-c0ab-897c5aa14504
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/31b6b490-0201-0010-e4b6-a1523327025e
    Hope this helps.
    ****Assign Points if Helpful****
    Regards,
    Ravikanth.

  • Difference between Integration Process and Monitoring Process

    Hi Experts,
    What is the difference between Integration Process and Monitoring Process available in PI7.1?
    SAP says that Monitoring process is a special kind of integration process that receives the event messages.
    My doubt is even integration process can receive the event messages.
    Why these two different type of entities are created for the same purpose?
    And what is the technical difference between the two in terms of PI perspective?
    Regards,
    Sami.

    My question is now answered.
    [https://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/70a25d3a-e4fa-2a10-43b5-b4169ba3eb17]
    On page 17 of this pdf following sentence is mentioned :-
    From technical perspective, there is no difference between monitoring process and integration process.
    Though logically those are two deifferent things.
    Monitoring porcesses are used to receive only event messages that is comprises of event data only.
    Like Purchase order creation is a event and its event message will have the event data like Order Id, Created on, Created by, Quantity etc., instead of whole purchase order.
    Where as Integration Process is a way to provide solution in some specific circummtances like where we have to automate our process or where we need something in between for the course of communication.
    Guys thanks for your precious time.
    Regards,
    Sami.

  • How do i get the error detil within monitoring process .. ???

    Dear All,
    Hi all ... It's regarding monitoring
    If we monitor the extraction's request, than we go detail, we're gonna see the 3 tabs: Header, Status, and Details. You'll see the status of your data extraction request..
    Now, if i want to get the data in the details Or Status, does anyone know what table that keep that information??
    Or ..
    Is there a function to achieve that information ???
    Could you kindly share to me please .. ????
    Thanks in advance.
    Best regards,
    Niel.

    Dear Niel,
    Tcode: RSPC
    To create process chain go into RSPC there we have 4 views like
    1. planning view (to create process chain)
    2. checking view(to check the process chain)
    3. log view (to monitor process chain)
    4. detail view(to see which process type has which variants)
    You cant get the data in Monitor as it is used only for monitoring purpose
    you can get it information through mail.. for that
    try
    You can send messages to an application process of the chain, depending on the success or failure of the process.
    1. To do this, using the context menu for a process, create another process variant of type Send Message.
    2. If you maintain a message, first specify whether you want the message to be sent when the process has been completed successfully or unsuccessfully. Then choose Next.
    3. A dialog box appears, in which you can select an existing process variant or create a new one.
    4. If you create a new process variant, edit the document that is going to be sent and maintain a list of recipients.
    5. Save your process variant and go back.
    The message process variant is now assigned to your application process. When the message is sent, the status information and the process log are also sent.
    Note, you must configure SAPconnect in order to ensure that your system can send email - use transaction SCOT if this has not been configured.
    Thanks for any points you choose to assign (that is the way to say thansk in SDN).
    It's set as a property for an individual process in the chain. You would have to set on each process where you want notification.
    One technique you might consider, is that tou can create a "meta-chain" made up of local chains, and you can set a message on each of the "local chain" processes in the meta-chain
    Also go through these links
    How to Trigger an Alert from a Process Chain (NW7.0)
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/c0f4e952-e46e-2910-1f9e-cae187cd16d1
    SAP Network Blog: Information Broadcaster Triggered by a Process Chain
    /people/anil.bavaraju/blog/2008/02/07/information-broadcaster-triggered-by-a-process-chain
    Hope it helps you,Revert me back if you have any queries
    Assign ponts if helpful
    Regards
    Bala

  • KM Cache in Cache Monitor is not cleared using KM API

    Hi All,
    I am trying to clear the KM cache that is pre-configured in Cache Monitor using KM API. Below is the code I am using to clear the cache. It is not clearing the cache but creating the new cache with the name "KM_Cache1 (1)" and default properties set to peak load as 100%. But my requirement is to access the pre-configured cache and clear it then refresh it.
    import com.sapportals.wcm.WcmException;
    import com.sapportals.wcm.service.cache.CacheServiceFactory;
    import com.sapportals.wcm.util.cache.CacheException;
    import com.sapportals.wcm.util.cache.ICache;
    import com.sapportals.wcm.util.cache.CacheFactory;
    try{
              Cache cache1 = CacheServiceFactory.getInstance().getCache("KM_Cache1");
              cache1.clearCache();
              cache1.refresh();
    Thanks in advance to help me on how to clear the cache in cache Monitor.

    Hi,
    The code is as follows:
    IIndexService indexService = null;
    try {
       indexService = (IIndexService) ResourceFactory.getInstance().getServiceFactory().getService(IServiceTypesConst.INDEX_SERVICE);
    } catch (ResourceException e) {
       if (indexService == null) {
         log.errorT("Error on instanciating the index service");
         return this.renderMessage(this.getBundleString(RES_NO_INDEX_SERVICE), StatusType.ERROR);
    // get index
    IIndex index = null;
    try {
       index = indexService.getIndex("YourIndexID");
    } catch (WcmException e1) {
       log.errorT("Error when trying to get the index");
       return this.renderMessage(this.getBundleString(RES_NO_INDEX), StatusType.ERROR);
    // check if the index is a instance of AbstractClassificationIndex
    AbstractClassificationIndex classiIndex = null;
    if (index instanceof AbstractClassificationIndex) {
       classiIndex = (AbstractClassificationIndex) index;
    } else {
       log.errorT("The index " + index.getIndexName() + " is no classification index");
       return this.renderMessage(this.getBundleString(RES_NO_CLASSIFICATION_INDEX), StatusType.WARNING);
    //give your KM Resource here for which you want to know if it is classified or not
    boolean classified = classiIndex.isDocClassifiedInAnyTax(resource);
    Regards,
    Praveen Gudapati

  • SXI_CACHE vs. RWB Cache Monitoring

    I'm familiar with the functionalities of transaction SXI_CACHE, as I've used it on a previous project. However, I have not used the Cache Monitoring function in the RWB, and was curious what the differences were. When would you prefer to use one over the other?
    I've reviewed the How To Handle Cache in XI30, but the only thing that was specified about RWB Cache Monitoring was that you would choose between Software Components and Mapping Programs. I am still unclear as to what else is available.
    Thanks in advance!

    Hi Daniel,
    See these..
    http://help.sap.com/saphelp_nw2004s/helpdata/en/0d/28e1c20a9d374cbb71875c5f89093b/frameset.htm
    http://help.sap.com/saphelp_nw2004s/helpdata/en/92/2fd93f130f9115e10000000a155106/frameset.htm
    /people/sravya.talanki2/blog/2005/12/02/sxicache--ripped-off
    /people/sravya.talanki2/blog/2005/11/03/cache-refresh-errors--new-phenomena
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/003de490-0201-0010-68a4-bf7a1e1fd3a5 -- Monitoring in XI 3.0 ( see cache monitoring)
    cheers,
    Prashanth
    P.S Please mark helpful answers

Maybe you are looking for

  • BT wifi & BT Wifi with Fon connection issue

    in the last 5 days I've had  issues connecting to BT Wifi and BT Wifi with Fon. I'm connecting using an iPhone and iPad so when I use the BT Wifi app for logging in it constantly says check details (which are correct) and then it will connect but the

  • Is there any way to get a better performance with this Xquery ?

    I created two tables. One has XBRL docoument with XMLType based on Clob and I'm going to select several columns from it and insert them into the other table. I used this SQL using XMLTable. INSERT INTO SV_XBRL_ELEMENT SELECT r.finance_cd,r.base_month

  • Weird problem with file structure

    Hi, I would just like to report something strange that happened, and would like to knwo if it happened to anyone else. I had a directory full of Logic Pro 7 projects called, say, "DIR". I copied that directory to "DIR8" and started loading the projec

  • Requirement is getting generated multiple times on Running the MRP

    Dear All, I am attaching one Fert material to my project and generating reservation for this.while running the MRP through Tcode MD01,planned order is getting created and later on converted to production order.After confirmation of production order,t

  • How to use wildcards in REST filter for subscription items

    I am following this documentation: String column filters Support % (starts with) operator. Support * (Contains) operator. Does not support (ends with) operator. Examples: Name=service* Name=*g* Name=*g -- not allowed REST URL: http://<ServerURL>/Requ