Enable OLAP proactive caching
Hi All
Just a quick question, I am unable to find any SAP Notes regarding OLAP proactive caching, and was wondering if enabling it would imporve our cube performance when querying.
The environment is SQL 2005 / AS 2005 SP3 / BPC 7.0 SP7 Patch 2
Thanks in advance
Daniel
Hi Daniel
Actually, BPC WB partition is using proactive caching as ROLAP.
Even though you set it for other two partitions (FACT,FAC2), it will not improve your query performance.
What I suggest is removing number of rows and columns in the expansion.
If you have multiple columns and rows in expansion, it will create crossjoin so it will make worse performance.
If you should have muliple columns and rows, then try to use multiple EVDRE so that it can have same column/row member.
It will remove crossjoin and will make better query performance.
Thank you.
James Lim
Similar Messages
-
Error in MOLAP Proactive Caching
Hello,
We have enabled Proactive Caching for MOLAP and we are using Polling mechanism. The polling query queries the view for date column. An SQL view is the datasource for the cube with joins on different tables.
When an insert happens on the underlying table, the polling query works fine and starts to process the cube.
During the cube process, the following error is logged in the SQL Server Profiler
Internal error: The operation terminated unsuccessfully. Errors in the OLAP storage engine: The attribute key cannot be found when processing: Table:vw_realdata, Column: 'Product', Value: 'Product1'. The attibute is 'Product'....
We have enabled the "Apply settings to dimensions" checkbox for the Measure group
When the complete database is processed, this error does not occur.
Please let me know how to prevent this error using Proactive Caching?Eileen,
"The issue is during the cube process which is run by SSAS once it detects changes by Poll query"
Say I have a dimension Product, with key as Product_Key and an attribute BRAND. with values {1,BRAND-A}.
Up to now everything works fine.
Dimension data in database got updated - BRAND-A updated to BRAND-B.
During this time,
- before poll query detect
- after poll query detected and during cube processing by SSAS
Any MDX query fired with attribute BRAND, will look for BRAND-B in MOLAP dimension, if not found it will throw error. Why BRAND-B, because the DB is already updated.
SELECT non empty [PRODUCT].[BRAND].MEMBERS on rows, [Sales] on columns FROM MYCUBE
will translate in sql query like below
SELECT BRAND, SUM(Sales) Sales FROM <MYFACT> fact , PRODUCT prod where fact.PRODUCT_KEY = prod.PRODUCT_KEY GROUP by prod.BRAND
The sql returns
BRAND-B|9999.89 , the attribute values are checked against MOLAP dimension , it would fail with the error message as Anandh got.
After cube process completed by pro-active mechanism, the error will go away.
Thanks
Shom
Shom -
Any problems having Admin Optimization and Proactive caching run concurrently
Hi,
We've recently enabled proactive caching refreshing every 10 minutes and have seen data in locked versions changing after Full Admin Optimization runs. Given how the data reverts back to a prior submitted number, I suspect having proactive caching occur while the Full Admin Optimization runs may be the culprit.
here's an example to depict what is happening.
original revenue is $10M.
user submits new revenue amount of $11M.
version is locked.
data in locked version is copied into a new open version.
full optimization runs at night and take 60 mins. all the while, proactive caching runs every 10 mins.
user reports the revenue in the previously locked version is $10M and the new version shows $11M.
We've never experienced this prior to enabling proactive caching which leads me to believe the 2 processes running concurrently may be the source of the problem.
Is proactive caching supposed to be disabled while Full Admin Optimization process is running?
Thanks,
FoHi Fo
When a full optimization is run, the following operations take place:
- data is moved from wb and fac2 tables to the fact table
- the cube is processed
If the users are loading data while full optimization occurs then it is expected that a certain discrepancy will be observed. One needs to know that even with proactive caching enabled, the OLAP cube will not be 100% accurate 100% of the time.
Please have a look at this post which explains the details of proactive caching:
http://www.bidn.com/blogs/MMilligan/bidn-blog/2468/near-real-time-olap-using-ssas-proactive-caching
Also - depending on how they are built, the BPC reports may generate a combination of MDX and SQL queries which will retrieve data from the cube and data from the backend tables.
I would suggest to prevent users from loading data and running reports while the optimization takes place.
Stefan -
Automatic MOLAP cube : Proactive caching was cancelled because newer data became available
When I process the cube manually after processing Dimension, It works fine. But when I append data into database column, It performs proactive caching, at that time it fails.
Sometimes: It does not get key attribute because measure gets processed before dimension
and sometimes it gives error:
Proactive caching was cancelled because newer data became available Internal error: The operation terminated unsuccessfully. OLE DB error:
OLE DB or ODBC error: Operation canceled; HY008. Errors in the OLAP storage engine: An error occurred while the dimension, with the ID of 'call dim Monthly 201401 2', Name of 'call dim Monthly
201401 2' was being processed. Errors in the OLAP storage engine: An error occurred while the 'MSW' attribute of the 'call dim Monthly 201401 2' dimension from the 'callAnalysisProject' database was being processed. etc....I have also seen this error occur in other scenarios.
In the first if you have set Pro-Active caching to refresh every 1 minute and your query takes 2 minutes to refresh the error above can be displayed. Solution increase your refresh time or tune your Pro-Active caching query.
In connection with the above if your server is limited on available resources this can also cause the slower query response times during refresh and the message above. -
Error while Enabling Active Data Cache Clustering Services: error 1753
I have installed the Oracle BAM with Enterprise link.All the installation was successful.But while Enabling Active Data Cache Clustering Services using the following command:
cluster.exe restype "Oracle Business Activity Monitoring Active Data Cache" /create /dll:"C:\OracleBAM\ADCClusterResourceType.dll"
i am getting the folllowing error:
System error 1753 has occurred.
There are no more endpoints available from the endpoint mapper.
Although i stopped the Active Data Cache Service..?
I am unable to resolve it
Kindly help me out
Ramesh NambalaHi.
I encountered the same problem. In my case, the reason was that I created a new user for BAM (to be provided when installing BAM) but ran the installation under another user account. It seems like these users must be the same to make it work.
Greetings,
cor -
How to enable olap option in oracle enterprise edition 10g?
what are the settings/steps one should take in enabling OLAP option in oracle enterprise edition 10g?
If we have an installation of Oracle EE 10g,how can we know that OLAP is enabled or not?
Can anyone throw light on OLAP_TABLE function?
In Analytic Workspace manager 10g,i have created a workspace ,under that i am having some dimensions,cubes,as well.
After mapped to source table,and maintained,when i do a right click on measures, to view data, i get a error ORA-06553:PLS-222:no function with the name 'OLAP_TABLE' exists in this scopeThe Export function is disabled when the responsibility is opened initially .
When we enter into a specific form its automatically enabled .
Login to a user who has Application Developer responsibility & to Application >Register & press f11 and Ctrl f11 . -
Feature not enabled: OLAP Window Functions
I was trying to use the RANK function in oracle 8i when I got this message
ORA-00439: feature not enabled: OLAP Window Functions
Where do I enable this feature? there isn't enough detail available for this in documentation.
Any help will be appriciated.That's correct
The differences between Standard and Enterprise Edition are listed here:
http://www.oracle.com/technology/products/oracle8i/pdf/8i_fam.pdf -
When does proactive caching make sense?
Hi all!
An standard pattern for multi-dimensional cube is to have
one cube doing heavy time-consuming processing and then synchronize it to query cubes.
In this setup, will pro-active caching makes sense?
Best regards
Bjørn
B. D. JensenHello Jensen,
Proactive Cache is useful low volume data cubes where data is updating frequently like inventory, forecasting etc. But I will tell you with my own experience Proactive cache in SSAS is not worth. It behaves unexpectedly some times when data update/insert/Delete
in source table the cube doesn't start its processing , better you create a SQL Job to process the cube after specified time .
If you want to process the cube in specified interval then I would suggest you to go with SQL JOB
blog:My Blog/
Hope this will help you !!!
Sanjeewan -
Proactive caching Error while installing BPC 7.0 SP6
All,
We are installing BPC 7.0 sp 6 on Microsoft SQL Server 2008 (RTM) - 10.0.1600.22 (Intel X86) Jul 9 2008 14:43:34 Copyright (c) 1988-2008 Microsoft Corporation Standard Edition on Windows NT 5.2 <X86> (Build 3790: Service Pack 2)
In short SQL 2008 Standard Edition. Will BPC 7.0 work with it?
We tried to install the BPC and recived the "Proactive caching" errors. I know we will need this feature for ROLAP.
Can we keep Standard Edition and make it work? Or we will need EE of SQL for BPC to work?
Thanks for your help.
Sam
Edited by: Sam Patel on Mar 24, 2010 10:05 AMSQL Server EE is required for all SAP BPC Microsoft versions.
BPC is using functionalities from SSAS delivery just with EE.
Proactive caching is one the features available just into EE not into SE. But are also other features used by BPC from SSAS which can be found just into EE.
It is true that actually just for SSAS we require EE, for all other componetnts we don't really need EE.
So the installation will work, but of course you will have some limitations coming from Microsoft.
For example SQL Server SE is able to handle maximum 4 processors.
RS can not be installed into Web farm if you are using SE
And many others.
I hope this will help you to have a bteer view about what is reguired and why.
Regards
Sorin Radulescu
Edited by: Sorin Radulescu on Mar 25, 2010 2:39 PM -
Proactive Caching - Monitoring processing
I'd like to hear from anyone that is using proactive caching and how they monitor the loads of the cube.
I have created a Aggregation=Max measure in each measure group that loads as getdate(), this allows me to see the load date by partition. My date dimension has a partition_cd, which denotes what dates a partition covers. The partition date
scheme is the same for all measure groups. This handles things from the user perspective, they know how recent their data is.
What it doesn't do is allow me to see average load times, number of loads per day, etc.. the things I need from a support perspective.
The only solution I have seen for this is the ASTrace.exe application. This would mean installing something custom on the server, like to avoid that if I can. Any other options out there?
Any other feedback on this area in general?
As always you guys are great, thanks for all the help!
-KenHi Ken,
Thank you for your question.
I am trying to involve someone more familiar with this topic for a further look at this issue. Sometime delay might be expected from the job transferring. Your patience is greatly appreciated.
Thank you for your understanding and support.
Regards,
Charlie Liao
TechNet Community Support -
Proactive Caching for Cube process.
Hi,
We need to implement proactive caching on one of our cubes in SQL Server 2012, we are able to do it at Partition level (Measures) when there data changes in tables, I am looking for option to implement Proactive Caching at Cube level every
night at particular time (12.00 A.M) irrespective of data change in the tables. We dont want to use SSIS Packages.
Thank You.
PraveenHi Praveen,
Proactive Caching is a feature in SSAS that allows you to specify when to process a measure group partition or dimension as the data in the relational data source changes.
Generally, to implement Proactive Caching for a cube, we develop an SSIS package that processes the dimensions and measure group partitions. And then execute the SSIS package periodically. As
Kieran said, why don't you want to use SSIS Packages in your scenario?
Here are some useful links for your reference.
http://vmdp.blogspot.com/2011/07/pro-active-caching-in-ssas.html
http://www.mssqltips.com/sqlservertip/1563/how-to-implement-proactive-caching-in-sql-server-analysis-services-ssas/
Regards,
Charlie Liao
If you have any feedback on our support, please click
here.
Charlie Liao
TechNet Community Support -
How to enable OHS compression/cache for OBIEE
Hi i have installed OHS (11.1.1.7) and webcache on OBIEE 11.1.1.7. Does any one know how to enable/configure compression/cache for OBIEE analytics? Also how to validate if compression is working?
Hi Anke,
all tables that have been created in V9.7 with attribute COMPRESS YES will be compressed statically .
db2 " select count(*) , rowcompmode from syscat.tables group by rowcompmode "
After the upgrade to 10.5 all tables created with attribute COMPRESS YES will get rowcompmode='A' but old tables created with V9.7 will stay with rowcompmode='S' .
You can change tables from rowcompmode='S' to rowcompmode='A' via ALTER TABLE . After this all new pages or old pages that are touched will be adaptively compressed. But old pages that are not touched will only be static compressed. To get all pages of an existinbg table adaptive compressed you need to move data. For example with DB6CONV.
Regards
Frank -
SSAS Proactive Cache from View
Hi,
I have recently reconfigured some of my cube partitions to the HOLAP storage mode with proactive caching turned on. At the moment I have the "Notifications" set to "SQL Server" and tracking from a table and everything is working
as expected.
However, ideally I would like to track from an existing view (which would mean that I do not have to create new tables for tracking purposes only). I initially tried this and after some searching have not found a solution only that most people solve
the issue by tracking from tables. Surely there is a way to achieve tracking from views? If someone could point me in the right direction for some reading it would be much appreciated.
Thank you for taking the time to read this.
SQL Server Version:
Microsoft SQL Server 2008 R2 (SP1) - 10.50.2550.0 (X64) Jun 11 2012 16:41:53 Copyright (c) Microsoft Corporation Developer Edition (64-bit) on Windows NT 6.1 <X64> (Build 7601: Service Pack 1)Hi Nesthead,
According to your description, you created a project based on SQL Server views, now you need to set the change tracking on the views so that you can set cube partitions to the HOLAP storage mode with proactive caching turned on, right?
As you know, we can set the Change Tracking setting on database level and table level, however there is no such an option on view level. Generally, we create a SQL Server view to combine the columns from different tables, so the view is based on multiple
tables. If we need to set cube partitions to the HOLAP storage mode with proactive caching turned on, we just need to turn on the Change Tracking on the tables that are used to create the view. Here is a similar issue, please see:
http://stackoverflow.com/questions/19978072/ssas-2008-proactive-caching-not-working-on-views
Regards,
Charlie Liao
If you have any feedback on our support, please click
here.
Charlie Liao
TechNet Community Support -
Convergence - problem enabling browser-side caching
I am following the ["Enhancing Browser-Side Caching of Static Files" section of the Convergence Performance Tuning|http://wikis.sun.com/display/CommSuite/Convergence+Performance+Tuning#ConvergencePerformanceTuning-EnhancingBrowserSideCachingofStaticFiles] wiki document.
I installed the class file
-rw-r--r-- 1 root root 2724 Oct 2 08:50 /opt/SUNWappserver/domains/domain1/config/lib/classes/ExpiresFilter.classAnd I added the following to default-web.xml (just below the last </servlet-mapping>)
<filter>
<filter-name>ExpiresFilter</filter-name>
<filter-class>iwc.ExpiresFilter</filter-class>
</filter>
<filter-mapping>
<filter-name>ExpiresFilter</filter-name>
<url-pattern>/iwc_static/js/*</url-pattern>
<url-pattern>/iwc_static/layout/*</url-pattern>
<dispatcher>REQUEST</dispatcher>
<dispatcher>FORWARD</dispatcher>
</filter-mapping>When I restart glassfish, I get this error in iwc_admin.log
ADMIN: ERROR from com.sun.comms.client.admin.web.util.ConfigHelper Thread pool-1-thread-4 ipaddress= sessionid= at 09:21:35,355- Error occured while loading the configuration from the file Parsing Error
Line: 209
URI: file:/var/opt/wisc/wtcnv4/iwc/config/configuration.xml
Message: cvc-complex-type.2.4.b: The content of element 'BackendServiceDetails' is not complete. One of '{Enable, EnableSSL, ServiceURI, Timeout}' is expected.
ADMIN: ERROR from com.sun.comms.client.admin.web.util.ConfigHelper Thread pool-1-thread-4 ipaddress= sessionid= at 09:21:35,355- Parsing error while loading configuration: Error occured while loading the configuration from the file /var/opt/wisc/wtcnv4/iwc/config/configuration.xml due to: Everything works fine if I remove the <filter> stanza from the config.
I'm guessing that it isn't loading the class correctly. Any ideas?jessethompson wrote:
I am following the ["Enhancing Browser-Side Caching of Static Files" section of the Convergence Performance Tuning|http://wikis.sun.com/display/CommSuite/Convergence+Performance+Tuning#ConvergencePerformanceTuning-EnhancingBrowserSideCachingofStaticFiles] wiki document.
What version of Convergence are you using (./iwcadmin -V)?
I tried the steps with Convergence patch 10 (update 3) and Glassfish 2.1 (on Redhat Linux) and it worked fine -- I see "Expires" and "Cache-Control" headers for Convergence file requests.
I installed the class file
-rw-r--r-- 1 root root 2724 Oct 2 08:50 /opt/SUNWappserver/domains/domain1/config/lib/classes/ExpiresFilter.class
The class file should be in:
/opt/SUNWappserver/domains/domain1/lib/classes/iwc/Regards,
Shane. -
Can't enable HDD write caching
I recently upgraded my PC. I am using Vista x64 on a MSI P35 mobo and a Seagate SATA II 250GB drive and a 36GB Raptor (SATA I). It has the newest 1.8 BIOS and newest Intel AHCI drivers.
The problem is this: I can't enable write caching on either hard drive in the Device Manager. It is there to check but below it it says something like "This device does not support write caching."
Now I am not dumb, and the Raptor used to have write caching enabled on my old nForce4 mobo in Vista. I have AHCI enabled in the mobo BIOS too. Does anybody kno what is causing this?
Also, does anybody knw of any good utilities that will tell me the fatures that are enabled (NCQ, SATA 3.0, etc..) that doesn't require an install? I just like to execute utilities and get results without installing stuff (like CPU-Z and GPU-Z).Hi,
By default, write caching is disabled on a disk that contains the Active Directory database (Ntds.dit). Also, write caching is disabled on a disk that contains the Active Directory log files. By doing this, you enhance the reliability of the Active
Directory files.
Thus if it is a DC, try to move the Active Directory database and the Active Directory log files off of a disk in which you need to enable write caching.
If you have any feedback on our support, please send to [email protected]
Maybe you are looking for
-
Overflow in the operation:= error
Hi, I get the following error in my DTP of a process chain: Runtime error while executing rule -> see long text Diagnosis An error occurred while executing the transformation rule: The actual error message: Overflow in the operation := The error
-
How can i get iTunes on my new lap top to look at my server for the music
I have a new lap top with vista ultimate. I can't get it to "see" the music stored on the server, so my iTUnes on the lap top is blank. How can i get my iTunes on my new lap top to look at my server for the music? someone mentioned creating an alias
-
Installing itunes to my laptop I get an error message (Apple mobile device service fail to start verify that you have sufficient privileges to tart system services)
-
Cloning tool doesn't work...
I can't get my clone tool to work on Photoshop CS5. Never had a problem in the older versions. Is there something different about how it works in this one?
-
I bought my Macbook pro in 2010 and had Snow Leopard which I upgraded when Mountain Lion was available. Now it keeps saying Mavarick is available. Is that another update? Or is that something else?