Increase CalcLockBlock setting
I am receiving the following error message when drilling down on an ad hoc report, or generating various reports.
Dynamic Calc processor cannot lock more than [50] ESM blocks during the calculation, please increase CalcLockBlock setting and then retry(a small data cache setting could also cause this problem, please check the data cache size setting).
My essbase.cfg settings are as follows.
Direct I/O is turned off. Buffered I/O is turned on.
__SM__BUFFERED_IO TRUE
; No wait I/O is turned off. Waited I/O is turned on.
__SM__WAITED_IO TRUE
; Turn off Intelligent Calculation.
UPDATECALC FALSE
; Bottom-up calculations are performed on all formulas.
CalcOptFrmlBottomUp TRUE
CALCCACHEHIGH 50000000
CALCCACHEDEFAULT 200000
CALCCACHELOW 1024
CALCNOTICEHIGH 20
CALCNOTICEDEFAULT 10
CALCNOTICELOW 5
CALCLOCKBLOCKHIGH 10000
CALCLOCKBLOCKDEFAULT 4000
CALCLOCKBLOCKLOW 1000
DYNCALCCACHEMAXSIZE 81250000000
DATAERRORLIMIT 1000000000
LOCKTIMEOUT 120
TIMINGMESSAGES FALSE
EXCEPTIONLOGOVERWRITE FALSE
SSPROCROWLIMIT 500000
SSLOGUNKNOWN FALSE
How do I correct my errors?
Thank you in advance.
I see that the following in my cube Database Properties under the Caches tab.
Cache memory locking - Not checked
Cache sizes
Index cache setting (KB) 1024
Index cache current value (KB) 1024
Data file cache setting (KB) 32768
Data file cache current value (KB) 0
Data cache setting (KB) 11457
Data cache current value (KB) 11407
Index page setting (KB) 8
Index page current value (KB) 8
Are you suggesting that I change the data cache setting?
If so, to what amount?
Thank you,
Dan
Similar Messages
-
Error while executing business rule "Please increase CalcLockBlock setting"
Hello,
An error is occouring when I execute a particular business rule. The following message appears when I view the error from Job Console:
"Dynamic calc processor cannot allocate more than [100] blocks from the heap. Please increase CalcLockBlock setting and then retry"
No such error is occouring when I execute any other business rule.
Can anyone please tell me how to resolve it?
Regards,
Sa'adHi Sa'ad,
in the business rule your getting the error message is a calculation, where the result needs more than 100 blocks to access to.
The error comes from three possible causes:
1. Lockblock setting is low (essbase.cfg)
2. Your data cache setting is to small
3. Block size is to large
I think you have to increase data cache settings (by default 8192 KB) to.
Cheers
Uli
Edited by: Uli Drexelius on 13.07.2011 14:23 -
Hi,
Our Environment is Essbase 11.1.2.2 and working on Essbase EAS and Shared Services components.One of our user tried to run the Cal Script of one Application and faced this error.
Dynamic Calc processor cannot lock more than [100] ESM blocks during the calculation, please increase CalcLockBlock setting and then retry(a small data cache setting could also cause this problem, please check the data cache size setting).
I have done some Google and found that we need to add something in Essbase.cfg file like below.
1012704 Dynamic Calc processor cannot lock more than number ESM blocks during the calculation, please increase CalcLockBlock setting and then retry (a small data cache setting could also cause this problem, please check the data cache size setting).
Possible Problems
Analytic Services could not lock enough blocks to perform the calculation.
Possible Solutions
Increase the number of blocks that Analytic Services can allocate for a calculation:
Set the maximum number of blocks that Analytic Services can allocate to at least 500.
If you do not have an $ARBORPATH/bin/essbase.cfg file on the server computer, create one using a text editor.
In the essbase.cfg file on the server computer, set CALCLOCKBLOCKHIGH to 500.
Stop and restart Analytic Server.
Add the SET LOCKBLOCK HIGH command to the beginning of the calculation script.
Set the data cache large enough to hold all the blocks specified in the CALCLOCKBLOCKHIGH setting.
Determine the block size.
Set the data catche size.
Actually in our Server Config file(essbase.cfg) we dont have below data added.
CalcLockBlockHigh 2000
CalcLockBlockDefault 200
CalcLockBlocklow 50
So my doubt is if we edit the Essbase.cfg file and add the above settings and restart the services will it work? and if so why should we change the Server config file if the problem is with one application Cal Script. Please guide me how to proceed.
Regards,
NaveenYour calculation needs to hold more blocks in memory than your current set up allows.
From the docs (quoting so I don't have to write it, not to be a smarta***:
CALCLOCKBLOCK specifies the number of blocks that can be fixed at each level of the SET LOCKBLOCK HIGH | DEFAULT | LOW calculation script command.
When a block is calculated, Essbase fixes (gets addressability to) the block along with the blocks containing its children. Essbase calculates the block and then releases it along with the blocks containing its children. By default, Essbase allows up to 100 blocks to be fixed concurrently when calculating a block. This is sufficient for most database calculations. However, you may want to set a number higher than 100 if you are consolidating very large numbers of children in a formula calculation. This ensures that Essbase can fix all the required blocks when calculating a data block and that performance will not be impaired.
Example
If the essbase.cfg file contains the following settings:
CALCLOCKBLOCKHIGH 500 CALCLOCKBLOCKDEFAULT 200 CALCLOCKBLOCKLOW 50
then you can use the following SET LOCKBLOCK setting commands in a calculation script:
SET LOCKBLOCK HIGH;
means that Essbase can fix up to 500 data blocks when calculating one block.
Support doc is saying to change your config file so those settings can be made available for any calc script to use.
On a side note, if this was working previously and now isn't then it is worth investigating if this is simply due to standard growth or a recent change that has made an unexpected significant impact. -
CalcLockBlock setting.....
Hi All,
I have written a business rule in EAS console.The rule was validated successfully but when i tried to Launch,it shows error msg as below..
Detail:Cannot calculate. Analytic Server Error(1012704): Dynamic Calc processor cannot lock more than [4] ESM blocks during the calculation, please increase CalcLockBlock setting and then retry(a small data cache setting could also cause this problem, please check the data cache size setting).
I read through the Essbase Technical reference guide to increase CalcLockBlock setting.It is written that
If the essbase.cfg file contains the following settings:
CALCLOCKBLOCKHIGH 500
CALCLOCKBLOCKDEFAULT 200
CALCLOCKBLOCKLOW 50
then you can use the following SET LOCKBLOCK setting commands in a calculation script:
SET LOCKBLOCK HIGH;
Then I added the above 3 setting into my essbase.cfg file and set LOCKBLOCK HIGH.Still unable to launch the business rule.
can someone plz suggest me how to set the CalcLockBlock size?
Regards,Hi,
The exact value is entirely relevant to the type of calculation you are trying to run. However, I suggest you to keep increasing these gradually (by 100 each time) until you can make it work. In most cases though, 500 should work fine. Have you recycled essbase service? Essbase.cfg settings are effective only after restart.
Cheers,
Alp -
Best practice to have cache and calclockblock setting?
Hi,
I want to implement hyperion planning.
What should be the best practice to set essbase settings for optimize performance?Personally, I would work out the application design before you consider performance settings. There are so many variables involved that to try to do it upfront is going to be difficult.
That being said each developer has their own preferred approach and some will automatically add certain expressions into the Essbase.cfg file, set certain application level settings via EAS (Index Cache, Data Cache, Data File Cache).
There are many posts discussing these topics in this forum so suggest you do a search and gather some opinions.
Regards
Stuart -
Tuning average of a data set calculation in an increasing Data Set
I was asked a question by one of my colleagues in another team:
In a table with increasing number of rows and that too in an increasing order of Id Keys- 1,2, ....3000... 12000 .... 50000..............
We have a column age.
And we have a query:
select avg(age) from <tablename> ;
This query is increasingly taking a longer time
How to optimize it.
I guessed that we might need some sort of partitioning or additional grouping but I am not fully sure
as to that would be the best solution.
What do you say?I was asked a question by one of my colleagues in another team:
In a table with increasing number of rows and that too in an increasing order of Id Keys- 1,2, ....3000... 12000 .... 50000..............
We have a column age.
And we have a query:
select avg(age) from <tablename> ;
This query is increasingly taking a longer time
How to optimize it.
I guessed that we might need some sort of partitioning or additional grouping but I am not fully sure
as to that would be the best solution.
What do you say?
I say it sounds like an interview question to me!
A question like that is often ask to see how (or if) a person thinks before answering.
Will they just blindly try to privide an answer?
Are they willing to ask questions to clarify the question being ask?
Will they ask questions to try to understand the actual requirements?
1. No one stores AGE in a database. Someone's age changes everyday so 'age' is a total meaningles piece of data. A data element like 'age at death' is meaninful and does NOT change. It can be meaningful because it protects potentially sensitive information like birth date and death date.
2. Neither the questioner nor you have presented ANY INFO indicating that any query needs to be optimized. Of course a query on an increasing amount of data will take an 'increasingly' longer amount of time. That should be obvious. If someone asks you to count to 100 won't that take MORE TIME than counting to 95?
3. A simple query like 'SELECT AVG(age)' does NOT need to be optimized - certainly not based on any info you provided. That AVG function may be part of a larger, more compliex, query and maybe that 'other' query needs to be tuned.
Your 'colleague' is either 'testing' you or 'playing' with you.
If it ain't broke, don't fix it.
The first step is to VERIFY that a problem actually exists. That means test and get PROOF that the query is taking longer than before and measure HO MUCH longer it is taking. As noted above if there is more data then a query MUST TAKE LONGER - that is a simple fact. That doesn't mean there is any tuning to be done.
Only after you verify that a problem actually exists do you start looking for solutions. -
Way to increase DPI setting in aperture like iPhoto?
Hi
There was a way in iPhoto to change the plist file to have it generate a higher res book prior to uploading to Apple - depending on whom you spoke with, this made a minor or major difference. Is anyone aware of something like this for Aperture?It was only iPhoto 5 that had the plist hack change, iPhoto 6 produces 300dpi books as standard, as does Aperture - there is no need to increase the dpi as you can't increase the printing size through the print services anyway.
-
??
Hi, I have been having the same problem for about 2weeks, keep deleting the app and reinstalling but this does not help. I even deleted it from Facebook as that's where u play thru, but that doesn't help either!!!!!! Its driving me mad. Any luck and I'll let u know
-
Increasing the LockBlock setting
Hi,
I'm trying to create an automated process where I can define the exact dimension members that will be exported to an SQL table.
I'm using the DataExport command to do it, and substitution variables that will fill the FIX statements of the dataexport query.
Since users can select a great number of blocks depending on the selection made, I'm getting the Dynamic Calc Processor error:
Dynamic calc processor cannot allocate more than [1200] blocks from the heap. Please increase CalcLockBlock setting and then retry
1200 blocks is the default number I have in Essbase.cfg, while CALCLOCKBLOCKHIGH as a higher number. However using SET LOCKBLOCK HIGH; is not helping getting around the problem and I'm not sure why.
Any ideas?
Thank you
Edited by: Icebergue on Nov 27, 2012 7:42 AMNo, I'm not using parallel calc and yes, I'm exporting dynamic calcs :) Exporting stored members is not a problem because if I switch DataExportDynamicCalc to OFF it works well.
Block Size: ~ 60 KB
Data Cache: 2000000 KB
SET LOCKBLOCK HIGH;
SET DATAEXPORTOPTIONS
DataExportLevel "ALL";
DataExportDynamicCalc ON;
DataExportrelationalFile ON;
DATAEXPORTOVERWRITEFILE ON;
DataExportDimHeader OFF;
FIX(&YearOrc, "ORC_N","Base","FRAC")
FIX("OCSPROJREC")
FIX(@CHILDREN(RNVIDA))
FIX(TOT_CHANNEL)
FIX(&ExportAccount)
FIX("Jan":"Dec")
DATAEXPORT "DSN" "dsn_name" "table_name" "username" "password";
ENDFIX
ENDFIX
ENDFIX
ENDFIX
ENDFIX
ENDFIX -
CalcLockBlock error at execution.
Hi,
We have a calculation script which is throwing an execution Error: 1012704 Dynamic Calc processor cannot lock more than [200] ESM blocks during the calculation, please increase CalcLockBlock setting and then retry(a small data cache setting could also cause this problem, please check the data cache size setting).
Accordingly we have set the below setting in the essbase.cfg file
CALCLOCKBLOCKHIGH 1500
CALCLOCKBLOCKDEFAULT 200
CALCLOCKBLOCKLOW 100
DATACACHESIZE 90M
Now its throwing an execution Error: 1012704 Dynamic Calc processor cannot lock more than [495] ESM blocks during the calculation, please increase CalcLockBlock setting and then retry(a small data cache setting could also cause this problem, please check the data cache size setting).
Please help me provide some solution...
Thanks in advanceHi John,
Thanks for your response.
Actually we have tried with different data cahe setting but still without success.
We are using one of the script written below:
I have written this piece of code but its not showing desired result .I think the problem is in the AVGRANGE.
Please look into this and let me know if I am doing anything wrong.
I need to accomplish this task ,if employee E1 & E2 are in Entity1 in Grade S in forecast1 and now in forecast 2 a new employee namely E3 has come in this new forecast and whether he belongs to same entity can be identified by a new account say "F",If "F" is present for that Employee in that particular entity means he belongs to that Entity .Then I need to calculate.
"P" value for E3 for a month=Avg of "P" value for E1 & E2 in Entity1 in Grade S for that month.
I think this code is calculating for invalid combination also.
FIX (&CurrFctScenario,&CurrFctVersion,&CurrYear)
FIX (&SeedCurrency)
FIX(@descendatns("Entity"),@descendatns(GRADE),@Descendants(Employee)
FIX (&CurrMonth:"MAY"
, &SeedHSP
"P"(
IF ( "F"!=#Missing AND "P"==#Missing)
@AVGRANGE(SKIPNONE,"P",@children(Employee)->@currmbr(Grade)->@currmbr(entity));
ENDIF;
ENDFIX
ENDFIX
One more thing as I am testing this code for say two three employees then its working fine but as I am uisng @children(Employee) then I am getting error message
Error: 1012704 Dynamic Calc processor cannot lock more than [200] ESM blocks during the calculation, please increase CalcLockBlock setting and then retry(a small data cache setting could also cause this problem, please check the data cache size setting).
Is there any other way of doing this calculation?or only changing data cache setting will help us? -
Error: 1012704 Dynamic Calc processor cannot lock more than [25] ESM blocks
Dear All,
I get the Following Error in the Essbase console when I try to Execute any CalcScript.
Error: 1012704 Dynamic Calc processor cannot lock more than [25] ESM blocks during the calculation, please increase CalcLockBlock setting and then retry(a small data cache setting could also cause this problem, please check the data cache size setting)_+
Please find the detailed output of the Statics of my Planning Applications Database and outline.
please help guys........
GetDbStats:
-------Statistics of AWRGPLAN:Plan1 -------
Dimension Name Type Declared Size Actual Size
===================================================================
HSP_Rates SPARSE 11 11
Account DENSE 602 420
Period DENSE 19 19
Year SPARSE 31 31
Scenario SPARSE 6 6
Version SPARSE 4 4
Currency SPARSE 10 10
Entity SPARSE 28 18
Departments SPARSE 165 119
ICP SPARSE 80 74
LoB SPARSE 396 344
Locations SPARSE 57 35
View SPARSE 5 5
Number of dimensions : 13
Declared Block Size : 11438
Actual Block Size : 7980
Declared Maximum Blocks : 3.41379650304E+015
Actual Maximum Blocks : 1.87262635317E+015
Number of Non Missing Leaf Blocks : 10664
Number of Non Missing Non Leaf Blocks : 2326
Number of Total Blocks : 12990
Index Type : B+ TREE
Average Block Density : 0.01503759
Average Sparse Density : 6.936782E-010
Block Compression Ratio : 0.001449493
Average Clustering Ratio : 0.3333527
Average Fragmentation Quotient : 19.3336
Free Space Recovery is Needed : No
Estimated Bytes of Recoverable Free Space : 0
GetDbInfo:
----- Database Information -----
Name : Plan1
Application Name : AWRGPLAN
Database Type : NORMAL
Status : Loaded
Elapsed Db Time : 00:00:05:00
Users Connected : 2
Blocks Locked : 0
Dimensions : 13
Data Status : Data has been modified
since last calculation.
Data File Cache Size Setting : 0
Current Data File Cache Size : 0
Data Cache Size Setting : 3128160
Current Data Cache Size : 3128160
Index Cache Size Setting : 1048576
Current Index Cache Size : 1048576
Index Page Size Setting : 8192
Current Index Page Size : 8192
Cache Memory Locking : Disabled
Database State : Read-write
Data Compression on Disk : Yes
Data Compression Type : BitMap Compression
Retrieval Buffer Size (in K) : 10
Retrieval Sort Buffer Size (in K) : 10
Isolation Level : Uncommitted Access
Pre Image Access : No
Time Out : Never
Number of blocks modified before internal commit : 3000
Number of rows to data load before internal commit : 0
Number of disk volume definitions : 0
Currency Info
Currency Country Dimension Member : Entity
Currency Time Dimension Member : Period
Currency Category Dimension Member : Account
Currency Type Dimension Member :
Currency Partition Member :
Request Info
Request Type : Data Load
User Name : admin@Native Directory
Start Time : Mon Aug 15 18:35:51 2011
End Time : Mon Aug 15 18:35:51 2011
Request Type : Customized Calculation
User Name : 6236@Native Directory
Start Time : Tue Aug 16 09:44:10 2011
End Time : Tue Aug 16 09:44:12 2011
Request Type : Outline Update
User Name : admin@Native Directory
Start Time : Tue Aug 16 10:50:02 2011
End Time : Tue Aug 16 10:50:02 2011
ListFiles:
File Type
Valid Choices: 1) Index 2) Data 3) Index|Data
>>Currently>> 3) Index|Data
Application Name: AWRGPLAN
Database Name: Plan1
----- Index File Information -----
Index File Count: 1
File 1:
File Name: C:\Oracle\Middleware\user_projects\epmsystem1\EssbaseServer\essbaseserver1\APP\AWRGPLAN\Plan1\ess00001.ind
File Type: INDEX
File Number: 1 of 1
File Size: 8,024 KB (8,216,576 bytes)
File Opened: Y
Index File Size Total: 8,024 KB (8,216,576 bytes)
----- Data File Information -----
Data File Count: 1
File 1:
File Name: C:\Oracle\Middleware\user_projects\epmsystem1\EssbaseServer\essbaseserver1\APP\AWRGPLAN\Plan1\ess00001.pag
File Type: DATA
File Number: 1 of 1
File Size: 1,397 KB (1,430,086 bytes)
File Opened: Y
Data File Size Total: 1,397 KB (1,430,086 bytes)
File Size Grand Total: 9,421 KB (9,646,662 bytes)
GetAppInfo:
-------Application Info-------
Name : AWRGPLAN
Server Name : GITSHYPT01:1423
App type : Non-unicode mode
Application Locale : English_UnitedStates.Latin1@Binary
Status : Loaded
Elapsed App Time : 00:00:05:24
Users Connected : 2
Data Storage Type : Multidimensional Data Storage
Number of DBs : 3
List of Databases
Database (0) : Plan1
Database (1) : Plan2
Database (2) : Plan3ESM Block Issue
Cheers..!! -
Trying to understand blocksize and the correct sizing requirements..
I'm getting the following error after migrating a cube from a 9.3 environment to an 11 environment while running a calc script - which by the way works fine in 9.3 with identical cache settings.
Error: 1012704 Dynamic Calc processor cannot lock more than [2] ESM blocks during the calculation, please increase CalcLockBlock setting and then retry(a small data cache setting could also cause this problem, please check the data cache size setting).
When applying the following formula to our environment I get 122.84 Allocateable blocks, if my math is correct. The error is stating "[2] ESM" blocks so I must be doing something wrong?
Data Cache in K
______________ = Number of blocks that can be allocated
Block Size in K
block size 1651392/1024=1612.6875
Cache 198107
# of allocateable blocks 122.8427702
The settings from our config file are as follows:
CALCLOCKBLOCKHIGH 5500
CALCLOCKBLOCKDEFAULT 3500
CALCLOCKBLOCKLOW 100
CalcCacheHigh 199229440
CalcCacheDefault 104857600
CalcCacheLow 52428800
So what am I missing here, why the [2] ESM error? Is that a bogus error for a problem with the calc script I'm trying to run?
Thanks for your help.
Adam,OK, so even though I "thought" the setting survived a server restart I wen ahead and stopped and started the database and the settings applied and all seems to be good.
Thanks for the feedback. -
Essbase Error involving member formula
We are having an essbase issue when accessing a data intersection that has three members that have member formulas. In FRS, we get the following error message:
Error executing query: Error: Internal Essbase JAPI error: [Cannot perform cube view operation. Essbase Error(1012700): The dynamic calc processor cannot allocate more than [9] blocks from the heap. Either the CalcLockBlock setting is too low or the data cache size setting is too low. Increase CalcLockBlock setting or the data cache size setting and then retry]
I have tried numerous combinations of fixes for CalcLockBlock and data cache size and can't seem to get around this error.
If we look at an intersection with just two of the member formulas, it works. If we make the member formulas shorter, then it will load, but when we have the full member formulas there, it errors out.
Is this something that does indeed need to be fixed by CalcLockBlock settings? Any recommended settings? I have seen this issue on and off for years now and can't really nail down a good solution for it other than we change the member formulas.I just want to make sure that when you increase the data cache size you are restarting the application you are changing the settings on.
Also, you should see the error changing
for example
The dynamic calc processor cannot allocate more than [9] blocks from the heap
change to 20000kb
The dynamic calc processor cannot allocate more than [16] blocks from the heap
change to 40000kb
The dynamic calc processor cannot allocate more than [24] blocks from the heap
i.e. the number of blocks able to be locked is increasing - do you see this, or does it not change? -
Dynamic/Stored calculations in BSO
Hello Guys,
(Rephrasing the question)
I've below queries in respect to Storage for a BSO member calculations,
first consider 3 two dimensions,
Measures
-Source
---A(~)
---B(~)
-Cals
---D(~)
---E(~)
Product
--P1(+)
----P11(+)
----P12(+)
----P13(+)
Customer
--C1
---C11(+)
---C12(+)
Here
1) For Measure 'D', I want data for Measure A to be aggregated only for P11 and P12 products,
Calculation based on Single Dimension members.
2) Measure 'E', I want data to refer to only particular Combinations sum like D = (B -> P13 --> C12) + (B -> P11 -> C11)
Calculation based on combinations of Dimensions
I have several KPIs and different rules needs to be implement across dimensions, so using Fix is not so easy.
For now I'm using member formulas with dynamic cacls.
As data is increasing then I'm facing the following challenge
dynamic calc processor cannot allocate more than [100] blocks from the heap. Please increase CalcLockBlock setting and then retry
I've increased the CalcLockBlock to 500, restarted Essbase service and tried but still facing the same problem. Hence I made calc members to Stored Type
What I want to know is
1) Any suggestion for calcs, as the levels and dimensions are different for several measures
2) What is the difference for Calculation process and result for Dynamic Calc over Stored across dimensions.
Thank You
Edited by: NareshV on Mar 28, 2011 8:19 PMThank you for your responses.
Srinivas Bobbala wrote:
D = (B -> P13 --> C12) + (B -> P11 -> C11) - This requires to calculate onlt two blocks. I think calculations are more complex than this with if conditions.
I was just giving an example, to increase the complexity here, I've some 5 other dimensions as well and hence you can understand it is not just 2 blocks.
I've few Dynamic Cal members with outline formulae which refer to different block combinations.
When I pull data from Add-in, I get Essbase Error
Dynamic calc processor cannot allocate more than [100] blocks from the heap. Please increase CalcLockBlock setting and then retry
From the Application log, I get to see the following error
*[Wed Mar 30 01:43:44 2011]Local/APP/DB1/admin/Error(1020004)*
An error [1012700] occurred in Spreadsheet Extractor.
*[Wed Mar 30 01:43:44 2011]Local/APP/DB1/admin/Info(1020055)*
Spreadsheet Extractor Elapsed Time : [0] seconds
*[Wed Mar 30 01:43:44 2011]Local/APP/DB1/admin/Info(1020082)*
Spreadsheet Extractor Big Block Allocs -- Dyn.Calc.Cache : [100] non-Dyn.Calc.Cache : [0]
*[Wed Mar 30 01:43:44 2011]Local/APP/DB1/admin/Warning(1080014)*
*Transaction [ 0x400c3( 0x4d928ad0.0x58610 ) ] aborted due to status [1012700].*
I've tried to resolve this by
1) Adding 'CALCLOCKBLOCK HIGH 500' in essbase.cfg ( as I did not see CALCLOCKBLOCK in the config file)
2) Increased my Datafile cache to 505416KB as per existing Blocks & Block size
Block Size is 1792 B, Existing Blocks 577619
3) Restarted Essbase service and application
Guys, Am I following the correct steps ? How can I make this to work. -
Hi all,
I got the following error...
Error: 1012704 Dynamic Calc processor cannot lock more than [10] ESM blocks during the calculation, please increase CalcLockBlock setting and then retry (a small data cache setting could also cause this problem, please check the data cache size setting).
Then put the following entries in essbase.cfg present in C:\Hyperion\AnalyticServices\bin:
CALCLOCKBLOCKHIGH 500
CALCLOCKBLOCKDEFAULT 200
CALCLOCKBLOCKLOW 50
Then recycle the services in the order mentioned in the STOP and START scripts. Then run the calc scripts again.
where can i exactly run the above scripts...plz help me
cheers
prashIf you have following settings in Essbase.cfg: -
CALCLOCKBLOCKHIGH 500
CALCLOCKBLOCKDEFAULT 200
CALCLOCKBLOCKLOW 50
then you can use the following SET LOCKBLOCK setting commands in a calculation script:
SET LOCKBLOCK HIGH;
means that Essbase can fix up to 500 data blocks when calculating one block.
SET LOCKBLOCK DEFAULT;
means that Essbase can fix up to 200 data blocks when calculating one block.
SET LOCKBLOCK LOW;
means that Essbase can fix up to 50 data blocks when calculating one block.
Hope it help you.
Atul K,
Maybe you are looking for
-
After two years Photoshop will not launch on my iMac! A dialogue box comes up with - ERROR: 150:30 - Contact Technical support! Please help as I need to process a large number of digital images, Regards, Pete
-
HT4623 Is their a way to upgrade my ipod touch 3 gen to IOS 6?
Is their a way to upgrade my ipod touch 3 gen to IOS 6?
-
Blank File Icons in Small List View
Hi, I got an annoying problem with my Mac. In Finder, it's not going to show the right icons while I'm in the small list view. If I switch to the bigger list view or the symbol view, everything appears correct. I just reinstalled Yosemite but it's st
-
Where is the serial number?
i just downloaded a free trial of photoshop and have been prompted to enter the serial number. I never received it. where do i go from here?
-
Photoshop Elements 12.1 upgrade fails
Photoshop Elements 12.1 update fails with U44M1P2003 error. Tried shutting down antivirus and firewall. Also looked at registry entries suggested with three of the four missing and the fourth was correct path.