Dynamic Calc result different on each retrieval

We're currently testing version 6.5.6 on an NT server. We have two formulas set to Dynamic Calc. On the first retrieval, the formulas seem to retrieve fine. After several retrievals (maybe 20) they stop working. After this, each retrieval will give a different result. It seems to cycle through a few different incorrect answers before it starts over. If I modify the outline, restart services or reboot the box, it fixes the formulas for a while. However, after several retrievals, it reverts to its bad behavior.These formulas work consistently in 6.1p3a.We have already tried adding CALCFGDEPOPT FALSE to the Essbase config file to turn off event 28. It did not solve the problem.

Check your application log to see if the dynamic calc is dependent upon another dynamic calc member. If so you may need to make these members that calc incorrectly "Twopass" such that the first calculation is completed then the twopass calc is performed. You might also check the app log to make sure the dynamic calc members are not trying to perform the calcs off of members that have not been calculated yet. (i.e: they are below the cyn cal mbrs in the outline).

Similar Messages

  • Retrieval performance become poor with dynamic calc members with formulas

    We are facing the retrieval performance issue on our partititon cube.
    It was fine before applying the member formulas for 4 of measures and made them dynamic calc.
    The retrieval time has increased from 1sec to 5 sec.
    Here is the main formula on a member, and all these members are dynamic calc (having member formula)
    IF (@ISCHILD ("YTD"))
    IF (@ISMBR("JAN_YTD") AND @ISMBR ("Normalised"))
    "Run Rate" =
    (@AVG(SKIPNONE, @LIST (@CURRMBR ("Year")->"JAN_MTD",
    @RANGE (@SHIFT(@CURRMBR ("Year"),-1, @LEVMBRS ("Year", 0)), @LIST("NOV_MTD","DEC_MTD")))) *
    @COUNT(SKIPNONE,@RSIBLINGS(@CURRMBR ("Period")))) + "04";
    ELSE
    IF (@ISMBR("FEB_YTD") AND @ISMBR ("Normalised"))
    "Run Rate" =
    (@AVG (SKIPNONE, @RANGE (@SHIFT(@CURRMBR ("Year"),-1, @LEVMBRS ("Year", 0)),"DEC_MTD"),
    @RANGE (@CURRMBR ("Year"), @LIST ("JAN_MTD", "FEB_MTD"))) *
    @COUNT(SKIPNONE,@RSIBLINGS(@CURRMBR ("Period")))) + "04";
    ELSE
    "Run Rate"
    =(@AVGRANGE(SKIPNONE,"Normalised Amount",@CURRMBRRANGE("Period",LEV,0,-14,-12))*
    @COUNT(SKIPNONE,@RSIBLINGS(@CURRMBR ("Period"))))
    + "Normalised"->"04";
    ENDIF;
    ENDIF;
    ELSE 0;
    ENDIF
    Period is dense
    Year is dense
    Measures (normalised) is dense
    remaining all sparse
    block size 112k
    index cache to 10mb
    Rertrieval buffer 70kb
    dynamiccalccahe max set to 200mb
    Please not that, this is partition cube, retriving data from 2 ASO, 1 BSO underline cubes.

    I received the following from Hyperion. I had the customer add the following line to their essbase.cfg file and it increased their performance of Analyzer retrieval from 30 seconds to 0.4 seconds. CalcReuseDynCalcBlocks FALSE This is an undocumented setting (will be documented in Essbase v6.2.3). Here is a brief explanation of this setting from development: This setting is used to turn off a method of reusing dynamically calculated values during retrievals. The method is turned on by default and can speed up retrievals when it involves a large number of dynamically calculated blocks that are each required to compute several other blocks. This may happen when there is a big hierarchy of sparse dynamic calc members. However, a large dynamic calculator cache size or a large value of CALCLOCKBLOCK may adversely affect the retrieval performance when this method is used. In such cases, the method should be turned off by setting CalcReuseDynCalcBlocks to FALSE in the essbase.cfg file. Only retrievals are affected by this setting.

  • Retrieval impact of complex dynamic calc formulas

    Hi,
    How do you debug a dynamic calc member formula (complex), to tune it for high performance.
    Thanks!

    If you are using this member in any of the standard reports, Run the report including and excluding this member.
    Then you can come to a conclusion whether it is acceptable or not.
    If you feel it is not acceptable.
    Check the formula for any if conditions or formulas are pointing to a group of sparse members. It requires to load multiple blocks for condition validation.
    Recheck and try to minimize. Still problem persists, make the member stored and calculate in the batch.
    Read the below link.
    http://download.oracle.com/docs/cd/E10530_01/doc/epm.931/html_esb_dbag/frameset.htm?dcadynca.htm#dcadynca28382

  • Dynamic Calc with if statements

    Hello
    I have a dynamically calced member who’s calculation depends on what member of another dimension the user is looking at, so in the calc we have several if statements using the @ismbr function. Our calc looks something like this:
    If (@ismbr("Dim Member1") or @ismbr("Dim Member2"))
         Big Calc 1;
    elseif (@ismbr("Dim Member3") or @ismbr("Dim Member4"))
         Big Calc 2;
    elseif(@ismbr("Dim Member5"))
         Big Calc 3;
    else
         Last Big Calc;
    endif;
    What we are seeing is that when we run a query that should fall through to the final else (IE the Last Big Calc;) every calculation under every if is being executed. To come to this conclusion we did the following. Each of the calculations is pretty complex and can take a bit of time to run so we replaced Big Calcs 1-3 with very simple calcs that are very fast and ran the same query again, and while the result returned was the same (IE the result of Last Big Calc) it returned 10 times faster.
    I’m pretty naive about most things Essbase but I’m stuck with trying to solve this problem so I was hoping that someone might have run into something like this before or knows how I can solve it. Any insight will be very appreciated.
    I’m using Essbase 9.3.1 and my cube is a BSO cube.
    Thanks
    David

    Is it possible that the intermediate results can be put into "memo" members as a stored value, being "on call" so they don't get re-calculated as often?
    This wouldn't eliminate the source of the issue, but it would return the results much faster during retrieves. The downside of course is that these values would extend the calc time, so it may not be the right answer for you.
    Alternative #2: Make the final value stored, using a calc script to iterate the formulas only for the appropriate member sets, eliminating the duplicate efforts but moving the calculations away from the D-C realm.
    -Doug.

  • Dynamic Calc Member with Formula to Calculate a Rolling 13 Year

    I have to create a rollup in my time dimension that will have 13 members representing each month of a 13 rolling forecast. I am having trouble setting this up as dynamic calc members with a formula assigned to it. For example, on of the members has the following formula.
    As you can see, I am referencing several substitution variables since I want to make this as most dynamic as possible without having to make any changes to the formula member. Only making changes to the substitution variable. Any help would be much appreciated.
    IF((@ISMBR(&Current_Year)) AND (&Act_RF_Last_Month == "Jan"));
    "RTM 6" = @MDSHIFT(&Act_RF_Last_Month, -1, "Years", , 5, "Period",);
    ELSE #Missing;
    ENDIF
    IF((@ISMBR(&Current_Year)) AND (&Act_RF_Last_Month == "Feb"));
    "RTM 6" = @MDSHIFT(&Act_RF_Last_Month, -1, "Years", , 5, "Period",);
    ELSE #Missing;
    ENDIF
    IF((@ISMBR(&Current_Year)) AND (&Act_RF_Last_Month == "Mar"));
    "RTM 6" = @MDSHIFT(&Act_RF_Last_Month, -1, "Years", , 5, "Period",);
    ELSE #Missing;
    ENDIF
    IF((@ISMBR(&Current_Year)) AND (&Act_RF_Last_Month == "Apr"));
    "RTM 6" = @MDSHIFT(&Act_RF_Last_Month, -1, "Years", , 5, "Period",);
    ELSE #Missing;
    ENDIF
    IF((@ISMBR(&Current_Year)) AND (&Act_RF_Last_Month == "May"));
    "RTM 6" = @MDSHIFT(&Act_RF_Last_Month, -1, "Years", , 5, "Period",);
    ELSE #Missing;
    ENDIF
    IF((@ISMBR(&Current_Year)) AND (&Act_RF_Last_Month == "Jun"));
    "RTM 6" = @MDSHIFT(&Act_RF_Last_Month, -1, "Years", , 5, "Period",);
    ELSE #Missing;
    ENDIF
    IF((@ISMBR(&Current_Year)) AND (&Act_RF_Last_Month == "Jul"));
    "RTM 6" = @MDSHIFT(&Act_RF_Last_Month, -1, "Years", , 5, "Period",);
    ELSE #Missing;
    ENDIF
    IF((@ISMBR(&Current_Year)) AND (&Act_RF_Last_Month == "Aug"));
    "RTM 6" = @SHIFT(&Act_RF_Last_Month,-7);
    ELSE #Missing;
    ENDIF
    IF((@ISMBR(&Current_Year)) AND (&Act_RF_Last_Month == "Sep"));
    "RTM 6" = @SHIFT(&Act_RF_Last_Month,-7);
    ELSE #Missing;
    ENDIF
    IF((@ISMBR(&Current_Year)) AND (&Act_RF_Last_Month == "Oct"));
    "RTM 6" = @SHIFT(&Act_RF_Last_Month,-7);
    ELSE #Missing;
    ENDIF
    IF((@ISMBR(&Current_Year)) AND (&Act_RF_Last_Month == "Nov"));
    "RTM 6" = @SHIFT(&Act_RF_Last_Month,-7);
    ELSE #Missing;
    ENDIF
    IF((@ISMBR(&Current_Year)) AND (&Act_RF_Last_Month == "Dec"));
    "RTM 6" = @SHIFT(&Act_RF_Last_Month,-7);
    ELSE #Missing;
    ENDIF

    We use similar logic but its in a calc script, not a dynamic formula.
    Here's an example of what we do:
    FIX (&Year, &Scenario)
    IF (@ISMBR("JAN"))
    "Average Usage%" = @AVG(SKIPBOTH, @MDSHIFT("Usage %"->"DEC"->&actYear, -1, "Year",), "Util %"->"JAN"->&actYear);
    ELSEIF
    ENDFIX
    Another example in a different database:
    IF (@ISMBR("JAN")) ((@MDSHIFT("Fees"->APR,-1,YEARS,)+@MDSHIFT("Fees"->MAY,-1,YEARS,)+@MDSHIFT("Fees"->JUN,-1,YEARS,)+@MDSHIFT("Fees"->JUL,-1,YEARS,)+@MDSHIFT("Fees"->AUG,-1,YEARS,)+@MDSHIFT("Fees"->SEP,-1,YEARS,)+@MDSHIFT("Fees"->OCT,-1,YEARS,)+@MDSHIFT("Fees"->NOV,-1,YEARS,)+@MDSHIFT("Fees"->DEC,-1,YEARS,)+"Fees"->JAN)/10)*12;

  • Substitution variable in Dynamic Calc

    Hi,
    We are using Essbase 9.3.0 on Windows and are seeing this behavior in our BSO cubes.
    When we use a substitution variable in a Scenario member with Dynamic Calc (not store) setting, after the first retrieve, if we change the value of the substitution variable, the subsequent retrieves do not generate updated results.
    I suspect that the value is cached in the Dynamic Calculator Cache, and for some reason does not track changes in Substitution Variables to know that the value must be re-calculated. Here is what I see in the Application log -
    [Mon Aug 09 10:31:51 2010]Local/App1/db1/user1/Info(1020055)
    Spreadsheet Extractor Elapsed Time : [0.032] seconds
    [Mon Aug 09 10:31:51 2010]Local/App1/db1/user1/Info(1020082)
    Spreadsheet Extractor Big Block Allocs -- Dyn.Calc.Cache : [4] non-Dyn.Calc.Cache : [0]
    This says that 4 blocks were used from the Dynamic Calc Cache, and none from outside it. Does this mean that existing blocks were read and not re-populated?
    If I make a change to the formula, wherein I hard code the value of the sub var and perform the retrieve, then the value is updated. Subsequent retrieves, after restoring the formula still returns the updated results.
    My question is, is this expected behavior? Or am I doing something /reading something wrong?
    Thanks,
    Andy

    when a subst variable value is changed... to use the value in member formula or calc script, the concerned application has to be restarted...
    - Krish

  • Retrive time with Dynamic Calc

    Hi Experts,
    My users run a large smart view report which works well
    However, when they change one of the member on the query to DC member (dense)
    Then the retrieve takes forever and they getting network error from smart view.
    I try to figure out , how can I improve the retrieve time on my BSO app, focusing on DC issues
    I try to increase the data cache, index cache, retrieval buffer and sort.
    I add command of DYNCALCCACHEMAXSIZE and the retrieve doesn’t getting better.
    In addition, can you pls help me understand the app log file,
    Here’s what happen after I ran the report, maybe it will lead
    To the problem:
    [Mon Aug 06 02:41:03 2012]Local/Orac///1768/Info(1200481)
    Formula for member [YTD USD Display] will be executed in [TOPDOWN and CELL] mode
    [Mon Aug 06 02:41:03 2012]Local/Orac///1768/Info(1012710)
    Essbase needs to retrieve [1] Essbase Kernel blocks in order to calculate the top dynamically-calculated block.
    [Mon Aug 06 02:41:03 2012]Local/Orac///1768/Info(1012736)
    The Dyn.Calc.Cache for database [Main] can hold a maximum of [242] blocks.
    [Mon Aug 06 02:41:03 2012]Local/Orac///1768/Info(1012737)
    The Dyn.Calc.Cache for database [Main], when full, will result in [allocation from non-Dyn.Calc.Cache memory].
    [Mon Aug 06 02:41:03 2012]Local/Orac///1768/Info(1019018)
    Writing Parameters For Database [Main]
    [Mon Aug 06 02:41:03 2012]Local/Orac///1768/Info(1019017)
    Reading Parameters For Database [Main]
    [Mon Aug 06 02:41:05 2012]Local/Orac///1768/Info(1070013)
    Index cache size ==> [307200000] bytes, [37500] index pages.
    [Mon Aug 06 02:41:05 2012]Local/Orac///1768/Info(1070014)
    Index page size ==> [8192] bytes.
    [Mon Aug 06 02:41:05 2012]Local/Orac///1768/Info(1070081)
    Using buffered I/O for the index and data files.
    [Mon Aug 06 02:41:05 2012]Local/Orac///1768/Info(1070083)
    Using waited I/O for the index and data files.
    [Mon Aug 06 02:41:05 2012]Local/Orac///1768/Info(1019019)
    Reading Data File Free Space Information For Database [Main]...
    [Mon Aug 06 02:41:05 2012]Local/Orac///1768/Info(1006025)
    Data cache size ==> [512000000] bytes, [9359] data pages
    [Mon Aug 06 02:41:05 2012]Local/Orac///1768/Info(1006026)
    Data file cache size ==> [0] bytes, [0] data file pages
    [Mon Aug 06 02:41:05 2012]Local/Orac///1768/Info(1080053)
    Free space recovery skipped. Estimated free space recoverable by RecoverDbFreeSpace: [11068706640] bytes
    [Mon Aug 06 02:41:05 2012]Local/Orac///1768/Info(1024033)
    Missing Database Config File [E:\Hyperion\AnalyticServices\APP\Orac\Main\Main.cfg], Query logging disabled
    [Mon Aug 06 02:41:05 2012]Local/Orac///1768/Info(1203135)
    Starting the Data Mining Framework
    [Mon Aug 06 02:41:05 2012]Local/Orac///1768/Info(1203136)
    Data Mining Framework successfully initialized.
    [Mon Aug 06 02:41:05 2012]Local/Orac///1768/Info(1200551)
    Allocated TRIGMAXMEMSIZE: [4096] Bytes.
    [Mon Aug 06 02:41:05 2012]Local/Orac///1768/Info(1013205)
    Received Command [Get Database Volumes]
    [Mon Aug 06 02:41:05 2012]Local/Orac///1768/Info(1013205)
    Received Command [Set Database State]
    [Mon Aug 06 02:41:05 2012]Local/Orac///1768/Info(1019018)
    Writing Parameters For Database [Main]
    [Mon Aug 06 02:41:05 2012]Local/Orac///1768/Info(1019018)
    Writing Parameters For Database [Main]
    [Mon Aug 06 02:41:05 2012]Local/Orac///1768/Info(1013205)
    Received Command [Get Database State]
    [Mon Aug 06 02:41:05 2012]Local/Orac///1768/Info(1013205)
    Received Command [Get Database Info]
    [Mon Aug 06 02:41:05 2012]Local/Orac///1768/Info(1013205)
    Received Command [SetApplicationState]
    [Mon Aug 06 02:41:06 2012]Local/Orac///1768/Info(1019010)
    Writing Application Definition For [Orac]
    [Mon Aug 06 02:41:06 2012]Local/Orac///1768/Info(1019011)
    Writing Database Definition For [Main]
    [Mon Aug 06 02:41:06 2012]Local/Orac///1768/Info(1019022)
    Writing Database Mapping For [Orac]
    [Mon Aug 06 02:41:06 2012]Local/Orac///5132/Info(1013210)
    User [essadmin] set active on database [Main]
    Pls help me find out how can I overcome this.
    Many thanks,

    I'll add on to Dan's comment that you very much need to test the dynamic calc (if indeed that is the problem) in the proper context.
    Amusing but painful story:
    I was once tasked with reducing calc time on a BSO database. i saw long ugly stored member calcs. Hah! I cleverly converted them to dynamic and calc times dropped. No one told me (and I didn't ask) how they were to be reported. There were thousands of instances of this member on the Excel sheet. What had been a nice fast retrieve dropped to five minutes. Whoops.
    Take your pain wherever you like -- if you are reading multiple blocks to do the calc (my guess) nothing on earth is going to make it fast. Why make it dynamic in the first place? How many times is the member retrieved? Is it all in the block or, as i guessed, across multiple blocks?
    Last comment -- again (this is getting boring) I agree with Dan -- caches are not, usually, the source of any performance joy and it always comes down to design.
    Regards,
    Cameron Lackpour

  • Problem with a calc involving @MDSHIFT and Dynamic Calc

    Hi all,
    I have a problem with the calculation of a member in a calc script. The formula of this member is :
    "R70100"
    IF(@ISMBR("M01"))
    ("T_008"->"Cumul"->"HT"+"T_003"->"Cumul"->"HT")*"AVCT_PR" - @MDSHIFT("R70100" -> "Cumul" -> "HT" -> "M12", -1, "Year", );
    ELSE
    ("T_008"->"Cumul"->"HT"+"T_003"->"Cumul"->"HT")*"AVCT_PR" - @MDSHIFT("R70100" -> "Cumul" -> "HT", -1, "Period", );
    ENDIF
    - R70100 is a member of a dense dimension.
    - T_008, T_003 and AVCT_PR are spare and stored.
    - "Cumul" is a dynamic member which allow us to calculate cumulative from monthly.
    - Time is split into two dimensions, Period and Year.
    The problem is that the @MDSHIFT doesn't seems to work with the dynamic calc as the monthly result is the cumulative correct value. I didn't know what is wrong BUT each time I launch the script, values are good for one more month.
    After one run, for example, I obtained this result :
    !http://zenon.apartia.fr/stuff/200910070001.GIF!
    Two run later, here are the news values :
    !http://zenon.apartia.fr/stuff/200910070002.GIF!
    Any idea on what going on and how to correct it ? Thanks !
    Frédéric

    A couple of questions for you:
    1) Do you need the overhead of @MDSHIFT when you are only moving in one dimension? Wouldn't @PRIOR have been a lot easier? At least this is true for the ELSE condition.
    2) Could you post just the results of the @MDSHIFT calculation so we (okay, this may be just for me, it's still breakfast time here and not enough coffee has been ingested yet to read a spreadsheet without recourse to the formulas) can better see the impact?
    3) Have you tried using your code against a non-dynamic member? Does it make a difference?
    Regards,
    Cameron Lackpour

  • Sparse Dynamic Calc Member

    All,
    Is it better to have a sparse store member and calculating in Calcscript/Rule or sparse dynmic calc member with formula?
    Thank You

    Here's a little background on why sparse dynamic calcs can sometimes (but not always) be bad for retrieval performance:
    - Essbase stores data in one of two types of files - the Index file and the Page file. (there can be many of each type of file, depending on how much data is stored in your cube)
    - The index file stores all of the combinations of sparse members (where there is data). Think of each sparse dimension as a column in the index file.
    - The page file contains "blocks" that are made up of your dense dimension members. Each "cell" in a block is 8 bytes. (a cell being a single combination of members from each dense dimension)
    - Each row in your index file points to one and only one block. (this is why you need one member from every dimension to get to a data value)
    So let's say we have three members in one of our sparse dimensions. For the sake of simplicity, let's say this is the only sparse dimension in the cube.
    - Member "A" is stored.
    - Member "B" is stored.
    - Member "C" is a dynamic calc equal to "A" - "B".
    Question - if you retrieve on member "C", how many blocks does Essbase have to pull into memory? 2 (the blocks for A and B), plus it has to create a block in memory for member "C". (at least I think it creates a block in memory . . . . ) Either way, that's a lot of I/O, relatively speaking.
    If "C" were calculated in batch and stored, the same retrievel would only have to pull a single block into memory - the stored block for C. This is less I/O.
    So why are some dynamic sparse calcs bad and others aren't? It all comes down to how many blocks the retrieval forces Essbase to pull into memory. (FYI this "memory" is really your database caches)
    This is why sparse dynamic calcs like variance scenarios are common. A sparse dynamic calc that compares Actual to Budget doesn't pull a lot of blocks into memory. However, if you set a sparse parent to be dynamic, and it had 100 children, this would be pretty bad for performance.
    If you can visualize this concept, you can performance tune a BSO cube. It's all about I/O.
    Hope this helps,
    - Jake

  • Viewing the results of a "Data Retrieval Rule"

    Viewing the results of a "Data Retrieval Rule"
    RC2008.3 SANDBOX
    ORACLE 10g
    Websphere 6.1
    IE6 & 7
    Hello:
    Does anyone know if there is a log available that will show the SQL that is sent to the server during a DataRetrievalRule?.
    I had my rule working at one time, and now my values are returning blank.
    So I am hoping that there is a log somewhere so that I can see what is going on.
    thank you
    Daniel
    Safeway Inc.

    The idea here is to not drag and drop a datastore as a target, but to create this datastore directly in the interface. You can then use this interface as source of another interface.
    here is the doc... Let me know if you need more help.
    Target Datastore
    An interface can have only one target datastore. There are two possibilities for this datastore:
    A permanent datastore, corresponding to a datastore that already exists in a model.
    A temporary datastore, that does not yet exist in a model, and which will thus be created dynamically by the interface, in either the work or data schema of the logical schema of the Staging Area specified on the Definition tab of the interface.
    The target datastore, with the mapping for each column, is displayed on the right of the Diagram tab.
    To edit the properties of the target datastore, click on the datastore title. The properties panel will appear at the bottom of the window.

  • Loading accounts as Dynamic Calc

    Hi,
    Has anybody tried to load different data storage vales for accounts through ODI? We are having an issue with this in that we are specifying an X in the Data Storage column (for Dynamic Calc) but the values do not show up as dynamic Calc in the outline. Anybody had any similar issues and managed to solve them?
    Thanks in advance for any help.

    Hi,
    There's nothing to stop you tagging every root member of a dimension as dynamic calc. You just need to consider the usual balance between database size and retrieval times. If you're thinking of tagging all of the descendants as dynamic too then you could run into memory/lockblock issues depending on how big the db is.
    I'm not sure whether there's a performance overhead on a db with duplicates enabled. I don't think there is.
    Gee

  • Error: 1012704 Dynamic Calc processor cannot lock more than [25] ESM blocks

    Dear All,
    I get the Following Error in the Essbase console when I try to Execute any CalcScript.
    Error: 1012704 Dynamic Calc processor cannot lock more than [25] ESM blocks during the calculation, please increase CalcLockBlock setting and then retry(a small data cache setting could also cause this problem, please check the data cache size setting)_+
    Please find the detailed output of the Statics of my Planning Applications Database and outline.
    please help guys........
    GetDbStats:
    -------Statistics of AWRGPLAN:Plan1 -------
    Dimension Name Type Declared Size Actual Size
    ===================================================================
    HSP_Rates SPARSE 11 11
    Account DENSE 602 420
    Period DENSE 19 19
    Year SPARSE 31 31
    Scenario SPARSE 6 6
    Version SPARSE 4 4
    Currency SPARSE 10 10
    Entity SPARSE 28 18
    Departments SPARSE 165 119
    ICP SPARSE 80 74
    LoB SPARSE 396 344
    Locations SPARSE 57 35
    View SPARSE 5 5
    Number of dimensions : 13
    Declared Block Size : 11438
    Actual Block Size : 7980
    Declared Maximum Blocks : 3.41379650304E+015
    Actual Maximum Blocks : 1.87262635317E+015
    Number of Non Missing Leaf Blocks : 10664
    Number of Non Missing Non Leaf Blocks : 2326
    Number of Total Blocks : 12990
    Index Type : B+ TREE
    Average Block Density : 0.01503759
    Average Sparse Density : 6.936782E-010
    Block Compression Ratio : 0.001449493
    Average Clustering Ratio : 0.3333527
    Average Fragmentation Quotient : 19.3336
    Free Space Recovery is Needed : No
    Estimated Bytes of Recoverable Free Space : 0
    GetDbInfo:
    ----- Database Information -----
    Name : Plan1
    Application Name : AWRGPLAN
    Database Type : NORMAL
    Status : Loaded
    Elapsed Db Time : 00:00:05:00
    Users Connected : 2
    Blocks Locked : 0
    Dimensions : 13
    Data Status : Data has been modified
    since last calculation.
    Data File Cache Size Setting : 0
    Current Data File Cache Size : 0
    Data Cache Size Setting : 3128160
    Current Data Cache Size : 3128160
    Index Cache Size Setting : 1048576
    Current Index Cache Size : 1048576
    Index Page Size Setting : 8192
    Current Index Page Size : 8192
    Cache Memory Locking : Disabled
    Database State : Read-write
    Data Compression on Disk : Yes
    Data Compression Type : BitMap Compression
    Retrieval Buffer Size (in K) : 10
    Retrieval Sort Buffer Size (in K) : 10
    Isolation Level : Uncommitted Access
    Pre Image Access : No
    Time Out : Never
    Number of blocks modified before internal commit : 3000
    Number of rows to data load before internal commit : 0
    Number of disk volume definitions : 0
    Currency Info
    Currency Country Dimension Member : Entity
    Currency Time Dimension Member : Period
    Currency Category Dimension Member : Account
    Currency Type Dimension Member :
    Currency Partition Member :
    Request Info
    Request Type : Data Load
    User Name : admin@Native Directory
    Start Time : Mon Aug 15 18:35:51 2011
    End Time : Mon Aug 15 18:35:51 2011
    Request Type : Customized Calculation
    User Name : 6236@Native Directory
    Start Time : Tue Aug 16 09:44:10 2011
    End Time : Tue Aug 16 09:44:12 2011
    Request Type : Outline Update
    User Name : admin@Native Directory
    Start Time : Tue Aug 16 10:50:02 2011
    End Time : Tue Aug 16 10:50:02 2011
    ListFiles:
    File Type
    Valid Choices: 1) Index 2) Data 3) Index|Data
    >>Currently>> 3) Index|Data
    Application Name: AWRGPLAN
    Database Name: Plan1
    ----- Index File Information -----
    Index File Count: 1
    File 1:
    File Name: C:\Oracle\Middleware\user_projects\epmsystem1\EssbaseServer\essbaseserver1\APP\AWRGPLAN\Plan1\ess00001.ind
    File Type: INDEX
    File Number: 1 of 1
    File Size: 8,024 KB (8,216,576 bytes)
    File Opened: Y
    Index File Size Total: 8,024 KB (8,216,576 bytes)
    ----- Data File Information -----
    Data File Count: 1
    File 1:
    File Name: C:\Oracle\Middleware\user_projects\epmsystem1\EssbaseServer\essbaseserver1\APP\AWRGPLAN\Plan1\ess00001.pag
    File Type: DATA
    File Number: 1 of 1
    File Size: 1,397 KB (1,430,086 bytes)
    File Opened: Y
    Data File Size Total: 1,397 KB (1,430,086 bytes)
    File Size Grand Total: 9,421 KB (9,646,662 bytes)
    GetAppInfo:
    -------Application Info-------
    Name : AWRGPLAN
    Server Name : GITSHYPT01:1423
    App type : Non-unicode mode
    Application Locale : English_UnitedStates.Latin1@Binary
    Status : Loaded
    Elapsed App Time : 00:00:05:24
    Users Connected : 2
    Data Storage Type : Multidimensional Data Storage
    Number of DBs : 3
    List of Databases
    Database (0) : Plan1
    Database (1) : Plan2
    Database (2) : Plan3

    ESM Block Issue
    Cheers..!!

  • Dynamic Calc processor cannot lock more than [100] ESM blocks during the calculation, please increase CalcLockBlock setting and then retry(a small data cache setting could also cause this problem, please check the data cache size setting).

    Hi,
    Our Environment is Essbase 11.1.2.2 and working on Essbase EAS and Shared Services components.One of our user tried to run the Cal Script of one Application and faced this error.
    Dynamic Calc processor cannot lock more than [100] ESM blocks during the calculation, please increase CalcLockBlock setting and then retry(a small data cache setting could also cause this problem, please check the data cache size setting).
    I have done some Google and found that we need to add something in Essbase.cfg file like below.
    1012704 Dynamic Calc processor cannot lock more than number ESM blocks during the calculation, please increase CalcLockBlock setting and then retry (a small data cache setting could also cause this problem, please check the data cache size setting).
    Possible Problems
    Analytic Services could not lock enough blocks to perform the calculation.
    Possible Solutions
    Increase the number of blocks that Analytic Services can allocate for a calculation:
    Set the maximum number of blocks that Analytic Services can allocate to at least 500. 
    If you do not have an $ARBORPATH/bin/essbase.cfg file on the server computer, create one using a text editor.
    In the essbase.cfg file on the server computer, set CALCLOCKBLOCKHIGH to 500.
    Stop and restart Analytic Server.
    Add the SET LOCKBLOCK HIGH command to the beginning of the calculation script.
    Set the data cache large enough to hold all the blocks specified in the CALCLOCKBLOCKHIGH setting. 
    Determine the block size.
    Set the data catche size.
    Actually in our Server Config file(essbase.cfg) we dont have below data  added.
    CalcLockBlockHigh 2000
    CalcLockBlockDefault 200
    CalcLockBlocklow 50
    So my doubt is if we edit the Essbase.cfg file and add the above settings and restart the services will it work?  and if so why should we change the Server config file if the problem is with one application Cal Script. Please guide me how to proceed.
    Regards,
    Naveen

    Your calculation needs to hold more blocks in memory than your current set up allows.
    From the docs (quoting so I don't have to write it, not to be a smarta***:
    CALCLOCKBLOCK specifies the number of blocks that can be fixed at each level of the SET LOCKBLOCK HIGH | DEFAULT | LOW calculation script command.
    When a block is calculated, Essbase fixes (gets addressability to) the block along with the blocks containing its children. Essbase calculates the block and then releases it along with the blocks containing its children. By default, Essbase allows up to 100 blocks to be fixed concurrently when calculating a block. This is sufficient for most database calculations. However, you may want to set a number higher than 100 if you are consolidating very large numbers of children in a formula calculation. This ensures that Essbase can fix all the required blocks when calculating a data block and that performance will not be impaired.
    Example
    If the essbase.cfg file contains the following settings:
    CALCLOCKBLOCKHIGH 500  CALCLOCKBLOCKDEFAULT 200  CALCLOCKBLOCKLOW 50 
    then you can use the following SET LOCKBLOCK setting commands in a calculation script:
    SET LOCKBLOCK HIGH; 
    means that Essbase can fix up to 500 data blocks when calculating one block.
    Support doc is saying to change your config file so those settings can be made available for any calc script to use.
    On a side note, if this was working previously and now isn't then it is worth investigating if this is simply due to standard growth or a recent change that has made an unexpected significant impact.

  • Dynamic Calc, 2 Pass with Attributes!

    Hi there,We have an Occupancy KPI within our Measures dimension, that works off the following member calc:(("Agency"/"Occupancy")/"Actual"->"DayInMonth"->"DummyHome")*7;This is set with <Dynamic Calc> <Two-Pass>This works great accross our cost centres until we start to use attributes within the ADD-In. Then the figures are shown as #Missing - Nuts!Any ideas would be welcomeMany Thanks in advanceMark

    Are you also trying to pull dynamic time series or any other dynamically calculated member? If every other intersection is stored, I would expect this to work. However, if you attempt to pull, say Q-T-D(&CurrWk), then a #Mi result is quite possible, and unavoidable (depending on your outline).

  • Problem with Dynamic Calc members in calc script

    Hi
    i wrote a calc script which calculates the value of acct2 which is dependant on the value of acct1 that is calculated from acct0 as bellow:
    FIX(
    M1
    acct1(
    acct1=acct0->M2;
    acct2=acct1->M2;
    ENDFIX
    the value of the intersection of M1->acct0 is 100 as input ,what i wanted from this calc script is the acct2->M1 to be 100 after executing the script once
    but the fact is i have to run it twice to get the expected result.
    the outline is like this:
    M:
    M2 Dynamic calc M2=M0+M1
    M0 Stored
    m1 stored
    Account:
    acct0 stored
    acct1 stored
    acct2 stored
    Can anyone tell me the reason and how to solve it?
    Edited by: user10450070 on 2011-1-25 上午1:38

    Is acct1 correct after one run of the calculation, but acct2 incorrect? If so, it's because Essbase is calculating them both at the same time (parallel), whereas it has to be forced to calculate acct1 first and acct2 second. You can do this with separate fixes as below:
    FIX(M1)
    acct1=acct0->M2;
    ENDFIX
    FIX(M2)
    acct2=acct1->M2;
    ENDFIX
    Or, you can you can force it into calculating in serial mode:
    SET CALCPARALLEL 0;
    FIX(M1)
    acct1=acct0->M2;
    acct2=acct1->M2;
    ENDFIX
    Sabrina
    Edited by: SabrinaD on Jan 26, 2011 7:56 AM

Maybe you are looking for