Sparse Member rollup

Hello,<BR>I am having 9 dimensions in that 3 are dense and 6 are sparse.<BR>I am having Date as sparse member and Measure as dense member.<BR>My daily data is rolled up to weekly level in essbase calculation. For few members weekly rollup will be rollup of a member divide by rollup of some other members.<BR>For eg., 3 measures say X,Y,Z. In this Z=X/Y. So weekly rollup of Z will be weekly rollup of X divide by weekly rollup of Y.<BR><BR>But since my Date dimension is in Sparse and Formula is in dense i am not able to achive it.The weekly members are rolled up as it is, and its not as per the formula.<BR><BR>Can any one suggest how to achive this?<BR><BR>Regards<BR>R.Prasanna

<p>If your problem is solved then ignore this.</p><p>If not : My reply is As what I understood,  (It was workingcorrectly before when Time dimension was dense.)</p><p>Are you calculating any cross-dimensional formula on top ofdense and sparse members which resulting in dense member?</p><p>If yes, your problem is here.</p><p>If any formula you calculate on cross-dimensional members ofsparse and dense combination, result must be stored in Sparsemember, or it will not create a data block for this membercombination. Save it in sparse member. This is frequent error whenwe work with Time dimension.</p><p>Before it was OK, because all were dense, and it had alreadycreated blocks for it.</p><p>Now, block is not there, and it should create a block, which isnot created because you are saving result in dense measure'smember.</p><p> </p><p>Its always best using FIX-ENDFIX for all performanceoptimization and avoiding or reducing this error.</p><p> </p><p>Let me know, if this helps.</p><p> </p>

Similar Messages

  • Deleting Sparse member would clear respective  Blocks

    Hi Guru,
    I have to delete historical data for past 4 years. To delete the data & also to clear the blocks associated with that (to be deleted) member, one option is to use “clearBlock" in the calc Script. However we are thinking of an easier method. Can we simply delete the Year member from the database? Since Year dimension is a Sparse dimension, by deleting a Year member (eg FY04) all data including blocks created with reference to the FY04 would also get cleared. Is our perception correct?
    Thanks in advance for your suggestions..
    Regards,

    Pragati Khare wrote:
    Hi Guru,
    I have to delete historical data for past 4 years. To delete the data & also to clear the blocks associated with that (to be deleted) member, one option is to use “clearBlock" in the calc Script. However we are thinking of an easier method. Can we simply delete the Year member from the database? Since Year dimension is a Sparse dimension, by deleting a Year member (eg FY04) all data including blocks created with reference to the FY04 would also get cleared. Is our perception correct?
    Thanks in advance for your suggestions..
    Regards,If you delete the member in outline the combination of that member values will be clear.
    Regards,
    Prabhas
    Edited by: P on Apr 1, 2011 9:11 AM

  • Addition of Sparse Stored member??

    Hello,
    Recently we have added a sparse stored member to our outline then Immediately all our calc scripts are taking the double amount of time to excecute than earlier. Earlier we have sparse members as (6x9x93x290x1433x8x43x4x3) and now we added one member in first dimension i.e, now (7x9x93x290x1433x8x43x4x3). the dense members were remain same (863x16). I noticed the difference in no. of blocks before and after adding the sparse member.
    I assumed like addition of a sparse stored member increases the no. of blocks and it takes more time to search for the new blocks and taking little more time to excute the calc scripts. But i'm surprising that with addition of only 1 member it is taking double time for all calc scripts.
    Can you please let me know what might went wrong to cause the increase in calc time? Do I need to check any other parameters to know for high calc time after adding a sparse member?
    I'm in EAS 11.1.1.3.
    Apprciate your help.
    thanks,

    If you've doubled the number of blocks, especially in a dimension that you cannot FIX around, e.g., FIX on Budget only instead of all level zero Scenarios, I don't think there is a magic bullet that will make your performance go back to where it was -- you have twice the data.
    I still am having a hard time believing one or two or three (which is it, btw?) new members in a sparse dimension like Product really would have that much of an effect. If it were a new Scenario, and you copied Actual into it and multiplied it by 1.05, then yes, I can easily see a doubling of blocks because the data in that sparse dimension (Scenario) is really very dense. By for Product which is inherently (okay, usually inherently) sparse in its data distribution? That doesn't really make sense.
    There are a couple of tests you can make:
    -- For giggles, make a copy of the database and remove all of the other members in the Product dimension, then save the outline. Then force a restructure (which I would suggest on a daily basis for almost every BSO db in the world). How big are the .PAG files? How many blocks?
    -- You should also be able to look at whatever input you have for these new members and (hoping that you have it in SQL) do a select on the sparse members that define the blocks -- that should also give you an accurate count of the new sparse blocks, at least at level zero.
    Regards,
    Cameron Lackpour

  • Why to open Block only with Dense dimension Member ???

    Hi All,
    I want to understand the logic behind opening the block with dense member. Generally we open the block with member of dense dimension by fixing the sparse members
    e.g. below considering C1, P1,Working,Actual are members of sparse dimension customer, product, Version, Scenario.
    Fix(C1,P1,Working,Actual)
    Jan
    IF(Some condition)
    Calculation on Account Member
    ENDIF
    Now if i change the code as below making Jan in FIX and opening the block with Actual. How it will affect the performance. The Number of iterations in total number of datacells will be same in bothe cases above and below then why to give priority to dense dimension member for opening the block.
    Fix(C1,P1,Working,Jan)
    Actual
    IF(Some condition)
    Calculation on Account Member
    ENDIF
    ENDFIX
    I know the logic behind keeping sparse dimension member in FIX but here i think keeping 'Actual' member for opening the block is also a type of fix which will tell the code to do the calculation only for blocks of 'Actual'.
    In 1st case i have fixed the number of blocks containing members C1,P1,Working,Actual and by keeping Jan for opening block means Jan is also fixed for the particular application means code will not run for feb march etc.
    Means code will run for C1->P1->Working>Actual->Jan
    In 2nd case also i have fixed C1,P1,Working,Jan and by keeping Actual for opening block means Actual is also fixed for the particular application means code will not run for other members of Scenarion dimension.
    Means code will run for C1->P1->Working>Jan->Actual
    Then what is the difference??? Why it will effect performance???

    when using an IF statement in a calc script you have to specify a member to associate it with for the calc member block. This is like temporarily assigning it to that member as a formula in the outline. In most cases calculations on dense dimensions can work faster as the block is in memory. Remember that a block statement can have multiple calculation statements in it. When this occurs, having the calculations on a dense member can speed it up as it may not have to swap blocks to do the calculation.
    Also consider that if a sparse combination of members does not exist, the block will not exist and there will be no attempt to do the calculations. If a sparse member is on the calc member block, it will cycle through all of the sparse members looking to do the calculation.

  • FIX STATEMENT AND LARGE SPARSE DIMENSIONS

    Hello all,
    We have the following Essbase BSO db;
    Account (Dense) (285 Members) (Aggregating Dimension)
    Period (Dense) (65 Members) (Aggregating Dimension)
    D1 (Sparse) (3700 Members)
    (Aggregating Dimension)
    D2 (Sparse) (8900 Members)
    (Aggregating Dimension)
    D3 (Sparse) (15000 Members)
    (Aggregating Dimension)
    Version (Sparse) (3 Members) (NON-Aggregating Dimension)
    Scenario (Sparse) (5 Members) (NON-Aggregating Dimension)
    Year (Sparse) (3 Members) (NON-Aggregating Dimension)
    Currency (Sparse) (11 Members) (NON-Aggregating Dimension)
    D4 (Sparse) (20 Members) (NON-Aggregating Dimension)
    Block Size = ~150KM
    Index Cache = 4GB
    Data Cache = 8GB
    CPUs = 8
    MEMORY FREE = 26GB
    NOTE:
    We are executing the database with data loaded for "JUST" 10 BLOCKS,
    SO
    VERY VERY SMALL VOLUME OF DATA!!!
    We have come across a rather irritating and strange issue while executing
    the following calc;
    CASE 1: With LEVEL-0 OF RELEVANT
    SPARSE DIMENSIONS IN FIX STATEMENT
    //ESS_LOCALE English_UnitedStates.Latin1@Binary
    SET MSG ERROR;
    SET CACHE HIGH;
    SET UPDATECALC OFF;
    SET LOCKBLOCK HIGH;
    SET AGGMISSG OFF;
    SET CALCPARALLEL 4;
    SET CREATENONMISSINGBLK OFF;
    FIX(ACTUAL,"2013",@LEVMBRS(D1,0),@LEVMBRS(D2,0),@LEVMBRS(D3,0),@LEVMBRS(D4,0),
    @RELATIVE("EBITDA",0))
    DATACOPY ARS->CURRENCY_VERSION TO ARS->WORKING;
    DATACOPY CAD->CURRENCY_VERSION TO CAD->WORKING;
    DATACOPY CHF->CURRENCY_VERSION TO CHF->WORKING;
    DATACOPY COP->CURRENCY_VERSION TO COP->WORKING;
    DATACOPY EUR->CURRENCY_VERSION TO EUR->WORKING;
    DATACOPY GBP->CURRENCY_VERSION TO GBP->WORKING;
    DATACOPY MXN->CURRENCY_VERSION TO MXN->WORKING;
    DATACOPY CNY->CURRENCY_VERSION TO CNY->WORKING;
    ENDFIX
    $$$$$$NOTE1: The above FIX STATEMENT
    works just fine and executes in 1 sec as its just a DATACOPY. ESSENTIALLY I
    HAVE CREATED THE BLOCKS I WANT TO PERFORM THE BELOW SPARSE CALCULATION$$$$$$
    FIX(ACTUAL,"2013",@LEVMBRS(D1,0),@LEVMBRS(D2,0),@LEVMBRS(D3,0),@LEVMBRS(D4,0),
    @RELATIVE("EBITDA",0))
    ARS = ARS->CURRENCY_VERSION *
    "ARS_RATE"->"NO_CURRENCY"->"C_NONE"->"L_NONE"->"S_NONE"->"U_NONE"->WORKING;
    CAD = ARS->CURRENCY_VERSION *
    "CAD_RATE"->"NO_CURRENCY"->"C_NONE"->"L_NONE"->"S_NONE"->"U_NONE"->WORKING;
    CHF = ARS->CURRENCY_VERSION *
    "CHF_RATE"->"NO_CURRENCY"->"C_NONE"->"L_NONE"->"S_NONE"->"U_NONE"->WORKING;
    COP = ARS->CURRENCY_VERSION *
    "COP_RATE"->"NO_CURRENCY"->"C_NONE"->"L_NONE"->"S_NONE"->"U_NONE"->WORKING;
    EUR = ARS->CURRENCY_VERSION *
    "EUR_RATE"->"NO_CURRENCY"->"C_NONE"->"L_NONE"->"S_NONE"->"U_NONE"->WORKING;
    GBP = ARS->CURRENCY_VERSION *
    "COP_RATE"->"NO_CURRENCY"->"C_NONE"->"L_NONE"->"S_NONE"->"U_NONE"->WORKING;
    MXN = ARS->CURRENCY_VERSION *
    "MXN_RATE"->"NO_CURRENCY"->"C_NONE"->"L_NONE"->"S_NONE"->"U_NONE"->WORKING;
    CNY = ARS->CURRENCY_VERSION *
    "CNY_RATE"->"NO_CURRENCY"->"C_NONE"->"L_NONE"->"S_NONE"->"U_NONE"->WORKING;
    ENDFIX
    $$$$$$NOTE2: The above FIX STATEMENT
    is where we have a problem, THE CALC JUST HANGS AND DOES NOT CALCULATE$$$$$$
    CASE 2: With “specific” members of
    RELEVANT SPARSE DIMENSIONS IN FIX STATEMENT
    //ESS_LOCALE English_UnitedStates.Latin1@Binary
    SET MSG ERROR;
    SET CACHE HIGH;
    SET UPDATECALC OFF;
    SET LOCKBLOCK HIGH;
    SET AGGMISSG OFF;
    SET CALCPARALLEL 4;
    SET CREATENONMISSINGBLK OFF;
    FIX(ACTUAL,"2013",W1,"2251026",MORSCREJWHITE,MORSCREJWHITE_S,KG,TRADESALES)
    DATACOPY ARS->CURRENCY_VERSION TO ARS->WORKING;
    DATACOPY CAD->CURRENCY_VERSION TO CAD->WORKING;
    DATACOPY CHF->CURRENCY_VERSION TO CHF->WORKING;
    DATACOPY COP->CURRENCY_VERSION TO COP->WORKING;
    DATACOPY EUR->CURRENCY_VERSION TO EUR->WORKING;
    DATACOPY GBP->CURRENCY_VERSION TO GBP->WORKING;
    DATACOPY MXN->CURRENCY_VERSION TO MXN->WORKING;
    DATACOPY CNY->CURRENCY_VERSION TO CNY->WORKING;
    ENDFIX
    $$$$$$NOTE3: The above FIX STATEMENT
    works just fine and executes in 1 sec as its just a DATACOPY. ESSENTIALLY I
    HAVE CREATED THE BLOCKS I WANT TO PERFORM THE BELOW SPARSE CALCULATION$$$$$$
    FIX(ACTUAL,"2013","2251026",MORSCREJWHITE,MORSCREJWHITE_S,KG,WORKING)
    ARS = ARS->CURRENCY_VERSION *
    "ARS_RATE"->"NO_CURRENCY"->"C_NONE"->"L_NONE"->"S_NONE"->"U_NONE"->WORKING;
    CAD = CAD->CURRENCY_VERSION *
    "CAD_RATE"->"NO_CURRENCY"->"C_NONE"->"L_NONE"->"S_NONE"->"U_NONE"->WORKING;
    CHF = CHF->CURRENCY_VERSION *
    "CHF_RATE"->"NO_CURRENCY"->"C_NONE"->"L_NONE"->"S_NONE"->"U_NONE"->WORKING;
    COP = COP->CURRENCY_VERSION *
    "COP_RATE"->"NO_CURRENCY"->"C_NONE"->"L_NONE"->"S_NONE"->"U_NONE"->WORKING;
    EUR = EUR->CURRENCY_VERSION *
    "EUR_RATE"->"NO_CURRENCY"->"C_NONE"->"L_NONE"->"S_NONE"->"U_NONE"->WORKING;
    GBP = GBP->CURRENCY_VERSION *
    "GBP_RATE"->"NO_CURRENCY"->"C_NONE"->"L_NONE"->"S_NONE"->"U_NONE"->WORKING;
    MXN = MXN->CURRENCY_VERSION * "MXN_RATE"->"NO_CURRENCY"->"C_NONE"->"L_NONE"->"S_NONE"->"U_NONE"->WORKING;
    CNY = CNY->CURRENCY_VERSION *
    "CNY_RATE"->"NO_CURRENCY"->"C_NONE"->"L_NONE"->"S_NONE"->"U_NONE"->WORKING;
    ENDFIX
    $$$$$$NOTE4: The above FIX
    STATEMENT works just fine, as I am fixing on just ONE member from relevant
    dimensions in the FIX STATEMENT$$$$$$
    Please note the only difference between NOTE 2
    and NOTE 4 is the FIX statements. I.e., Fixing all required lev-0 members from
    required dimensions as opposed to Fixing on just single lev-0 members from
    required dimensions.
    Also please note that NOTE 1 FIX statement works
    just fine! So, there is no problem with the FIX statement itself but rather
    with the combination of Sparse Calculations and with Fixing all LEV-0 members
    from required dimension in the FIX. Probably because the Sparse dimensions are
    VERY LARGE.
    Could anyone shed some light on what might be wrong here?
    We are on Essbase standalone V11.1.2.
    Your inputs are very much appreciated!
    Thanks

    Hi,
    One minor thing I notice - your NOTE 2 problem script does not fix on your Working version, whereas the NOTE 4 script does.
    With your small number of blocks, this should not be the problem, but it's probably worth quickly testing and eliminating before delving deeper.
    Your script could be invoking member formulae from the Account dimension - does your outline validate ok..?
    I cannot tell from your NOTE 4 script which member relates to the Account dimension - is it 2251026?
    Whichever it is, it is worth expanding NOTE 4 script up to @RELATIVE("EBITDA",0)....
    - if it runs, you know the problem is with your sparse member selections (ie number of blocks being calculated).
    - If it doesn't run, then you know there is a problem in calculating one or more of the Accounts under EBITDA.  In which case, gradually narrow the range of Accounts to locate the Account or Accounts that trigger the issue.  Chances are there's a problem with the member formula

  • Populated dense dimension member failed ?

    Hi,
    I have a scenario dimension which is dense with this member : Actual and Budget, and I would like to copy Budget to Actual.
    Actual member are already populated on month M1, M2, M3
    ACTUAL BUDGET
    M1 100
    M2 200
    M3 300
    I wrote a HBR like this :
    Fix(M1:M3,other sparse dimension member)
    Budget = Actual
    ENDFIX
    but only M3 has been populated
    ACTUAL BUDGET
    M1 100
    M2 200
    M3 300 300
    I don't understand why because datablock already exist for M1, M2, M3.
    I tried to use CREATEONMISSINGBLOCK ON but didn't change anything.
    So I changed my script with a datacopy like this :
    Fix(M1:M3,other sparse dimension member)
    Datacopy Actual to Budget;
    ENDFIX
    And now it work
    ACTUAL BUDGET
    M1 100 100
    M2 200 200
    M3 300 300
    But I don't understand why my first script failed ?
    Regards
    Benjamin

    when using an IF statement in a calc script you have to specify a member to associate it with for the calc member block. This is like temporarily assigning it to that member as a formula in the outline. In most cases calculations on dense dimensions can work faster as the block is in memory. Remember that a block statement can have multiple calculation statements in it. When this occurs, having the calculations on a dense member can speed it up as it may not have to swap blocks to do the calculation.
    Also consider that if a sparse combination of members does not exist, the block will not exist and there will be no attempt to do the calculations. If a sparse member is on the calc member block, it will cycle through all of the sparse members looking to do the calculation.

  • Member block component only select dense dimension members

    I used Calculation Manager to create Business Rule for Planning app. The Member Block component only show dense dimensions. Why?

    Its trying to force you to write the calculation optimally. Calculations with sparse member blocks are much slower than dense dimension blocks. Even if you cannot select, you can always type in the sparse dimension member name. You dont need to use member selector.

  • Variance Member Calculation

    I want to write BPC reports that sort on variance (Actual to Budget).  Perhaps an analyst would also filter on variances to make it easier to find problem areas (i.e. variances > $10,000).
    I don't think I can accomplish this with evDre() without creating a new CATEGORY/VARIANCE member.
    Anyone have an example of a logic file/member calculation that replicates evBet() functionality and populates a VARIANCE member?
    Is this already part of ApShell?  Isn't this requested all the time? Possibly I am thinking about this incorrectly...
    Thanks for any insight...Marv

    I can't make member as store. Also changing member formula requires lot of changes as we have many versions and scenario to handle. I got the following solution to halt two-pass calculation with the sparse member and its working fine.
    Ref:http://www.network54.com/Forum/58296/thread/1078946717/%25+variance+via+@VARPER+for+%25....ughh!+Can%27t+even+explain+it!
    Apply a UDA to sparse dimension members with two-pass formulas
    wrap two-pass formulas in dense dimensions with IF statements checking for NOT that UDA
    Steps required:
    Create a UDA in the scenario dimension called "Variance", and apply it to the Var% members.
    The var% member is a sparse dynamic calc two pass member with the following formula: @VARPER(Actual, Budget);
    Then, modify your two pass calc ratios in the accounts dimension formulas to look like this:
    IF(NOT(@ISUDA("Scenario", "Variance")))
    "GP %" = "Sales" % "Gross Profit";
    ENDIF;
    regards,
    aamir

  • Not creating the sparse data block

    I have a database that loads to a small range of members in a sparse dimension (Products in this case is the sparse dimension, and I'm loading Revenue in the Accounts dimension (Dense)..)In my calc script, I'm allocating the revenue to another set of products. The problem is that unless I load zeroes into those datablocks prior to calcing, it doesn't create the datablock, thus no revenue is allocated.How can I fix this? I should also mention that I'm creating members of another sparse dimension (Sales Order # in this case) with each data load, so I can't load zeroes in advance.Thanks!Clark

    In order to calculate a block which does not exist, there are 3 ways:- create block on equation: but this decrases the performance calculation and will create to much blocks.In the past, this was not well working.- instead of using a dense member formula at the left side of the sign '=', use a sparse member.eg:if Actual is a sparse member, you write:ACTUAL (IF (@ismbr(memberdensetobecalculated) ACTUAL= X * X/Y;)Sparse member formula always create block if needed. unfortunately, theses kind of formula may be time consuming depending the size of the fix.- Create the block before your calculation using datacopyI prefer this solution , because this solution is the most powerful. enjoy your script.Franck

  • Sparse dimension problem (2)

    This message is for Jade Cole and Farid Rashid:I'm working on Hyperion Essbase 5.0.2 patch 11 and when you assign a constant to a member in a sparse dimension, Essbase DOESN'T create data blocks for sparse dimension member combinations!!! It must be a bug!!! I have replicated the situation on Hyperion Essbase 6.0 and works fine.Thank you,Lucas.

    You need to restrict the CLEARDATA further, as it seems to be deleting the entire block. Hence, the Account = 1 is failing.There are a few ways to remedy the problem - one way would be to use DATACOPY to create the Enterprise1 and Enterprise2 blocks, but you would need a source member that had numbers.The other thing you could do is re-arrange the formula so that a sparse member appears on the left hand side of the equation, rather than a dense member. This will create the blocks and populate the values, but you should be careful, as it may create more blocks than you want.Hope that helps.Regards,Jade-------------------------------Jade ColeSenior Business Intelligence ConsultantClarity [email protected]

  • Question/Comment about Outline

    Did you ever noticed that members that are dynamic calc in a sparse dimension are considered stored members when you look at database statistics ?I wonder why ... ???Why would dynamic calc members in sparse dimension be considered stored member when the same members in a dense dimension are not.Example :Dimension X is dense. It has 10 members, 5 of which are dynamic calc. In database statistics, you will see the following:Dimension Type : DenseMembers in dim : 10Members stored : 5For argument sake, use the same dimension but set it to sparse. In database statistics, you will see the following:Dimension Type : SparseMembers in dim : 10Members stored : 10Anyone else seen this or is this a possible bug or our server ???We are using Essbase 6.1 patch 2 on Sun Solaris 5.8.We also have Essbase 6.1 patch 3A on a development server also running on Sun Solaris 5.8. Both server shows the same statistics.Thomas

    It could be a bug - or just the way the value is calculated. For example - if you tag members of sparse dimensions as dynamic calc, the POTENTIAL number of blocks stays the same. I would argue that it should decrease, since you will never have blocks stored for members that are dynamic.However, if you have data in your database, the number of EXISTING blocks does decrease, which is expected.Now, the other possibility is that Essbase calculates blocks according to their block number. Even though a sparse member may be dynamic, Essbase may still track the block number for this "dynamic" block in case it is required for a rollup or other calculation. The "storage" of the sparse member is merely to assign the sparse combination a block number.That is my best guess (aside from a bug)... I couldn't find an explaination in the DBA guide. Regards,Jade---------------------------------Jade ColeSenior Business Intelligence ConsultantClarity [email protected] Posted by thomas.doyon   4/3/02 08:29---Did you ever noticed that members that are dynamic calc in a sparse dimension are considered stored members when you look at database statistics ?I wonder why ... ???Why would dynamic calc members in sparse dimension be considered stored member when the same members in a dense dimension are not.Example :Dimension X is dense. It has 10 members, 5 of which are dynamic calc. In database statistics, you will see the following:Dimension Type : DenseMembers in dim : 10Members stored : 5For argument sake, use the same dimension but set it to sparse. In database statistics, you will see the following:Dimension Type : SparseMembers in dim : 10Members stored : 10Anyone else seen this or is this a possible bug or our server ???We are using Essbase 6.1 patch 2 on Sun Solaris 5.8.We also have Essbase 6.1 patch 3A on a development server also running on Sun Solaris 5.8. Both server shows the same statistics.Thomas

  • Attribute Dimension or Alternate Hierarchy

    Hello,
    We have a sparse dimension that has probably around 1500 members. We need to report on these members in different views then the current hierarchy holds. Is it best to create an alternate hierarchy with these 1500 members or to tag these members with various attributes and then use the attribute in the different reports. I did some research and found that attribute dimensions often have slow FR performance since they are dynamic, however, I feel like 1500 shared members is a lot. Any tips, tricks, advice would be greatly appreciated.

    Hi,
    I don't think 1500 shared members is a concern in and of itself (unless you are putting them all under one dynamic parent!). You are correct there could be some response time impact with attribute dimension retrievals, but then shared members under dynamic sparse parents may present similar concerns.
    I would encourage you to think about the structure and navigability desired. You should keep in mind that shared member rollups can only be used in reports consistent with the dimension they are in. One advantage of attribute dimensions over shared rollups is that, only with attribute dimensions, can you cross-tab report against the base dimension. If your users care about being able to see certain alternate groupings across the main dimension, then I would suggest going with attribute dimensions.
    There are other considerations. For instance, adding a large number of shared rollups in your BSO anchor dimension (the physically last dimension with hierarchy), you put more demand on Essbase calculator cache.
    Out of time for now. Just sharing some initial thoughts I hope you will find helpful.
    Darrell Barr

  • Business/Calc script

    Hi All,
    Request your inputs on the below.
    I have a the below code of the rule is as below:
    FIX ("Local","&STATmonth","HSP_Input"",&STATYR1,"Forecast","Entity_default",@RELATIVE("product",0),"Version_Deault")
    "AC_BL1"
    if(@ISMBR(&firstmonth))
    AC_BL1 =AC_BL1->&firstmonth;
    else
    AC_BL1 = @MDSHIFT("AC_BL2",-1,"Period",) - @MDSHIFT("AC_BL3",-1,"Period",) + @MDSHIFT("AC_BL1",-1,"Period",) + @MDSHIFT("AC_BL4",-1,"Period",) ;
    endif
    ENDFIX
    FIX ("Local","&STATmonth","HSP_Input"",&STATYR2,"Forecast","Entity_default",@RELATIVE("product",0),"Version_Deault")
    "AC_BL1"
    IF(@ISMBR(Jan))
    AC_BL1 =AC_BL1->&lastmonth;
    else
    AC_BL1 = @MDSHIFT("AC_BL2",-1,"Period",) - @MDSHIFT("AC_BL3",-1,"Period",) - @MDSHIFT("AC_BL4",-1,"Period",) ;
    endif
    ENDFIX
    MODIFIED SCRIPT WITH NESTED FIX AS BELOW:
    FIX ("Local","&STATmonth","HSP_Input","Forecast","Entity_default",@RELATIVE("product",0),"Version_Deault")
              FIX (&STATYR1)
    "AC_BL1"
    if(@ISMBR(&firstmonth))
    AC_BL1 =AC_BL1->&firstmonth;
    else
    AC_BL1 = @MDSHIFT("AC_BL2",-1,"Period",) - @MDSHIFT("AC_BL3",-1,"Period",) + @MDSHIFT("AC_BL1",-1,"Period",) + @MDSHIFT("AC_BL4",-1,"Period",) ;
    endif
              ENDFIX
              FIX (&STATYR2)
    "AC_BL1"
    IF(@ISMBR(Jan))
    AC_BL1 =AC_BL1->&lastmonth;
    else
    AC_BL1 = @MDSHIFT("AC_BL2",-1,"Period",) - @MDSHIFT("AC_BL3",-1,"Period",) - @MDSHIFT("AC_BL4",-1,"Period",) ;
    endif
              ENDFIX
    ENDFIX
    I have taken of the differences in the first two FIX statements which are "&STATYR2" and "&STATYR1" and fixed them seperately in another FIX statements.
    Does my logic correct. When I am trying to run this script its taking hours..
    Can some one please light whether I am doing anything wrong?
    Thanks in Advance.

    Are Account and Period dense as is most typical?
    Of the two, I like your nested version better ... if nothing else than easier to read code.
    I might avoid a dense Period member &STATmonth in the FIX. Instead join it with the other period tests in the IF statements. IF(@ISMBR(&firstmonth) AND @ISMBR(&STATmonth)). Try turning on SET MSG DETAIL; for a run. Caution ... it is VERY verbose, your application log file can balloon up but you will see exactly what member sets are being calculated. You may find that is is calculations JAN->Product1 then FEB->Product1 then MAR->Product1 and repetitively reading and writing the same sparse member combination block over and over again.
    Sometimes outline math is a lot faster than calc script math and gets the same result. Try an alternate rollup in Account where you have:
    AC_BL_NET
    + AC_CL1
    + AC_BL2
    - AC_BL3
    + AC_BL4
    Then use a single @PRIOR(AC_BL_NET) instead of four @MDSHIFT statements. For extra credit, you could even read AC_BL_NET into an ARRAY. They are remarkably fast, but only handle one dimension ... in your case Product is the only dimension that changes.
    Isn't this the same as doing nothing?
    if(@ISMBR(&firstmonth)) AC_BL1 =AC_BL1->&firstmonth;
    Perhaps you want to skip that and just use IF(NOT(@ISMBR(&firstmonth))) then do the ELSE stuff.
    Regards,
    Mike Henderson

  • Calc Churning on Flat Hierarchies

    Outline hourglass shape with 7 dims - 4 sparse dimensions. Three of these dimensions are very flat - 3,000, 8,000, and 13,000 members in these of which thousands of member may rollup to a parent node.CALC DIM on largest sparse member completed in 1 hour. CALC DIM on 2nd dimension caused Essbase to churn for over 24 hours before I killed it. UsedSET CACHE HIGH; - default settingsSET CALCHASHTBL ON; 100MB allocated for hash table in essbase.cfg.If I add additional levels to bring # of children per parent < 400, calc completed in 11 hrs. 50,000,000 upper level blocks created - 32GB cube. I was interested to see how many upper level blocks were created for these manufactured blocks. Doing this also makes cube more difficult for end users. Any suggestions?

    HiIt is best if possible to keep the children of a saprse dimension to 100 members or less. This is not always possible. Is the level of detail you are storing required?Just some info...

  • Performance Question on Time dimension.

    Hi all,
    I have a cube with a date dimension, which represents a snapshot date for open orders on the system ( 5 other dimensions apart from this one ) - this is updated daily.
    Should I:
    (1) build all dates for the foreseeable future
    (2) add each new snapshot date in day by day.
    Each seems to have pros & cons, 1 doesnt need to restructure every dimension build, but the calculation seems to take longer, 2 seems to restructure the outline every time which takes longer but the calculation is quicker.
    I'd value anyone's thoughts or experience on this.
    Essbase 11.1.1 on redhat LINUX.
    Thanks

    Why not just build the dimension (every two years?) and incorporate a defrag routine in your data load -- I'm guessing that something is fragmenting your db although a sparse restructure shouldn't be fixing that?
    This would be easy to test -- try the first approach and look at PAG file size, then try the second and look at the same. Again, a sparse restructure causes Essbase to rewrite the .IND file -- the .PAG file just gets added to as new sparse member data comes in. That should be the same behavior to .PAG whether the members exist or not -- Essbase doesn't create or rewrite to existing blocks until there's data.
    Or is it possible that you're using Intelligent Calc and Essbase is only calculating the dirty blocks? That sort-of, kind-of makes sense although I have to wonder why the number of dirty blocks would be different between the two approaches.
    Regards,
    Cameron Lackpour

Maybe you are looking for