Essbase calc script performance issues

Hi,
I have essbase 9.3 running on Sun solaris 4 CPU, 16 GB server. The calc script "calc all" takes ~3 hrs to complete.
This is the calc script.
/ESS_LOCALE English_UnitedStates.US-ASCII@Binary
SET UPDATECALC OFF;
SET CALCPARALLEL 4;
SET CALCTASKDIMS 2;
CALC ALL;
We don't have to calc all dim, but even if we
But even with specific dim we get the same timing. Below is the script
SET UPDATECALC OFF;
SET CALCPARALLEL 4;
SET CALCTASKDIMS 2;
FIX ("Y2009", "Actual");
CALC DIM("Data Source","Intercompany","LegalEntity","Site","Department","Entity");
ENDFIX
The ess00001.ind is 700 Mb and ess00001.pag is 2.1 GB.
In Admin services, this is what I see for caches
1) Index cache size is 1 GB for this DB
2) Index cache current value is 1gb
3) Datafile cache setting is 1.5 GB
4) Datafile cache current value is 0 (?? not sure why??)
5) Data cache setting 4.1 GB
6) Index page setting 8 kb
please help ...
Thanks
Moe

Moe,
I'm guessing you inherited this thing, else you would know why the cache settings are what they are, but here are some thoughts:
Caches:
3) Datafile cache setting is 1.5 GB
4) Datafile cache current value is 0 (?? not sure why??)You're running the database in Buffered I/O, so the data file cache is ignored.
1) Index cache size is 1 GB for this DB
2) Index cache current value is 1gb You have consumed all of the cache -- I'm a little confused, as you state your .ind file to be 700 megabytes -- generally the index cache consumption doesn't go beyond the .ind file size. When you look at your hit ratio statistics in EAS, does it show a 1 against the index cache? If yes, then you don't need to look any further as that's as good as it's going to get.
5) Data cache setting 4.1 GBUnless you're using MEMSCALINGFACTOR, I don't think Essbase is actually addressing all of the memory you've assigned. What are you showing as actually used? In any case, having a data cache almost twice as big as the .pag files is a waste as it's way too large.
Easy, off the cuff suggestions without knowing more about your db:
1) Try AGG instead of CALC DIM for sparse dimensions.
2) Try turning off (yes, turning off, you'd be surprised) parallel calc, and benchmark it. It will probably be slower, but it's nice to know.
3) Dimension order? Modified hourglass?
4) Tried defragmenting the database and benchmarking the performance?
5) What is your block size? Big? Small?
6) I think you are not calculating your Accounts/Measures dimension in your calc? If you are, and it's dense, could you make those Accounts dynamic calc -- dropping a dimension from the calc can be huge.
I'm sure there will be other suggestions -- these are the easiest.
Regards,
Cameron Lackpour

Similar Messages

  • Calc script & performance issues

    Hi All we have a calc script which used to take only 10 mint every day. But today it is taking long time 4 hrs,,,stil running..if i cancel that calc operation what is the impact on database?. Earlier all users are hapy with speed...but suddenly every one got pissed off with speed...it is taking long time to retrieve data..Quickly what are the parameters I need to check?thanks in adavance..

    <p>If you are using committed access then you can safely cancel thecalculation. All data will be reverted back to what it was beforecalculation. However if you are using uncommitted access it isrecommended not to cancel any running operation.</p><p>If you want to eliminate fragmentation, just export your level 0data and import again, and then do a calc all.</p><p>Doing so will remove any fragmentation. It is recommended to dothis once in  a while like 2 months to get rid offragmentation.</p>

  • Enhanced Calc Script more flexible than native Essbase Calc Script?

    What makes an Enhanced Calc Script more flexible than native Essbase Calc Script?
    Run on Save or @CALCMODE function or Run time prompts or Can be run over the web or Substitution Variables or Custom Defined Functions.
    Appreciate if u reply ASAP!!
    Thanks in Advance!!!

    Some posts on the subject
    Business Rule
    Business rule
    Business rule
    Cheers
    John
    http://john-goodiwn.blogspot.com/

  • Will block size effect the calc script performance?

    Hi Experts,
    I have a cube called RCI_LA:RCI_LA, now I have created calc scripts and working fine. But those calc scripts are taking too much time than expected (normally it should not take more than 15 min but those are taking nearly 1 hr or more some calc scripts.)
    In database properties I found that block size is 155896 B i.e. 152.KB but this size should be 8 to 100 KB & Block density is 0.72%
    If block size exceeds more than 100 KB will it impact the performance of Calc scripts?
    I think answer to the above question is “yes”. In this case what should I need to do to improve calc scripts performance?
    Could you please share your experience here with me to come out of this problem?
    Thanks in advance.
    Ram

    I believe Sandeep was trying to say "Dynamic" rather than "Intelligent".
    The ideal block size is a factor in all calcs, but the contributing reasons are many (The main three are CPU caching, Data I/O overhead, Index I/O overhead).
    Generally speaking, the ideal block size is achieved when you can minimize the combination of Data I/O overhead and Index I/O overhead. For this reason a block size that is too large will incur too much Data I/O, while a block size that is too small will incur too much Index I/O. If your Index file is small, increasing your block size may help, although the commonly acceptible block size is between 8K and 64K in size, this is just a guideline.
    In other words, if you test it with something right in the middle and your index file is tiny, you might want to test it with a smaller block size. If your index file is very large (i.e. 400 MB or more), you may want to increase the block size and retest.
    Ways to increase/decrease it are also many. Obviously, changing the dense/sparse settings is the main way, but there are some considerations that make this a touchy process. Other ways are to use dynamic calc in the dense dimensions. I say start at the top of your smallest dense dimension and keep the number of DIMENSIONS that you use D-C on limited. Using D-C members in a dense dimension does NOT increase the index file, so it could be considered a "free" reduction in block size -- the penulty is paid on the retrieve side (there is no free ride).

  • Execute Essbase Calc Scripts from FDM

    Hi,
    Can any of you let me know how to execute Essbase Calc Scripts from FDM, these Calc Scripts are on Essbase Server. Any help would be greatly appreciated.
    Thanks

    See the thread below:
    Re: FDM - Script

  • Report Script Performance Issues

    Essbase Nation,
    We have a report script that extracts a full 12 months worth of history in 7 minutes. The script that is used to extract the period dimension is as follows:
    <Link (<Descendants("Dec YTD") And <Lev("Period",0))
    The line above is then changed to pull just one month of data, and now the report script runs for 8 hours.
    Please advise as to why the difference in performance.
    Thank you.

    ID 581459.1:
    Goal
    How to optimize Hyperion Essbase Report Scripts?
    Solution
    To optimize your Report follow the suggested guidelines below:
    1. Decrease the amount of Dynamic Calcs in your outline. If you have to, make it dynamic calc and store.
    2. Use the <Sparse command at the beginning of the report script.
    3. Use the <Column command for the dense dimensions instead of using the Page command. The order of the dense dimensions in the Column command should
    be the same as the order of the dense dimension in the outline. (Ex. <Column (D1, D2)).
    4. Use the <Row command for the sparse dimensions. The order of the sparse dimensions in the Row command should be in the opposite order of the sparse
    dimension in the outline. (Ex. <Row (S3, S2, S1)). This is commonly called sparse bottom up method.
    5. If the user does not want to use the <Column command for the dense dimensions, then the dense dimensions should be placed at the end of the <Row command.
    (Ex. <Row (S3, S2, S1, D1, D2)).
    6. Do not use the Page command, use the Column command instead.

  • Outline Order, Calc Script Performance, Substitution Variables

    Hi All,
    I am currently looking in to the performance side.
    This is mainly about the calculation script performance.
    There are lot of questions in my mind and as it is said you can get the results only by testing.
    1. Outline order should be from least sparse to most sparse
    (other reason : to accomodate as many sparse members in to calculator cache) correct me if I am wrong
    2. Is Index entry created based on the outline order. For example I have outline order as Scenarios, Products, Markets then does my index entry be like scenario -> Products -> Markets ?
    3. Does this order has to match with the order of members in FIX Statement of calculation script?
    4. I have 3 sparse dimensions. P (150 members), M (8 members), V (20 members).
    I use substitution variables for these three in the calculation script. And these three are the mandotary things in my calculation script. Now when I see the fix statement, these three are the first 3 parameters of the fix statemtn and since I am fixing on a specific member, placing these three members as the first 3 sparse dimensions in the outline, ill it improve performance?
    In one way, I can say that a member from P, M,V becomes my key for the data.
    Theoritically if I think, may be it will...but in practical terms I don't see any of such thing.. Correct me If my thinking is wrong.
    One more thing, I have a calc script with say around 10 FIX statements and this P,M,V is being used in every FIX statemnts. Since my entire calculation will be only on one P, one M, one V. Can I put everything in one FIX at the beginning and exclude it from remaining FIX statememts?
    5. I have a lot of cross dimensional operations in my calc scripts for accounts dimension (500 + ) members.
    Is there a way to reduce these?
    6. My cube statistics..
    Cube size : 80 GB +
    Block Size : 18 KB (Approx)
    Block density : 0.03 . This is what I am more worried about. This really hurts me.
    This is one of the reason why my calculation time is > 7 hours and some times it is horrible when there is huge amount of data (it takes aound 20 + hours) for calculation.
    I would be looking forward for your suggestions.
    It would be really apprecialble if It is Ok to share your contact number so that I can get in touch with you. That could be of great help from your side.

    I have provided some answers below:
    There are lot of questions in my mind and as it is said you can get the results only by testing.
    ----------------------------You are absolutely right here but it helps to understand the underlying principles and best practices as you seem to understand.
    1. Outline order should be from least sparse to most sparse
    (other reason : to accomodate as many sparse members in to calculator cache) correct me if I am wrong
    ----------------------------This is one reason but another is to manage disk I/O during calculations. Especially when performing the intial calculation of a cube, the order of sparse dimensions from smallest to largest will measurably affect your calc times. There is another consideration here though. The smallest to largest (or least to most) sparse dimension argument assumes single threading of the calculations. You can gain improvements in calc time by multi-threading. Essbase will be able to make more effective use of multi-threading if the non-aggregating sparse dimensions are at the end of the outline.
    2. Is Index entry created based on the outline order. For example I have outline order as Scenarios, Products, Markets then does my index entry be like scenario -> Products -> Markets ?
    ----------------------------Index entry or block numbering is indeed based on outline order. However, you do not have to put the members in a cross-dimensional expression in the same order.
    3. Does this order has to match with the order of members in FIX Statement of calculation script?
    ----------------------------No it does not.
    4. I have 3 sparse dimensions. P (150 members), M (8 members), V (20 members).
    I use substitution variables for these three in the calculation script. And these three are the mandotary things in my calculation script. Now when I see the fix statement, these three are the first 3 parameters of the fix statemtn and since I am fixing on a specific member, placing these three members as the first 3 sparse dimensions in the outline, ill it improve performance?
    --------------------------This will not necessarily improve performance in and of itself.
    In one way, I can say that a member from P, M,V becomes my key for the data.
    Theoritically if I think, may be it will...but in practical terms I don't see any of such thing.. Correct me If my thinking is wrong.
    One more thing, I have a calc script with say around 10 FIX statements and this P,M,V is being used in every FIX statemnts. Since my entire calculation will be only on one P, one M, one V. Can I put everything in one FIX at the beginning and exclude it from remaining FIX statememts?
    --------------------------You would be well advised to do this and it would almost certainly improve performance. WARNING: There may be a reason for the multiple fix statements. Each fix statement is one pass on all of the blocks of the cube. If the calculation requires certain operations to happen before others, you may have to live with the multiple fix statements. A common example of this would be calculating totals in one pass and then allocating those totals in another pass. The allocation often cannot properly happen in one pass.
    5. I have a lot of cross dimensional operations in my calc scripts for accounts dimension (500 + ) members.
    Is there a way to reduce these?
    -------------------------Without knowing more about the application, there is no way of knowing. Knowledge is power. You may want to look into taking the Calculate Databases class. It is a two day class that could help you gain a better understanding of the underlying calculation principles of Essbase.
    6. My cube statistics..
    Cube size : 80 GB +
    Block Size : 18 KB (Approx)
    Block density : 0.03 . This is what I am more worried about. This really hurts me.
    This is one of the reason why my calculation time is > 7 hours and some times it is horrible when there is huge amount of data (it takes aound 20 + hours) for calculation.
    ------------------------Your cube size is large and block density is quite low but there are too many other factors to consider to simply say that you should make changes based solely on these parameters. Too often we get focused on block density and ignore other factors. (To use an analogy from current events, this would be like making a decision on which car to buy solely based on gas mileage. You could do that but then how do you fit all four kids into the sub-compact you just bought?)
    Hope this helps.
    Brian

  • Retrieve imported and validated Entities for further ESSBASE calc Script

    Hi folks,
    once the FDM processing is finished:
    The Event Script AftConsolidate is executed.
    It is retrieving all unique Entity entries (trialbalance command), Period (POV), Scenario (POV) etc. and is bulding a dynamic ESSBAE calc script command which is afterwards executed to ensure that even the leaf member are correctly transferred to ESSBASE, the nodes are refreshed/aggregated as well.
    This works perfectly ;-)
    MY ISSUE:
    I want to clone this logic into a custom web script which then can be executed adhoc via webfrontend / Task Flow.
    I tried to copy the AftCondolidate Script into this custom web script. UNFORTUNATELY i get an error: saying DATA ACCESS ERROR
    My assumption is, that the trialbalance command does not work wit the custom web scripts.
    Is that right? Are there any workarounds how to retrieve out of a custom web script the entity dimension and store the unique entity entries in an array?
    regards
    Hau

    You don't need a custom script. FDM has functionality to call the consolidate action only, check the activities menu

  • Calc Script performance

    <BR> Hello,<BR><BR> A customer have a cube that is taking a lot longer to calculate after each new load. The cube have 7 dimensions, monthly data from Jan 2005 on, 20 GB of data. It's taking around 14 hours to calculate it, but if you load the data on an identical cube with no data, it is calculated in less than 2 hours. <BR><BR>The calc scripts include a FIX on a dense dimension, as shown below:<BR><BR><BR><BR>Fix( &CurrentYear, &CurrentMonth, Actual, Local) <--- sparse dims<BR><BR> Fix (@IDescendants("REVENUE"), "Qtd VP Interna") <--- dense dim members (Accounts)<BR><BR> Calc Dim (Presidencia, Product); <--- sparse dims<BR><BR> EndFix<BR><BR>EndFix<BR><BR><BR><BR> The question is: since FIXing on a dense dimension causes all data blocks to be touched, is the inner FIX causing a scan in all data blocks of the database, even if the outer FIX refers to sparse dims only? <BR><BR> And during the calc process, the Windows performance monitor shows very little CPU activity and only accasionally a disk reading...<BR><BR><BR> And since Calc Dim is not allowed within an IF command, is there another way to obtain that consolidation?<BR><BR> Thanks in advance!<BR>

    <BR> Hello Gary!<BR><BR> I agree that calculating a new month's data in an empty cube should be faster than calculation the same data in a cube that already have 16 months of data, but I think that it's taking much longer that expected. I expected it to be 50% slower, but not 700% !<BR><BR> I even recreated the production cube from scratch, loading and calculating one month at a time, in 2 different servers. The results are always the same: the new calc time is a lot longer the previous one.<BR><BR> And when I use the windows' performance monitor to compare the server's behavior between the calcs of the empty cube an the production one, you can see that the server is either acessing the hard disk or is calculating 100% of the time for the empty cube, but the graphs for the production cube indicates very low disk access and CPU activities. It seems to be waiting for something...<BR><BR> I have already made many configuration changes, such as resizing the Index, data and data-file caches (I'm using direct i/o), number of lock blocks, compression mode among others, but the performance gains obtained for the calc in the empty cube is not reflected for the production cube, maybe because it's (apparently) doing nothing most of the time...<BR><BR> Is there a trace I can use to check what the ESSBASE is doing during the calc? I have used the MSG Detail but this didn't help.<BR><BR><BR> Thank you for your help!<BR>

  • Report Script- Performance Issue

    Hi,
    I ran this report script and it is taking around 2 hours to complete. Is there any possiblity to better tune this script. Please advice me where else can we better tune this.
    Thanks,
    UB.

    ID 581459.1:
    Goal
    How to optimize Hyperion Essbase Report Scripts?
    Solution
    To optimize your Report follow the suggested guidelines below:
    1. Decrease the amount of Dynamic Calcs in your outline. If you have to, make it dynamic calc and store.
    2. Use the <Sparse command at the beginning of the report script.
    3. Use the <Column command for the dense dimensions instead of using the Page command. The order of the dense dimensions in the Column command should
    be the same as the order of the dense dimension in the outline. (Ex. <Column (D1, D2)).
    4. Use the <Row command for the sparse dimensions. The order of the sparse dimensions in the Row command should be in the opposite order of the sparse
    dimension in the outline. (Ex. <Row (S3, S2, S1)). This is commonly called sparse bottom up method.
    5. If the user does not want to use the <Column command for the dense dimensions, then the dense dimensions should be placed at the end of the <Row command.
    (Ex. <Row (S3, S2, S1, D1, D2)).
    6. Do not use the Page command, use the Column command instead.

  • Calc script prompt issue in workspace

    Hello Gurus,
    This issue is related to running Calc scripts from EPM Workspace 11.1.2.2.300.
    Browser : IE8/IE9
    We are facing an issue where users are able to run CALC from workspace successfully.
    But they are not getting prompt as CALC XYZ ran successfully.
    The browser just keeps showing CALC XYZ is processing. If we check in logs, its shows Calc has already completed its execution.
    Can this happen if we have a slow network? Or is it a bug as the calc seems to be completing on the required time in the background (when checking logs) but the front process keeps going on.
    Any help will be highly appreciated.
    Thanks,
    hyperionEPM

    Hi Rahul,
    Thanks for your quick reply..
    Yes we are on planning 11.1.2.2.300
    Following is what happens --
    <li> We open workspace and login to a planning application.
    <li> We navigate to the Business Rules from Tools.
    <li> We then run the Calc script from within Workspace for that application
    What happens is the calc should be running for ~ 3 mins (it runs for approx same time when we check the logs) however the workspace keeps showing that it is still in progress even if it exceeds ~10 mins, giving us a feel that the calc is still running which is not the case as the log file has the appropriate time when it completed.
    We then have to manually close the UI which shows that the calc is still running as it had already completed.
    Any ideas on what could be causing this?
    Thanks,
    hyperionEPM

  • Call ODI 11g scenario from Essbase calc script/business rule using ODI SDK

    I am looking for any hints on how to use the ODI 11g SDK. I want to call a java application (CDF) that runs an ODI scenario using RUNJAVA in Essbase which I have successfully done in the 10g environment.
    The java application has the odi-core.jar included in the project and registers OK with Essbase and I have replicated code from the Oracle sample code site. When I run the application in a calc script I get the following error:
    EssbaseCluster-1.EFTS.EFTS.odi     Execute calculation script     June 17, 2011 10:20:40 AM NZST     Failed
    Error: 1200456 Problem running [indigo.essbase.odi.RunODIScenario]: [java.lang.NoClassDefFoundError: org/springframework/util/StringUtils]
    When I comment out the code that calls the creates the OdiInstance then the java app executes fine - i.e. writes something to the Essbase log.
    The research I have done so far indicates that a classpath is incorrect. If that is the case where do I start looking to correct the classpath? Is it the ODI classpath or the Essabase classpath?
    Any tips would be grateful.
    Thanks.

    You need to import more jars to execute this
    following are the jars
    1)     bsf.jar
    2)     bsh-2.0b2.jar
    3)     commons-collections-3.2.jar
    4)     eclipselink.jar
    5)     odi-core.jar
    6)     ojdl.jar
    7)     oracle.ucp_11.1.0.jar
    8)     persistence.jar
    9)     spring-beans.jar
    10)     spring-core.jar
    11)     spring-dao.jar
    12)     spring-jdbc.jar
    Once you have this in classpath - your scenario will execute
    Hope this helps.

  • Restrict Planning Admin from seeing all Essbase calc scripts

    Hi - I have a few ppl who are Administrators on Planning and they see all of my calc scripts in Essbase. Is there a way to block them from seeing them? I have those calc scripts to run things that they do not need to run.
    I am using Planning 3.5.1, Essbase 6.5
    Thanks,
    Cindy

    Hi Cindy,
    I assume the calc scripts you want to block from the Planning Admin's are within the applications they are administrators of correct?
    The only way I can think to do this is to place these in another directory on the server or on a share you have access to and use Esscmd scripts to execute them in a batch function.
    e.g.
    RUNCALC 3 C:\SERVER_DIR\mycalcs\calc
    Presuming the directory were on your essbase server and you schedule this on your essbase server.
    This assumes your Planning Admins do not have access to your Essbase server. Of course if this were unix the path to the script would be different.
    You could also only have the scripts on your local machine.
    Regards,
    John A. Booth
    http://www.metavero.com

  • ALI Scripting - Performance Issues

    We are using ALI scripting to raise events so that other portlets in the page can listen to the events and also passing some data to other portlets using "PTPortlet.setSessionPref" scripting API. In my local testing, the scripting is taking just 0.3 seconds, however when we deployed this code to common portal which has few other portlets, the scripting is taking 5 seconds. Does any one know if there are any known best practices around ALI scripting to avoid performance issues especially to transfer events between portlets?
    Thanks
    Sampath

    Hi, I would like to provide additional information on the performance issue we are facing with ALI scripting. In the code highlighted below, we are using “setSessionPref” methods to set values in session variables using ALI scripting APIs and raising events for IPC.
    var prefName = 'selectionString';
    var prefValue = xmlFile;
    var gpConfigValue = "<%=gpPromptsConfigID%>";
    PTPortlet.setSessionPref(prefName,prefValue);
    PTPortlet.setSessionPref('gpFormObj',frmObj.name);
    PTPortlet.setSessionPref('gpPromptsConfigID', gpConfigValue);
    myportlet$$PORTLET_ID$$.raiseEvent('onSelectionSubmitFormObj',false);
    myportlet$$PORTLET_ID$$.raiseEvent('onSelectionSubmit', false);
    This code does not take lot of time(less than a second) to execute in my local machine however, it takes lof of time (around 5 seconds) in integrated development environment.
    After lot of debugging, I observed that if I change “PTPortalContext.GET_SESSION_PREFS_URL” to point to http://localhost:7001 instead of integrated development environemnt, the processing time is considerably reduced. This code below is inserted automatically by ALUI in every portal page. Does any one know the signficance of the below code automaitcally inserted by ALUI and how it can impact IPC.
    // Define PTPortalContext for CSAPI
    PTPortalContext = new Object();
    PTPortalContext.GET_SESSION_PREFS_URL = 'http://dcdev.pg.com/portal/server.pt?space=SessionPrefs&control=SessionPrefs&action=getprefs';
    PTPortalContext.SET_SESSION_PREFS_URL = 'http://dcdev.pg.com/portal/server.pt?space=SessionPrefs&control=SessionPrefs&action=setprefs';
    Thanks
    Sampath

  • Peculiar problem with Essbase (Calc Script) - substitution variable / UDAs

    This is odd but I have a script like :
    VAR iloop=1,break=0;
    FIX(<required POV>)
    Loop (20,break)
    VAR Country_total1,Country_total2,Country_total3;
    FIX (@UDA(Entity,@ALIAS(@CONCATENATE("&Country",iloop)))) // &Country1, &Country2 - are substitution variables with UDAs stored as strings
    Statements;
    /* +<statements for calculating total values.. for that country and stored against variables>+ */
    Country_total1=Country_total1+ +<Calculation>+
    ENDFIX
    /* Second part : Now again the calculations stored in the variables are to be stored against specific entities */
    FIX (@UDA(Entity,@ALIAS(@CONCATENATE("&Country",iloop))))
    FIX(@ISUDA(Entity,<Check1>)
    ..... Assign to relevant account
    ENDFIX
    ENDFIX
    ENDLOOP
    ENDFIX
    Now the problem is that the first fix statement works just fine, but the FIX statement in the 'second part' throws an error
    Error: 1200354 Error parsing formula for [FIX STATEMENT] (line 66): expected type [STRING] found [EXTVAR] ([iloop]) in function [@CONCATENATE]
    If I hard code the 'second part' FIX statement to the substitution variable directly - it works just fine.
    How can the first statement work and not the second one ? They are exactly the same.

    Glenn, thanks - I hadn't thought of that :).
    But it still does not entirely solve my problem (please see my previous post depicting a requirement similar to ours )
    - We have lots of countries (50-60+ might be much more) and each country can have multiple entities (3-4 on an average - can go unto 7-8)
    - so good guess would be around 200 entities
    - So say I have to do it for 2 countries only (two entity types). Then I need 4 variables - 2 for each country ('country 1 ET1 total', 'Country 1 ET2 Total')
    When the list is 20 counties - variables become 40 :(.
    - Still leaving aside the 40 variables for a bit -
    There are subsequent steps of calculations which needs to be done based on these totals (which are exactly the same for all countries) - just that we need the correct totals to begin with and the rest is already stored in the DB
    So since I have a different variable for each country - I cannot write one single calculation block to use the variables sequentially one by one (can I ?)
    I might have to write a separate calculation block for each of these countries. (20 separate blocks)
    That's what I was trying to avoid and simplify with the substitution variable (but is not working)
    - Create substitution variables - which would store the alias of the required countries (2/10/20 as many required)
    - Loop through these substitution variables - using them one by one
    - So I just need one single block of calculation with all the variable in the calc script being reused after each country calculation is done
    - and the user need not go into the script, as the only thing that will change are the countries. And he can change it easily through the substitution variable.
    Edited by: Ankur on Jun 27, 2012 12:53 PM

Maybe you are looking for

  • Problem with "back" on Safari

    Hi, I´m having problem with my Safari.  When i hit back or swipe on my trackpad to go on the site i just browsed, the Safari diplays an older version of the site, usually 1-2 days older.  For example:  When browsing news sites i get 24 hrs old news :

  • WRT310N Drops Connection

    Hello, I recently purchased a WRT310N (version 2.0) after using a WRT54GL for years. Here are my issues with this router: 1. After updating and saving settings the router will drop its IP address and won't renew it without a reset.  Connections to th

  • Table control in module program

    hi, i don't know about table control in module program.so please send the faq's ?

  • Indesign CS5 classroom in a book

    Hi I am following lesson 3. I used rectangle frame tool to add placeholder frame for graphics. I chose wrap around bounding box but the imported text file pushed graphic onto following page. Has anyone had similar problems with the lessons or have I

  • Problem with assigning Quotation number to notification number

    Hi Experts, I have a requirement where when a Quotation number is created using IW52 transaction it should get assigned to notification number. The problem am facing is as  below. 1) As you all know QMEL is the header table for Notification number, w