Outline Order, Calc Script Performance, Substitution Variables

Hi All,
I am currently looking in to the performance side.
This is mainly about the calculation script performance.
There are lot of questions in my mind and as it is said you can get the results only by testing.
1. Outline order should be from least sparse to most sparse
(other reason : to accomodate as many sparse members in to calculator cache) correct me if I am wrong
2. Is Index entry created based on the outline order. For example I have outline order as Scenarios, Products, Markets then does my index entry be like scenario -> Products -> Markets ?
3. Does this order has to match with the order of members in FIX Statement of calculation script?
4. I have 3 sparse dimensions. P (150 members), M (8 members), V (20 members).
I use substitution variables for these three in the calculation script. And these three are the mandotary things in my calculation script. Now when I see the fix statement, these three are the first 3 parameters of the fix statemtn and since I am fixing on a specific member, placing these three members as the first 3 sparse dimensions in the outline, ill it improve performance?
In one way, I can say that a member from P, M,V becomes my key for the data.
Theoritically if I think, may be it will...but in practical terms I don't see any of such thing.. Correct me If my thinking is wrong.
One more thing, I have a calc script with say around 10 FIX statements and this P,M,V is being used in every FIX statemnts. Since my entire calculation will be only on one P, one M, one V. Can I put everything in one FIX at the beginning and exclude it from remaining FIX statememts?
5. I have a lot of cross dimensional operations in my calc scripts for accounts dimension (500 + ) members.
Is there a way to reduce these?
6. My cube statistics..
Cube size : 80 GB +
Block Size : 18 KB (Approx)
Block density : 0.03 . This is what I am more worried about. This really hurts me.
This is one of the reason why my calculation time is > 7 hours and some times it is horrible when there is huge amount of data (it takes aound 20 + hours) for calculation.
I would be looking forward for your suggestions.
It would be really apprecialble if It is Ok to share your contact number so that I can get in touch with you. That could be of great help from your side.

I have provided some answers below:
There are lot of questions in my mind and as it is said you can get the results only by testing.
----------------------------You are absolutely right here but it helps to understand the underlying principles and best practices as you seem to understand.
1. Outline order should be from least sparse to most sparse
(other reason : to accomodate as many sparse members in to calculator cache) correct me if I am wrong
----------------------------This is one reason but another is to manage disk I/O during calculations. Especially when performing the intial calculation of a cube, the order of sparse dimensions from smallest to largest will measurably affect your calc times. There is another consideration here though. The smallest to largest (or least to most) sparse dimension argument assumes single threading of the calculations. You can gain improvements in calc time by multi-threading. Essbase will be able to make more effective use of multi-threading if the non-aggregating sparse dimensions are at the end of the outline.
2. Is Index entry created based on the outline order. For example I have outline order as Scenarios, Products, Markets then does my index entry be like scenario -> Products -> Markets ?
----------------------------Index entry or block numbering is indeed based on outline order. However, you do not have to put the members in a cross-dimensional expression in the same order.
3. Does this order has to match with the order of members in FIX Statement of calculation script?
----------------------------No it does not.
4. I have 3 sparse dimensions. P (150 members), M (8 members), V (20 members).
I use substitution variables for these three in the calculation script. And these three are the mandotary things in my calculation script. Now when I see the fix statement, these three are the first 3 parameters of the fix statemtn and since I am fixing on a specific member, placing these three members as the first 3 sparse dimensions in the outline, ill it improve performance?
--------------------------This will not necessarily improve performance in and of itself.
In one way, I can say that a member from P, M,V becomes my key for the data.
Theoritically if I think, may be it will...but in practical terms I don't see any of such thing.. Correct me If my thinking is wrong.
One more thing, I have a calc script with say around 10 FIX statements and this P,M,V is being used in every FIX statemnts. Since my entire calculation will be only on one P, one M, one V. Can I put everything in one FIX at the beginning and exclude it from remaining FIX statememts?
--------------------------You would be well advised to do this and it would almost certainly improve performance. WARNING: There may be a reason for the multiple fix statements. Each fix statement is one pass on all of the blocks of the cube. If the calculation requires certain operations to happen before others, you may have to live with the multiple fix statements. A common example of this would be calculating totals in one pass and then allocating those totals in another pass. The allocation often cannot properly happen in one pass.
5. I have a lot of cross dimensional operations in my calc scripts for accounts dimension (500 + ) members.
Is there a way to reduce these?
-------------------------Without knowing more about the application, there is no way of knowing. Knowledge is power. You may want to look into taking the Calculate Databases class. It is a two day class that could help you gain a better understanding of the underlying calculation principles of Essbase.
6. My cube statistics..
Cube size : 80 GB +
Block Size : 18 KB (Approx)
Block density : 0.03 . This is what I am more worried about. This really hurts me.
This is one of the reason why my calculation time is > 7 hours and some times it is horrible when there is huge amount of data (it takes aound 20 + hours) for calculation.
------------------------Your cube size is large and block density is quite low but there are too many other factors to consider to simply say that you should make changes based solely on these parameters. Too often we get focused on block density and ignore other factors. (To use an analogy from current events, this would be like making a decision on which car to buy solely based on gas mileage. You could do that but then how do you fit all four kids into the sub-compact you just bought?)
Hope this helps.
Brian

Similar Messages

  • Will block size effect the calc script performance?

    Hi Experts,
    I have a cube called RCI_LA:RCI_LA, now I have created calc scripts and working fine. But those calc scripts are taking too much time than expected (normally it should not take more than 15 min but those are taking nearly 1 hr or more some calc scripts.)
    In database properties I found that block size is 155896 B i.e. 152.KB but this size should be 8 to 100 KB & Block density is 0.72%
    If block size exceeds more than 100 KB will it impact the performance of Calc scripts?
    I think answer to the above question is “yes”. In this case what should I need to do to improve calc scripts performance?
    Could you please share your experience here with me to come out of this problem?
    Thanks in advance.
    Ram

    I believe Sandeep was trying to say "Dynamic" rather than "Intelligent".
    The ideal block size is a factor in all calcs, but the contributing reasons are many (The main three are CPU caching, Data I/O overhead, Index I/O overhead).
    Generally speaking, the ideal block size is achieved when you can minimize the combination of Data I/O overhead and Index I/O overhead. For this reason a block size that is too large will incur too much Data I/O, while a block size that is too small will incur too much Index I/O. If your Index file is small, increasing your block size may help, although the commonly acceptible block size is between 8K and 64K in size, this is just a guideline.
    In other words, if you test it with something right in the middle and your index file is tiny, you might want to test it with a smaller block size. If your index file is very large (i.e. 400 MB or more), you may want to increase the block size and retest.
    Ways to increase/decrease it are also many. Obviously, changing the dense/sparse settings is the main way, but there are some considerations that make this a touchy process. Other ways are to use dynamic calc in the dense dimensions. I say start at the top of your smallest dense dimension and keep the number of DIMENSIONS that you use D-C on limited. Using D-C members in a dense dimension does NOT increase the index file, so it could be considered a "free" reduction in block size -- the penulty is paid on the retrieve side (there is no free ride).

  • Essbase calc script performance issues

    Hi,
    I have essbase 9.3 running on Sun solaris 4 CPU, 16 GB server. The calc script "calc all" takes ~3 hrs to complete.
    This is the calc script.
    /ESS_LOCALE English_UnitedStates.US-ASCII@Binary
    SET UPDATECALC OFF;
    SET CALCPARALLEL 4;
    SET CALCTASKDIMS 2;
    CALC ALL;
    We don't have to calc all dim, but even if we
    But even with specific dim we get the same timing. Below is the script
    SET UPDATECALC OFF;
    SET CALCPARALLEL 4;
    SET CALCTASKDIMS 2;
    FIX ("Y2009", "Actual");
    CALC DIM("Data Source","Intercompany","LegalEntity","Site","Department","Entity");
    ENDFIX
    The ess00001.ind is 700 Mb and ess00001.pag is 2.1 GB.
    In Admin services, this is what I see for caches
    1) Index cache size is 1 GB for this DB
    2) Index cache current value is 1gb
    3) Datafile cache setting is 1.5 GB
    4) Datafile cache current value is 0 (?? not sure why??)
    5) Data cache setting 4.1 GB
    6) Index page setting 8 kb
    please help ...
    Thanks
    Moe

    Moe,
    I'm guessing you inherited this thing, else you would know why the cache settings are what they are, but here are some thoughts:
    Caches:
    3) Datafile cache setting is 1.5 GB
    4) Datafile cache current value is 0 (?? not sure why??)You're running the database in Buffered I/O, so the data file cache is ignored.
    1) Index cache size is 1 GB for this DB
    2) Index cache current value is 1gb You have consumed all of the cache -- I'm a little confused, as you state your .ind file to be 700 megabytes -- generally the index cache consumption doesn't go beyond the .ind file size. When you look at your hit ratio statistics in EAS, does it show a 1 against the index cache? If yes, then you don't need to look any further as that's as good as it's going to get.
    5) Data cache setting 4.1 GBUnless you're using MEMSCALINGFACTOR, I don't think Essbase is actually addressing all of the memory you've assigned. What are you showing as actually used? In any case, having a data cache almost twice as big as the .pag files is a waste as it's way too large.
    Easy, off the cuff suggestions without knowing more about your db:
    1) Try AGG instead of CALC DIM for sparse dimensions.
    2) Try turning off (yes, turning off, you'd be surprised) parallel calc, and benchmark it. It will probably be slower, but it's nice to know.
    3) Dimension order? Modified hourglass?
    4) Tried defragmenting the database and benchmarking the performance?
    5) What is your block size? Big? Small?
    6) I think you are not calculating your Accounts/Measures dimension in your calc? If you are, and it's dense, could you make those Accounts dynamic calc -- dropping a dimension from the calc can be huge.
    I'm sure there will be other suggestions -- these are the easiest.
    Regards,
    Cameron Lackpour

  • Report script - using substitution variable with multiple values

    Hi All,
    Substitution variable with multiple values is not working correctly with Report scripts. Can you please let me know what is the syntax to assign multiple values to a sub variable using maxl:
    alter database samp.samp set variable 'ExtractQuarter' 'Q1,Q2,Q3,Q4';
    alter database Samp.Samp set variable 'ExtractQuarter' 'Q1:Q4';
    I tried both of the above but they are errored out with the below error:
    Error: 1001005 - Unknown Member [Q1:Q4] in Report.
    my requirement is different for both Actual and forecast data extract so i would like to make use of this variable to extract whole year data for Forecast and current quarter data for Actual with out duplicating the report scripts for both processes.
    Thanks,
    PRaveen

    Hi,
    Please refer following thread,
    range of months in report script?
    Hope it helps.
    Regards

  • Calc Script performance

    <BR> Hello,<BR><BR> A customer have a cube that is taking a lot longer to calculate after each new load. The cube have 7 dimensions, monthly data from Jan 2005 on, 20 GB of data. It's taking around 14 hours to calculate it, but if you load the data on an identical cube with no data, it is calculated in less than 2 hours. <BR><BR>The calc scripts include a FIX on a dense dimension, as shown below:<BR><BR><BR><BR>Fix( &CurrentYear, &CurrentMonth, Actual, Local) <--- sparse dims<BR><BR> Fix (@IDescendants("REVENUE"), "Qtd VP Interna") <--- dense dim members (Accounts)<BR><BR> Calc Dim (Presidencia, Product); <--- sparse dims<BR><BR> EndFix<BR><BR>EndFix<BR><BR><BR><BR> The question is: since FIXing on a dense dimension causes all data blocks to be touched, is the inner FIX causing a scan in all data blocks of the database, even if the outer FIX refers to sparse dims only? <BR><BR> And during the calc process, the Windows performance monitor shows very little CPU activity and only accasionally a disk reading...<BR><BR><BR> And since Calc Dim is not allowed within an IF command, is there another way to obtain that consolidation?<BR><BR> Thanks in advance!<BR>

    <BR> Hello Gary!<BR><BR> I agree that calculating a new month's data in an empty cube should be faster than calculation the same data in a cube that already have 16 months of data, but I think that it's taking much longer that expected. I expected it to be 50% slower, but not 700% !<BR><BR> I even recreated the production cube from scratch, loading and calculating one month at a time, in 2 different servers. The results are always the same: the new calc time is a lot longer the previous one.<BR><BR> And when I use the windows' performance monitor to compare the server's behavior between the calcs of the empty cube an the production one, you can see that the server is either acessing the hard disk or is calculating 100% of the time for the empty cube, but the graphs for the production cube indicates very low disk access and CPU activities. It seems to be waiting for something...<BR><BR> I have already made many configuration changes, such as resizing the Index, data and data-file caches (I'm using direct i/o), number of lock blocks, compression mode among others, but the performance gains obtained for the calc in the empty cube is not reflected for the production cube, maybe because it's (apparently) doing nothing most of the time...<BR><BR> Is there a trace I can use to check what the ESSBASE is doing during the calc? I have used the MSG Detail but this didn't help.<BR><BR><BR> Thank you for your help!<BR>

  • Calc script & performance issues

    Hi All we have a calc script which used to take only 10 mint every day. But today it is taking long time 4 hrs,,,stil running..if i cancel that calc operation what is the impact on database?. Earlier all users are hapy with speed...but suddenly every one got pissed off with speed...it is taking long time to retrieve data..Quickly what are the parameters I need to check?thanks in adavance..

    <p>If you are using committed access then you can safely cancel thecalculation. All data will be reverted back to what it was beforecalculation. However if you are using uncommitted access it isrecommended not to cancel any running operation.</p><p>If you want to eliminate fragmentation, just export your level 0data and import again, and then do a calc all.</p><p>Doing so will remove any fragmentation. It is recommended to dothis once in  a while like 2 months to get rid offragmentation.</p>

  • Maxl Clear Script with Substitution Variables

    Hi,
    I am new to Maxl and I am trying to write a clear script with Maxl on an ASO Cube. The script should only clear data on a certain version and year that changes every month. I am using the following syntax and it doesn't work.
    alter database ASOSamp.Sample clear data in region '{[&SV_CurrentPeriod]}';
    What is the correct syntax here?
    thanks,

    alter database ASOSamp.Sample clear data in region '{[&Test]}';...with the subvar 'Test' set to Jan...works on my 11.1.2 test system.
    Incidentally, when you say "it doesn't work", do you mean you get a MaxL error, or it doesn't clear the expected region? I was assuming the former, but you don't say.

  • How to backup Outline, Reports, Calc Scripts and Rule Files?

    just as the topic states, how do you do that for Essbase?

    Backup/Recovery guide -  http://docs.oracle.com/cd/E40248_01/epm.1112/epm_backup/launch.html
    Some options are to use file system copy or LCM
    Cheers
    John
    http://john-goodwin.blogspot.com/

  • Issue with substitution variable with @MDSHIFT in BR in Calc Manager

    Hi Experts,
    We have a BR in EAS which is working fine.
    and exported to Calcmanager, now while validation it is giving error "@BaseScenario" is not found.
    the code is
    @MDSHIFT("LFLSalesIncVAT_GBP_C_DHP"->&BaseScenario->"Final"->"CD_FinalPlanPL",-1,"Year",,1,"Period",);
    if i hard code it like @MDSHIFT("LFLSalesIncVAT_GBP_C_DHP"->"P7PP"->"Final"->"CD_FinalPlanPL",-1,"Year",,1,"Period",); working fine.
    is there any limitation with calc manger and substitution variables?
    Thanks
    GP

    There seems to be many issues with validations in calc manager, does the rule run if you dont validate.
    Cheers
    John
    http://john-goodwin.blogspot.com/

  • Chose 'Export to Shared Services' on a calc script....where did it go?

    I accidentally right-click and choose 'Export to Shared Services' on a Calc script in EAS.
    the help file is not much...I am curious...where did this export to in Shared Services? What is the point of this feature?
    Thanks
    JTS

    Hi,
    I think it was Hyperion's original idea to share objects between applications which were known as models, so you could export outlines and calc scripts to shared services and then shared them across applications or to synchronize different applications.
    It is something that I have never seen really used and I am sure it was dropped after 9.2, though the option still existed to export in EAS but you couldn't manage models in 9.3
    I think it has all be superseeded with concepts like EPMA and LCM.
    Cheers
    John
    http://john-goodwin.blogspot.com/

  • Problem setting the value of a substitution variable in a calc script

    Hi, All.
    I'm trying to update the value of a substitution variable inside an IF statement and I'm having trouble.
    Here's the code:
    *"Jan"(*
    IF  (LoopCounter = 1) &varEntity = "100";
    ENDIF;)
    the error I get back from EAS is:
    Error: 1200336 Error parsing formula for [Jan] (line 54): [(] without [)]

    I know CL knows a lot about HBR, if he's around today... LOL
    As a matter of fact, I just used an HBR (not using Calc Manager till I have to)'s macro functionality to drive fx for different Scenarios in a Planning app. Here's how I call the macro:
              /*     Call the macro mcrFxCoreOutYears for the four fx rate types for ALL 12 months of the year.     */
              FIX(&BudYear)
                   %mcrFxCoreOutYears(Constant)
                   %mcrFxCoreOutYears(Comparable)
                   %mcrFxCoreOutYears(Estimate)
                   %mcrFxCoreOutYears(Actual)
              ENDFIXI pass the Scenarios Constant, Comparable, Estimate, and Actual to the macro mcrFxCoreOutYears. You can apply HBRs against Essbase.
    I believe (I can't remember what exactly -- is it that templates don't accept parameters? That seems hard to believe. Looking at a CM template, that does seem to be the case. Bummer if I'm right.) that Calc Manager BRs have less functionality wrt templates, but I haven't worked with CM for over a year.
    It works really well -- a single place to maintain code and no appreciable performance cost.
    You cannot launch a HBR from a MaxL script but HBRs can be launched from command lines -- this may constrain where you place your automation (it basically has to run off of whatever the EAS server is. You may end up shelling out of MaxL to execute the HBR or even execute scripts across servers -- that does get more complicated.).
    Anyway, if you have any more questions, ask away -- it really is a very powerful component. I will have to look at CM more closely (sooner or later I will lose the HBR vs CM argument and at least need to know if templates support parameters) to see if I'm right on what I wrote above.
    Regards,
    Cameron Lackpour
    P.S. You cannot define an ARRAY in an HBR macro because ARRAY arrayname[value] gets read as a parameter. I ended up declaring the array in the calling HBR.
    P.P.S. I really need to write up the fx approach in a blog post -- it is wicked fast and really easy to use. Too many other posts in the queue already.
    Edited by: CL on Aug 24, 2011 1:48 PM

  • Unable to execute the substitution variable in calc scripts in essbase 11.1

    Unable to execute the substitution variable in calc scripts in essbase 11.1.3
    FIX(&CURRVERSION,COLA)
    Unit=units*Listprice;
    dataexport "file" "," "E:\NEW.TXT";
    ENDFIX
    Error: 1200471 Error parsing formula for FIX STATEMENT (line 1): expression expected before [)]
    This is error it throws when executing the calculation script
    I wonder whether its a problem with substitution variable i want to know wat went wrong inside the fix statement
    I have created substitution variable use maxl
    Installed the essbase in custom manner and standlone mode nt register with the shared services ,
    Is this problem with the custom installation of essbase
    Regards
    shenna

    If you remove the substitution variable and replace it with the actual value (whatever that is), does the code work? That will tell you if the issue is around the substitution variable or not.
    John -- First you race Glenn, then you race me -- and you always win. :)
    Regards,
    Cameron Lackpour

  • Peculiar problem with Essbase (Calc Script) - substitution variable / UDAs

    This is odd but I have a script like :
    VAR iloop=1,break=0;
    FIX(<required POV>)
    Loop (20,break)
    VAR Country_total1,Country_total2,Country_total3;
    FIX (@UDA(Entity,@ALIAS(@CONCATENATE("&Country",iloop)))) // &Country1, &Country2 - are substitution variables with UDAs stored as strings
    Statements;
    /* +<statements for calculating total values.. for that country and stored against variables>+ */
    Country_total1=Country_total1+ +<Calculation>+
    ENDFIX
    /* Second part : Now again the calculations stored in the variables are to be stored against specific entities */
    FIX (@UDA(Entity,@ALIAS(@CONCATENATE("&Country",iloop))))
    FIX(@ISUDA(Entity,<Check1>)
    ..... Assign to relevant account
    ENDFIX
    ENDFIX
    ENDLOOP
    ENDFIX
    Now the problem is that the first fix statement works just fine, but the FIX statement in the 'second part' throws an error
    Error: 1200354 Error parsing formula for [FIX STATEMENT] (line 66): expected type [STRING] found [EXTVAR] ([iloop]) in function [@CONCATENATE]
    If I hard code the 'second part' FIX statement to the substitution variable directly - it works just fine.
    How can the first statement work and not the second one ? They are exactly the same.

    Glenn, thanks - I hadn't thought of that :).
    But it still does not entirely solve my problem (please see my previous post depicting a requirement similar to ours )
    - We have lots of countries (50-60+ might be much more) and each country can have multiple entities (3-4 on an average - can go unto 7-8)
    - so good guess would be around 200 entities
    - So say I have to do it for 2 countries only (two entity types). Then I need 4 variables - 2 for each country ('country 1 ET1 total', 'Country 1 ET2 Total')
    When the list is 20 counties - variables become 40 :(.
    - Still leaving aside the 40 variables for a bit -
    There are subsequent steps of calculations which needs to be done based on these totals (which are exactly the same for all countries) - just that we need the correct totals to begin with and the rest is already stored in the DB
    So since I have a different variable for each country - I cannot write one single calculation block to use the variables sequentially one by one (can I ?)
    I might have to write a separate calculation block for each of these countries. (20 separate blocks)
    That's what I was trying to avoid and simplify with the substitution variable (but is not working)
    - Create substitution variables - which would store the alias of the required countries (2/10/20 as many required)
    - Loop through these substitution variables - using them one by one
    - So I just need one single block of calculation with all the variable in the calc script being reused after each country calculation is done
    - and the user need not go into the script, as the only thing that will change are the countries. And he can change it easily through the substitution variable.
    Edited by: Ankur on Jun 27, 2012 12:53 PM

  • A method for passing in the system date to either a substitution variable or directly into a calc script for use on the fix statement

    Does anyone have an idea of how to pass in the server system date into a calc script or into a substitution variable so that I can fully automate my calc script to only calculate the current day? Thanks very much for any assistance on this.

    unsure why cannot att bat<BR>below is raw code meant to insert into .bat file<BR>================================================<BR><BR>code starts below this line<BR>================================================<BR>:: <BR>:: pls ensure essbase server up and running <BR>:: batch file to upd subs var <BR>:: insert correct values below<BR>:: <BR>:: substitution variables set up in cube: curryr, lastyr <BR>:: substitution variables set up in cube: currmth, prevmth ... <BR>::<BR><BR>:: setting of local env vars<BR>setlocal<BR>::<BR><BR>:: setting of job control vars<BR>set svr=<< insert value here >><BR>set uid=<< insert value here >><BR>set pwd=<< insert value here >><BR>::<BR><BR>:: setting of date and time vars<BR>for /F "tokens=1-4 delims=/ " %%i in ('date /t') do (<BR>set dayofweek=%%i<BR>set day=%%k<BR>set month=%%j<BR>set year=%%l<BR>set datestamp=%%l_%%j_%%k<BR>)<BR>for /F "tokens=1-2 delims=: " %%i in ('time /t') do (<BR>set hour=%%i<BR>set minute=%%j<BR>set timestamp=%%i_%%j<BR>)<BR>::<BR><BR>:: setting year vars<BR>set /a curryr=%year%<BR>set /a lastyr=(%year% - 1)<BR>::<BR><BR>:: setting paths and files<BR>set destpath=<< insert path here >><BR>set errpath=<< insert path here >><BR>set errfiles=%errpath%\*.err<BR>set errfiledir=%errpath%\%datestamp%_%timestamp%_err.dir<BR>set errfile=%destpath%\%datestamp%_%timestamp%_err.err<BR>set logfile=%destpath%\%datestamp%_%timeStamp%_log.log<BR>set upd_var_file=%destpath%\upd_var.txt<BR>::<BR><BR>:: initial housekeeping<BR>if exist %errfile% del %errfile%<BR>if exist %logfile% del %logfile%<BR>if exist %upd_var_file% del %upd_var_file%<BR>if exist %errfiledir% del %errfiledir%<BR>::<BR><BR>:: start all<BR>echo. >> %logfile%<BR>echo rem &0 >> %logfile%<BR>echo. >> %logfile%<BR>echo rem --- start all --- >> %logfile%<BR>date/t >> %logfile%<BR>time/t >> %logfile%<BR><BR>:: dates<BR>echo rem --- dates --- >> %logfile%<BR>echo Curr Year = %curryr% >> %logfile%<BR>echo Last Year = %lastyr% >> %logfile%<BR><BR>:: gen temp txt to upd subs vars<BR>echo rem --- upd subs vars start --- >> %logfile%<BR>echo. >> %upd_var_file%<BR>echo. >> %upd_var_file%<BR>echo login "%svr%" "%uid%" "%pwd%"; >> %upd_var_file%<BR>if %month% == 01 (<BR> echo updatevariable "CurrMth" "%svr%" "" "" "Dec"; >> %upd_var_file%<BR> echo updatevariable "PrevMth" "%svr%" "" "" "Nov"; >> %upd_var_file%<BR> echo updatevariable "CurrQtr" "%svr%" "" "" "Q4"; >> %upd_var_file%<BR>)<BR>if %month% == 02 (<BR> echo updatevariable "CurrMth" "%svr%" "" "" "Jan"; >> %upd_var_file%<BR> echo updatevariable "CurrQtr" "%svr%" "" "" "Q1"; >> %upd_var_file%<BR> echo updatevariable "CurrYr" "%svr%" "" "" "FY%curryr%"; >> %upd_var_file%<BR> echo updatevariable "NextYr" "%svr%" "" "" "FY%nextyr%"; >> %upd_var_file%<BR> echo updatevariable "NextY2" "%svr%" "" "" "FY%nexty2%"; >> %upd_var_file%<BR> echo updatevariable "LastYr" "%svr%" "" "" "FY%lastyr%"; >> %upd_var_file%<BR> echo updatevariable "LastY2" "%svr%" "" "" "FY%lasty2%"; >> %upd_var_file%<BR> echo updatevariable "LastY3" "%svr%" "" "" "FY%lasty3%"; >> %upd_var_file%<BR> echo updatevariable "PrevYr" "%svr%" "" "" "FY%lastyr%"; >> %upd_var_file%<BR> echo updatevariable "PrevMth" "%svr%" "" "" "Dec"; >> %upd_var_file%<BR>)<BR>if %month% == 03 (<BR> echo updatevariable "CurrMth" "%svr%" "" "" "Feb"; >> %upd_var_file%<BR> echo updatevariable "PrevMth" "%svr%" "" "" "Jan"; >> %upd_var_file%<BR> echo updatevariable "CurrQtr" "%svr%" "" "" "Q1"; >> %upd_var_file%<BR> echo updatevariable "PrevYr" "%svr%" "" "" "FY%curryr%"; >> %upd_var_file%<BR>)<BR>if %month% == 04 (<BR> echo updatevariable "CurrMth" "%svr%" "" "" "Mar"; >> %upd_var_file%<BR> echo updatevariable "PrevMth" "%svr%" "" "" "Feb"; >> %upd_var_file%<BR> echo updatevariable "CurrQtr" "%svr%" "" "" "Q1"; >> %upd_var_file%<BR>)<BR>if %month% == 05 (<BR> echo updatevariable "CurrMth" "%svr%" "" "" "Apr"; >> %upd_var_file%<BR> echo updatevariable "PrevMth" "%svr%" "" "" "Mar"; >> %upd_var_file%<BR> echo updatevariable "CurrQtr" "%svr%" "" "" "Q2"; >> %upd_var_file%<BR>)<BR>if %month% == 06 (<BR> echo updatevariable "CurrMth" "%svr%" "" "" "May"; >> %upd_var_file%<BR> echo updatevariable "PrevMth" "%svr%" "" "" "Apr"; >> %upd_var_file%<BR> echo updatevariable "CurrQtr" "%svr%" "" "" "Q2"; >> %upd_var_file%<BR>)<BR>if %month% == 07 (<BR> echo updatevariable "CurrMth" "%svr%" "" "" "Jun"; >> %upd_var_file%<BR> echo updatevariable "PrevMth" "%svr%" "" "" "May"; >> %upd_var_file%<BR> echo updatevariable "CurrQtr" "%svr%" "" "" "Q2"; >> %upd_var_file%<BR>)<BR>if %month% == 08 (<BR> echo updatevariable "CurrMth" "%svr%" "" "" "Jul"; >> %upd_var_file%<BR> echo updatevariable "PrevMth" "%svr%" "" "" "Jun"; >> %upd_var_file%<BR> echo updatevariable "CurrQtr" "%svr%" "" "" "Q3"; >> %upd_var_file%<BR>)<BR>if %month% == 09 (<BR> echo updatevariable "CurrMth" "%svr%" "" "" "Aug"; >> %upd_var_file%<BR> echo updatevariable "PrevMth" "%svr%" "" "" "Jul"; >> %upd_var_file%<BR> echo updatevariable "CurrQtr" "%svr%" "" "" "Q3"; >> %upd_var_file%<BR>)<BR>if %month% == 10 (<BR> echo updatevariable "CurrMth" "%svr%" "" "" "Sep"; >> %upd_var_file%<BR> echo updatevariable "PrevMth" "%svr%" "" "" "Aug"; >> %upd_var_file%<BR> echo updatevariable "CurrQtr" "%svr%" "" "" "Q3"; >> %upd_var_file%<BR>)<BR>if %month% == 11 (<BR> echo updatevariable "CurrMth" "%svr%" "" "" "Oct"; >> %upd_var_file%<BR> echo updatevariable "PrevMth" "%svr%" "" "" "Sep"; >> %upd_var_file%<BR> echo updatevariable "CurrQtr" "%svr%" "" "" "Q4"; >> %upd_var_file%<BR>)<BR>if %month% == 12 (<BR> echo updatevariable "CurrMth" "%svr%" "" "" "Nov"; >> %upd_var_file%<BR> echo updatevariable "PrevMth" "%svr%" "" "" "Oct"; >> %upd_var_file%<BR> echo updatevariable "CurrQtr" "%svr%" "" "" "Q4"; >> %upd_var_file%<BR>)<BR>echo. >> %upd_var_file%<BR>echo exit; >> %upd_var_file%<BR><BR>:: run temp txt to upd subs vars <BR>esscmd %upd_var_file%<BR>echo rem --- update subs vars end --- >> %logfile%<BR> date/t >> %logfile%<BR> time/t >> %logfile%<BR><BR>:: end all<BR>echo rem --- end all --- >> %logfile%<BR> date/t >> %logfile%<BR> time/t >> %logfile%<BR><BR>: end_all<BR>endlocal<BR><BR>===============================<BR>

  • Write privileges on Essbase outline, substitiute and calc script variables

    Hi,
    What exact privileges need to be provision to the user\group to have the write access on outline, substitute variables and cal script without provision them to Database Manager.
    Thanks.

    To view the content of the calc, you'll have to edit it and edit needs databasemanager privileges.
    I don't think you achieve what you are looking for out-of box. So how about writing a bat file which copies the calc script to a shared folder where you can provide a read access to the users and they can view the calc scripts. (They can use text editor to view the script contents.)
    Regards
    Celvin
    http://www.orahyplabs.com

Maybe you are looking for

  • Exporting to DVD - quality settings

    Hello I am wondering if there are any QUALITY options I need to be considering when exporting a final story in the timeline to a DVD for stakeholder distribution? (the type of quality you see when you play the footage from the video camera direct to

  • Help for data upload

    hii friends, i have written this bdc program to upload the address data from flat file to the sap system but whenever i process the session i get the result like W_MAT-NAME,W_MAT-STREET etc instead of the data in the flat file. i have pasted my progr

  • Printing many pages per sheets

    Hi! I have some kind of report to print. I want to be able to print up to 6 on a single page. I always have the same infos to print. They are as this: P.O. #: 000-001 By: Employee 001 Date: 01-01-2004 Product: 102103 Title: THIS PRODUCT NAME UPC: 123

  • Menus that open UP instead of down

    The site I am putting together in Flash CS3 has two menu headings at the bottom of the page. I know that usually the menus are at the top or on the side, but in this instance, they have to be at the bottom. That being said, when a user rolls over "En

  • Third party G/L account

    What should be default g/l account to be assigned for a third party purchase order with acct assmt & item cat as third party. PO is created wrt third party sales order.