AGG and CALC DIM Essbase script recently started to grow our pag files

We have a Essbase script that does nothing but AGG and CALC DIM that ran fine for months in that it did not grow our Workforce cube. Starting in late Jan it started to grow its pag files. Workforce cube used to be 7 GB in Dec 2010, then it grew to 10GB today. I tested running it and it grew our pag files by 170MB 2nd time and then by 70MB the 3rd time I ran it. Has anyone seen this?

Thanks a million Cameron.
1) I do dense restructures every night - apparently that does not remove all defragmentation.
last questions:
2) I exported level zero, cleared all data, then imported level zero data. That should clear up all defragmentation, wouldn't it?
3) After importing level zero data, I ran a simple Calc Dim calc script on Accounts dim only on this Workforce BSO cube that is only 400MB. It took over 30 mins. On my second and third run of same calc script, it took 9 mins. My BSO cube grew a few MB. Can I assume that blocks have been build by first run and that all subsequent runs will stay around 9 mins since blocks have now been build?
Here is the calc script
SET CACHE HIGH;
SET UPDATECALC OFF;
SET CLEARUPDATESTATUS OFF;
SET LOCKBLOCK HIGH;
SET AGGMISSG ON;
SET CALCPARALLEL 3;
FIX (febscenario,Working)
FIX(@RELATIVE(TTC,0),@RELATIVE(TCI,0),@LEVMBRS("Project",0),@RELATIVE("Total Employees",0))
FIX(FY11, FY12 "Jan":"Dec")
FIX("HSP_InputValue","Local","USD")
CALC DIM ("Account");
CALC TWOPASS;
ENDFIX
ENDFIX /* &YearNext */
ENDFIX
ENDFIX
4) When I calc only FY11, it takes 3 seconds to calc on the first to 4th run of the above calc. However, when I calc FY12, it takes over 30 mins on first calc and 9 mins subsequently. Why is that? Should I use SET CALCONMISSINGBLK in my calc script?
5) I am running calc as Essbase admin user. The level zero text file I loaded is only 460MB. After calc, the BSO cube's pag files are only 420MB. We are thinking of calc'ing older scenarios for historical purposes but am not sure if that will degrade the calc performance. My experience has been that - increasing the size of the BSO cube by calc'ing will degrade future calc times. Is that your experience?
Edited by: Essbase Fan on Feb 25, 2011 9:15 AM
Edited by: Essbase Fan on Feb 25, 2011 9:17 AM

Similar Messages

  • Which is optimal one among AGG and CALC DIM??

    Hi Frndz,
    I knew the difference between AGG n CALC DIM but i'm not sure which is the optimal one.Is there any specific situation to use them when should we go for them.
    Thanks

    Hi John,
    I am responding to this post after such a long period. But i found that this is perfectly relevant to the situation i am facing.
    Actually we have been using the CALC DIM function for aggregation. Recently we replaced this by AGG as i read that AGG is more optimal when you are aggregating sparse dimensions with less that 6 levels. All the dimensions we are aggregating have less than 6 levels. But after using the AGG command my database size increased exponentially from 53G to 90G. This was the only change i did in the process. Then i cleared all data and imported level 0 and did the aggregation using CALC DIM again. Now the database size is back to normal.
    I am not able to understand why this happened in my case. Even the rules were taking more time to run as compared to Calc Dim.
    If you can please through some light on this.
    Regards
    Vikas

  • What is wrong with this? AGG on sparse and Calc Dim on Dense?

    I have this....
    FIX(@IDESC(Sparse parent member))
    CALC DIM(Dense,Dense,Dense);
    Agg(Sparse,Sparse);
    ENDFIX
    No formulas in outline ....just followed simple perf impro technique.....as they say...agg faster on sparse.Syntatically is abstly right but result is not right......?
    Whats the catch? do i need to attend some basic classes???:-P
    Edited by: 961729 on Sep 26, 2012 10:27 PM

    Thanks Cameroon and Tim for your reply!
    Perhaps i should have listed more details...my bad!. Anyway my idea was to know if two of below fixes produce same result...?
    Fix(Sparse)
    Calc dim(Dense,Dense,Dense,Sparse,Sparse);
    Endfix
    Fix(Sparse)
    Calc dim(Dense,Dense,Dense);
    Agg(Sparse.Sparse);
    Endfix
    Coming to my earlier mentioned fix....well that is the conclusive fix of my allocation script..I have two children of version dim. One is used for allocation and other to put adjustment. So in that fix iam just aggregating all dim for version dim (allocation +adjustment).                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                               

  • HT2518 I have tried 3 times to migrate my info from my pc to my mac and it gets as far as starting to transfer the music file and then stops any ideas thanks

    Hello just got my MacBook pro but cant migrate my pc info yo the mac tried three times gets as far as trying to transfer music then stops any ides thanks

    as an addendum  - is there someway to "restart" a stalled restore.
    When I went to bed last night my picture download was in the same position when I woke up. I am on the edge of cell service. I have NO cell service on this device - as the phone service has been transferred to a iphone6.
    My phone is sitting 10-15 feet away from my wifi access point. there are appx 1200 pictures - so the restore takes an inordinate amount of time - much more than the few hour stated.

  • Essbase Defrag and Calc Issue

    I have a calcscript will runs against a BSO cube with 11 dimensions and it runs for 15 mins. The script does some calculations and rollups (AGG and CALC DIMS etc). I can see the blocks go from 15k blocks to 500k blocks at the end.
    And then when I re-run the script, it runs for over 2 hours and of course it's due to the block explosions and all. And so, I do a defrag and clear upper blocks by doing a:
    CLEARBLOCK UPPER;
    CLEARBLOCK EMPTY;
    After this the number of blocks goes way down to 25k blocks.
    And still the calc continues to run over 2 hours. So my question is why this behavior? Is there anything I can do besides reloading the cube from scratch? I know if I do a complete clear it obviously runs the same timing again. But I want to be able to not have to do that. Is there a step I am missing?

    Here are some tips:
    - please review "intelligent calculation" - this feature helps when only a few blocks are changed to limit the recalculation to affected blocks
    - your index cache may be to low - index has grown with number of blocks beyond your index cache size -> more disk IO than your setup can handle
    - overall design needs tuning (block size / density / dynamic calculations / ...) -> contact a consultant
    - what is the business case for 10 min calc time limit? Will users wait for calculation to finish?
    Sascha.

  • Calc Dim vs IDESCENDANTS performance

    I try to aggregate (LegalEntity, LegalEntityICP, EntityICP)
    script1:
    FIX(&vYear,&vScenario, &vVersion, ......)
    ___CALC DIM(LegalEntity, LegalEntityICP, EntityICP);
    ENDFIX
    Execute time: 1201sec
    I execute the same slice in script2:
    FIX(&vYear,&vScenario, &vVersion, ......)
    __ FIX(@IDESCENDANTS(LegalEntity)
    ______@IDESCENDANTS(LegalEntityICP))
    __________ @IDESCENDANTS("EntityICP",-1);
    __ ENDFIX
    __ FIX(@IDESCENDANTS(LegalEntity)
    ______ @IDESCENDANTS(EntityICP))
    __________ @IDESCENDANTS("LegalEntityICP",-1);
    __ ENDFIX
    __ FIX(@IDESCENDANTS(LegalEntityICP)
    ______@IDESCENDANTS(EntityICP))
    __________ @IDESCENDANTS("LegalEntity",-1);
    __ ENDFIX
    ENDFIX
    Execute time: 623sec
    Why CALC DIM work so long? When I use AGG execute time is more then 1800sec!!!!
    I try to play with dim order, I move LegalEntity, LegalEntityICP, EntityICP on top (under dense dim) but result still the same.
    But when I try to reduce agg slice it`s work stable.
    FIX(&vYear,&vScenario, &vVersion, ......, @IDESCENDANTS(LegalEntityICP))
    __ CALC DIM(LegalEntity, EntityICP);
    ENDFIX
    Execute time: 117sec
    FIX(&vYear,&vScenario, &vVersion, ......, @IDESCENDANTS(LegalEntityICP))
    __ AGG(LegalEntity, EntityICP);
    ENDFIX
    Execute time: 117sec
    FIX(&vYear,&vScenario, &vVersion, ......, @IDESCENDANTS(LegalEntityICP))
    __ FIX(@IDESCENDANTS(LegalEntity))
    ______ @IDESCENDANTS("EntityICP",-1);
    __ ENDFIX
    __ FIX(@IDESCENDANTS(EntityICP))
    ______ @IDESCENDANTS("LegalEntity",-1);
    __ ENDFIX
    ENDFIX
    Execute time: 116sec

    Wow, you are covering a bunch of different issues.
    Here's the issue -- the number of blocks the code touches. The more it has to address, the slower it goes. There's no way round that.
    There are a few things you can do:
    1) Introduce parallel calcs (I am assuming a calc script, not a user-run Business Rule). See the [Tech Ref|http://download.oracle.com/docs/cd/E12825_01/epm.111/esb_techref/set_calcparallel.htm] for more information. The DBAG also has information on this -- you should understand the theory behind it before you go full speed on it.
    2) Adjust your data and index cache (this one is only going to help a little). See the DBAG for more information on this. Do not dedicate your life to this one -- there are some quick hits that can help; going beyond will be an exercise of increasing effort for decreasing results.
    3) Make sure you FIX on level zero of all of your other dimensions.
    4) Think about what's in your block. Accounts and Time don't have to be the only ones. Do you know how densely populated your other dimensions are? Search this board for techniques to figure that out. The denser the dimension, the more likely it is to be in the block. Blocks size becomes a lot more plastic if you're on 64 bit Essbase.
    5) Btw, you have the right idea when it comes to limiting the scope of the calc to get better times. I use that approach for different purposes in Planning Business Rules as shown in my blog.
    Okay, enough of the general comments, I'll try your questions in order:
    FIX(&vYear,&vScenario, &vVersion, ......)
    ___CALC DIM(LegalEntity, LegalEntityICP, EntityICP);
    ENDFIX
    Execute time: 1201sec^^^You're doing a full calculation of all dimensions. I'm a little surprised AGG is slower than CALC DIM as the times are usually reversed, but I wonder if you did a restructure between runs -- my guess is not so the AGG was maybe running on a fragmented db?
    FIX(&vYear,&vScenario, &vVersion, ......)
    __ FIX(@IDESCENDANTS(LegalEntity)
    ______@IDESCENDANTS(LegalEntityICP))
    __________ @IDESCENDANTS("EntityICP",-1);
    __ ENDFIX
    __ FIX(@IDESCENDANTS(LegalEntity)
    ______ @IDESCENDANTS(EntityICP))
    __________ @IDESCENDANTS("LegalEntityICP",-1);
    __ ENDFIX
    __ FIX(@IDESCENDANTS(LegalEntityICP)
    ______@IDESCENDANTS(EntityICP))
    __________ @IDESCENDANTS("LegalEntity",-1);
    __ ENDFIX
    ENDFIX
    Execute time: 623secSo it's faster, but of course you're not calculating level 0. Why? How do you even have any values at level 1? From the above AGG and CALC DIM tests I would guess. In any case, you're doing three passes through the database and these calcs are not going to work for you in the real world.
    FIX(&vYear,&vScenario, &vVersion, ......, @IDESCENDANTS(LegalEntityICP))
    __ CALC DIM(LegalEntity, EntityICP);
    ENDFIX
    Execute time: 117sec
    FIX(&vYear,&vScenario, &vVersion, ......, @IDESCENDANTS(LegalEntityICP))
    __ AGG(LegalEntity, EntityICP);
    ENDFIX
    Execute time: 117secI'm not sure why you repeated yourself, but you know your code is not calculating LegalEntityICP.
    FIX(&vYear,&vScenario, &vVersion, ......, @IDESCENDANTS(LegalEntityICP))
    __ FIX(@IDESCENDANTS(LegalEntity))
    ______ @IDESCENDANTS("EntityICP",-1);
    __ ENDFIX
    __ FIX(@IDESCENDANTS(EntityICP))
    ______ @IDESCENDANTS("LegalEntity",-1);
    __ ENDFIX
    ENDFIX
    Execute time: 116secSo now you're taking a different tack, and still not calculating LegalEntityICP, which has to be calculated sooner or later.
    You've proved that the fewer blocks your code addresses, the faster it runs.
    However, since you need to calculate this data sooner or later, I'd go with one of the general recommendations (and I am sure others will chime in, too) and start working with those approaches.
    Good luck!
    Regards,
    Cameron Lackpour

  • CALC DIM or AGG - Can we aggregate from higher levels?

    Hello everyone,
    We have a custom dimension called LOCATION, under which we have Continents ASIA and EUROPE. Under ASIA and EUROPE we again have countries (India & England) under which there are states. Now data may be entered at a Continent level or Country level. If we do a normal AGG or CALC DIM operation, the higher level data will get overwritten since aggregation is done on a bottom up basis from zero level members.
    Can we aggregate from a higher level (say Country or Continent) in the dimension up to the Location level (i.e. override the default zero level up aggregation) ?
    PS: Creating a dummy member would not be an elegant solution, since I have presented the problem in a much simplified way, and there are many level of hierarchies - and a dummy will be required under each hierarchy level.
    Any suggestion would be of great help.
    Thanks..
    Sayantan

    I know you are against creating dummy members but if you don't there is no way to know for sure you are aggregating correctly. Most people want to see the dummy members just so they know why the parent doesn't equal the sum of all the children.
    We have a million+ member customer dim. Quite a few of them are loaded at the parent level. Our parent child hierarchy (not dim) gets built from the fact table+dim. So if the etl finds a record in the fact table that has children in the dim table it creates the dummy member to load the data to. This has worked well for us. Good luck.
    Edited by: user7332747 on Apr 23, 2009 1:34 PM

  • CALC DIM Issue

    I have a Essbase (v11.1.1.3) calc script that I can't see to get to do what I want.
    Here is my script:
    SET UPDATECALC OFF;
    SET UPTOLOCAL OFF;
    SET AGGMISSG ON;
    SET FRMLBOTTOMUP OFF;
    SET CALCPARALLEL 3;
    SET CALCTASKDIMS 2;
    FIX ("Customer1","May Scenario","Working","FY11","May-MTD",@DESCENDANTS ("Total Product Revenue"))
    CALC DIM ("Regions","Projects","Product Lines","Countries","Companies","Currencies");
    ENDFIX
    I have data in at level 0. Total Product Revenue is in my Accounts dim and the sub-accounts (level 0 members) under it are not rolling up even though all the consolidation operators are set to +. The reason I am doing the @DESCENDANTS("Total Product Revenue") is that with this calc script I just want to aggregate Total Product Revenue and the accounts under it, not the other members in the accounts dimension.
    If I have missed anything please let me know.
    Thanks,
    Ken

    One other note. The order you do things can also depend on what is dense and what is sparse. For example if your accounts dimensions is dense, you would want to fix on level zero members of the other dimensions and do the agg if it first. then do the fix on the members of the dimensions including accounts and calc dim the other dimensions. It is more efficient because you are only doing the calculation on the bottom level of the dense dimension then calculating the sparse dimensions

  • Updating Start, Finish % Complete from excel file to Ms Project

    Hi - This is my first post on this forum which i see is a great way to share knowledge.
    I’m having a schedule which has around 3000 lines with resources names and a Responsible Person (  where individual resources
    report to) updated on the schedule.
    I need to work out a good way to get schedule updates from different parties in a efficient way.
    My thoughts are to make a solution to generate the tasks which needs to be updated based on the status date and generate a individual excel files for each responsible person and get
    their updates back in excel and update the dates and % complete back to the MS Project schedule.<o:p></o:p>
    The steps
    a) create a filter to list all the tasks which need to be updated based status date ( tasks which are in progress, tasks in the past which have not started yet,
    tasks in the past which have finished etc) grouped as per responsible person and create separate excel files for each responsible person.
    b) the excel file will contain ( UID, Task name, start date, finish date, %complete, resource name, responsible person)
    c) I would like to email these excel files to each responsible person and get their updates for (start date, finish date, %complete)
    d) Create a VBA macro to let the user to select the updated excel file and update the data back to MS Project file.
    Conditions while updating
    a)  
    Read the Excel file from Top to bottom and find the correct record based on the UID and update the "duration" in order to change dates as below
    b)  
    IF Task has started ( the excel file contains % complete and a new start date later than the MS Project start date)
    Update MS Project file ( Actual Start date and % Complete)
    c)  
    IF Task has started as Scheduled ( the excel file contain % complete and the excel file start date is equal to MS Project start date)
            Update MS Project file ( Actual Start date and % Complete)
    d)  
    IF Task has started and finished as Scheduled ( The excel file start and finish dates are equal to MS Project file but excel file % complete is now 100%)
    Update MS Project File ( Actual Start, Actual Finish and Update % Complete 100%)
    e)  
    If Start date of the task has been rescheduled to a future date ( If Excel file start date is > that Ms project start date and excel file % complete is 0)
    Update the MS Project file and inset a lag to match the excel file start date.
    I would like to know whether this would be a feasible solutions and if some of you have implemented this kind of thing please share some code snippets.
    Thanks a lot

    Hi John, I've managed to progress further on the code and Split the records as per supervisor and create excel file and email  to supervisor.
    However I ran into 2 problems
    1) The code  works some time without any error and some times it stalls. I have managed to narrow it down to the sorting code block on the
    ExportTaskstobeUpdated mehod where it Selects the Active work book which has all the data based on the filter and Sorts according to Supervisor Name+ Resource Name
    I have made the error code Bold and Italic
    The error I'm getting is
    Run-time error '1004'
    Method'Range' of object'_Global' failed
    2) Currently I'm creating the Master Excel file which is generated by the
    ExportTaskstobeUpdated method and splitting them and creating all the splits + the Master  excel file to a hard coded locationwhich is C:GESProjectEmail folder
    When Saving it Prompts to Over wright the existing files , I don't need any prompts coming out for the user.
    I've tried using the Application.DisplayAlearts=False but it does not work for me.
    In order to overcome this , I created a routine to delete the files after the files have been emailed/Saved as draft but that too fails to delete the original Master excel file.
    Is there a way to generate the excel files on the fly and without saving can I attach it to the email ?
    Or would like to know your thoughts on a better way to handle this.
    Sub ExportTaskstoBeUpdated()
    'Start Excel and create a new workbook
    'Create column titles
    'Export data and the project title
    'Tidy up
    Dim xlApp As Excel.Application
    Dim xlRange As Excel.Range
    Dim Dept As Task
    Dim Check As String
    Dim Prime As String
    Dim PrimeEmail As String
    Dim OpeningParen, ClosingParen As Integer
    'Start Excel and create a new workbook
    Set xlApp = CreateObject("Excel.Application")
    xlApp.Visible = True
    xlApp.Workbooks.Add
    'Create column titles
    Set xlRange = xlApp.Range("A1")
    With xlRange
    '.Formula = "Master Schedule Report"
    .Font.Bold = True
    .Font.Size = 12
    .Select
    End With
    xlRange.Range("A1") = "Supervisor"
    xlRange.Range("B1") = "Resource Name"
    xlRange.Range("C1") = "UID"
    xlRange.Range("D1") = "Task Name"
    xlRange.Range("E1") = "Start Date"
    xlRange.Range("F1") = "Finish Date"
    xlRange.Range("G1") = "% Completed"
    xlRange.Range("H1") = "Project"
    xlRange.Range("I1") = "Supervisor Email"
    With xlRange.Range("A1:N1")
    .Font.Bold = True
    .HorizontalAlignment = xlHAlignCenter
    .VerticalAlignment = xlVAlignCenter
    .EntireColumn.AutoFit
    .Select
    End With
    'Export data and the project title
    Set xlRange = xlRange.Range("A2") 'Set cursor to the right spot in the worksheet
    ViewApply Name:="Task Sheet" 'Get the view that has the Text11 column to filter on
    OutlineShowAllTasks 'Any hidden tasks won't be selected, so be sure all tasks are showing
    FilterApply Name:=" Telstra - CHECK 5 - Unstatused Tasks" 'This custom filter selects "External"
    SelectTaskColumn ("Text2") 'Insures the For Each loop gets all of the filtered tasks, this may be redundant
    For Each Dept In ActiveSelection.Tasks 'Pulls data for each task into spreadsheet
    If Dept.Text4 <> "" Then ' If there is no Supervisor defined ignore the Task
    With xlRange
    .Range("A1") = Dept.Text4 ' Supervisor Name, which has a Lookup , where the description on the lookup is the Supervisor Email
    .Range("B1") = Dept.ResourceNames
    .Range("C1") = Dept.Text1
    .Range("D1") = Dept.Name
    .Range("E1") = Format(Dept.Start, "dd-mmm-yyyy")
    .Range("F1") = Format(Dept.Finish, "dd-mmm-yyyy")
    .Range("G1") = Dept.PercentComplete
    .Range("H1") = ActiveProject.Name
    'This below Code Developed by John Finds the lookup description value for a custom field value
    If Dept.Text4 <> "" Then 'This is not required now since its captured above
    Dim Found As Boolean
    Dim i As Integer, NumSup As Integer
    NumSup = ActiveProject.Resources.Count
    'Once you have that set up, now you can add this code to your macro to determine the value for the "I1" range.
    On Error Resume Next
    'cycle through each item in the value list to find the one selected for this task
    For i = 1 To NumSup
    'once the item is found exit the loop
    If CustomFieldValueListGetItem(pjCustomTaskText4, pjValueListValue, i) = _
    Dept.Text4 Then
    'the loop can exit for two reasons, one, the item has been found, two,
    ' the item was not found and an error occurred. We need to distinguish between the two
    If Err = 0 Then Found = True
    Exit For
    End If
    Next
    On Error GoTo 0 'this resets the err function
    'now that the corresponding email address is found, set the Excel range value
    If Found Then
    .Range("I1") = CustomFieldValueListGetItem(pjCustomTaskText4, pjValueListDescription, i)
    Else
    .Range("I1") = "No Email Defined"
    End If
    End If
    '=====================
    End With
    Set xlRange = xlRange.Offset(1, 0) 'Point to next row
    Else
    End If
    Next Dept
    'Tidy up
    FilterApply Name:="All Tasks"
    ViewApply Name:="Task Sheet"
    With xlRange
    .Range("A1").ColumnWidth = 30
    .Range("D1").ColumnWidth = 50
    .Range("E1").ColumnWidth = 20
    .Range("F1").ColumnWidth = 20
    .Range("G1").ColumnWidth = 20
    .Range("H1").ColumnWidth = 30
    End With
    'Sort the Active work book for Supervisor Name + Resource Name
    Range("A1:I1").Select
    ActiveWorkbook.Worksheets("Sheet1").Sort.SortFields.Add Key:=Range("A2:A500") _
    , SortOn:=xlSortOnValues, Order:=xlAscending, DataOption:=xlSortNormal
    ActiveWorkbook.Worksheets("Sheet1").Sort.SortFields.Add Key:=Range("B2:B500") _
    , SortOn:=xlSortOnValues, Order:=xlAscending, DataOption:=xlSortNormal
    With ActiveWorkbook.Worksheets("Sheet1").Sort
    .SetRange Range("A1:I500")
    .Header = xlYes
    .MatchCase = False
    .Orientation = xlTopToBottom
    .SortMethod = xlPinYin
    .Apply
    End With
    Set xlApp = Nothing
    'Call Method to Split to seperate files and Email
    SplitIntoSeparateFiles
    'Call the Method to delete the excel files genearated by the above method
    Deletefiles
    End Sub
    Sub SplitIntoSeparateFiles()
    '* This method Split the Master excel file which is sorted by Supervisor Name + Resource Name
    Dim OutBook, MyWorkbook As Workbook
    Dim DataSheet As Worksheet, OutSheet As Worksheet
    Dim FilterRange As Range
    Dim UniqueNames As New Collection
    Dim LastRow As Long, LastCol As Long, _
    NameCol As Long, Index As Long
    Dim OutName, MasterOutName, SupervisorEmail As String
    'set references and variables up-front for ease-of-use
    'the current workbook is the one with the primary data, more workbooks will be created later
    Set MyWorkbook = ActiveWorkbook
    Set DataSheet = ActiveSheet
    NameCol = 1 ' This is supervisor Name which will be splitted
    LastRow = DataSheet.Cells.Find("*", SearchOrder:=xlByRows, SearchDirection:=xlPrevious).Row
    LastCol = DataSheet.Cells.Find("*", SearchOrder:=xlByColumns, SearchDirection:=xlPrevious).Column
    Set FilterRange = Range(DataSheet.Cells(1, NameCol), DataSheet.Cells(LastRow, LastCol))
    'loop through the name column and store unique names in a collection
    For Index = 2 To LastRow
    On Error Resume Next
    UniqueNames.Add Item:=DataSheet.Cells(Index, NameCol), Key:=DataSheet.Cells(Index, NameCol)
    On Error GoTo 0
    Next Index
    'iterate through the unique names collection, writing
    'to new workbooks and saving as the group name .xls
    Application.DisplayAlerts = False
    For Index = 1 To UniqueNames.Count
    Set OutBook = Workbooks.Add
    Set OutSheet = OutBook.Sheets(1)
    With FilterRange
    .AutoFilter Field:=NameCol, Criteria1:=UniqueNames(Index)
    .SpecialCells(xlCellTypeVisible).Copy OutSheet.Range("A1")
    End With
    OutName = "C:\GESProjectEmail" + "\" 'Path to Save the generated file
    SupervisorEmail = OutSheet.Range("I2") 'Supervisor Email
    MasterOutName = OutName & "Test" ' This is the First excel file generated by the ExportTasksto Be Updated Method
    OutName = OutName & UniqueNames(Index) & Trim(I2)
    Application.DisplayAlerts = False
    OutBook.SaveAs FileName:=OutName, FileFormat:=xlExcel8 'Create Excel files for each splitted files and save
    'Call Send Email method
    Send_Email_Current_Workbook (SupervisorEmail)
    OutBook.Close SaveChanges:=False
    Call ClearAllFilters(DataSheet)
    Next Index
    Application.DisplayAlerts = False
    ActiveWorkbook.SaveAs FileName:=MasterOutName, FileFormat:=xlExcel8
    ActiveWorkbook.Close SaveChanges:=False
    Application.DisplayAlerts = True
    End Sub
    Sub Send_Email_Current_Workbook(Email As String)
    Dim OutApp As Object
    Dim OutMail As Object
    Dim rng As Range
    Set OutApp = CreateObject("Outlook.Application")
    OutApp.Session.Logon
    Set OutMail = OutApp.CreateItem(0)
    On Error Resume Next
    With OutMail
    .To = Email
    .CC = ""
    .BCC = ""
    .Subject = "Project Status Update"
    .Body = "Please Update the attached file and email it back to the PM"
    .Attachments.Add ActiveWorkbook.FullName
    .Save
    End With
    On Error GoTo 0
    Set OutMail = Nothing
    Set OutApp = Nothing
    End Sub
    'safely clear all the filters on data sheet
    Sub ClearAllFilters(TargetSheet As Worksheet)
    With TargetSheet
    TargetSheet.AutoFilterMode = False
    If .FilterMode Then
    .ShowAllData
    End If
    End With
    End Sub
    Sub Deletefiles()
    ' This method is to delete the excel files once its saved and to avoid the DO you want to overigt message
    ' because Application.DisplayAlerts = False command still prompts the user to save
    On Error Resume Next
    Kill "C:\GESProjectEmail\*.xl*"
    On Error GoTo 0
    End Sub

  • Admin Server can't  start after relocating page file.

    Hi all,
    I've recently installed a new hard disk on my pc at work and thought it was a good idea to put the page file there (don't use the same disk for pf and os).
    However, probably after that (I hadn't fired up my server just before the hard disk installation) WebLogic can't start up. I'm getting:
    Starting WLS with line:
    S:\bea\JDK160~1\bin\java -client   -Xms256m -Xmx1024m -XX:CompileThreshold=8000 -XX:PermSize=128m  -XX:MaxPermSize=512m  -Xverify:none  -da
    -Dplatform.home=S:\bea\WLSERV~1.3 -Dwls.home=S:\bea\WLSERV~1.3\server -Dweblogic.home=S:\bea\WLSERV~1.3\server  
    -Dweblogic.management.discover=true  -Dwlw.iterativeDev= -Dwlw.testConsole= -Dwlw.logErrorsToConsole=
    -Dweblogic.ext.dirs=S:\bea\patch_wls1030\profiles\default\sysext_manifest_classpath;S:\bea\patch_cie660\profiles\default\sysext_manifest_classpath
    -Djava.security.auth.login.config=P:/myproject/setup/jaas/jaas.config -Dweblogic.Name=AdminServer
    -Djava.security.policy=S:\bea\WLSERV~1.3\server\lib\weblogic.policy   weblogic.Server
    Error occurred during initialization of VM
    Could not reserve enough space for object heap
    Could not create the Java virtual machine.OS: Windows XP SP3 32-bit
    WebLogic Version: 10.3
    Has this happened to anyone else? How do I resolve this?

    Hi,
    The manually created WorkManager overwrites the built in WorkManager from WLS, but this new one isn't targeted to the AdminServer. When the AdminServer tries to access the 'default' WorkManager, which isn't targeted to it, the DeploymentException error appears.
    Solution
    Comment the default WorkManager and target the WebLogic WorkManager to the Admin Server.
    Regards,
    Kal

  • Power failure and lost Pages file

    Power on my laptop suddenly went off and I had not saved a copy of a new Pages file I was working on. It does not appear in recent files or anywhere when I restarted and opened Pages. Are temp files stored anywhere that I might look to recover the work I've done?

    Mark,
    No, unfortunately.
    You can set Pages to save a copy of any +previous version+ in your Preferences, however this wouldn't help in this case.
    Download (free):
    http://www.macupdate.com/info.php/id/29454/eversave
    To autosave everything you wish in your Mac.
    Best is to use cmd S regularly as you work, then you choose at what point it is saved.
    Peter

  • How to speed-up CALC Dim and/or AGG Dim

    Hello Everyone,
    I am new to Essbase and apologize for such a generic query. I came across a calculation script that takes more than 10 hours to execute. The crux of the script is CALC Dim (DIM) where DIM is a dense dimension with 11 levels (and a lot of calculated members). Can anyone guide me about the approach to be adopted to optimize the script. I was fiddling around with CALCCACHEHIGH parameter in essbase.cfg file and the SET CACHE HIGH declaration in the script. Will that help?
    Some details of the original script are outlined below. The basic optimization parameters are in place (like the dense dimensions are at the end of the FIX list etc)
    Thanks.
    Sayantan
    Script details:
    SET AGGMISSG ON;
    SET CREATENONMISSINGBLK ON;
    SET UPDATECALC OFF;
    SET FRMLBOTTOMUP ON ;
    SET CACHE ALL;
    SET CALCPARALLEL 5;
    SET CALCTASKDIMS 2;
    SET LOCKBLOCK HIGH;
    FIX (&pln_scenario,"BU Budget",&plan1_yr,@RELATIVE("Product",0), @RELATIVE("YearTotal",0),... etc
    CALC DIM ("Account");

    Hi,
    Thanks for your suggestions. I will definitely try to implement them and post the results. Meanwhile, here's another script which should not take too long to run. However, this script runs for hours. Any suggestions would be great (this does not have the cache options implemented yet). I have added some more details about the dimensions involved.
    Outline of the script:
    /*Script Begins*/
    SET UPDATECALC OFF;
    SET CACHE ALL;
    SET CALCPARALLEL 5;
    SET CALCTASKDIMS 2;
    SET NOTICE LOW;
    SET MSG SUMMARY;
    FIX (@ATTRIBUTE("Existing"),@Relative("DC",0),@RELATIVE("PCPB",0),&pln_scenario,"BU Budget",&plan1_yr,@RELATIVE("YearTotal",0))
    FIX(@RELATIVE ("RM",0))
    SET CREATENONMISSINGBLK ON;
    "RMQ" = "DC wise Sales Volume"->"UBOM" * "RMQ per Unit SKU"->"All DC"->"YearAgg";
    ENDFIX;
    ENDFIX;
    /*Script Ends*/
    Dimension details: Evaluation Order which has been thought through.
    Dimension Members Density
    Account 352 Dense
    Period 35 Dense
    Version 3 Sparse
    Scenario 8 Sparse
    DC 7 Sparse
    Year 9 Sparse
    Entity 20 Sparse
    Product 416 Sparse
    BOM 938 Sparse

  • Calc Dim vs Agg

    to my understand when you want to agg up a Dense dimension you use Calc Dim in the Calc Script, and when you want to agg up a Sparse dimension in the calc Script you use the AGG statement
    what is the different in the to process
    Please advise

    Does the definition of AGG not help from the documentation
    "The AGG command performs a limited set of high-speed consolidations. Although AGG is faster than the CALC commands when calculating sparse dimensions, it cannot calculate formulas; it can only perform aggregations based on the database structure. AGG aggregates a list of sparse dimensions based on the hierarchy defined in the database outline. If a member has a formula, it is ignored, and the result does not match the relationship defined by the database outline."
    Cheers
    John
    http://john-goodwin.blogspot.com/

  • I am receiving a lot of pop-up ads on website pages. It has only recently started happening. They open new windows with ads for online gambling, online dating and other things. They are annoying and I want them shut off. Please help me

    I have recently started seeing millions of pop-up ads on website pages. It is not something that normally happens. They open new windows with ads for online gambling, online dating among other things. I am growing sick and tired of these ads, and have tried multiple things. I have uninstalled and reinstalled software, I have blocked pop-up ads and unenabled Java script. I have tried everything I could find, but nothing has worked. I have even downloaded AdBlock and Clean Genius. Nothing has even remotely changed this annoying trait. I hope this can be a quick fix I can perform and not something that will have to be done by a tech-hand. Please, if you know anything about this issue, let me know how to fix it.
    Thanks.

    Unfortunately, you have installed malware by clicking on a pop up ad.
    Removal instructions here >>  The Safe Mac » Adware Removal Guide
    Uninstall CleanGenius >  Mac App uninstaller: How to uninstall the Macintosh applications with CleanGenius uninstaller on Mac OS X?
    Third party maintenance utilities are not necessary on a Mac and can actually do more harm than good and cannot remove malware > Mac App uninstaller: How to uninstall the Macintosh applications with CleanGenius uninstaller on Mac OS X?

  • Hyperion business rules and calc scripts

    Hi...can anyone differentiate HBR and Calc scripts.. what is the advantage HBR got over Calc scripts.. replies will be highly appreciated

    Hi
    there are many you can easily get answer reading thro documentation.
    major difference is the runtime prompt in HBR , which differentiates Calc script.
    however I recently learned that you can put run time prompts in calc scripts lusing VBA macros.
    good luck.

Maybe you are looking for

  • Message No F5808 Field Assign. is a required field for G/L Account.

    Hi Guys, I have a problem when releasing a cancel credit memo billing document. It shows the following error message: Field Assign. is a required field for G/L Account. Message No. F5808 Diagnosis, The value field "Assign." in the interface to Financ

  • How to capture Purchase order text from MM02

    Hi Friends , i have a requirement like i have  to capture the purchase order text from MM02 from one program. Pls give helpful code . Thanks & Regards Jagadeeshwar.Bachu

  • Applet with adittional jar files / mp3 with Jlayer

    In order to support multiple MP3 encodings, I created a class that extends MediaPlayer called [MediaPlayerMP3|http://svn.vacavitoria.com/cabecudinhos_1/mp3AsMediaPlayer/src/mp3asmediaplayer/MediaPlayerMP3.fx] , that uses Jlayer to play mp3 files in a

  • Premiere CS4 - Audio sync problems, Encoding Problems

    Hi, I've been using Premiere CS3 for encoding 20-25 minute videos to deploy on a Flash Media Streaming server. Today I upgraded to CS4 and everything that I render to FLV/FV4, H264 or Quicktime format has serious audio sync problems when I use AAC as

  • Windows 7 x64, can't search PDF files

    I am running Windows 7 x64. I have installed Reader version 9.3.1.. Seems to work OK, but Windows search does not index PDF files (this works fine on my Vista x86 box). I checked the Indexing options and I notice that under Advanced, File Types "pdf"