The HFM Consolidation Mystery

Hi,
I am looking to optimise the rules files that is being used for consolidation. I would like to have a clear understanding of how the consolidation exactly happens, thwe concept of subcubes, performace best practices, use of virtual memory and physical memory, detailed consolidation log files etc. Do we have some documentation that throws light on these topics?
I have referred the admin guide and it seems to explain the logical steps involved during consolidation and not the more technical details like memory utilization and the concept of sub cubes. Any help would be greatly appreciated.
Cheers!!!
Lijoy

Kelly,
Thanks for the help. But training does not seem an option as of now.
Just a few questions.
Does the consolidation time include time needed to read/write the data into the dataBase. Is this value considerable compared to the total time taken for consolidation?
I have tried to implement certain timers onto the rule files to check the exact time taken for consolidating each POV. However, the total summation of times shown by the timers is merely half of the total consolidation time. What are the other process that may be taking the extra amount of time?
I would like to get a knowhow on where the remaining time is being used?
Regards,
Lijoy

Similar Messages

  • Can odi do the hfm consolidation ?

    I read the odi adapter for hfm document, it says it can do the HFM consolidate , once I look closer, it has options consolidate_only, consolidate_after_load when you load data into HFM, waht we need is to consolidate hfm without loading data, so the consolidate_after_load looks like loading->consolidate to me, so it doesn't look like the solution I need. for consolidate_only, it explains it as : data is consolidated but not loaded, it looks like it consolidate the source data but not the data in hfm.
    do I understand that correctly? most important, is there any way to do the consolidate only for hfm in odi?

    CONSOLIDATE_ONLY
    Valid values: Yes and No
    If set to Yes, data is consolidated but not loaded.
    The data is consolidated only within HFM, has nothing to do with source.
    Thanks,
    Denis

  • Unable to Open the HFM application through Workspace

    Hi,
    Here I am getting below error while i am trying to open the HFM application through Workspace.
    I was permitted allowed all IIS (Allow option checked ) but still the error coming i am unable to open my application through workspace , but i am able to do the all thing in HFM via Clien.The error is below
    There was some communication error. Response is : http://localhost:19000/hfm/GlobalWorkspaceNav/bpm/modules/com/hyperion/hfm/web/appcontainer/Adf.asp <!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.01//EN" "http://www.w3.org/TR/html4/strict.dtd">
    <HTML><HEAD><TITLE>The page cannot be found</TITLE>
    <META HTTP-EQUIV="Content-Type" Content="text/html; charset=Windows-1252">
    <STYLE type="text/css">
    BODY { font: 8pt/12pt verdana }
    H1 { font: 13pt/15pt verdana }
    H2 { font: 8pt/12pt verdana }
    A:link { color: red }
    A:visited { color: maroon }
    </STYLE>
    </HEAD><BODY><TABLE width=500 border=0 cellspacing=10><TR><TD>
    The page cannot be found
    The page you are looking for might have been removed, had its name changed, or is temporarily unavailable.
    <hr>
    Please try the following:
    Make sure that the Web site address displayed in the address bar of your browser is spelled and formatted correctly.
    If you reached this page by clicking a link, contact
    the Web site administrator to alert them that the link is incorrectly formatted.
    Click the Back button to try another link.
    HTTP Error 404 - File or directory not found.
    Internet Information Services (IIS)
    <hr>
    Technical Information (for support personnel)
    Go to Microsoft Product Support Services and perform a title search for the words HTTP and 404.
    Open IIS Help, which is accessible in IIS Manager (inetmgr),
    and search for topics titled Web Site Setup, Common Administrative Tasks, and About Custom Error Messages.
    </TD></TR></TABLE></BODY></HTML>
    "There was some communication error. Response is : http://localhost:19000/hfm/GlobalWorkspaceNav/bpm/modules/com/hyperion/hfm/web/appcontainer/Adf.asp"
    "

    there can be many causes to this error....
    1) you should check first of all that the redirection from Apache port 19000 to folder HFM within the IIS is actually working. try hitting http://apacheserver:19000/hfm and make sure you get "hfm" response back.
    2) check that ASP is turned on in the IIS ("file / directory not found" suggests that it isnt)
    3) check what URL you used when you created the application or when you registered it with shared services as the "Financial Management Web Server URL for Security". It should normally be http://apacheservername:19000/hfm. If you did this wrong then re-register it with shared services via Navigate->Administer->Classic Application Administration->Consolidation Administration.
    4) check that the application was registered with Shared Services in the first place.
    5) make sure that on the HFM web server, the HFM application server/cluster is properly registered
    6) stop and restart all services and test again...
    7) try not to use "localhost" but proper fully qualified domain names "servername.mydomain.com" wherever possible. you will have less problems. the only place where you should use a non qualified name is in Relational Content links... see the HFM readme.

  • HFM Consolidations using Task Flow Automation

    I have created a task flow that has 5 stages (each stage having a success and a fail link). The first stage is set to consolidate (Impacted) our Actuals scenario for 6 years for 3 entity trees). The 2nd stage consolidates a different scenario for 3 yrs for 3 trees, etc. The task flow seems to "stop" in Stage 1 after consolidating all 3 trees for a few years. If I look in the View Taskflow status, the task flow still shows as being active; however if I look under Running Tasks, it shows nothing is running. If I look under Task Audit, it shows the last consolidation finished. I run into this same problem both when I schedule the task flow and when I run it manually. It consolidates a few years and then stops (and it's still in the first stage). I looked in the Event log on the HFM server and don't see anything that looks suspicious. Has anyone else had this problem?
    Also one more question - when I go to automate the task flow, why does the server time stamp listed there not match the clock on the server? I'm remoted into the server and the clock on the task bar says 9am, but the Server Date showing on the Starting Event tab shows 8am.
    Thanks.
    Terri

    I just got back to looking at this today. I ran the task flow to consolidate 2004-2009 actuals for 3 trees for all 12 months. All 3 trees were consolidated for 2004. Two trees were consolidated for 2005 and then it stopped. Both 2004 and 2005 are in the same stage of the task flow, so if the settings were correct for 2004, they should be fine for 2005. If I look at the task audit screen, it shows that it started consolidating the 2nd tree of 2005 at 8:32am. It then shows my id logging off at 8:33am and that the consolidation finished at 8:38am. Then it shows my id logging off again at 8:47am. The task flow is set up so that my id logs in for the 1st stage and then the following stages are set to use Initiator ... but the flow isn't getting to the 2nd stage be/c it dies.
    If I view the status of my task flow, it still shows as being active. I don't understand why the task audit shows my id as logging off be/c I still have the same 2 HFM sessions running as I did when I initiated the task flow ... unless it's the session that starts up to run the task flow that's getting logged off ... but if it is, I don't know why. Would the auto-logoff after so many minutes of inactivity cause this? Maybe that session appears inactive even though it's waiting for the task flow to do it's thing.
    1:07pm Follow-up - After the task flow hung this morning, I manually stopped it. I then kicked off the same task flow again and it completed successfully about 4 hours later ... so it doesn't seem to be due to the auto-logoff (time out) setting.
    Terri
    Edited by: Terri T on Apr 20, 2009 10:07 AM

  • Standard HFM consolidation rules for CALC manager( EPMA)

    Can some tell me how to get the stadard HFM consolidation rules, in cal mgr format. I know you can get these in VB script, but I have not seen how to get these in calc mgr format. Thanks!

    This might help: http://docs.oracle.com/cd/E17236_01/epm.1112/cmgr_admin/ch04.html
    It is pretty comprehensive, but it has a lot of examples in it.
    Cheers,
    Mehmet

  • Email FR reports after HFM Consolidation

    How to trigger the FR batch jobs when HFM consolidation runs successfully? We want to get the reports in mail after HFM consolidation runs .
    How to achieve this, Any helps would be appreciated .
    Thanks,

    I don't think such a trigger to automate this process is available in the Hyperion/EPM spectrum,however you can schedule FR batches via the command line,details are available at
    http://docs.oracle.com/cd/E17236_01/epm.1112/fr_webuser_epm.pdf
    starting page 95
    Let me know
    Regards

  • Urget Error while deploying the HFm application in EPMA

    Dear All
    Please suggest me on the below error while i am deploying the HFM application sencond time system throughs the below error. for this what i have to check ??
    Error :The 'Test_YearCon' dimension has been modified since the first successful deployment of the application.
    Error: The 'Test_Period' dimension has been modified since the first successful deployment of the application.
    Thanks in Advace.
    Regads.
    Rao

    you have made modifications in your Year and Period dimensions, even in just the aliases possibly, since the last time the Application was deployed.
    it is not allowed to make any modifications to these dimensions after 1st deployment. HFM does not support it.
    to get a successful deployment you will need to delete the application and redeploy.
    also please make sure you are using the latest Service Pack version of your release of EPMA e.g. if 9.3.1 make sure you are using 9.3.1 Service Pack 3 and fully read the Readme documentation that comes with it.

  • Issue while Reverse Engineering the HFM Application

    Hi,
    We are in the process of integrating HFM (11.1.2.2.0) with ODI (11.1.1.6) for metadata upload from ODI to HFM, and, I am getting the following error while reverse engineering to the HFM application.
    org.apache.bsf.BSFException: exception from Jython:
    Traceback (most recent call last):
    File "<string>", line 38, in <module>
         at com.hyperion.odi.hfm.ODIHFMAppConnection.<init>(ODIHFMAppConnection.java:58)
         at com.hyperion.odi.hfm.ODIHFMAppReverser.connect(ODIHFMAppReverser.java:27)
         at com.hyperion.odi.common.ODIModelImporter.importModels(Unknown Source)
         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
         at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
         at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
         at java.lang.reflect.Method.invoke(Method.java:597)
    com.hyperion.odi.common.ODIHAppException: com.hyperion.odi.common.ODIHAppException: Error occurred while loading driver.
         at org.apache.bsf.engines.jython.JythonEngine.exec(JythonEngine.java:146)
         at com.sunopsis.dwg.codeinterpretor.SnpScriptingInterpretor.execInBSFEngine(SnpScriptingInterpretor.java:322)
         at com.sunopsis.dwg.codeinterpretor.SnpScriptingInterpretor.exec(SnpScriptingInterpretor.java:170)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.scripting(SnpSessTaskSql.java:2472)
         at oracle.odi.runtime.agent.execution.cmd.ScriptingExecutor.execute(ScriptingExecutor.java:47)
         at oracle.odi.runtime.agent.execution.cmd.ScriptingExecutor.execute(ScriptingExecutor.java:1)
         at oracle.odi.runtime.agent.execution.TaskExecutionHandler.handleTask(TaskExecutionHandler.java:50)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.processTask(SnpSessTaskSql.java:2913)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java:2625)
         at com.sunopsis.dwg.dbobj.SnpSessStep.treatAttachedTasks(SnpSessStep.java:558)
         at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java:464)
         at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java:2093)
         at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$2.doAction(StartSessRequestProcessor.java:366)
         at oracle.odi.core.persistence.dwgobject.DwgObjectTemplate.execute(DwgObjectTemplate.java:216)
         at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.doProcessStartSessTask(StartSessRequestProcessor.java:300)
         at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.access$0(StartSessRequestProcessor.java:292)
         at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$StartSessTask.doExecute(StartSessRequestProcessor.java:855)
         at oracle.odi.runtime.agent.processor.task.AgentTask.execute(AgentTask.java:126)
         at oracle.odi.runtime.agent.support.DefaultAgentTaskExecutor$2.run(DefaultAgentTaskExecutor.java:82)
         at java.lang.Thread.run(Thread.java:662)
    Caused by: Traceback (most recent call last):
    File "<string>", line 38, in <module>
         at com.hyperion.odi.hfm.ODIHFMAppConnection.<init>(ODIHFMAppConnection.java:58)
         at com.hyperion.odi.hfm.ODIHFMAppReverser.connect(ODIHFMAppReverser.java:27)
         at com.hyperion.odi.common.ODIModelImporter.importModels(Unknown Source)
         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
         at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
         at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
         at java.lang.reflect.Method.invoke(Method.java:597)
    com.hyperion.odi.common.ODIHAppException: com.hyperion.odi.common.ODIHAppException: Error occurred while loading driver.
         at org.python.core.PyException.fillInStackTrace(PyException.java:70)
         at java.lang.Throwable.<init>(Throwable.java:181)
         at java.lang.Exception.<init>(Exception.java:29)
         at java.lang.RuntimeException.<init>(RuntimeException.java:32)
         at org.python.core.PyException.<init>(PyException.java:46)
         at org.python.core.PyException.<init>(PyException.java:43)
         at org.python.core.Py.JavaError(Py.java:455)
         at org.python.core.Py.JavaError(Py.java:448)
         at org.python.core.PyReflectedFunction.__call__(PyReflectedFunction.java:177)
         at org.python.core.PyObject.__call__(PyObject.java:355)
         at org.python.core.PyMethod.__call__(PyMethod.java:215)
         at org.python.core.PyMethod.instancemethod___call__(PyMethod.java:221)
         at org.python.core.PyMethod.__call__(PyMethod.java:206)
         at org.python.core.PyObject.__call__(PyObject.java:381)
         at org.python.core.PyObject.__call__(PyObject.java:385)
         at org.python.pycode._pyx0.f$0(<string>:38)
         at org.python.pycode._pyx0.call_function(<string>)
         at org.python.core.PyTableCode.call(PyTableCode.java:165)
         at org.python.core.PyCode.call(PyCode.java:18)
         at org.python.core.Py.runCode(Py.java:1204)
         at org.python.core.Py.exec(Py.java:1248)
         at org.python.util.PythonInterpreter.exec(PythonInterpreter.java:172)
         at org.apache.bsf.engines.jython.JythonEngine.exec(JythonEngine.java:144)
         ... 19 more
    Caused by: com.hyperion.odi.common.ODIHAppException: Error occurred while loading driver.
         at com.hyperion.odi.hfm.ODIHFMAppConnection.<init>(ODIHFMAppConnection.java:58)
         at com.hyperion.odi.hfm.ODIHFMAppReverser.connect(ODIHFMAppReverser.java:27)
         at com.hyperion.odi.common.ODIModelImporter.importModels(Unknown Source)
         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
         at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
         at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
         at java.lang.reflect.Method.invoke(Method.java:597)
         at org.python.core.PyReflectedFunction.__call__(PyReflectedFunction.java:175)
         ... 33 more
    Caused by: com.hyperion.odi.hfm.wrapper.HFMException: Error occurred while loading driver.
         at com.hyperion.odi.hfm.wrapper.HFMServer.<init>(HFMServer.java:29)
         at com.hyperion.odi.hfm.ODIHFMAppConnection.<init>(ODIHFMAppConnection.java:51)
         ... 40 more
    Caused by: com.hyperion.odi.hfm.wrapper.HFMException: C:\Oracle\Middleware\Oracle_ODI1\oracledi\agent\drivers\HFMDriver.dll: Can't load IA 32-bit .dll on a AMD 64-bit platform
         at com.hyperion.odi.hfm.wrapper.HFMDriverJNI.getDriver(HFMDriverJNI.java:23)
         at com.hyperion.odi.hfm.wrapper.HFMServer.<init>(HFMServer.java:26)
         ... 41 more
    Can anyone please suggest a suitable solution for this?
    Thanks
    Abhi

    Hi,
    Under the readme eRPI : http://docs.oracle.com/cd/E17236_01/epm.1112/readme/erpi_1112200_readme.html
    I find this :
    ODI Setup Steps to Integrate with Financial Management
    To setup the ODI to integrate with Financial Management:
    Rename odiparams.bat located at <ODI_HOME>\oracledi\agent\bin to odiparams.bat.backup.
    The default installation location is C:\app\Oracle\product\11.1.1\Oracle_ODI_1.
    Download Oracle Data Integrator Companion CD 11.1.1.6.2.
    Unzip the contents of the Companion CD to the temp directory.
    Unzip oracledi-agent-standalone.zip.
    Copy the contents to: <ODI_HOME>.
    Make a backup of the HFMDriver.DLL.
    For the 32 bit operating system, copy HFMDriver32_11.1.2.2.dll as HFMDriver.DLL.
    For the 64bit operating system, copy HFMDriver64_11.1.2.2.dll as HFMDriver.DLL.
    Rename odiparams.bat.backup from Step 1 to odiparams.
    Edit the PATH to include: <ODI_HOME>\oracledi.sdk\lib
    Restart the ODI Agent.
    I try this procedure, but now, I have a new error :
    - Unable to connect on "myhfmserver" with "admin" user
    I think that, there is another step to do, because now HFM 11.1.2.2 use directly the "EPM Registry" (into Shared Services Schema) to retreive the HFM servers ou Clusters.
    But I don't find any information, how to setup this part into ODI.
    Or may be, we need to setup "Regedit Keys" into Windows registry like previous HFM release ?
    If you have any news, please inform me

  • Conditional Mapping with script using BlOCKPROC to access the HFM API

    Hello,
    following problem. During the mapping/validation I need to map certain custom1 members to None if the Top Member is None and let it be if not None. This allocation depends on the account. Reading through the documentation I stumbled across conditional mapping.
    So in the Custom1 map type Like I defined following:
    Rule Name: C1_Script
    Rule Definition: CXXX*
    TargetCustom1: #Script
    In the script I try to achieve to get the Custom1 Top member for the account! The HFM API offers this function(fGetCustomTopMember) and with varValue(14) I get my TargetAccount. So everything I need seems to be there.
    Only problem is I need to use the BlOCKPROC to access the HFM API provided through the adapter, but either way won't work.
    Trying "Set API.IntBlockMgr.IntegrationMgr.PobjIntegrate = BlOCKPROC.ActConnect("LookUP")" throws an object required error.
    Trying
    Dim BLOCKPROC
    Set BLOCKPROC = CreateObject("upsWBlockProcessorDM.clsBlockProcessor")
    BLOCKPROC.Initialize API, SCRIPTENG
    throws me a type mismatch for "BLOCKPROC.Initialize".
    I am kinda at my wits end. Any help/clue is much appreciated.
    Ahh, yes the different writing BlOCKPROC and BLOCKPROC is correct!
    Thanks!

    Hello Tony,
    Many thanks for your answer. Just one last question. The event AftProcMap(strLoc, strDim) gives me the Dimension? Just how do I get the account?
    Can you point me to the right direction and I'll figure out the rest!
    Thanks a lot!

  • How to calculate the HFM Cube size in SQL Server-2005

    Hi
    How to calculate the HFM Cube size in SQL Server-2005 ?
    Below query used for Oracle. Then what is query for SQL Server?
    SQL> select sum(bytes/1024/1024) from dba_segments where segment_name like 'FINANCIAL_%' and owner='HFM';
    SUM(BYTES/1024/1024)
    SQL> select sum(bytes/1024/1024) from dba_segments where segment_name like 'HSV FINANCIAL%' and owner='HFM';
    SUM(BYTES/1024/1024)
    Regards
    Smilee

    What is your objective? The subcube in HFM is a concept which applies to the application tier - not so much to the database tier. The size of the subcube is the unique number of data strips (data values for January - December inclusive, for example) for the given entity, currency triplet or Parent.Child node. You have to account for parent accounts and customs which don't exist in the database but are generated in RAM in the application tier.
    So, if your objective is to find the largest subcubes, you could do this by querying the database and counting the number of records per entity/value (DCE tables) or parent.child entity combination (DCN tables). I'm not versed in SQL, but I think the script below would just tell you the schema size and not the subcube sizes.
    Check out Accelatis.com for a third party software product that can do this for you. The feature is called the Subcube Analyzer and was written by the same team that wrote HFM, so they ought to know how this works :-)
    --chris                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                               

  • Error in Reverse engineering the HFM app in ODI

    Hi,
    I am getting the below error when trying to revrse engineer the HFm app in ODi.
    Can any one help on this.
    com.hyperion.odi.common.ODIHAppException: Error occurred in driver while getting Dimension List.
    I am trying to reverse the smaple app.
    Thanks in advance

    Try the Oracle Data Integrator Forum
    Data Integrator
    Edited by: Johnreardon on 13-Jan-2011 05:34

  • Is there any script to archive the HFM logs?

    Currently, we are manually archiving the HFM logs from the servers. Is there any automated way (i.e. any script for this process)?

    Please confirm what log you are talking about :
    - HsvEventLog.log
    - HFM Application specific Task Audit log stored in HFM database? (<APP_NAME>TASKAUDIT)
    - HFM Error Log stored in HFM database? (HFM_ERRORLOG)
    - Windows event logs referring to Hyperion/hfm?
    For the database tables, you could create SQL (or oracle) jobs to copy the data to a different table, or export it somewhere.
    For the file(s) such as HsvEventLog.log file, you could create a windows scheduled task or cron job on unix to move the files somewhere.
    For the Windows event log, I believe you could create a job to export log data, but I've never bothered with this log since most of the info you will find here is also in the error log from HFM.

  • I can´t see task released in the tab "Consolidation" inside of the CMS

    Hi friends please I need your help, the tasks into the tab "Consolidation" has been deleted and I can´t see the tasks inside of the tab, I have the files with the extention .pra into the server, thoses files are used for the deploy of the tasks into the tab "Consolidation", but I don´t Know what procedure I have to follow for can see the tasks into tab "Consolidation" again.
    Please help me
    Best Regards.
    Gustavo.

    You might need to restart your Mac.
    1. Click on the Apple Logo
    2. Click Restart
    3. To confirm restart, Click Restart
    4. Your Mac restarts
    Or instead, you can download media from your iPod Touch.
    1. Open iTunes on Your iPod Touch
    2. Browse

  • ACCUM function ignores the time consolidation option

    Hi,
    We need to create YTD KPIs based on the Monthly KPIs.
    We are trying to use te ACCUM function to create the KPIs:
    calc KPI_ACC = ACCUM(KPI,yearly)
    but when the time consolidation method of the KPI is not SUM (e.g. LAST), the ACCUM function doesn't take this into account.
    Any ideas?
    Regards

    Again I'm doing this from memory. Control variables can be used in all these places.
    I tend to use the ASK command to present a menu and set the dates. A single selection from ASK can set many variables.
    Then I have a shell proc with command like
    set period Jan - &myvar
    calc var2 = average(var1) for &myvar
    in some cases this proc can be run as it stands. In other cases the control var won't resolve. In this case I then have a command that does something like
    copy pro myshellproc myexecuteproc resolve
    compile myexecuteproc
    exe myexecueproc
    You'll need to play with the technique but once you understand it you can do all sorts of magic.

  • Scope Entities under the selected Consolidation Group

    Hi Experts,
    During running of Consolidation, I have a logic to compute the translation of Retained Earnings via Account Transformation. Now my problem is that the logic calculates for all Entities. As a result there are data in other Entities even though they are not part of the consolidation group. In the MS version, we used the function below to filter Entities that are active under the selected consolidation group but it seems that this is not supported here in NW. Any function I can use to scope only the Entities that are active in the selected conso group?
    To scope active Entities in MS:
    *XDIM_GETINPUTSET ENTITY=%MYENT%
    *APP=OWNERSHIP
    *XDIM_MEMBERSET GROUPS=%GROUPS_SET%
    *XDIM_MEMBERSET OWNACCOUNT=ACTIVE
    *CRITERIA SIGNEDDATA>=1
    *ENDXDIM
    Conso Script Logic:
    //CALCULATE RETAINED EARNINGS TRANSLATION
    *RUN_PROGRAM CALC_ACCOUNT
       CATEGORY = %CATEGORY_SET%
       CURRENCY = USD
       TID_RA = %TIME_SET%
       OTHER = [GROUPS=%GROUPS_SET%,NON_GROUP]
       CALC = RE_BEG
    *ENDRUN_PROGRAM
    //TRANSLATE TO GROUP CURRENCY
    *RUN_PROGRAM CURR_CONVERSION
       CATEGORY = %CATEGORY_SET% 
       GROUP = %GROUPS_SET%
       TID_RA = %TIME_SET%
       RATEENTITY = GLOBAL
       MEASURES=YTD
    *ENDRUN_PROGRAM
    //RUN CONSOLIDATION RULES
    *RUN_PROGRAM CONSOLIDATION
       CATEGORY = %CATEGORY_SET%
       GROUP = %GROUPS_SET%
       TID_RA = %TIME_SET%
       MEASURES=YTD
    *ENDRUN_PROGRAM
    Thanks in advance,
    Marvin

    Hi Marvin,
    Normally we have a consolidation hierarchy structure in Entity Dimentsion also, so you can add entity filter in script logic and DM and run CALC_ACCOUNT for this group only.
    Anil

Maybe you are looking for