Loading forecasts alternatives to HFM (9.3.1)
Hi,
I'm hoping that you could provide some recommendations as to what is the best method for us to load data into Budget/Forecast scenarios.
As a bit of background to our HFM structure,
I am only concerned with loading data into 1 entity, however we are using C3 to represent different business units/cost centres of the business of which there are approximately 300. Up until now, all budgeting and forecasting would be prepared by individual business units/cost centres in excel and "submitted" to central finance where we would them consolidate the forecasts together in excel before submitting via FDM as a ranged load to HFM.
There is a deep desire to enable at least to divisional level; the ability for individual users to submit data directly to HFM so that they can control their budgeting process. It is also likely that they will want to continually re-submit revised budgets to HFM during the process.
There are a number of solutions I can see to allow the divisions to submit.
*1) Use data forms via excel to push the data to HFM.*
This has the advantage that I can ensure that no zero amounts are loaded into HFM however given the number of cost centres containg data (approx 300) this would be cumbersome to say the least.
2) Setdata in Smartview.
As above.
3) FDM using range load templates
Because all of the divisions exist under the same entity but want to submit their data at different times, it is not possible for them to use the replace load method, they would have to use the "merge" load method to ensure that they do not remove other divisional data. This presents the problem that all accounts will need to be loaded against even if the value is zero to ensure that if a value of an intersection changes to a zero, the total value of the TB is correct.
A quick calculation 300 cost centres * 200 accounts * 12 months = 720,000 data points at a minimum for P&L only. The vast majority will zero.
How well does HFM deal with zero values? for 720k - 1m are not an issue with regard to performance etc then it may be acceptable. At a push I could always use a remove zero's script once the budget/forecast is finalised.
If anyone has had a similar issue or can provide any other solutions or things for me to consider, then I welcome your feedback.
Hi,
Can you define "diffrent domains" for me? Are they diffrent phisyical machines? What type of database server are you using?
-Tony
Similar Messages
-
Plan/Forecast Scenario in HFM - best practice
Hello all,
We're currently working on adding a Plan and Forecast scenarios to HFM
to incorporate some budgeting, which had previously been done in
Excel/Access. For those of you who do planning in HFM, would you care
to briefly summarize your processes?
1) How do you load the data into your HFM plan/forecast scenario?
Excel spreadsheet? FDM Multi-Load? Webform?
2) If you use Excel, do you just allow the users to submit via the
template using SmartView or formulas? What about audit/controls?
3) Do you let the users consolidate whenever they want?
Any other tips would be greatly appreciated!
ThanksWe've been doing budgeting forecasting for awhile in HFM.... However.... We leaving that and going to Hyperion Planning...
Some things you are going to want to consider:
1) You will probably have more people entering budgets \ forecasting then actuals, so there might be licensing cost.
2) Security will need to be considered, who get's what and where.
3) Budgeting \ Forecasting is usually done at a more granular level then rolled-up. This could make your HFM application much larger or more detailed then necessary.
4) If you re-org, which we do, we have to reconsolidate all the scenarios and all the years. Time consuming.
5) The is one rule file, making divisional or cross-company modeling difficult. For instance, one of our divisions has a very complex allocation model which we could not build in HFM.
6) Its not uncommon for users to ask for a rolling forecast, which can be done in HFM at the cost of more scenarios. Also, versioning is difficult in HFM.
7) Reporting will also need to be considered, who does it, how it looks and who gets what report.
With Planning, we don't have the above issues.
To answer your questions before moving to Planning:
1) How do you load the data into your HFM plan/forecast scenario, Excel spreadsheet, FDM Multi-Load, Webform? 90% use the data grids, 5% FDM (International or not on our GL) 5% SmartView (Powerusers)
2) If you use Excel, do you just allow the users to submit via the template using SmartView or formulas? What about audit/controls? How do you handle that in actuals? Our controls are handled via security
3) Do you let the users consolidate whenever they want? Yes - at the expense of users running consolidations over each-other -
Unable to load the data into HFM
Hello,
We created new HFM app configured that with FDM, generated output file through FDM and loaded that file through HFM directly 5-6 times, there was no issue till here.
Then I loaded the file through FDM 4 times successfully, even for different months. But, after 4 loads I start getting Error. Attached is the error log .
Please help us earliest..
** Begin fdmFM11XG6A Runtime Error Log Entry [2013-10-30-13:44:26] **
Error:
Code............-2147217873
Description.....System.Runtime.InteropServices.COMException (0x80040E2F): Exception from HRESULT: 0x80040E2F
at HSVCDATALOADLib.HsvcDataLoadClass.Load(String bstrClientFilename, String bstrClientLogFileName)
at fdmFM11XG6A.clsFMAdapter.fDBLoad(String strLoadFile, String strErrFile, String& strDelimiter, Int16& intMethod, Boolean& blnAccumFile, Boolean& blnHasShare, Int16& intMode)
Procedure.......clsHPDataManipulation.fDBLoad
Component.......E:\Opt\Shared\Apps\Hyperion\Install\Oracle\Middleware\EPMSystem11R1\products\FinancialDataQuality\SharedComponents\FM11X-G6-A_1016\AdapterComponents\fdmFM11XG6A\fdmFM11XG6A.dll
Version.........1116
Identification:
User............fdmadmin
Computer Name...EMSHALGADHYFD02
FINANCIAL MANAGEMENT Connection:
App Name........
Cluster Name....
Domain............
Connect Status.... Connection Open
Thanks,'
RaamWe are working with the DB team but they have confirmed that they is no issue with the TB, the process we have followed
As a standard process – while loading the data from FDM or manually to HFM – we don’t write any SQL query. Using the web interface – data would be loaded to HFM application. This data can we viewed by different reporting tools (smart view(excel)/HFR Report/etc.)
There is no any official documents on oracle website which talk about Insert SQL query which is used to insert data to HFM tables. Even, Hyperion does not provide much details on its internal tables used. Hyperion does not provide much insight on internal structure of HFM system.
As per Hyperion blogs/forums on internet –HFM stores the base level data in so called DCE tables (for example EMHFMFinal _DCE_1_2013 where EMHFMFinal is application name, 1 identifies the Scenario and 2013 the Year). Each row in the DCE table contains data for all periods of a given combination of dimensions (also called an intersection).
We are trying to load same data file with a replace option( it should delete the existing data before loading the data file). -
(Thanks for the posting suggestion Larry).
I followed prior suggestions to load exchange rates to HFM. So I have C1 as a "pass through" wildcard, C2 static to USD; and None is set for ICP, Entity, and Account. Now, how do I get past validation stage to load this to HFM? Since I have other locations that need to be validated I dont think I could shut validation off (how do I do that?) ?
ThanksI'm not sure what you mean by a pre-defined property, but the HFM administrator or someone must input the Fx rates that you want to use in the system. If you go into the HFM client and manage metadata you will see a drop-down for Currencies. HFM is not going to create these members just based on the information you are trying to load. So, if you are trying to load CAD, EUR, NZD, etc. those members must be loaded into HFM prior to you attempting to load data. Just as an example below would be the POV you would load your Fx rates to if :
Scenario;Year;Period;View;Entity;Value;Account;ICP;C1;C2;C3;C4;Amount
Actual;2012;Jan;YTD;[None];<Entity Currency>;BSRate;[ICP None];USD;EUR;[None];[None];X.XXXX
Actual;2012;Jan;YTD;[None];<Entity Currency>;ISRate;[ICP None];USD;EUR;[None];[None];X.XXXX
Hope this helps.
Jason -
Error while loading .app file in HFM Classic
Hi,
I am getting an error while loading a sample HFM .app file which I have generated using DRM Export for the Account hierarchies. The log file gives the following error
The file is attached.
Please help! Thanks!Hi there,
The issue is the question mark at the beginning of the sentence:
The correct format is:
Cheers,
Thanos -
Load Multiple Currencies in HFM?
Is it possible to load historical values for multiple currencies in HFM? We would like to load both CAD and USD values for prior periods; not just load the local currency (CAD) and translate to USD.
Edited by: user8947753 on Oct 1, 2010 1:12 PM
Edited by: user8947753 on Oct 1, 2010 1:17 PMWhile you can only load data in the entity's own currency, it is possible to load data from a different currency and then write your own translation rules to manage the correct translations. This is a deeper design question you have to go through to determine the impact on roll-forwards and currency translation adjustment, but on a basic level you can load data in different currencies. You just have to think through the needed metadata and rules associated with doing this.
--Chris -
ODI Error while loading data to Classic HFM Application
Hello All,
I am performing a integration from source(Flat file) to HFM application I have created a Integration to it and while loading the data i get the following error:-
org.apache.bsf.BSFException: exception from Jython:
Traceback (most recent call last):
File "<string>", line 3, in <module>
at com.hyperion.odi.hfm.ODIHFMAppWriter.loadData(ODIHFMAppWriter.java:240)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
com.hyperion.odi.common.ODIHAppException: com.hyperion.odi.common.ODIHAppException: Properties [CLEAR_ALL_METADATA_BEFORE_LOAD, REPLACE_MODE] are mandatory.
at org.apache.bsf.engines.jython.JythonEngine.exec(JythonEngine.java:146)
at com.sunopsis.dwg.codeinterpretor.SnpScriptingInterpretor.execInBSFEngine(SnpScriptingInterpretor.java:322)
at com.sunopsis.dwg.codeinterpretor.SnpScriptingInterpretor.exec(SnpScriptingInterpretor.java:170)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.scripting(SnpSessTaskSql.java:2472)
at oracle.odi.runtime.agent.execution.cmd.ScriptingExecutor.execute(ScriptingExecutor.java:47)
at oracle.odi.runtime.agent.execution.cmd.ScriptingExecutor.execute(ScriptingExecutor.java:1)
at oracle.odi.runtime.agent.execution.TaskExecutionHandler.handleTask(TaskExecutionHandler.java:50)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.processTask(SnpSessTaskSql.java:2913)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java:2625)
at com.sunopsis.dwg.dbobj.SnpSessStep.treatAttachedTasks(SnpSessStep.java:577)
at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java:468)
at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java:2128)
at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$2.doAction(StartSessRequestProcessor.java:366)
at oracle.odi.core.persistence.dwgobject.DwgObjectTemplate.execute(DwgObjectTemplate.java:216)
at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.doProcessStartSessTask(StartSessRequestProcessor.java:300)
at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.access$0(StartSessRequestProcessor.java:292)
at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$StartSessTask.doExecute(StartSessRequestProcessor.java:855)
at oracle.odi.runtime.agent.processor.task.AgentTask.execute(AgentTask.java:126)
at oracle.odi.runtime.agent.support.DefaultAgentTaskExecutor$2.run(DefaultAgentTaskExecutor.java:82)
at java.lang.Thread.run(Thread.java:662)
Caused by: Traceback (most recent call last):
File "<string>", line 3, in <module>
at com.hyperion.odi.hfm.ODIHFMAppWriter.loadData(ODIHFMAppWriter.java:240)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
com.hyperion.odi.common.ODIHAppException: com.hyperion.odi.common.ODIHAppException: Properties [CLEAR_ALL_METADATA_BEFORE_LOAD, REPLACE_MODE] are mandatory.
at org.python.core.PyException.fillInStackTrace(PyException.java:70)
at java.lang.Throwable.<init>(Throwable.java:181)
at java.lang.Exception.<init>(Exception.java:29)
at java.lang.RuntimeException.<init>(RuntimeException.java:32)
at org.python.core.PyException.<init>(PyException.java:46)
at org.python.core.PyException.<init>(PyException.java:43)
at org.python.core.Py.JavaError(Py.java:455)
at org.python.core.Py.JavaError(Py.java:448)
at org.python.core.PyReflectedFunction.__call__(PyReflectedFunction.java:177)
at org.python.core.PyObject.__call__(PyObject.java:355)
at org.python.core.PyMethod.__call__(PyMethod.java:215)
at org.python.core.PyMethod.instancemethod___call__(PyMethod.java:221)
at org.python.core.PyMethod.__call__(PyMethod.java:206)
at org.python.core.PyObject.__call__(PyObject.java:397)
at org.python.core.PyObject.__call__(PyObject.java:401)
at org.python.pycode._pyx15.f$0(<string>:6)
at org.python.pycode._pyx15.call_function(<string>)
at org.python.core.PyTableCode.call(PyTableCode.java:165)
at org.python.core.PyCode.call(PyCode.java:18)
at org.python.core.Py.runCode(Py.java:1204)
at org.python.core.Py.exec(Py.java:1248)
at org.python.util.PythonInterpreter.exec(PythonInterpreter.java:172)
at org.apache.bsf.engines.jython.JythonEngine.exec(JythonEngine.java:144)
... 19 more
Caused by: com.hyperion.odi.common.ODIHAppException: Properties [CLEAR_ALL_METADATA_BEFORE_LOAD, REPLACE_MODE] are mandatory.
at com.hyperion.odi.hfm.ODIHFMAppWriter.loadData(ODIHFMAppWriter.java:240)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.python.core.PyReflectedFunction.__call__(PyReflectedFunction.java:175)
... 33 more
Caused by: com.hyperion.odi.common.ODIHAppException: Properties [CLEAR_ALL_METADATA_BEFORE_LOAD, REPLACE_MODE] are mandatory.
at com.hyperion.odi.hfm.ODIHFMUtil.validateRequiredProperties(ODIHFMUtil.java:87)
at com.hyperion.odi.hfm.ODIHFMMetadataLoader$OptionsMetadataLoad.validate(ODIHFMMetadataLoader.java:66)
at com.hyperion.odi.hfm.ODIHFMMetadataLoader.validateOptions(ODIHFMMetadataLoader.java:188)
at com.hyperion.odi.hfm.ODIHFMAppStatement.validateLoadOptions(ODIHFMAppStatement.java:168)
at com.hyperion.odi.hfm.ODIHFMAppWriter.loadData(ODIHFMAppWriter.java:195)
... 38 more
Any help appreciated
Thanks,
PratikHi Pratik02,
I don't have access to that IKM from here and I don't know much about Hyperion, but here is an article that can help you : http://john-goodwin.blogspot.be/2010/02/odi-series-loading-hfm-metadata.html
Can you see the two options mentioned in the error message ? CLEAR_ALL_METADATA_BEFORE_LOAD and REPLACE_MODE? What are their value?
Are you loading a dimension? If so, shouldn't you use the "IKM SQL to Hyperion Financial management Dimension" instead of "IKM SQL to Hyperion Financial management Data" ?
Regards,
JeromeFr -
How can we load metadata automatically in HFM
Hi There
I am working with HFM 9.3.1.
I have account dimension in this every day new members are coming.
I just want to process this members that automatically update to the existing application.How can i do this is there any process that i can do in HFM 9.3.1 or should i need to upgrade the existing system.
Please suggest me that could be great help.
Thanks
HarsHave a look on www.epmmaestro.com
The EPM Maestro Suite has among its many other features, the ability to automate both metadat extract and load.
You do not indicate how your metadata file is updated (ie the new members added), as this will still be a manual effort.
Another new feature of the EPM Maestro (web feature), is Metadata Validator, which will ensure your attributes etc are correct.
That said, have you worked a plan since your posting? -
[SOLVED] Load module alternative
Hello,
I have small problems with one kernel module.
What I want: load the nvidia_acpi_call module to deactivate my discrete graphics card
from https://aur.archlinux.org/packages.php?ID=34499
1st alternative: load the module in rc.conf
Works, but I get dozens of error messages and long delays during boot
see: https://bbs.archlinux.org/viewtopic.php … 66#p787566
2nd alternative: modprobe it in rc.local
Works, but upowerd goes up to 100% cpu usage afterwards.
see: https://bbs.archlinux.org/viewtopic.php … 33#p994833
They say that the module should be loaded after xorg to avoid this.
3rd alternative: Load the module manually after system start
I am doing that at the moment, but it is quite annoying.
So do you have any other ideas?
Thx!
Last edited by Jankosevic (2012-01-21 13:59:36)Jankosevic wrote:module should be loaded after xorg
In e.g. rc.local, start a process that waits, e.g.: (untried crappy bash code ahead)
(while [[ -z $(pidof -s /usr/bin/X) ]] ; do sleep 1 ; done && sleep 10 && modprobe whatever) &
I added a "sleep 10" to give it time to settle - maybe needed, maybe not, but in case of doubt, I'd do it.
Last edited by brebs (2012-01-17 17:26:46) -
HFM Data Form in Smart View - Improper Suppression
We have several data forms that are opened by a group of controllers every month to load forecast information to HFM. This data form is opened by all of these users in Excel and submitted back to HFM via Smart View.
We have instructed all of the users to turn off ALL suppression while working with the data forms because there are some invalid intersections for ICP based on the input entity. The form is rendering properly for all users but one. When he opens the data form it appears to show all of the lines, but then once it fully loads, it is removing/suppressing one line that is an invalid intersection. In checking his Smart View options, I can clearly say that he does not have ANY of the suppression options in use.
Has anyone experienced this issue? Are there any workarounds? Why would this only happen for one user when 10 others have no issue when using the same file/form/options?
Your help is appreciated, as always!
ScottHi Scott,
If this is only for one user, I doubt something wrong in his profile.
Ask your IT to have his profile recreated to see if the issue disappears.
Hope this helps,
Thank you,
Charles Babu J -
DATAEXPORT from Essbase to Oracle SQL - Issue with #Missing
Hi,
I'm exporting data from Hyperion Planning application in order to transform and load forecast data into HFM.
To do so, I'm using ODI which build an Essbase calc script, run this calc script in Essbase, transform the data and then load it into HFM.
The calc script is exporting data into an Oracle table. I have one problem with the precision of the extract: when I set a precision (properties DataExportDecimal & DataExportPrecision), the #Missing are exported as "0" in Oracle table. I saw in the documentation that the precision properties are optional but when I don't set these precisions, #Missing are exported as "-0,000000000002".
I'd like to export #Missing as null and keep "0" when data equals to "0" (I mean, no transformation of "0" to null). Also, I can't change the architecture to flat file export / load.
I put the script I use below, thanks a lot for your help!
Stephane.
//ESS_LOCALE English_UnitedStates.Latin1@Binary
SET UPDATECALC OFF ;
SET CACHE HIGH;
SET CALCPARALLEL 4;
SET LOCKBLOCK HIGH ;
SET MSG SUMMARY ;
SET DATAEXPORTOPTIONS
DataExportLevel ALL ;
DataExportDynamicCalc On;
DataExportDecimal 7;
DataExportPrecision 7;
DataExportColFormat Off;
DataExportRelationalFile On;
DataExportOverwriteFile On;
DataExportDryRun Off;
DataExportColHeader "PERIOD";
FIX(BegBalance, @RELATIVE(TOT_YEAR,0), &FY_N0, &FY_N1, &FY_N2, "FINAL", &CurrScenario, @RELATIVE("PL_SRC", 0), "ENTITY_CURR", @RELATIVE("SITES", 0), @RELATIVE("TECHPL", 0), "ND_COST_CENTER", "ND_COST_CENTER_IC", "IC_TOT", @LEVMBRS(BU, 0), "ALL_PROJECT", "PTD", @LEVMBRS("LEGAL_ENTITY", 0))
DATAEXPORT "DSN" "STAGING" "STG_BUDGET_EXTRACT_RPTING_HP" "#SCHEMA" "#PASSWD";
ENDFIX;I can't really comment on why #Missing is not being outputted as you expect, is it possible you can run an update after the export to transform "-0,000000000002" to NULL
Cheers
John
http://john-goodwin.blogspot.com/ -
Loading Metadata to HFM Classic from FDM
Hi,
Can anyone please explain is there any possibility to load Metadata (Members) in HFM Classic Application using FDM.
If so please explain in detailed.
Thanks in advanceFDM is not designed to load metadata but data. It can be done but would involve a fair bit of custom code (including a wrapper dll) as the metadata load functionality is not exposed by the HFM web SDK. I'd look at alternative method for doing this and stick to loading data via FDM.
-
Unable to load data from FDMEE staging table to HFM tables
Hi,
We have installed EPM 11.1.2.3 with all latest related products (ODI/FDMEE) in our development environment
We are inprocess of loading data from EBS R12 to HFM using ERPI (Data Managemwnt in EPM 11.1.2.3). We could Import and validate the data but when we try to Export the data to HFM, the process continuously running hours (neither it gives any error nor it completes).
When we check the process details in ODI work repository, following processes are completed successfully
COMM_LOAD_BALANCES - Running .........(From past 1 day, still running)
EBS_GL_LOAD_BALANCES_DATA - Successful
COMM_LOAD_BALANCES - Successful
Whereas we could load data in staging table of FDMEE database schema and moreover we are even able to drill through to source system (EBS R12) from Data load workbench but not able to load the data into HFM application.
Log details from the process are below.
2013-11-05 17:04:59,747 INFO [AIF]: FDMEE Process Start, Process ID: 31
2013-11-05 17:04:59,747 INFO [AIF]: FDMEE Logging Level: 4
2013-11-05 17:04:59,748 INFO [AIF]: FDMEE Log File: C:\FDMEE\outbox\logs\OASIS_31.log
2013-11-05 17:04:59,748 INFO [AIF]: User:admin
2013-11-05 17:04:59,748 INFO [AIF]: Location:VisionLoc (Partitionkey:1)
2013-11-05 17:04:59,749 INFO [AIF]: Period Name:JAN (Period Key:1/1/12 12:00 AM)
2013-11-05 17:04:59,749 INFO [AIF]: Category Name:VisionCat (Category key:3)
2013-11-05 17:04:59,749 INFO [AIF]: Rule Name:VisionRule (Rule ID:2)
2013-11-05 17:05:00,844 INFO [AIF]: Jython Version: 2.5.1 (Release_2_5_1:6813, Sep 26 2009, 13:47:54)
[Oracle JRockit(R) (Oracle Corporation)]
2013-11-05 17:05:00,844 INFO [AIF]: Java Platform: java1.6.0_37
2013-11-05 17:05:02,910 INFO [AIF]: -------START IMPORT STEP-------
2013-11-05 17:05:02,953 INFO [AIF]: -------END IMPORT STEP-------
2013-11-05 17:05:03,030 INFO [AIF]: -------START EXPORT STEP-------
2013-11-05 17:05:03,108 INFO [AIF]:
Move Data for Period 'JAN'
Any help on above is much appreciated.
Thank you
Regards
PraneethHi,
I have followed the steps 1 & 2 above. Noe the log shows something like below
2013-11-05 09:47:31,179 INFO [AIF]: FDMEE Process Start, Process ID: 22
2013-11-05 09:47:31,179 INFO [AIF]: FDMEE Logging Level: 4
2013-11-05 09:47:31,179 INFO [AIF]: FDMEE Log File: C:\FDMEE\outbox\logs\OASIS_22.log
2013-11-05 09:47:31,180 INFO [AIF]: User:admin
2013-11-05 09:47:31,180 INFO [AIF]: Location:VisionLoc (Partitionkey:1)
2013-11-05 09:47:31,180 INFO [AIF]: Period Name:JAN (Period Key:1/1/12 12:00 AM)
2013-11-05 09:47:31,180 INFO [AIF]: Category Name:VisionCat (Category key:3)
2013-11-05 09:47:31,181 INFO [AIF]: Rule Name:VisionRule (Rule ID:2)
2013-11-05 09:47:32,378 INFO [AIF]: Jython Version: 2.5.1 (Release_2_5_1:6813, Sep 26 2009, 13:47:54)
[Oracle JRockit(R) (Oracle Corporation)]
2013-11-05 09:47:32,378 INFO [AIF]: Java Platform: java1.6.0_37
2013-11-05 09:47:34,652 INFO [AIF]: -------START IMPORT STEP-------
2013-11-05 09:47:34,698 INFO [AIF]: -------END IMPORT STEP-------
2013-11-05 09:47:34,744 INFO [AIF]: -------START EXPORT STEP-------
2013-11-05 09:47:34,828 INFO [AIF]:
Move Data for Period 'JAN'
2013-11-08 11:49:10,478 INFO [AIF]: FDMEE Process Start, Process ID: 22
2013-11-08 11:49:10,493 INFO [AIF]: FDMEE Logging Level: 5
2013-11-08 11:49:10,493 INFO [AIF]: FDMEE Log File: C:\FDMEE\outbox\logs\OASIS_22.log
2013-11-08 11:49:10,494 INFO [AIF]: User:admin
2013-11-08 11:49:10,494 INFO [AIF]: Location:VISIONLOC (Partitionkey:1)
2013-11-08 11:49:10,494 INFO [AIF]: Period Name:JAN (Period Key:1/1/12 12:00 AM)
2013-11-08 11:49:10,495 INFO [AIF]: Category Name:VISIONCAT (Category key:1)
2013-11-08 11:49:10,495 INFO [AIF]: Rule Name:VISIONRULE (Rule ID:1)
2013-11-08 11:49:11,903 INFO [AIF]: Jython Version: 2.5.1 (Release_2_5_1:6813, Sep 26 2009, 13:47:54)
[Oracle JRockit(R) (Oracle Corporation)]
2013-11-08 11:49:11,909 INFO [AIF]: Java Platform: java1.6.0_37
2013-11-08 11:49:14,037 INFO [AIF]: -------START IMPORT STEP-------
2013-11-08 11:49:14,105 INFO [AIF]: -------END IMPORT STEP-------
2013-11-08 11:49:14,152 INFO [AIF]: -------START EXPORT STEP-------
2013-11-08 11:49:14,178 DEBUG [AIF]: CommData.exportData - START
2013-11-08 11:49:14,183 DEBUG [AIF]: CommData.getRuleInfo - START
2013-11-08 11:49:14,188 DEBUG [AIF]:
SELECT brl.RULE_ID
,br.RULE_NAME
,brl.PARTITIONKEY
,brl.CATKEY
,part.PARTVALGROUP
,br.SOURCE_SYSTEM_ID
,ss.SOURCE_SYSTEM_TYPE
,CASE
WHEN ss.SOURCE_SYSTEM_TYPE LIKE 'EBS%' THEN 'N'
WHEN ss.SOURCE_SYSTEM_TYPE LIKE 'PS%' THEN 'N'
WHEN ss.SOURCE_SYSTEM_TYPE LIKE 'FUSION%' THEN 'N'
WHEN ss.SOURCE_SYSTEM_TYPE LIKE 'FILE%' THEN 'N'
ELSE 'Y'
END SOURCE_ADAPTER_FLAG
,app.APPLICATION_ID
,app.TARGET_APPLICATION_NAME
,app.TARGET_APPLICATION_TYPE
,app.DATA_LOAD_METHOD
,brl.PLAN_TYPE
,CASE brl.PLAN_TYPE
WHEN 'PLAN1' THEN 1
WHEN 'PLAN2' THEN 2
WHEN 'PLAN3' THEN 3
WHEN 'PLAN4' THEN 4
WHEN 'PLAN5' THEN 5
ELSE 0
END PLAN_NUMBER
,br.INCL_ZERO_BALANCE_FLAG
,br.PERIOD_MAPPING_TYPE
,br.INCLUDE_ADJ_PERIODS_FLAG
,br.BALANCE_TYPE ACTUAL_FLAG
,br.AMOUNT_TYPE
,br.BALANCE_SELECTION
,br.BALANCE_METHOD_CODE
,COALESCE(br.SIGNAGE_METHOD, 'ABSOLUTE') SIGNAGE_METHOD
,br.CURRENCY_CODE
,br.BAL_SEG_VALUE_OPTION_CODE
,brl.EXECUTION_MODE
,COALESCE(brl.IMPORT_FROM_SOURCE_FLAG, 'Y') IMPORT_FROM_SOURCE_FLAG
,COALESCE(brl.RECALCULATE_FLAG, 'N') RECALCULATE_FLAG
,COALESCE(brl.EXPORT_TO_TARGET_FLAG, 'N') EXPORT_TO_TARGET_FLAG
,CASE
WHEN (br.LEDGER_GROUP_ID IS NOT NULL) THEN 'MULTI'
WHEN (br.SOURCE_LEDGER_ID IS NOT NULL) THEN 'SINGLE'
ELSE 'NONE'
END LEDGER_GROUP_CODE
,COALESCE(br.BALANCE_AMOUNT_BS, 'YTD') BALANCE_AMOUNT_BS
,COALESCE(br.BALANCE_AMOUNT_IS, 'PERIODIC') BALANCE_AMOUNT_IS
,br.LEDGER_GROUP
,(SELECT brd.DETAIL_CODE
FROM AIF_BAL_RULE_DETAILS brd
WHERE brd.RULE_ID = br.RULE_ID
AND brd.DETAIL_TYPE = 'LEDGER'
) PS_LEDGER
,CASE lg.LEDGER_TEMPLATE
WHEN 'COMMITMENT' THEN 'Y'
ELSE 'N'
END KK_FLAG
,p.LAST_UPDATED_BY
,p.AIF_WEB_SERVICE_URL WEB_SERVICE_URL
,p.EPM_ORACLE_INSTANCE
FROM AIF_PROCESSES p
INNER JOIN AIF_BAL_RULE_LOADS brl
ON brl.LOADID = p.PROCESS_ID
INNER JOIN AIF_BALANCE_RULES br
ON br.RULE_ID = brl.RULE_ID
INNER JOIN AIF_SOURCE_SYSTEMS ss
ON ss.SOURCE_SYSTEM_ID = br.SOURCE_SYSTEM_ID
INNER JOIN AIF_TARGET_APPLICATIONS app
ON app.APPLICATION_ID = brl.APPLICATION_ID
INNER JOIN TPOVPARTITION part
ON part.PARTITIONKEY = br.PARTITIONKEY
INNER JOIN TBHVIMPGROUP imp
ON imp.IMPGROUPKEY = part.PARTIMPGROUP
LEFT OUTER JOIN AIF_COA_LEDGERS l
ON l.SOURCE_SYSTEM_ID = p.SOURCE_SYSTEM_ID
AND l.SOURCE_LEDGER_ID = COALESCE(br.SOURCE_LEDGER_ID,imp.IMPSOURCELEDGERID)
LEFT OUTER JOIN AIF_PS_SET_CNTRL_REC_STG scr
ON scr.SOURCE_SYSTEM_ID = l.SOURCE_SYSTEM_ID
AND scr.SETCNTRLVALUE = l.SOURCE_LEDGER_NAME
AND scr.RECNAME = 'LED_GRP_TBL'
LEFT OUTER JOIN AIF_PS_LED_GRP_TBL_STG lg
ON lg.SOURCE_SYSTEM_ID = scr.SOURCE_SYSTEM_ID
AND lg.SETID = scr.SETID
AND lg.LEDGER_GROUP = br.LEDGER_GROUP
WHERE p.PROCESS_ID = 22
2013-11-08 11:49:14,195 DEBUG [AIF]:
SELECT adim.BALANCE_COLUMN_NAME DIMNAME
,adim.DIMENSION_ID
,dim.TARGET_DIMENSION_CLASS_NAME
,(SELECT COA_SEGMENT_NAME
FROM AIF_COA_SEGMENTS cs
WHERE cs.COA_LINE_ID = tiie.IMPSOURCECOALINEID1
) COA_SEGMENT_NAME1
,(SELECT COA_SEGMENT_NAME
FROM AIF_COA_SEGMENTS cs
WHERE cs.COA_LINE_ID = tiie.IMPSOURCECOALINEID2
) COA_SEGMENT_NAME2
,(SELECT COA_SEGMENT_NAME
FROM AIF_COA_SEGMENTS cs
WHERE cs.COA_LINE_ID = tiie.IMPSOURCECOALINEID3
) COA_SEGMENT_NAME3
,(SELECT COA_SEGMENT_NAME
FROM AIF_COA_SEGMENTS cs
WHERE cs.COA_LINE_ID = tiie.IMPSOURCECOALINEID4
) COA_SEGMENT_NAME4
,(SELECT COA_SEGMENT_NAME
FROM AIF_COA_SEGMENTS cs
WHERE cs.COA_LINE_ID = tiie.IMPSOURCECOALINEID5
) COA_SEGMENT_NAME5
,(SELECT CASE mdd.ORPHAN_OPTION_CODE
WHEN 'CHILD' THEN 'N'
WHEN 'ROOT' THEN 'N'
ELSE 'Y'
END DIMENSION_FILTER_FLAG
FROM AIF_MAP_DIM_DETAILS_V mdd
,AIF_MAPPING_RULES mr
WHERE mr.PARTITIONKEY = tpp.PARTITIONKEY
AND mdd.RULE_ID = mr.RULE_ID
AND mdd.DIMENSION_ID = adim.DIMENSION_ID
) DIMENSION_FILTER_FLAG
,tiie.IMPCONCATCHAR
FROM TPOVPARTITION tpp
INNER JOIN AIF_TARGET_APPL_DIMENSIONS adim
ON adim.APPLICATION_ID = 2
INNER JOIN AIF_DIMENSIONS dim
ON dim.DIMENSION_ID = adim.DIMENSION_ID
LEFT OUTER JOIN TBHVIMPITEMERPI tiie
ON tiie.IMPGROUPKEY = tpp.PARTIMPGROUP
AND tiie.IMPFLDFIELDNAME = adim.BALANCE_COLUMN_NAME
AND tiie.IMPMAPTYPE = 'ERP'
WHERE tpp.PARTITIONKEY = 1
AND adim.BALANCE_COLUMN_NAME IS NOT NULL
ORDER BY adim.BALANCE_COLUMN_NAME
2013-11-08 11:49:14,197 DEBUG [AIF]: {'APPLICATION_ID': 2L, 'IMPORT_FROM_SOURCE_FLAG': u'N', 'PLAN_TYPE': None, 'RULE_NAME': u'VISIONRULE', 'ACTUAL_FLAG': u'A', 'IS_INCREMENTAL_LOAD': False, 'EPM_ORACLE_INSTANCE': u'C:\\Oracle\\Middleware\\user_projects\\epmsystem1', 'CATKEY': 1L, 'BAL_SEG_VALUE_OPTION_CODE': u'ALL', 'INCLUDE_ADJ_PERIODS_FLAG': u'N', 'PERIOD_MAPPING_TYPE': u'EXPLICIT', 'SOURCE_SYSTEM_TYPE': u'EBS_R12', 'LEDGER_GROUP': None, 'TARGET_APPLICATION_NAME': u'OASIS', 'RECALCULATE_FLAG': u'N', 'SOURCE_SYSTEM_ID': 2L, 'TEMP_DATA_TABLE_NAME': 'TDATASEG_T', 'KK_FLAG': u'N', 'AMOUNT_TYPE': u'MONETARY', 'EXPORT_TO_TARGET_FLAG': u'Y', 'DATA_TABLE_NAME': 'TDATASEG', 'DIMNAME_LIST': [u'ACCOUNT', u'ENTITY', u'ICP', u'UD1', u'UD2', u'UD3', u'UD4'], 'TDATAMAPTYPE': 'ERP', 'LAST_UPDATED_BY': u'admin', 'DIMNAME_MAP': {u'UD3': {'IMPCONCATCHAR': None, 'TARGET_DIMENSION_CLASS_NAME': u'Generic', 'COA_SEGMENT_NAME5': None, 'COA_SEGMENT_NAME1': u'SEGMENT5', 'COA_SEGMENT_NAME2': None, 'COA_SEGMENT_NAME3': None, 'DIMENSION_FILTER_FLAG': None, 'COA_SEGMENT_NAME4': None, 'DIMNAME': u'UD3', 'DIMENSION_ID': 9L}, u'ICP': {'IMPCONCATCHAR': None, 'TARGET_DIMENSION_CLASS_NAME': u'ICP', 'COA_SEGMENT_NAME5': None, 'COA_SEGMENT_NAME1': u'SEGMENT7', 'COA_SEGMENT_NAME2': None, 'COA_SEGMENT_NAME3': None, 'DIMENSION_FILTER_FLAG': None, 'COA_SEGMENT_NAME4': None, 'DIMNAME': u'ICP', 'DIMENSION_ID': 8L}, u'ENTITY': {'IMPCONCATCHAR': None, 'TARGET_DIMENSION_CLASS_NAME': u'Entity', 'COA_SEGMENT_NAME5': None, 'COA_SEGMENT_NAME1': u'SEGMENT1', 'COA_SEGMENT_NAME2': None, 'COA_SEGMENT_NAME3': None, 'DIMENSION_FILTER_FLAG': None, 'COA_SEGMENT_NAME4': None, 'DIMNAME': u'ENTITY', 'DIMENSION_ID': 12L}, u'UD2': {'IMPCONCATCHAR': None, 'TARGET_DIMENSION_CLASS_NAME': u'Generic', 'COA_SEGMENT_NAME5': None, 'COA_SEGMENT_NAME1': u'SEGMENT4', 'COA_SEGMENT_NAME2': None, 'COA_SEGMENT_NAME3': None, 'DIMENSION_FILTER_FLAG': None, 'COA_SEGMENT_NAME4': None, 'DIMNAME': u'UD2', 'DIMENSION_ID': 11L}, u'UD4': {'IMPCONCATCHAR': None, 'TARGET_DIMENSION_CLASS_NAME': u'Generic', 'COA_SEGMENT_NAME5': None, 'COA_SEGMENT_NAME1': u'SEGMENT6', 'COA_SEGMENT_NAME2': None, 'COA_SEGMENT_NAME3': None, 'DIMENSION_FILTER_FLAG': None, 'COA_SEGMENT_NAME4': None, 'DIMNAME': u'UD4', 'DIMENSION_ID': 1L}, u'ACCOUNT': {'IMPCONCATCHAR': None, 'TARGET_DIMENSION_CLASS_NAME': u'Account', 'COA_SEGMENT_NAME5': None, 'COA_SEGMENT_NAME1': u'SEGMENT3', 'COA_SEGMENT_NAME2': None, 'COA_SEGMENT_NAME3': None, 'DIMENSION_FILTER_FLAG': None, 'COA_SEGMENT_NAME4': None, 'DIMNAME': u'ACCOUNT', 'DIMENSION_ID': 10L}, u'UD1': {'IMPCONCATCHAR': None, 'TARGET_DIMENSION_CLASS_NAME': u'Generic', 'COA_SEGMENT_NAME5': None, 'COA_SEGMENT_NAME1': u'SEGMENT2', 'COA_SEGMENT_NAME2': None, 'COA_SEGMENT_NAME3': None, 'DIMENSION_FILTER_FLAG': None, 'COA_SEGMENT_NAME4': None, 'DIMNAME': u'UD1', 'DIMENSION_ID': 7L}}, 'TARGET_APPLICATION_TYPE': u'HFM', 'PARTITIONKEY': 1L, 'PARTVALGROUP': u'[NONE]', 'LEDGER_GROUP_CODE': u'SINGLE', 'INCLUDE_ZERO_BALANCE_FLAG': u'N', 'EXECUTION_MODE': u'SNAPSHOT', 'PLAN_NUMBER': 0L, 'PS_LEDGER': None, 'BALANCE_SELECTION': u'FUNCTIONAL', 'BALANCE_AMOUNT_IS': u'PERIODIC', 'RULE_ID': 1L, 'BALANCE_AMOUNT_BS': u'YTD', 'CURRENCY_CODE': None, 'SOURCE_ADAPTER_FLAG': u'N', 'BALANCE_METHOD_CODE': u'STANDARD', 'SIGNAGE_METHOD': u'SAME', 'WEB_SERVICE_URL': u'http://localhost:9000/aif', 'DATA_LOAD_METHOD': u'CLASSIC_VIA_EPMI'}
2013-11-08 11:49:14,197 DEBUG [AIF]: CommData.getRuleInfo - END
2013-11-08 11:49:14,224 DEBUG [AIF]: CommData.insertPeriods - START
2013-11-08 11:49:14,228 DEBUG [AIF]: CommData.getLedgerListAndMap - START
2013-11-08 11:49:14,229 DEBUG [AIF]: CommData.getLedgerSQL - START
2013-11-08 11:49:14,229 DEBUG [AIF]: CommData.getLedgerSQL - END
2013-11-08 11:49:14,229 DEBUG [AIF]:
SELECT l.SOURCE_LEDGER_ID
,l.SOURCE_LEDGER_NAME
,l.SOURCE_COA_ID
,l.CALENDAR_ID
,'0' SETID
,l.PERIOD_TYPE
,NULL LEDGER_TABLE_NAME
FROM AIF_BALANCE_RULES br
,AIF_COA_LEDGERS l
WHERE br.RULE_ID = 1
AND l.SOURCE_SYSTEM_ID = br.SOURCE_SYSTEM_ID
AND l.SOURCE_LEDGER_ID = br.SOURCE_LEDGER_ID
2013-11-08 11:49:14,230 DEBUG [AIF]: CommData.getLedgerListAndMap - END
2013-11-08 11:49:14,232 DEBUG [AIF]:
INSERT INTO AIF_PROCESS_PERIODS (
PROCESS_ID
,PERIODKEY
,PERIOD_ID
,ADJUSTMENT_PERIOD_FLAG
,GL_PERIOD_YEAR
,GL_PERIOD_NUM
,GL_PERIOD_NAME
,GL_PERIOD_CODE
,GL_EFFECTIVE_PERIOD_NUM
,YEARTARGET
,PERIODTARGET
,IMP_ENTITY_TYPE
,IMP_ENTITY_ID
,IMP_ENTITY_NAME
,TRANS_ENTITY_TYPE
,TRANS_ENTITY_ID
,TRANS_ENTITY_NAME
,PRIOR_PERIOD_FLAG
,SOURCE_LEDGER_ID
SELECT DISTINCT brl.LOADID PROCESS_ID
,pp.PERIODKEY PERIODKEY
,prd.PERIOD_ID
,COALESCE(prd.ADJUSTMENT_PERIOD_FLAG, 'N') ADJUSTMENT_PERIOD_FLAG
,COALESCE(prd.YEAR, ppsrc.GL_PERIOD_YEAR,0) GL_PERIOD_YEAR
,COALESCE(prd.PERIOD_NUM, ppsrc.GL_PERIOD_NUM,0) GL_PERIOD_NUM
,COALESCE(prd.PERIOD_NAME, ppsrc.GL_PERIOD_NAME,'0') GL_PERIOD_NAME
,COALESCE(prd.PERIOD_CODE, CAST(COALESCE(prd.PERIOD_NUM, ppsrc.GL_PERIOD_NUM,0) AS VARCHAR(38)),'0') GL_PERIOD_CODE
,(COALESCE(prd.YEAR, ppsrc.GL_PERIOD_YEAR,0) * 10000 + COALESCE(prd.PERIOD_NUM, ppsrc.GL_PERIOD_NUM,0)) GL_EFFECTIVE_PERIOD_NUM
,COALESCE(ppa.YEARTARGET, pp.YEARTARGET) YEARTARGET
,COALESCE(ppa.PERIODTARGET, pp.PERIODTARGET) PERIODTARGET
,'PROCESS_BAL_IMP' IMP_ENTITY_TYPE
,(COALESCE(prd.YEAR, ppsrc.GL_PERIOD_YEAR,0) * 10000 + COALESCE(prd.PERIOD_NUM, ppsrc.GL_PERIOD_NUM,0)) IMP_ENTITY_ID
,COALESCE(prd.PERIOD_NAME, ppsrc.GL_PERIOD_NAME,'0') IMP_ENTITY_NAME
,'PROCESS_BAL_TRANS' TRANS_ENTITY_TYPE
,(COALESCE(prd.YEAR, ppsrc.GL_PERIOD_YEAR,0) * 10000 + COALESCE(prd.PERIOD_NUM, ppsrc.GL_PERIOD_NUM,0)) TRANS_ENTITY_ID
,pp.PERIODDESC TRANS_ENTITY_NAME
,'N' PRIOR_PERIOD_FLAG
,2202 SOURCE_LEDGER_ID
FROM (
AIF_BAL_RULE_LOADS brl
INNER JOIN TPOVCATEGORY pc
ON pc.CATKEY = brl.CATKEY
INNER JOIN TPOVPERIOD_FLAT_V pp
ON pp.PERIODFREQ = pc.CATFREQ
AND pp.PERIODKEY >= brl.START_PERIODKEY
AND pp.PERIODKEY <= brl.END_PERIODKEY
LEFT OUTER JOIN TPOVPERIODADAPTOR_FLAT_V ppa
ON ppa.PERIODKEY = pp.PERIODKEY
AND ppa.PERIODFREQ = pp.PERIODFREQ
AND ppa.INTSYSTEMKEY = 'OASIS'
INNER JOIN TPOVPERIODSOURCE ppsrc
ON ppsrc.PERIODKEY = pp.PERIODKEY
AND ppsrc.MAPPING_TYPE = 'EXPLICIT'
AND ppsrc.SOURCE_SYSTEM_ID = 2
AND ppsrc.CALENDAR_ID IN ('29067')
LEFT OUTER JOIN AIF_GL_PERIODS_STG prd
ON prd.PERIOD_ID = ppsrc.PERIOD_ID
AND prd.SOURCE_SYSTEM_ID = ppsrc.SOURCE_SYSTEM_ID
AND prd.CALENDAR_ID = ppsrc.CALENDAR_ID
AND prd.SETID = '0'
AND prd.PERIOD_TYPE = '507'
AND prd.ADJUSTMENT_PERIOD_FLAG = 'N'
WHERE brl.LOADID = 22
ORDER BY pp.PERIODKEY
,GL_EFFECTIVE_PERIOD_NUM
2013-11-08 11:49:14,235 DEBUG [AIF]: CommData.insertPeriods - END
2013-11-08 11:49:14,240 DEBUG [AIF]: CommData.moveData - START
2013-11-08 11:49:14,242 DEBUG [AIF]: CommData.getPovList - START
2013-11-08 11:49:14,242 DEBUG [AIF]:
SELECT PARTITIONKEY
,PARTNAME
,CATKEY
,CATNAME
,PERIODKEY
,COALESCE(PERIODDESC, TO_CHAR(PERIODKEY,'YYYY-MM-DD HH24:MI:SS')) PERIODDESC
,RULE_ID
,RULE_NAME
FROM (
SELECT DISTINCT brl.PARTITIONKEY
,part.PARTNAME
,brl.CATKEY
,cat.CATNAME
,pprd.PERIODKEY
,pp.PERIODDESC
,brl.RULE_ID
,br.RULE_NAME
FROM AIF_BAL_RULE_LOADS brl
INNER JOIN AIF_BALANCE_RULES br
ON br.RULE_ID = brl.RULE_ID
INNER JOIN TPOVPARTITION part
ON part.PARTITIONKEY = brl.PARTITIONKEY
INNER JOIN TPOVCATEGORY cat
ON cat.CATKEY = brl.CATKEY
INNER JOIN AIF_PROCESS_PERIODS pprd
ON pprd.PROCESS_ID = brl.LOADID
LEFT OUTER JOIN TPOVPERIOD pp
ON pp.PERIODKEY = pprd.PERIODKEY
WHERE brl.LOADID = 22
) q
ORDER BY PARTITIONKEY
,CATKEY
,PERIODKEY
,RULE_ID
2013-11-08 11:49:14,244 DEBUG [AIF]: CommData.getPovList - END
2013-11-08 11:49:14,245 INFO [AIF]:
Move Data for Period 'JAN'
2013-11-08 11:49:14,246 DEBUG [AIF]:
UPDATE TDATASEG
SET LOADID = 22
WHERE PARTITIONKEY = 1
AND CATKEY = 1
AND RULE_ID = 1
AND LOADID < 22
AND PERIODKEY = '2012-01-01'
2013-11-08 11:49:14,320 DEBUG [AIF]: Number of Rows updated on TDATASEG: 1842
2013-11-08 11:49:14,320 DEBUG [AIF]:
INSERT INTO AIF_APPL_LOAD_AUDIT (
LOADID
,TARGET_APPLICATION_TYPE
,TARGET_APPLICATION_NAME
,PLAN_TYPE
,SOURCE_LEDGER_ID
,EPM_YEAR
,EPM_PERIOD
,SNAPSHOT_FLAG
,SEGMENT_FILTER_FLAG
,PARTITIONKEY
,CATKEY
,RULE_ID
,PERIODKEY
,EXPORT_TO_TARGET_FLAG
SELECT DISTINCT 22
,TARGET_APPLICATION_TYPE
,TARGET_APPLICATION_NAME
,PLAN_TYPE
,SOURCE_LEDGER_ID
,EPM_YEAR
,EPM_PERIOD
,SNAPSHOT_FLAG
,SEGMENT_FILTER_FLAG
,PARTITIONKEY
,CATKEY
,RULE_ID
,PERIODKEY
,'N'
FROM AIF_APPL_LOAD_AUDIT
WHERE PARTITIONKEY = 1
AND CATKEY = 1
AND RULE_ID = 1
AND LOADID < 22
AND PERIODKEY = '2012-01-01'
2013-11-08 11:49:14,321 DEBUG [AIF]: Number of Rows inserted into AIF_APPL_LOAD_AUDIT: 1
2013-11-08 11:49:14,322 DEBUG [AIF]:
INSERT INTO AIF_APPL_LOAD_PRD_AUDIT (
LOADID
,GL_PERIOD_ID
,GL_PERIOD_YEAR
,DELTA_RUN_ID
,PARTITIONKEY
,CATKEY
,RULE_ID
,PERIODKEY
SELECT DISTINCT 22
,GL_PERIOD_ID
,GL_PERIOD_YEAR
,DELTA_RUN_ID
,PARTITIONKEY
,CATKEY
,RULE_ID
,PERIODKEY
FROM AIF_APPL_LOAD_PRD_AUDIT
WHERE PARTITIONKEY = 1
AND CATKEY = 1
AND RULE_ID = 1
AND LOADID < 22
AND PERIODKEY = '2012-01-01'
2013-11-08 11:49:14,323 DEBUG [AIF]: Number of Rows inserted into AIF_APPL_LOAD_PRD_AUDIT: 1
2013-11-08 11:49:14,325 DEBUG [AIF]: CommData.moveData - END
2013-11-08 11:49:14,332 DEBUG [AIF]: CommData.updateWorkflow - START
2013-11-08 11:49:14,332 DEBUG [AIF]:
SELECT tlp.PROCESSSTATUS
,tlps.PROCESSSTATUSDESC
,CASE WHEN (tlp.INTLOCKSTATE = 60) THEN 'Y' ELSE 'N' END LOCK_FLAG
FROM TLOGPROCESS tlp
,TLOGPROCESSSTATES tlps
WHERE tlp.PARTITIONKEY = 1
AND tlp.CATKEY = 1
AND tlp.PERIODKEY = '2012-01-01'
AND tlp.RULE_ID = 1
AND tlps.PROCESSSTATUSKEY = tlp.PROCESSSTATUS
2013-11-08 11:49:14,336 DEBUG [AIF]:
UPDATE TLOGPROCESS
SET PROCESSENDTIME = CURRENT_TIMESTAMP
,PROCESSSTATUS = 20
,PROCESSEXP = 0
,PROCESSENTLOAD = 0
,PROCESSENTVAL = 0
,PROCESSEXPNOTE = NULL
,PROCESSENTLOADNOTE = NULL
,PROCESSENTVALNOTE = NULL
WHERE PARTITIONKEY = 1
AND CATKEY = 1
AND PERIODKEY = '2012-01-01'
AND RULE_ID = 1
2013-11-08 11:49:14,338 DEBUG [AIF]: CommData.updateWorkflow - END
2013-11-08 11:49:14,339 DEBUG [AIF]: CommData.purgeInvalidRecordsTDATASEG - START
2013-11-08 11:49:14,341 DEBUG [AIF]:
DELETE FROM TDATASEG
WHERE LOADID = 22
AND (
PARTITIONKEY = 1
AND CATKEY = 1
AND PERIODKEY = '2012-01-01'
AND RULE_ID = 1
AND VALID_FLAG = 'N'
2013-11-08 11:49:14,342 DEBUG [AIF]: Number of Rows deleted from TDATASEG: 0
2013-11-08 11:49:14,342 DEBUG [AIF]: CommData.purgeInvalidRecordsTDATASEG - END
2013-11-08 11:49:14,344 DEBUG [AIF]: CommData.updateAppLoadAudit - START
2013-11-08 11:49:14,344 DEBUG [AIF]:
UPDATE AIF_APPL_LOAD_AUDIT
SET EXPORT_TO_TARGET_FLAG = 'Y'
WHERE LOADID = 22
AND PARTITIONKEY = 1
AND CATKEY = 1
AND PERIODKEY= '2012-01-01'
AND RULE_ID = 1
2013-11-08 11:49:14,345 DEBUG [AIF]: Number of Rows updated on AIF_APPL_LOAD_AUDIT: 1
2013-11-08 11:49:14,345 DEBUG [AIF]: CommData.updateAppLoadAudit - END
2013-11-08 11:49:14,345 DEBUG [AIF]: CommData.updateWorkflow - START
2013-11-08 11:49:14,346 DEBUG [AIF]:
UPDATE TLOGPROCESS
SET PROCESSENDTIME = CURRENT_TIMESTAMP
,PROCESSSTATUS = 21
,PROCESSEXP = 1
,PROCESSEXPNOTE = 'Export Successful'
WHERE PARTITIONKEY = 1
AND CATKEY = 1
AND PERIODKEY = '2012-01-01'
AND RULE_ID = 1
2013-11-08 11:49:14,347 DEBUG [AIF]: CommData.updateWorkflow - END
2013-11-08 11:49:14,347 DEBUG [AIF]: CommData.exportData - END
2013-11-08 11:49:14,404 DEBUG [AIF]: HfmData.loadData - START
2013-11-08 11:49:14,404 DEBUG [AIF]: CommData.getRuleInfo - START
2013-11-08 11:49:14,404 DEBUG [AIF]:
SELECT brl.RULE_ID
,br.RULE_NAME
,brl.PARTITIONKEY
,brl.CATKEY
,part.PARTVALGROUP
,br.SOURCE_SYSTEM_ID
,ss.SOURCE_SYSTEM_TYPE
,CASE
WHEN ss.SOURCE_SYSTEM_TYPE LIKE 'EBS%' THEN 'N'
WHEN ss.SOURCE_SYSTEM_TYPE LIKE 'PS%' THEN 'N'
WHEN ss.SOURCE_SYSTEM_TYPE LIKE 'FUSION%' THEN 'N'
WHEN ss.SOURCE_SYSTEM_TYPE LIKE 'FILE%' THEN 'N'
ELSE 'Y'
END SOURCE_ADAPTER_FLAG
,app.APPLICATION_ID
,app.TARGET_APPLICATION_NAME
,app.TARGET_APPLICATION_TYPE
,app.DATA_LOAD_METHOD
,brl.PLAN_TYPE
,CASE brl.PLAN_TYPE
WHEN 'PLAN1' THEN 1
WHEN 'PLAN2' THEN 2
WHEN 'PLAN3' THEN 3
WHEN 'PLAN4' THEN 4
WHEN 'PLAN5' THEN 5
ELSE 0
END PLAN_NUMBER
,br.INCL_ZERO_BALANCE_FLAG
,br.PERIOD_MAPPING_TYPE
,br.INCLUDE_ADJ_PERIODS_FLAG
,br.BALANCE_TYPE ACTUAL_FLAG
,br.AMOUNT_TYPE
,br.BALANCE_SELECTION
,br.BALANCE_METHOD_CODE
,COALESCE(br.SIGNAGE_METHOD, 'ABSOLUTE') SIGNAGE_METHOD
,br.CURRENCY_CODE
,br.BAL_SEG_VALUE_OPTION_CODE
,brl.EXECUTION_MODE
,COALESCE(brl.IMPORT_FROM_SOURCE_FLAG, 'Y') IMPORT_FROM_SOURCE_FLAG
,COALESCE(brl.RECALCULATE_FLAG, 'N') RECALCULATE_FLAG
,COALESCE(brl.EXPORT_TO_TARGET_FLAG, 'N') EXPORT_TO_TARGET_FLAG
,CASE
WHEN (br.LEDGER_GROUP_ID IS NOT NULL) THEN 'MULTI'
WHEN (br.SOURCE_LEDGER_ID IS NOT NULL) THEN 'SINGLE'
ELSE 'NONE'
END LEDGER_GROUP_CODE
,COALESCE(br.BALANCE_AMOUNT_BS, 'YTD') BALANCE_AMOUNT_BS
,COALESCE(br.BALANCE_AMOUNT_IS, 'PERIODIC') BALANCE_AMOUNT_IS
,br.LEDGER_GROUP
,(SELECT brd.DETAIL_CODE
FROM AIF_BAL_RULE_DETAILS brd
WHERE brd.RULE_ID = br.RULE_ID
AND brd.DETAIL_TYPE = 'LEDGER'
) PS_LEDGER
,CASE lg.LEDGER_TEMPLATE
WHEN 'COMMITMENT' THEN 'Y'
ELSE 'N'
END KK_FLAG
,p.LAST_UPDATED_BY
,p.AIF_WEB_SERVICE_URL WEB_SERVICE_URL
,p.EPM_ORACLE_INSTANCE
FROM AIF_PROCESSES p
INNER JOIN AIF_BAL_RULE_LOADS brl
ON brl.LOADID = p.PROCESS_ID
INNER JOIN AIF_BALANCE_RULES br
ON br.RULE_ID = brl.RULE_ID
INNER JOIN AIF_SOURCE_SYSTEMS ss
ON ss.SOURCE_SYSTEM_ID = br.SOURCE_SYSTEM_ID
INNER JOIN AIF_TARGET_APPLICATIONS app
ON app.APPLICATION_ID = brl.APPLICATION_ID
INNER JOIN TPOVPARTITION part
ON part.PARTITIONKEY = br.PARTITIONKEY
INNER JOIN TBHVIMPGROUP imp
ON imp.IMPGROUPKEY = part.PARTIMPGROUP
LEFT OUTER JOIN AIF_COA_LEDGERS l
ON l.SOURCE_SYSTEM_ID = p.SOURCE_SYSTEM_ID
AND l.SOURCE_LEDGER_ID = COALESCE(br.SOURCE_LEDGER_ID,imp.IMPSOURCELEDGERID)
LEFT OUTER JOIN AIF_PS_SET_CNTRL_REC_STG scr
ON scr.SOURCE_SYSTEM_ID = l.SOURCE_SYSTEM_ID
AND scr.SETCNTRLVALUE = l.SOURCE_LEDGER_NAME
AND scr.RECNAME = 'LED_GRP_TBL'
LEFT OUTER JOIN AIF_PS_LED_GRP_TBL_STG lg
ON lg.SOURCE_SYSTEM_ID = scr.SOURCE_SYSTEM_ID
AND lg.SETID = scr.SETID
AND lg.LEDGER_GROUP = br.LEDGER_GROUP
WHERE p.PROCESS_ID = 22
2013-11-08 11:49:14,406 DEBUG [AIF]:
SELECT adim.BALANCE_COLUMN_NAME DIMNAME
,adim.DIMENSION_ID
,dim.TARGET_DIMENSION_CLASS_NAME
,(SELECT COA_SEGMENT_NAME
FROM AIF_COA_SEGMENTS cs
WHERE cs.COA_LINE_ID = tiie.IMPSOURCECOALINEID1
) COA_SEGMENT_NAME1
,(SELECT COA_SEGMENT_NAME
FROM AIF_COA_SEGMENTS cs
WHERE cs.COA_LINE_ID = tiie.IMPSOURCECOALINEID2
) COA_SEGMENT_NAME2
,(SELECT COA_SEGMENT_NAME
FROM AIF_COA_SEGMENTS cs
WHERE cs.COA_LINE_ID = tiie.IMPSOURCECOALINEID3
) COA_SEGMENT_NAME3
,(SELECT COA_SEGMENT_NAME
FROM AIF_COA_SEGMENTS cs
WHERE cs.COA_LINE_ID = tiie.IMPSOURCECOALINEID4
) COA_SEGMENT_NAME4
,(SELECT COA_SEGMENT_NAME
FROM AIF_COA_SEGMENTS cs
WHERE cs.COA_LINE_ID = tiie.IMPSOURCECOALINEID5
) COA_SEGMENT_NAME5
,(SELECT CASE mdd.ORPHAN_OPTION_CODE
WHEN 'CHILD' THEN 'N'
WHEN 'ROOT' THEN 'N'
ELSE 'Y'
END DIMENSION_FILTER_FLAG
FROM AIF_MAP_DIM_DETAILS_V mdd
,AIF_MAPPING_RULES mr
WHERE mr.PARTITIONKEY = tpp.PARTITIONKEY
AND mdd.RULE_ID = mr.RULE_ID
AND mdd.DIMENSION_ID = adim.DIMENSION_ID
) DIMENSION_FILTER_FLAG
,tiie.IMPCONCATCHAR
FROM TPOVPARTITION tpp
INNER JOIN AIF_TARGET_APPL_DIMENSIONS adim
ON adim.APPLICATION_ID = 2
INNER JOIN AIF_DIMENSIONS dim
ON dim.DIMENSION_ID = adim.DIMENSION_ID
LEFT OUTER JOIN TBHVIMPITEMERPI tiie
ON tiie.IMPGROUPKEY = tpp.PARTIMPGROUP
AND tiie.IMPFLDFIELDNAME = adim.BALANCE_COLUMN_NAME
AND tiie.IMPMAPTYPE = 'ERP'
WHERE tpp.PARTITIONKEY = 1
AND adim.BALANCE_COLUMN_NAME IS NOT NULL
ORDER BY adim.BALANCE_COLUMN_NAME
2013-11-08 11:49:14,407 DEBUG [AIF]: {'APPLICATION_ID': 2L, 'IMPORT_FROM_SOURCE_FLAG': u'N', 'PLAN_TYPE': None, 'RULE_NAME': u'VISIONRULE', 'ACTUAL_FLAG': u'A', 'IS_INCREMENTAL_LOAD': False, 'EPM_ORACLE_INSTANCE': u'C:\\Oracle\\Middleware\\user_projects\\epmsystem1', 'CATKEY': 1L, 'BAL_SEG_VALUE_OPTION_CODE': u'ALL', 'INCLUDE_ADJ_PERIODS_FLAG': u'N', 'PERIOD_MAPPING_TYPE': u'EXPLICIT', 'SOURCE_SYSTEM_TYPE': u'EBS_R12', 'LEDGER_GROUP': None, 'TARGET_APPLICATION_NAME': u'OASIS', 'RECALCULATE_FLAG': u'N', 'SOURCE_SYSTEM_ID': 2L, 'TEMP_DATA_TABLE_NAME': 'TDATASEG_T', 'KK_FLAG': u'N', 'AMOUNT_TYPE': u'MONETARY', 'EXPORT_TO_TARGET_FLAG': u'Y', 'DATA_TABLE_NAME': 'TDATASEG', 'DIMNAME_LIST': [u'ACCOUNT', u'ENTITY', u'ICP', u'UD1', u'UD2', u'UD3', u'UD4'], 'TDATAMAPTYPE': 'ERP', 'LAST_UPDATED_BY': u'admin', 'DIMNAME_MAP': {u'UD3': {'IMPCONCATCHAR': None, 'TARGET_DIMENSION_CLASS_NAME': u'Generic', 'COA_SEGMENT_NAME5': None, 'COA_SEGMENT_NAME1': u'SEGMENT5', 'COA_SEGMENT_NAME2': None, 'COA_SEGMENT_NAME3': None, 'DIMENSION_FILTER_FLAG': None, 'COA_SEGMENT_NAME4': None, 'DIMNAME': u'UD3', 'DIMENSION_ID': 9L}, u'ICP': {'IMPCONCATCHAR': None, 'TARGET_DIMENSION_CLASS_NAME': u'ICP', 'COA_SEGMENT_NAME5': None, 'COA_SEGMENT_NAME1': u'SEGMENT7', 'COA_SEGMENT_NAME2': None, 'COA_SEGMENT_NAME3': None, 'DIMENSION_FILTER_FLAG': None, 'COA_SEGMENT_NAME4': None, 'DIMNAME': u'ICP', 'DIMENSION_ID': 8L}, u'ENTITY': {'IMPCONCATCHAR': None, 'TARGET_DIMENSION_CLASS_NAME': u'Entity', 'COA_SEGMENT_NAME5': None, 'COA_SEGMENT_NAME1': u'SEGMENT1', 'COA_SEGMENT_NAME2': None, 'COA_SEGMENT_NAME3': None, 'DIMENSION_FILTER_FLAG': None, 'COA_SEGMENT_NAME4': None, 'DIMNAME': u'ENTITY', 'DIMENSION_ID': 12L}, u'UD2': {'IMPCONCATCHAR': None, 'TARGET_DIMENSION_CLASS_NAME': u'Generic', 'COA_SEGMENT_NAME5': None, 'COA_SEGMENT_NAME1': u'SEGMENT4', 'COA_SEGMENT_NAME2': None, 'COA_SEGMENT_NAME3': None, 'DIMENSION_FILTER_FLAG': None, 'COA_SEGMENT_NAME4': None, 'DIMNAME': u'UD2', 'DIMENSION_ID': 11L}, u'UD4': {'IMPCONCATCHAR': None, 'TARGET_DIMENSION_CLASS_NAME': u'Generic', 'COA_SEGMENT_NAME5': None, 'COA_SEGMENT_NAME1': u'SEGMENT6', 'COA_SEGMENT_NAME2': None, 'COA_SEGMENT_NAME3': None, 'DIMENSION_FILTER_FLAG': None, 'COA_SEGMENT_NAME4': None, 'DIMNAME': u'UD4', 'DIMENSION_ID': 1L}, u'ACCOUNT': {'IMPCONCATCHAR': None, 'TARGET_DIMENSION_CLASS_NAME': u'Account', 'COA_SEGMENT_NAME5': None, 'COA_SEGMENT_NAME1': u'SEGMENT3', 'COA_SEGMENT_NAME2': None, 'COA_SEGMENT_NAME3': None, 'DIMENSION_FILTER_FLAG': None, 'COA_SEGMENT_NAME4': None, 'DIMNAME': u'ACCOUNT', 'DIMENSION_ID': 10L}, u'UD1': {'IMPCONCATCHAR': None, 'TARGET_DIMENSION_CLASS_NAME': u'Generic', 'COA_SEGMENT_NAME5': None, 'COA_SEGMENT_NAME1': u'SEGMENT2', 'COA_SEGMENT_NAME2': None, 'COA_SEGMENT_NAME3': None, 'DIMENSION_FILTER_FLAG': None, 'COA_SEGMENT_NAME4': None, 'DIMNAME': u'UD1', 'DIMENSION_ID': 7L}}, 'TARGET_APPLICATION_TYPE': u'HFM', 'PARTITIONKEY': 1L, 'PARTVALGROUP': u'[NONE]', 'LEDGER_GROUP_CODE': u'SINGLE', 'INCLUDE_ZERO_BALANCE_FLAG': u'N', 'EXECUTION_MODE': u'SNAPSHOT', 'PLAN_NUMBER': 0L, 'PS_LEDGER': None, 'BALANCE_SELECTION': u'FUNCTIONAL', 'BALANCE_AMOUNT_IS': u'PERIODIC', 'RULE_ID': 1L, 'BALANCE_AMOUNT_BS': u'YTD', 'CURRENCY_CODE': None, 'SOURCE_ADAPTER_FLAG': u'N', 'BALANCE_METHOD_CODE': u'STANDARD', 'SIGNAGE_METHOD': u'SAME', 'WEB_SERVICE_URL': u'http://localhost:9000/aif', 'DATA_LOAD_METHOD': u'CLASSIC_VIA_EPMI'}
2013-11-08 11:49:14,407 DEBUG [AIF]: CommData.getRuleInfo - END
2013-11-08 11:49:14,407 DEBUG [AIF]: CommData.getPovList - START
2013-11-08 11:49:14,407 DEBUG [AIF]:
SELECT PARTITIONKEY
,PARTNAME
,CATKEY
,CATNAME
,PERIODKEY
,COALESCE(PERIODDESC, TO_CHAR(PERIODKEY,'YYYY-MM-DD HH24:MI:SS')) PERIODDESC
,RULE_ID
,RULE_NAME
FROM (
SELECT DISTINCT brl.PARTITIONKEY
,part.PARTNAME
,brl.CATKEY
,cat.CATNAME
,pprd.PERIODKEY
,pp.PERIODDESC
,brl.RULE_ID
,br.RULE_NAME
FROM AIF_BAL_RULE_LOADS brl
INNER JOIN AIF_BALANCE_RULES br
ON br.RULE_ID = brl.RULE_ID
INNER JOIN TPOVPARTITION part
ON part.PARTITIONKEY = brl.PARTITIONKEY
INNER JOIN TPOVCATEGORY cat
ON cat.CATKEY = brl.CATKEY
INNER JOIN AIF_PROCESS_PERIODS pprd
ON pprd.PROCESS_ID = brl.LOADID
LEFT OUTER JOIN TPOVPERIOD pp
ON pp.PERIODKEY = pprd.PERIODKEY
WHERE brl.LOADID = 22
) q
ORDER BY PARTITIONKEY
,CATKEY
,PERIODKEY
,RULE_ID
2013-11-08 11:49:14,409 DEBUG [AIF]: CommData.getPovList - END
2013-11-08 11:49:14,409 DEBUG [AIF]: CommData.updateWorkflow - START
2013-11-08 11:49:14,409 DEBUG [AIF]:
SELECT tlp.PROCESSSTATUS
,tlps.PROCESSSTATUSDESC
,CASE WHEN (tlp.INTLOCKSTATE = 60) THEN 'Y' ELSE 'N' END LOCK_FLAG
FROM TLOGPROCESS tlp
,TLOGPROCESSSTATES tlps
WHERE tlp.PARTITIONKEY = 1
AND tlp.CATKEY = 1
AND tlp.PERIODKEY = '2012-01-01'
AND tlp.RULE_ID = 1
AND tlps.PROCESSSTATUSKEY = tlp.PROCESSSTATUS
2013-11-08 11:49:14,410 DEBUG [AIF]:
UPDATE TLOGPROCESS
SET PROCESSENDTIME = CURRENT_TIMESTAMP
,PROCESSSTATUS = 30
,PROCESSENTLOAD = 0
,PROCESSENTVAL = 0
,PROCESSENTLOADNOTE = NULL
,PROCESSENTVALNOTE = NULL
WHERE PARTITIONKEY = 1
AND CATKEY = 1
AND PERIODKEY = '2012-01-01'
AND RULE_ID = 1
2013-11-08 11:49:14,411 DEBUG [AIF]: CommData.updateWorkflow - END
2013-11-08 11:49:14,412 DEBUG [AIF]:
SELECT COALESCE(usr.PROFILE_OPTION_VALUE, app.PROFILE_OPTION_VALUE, site.PROFILE_OPTION_VALUE) PROFILE_OPTION_VALUE
FROM AIF_PROFILE_OPTIONS po
LEFT OUTER JOIN AIF_PROFILE_OPTION_VALUES site
ON site.PROFILE_OPTION_NAME = po.PROFILE_OPTION_NAME
AND site.LEVEL_ID = 1000
AND site.LEVEL_VALUE = 0
AND site.LEVEL_ID <= po.MAX_LEVEL_ID
LEFT OUTER JOIN AIF_PROFILE_OPTION_VALUES app
ON app.PROFILE_OPTION_NAME = site.PROFILE_OPTION_NAME
AND app.LEVEL_ID = 1005
AND app.LEVEL_VALUE = NULL
AND app.LEVEL_ID <= po.MAX_LEVEL_ID
LEFT OUTER JOIN AIF_PROFILE_OPTION_VALUES usr
ON usr.PROFILE_OPTION_NAME = usr.PROFILE_OPTION_NAME
AND usr.LEVEL_ID = 1010
AND usr.LEVEL_VALUE = NULL
AND usr.LEVEL_ID <= po.MAX_LEVEL_ID
WHERE po.PROFILE_OPTION_NAME = 'JAVA_HOME'
2013-11-08 11:49:14,413 DEBUG [AIF]: HFM Load command:
%EPM_ORACLE_HOME%/products/FinancialDataQuality/bin/HFM_LOAD.vbs "22" "a9E3uvSJNhkFhEQTmuUFFUElfdQgKJKHrb1EsjRsL6yZJlXsOFcVPbGWHhpOQzl9zvHoo3s%2Bdq6R4yJhp0GMNWIKMTcizp3%2F8HASkA7rVufXDWEpAKAK%2BvcFmj4zLTI3rtrKHlVEYrOLMY453J2lXk6Cy771mNSD8X114CqaWSdUKGbKTRGNpgE3BfRGlEd1wZ3cra4ee0jUbT2aTaiqSN26oVe6dyxM3zolc%2BOPkjiDNk1MqwNr43tT3JsZz4qEQGF9d39DRN3CDjUuZRPt4SEKSSL35upncRJiw2uBOtV%2FvSuGLNpZ2J79v1%2Ba1Oo9c4Xhig7SFYbE6Jwk1yXRJLTSw0DKFu%2FEpcdjpOnx%2F6YawMBNIa5iu5L637S91jT1Xd3EGmxZFq%2Bi6bHdCJAC8g%3D%3D" "C%3A%5COracle%5CMiddleware%5Cuser_projects%5Cepmsystem1" "%25EPM_ORACLE_HOME%25%2F..%2Fjdk160_35"
Is there anywhere we need to set EPM Home and also let me know what is PSU1 ?
Thanks in advance.
Praneeth -
Error While Loading data from HFM to Essbase
Hi,
I am trying to load data from a HFM application to Essbase application. The LKM i am using is "HFM Data to SQL" and the IKM is "SQL to Hyperion Essbase(DATA). I am also using couple of mapping expression like {CASEWHEN(HFMDATA."Account" = '3110', 'PL10100', HFMDATA."Account") } , {CASEWHEN(HFMDATA."Period" = 'January', 'Jan', HFMDATA."Period") } etc.
The error i am getting looks like this -
org.apache.bsf.BSFException: exception from Jython:
Traceback (innermost last):
File "<string>", line 23, in ?
com.hyperion.odi.essbase.ODIEssbaseException: Invalid column type specified for Data column [Account]
at com.hyperion.odi.essbase.ODIEssbaseDataWriter.loadData(Unknown Source)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at org.python.core.PyReflectedFunction.__call__(PyReflectedFunction.java)
at org.python.core.PyMethod.__call__(PyMethod.java)
at org.python.core.PyObject.__call__(PyObject.java)
at org.python.core.PyInstance.invoke(PyInstance.java)
I am mapping account dimension of HFM to to account dimension of essbase. Both source and target column type is "string" with length 80. Instead of keeping the target "Essbase application", if i keep an Oracle table, columns of which are exactly like essbase application, ODI is loading data perfectly with proper coversion.
Can anyody please help me out.
Many thanks.
N. Laha.Hi John,
Thank you very much for your response. Pasting here the SQl generated at step "8. Integration - Load data into Essbase" in the operator in the description tab, for your perusal -
from com.hyperion.odi.common import ODIConstants
from com.hyperion.odi.connection import HypAppConnectionFactory
from java.lang import Class
from java.lang import Boolean
from java.sql import *
from java.util import HashMap
# Get the select statement on the staging area:
sql= """select CASEWHEN(C3_ACCOUNT = '3110', 'PL10100', C3_ACCOUNT) "Account",CASEWHEN(C4_PERIOD = 'January', 'Jan', C4_PERIOD) "TimePeriod",'HSP_Inputvalue' "HSP_Rates",C8_SCENARIO "Scenario",CASEWHEN(C2_YEAR = '2007', 'FY09', C2_YEAR) "Years",CASEWHEN(C1_CUSTOM1 = '2012', 'Atacand', C1_CUSTOM1) "Product",CASEWHEN(C7_VIEW = 'YTD', 'Input', C7_VIEW) "Version",CASEWHEN(C6_VALUE = 'USD', 'Local', C6_VALUE) "Currency",C9_ENTITY "Entity",C5_DATAVALUE "Data" from "C$_0AZGRP_PLData" where (1=1) """
srcCx = odiRef.getJDBCConnection("SRC")
stmt = srcCx.createStatement()
srcFetchSize=30
stmt.setFetchSize(srcFetchSize)
rs = stmt.executeQuery(sql)
#load the data
stats = pWriter.loadData(rs)
#close the database result set, connection
rs.close()
stmt.close()
From the SQL, you may find the CASEWHEN expressions. Yes these i have added in the interface mapping section, with the help of ODI expression editor. Following are the expressions i have used, one statement for one mapping each. This is just to make the data of HFM, compatible to Essbase. e.g. - account number in HFM for which i am estracting the data is 3110, but there is no account '3110' in the essbase application, so essbase application wont be able to load the data. To make the HFM data compatible with Essbase, these casewhen statements have been used. I hope its fine to use such statements in mapping.
CASEWHEN(HFMDATA."Year" = '2007', 'FY09', HFMDATA."Year")
CASEWHEN(HFMDATA."View" = 'YTD', 'Input', HFMDATA."View")
CASEWHEN(HFMDATA."Entity" = '100106_LC', '1030GB', HFMDATA."Entity")
CASEWHEN(HFMDATA."Value" = 'USD', 'Local', HFMDATA."Value")
CASEWHEN(HFMDATA."Account" = '3110', 'PL10100', HFMDATA."Account")
CASEWHEN(HFMDATA."Custom1" = '2012', 'Atacand', HFMDATA."Custom1")
CASEWHEN(HFMDATA."Period" = 'January', 'Jan', HFMDATA."Period")
If you can get some idea from the SQL and help me out!
Many thanks.
N. Laha
Edited by: user8732338 on Sep 28, 2009 9:45 PM -
Hello All,
We had an EPMA Type HFM application whose all dimensions were local, The application validated and deployed successfully.
We tried loading data into the HFM application and the data load was successful.
Then we decided to convert all of the local dimension of the above mentioned HFM application as shared dimensions. After converting all the dimensions as shared dimension successfully we are getting error while loading data into the same HFM application (The app does validate and can be deployed after changes)
The Error log is below:
Load data started: 11/29/2014 10:53:15.
Line: 216, Error: Invalid cell for Period Oct.
ACTUAL;2014; Oct; YTD; E_2100;<Entity Currency>;89920000; [ICP None]; CORP; [None]; [None]; FARM21000;11979
>>>>>>
Line: 217, Error: Invalid cell for Period Nov.
ACTUAL;2014; Nov; YTD; E_2100;<Entity Currency>;89920000; [ICP None]; CORP; [None]; [None]; FARM21000;23544
>>>>>>
Line: 218, Error: Invalid cell for Period Dec.
ACTUAL;2014; Dec; YTD; E_2100;<Entity Currency>;89920000; [ICP None]; CORP; [None]; [None]; FARM21000;58709
>>>>>>
Line: 219, Error: Invalid cell for Period Oct.
ACTUAL;2014; Oct; YTD; E_2100;<Entity Currency>;28050000; E_6000_20; [None]; [None]; [None]; FARM21000;-11979
>>>>>>
Line: 220, Error: Invalid cell for Period Nov.
ACTUAL;2014; Nov; YTD; E_2100;<Entity Currency>;28050000; E_6000_20; [None]; [None]; [None]; FARM21000;-11565
>>>>>>
Wanted to know whether there is something I might have missed while converting local dimension into shared (If there is any sequence to do so,or any constraint that I may not be aware of, though the conversion looks good as application is validated and deployed after changes)
What can be the reason for the failed data load, can anyone help?
Thanks
ArpanHi,
I would look at the account properties for that account (89920000) and see the TopCustom1...4Member. You will find the reason behind the invalid cells.
When you convert the local dimensions to shared, have you checked the 'Dimension Association' for Accounts and Entitities?
It does seem to lose the dimension association if a proper sequence is not followed.
Regards,
S
Maybe you are looking for
-
Can PPC and Intel OS's be on the same hard drive?
I do a lot of work on Apple computers for clients. I have made it a habit to bring firewire hard drives with OS X installed to my work locations so I can boot peoples' computers from my own, clean OS for numerous troubleshooting purposes. I was hopin
-
My brother and I phone number is link together on the same iTunes and Icloud therefore I am getting all his text messages on my phone, how do I stop this? He does not even have an Iphone anymore.
-
Problem with Matching and Merging Step of MDM Workflow
Hi, I have created a workflow which will first Match Step where the user will do both Matching and Merging. Then, it will go for a process step Final Review. Then, it will go for Branch step which is named as Decide on Check in. There are two workflo
-
Archived document Link to Service order
Hi, My requirement is to link the Archived document to the service order. The document is achieved by third party. I have created a content repository, Document type, created a link for the content repository. In this case can you please let me know
-
Error in CreateOUIProcess():-1 while installing Oracle Enterprise 8i
I attempted to install Oracle Enterprise 8i for linux. When I executed "runInstaller", it generated the following message: Initializing Java Virtual Machine from /usr/local/jre/bin/jre.Please wait ..... Error in CreateOUIProcess():-1 I am grateful fo