HFM Data Load Error in ODI
Hi,
I'm loading data into HFM from flat file. When the interface is executed only some of the data are getting loaded. When i checked for the errors in the log, I'm getting the following error message in log:
'Line: 56, Error: Invalid cell for Period Apr'
Then i found that its an invalid intersection in HFM which am trying to load.
In FDM there is an option to validate invalid intersections during data load.
I would like to know how to do so in ODI to overcome this kind of error i.e. is there any option in ODI to ignore this kind of error.
Kndly help me.
Thanks in advance
Hi,
I think even if the metadata exists still there might be some issues with HFM forbidden cells. There are HFM rules that determines which intersections are editable/loadable which are not. Please look at your HFM admin regarding forbidden rules. Or otherwise change the property of Custom dimensions so that it accepts data into all intersections.
Thanks,
Debasis
Similar Messages
-
Hi,
I'm new to Oracle data Integrator.
I have an issue in log file name. I'm loading data into Hyperion Financial Management through ODI. In the Interface, when we select the IKM SQL to HFM data, we have an option of log file enabled. I made it true and gave the log file name as 'HFM_dataload.log'. After executing the interface when I navigate in to that log folder and view the log file, that file is blank. Also a new file 'HFM_dataloadHFM6064992926974374087.log' is created and the log details are displayed in it. Since I have to automate the process of picking up the everyday log file,
* I need the log details to be displayed in the specified log name i.e. 'HFM_dataload.log
Also I was not able to perform any action (copy that newly generated log file into another or send that file in mail) on that log file, since I'm not able to predict the numbers generated along with the specified log file name.
Kindly help me to overcome this issue.
Thanks in advance.
Edited by: user13754156 on Jun 27, 2011 5:08 AM
Edited by: user13754156 on Jun 27, 2011 5:09 AMThanks a lot for idea.
I am wonder in HFM data loads. In ODI operator they are showing warning symbol though few records got rejected instead Error. Is it possible to make it fail if one or more records got rejected.
I have experience with Essbase data loads. if it reaches specified number of records operator will get fail.
Please guide me if i am missing something.
Regards,
PrakashV -
Most common BW data load errors in production and how to solve them ..
Hi All,
Most common BW data load errors in production and how to solve them ..
Any doc on it ,if so plz send it across to this id [email protected]
Thanks in advance.
Rgrds
shobahi
1) RFC connection lost.
2) Invalid characters while loading.
3) ALEREMOTE user is locked.
4) Lower case letters not allowed.
5) While loading the data i am getting messeage that 'Record
the field mentioned in the errror message is not mapped to any infoboject in the transfer rule.
6) object locked.
7) "Non-updated Idocs found in Source System".
8) While loading master data, one of the datapackage has a red light error message:
Master data/text of characteristic 'so and so' already deleted .
9) extraction job aborted in r3
10) request couldnt be activated because theres another request in the psa with a smaller sid
11) repeat of last delta not possible
12) datasource not replicated
13) datasource/transfer structure not active ´
14) Idoc Or Trfc Error
15. ODS Activation Error -
What are the frequent data load errors in production support?
what are the frequent data load errors in production support?
It is a long list. Here is some of them.
1. Idoc not arriving from source system.
2. previous processes failed.
3. change run unsucessful or did not run properly and data in master data is messed up.
4. invalid characteristics in the data.
5. Duplicate records found.
and on and on.
Ravi Thotahdri -
Data Loading Error : ToomanyError records.
Hi All,
I got Data Loading error when I am loading from Flat File.
Following is the error message :
Too many error records - update terminated
Error 18 in the update
No SID Found for value '00111805' of Characterstic ZUNIQUEID (Message No 70)
can anybody help in resolving the issue.
Regards,
Chakravarthyhi
Check the format of your charecteristics and key figures .
Check you put data separators in you flat files approriately
In the particular charecteristics ZUNIQUEID ,ensure data is consistent...check the related tables
Assign points if useful
Regards
N Ganesh -
Data Load error Please respond ASAP
Hi ,
When i am trying to load data for year 07 using rule files its showing an error and aborting the data load
Error Msg " Data Value [2007] Encountered before all dimensions Selected,[5] Records Completed"
Any help greatly appreciated.
ThanksKey here is that 5 records were loaded before failing on the 6th, indicating it's not a header issue on the load rule.
Check the first 5 records and see how the column containing 2007 differs between them and the failing 6th record. Perhaps the 6th record has a null (as indicated eariler) or prehaps 2007 is not the name of the member in the outline.
I have had both issues give similar error results.
J -
Data load error regarding timestamp
Hi All,
I am getting data load error related to time.
I am using 2lis_02_SCN for CNFTM(time stamp) filed
with value "08:00:" need to convert to 08:00:00...for some records and rest of the records are coming with normal 6 digit time.
please send your suggestion as early as possible.
Thanks
RupaHi Rupa,
Try out following sample code inthe transfer rule for the same info object and it should work
lv_len = strlen( str ).
if lv_len = 4.
concatenate str '00' into str.
endif
Regards
Kapadia -
Common data load errors in BI 7
Hi , i would appreciate if any one can provide me a list with most common Data load errors effecting both master and transactional data in SAP BI 7 or BW 3.5..
Hi AAL,
There're lots of loading error posts in this forum. This one I guess is good for you:
production support errors
production support issues
Or you can search yourself using key word common load error etc.
Hope above helps.
Regards,
Frank -
Sap bw 7.3 master data load error
sap bw 7.3 master data load error
erroe: Exception in Substep RulesHi Jayram,
I am assuming that you are getting this error when you are loading data from PSA to IO. If so, the error might be because of
1. Duplicate records
PSA might have duplicate records. But you will be able to load only one record to Master data IO. In DTP Update tab, there is an option for "Handle Duplicate Record Keys" . Just enable this & try to load again.
2. Erroneous records.
Some special reocrds are allowed in PSA where as thye might not be allowed in Master Data/Data Targets like Lower case letters/ some speacial characters etc. If error is because of this, then you might need to correct the data at PSA level & reload the data to Master data. Else get it corrected in source system itself & fetch the data to BW. Or write some code to take care of these special characters.
Hope it helps!
Regards,
Pavan -
hi friends ,
we are geting data load error as; Process 000087 for deterermined fro infoobject DUVTYPE started in STATUS TAB and in DETAILS TAB update(251new/160 chnaged): Errors occured.
what should i know from information and what are the reasons for this load failure and possible rectification methods.
This load is through process chains and aslo a delta load.
Can i delete the request and start the infopackage manually? also will it make my Process chain Green automatically or should i repeat from the subsequent process?
Plz help.
I award points.
regards,
siddarthaHi siddharta,
look in the details of the monitor to see the error. Be very cautious with deltas! They can not be repeated every time. it depends on the datasource! What you can do is delete the failed request from all(!) the data targets meanwhile!
Regards,
Jzergen -
HFM DATA LOAD WITH ODI HANGS LONG TIME
Hi all,
There's a very strange problem when I loading data from MS SQLServer to HFM with ODI. Specifically, there are 2 interfaces to 2 applications on the same HFM server. Data amount is about 1,300,000 and 650,000 separately.
The strange thing is when I execute interface individually, it sometimes works well. However when I execute the package contains 2 interfaces, the larger one almost hangs on about 10+ hours every time whether I use an agent or not.
After some research, it seems that the session hangs on because it cannot get return info from HFM but loading data has already completed. I found some similar problems on OTN like 64bit driver and jre compatible error OR deadlock on table. Different with this one. So, can anyone help on this? Much appreciate in advance!!!
BTW, ODI and HFM are on the same server but ODI repositary and source of interface are on another MS SQL data server. The version is as below:
HFM 11.1.1.3.0.956
ODI 11.1.1.6.0
win server 2003 x86
MS SQLServer 2008 R2
win server 2008 x64
Regards,
SteveHi SH,
source is MS SQLServer 2008 R2, staging area is on the source side, target is a HFM 11.1.1.3.0.956 based on SQLSERVER.
KM is a standard 'IKM SQL to Hyperion Financial Management Data'.
No transformation logic but only a filter to select data in current year.
Besides, I have do some performance tuning as guide tolds:
REM #
REM # Java virtual machine
REM #
set ODI_JAVA_HOME=D:\oracle\Java\jdk1.6.0_21
REM #
REM # Other Parameters
REM #
set ODI_INIT_HEAP=512m
set ODI_MAX_HEAP=1024m
set ODI_JMX_PROTOCOL=rmi
In Regedit:
EnableServerLocking: 1
MaxDataCacheSizeinMB :1000
MaxNumDataRecordsInRAM: 2100000
MultiServerMaxSyncDelayForApplicationChanges:300
MultiServerMaxSyncDelayForDataChanges:300
After some reaserch, I think the problem can be located at the HFM-ODI adapter or HFM side(maybe HFM cannot respond a completed info to ODI), do you have any idea? Thanks in advance -
Hi All,
I have created One Interface for loading data to Planning Application i have make the Account Dimension as Load dimension and Segment as Driver dimension (member is Audio)
I have created one form for this now when i am running the interface its get successfully executed but at report statics it showing following problem
org.apache.bsf.BSFException: exception from Jython:
Traceback (innermost last):
File "<string>", line 2, in ?
Planning Writer Load Summary:
Number of rows successfully processed: 0
Number of rows rejected: 48
at org.apache.bsf.engines.jython.JythonEngine.exec(JythonEngine.java:146)
at com.sunopsis.dwg.codeinterpretor.SnpScriptingInterpretor.execInBSFEngine(SnpScriptingInterpretor.java:345)
at com.sunopsis.dwg.codeinterpretor.SnpScriptingInterpretor.exec(SnpScriptingInterpretor.java:169)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.scripting(SnpSessTaskSql.java:2374)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.execScriptingOrders(SnpSessTaskSql.java:1615)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.execScriptingOrders(SnpSessTaskSql.java:1580)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTaskTrt(SnpSessTaskSql.java:2755)
at com.sunopsis.dwg.dbobj.SnpSessTaskSqlI.treatTaskTrt(SnpSessTaskSqlI.java:68)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java:2515)
at com.sunopsis.dwg.dbobj.SnpSessStep.treatAttachedTasks(SnpSessStep.java:534)
at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java:449)
at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java:1954)
at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$2.doAction(StartSessRequestProcessor.java:322)
at oracle.odi.core.persistence.dwgobject.DwgObjectTemplate.execute(DwgObjectTemplate.java:224)
at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.doProcessStartSessTask(StartSessRequestProcessor.java:246)
at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.access$0(StartSessRequestProcessor.java:237)
at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$StartSessTask.doExecute(StartSessRequestProcessor.java:794)
at oracle.odi.runtime.agent.processor.task.AgentTask.execute(AgentTask.java:114)
at oracle.odi.runtime.agent.support.DefaultAgentTaskExecutor$2.run(DefaultAgentTaskExecutor.java:82)
at java.lang.Thread.run(Thread.java:619)
I have find out a log file for it which says :
Account,Data Load Cube Name,MP3,Point-of-View,Error_Reason
Max Hits Per Day,Consol,Jan,"USD,Cannot load dimension member, error message is: java.lang.IllegalArgumentException: Unbalanced double quotes found at character 3 in the member list: "USD
Avg Hits Per Day,Consol,Jan,"USD,Cannot load dimension member, error message is: java.lang.IllegalArgumentException: Unbalanced double quotes found at character 3 in the member list: "USD
Can any buddy tell me what i did wrong. My Data file not contains any Double Quote mark with USD the POV is like this :---> USD,E01_0 ,Jan,Current,Working,FY11
Thanks
Gourav Atalkar
Edited by: Gourav Atalkar(ELT) on May 3, 2011 4:06 AMAs it is comma separated and all the members for the POV need to be together try enclosing it in quotes e.g "USD,E01_0 ,Jan,Current,Working,FY11"
Cheers
John
http://john-goodwin.blogspot.com/ -
Hello All,
We had an EPMA Type HFM application whose all dimensions were local, The application validated and deployed successfully.
We tried loading data into the HFM application and the data load was successful.
Then we decided to convert all of the local dimension of the above mentioned HFM application as shared dimensions. After converting all the dimensions as shared dimension successfully we are getting error while loading data into the same HFM application (The app does validate and can be deployed after changes)
The Error log is below:
Load data started: 11/29/2014 10:53:15.
Line: 216, Error: Invalid cell for Period Oct.
ACTUAL;2014; Oct; YTD; E_2100;<Entity Currency>;89920000; [ICP None]; CORP; [None]; [None]; FARM21000;11979
>>>>>>
Line: 217, Error: Invalid cell for Period Nov.
ACTUAL;2014; Nov; YTD; E_2100;<Entity Currency>;89920000; [ICP None]; CORP; [None]; [None]; FARM21000;23544
>>>>>>
Line: 218, Error: Invalid cell for Period Dec.
ACTUAL;2014; Dec; YTD; E_2100;<Entity Currency>;89920000; [ICP None]; CORP; [None]; [None]; FARM21000;58709
>>>>>>
Line: 219, Error: Invalid cell for Period Oct.
ACTUAL;2014; Oct; YTD; E_2100;<Entity Currency>;28050000; E_6000_20; [None]; [None]; [None]; FARM21000;-11979
>>>>>>
Line: 220, Error: Invalid cell for Period Nov.
ACTUAL;2014; Nov; YTD; E_2100;<Entity Currency>;28050000; E_6000_20; [None]; [None]; [None]; FARM21000;-11565
>>>>>>
Wanted to know whether there is something I might have missed while converting local dimension into shared (If there is any sequence to do so,or any constraint that I may not be aware of, though the conversion looks good as application is validated and deployed after changes)
What can be the reason for the failed data load, can anyone help?
Thanks
ArpanHi,
I would look at the account properties for that account (89920000) and see the TopCustom1...4Member. You will find the reason behind the invalid cells.
When you convert the local dimensions to shared, have you checked the 'Dimension Association' for Accounts and Entitities?
It does seem to lose the dimension association if a proper sequence is not followed.
Regards,
S -
Hello,
We are loading one month data from peoplesoft to planning application using ERPI 11.1.2.1. It takes about 1 hrs to load one month data.Is it normal? When looking at ODI PS_GL_LOAD_BALANCES_DATA is always successful but COMM_LOAD_BALANCES fails at the end with this error.Did anyone have this error before. Message in HyS9aifWeb-sysout file is:
FROM AIF_HS_BALANCES
WHERE LOADID = 132
[EssbaseRuleFile] Locking rules file AIFData
[EssbaseRuleFile] Successfully locked rules file AIFData
[EssbaseRuleFile] Copying rules file OWBPLAND for data load as AIFData
[EssbaseRuleFile] Unlocking rules file AIFData
[EssbaseRuleFile] Successfully unlocked rules file AIFData
[EssbaseRuleFile] The data rules file has been created successfully.
[EssbaseRuleFile] Locking rules file AIFData
[EssbaseRuleFile] Successfully locked rules file AIFData
[EssbaseRuleFile] Load data into the cube by launching rules file...
<Mar 1, 2012 1:09:30 PM PST> <Error> <WebLogicServer> <BEA-000337> <[STUCK] ExecuteThread: '5' for queue: 'weblogic.kernel.Default (self-tuning)' has been busy for "625" seconds working on the request "weblogic.servlet.internal.ServletRequestImpl@3bc2189[
POST /aif/services/HPLService HTTP/1.1
Content-type: text/xml;charset="utf-8"
Accept: text/xml, multipart/related, text/html, image/gif, image/jpeg, *; q=.2, */*; q=.2
Soapaction: "http://DEVAPPHPN001.indymacdev.biz:19000/aif/services/HPLServiceexecuteDataLoad"
User-Agent: JAX-WS RI 2.1.7-b01-
Content-Length: 638
Connection: Keep-Alive
Proxy-Client-IP: 10.205.101.108
X-Forwarded-For: 10.205.101.108
X-WebLogic-KeepAliveSecs: 30
X-WebLogic-Force-JVMID: 2124105652
]", which is more than the configured time (StuckThreadMaxTime) of "600" seconds. Stack trace:
Thread-176 "[STUCK] ExecuteThread: '5' for queue: 'weblogic.kernel.Default (self-tuning)'" <alive, suspended, priority=1, DAEMON> {
jrockit.net.SocketNativeIO.readBytesPinned(SocketNativeIO.java:???)
jrockit.net.SocketNativeIO.socketRead(SocketNativeIO.java:24)
java.net.SocketInputStream.socketRead0(SocketInputStream.java:???)
java.net.SocketInputStream.read(SocketInputStream.java:107)
com.essbase.services.olap.main.main_direct.EssNetClient.readFrom(Unknown Source)
com.essbase.services.olap.main.main_direct.EssNetClient.readPackage(Unknown Source)
com.essbase.services.olap.main.main_direct.EssNetClient.receiveResponse2(Unknown Source)
com.essbase.services.olap.main.main_direct.EssNetClient.adNetReceiveResponse(Unknown Source)
com.essbase.services.olap.main.main_direct.EssAPIData._adImport(Unknown Source)
com.essbase.services.olap.main.main_direct.EssAPIData.ImportToEssbaseASO(Unknown Source)
com.essbase.services.olap.main.main_direct.EssMAPIDir.BeginDataload(Unknown Source)
com.essbase.server.framework.EssOlapMainService.BeginDataload(Unknown Source)
com.essbase.server.framework.EssOrbPluginDirect.essMainBeginDataload(Unknown Source)
com.essbase.api.session.EssOrbPlugin._invokeMainMethod(Unknown Source)
com.essbase.api.session.EssOrbPlugin._invokeMethod2(Unknown Source)
com.essbase.api.session.EssOrbPlugin._invokeMethod(Unknown Source)
com.essbase.server.framework.EssOrbPluginDirect._invokeProtected(Unknown Source)
com.essbase.api.session.EssOrbPluginEmbedded.invokeMethod(Unknown Source)
com.essbase.api.session.EssOrbPluginEmbedded.invokeMethod(Unknown Source)
com.essbase.api.session.EssOrbPlugin.essMainBeginDataload(Unknown Source)
com.essbase.api.datasource.EssCube.beginDataload(Unknown Source)
com.hyperion.aif.essbase.EssbaseRuleFile.executeDataRuleFile(EssbaseRuleFile.java:314)
com.hyperion.aif.webservices.HPLService.executeDataLoad(HPLService.java:167)
sun.reflect.GeneratedMethodAccessor7310.invoke(Unknown Source)
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
java.lang.reflect.Method.invoke(Method.java:575)
org.apache.axis.providers.java.RPCProvider.invokeMethod(RPCProvider.java:397)
org.apache.axis.providers.java.RPCProvider.processMessage(RPCProvider.java:71)
org.apache.axis.providers.java.JavaProvider.invoke(JavaProvider.java:265)
org.apache.axis.strategies.InvocationStrategy.visit(InvocationStrategy.java:32)
org.apache.axis.SimpleChain.doVisiting(SimpleChain.java:113)
org.apache.axis.SimpleChain.invoke(SimpleChain.java:78)
org.apache.axis.handlers.soap.SOAPService.invoke(SOAPService.java:435)
org.apache.axis.server.AxisServer.invoke(AxisServer.java:132)
org.apache.axis.transport.http.AxisServlet.doPost(AxisServlet.java:586)
javax.servlet.http.HttpServlet.service(HttpServlet.java:700)
org.apache.axis.transport.http.AxisServletBase.service(AxisServletBase.java:325)
javax.servlet.http.HttpServlet.service(HttpServlet.java:815)
weblogic.servlet.internal.StubSecurityHelper$ServletServiceAction.run(StubSecurityHelper.java:224)
weblogic.servlet.internal.StubSecurityHelper.invokeServlet(StubSecurityHelper.java:108)
weblogic.servlet.internal.ServletStubImpl.execute(ServletStubImpl.java:206)
weblogic.servlet.internal.TailFilter.doFilter(TailFilter.java:26)
weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:55)
oracle.security.jps.ee.http.JpsAbsFilter$1.run(JpsAbsFilter.java:111)
oracle.security.jps.util.JpsSubject.doAsPrivileged(JpsSubject.java:299)
oracle.security.jps.ee.util.JpsPlatformUtil.runJaasMode(JpsPlatformUtil.java:405)
oracle.security.jps.ee.http.JpsAbsFilter.runJaasMode(JpsAbsFilter.java:94)
oracle.security.jps.ee.http.JpsAbsFilter.doFilter(JpsAbsFilter.java:98)
oracle.security.jps.ee.http.JpsFilter.doFilter(JpsFilter.java:70)
weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:55)
oracle.dms.servlet.DMSServletFilter.doFilter(DMSServletFilter.java:86)
weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:55)
weblogic.servlet.internal.RequestEventsFilter.doFilter(RequestEventsFilter.java:25)
weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:55)
weblogic.servlet.internal.WebAppServletContext$ServletInvocationAction.wrapRun(WebAppServletContext.java:3687)
weblogic.servlet.internal.WebAppServletContext$ServletInvocationAction.run(WebAppServletContext.java:3681)
weblogic.security.acl.internal.AuthenticatedSubject.doAs(AuthenticatedSubject.java:308)
weblogic.security.service.SecurityManager.runAs(SecurityManager.java:116)
weblogic.servlet.internal.WebAppServletContext.securedExecute(WebAppServletContext.java:2213)
weblogic.servlet.internal.WebAppServletContext.execute(WebAppServletContext.java:2135)
weblogic.servlet.internal.ServletRequestImpl.run(ServletRequestImpl.java:1420)
weblogic.work.ExecuteThread.execute(ExecuteThread.java:201)
weblogic.work.ExecuteThread.run(ExecuteThread.java:168)
Any feedback is appreciated.Thank you for taking ime to read through.
Regards.Hello,
Part of the message you are getting is because the WebLogic process is a bit intuitive. It knows when something is running and waits. It also assumes anything running for a certain period of time is considered "stalled/stuck" and reports it. The time it waits is dependant on the Weblogic configuration.
As for the time it takes to load data... that is very vague. Without knowing more about your environment that is very hard to state. We know nothing about the hardware, platform, dataset size, etc. You would need to review ODI to find out what is taking the long amount of time and try and replicate it outside of ERPi/ODI to see if you get the same results.
Thank you, -
Hello,
We are trying to load data to HFM from MS SQL.
1. Successfully reverse engineered both SQL and HFM
2. User for SQL has DBO access
3. Successfully mapped source and target
4. In flow we are using dedicated SQL staging area
5. We are using LKM SQL to MSSQL and IKM SQL TO HFM Data
6. In IKM we are using all default settings for properties
7. When we execute; the interface is hung on the 5th step:
1. DROP WORK TABLE (Success)
2. CREATE WORK TABLE (Success)
3. LOAD DATA (Success)
4. SQL TO HFM PREPARE TO LOADING (Success)
*5. SQL TO HFM LOAD DATA TO HFM (RUNNING FOR 14+ hrs)*
To make sure it wasn't a large volume issue (just 100k rows), we even created a filter to pull just a single entity with very few records, still the process doesn't complete even after 12+ hours...
We are using 10.1.3.6.0, are there any known issues with IKM SQL TO HFM DATA in this version?
Please suggest.
Appreciate your responses.
ThanksHello,
Thanks for the response.
Looked into the logs and nothing that points to 'why its hanging'....
Here's the log, says connection to source, connection to hfm, options, etc all good...
</Options>
2013-05-31 12:39:10,107 INFO [DwgCmdExecutionThread:null:0]: Load Options validated.
2013-05-31 12:39:10,302 INFO [DwgCmdExecutionThread:null:0]: Source data retrieved.
2013-05-31 12:39:10,303 INFO [DwgCmdExecutionThread:null:0]: Pre-load tasks completed.
2013-05-31 12:49:30,396 INFO [DwgCmdExecutionThread:odi_agent:2]: ODI Hyperion Financial Management Adapter Version 9.3.1
2013-05-31 12:49:30,398 INFO [DwgCmdExecutionThread:odi_agent:2]: Load task initialized.
2013-05-31 12:49:30,407 INFO [DwgCmdExecutionThread:odi_agent:2]: Connecting to Financial Management application [XXXXX] on [XXXXX] using user-name [XXXXX].
2013-05-31 12:49:30,923 INFO [DwgCmdExecutionThread:odi_agent:2]: Connected to Financial Management application.
2013-05-31 12:49:30,923 INFO [DwgCmdExecutionThread:odi_agent:2]: HFM Version: 11.1.2.1.0.
2013-05-31 12:49:30,923 INFO [DwgCmdExecutionThread:odi_agent:2]: Options for the Financial Management load task are:
<Options>
<Option name=LOG_FILE_NAME value=D:\LOGS_ERRORS\SQL_HFM_LOG.LOG/>
<Option name=IMPORT_MODE value=Merge/>
<Option name=CONSOLIDATE_ONLY value=false/>
<Option name=CONSOLIDATE_PARAMETERS value=""/>
<Option name=LOG_ENABLED value=true/>
<Option name=ACCUMULATE_WITHIN_FILE value=false/>
<Option name=CONSOLIDATE_AFTER_LOAD value=false/>
<Option name=FILE_CONTAINS_SHARE_DATA value=false/>
So, no clear info on why its hanging on the load step...
Any suggestions experts? Is it because of the Adaptor version being Version 9.3.1 and HFM Version: 11.1.2.1.0?
Thanks for your inputs!
Maybe you are looking for
-
Error "AND, OR or end of condition expected" in sap script IF statement
Hi all, /: IF &WA_BSEG_IN-BUKRS& EQ '1000' OR &WA_BSEG_IN-BUKRS& EQ'2000' = OR &WA_BSEG_IN-BUKRS& EQ '4000' OR &WA_BSEG_IN-BUKRS& EQ '5000' /: CASE &WA_BSEG_IN-BLART& /: WHEN 'DZ' /: IF &SAVE_EVENT& EQ u2018ZPR01u2019 OR &SAVE_EVENT& E
-
Upgraded to Lion and now can't drag files around
I can't drag and drop files/folders nor even move anything on my desktop.... I have to say the "upgrade" to Lion has been the most frustrating ever.
-
Backdate from Mavericks to Lion?
I made the mistake of upgrading from Lion to Mavericks on my 2011 MacBook Pro. I want to restore my OS back to Lion since Mavericks is incredibly flawed & causes problems with performace & screws up syncing my brand new iPhone 5c. Is this at all poss
-
Can't drag .mov file to aut play box.
When I import a quick time file movie I can't drag into the auto start box so that the DVD can just play when inserted. I get no error because the video won't even drag into the auto start box. What is causing this? I'm on a deadline and need to burn
-
Add Content button not rendered inside Portlets on edit mode
Hi, I have a requirement where administrators will add some external content(Add content) either from taskflow or portlet and that external content (along with taskflow output) will be shown to the users based on some logic performed inside the taskf