Log Issue in HFM data load

Hi,
I'm new to Oracle data Integrator.
I have an issue in log file name. I'm loading data into Hyperion Financial Management through ODI. In the Interface, when we select the IKM SQL to HFM data, we have an option of log file enabled. I made it true and gave the log file name as 'HFM_dataload.log'. After executing the interface when I navigate in to that log folder and view the log file, that file is blank. Also a new file 'HFM_dataloadHFM6064992926974374087.log' is created and the log details are displayed in it. Since I have to automate the process of picking up the everyday log file,
* I need the log details to be displayed in the specified log name i.e. 'HFM_dataload.log
Also I was not able to perform any action (copy that newly generated log file into another or send that file in mail) on that log file, since I'm not able to predict the numbers generated along with the specified log file name.
Kindly help me to overcome this issue.
Thanks in advance.
Edited by: user13754156 on Jun 27, 2011 5:08 AM
Edited by: user13754156 on Jun 27, 2011 5:09 AM

Thanks a lot for idea.
I am wonder in HFM data loads. In ODI operator they are showing warning symbol though few records got rejected instead Error. Is it possible to make it fail if one or more records got rejected.
I have experience with Essbase data loads. if it reaches specified number of records operator will get fail.
Please guide me if i am missing something.
Regards,
PrakashV

Similar Messages

  • HFM Data Load Issue

    Hello All,
    We had an EPMA Type HFM application whose all dimensions were local, The application validated and deployed successfully.
    We tried loading data into the HFM application and the data load was successful.
    Then we decided to convert all of the local dimension of the above mentioned HFM application as shared dimensions. After converting all the dimensions as shared dimension successfully we are getting error while loading data into the same HFM application (The app does validate and can be deployed after changes)
    The Error log is below:
    Load data started: 11/29/2014 10:53:15.
    Line: 216, Error: Invalid cell for Period Oct.
    ACTUAL;2014; Oct; YTD; E_2100;<Entity Currency>;89920000; [ICP None]; CORP; [None]; [None]; FARM21000;11979
    >>>>>>
    Line: 217, Error: Invalid cell for Period Nov.
    ACTUAL;2014; Nov; YTD; E_2100;<Entity Currency>;89920000; [ICP None]; CORP; [None]; [None]; FARM21000;23544
    >>>>>>
    Line: 218, Error: Invalid cell for Period Dec.
    ACTUAL;2014; Dec; YTD; E_2100;<Entity Currency>;89920000; [ICP None]; CORP; [None]; [None]; FARM21000;58709
    >>>>>>
    Line: 219, Error: Invalid cell for Period Oct.
    ACTUAL;2014; Oct; YTD; E_2100;<Entity Currency>;28050000; E_6000_20; [None]; [None]; [None]; FARM21000;-11979
    >>>>>>
    Line: 220, Error: Invalid cell for Period Nov.
    ACTUAL;2014; Nov; YTD; E_2100;<Entity Currency>;28050000; E_6000_20; [None]; [None]; [None]; FARM21000;-11565
    >>>>>>
    Wanted to know whether there is something I might have missed while converting local dimension into shared (If there is any sequence to do so,or any constraint that I may not be aware of, though the conversion looks good as application is validated and deployed after changes)
    What can be the reason for the failed data load, can anyone help?
    Thanks
    Arpan

    Hi,
    I would look at the account properties for that account (89920000) and see the TopCustom1...4Member. You will find the reason behind the invalid cells.
    When you convert the local dimensions to shared, have you checked the 'Dimension Association' for Accounts and Entitities?
    It does seem to lose the dimension association if a proper sequence is not followed.
    Regards,
    S

  • HFM Data Load Error in ODI

    Hi,
    I'm loading data into HFM from flat file. When the interface is executed only some of the data are getting loaded. When i checked for the errors in the log, I'm getting the following error message in log:
    'Line: 56, Error: Invalid cell for Period Apr'
    Then i found that its an invalid intersection in HFM which am trying to load.
    In FDM there is an option to validate invalid intersections during data load.
    I would like to know how to do so in ODI to overcome this kind of error i.e. is there any option in ODI to ignore this kind of error.
    Kndly help me.
    Thanks in advance

    Hi,
    I think even if the metadata exists still there might be some issues with HFM forbidden cells. There are HFM rules that determines which intersections are editable/loadable which are not. Please look at your HFM admin regarding forbidden rules. Or otherwise change the property of Custom dimensions so that it accepts data into all intersections.
    Thanks,
    Debasis

  • HFM Data Load Hangs?

    Hello,
    We are trying to load data to HFM from MS SQL.
    1. Successfully reverse engineered both SQL and HFM
    2. User for SQL has DBO access
    3. Successfully mapped source and target
    4. In flow we are using dedicated SQL staging area
    5. We are using LKM SQL to MSSQL and IKM SQL TO HFM Data
    6. In IKM we are using all default settings for properties
    7. When we execute; the interface is hung on the 5th step:
    1. DROP WORK TABLE (Success)
    2. CREATE WORK TABLE (Success)
    3. LOAD DATA (Success)
    4. SQL TO HFM PREPARE TO LOADING (Success)
    *5. SQL TO HFM LOAD DATA TO HFM (RUNNING FOR 14+ hrs)*
    To make sure it wasn't a large volume issue (just 100k rows), we even created a filter to pull just a single entity with very few records, still the process doesn't complete even after 12+ hours...
    We are using 10.1.3.6.0, are there any known issues with IKM SQL TO HFM DATA in this version?
    Please suggest.
    Appreciate your responses.
    Thanks

    Hello,
    Thanks for the response.
    Looked into the logs and nothing that points to 'why its hanging'....
    Here's the log, says connection to source, connection to hfm, options, etc all good...
    </Options>
    2013-05-31 12:39:10,107 INFO [DwgCmdExecutionThread:null:0]: Load Options validated.
    2013-05-31 12:39:10,302 INFO [DwgCmdExecutionThread:null:0]: Source data retrieved.
    2013-05-31 12:39:10,303 INFO [DwgCmdExecutionThread:null:0]: Pre-load tasks completed.
    2013-05-31 12:49:30,396 INFO [DwgCmdExecutionThread:odi_agent:2]: ODI Hyperion Financial Management Adapter Version 9.3.1
    2013-05-31 12:49:30,398 INFO [DwgCmdExecutionThread:odi_agent:2]: Load task initialized.
    2013-05-31 12:49:30,407 INFO [DwgCmdExecutionThread:odi_agent:2]: Connecting to Financial Management application [XXXXX] on [XXXXX] using user-name [XXXXX].
    2013-05-31 12:49:30,923 INFO [DwgCmdExecutionThread:odi_agent:2]: Connected to Financial Management application.
    2013-05-31 12:49:30,923 INFO [DwgCmdExecutionThread:odi_agent:2]: HFM Version: 11.1.2.1.0.
    2013-05-31 12:49:30,923 INFO [DwgCmdExecutionThread:odi_agent:2]: Options for the Financial Management load task are:
    <Options>
    <Option name=LOG_FILE_NAME value=D:\LOGS_ERRORS\SQL_HFM_LOG.LOG/>
    <Option name=IMPORT_MODE value=Merge/>
    <Option name=CONSOLIDATE_ONLY value=false/>
    <Option name=CONSOLIDATE_PARAMETERS value=""/>
    <Option name=LOG_ENABLED value=true/>
    <Option name=ACCUMULATE_WITHIN_FILE value=false/>
    <Option name=CONSOLIDATE_AFTER_LOAD value=false/>
    <Option name=FILE_CONTAINS_SHARE_DATA value=false/>
    So, no clear info on why its hanging on the load step...
    Any suggestions experts? Is it because of the Adaptor version being Version 9.3.1 and HFM Version: 11.1.2.1.0?
    Thanks for your inputs!

  • Issues with ondemand Data loader

    Hello,
    We are facing 2 issues with on demand data loader.
    Issue 1
    While inserting 'Contacts' and 'Assets' if the 'Account' information is wrong, the records are created with out accounts even though "Account" is a required field.
    Issue 2
    While inserting records data loader is not checking for duplicates. So duplicate records are getting created.
    Kindly advise if anyone has come across similar issues. Thanks
    Dipu
    Edited by: user11097775 on Jun 20, 2011 11:46 PM

    Hello,
    We are facing 2 issues with on demand data loader.
    Issue 1
    While inserting 'Contacts' and 'Assets' if the 'Account' information is wrong, the records are created with out accounts even though "Account" is a required field.
    Issue 2
    While inserting records data loader is not checking for duplicates. So duplicate records are getting created.
    Kindly advise if anyone has come across similar issues. Thanks
    Dipu
    Edited by: user11097775 on Jun 20, 2011 11:46 PM

  • HFM DATA LOAD WITH ODI HANGS LONG TIME

    Hi all,
         There's a very strange problem when I loading data from MS SQLServer to HFM with ODI. Specifically, there are 2 interfaces to 2 applications on the same HFM server. Data amount is about 1,300,000 and 650,000 separately.
    The strange thing is when I execute interface individually, it sometimes works well. However when I execute the package contains 2 interfaces, the larger one almost hangs on about 10+ hours every time whether I use an agent or not.
    After some research, it seems that the session hangs on because it cannot get return info from HFM but loading data has already completed. I found some similar problems on OTN like 64bit driver and jre compatible error OR deadlock on table. Different with this one. So, can anyone help on this? Much appreciate in advance!!!
    BTW, ODI and HFM are on the same server but ODI repositary and source of interface are on another MS SQL data server. The version is as below:
    HFM 11.1.1.3.0.956
    ODI 11.1.1.6.0
    win server 2003 x86
    MS SQLServer 2008 R2
    win server 2008 x64
    Regards,
    Steve

    Hi SH,
         source is MS SQLServer 2008 R2, staging area is on the source side, target is a HFM 11.1.1.3.0.956 based on SQLSERVER.
         KM is a standard 'IKM SQL to Hyperion Financial Management Data'.
         No transformation logic but only a filter to select data in current year.
    Besides, I have do some performance tuning as guide tolds:
    REM #
    REM # Java virtual machine
    REM #
    set ODI_JAVA_HOME=D:\oracle\Java\jdk1.6.0_21
    REM #
    REM # Other Parameters
    REM #
    set ODI_INIT_HEAP=512m
    set ODI_MAX_HEAP=1024m
    set ODI_JMX_PROTOCOL=rmi
    In Regedit:
    EnableServerLocking: 1
    MaxDataCacheSizeinMB :1000
    MaxNumDataRecordsInRAM: 2100000
    MultiServerMaxSyncDelayForApplicationChanges:300
    MultiServerMaxSyncDelayForDataChanges:300
    After some reaserch, I think the problem can be located at the HFM-ODI adapter or HFM side(maybe HFM cannot respond a completed info to ODI), do you have any idea? Thanks in advance

  • Performance issues with Planning data load & Agg in 11.1.2.3.500

    We recently upgraded from 11.1.1.3 to 11.1.2.3. Post upgrade we face performance issues with one of our Planning job (eg: Job E). It takes 3x the time to complete in our new environment (11.1.2.3) when compared to the old one (11.1.1.3). This job loads then actual data and does the aggregation. The pattern which we noticed is , if we run a restructure on the application and execute this job immediately it gets completed as the same time as 11.1.1.3. However, in current production (11.1.1.3) the job runs in the following sequence Job A->Job B-> Job C->Job D->Job E and it completes on time, but if we do the same test in 11.1.2.3 in the above sequence it take 3x the time . We dont have a window to restructure the application to before running Job E  every time in Prod. Specs of the new Env is much higher than the old one.
    We have Essbase clustering (MS active/passive) in the new environment and the files are stored in the SAN drive. Could this be because of this? has any one faced performance issues in the clustered environment?

    Do you have exactly same Essbase config settings and calculations performing AGG ? . Remember something very small like UPDATECALC ON/OFF can make a BIG difference in timing..

  • HFM data load "replace by security"

    Version 2.2 -> What is wrong - even for admin user every second load clears data and every second load imports

    Hi,
    Replace by security option is applicable to accounts only.
    For C3 dimension, different users need to have access to different C3 members. First of all please check in the HFM app settings, the parameter "UseSecurityForCustom" is selected or not. This parameter is used if we need to apply security on custom dimensions.Also verify the parameter "SecurityClass" which will be used to identify the users who can access the custom dimension data.
    Thanks..

  • IDOC ADAPTER Issue, some meta data loads, some doesn't

    Hey all,
    I just started trying to setup the IDOC adapter connection between our R/3 system and our XI system. I decided to try and pull in some meta data via IDX2 to start. The weird thing is that I get some IDOC structures successfully, but not others. I think it has to do with some requirnments that I am missing on the R/3 side. For instance, outside of setting up the RFC connection, creating the tRFC port and the partner profile for the Logical system what else is needed. Do I have to assign the basic type to the outbound partner profile to access it? Do I have to actually create IDOCS for the basic type? Just curious. Here are the details of what I have done so far. A lot of this I pulled from other peoples postings here on the forum, I think I am just missing a few steps!
    Our current configuration is as follows:
    SAP R/3 Enterprise to post IDOCs to XI .
    - R/3 Configuration:
    logical system created: DXICLNT200
    we20 : Partner Type LS: partner number: DXICLNT200.
    we21 : IDOC Ports: Transactional RFC: Port name: DXICLNT200. IDOC record type SAP Release 4.x. RFC Destination : DXICLNT200
    sm59 : create RFC destination : DXICLNT200. Target host: the XI box. Logon security: Client 200. XIAPPLUSER. PASSWORD (for XIAPPLUSER). Testing the connection works.
    - XIHOST Configuration:
    sm59 : created RFC destination to DDSCLNT210. Gateway options used, NO Client or User specified, are these needed, if so, is there a particular user to use.
    Test connection works.
    IDX1 : Created Port - DDSCLNT210. client 210. RFC destination DDSCLNT210.
    IDX2 : Able to load metadata for IDoc Type CHRMAS01, of port DDSCLNT210. NOT ABLE TO IMPORT CREMAS03 or MATMAS03!
    Any help is greatly apprciated!
    Thanks,
    Chris

    There was a problem in the RFC connection, the target host was not populated correctly. All meta data can now be pulled.
    However, I am curious about where the meta data for CHRMAS and CLSMAS came from. the RFC connection was NOT working, so where could it have pulled this information from????

  • Issue in master data loading

    Dear Experts
    when loading hierarchy data from CRM for some of my master data i am facing the below problem.
    Everyday i load my master data through process chains for some hierarchies but those changes are not getting displayed
    when checked in infopackage monitor its green but when i double click and check the details the overall status is red and underneath that the third one TRANSFER (IDOCS AND TRFC) : is also showing red.
    So later if i load manually then everything is becoming green and i am able to see the changes, What could be the possible reason and how can i solve it.
    Thanks and regards
    Neel

    Dear Neel,
    Thanks for the info, we have checked the option of activating the hierarchy in the infopackage itself, but even though the load package shouldn't be red in overall status.
    Thanks and regards
    Neel

  • Master data loading issue

    Hi gurus,
        Presently i am working on BI 7.0.I have small issue regarding master data loading.
        I have generic data soruce for master data loading.i have to fetch this data to BW side.Always i have to do full data load to the master data object.first time i have scheduled info package and run DTP to load data to master data object, no issues, data got loaded successfully.whenever i run infopacage for second time and run DTP i am getting error saying that duplicated records.
       How can i handle this.
    Best Regards
    Prasad

    Hi Prasad,
    Following is happening in your case:
    <b>Loading 1st Time:</b>
    1. Data loaded to PSA through ipack.It is a full load.
    2. data loaded to infoobject through DTP.
    <b>Loading 2nd Time:</b>
    1. Data is again loaded to PSA. It is a full load.
    2. At this point, data in PSA itself is duplicate. So when you are running the DTP, it picks up the data of both the requests that were loaded to PSA. And hence, you are getting the Duplicate record error.
    Please clear the PSA after the data is loaded to infoobject.
    Assign points if helpful.
    Regards,
    Tej Trivedi

  • DSO data Load issue

    Hi all,
    i have some issue with DSO data load , while loading the data data comming to psa perfectly 238 ware comes, but when i trigger that DTP, i am getting only 6 records.
    Please any one suggest me.
    Thanks,
    Gayatri.

    Hi Gayatri,
    if you have already loaded some data to DSO & now if you are trying to do Delta, it is possible that it is taking only Delta Data??
    (or)
    If you have any Start/End routines/Rule Routines written for deleting any records based on some conditions.
    (or)
    Also It depends on the keyfield you have selected in the DSO. If the keyfield which you have selected is having some repeated values, then while loading into DSO, it will be aggregated i.e if you have 10 rows for keyfield with value say 101, then in the DSO it will be loaded with only one row with value 101 (10 rows becoming 1 row) & with the keyfigure either Summation/Overwritten depending on what you have selected in the rule details for the keyfigure(you can check it by right click on Keyfigure mapping> Rule Details> here you can see whether it is Overwrite/Summation).
    Also as mentioned in above posts you can check the DSO --> manage --> Check the number of rows transferred & number of rows added.
    Hope it is clear & helpful!
    Regards,
    Pavan

  • ERPi Data load mapping Issue

    Hi,
    We are facing issue with ERPi data load mappings issue. Mapping file (txt file) has 36k records, whenever we are trying to load mappings, it's taking very long time, nearly 1 hour 30mins. but we want to reduce that time. is there any way to reduce data load mapping time??
    Hyperion verion: 11.1.2.2.300
    Please help, thanks in advance!!
    Thanks.

    Any one face the same kind of issue??

  • SPM Data Loads : Less number of records getting loaded in the Invoice Inbound DSO

    Dear Experts,
    We are working on a project, where data of different NON SAP Source Systems is being loaded into SPM, via Flat File Loads. We came across a very weird situation.
    For other Master and Transaction Data objects, it worked fine, but when we loaded Invoice File, less number of records are getting loaded in the Inbound DSO. The Invoice File contained 80000 records, but the inbound DSO has 78500 records only. We are losing out on 1500 Records.
    We are unable to figure out, as to which 1500 records are we missing out on. We couldn't find any logs, in the Inbound Invoice DSO. We are unable to find out if the records are erroneous, or there is any issue with something else. Is there a way to analyze the situation / Inbound invoice DSO.
    If there is any issue with the Outbound DSO or Cube, We know that it is possible to check the issue with the Data Load request, but for the Inbound DSO, we are not aware, as to which the way to analyze the issue, and why Inbound DSO is taking less records.
    Regards
    Pankaj

    Hi,
    Yes, It might be happen in DSO, because the data records have the simantic keys, so in Keyfileds selection you might have less no of records.
    If you have any rountines check the code(If any condetion for filtering the records).
    Regards.

  • Dynamo Log issue

    Hi,
    I have been using ATG 10.1.2 in for work. Recently I am observing that my dynamo.log is not getting updated whenever I run any transaction or do something. Whenever I restart that particular server, the log gets updated only till "Nucleus Running" and anything I do after that on the application is not getting updated on dynamo.log. I have a custom std.out where everything gets logged, and here also the log is not getting updated. I am using Weblogic 10.3.5 with this ATG.
    Any help or Info appreciated.
    Thanks.

    Thanks a lot for idea.
    I am wonder in HFM data loads. In ODI operator they are showing warning symbol though few records got rejected instead Error. Is it possible to make it fail if one or more records got rejected.
    I have experience with Essbase data loads. if it reaches specified number of records operator will get fail.
    Please guide me if i am missing something.
    Regards,
    PrakashV

Maybe you are looking for

  • Mail feature question

    Quick queston about receiving mail through the iPhone. I am a student with a college email address that ends with "luc.edu" I don't know much about the mail feature yet, but will I be able to program the iPhone to allow me to receive mail from my stu

  • Disassemble PDF based on Content Table

    Suppose a PDF file has a table of contents. Is it possible to split this PDF file into multiple small PDF files that only contains one chapter? Thanks, P

  • Font issue. Project created in CS5 converted to CS6

    After I open a project I created with cs5 in cs6 the fonts are changed and the position is different. Same computer?   I also noticed the title location on the title editor does not match the title location on the preview monitor.  It shows in the ce

  • Types of  Cubes

    hai pals,     i want  the information abt TYPES OF  INFOCUBES... and what are they.. and what is difference bettwen each other.. and when we  select that particular cubes...   thanks in advance.. @jay

  • How to limit repainting to 75 hz in a Panel?

    Hello all, I have created a geometrical editor and I want to limit refresh rate of the repaints when I am moving an object on the screen to 75hz (or to the monitor refresh rate) because it uses 90% of CPU when moving an object. When user drags an obj