HFM data load "replace by security"

Version 2.2 -> What is wrong - even for admin user every second load clears data and every second load imports

Hi,
Replace by security option is applicable to accounts only.
For C3 dimension, different users need to have access to different C3 members. First of all please check in the HFM app settings, the parameter "UseSecurityForCustom" is selected or not. This parameter is used if we need to apply security on custom dimensions.Also verify the parameter "SecurityClass" which will be used to identify the users who can access the custom dimension data.
Thanks..

Similar Messages

  • Log Issue in HFM data load

    Hi,
    I'm new to Oracle data Integrator.
    I have an issue in log file name. I'm loading data into Hyperion Financial Management through ODI. In the Interface, when we select the IKM SQL to HFM data, we have an option of log file enabled. I made it true and gave the log file name as 'HFM_dataload.log'. After executing the interface when I navigate in to that log folder and view the log file, that file is blank. Also a new file 'HFM_dataloadHFM6064992926974374087.log' is created and the log details are displayed in it. Since I have to automate the process of picking up the everyday log file,
    * I need the log details to be displayed in the specified log name i.e. 'HFM_dataload.log
    Also I was not able to perform any action (copy that newly generated log file into another or send that file in mail) on that log file, since I'm not able to predict the numbers generated along with the specified log file name.
    Kindly help me to overcome this issue.
    Thanks in advance.
    Edited by: user13754156 on Jun 27, 2011 5:08 AM
    Edited by: user13754156 on Jun 27, 2011 5:09 AM

    Thanks a lot for idea.
    I am wonder in HFM data loads. In ODI operator they are showing warning symbol though few records got rejected instead Error. Is it possible to make it fail if one or more records got rejected.
    I have experience with Essbase data loads. if it reaches specified number of records operator will get fail.
    Please guide me if i am missing something.
    Regards,
    PrakashV

  • HFM Data Load Issue

    Hello All,
    We had an EPMA Type HFM application whose all dimensions were local, The application validated and deployed successfully.
    We tried loading data into the HFM application and the data load was successful.
    Then we decided to convert all of the local dimension of the above mentioned HFM application as shared dimensions. After converting all the dimensions as shared dimension successfully we are getting error while loading data into the same HFM application (The app does validate and can be deployed after changes)
    The Error log is below:
    Load data started: 11/29/2014 10:53:15.
    Line: 216, Error: Invalid cell for Period Oct.
    ACTUAL;2014; Oct; YTD; E_2100;<Entity Currency>;89920000; [ICP None]; CORP; [None]; [None]; FARM21000;11979
    >>>>>>
    Line: 217, Error: Invalid cell for Period Nov.
    ACTUAL;2014; Nov; YTD; E_2100;<Entity Currency>;89920000; [ICP None]; CORP; [None]; [None]; FARM21000;23544
    >>>>>>
    Line: 218, Error: Invalid cell for Period Dec.
    ACTUAL;2014; Dec; YTD; E_2100;<Entity Currency>;89920000; [ICP None]; CORP; [None]; [None]; FARM21000;58709
    >>>>>>
    Line: 219, Error: Invalid cell for Period Oct.
    ACTUAL;2014; Oct; YTD; E_2100;<Entity Currency>;28050000; E_6000_20; [None]; [None]; [None]; FARM21000;-11979
    >>>>>>
    Line: 220, Error: Invalid cell for Period Nov.
    ACTUAL;2014; Nov; YTD; E_2100;<Entity Currency>;28050000; E_6000_20; [None]; [None]; [None]; FARM21000;-11565
    >>>>>>
    Wanted to know whether there is something I might have missed while converting local dimension into shared (If there is any sequence to do so,or any constraint that I may not be aware of, though the conversion looks good as application is validated and deployed after changes)
    What can be the reason for the failed data load, can anyone help?
    Thanks
    Arpan

    Hi,
    I would look at the account properties for that account (89920000) and see the TopCustom1...4Member. You will find the reason behind the invalid cells.
    When you convert the local dimensions to shared, have you checked the 'Dimension Association' for Accounts and Entitities?
    It does seem to lose the dimension association if a proper sequence is not followed.
    Regards,
    S

  • HFM Data Load Error in ODI

    Hi,
    I'm loading data into HFM from flat file. When the interface is executed only some of the data are getting loaded. When i checked for the errors in the log, I'm getting the following error message in log:
    'Line: 56, Error: Invalid cell for Period Apr'
    Then i found that its an invalid intersection in HFM which am trying to load.
    In FDM there is an option to validate invalid intersections during data load.
    I would like to know how to do so in ODI to overcome this kind of error i.e. is there any option in ODI to ignore this kind of error.
    Kndly help me.
    Thanks in advance

    Hi,
    I think even if the metadata exists still there might be some issues with HFM forbidden cells. There are HFM rules that determines which intersections are editable/loadable which are not. Please look at your HFM admin regarding forbidden rules. Or otherwise change the property of Custom dimensions so that it accepts data into all intersections.
    Thanks,
    Debasis

  • HFM Data Load Hangs?

    Hello,
    We are trying to load data to HFM from MS SQL.
    1. Successfully reverse engineered both SQL and HFM
    2. User for SQL has DBO access
    3. Successfully mapped source and target
    4. In flow we are using dedicated SQL staging area
    5. We are using LKM SQL to MSSQL and IKM SQL TO HFM Data
    6. In IKM we are using all default settings for properties
    7. When we execute; the interface is hung on the 5th step:
    1. DROP WORK TABLE (Success)
    2. CREATE WORK TABLE (Success)
    3. LOAD DATA (Success)
    4. SQL TO HFM PREPARE TO LOADING (Success)
    *5. SQL TO HFM LOAD DATA TO HFM (RUNNING FOR 14+ hrs)*
    To make sure it wasn't a large volume issue (just 100k rows), we even created a filter to pull just a single entity with very few records, still the process doesn't complete even after 12+ hours...
    We are using 10.1.3.6.0, are there any known issues with IKM SQL TO HFM DATA in this version?
    Please suggest.
    Appreciate your responses.
    Thanks

    Hello,
    Thanks for the response.
    Looked into the logs and nothing that points to 'why its hanging'....
    Here's the log, says connection to source, connection to hfm, options, etc all good...
    </Options>
    2013-05-31 12:39:10,107 INFO [DwgCmdExecutionThread:null:0]: Load Options validated.
    2013-05-31 12:39:10,302 INFO [DwgCmdExecutionThread:null:0]: Source data retrieved.
    2013-05-31 12:39:10,303 INFO [DwgCmdExecutionThread:null:0]: Pre-load tasks completed.
    2013-05-31 12:49:30,396 INFO [DwgCmdExecutionThread:odi_agent:2]: ODI Hyperion Financial Management Adapter Version 9.3.1
    2013-05-31 12:49:30,398 INFO [DwgCmdExecutionThread:odi_agent:2]: Load task initialized.
    2013-05-31 12:49:30,407 INFO [DwgCmdExecutionThread:odi_agent:2]: Connecting to Financial Management application [XXXXX] on [XXXXX] using user-name [XXXXX].
    2013-05-31 12:49:30,923 INFO [DwgCmdExecutionThread:odi_agent:2]: Connected to Financial Management application.
    2013-05-31 12:49:30,923 INFO [DwgCmdExecutionThread:odi_agent:2]: HFM Version: 11.1.2.1.0.
    2013-05-31 12:49:30,923 INFO [DwgCmdExecutionThread:odi_agent:2]: Options for the Financial Management load task are:
    <Options>
    <Option name=LOG_FILE_NAME value=D:\LOGS_ERRORS\SQL_HFM_LOG.LOG/>
    <Option name=IMPORT_MODE value=Merge/>
    <Option name=CONSOLIDATE_ONLY value=false/>
    <Option name=CONSOLIDATE_PARAMETERS value=""/>
    <Option name=LOG_ENABLED value=true/>
    <Option name=ACCUMULATE_WITHIN_FILE value=false/>
    <Option name=CONSOLIDATE_AFTER_LOAD value=false/>
    <Option name=FILE_CONTAINS_SHARE_DATA value=false/>
    So, no clear info on why its hanging on the load step...
    Any suggestions experts? Is it because of the Adaptor version being Version 9.3.1 and HFM Version: 11.1.2.1.0?
    Thanks for your inputs!

  • HFM DATA LOAD WITH ODI HANGS LONG TIME

    Hi all,
         There's a very strange problem when I loading data from MS SQLServer to HFM with ODI. Specifically, there are 2 interfaces to 2 applications on the same HFM server. Data amount is about 1,300,000 and 650,000 separately.
    The strange thing is when I execute interface individually, it sometimes works well. However when I execute the package contains 2 interfaces, the larger one almost hangs on about 10+ hours every time whether I use an agent or not.
    After some research, it seems that the session hangs on because it cannot get return info from HFM but loading data has already completed. I found some similar problems on OTN like 64bit driver and jre compatible error OR deadlock on table. Different with this one. So, can anyone help on this? Much appreciate in advance!!!
    BTW, ODI and HFM are on the same server but ODI repositary and source of interface are on another MS SQL data server. The version is as below:
    HFM 11.1.1.3.0.956
    ODI 11.1.1.6.0
    win server 2003 x86
    MS SQLServer 2008 R2
    win server 2008 x64
    Regards,
    Steve

    Hi SH,
         source is MS SQLServer 2008 R2, staging area is on the source side, target is a HFM 11.1.1.3.0.956 based on SQLSERVER.
         KM is a standard 'IKM SQL to Hyperion Financial Management Data'.
         No transformation logic but only a filter to select data in current year.
    Besides, I have do some performance tuning as guide tolds:
    REM #
    REM # Java virtual machine
    REM #
    set ODI_JAVA_HOME=D:\oracle\Java\jdk1.6.0_21
    REM #
    REM # Other Parameters
    REM #
    set ODI_INIT_HEAP=512m
    set ODI_MAX_HEAP=1024m
    set ODI_JMX_PROTOCOL=rmi
    In Regedit:
    EnableServerLocking: 1
    MaxDataCacheSizeinMB :1000
    MaxNumDataRecordsInRAM: 2100000
    MultiServerMaxSyncDelayForApplicationChanges:300
    MultiServerMaxSyncDelayForDataChanges:300
    After some reaserch, I think the problem can be located at the HFM-ODI adapter or HFM side(maybe HFM cannot respond a completed info to ODI), do you have any idea? Thanks in advance

  • Data Load: Replace with Rule

    Hi guys,
    this might be a simple question but need to find out quickly:
    Can it load exports with a rule file even its not in column format??
    Data source is an export and NOT in column format and need to load the data with a rule file to replace a member name. The new member names in the new outline has changed a little bit.
    Old: Team_1
    New: Team_01
    Export is done by right click data base -> export.
    Any idea?
    Regards,
    Bernd
    Edited by: Bernd on Mar 15, 2012 4:13 PM

    Data file in a free-form format doesn't need a rule file to import. Given your case my best guess is to just replace the names in the export file with the new names and jsut import it.

  • FDQM Sample data load file for HFM

    Hi all,
    I have just started working on FDQM 11.1.1.3. I had done integration of HFM application with FDQM. I required to load data to HFM application i don't have any gl file or csv file with hfm dimension . so any one can help me get this file.......
    and one more thing.......
    i just want to know what is the basic steps i need to perform to load data to HFM application after creating fdm application and integrating with hfm application.
    Thanks.....

    Hi,
    After creating fdm application and integrating with hfm application also includes the Target Application name setting in FDM
    Now the FDM application is ready with its target set to HFM application
    1. You can create an Import format as below for the application using only Custom1 and Custom2 dimensions from 4 available Custom dimensions(you can modify according to your dimensions):
    Use '|'(Pipe as delimeter)
    Account|Account Desription|Entity|C1|C2|Amount
    2. You can create a text file with data like mentioned below by making a combination of members from each dimension specified in import format:
    11001|Capex|111|000|000|500000
    11002|b-Capex|111|000|000|600000
    Note: these dimension memers should be mapped in 'Mappings' section; use the members specified in file as source and select any target member of HFM for them
    3. Then tag this import format with any location
    4. Import the text file using 'Browse' option
    5. Validate the data loaded
    Note: If mapping is done for all dimension members used in file, then validation will be successful
    6. Export the data
    Now you can export and hence load the data to HFM application using Replace/Merge option.
    Here you are with the basic steps to load the data from FDM to HFM.
    Regards,
    J

  • Protecting the Data in HFM during loads ?

    Is there a way to protect the data in HFM during FDM loads.
    Some of the data is being entered in HFM through forms and I need to protect this data in HFM while loading through FDM.
    Thanks,
    Anil

    Yes, you would need to enable the "data Protection" integration setting within the FM adapter and also specify a Protection value in the Protection Value Option. Please refer to the FM adapter readme for the integration options:
    http://www.oracle.com/technology/documentation/epm.html

  • What is the difference between "Replace" and "Replace by Security"

    Hi all,
    I am implementing FDM to Load data to HFM, I set them up with a proper security on the shared service and FDM. When I load data to HFM from FDM, there are 4 options to load data. (Replace, Merge, Accumulate, Replace by Security)
    A document states that
    Replace: Clears all account values from the target application, and replaces that the data in the application with the data in the load file.
    Replace by Security: Loads data to the target application only for members to which you have access.
    However, even if I used “Replace”, it worked with security which I set.
    Does anyone know what the difference between “Replace” and “Replace by Security”?
    Is there any way that FDM works without security?
    Thanks in advance
    taku

    The fact both methods are producing the same results will either be coincidence or perhaps because you are mis-interpreting the difference between the 2. When you run a Replace HFM will clear out all data based on the combination of Scenario, Year, Period, Entity and Value dimensions it does not care if you have set up security classes which may restrict access to any of the other dimensions i.e. Account, Custom1 thru 4. When you use Replace By Security HFM will still clear on the original criteria but it will respect security class restrictions set up on the other dimensions e.g. if you are assigned to a security class which is restricted to accessing only a certain subset of accounts, HFM will not clear data fro the accounts you don't have access to when you use Replace By Security

  • Replace by Security

    Hi,
    Does "Replace by Security" respect access rights to security classes for Custom1-4 as well as for accounts? Looking at the HFM User Guide, it is not clear to me if this applies to the Accounts dimension only, or also to C1-4: "This option enables you to perform a data load in Replace mode even if you do not have access to all accounts"
    The underlying question is how to allow users loading data in Replace mode per individual C3 member. C3 represents different types of adjustments, for which different users are responsible.
    Many thanks!

    Hi,
    Replace by security option is applicable to accounts only.
    For C3 dimension, different users need to have access to different C3 members. First of all please check in the HFM app settings, the parameter "UseSecurityForCustom" is selected or not. This parameter is used if we need to apply security on custom dimensions.Also verify the parameter "SecurityClass" which will be used to identify the users who can access the custom dimension data.
    Thanks..

  • Data Load Issue

    Dears,
    we are trying to distrigute the data load functionality to base users.
    However it seems that in our tests these kind of users can only use the option "replace by security".
    If they try to run the data load without having selected this option, the system returns the following error:
    "No write access to period December"
    which is very weird as the period December is available for input to the same users through data entry forms.
    Please help.

    This comes up when the user does not have write access to all the intersections in the subcube.
    Let's say you have security on Custom 1 with members A, B, and C, and a file with some data.
    If John only has Read Access to A, a replace by security will clear and load products B and C only.
    If John tries a Replace load, it will attempt to clear all members, however because John does not have access to clear product A, this causes the error you're looking.

  • HT1386 I have an older iPhone (3gs) and need to upgrade to a newer phone (4S).  I need to get my NOTES, CALENDAR, CONTACTS, PICTURES, etc backed up on iTunes so I can get that data loaded onto the new phone.  But not sure how to do that.

    I have an older iPhone (3gs) and need to upgrade to a newer phone (4S).  I need to get my NOTES, CALENDAR, CONTACTS, PICTURES, etc backed up on iTunes so I can get that data loaded onto the new phone.  But not sure how to do that.  When I open iTunes it has a button that say "Back Up iPhone", but I'm not sure what that does.  When I go into the sync options it say I have another user account and asks me if I want to merge of replace. I'm assuming it's trying to tell me I have an older iTunes Library, but don't know that.  Geez, maybe people over 60 shouldn't have iPhones, iTunes just bafles me.

    http://manuals.info.apple.com/en_US/iphone_user_guide.pdf

  • Comparison of Data Loading techniques - Sql Loader & External Tables

    Below are 2 techniques using which the data can be loaded from Flat files to oracle tables.
    1)     SQL Loader:
    a.     Place the flat file( .txt or .csv) on the desired Location.
    b.     Create a control file
    Load Data
    Infile "Mytextfile.txt" (-- file containing table data , specify paths correctly, it could be .csv as well)
    Append or Truncate (-- based on requirement) into oracle tablename
    Separated by "," (or the delimiter we use in input file) optionally enclosed by
    (Field1, field2, field3 etc)
    c.     Now run sqlldr utility of oracle on sql command prompt as
    sqlldr username/password .CTL filename
    d.     The data can be verified by selecting the data from the table.
    Select * from oracle_table;
    2)     External Table:
    a.     Place the flat file (.txt or .csv) on the desired location.
    abc.csv
    1,one,first
    2,two,second
    3,three,third
    4,four,fourth
    b.     Create a directory
    create or replace directory ext_dir as '/home/rene/ext_dir'; -- path where the source file is kept
    c.     After granting appropriate permissions to the user, we can create external table like below.
    create table ext_table_csv (
    i Number,
    n Varchar2(20),
    m Varchar2(20)
    organization external (
    type oracle_loader
    default directory ext_dir
    access parameters (
    records delimited by newline
    fields terminated by ','
    missing field values are null
    location ('file.csv')
    reject limit unlimited;
    d.     Verify data by selecting it from the external table now
    select * from ext_table_csv;
    External tables feature is a complement to existing SQL*Loader functionality.
    It allows you to –
    •     Access data in external sources as if it were in a table in the database.
    •     Merge a flat file with an existing table in one statement.
    •     Sort a flat file on the way into a table you want compressed nicely
    •     Do a parallel direct path load -- without splitting up the input file, writing
    Shortcomings:
    •     External tables are read-only.
    •     No data manipulation language (DML) operations or index creation is allowed on an external table.
    Using Sql Loader You can –
    •     Load the data from a stored procedure or trigger (insert is not sqlldr)
    •     Do multi-table inserts
    •     Flow the data through a pipelined plsql function for cleansing/transformation
    Comparison for data loading
    To make the loading operation faster, the degree of parallelism can be set to any number, e.g 4
    So, when you created the external table, the database will divide the file to be read by four processes running in parallel. This parallelism happens automatically, with no additional effort on your part, and is really quite convenient. To parallelize this load using SQL*Loader, you would have had to manually divide your input file into multiple smaller files.
    Conclusion:
    SQL*Loader may be the better choice in data loading situations that require additional indexing of the staging table. However, we can always copy the data from external tables to Oracle Tables using DB links.

    Please let me know your views on this.

  • Data Load behaviour in Essbase

    Hello all-
    I am loading data from Flat File using a server Rule File. In the rule file i have properties for a feild where in it replaces a name in flat file for member name in outline so it is somwhat like this:
    Replace With
    Canada 00-200-SE
    Belgium 00-300- SE
    and so on
    Now in my flat file there was a new member for example china & the replacement for it was not present in Rule File & when the data was loaded in the system it didnt rejected that record on the contrary it loaded the values for china in
    the region which was above it and overwrited the values for the original one.
    Is this the normal behavior of essbase , I was thinking that record should have been rejected .
    I know when we do a Lock & Send via Addin & if member is not present in outline it give you warning when you lock that sheet & eventually if you dont delete that member from the template it will load data against it in the member above it.
    Is there a waok around for this problem or this is what it is ?
    I am on Hyperion Planning / Essbase Version 9.3.1.
    Thanks

    Still thinking how does these properties effects the way data is being loaded right now. Have gone through DBAG & i dont see a reason y any of these peoperties might be affecting the load^^^Here's what I think is happening: China is not getting mapped, but the replacement for Belgium is occuring and resolves to a valid member name. Essbase sees China and doesn't recognize it (you knew all of this already).
    When the load occurs, Essbase says (okay, I am anthromorphizing, but you get the ida) "Eh, I have no idea what China is, but 00-300-SE is the last good Country member I have, I will load there." Essbase is picking the last valid member and loading to that. I liken it to a lock and send from Excel with nested dimensions and non-repeating members. Essbase "looks up" a row, finds the valid member, and loads there.
    And yes, this is in the DBAG: http://download.oracle.com/docs/cd/E12825_01/epm.111/esb_dbag/ddlload.htm#ddlload1034271
    Search for "Unknown Member Fields" -- it's all the way at the bottom of the above link.
    In fact, to save you the trip, per the DBAG:
    If you are performing a data load and Essbase encounters an unknown member name, Essbase rejects the entire record. If there is a prior record with a member name for the missing member field, Essbase continues to the next record. If there is no prior record, the data load stops. Regards,
    Cameron Lackpour

Maybe you are looking for