Protecting the Data in HFM during loads ?

Is there a way to protect the data in HFM during FDM loads.
Some of the data is being entered in HFM through forms and I need to protect this data in HFM while loading through FDM.
Thanks,
Anil

Yes, you would need to enable the "data Protection" integration setting within the FM adapter and also specify a Protection value in the Protection Value Option. Please refer to the FM adapter readme for the integration options:
http://www.oracle.com/technology/documentation/epm.html

Similar Messages

  • The data has been updated-loaded upto PSA only

    Hi,
    The data has been updated-loaded upto PSA only even though the processing type
    has update into PSA and then into data targets. Please let us know what can be problem and what will be the solution for this.
    Regards,
    Raghav

    Hi Kolli,
    Do you have the further update from PSA process in your process chain? You will need this if the InfoPackage has the 3rd option on the Processing tab (PSA and then to data targets).
    Hope this helps...

  • Unable to load the data into HFM

    Hello,
    We created new HFM app configured that with FDM, generated output file through FDM and loaded that file through HFM directly 5-6 times, there was no issue till here.
    Then I loaded the file through FDM 4 times successfully, even for different months. But, after 4 loads I start getting Error. Attached is the error log .
    Please help us earliest..
    ** Begin fdmFM11XG6A Runtime Error Log Entry [2013-10-30-13:44:26] **
    Error:
    Code............-2147217873
    Description.....System.Runtime.InteropServices.COMException (0x80040E2F): Exception from HRESULT: 0x80040E2F
       at HSVCDATALOADLib.HsvcDataLoadClass.Load(String bstrClientFilename, String bstrClientLogFileName)
       at fdmFM11XG6A.clsFMAdapter.fDBLoad(String strLoadFile, String strErrFile, String& strDelimiter, Int16& intMethod, Boolean& blnAccumFile, Boolean& blnHasShare, Int16& intMode)
    Procedure.......clsHPDataManipulation.fDBLoad
    Component.......E:\Opt\Shared\Apps\Hyperion\Install\Oracle\Middleware\EPMSystem11R1\products\FinancialDataQuality\SharedComponents\FM11X-G6-A_1016\AdapterComponents\fdmFM11XG6A\fdmFM11XG6A.dll
    Version.........1116
    Identification:
    User............fdmadmin
    Computer Name...EMSHALGADHYFD02
    FINANCIAL MANAGEMENT Connection:
    App Name........
    Cluster Name....
    Domain............
    Connect Status.... Connection Open
    Thanks,'
    Raam

    We are working with the DB team but they have confirmed that they is no issue with the TB, the process we have followed
    As a standard process – while loading the data from FDM or manually to HFM – we don’t write any SQL query. Using the web interface – data would be loaded to HFM application. This data can we viewed by different reporting tools (smart view(excel)/HFR Report/etc.)
    There is no any official documents on oracle website which talk about Insert SQL query which is used to insert data to HFM tables. Even, Hyperion does not provide much details on its internal tables used. Hyperion does not provide much insight on internal structure of HFM system.
    As per Hyperion blogs/forums on internet –HFM stores the base level data in so called DCE tables (for example EMHFMFinal _DCE_1_2013 where EMHFMFinal  is application name, 1 identifies the Scenario and 2013 the Year).  Each row in the DCE table contains data for all periods of a given combination of dimensions (also called an intersection).
    We are trying to load same data file with a replace option( it should delete the existing data before loading the data file).

  • Need to Skip the fields duaring the data upload by SQL*Loader.

    Hi all,
    I am not able to get how to skip some fileds in the file while uploading it thru SQL*Loader. Let us have a look the below scenario.
    The table has 4 columns. but in the input file came up with 6 fields. Let us assume the four fields came as first in the order. Then, we can populate the data into the 4 columns of the table. At the same time we want to skip the remaining fields. Even those fields/columns are no longer available in the database table.
    For example:
    create table data_temp(sno number,name varchar2(10 char),sex char(1),loc varchar2(20 char));
    Data file
    sno,name,sex,loc,organization,contact
    1,ram,M,India,HP,998976789
    2,Shesha,M,India,IBM,7890808098
    Control_file
    OPTIONS(SKIP=1)
    LOAD DATA
    INFILE *
    APPEND INTO TABLE data_temp
    FIELDS TERMINATED BY ',' optionally enclosed by '"'
    TRAILING NULLCOLS
    sno "trim(:sno)",
    name "SUBSTR(trim(:name),1,20)" ,
    sex "SUBSTR(trim(:sex),1,1)",
    loc "SUBSTR(trim(:loc),1,20)" ,
    Please suggest me how to implement the above scenario in the control file.
    Thanks in Advance!!
    Regards,
    Vissu.....

    Use FILLER. Control file:
    OPTIONS(SKIP=1)
    LOAD DATA
    INFILE *
    APPEND INTO TABLE data_temp
    FIELDS TERMINATED BY ',' optionally enclosed by '"'
    TRAILING NULLCOLS
    sno "trim(:sno)",
    name "SUBSTR(trim(:name),1,20)" ,
    sex "SUBSTR(trim(:sex),1,1)",
    loc "SUBSTR(trim(:loc),1,20)" ,
    organization filler,
    contact filler
    begindata
    sno,name,***,loc,organization,contact
    1,ram,M,India,HP,998976789
    2,Shesha,M,India,IBM,7890808098
    {code}
    Now:
    {code}
    SQL> create table data_temp(sno number,name varchar2(10 char),sex char(1),loc varchar2(20 char));
    Table created.
    SQL> host sqlldr scott/tiger control=c:\temp\vissu.ctl log=c:\temp\vissu.log
    SQL> select * from data_temp
      2  /
           SNO NAME       S LOC
             1 ram        M India
             2 Shesha     M India
    SQL>
    {code}
    SY.                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                               

  • How syncronize with the data base after pre-loading the data

    Hi,
    I have pre-loaded the data from the database table into the cache.
    If the key is not found in the cache i want to it to connect to database and get the value from the table. How to achieve this?

    Hi JK,
    I have pasted my cache loader code, config file and the main class in the other post but i m not sure what is the issue with it. Its not working. Please can you tell me what might be the issue with that piece of code. I m not getting any exception either but the load() or loadAll() method is not at all getting triggered on invoking cache.get() or cache.getAll() method. What might be the cause for this issue?
    Can you give me the coherence-cache-config.xml contents?
    I m not sure whether its the issue with the config file because i have read some where that refreshaheadfactor is required to trigger the loadAll() method.
    Edited by: 943300 on Jul 4, 2012 9:57 AM

  • Concatenating the data file(Mbrs) before loading.. due to change in Outline

    We have 8 Dimensions(11.1.1.3)
    The data file is coming from the source system in this format from 3 yrs as below ..
    D1 D2 D3 D4 D5 D6 Product, Curriency
    . , . ,. , . , . , ., a , USD
    . , . , ., . , . , . , a , EUR
    . , . , . , . , . , ., b , GBP
    . , . , . , . , . , . , b , INR
    Now the product name as been changed in outline as
    a_USD
    a_EUR
    b_GBP
    b_INR
    So, Is there any way in the hyperion suite(like in rules file or any other), where I can concatenate Product and currency and get the file as
    D1 D2 D3 D4 D5 D6 , Product , Curriency
    . . . . . . , a_USD , USD
    . . . . . . , a_EUR , EUR
    . . .. . . . , b_GBP , GBP
    . . . . . . , b_INR , INR
    Please do let me know
    Thanks in Advance.
    Edited by: 838300 on Sep 27, 2011 9:00 AM

    While what Mehmet wrote is correct, if this is anything more than a quick and dirty fix, may I suggest you go down another ETL path? (Yes, this is Cameron's load rule rant coming right at you, but in abbreviated form.)
    ETL in a load rule is great to get the job done, but is a pain to maintain. You would be (unpleasantly) surprised how these have a way of growing. Have you given a thought to fixing it at the source or doing some true ETL in a tool like ODI, or staging it in SQL and doing the mods there? I know, for a simple(ish) change, that seems overkill, but load rules for the purposes of ETL are Evil.
    Regards,
    Cameron Lackpour
    P.S. For anyone who must see me go bonkers over this, see: http://www.network54.com/Forum/58296/thread/1311019508/Data+Load+Optimization+-Headervs+Column

  • SQL*Loader to insert data file name during load

    I'd like to use a single control file to load data from different files (at different times) to the same table. I'd like this table to have a column to hold the name of the file the data came from. Is there a way for SQL*Loader to automatically do this? (I.e., as opposed to running an update query separately.) I can edit the control file before each load to set a CONSTANT to hold the new data file name, but I'd like it to pick this up automatically.
    Thanks for any help.
    -- Harvey

    Hello Harvey.
    I've previously attempted to store a value into a global/local OS variable and use this within a SQL*Loader control file (Unix OS and Oracle versions between 7.3.4 and 10g). I was unsuccessful in each attempt and approach I could imagine. It was very easy to use a sed script to make a copy of the control file, changing a string within it to do this however.
    Do you really want to store a file name on each and every record? Perhaps an alternative would be to use a relational model. Create a file upload log table that would store the file name and an upload # and then have the SQL*Loader control file call a function that would read that table for the most recent upload #. You'll save some disk space too.
    Hope this helps,
    Luke

  • Remove the data which is recently loaded, but not disturbing the original

    Hello Experts!!
    I have few tables in Oracle 9i/10g , and they already have data in them. I am trying to migrate the data coming from variuos source systems into these Oracle tables. There is a chance that after loading I might get some unwanted data into these tables.
    How do I remove just that data which I have loaded recently, and do not distrub the original data it already has.
    Lot of people have adviced me to backup those tables and reload the data back if there is any problem, but I am looking at a different approach. I just dont want to change the existing system, as lot of users use the system.
    Could anyone help me solving this issue.
    Thanks,
    Vishal

    Visu wrote:
    How are you loading the data daily? Is it direct path?
    1) Capture the primary key or rowid information into interim table while loading data into target table (either trigger or stored procedure) - triggers will not be fired if the load is direct path, it is the feasible way otherwise as there will be no impact on the existing codeThat's not true. The trigger will disable the direct path, not the other way around.
    Simple proof.
    TUBBY_TUBBZ?create table for_direct_pathery
      2  (
      3     column1 number
      4  );
    Table created.
    Elapsed: 00:00:00.07
    TUBBY_TUBBZ?
    TUBBY_TUBBZ?insert --+ APPEND
      2  into for_direct_pathery
      3  select 1 from dual ;
    1 row created.
    Elapsed: 00:00:00.00
    TUBBY_TUBBZ?
    TUBBY_TUBBZ?select * from for_direct_pathery;
    select * from for_direct_pathery
    ERROR at line 1:
    ORA-12838: cannot read/modify an object after modifying it in parallel
    Elapsed: 00:00:00.01
    TUBBY_TUBBZ?
    TUBBY_TUBBZ?rollback;
    Rollback complete.
    Elapsed: 00:00:00.00
    TUBBY_TUBBZ?
    TUBBY_TUBBZ?create or replace trigger disable_direct_path_access
      2  before insert on for_direct_pathery
      3  begin
      4     null;
      5  end;
      6  /
    Trigger created.
    Elapsed: 00:00:00.09
    TUBBY_TUBBZ?
    TUBBY_TUBBZ?insert --+ APPEND
      2  into for_direct_pathery
      3  select 1 from dual ;
    1 row created.
    Elapsed: 00:00:00.00
    TUBBY_TUBBZ?
    TUBBY_TUBBZ?select * from for_direct_pathery;
               COLUMN1
                     1
    1 row selected.
    Elapsed: 00:00:00.00
    TUBBY_TUBBZ?Notice how you can select the table (since it wasn't direct path loaded) when the trigger is in place.

  • How to change the date and time during OVM installation for Fusion Apps

    Hi,
    The customer is using Fusion Instance which is a V1 (Build 19) OVM installation shipped by Oracle with Demo Data (including seeded user IDs and roles) in it.
    Now they have issue with Date format for the application on the screens is showing MM/DD/YYYY where as it should be (Singapore) Standard is DD/MM/YYYY.
    Q/A: How to change the date/time in OVM image
    Refer SR : 3-5640792461
    Regards
    Ganesh Ananthapadmanaban

    Not a problem. There are a number of other system and network setup commands that you might find useful at some time. From within the Remote Desktop Admin in the Send Unix Command window type:
    networksetup -help
    systemsetup -help
    That will give you lists of the commands, including the one I posted above, with their basic syntax. They're handy to know.
    Cheers.

  • How can we restrict users from changing the data in HFM.

    Hi All,
    We have requirement from users where, They don't want the base data being loaded from SAP to HFM via FDM through ERPi to get changed in HFM at <Entity Currency>. They want data to be read only and no body should be able to change neither Grid nor Forms and neither Smart View. If we restrict by Shared services access then again they can't change ownership management value.
    Regards,
    Sushil

    Hi Thanos, Thanks for your reply.
    Yes i am aware of the security class, so your suggestion is to use security classes to restrict users? And how can i use the phased submission for the same?  I am new to HFM so please bear with me.
    I have one more question that my Application is HFM EPMA application. So is it necessary to have Application Administrator to change hierarchy and Deploy the Application from EPMA?
    Thanks,
    Sushil

  • How can I lock/protect the datas in my lost MacBook Pro which runs in OS X. I have installed Mac keeper and activated theft lock.

    Lost my Mac book Pro and I want to protect all my data in it. I have installed Mac keeper in my MacBook Pro and have activated theft  lock. Now who and where should I report the loss of my MacBook Pro?.

    Police
    Change all of your passwords

  • Error While Loading data from HFM to Essbase

    Hi,
    I am trying to load data from a HFM application to Essbase application. The LKM i am using is "HFM Data to SQL" and the IKM is "SQL to Hyperion Essbase(DATA). I am also using couple of mapping expression like {CASEWHEN(HFMDATA."Account" = '3110', 'PL10100', HFMDATA."Account") } , {CASEWHEN(HFMDATA."Period" = 'January', 'Jan', HFMDATA."Period") } etc.
    The error i am getting looks like this -
    org.apache.bsf.BSFException: exception from Jython:
    Traceback (innermost last):
    File "<string>", line 23, in ?
    com.hyperion.odi.essbase.ODIEssbaseException: Invalid column type specified for Data column [Account]
         at com.hyperion.odi.essbase.ODIEssbaseDataWriter.loadData(Unknown Source)
         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
         at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
         at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
         at java.lang.reflect.Method.invoke(Unknown Source)
         at org.python.core.PyReflectedFunction.__call__(PyReflectedFunction.java)
         at org.python.core.PyMethod.__call__(PyMethod.java)
         at org.python.core.PyObject.__call__(PyObject.java)
         at org.python.core.PyInstance.invoke(PyInstance.java)
    I am mapping account dimension of HFM to to account dimension of essbase. Both source and target column type is "string" with length 80. Instead of keeping the target "Essbase application", if i keep an Oracle table, columns of which are exactly like essbase application, ODI is loading data perfectly with proper coversion.
    Can anyody please help me out.
    Many thanks.
    N. Laha.

    Hi John,
    Thank you very much for your response. Pasting here the SQl generated at step "8. Integration - Load data into Essbase" in the operator in the description tab, for your perusal -
    from com.hyperion.odi.common import ODIConstants
    from com.hyperion.odi.connection import HypAppConnectionFactory
    from java.lang import Class
    from java.lang import Boolean
    from java.sql import *
    from java.util import HashMap
    # Get the select statement on the staging area:
    sql= """select CASEWHEN(C3_ACCOUNT = '3110', 'PL10100', C3_ACCOUNT) "Account",CASEWHEN(C4_PERIOD = 'January', 'Jan', C4_PERIOD) "TimePeriod",'HSP_Inputvalue' "HSP_Rates",C8_SCENARIO "Scenario",CASEWHEN(C2_YEAR = '2007', 'FY09', C2_YEAR) "Years",CASEWHEN(C1_CUSTOM1 = '2012', 'Atacand', C1_CUSTOM1) "Product",CASEWHEN(C7_VIEW = 'YTD', 'Input', C7_VIEW) "Version",CASEWHEN(C6_VALUE = 'USD', 'Local', C6_VALUE) "Currency",C9_ENTITY "Entity",C5_DATAVALUE "Data" from "C$_0AZGRP_PLData" where      (1=1) """
    srcCx = odiRef.getJDBCConnection("SRC")
    stmt = srcCx.createStatement()
    srcFetchSize=30
    stmt.setFetchSize(srcFetchSize)
    rs = stmt.executeQuery(sql)
    #load the data
    stats = pWriter.loadData(rs)
    #close the database result set, connection
    rs.close()
    stmt.close()
    From the SQL, you may find the CASEWHEN expressions. Yes these i have added in the interface mapping section, with the help of ODI expression editor. Following are the expressions i have used, one statement for one mapping each. This is just to make the data of HFM, compatible to Essbase. e.g. - account number in HFM for which i am estracting the data is 3110, but there is no account '3110' in the essbase application, so essbase application wont be able to load the data. To make the HFM data compatible with Essbase, these casewhen statements have been used. I hope its fine to use such statements in mapping.
    CASEWHEN(HFMDATA."Year" = '2007', 'FY09', HFMDATA."Year")
    CASEWHEN(HFMDATA."View" = 'YTD', 'Input', HFMDATA."View")
    CASEWHEN(HFMDATA."Entity" = '100106_LC', '1030GB', HFMDATA."Entity")
    CASEWHEN(HFMDATA."Value" = 'USD', 'Local', HFMDATA."Value")
    CASEWHEN(HFMDATA."Account" = '3110', 'PL10100', HFMDATA."Account")
    CASEWHEN(HFMDATA."Custom1" = '2012', 'Atacand', HFMDATA."Custom1")
    CASEWHEN(HFMDATA."Period" = 'January', 'Jan', HFMDATA."Period")
    If you can get some idea from the SQL and help me out!
    Many thanks.
    N. Laha
    Edited by: user8732338 on Sep 28, 2009 9:45 PM

  • How can I switch the connection pool dynamically during on load happens

    HI,
    I have two data bases which holds same data. i.e Prod_db, Prod_db1,
    I want to switch the connection pool dynamically during load happens
    Ex: During load happens i want to hit prod_db1, after load completes i want to hit prod_db. How to achieve this.

    create dynamic repository variable for DSN using init block so that value is changes based on your timings and use this in connection pool.
    If you use same user and passwords for both the databases that would be easy or else need to follow the same for uid and pwd.
    That should work, if not update.
    If helps pls mark correct/helpful

  • How to integrate Oracle Ebs to FDQM to load the data to Essbase

    Hi Experts,
    I need a document regarding how to integrate Oracle EBS to FDQM.I am taking target as Essbase.I know how to integrate to HFM and loading the data to HFM.So please share me the document like How to get source file from E-Buisiness suite to load into Essbase.Pleaae share with an example
    Thanks in advance

    The following link:
    http://docs.oracle.com/cd/E24674_01/epm.1112/fdm_adapter_readmes/es_g4_h_target_adapter_readme.html
    You need to configure the target dimensions of Essbase within the Essbase adapter of FDQM.
    For source integration of EBS, you have two options:
    1. Use Oracle ERPI (ERP Integrator)
    2. Make a custom data pump import script in FDQM to retrieve data from EBS database

  • Data is not getting loaded in the Cube

    Hi Experts,
    I had a cube to which I created aggregates and then I deleted the data of the cube and made some changes in it and trying to load the data again from PSA.
    But I'm the data is not getting loaded in to it the cube.
    Can anyone tell me do I have to delete the aggregates before I start a fresh data in to my cube. If yes can you tell me how do i delete it and load it again.
    If it is something else please help me resolve my issue.
    Thanks

    Hi,
    Deactivate the aggregates and then load the data from PSA to Cube.
    And then reactivate the aggregates. While reactivating them you can ask to fill them also.
    Regards,
    Anil Kumar Sharma. P

Maybe you are looking for

  • 30" DVI screen has permanent smudge...

    Is this the experience Apple wants me to have after only a few years with their product? I have a 2009 30 Inch DVI display.  For over a year I have noticed that the display is showing 'gray smudges'.   Some are as long as 6 inches and about 3/4 inch

  • PGI & Sales Order Link

    Is there any direct link between PGI and Sales Order? I know there is a link through VBFA table but i dont want to go through that,and there is one in VBUP table but again you need to provide the delivery number in VBELN and fetch the result from WBS

  • Serialize an Object

    have an Automotive class which has the following methods. Iam able to read the i/p file and print the o/p in another file. But when iam trying to serialize the Automotive class a .dat file is created but the o/p is not printed. Can we do the serializ

  • Message structure in imported WSDL

    Hi All,          I have the following structure in a imported WSDL:     <xsd:complexType name="CustomersUpdate">             <xsd:complexContent>                <xsd:extension base="p0:CustomersCreate">                   <xsd:sequence>               

  • Checking Rule for Process Order

    Hi Guys I am using the standard checking rule for Process Order while releasing. This rule considers the quality blocked stock also as available stock and thus release the Process Order. However the client does not want to include the quality block s