Essbase data backup - 9.3.1

Hi All,
I have a question around Essbase data backups.
We are on 9.3.1 and the Essbase data files (.ind and .pag) are under the default location (Hyperion\Essbase\App etc).
One of our apps is 70gb and gets backed up nightly.
Now is it necessary to take an export of the data for the backup to be complete? My main concern is that exporting 70GB of data will take hours.
If for some reason we need to restore this app to another server, will copying all files under the Essbase\App\Appname folder also restore the data?
Thanks for your help.
Seb

Provided the Essbase application & database is stopped when the backup occurs, backing up the ind and pag files, as well as the Database outline(*.otl), *.tct , *.db, *.dbb, *.esm files should allow you to recover from a backup. I have had successfull recoveries from standard file backup of these files without issue, and in pretty good time.
I have not tried to restore to a different server. Usually when migrating, I copy the outline over, and export data from one database and import to the other.
Imran
Edited by: ImranS on Apr 7, 2009 9:51 AM

Similar Messages

  • Essbase cube empty vs Essbase data file not empty. It is possible?

    Hi Guys,
    This morning end-users has alerted that our planning cube has no data. I have verified that the export file was completely empty despite that Essbase data file (ess00001.pag) is about 3 MB. How does it is possible? I am trying to recover the proper situation using backups buit I don't know how to identify when database was cleaned (end users dind't acces to essbase since 2 weeks ago).
    Thanks

    Could the level 0 data have been zeroed out, and no agg run to propagate the zeros (or #Missing) to upper levels?
    Did you at one time have some data in the database, then lock n send to overwrite?
    If you had performed a database reset, that would leave no page file. If you simply overwrote data, then you could have some fragmentation (when data is overwritten, the data is written to a new space in the page file with the index pointing to new data. This leaves the prior data behind with no index pointer to it, but it's still in the page file.)
    Was the export file of all data, or level zero data?
    If it was level 0 export, perform an export all (3mb is not that much) to verify if there is any data. If not, I'd chalk it up to fragmentation.
    Robert

  • Has somebody already tried HTBase - Essbase Hot Backup Tool?

    Dear all,
    You guys know this tool (HTBase - Essbase Hot Backup)?
    Is it neccesary to enable transaction loggin from essbase.cfg to make it work?.
    I look fordward to your responses. Regards.

    I saw a demo of it by Bruno and it works. They have a fewer other products in the pipeline. I am almost certain you did not need to enable transactional logging as we debated the benefit of the tool if transactional logging was enabled and regulary data exports were performed. I do not know of anyone using at the moment.
    Thanks,
    Nathan

  • Is it possible to extract essbase data as a web service

    Hi All,
    I would like to know if it is possible to extract essbase data as a web service. What are the things to look for to achieve this functionality. Also request you to help me with a simple example.
    Thanks for your help in advance.
    Thanks,
    Praveen

    1)http://docs.oracle.com/cd/E26232_01/doc.11122/aps_admin.pdf
    2)http://code.google.com/p/essbase-plsql-interface/downloads/list?deleted=1&ts=1331485947
    3)http://essbase.ru/archives/category/performance/essbase-api/xmla

  • Odi 11g - IKM SQL to Hyperion Essbase (DATA) log file always empty

    In odi 11g when using *"IKM SQL to Hyperion Essbase (DATA)"* setting the the "LOG_ENABLED" = true,
    only an empty file are generated.
    Just "LOG_ERRORS" file (if errors occurs) are created.
    Is this just an my issue?
    Can someone help me?
    p.s.: the same issue, I got even with the *"IKM SQL to Hyperion Planning"*
    Thx in advance, Paolo

    Thanks John for your suggestion.
    here the patch *"Patch 10302682: IKM SQL TO PLANNING: LOG FILE IS CREATED BUT NOTHING INSIDE."*
    I didn't see any other about Essbase...
    I try to check all day on support site.
    Paolo
    Edited by: Paolo on 19-apr-2011 8.44

  • Time series functions are not working in OBIEE for ESSBASE data source

    Hi All,
    I am facing a problem in OBIEE as I am getting error messages for measure columns with Time series functions(Ago,ToDate and PeriodRolling) in both RPD and Answers.
    Error is "Target database does not support Ago operation".
    But I am aware of OBIEE supports Time Series functions for Essbase data source.
    using Hyperion 9.3.1 as data source and obiee 11.1.1.5.0 as reporting tool.
    Appreciate your help.
    Thanks,
    Aravind

    Hi,
    is because the time series function are not supported for the framentation content, see the content of the oracle support:
    The error occurs due to the fact the fragmented data sources are used on some Time series measures. Time series measures (i.e. AGO) are not supported on fragmented data sources.
    Confirmation is documented in the following guide - Creating and Administering the Business Model and Mapping Layer in an Oracle BI Repository > Process of Creating and Administering Dimensions
    Ago or ToDate functionality is not supported on fragmented logical table sources. For more information, refer to “About Time Series Conversion Functions” on page 197.
    Regards,
    Gianluca

  • RELIABLE data backup software recommendation for MacBook Pro WANTED!

    I own Retrospect 6.x and I must say that on the intel platform it is less than stellar. On my 15" Powerbook I would get about 90MB/minute and on my new 17" MacBook Pro I am getting about 30MB/minute, which is taking FOREVER. Can someone PLEASE recommend some reliable data backup software in universal binary that will obtain backup speeds similar to what I use to get out of Retrospect on my powerbook? Thank you kindly!
    Scott

    Welcome to the forum,
    If you don’t have a .Mac account, you may want to consider it. It has a fairly respectable BackUp feature that’s quite good and very well integrated to your Mac. I strongly recommend it. Can’t say much compared to the speed of Retrospect although it does perform incremental backups as well as a host of other options. Hope that you may find this useful, etc.
    References:
    http://www.mac.com/1/solutions/backup.html
    http://www.mac.com/WebObjects/Welcome.woa?aff=consumer&cty=US&lang=en
    Regards,
    2.16 MBP (FW 1.0.1) Week-12 build   Mac OS X (10.4.6)   G4 Tower (OS 9/10), Dell 620 WorkStation (XP Pro), Gateway P4 (XP Home)

  • How to use Bind Variables in Essbase data control

    Hi,
    I am trying to use Bind Variables in MDX query while creating the Essbase Data Control. I have used the below query with the Bind Variable.. this query is working in Essbase admin console..but it is throwing error (*Invalid MDX Query)* while creating Essbase Datacontrol in JDeveloper.
    MDX Query : SELECT {[Measures].Msr_2} ON COLUMNS, [Time].Children ON ROWS FROM cube
    where ($name)
    Could any body suggest me on how to use bind variables with Essbase Data control.
    Thanks,
    Swathi

    Hello Swathi, can you please help me how you created Essbase DataControl? Also were you able to figure out this?
    Thanks, Praveen.

  • ODI 11.1.1.5 IKM SQL to Hyperion Essbase (DATA) ERROR

    Hello!
    I'm loading data from Oracle database table "nbkr_source" to Essbase database nbkr.bdr using IKM SQL to Hyperion Essbase (DATA).
    I'm using the interface which successfully loading data into Essbase in ODI 10.1.3.5, but this interface doesn't work in ODI 11.1.1.5...
    The error appears on the step 6: "load data into essbase":
    [2011-12-02T08:21:28.985-04:30] [] [NOTIFICATION:16] [ODI-1126] [] [tid: 49] [ecid: 0000JFyGjmU6UOYjLpATOA1Eq7wG00000U,0:482] Agent localagent started session nbkr_sql_to_essbase2 (16001) in work repository WORKREP1 using context GLOBAL.
    [2011-12-02T08:21:32.401-04:30] [] [ERROR] [ODI-1217] [] [tid: 49] [ecid: 0000JFyGjmU6UOYjLpATOA1Eq7wG00000U,0:482] Session nbkr_sql_to_essbase2 (16001) fails with return code 7000.[[
    ODI-1226: Step nbkr_sql_to_essbase2 fails after 1 attempt(s).
    ODI-1240: Flow nbkr_sql_to_essbase2 fails while performing a Integration operation. This flow loads target table nbkr_bdrData.
    Caused By: org.apache.bsf.BSFException: exception from Jython:
    Traceback (most recent call last):
    File "<string>", line 26, in <module>
         at com.hyperion.odi.essbase.ODIEssbaseDataWriter.loadData(Unknown Source)
         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
         at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
         at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
         at java.lang.reflect.Method.invoke(Method.java:597)
    com.hyperion.odi.essbase.ODIEssbaseException: com.hyperion.odi.essbase.ODIEssbaseException: Invalid column type specified for Data column [План счетов]
         at org.apache.bsf.engines.jython.JythonEngine.exec(JythonEngine.java:146)
         at com.sunopsis.dwg.codeinterpretor.SnpScriptingInterpretor.execInBSFEngine(SnpScriptingInterpretor.java:346)
         at com.sunopsis.dwg.codeinterpretor.SnpScriptingInterpretor.exec(SnpScriptingInterpretor.java:170)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.scripting(SnpSessTaskSql.java:2457)
         at oracle.odi.runtime.agent.execution.cmd.ScriptingExecutor.execute(ScriptingExecutor.java:47)
         at oracle.odi.runtime.agent.execution.cmd.ScriptingExecutor.execute(ScriptingExecutor.java:1)
         at oracle.odi.runtime.agent.execution.TaskExecutionHandler.handleTask(TaskExecutionHandler.java:50)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.processTask(SnpSessTaskSql.java:2906)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java:2609)
         at com.sunopsis.dwg.dbobj.SnpSessStep.treatAttachedTasks(SnpSessStep.java:537)
         at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java:453)
         at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java:1740)
         at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$2.doAction(StartSessRequestProcessor.java:338)
         at oracle.odi.core.persistence.dwgobject.DwgObjectTemplate.execute(DwgObjectTemplate.java:214)
         at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.doProcessStartSessTask(StartSessRequestProcessor.java:272)
         at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.access$0(StartSessRequestProcessor.java:263)
         at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$StartSessTask.doExecute(StartSessRequestProcessor.java:822)
         at oracle.odi.runtime.agent.processor.task.AgentTask.execute(AgentTask.java:123)
         at oracle.odi.runtime.agent.support.DefaultAgentTaskExecutor$2.run(DefaultAgentTaskExecutor.java:82)
         at java.lang.Thread.run(Thread.java:662)
    Code:
    from com.hyperion.odi.common import ODIConstants
    from com.hyperion.odi.connection import HypAppConnectionFactory
    from java.lang import Class
    from java.lang import Boolean
    from java.sql import *
    from java.util import HashMap
    # Get the select statement on the staging area:
    sql= """select C1_HSP_RATES "HSP_Rates",C2____________ "План счетов",C3_______ "Период",C4_________ "Сценарий",C5_______ "Версия",C6_______ "Валюта",C7____ "Год",C8________________ "Структура банка",C9_DATA "Data" from "C$_0nbkr_bdrData" where      (1=1) """
    srcCx = odiRef.getJDBCConnection("SRC")
    stmt = srcCx.createStatement()
    srcFetchSize=30
    #stmt.setFetchSize(srcFetchSize)
    stmt.setFetchSize(1)
    print "executing query"
    rs = stmt.executeQuery(sql)
    print "done executing query"
    #load the data
    print "loading data"
    stats = pWriter.loadData(rs)
    print "done loading data"
    #close the database result set, connection
    rs.close()
    stmt.close()

    There is not a set limit, you set it to the most optimal value for the database you are loading to which you will find by peforming tests with different values.
    Cheers
    John
    http://john-goodwin.blogspot.com/

  • Essbase Date Measures  in OBIEE

    Dose Essbase Date Measures supported by OBIEE. Can we have a Essbase Date Measures display as a DATE in OBIEE?
    Thanks

    Hi Nilaksha,
    I thought you might say that, I think this is because Essbase doesn't use data types as such, all data is stored as a value and characteristics of that data or how it is treated are more like meta data. This certainly rings true for the time dimensions I have seen. I think your best bet is to look at how you can format the numbers in the columns to make them appear in the format you require.
    Regards
    Ed

  • Users are unable to access Essbase data-corrupt group

    Someone moved a hyperion user to an ldap directory group that hyperion couldn't access which seems to have corrupted the group to which the ID belongs. I got rid of the ID but the users are still experiencing problems accessing essbase data. We have 9.3.1 and I have refreshed security filters and recycled many times. Does anyone have any suggestions? Should I reimport the secfile.txt?

    Where I need to check the logs in tbl Logs under Appserver?
    Well, I am unable to see complete input schedule. I have unprotected the sheet still there is data N38 but able to see before row 58. unable to expand.
    Kindly suggest.

  • Ideas for Providing User Level Data Backup and Restore

    I'm looking for ideas for implementing a user level application data backup and restore in an Apex app.
    What would be great is to have a user be provided an export file and a way to import this file. A bit overkill but hopefully never needed.
    Another option that is perfectly doable is a report that simply provides a means to create an export of the data. Since I already have an interface I can use an export to interface an export.
    Any thoughts?
    Hopefully I'm missing something already there for an end user to use.

    jlincoln wrote:
    "Do you mean "export" and "import" colloquially, or in the specific sense of the exp/imp/datapump utilities?"': I mean as in imp/exp Oracle utilities. Generally speaking, it would be neat to be able to export and import via an Apex an application. In this hosted environment I don't have that access but would this be a bad idea if you don't care about the existing data in the schema in which the data resides?I can envisage a mechanism using <tt>exp/imp</tt>, but since it requires <tt>dbms_scheduler</tt> external jobs and access to the file system it's highly unlikely to be possible in a hosted environment. (Unless you're doing the hosting?)
    Backup: Necessary for piece of mind and flexibility. I am working on a VB/Access user who does this today to get to the point when they can be comfortable with the backups occurring regularly and by the hosting site's DBA group.
    Restore: Like I said. I am working on a VB/Access user who does this today to get to the point when they can be comfortable with the backups occurring regularly and by the hosting site's DBA group. This is a very small data set. A restore would simply remove existing data and replace it with the new data.My opinion is that time would be better spent working on the user rather than a redundant backup and restore feature. Involve them in a disaster recovery exercise with whoever is hosting the environment to prove that their data is safe. Normally the inclusion of data in regular, effective database backups is sold as a major feature of APEX solutions.
    "What about security/privacy when this data ends up in uncontrolled environments?": I don't understand the point of this question. The data should not end up in uncontrolled environments. Just like the data in the database or its backups.Again, having data in a central, shared location protected by multiple levels of application, database, and OS security is usually seen as a plus for APEX over VB/Access. Exporting the data in toto to a PC/laptop that can be stolen or lost, and where it can be copied to USB drives/phones/email loses this protection.
    User Level: Because the end user must have access to the backup and restore mechanisms of the application.
    Application Data: The application data. Less than 10MB. Very small. It can be exported in a flat file downloaded by the end user. This file can then be used to upload and import via an existing application interface. For example.
    "I'm struggling to parse this for meaning.": When I say I have an existing interface I am referring to a program residing in the Apex application that will take data from a flat table structure (i.e. interface table), validate the data, derive data, and load into the target table structure.Other than the report export capability linked to above, there's nothing built-in to APEX that comes close to your requirement. If the data is simple enough that it can be handled in such a report, and you have a process that can read and recreate this export, then you have your backup/restore capability. If the data can't be handled in a simple report, then you'll need a more complex PL/SQL process to generate the file.

  • OBIEE report with HFM and Essbase Data source

    Hi All,
    Just want your though can we get the data from HFM and Essbase data source in a single report i.e. for some account i want to show budget data from HFM and Forecast data from essbase in a single OBIEE report? If yes how can we model both Data source in rpd?

    It possible but the challenge is to try to align the account hierarchy and the entity hierarchy between Essbase and HFM. HFM is looking the data at a top side consolidation level where planning will have more granular details to forecast correctly. The best solution is either to build a custom table in to import both sources, or to use HFM data into Essbase, or to use EPMA. The best in class solution stays to align hierarchies and metadata across applications. We implemented BI Financials Dashboard for the General Ledger, built a custom Dashboard connected in Real Time to Planning and are installing OFMA to connect HFM. Once done, you can rationalize metadata if not done before and start to link the different data into single reports. You will also have to ensure the data is refreshed at the same time to avoid inconsistency.

  • Import data backup into another DB instance

    Hello all.
    I need to import a Data Backup into Another Database instance. MaxDb version 7.3.
    On higher versions of MaxDb I performed this operation without any problems (http://help.sap.com/erp2005_ehp_04/helpdata/EN/43/d5ebc2c9ed3ab3e10000000a422035/frameset.htm).
    For this version of MaxDB I have such problem:
    the commands don't coincide with help description, I define backup template, transfer DB instance to admin state, open utility session and try to srart recovery (there is no command db_activate RECOVER <template>)
    dbmcli on Q46>util_connect
    OK
    dbmcli on Q46>recover_start DEMO
    ERR
    -24988,ERR_SQL: sql error
    -903,Message not available,blockcount mismatch
    What I can do? How can I start recover from data backup from another DB on this version of MaxDB?

    I have tried to perform backup of original system and recovery to another database one more time.
    There is the another error:
    dbmcli on Q46>recover_start DemoD46_recover
    ERR
    -24988,ERR_SQL: sql error
    -3014,Invalid end of SQL statement

  • Essbase Data Cache Full Error

    Hi, All,
    I am encountering an Essbase data cache full error while running some calc scripts. I have already try to set the data cache size to 800MB to 1.6GB, and index cache to around 100MB. It is quite a large BSO Essbase, with one dimension over 1000 members, another about 2500 members, and the last one about 3000 members.
    I have three similar scripts and each is for different entities to aggregate on some of those dimensions.
    For example, I started with unload the app/db, and run the calc script 1 for Entity1, it is successfully; However, I continue to run on calc script 2 for Entity 2, it showed the "data cache full error". However, after I unload the app/db, and then run calc script 2 again, I can have the calc scripts completed with no errors.
    I am running on Essbase 11.1.1.3 on AIX platform 32-bit Essbase.
    Anyone had encountered that before. Is it a problem with Essbase RAM handling, in this case?
    Thanks in advance

    Thank you John,
    We have found that it is the entity dimension that should be responsible to this problem.
    I remember we have encountered this kind of problem before when we aggregate an application with the entity dimension hierarchy mixed with shared and stored instances of the same level 0 members. To put it simple, there are three members under "Entity" dimension member, which represents different view of entity hierarchies of same level 0 members. the first one has stored level 0 entity members while the other two have shared ones. And at that time, our client added another hierarchy with shared level 0 members, but they did not put this tree under "Entity" dimension member directly, but rather put it under the first child of "Entity", which is the one with stored level 0 members.
    It is a little bit confusing to describe the situation only by text. Anyway, at that time, the first hierarchy had both stored and shared instances of the same group of level 0 members. And the data cache is always full when aggregating. and after we moved the forth hierarchy to another tree so under that hierarchy the level 0 members are all shared instances, the aggregation worked flawlessly.
    I wondered why this happened and consider this is related to detailed calculation logic of Essbase. May you shed some light on this topic? Thank you with all my heart!
    Warm Regards,
    John

Maybe you are looking for

  • Memory Upgrade on a U925T

    How do you do a memory upgrade on a Satellite U925T-2300.  I have an 8 GB memory module I would like to install and can't find the door to do it!  Help.

  • Cannot install Windows 7 Pro; gets stuck on "Starting Windows" screen

    I have read dozens of posts on this topic but none has helped yet. My system has Windows 8 with UEFI. I want to install Windows 7 Pro on another partition. I have turned off Secure Boot in the UEFI/BIOS. When I try to install Windows 7 Pro from a DVD

  • How do I install an OS on a Toshiba NB305?

    I just bought a Toshiba Netbook 305 Model No. N442BL  and they wiped it "clean" including the windows 7 starter that was supposed to be on it.  The guy says I have to install my own OS but I have no clue how to do that.   There's no F8 option to get

  • Error in Sending IDoc to XI

    Hi Gurus, We are not being able to send Idocs from one of our Back-end SAP system. The Idoc status in we05 is successful but it doesn't reach XI. So we checked in SM58 in the Back-end R/3 system and there it shows the error: No Service For System  SA

  • Personal hotspot working via BlueTooth but NOT working via wi-fi

    Hi, Would appreciate some help with this! Using an iPhone 4, just upgraded to iOS 4.3, and trying to get personal hotspot working. Basically, see subject ^ .. I can't get it working. Devices (I've tried iMac, iPad) can see the wireless network create