Using HAL to build Essbase from Enterprise

Has anyone used HAL to update an Essbase cube with data and metadata from Enterprise. This update would include updating dimensions. I'm looking for a replacement to Enterprise Link as it doesn't work with Enterprise 5.5Thanks,Nate

Our company tried to use HAL to extract data from Enterprise and load into Essbase. We had a Hyeprion consultant spend a week trying to set this up for us. Finally after many frustrating attempts and numerous calls to the Hyperion Help line, the consultant informed us that the Enterprise building block for HAL does not work and we would have to use a work around. To make matters worse, Hyperion then billed our company for the time the consultant spent to find out that there was a problem with the software. As far as I know the problem with the Enterprise building block has not been fixed.

Similar Messages

  • Project Server 2010 - Build Team from Enterprise Greyed out

    I am trying to find out why the Enterprise Resources are greyed out when I try to build an Enterprise Team for a project?
    Starting from Project Server, I open a project in Project Pro.  Then go to Resource..Add Resources..Build Enterprise Team.  In the Build Team window, the Enterprise Resources are greyed out and I am unable to add them to be Project Resources.
    Any ideas?

    David --
    To add to the excellent replies from Guillaume, if you are a member of the Project Managers group, you should be able to add resources to your project team (the resources should not be grayed out in the Build Team dialog).  This is assuming, however,
    that your Project Server administrator has not edited the default permissions in the Project Managers group or in the My Organization and My Projects categories that are part of the Project Managers group.
    Assuming that your Project Server administrator has NOT modified the permissions in the default Project Managers group, the other action that can cause your problem is if you are members of MULTIPLE security Groups, and in at least one of those Groups, your
    Project Server administrator has set a Denied value on one, two, or all three of the permissions that Guillaume references in his first post.
    So, to echo what Guillaume has told you, your situation is most likely caused by a permissions issue.  That being the cause, your Project Server administrator would need to track down and resolve the source of this issue.  Hope this helps.
    Dale A. Howard [MVP]

  • Using HAL to generate XML files

    Greetings,<BR>I'd like to use HAL to build XML files and feeding certain variables while these files are generated. When the XML file is done being created, it will contain the variables passed. There are over 500 XML required to be build and for each of these files, only 2 or 3 variables must be passed to them. A template of the xml file was created using the Batch scheduler from Hyperion Reports, and this is the file I'd like to use as a template and build a HAL adapter based on that file.<BR><BR>Does any one have a sample of the XML adapter (via screen shot) that I can use a starting guide?<BR><BR>Many thanks in advance for your time.

    Hi,
    I'm running into the same problem deploying the classes generated by the class generator. Code works fine from JDeveloper, but had to put my DTD in the directory where my classes are. Deploying the classes with Apache's JServ gives me a NullPointer exception on the first addNode method. I guess it can't find the DTD. I tried to put the DTD in many locations but this didn't fix the problem. Any suggestions?
    Steve,
    Did you fix this problem? Thanx!
    null

  • Building report from PL/SQL cursor

    Hello,
    is there any way to build APEX report using just PL/SQL cursor?
    I don't have grant to SELECT from views or tables, but I can use some functions returning row types and cursors. I know I can use them to build table from scratch with htp.p etc., but it’s not very nice. I want to do it using APEX reporting with filtering and pagination functionality.
    Is it possible?
    Regards,
    Przemek

    Apologies for the delay, I was out of the office.
    Below is a package serving as the basis for creating a view based on a pipelined function. The package is just a skeleton and will not compile in its current form, you will need to go through it filling in the blanks as described by the comments.
    If you want some control over what rows are returned by the view from a larger result set, use the set_parameters function in the WHERE clause, E.G.:
    select * from really_big_complicated_slow_view where 1 = view_pkg.set_parameters(something_that_reduces_the_view_result_set_size);
    Or, a more concrete example:
    select result_text from view_to_convert_to_csv where 1 = view_pkg.set_parameters(pi_table => 'my_table', pi_where = 'whatever');
    In the spirit of full disclosure, I got the idea for using the "set_parameters" function in the view WHERE clause from a post or blog somewhere a couple of years ago but have lost track of who actually deserves the credit for the good idea.
    -Tom
    create or replace package demo_vw as
    -- Package to serve as the basis for a view based on a function
    -- Customize this record so that it represents a row from this view...
    type row_type is record (
    -- record fields here
    type table_type is table of row_type;
    -- This function is used in the DDL to define the view, for example:
    -- create or replace view my_view (col1, col2, ..., colN) as
    -- select * from table(my_view_vw.get_view);
    function get_view
    return table_type
    pipelined;
    -- Customize this function header to accept the parameters for this view, if
    -- any. If this view does not require any parameters, the set_parameters
    -- function may be deleted.
    -- This function should always return 1 and is called as follows
    -- select <whatever>
    -- from my_view
    -- where 1 = my_view_vw.set_parameters(p1, p2, p3, ..., pN);
    function set_parameters (pi_whatever1 in whatever,
    pi_whateverN in whatever)
    return number;
    end demo_vw;
    show error package demo_vw
    create or replace package body demo_vw as
    -- Customize this list of private global variables to match the parameters
    -- required by the view. These variables are set, reset, and validated by
    -- set_parameters, reset_parameters, and valid_parameters respectively...
    g_var1 whatever;
    g_varN whatever;
    function set_parameters (pi_whatever1 in whatever,
    pi_whateverN in whatever)
    return number
    is
    -- Customize this function header to accept the parameters for this view, if
    -- any. If this view does not require any parameters, the set_parameters
    -- function may be deleted.
    -- This function should always return 1 and is called as follows
    -- select col1, col2, ..., colN
    -- from my_view
    -- where 1 = my_view_vw.set_parameters(p1, p2, p3, ..., pN);
    begin
    g_var1 := pi_whatever1;
    g_varN := pi_whateverN;
    return 1;
    end set_parameters;
    function valid_parameters
    return boolean
    is
    -- Customize...
    -- Assumes that set_parameters has been called to set the value of the view
    -- parameters.
    l_valid boolean := true;
    begin
    return l_valid;
    end valid_parameters;
    procedure reset_parameters
    is
    -- Customize...
    -- This is called at the end of the get_view function to reset the view
    -- parameters for the next caller.
    begin
    g_var1 := null;
    g_varN := null;
    end reset_parameters;
    function get_view
    return table_type
    pipelined
    is
    -- build and return each row for the view...
    l_row row_type;
    begin
    if valid_parameters then
    -- do your process to populate the l_row variable here...
    pipe row (l_row);
    end if;
    reset_parameters;
    exception
    when others then
    reset_parameters;
    raise;
    end get_view;
    end demo_vw;
    show error package body demo_vw
    create or replace view demo
    as
    select * from table(demo_vw.get_view);

  • Building index from multiple master and child relationship tables

    Hello,
    My question is:
    Is it possible to create the index for master and child tables?
    If yes, can you please point me out to any links or give me an example.
    Actually i just followed this below link to create the index using multiple tables
    Building index from multiple tables for text search
    I am able to create the index using above link,but problem accured , when i search for one master data column value then it is returning many rows with same master data for each child row.
    for example
    SELECT
    a.conc_program_name,
    a.conc_program_desc,
    b.param_name
    FROM a_master a, b_child b
    WHERE b.report_dtls_id = a.report_id
    AND CONTAINS (a.dummy, 'PAY') > 0
    Which retruns
    PAY Master A
    PAY Master B
    PAY Master C
    Please let me know is there any way i can restrict this to single row with concatination of child data like
    PAY Master A B C
    Another doubt is ,i have the column value like p_consolidation_set_id,when i give this in CONTAINS (a.dummy, 'p_consolidation_set_id') > 0 ,then not able to get the any results.
    please let me what shall i do for this issue.
    Thanks
    Message was edited by:
    user496798

    There are various ways to concatenate the values. One nice generic solution is to use Tom Kyte's stragg function:
    http://asktom.oracle.com/pls/ask/f?p=4950:8:::::F4950_P8_DISPLAYID:2196162600402
    If p_consolidation_set_id is a variable name, not a value, then do not put quotes around it.
    Message was edited by:
    Barbara Boehmer

  • Associating member attributes using HAL in Essbase and Planning

    Has anyone used HAL 7.0 to associate member attributes using either the Essbase Adapter or Planning Adapter? We are trying to upload the attributes to the members using a flat file but there isn't very good documentation in the HAL user guide regarding this. Any help would be appreciated.ThanksChris

    Hi, These db's are recently migrated from back end. Are there any issues with the db's?

  • Delete Members from Planning web Using HAL

    Hi All,
    I have loaded approx 150 members under a Custom dimension (Customer..All of them are level 0 with never share property) using HAL. I haven't refreshed to essbase. Now I would like to delete these all 150 members and need to load fresh customer dimension members.
    The issue is how can I delete all the loaded 150 members from Planning. ( Its very time consuming to delete 150 members from Planning web) Can that be done using HAL.
    Please suggest
    Thanks
    Madan

    Yes it can be done with HAL, you would just supply the parent,member a source file and then set the operation port to "Delete Descendants"
    e.g.
    Parent,Child,Operation
    Acc1,Acc12,Delete Descendants
    The operation should be able to use the following values
    * Update - adds, updates, or moves the member being loaded
    * Delete Level 0 - deletes the member being loaded if it has no children
    * Delete Idescendants - deletes the member being loaded and all of its descendants
    * Delete Descendants - deletes the descendants of the member being loaded, but does not delete the member itself
    Cheers
    John
    http://john-goodwin.blogspot.com/

  • I want to put windows 7 on my iMac. Do I use Windows 7, 64 bit System Builder or do I use a "retail copy" of the Windows 7, 64 bit ? Microsoft says I should use the System Builder copy. A technical support individual from Apple said use the "retail copy"

    A Microsoft technical advisor said that the "retail copy" of Windows 7, 64 bit Home Premium would not work with Boot Camp to put the Windows operating system on the IMac. She said that I would have to use the System Builder copy of Windows 7 to put it on my iMac. An Apple technical advisor told me that I had to use the "retail copy" of Window 7 because that was the one used by Apple when they developed the Boot Camp program.  I have also seen some articles from Apple that suggests that both copies of Windows 7 would work OK. Does anyone have any experience that shows which one or both I should purchase to put on my computer?

    Welcome to ASC!
    I used the 64-bit System Builder Edition when I bootcamped my 2010 iMac a couple of years ago. Worked perfectly--installed a very lean system. Without all the extra cr@pware that comes preinstalled on name-brand Win computers, the SBE OS actually works darned well.
    However, that was under Mac OS 10.6.8 which remains on that computer for compatibilty reasons not related to Windows. Haven't BCed any newer Mac OS, so your mileage may vary.

  • Theoretically, may be used Warehouse Builder instead of Enterprise Link?

    Theoretically, may be used Oracle Warehouse Builder instead of Oracle BAM Enterprise Link?
    Interestingly, what are the Oracle plans on improving BAM Enterprise Link? Is it possible to include Enterprise Link functionality into Warehouse Builder?
    I will be pleasure for any answers and thoughts.
    With best wishes,
    Sergey G.

    Yes. We are looking at both the BPEL Process Manager as well as the Oracle Warehouse Builder (OWB) product as additional options to Enterprise Link. However, for the official plans around this, please contact [email protected] (Product Manager).
    Thanks,
    Vishal

  • [svn] 4463: Use svn: ignore to suppress some new build output from appearing in 'svn status'.

    Revision: 4463
    Author: [email protected]
    Date: 2009-01-08 18:33:41 -0800 (Thu, 08 Jan 2009)
    Log Message:
    Use svn:ignore to suppress some new build output from appearing in 'svn status'.
    AIR SDK Readme.txt
    frameworks/projects/air
    lib/fxgutils.jar
    modules/fxgutils/classes
    QE Notes: None
    Doc Notes: None
    Bugs: None
    Reviewer: None
    Property Changed:
    flex/sdk/trunk/
    flex/sdk/trunk/frameworks/projects/
    flex/sdk/trunk/lib/
    flex/sdk/trunk/modules/fxgutils/

    Remember that Arch Arm is a different distribution, but we try to bend the rules and provide limited support for them.  This may or may not be unique to Arch Arm, so you might try asking on their forums as well.

  • Couldn't copy large amount of data from enterprise DB to Oracle 10g

    Hi,
    I am using i-batis to copy data from enterprise DB to oracle and viceversa.
    The datatype of a field on EDB is 'text' and the datatype on oracle is 'SYS.XMLTYPE'
    i am binding these to a java string property in a POJO to bind values.
    I could successfully copy limited amount of data from EDB to oracle but if there is more data, i am getting the following exceptions with different oracle drivers ( but i could read large amount of data from EDB):
    --- Cause: java.sql.SQLException: ORA-01461: can bind a LONG value only for insert into a LONG column
    at com.ibatis.sqlmap.engine.mapping.statement.MappedStatement.executeUpdate(MappedStatement.java:107)
    at com.ibatis.sqlmap.engine.impl.SqlMapExecutorDelegate.update(SqlMapExecutorDelegate.java:457)
    at com.ibatis.sqlmap.engine.impl.SqlMapSessionImpl.update(SqlMapSessionImpl.java:90)
    at com.ibatis.sqlmap.engine.impl.SqlMapClientImpl.update(SqlMapClientImpl.java:66)
    at com.aqa.pojos.OstBtlData.updateOracleFromEdbBtlWebservice(OstBtlData.java:282)
    at com.aqa.pojos.OstBtlData.searchEdbAndUpdateOracleBtlWebservice(OstBtlData.java:258)
    com.ibatis.common.jdbc.exception.NestedSQLException:
    --- The error occurred in com/aqa/sqlmaps/SQLMaps_OSTBTL_Oracle.xml.
    --- The error occurred while applying a parameter map.
    --- Check the updateOracleFromEDB-InlineParameterMap.
    --- Check the parameter mapping for the 'btlxml' property.
    --- Cause: java.sql.SQLException: setString can only process strings of less than 32766 chararacters
    at com.ibatis.sqlmap.engine.mapping.statement.MappedStatement.executeUpdate(MappedStatement.java:107)
    at com.iba
    I have latest oracle 10g jdbc drivers.
    remember, i could copy any amount of data from oracle to EDB but not otherway around.
    PLease let me know if you have come across this issue, any recommendation is very much appreciated.
    Thanks,
    CK.

    Hi,
    I finally remembered how I solved this issue previously.
    The jdbc driver isn't able to directly call the insert with a column xml_type. The solution I was using was to build a wrapper function in plSQL.
    Here it is (for insert but I suppose tha update will be the same)
    create or replace procedure insertXML(file_no_in in number, program_no_in in varchar2, ost_XML_in in clob, btl_XML_in in clob) is
    begin
    insert into AQAOST_FILES (file_no,program_no,ost_xml,btl_xml) values(file_no_in, program_no_in, xmltype(ost_XML_in), xmltype(btl_XML_in));
    end insertXML;
    here is the sqlmap file I used
    <?xml version="1.0" encoding="UTF-8" ?>
    <!DOCTYPE sqlMap
    PUBLIC "-//ibatis.apache.org//DTD SQL Map 2.0//EN"
    "http://ibatis.apache.org/dtd/sql-map-2.dtd">
    <sqlMap>
         <typeAlias alias="AqAost" type="com.sg2net.jdbc.AqAost" />
         <insert id="insert" parameterClass="AqAost">
              begin
                   insertxml(#fileNo#,#programNo#,#ostXML:CLOB#,#bltXML:CLOB#);
              end;
         </insert>
    </sqlMap>
    an here is a simple program
    package com.sg2net.jdbc;
    import java.io.IOException;
    import java.io.Reader;
    import java.io.StringWriter;
    import java.sql.Connection;
    import oracle.jdbc.pool.OracleDataSource;
    import com.ibatis.common.resources.Resources;
    import com.ibatis.sqlmap.client.SqlMapClient;
    import com.ibatis.sqlmap.client.SqlMapClientBuilder;
    public class TestInsertXMLType {
         * @param args
         public static void main(String[] args) throws Exception {
              // TODO Auto-generated method stub
              String resource="sql-map-config-xmlt.xml";
              Reader reader= Resources.getResourceAsReader(resource);
              SqlMapClient sqlMap = SqlMapClientBuilder.buildSqlMapClient(reader);
              OracleDataSource dataSource= new OracleDataSource();
              dataSource.setUser("test");
              dataSource.setPassword("test");
              dataSource.setURL("jdbc:oracle:thin:@localhost:1521:orcl");
              Connection connection=dataSource.getConnection();
              sqlMap.setUserConnection(connection);
              AqAost aqAost= new AqAost();
              aqAost.setFileNo(3);
              aqAost.setProgramNo("prg");
              Reader ostXMLReader= Resources.getResourceAsReader("ostXML.xml");
              Reader bltXMLReader= Resources.getResourceAsReader("bstXML.xml");
              aqAost.setOstXML(readerToString(ostXMLReader));
              aqAost.setBltXML(readerToString(bltXMLReader));
              sqlMap.insert("insert", aqAost);
              connection.commit();
         public static String readerToString(Reader reader) {
              StringWriter writer = new StringWriter();
              char[] buffer = new char[2048];
              int charsRead = 0;
              try {
                   while ((charsRead = reader.read(buffer)) > 0) {
                        writer.write(buffer, 0, charsRead);
              } catch (IOException ioe) {
                   throw new RuntimeException("error while converting reader to String", ioe);
              return writer.toString();
    package com.sg2net.jdbc;
    public class AqAost {
         private long fileNo;
         private String programNo;
         private String ostXML;
         private String bltXML;
         public long getFileNo() {
              return fileNo;
         public void setFileNo(long fileNo) {
              this.fileNo = fileNo;
         public String getProgramNo() {
              return programNo;
         public void setProgramNo(String programNo) {
              this.programNo = programNo;
         public String getOstXML() {
              return ostXML;
         public void setOstXML(String ostXML) {
              this.ostXML = ostXML;
         public String getBltXML() {
              return bltXML;
         public void setBltXML(String bltXML) {
              this.bltXML = bltXML;
    I tested the insert and it works correctly
    ciao,
    Giovanni

  • Using HAL to extract and then load metadata

    hi!
    We are designing a process in which HAL is used to extract a dimension from Essbase and then loaded to another application using a load rule.
    We are struggling with our attributes extraction, we have multiple attributes to an entity and when we extract it from Essbase it creates a string with a comma delimiter. Example
    - Entity1, Store, +, "Attribute1,Attribute2,Attribute3",UDA1
    Any ideas of how to remove the "? I have tried using formulas like Mid(String,1,Len(String)-1) - but it ignores the "
    Thanks

    You could run a macro in Excel's VBA editor that would extract each member of the dim and put each one in a column in a spreadsheet. The macro is ->
    Sub mainSub()
    x = EssVConnect("Sheet1", "admin", "password", "server", "appname", "dbname")
    Set rng = Range("A1")
    vt = EssVGetMemberInfo("Sheet1", "dimName", 2, False)
    If IsArray(vt) Then
    cbItems = UBound(vt) + 1
    For i = 0 To UBound(vt)
    rng.Value = vt(i)
    Set rng = rng.Offset(1, 0)
    Next
    End If
    End Sub
    Of course you would need to have the EssVGetMemberInfo function declared, but it is a nice neet way to extract the members into separate cells in an Excel worksheet. And then it could be imported to a db cube.

  • How to use JAPI to Access Essbase 6.5

    Do anyone knows How to use JAPI to Access Essbase 6.5?
    I was told to install Enterprise Services 7.3.1.10.
    But when I test eds samples, I can connect Essbase but can not get data from Essbase.
    Need to change another Enterprise Services(I was told 7.3.1.10 is the latest version), or do some config in 7.3.1.10.
    Thanks a lot in advance

    Gee is correct. You will need to use EAS 7.x to add an analytic server. You will also have to do a few other tasks such as synch'ing security, etc.
    Further, as you are not accessing the exact same version of Essbase as your version of EDS, there is a property setting you will have to change to be successful. In eds\7.1.3\bin, open essbase.properties and modify the server.olap.direct setting to set it to false. What this setting does is forces EDS to use it's own private set of C API files and to use them for all calls to Essbase. Otherwise, EDS will attempt to use pure Java calls to the server. Unfortunately, the pure Java calls were implemented in an 'evolutionary' manner so every Essbase version got more calls until sometime in the System 9 timeframe. By the time 9.3 shipped (and maybe before, I would have to check), all calls the server are pure Java and that property setting is no longer available.
    Tim Tow
    Oracle ACE
    Applied OLAP, Inc

  • DC Build Fail in Enterprise Portal DC

    DC Build Fails in Enterprise Portal DC
    The DC uses JCA to execute BAPI's in the backend system.
    What DC should be added to the Used Dc to make the DC build successfull?
    Thanks and Regards,
    Prasanna Krishnamurthy

    Hello,
    You need to use external libraries that contain the required jar files for your par file.  You can create a new project type of External Library, add the jar file to the specified folder and then create a public part based on this jar file.  Refernce this public part from your portal DC.  This will create the build time dependency you require.
    I recently submitted a how to paper to SDN on this topic.  Hopefully it will be posted soon.
    Thanks,
    Marty

  • Using an Adaptive RFC Model from JSP

    I'm trying to use an Adaptive RFC Model I created for use in our Web Dynpros from a Portal Application project. I've created 2 DCs:
    1. A Web Dynpro DC, only containing the imported Adaptive RFC Model. The model has been added to the public parts.
    2. A Portal Application DC, with the model DC added as a DC usage (along with the other required DCs)
    Now, everything appears to work from inside NWDS. The dot operator works while editing my JSPDynPage class, and everything builds fine, but when I deploy and test in the portal, I get "package not found" errors on the actual JSP. I assume this is because the JSP is compiled at run-time. How do I expose these classes to the JSP compiler?
    It's seems like using the same method to call BAPIs in Web Dynpros and Java iViews would be a good practice following SAP's "reusable code" policy, so there must be a way to do it...

    Jonathan,
    Adaptive RFC Model works correctly only when running inside WebDynpro container. JSP runs outside container (in regular J2EE WebApplication container).
    The error you receive is due to missing run-time references to container classes.
    Actually, what you are trying to do is dead end, sorry.
    Use SAP Enterprise Connector instead of Adaptive RFC model. Generated connector classes works from any environment. Also they are sooo similar to Adaptive RFC that you'll even not mention a difference.
    Valery Silaev
    EPAM Systems
    http://www.NetWeaverTeam.com

Maybe you are looking for

  • How to add date with specified number of years?

    Given a date 2005-3-28. How do i get the date of the given above date plus 25 years? Please advise. Thank you.

  • Import NEF files from Nikon D7100

    Ich habe die aktuellen PS CC, Bridge CC, ACR- und DNG Versionen installiert. Dennoch werden meine NEF-Dateien meiner Nikon D7100 nicht erkannt.

  • Why do I only get Email when the computer is turned on?

    Using Thunderbird, If the computer is off, no emails are delivered. If the computer is on and I'm signed into my computer account, email seems to work normally. I do not need to be signed on to email. I previously tried Outlook and Windows Live and h

  • Alternative to stsadm -o migrateuser

    Perhaps this is the better spot to ask this question, since it has to do with setting up a SharePoint site collection. So here is the scenario, and actually a very simple thing we are trying to do. We have Server A with the domain name of Turner. Thi

  • Shifting positions of values in arrays

    I need to replicate some functionality in existing IDL (Interactive Data Language) code. This language has some very high level constructs for moving values in arrays. This statement: shifted_array = shift(my_array, xshift, yshift)shift all the value