ODI 11.1.1.5 IKM SQL to Hyperion Essbase (DATA) ERROR

Hello!
I'm loading data from Oracle database table "nbkr_source" to Essbase database nbkr.bdr using IKM SQL to Hyperion Essbase (DATA).
I'm using the interface which successfully loading data into Essbase in ODI 10.1.3.5, but this interface doesn't work in ODI 11.1.1.5...
The error appears on the step 6: "load data into essbase":
[2011-12-02T08:21:28.985-04:30] [] [NOTIFICATION:16] [ODI-1126] [] [tid: 49] [ecid: 0000JFyGjmU6UOYjLpATOA1Eq7wG00000U,0:482] Agent localagent started session nbkr_sql_to_essbase2 (16001) in work repository WORKREP1 using context GLOBAL.
[2011-12-02T08:21:32.401-04:30] [] [ERROR] [ODI-1217] [] [tid: 49] [ecid: 0000JFyGjmU6UOYjLpATOA1Eq7wG00000U,0:482] Session nbkr_sql_to_essbase2 (16001) fails with return code 7000.[[
ODI-1226: Step nbkr_sql_to_essbase2 fails after 1 attempt(s).
ODI-1240: Flow nbkr_sql_to_essbase2 fails while performing a Integration operation. This flow loads target table nbkr_bdrData.
Caused By: org.apache.bsf.BSFException: exception from Jython:
Traceback (most recent call last):
File "<string>", line 26, in <module>
     at com.hyperion.odi.essbase.ODIEssbaseDataWriter.loadData(Unknown Source)
     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
     at java.lang.reflect.Method.invoke(Method.java:597)
com.hyperion.odi.essbase.ODIEssbaseException: com.hyperion.odi.essbase.ODIEssbaseException: Invalid column type specified for Data column [План счетов]
     at org.apache.bsf.engines.jython.JythonEngine.exec(JythonEngine.java:146)
     at com.sunopsis.dwg.codeinterpretor.SnpScriptingInterpretor.execInBSFEngine(SnpScriptingInterpretor.java:346)
     at com.sunopsis.dwg.codeinterpretor.SnpScriptingInterpretor.exec(SnpScriptingInterpretor.java:170)
     at com.sunopsis.dwg.dbobj.SnpSessTaskSql.scripting(SnpSessTaskSql.java:2457)
     at oracle.odi.runtime.agent.execution.cmd.ScriptingExecutor.execute(ScriptingExecutor.java:47)
     at oracle.odi.runtime.agent.execution.cmd.ScriptingExecutor.execute(ScriptingExecutor.java:1)
     at oracle.odi.runtime.agent.execution.TaskExecutionHandler.handleTask(TaskExecutionHandler.java:50)
     at com.sunopsis.dwg.dbobj.SnpSessTaskSql.processTask(SnpSessTaskSql.java:2906)
     at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java:2609)
     at com.sunopsis.dwg.dbobj.SnpSessStep.treatAttachedTasks(SnpSessStep.java:537)
     at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java:453)
     at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java:1740)
     at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$2.doAction(StartSessRequestProcessor.java:338)
     at oracle.odi.core.persistence.dwgobject.DwgObjectTemplate.execute(DwgObjectTemplate.java:214)
     at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.doProcessStartSessTask(StartSessRequestProcessor.java:272)
     at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.access$0(StartSessRequestProcessor.java:263)
     at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$StartSessTask.doExecute(StartSessRequestProcessor.java:822)
     at oracle.odi.runtime.agent.processor.task.AgentTask.execute(AgentTask.java:123)
     at oracle.odi.runtime.agent.support.DefaultAgentTaskExecutor$2.run(DefaultAgentTaskExecutor.java:82)
     at java.lang.Thread.run(Thread.java:662)
Code:
from com.hyperion.odi.common import ODIConstants
from com.hyperion.odi.connection import HypAppConnectionFactory
from java.lang import Class
from java.lang import Boolean
from java.sql import *
from java.util import HashMap
# Get the select statement on the staging area:
sql= """select C1_HSP_RATES "HSP_Rates",C2____________ "План счетов",C3_______ "Период",C4_________ "Сценарий",C5_______ "Версия",C6_______ "Валюта",C7____ "Год",C8________________ "Структура банка",C9_DATA "Data" from "C$_0nbkr_bdrData" where      (1=1) """
srcCx = odiRef.getJDBCConnection("SRC")
stmt = srcCx.createStatement()
srcFetchSize=30
#stmt.setFetchSize(srcFetchSize)
stmt.setFetchSize(1)
print "executing query"
rs = stmt.executeQuery(sql)
print "done executing query"
#load the data
print "loading data"
stats = pWriter.loadData(rs)
print "done loading data"
#close the database result set, connection
rs.close()
stmt.close()

There is not a set limit, you set it to the most optimal value for the database you are loading to which you will find by peforming tests with different values.
Cheers
John
http://john-goodwin.blogspot.com/

Similar Messages

  • Odi 11g - IKM SQL to Hyperion Essbase (DATA) log file always empty

    In odi 11g when using *"IKM SQL to Hyperion Essbase (DATA)"* setting the the "LOG_ENABLED" = true,
    only an empty file are generated.
    Just "LOG_ERRORS" file (if errors occurs) are created.
    Is this just an my issue?
    Can someone help me?
    p.s.: the same issue, I got even with the *"IKM SQL to Hyperion Planning"*
    Thx in advance, Paolo

    Thanks John for your suggestion.
    here the patch *"Patch 10302682: IKM SQL TO PLANNING: LOG FILE IS CREATED BUT NOTHING INSIDE."*
    I didn't see any other about Essbase...
    I try to check all day on support site.
    Paolo
    Edited by: Paolo on 19-apr-2011 8.44

  • ODI - IKM SQL to Hyperion Essbase (DATA)

    Hi,
    I am using IKM SQL to Hyperion Essbase (DATA), for loading the data to Essbase.
    Can any one tell me the highest value which i can give for Commit_Interval, in the options of IKM SQL to Hyperion Essbase (DATA)?
    Thanks
    V D Reddy

    There is not a set limit, you set it to the most optimal value for the database you are loading to which you will find by peforming tests with different values.
    Cheers
    John
    http://john-goodwin.blogspot.com/

  • Unable to load data to Essbase (IKM SQL to Hyperion Essbase (DATA)

    I'm trying to load data to Hyperion Essbase. Unfortunately it isn't going so well. I've followed the instructions but I keep getting this "BUFFER_SIZE" error. I haven't changed the default BUFFER_SIZE (it is set at <Default>:80) and I haven't changed any other parameters in the KM.
    Appreciate any thoughts...
    com.hyperion.odi.common.ODIConstants has no attribute *'BUFFER_SIZE'*
    org.apache.bsf.BSFException: exception from Jython:
    Traceback (innermost last):
    File "<string>", line 82, in ?
    AttributeError: class 'com.hyperion.odi.common.ODIConstants' has no attribute 'BUFFER_SIZE'
    at org.apache.bsf.engines.jython.JythonEngine.exec(JythonEngine.java:146)
    at com.sunopsis.dwg.codeinterpretor.k.a(k.java)
    at com.sunopsis.dwg.dbobj.SnpSessTaskSql.scripting(SnpSessTaskSql.java)
    at com.sunopsis.dwg.dbobj.SnpSessTaskSql.execScriptingOrders(SnpSessTaskSql.java)
    at com.sunopsis.dwg.dbobj.SnpSessTaskSql.execScriptingOrders(SnpSessTaskSql.java)
    at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTaskTrt(SnpSessTaskSql.java)
    at com.sunopsis.dwg.dbobj.SnpSessTaskSqlI.treatTaskTrt(SnpSessTaskSqlI.java)
    at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java)
    at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java)
    at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java)
    at com.sunopsis.dwg.cmd.DwgCommandSession.treatCommand(DwgCommandSession.java)
    at com.sunopsis.dwg.cmd.DwgCommandBase.execute(DwgCommandBase.java)
    at com.sunopsis.dwg.cmd.e.i(e.java)
    at com.sunopsis.dwg.cmd.h.y(h.java)
    at com.sunopsis.dwg.cmd.e.run(e.java)
    at java.lang.Thread.run(Unknown Source)
    Edited by: Chris Rothermel on Apr 13, 2010 3:44 PM

    Yes! That's great but I'm not out of the woods yet. I had applied the patch incorrectly before -- I'd extracted it to the wrong directory. Now I get this new error: Missing standard dimension column for data load
    Any thoughts on how to get by this one?
    org.apache.bsf.BSFException: exception from Jython:
    Traceback (innermost last):
    File "<string>", line 23, in ?
    com.hyperion.odi.essbase.ODIEssbaseException: Missing standard dimension column for data load
    at com.hyperion.odi.essbase.ODIEssbaseDataWriter.loadData(Unknown Source)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
    at java.lang.reflect.Method.invoke(Unknown Source)
    at org.python.core.PyReflectedFunction.__call__(PyReflectedFunction.java)
    at org.python.core.PyMethod.__call__(PyMethod.java)
    at org.python.core.PyObject.__call__(PyObject.java)
    at org.python.core.PyInstance.invoke(PyInstance.java)
    at org.python.pycode._pyx1.f$0(<string>:23)
    at org.python.pycode._pyx1.call_function(<string>)
    at org.python.core.PyTableCode.call(PyTableCode.java)
    at org.python.core.PyCode.call(PyCode.java)
    at org.python.core.Py.runCode(Py.java)
    at org.python.core.Py.exec(Py.java)
    at org.python.util.PythonInterpreter.exec(PythonInterpreter.java)
    at org.apache.bsf.engines.jython.JythonEngine.exec(JythonEngine.java:144)
    at com.sunopsis.dwg.codeinterpretor.k.a(k.java)
    at com.sunopsis.dwg.dbobj.SnpSessTaskSql.scripting(SnpSessTaskSql.java)
    at com.sunopsis.dwg.dbobj.SnpSessTaskSql.execScriptingOrders(SnpSessTaskSql.java)
    at com.sunopsis.dwg.dbobj.SnpSessTaskSql.execScriptingOrders(SnpSessTaskSql.java)
    at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTaskTrt(SnpSessTaskSql.java)
    at com.sunopsis.dwg.dbobj.SnpSessTaskSqlI.treatTaskTrt(SnpSessTaskSqlI.java)
    at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java)
    at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java)
    at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java)
    at com.sunopsis.dwg.cmd.DwgCommandSession.treatCommand(DwgCommandSession.java)
    at com.sunopsis.dwg.cmd.DwgCommandBase.execute(DwgCommandBase.java)
    at com.sunopsis.dwg.cmd.e.j(e.java)
    at com.sunopsis.dwg.cmd.g.z(g.java)
    at com.sunopsis.dwg.cmd.e.run(e.java)
    at java.lang.Thread.run(Unknown Source)
    Caused by: com.hyperion.odi.essbase.ODIEssbaseException: Missing standard dimension column for data load
    at com.hyperion.odi.essbase.ODIEssbaseDataWriter.validateColumns(Unknown Source)
    ... 32 more
    com.hyperion.odi.essbase.ODIEssbaseException: com.hyperion.odi.essbase.ODIEssbaseException: Missing standard dimension column for data load
    at org.apache.bsf.engines.jython.JythonEngine.exec(JythonEngine.java:146)
    at com.sunopsis.dwg.codeinterpretor.k.a(k.java)
    at com.sunopsis.dwg.dbobj.SnpSessTaskSql.scripting(SnpSessTaskSql.java)
    at com.sunopsis.dwg.dbobj.SnpSessTaskSql.execScriptingOrders(SnpSessTaskSql.java)
    at com.sunopsis.dwg.dbobj.SnpSessTaskSql.execScriptingOrders(SnpSessTaskSql.java)
    at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTaskTrt(SnpSessTaskSql.java)
    at com.sunopsis.dwg.dbobj.SnpSessTaskSqlI.treatTaskTrt(SnpSessTaskSqlI.java)
    at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java)
    at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java)
    at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java)
    at com.sunopsis.dwg.cmd.DwgCommandSession.treatCommand(DwgCommandSession.java)
    at com.sunopsis.dwg.cmd.DwgCommandBase.execute(DwgCommandBase.java)
    at com.sunopsis.dwg.cmd.e.j(e.java)
    at com.sunopsis.dwg.cmd.g.z(g.java)
    at com.sunopsis.dwg.cmd.e.run(e.java)
    at java.lang.Thread.run(Unknown Source)

  • ODI Error:   Error when selecting IKM Selector: IKM SQL to Hyperion Planni

    Hi,
    I'm trying to build the Accounts dimension from a tab delimited text file. I have setup the logical and physical architecture and contexts.
    On the 'Flow' tab of my Model, I click on my Target header. Then on the 'IKM Selector' dropdown, I select 'IKM SQL to Hyperion Planning'. It gives the error:
    Internal Error: ODI-15526: IKM IKM SQL to Hyperion Planning not accepted for staging and target technology
    Any ideas? I think I'm really close. I'm really new/green on ODI and I'm at the very end of trying to actually run my dimension build.
    All help is appreciated.

    The hyperion technologies do not support being set as target staging area so you will always need to set a staging area, this can be a relational staging area such as oracle, sql server or the memory engine.
    Cheers
    John
    http://john-goodwin.blogspot.com/

  • IKM File to Hive (Load Data) error

    I use Odi version 11.1.1.6 .
    I need to trasfer a csv file from Hdfs to a Hive table.
    My csv file is  located into HDFS path: /user/oracle/inboxlog
    My physical architecture is so set:
    - File :
         Dataserver:
              Name : HDFSFILE
              jdbc Driver: empty
              jdbc Url: hdfs://mynode:8020
         Physical Schema:
              Directory Schema: /user/oracle/inboxlog
              Directory WorkSchema: /user/oracle/inboxlog
              Context: HADOOP
              Logical Schema: LogicalHdfs
    - Hive:
         Dataserver:
              Name : NEW_HIVE
              jdbc Driver: org.apache.hadoop.hive.jdbc.HiveDriver
              jdbc Url: jdbc:hive://mynode:10000/default
              hive metastore uri: thrift://mynode:10000
         Physical Schema:
              Directory Schema: default
              Directory WorkSchema: default
              Context: HADOOP
              Logical Schema: NEW_HIVE_LOGICAL
    My logical architecture is so set:
    -File:
         Name: LogicalHdfs
         Context: HADOOP
         Physical Schema: HDFSFILE./user/oracle/inboxlog
    -Hive:
         Name: NEW_HIVE_LOGICAL
         Context: HADOOP
         Physical Schema: NEW_HIVE.default
    I created an interface in which I use as the source csv file and how to target a hive table.
    The source file is defined in a datastore so that:
      name: myfile.csv
      resource name : myfile.csv
      datastore type: table
      file format : delimited
      record separator: unix
      field separator: ,
      text delimited: "
    and it has 5 columns string type.
    the target hive table is definited in a datastore:
      name: mytable
      resource name : mytable
      datastore type: table
    and it has the same column of the myfile.csv
    I use for the interface the IKM file to hive (load Data) with this parameter:
      create target table: true;
      truncate :true;
      file is local: false;
      use staging table: false
    the remaining parameters are set to default.
    when I start the interface it snaps to the third step with the following error:
    org.apache.bsf.BSFException: exception from Groovy: java.sql.SQLException: Exception encountered while submitting:
    load data  inpath 'hdfs://mynode:8020/user/oracle/inboxlog/myfile.csv' overwrite
    into table mytable
    Query returned non-zero code: 10028, cause: FAILED: SemanticException [Error 10028]: Line 1:18 Path is not legal ''hdfs://mynode:8020/user/oracle/inboxlog/myfile.csv'': Move from: hdfs://mynode:8020/user/oracle/inboxlog/myfile.csv to: hdfs://bdavm-ns/user/hive/warehouse/myfile is not valid. Please check that values for params "default.fs.name" and "hive.metastore.warehouse.dir" do not conflict.
      at org.codehaus.groovy.bsf.GroovyEngine.exec(GroovyEngine.java:110)
      at com.sunopsis.dwg.codeinterpretor.SnpScriptingInterpretor.execInBSFEngine(SnpScriptingInterpretor.java:322)
      at com.sunopsis.dwg.codeinterpretor.SnpScriptingInterpretor.exec(SnpScriptingInterpretor.java:170)
      at com.sunopsis.dwg.dbobj.SnpSessTaskSql.scripting(SnpSessTaskSql.java:2472)
      at oracle.odi.runtime.agent.execution.cmd.ScriptingExecutor.execute(ScriptingExecutor.java:47)
      at oracle.odi.runtime.agent.execution.cmd.ScriptingExecutor.execute(ScriptingExecutor.java:1)
      at oracle.odi.runtime.agent.execution.TaskExecutionHandler.handleTask(TaskExecutionHandler.java:50)
      at com.sunopsis.dwg.dbobj.SnpSessTaskSql.processTask(SnpSessTaskSql.java:2913)
      at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java:2625)
      at com.sunopsis.dwg.dbobj.SnpSessStep.treatAttachedTasks(SnpSessStep.java:558)
      at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java:464)
      at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java:2093)
      at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$2.doAction(StartSessRequestProcessor.java:366)
      at oracle.odi.core.persistence.dwgobject.DwgObjectTemplate.execute(DwgObjectTemplate.java:216)
      at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.doProcessStartSessTask(StartSessRequestProcessor.java:300)
      at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.access$0(StartSessRequestProcessor.java:292)
      at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$StartSessTask.doExecute(StartSessRequestProcessor.java:855)
      at oracle.odi.runtime.agent.processor.task.AgentTask.execute(AgentTask.java:126)
      at oracle.odi.runtime.agent.support.DefaultAgentTaskExecutor$2.run(DefaultAgentTaskExecutor.java:82)
      at java.lang.Thread.run(Thread.java:662)
    Caused by: java.sql.SQLException: Exception encountered while submitting:
    load data  inpath 'hdfs://mynode:8020/user/oracle/inboxlog/myfile.csv' overwrite
    into table mytable
    Query returned non-zero code: 10028, cause: FAILED: SemanticException [Error 10028]: Line 1:18 Path is not legal ''hdfs://mynode:8020/user/oracle/inboxlog/myfile.csv'': Move from: hdfs://mynode:8020/user/oracle/inboxlog/myfile.csv to: hdfs://bdavm-ns/user/hive/warehouse/mytable is not valid. Please check that values for params "default.fs.name" and "hive.metastore.warehouse.dir" do not conflict.
      at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
      at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
      at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
      at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
      at org.codehaus.groovy.reflection.CachedConstructor.invoke(CachedConstructor.java:77)
      at org.codehaus.groovy.runtime.callsite.ConstructorSite$ConstructorSiteNoUnwrapNoCoerce.callConstructor(ConstructorSite.java:107)
      at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCallConstructor(CallSiteArray.java:52)
      at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callConstructor(AbstractCallSite.java:192)
      at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callConstructor(AbstractCallSite.java:200)
      at flexUtilHive.executeQuery(Prepare_Hive_session:53)
      at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
      at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
      at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
      at java.lang.reflect.Method.invoke(Method.java:597)
      at org.codehaus.groovy.runtime.callsite.PogoMetaMethodSite$PogoCachedMethodSiteNoUnwrap.invoke(PogoMetaMethodSite.java:246)
      at org.codehaus.groovy.runtime.callsite.PogoMetaMethodSite.call(PogoMetaMethodSite.java:63)
      at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCall(CallSiteArray.java:40)
      at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:117)
      at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:125)
      at Load_data_file_s_.run(Load_data_file_s_:113)
      at groovy.lang.GroovyShell.evaluate(GroovyShell.java:576)
      at groovy.lang.GroovyShell.evaluate(GroovyShell.java:614)
      at groovy.lang.GroovyShell.evaluate(GroovyShell.java:595)
      at org.codehaus.groovy.bsf.GroovyEngine.exec(GroovyEngine.java:108)
      ... 19 more
    any idea?

    Hi all,
    for those who are interested, I solved the problem below , it was file permissions of the loaded file.
    Hive moves the input file from its original path to its warehouse directory.
    In order to move the file, the agent needs write permissions on HDFS input directory. When I gave the permission, it was solved.
    Regards...

  • SQL to Access Essbase data

    I have a user who would like to use SQL to Access Essbase cube data. Any suggestions? Thanks!

    The user could just use the MaxL interface and an interactive report script. It accomplishes the same goal of returning a result set.

  • Hyperion11.1.2.2 Sql Server and Essbase data load error

    Gentlemen,
    i have issue on loading data in Essbase via sql server , while loading data it was fine and all of sudden i see that i get network error 10054
    network error 10054 failed to recive data / send data
    unexpected essbase error 1042013
    even when i try to load manully it says Data Load Fails error Data load buffer [9999] does not exist Unexpected Essbase error 1270040
    and in logs
    Received client request: MaxL: Execute (from user [admin@Native Directory
    *Error writing to server*
    Also tried with maxll its the same issue
    is .esm is loacked?
    any issue with ODBC?
    or any orphan link which is connected to sql  and hyperion unable to process another ?
    let me know your thoughts
    thanks.                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                       

    Can you try limiting the query to return some records? Try adding a where clause and see whether that works?
    Regards
    Celvin
    http://www.orahyplabs.com

  • ODI 10.1.3.5 - Text file to Hyperion Essbase - Metadata

    Hello folks,
    I tried to insert new member on my outline in Essbase 9.3.1 using the following text file content:
    Member;Parent;Alias;Consolidation;Data Storage
    A;Account;A-Test;+;Never Share
    Follow the error description below, but I could not understand the issue´s cause.
    When I use the same text file to load Hyperion Planning, it works fine.
    org.apache.bsf.BSFException: exception from Jython:
    Traceback (innermost last):
    File "<string>", line 20, in ?
    java.sql.SQLException: Unexpected token: PARENTNAME in statement [select   C3_PARENT    ""ParentName]
    Does somebody know what happened?
    Best Regards,
    Wallace Galvão
    Sao Paulo - Brazil

    Hello John,
    I don´t have quotes on the column separator or table alias separator field. But I found the quotes on the language tab in object delimiter field, I removed and I have new error, follow below.
    File "<string>", line 23, in ?
    com.hyperion.odi.essbase.ODIEssbaseException: Cannot build dimension. Analytic Server Error(1030011): Invalid character in name (<Default>) in ESSAPI function adBuildDim
    SQL Generated:
    from com.hyperion.odi.common import ODIConstants
    from com.hyperion.odi.connection import HypAppConnectionFactory
    from java.lang import Class
    from java.lang import Boolean
    from java.sql import *
    from java.util import HashMap
    # Get the select statement on the staging area:
    sql= """select C2_PARENT "ParentName",C3_MEMBER "MemberName",C1_ALIAS "Alias",C4_DATA_STORAGE "DataStorage",C5_CONSOLIDATION "Consolidation" from "C$_0Account" where      (1=1) """
    srcCx = odiRef.getJDBCConnection("SRC")
    stmt = srcCx.createStatement()
    srcFetchSize=30
    stmt.setFetchSize(srcFetchSize)
    rs = stmt.executeQuery(sql)
    #load the data
    stats = pWriter.loadData(rs)
    #close the database result set, connection
    rs.close()
    stmt.close()
    Besides, for me it was strange found two languages defined on Essbase technology: Jython and SQL. And the definition tab, Tecnology type was: database or file (JDBC/ODBC), I changed to "bean script framework" (similar to Hyperion Planning technology) and removed the SQL keeping only Jython, but the issues continues.
    Also, I changed my odihapp_essbase.jar (date created 9/9/2009) and readded the essbase technology, but without success.
    Do you have any idea to solve this issue?
    Regards,
    Wallace Galvão
    Sao Paulo - Brazil

  • ODI Error Handling: IKM for Essbase (Data), Check reject at Commit Intervls

    All,
    I am trying to see if there is a way I can handle errors in the ODI IKM for SQL to Hyperion Essbase (Data), so I can switch to using a load rule interface if there are rejects.
    I am thinking, if we can check for rejects after every commit interval (right now using the default of 1000 records), and continue to the next set of 1000 records only if there are no rejects.
    If there is a way of even aborting the interface run i.e. prevent it from switching to loading records line-by-line on the occurrence of a reject, I can check a log and kick off an interface which will use an Essbase Load rule to continue loading.
    I don't know if it all sounds too hypothetical, but I want to see if anyone has ideas around this approach.
    Please share any thoughts.
    Thanks,
    Anindyo

    Thanks John, I was thinking on those lines.
    But it would help if there is a way of collecting information on what rejected, without having to set up a new physical object to pull from the work repository or from the file.
    We are trying to get away from any KM customization.
    Do you know what we can check for here? Is there a way of refreshing a variable in case of a failure, which we can check in the next step?
    Thanks,
    Anindyo

  • QUESTION:  Essbase data extraction and Installing ODI Agent??

    For extracting data from Essbase cubes, ODI has "LKM Hyperion Essbase DATA to SQL".
    We can use (1). ReportScript, or (2). MDX-query, or (3). CalcScript
    For data-extraction using CalcScript, ODI Agent must be running on the same server as the Essbase server.
    Does anyone know if there is a need for ODI Agent on the Essbase machine if we use MDX-query method for data-extraction?
    We would like to avoid installing ODI Agent for Essbase data-extraction.
    .

    Thanks John.
    One related question. To move data from one Essbase cube to another Essbase cube using ODI Interface, Can we do it efficiently through MDX-query?
    We want to avoid Replicated-partitioning OR CalcScripts, if possible.
    BTW... Your ODI/Hyperion blog is a bible for us.

  • ODI - IKM SQL to File Append - Header not Generated

    I'm using ODI IKM SQL to File Append to create a text file, but the header is not being generated. And the GENERATE_HEADER is set to Yes. The file is Tab delimited and the Heading (number of lines) is set to 1.
    Seems to only be an issue with HFM files coming from the Unix server.
    Any suggestions?
    Thanks, Mike

    Ok, getting the following error in step 6 - Integration - HFM_EA_Translate - Insert Column Headers
    java.lang.NumberFormatException
         at java.math.BigDecimal.<init>(BigDecimal.java:459)
         at java.math.BigDecimal.<init>(BigDecimal.java:728)
         at com.sunopsis.sql.SnpsQuery.updateExecStatement(SnpsQuery.java)
         at com.sunopsis.sql.SnpsQuery.addBatch(SnpsQuery.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.execCollOrders(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTaskTrt(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSqlI.treatTaskTrt(SnpSessTaskSqlI.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java)
         at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java)
         at com.sunopsis.dwg.cmd.DwgCommandSession.treatCommand(DwgCommandSession.java)
         at com.sunopsis.dwg.cmd.DwgCommandBase.execute(DwgCommandBase.java)
         at com.sunopsis.dwg.cmd.e.i(e.java)
         at com.sunopsis.dwg.cmd.h.y(h.java)
         at com.sunopsis.dwg.cmd.e.run(e.java)
         at java.lang.Thread.run(Thread.java:662)
    The last two columns in the file are numeric which seems to be causing the issue. Will change formatting for the two columns (String, Numeric, etc.) to see if I can resolve the header issue.
    Thanks,
    Mike

  • ODI IKM SQL to Planning Issue

    Can anyone help me out with this one? I'm trying to run an IKM to Planning to update a Planning dimension metadata outline and am using the IKM SQL to Planning. After executing it, the Operator shows the status completed but an exception occured from one of the steps, "Report Statistics". I'm using the Sunopsis_Memory_Engine as my staging area and my flow chart start with a flat file (LKM File to SQL), staging area Sunopsis, and target is Planning (IKM SQL to Planning). In my mapping diagram, I have all columns to execute on the staging area, except for the parent and child columns which I have executing on the source. Below is the execution message for the "Report Statistics" step that seems to fail. Anyone know how to address this?
    org.apache.bsf.BSFException: exception from Jython:
    Traceback (innermost last):
    File "<string>", line 2, in ?
    Planning Writer Load Summary:
         Number of rows successfully processed: 0
         Number of rows rejected: 237
         at org.apache.bsf.engines.jython.JythonEngine.exec(JythonEngine.java:146)
         at com.sunopsis.dwg.codeinterpretor.k.a(k.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.scripting(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.execScriptingOrders(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.execScriptingOrders(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTaskTrt(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSqlI.treatTaskTrt(SnpSessTaskSqlI.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java)
         at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java)
         at com.sunopsis.dwg.cmd.DwgCommandSession.treatCommand(DwgCommandSession.java)
         at com.sunopsis.dwg.cmd.DwgCommandBase.execute(DwgCommandBase.java)
         at com.sunopsis.dwg.cmd.e.i(e.java)
         at com.sunopsis.dwg.cmd.g.y(g.java)
         at com.sunopsis.dwg.cmd.e.run(e.java)
         at java.lang.Thread.run(Unknown Source)

    Hi,
    Have you also enabled the error logs in the IKM as this log should also explain why they are being rejected.
    In the operator a failure is a red icon, the icons you are pointing out are yellow, they may be an error but they are not serious, for instance it tries to drop a table if it doesn't exist it warns and carries on, if the table existed it would drop it, these are not serious failures, ones to be concerned about are red.
    Cheers
    John
    http://john-goodwin.blogspot.com/

  • Error in IKM SQL to JMS XML Append

    Hi
    I am doing transformation from Oracle table to XML and send that XML to JMS Queue.
    So i am uing 1) IKM SQL to JMS XML append 2) LKM SQL to SQL
    In IKM SQL to JMS XML append
    set following parameters:
    SYNCHRO_XML_TO_JMS =      true
    INITIALIZE_XML_SCHEMA =      true
    JMS_EXTRACT_MESSAGE = <Root element name>
    when execute interface,
    Getting error in step "Insert into XML (JMS Message)" like
    ODI-1228: Task INBOUND_XML_TEST (Integration) fails on the target JMS_QUEUE_XML connection INBOUND_XML.
    Caused By: java.sql.SQLException: java.sql.SQLException: Parameter not set
         at com.sunopsis.jdbc.driver.JMSXMLPreparedStatement.addBatch(JMSXMLPreparedStatement.java:62)
         at oracle.odi.runtime.agent.execution.sql.BatchSQLCommand.execute(BatchSQLCommand.java:42)
         at oracle.odi.runtime.agent.execution.sql.SQLExecutor.execute(SQLExecutor.java:102)
         at oracle.odi.runtime.agent.execution.sql.SQLExecutor.execute(SQLExecutor.java:1)
         at oracle.odi.runtime.agent.execution.DataMovementTaskExecutionHandler.handleTask(DataMovementTaskExecutionHandler.java:84)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.processTask(SnpSessTaskSql.java:2906)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java:2609)
         at com.sunopsis.dwg.dbobj.SnpSessStep.treatAttachedTasks(SnpSessStep.java:537)
         at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java:453)
         at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java:1740
    Regards,
    Ankush.

    Hi,
    Is this issue resolved? If so, can you provide the fix?
    I am getting the same error.
    Thanks,
    Ruby
    Edited by: Rubellah Rajakumar on Oct 1, 2012 5:21 PM

  • ODI 10.1.3.5 with SQL Server 2005

    Good afternoon,
    We recently installed 10.1.3.5, in a Windows environment with SQL Server 2005 and Hyperion Essbase/Planning/HFM 11.1.1. We were following John Goodwin's infamous blog (thanks, John!), but ran into a number of snags.
    For instance, when we set-up interfaces with flat files as the source, loading metadata either to Planning or Essbase, ODI errors out when:
    1) our header name in the flat file has spaces (using double quotes didn't work)
    2) our header name in the flat file is different than the reverse property name from Essbase/Planning.
    Also, we're now seeing an issue in our Flat file to Essbase metadata interface where the SQL used in the IKM seems to be incorrect. Below is the error message we're receiving:
    "org.apache.bsf.BSFException: exception from Jython:
    Traceback (innermost last):
    File "<string>", line 20, in ?
    com.microsoft.sqlserver.jdbc.SQLServerException: An object or column name is missing or empty. For SELECT INTO statements, verify each column has a name. For other statements,
    look for empty alias names. Aliases defined as "" or [] are not allowed. Add a name or single space as the alias name."
    We are thinking that the culprit may be either the wrong .jar or .jre we're using with SQL Server 2005, as there seem to be inherent issues with the SQL generated in the LKM and IKM. Our questions to you:
    1. What sqljdbc.jar driver version are you using?
    2. What .jre version are you using?
    3. What environment does the above work correctly in? (perhaps an earlier version of ODI against an earlier version of Hyperion?)
    FYI - We have tried sqljdbc 1.0 and 2.0, with .jre 1.6.0_10.
    Thanks in advance!
    -O
    Edited by: Alapat on Jan 22, 2009 8:04 AM

    Thank you very much for responding!
    After a 2 week battle, we are now finally coming to a close on our issues (although we cannot say for sure what all of the causes were).
    Here is what we discovered about our questions, through other threads:
    1. John Goodwin was using JRE 1.6.x, so we ruled that out as a culprit.
    2. Others are using SQL Server 2005 just fine with ODI, so we ruled this out too. We chose to keep the sqljdbc 2.0 driver.
    Here are some issues we did uncover during our investigation:
    1. We were having issues with our agent, specifically regarding the HFM technology. We created a new one, added a few extra parameters, compiled the agent, and are now back up and running. We realized the problem when we switched to "Local (No Agent)" and saw that we were not having as many issues.
    2. We went into the physical agent and pressed "Update Scheduling". We do not know if this had any effect, but we suddenly noticed that our flat file header issues were resolved.
    3. The Essbase issue was due to a configuration error with our Hyperion Essbase physical technology. We went into that technology and changed a setting on the "Languages" tab, called "Object Delimiter". It had double-quotes in it - we removed that parameter and instantly saw a change in the SQL being executed.
    4. Some of the mappings in our interfaces needed to be executed on different environments. i.e. - We needed to switch to "staging" when the flat file header was different than the target column name. Newbie mistake - we thought ODI had built-in intelligence to choose correctly for us.
    5. When doing a reverse on Essbase models, we noticed that the Essbase columns would populate with an "Undefined" datatype, instead of the expected CHAR. We have to manually define all column datatypes after every Essbase reverse now. Must be a bug.
    6. We also noticed when doing a flat file reverse that all columns auto-populate with datatype STRING, length 50. We have to manually personalize that option for different Hyperion applications, based on need. i.e. - some require numeric datatypes, or smaller/larger lengths. This was obviously a newbie error. :)
    Thank you for your tip on the MS SS2005 JDBC drivers - we will look into your other options - IMHO, JTDS, etc.
    -O

Maybe you are looking for

  • Navigation Profile in Purchasing Reports / MM

    Dear Experts , Pls tell me if  there is navigation profile in MM reports . I have seen the same in PP reports but couldnt find the same functionality in MM . Rgds

  • Photo Booth Problems in Lion

    I upgraded to Lion on my Macbook Pro today. While testing out the new features, I opened photo booth and saw the first page featuring new face tracking options, but none of them were showing up in the preview or when I tried to click on them. Also, I

  • Java vs. C/C++ binary size comparison

    I am looking for a paper or study that gives a comparison of the byte-code size of various Java and C++ programs. Specifically, I am trying to compare native code implementations for different machine arcithectures vs. doing the same thing in a Java

  • Error in any ServiceImpl when pressing save button

    In our application after pressing the save button we are getting the following error on screen: "com.nbd.demeter.model.service.AbonnementenServiceImpl". This error can occur in any screen (any ServiceImpl). We do not get this error every time when th

  • Fax Custom Form

    I have designed a custom Outlook form for the prupose of RightFaxing but I am having difficulties finding the necessary VBA code to actually have the form fax. Currently, the form only comes through with the standard cover page, including the Subject