Errors in OLAP storage engine

I am running BPC MS 7.5 and tried to do a full optimization and got the error below.
Unfortunately it did not appear to put the data back to its original state and I had to restore to a back up!
Can anyone tell me how to resolve this error - I am not a techie more an application man so in "English" would help me.
==============[System Error Tracing]==============
[System  Name] : OLAPServerHandler
[Message Type] : ErrorMessage
[Job Name]     : Olap9Manager : CubeProcess
[DateTime]     : 6/22/2011 11:45:32 AM
[UserId]       :
[Exception]
    DetailMsg  : {Microsoft.AnalysisServices.OperationException: Errors in the OLAP storage engine: The attribute key cannot be found when processing: Table: 'dbo_tblFactConsolidation', Column: 'ACCOUNT', Value: 'CTLICBFR010'. The attribute is 'Account_ID1'.
Errors in the OLAP storage engine: The process operation ended because the number of errors encountered during processing reached the defined limit of allowable errors for the operation.
Errors in the OLAP storage engine: An error occurred while processing the 'Consolidation' partition of the 'Consolidation' measure group for the 'Consolidation' cube from the EBIQUITYTRAIN database.
Errors in the OLAP storage engine: The process operation ended because the number of errors encountered during processing reached the defined limit of allowable errors for the operation.
Internal error: The operation terminated unsuccessfully.
Server: The operation has been cancelled.
Thanks

Hi,
Usually, this kind of error message comes from invalid records in your fact tables.
Check out SAP Note 1098683.
Hope this will help you.
Kind Regards,
Patrick

Similar Messages

  • Errors in OLAP storage engine when processing application

    Hi
    After changing a logic in the application, we processed the application but it did not complete successfully due to the following error:
    Error message:: CreateOLAPCubeForApplication:CreateCube:Errors in the OLAP storage engine: The attribute key cannot be found when processing: Table: 'dbo_tblFactQuality', Column: 'QACCOUNT', Value: '%ACCOUNTS%'. The attribute is 'QAccount_ID1'.
    When trying other applications they showed errors with reference to the measure group, e.g.:
    Error message:: Errors in the metadata manager. No dimension relationships exist within the 'Ownership' measure group.
    Thanks for your help.
    Melanie

    Hi,
    This kind of error will come when you have some invalid member defined in your fact table. you can use the following SQL query to check it.
    select * from tblfact"yourapplication" where "dimension" not in (select id from mbr"dimension" where calc = 'N')
    The same query needs to be run in wbtable and fac2 table.
    These selection should return 0 records.
    If it is returning something then you have to delete these records (replace "select * "with delete).
    You can run this for all the applications and all the dimensions. However, I believe, you can check in your Quality application and the QACCOUNT dimension.
    You can see for which dimension, there is an invalid member. This member might be getting created through one of your script logics.
    Hope this helps.

  • Errors in the OLAP storage engine: The attribute key cannot be found when processing: Table: performance_fact

    Hi
    We are using SSAS 2008 r2 and have a cube on our data warehouse, this has been running fine for quite a while now.
    However it failed yesterday with the error message:
    Errors in the OLAP storage engine: The attribute key cannot be found when processing: Table: 
    'Performance_Fact', Column: 'accountexternalId', Value: '9474'. The attribute is 'External Id'."
    Errors in the OLAP storage engine: The process operation ended because the number of errors encountered during processing reached the defined limit of allowable errors for the operation." 
    An error occurred while processing the '2011' partition of the 'Fact' measure group for the 'DB' cube from the DB database.
    Please can someone help me to resolve the error, I don't understand what it means.
    Thanks

    Hello,
    The error means that you have a key in the fact data which don't exist in the related dimension. Ensure that the dimension is also processed so that it contains all used keys.
    Olaf Helper
    [ Blog] [ Xing] [ MVP]

  • Errors in the OLAP storage engine: An error occurred while processing the indexes for the partition of the measure group of the cube from the database.

    Errors in the OLAP storage engine: An error occurred while processing the indexes for the partition of the measure group of the cube from the database.
    I have dropped and recreated a fact table to refresh the data as the table is having identity columns. So not able to directly insert data. After that cube is throwing above error.
    Please suggest.

    Hi Md,
    It hard to analyse the issue base on the limited information, Are there any error message on the log? Under C:\Program Files\Microsoft SQL Server\MSAS11.\OLAP\Log (this could be different for your server), there are 3 log files that are generated:
    msmdrrv.log
    FlightRecorderCurrent.trc
    FightRecorderBack.trc
    The msmdrrv.log simply points to the other two logs. There might be some detail information for this error. Please provide us the detail information about it, so that we can make further analysis.
    Regards,
    Charlie Liao
    TechNet Community Support

  • The attribute is 'Date Key'. Errors in the OLAP storage engine:

    I have a Datetime column in my source table 'LastUpdatedDateTime' in this format  '2012-08-15 14:58:42.467' and I have converted to integer using (YEAR(LastUpdatedDateTime) * 10000) + (MONTH(LastUpdatedDateTime) * 100) + DAY(LastUpdatedDateTime)
    in my Fact table  in the SSIS using a derived Column. Now when I run it in the SSAS I get an error
    Erors in the OLAP storage engine: The attribute key cannot be found when processing: Table: 'Fact table', Column: 'LastUpdatedDateKey', Value: '20120815'.
    Ebenezer

    Hi Ebenezer,
    This error occurs when the Value: '20120815' is missing in dimension but available in Fact, probably in Date dimension in your case.
    Have you converted the Datekey in Date dim to same format ?
    If converted please check whether Value: '20120815' is available in Date dimension table.
    Saurabh Kamath

  • Errors in the OLAP storage engine: An error occurred while processing the 'vFactUTC_201304_INCR_UPDATE_TEMP_qtfcw_' partition of the 'vFact' mea

    <Error
    ErrorCode="3240034318"
    Description="Errors in the OLAP storage engine: An error occurred while processing the 'vFactUTC_201304_INCR_UPDATE_TEMP_qtfcw_' partition of the 'vFact' measure
    group for the 'vAMGenericUTC' cube from the AC_OLAP database."
    Source="Microsoft SQL Server 2008 Analysis Services"
    HelpFile="" />
    Any idea abt this error
    Putting details error ..
    <return
    xmlns="urn:schemas-microsoft-com:xml-analysis">
      <results
    xmlns="http://schemas.microsoft.com/analysisservices/2003/xmla-multipleresults">
        <root
    xmlns="urn:schemas-microsoft-com:xml-analysis:empty">
          <Exception
    xmlns="urn:schemas-microsoft-com:xml-analysis:exception"
    />
          <Messages
    xmlns="urn:schemas-microsoft-com:xml-analysis:exception">
            <Error
    ErrorCode="3238395904"
    Description="OLE DB error: OLE DB or ODBC error: Cannot execute the query
    &quot;SELECT
    &quot;Tbl1002&quot;.&quot;idUTCDateTime&quot;
    &quot;Col1145&quot;,&quot;Tbl1002&quot;.&quot;idLocalDateTime&quot;
    &quot;Col1146&quot;,&quot;Tbl1002&quot;.&quot;idDateKey&quot;
    &quot;Col1147&quot;,&quot;Tbl1002&quot;.&quot;idUTCDateKey&quot;
    &quot;Col1148&quot;,&quot;Tbl1002&quot;.&quot;idMinuteKey&quot;
    &quot;Col1149&quot;,&quot;Tbl1002&quot;.&quot;idUTCMinuteKey&quot;
    &quot;Col1150&quot;,&quot;Tbl1002&quot;.&quot;idInstance&quot;
    &quot;Col1151&quot;,&quot;Tbl1002&quot;.&quot;idMetricKey&quot;
    &quot;Col1152&quot;,&quot;Tbl1002&quot;.&quot;idKSKey&quot;
    &quot;Col1153&quot;,&quot;Tbl1002&quot;.&quot;idMaintenanceKey&quot;
    &quot;Col1154&quot;,&quot;Tbl1002&quot;.&quot;MetricMin&quot;
    &quot;Col1155&quot;,&quot;Tbl1002&quot;.&quot;MetricMax&quot;
    &quot;Col1156&quot;,&quot;Tbl1002&quot;.&quot;MetricCount&quot;
    &quot;Col1157&quot;,&quot;Tbl1002&quot;.&quot;MetricSum&quot;
    &quot;Col1158&quot;,&quot;Tbl1002&quot;.&quot;MetricLog&quot;
    &quot;Col1159&quot;,&quot;Tbl1002&quot;.&quot;MetricSumOfSquares&quot;
    &quot;Col1160&quot;,&quot;Tbl1002&quot;.&quot;MetricSumOfLogSquares&quot;
    &quot;Col1161&quot;,&quot;Tbl1002&quot;.&quot;idTextData&quot;
    &quot;Col1162&quot;,&quot;Tbl1002&quot;.&quot;idLoad&quot;
    &quot;Col1143&quot;,CONVERT(smalldatetime,&quot;Tbl1002&quot;.&quot;idDateKey&quot;,0)
    &quot;Expr1004&quot;,CONVERT(smallint,&quot;Tbl1002&quot;.&quot;idMinuteKey&quot;,0)
    &quot;Expr1006&quot;,(1)
    &quot;Expr1008&quot; FROM
    &quot;DM_VSQL51_SOPM33_QDB&quot;.&quot;dbo&quot;.&quot;vData&quot;
    &quot;Tbl1002&quot; WHERE
    &quot;Tbl1002&quot;.&quot;idLocalDateTime&quot;&gt;=(1362096000)
    AND &quot;Tbl1002&quot;.&quot;idLocalDateTime&quot;&lt;=(1364774399)
    AND &quot;Tbl1002&quot;.&quot;idLoad&quot;&gt;=(9243)
    AND &quot;Tbl1002&quot;.&quot;idLoad&quot;&lt;=(9304)&quot;
    against OLE DB provider &quot;SQLNCLI10&quot; for linked server
    &quot;DM_DS_SOPM33_QDB&quot;. ; 42000; The OLE DB provider
    &quot;SQLNCLI10&quot; for linked server
    &quot;DM_DS_SOPM33_QDB&quot; reported an error. Execution terminated by the provider because a resource limit was reached.;
    42000." Source="Microsoft SQL Server 2008 Analysis Services"
    HelpFile="" />
            <Error
    ErrorCode="3240034318"
    Description="Errors in the OLAP storage engine: An error occurred while processing the 'vFact_201303_INCR_UPDATE_TEMP_he3dx_' partition of the 'vFact' measure
    group for the 'vAMGeneric' cube from the AC_OLAP database."
    Source="Microsoft SQL Server 2008 Analysis Services"
    HelpFile="" />
            <Error
    ErrorCode="3238002695"
    Description="Internal error: The operation terminated unsuccessfully."
    Source="Microsoft SQL Server 2008 Analysis Services"
    HelpFile="" />
            <Error
    ErrorCode="3238002695"
    Description="Internal error: The operation terminated unsuccessfully."
    Source="Microsoft SQL Server 2008 Analysis Services"
    HelpFile="" />
            <Error
    ErrorCode="3239837698"
    Description="Server: The operation has been cancelled."
    Source="Microsoft SQL Server 2008 Analysis Services"
    HelpFile="" />
            <Error
    ErrorCode="3238395904"
    Description="OLE DB error: OLE DB or ODBC error: Operation canceled; HY008."
    Source="Microsoft SQL Server 2008 Analysis Services"
    HelpFile="" />
            <Error
    ErrorCode="3240034318"
    Description="Errors in the OLAP storage engine: An error occurred while processing the 'vFact_201304_INCR_UPDATE_TEMP_kq5tw_' partition of the 'vFact' measure
    group for the 'vAMGeneric' cube from the AC_OLAP database."
    Source="Microsoft SQL Server 2008 Analysis Services"
    HelpFile="" />
            <Error
    ErrorCode="3238002695"
    Description="Internal error: The operation terminated unsuccessfully."
    Source="Microsoft SQL Server 2008 Analysis Services"
    HelpFile="" />
            <Error
    ErrorCode="3238002695"
    Description="Internal error: The operation terminated unsuccessfully."
    Source="Microsoft SQL Server 2008 Analysis Services"
    HelpFile="" />
            <Error
    ErrorCode="3238395904"
    Description="OLE DB error: OLE DB or ODBC error: Operation canceled; HY008."
    Source="Microsoft SQL Server 2008 Analysis Services"
    HelpFile="" />
            <Error
    ErrorCode="3240034318"
    Description="Errors in the OLAP storage engine: An error occurred while processing the 'vFactUTC_201304_INCR_UPDATE_TEMP_qtfcw_' partition of the 'vFactUTC'
    measure group for the 'vAMGenericUTC' cube from the AC_OLAP database."
    Source="Microsoft SQL Server 2008 Analysis Services"
    HelpFile="" />
            <Error
    ErrorCode="3238002695"
    Description="Internal error: The operation terminated unsuccessfully."
    Source="Microsoft SQL Server 2008 Analysis Services"
    HelpFile="" />
            <Error
    ErrorCode="3238002695"
    Description="Internal error: The operation terminated unsuccessfully."
    Source="Microsoft SQL Server 2008 Analysis Services"
    HelpFile="" />
          </Messages>
        </root>
      </results>
    </return>

    against OLE DB provider
    &quot;SQLNCLI10&quot; for linked server
    &quot;DM_DS_SOPM33_QDB&quot;. ; 42000; The OLE DB provider
    &quot;SQLNCLI10&quot; for linked server
    &quot;DM_DS_SOPM33_QDB&quot; reported an error. Execution terminated by the provider because a resource limit was reached.;
    42000." Source="Microsoft SQL Server 2008 Analysis Services"
    HelpFile="" />
    Hi Born,
    Have you check your disk space? This issue might be caused by not enougl disk sapce. Here is a similar thread about this topic:
    http://social.technet.microsoft.com/Forums/en-US/sqlanalysisservices/thread/1783c640-5d4d-4086-bbdc-05adc45e3816/
    If you have any feedback on our support, please click
    here.
    Regards,
    Elvis Long
    TechNet Community Support

  • Errors in the OLAP storage engine: The attribute key cannot be found when processing

    this is the absolute worst error message in all of computing.  I despise it.  Here is my situation.
    SSAS 2008 R2.
    I have one dimension.  I have not even built my cube yet.  only a dimension.  I am trying to process it.  I can process it when I only have a single attribute, they key.  it is a composite key.  When I add a new attribute (integer),
    I get the error message.  There are no null values.  There are no blanks as its an integer. 
    The attribute key cannot be found where?  I'm processing the dimension you idiot.  there is not even a cube yet in order for any key to be found or not.

    Hi Baracus,
    According to your description, you get the error "Errors in the OLAP storage engine: The attribute key cannot be found when processing" when processing your cube, right?
    Generally, the detail error message should like
    Table: 'dbo_FactSales', Column: 'ProductID', Value: '1111'. The attribute is 'Product ID'
    The above error explains that the fact table named "FactSales" contains column ProductID with value "1111" but the same  ProductID  is not present in your dimension table. There is a primary key - foreign key relationship exist
    between the ProductID column of dimension table and fact table named "FactSales" and cube is unable to find ProductID with value 1111 in the dimension table.
    At this time, what we need to do is to check either your dimension and fact table contains the value mentioned in the error message (  Value: '1111' in the above example). Here are some links about troubleshoot this issue, please see:
    http://www.businessintelligence-solutions.com/ssas-typical-error-attribute-key-processing/
    http://www.youtube.com/watch?v=5O7IAjvtAF4
    If this is not what you want, please provide us more information about you issue, so that we can make further analysis.
    Regards,
    Charlie Liao
    If you have any feedback on our support, please click
    here.
    Charlie Liao
    TechNet Community Support

  • Errors in the OLAP storage engine

    Job is scheduled to process a cube. The job failed with the below errors:
    Description: Errors in the OLAP storage engine: An error occurred while the dimension, with the ID of 'Dim ID', Name of 'Dim ID' was being processed.  
    End Error  Error: 2015-01-02 02:17:52.73     
    Code: 0xC11F000D     
    Source: Process Dimensions Analysis Services Execute DDL Task     
    Description: Errors in the OLAP storage engine: An error occurred while the 'HID' attribute of the 'Dim ID' dimension from the database was being processed.  
    End Error  Error: 2015-01-02 02:17:52.75     
    Code: 0xC1060000     
    Source: Process Dimensions Analysis Services Execute DDL Task     
    Description: OLE DB error: OLE DB or ODBC error: Operation canceled; HY008.  
    End Error  Error: 2015-01-02 02:17:52.76     
    Code: 0xC11F000C     
    Source: Process Dimensions Analysis Services Execute DDL Task     
    Description: Errors in the OLAP storage engine: An error occurred while the dimension, with the ID of 'Dim Nme', Name of 'Dim Nme' was being processed.  
    End Error  Error: 2015-01-02 02:17:52.76     
    Code: 0xC11F000D     
    Source: Process Dimensions Analysis Services Execute DDL Task    
    Description: Errors in the OLAP storage engine: An error occurred while the 'NME' attribute of the 'Dim Nme' dimension from the database was being processed.  
    End Error  Error: 2015-01-02 02:17:52.78     
    Code: 0xC1060000     
    Source: Process Dimensions Analysis Services Execute DDL Task     
    Description: OLE DB error: OLE DB or ODBC error: Operation canceled; HY008.  
    End Error  Error: 2015-01-02 02:17:52.78     
    Code: 0xC11F000C     
    Source: Process Dimensions Analysis Services Execute DDL Task     
    Description: Errors in the OLAP storage engine: An error occurred while the dimension, with the ID of 'Dim Ky', Name of 'Dim Ky' was being processed.  
    End Error  Error: 2015-01-02 02:17:52.79     
    Code: 0xC11F000D     
    Source: Process Dimensions Analysis Services Execute DDL Task     Description: Errors in the OLAP storage engine: An error occurred while the 'KY' attribute of the 'Dim Ky' dimension from the database was being processed.  
    End Error  Error: 2015-01-02 02:17:52.81     
    Code: 0xC11F000D     
    Source: Process Dimensions Analysis Services Execute DDL Task     
    Description: Errors in the OLAP storage engine: An error occurred while the 'Id' attribute of the 'Dim Txt' dimension from the database was being processed.  
    End Error  Error: 2015-01-02 02:17:52.81     
    Code: 0xC11F000D     
    Source: Process Dimensions Analysis Services Execute DDL Task     
    Description: Errors in the OLAP storage engine: An error occurred while the 'Id' attribute of the 'VWIprve' dimension from the database was being processed.  
    End Error  DTExec: The package execution returned DTSER_FAILURE (1).  
    Started:  1:00:22 AM  
    Finished: 2:18:02 AM  
    Elapsed:  4660.47 seconds.  The package execution failed.  The step failed.
    Please help!!!!!!

    Hi,
    According to your description, you create a SSIS package to process SSAS cube, the problem is that you are experiencing the error "Errors in the OLAP storage engine: An error occurred while the dimension, with the ID of 'Dim Nme', Name of 'Dim Nme'
    was being processed“ while executing the job, right?
    Based on my research, the issue can be caused by that you haven't set a primary key in your dimension table. Here is a blog that describe this issue, please see:
    http://blogs.microsoft.co.il/barbaro/2014/03/03/errors-in-the-olap-storage-engine-processing-a-large-dimension/
    Regards
    Charlie Liao
    TechNet Community Support

  • Errors in the OLAP storage engine: A duplicate attribute key has been found when processing

    Hi dear MSDN Community,
    I am facing a problem while processing a cube with a customer hierarchy as follows:
    Global Account --> Main Customer --> Master Customer --> Customer
    The data comes from a flatted parent child table, that is, I create an extra column for every level of the hierarchy in the customer view. If a level is empty, then the value is filled with the previous value. Then I can use the property:
    HideMemberIf = OnlyChildWithParentName for the intermediate levels (Main and Master Customer)
    HideMemberIf = ParentName For the leafs (Customer)
    HideMemberIf = never for the root (Global Account)
    Consider this example:
    Then, for the root level I am using as the key the fields in yellow in order to avoid duplicates. However, I am getting the error message "Errors in the OLAP storage engine: A duplicate attribute key has been found when processing" while processing.
    I analyzed the query that SSAS issues to the server (select distinct ....) and I think it should work but it is still failing.
    I had similar problems with the intermediate levels but I was able to solve it using a similar procedure.
    Any help will be appreciated.
    Kind Regards.

    When are you having this error? While processing the dimension or during cube processing?
    http://blog.oraylis.de/2013/08/a-duplicate-attribute-key-has-been-found-during-processing-revisited/
    Please Mark This As Answer if it solved your issue
    Please Vote This As Helpful if it helps to solve your issue
    Visakh
    My Wiki User Page
    My MSDN Page
    My Personal Blog
    My Facebook Page

  • Storage Engine Error in Render Plus QPAC!!

    Hi,
    i am getting an error(i have pasted below) when i am using render plus qpac.i have set only 2 properties in it.tagged pdf=1 and output form=PDF Form.couldnt resolve this problem.weird thing is that this problem is coming in only one workflow(which is the main workflow of my project)..other places it is working very fine..!
    plzz help me..
    java.sql.SQLException: General error message from server: "Got error 139 from storage engine"
    at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:1997)
    at com.mysql.jdbc.MysqlIO.sendCommand(MysqlIO.java:1167)
    at com.mysql.jdbc.MysqlIO.sqlQueryDirect(MysqlIO.java:1278)
    at com.mysql.jdbc.Connection.execSQL(Connection.java:2247)
    at com.mysql.jdbc.PreparedStatement.executeUpdate(PreparedStatement.java:1772)
    at com.mysql.jdbc.PreparedStatement.executeUpdate(PreparedStatement.java:1619)
    at org.jboss.resource.adapter.jdbc.CachedPreparedStatement.executeUpdate(CachedPreparedState ment.java:56)
    at org.jboss.resource.adapter.jdbc.WrappedPreparedStatement.executeUpdate(WrappedPreparedSta tement.java:335)
    at com.adobe.pof.adapter.JDBCAdapter.updateObject(JDBCAdapter.java:519)
    at com.adobe.pof.adapter.JDBCAdapter.updateObject(JDBCAdapter.java:442)
    at com.adobe.pof.omapi.POFObjectManagerImpl.writeObject(POFObjectManagerImpl.java:254)
    at com.adobe.pof.omapi.POFObjectManagerRemoteBean.writeObject(POFObjectManagerRemoteBean.jav a:274)
    at sun.reflect.GeneratedMethodAccessor274.invoke(Unknown Source)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:324)
    at org.jboss.ejb.StatelessSessionContainer$ContainerInterceptor.invoke(StatelessSessionConta iner.java:683)
    at org.jboss.resource.connectionmanager.CachedConnectionInterceptor.invoke(CachedConnectionI nterceptor.java:185)
    at org.jboss.ejb.plugins.StatelessSessionInstanceInterceptor.invoke(StatelessSessionInstance Interceptor.java:72)
    at org.jboss.ejb.plugins.AbstractTxInterceptor.invokeNext(AbstractTxInterceptor.java:84)
    at org.jboss.ejb.plugins.TxInterceptorCMT.runWithTransactions(TxInterceptorCMT.java:315)
    at org.jboss.ejb.plugins.TxInterceptorCMT.invoke(TxInterceptorCMT.java:148)
    at org.jboss.ejb.plugins.SecurityInterceptor.invoke(SecurityInterceptor.java:120)
    at org.jboss.ejb.plugins.LogInterceptor.invoke(LogInterceptor.java:191)
    at org.jboss.ejb.plugins.ProxyFactoryFinderInterceptor.invoke(ProxyFactoryFinderInterceptor. java:122)
    at org.jboss.ejb.StatelessSessionContainer.internalInvoke(StatelessSessionContainer.java:331 )
    at org.jboss.ejb.Container.invoke(Container.java:723)
    at org.jboss.ejb.plugins.local.BaseLocalProxyFactory.invoke(BaseLocalProxyFactory.java:359)
    at org.jboss.ejb.plugins.local.StatelessSessionProxy.invoke(StatelessSessionProxy.java:83)
    at $Proxy242.writeObject(Unknown Source)
    at com.adobe.pof.omapi.POFObjectManagerLocalEJBAdapter.writeObject(POFObjectManagerLocalEJBA dapter.java:155)
    at com.adobe.workflow.datatype.POFVariableContainer.write(POFVariableContainer.java:114)
    at com.adobe.workflow.dom.InstanceDocument.persistNode(InstanceDocument.java:163)
    at com.adobe.workflow.engine.ProcessEngineBMTBean.continueBranchAtAction(ProcessEngineBMTBea n.java:2379)
    at com.adobe.workflow.engine.ProcessEngineBMTBean.asyncContinueBranchCommand(ProcessEngineBM TBean.java:1954)
    at sun.reflect.GeneratedMethodAccessor387.invoke(Unknown Source)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:324)
    at org.jboss.ejb.StatelessSessionContainer$ContainerInterceptor.invoke(StatelessSessionConta iner.java:683)
    at org.jboss.resource.connectionmanager.CachedConnectionInterceptor.invoke(CachedConnectionI nterceptor.java:185)
    at org.jboss.ejb.plugins.AbstractTxInterceptor.invokeNext(AbstractTxInterceptor.java:84)
    at org.jboss.ejb.plugins.AbstractTxInterceptorBMT.invokeNext(AbstractTxInterceptorBMT.java:1 44)
    at

    yes Howard, i had a look at it.but is it going to be the same for any workflow which has more process variables?for the workflow i am doing,we need more process variables.to tell u exactly i have 10 form variables,2 document,3 int, and others totally below 20.
    Now that i had understood the cause of error,i want 2 rectif it without changing anything in the database as we are not given rights for writing into database.is there any way of doing it?if yes plz suggest me.
    thank you,
    Raghava kumar V.S.S.

  • Error during Olap metadata retrieval

    Hi all,
    We are getting the following error while trying to establish connection to the olap catalog to retrive olap metadata information for BI Beans
    "Successfully connected to the Oracle database but failed to retrieve metadata.
    Although the database connection succeeded,metadata in the OLAP Catalog is not
    valid and could not be retrieved.In order to create OLAP queries against the
    database,you must ensure that the OLAP Catalog metadata is valid."
    On running the BI Beans configuration diagnostic utility we get the following error
    <?xml version="1.0" encoding="UTF-8" ?>
    <BICheckConfig version="1.0.0.0">
    <Check key="JDEV_ORACLE_HOME" value="D:\JDev_57_Oct14Class"/>
    <Check key="JAVA_HOME" value="D:\jdk1.3.1_04"/>
    <Check key="JDeveloper version" value="9.0.3.998"/>
    <Check key="BI Beans version" value="9.0.3.4.0"/>
    <Check key="BI Beans internal version" value="2.7.0.11.3"/>
    <Check key="host" value="ap950sun.us.oracle.com"/>
    <Check key="port" value="1521"/>
    <Check key="sid" value="HCXD2BI"/>
    <Check key="user" value="hbi"/>
    <Check key="Connecting to the database" value="Successful"/>
    <Check key="JDBC driver version" value="9.2.0.1.0"/>
    <Check key="JDBC JAR file location" value="D:\JDev_57_Oct14Class\jdbc\lib"/>
    <Check key="Database version" value="9.2.0.2.0"/>
    <Check key="OLAP Catalog version" value=""/>
    <Check key="OLAP AW Engine version" value=""/>
    <Check key="OLAP API Server version" value=""/>
    <Check key="BI Beans Catalog version" value="N/A; not installed in hbi"/>
    <Check key="OLAP API JAR file version" value="9.2"/>
    <Check key="OLAP API JAR file location" value="D:\JDev_57_Oct14Class\jdev\lib\ext"/>
    <Check key="OLAP API Metadata Load" value="Unsuccessful"/>
    <Check key="Failure Point" value="MDMLoadFailed"/>
    <Check key="StackTrace">
    <![CDATA[
    ============================================================================
    1) An error occurred during olap API metadata retrieval. This is probably caused by inconsistent metadata.
    1) An error occurred during olap API metadata retrieval. This is probably caused by inconsistent metadata.
    ============================================================================
    oracle.express.idl.util.OlapiException
         at oracle.express.idl.ExpressConnectionModule.ConnectionInterfaceStub.getDefaultDatabase(ConnectionInterfaceStub.java:1465)
         at oracle.express.mdm.MdmMetadataProvider.<init>(MdmMetadataProvider.java:200)
         at oracle.express.mdm.MdmMetadataProvider.<init>(MdmMetadataProvider.java:187)
         at oracle.express.olapi.data.full.ExpressDataProvider.getDefaultMetadataProvider(ExpressDataProvider.java:549)
         at oracle.dss.metadataManager.server.drivers.mdm._92.MDMMetadataDriverImpl_92.getMdmMetadataProvider(MDMMetadataDriverImpl_92.java:1134)
         at oracle.dss.metadataManager.server.drivers.mdm._92.MDMMetadataDriverImpl_92.attach(MDMMetadataDriverImpl_92.java:811)
         at oracle.dss.metadataManager.server.drivers.mdm.MDMMetadataDriverImpl.attach(MDMMetadataDriverImpl.java:133)
         at oracle.dss.metadataManager.server.MetadataManagerImpl.buildObjectModel(MetadataManagerImpl.java:1085)
         at oracle.dss.metadataManager.server.MetadataManagerImpl.attach(MetadataManagerImpl.java:962)
         at oracle.dss.metadataManager.client.MetadataManager.attach(MetadataManager.java:866)
         at oracle.dss.metadataManager.client.MetadataManager.attach(MetadataManager.java:792)
         at BICheckConfig.checkConnection(BICheckConfig.java:250)
         at BICheckConfig.main(BICheckConfig.java:1172)
    ]]>
    </Check>
    </BICheckConfig>
    We are getting the same error for 2 different users.
    We are using Jdeveloper 9.0.3 with BI Beans 9.0.3 and 9.2.0.2 database
    Could someone help us with this.
    Thanks in advance.
    Nigel

    Hi,
    The issue here is if the whole catalog is corrupt or just one schema. So to try and determine the status of the catalog I would try:
    1) Using OEM remove all the objects you created
    2) I presume you created your database using the Database Configuration Assistant? You should have used the warehouse template
    3) Make sure the following accounts are unlocked and also not expired : SH, OLAPSYS
    4) Make sure the password for the SH schema is SH
    5) Make sure the password for the OLAPSYS account is manager
    6) Install the BIBDEMO schema that is shipped with BI Beans. This in the jdev_home/bibeans/bibdemo_schema
    The installation process will remove SH schema from the OLAP catalog.
    7)Once this is installed use JDeveloper to see if you can create a crosstab or graph.
    8) If the BIBDEMO schema works try creating your new schemas one at a time.
    9) Make sure the if you define the a dimension as type time it has END_DATE (column type DATE) and TIME_SPAN (column type number) defined. Otherwise don't define the dimension as type time.
    Hope this helps
    Keith Laker
    Product Manager
    Oracle Business Intelligence Beans

  • Can't get the INNODB storage engine to start with Leopard MySQL server...

    As shipped by apple. In the /etc/my.cnf file it says to just uncomment the stuff related to innodb and all will be well. I did that, restarted the Mysql server and when I take a look at the available storage engines using phpmyadmin, innodb is turned off...
    The following SQL runs:
    CREATE TABLE `Puzzles` (
    `id` int(10) unsigned NOT NULL auto_increment,
    `initialState` char(81) NOT NULL default '',
    `solution` char(81) NOT NULL default '',
    `levelOfDifficulty` int(11) NOT NULL default '5',
    `numberOfClues` smallint(5) unsigned NOT NULL default '0',
    PRIMARY KEY (`id`),
    UNIQUE KEY `initialState` (`initialState`),
    KEY `levelOfDifficulty` (`levelOfDifficulty`),
    KEY `numberOfClues` (`numberOfClues`)
    ) ENGINE=InnoDB DEFAULT CHARSET=latin1 AUTO_INCREMENT=780 ;
    But when I look at the table, it's a MyIsam table.
    Heres the innodb part of the /etc/my.cnf file:
    # Uncomment the following if you are using InnoDB tables
    innodbdata_homedir = /var/mysql/
    innodbdata_filepath = ibdata1:2000M;ibdata2:10M:autoextend
    innodblog_group_homedir = /var/mysql/
    innodblog_archdir = /var/mysql/
    # You can set ..buffer_poolsize up to 50 - 80 %
    # of RAM but beware of setting memory usage too high
    innodbbuffer_poolsize = 384M
    innodbadditional_mem_poolsize = 20M
    # Set ..log_filesize to 25 % of buffer pool size
    innodblog_filesize = 100M
    innodblog_buffersize = 8M
    innodbflush_log_at_trxcommit = 1
    innodblock_waittimeout = 50
    No errors in the logs visible through Server Admin. Did Apple put the MySQL configuration files somewhere non-standard and leave the /etc/my.cnf file around to fool us?
    Any pointers will be helpful or should I just build the fink package and give up on the shipped Leopard package (so far I'm not too impressed with Leopard).
    Best,
    Dick Munroe

    The problem turned out to be the sizes of the INNODB log and data files. Apparently, the initial startup of MySQL causes these files to be built. The numbers that you uncomment in the my.cnf file aren't related in any way to these sizes and the innodb storage engine won't start unless they match. Stop the server, delete the files, and restart the server and you're off and running. The logs on Leopard Server have NO indication that this has occurred. I found the problem by installing the Mac OS kit from mysql.com and looking in those logs. Interesting question is why that information is missing from the log for Leopard Server when the information is clearly available (as show by running the mysql server).
    Best,
    Dick Munroe

  • EJB + Mysql Storage engine choice

    I'm developing an EJB 3 application using EclipseLink with MySQL and need to choose a suitable storage engine for MySQL
    I am considering MyISAM and InnoDB. As far as I can tell the main difference is the transactional features of InnoDB.
    My understanding is that with EJB the transactions are managed by the EJB container.
    Is there any advantage in choosing InnoDB over MyISAM when working with EJB?

    yes Howard, i had a look at it.but is it going to be the same for any workflow which has more process variables?for the workflow i am doing,we need more process variables.to tell u exactly i have 10 form variables,2 document,3 int, and others totally below 20.
    Now that i had understood the cause of error,i want 2 rectif it without changing anything in the database as we are not given rights for writing into database.is there any way of doing it?if yes plz suggest me.
    thank you,
    Raghava kumar V.S.S.

  • "error: thread-local storage not supported for this target"

    I have a program that uses the __thread specifier, to be run on a Solaris 9/UltraSprac.
    I am not able to compile it using gcc 3.4.4 or 4.0.4, it emits the msg "error: thread-local storage not supported for this target".
    xz@gamera% gcc -v -Wall -D_REENTRANT -c -o func_stack.o func_stack.c
    Reading specs from /opt/gcc/3.4.4/lib/gcc/sparc-sun-solaris2.8/3.4.4/specs
    Configured with: ../srcdir/configure --prefix=/opt/gcc/3.4.4 --disable-nls
    Thread model: posix
    gcc version 3.4.4
    /opt/gcc/3.4.4/libexec/gcc/sparc-sun-solaris2.8/3.4.4/cc1 -quiet -v -D_REENTRANT -DMESS func_stack.c -quiet -dumpbase func_stack.c -mcpu=v7 -auxbase-strip func_stack.o -Wall -version -o /var/tmp//cc0poHSN.s
    ignoring nonexistent directory "/usr/local/include"
    ignoring nonexistent directory "/opt/gcc/3.4.4/lib/gcc/sparc-sun-solaris2.8/3.4.4/../../../../sparc-sun-solaris2.8/include"
    #include "..." search starts here:
    #include <...> search starts here:
    /opt/gcc/3.4.4/include
    /opt/gcc/3.4.4/lib/gcc/sparc-sun-solaris2.8/3.4.4/include
    /usr/include
    End of search list.
    GNU C version 3.4.4 (sparc-sun-solaris2.8)
            compiled by GNU C version 3.4.4.
    GGC heuristics: --param ggc-min-expand=100 --param ggc-min-heapsize=131072
    func_stack.c:16: error: thread-local storage not supported for this target
    func_stack.c:17: error: thread-local storage not supported for this target
    func_stack.c:19: error: thread-local storage not supported for this target
    xs@gamera% gcc -v -D_REENTRANT  -c -o func_stack.o func_stack.c
    Using built-in specs.
    Target: sparc-sun-solaris2.9
    Configured with: /net/clpt-v490-0/export/data/bldmstr/20070711_mars_gcc/src/configure --prefix=/usr/sfw --enable-shared --with-system-zlib --enable-checking=release --disable-libmudflap --enable-languages=c,c++ --enable-version-specific-runtime-libs --with-cpu=v9 --with-ld=/usr/ccs/bin/ld --without-gnu-ld
    Thread model: posix
    gcc version 4.0.4 (gccfss)
    /pkg/gcc/4.0.4/bin/../libexec/gcc/sparc-sun-solaris2.9/4.0.4/cc1 -quiet -v -I. -iprefix /pkg/gcc/4.0.4/bin/../lib/gcc/sparc-sun-solaris2.9/4.0.4/ -D__sparcv8 -D_REENTRANT -DMESS func_stack.c -quiet -dumpbase func_stack.c -mcpu=v9 -auxbase-strip func_stack.o -version -m32 -o /tmp/ccjsdswh.s -r /tmp/cc2w4ZRo.ir
    ignoring nonexistent directory "/pkg/gcc/4.0.4/bin/../lib/gcc/sparc-sun-solaris2.9/4.0.4/../../../../sparc-sun-solaris2.9/include"
    ignoring nonexistent directory "/usr/local/include"
    ignoring nonexistent directory "/usr/sfw/lib/gcc/sparc-sun-solaris2.9/4.0.4/include"
    ignoring nonexistent directory "/usr/sfw/lib/../sparc-sun-solaris2.9/include"
    #include "..." search starts here:
    #include <...> search starts here:
    /pkg/gcc/4.0.4/bin/../lib/gcc/sparc-sun-solaris2.9/4.0.4/include
    /usr/sfw/include
    /usr/include
    End of search list.
    GNU C version 4.0.4 (gccfss) (sparc-sun-solaris2.9)
            compiled by GNU C version 4.0.4 (gccfss).
    GGC heuristics: --param ggc-min-expand=100 --param ggc-min-heapsize=131072
    func_stack.c:16: error: thread-local storage not supported for this target
    func_stack.c:17: error: thread-local storage not supported for this target
    func_stack.c:19: error: thread-local storage not supported for this targetJust as comparison, the corresponding output of compiling another file which does not have __thread declaration is as follows:
    xz@gamera% gcc -v -Wall -D_REENTRANT -c -o common.o common.c
    Reading specs from /opt/gcc/3.4.4/lib/gcc/sparc-sun-solaris2.8/3.4.4/specs
    Configured with: ../srcdir/configure --prefix=/opt/gcc/3.4.4 --disable-nls
    Thread model: posix
    gcc version 3.4.4
    /opt/gcc/3.4.4/libexec/gcc/sparc-sun-solaris2.8/3.4.4/cc1 -quiet -v -D_REENTRANT -DMESS common.c -quiet -dumpbase common.c -mcpu=v7 -auxbase-strip common.o -Wall -version -o /var/tmp//cc4VxrLz.s
    ignoring nonexistent directory "/usr/local/include"
    ignoring nonexistent directory "/opt/gcc/3.4.4/lib/gcc/sparc-sun-solaris2.8/3.4.4/../../../../sparc-sun-solaris2.8/include"
    #include "..." search starts here:
    #include <...> search starts here:
    /opt/gcc/3.4.4/include
    /opt/gcc/3.4.4/lib/gcc/sparc-sun-solaris2.8/3.4.4/include
    /usr/include
    End of search list.
    GNU C version 3.4.4 (sparc-sun-solaris2.8)
            compiled by GNU C version 3.4.4.
    GGC heuristics: --param ggc-min-expand=100 --param ggc-min-heapsize=131072
    /usr/ccs/bin/as -V -Qy -s -xarch=v8 -o common.o /var/tmp//cc4VxrLz.s
    /usr/ccs/bin/as: Sun WorkShop 6 update 2 Compiler Common 6.2 Solaris_9_CBE 2001/04/02Note that the last 2 lines seem to suggest that a Sun assembler is used as the back-end of gcc. I am not sure whether the failure to compile the first file (with __thread) was due to the incompatibility of this Sun assembler. In the first case, the error msg was emitted before these 2 lines are printed.
    I further read a post about gcc 3.3.3's inability to compile code that has __thread in it, on a HP-UX 11.11: http://forums12.itrc.hp.com/service/forums/questionanswer.do?admit=109447627+1216595175060+28353475&threadId=1148976 The conclusion seems to suggest that "the 2.17 GNU assembler did not support thread local storage" and gcc sees that and thus disallows TLS.
    If the assembler is the culprit, then does anyone know whether this "Sun WorkShop 6 update 2" assembler in my installation can work with TLS? And how come a Sun assembler becomes the back-end of gcc? I read that gas (the GNU assembler) is the default backend of gcc. (How) can one specify the assembler to be used for gcc?
    As an aside, I am able to compile my file on this same Solaris 9/UltraSparc platform using the Sun Studio 12 C Compiler:
    xz@gamera% cc -V -# -D_REENTRANT  -c -o func_stack.o func_stack.c
    cc: Sun C 5.9 SunOS_sparc Patch 124867-01 2007/07/12
    ### Note: NLSPATH = /pkg/SUNWspro/12/prod/bin/../lib/locale/%L/LC_MESSAGES/%N.cat:/pkg/SUNWspro/12/prod/bin/../../lib/locale/%L/LC_MESSAGES/%N.cat
    ###     command line files and options (expanded):
    ### -c -D_REENTRANT  -V func_stack.c -o func_stack.o
    /pkg/SUNWspro/12/prod/bin/acomp -xldscope=global -i func_stack.c -y-fbe -y/pkg/SUNWspro/12/prod/bin/fbe -y-xarch=generic -y-xmemalign=8i -y-o -yfunc_stack.o -y-verbose -y-xthreadvar=no%dynamic -y-comdat -xdbggen=no%stabs+dwarf2+usedonly -V -D_REENTRANT  -m32 -fparam_ir -Qy -D__SunOS_5_9 -D__SUNPRO_C=0x590 -D__SVR4 -D__sun -D__SunOS -D__unix -D__sparc -D__BUILTIN_VA_ARG_INCR -D__C99FEATURES__ -Xa -D__PRAGMA_REDEFINE_EXTNAME -Dunix -Dsun -Dsparc -D__RESTRICT -xc99=%all,no%lib -D__FLT_EVAL_METHOD__=0 -I/pkg/SUNWspro/12/prod/include/cc "-g/pkg/SUNWspro/12/prod/bin/cc -V -D_REENTRANT  -c -o func_stack.o " -fsimple=0 -D__SUN_PREFETCH -destination_ir=yabe
    acomp: Sun C 5.9 SunOS_sparc Patch 124867-01 2007/07/12Interestingly, the output no longer mentions the "/usr/ccs/bin/as: Sun WorkShop 6 update 2" assembler.

    Just as another comparison, I compiled a file without __thread on the Solaris 9/UltraSparc platform using gcc 4.0.4. Not surprisingly it worked. But I no longer see the mention of the Sun assembler as in the case of gcc 3.4.4. Nor did I see the mention of "GNU assembler" as in the case of gcc 4.0.4/Solaris 10/x86. Instead, I saw something called "iropt" and "cg". Does anyone know what they are?
    xz@gamera% gcc -v -Wall -D_REENTRANT -c -o common.o common.c
    Using built-in specs.
    Target: sparc-sun-solaris2.9
    Configured with: /net/clpt-v490-0/export/data/bldmstr/20070711_mars_gcc/src/configure --prefix=/usr/sfw --enable-shared --with-system-zlib --enable-checking=release --disable-libmudflap --enable-languages=c,c++ --enable-version-specific-runtime-libs --with-cpu=v9 --with-ld=/usr/ccs/bin/ld --without-gnu-ld
    Thread model: posix
    gcc version 4.0.4 (gccfss)
    /pkg/gcc/4.0.4/bin/../libexec/gcc/sparc-sun-solaris2.9/4.0.4/cc1 -quiet -v -iprefix /pkg/gcc/4.0.4/bin/../lib/gcc/sparc-sun-solaris2.9/4.0.4/ -D__sparcv8 -D_REENTRANT -DMESS common.c -quiet -dumpbase common.c -mcpu=v9 -auxbase-strip common.o -Wall -version -m32 -o /tmp/ccSGJIDD.s -r /tmp/ccKuJz76.ir
    ignoring nonexistent directory "/pkg/gcc/4.0.4/bin/../lib/gcc/sparc-sun-solaris2.9/4.0.4/../../../../sparc-sun-solaris2.9/include"
    ignoring nonexistent directory "/usr/local/include"
    ignoring nonexistent directory "/usr/sfw/lib/gcc/sparc-sun-solaris2.9/4.0.4/include"
    ignoring nonexistent directory "/usr/sfw/lib/../sparc-sun-solaris2.9/include"
    #include "..." search starts here:
    #include <...> search starts here:
    /pkg/gcc/4.0.4/bin/../lib/gcc/sparc-sun-solaris2.9/4.0.4/include
    /usr/sfw/include
    /usr/include
    End of search list.
    GNU C version 4.0.4 (gccfss) (sparc-sun-solaris2.9)
            compiled by GNU C version 4.0.4 (gccfss).
    GGC heuristics: --param ggc-min-expand=100 --param ggc-min-heapsize=131072
    /pkg/gcc/4.0.4/SUNW0scgfss/4.0.4/prod/bin/iropt -F -xarch=v8plus -xchip=generic -O1 -xvector=no -xbuiltin=%none -xcache=generic -Qy -h_gcc -o /tmp/ccUl4mVM.ircg /tmp/ccKuJz76.ir -N/dev/null -is /tmp/ccSGJIDD.s
    /pkg/gcc/4.0.4/SUNW0scgfss/4.0.4/prod/bin/cg -Qy -xarch=v8plus -xchip=generic -OO0 -T3 -Qiselect-C0 -Qrm:newregman:coalescing=0 -xcode=abs32 -xcache=generic -xmemalign=8i -il /pkg/gcc/4.0.4/bin/../lib/gcc/sparc-sun-solaris2.9/4.0.4//gccbuiltins.il -xvector=no -xthreadvar=no%dynamic -xbuiltin=%none -Qassembler-ounrefsym=0 -Qiselect-T0 -Qassembler-I -Qassembler-U -comdat -h_gcc -is /tmp/ccSGJIDD.s -ir /tmp/ccUl4mVM.ircg -oo common.o

  • Getting error in the adapter engine when sending a message

    Hi,
    I'm always getting this error in the adapter engine: Message processing failed. Cause: com.sap.aii.messaging.util.URI$MalformedURIException: no scheme
    The comm channel ends up in error because of messages having the above error.
    Could someone please help?
    Thanks.
    Mike

    Mike.
    Please check this links.
    Check your adapter and restarted.
    com.sap.aii.messaging.util.URI$MalformedURIException: invalid port number
    Re: Prerequisites to setup configure mail scenario.
    error in communication channel
    Error :Receiver File Channel not Initialized

Maybe you are looking for