Cube processing failing

I am experiencing a strange issue . One of our cubes processing is successful when I do it via BIDS or management studio. But when I process the cube via XMLA it gives strange errors, this
was working fine earlier.
<return
xmlns="urn:schemas-microsoft-com:xml-analysis">
  <results
xmlns="http://schemas.microsoft.com/analysisservices/2003/xmla-multipleresults">
    <root
xmlns="urn:schemas-microsoft-com:xml-analysis:empty">
      <Exception
xmlns="urn:schemas-microsoft-com:xml-analysis:exception"
/>
      <Messages
xmlns="urn:schemas-microsoft-com:xml-analysis:exception">
        <Error
ErrorCode="3238002695"
Description="Internal error: The operation terminated unsuccessfully."
Source="Microsoft SQL Server 2008 R2 Analysis Services"
HelpFile="" />
        <Warning
WarningCode="1092550657"
Description="Errors in the OLAP storage engine: The attribute key cannot be found when processing: Table: 'DBID_LGCL_DATABASE_SYSTEM_MAP', Column: 'LGCL_DATABASE_KEY',
Value: '671991'. The attribute is 'LGCL DATABASE KEY'."
Source="Microsoft SQL Server 2008 R2 Analysis Services"
HelpFile="" />
        <Warning
WarningCode="2166292482"
Description="Errors in the OLAP storage engine: The attribute key was converted to an unknown member because the attribute key was not found. Attribute LGCL
DATABASE KEY of Dimension: Logical Database from Database: Column_Analytics_QA, Cube: COLUMN_USAGE, Measure Group: LGCL DATABASE SYSTEM MAP, Partition: LGCL DATABASE SYSTEM MAP, Record: 94986."
Source="Microsoft SQL Server 2008 R2 Analysis Services"
HelpFile="" />
        <Error
ErrorCode="3240034310"
Description="Errors in the OLAP storage engine: The process operation ended because the number of errors encountered during processing reached the defined limit
of allowable errors for the operation."
Source="Microsoft SQL Server 2008 R2 Analysis Services"
HelpFile="" />
        <Error
ErrorCode="3240034318"
Description="Errors in the OLAP storage engine: An error occurred while processing the 'LGCL DATABASE SYSTEM MAP' partition of the 'LGCL DATABASE SYSTEM MAP'
measure group for the 'COLUMN_USAGE' cube from the Column_Analytics_QA database."
Source="Microsoft SQL Server 2008 R2 Analysis Services"
HelpFile="" />
        <Error
ErrorCode="3240034310"
Description="Errors in the OLAP storage engine: The process operation ended because the number of errors encountered during processing reached the defined limit
of allowable errors for the operation."
Source="Microsoft SQL Server 2008 R2 Analysis Services"
HelpFile="" />
        <Error
ErrorCode="3239837702"
Description="Server: The current operation was cancelled because another operation in the transaction failed."
Source="Microsoft SQL Server 2008 R2 Analysis Services"
HelpFile="" />
        <Error
ErrorCode="3240034318"
Description="Errors in the OLAP storage engine: An error occurred while processing the 'COLUMN USAGE FACT_8474' partition of the 'SYBASE COLUMN USAGE' measure
group for the 'COLUMN_USAGE' cube from the Column_Analytics_QA database."
Source="Microsoft SQL Server 2008 R2 Analysis Services"
HelpFile="" />
        <Error
ErrorCode="3240034318"
Description="Errors in the OLAP storage engine: An error occurred while processing the 'COLUMN USAGE FACT_8714' partition of the 'SYBASE COLUMN USAGE' measure
group for the 'COLUMN_USAGE' cube from the Column_Analytics_QA database."
Source="Microsoft SQL Server 2008 R2 Analysis Services"
HelpFile="" />
        <Error
ErrorCode="3240034318"
Description="Errors in the OLAP storage engine: An error occurred while processing the 'COLUMN USAGE FACT_9102' partition of the 'SYBASE COLUMN USAGE' measure
group for the 'COLUMN_USAGE' cube from the Column_Analytics_QA database."
Source="Microsoft SQL Server 2008 R2 Analysis Services"
HelpFile="" />
        <Error
ErrorCode="3240034310"
Description="Errors in the OLAP storage engine: The process operation ended because the number of errors encountered during processing reached the defined limit
of allowable errors for the operation."
Source="Microsoft SQL Server 2008 R2 Analysis Services"
HelpFile="" />
        <Error
ErrorCode="3240034318"
Description="Errors in the OLAP storage engine: An error occurred while processing the 'COLUMN USAGE FACT_8186' partition of the 'SYBASE COLUMN USAGE' measure
group for the 'COLUMN_USAGE' cube from the Column_Analytics_QA database."
Source="Microsoft SQL Server 2008 R2 Analysis Services"
HelpFile="" />
        <Error
ErrorCode="3240034318"
Description="Errors in the OLAP storage engine: An error occurred while processing the 'COLUMN USAGE FACT_8282' partition of the 'SYBASE COLUMN USAGE' measure
group for the 'COLUMN_USAGE' cube from the Column_Analytics_QA database."
Source="Microsoft SQL Server 2008 R2 Analysis Services"
HelpFile="" />
        <Error
ErrorCode="3240034318"
Description="Errors in the OLAP storage engine: An error occurred while processing the 'COLUMN USAGE FACT_8530' partition of the 'SYBASE COLUMN USAGE' measure
group for the 'COLUMN_USAGE' cube from the Column_Analytics_QA database."
Source="Microsoft SQL Server 2008 R2 Analysis Services"
HelpFile="" />
        <Error
ErrorCode="3240034318"
Description="Errors in the OLAP storage engine: An error occurred while processing the 'COLUMN USAGE FACT_9050' partition of the 'SYBASE COLUMN USAGE' measure
group for the 'COLUMN_USAGE' cube from the Column_Analytics_QA database."
Source="Microsoft SQL Server 2008 R2 Analysis Services"
HelpFile="" />
        <Error
ErrorCode="3240034318"
Description="Errors in the OLAP storage engine: An error occurred while processing the 'COLUMN USAGE FACT_9002' partition of the 'SYBASE COLUMN USAGE' measure
group for the 'COLUMN_USAGE' cube from the Column_Analytics_QA database."
Source="Microsoft SQL Server 2008 R2 Analysis Services"
HelpFile="" />
        <Error
ErrorCode="3240034318"
Description="Errors in the OLAP storage engine: An error occurred while processing the 'COLUMN USAGE FACT_9146' partition of the 'SYBASE COLUMN USAGE' measure
group for the 'COLUMN_USAGE' cube from the Column_Analytics_QA database."
Source="Microsoft SQL Server 2008 R2 Analysis Services"
HelpFile="" />
        <Error
ErrorCode="3240034318"
Description="Errors in the OLAP storage engine: An error occurred while processing the 'COLUMN USAGE FACT_8770' partition of the 'SYBASE COLUMN USAGE' measure
group for the 'COLUMN_USAGE' cube from the Column_Analytics_QA database."
Source="Microsoft SQL Server 2008 R2 Analysis Services"
HelpFile="" />
        <Error
ErrorCode="3240034318"
Description="Errors in the OLAP storage engine: An error occurred while processing the 'COLUMN USAGE FACT_8642' partition of the 'SYBASE COLUMN USAGE' measure
group for the 'COLUMN_USAGE' cube from the Column_Analytics_QA database."
Source="Microsoft SQL Server 2008 R2 Analysis Services"
HelpFile="" />
        <Error
ErrorCode="3240034318"
Description="Errors in the OLAP storage engine: An error occurred while processing the 'COLUMN USAGE FACT_9058' partition of the 'SYBASE COLUMN USAGE' measure
group for the 'COLUMN_USAGE' cube from the Column_Analytics_QA database."
Source="Microsoft SQL Server 2008 R2 Analysis Services"
HelpFile="" />
        <Error
ErrorCode="3240034318"
Description="Errors in the OLAP storage engine: An error occurred while processing the 'COLUMN USAGE FACT_8322' partition of the 'SYBASE COLUMN USAGE' measure
group for the 'COLUMN_USAGE' cube from the Column_Analytics_QA database."
Source="Microsoft SQL Server 2008 R2 Analysis Services"
HelpFile="" />
        <Error
ErrorCode="3240034318"
Description="Errors in the OLAP storage engine: An error occurred while processing the 'COLUMN USAGE FACT_8658' partition of the 'SYBASE COLUMN USAGE' measure
group for the 'COLUMN_USAGE' cube from the Column_Analytics_QA database."
Source="Microsoft SQL Server 2008 R2 Analysis Services"
HelpFile="" />
        <Error
ErrorCode="3240034318"
Description="Errors in the OLAP storage engine: An error occurred while processing the 'COLUMN USAGE FACT_8410' partition of the 'SYBASE COLUMN USAGE' measure
group for the 'COLUMN_USAGE' cube from the Column_Analytics_QA database."
Source="Microsoft SQL Server 2008 R2 Analysis Services"
HelpFile="" />
        <Error
ErrorCode="3240034318"
Description="Errors in the OLAP storage engine: An error occurred while processing the 'BRDGE PHYS LGCL' partition of the 'BRDGE PHYS LGCL' measure group for
the 'COLUMN_USAGE' cube from the Column_Analytics_QA database."
Source="Microsoft SQL Server 2008 R2 Analysis Services"
HelpFile="" />
        <Error
ErrorCode="3240034310"
Description="Errors in the OLAP storage engine: The process operation ended because the number of errors encountered during processing reached the defined limit
of allowable errors for the operation."
Source="Microsoft SQL Server 2008 R2 Analysis Services"
HelpFile="" />
Any idea what might be the reason?

Please refer another question with the same issue
here
Below is my answer from that post:
From my experience, this may be because of data loading in dimensions or fact while you are processing your cube or (in worst case) the issue is not related to attribute keys at all. Because, if you re process the cube it will process successfully on the same
set of records.
First identify the processing
option for your SSAS cube.  
You can use SSIS "Analysis Service processing task" to process dimensions and fact separately. 
or
You can process object in batches (Batch
Processing). Using batch processing you can select the objects to be processed and control the processing order.
Also, a batch can run as a series of stand-alone jobs or as a transaction in which the failure of one process causes a rollback of the complete batch.
To aggregate:
Ensure that you are not loading data into fact and dimension while processing cube.
Don't write queries for dirty read
Remember when you process a dimension on ProcessFull or ProcessUpdate; cube will move to unprocessed state and it can
not be queried. 
 

Similar Messages

  • SCSM2012: Cube processing failing on two cubes - ConfigItemDimKey not found

    Hi
    Two of our cubes (SystemCenterSoftwareUpdateCube and SystemCenterPowerManagementCube) has started to fail processing lately. In ServiceManager they the error is just "failed", but in SSAS there is a lot of errors.
    Both cubes fails with the following error when SSAS processing them:
    "Errors in the OLAP storage engine: The attribute key cannot be found when processing: Table: 'ConfigItemDim', Column: 'ConfigItemDimKey', Value: '7200'. The attribute is 'ConfigItemDimKey'. Errors in the OLAP storage engine: The attribute key was converted
    to an unknown member because the attribute key was not found. Attribute ConfigItemDimKey of Dimension: ConfigItemDim from Database: DWASDataBase, Cube: SystemCenterSoftwareUpdateCube, Measure Group: ConfigItemDim, Partition: ConfigItemDim, Record: 7201. Errors
    in the OLAP storage engine: The process operation ended because the number of errors encountered during processing reached the defined limit of allowable errors for the operation. Errors in the OLAP storage engine: An error occurred while processing the 'ConfigItemDim'
    partition of the 'ConfigItemDim' measure group for the 'SystemCenterSoftwareUpdateCube' cube from the DWASDataBase database."
    => My question: is it possible to recreate this ConfigItemDimKey manually (and how), or delete those cube and create them from scratch (back to oob status) ?
    Thanks.
    /Peter

    Hi Peter,
    We recently had similar issues with our ChangeAndActivityManagementCube. After a conversation with a Microsoft supporter I was able to work around that problem and so far it hasn't appeared yet.
    As you can read from the error message the issues appears when Analysis Services tries to process the ConfigItemDim measure group. During the processing it's looking up the corresponding dimension for attribute key. Now when the Measure Group
    is getting processed before the corresponding dimension is processed, it's possible that the attribute key doesn't exist in the dimension table yet and then this error occurs.
    What you have to do is the following:
    1. Process the dimensions manually using PowerShell and the following code (change the server address and SQL instance and execute it on the AS server):
    [System.Reflection.Assembly]::LoadWithPartialName("Microsoft.AnalysisServices") > $NULL
    $Server = New-Object Microsoft.AnalysisServices.Server
    $Server.Connect("YOUR_DW_AS_SERVER\DW_AS_INSTANCE")
    $Databases = $Server.Databases
    $DWASDB = $Databases["DWASDatabase"]
    $Dimensions = New-Object Microsoft.AnalysisServices.Dimension
    $Dimensions = $DWASDB.Dimensions
    foreach ($Dimension in $Dimensions){$Dimension.Process("ProcessFull")}
    2. Then process the affected measure group manually using the SQL Management Studio. You have to connect to the AS Engine, expand the DWASDatabase DB -> Cubes -> Measure Groups -> right click the affected Measure Group and select process ->
    leave the standard settings in the next window and press ok.
    You have to repeated step 2 for each Masure Group mentioned in the event logs.
    3. Now process the entire cube by right clicking the cube in SQL Management Studio and select Process. Now the processing should finish successfully.
    Since then the data warehouse cube processing jobs were working fine again too in our installation.
    Hope this helps.
    Cheers
    Alex

  • Cube Process failed?

    How to fix this error?
    Executed as user: TWCCORP\los.sql. Microsoft (R) SQL Server Execute Package Utility Version 10.0.5500.0 for 64-bit
    Copyright (C) Microsoft Corp 1984-2005. All rights reserved. Started: 4:42:15 AM Error: 2014-07-25 06:47:16.58 Code: 0xC1000007 Source: Process PaymentServices Cube Analysis Services Execute DDL Task Description:
    Internal error: The operation terminated unsuccessfully. End Error Error: 2014-07-25 06:47:16.58 Code: 0xC1110078 Source: Process PaymentServices Cube Analysis Services Execute DDL Task Description: Errors in the back-end database access module. The read operation
    was cancelled due to an earlier error. End Error Error: 2014-07-25 06:47:16.58 Code: 0xC11F000D Source: Process PaymentServices Cube Analysis Services Execute DDL Task Description: Errors in the OLAP storage engine: An error occurred while the 'Customer Key'
    attribute of the 'Subscriber' dimension from the 'PaymentServices' database was being processed. End Error Error: 2014-07-25 06:47:16.58 Code: 0xC11F0006 Source: Process PaymentServices Cube Analysis Services Execute DDL Task Description: Errors in the OLAP
    storage engine: The process operation ended because the number of errors encountered during processing reached the defined limit of allowable errors for the operation. End Error Error: 2014-07-25 06:47:16.67 Code: 0xC1020034 Source: Process PaymentServices
    Cube Analysis Services Execute DDL Task Description: File system error: The following file is corrupted: Physical file: \\?\G:\SQLANALYSIS\MSMDCacheRowset_4856_2240_ybmwas.tmp. Logical file . End Error Error: 2014-07-25 06:47:16.69 Code: 0xC102003C Source:
    Process PaymentServices Cube Analysis Services Execute DDL Task Description: File system error: The background thread running lazy writer encountered an I/O error. Physical file: \\?\G:\SQLANALYSIS\MSMDCacheRowset_4856_2240_ybmwas.tmp. Logical file: . End
    Error Error: 2014-07-25 06:47:16.69 Code: 0xC11F000D Source: Process PaymentServices Cube Analysis Services Execute DDL Task Description: Errors in the OLAP storage engine: An error occurred while the 'Customer Key' attribute of the 'Subscriber' dimension
    from the 'PaymentServices' database was being processed. End Error Error: 2014-07-25 06:47:16.69 Code: 0xC11F0006 Source: Process PaymentServices Cube Analysis Services Execute DDL Task Description: Errors in the OLAP storage engine: The process operation
    ended because the number of errors encountered during processing reached the defined limit of allowable errors for the operation. End Error Error: 2014-07-25 06:47:16.69 Code: 0xC11C0006 Source: Process PaymentServices Cube Analysis Services Execute DDL Task
    Description: Server: The current operation was cancelled because another operation in the transaction failed. End Error DTExec: The package execution returned DTSER_FAILURE (1). Started: 4:42:15 AM Finished: 6:47:17 AM Elapsed: 7501.56 seconds. The package
    execution failed. The step failed.

    The error to pay attention to is:
    The following file is corrupted: Physical file: \\?\G:\SQLANALYSIS\MSMDCacheRowset_4856_2240_ybmwas.tmp
    I would recommend you delete that database from the SSAS server using Management Studio. Then redeploy and reprocess that database from source code.
    You may also want to stop SSAS and do a chkdsk on the G drive to be sure the media isn't failing.
    http://artisconsulting.com/Blogs/GregGalloway

  • Dimensions and Cubes Process OK / database 'Process Full' fails

    Hello,
    I am having trouble processing SSAS database (SQL 2008R2).
    Within BIDS environment I can process all dimensions and two cubes - no problem. But when I try to process the database (Process Full – Sequential/ All in one transaction), I keep getting errors about one specific dimension (however the very same dimension
    is processed separately just fine).
    Any ideas why this would happen?
    Thanks,
    Lana

    Thanks so much for all your replies! Your help is greatly appreciated
    J
    Here is the error message I am getting (it is the same for bunch of other attributes of the same dimension called
    Item):
    Errors in the OLAP storage engine: An error occurred while the dimension, with the ID of 'Item', Name of 'Item' was being processed.
    Errors in the OLAP storage engine: An error occurred while the 'Product Group' attribute of the 'Item' dimension from the 'InvCube' database was being processed.
    OLE DB error: OLE DB or ODBC error: Operation canceled; HY008.
    Errors in the OLAP storage engine: A duplicate attribute key has been found when processing: Table: 'dbo_Item', Column: 'Colour', Value: ''. The attribute is 'Colour'.
    Keep in mind that the Key for this dimension is called ItemID and there are no duplicates (I checked). Before processing the cube, I have dozens of procedures (within SSIS package that brings the data into DW) that are checking if all the
    keys (FK columns on Fact table) exist within dim tables (PK columns). I also do not allow NULLs or blanks for the dim keys within Fact tables, I replace them with 
    special “DimName-000” Keys that, again, exist within dim tables.
    What I am trying to figure out… why I can process Item dimension just fine on its own (I also process all the other dimensions OK), after that I can process both cubes that are using this (and other) dimension(s) … and I can browse the cubes, getting breakdown
    by Item / Colour / Product Group, etc... So, everything seems to work perfectly. However, when I try to process the whole database (with all previously processed dimensions and both cubes - there are only two cubes for now, to make is simple), the process
    fails, giving me an error about duplicate attribute key for this specific Item dimension.
    Any thoughts?
    Thanks again!

  • SSAS Cube Dimension Processing Fails Reporting: File system error: A FileStore error from WriteFile occurred

    Hi,
    I have a SSAS Cube Database for which the processing had been stopped since the year 2012, but as the requirements have again come up, this cube needs to be processed on a daily basis again, to refresh the cube with new incoming data.
    As i was testing the cube processing steps before finally re-activating the cube processing in production, i came across a weird error while doing a manual processing for a Dimension.
    File system error: A FileStore error from WriteFile occurred. Physical file: \\?\D:\MSAS10.MSSQLSERVER\OLAP\Data\Citi.184.db\Dim Extended Data.0.dim\54.Extended Data.ksstore. Logical file: . . Errors in the OLAP storage engine: An error occurred
    while the 'Extended Data' attribute of the 'ExtendedData' dimension from the 'Citi' database was being processed.
    This error came to me while i was doing a process update for the dimension, i even tried to do a Process Full for the same dimension which did not help. I then did a Unprocess and then a Process Full, which didn't work as well.
    Can anyone please help me with this issue. I am using the SQL Server 2008 where the cube is hosted and the processing used to work fine earlier, but now its failing with this error.
    Thanks

    This looks like the 4GB string store limit. If you can upgrade to SQL Server 2012 Analysis Services, you can change a simple setting and grow beyond that 4GB string store limit.
    http://technet.microsoft.com/en-us/library/gg471589(SQL.110).aspx
    If you can't upgrade, what data is in that ExtendedData attribute? If it's a meaningless surrogate key that you don't need users to see, you can use this trick to avoid the 4GB string store limit:
    http://cwebbbi.wordpress.com/2012/08/10/storage-and-the-namecolumn-and-keycolumns-properties/ But if it's something the users need to see, there's not an easy way to fix it without upgrading to SSAS 2012.
    http://artisconsulting.com/Blogs/GregGalloway

  • Cube processing Error

    Hi,
       I am using SSAS 2008 R2. We have a job scheduled at 3:00 AM. But today we got this error during cube processing. Please help me to solve this ASAP
    Error: 2015-01-07 08:26:49.08     Code: 0xC11F0006
     Source: Analysis Services Processing Task 1 Analysis Services Execute DDL Task   
     Description: Errors in the OLAP storage engine:
     The process operation ended because the number of errors encountered during processing reached the defined
     limit of allowable errors for the operation.

    Hi Anu,
    According to your description, you get errors when executing a cube processing task. Right?
    The error you posted just tell us there are too many errors during processing. Please pay attention to the errors around this error message. Those error will tell where the root issue happens. We suggest you use SQL Profiler to monitor SSAS. See:
    Use SQL Server Profiler to Monitor Analysis Services. Please share those detail error message to us. Also refer to some similar threads below, some advices in the link may help you:
    https://social.msdn.microsoft.com/Forums/sqlserver/en-US/61f117bd-b6ac-493a-bbd5-3b912dbe05f0/cube-processing-issue
    https://social.msdn.microsoft.com/forums/sqlserver/en-US/006c849c-86e3-454a-8f27-429fadd76273/cube-processing-job-failed
    If you have any question, please feel free to ask.
    Simon Hou
    TechNet Community Support

  • TIME dimension processing fails saying "..attribute key cannot be found.." in EPM 10

    After upgrading from version 7.5 to EPM 10, when we ran a ‘Full Process’ on the TIME dimension, it ran into an error saying “Errors in the OLAP storage engine: The attribute key cannot be found when processing: Table: 'dbo_tblFactCapEx', Column: 'TIMEID', Value: '20090013'. The attribute is 'Time_TIMEID'.  (1/13/2015 2:41:02 PM)”.
    Full error message is attached herewith – ‘Time Dimension Error.txt’
    After researching, we did discover that MONTHNUM needed to be converted to BASE_PERIOD. Re-processed which produced the same error.
    Prior to migration to version 7.5, we ran a full process on TIME dimension there. It completed successfully, confirming the issue is only with version 10.
    Confirmed we could see the TIMEID value of 20090013 in the following places:
    Time Dimension in the appropriate TIMEID attribute column.
    Confirmed mbrTIME table had base member ID with TIMEID attribute filled out correctly.
    Data in tblFactFINANCE could be pulled using that TIMEID
    We truncated all the records in all the fact tables associated to this TIME dimension.
    Eventually, when none of the tables had any records, the TIME dimension then processed successfully.
    We this began to suspect the issue may not really be related to bad records.
    We conducted one more test to confirm this.
    Using an input form in EPM 10, we manually entered data in one of the models (at this point none of the fact tables have any records)
    Ran Full Optimize on that model with Compress Database and Index Defragmentation checked – This step failed with the error attached in ‘MatrixRateFullOptimize.txt’
    Ran Full process on Time Dimension – Failed indicating issue with TimeID 2012001 (that’s my manual entry). Attached error report ‘TimeDim Error MatrixRate.txt’
    At this point, the table only contains the manually entered records (no suspected bad records)
    We then suspected there could have been an issue with the upgrade process.
    So we reprocessed all the dimension and optimized all the models in version 7.5, made a new backup and restored it to version 10.
    The issue still persisted!
    At this point, we have tried all the possibilities we could think of. Each time the fact table is populated with records, the TIME dimension process fails indicating ‘the attribute key’ cannot be found.
    There is probably something in the OLAP partition that is not able to link the dimension attributes to the cubes.
    Additional Information:
    Please find attached the existing Time Dimension – TimeDimensionMembers.xlxs
    Version of Excel used: Excel 2007, SP3 MSO (12.0.6683.5000)
    System Specs: Please see screenshot below.

    Thank you all for responding! This issue is resolved.
    Here’s what the issue was:
    The time structure is TOTAL >> Years >> Quarters >> Months (e.g. T.ALL >> 2012.TOTAL >> 2012.Q1 >> 2012.P01)
    As shown in the screenshot below, the LEVEL for ‘T.ALL’ member was set to YEAR, which is incorrect (we can’t have Year rolling up to a Year)
    We changed the LEVEL to ‘TOTAL’ and this fixed the issue!!
    If only it gave a better error message than the “..attribute key not found” message

  • Cube Processing - process Dimension before Measure Groups? How to?

    Hi guys.
    I noticed that when you choose "Process Cube", Measure groups are processed before Dimensions. Recently I ran into an issue caused by change in data that made the Process Cube job fail, - until I manually processed all Dimensions and then run the
    "Process Cube" job again.
    - Is there any way to configure it to process Dimensions before Measure Groups???
    Cheers!

    We use SSIS to automate the cube processing using XMLA scripts. We have a control table where we maintain the list of dimensions and measure group partitions which will be iterated upon and processed by SSIS. It will also log audit information like when
    it was started, when it got ended and the process result.
    Please Mark This As Answer if it solved your issue
    Please Vote This As Helpful if it helps to solve your issue
    Visakh
    My Wiki User Page
    My MSDN Page
    My Personal Blog
    My Facebook Page

  • New field added to cube, delta DTP from DSO to cube is failing

    Dear all,
    Scenerio in BI 7.0 is;
    data source -delta IPs-> DSO -
    delta DTP---> Cube.
    Data load using proces chain daily.
    We added new field to Cube --> transformation from DSO to cube is active, transport was successful.
    Now, delta from DSO to cube is failing.
    Error is: Dereferencing of the NULL reference,
    Error while extracting from source <DSO name>
    Inconsistent input parameter (parameter: Fieldname, value DATAPAKID)
    my conclusion, system is unable to load delta due to new field. And it wants us to initialize it again ( am i right ?)
    Do I have only one choice of deleting data from cube & perform init dtp again ? or any other way ?
    Thanks in advance!
    Regards,
    Akshay Harshe

    Hi Durgesh / Murli,
    Thanks for quick response.
    @ durgesh: we have mapped existing DSO field to a new field in cube. So yes in DTP I can see the field in filter. So I have to do re-init.
    @ Murli: everything is active.
    Actully there are further complications as the cube has many more sources, so wanted to avoid seletive deletion.
    Regards,
    Akshay

  • Cube refresh fails with an error below

    Hi,
    We are experiencing this problem below during planning application database refresh. We have been refreshing the database everyday, but all of a sudden the below error is appearing in log. The error is something like below:
    Cube refresh failed with error: java.rmi.UnmarshalException: error unmarshalling return; nested exception is:
    java.io.EOFException
    When the database refresh is done from workspace manually, the database refresh is happening successfully. But when triggered from unix script, its throwing the above error.
    Is it related to some provisioning related issue for which user has been removed from MSAD?? Please help me out on this.
    Thanks,
    mani
    Edited by: sdid on Jul 29, 2012 11:16 PM

    I work with 'sdid' and here is a better explaination of what exactly is going on -
    As part of our nightly schedule we have a unix shell script that executes refresh of essbase cubes from planning using the 'CubeRefresh.sh' shell script.
    Here is how our shell looks like -
    /opt/hyperion/Planning/bin/CubeRefresh.sh /A:<cube name> /U:<user id> /P:<password> /R /D /FS
    Here is what 'CubeRefresh.sh' looks like -
    PLN_JAR_PATH=/opt/hyperion/Planning/bin
    export PLN_JAR_PATH
    . "${PLN_JAR_PATH}/setHPenv.sh"
    "${HS_JAVA_HOME}/bin/java" -classpath ${CLASSPATH} com.hyperion.planning.HspCubeRefreshCmd $1 $2 $3 $4 $5 $6 $7
    And here is what 'setHPenv.sh' looks like -
    HS_JAVA_HOME=/opt/hyperion/common/JRE/Sun/1.5.0
    export HS_JAVA_HOME
    HYPERION_HOME=/opt/hyperion
    export HYPERION_HOME
    PLN_JAR_PATH=/opt/hyperion/Planning/lib
    export PLN_JAR_PATH
    PLN_PROPERTIES_PATH=/opt/hyperion/deployments/Tomcat5/HyperionPlanning/webapps/HyperionPlanning/WEB-INF/classes
    export PLN_PROPERTIES_PATH
    CLASSPATH=${PLN_JAR_PATH}/HspJS.jar:${PLN_PROPERTIES_PATH}:${PLN_JAR_PATH}/hbrhppluginjar:${PLN_JAR_PATH}/jakarta-regexp-1.4.
    jar:${PLN_JAR_PATH}/hyjdbc.jar:${PLN_JAR_PATH}/iText.jar:${PLN_JAR_PATH}/iTextAsian.jar:${PLN_JAR_PATH}/mail.jar:${PLN_JAR_PA
    TH}/jdom.jar:${PLN_JAR_PATH}/dom.jar:${PLN_JAR_PATH}/sax.jar:${PLN_JAR_PATH}/xercesImpl.jar:${PLN_JAR_PATH}/jaxp-api.jar:${PL
    N_JAR_PATH}/classes12.zip:${PLN_JAR_PATH}/db2java.zip:${PLN_JAR_PATH}/db2jcc.jar:${HYPERION_HOME}/common/CSS/9.3.1/lib/css-9_
    3_1.jar:${HYPERION_HOME}/common/CSS/9.3.1/lib/ldapbp.jar:${PLN_JAR_PATH}/log4j.jar:${PLN_JAR_PATH}/log4j-1.2.8.jar:${PLN_JAR_
    PATH}/hbrhppluginjar.jar:${PLN_JAR_PATH}/ess_japi.jar:${PLN_JAR_PATH}/ess_es_server.jar:${PLN_JAR_PATH}/commons-httpclient-3.
    0.jar:${PLN_JAR_PATH}/commons-codec-1.3.jar:${PLN_JAR_PATH}/jakarta-slide-webdavlib.jar:${PLN_JAR_PATH}/ognl-2.6.7.jar:${HYPE
    RION_HOME}/common/CLS/9.3.1/lib/cls-9_3_1.jar:${HYPERION_HOME}/common/CLS/9.3.1/lib/EccpressoAll.jar:${HYPERION_HOME}/common/
    CLS/9.3.1/lib/flexlm.jar:${HYPERION_HOME}/common/CLS/9.3.1/lib/flexlmutil.jar:${HYPERION_HOME}/AdminServices/server/lib/easse
    rverplugin.jar:${PLN_JAR_PATH}/interop-sdk.jar:${PLN_JAR_PATH}/HspCopyApp.jar:${PLN_JAR_PATH}/commons-logging.jar:${CLASSPATH
    export CLASSPATH
    case $OS in
    HP-UX)
    SHLIB_PATH=${HYPERION_HOME}/common/EssbaseRTC/9.3.1/bin:${HYPERION_HOME}/Planning/lib:${SHLIB_PATH:-}
    export SHLIB_PATH
    SunOS)
    LD_LIBRARY_PATH=${HYPERION_HOME}/common/EssbaseRTC/9.3.1/bin:${HYPERION_HOME}/Planning/lib:${LD_LIBRARY_PATH:-}
    export LD_LIBRARY_PATH
    AIX)
    LIBPATH=${HYPERION_HOME}/common/EssbaseRTC/9.3.1/bin:${HYPERION_HOME}/Planning/lib:${LIBPATH:-}
    export LIBPATH
    echo "$OS is not supported"
    esac
    We have not made any changes to either the shell or 'CubeRefresh.sh' or 'setHPenv.sh'
    From the past couple of days the shell that executes 'CubeRefresh.sh' has been failing with the error message below.
    Cube refresh failed with error: java.rmi.UnmarshalException: error unmarshalling return; nested exception is:
    java.io.EOFException
    This error is causing our Essbase cubes to not get refreshed from Planning cubes through these batch jobs.
    On the other hand the manual refesh from within Planning works.
    We are on Hyperion® Planning – System 9 - Version : 9.3.1.1.10
    Any help on this would be greatly appreciated.
    Thanks
    Andy
    Edited by: Andy_D on Jul 30, 2012 9:04 AM

  • Message processing failed if we receive attachments via E-Mail

    Dear ladies and gentlemen,
    we have follow proccess:
    We get some e-mails with an attachment. The E-Mail pick up the messages and send it to an RFC Channel.
    I changed the payload via the localejbs/AF_Modules/PayloadSwapBean :
    localejbs/AF_Modules/PayloadSwapBean Local Enterprise Bean TRANSFORM
    and the module configuration looks follow:
    TRANSFORM swap.keyName payload-name
    TRANSOFRM swap.keyValue MailAttachment-1
    it looks good and the payload would be swapped okay, and the mapping is also okay.
    But if we push the message to the RFC Adapter I get follow error message back:
    Message processing failed. Cause: com.sap.engine.interfaces.messaging.api.exception.MessagingException: com.sap.aii.adapter.rfc.afcommunication.RfcAFWException: error while processing message to remote system:com.sap.aii.adapter.rfc.core.client.RfcClientException: functiontemplate from repository was <null>
    So now my question: If is possible, that the adapter try to send also the Body of this E-Mail to the target system ? Because the
    Message ID is the ID from the payload of the E-Mail Body.
    It is possible to keep away the E-Mail Body from the payload?
    Because we need only the attachment. Or if I I have to setup via Bean anything in the RFC Adapter ?
    Thanks
    Kind reagards
    Stephan Kohler
    P.S. The PI System is an PI 7.1, the target system would be an R/3 4.6.  I haven't  any success with the RFC Adatper version 7.10 or 6.40

    > Message processing failed. Cause: com.sap.engine.interfaces.messaging.api.exception.MessagingException: com.sap.aii.adapter.rfc.afcommunication.RfcAFWException: error while processing message to remote system:com.sap.aii.adapter.rfc.core.client.RfcClientException: functiontemplate from repository was <null>
    >
    Is your attachment an XML? Is this XML structure is same as what you mapping source is? If that is the case, it should be okay. Is this interface working for non-mail sender? You might want to re-import the RFC structure in to PI again.
    VJ

  • Error in external tax system: SAX processing failed on input stream SAX pro

    Hi
    When I was posted in T.Code: FB70,  (Customer Invoice) I am getting below mentioned error.
    Error in external tax system: SAX processing failed on input stream SAX processi.
    I put tick mark on calculate Tax column and select O1(A/R Sales Taxable).
    Pls. help me.
    Thanks
    Ranjith

    Hi Ranjith,
    I also face this problem in Production now.
    Could you kindly share with me how you resolved this issue?
    Thanks,
    Markus

  • Message processing failed, FTP Receiver Adapter error...

    Hello all,
    We have a Idoc to File(FTP) scenario using PI.
    When PI try to send the file out to the FTP site, we get the following message in the communication channel monitoring and the file never reach the FTP site:
    Message processing failed. Cause: com.sap.aii.af.ra.ms.api.RecoverableException: Error when getting an FTP connection from connection pool: com.sap.aii.af.service.util.concurrent.ResourcePoolException: Unable to create new pooled resource: ConnectException: Connection timed out: connect
    When we look at the detail display, we can see that the connection as been establish with the FTP site but the adapter is unable to deliver the file...
    Any Idea why?
    Thanks in advance.

    Hi ,
    There are two things that you can do
    1- check the connection of FTP from command prompt. If it is acceebile from command prompt then check for authorization that wether you have access to post the file at FTP or not (Full access READ , WRITE and EXCECUTE)
    2- in your adpater change the connection mode from Per file transfer to "Permanent".
    Please feel free to reply on this thread if you are not able to.
    Thanks

  • Upgrading Stellent 7.5 to OCS 10gR3 Import Process failing HELP NEEDED

    Hi,
    I am upgrading Stellent 7.5 to Oracle COntent Server 10gR3. Here is what I have done.
    1. Migrated all the configuration from Stellent to 10gR3
    2. Migrated the Folders from Stellent to 10gR3
    3. Migrated the content by creating an Archive and then importing the Archive in 10gR3.
    I am seeing lot of errors in the log file. Following are the errors I see in the log file.
    1.
    Could not send mail message from (null) with subject line: Content Release Notification. Could not get I/O for connection to: hpmail.rtp.ppdi.com java.net.ConnectException: Connection timed out
    2.
    Import error for archive 'ProductionContent' in collection 'prod_idc': Invalid Metadata for 'ID_000025'. Virtual folder does not exist.
    3.
    Import error for archive 'ProductionContent' in collection 'prod_idc': Content item 'ID_004118' was not successfully checked in. The primary file does not exist.
    4.
    Import error for archive 'ProductionContent' in collection 'prod_idc': Content item 'ID_004213' was not successfully checked in. IOException (System Error: /u01/app/oracle/prod/ucm/server/archives/productioncontent/09-dec-21_23.29.44_396/4/vault/dmc_unblinded_documents/4227 (No such file or directory)) java.io.FileNotFoundException: /u01/app/oracle/prod/ucm/server/archives/productioncontent/09-dec-21_23.29.44_396/4/vault/dmc_unblinded_documents/4227
    5.
    Import error for archive 'ProductionContent' in collection 'prod_idc': Content item 'ID_031414' with revision label '2' was not successfully checked in. The release date (11/4/08 9:12 AM) of the new revision is not later than the release date (11/4/08 9:12 AM) of the latest revision in the system.
    6.
    Import error for archive 'ProductionContent' in collection 'prod_idc': Invalid Metadata for 'ID_033551'. Item with name '07-0040_IC_Olive-View_UCLA_ERI_Cellulitis_2008-08-26.pdf' already exists in folder '/Contribution Folders/2007/07-0040/07-0040Site_Specific_Documents/07-0040Olive_View_UCLA_Medical_Center/07-0040Archive/07-0040Essential_Documents_ARC/07-0040Informed_Consent_ARC/'.
    7.
    Import error for archive 'ProductionContent' in collection 'prod_idc': Aborting. Too many errors.
    QUESTIONS:
    Is there a way to keep the import processing running even if the errors are coming. As it looks like when there are too many errors the import process stops in the middle.
    How do I find out the total number of folders and the documents. As I want to run the same query on Stellent 7.5 and find out total number of folders and the documents and then run the same query on 10gR3 and compare the results. Just want to fnd out how much content is imported.
    How do I run the import process over again as half of the content is imported and my import process failed in the middle when running the process over again what settings do I need to provide to make sure no duplicates get created etc.
    Any help is really appreciated.
    Thanks

    Hi
    There are a couple of ways to get around the issues that you are facing such that import process is not interrupted because of these. They are as follows :
    1. Use ArchiveReplicationException component . This will keep the import process running and make a log of the failed process which can be used for assessing / gauging the success of the import and what all needs to be redone.
    I would suggest this as the option for your case.
    2. Put the config variable for arciver exception to 9999 so that the archive process stops only after hitting the limit of 9999 errors.
    I would suggest to go for step 1 as that would be a much more foolproof and methodical way of knowing what all items have failed during import.
    Thanks
    Srinath

  • BPEL process fails in SOA (in UNIX), but works fine in SOA (in Windows) env

    Hello,
    BPEL process fails in SOA (in UNIX), but works fine in SOA (in Windows) environment
    Step 1: Build a asynchronous BPEL process which has no extra node. Make and deploy it in 'local windows desktop SOA' server
    The BPEL process has three nodes:
    a. client - on the left side of the swim lane
    b. receiveInput - first node in swim lane (client calls 'receiveInput')
    c. callbackClient - second and last node in the swim lane ('callbackClient' calls client)
    Step 2: Go to BPEL console and 'Initiate' the BPEL process -> 'Post XML Message'
    Step 3: Now, I can see the successfully completed BPEL instance in the BPEL console.
    Now,
    Step 4: Deploy the same BPEL process (dummy asynchronous) in the SOA server (hosted in unix box)
    Step 5: Go to BPEL console and 'Initiate' the BPEL process -> 'Post XML Message'
    Step 6: I find that the BPEL instance appears to have ended in error (on the second node i.e. callbackClient )
    With the following error message
    +<invalidVariables xmlns="http://schemas.oracle.com/bpel/extension"><part name="code"><code>9710</code>+
    +</part><part name="summary"><summary>Invalid xml document.+
    According to the xml schemas, the xml document is invalid. The reason is: Error::cvc-complex-type.2.4.b: The content of element 'DummyBPELProcessProcessResponse' is not complete. One of '{"http://xmlns.oracle.com/DummyBPELProcess":result}' is expected.
    Please make sure that the xml document is valid against your schemas.
    +</summary>+
    +</part></invalidVariables>+
    Has anyone faced similar issue as above ?
    i.e. process works find in windows environment (local SOA), but fails in SOA server in UNIX environment
    Appreciate your help in understanding this issue.
    Thanks,
    Santhosh

    Hello,
    The fix to this issue appears to have been as follows:
    +<schema attributeFormDefault="unqualified"+
    +     elementFormDefault="qualified"+
    +     targetNamespace="http://xmlns.oracle.com/DummyBPELProcess"+
    +     xmlns="http://www.w3.org/2001/XMLSchema">+
    +     <element name="DummyBPELProcessProcessRequest">+
    +          <complexType>+
    +               <sequence>+
    +                    <element name="input" type="string"/>+
    +               </sequence>+
    +          </complexType>+
    +     </element>+
    +     <element name="DummyBPELProcessProcessResponse">+
    +          <complexType>+
    +               <sequence>+
    +                    <element name="*:result*" type="string"/>+
    +               </sequence>+
    +          </complexType>+
    +     </element>+
    +</schema>+
    In DummyBPELProcess.xsd,
    modifiying "result" to ":result" appears to have resolved the issue in SOA under unix environment.
    If anyone can explain why "result" works in SOA under windows and ":result" works in SOA under unix environment, I would really appreciate your help.
    Thanks,
    Santhosh

Maybe you are looking for

  • Vendor transfer

    hi all   i want to transfer the vendor in company code to the other company code.can i know T.code for this? thanks in adv rgds sree

  • ITunes syncs new photo albums with duplicate photos

    I have Windows 7 and have an iPhone 4 with iOS 5. I have been trying to sync photos into an album on my phone. After syncing I find duplicate photos in the album. Only one or two photos have not been duplicated. Also, I can also no longer select "Sel

  • Hi all, Safari quitting again any cloues please?

    There is no virus in my machine (flashback), no Java, I go trough a proxy, I have disabled all plugins, I have disable the DNS prefetching.Thanks for your help. Process:         WebProcess [1006] Path: /System/Library/StagedFrameworks/Safari/WebKit2.

  • Language change in app store

    How can we change the language of App Store? It goes automaticly to the language of the registered country and does not allow me to change to other language. This is pretty new change. It does not fix to change the general language of the device. Any

  • Installing OAM 11gR2

    Hi, I wanted to integrate OIM11gR2 with OAM. For this, I did the below so far. 1) Installed WLS server, OIM and SOA for OIM 11gR2 2) Trying to install OAM. Can you please let me know the order in which I need to install.? Also let me know do we need