Analysis Services - Partition Design

Hi,
With a backup of an SAP BPC appset the partition design is not maintained.  It always reverts to the standard 3 partitions on a restore.
This means that if we ever move our application set, I need to recreate the partions from scratch.
Within SQL Server Management Studio there is an option to "script cube as"
Can these scripts be safely used to move the partition design between servers?
Regards,
Mark

Hi Mark,
Yes this a good option.
But my suggestion will be to do a backup also of SSAS database once with backup done using server manager.
In this way after restore with data manager you can restore also the SSAS databse.
Please change into Appdef the connection to use the right SQL server not the one from where you performed the backup.
Another point it will be to script the partitions when you create them.
Click right into partition script as - to file.
In this way you will have the script for parttions to be used next time.when you will move into another environment.
After the creation of partitions and delete of "nameofapplication" partition you have to process the entire cube.
I hope this will help you.
Regards
Sorin

Similar Messages

  • [Forum FAQ] How do I create calculated measure using AMO in SQL Server Analysis Services?

    Introduction
    In SQL Server Analysis Services (SSAS), you can create a calculated measure in SQL Server Data Tool (SSDT)/Boniness Integrated Development Studio (BIDS). Sometimes you may need to create calculated measure by using AMO in a C# or VB project.
    In this article, I will demonstrate so how to create calculated measure using AMO in SSAS?
    Prerequisites
    Before create calculated measure using AMO, you need to ensure that the following components were installed in your server.
    The multidimensional database AdventureWorks Multidimensional Model 2012
    A SQL Server with SSIS and SSAS installed
    The AMO libraries installed:
    X86 Package (SQL_AS_AMO.msi)
    X64 Package (SQL_AS_AMO.msi)
    Solution
    Here is the detail steps to create calculated measure using AMO in SSAS.
    Open SSDT and create a new SSIS project.
    Drag Script Task to the design surface.
    Click SSIS-> Variables to open the Variables window and add two variables that used to connect to the server and database.
    Create a connection to connect to SSAS server.
    Rename the connection name to ssas.
    Double click the Script Task to open Script Task Editor.
    Add Connection and Database variables to ReadWriteVariables textbox and then click Edit Script button.
    Add AMO reference in the Solution Explore window.
    Copy the script below and paste it into the script.
    Dim objServer As Server
    Dim objDatabase As Database
    Dim strDataBaseID As String
    Dim objCube As Cube
    Dim objMdxScript As MdxScript
    Dim objCommand As Command
    Dim strCommand As String
    objServer = New Server
    objServer.Connect("localhost")
    objDatabase = objServer.Databases("AdventureWorksDW2012Multidimensional-EE2")
    strDataBaseID = objDatabase.ID
    If objDatabase.Cubes.Count > 0 Then
    objCube = objDatabase.Cubes("Adventure Works")
    If objCube.MdxScripts.Count > 0 Then
    objMdxScript = objCube.MdxScripts("MdxScript")
    objMdxScript = objCube.MdxScripts(0)
    Else
    objCube.MdxScripts.Add("MdxScript", "MdxScript")
    objMdxScript = objCube.MdxScripts("MdxScript")
    End If
    objCommand = New Command
    strCommand = "CREATE MEMBER CURRENTCUBE.[Measures].[Multipy Measures By 3]"
    strCommand = strCommand & " AS [Measures].[Internet Sales Amount] * 3, "
    strCommand = strCommand & " VISIBLE = 1 ; "
    objCommand.Text = strCommand
    objMdxScript.Commands.Add(objCommand)
    objMdxScript.Update()
    objCube.Update()
    End If
    objServer.Disconnect()
    Then you can run this SSIS package to create the calculated measure.
    Applies to
    Microsoft SQL Server 2005
    Microsoft SQL Server 2008
    Microsoft SQL Server 2008 R2
    Microsoft SQL Server 2012
    Please click to vote if the post helps you. This can be beneficial to other community members reading the thread.

    Thanks,
    Is this a supported scenario, or does it use unsupported features?
    For example, can we call exec [ReportServer].dbo.AddEvent @EventType='TimedSubscription', @EventData='b64ce7ec-d598-45cd-bbc2-ea202e0c129d'
    in a supported way?
    Thanks! Josh

  • Analysis Service Execute DDL Task throwing error with SourceType Variable

    Hi,
    I have Configuring Analysis Services Execute DDL Task to use Variable and Process Data(xmla Script) like below:
    When I execute this task I get the below error message:
    [Analysis Services Execute DDL Task] Error: The -->
    text node at line 23, column 3 cannot appear inside the DataSource element (namespace http://schemas.microsoft.com/analysisservices/2003/engine) under Envelope/Body/Execute/Command/Batch/Parallel/Process. This element can
    only have text nodes containing white-space characters.
    Can anyone please let me know how to resolve this.

    If I run using the sourceType "Direct Input", the Analysis Execute DDL Task runs fine, but if I use the sourcetype as variable its throws the error. And below is the xmla script
    Here is my Package look and the xmla script; its failing at "ProcessAdd" Analysis Execute DDL task:
    "SELECT '<Batch xmlns=\"http://schemas.microsoft.com/analysisservices/2003/engine\">
    <ErrorConfiguration xmlns:xsd=\"http://www.w3.org/2001/XMLSchema\" xmlns:xsi=\"http://www.w3.org/2001/XMLSchema-instance\" xmlns:ddl2=\"http://schemas.microsoft.com/analysisservices/2003/engine/2\" xmlns:ddl2_2=\"http://schemas.microsoft.com/analysisservices/2003/engine/2/2\" xmlns:ddl100_100=\"http://schemas.microsoft.com/analysisservices/2008/engine/100/100\" xmlns:ddl200=\"http://schemas.microsoft.com/analysisservices/2010/engine/200\" xmlns:ddl200_200=\"http://schemas.microsoft.com/analysisservices/2010/engine/200/200\" xmlns:ddl300=\"http://schemas.microsoft.com/analysisservices/2011/engine/300\" xmlns:ddl300_300=\"http://schemas.microsoft.com/analysisservices/2011/engine/300/300\" xmlns:ddl400=\"http://schemas.microsoft.com/analysisservices/2012/engine/400\" xmlns:ddl400_400=\"http://schemas.microsoft.com/analysisservices/2012/engine/400/400\">
    <KeyNotFound>IgnoreError</KeyNotFound>
    </ErrorConfiguration>
    <Parallel>
    <Process xmlns:xsd=\"http://www.w3.org/2001/XMLSchema\" xmlns:xsi=\"http://www.w3.org/2001/XMLSchema-instance\" xmlns:ddl2=\"http://schemas.microsoft.com/analysisservices/2003/engine/2\" xmlns:ddl2_2=\"http://schemas.microsoft.com/analysisservices/2003/engine/2/2\" xmlns:ddl100_100=\"http://schemas.microsoft.com/analysisservices/2008/engine/100/100\" xmlns:ddl200=\"http://schemas.microsoft.com/analysisservices/2010/engine/200\" xmlns:ddl200_200=\"http://schemas.microsoft.com/analysisservices/2010/engine/200/200\" xmlns:ddl300=\"http://schemas.microsoft.com/analysisservices/2011/engine/300\" xmlns:ddl300_300=\"http://schemas.microsoft.com/analysisservices/2011/engine/300/300\" xmlns:ddl400=\"http://schemas.microsoft.com/analysisservices/2012/engine/400\" xmlns:ddl400_400=\"http://schemas.microsoft.com/analysisservices/2012/engine/400/400\">
    <Object>
    <DatabaseID>IIS_Version2</DatabaseID>
    <DimensionID>Application</DimensionID>
    </Object>
    <Type>ProcessAdd</Type>
    <DataSource xmlns:xsd=\"http://www.w3.org/2001/XMLSchema\" xmlns:xsi=\"http://www.w3.org/2001/XMLSchema-instance\" xmlns:ddl2=\"http://schemas.microsoft.com/analysisservices/2003/engine/2\" xmlns:ddl2_2=\"http://schemas.microsoft.com/analysisservices/2003/engine/2/2\" xmlns:ddl100_100=\"http://schemas.microsoft.com/analysisservices/2008/engine/100/100\" xmlns:ddl200=\"http://schemas.microsoft.com/analysisservices/2010/engine/200\" xmlns:ddl200_200=\"http://schemas.microsoft.com/analysisservices/2010/engine/200/200\" xmlns:ddl300=\"http://schemas.microsoft.com/analysisservices/2011/engine/300\" xmlns:ddl300_300=\"http://schemas.microsoft.com/analysisservices/2011/engine/300/300\" xmlns:ddl400=\"http://schemas.microsoft.com/analysisservices/2012/engine/400\" xmlns:ddl400_400=\"http://schemas.microsoft.com/analysisservices/2012/engine/400/400\" xmlns:dwd=\"http://schemas.microsoft.com/DataWarehouse/Designer/1.0\" xsi:type=\"RelationalDataSource\" dwd:design-time-name=\"1a3cb292-9bce-4c59-a182-177d6b3506ff\" xmlns=\"http://schemas.microsoft.com/analysisservices/2003/engine\">
    <ID>IISDW</ID>
    <Name>IISDW</Name>
    <ConnectionString>Provider=SQLNCLI11.1;Data Source=CO1MSFTSQLHKT02;Integrated Security=SSPI;Initial Catalog=IISDW</ConnectionString>
    <Timeout>PT0S</Timeout>-->
    </DataSource>
    <DataSourceView xmlns:xsd=\"http://www.w3.org/2001/XMLSchema\" xmlns:xsi=\"http://www.w3.org/2001/XMLSchema-instance\" xmlns:ddl2=\"http://schemas.microsoft.com/analysisservices/2003/engine/2\" xmlns:ddl2_2=\"http://schemas.microsoft.com/analysisservices/2003/engine/2/2\" xmlns:ddl100_100=\"http://schemas.microsoft.com/analysisservices/2008/engine/100/100\" xmlns:ddl200=\"http://schemas.microsoft.com/analysisservices/2010/engine/200\" xmlns:ddl200_200=\"http://schemas.microsoft.com/analysisservices/2010/engine/200/200\" xmlns:ddl300=\"http://schemas.microsoft.com/analysisservices/2011/engine/300\" xmlns:ddl300_300=\"http://schemas.microsoft.com/analysisservices/2011/engine/300/300\" xmlns:ddl400=\"http://schemas.microsoft.com/analysisservices/2012/engine/400\" xmlns:ddl400_400=\"http://schemas.microsoft.com/analysisservices/2012/engine/400/400\" xmlns:dwd=\"http://schemas.microsoft.com/DataWarehouse/Designer/1.0\" dwd:design-time-name=\"b0b61205-c64d-4e34-afae-6d4d48b93fb3\" xmlns=\"http://schemas.microsoft.com/analysisservices/2003/engine\">
    <ID>IISDW</ID>
    <Name>IISDW</Name>
    <DataSourceID>IISDW</DataSourceID>
    <Schema>
    <xs:schema id=\"IISDW_x0020_1\" xmlns=\"\" xmlns:xs=\"http://www.w3.org/2001/XMLSchema\" xmlns:msdata=\"urn:schemas-microsoft-com:xml-msdata\" xmlns:msprop=\"urn:schemas-microsoft-com:xml-msprop\">
    <xs:element name=\"IISDW_x0020_1\" msdata:IsDataSet=\"true\" msdata:UseCurrentLocale=\"true\" msprop:design-time-name=\"72037318-e316-469d-9a45-a10c77709b39\">
    <xs:complexType>
    <xs:choice minOccurs=\"0\" maxOccurs=\"unbounded\">
    <xs:element name=\"Application\" msprop:design-time-name=\"7f579e7e-e8b7-4a9d-8a93-a255fccbbfbe\" msprop:IsLogical=\"True\" msprop:FriendlyName=\"Application\" msprop:DbTableName=\"Application\" msprop:TableType=\"View\" msprop:Description=\"\" msprop:QueryDefinition=\"SELECT a.Application, DATEADD([hour], DATEDIFF([hour], 0, a.[Timestamp]), 0) AS [Timestamp], a.ServerName, CAST(a.ServerName AS char(3)) AS DataCenter, a.CS_URI_Stem, CAST(HashBytes(''MD5'', &#xD;&#xA; a.Application + a.ServerName + a.CS_URI_Stem + CAST(DATEADD([hour], DATEDIFF([hour], 0, a.[Timestamp]), 0) AS varchar(24))) AS uniqueidentifier) AS Server_URI_Identity&#xD;&#xA;FROM IIS_6_OLD AS a LEFT OUTER JOIN&#xD;&#xA; Dimension_Pointer AS b ON a.Application = b.Application&#xD;&#xA;WHERE (b.ProcessedFlag = 0) AND (a.Application IN ("+(DT_WSTR,100) @[User::strDistinctApplication]+"))&#xD;&#xA;GROUP BY a.Application, DATEADD([hour], DATEDIFF([hour], 0, a.[Timestamp]), 0), a.ServerName, a.CS_URI_Stem\" msprop:QueryBuilder=\"SpecificQueryBuilder\">
    <xs:complexType>
    <xs:sequence>
    <xs:element name=\"Application\" msprop:design-time-name=\"f3074e98-4a82-4bc5-a818-916203f7758b\" msprop:DbColumnName=\"Application\" msprop:FriendlyName=\"Application\" minOccurs=\"0\">
    <xs:simpleType>
    <xs:restriction base=\"xs:string\">
    <xs:maxLength value=\"255\" />
    </xs:restriction>
    </xs:simpleType>
    </xs:element>
    <xs:element name=\"Timestamp\" msdata:ReadOnly=\"true\" msprop:design-time-name=\"2662e3a8-8b1a-4d77-aecb-575329b84dc1\" msprop:DbColumnName=\"Timestamp\" msprop:FriendlyName=\"Timestamp\" type=\"xs:dateTime\" />
    <xs:element name=\"ServerName\" msprop:design-time-name=\"ced26d49-cd6e-4073-a40c-ff5ef70e4ef1\" msprop:DbColumnName=\"ServerName\" msprop:FriendlyName=\"ServerName\" minOccurs=\"0\">
    <xs:simpleType>
    <xs:restriction base=\"xs:string\">
    <xs:maxLength value=\"255\" />
    </xs:restriction>
    </xs:simpleType>
    </xs:element>
    <xs:element name=\"DataCenter\" msdata:ReadOnly=\"true\" msprop:design-time-name=\"4583e15a-dcf1-45a2-a30b-bd142ca8b778\" msprop:DbColumnName=\"DataCenter\" msprop:FriendlyName=\"DataCenter\" minOccurs=\"0\">
    <xs:simpleType>
    <xs:restriction base=\"xs:string\">
    <xs:maxLength value=\"3\" />
    </xs:restriction>
    </xs:simpleType>
    </xs:element>
    <xs:element name=\"CS_URI_Stem\" msprop:design-time-name=\"10db5a79-8d50-49d2-9376-a3b4d19864b9\" msprop:DbColumnName=\"CS_URI_Stem\" msprop:FriendlyName=\"CS_URI_Stem\" minOccurs=\"0\">
    <xs:simpleType>
    <xs:restriction base=\"xs:string\">
    <xs:maxLength value=\"4000\" />
    </xs:restriction>
    </xs:simpleType>
    </xs:element>
    <xs:element name=\"Server_URI_Identity\" msdata:DataType=\"System.Guid, mscorlib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089\" msdata:ReadOnly=\"true\" msprop:design-time-name=\"018ede0a-e15e-47c2-851b-f4431e8c839c\" msprop:DbColumnName=\"Server_URI_Identity\" msprop:FriendlyName=\"Server_URI_Identity\" type=\"xs:string\" />
    </xs:sequence>
    </xs:complexType>
    </xs:element>
    </xs:choice>
    </xs:complexType>
    <xs:unique name=\"Constraint1\" msprop:IsLogical=\"True\" msdata:PrimaryKey=\"true\">
    <xs:selector xpath=\".//Application\" />
    <xs:field xpath=\"Server_URI_Identity\" />
    <xs:field xpath=\"Timestamp\" />
    </xs:unique>
    </xs:element>
    </xs:schema>
    <IISDW_x0020_1 xmlns=\"\" />
    </Schema>
    </DataSourceView>
    <WriteBackTableCreation>UseExisting</WriteBackTableCreation>
    </Process>
    </Parallel>
    </Batch>' as XMLAScript_ProcessData"

  • MSSQL2005 Analysis Service Distinct Count

    hi,
    i am currently trying to build a distinct count on my cube (mssql2005 analysis services).
    But after i added the discount count on the field i want to and start the processing, the following errors appear.
    - Errors in the OLAP storage engine: The sort order specified for distinct count records is incorrect.
    - Errors in the OLAP storage engine: An error occurred while processing the 'FACT VIEW STATISTIC' partition of the 'FACT VIEW  STATISTIC 1' measure group for the 'Accident Statistic' cube from the OLAP_PROJECT database.
    the count measure works fine.
    will appreciate any help on this distinct count problem.
    thanks in advance.
    HY

    I also received this error:
    "Errors in the OLAP storage engine: The sort order specified for distinct count records is incorrect. "
    Running SQL Server 2005 SP2 Enterprise Edition
    The collation between SQL Server and Analysis Services was the same.
    The distinct count was on a character data type.
    There were no NULLs in the data.
    The cube was processing fine until new data was added.
    After some investigation into the data it seems that the culprit was one row that the data length was 13 characters on the column of the distinct count. Everything else was less than 13 characters. (See results below). Updating this one row solved the problem. The exact value of the data is: '1-4296-175-9'
    Here is a result set:
    select len(columnname) as data_length, count(*) as count
    from [tablename]
    group by len(columnname)
    order by data_length
    data_length   count
    2    3
    5    1
    6    3
    7    2
    9    1
    10    856
    13    1
    My question though is if SQL2005 can do distinct counts on strings then why choke on one row with an extra length?

  • SAP HANA as a data source for Analysis Services

    I tried to use SAP's .net provider for HANA as a data source for an Analysis Services cube but to no avail.   I can connect, create a named query and preview data from the data source view designer but when SSAS actually tries to run the query,
    it wraps the named query in another query and produces a syntax error.  
    For example,  if the named query is "select * from ENTHANA.MARA", 
    SSAS sends a query like this "SELECT [test].* FROM (select * from ENTHANA.MARA) AS [test]"   (Test is the name of the named query).
    I know SSAS wraps queries and that's fine - except that brackets are not a valid way to quote an identifier in HANA.  HANA uses double quotes like Oracle. 
    Is there any setting in SSAS that can affect this behavior?   Will HANA ever be a fully supported data source for SSAS?  If so, when?

    Hi David,
    According to your description, you are creating a SQL Server Analysis Services project, now what you want is using SAP HANA as the data source, right?
    SSAS support many type of data source. However, as you can see on the link below, SAP HANA data source was not listed on that link. So this type of data source is not supported in current version of SSAS. Microsoft will update that document when it is supported.
    http://msdn.microsoft.com/en-IN/library/ms175608.aspx
    Thank you for your understanding.
    Regards,
    Charlie Liao
    TechNet Community Support

  • What to configure in MS SQL 2008 Analysis Services

    Hello
    I have several cubes in MS Analysis Services 2008 that are available for users in Excel. I am trying to configure a universe in IDT (Edge 4.1 ) but cannot add it in IDT when i create a new olap connection (XMLA). I add the parameters (authentication , server ,  user name , password, language ,) but it appears blank  in the cube selection. I followed the instructions in Information design tool 4.0: Create a connection to an OLAP data source
    How should i configure Analysis services to be able to connect to it from th IDT ?
    Regards
    Hector

    Purchase SQL Server 2014 (or 2012) Developer Edition for around $50 (2008 may no longer be available).  Install it.  Also install AdventureWorks2012 sample database.
    SQL Server Management Studio is part of the client tools installation.  SSMS is your learning and later working environment.
    SSMS Object Explorer provides you access to all of the database objects.
    Practice T-SQL scripts:
    http://www.sqlusa.com/bestpractices/
    Kalman Toth Database & OLAP Architect
    SQL Server 2014 Design & Programming
    New Book / Kindle: Exam 70-461 Bootcamp: Querying Microsoft SQL Server 2012

  • Dynamic Parameters in MS Analysis Services and BPC

    Hello Experts !
    I need to execute a BPC Package, but in Microsoft Analysis Services I'm not able to have dynamic parameters, so I have to do it manually every time, because of the following problem when I try to execute it in Source Editor on Analysis Services.
    "Parameter cannot be extracted from the SQL Command. The provider might not help to parse parameter information from the commnand"
    The source is Sybase OLEDB
    Best Regards,
    Rodrigo

    Hi Rodrigo,
    I'm not sure I understood your question.
    I'm refering to this: "but in Microsoft Analysis Services I'm not able to have dynamic parameters"
    What do you mean?
    Are we speaking about tasks to process cube or partitions of SSAS(SQL Server Analysis Services)?
    Are you speakinga about tasks from SSIS (Sql Server integration services?)
    Into SSIS you can build tasks to accept parameters.
    You can have also tasks into SSIS which can perform actions into SSAS.
    MODIFYSCRIPT variable is used to send parameters to tasks.
    If you will clarify your questions then I hope I will be able to provide more details.
    Kind Regards
    Sorin Radulescu

  • Can BO Universe extracted data to a SQL Server Analysis Services Cube

    Hello,
    I have a situation where all the business rules and aggregations are defined in the BO Universe layer, however now they want to create cubes in SSAS (Sql Server Analysis Services) and create dashboards using Microsoft technology.
    I am not sure if data and business rules including Hierarchies can be extracted from the BO universe to SSAS.  Any thoughts and suggesstions.
    AP

    Hi,
    This is not provided out of the box but you can develop a custom application using Designer SDK.
    Didier

  • SQL Server Analysis Services (PowerPivot) instance returned the following error:

    SQL Server Analysis Services (PowerPivot) instance returned the following error: Error Code: 3241607174 Description: A connection corresponding to the embedded PowerPivot data was not found in the
    Excel workbook, file 'C:\Program Files\Microsoft SQL Server\MSAS10_50.POWERPIVOT\OLAP\Backup\Sandboxes\PowerPivotServiceApplication1\ReadOnlyExclusive-1096b00b-c487-46f9-afea-f717148863a8-Test_Sheet.xlsx'
    Hi Have Installed power pivot to and exsistinf farm standalone farm.
    What Should be the connection in the Excel fiile?
    Thanks
    Ravi
    Ravi

    Hi,
    According to your post, my understanding is that  you got an error with SQL Server Analysis Services.
    Did you have installed PowerPivot for Excel on a client machine and modeled a database as part of this workbook?
    The refresh in question is about refreshing the embedded Analysis Services database with data retrieved from the backend sources (defined when the database was originally modeled) and resaving the updated workbook. It should not be confused
    with Excel / Excel Services options to refresh the data connections in the workbook. That option is about refreshing data stored in Excel’s caches. With regards to PowerPivot, that would be refreshing from the embedded database, not the backend data from which
    the database was sourced. In summary, this feature will not work against a workbook which does not have an embedded database because there is no database to refresh. This is by design and should not be interpreted as indicating that there is something wrong
    with your system.
    More information:
    http://powerpivotgeek.com/2010/02/07/a-connection-corresponding-to-the-embedded-powerpivot-data-was-not-found-in-the-excel-workbook/
    There are some similar articles for your reference.
    http://technet.microsoft.com/en-us/library/ee210712(v=sql.105).aspx
    http://support.microsoft.com/kb/2761246
    http://social.msdn.microsoft.com/Forums/sqlserver/en-US/a324436c-9901-494e-9f9b-314399d65ccb/data-refresh-dont-work-for-serverhealthxlsx
    For the error “The data connection uses Windows Authentication and user credentials could not be delegated. The following connections failed to refresh”.
    There are multiple causes for this error message. The common factor behind all of them is that Excel Services cannot get a valid Windows user identity from a claims token in SharePoint.
    For more information:
    http://technet.microsoft.com/en-us/library/ff487975.aspx
    http://mmman.itgroove.net/2013/03/the-data-connection-uses-windows-authentication-and-user-credentials-could-not-be-delegated/
    Thanks & Regards,
    Jason
    Jason Guo
    TechNet Community Support

  • SSAS 2012 Tabular. Analysis Services Query Log not working

    Hello.
    It is possible to configure Analysis Services Query Log on SSAS 2012 Tabular (11.0.3321) server? Or this feature works only with Multidimensional server mode?
    It is not a problem for me to configure Query Log on Multidimensional server. But when I'm trying to do this in Tabular server result is always the same: table S_OlapQueryLog created but no data there.
    This problem exist in our production server. And is fully reproduced in test environment. This problem was reproduced with SQL 2012 RTM, CU4, SP1 and SP1 CU1.
    I tryed to configure Analysis Services Query Log by using SQL account. Later by using windows authentification. Accounts was db owners. Later I have used accounts with sysadmin rights. Results was the same.
    In msmdsrv.log i can find only related events like this:
    (12/7/2012 9:22:59 PM) Message: The query log was started. (Source:
    \\?\P:\Olap\Log\msmdsrv.log, Type: 1, Category: 289, Event ID: 0x41210003)
    (12/7/2012 9:25:03 PM) Message: The query log was stopped. (Source:
    \\?\P:\Olap\Log\msmdsrv.log, Type: 1, Category: 289, Event ID: 0x41210004)
    I'm stuck completely with this functionality. So if someone was much luckier than I please write to me.
    Regards
    Audrius

    I agree with Gerhard's comments, but there are a couple of additional points
    - I am not 100 percent sure if queries that are answered by the SE cache are recorded in the query log
    Queries answered by the SE cache are not definitely not recorded in the QueryLog. So depending on the types of queries and the design of your cube you could possible miss a large proportion of the actual end user queries.
    The QueryLog records the QuerySubcube events and there are zero to many of these generated for a given end user query (zero if the query can be answered from cache). The optimizer may also choose to pre-fetch a wider range of data or to break a single range
    into a few smaller requests so it is not a true indication of the actual query that the end user generated.
    Doing a trace is the only way to catch the actual queries submitted by end users.
    http://darren.gosbell.com - please mark correct answers

  • Analysis Service Database missing post reboot

    Hi
    Post recycle of Analysis Service we found one of the database went missing. So we tried attaching it from the local folder. But its failing with below error:
    The detach log '\\?\E:\SalesCube_Monthly.0.db\SalesCube_Monthly.detach_log' could not be found in the specified database folder.
    The database cannot be attached because an error occurred while loading the detach log from the file '\\?\E:\SalesCube_Monthly.0.db\SalesCube_Monthly.detach_log'. One possible reason is that database is already attached in ReadWrite mode to another server instance.
    As we did not detach the database, this file detach_log did not get created. Is there any other way, we can attach or recover this database ?

    Hi kalisubbu1,
    Please follow the steps suggested in this related thread
    http://social.msdn.microsoft.com/Forums/en-US/sqlanalysisservices/thread/231ba420-d8da-4945-b1e7-afc185128bb4  to see if it works for you.
    You could try stopping the SSAS service on the new server, copying both the <database>.db folder and the <database>.db.xml file to the data folder on the new server and then restarting the SSAS service.
    This should work as long as this database is not what is causing the other server not to be able to start and that you do not have any remote partitions or data stored outside of the data folder.
    Regards,
    Jerry
    TechNet Subscriber Supportin forum
    If you have any feedback on our support, please contact
     [email protected]

  • SSDT Issues in VS2010 - Analysis Services Project

    Hi,
    I have installed SQL Server 2012 RC0 and CTP4 of the SSDT on my laptop with an existing VS2010 Premium SP1 installation. I hve opened an Analysis Services project and connected to a cube developed on SQL Server 2008 R2 and restored into the 2012 RC0 environment. 
    When I try to open the DSV of the Analysis Services project I get the following error:
    ===================================
    An error prevented the view from loading. (Microsoft Visual Studio)
    ===================================
    The specified module could not be found. (Exception from HRESULT: 0x8007007E) (System.Windows.Forms)
    Program Location:
    at System.Windows.Forms.UnsafeNativeMethods.CoCreateInstance(Guid& clsid, Object punkOuter, Int32 context, Guid& iid)
    at System.Windows.Forms.AxHost.CreateWithLicense(String license, Guid clsid)
    at System.Windows.Forms.AxHost.CreateInstanceCore(Guid clsid)
    at System.Windows.Forms.AxHost.CreateInstance()
    at System.Windows.Forms.AxHost.GetOcxCreate()
    at System.Windows.Forms.AxHost.TransitionUpTo(Int32 state)
    at System.Windows.Forms.AxHost.CreateHandle()
    at System.Windows.Forms.Control.CreateControl(Boolean fIgnoreVisible)
    at System.Windows.Forms.Control.CreateControl()
    at Microsoft.DataWarehouse.Design.ComponentDiagram.CreateDdsView(Control parentControl)
    at Microsoft.AnalysisServices.Design.DataSourceDesignerView..ctor(DataSourceDesigner designer, DataSourceDiagram diagram, IComponent diagramOwnerComponent)
    at Microsoft.AnalysisServices.Design.DataSourceDesignerView..ctor(DataSourceDesigner designer)
    at Microsoft.AnalysisServices.Design.DataSourceDesigner.CreateDataSourceDesignerView(VsStyleToolBar pageViewToolBar)
    at Microsoft.DataWarehouse.Design.EditorWindow.EnsureViewIsLoaded(EditorView view)
    To date I have not been abe to find a solution which is frustrating. Any ideas how I can correct this issue?
    Regards,
    Alan
    Regards, Alan

    The error message that was being returned was:
    An error prevented the view from loading. (Microsoft Visual Studio)
    ===================================
    An error occurred while attempting to start the report preview worker process. (Microsoft.ReportingServices.Designer)
    Given that this scenario worked with previous editions of Reporting Services and BI Development Studio, the error was unexpected.  On further investigation, the same behavior and error could be reproduced by launching a Remote Desktop session in Seamless
    or Remote Applications Integrated Locally (RAIL) mode.
    RAIL extends the RDP protocol presents a remote application running on a RAIL server as a local user application running on the client machine. This causes the remote application to appear as if it is running on the user's local computer rather
    than being presented in the desktop of the remote Server.
    In this scenario, the behavior and the error message are both by design.
    SSDT is a multi-targeted assembly that is built around the .NET 4.0 Framework. Business Intelligence Development Studio (BIDS) is built around the .NET 3.5 Framework and there are differences in the way that the .NET 4.0 Framework deals with child processes
    when the parent process dies. As a result, there was a deliberate decision to change the behavior of the report viewer in preview mode. This is largely because under the .NET 4.0 Code Access Security policy, executing under the current AppDomain is no
    longer supported, necessitating use of a sandbox. You may wish to read Brian Hartman's blog posting "Expression Evaluation in Local Mode" for a more detailed discussion of Visual Studio 2010 defaulting to the .NET 4.0 CAS policy. The net result of
    the change is that when a report is previewed, SSDT spawns a child process to allow previewing of the report. In the RDP using Seamless or RAIL mode, it would still be necessary for SSDT to spin up a new instance of the Reporting Services Preview Processing
    Host (PreviewProcessingService.exe) separate from the SSDT process. In the event that the user were to close SSDT or SSDT were to crash, it would be possible for orphaned instances of the Reporting Services Preview Processing Host application to remain in
    memory and continue execution after SSDT was terminated.
    The change makes it cleaner to close SSDT and eliminates the possibility of orphaned child processes in the event that the SSDT environment is terminated for some reason. Had the change not been made, it would have been possible to allow orphaned
    child applications/processes to continue to execute, using CPU cycles and consuming server memory, until the user logged off the machine. If the user didn’t log off and rather elected to simply disconnect without logging off, any orphaned applications would
    have remained in memory for an indefinite period of time. In the scenario where a machine is being remotely accessed by multiple users, that could potentially mean thousands of orphaned applications would be in memory and actively executing at any given
    time.

  • Analysis Services 2012 hungs up

    Hi everybody,
    I am using sql server 2012 sp2, 16 cores CPU, 128 Gb Memory.
    Both sql server and analyis servics are installed on the same server.
    sql server is allocated to use  (60 Gb). And analysis services instance properties are set to default values.
    I have a cube which is accessed by multiple applications ( excel pivot , java applications).
    the cube works fine when less number of users connects to it . But it start cribbing when multiple user connect to it .the server doesn't respond .
    From java application when one user runs the tool it hits around 2000 mdx queries of similar nature with only where condition is changed.similarly it might happens that during peak load it is expected to 20 users to use the java application and fires queries
    against cube so totally (20*2000 number of queries).
    also the same mdx queries when run from ssms will not take much time, it respond in ms when checked in (mdx editor).
    Other details:
    the cube is partitioned to 36 months, andwe have created UBO on the maximum queries fired on Analysis DB. THE CUBE IS SCHEDULED TO PROCESS EVERY 15 MIN.
    Please share me your thoughts.
    Below is the counter Captured
    Thanks
    Prasanna KJ
    Praxy

    Hi Praxy,
    Thank you for your question. 
    I am trying to involve someone more familiar with this topic for a further look at this issue. Sometime delay might be expected from the job transferring. Your patience is greatly appreciated. 
    Thank you for your understanding and support.
    Regards,
    Charlie Liao
    TechNet Community Support

  • Pass parameter to analysis services report to cube (Not filter)

    Hi I am trying to pass a parameter from a report to a subreport. Both reports are built on Analysis Services cubes.
    I can pass the parameter value to a FILTER parameter but I cannot pass it directly to parameters used in cube. Is it only possible to use the passed parameter as a filter? 
    My subreport is a lot slower when I have to return everything then filter it in Reporting Services, rather than only returning what I need.

    Hi Darkdushy,
    As you said, there are two ways to filter data using parameters. First is add a filter to the dataset or other report items. The second one is use the parameter on the query no matter the database is a multidimensional or relational database. Here is a sample
    query for your reference.
    select
    {[Measures].[Internet Sales Amount]
    } on columns,
    {[Date].[Date].members} on rows
    from(
    select
    STRTOMEMBER("[Date].[Date].&["+@StartDate+"]"):STRTOMEMBER("[Date].[Date].&["+@EndDate+"]")
    ) on columns
    from [Adventure Works]
    Reference:Define Parameters in the MDX Query Designer for Analysis Services (Report Builder and SSRS)
    Regards,
    Charlie Liao
    TechNet Community Support

  • UNX configurations for MS Analysis Services is only through XMLA!!??

    In BOBJ4.0, I just Wanted to confirm if the UNX configurations for MS Analysis Services is only through XMLA.
    Is there any way BOBJ4.0 can be configured to use MSAS OLEDB for OLAP?
    One of our client does not want to go thru XMLA connectivity to their MSAS CUBES, this is possible thru Universe Design Tool but not with IDT.
    Can anyone please help on this?

    UNX - XMLA
    UNV - OLEDB
    those are the choices.

Maybe you are looking for

  • Since upgrading to iOS 6 I can no longer send pictures as attachments to text messages, why?

    I upgraded to iOS6 and can no longer attach photos to text messages, why would that be?  Also since upgrading I find that the Personal Hotspot on my iPhone 4s is now grayed out.  Is this something to to with the upgrade or what?  I am really frustrat

  • File sender adapter Module Error

    Hi Everybody, I have developed an adapter module for file sender to rename the file name which is picked from the directory and I am getting an error "Attempt to process file failed with java.lang.NullPointerException". Can anybody tell me why is thi

  • N95 light won't turn off when slide open

    I just find this, in rare occasions the backlight won't turn off and auto keylock not fuction(s/w) when the slide is open, but the phone is too shy to do it again after reboot, this is definitely a bug to me. It is a N95-1 with V21.0.016 f/w. Message

  • Laptop versus desktop

    Hey everyone, A colleague of mine just got moved from a desktop to a laptop, then was given full admin rights. Suddenly, everything in RH (WebHelp builds work ok, nothing is "blank" in the TOC, no error messages) works perfectly. I'm trying to nail d

  • Email account for blackberry

    i go to setup wizard to make my email account..but it keeps asking me if my system adminstration gave me a password..who is my system administration and how do i set up my blackberry email account?