Analysis Services execute ddl task

Hi,
I want to run the Analysis Services execute ddl task to execute a cube creation via xmla script.
The cube could already exist, so I want to ask if it exists before running the task or else it will fail.
What would be the best way to perform this - if exist question ?
Dani

You can use alter element like this example:
<Alter ObjectExpansion="ObjectProperties" xmlns="http://schemas.microsoft.com/analysisservices/2003/engine">
<Object>
<DatabaseID>Adventure Works DW Multidimensional 2012</DatabaseID>
<DataSourceID>AdventureWorksDW2012</DataSourceID>
</Object>
<ObjectDefinition>
<DataSource xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:type="RelationalDataSource">
<ID>AdventureWorksDW2012</ID>
<Name>AdventureWorksDW2012</Name>
<ConnectionString>Data Source=fr-dwk-02;Initial Catalog=AdventureWorksDW2008R2;Integrated Security=True</ConnectionString>
<ManagedProvider>System.Data.SqlClient</ManagedProvider>
<Timeout>PT30S</Timeout>
</DataSource>
</ObjectDefinition>
</Alter>
More info on "Alter Element" on MSDN 
http://technet.microsoft.com/en-us/library/ms186630.aspx

Similar Messages

  • Analysis Services Execute DDL Task Internal error

    Hi all,
    I need help in solving this sporadic problem...  the dimension below (blanked out as xxxx) is based on a view.  The cube processes fine many times - but fails - abruptly once a week... it's being called from a sql job.
    Any ideas?  Thanks in advance!!
    Error: 2010-06-28 15:45:17.69     Code: 0xC1000007     Source: xxxxxxx Analysis Services Execute DDL Task     Description: Internal error: The operation terminated unsuccessfully.  End Error 
    Error: 2010-06-28 15:45:17.69     Code: 0xC11F000D     Source: xxxxxxx Analysis Services Execute DDL Task     Description: Errors in the OLAP storage engine: An error occurred while the 'xxxxx' attribute
    of the 'xxxxx' dimension from the 'xxxxxx' database was being processed.  End Error  Error: 2010-06-28 15:45:17.69     Code: 0xC11F0006     Source: xxxxxxx Analysis Services Execute DDL Task    
    Description: Errors in the OLAP storage engine: The process operation ended because the number of errors encountered during processing reached the defined limit of allowable errors for the operation.  End Error  Error: 2010-06-28 15:45:17.69    
    Code: 0xC11C0002     Source: xxxxxx Analysis Services Execute DDL Task     Description: Server: The operation has been cancelled.  End Error  DTExec: The package execution returned DTSER_FAILURE (1). 
    Started:  3:45:00 PM  Finished: 3:45:17 PM  Elapsed:  17.328 seconds.  The package execution failed.  The step failed.
    Harsh B

    (From http://msdn.microsoft.com/en-us/library/cc966526.aspx)
    "ExternalCommandTimeout is a server property that is used to set the number of seconds that SSAS should wait to time out when issuing commands to external data sources, such as relational and other OLAP sources."
    "ExternalConnectionTimeout is a server property that is used to set the number of seconds, by default, that SSAS should wait to time out when connecting to external data sources, such as relational and other OLAP sources."
    What you set it to, it's absolutely up to you... But as I see it in your log, your task failed in 17 and 9 seconds... So I think you can multiply the values by 10, I guess the situation won't change... But it is worth to check it at least.
    So... How often your fact table gets new (degenerate) dimension reference values? Is it a process update, or a process full command? I really would like to see it :)
    -- Zoltán Horváth
    -- MCITP SQL Server Business Intelligence Developer 2005, 2008
    -- Please mark posts as answered or helpful where appropriate.

  • Analysis Service Execute DDL Task throwing error with SourceType Variable

    Hi,
    I have Configuring Analysis Services Execute DDL Task to use Variable and Process Data(xmla Script) like below:
    When I execute this task I get the below error message:
    [Analysis Services Execute DDL Task] Error: The -->
    text node at line 23, column 3 cannot appear inside the DataSource element (namespace http://schemas.microsoft.com/analysisservices/2003/engine) under Envelope/Body/Execute/Command/Batch/Parallel/Process. This element can
    only have text nodes containing white-space characters.
    Can anyone please let me know how to resolve this.

    If I run using the sourceType "Direct Input", the Analysis Execute DDL Task runs fine, but if I use the sourcetype as variable its throws the error. And below is the xmla script
    Here is my Package look and the xmla script; its failing at "ProcessAdd" Analysis Execute DDL task:
    "SELECT '<Batch xmlns=\"http://schemas.microsoft.com/analysisservices/2003/engine\">
    <ErrorConfiguration xmlns:xsd=\"http://www.w3.org/2001/XMLSchema\" xmlns:xsi=\"http://www.w3.org/2001/XMLSchema-instance\" xmlns:ddl2=\"http://schemas.microsoft.com/analysisservices/2003/engine/2\" xmlns:ddl2_2=\"http://schemas.microsoft.com/analysisservices/2003/engine/2/2\" xmlns:ddl100_100=\"http://schemas.microsoft.com/analysisservices/2008/engine/100/100\" xmlns:ddl200=\"http://schemas.microsoft.com/analysisservices/2010/engine/200\" xmlns:ddl200_200=\"http://schemas.microsoft.com/analysisservices/2010/engine/200/200\" xmlns:ddl300=\"http://schemas.microsoft.com/analysisservices/2011/engine/300\" xmlns:ddl300_300=\"http://schemas.microsoft.com/analysisservices/2011/engine/300/300\" xmlns:ddl400=\"http://schemas.microsoft.com/analysisservices/2012/engine/400\" xmlns:ddl400_400=\"http://schemas.microsoft.com/analysisservices/2012/engine/400/400\">
    <KeyNotFound>IgnoreError</KeyNotFound>
    </ErrorConfiguration>
    <Parallel>
    <Process xmlns:xsd=\"http://www.w3.org/2001/XMLSchema\" xmlns:xsi=\"http://www.w3.org/2001/XMLSchema-instance\" xmlns:ddl2=\"http://schemas.microsoft.com/analysisservices/2003/engine/2\" xmlns:ddl2_2=\"http://schemas.microsoft.com/analysisservices/2003/engine/2/2\" xmlns:ddl100_100=\"http://schemas.microsoft.com/analysisservices/2008/engine/100/100\" xmlns:ddl200=\"http://schemas.microsoft.com/analysisservices/2010/engine/200\" xmlns:ddl200_200=\"http://schemas.microsoft.com/analysisservices/2010/engine/200/200\" xmlns:ddl300=\"http://schemas.microsoft.com/analysisservices/2011/engine/300\" xmlns:ddl300_300=\"http://schemas.microsoft.com/analysisservices/2011/engine/300/300\" xmlns:ddl400=\"http://schemas.microsoft.com/analysisservices/2012/engine/400\" xmlns:ddl400_400=\"http://schemas.microsoft.com/analysisservices/2012/engine/400/400\">
    <Object>
    <DatabaseID>IIS_Version2</DatabaseID>
    <DimensionID>Application</DimensionID>
    </Object>
    <Type>ProcessAdd</Type>
    <DataSource xmlns:xsd=\"http://www.w3.org/2001/XMLSchema\" xmlns:xsi=\"http://www.w3.org/2001/XMLSchema-instance\" xmlns:ddl2=\"http://schemas.microsoft.com/analysisservices/2003/engine/2\" xmlns:ddl2_2=\"http://schemas.microsoft.com/analysisservices/2003/engine/2/2\" xmlns:ddl100_100=\"http://schemas.microsoft.com/analysisservices/2008/engine/100/100\" xmlns:ddl200=\"http://schemas.microsoft.com/analysisservices/2010/engine/200\" xmlns:ddl200_200=\"http://schemas.microsoft.com/analysisservices/2010/engine/200/200\" xmlns:ddl300=\"http://schemas.microsoft.com/analysisservices/2011/engine/300\" xmlns:ddl300_300=\"http://schemas.microsoft.com/analysisservices/2011/engine/300/300\" xmlns:ddl400=\"http://schemas.microsoft.com/analysisservices/2012/engine/400\" xmlns:ddl400_400=\"http://schemas.microsoft.com/analysisservices/2012/engine/400/400\" xmlns:dwd=\"http://schemas.microsoft.com/DataWarehouse/Designer/1.0\" xsi:type=\"RelationalDataSource\" dwd:design-time-name=\"1a3cb292-9bce-4c59-a182-177d6b3506ff\" xmlns=\"http://schemas.microsoft.com/analysisservices/2003/engine\">
    <ID>IISDW</ID>
    <Name>IISDW</Name>
    <ConnectionString>Provider=SQLNCLI11.1;Data Source=CO1MSFTSQLHKT02;Integrated Security=SSPI;Initial Catalog=IISDW</ConnectionString>
    <Timeout>PT0S</Timeout>-->
    </DataSource>
    <DataSourceView xmlns:xsd=\"http://www.w3.org/2001/XMLSchema\" xmlns:xsi=\"http://www.w3.org/2001/XMLSchema-instance\" xmlns:ddl2=\"http://schemas.microsoft.com/analysisservices/2003/engine/2\" xmlns:ddl2_2=\"http://schemas.microsoft.com/analysisservices/2003/engine/2/2\" xmlns:ddl100_100=\"http://schemas.microsoft.com/analysisservices/2008/engine/100/100\" xmlns:ddl200=\"http://schemas.microsoft.com/analysisservices/2010/engine/200\" xmlns:ddl200_200=\"http://schemas.microsoft.com/analysisservices/2010/engine/200/200\" xmlns:ddl300=\"http://schemas.microsoft.com/analysisservices/2011/engine/300\" xmlns:ddl300_300=\"http://schemas.microsoft.com/analysisservices/2011/engine/300/300\" xmlns:ddl400=\"http://schemas.microsoft.com/analysisservices/2012/engine/400\" xmlns:ddl400_400=\"http://schemas.microsoft.com/analysisservices/2012/engine/400/400\" xmlns:dwd=\"http://schemas.microsoft.com/DataWarehouse/Designer/1.0\" dwd:design-time-name=\"b0b61205-c64d-4e34-afae-6d4d48b93fb3\" xmlns=\"http://schemas.microsoft.com/analysisservices/2003/engine\">
    <ID>IISDW</ID>
    <Name>IISDW</Name>
    <DataSourceID>IISDW</DataSourceID>
    <Schema>
    <xs:schema id=\"IISDW_x0020_1\" xmlns=\"\" xmlns:xs=\"http://www.w3.org/2001/XMLSchema\" xmlns:msdata=\"urn:schemas-microsoft-com:xml-msdata\" xmlns:msprop=\"urn:schemas-microsoft-com:xml-msprop\">
    <xs:element name=\"IISDW_x0020_1\" msdata:IsDataSet=\"true\" msdata:UseCurrentLocale=\"true\" msprop:design-time-name=\"72037318-e316-469d-9a45-a10c77709b39\">
    <xs:complexType>
    <xs:choice minOccurs=\"0\" maxOccurs=\"unbounded\">
    <xs:element name=\"Application\" msprop:design-time-name=\"7f579e7e-e8b7-4a9d-8a93-a255fccbbfbe\" msprop:IsLogical=\"True\" msprop:FriendlyName=\"Application\" msprop:DbTableName=\"Application\" msprop:TableType=\"View\" msprop:Description=\"\" msprop:QueryDefinition=\"SELECT a.Application, DATEADD([hour], DATEDIFF([hour], 0, a.[Timestamp]), 0) AS [Timestamp], a.ServerName, CAST(a.ServerName AS char(3)) AS DataCenter, a.CS_URI_Stem, CAST(HashBytes(''MD5'', &#xD;&#xA; a.Application + a.ServerName + a.CS_URI_Stem + CAST(DATEADD([hour], DATEDIFF([hour], 0, a.[Timestamp]), 0) AS varchar(24))) AS uniqueidentifier) AS Server_URI_Identity&#xD;&#xA;FROM IIS_6_OLD AS a LEFT OUTER JOIN&#xD;&#xA; Dimension_Pointer AS b ON a.Application = b.Application&#xD;&#xA;WHERE (b.ProcessedFlag = 0) AND (a.Application IN ("+(DT_WSTR,100) @[User::strDistinctApplication]+"))&#xD;&#xA;GROUP BY a.Application, DATEADD([hour], DATEDIFF([hour], 0, a.[Timestamp]), 0), a.ServerName, a.CS_URI_Stem\" msprop:QueryBuilder=\"SpecificQueryBuilder\">
    <xs:complexType>
    <xs:sequence>
    <xs:element name=\"Application\" msprop:design-time-name=\"f3074e98-4a82-4bc5-a818-916203f7758b\" msprop:DbColumnName=\"Application\" msprop:FriendlyName=\"Application\" minOccurs=\"0\">
    <xs:simpleType>
    <xs:restriction base=\"xs:string\">
    <xs:maxLength value=\"255\" />
    </xs:restriction>
    </xs:simpleType>
    </xs:element>
    <xs:element name=\"Timestamp\" msdata:ReadOnly=\"true\" msprop:design-time-name=\"2662e3a8-8b1a-4d77-aecb-575329b84dc1\" msprop:DbColumnName=\"Timestamp\" msprop:FriendlyName=\"Timestamp\" type=\"xs:dateTime\" />
    <xs:element name=\"ServerName\" msprop:design-time-name=\"ced26d49-cd6e-4073-a40c-ff5ef70e4ef1\" msprop:DbColumnName=\"ServerName\" msprop:FriendlyName=\"ServerName\" minOccurs=\"0\">
    <xs:simpleType>
    <xs:restriction base=\"xs:string\">
    <xs:maxLength value=\"255\" />
    </xs:restriction>
    </xs:simpleType>
    </xs:element>
    <xs:element name=\"DataCenter\" msdata:ReadOnly=\"true\" msprop:design-time-name=\"4583e15a-dcf1-45a2-a30b-bd142ca8b778\" msprop:DbColumnName=\"DataCenter\" msprop:FriendlyName=\"DataCenter\" minOccurs=\"0\">
    <xs:simpleType>
    <xs:restriction base=\"xs:string\">
    <xs:maxLength value=\"3\" />
    </xs:restriction>
    </xs:simpleType>
    </xs:element>
    <xs:element name=\"CS_URI_Stem\" msprop:design-time-name=\"10db5a79-8d50-49d2-9376-a3b4d19864b9\" msprop:DbColumnName=\"CS_URI_Stem\" msprop:FriendlyName=\"CS_URI_Stem\" minOccurs=\"0\">
    <xs:simpleType>
    <xs:restriction base=\"xs:string\">
    <xs:maxLength value=\"4000\" />
    </xs:restriction>
    </xs:simpleType>
    </xs:element>
    <xs:element name=\"Server_URI_Identity\" msdata:DataType=\"System.Guid, mscorlib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089\" msdata:ReadOnly=\"true\" msprop:design-time-name=\"018ede0a-e15e-47c2-851b-f4431e8c839c\" msprop:DbColumnName=\"Server_URI_Identity\" msprop:FriendlyName=\"Server_URI_Identity\" type=\"xs:string\" />
    </xs:sequence>
    </xs:complexType>
    </xs:element>
    </xs:choice>
    </xs:complexType>
    <xs:unique name=\"Constraint1\" msprop:IsLogical=\"True\" msdata:PrimaryKey=\"true\">
    <xs:selector xpath=\".//Application\" />
    <xs:field xpath=\"Server_URI_Identity\" />
    <xs:field xpath=\"Timestamp\" />
    </xs:unique>
    </xs:element>
    </xs:schema>
    <IISDW_x0020_1 xmlns=\"\" />
    </Schema>
    </DataSourceView>
    <WriteBackTableCreation>UseExisting</WriteBackTableCreation>
    </Process>
    </Parallel>
    </Batch>' as XMLAScript_ProcessData"

  • SSAS cube sync between two servers using analysis service execute DDL task in SSIS

    hi folks:
      I have two AS servers: TOBIDW and TOBIDW2. TOBIDW2 is the production server where users use Excel to connect . TOBIDW has been identified as a backup Cube server in case if TOBIDW2 is not available.   My job is to create a SSIS package that syncs
    cube db from TOBIDW2 to TOBIDW everyday (one way).
       The standard SSIS dev process  for us   is to develop  in the dev environment and deploy  to production env.  The XMLA script has been extracted using Sync wizard of SSMS on production cube server.  Inside this
    XMLA, the connectionstring is using integrated security = SSPI. 
    When deploy to production, what username/password I should use to replace the integrated security? 
    Thanks
    <Synchronize xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns="http://schemas.microsoft.com/analysisservices/2003/engine">
    <Source>
    <ConnectionString>Provider=MSOLAP.5;Data Source=TOBIDW2;Integrated Security=SSPI;Initial Catalog=ORDERDW</ConnectionString>
    <Object>
    <DatabaseID>ORDERDW</DatabaseID>
    </Object>
    </Source>
    <SynchronizeSecurity>CopyAll</SynchronizeSecurity>
    <ApplyCompression>true</ApplyCompression>
    </Synchronize>
    --Currently using Reporting Service 2000; Visual Studio .NET 2003; Visual Source Safe SSIS 2008 SSAS 2008, SVN --

    Hi cat_ca, 
    You don't need to replace Integrated Security=SSPI with other Windows username and password. Integrated Security means it uses the Windows user that launches the process (the dtexec.exe that runs your SSIS package) to connect to the server TOBIDW2 you defined
    in your connection string. 
    Actually for SSAS connection string, you don't normally need to specify user name and password because SSAS only takes Windows Authentication and Integrated Security takes care of it for us automatically using SSPI. The only case when you need to manually
    enter user name and password (Windows user) is when you configured a HTTP Pump to sit in front of your SSAS server and use HTTP://servername/ to access your SSAS server. This is because IIS server could be configured to take Basic Authentication which requires
    explicitly supply Windows user name and password for authentication. In that case your connection string should look similar to this:
    Provider=MSOLAP;Data Source=http://serverName/;Initial Catalog=myDataBase;
    User Id=domain\user;Password=myPassword;

  • Analysis Services Processing Task - Error processing

    Hi all,
    I have a SSIS package containing an Analysis Services Processing Task. In case it fails, I want to insert the error messages in a table. I have create an SSIS package that fails because of dimension values that are not in the fact table. The package fails and
    shows a number of errors in the debug window:
    Error: 0xC1000007 at Create Cube, Analysis Services Execute DDL Task: Erreur interne : L'opération n'a pas abouti.
    Warning: 0x811F0001 at Create Cube, Analysis Services Execute DDL Task: Erreurs dans le moteur de stockage OLAP : 
    Clé d'attribut introuvable lors du traitement de : Table : 'dbo_SSIS_DATA', Colonne : 'PRODCODE', Valeur : 'HUME'. L'attribut est 'PRODCODE'.
    SSIS package "Step 2 - Create Cube.dtsx" finished: Failure.
    However in the error list window there are 0 errors and 0 warnings.
    So, why doesn't it show an error message in the error list when the package clearly fails. And how can I get the error messages from the log into a table? Thanks in advance.
    Hugo

    Error list does not reflect what is getting to be logged, it is more for package validation.
    But the same errors will show up in the status, and the output window.
    Furthermore, if your log is table bound (e.g. the sysssiserrorlog) and in the log config you chose to log the OnError events then these will be captured automatically.
    Arthur
    MyBlog
    Twitter

  • [Forum FAQ] How do I send multiple rows returned by Execute SQL Task as Email content in SQL Server Integration Services?

    Question:
    There is a scenario that users want to send multiple rows returned by Execute SQL Task as Email content to send to someone. With Execute SQL Task, the Full result set is used when the query returns multiple rows, it must map to a variable of the Object data
    type, then the return result is a rowset object, so we cannot directly send the result variable as Email content. Is there a way that we can extract the table row values that are stored in the Object variable as Email content to send to someone?
    Answer:
    To achieve this requirement, we can use a Foreach Loop container to extract the table row values that are stored in the Object variable into package variables, then use a Script Task to write the data stored in packages variables to a variable, and then set
    the variable as MessageSource in the Send Mail Task. 
    Add four variables in the package as below:
    Double-click the Execute SQL Task to open the Execute SQL Task Editor, then change the ResultSet property to “Full result set”. Assuming that the SQL Statement like below:
    SELECT   Category, CntRecords
    FROM         [table_name]
    In the Result Set pane, add a result like below (please note that we must use 0 as the result set name when the result set type is Full result set):
    Drag a Foreach Loop Container connects to the Execute SQL Task. 
    Double-click the Foreach Loop Container to open the Foreach Loop Editor, in the Collection tab, change the Enumerator to Foreach ADO Enumerator, then select User:result as ADO object source variable.
    Click the Variable Mappings pane, add two Variables as below:
    Drag a Script Task within the Foreach Loop Container.
    The C# code that can be used only in SSIS 2008 and above in Script Task as below:
    public void Main()
       // TODO: Add your code here
                Variables varCollection = null;
                string message = string.Empty;
                Dts.VariableDispenser.LockForWrite("User::Message");
                Dts.VariableDispenser.LockForWrite("User::Category");
                Dts.VariableDispenser.LockForWrite("User::CntRecords");     
                Dts.VariableDispenser.GetVariables(ref varCollection);
                //Format the query result with tab delimiters
                message = string.Format("{0}\t{1}\n",
                                            varCollection["User::Category"].Value,
                                            varCollection["User::CntRecords"].Value
               varCollection["User::Message"].Value = varCollection["User::Message"].Value + message;   
               Dts.TaskResult = (int)ScriptResults.Success;
    The VB code that can be used only in SSIS 2005 and above in Script Task as below, please note that in SSIS 2005, we should
    change PrecompileScriptIntoBinaryCode property to False and Run64BitRuntime property to False
    Public Sub Main()
            ' Add your code here
            Dim varCollection As Variables = Nothing
            Dim message As String = String.Empty
            Dts.VariableDispenser.LockForWrite("User::Message")
            Dts.VariableDispenser.LockForWrite("User::Category")
            Dts.VariableDispenser.LockForWrite("User::CntRecords")
            Dts.VariableDispenser.GetVariables(varCollection)
            'Format the query result with tab delimiters
            message = String.Format("{0}" & vbTab & "{1}" & vbLf, varCollection("User::Category").Value, varCollection("User::CntRecords").Value)
            varCollection("User::Message").Value = DirectCast(varCollection("User::Message").Value,String) + message
            Dts.TaskResult = ScriptResults.Success
    End Sub
    Drag Send Mail Task to Control Flow pane and connect it to Foreach Loop Container.
    Double-click the Send Mail Task to specify the appropriate settings, then in the Expressions tab, use the Message variable as the MessageSource Property as below:
    The final design surface like below:
    References:
    Result Sets in the Execute SQL Task
    Applies to:
    Integration Services 2005
    Integration Services 2008
    Integration Services 2008 R2
    Integration Services 2012
    Integration Services 2014
    Please click to vote if the post helps you. This can be beneficial to other community members reading the thread.

    Thanks,
    Is this a supported scenario, or does it use unsupported features?
    For example, can we call exec [ReportServer].dbo.AddEvent @EventType='TimedSubscription', @EventData='b64ce7ec-d598-45cd-bbc2-ea202e0c129d'
    in a supported way?
    Thanks! Josh

  • SQL Server 2005 agent job runs a SSIS package ( Analysis Services Processing Task) fails

     Hi,
    SQL Server 2005 standard edition.
     I have a SSIS package which has a  Analysis Services Processing Task. I have tested the package in BIDS and it runs ok. But when I created a agent job and run it from the job it reports error:
    Code: 0xC0012024     Source: Analysis Services Processing Task      Description: The task "Analysis Services Processing Task" cannot run on this edition of Integration Services.
    It requires a higher level edition.
    This is the result of select @@version
    Microsoft SQL Server 2005 - 9.00.4035.00 (Intel X86)   Nov 24 2008 13:01:59   Copyright (c) 1988-2005 Microsoft Corporation  Standard Edition on Windows NT 5.2 (Build 3790: Service Pack 1) 
    Any idea?

     Hi,
    SQL Server 2005 standard edition.
     I have a SSIS package which has a  Analysis Services Processing Task. I have tested the package in BIDS and it runs ok. But when I created a agent job and run it from the job it reports error:
    Code: 0xC0012024     Source: Analysis Services Processing Task      Description: The task "Analysis Services Processing Task" cannot run on this edition of Integration Services. It
    requires a higher level edition.
    This is the result of select @@version
    Microsoft SQL Server 2005 - 9.00.4035.00 (Intel X86)   Nov 24 2008 13:01:59   Copyright (c) 1988-2005 Microsoft Corporation  Standard Edition on Windows NT 5.2 (Build 3790: Service Pack 1) 
    Any idea?
    Anyway, I have found a work around:
    http://technet.microsoft.com/en-us/library/ff929186.aspx

  • Does analysis service process task honor dynamic connections?

    simple question.. I have a ssis package which contains analysis service process task.. it went well in the dev env. Now, I've deployed to production and connection manager was set by configuration tables.. however, it seems to me that the connection manager
    on production is still pointing to the dev env... any workaround? 
    thanks
    --Currently using Reporting Service 2000; Visual Studio .NET 2003; Visual Source Safe SSIS 2008 SSAS 2008, SVN --

    Hi cat_ca,
    To achieve your goal, we can combine the Environment variable Configuration with SQL Server Configuration. In short, we can create an Environment variable type Configuration to define the connection string for the Connection Manager that is used by a SQL
    Server Configuration to connect to the SQL Server configuration table. To create such an Environment variable Configuration, we need to create a system Environment variable on each server to host the connection string for the corresponding connection manager
    such as:
    Data Source=ServerName;Initial Catalog=TestDB;Provider=SQLNCLI11.1;Integrated Security=SSPI;Auto Translate=False;
    Besides, we need to make sure the Environment variable Configuration is above the SQL Server Configurations in the Package Configurations Organizer list.
    Reference:
    http://amenjonathan.wordpress.com/2010/12/13/creating-a-solid-ssis-etl-solution-part-2-%E2%80%93-package-configurations/ 
    Regards,
    Mike Yin
    If you have any feedback on our support, please click here
    Mike Yin
    TechNet Community Support

  • SSIS Analysis Services Processing Task - which object is being processed?

    Hi
    I've just fallen down the hole dug by a gotcha in the SSIS Analysis Services Processing task (let's call it the ASPT as it's too long to type).
    I was doing testing yesterday, targeting a test database.  The way things are set up here is that we have two structurally-identical databases (let's call them SSASLive and SSASTest) on the same SSAS server.
    The SSIS package includes an Analysis Services Processing task that processes a partition of a measure group in a cube.
    So I changed the properties of the AS Connection Manager in the package to point to database SSASTest.  Then, because I'm paranoid, and don't trust the Editor for the ASPT, I edited the object to be processed, by deleting the existing object from the
    list and re-adding it (going down the tree: database SSASTest, cube TheCube, measure group TheMassiveMeasureGroup, partition CurrentMonth).
    This is an annoying necessity, because: the ASPT editor doesn't allow you to see exactly what the object(s) in the list to be processed are, by e.g. double-clicking on them.  All you can see is the lowest-level object name, which could be just
    "Current Month".  Current Month what?  There's no way of telling.
    Test worked fine.  I made sure the AS Connection manager was reset to point to SSASLive, and deployed the package.
    This morning, the package ran, and processed the partition in database
    SSASTest.  What happened?  Looking in the Code view of the package, I found the XMLA behind the Processing task: the database name is literally specified in the XMLA, and completely ignores the Catalog of the AS connection.  This information
    is not exposed anywhere in the UI view of the package.
    It's a bit like the dangerous annoyance when testing SSIS package Exec SQL tasks, where a previous developer has insisted on fully-qualifying object names: SELECT * FROM LiveDatabase.dbo.ATable.  You're working with a connection to a test database,
    but it makes not difference...
    But at least that problem is clearly visible.  I know that specifying the database name is normal in XMLA tasks: but at least the SSIS editor should allow you to see what the setting is.

    No, the point is that the Connection Manager's Initial Catalog setting (whether hard-coded or set by any of the configuration methods) has no effect on the database addressed by the Analysis Services Processing task.
    Steps to reproduce:
    1. Create an Analysis Services database containing a cube (or select an existing one): let's call it ASDB1.
    2. Make a copy of it on the same server, e.g. by backing up and restoring: let's call it ASDB_Copy
    2. Set up an SSIS package with a Connection Manager pointing to ASDB1
    3. Create an Analysis Services Processing task, using the Connection Manager, processing any particular SSAS object.
    4. Run the package.  The SSAS object in ASDB1 is processed (not the object in ASDB_Copy).  (Confirm by checking the Last Processed date in SSMS, Properties).
    5. Change the Connection Manager's Initial Catalog setting to point to ASDB_Copy.
    6. Run the package.  The SSAS object in ASDB1 is processed, not the object in ASDB_Copy.  The Connection Manager's Initial Catalog setting has no effect on the AS Processing Task.  Examining the Code view of the package confirms that the XMLA
    behind the task specifies the database as well as the AS object. (and this database setting is still ASDB1).
    The fix would be for SSIS to perform validation of AS objects referenced by the AS Processing Task whenever its Connection manager is pointed to a different database, and to update the XMLA accordingly.
    Also, it would be useful to be able to see exactly what the objects listed in the Processing Task are.  The information shown is just not sufficient to identify the object to the developer.  I may have dozens of measure groups with a partition
    called Current Month - but "Current Month" is all I can see in the editor, and there's no way to e.g. double-click on an item in the list and get the full object identification (Cube X, measure group Y, partition Current Month).  Sure,
    I can go and look at the XMLA, but that's fiddly: if there's going to be a wrapper around the XMLA (which is what the AS Processing Task is), then it should be unambiguous.

  • [Execute SQL Task] Error: Executing the query "DECLARE_@XMLA nvarchar(3000) ,__@DateSerial nvarch..." failed with the following error: "Incorrect syntax near '-'.". Possible failure reasons: Problems with the query, "ResultSet" property not set correctly,

    Hi
    DECLARE @XMLA nvarchar(3000)
    , @DateSerial nvarchar(35);
    -- Change date to format YYYYMMDDHHMMSS
    SET @DateSerial = CAST(GETDATE() AS DATE);
    --SELECT @DateSerial
    Set @XMLA = 
    N' <Batch xmlns="http://schemas.microsoft.com/analysis services/2003/engine">
     <ErrorConfiguration xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:ddl2="http://schemas.microsoft.com/analysisservices/2003/engine/2"
    xmlns:ddl2_2="http://schemas.microsoft.com/analysisservices/2003/engine/2/2" xmlns:ddl100_100="http://schemas.microsoft.com/analysisservices/2008/engine/100/100" xmlns:ddl200="http://schemas.microsoft.com/analysisservices/2010/engine/200"
    xmlns:ddl200_200="http://schemas.microsoft.com/analysisservices/2010/engine/200/200">
    <KeyErrorLimit>-1</KeyErrorLimit>
    <KeyNotFound>IgnoreError</KeyNotFound>
    <NullKeyNotAllowed>IgnoreError</NullKeyNotAllowed>
     </ErrorConfiguration>
     <Parallel>
    <Process xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:ddl2="http://schemas.microsoft.com/analysisservices/2003/engine/2" xmlns:ddl2_2="http://schemas.microsoft.com/analysisservices/2003/engine/2/2"
    xmlns:ddl100_100="http://schemas.microsoft.com/analysisservices/2008/engine/100/100" xmlns:ddl200="http://schemas.microsoft.com/analysisservices/2010/engine/200" xmlns:ddl200_200="http://schemas.microsoft.com/analysisservices/2010/engine/200/200"
    xmlns:ddl300="http://schemas.microsoft.com/analysisservices/2011/engine/300" xmlns:ddl300_300="http://schemas.microsoft.com/analysisservices/2011/engine/300/300">
     <Object>
     <DatabaseID>MultidimensionalProject5</DatabaseID>
     <CubeID>giri</CubeID>
     <MeasureGroupID>Fact Internet Sales</MeasureGroupID>
     </Object>
     <Type>ProcessFull</Type>
     <WriteBackTableCreation>UseExisting</WriteBackTableCreation>
     </Process>
      </Parallel>
    </Batch>';
    EXEC (@XMLA) At SHALL-PCAdventureWorksDw ;
     iam executive the    query when iam getting below error.
      [Execute SQL Task] Error: Executing the query "DECLARE
    @XMLA nvarchar(3000)
    , @DateSerial nvarch..." failed with the following error: "Incorrect syntax near '-'.". Possible failure reasons: Problems with the query, "ResultSet" property not set correctly, parameters not set
    correctly, or connection not established correctly. 
     how to solve this error;
     please help me

    What are you trying to do? What sort of data source is  SHALL-PCAdventureWorksDw?
    When you use EXEC() AT, I would execpt to see an SQL string to be passed to EXEC(), but you are passing an XML string????
    If you explain why you think this would work in the first place, maybe we can help you.
    Erland Sommarskog, SQL Server MVP, [email protected]

  • Execute SQL Tasks Failing for Duplicate Syntax Between DEV and Production DB

    Newbie here...be patient with me!
    I added tasks to refresh two tables (delete from, insert into, update) to an SSIS project . I have them running from the WinXP scheduler. The issue:
    In dev the tasks integrate and execute successfully from scheduler
    In prod I can right click and execute each of the three tasks without any problem, but the same tasks cause my project to fail when executed from scheduler
    Questions:
    Any ideas about what I am failing to see?
    How do I get a meaningful log messages from the tasks that are failing?
    Thanks for your ideas...
    Installed Edition: IDE Standard
    SQL Server Analysis Services  
    Microsoft SQL Server Analysis Services Designer
    Version 9.00.1399.00
    SQL Server Integration Services  
    Microsoft SQL Server Integration Services Designer
    Version 9.00.1399.00
    SQL Server Reporting Services  
    Microsoft SQL Server Reporting Services Designers
    Version 9.00.1399.00
    Microsoft Visual Studio 2005
    Version 8.0.50727.42  (RTM.050727-4200)
    Microsoft .NET Framework
    Version 2.0.50727
    Installed Edition: IDE Standard
    SQL Server Analysis Services  
    Microsoft SQL Server Analysis Services Designer
    Version 9.00.1399.00
    SQL Server Integration Services  
    Microsoft SQL Server Integration Services Designer
    Version 9.00.1399.00
    SQL Server Reporting Services  
    Microsoft SQL Server Reporting Services Designers
    Version 9.00.1399.00

    Newbie here...be patient with me!
    I added tasks to refresh two tables (delete from, insert into, update) to an SSIS project . I have them running from the WinXP scheduler. The issue:
    In dev the tasks integrate and execute successfully from scheduler
    In prod I can right click and execute each of the three tasks without any problem, but the same tasks cause my project to fail when executed from scheduler
    Questions:
    Any ideas about what I am failing to see?
    How do I get a meaningful log messages from the tasks that are failing?
    If i understand correctly you are trying to run SSIS package from Window XP task scheduler.  The reason for the failure may be due to permission while accessing resources required in the package. However it is difficult to suggest without
    looking at the error message.
    Scheduled tasks maintains a log file (Schedlgu.txt), in the c:\Windows folder. You can view the log from the Scheduled Tasks window by clicking
    View Log on the Advanced menu.
    The log file size is 32 kilobytes (KB), and when the file reaches its maximum size, it automatically starts to record new information at the beginning of the log file and writes over the old log file information.
    Refer http://support.microsoft.com/kb/308558
    http://msdn.microsoft.com/en-us/library/windows/desktop/aa383604(v=vs.85).aspx
    Regards, RSingh

  • Dynamic Parameters in MS Analysis Services and BPC

    Hello Experts !
    I need to execute a BPC Package, but in Microsoft Analysis Services I'm not able to have dynamic parameters, so I have to do it manually every time, because of the following problem when I try to execute it in Source Editor on Analysis Services.
    "Parameter cannot be extracted from the SQL Command. The provider might not help to parse parameter information from the commnand"
    The source is Sybase OLEDB
    Best Regards,
    Rodrigo

    Hi Rodrigo,
    I'm not sure I understood your question.
    I'm refering to this: "but in Microsoft Analysis Services I'm not able to have dynamic parameters"
    What do you mean?
    Are we speaking about tasks to process cube or partitions of SSAS(SQL Server Analysis Services)?
    Are you speakinga about tasks from SSIS (Sql Server integration services?)
    Into SSIS you can build tasks to accept parameters.
    You can have also tasks into SSIS which can perform actions into SSAS.
    MODIFYSCRIPT variable is used to send parameters to tasks.
    If you will clarify your questions then I hope I will be able to provide more details.
    Kind Regards
    Sorin Radulescu

  • [Forum FAQ] How do I create calculated measure using AMO in SQL Server Analysis Services?

    Introduction
    In SQL Server Analysis Services (SSAS), you can create a calculated measure in SQL Server Data Tool (SSDT)/Boniness Integrated Development Studio (BIDS). Sometimes you may need to create calculated measure by using AMO in a C# or VB project.
    In this article, I will demonstrate so how to create calculated measure using AMO in SSAS?
    Prerequisites
    Before create calculated measure using AMO, you need to ensure that the following components were installed in your server.
    The multidimensional database AdventureWorks Multidimensional Model 2012
    A SQL Server with SSIS and SSAS installed
    The AMO libraries installed:
    X86 Package (SQL_AS_AMO.msi)
    X64 Package (SQL_AS_AMO.msi)
    Solution
    Here is the detail steps to create calculated measure using AMO in SSAS.
    Open SSDT and create a new SSIS project.
    Drag Script Task to the design surface.
    Click SSIS-> Variables to open the Variables window and add two variables that used to connect to the server and database.
    Create a connection to connect to SSAS server.
    Rename the connection name to ssas.
    Double click the Script Task to open Script Task Editor.
    Add Connection and Database variables to ReadWriteVariables textbox and then click Edit Script button.
    Add AMO reference in the Solution Explore window.
    Copy the script below and paste it into the script.
    Dim objServer As Server
    Dim objDatabase As Database
    Dim strDataBaseID As String
    Dim objCube As Cube
    Dim objMdxScript As MdxScript
    Dim objCommand As Command
    Dim strCommand As String
    objServer = New Server
    objServer.Connect("localhost")
    objDatabase = objServer.Databases("AdventureWorksDW2012Multidimensional-EE2")
    strDataBaseID = objDatabase.ID
    If objDatabase.Cubes.Count > 0 Then
    objCube = objDatabase.Cubes("Adventure Works")
    If objCube.MdxScripts.Count > 0 Then
    objMdxScript = objCube.MdxScripts("MdxScript")
    objMdxScript = objCube.MdxScripts(0)
    Else
    objCube.MdxScripts.Add("MdxScript", "MdxScript")
    objMdxScript = objCube.MdxScripts("MdxScript")
    End If
    objCommand = New Command
    strCommand = "CREATE MEMBER CURRENTCUBE.[Measures].[Multipy Measures By 3]"
    strCommand = strCommand & " AS [Measures].[Internet Sales Amount] * 3, "
    strCommand = strCommand & " VISIBLE = 1 ; "
    objCommand.Text = strCommand
    objMdxScript.Commands.Add(objCommand)
    objMdxScript.Update()
    objCube.Update()
    End If
    objServer.Disconnect()
    Then you can run this SSIS package to create the calculated measure.
    Applies to
    Microsoft SQL Server 2005
    Microsoft SQL Server 2008
    Microsoft SQL Server 2008 R2
    Microsoft SQL Server 2012
    Please click to vote if the post helps you. This can be beneficial to other community members reading the thread.

    Thanks,
    Is this a supported scenario, or does it use unsupported features?
    For example, can we call exec [ReportServer].dbo.AddEvent @EventType='TimedSubscription', @EventData='b64ce7ec-d598-45cd-bbc2-ea202e0c129d'
    in a supported way?
    Thanks! Josh

  • SSIS Package Load Failure and is missing Execute SQL Task in Toolbox

    Also, BIDS closes after right-clicking the toolbox and selecting “Choose Items”.
    I’ve tried several ways to fix these issues with no success.
    I’m running SSIS/SSMS/SQL Server 2008 R2 (but not running an SQL instance on my PC) on a Windows XP with Service Pack 3 workstation. I get the following errors:
    WHEN OPENING BIDS:
    I get 2 errors titled “Package Load Failure”. They both reference ‘Visual Studio Explorers and Designers Package’ as being what failed to load. The first message references GUID 8D8529D3-625D-4496-8354-3DAD630ECC1B. The second references GUID AA612177-A69A-4391-B2F4-17F8A88F4BBA
    . They both mention contacting the package vendor, but I haven’t got any Add-ins loaded so I’m not sure what that means exactly. They also both ask if I’d like to skip loading the package in the future.
    ONCE BIDS IS OPEN:
    No error messages appear here. There is no ‘Execute SQL Task’ in the toolbox for Control Flows. Also, if I attempt to open a project containing this object (Execute SQL Task), I get this error in a modal window:
    TITLE: Microsoft Visual Studio
    There were errors while the package was being loaded.
    The package might be corrupted.
    See the Error List for details.
    BUTTONS:
    OK
    . . . and these errors in the Error List at the bottom of the IDE:
    Error      1             
    Error loading Package.dtsx: Failed to load task "Execute SQL Task", type "Microsoft.SqlServer.Dts.Tasks.ExecuteSQLTask.ExecuteSQLTask, Microsoft.SqlServer.SQLTask, Version=10.0.0.0, Culture=neutral, PublicKeyToken=89845dcd8080cc91". The contact information
    for this task is "Execute SQL Task; Microsoft Corporation; Microsoft SQL Server 2008 R2; © 2007 Microsoft Corporation; All Rights Reserved;http://www.microsoft.com/sql/support/default.asp;1". 
            C:\Documents and Settings\848862.VCHS_NT_DOMAIN\My Documents\Visual Studio 2008\Projects\LearnSSIS\LearnSSIS\Package.dtsx   
    1               
    1             
    Error      2             
    Validation error. Execute SQL Task : The task has failed to load. The contact information for this task is "Execute SQL Task; Microsoft Corporation; Microsoft SQL Server 2008 R2; © 2007 Microsoft Corporation; All Rights Reserved;http://www.microsoft.com/sql/support/default.asp;1". 
            Package.dtsx    
    0             
    0             
    Error      3             
    Validation error. Execute SQL Task : There were errors during task validation. 
       Package.dtsx    
    0             
    0               
    Finally, when I attempt to add the Execute SQL Task via the Tools menu, “Choose Toolbox Items” selection, I do get the tabbed box for adding things. However, they only come up after issuing this error:
    The following assemblies are installed SDK assemblies but could not be shown in the customize toolbox dialog because they are missing one or more components. Please make sure that all necessary libraries are available:
    Microsoft.SqlServer.SqlCEDest.dll
    Microsoft.SqlServer.Management.CollectorTasks.dll
    Microsoft.SqlServer.SQLTask.dll
    System.Data.SqlServerCe.dll
    Microsoft.Synchronization.Data.SqlServerCe.dll
    (above dll repeated at bottom of message)
    After this message, I have my choice of .NET Framework Compnents. When I select COM Components, those come up with no problem. The WPF Components tab generates the same dll error above. The Maintenance tab is empty. The SSIS Data Flow Items tab loads fine,
    as does the SSIS Control Flow Items tab – but there’s no Execute SQL Task there either.
    If you open the toolbox (on a new project, with a project loaded, or without a project loaded) and right-click, then select “Choose Items”, the entire application quickly closes. This “closing behavior” is the same thing you get if you attempt to add toolbox
    items through either the Tools menu or the right-click method when you launch BIDS in SafeMode.
    In an effort to fix this – I’ve rolled back the .NET Framework version to 3.0 and reinstalled back up to 4; I’ve uninstalled and reinstalled SQL Server 2008 R2 shared services as well as the entire application; I’ve cleared the .tbd files from my user profile;
    and I’ve run several things from the command line (devenv.exe /resetuserdata, /setup, /resetskippkgs, etc.). Lastly – I’ve applied every Service Pack and update applicable to my PC (ok, it belongs to my employer – but no one here has been able to figure this
    out either).
    After having worked this for several days now and, I think, exhausting every available avenue from search engines – I turn to you. Please let me know if you can help. Also, I can provide any additional
    files or information you may require.
    CBS

    try: Devenv.exe /ResetSettings
    http://msdn.microsoft.com/en-us/library/ms241273(v=vs.80).aspx
    Or try this (it's for 2005, but maybe it works for 2008 aswell):
    http://geekswithblogs.net/cicorias/archive/2007/10/10/Restoring-Missing-Toolbox-Items-in-Visual-Studio-2005---just.aspx
    or try
    http://stackoverflow.com/questions/1268298/how-to-rebuild-the-visual-studio-toolbox
    "You can also go into the folder "C:\Documents and Settings\\Local Settings\Application Data\Microsoft\VisualStudio\9.0" and delete the *.tbd files.  I have had to do this a couple times to get rid of duplicate items and when Reset Toolbox crashes
    without warning"
    Please mark the post as answered if it answers your question | My SSIS Blog:
    http://microsoft-ssis.blogspot.com |
    Twitter

  • Permanently change default error configuration in Analysis Services 2005

    Hi,
    Currently, I am working on a BPC 5.1 application.  The data for this application is loaded(inserted via SQL statement) right to the FACT table and then a full process is run for that cube via an SSIS package using the Analysis Services Processing Task.  Often records are loaded this way where a dimension member for some of the records has not been added to the Account dimension yet.  These records after loading are considered 'orphan records' until the accounts are added to the account dimension.
    This loading process is used because of the volume of records loaded(over 2 million at a time) and the timing of the company's business process.  They will receive data sometimes weeks before the account dimension is updated in BPC with the new dimension members.
    If I try and process the application from the BPC Administration area with these orphan records in the FACT table, the processing stops and an error displays.  Then when I process the cube from Analysis services, an error is displayed telling me that orphan data was found.
    A temporary work-around is to go into the cube properties in Analysis Services 2005, click on Error Configuration, uncheck 'Use default error configuration' and select 'Ignore errors'. Then you can process the application from BPC's Administration page successfully.  But, the problem is that after processing the application successfully, the Analysis Services Error Configuration automatically switches back from 'Ignore errors' to 'Use default error configuration'.
    Does anyone have any suggestions on how to permanently keep the 'Ignore errors' configuration selected so it does not automatically switch back to 'Use default error configuration'?  Prior to BPC 5.0 this was not occurring.
    Also, does anyone know why this was changed in BPC 5.0/5.1?
    Thanks,
    Glenn

    Hi Glenn,
    I understood the problem but I can say that it was a bad migration of appset from 4.2 to 5.0.
    Any way they are using a dts package to import data into our fact table. That's means they have to add another step into that package where they have to do the verfications of records before to insert into fact table. Verfications can be done using the same mechanism from our standard import. Just edit that package and add similar steps into customer package.
    Attention you need somebody with experience developing DTS packages with for BPC to avoid other problems.
    One of big benefits from 5.X compare with 4.2 was the fact that we are able to use optimization schema and aggregations for cubes.
    Heaving that orphan records it is not possible to use optimization schema for cubes and you are not able to create good aggregation into your cube.
    So my idea is to provide all these information to customer and to try to modify that package instead to enable that option which can cause many other issues.
    Sorin

Maybe you are looking for