Analysis service cubes - Utilisation report, how to extract resource, team assignment information

Hi,
I am using Excel 2013 and attached an external data source from analysis services OLAP cube that Project Server 2013 builds.
The cube I am using is portfolio analyser.
I would like categorise the resources by which team they are part of firstly, i.e. Project Management, Technical Services, Delivery
I have created a PowerPivot table and would like group each user by team and then by resource name, capacity, work and availability. Time across the top, so I can see this for Year, month, and weekly utilisation.
How can I get the team that each is resource is assigned from this cube?
Thanks. 

Hi Guillaume,
I think the field I require is listed in the Enterprise Custom Fields and Lookup Tables section of PWA, however I believe it was a default field that came pre-configured OOB after install. The exact field I require is located if you go to; PWA Settings
> Security >Manage Users > Select User> Team Details > Team Name.
As you can see by the description next team name, there are instructions advising on how to populate the team names for your organisation. So it was a pre-configured field. 
Team Details    
Team Details are optional and are used to define team membership and the team resource that represents a team. Before you set these options use "Server Settings"/"Enterprise Custom Fields and Lookup Tables" to create a lookup table that
contains your team names, and edit the "Team Name" resource custom field to use this lookup table.
Team Name is used to indicate team membership - each resource in a team will have the same value for Team Name. 
The Team Assignment Pool check box is selected for the team resource, used when assigning tasks to the team. Often a generic resource will be used with the assignment owner field set as the team manager.
So I have entered in all my appropriate organisation team names in the Lookup Table for Custom Field; Team.
I have assigned each resource to one of the team names.
Now I would like to create a PowerPivot Table and incorporate the resource team to be able to create a utilisation report based on each team within my organisation.
After receiving the link from you above I have gone to central admin > Service Application > Project Server Service Application > OLAP Database Management > Configuration of Cube > Cube Dimensions > Resource >
I have added "Team_Resource" to the selected dimension list. Then I completed a "Build Now".
When I go back to my PowerPivot Excel sheet and refresh data I am able to see the new field however, it is not the field that I need.
Can you confirm which field I would need to add?
Is this field a multi field?
Appreciate your assistance.
Thanks

Similar Messages

  • Error when trying to view reports or manually processing the TFS data warehouse and analysis services cube

    Hello Guys,
    I am trying to configure the reporting for TFS using SQL Server. But I get following error when viewing any report:
    So I try to manually process the cube to check if it works. I am following this article: https://msdn.microsoft.com/en-us/library/ff400237.aspx
    When I click on GetProcessingStatus and invoke it (with last field set as TRUE) I get following error:
    Please advice as to how to resolve this issue and be able to see the reports.

    I have managed to resolve this issue. Note that for all purposes of this question and answer, mydomain\tfsadmin is a generic user (used to install all software). This is a proof of concept account, for test purposes
    The thing is that while installation of the SQL Analysis service I had given the username: mydomain\tfsadmin (a generic user for testing) as the Analysis service administrator, instead of the 'Domain admin' group.
    Still I have managed to resolve this issue.
    Steps:
    1. Make sure that the user (mydomain\tfsadmin) is a member of Analysis Server -> TFS_Analysis db -> Roles -> TFSWarehouseAdministrator and TFSWarehouseDataReader. (This actually happens automatically when you run the TFS Admin console and configure
    reporting and provide the username that will access the Analysis db, etc. In my case the user is 'mydomain\tfsadmin')
    2. It is a bad practice to manually process the cube (you can do this is make sure that there are no errors, but only after completing the following steps till step 5)
    3. Also make sure that NT AUTHORITY\NETWORK SERVICE is member of the Analysis db -> Roles -> TFSWarehouseAdministrator ->
    this will resolve the error which appears in the 2nd screenshot in the question.
    4. Then you can right click on the Analysis db and run process. If this is throwing error as shown in above reply of mine, then you need to follow https://msdn.microsoft.com/en-us/library/vstudio/ff400237.aspx
    a. http://localhost:8080/tfs/TeamFoundation/Administration/v3.0/WarehouseControlService.asmx
    b. Choose
    ProcessWarehouse, run it.
    c. Choose ProcessAnalysisDatabase, type 'Full' and run it.
    d. Choose GetProcessingStatus,
    enter 'TRUE' in last field and run it.
    e. I don't get any errors at this point.
    5. Now you can connect to the Analysis server via SQL Mgmt Console, right click on Analysis db (TFS_Analysis) and click Process. It all works good.
    6. Now you can browse to the report URL (to get this URL you can open Team explorer 2013, connect to you team project, on RHS click 'Reports', Click 'Go to Site'.
    DONE.

  • Can BO Universe extracted data to a SQL Server Analysis Services Cube

    Hello,
    I have a situation where all the business rules and aggregations are defined in the BO Universe layer, however now they want to create cubes in SSAS (Sql Server Analysis Services) and create dashboards using Microsoft technology.
    I am not sure if data and business rules including Hierarchies can be extracted from the BO universe to SSAS.  Any thoughts and suggesstions.
    AP

    Hi,
    This is not provided out of the box but you can develop a custom application using Designer SDK.
    Didier

  • MS Excel cannot read the Roles definition on Analysis Service cube

    Hello,
    While using Analysis Service connection via Excel, apparently it does not automatically provide Roles or EffectiveUserName in the connection string based on username/password we provide. We must manually edit the connection string.
    Does this thing already defined as bugs, or there is other explanation ? Is there any related with this ?
    Thanks.

    Hi Prayijana,
    In your scenario, you said that you need to manually edit the connection string for different users to connect to SQL Server Analsysi Services Multidimensional database, right?
    When using excel to interact with a SSAS cube, we can use the default settings "Use the authenticated user's account". In this case, excel will use the current account to connect SSAS cube. If the account has a matched role on the databse,
    then he can access to the cube.
    If this is not what you, please provide us more informantion, so that we can make further analysis.
    Regards,
    Charlie Liao
    TechNet Community Support

  • String measures in Analysis services cube

    Hi,
    I am having the requirement where I have to use string as measures. Is this possible in Analysis services 2K.
    Any help on this will be greatly appreciated

    Hi,
     Darren Gosbell wrote:
    Have you see this site? http://www.sqlserveranalysisservices.com
    It has a couple of links on the lower left that discuss various cell annotation approaches in more detail than the discussion in this thread.
    Thanks, following the article "More on cell annotation" I have been able to have String Measures.
    a+, =)
    -=Clément=-

  • Cannot connect to SQL 2000 Analysis Service cube form 32bit 2010 excel

    Excel 2010 32bit is running on windows 7 64bit PC. User has new PC updated from XP where there was no connection issue.
    Initial Error
    "The following system error occurred: No connection could be made because the target machine actively refused it"
    I installed Office web component 2003 as advised in forums.
    Next Error
    "initialization of the data source failed.  Check the database server or contact your database administrator. Make sure the external database is available, and then try the operation again. If you see this message again, create a new datasource
    to connect tot he database."
    I have trawled the forums for the past 2 days looking for a solution.  Any help would be appreciated.
    Niall

    Hello,
    Please check the below link.May be this can Help You
    https://fawzi.wordpress.com/2013/05/15/sql-2012-analysis-services-error-no-connection-could-be-made-because-the-target-machine-actively-refused-it/

  • How to extract text and image information from postscript file

    I want to write a programe,and extract text and image information from postscript file using Java.Is it possible? How to extract ?
    Thank!

    First of all, PostScript is not a "text" file. It can and often does contain binary data. Since PostScript streams often contain nested procedures, unless you process the procedure definitions and can "execute" them, you cannot simply "scan" a file to get what you want. No, I can't talk about this in detail since it is quite complex. But Adobe does have the
    PostScript Language Reference Manual on-line for download at
    . Look that over and you will have a fairly healthy respect as to the task involved.
    - Dov

  • Grid Report  : How to Get the user specific information.

    Hello,
    we login into grid using a particular userid and password . Do we have any variable in grid control which will hold the value of the userid that was entered ?
    -Sai .

    What are you going to acheive with it?

  • Pass parameter to analysis services report to cube (Not filter)

    Hi I am trying to pass a parameter from a report to a subreport. Both reports are built on Analysis Services cubes.
    I can pass the parameter value to a FILTER parameter but I cannot pass it directly to parameters used in cube. Is it only possible to use the passed parameter as a filter? 
    My subreport is a lot slower when I have to return everything then filter it in Reporting Services, rather than only returning what I need.

    Hi Darkdushy,
    As you said, there are two ways to filter data using parameters. First is add a filter to the dataset or other report items. The second one is use the parameter on the query no matter the database is a multidimensional or relational database. Here is a sample
    query for your reference.
    select
    {[Measures].[Internet Sales Amount]
    } on columns,
    {[Date].[Date].members} on rows
    from(
    select
    STRTOMEMBER("[Date].[Date].&["+@StartDate+"]"):STRTOMEMBER("[Date].[Date].&["+@EndDate+"]")
    ) on columns
    from [Adventure Works]
    Reference:Define Parameters in the MDX Query Designer for Analysis Services (Report Builder and SSRS)
    Regards,
    Charlie Liao
    TechNet Community Support

  • Reporting Services Cube Parameters not generated automatically

    Hi, I have a few reports which display data from an Analysis Services cube.
    For example my report uses Dateset1 as the source of the report.
    When I modify Dataset1 by adding a new parameters in the Query Designer, a new (hidden) dataset is created called:  CustomerAgeCustomerAge.
    But a new parameter on the report is not automatically generated. In the Dataset1 - Dataset Properties \ Parameters there is an entry for the parameter which should be created e.g. [@CustomerAgeCustomerAge]. I can work around this but I'd like to find the
    cause. Is there a max number of parameters which are generated automatically - my report has 12.

    Hi darkdusky,
    Based on my understanding, when you add a new parameter in dataset query, the parameter doesn’t generate automatically in report parameters.
    In Reporting Service, when we define a dataset query that contains a query variable, the query command is parsed. For each query variable, a corresponding dataset parameter and report parameter are created automatically. As we tested in our environment,
    even we create thirteen parameters in data set query, corresponding parameters automatically appear both in report parameters and dataset parameters in dataset properties.
    So in your scenario, I would like to know how about your query with those parameters. If possible, please provide some screenshots of results before and after you add a new parameter in dataset query.
    If you have any question, please feel free to ask.
    Best regards,
    Qiuyun Yu

  • Connecting SQL Analysis Services 2005 cube with obiee 10g

    Hello experts,
    I'm trying to link an SQL Analysis Services 2005 cube with obiee 10g.
    In Oracle BI Administration, when I choose File -> Import -> From
    Multi-dimensional, I have to fill the next gaps:
    Provider Type: Analysis Services 2005
    URL: ***********
    User: blank
    Password: blank
    Before to do that, I need to create the URL path.
    To create the URL, I have followed this link.
    http://erpthings.blogspot.com/2008/08/obiee-connection-to-msas-2005-
    cube.html
    But, it hasn't worked because when I was trying to configure the HTTP
    access to SQL Server 2005 Analysis Services, we have to perform a test to ensure that it has been well configured. In IIS -> WebSites -> Default WebSites -> olap, we should browse the document called "msmdpump.dll" and this should show an xml document, but it doesn't.
    Thanks!

    Hi Alex
    It sounds like your MS data pump might not be configured correctly. First ensure that is working before trying to put the values into OBIEE. Don't worry about the 500 error - I get that too instead of an XML page.
    You can read a bit more about SSAS support on my blog here:
    http://total-bi.com/2010/12/obiee-sql-server-analysis-services-cubes/
    Here's the link to Microsoft's description of setting up msmdpump.dll
    http://technet.microsoft.com/en-us/library/cc917711.aspx
    Paul

  • Refreshing the Data Source View in Analysis Services

    I have added columns to the SQL Database table that is used as a dimension in an Analysis Services Cube.  The new columns will be used as additional Property Fields for the dimension.  When I attempted to refresh the Data Source view so that the additional columns are present, I am given the following error:
    System.Data
    Property not accessible because 'Parent Columns and Child Columns don't have type-matching columns'
    I have done nothing to the columns used for the parent of child and the error message provides nothing to gon on. Does anyone have any ideas on this?
    Gary

    Olga,
    Thanks for your response.  I will try and answer your questions
    1) I have not tried removing the columns yet.  I will try that this afternoon but have limited hope.  The two columns I added are simple text columns that will be used as attributes in the dimension.  I have made no change to the parent or child columns.
    2) The table I modified is the source table for a parent-child dimension.
    3) The reference to the "check list" does not take me to any kind of check list.
    4) The parent-child dimensions I am trying to modify have been in use for months and the parent and child columns do have the dame data types.
    5) I have also check the data types between the dimension table and the fact table.  they use the same data types (small int).
    6) I have not made a collection for the parent key, it is a single column. The remainder of your last paragraph is not clear to me. Can you give me an example.
    I am fairly inexperienced with Analysis Services, please talk slow and use small words  :-)
    Thanks again for your help!
    Gary

  • SQL Server analysis service command does not retry in agent job

    Hello,
    I'm working on SQL Server 2008 R2 version. I have scheduled a SQL server agent job to process analysis service cube using an analysis service command. Sometimes this job fails due to network load. So I set this job to retry 2 times after 10 mins interval.
    There were no retry attempts when job failed. There was no error message.
    I simulated this scenario using simple analysis service command which I intentionally set to fail. That's right. It's not retrying for analysis service commands. Any workaround for this ? any suggestions ?
    Thanks in advance.

    Hi Anush87,
    In your scenario, you had set that the step to retry 2 times after 10 mins interval. However, there were no any retry attempts when job failed, and you can reproduce this issue, right?
    Since there is no any error message, it's hard to give you the root reason that cause this issue. Based on my research, so many other people had encountered this issue which you can see on the link below
    http://social.technet.microsoft.com/Forums/en-US/543acccb-f107-420b-9652-53856c9137bb/sql-server-agent-job-retry-not-working?forum=sqldatabaseengine, you can you can submit a feedback at the link below
    http://connect.microsoft.com/SQLServer/Feedback
    So that Microsoft will confirm that if this issue is a bug in SQL Server Agent job.
    However, in order to troubleshoot this issue, you can query Agent job information to ensure the job configure settings are correct, you can refer to the link below to check it.
    http://www.mssqltips.com/sqlservertip/2561/querying-sql-server-agent-job-information/
    http://www.sqlservercentral.com/blogs/hugo/2009/05/27/configuring-auto-retry-on-sql-server-agent/
    Hope this helps.
    Regards,
    Charlie Liao
    TechNet Community Support

  • Issue in migrating analysis service job from one server to another

    Hi All,
    I'm migrating analysis services from one server to another, within the source server there is a job with the name
    denreportingprocess which executes a batch file within local drive on server. I have moved all the analysis services cubes from the source server to destination and all the databases on the source server have been restored to destination.Verified
    cubes are pointing to the correct database by modifying the connection string within the datasource part of the cube databases.
    I have copied the entire process folder from source to destination and edited the code within the run file and code within the job pointing the locations to new server.
    Below is the job code, it is running a batch file placed in E drive
    E:\MSSQL10_50.MSSQLSERVER\process\run.cmd
    Below is the text within the batch file (windows command script)
    E:\MSSQL10_50.MSSQLSERVER\process\ASCMD -S "servername" -i "E:\MSSQL10_50.MSSQLSERVER\process\Process.xmla"
    -o "E:\MSSQL10_50.MSSQLSERVER\process\Log.txt"
    Below are the list of files within the folder E:\MSSQL10_50.MSSQLSERVER\process
    ASCMD
    process.xmla
    Logs.txt
    run
    When I run the job it is failing with below error message
    Ascmd: Exception trying to impersonate user: Access to the path 'E:\MSSQL10_50.MSQLSERVER\process\log.txt' is denied.
    Ascmd: Execution failed: Access to the path 'E:\MSSQL10_50.MSQLSERVER\process\log.txt' is denied.
    Thanks in Advance.
    Regards, Kranthi

    Hi Visakh,
    Thanks for the reply, I have mapped the account under which the job was running to D drive, now it can access the file but job is still failing and error is not that informative.
    Below is the error message with which the job is failing, where can i find this output file with errors
    Executed as user abc_agent. C:\Windows\system32>E:\MSSQL10_50.MSSQLSERVER\process\ASCmd -S "servername" -i "E:\MSSQL10_50.MSSQLSERVER\process\process.xmla" -o "E:\MSSQL10_50.MSSQLSERVER\process\Log.txt"   Microsoft
    (R) Analysis Services 2008 Command Line Tool  Version 10.0.87.5 X86  Copyright (C) 2008 Microsoft Corporation.  All Rights Reserved.Ascmd: Check the output file for errors.  Process Exit Code 1.  The step failed.

  • SAP HANA as a data source for Analysis Services

    I tried to use SAP's .net provider for HANA as a data source for an Analysis Services cube but to no avail.   I can connect, create a named query and preview data from the data source view designer but when SSAS actually tries to run the query,
    it wraps the named query in another query and produces a syntax error.  
    For example,  if the named query is "select * from ENTHANA.MARA", 
    SSAS sends a query like this "SELECT [test].* FROM (select * from ENTHANA.MARA) AS [test]"   (Test is the name of the named query).
    I know SSAS wraps queries and that's fine - except that brackets are not a valid way to quote an identifier in HANA.  HANA uses double quotes like Oracle. 
    Is there any setting in SSAS that can affect this behavior?   Will HANA ever be a fully supported data source for SSAS?  If so, when?

    Hi David,
    According to your description, you are creating a SQL Server Analysis Services project, now what you want is using SAP HANA as the data source, right?
    SSAS support many type of data source. However, as you can see on the link below, SAP HANA data source was not listed on that link. So this type of data source is not supported in current version of SSAS. Microsoft will update that document when it is supported.
    http://msdn.microsoft.com/en-IN/library/ms175608.aspx
    Thank you for your understanding.
    Regards,
    Charlie Liao
    TechNet Community Support

Maybe you are looking for

  • Looking for upgrade articles on the G4 and Intel Minis?

    I wrote a couple of upgrade guides, one for upgrading the G4 Mini and one for upgrading the Intel Mini. The G4 Mini upgrade guide is available here: http://mini.wiredby.com And the Intel Mini upgrade guide is here: http://www.mac-forums.com/forums/sh

  • AP Payments Processing

    hi We are using EBS V12.1.3. Is anyone aware of problems when using local file system as the transmission protocol in the setup of the Funds Disbursements in Oracle Payments. I have specified a directory and file name as required in the setup so the

  • Audio Jump on Import

    I have just recieved a Sony HVR-MRC1K for use with my Sony Z5 in the hope I could do away with endless capturing hours. I have a problem in the fact that when I import footage (DV in this case) from the unit using either the RU Util or from CF via Ha

  • Streams apply process keeps growing PGA

    A streams apply process which applies to a sql sever database is increasing its pga use continually until i stop the process and restart it. I need to stop it once every week or it will use too much of the pga and the database will hand causin paging

  • Why is Site Manager Freezing?

    I am using CS5 with Windows 7.  A few weeks ago, I made a change with my ISP and need to make an adjustment in Site Manager, but I am locked out of several settings in the site setup screen.  I have deleted and reinstalled the entire suite twice.  Th