Performance of Deliverables (KPA/KPI)

Hi,
In Oracle Projects R12, I wish to be able to track the performance of project deliverables (whether the deliverable has been shipped by the due date, or whether the deliverable has been billed by the due date).
Can anyone suggest a way to do this please?
Or perhaps if not possible using deliverables/actions, the second best way would be to do this would be to use shipping milestones in tasks. Is there a way to track just the performance (like KPA) for task milestones such as shipping dates?
Many thanks,
Matt

Hi
You may consider exceptions client extension.
Oracle allows you to develop custom KPIs / Measures.
You may code the logic so system will calculate the actual shipping date and actual invoice generation date and compare those to the deliverable due date.
You can setup threshold so system will show indicators (green, yellow, red) based on the difference between actual completion date vs. the due date.
Dina

Similar Messages

  • Weird "Trend" metric calculation in Tabular KPI

    Hi Experts ,
    We have a tabular model in which we started designing the KPI to show Actual metric value vs its Target comparison by an Status indicator to know how good the metric has performed.  In the KPI calculation window of the tabular model , there are only
    place holders to calculate Value , Status & Target but not for “Trend”  . Even though we didn’t code anything specific for “Trend” calculation , Under the newly created KPI we see the “Trend” along with  Value , Goal & Status . But the “Trend”
    is behaving weirdly in the Tabular KPI, The trend indicator is being showed for every dimension attribute that is sliced with the KPI irrespective of whether it has Metric value or not.  I searched many websites to understand how this “Trend”  is
    being calculated in a KPI , but none of them are able to throw some light on the “Trend” calculation. In this kind of scenario please suggest me a way to circumvent this issue.
    How to hide the “Trend” indicator from the newly created KPI, as this I think we cannot define “Trend” calculation in tabular as in that of Multidimensional Cubes
    Understand the reason why “Trend” is displayed in tabular models
    Below is snapshot of our KPI when interfaced through Excel.
    Can you guys please help on how to hide the “Trend” expression from tabular models. So that our users wouldn’t be confused by using unrequired metric in KPI.
    Rajesh Nedunuri.

    Hi NedunuriRajesh,
    According to your description, since you haven't specified any expression for Trend calculation, you want to hide the Trend option. Right?
    In Analysis Services Tabular, Value, Goal Status and Trend in KPI are based on the Base Value, Target Value and Status Threshold. No matter you specify a Trend Expression or not, the Trend box is always displayed in the KPI pane. And it will do the
    calculation automatically. This is feature by design. There's no way to edit or modify it. So your requirement can't be achieved currently.
    I recommend you
    a feature request at https://connect.microsoft.com/SQLServer so
    that we can try to modify and
    expand the product features based on your needs.
    Best Regards,
    Simon Hou
    TechNet Community Support

  • Connect a Date Dimension to a cube without relationship

    Hi everybody,
    I would like to answers to one business requirements.
    I create a cube that models the following event : a customer send a product from an agency to another customer who receives it in another agency.
    So I have a fact table with only two measures
    Amount
    Count
    which is connected to these dimensions
    Product
    Sending Date
    Receiving Date
    Sender (Customer)
    Receiver (Customer)
    Sender (Agency)
    Receiver (Agency)
    The users would like to analyse the following KPI, at a specific date :
    Number of transactions sent, the amount
    Number of transactions received, the amount
    Number of transactions pending, the amount
    To answer this business requirement, I have added a new date dimension in the cube with no relation, so that the user can select a date from this independent dimension and get the different KPIs. 
    But I don't get any results.
    Is it a good model ? How to make it possible for the user to use the independent Date Dimension to perform analyzes of different KPI ?

    Is it a good model ? How to make it possible for the user to use the independent Date Dimension to perform analyzes of different KPI ?
    Hi Meal,
    According to your description, you want to know if is it possible for the user to use the independent Date Dimension to perform analyzes of different KPI, right?
    As per my understanding, we cannot do this without relationship between dimension tables and the independent Date table. However, we can add the relationship between the added date table and the fact table Sending Date and Receiving Date column. Please refer
    to the link below to see the details.
    http://msdn.microsoft.com/en-us/library/ms175427.aspx
    Regards,
    Charlie Liao
    TechNet Community Support

  • Scorecards and Dashboards

    Scorecards and Dashboards
    According to Harvard Business Review, Balanced Scorecard concept is the most influential management idea in the past 75 years.
    The performance of an organization is tracked against four key perspectives such as Finance, Customers, Internal Processes and finally Learning, Innovation and Growth. These perspectives are divided into multiple sets of Key Performance Indicators ( KPIs ) and grouped against these perspectives and further broken from top level to the employee level KPIs. World over companies are moving towards implementing enterprise wide organizational Balanced Scorecard.
    Simplicity, personalization and empowerment are the keys while building Agent Dashboards. These goals are achieved through the application of Business Intelligence ( BI ) concepts. A contact center performance is an aggregation of every agent performance.
    An Agent should be able to customize and get the unified view of the aggregated metrics and track them on daily basis. How much he has achieved with respect to target? How his performance benchmarked with respect to overall call center performance. Is there performance improvement over a period of time? He can drill down and see which particular KPI is bringing his overall score down and he can see how he has or has not been able to improve his performance over a period of time.
    A Supervisor should be able to see how agents under him are performing. His Dashboard should show an aggregated score of all his agents and simultaneously showing tabular information of comparative scores of all agents working under him. He can drill down on agent score who are not performing well. He can see individual KPIs of agents and study whether there are improvements over a period of time under the respective KPI.
    Manager would like to see dashboards of aggregated scores of supervisors on weekly basis rather than daily basis. He can drill down till agent’s individual KPI level if he desires. A manager could also drill down against different channels and see the performance of different supervisors under different channels.
    Call Center head would like to see the aggregated performance of his senior managers and across channels and will be interested to track performance against KPIs on monthly basis. He can drill down from manager level right up to the agent level and their individual KPIs to see where the actual pains are located.
    Quality Managers would like to see how agents are performing against quality related KPIs. He would further like to analyze poor performing KPIs against type of service requests, type of channel and may be against important set of clients so that specific training programs could be designed for specific sets of agents rather than a plane blanket approach where every one is trained for everything.
    In all the performance metrics could be analyzed against multiple dimensions such as channels customers, type of services, location etc. Users down the line are empowered to analyze the data and able to get operational and strategic insights. Hundreds of variants of Dashboards and Reports could be created from a simple single user friendly interactive interface
    All above could be simply achieved through web browser and single user interactive interface. Every person in the organization can personalize the dashboard or reports from this single user interface. The power of analysis is coupled into the reporting and dashboard. It is this Business Intelligence capability that creates the unique customer experience in getting the insights of business.
    By:- Sanjay Yadav

    Hi Pradeep,
    there are several how-tos available on SDN and also the xApps are available on the Service Marketplace as example.
    Also check the Visual Composer Site:
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/webcontent/uuid/8196d631-0e01-0010-0590-d9ec6e1693a5 [original link is broken] [original link is broken]
    Best Regards,
    Marcel

  • Abandoned Lightroom 1.0 - my list of critical showstoppers

    I have seen several recent examples of users venting on this forum but sadly, I have to symphatise as in my own experience the quality of the Lightroom 1.0 release, specifically the library module does in fact justify this level of frustration.
    I was looking forward to Lightroom becoming the new gold standard and the app that I would spend days and nights in when I was not out shooting. As it is I have abandoned my trial with Lightroom after three days. I never got past the Library module (and since Ligthroom can't function without a library that sealed it's fate).
    The stability showstoppers:
    - Importing with generating previews always crashed after few thousand images, no option to generate 1:1 previews during import completely baffling*
    - Importing without generating previews would randomly not import some of the images, deleting the library and starting again would cause a DIFFERENT set of images to be randomly not imported - no reason given, the images that could not be imported could be opened just fine in Camera Raw and Bridge would happily generate previews for them
    - Generating previews after importing always crashed after a few thousand images
    - Frequent out of memory errors
    That's just the MAJOR stability issues. Then there is performance (I have about 30,000 images, not that I could ever import all of them into Lightroom...)
    And then there are the functional showstoppers:
    - I will absolutely NOT trust Adobe with my images or my metadata (not after the fiasco with Bridge corruping it's caches in EVERY release to date or the issues with files that were renamed on import getting corrupted, and an application that comes with a built in tool to check for library corruption doesn't exactly inspire confidence either) so I import in place and want all of my metadata to be stored in *.xmp files. BUT WHY OH WHY does generating previews cause Lightroom to write out an *.xmp file for every file it generated a preview for!?! Even Bridge was smart enough to only write out *.xmp files when I actually did something to the image!
    - Since there is no way to disable 'folder aggregation', the folders view is usessless for trying to organize images. Try this, take a folder with 1000 images, create three subfolders and try to organize the images into subfolders. Every time you click on the parent folder you see all 1000 images. You have no idea which ones you've already moved and which are still in the parent.
    - Why can't renaming 'files' just rename the files in the file system!?! Since Lightroom is NOT a browser it MUST NOT force me to organize / rename my images outside of it since it then looses track of the files.
    Yes, I have logged all of these issues with Adobe. Have I heard anything from Adobe? No. Lightroom has been out for FOUR months. Given all of the stability problems that have been widely reported you would expect that a company that is in touch with it's users would in fact react and put out a couple dot-dot releases (1.0.1, 1.0.2) to address the stability showstoppers. That would at least enable them to get a feel if they're in fact addressing the most severe issues that their user base is experiencing. But no they chose to put their heads in the sand and pretend the users are happy and continue working on adding new functionality. Here is what's going to happen. They'll release 1.1 and half my showstoppers will still be there. And who knows how long we'll have to wait before the next release comes out.
    *Yes, I am aware how much disk space 1:1 previews would use but I don't care, I have a 30" screen and I have no problem throwing disk space, cpu power or machine time at the problem but I ABSOLUTELY REFUSE to waste my own time waiting for Lightroom to perform processing 'on the fly' that could have been performed in bulk while I sleep.
    Yours Extremely Frustrated,
    Cezary

    Lee,
    With all due respect, I think that you're still missing my point and in doing so I believe you're making my case stronger! The disclaimer is that I don't apply any metadata to the image during importing (my camera already embedded my name in every raw file and keywording at this point doesn't make sense in my workflow).
    The application of defaults is EXACTLY why I don't want the *.xmp files written out until 'I' have manually modified the image in some way. Applying the tone curve is great example. So today Lightroom applies a medium curve. Maybe tomorrow version 1.1 is going to get smarter and apply different curves depending on the contrast of the image (or some other default behavior will change, like what happend to the Shadows slider in Bridge 1.x, it used to default to 0, now it defaults to 5). But then we have the question of compatibility! Which images should it apply the new smarter defaults to? The natural choice is: images that the user modified should stay the same, new images and those that have not been modified in any way should automatically benefit from the new defaults. Comitting to defaults at the time of importing seems completely unnecessary. If they're defaults, why write them out at ALL? I wouldn't mind as much if the *.xmp files were 'empty' so to speak. But if I recall correctly, the're not, they include all the ACR settings at 'current' default level as if I manually set them.
    As for 'What's the point of importing your images into the LR database if this is the case?'. I never said I like it using a database but I appreciate the pros and cons of this approach. I would use Bridge instead but it has a different showstopper: the generated previews are too low quality to judge image sharpness (see BreezeBrowser in High Quality mode for what a preview should look like, I posted about this in the Bridge forum recently). The low quality previews are THE reason I am looking at Lightroom but of course it also promises to be 'so much more' than Bridge.
    Of all the issues that I consider to be showstoppers in the library module of Lightroom 1.0 this is the least important and I will agree to disagree with you. If 1.1 fixes the stability issues, improves performance and delivers a usable interface for managing files and folders that keeps them consistent with the file system I'll be happy to give Lightroom another shot even if it means *.xmp files being generated for every image as it's imported.
    Cezary

  • Java APM events do not appear in the diagnostics / advisor console

    Hi
    I configured the Java APM and installed / monitoring an application (struts). I can see all 4 performance rules for the application and I also enabled the Java APM monitors for exception and performance events. I am banging now my application to throw
    exceptions. I get up to 60 exceptions / sec, the alert appears in the SCOM console and also the Performance view delivers results, but if I try to find anything either in the APM Diagnostics nor Advisor console is any event visible.
    1) Should this Integration work in the preview?
    2) Is there any configuration I missed?
    Thank you for your help,
    Stefan
    Blog: http://blog.scomfaq.ch

    Hi Stefan,
    I read you great post
    http://stefanroth.net/2013/08/10/scom-2012-r2-java-apm-mp-installation-and-configuration-part-3/
    I can see the performance counters and availability events also monitored, but I still cannot get Java events in AppDiagnostic. Can you help me or update on your blog the Java events in AppDiagnostic part? Thanks in advance!
    I had configured JEE Invoke Account as "tomcat" within tomcat-users.
    Here is my tomcat-users.xml
    <tomcat-users>
    <!--
      NOTE:  By default, no user is included in the "manager-gui" role required
      to operate the "/manager/html" web application.  If you wish to use this app,
      you must define such a user - the username and password are arbitrary.
    -->
    <!--
      NOTE:  The sample user and role entries below are wrapped in a comment
      and thus are ignored when reading this file. Do not forget to remove
      <!.. ..> that surrounds them.
    -->
    <role rolename="manager-gui"/>
    <user username="tomcat" password="Passw0rd" roles="manager-gui"/>
    </tomcat-users>
    Regards,
    Sam
    This posting is provided &quot;AS IS&quot; with no warranties, and confers no rights.

  • What is a difference "Delivery Type Class"?

    Hi all,
    In class of delivery can define 3 type class :
    1. Item
    2. Document
    3. Others
    Can you explain me about this? In orale document I cannot found this information.
    Many Thanks.

    Hi,
    As per the Project implementation Guide
    A deliverable type class determines what functions you can perform on deliverable actions.
    You can plan, ship, procure, and bill an item deliverable.
    You can ship, procure and bill a document deliverable or other deliverable.
    Thanks
    Govind

  • Refresh Pivot Bug in Power Query version 2.16.3822.242

    I have an Excel file with Power Query and several Pivot Tables built from the Power Query data. The Pivots are built from the table loaded to a worksheet, and there is a timeline date slicer connected to all those various Pivot Tables.
    In build 2.16.3822.242, refreshing the Pivot Table (either by right-click/Refresh, or programmatically via macro) causes the date timeline slicer to disappear (literally be deleted from the sheet).  The Pivot Refresh succeeds fine.
    This does not occur when using Power Query Version: 2.16.3785.242. So for the time being, my only workaround is to avoid installing the newer build.
    Any ideas what's causing it?  I told my coworker (the primary user of the file) that she could still use the file and go filter every pivot's date range separately, rather than the slicer, but that is laborious and for now she is just going to remote-desktop
    into a computer that doesn't have build 3822 installed yet.
    Shawn Keene

    Hey Ed, I apologize for un-proposing your answer so quick, I was on my Windows Phone but the forum loading spinner wouldn't leave the text entry field, so I couldn't input a reply until I got home.
    I actually found a solution just today. This may not be the 'only' solution, but it worked well.  When I had previously thought I had a solution, it only 'worked' because my subsequent refresh after making the table change didn't add any new rows of
    data (all the rows it would have returned that day were already there).  Today I made the change and then refreshed, and the new rows were formatted as the Power Query intended them to be.
    The settings I used were to set the table to "do not preserve formatting" and to "overwrite existing cells for new data, clear unused cells". Now my inserted Power Query table is plain white (no banded lines or colors), but I don't care
    because it's on a hidden sheet that no one will ever see.
    Here's the table settings that seemed to work well for me.
    And to be honest (although I haven't extensively tested), I think this was only a problem if the table was created using an old version of Power Query, then refreshed with the new version.  Anyway, now I can easily refresh my Query and then the Pivot
    from it without trouble.
    And since I suck at brevity:  I also think it'd be beneficial for me to re-create these workbooks and have the Power Query only add the data to the data model, and source the Pivot directly from that, rather than indirectly source the pivot from a populated
    table. Basically save a step, and then the Pivot will refresh together with the connection, rather than having to be refreshed separately after the query is done.
    In any case, I'm loving this tool and I'm practically a Power Query evangelist to every other department, because it empowers the knowledge worker to investigate data without waiting weeks for a DBA to have time to write a query and send results.  Plus
    we power this awesome real-time performance dashboard showing live KPIs of 15 metrics, and ditched our Domo.com account, saving upwards of 100k per year, just by using Power Query. Using nothing but plain Excel and Power Query, it publishes the graphics to
    our intranet and TVs around the builds live all day. It's amazing.
    Shawn Keene

  • What ist App $S

    Was bedeutet das, kommt seit Win7 auf dem Rechner installiert ist und die 64bit - Version läuft

    A KPI are Key Performance Indicators.
    These are values companies use to manage their business.  E.g. net profit.
    In detail:
    Stands for Key Performance Indicators. A KPI is used to measure how well an organization or individual is accomplishing its goals and objectives. Organizations and businesses typically outline a number of KPIs to evaluate progress made in areas where performance is harder to measure.
    For example, job performance, consumer satisfaction and public reputation can be determined using a set of defined KPIs. Additionally, KPI can be used to specify objective organizational and individual goals such as sales, earnings, profits, market share and similar objectives.
    KPIs selected must reflect the organization's goals, they must be key to its success, and they must be measurable. Key performance indicators usually are long-term considerations for an organization

  • The Superdrive is driving me nuts!

    Hi all. I've been trying to rip audio books to my itunes library and I've run into a problem. Disc 1 will rip just fine. Disc 2 is not regognized at all, the drive makes a lawnmower like noise, iTunes locks up and you have to restart to eject the disc and get iTunes to stop...no force quitting on this one. Disc 3 Rips just fine. Discs 4-10...varying degrees of Disc 2 type issues. Sometimes clicking sounds a lot like when you run a laser cleaning disc through your xbox ect all the way to the ferocious spinning and lawnmower noises.
    There is nothing wrong with these discs. They are brand new, from the same set, and will rip into my girlsfriend's XP library just fine. They play in normal CD players. Why has my Harry Potter audiobook forsaken my beloved MBP?

    The heat is an aesthetic issue, don't put it on your lap. Yes my Macbook zooms up to the 100C ceiling momentarily when I run the double load test, then equilibrates down to 90C. The Core Duo can handle that level of heat, it is hardware-controlled such that it won't cook itself. Processors can take a lot - I know that my old 2100+ came under-greased from the factory when I got a AMD Athlon XP-powered PC back in the day; it consistently ran at 160F, maybe 70C or so, and this was near death temperature, yet this didn't hurt its lifespan.
    My temps are getting better. I now idle (running chat, safari, mail, preview) at 56C or so, which I consider good, since it's within the temp range of my old G4 12", which went up to 60C at times. But it's still a "hot" week 12 powerbook capable of pop-tart-cooking temperatures of 100C.
    Advice: run in power-saving mode, unless you are running an app that lags when this is on, which is unlikely. I guess this just throttles things down more so the processor doesn't get extra exercise. Also keep your house's AC up, as ambient air temperature affects heat. If the case is exposed to a muggy 78 degrees, it will be warmer than a 68 degree house.
    Anyway the palmrests and keyboard don't get hot. Only the upper edge gets hot, and the speakers. Don't put it on your lap, or set your legs apart enough that the cooler edges are where the weight is borne. So how does this really affect the user, apart from the belief that one's macbook is "bad"? For the performance it delivers, it's nearly worth the warm.

  • The heat is driving me nuts!!!

    I've tried to put up with the heat, but now it's officially driving me bananas. Apple had better come up with a response to this problem asap. Stop dodging the issue!
    P-oed!

    The heat is an aesthetic issue, don't put it on your lap. Yes my Macbook zooms up to the 100C ceiling momentarily when I run the double load test, then equilibrates down to 90C. The Core Duo can handle that level of heat, it is hardware-controlled such that it won't cook itself. Processors can take a lot - I know that my old 2100+ came under-greased from the factory when I got a AMD Athlon XP-powered PC back in the day; it consistently ran at 160F, maybe 70C or so, and this was near death temperature, yet this didn't hurt its lifespan.
    My temps are getting better. I now idle (running chat, safari, mail, preview) at 56C or so, which I consider good, since it's within the temp range of my old G4 12", which went up to 60C at times. But it's still a "hot" week 12 powerbook capable of pop-tart-cooking temperatures of 100C.
    Advice: run in power-saving mode, unless you are running an app that lags when this is on, which is unlikely. I guess this just throttles things down more so the processor doesn't get extra exercise. Also keep your house's AC up, as ambient air temperature affects heat. If the case is exposed to a muggy 78 degrees, it will be warmer than a 68 degree house.
    Anyway the palmrests and keyboard don't get hot. Only the upper edge gets hot, and the speakers. Don't put it on your lap, or set your legs apart enough that the cooler edges are where the weight is borne. So how does this really affect the user, apart from the belief that one's macbook is "bad"? For the performance it delivers, it's nearly worth the warm.

  • How to setup customization KPI (Key Performance Indictor) icon set on excel

    Hi, How to setup the KPI (Key Performance Indictor) using the customization KPI icon set on excel as request from company. Should it display the customization KPI icon with excel service of share
    point site?

    Hi,
    Thank you for posting in the MSDN Forum.
    Since the issue is more relate to the end-user, I'd like to move it to Excel IT pro forum.
    The reason why we recommend posting appropriately is you will get the most qualified pool of respondents, and other partners who read the forums regularly can either share their knowledge or learn from your interaction with us.
    Thanks for your understanding.
    Best regards
    Fei
    We are trying to better understand customer views on social support experience, so your participation in this interview project would be greatly appreciated if you have time. Thanks for helping make community forums a great place.
    Click
    HERE to participate the survey.

  • Portal performance KPIs

    I need to define an SLA for Portal performance.
    What would be the top KPIs or metrics for this?
    What are the tools/reports to measure it?

    Hi,
    Usually customer expect in SLA to be determined :
    a)  average response time (or processing time) KPI
    This KPI should be carefully defined for well-described process dialog steps (for UI based scenarios), as well as for well-defined amount of processed data.
    For background jobs processing time can be defined, with direct relation to the volume of data to be processed.
    The network environment (LAN/WAN) has significant impact on response time. The system hardware (CPU, Memory, Disk I/O) too. This means that you have to describe the conditions (fast CPU, enough memory, good network, etc) with as concrete as possible parameters, directly in the SLA together with the KPI.
    b) processing throughput or capacity , e.g. number of concurrent users, parallel tasks, orders per hour, and so on
    The KPI of how many users I can run on the system sometimes is a KPI that customers require. This, similar to the response time KPI, should be documented together with the hardware requirements and the volume of data requirements.
    Regards,
    Markus

  • KPI ratnig in individual Goals Predefined Performance Management

    Hi
    We are using predefined Performance Management process and need to add KPI rating for each goal created under individual goal . At present it gives option to give rating for all the new goals at one place in the individual goal .
    Pl let me know how to configure for the same.
    Thanks
    Srikant

    Hi,
    We are using EHP7 . wanted to know if we can do customization in the template UI.
    Srikant

  • DBA Staff KPI(s) - Measuring Performance

    <font color="#333333"><font color="#000080"><strong>DBA Staff KPI(s) - Measuring Performance</strong></font>
    </font>
    <p>
    <font color="#333333">
    Hi,
    I am an<strong> IT manager</strong>, <u>managing two solutions: "SCM" & "WMS (Warehouse Management System)"</u>, and there is a team for each solution.
    <strong>DBA staff</strong> are involved, almost, with everything (database administration, handling system/application performance & stability, Application support, creating & automating reports for all levels, monitoring interfaces issues, troubleshooting, handling investigations for operations/management/business/research/system issues ...etc.)
    Its not easy to identify their KPI(s) and the additional responsibilities.
    <u>
    If you know some common KPI(s) that have been used before, I really appreciate if you can share them with me, so I can evaluate my team properly.</u>
    </font>
    </p>
    <p>
    <font color="#333333">
    Thanks ......
    <em><strong>Regards,
    Ala' </strong></em></font>
    </p>

    Just a comment that while your Subject says "DBA", the job responsibilities include many other things which are outside the definition of a DBA role, eg Application support, creating "reports" for "all" levels, troubleshooting "business" issues, etc
    So what does "all" levels mean? Does that mean reports from the Application? or just the technical aspects of database administration like I/O, memory, latches etc. What does troubleshooting "business" issues mean? Does that mean identifying why the accounting books don't balance?
    I understand that while the DBA role can be clearly defined in theory (Oracle Database Administrator's Guide - Task of a DBA), in practice many organizations, especially the smaller ones, are closer to what you have described. In such cases, the approach I'm familiar with has been to define as much as practicable the primary and secondary responsibilities of each position, and define the KPIs for the primary and make rather vague statements for the secondary role(s).
    Thank you that was valuable,
    So what does "all" levels mean?Answer: Application Reports go to all level of management, from department managers to senior mgmt and directors.
    It includes,
    - Productivity measurements.
    - Identifying some of business requirements.
    - Creating/Building estimates for any CR.
    - Should be involved with warehouse expansions plans.
    - Reporting critical operations & inventory activities.
    - Building interactive reports.
    - Cost allocation, ... etc.
    Then the normal DBA comes which includes I/O, memory, DB performance, application & DB users ... etc.
    You are right: I can take their primary responsibilities to define their KPIs and leave secondary responsibilities on vague statements for the secondary role(s), so, I can evaluate them according to the primary part and leave the secondary part for them to compete on it.
    Thanks again

Maybe you are looking for