Performance counters for SQL database

I have this need to carry out a performance test on our SQL database. What are the counters that I need to check and are there any tools that would assist me on this?
mayooran99

Hello,
Please refer to the following series of articles.
n  Beginners
http://blogs.msdn.com/b/john_daskalakis/archive/2013/10/07/how-to-troubleshoot-sql-server-performance-issues-with-simple-tools-part-1-how-to-collect-a-detailed-perfmon-trace.aspx
http://blogs.msdn.com/b/john_daskalakis/archive/2013/10/14/how-to-troubleshoot-sql-server-performance-issues-with-simple-tools-part-2-how-to-analyze-the-perfmon-trace-and-detect-io-bottlenecks.aspx
http://blogs.msdn.com/b/john_daskalakis/archive/2013/10/21/how-to-troubleshoot-sql-server-performance-issues-with-simple-tools-part-2-how-to-analyze-the-perfmon-trace-and-detect-sql-server-performance-issues.aspx
http://blogs.msdn.com/b/john_daskalakis/archive/2013/10/30/how-to-troubleshoot-sql-server-performance-issues-with-simple-tools-part-3-the-profiler.aspx

n  Advanced
http://blogs.msdn.com/b/john_daskalakis/archive/2013/11/04/specialized-performance-troubleshooting-part-1-how-to-troubleshoot-forwarded-records.aspx
http://blogs.msdn.com/b/john_daskalakis/archive/2013/11/11/specialized-performance-troubleshooting-part-2-how-to-troubleshoot-memory-problems-in-sql-server.aspx
http://blogs.msdn.com/b/john_daskalakis/archive/2013/11/18/specialized-performance-troubleshooting-part-3-how-to-identify-storage-issues-at-a-sql-server-box.aspx
Hope this helps.
Regards,
Alberto Morillo
SQLCoffee.com

Similar Messages

  • Top 5 Performance Counters for SSAS Multidimentional

    I have SharePoint 2013 and SQL Server 2014 based reporting solution on Azure.
    I have dedicated VM for SSAS.
    I would like to collect performance data (CPU, RAM etc...)
    What are top 5 most important performance counter for SSAS? (PERFMON)
    Kenny_I

    Hi ,
    Every Counter has there own use and significance and most important will depend on your requiremnt like whta you exactly want to Monitor.
    Apart from above ; The most useful counters here would be in the MSASXX:Processing group, Rows read/sec and Rows written/sec, which will give you an idea of how fast SSAS is reading data from the data source. As a general rule you should
    expect to get 40-60000 rows/sec from SQL Server, and if you're getting less you should try to tune the database you're getting data from rather than SSAS itself. Partitioning measure groups in SSAS and processing partitions in parallel is also a good way of
    improving processing performance.
    Please Refer below link for better understanding of performance counter :
    http://ms-olap.blogspot.in/2009/04/performance-counters-for-ssas-2008.html
    Thanks

  • Performance counters for exchange 2013 mailbox & cas servers?

    what are the Performance counters for exchange 2013 mailbox & cas servers?
     similar to rpc request for troubleshooting exchange slowness, I haven't found any technet article for exchange 2013. 

    Hi,
    Please see this:
    Ask the Perf Guy: Sizing Exchange 2013 Deployments
    http://blogs.technet.com/b/exchange/archive/2013/05/06/ask-the-perf-guy-sizing-exchange-2013-deployments.aspx
    Hope it is what you need.
    Thanks
    Mavis
    Mavis Huang
    TechNet Community Support

  • SharePoint performance counters for publishing cache

    Hi,
    I add some performance counters for publishing cache to my Windows performance monitor. The total values are filled but for specific site's they stay empty. Any ideas?

    HI Hanny,
    Please check the below for configuration
    http://blogs.technet.com/b/rgullick/archive/2014/01/10/sharepoint-performance-monitoring.aspx
    http://blogs.msdn.com/b/russmax/archive/2009/05/27/configuring-performance-monitor-log-for-sharepoint-performance-issues.aspx
    https://technet.microsoft.com/en-us/library/ff758658.aspx?f=255&MSPPError=-2147217396
    Please remember to click 'Mark as Answer' on the answer if it helps you

  • Changes in resource name in Monitor Matrix for SQL Database @ Azure portal

    hi
    Exact 4 months after starting of new SQL database tier, Resource Matrix changed.
    old names :
    CPU percentage
    Log Write percentage
    Physical Data Read percentage
    new names are:
    CPU
    LogIO
    DataIO
    My question is they are equal :
    CPU percentage = CPU   (looks OK)
    Log Write percentage = LogIO  (Log Writes means IN but IO means both (in and out))
    Physical Data Read percentage = DataIO  (Read means OUT but IO means both (in and out))
    If they are different then what is change ?

    Hi,
    We use the
    sys.resource_stats catalog view to monitor the resource usage of your Azure SQL Database and compare current resource utilization to different performance levels . The CPU Percentage, Physical Data Reads percentage, Log Writes Percentage show the average
    utilization percentage relative to the DTU of your database. For more information, see:
    http://msdn.microsoft.com/en-us/library/azure/dn369873.aspx
    And now, we upgrade the monitor metrics name for CPU and IO activity of SQL Azure database. If you add the metrics of Data IO or Log IO individually in Monitor Page, it will show the same name with Physical Data Reads percentage and Log Writes Percentage,
    you can refer to the following screenshot.
    After I adding the Data IO, it will show as Physical Data Reads percentage in Monitor Page.
    Regards,
    Sofiya Li
    Sofiya Li
    TechNet Community Support

  • EKM using the Azure Key Vault is now available for SQL Database and SQL Server running in Azure VM's

    In preview today, you can create keys in the Azure Key Vault, and use them with Azure SQL Database, or SQL Server running in a Azure VM. Use the Extensible Key Management (EKM) for TDE, backup encryption, or cell level encryption. For more information, see
    Extensible Key Management Using Azure Key Vault (SQL Server)
    http://msdn.microsoft.com/en-us/library/dn198405.aspx.
    The announcement:
    Azure Key Vault in public preview
    Key Vault offers an easy, cost-effective way to safeguard keys and other sensitive data used by cloud applications and services. Included are the following features:
    Enhance data protection and compliance:
    Protect cryptographic keys and sensitive data   like passwords with keys stored in Hardware Security Modules (HSMs). For   added assurance, import or generate your keys in HSMs certified to FIPS 140-2   level 2 and Common Criteria EAL4 standards,
    so that keys stay within the HSM   boundary. Key Vault is designed so that Microsoft doesn’t see or extract your   keys.
    All the control, none of the work:
    Provision new vaults and keys in minutes   and centrally manage keys, sensitive data, and policies. You maintain control   over your encrypted data—simply grant permission for your own and third-party   applications to use keys as needed. Enable
    developers to easily manage keys   used for dev/test and migrate seamlessly to production keys managed by   security operations.
    Boost performance and achieve global scale: Improve
    performance and reduce latency of   cloud applications by storing cryptographic keys in the cloud (versus   on-premises). Key Vault rapidly scales to meet the cryptographic needs of   your cloud applications and match peak demand.
    Get started with Azure Key Vault by creating keys for applications you develop,
    SQL Server encryption (TDE, CLE, and Backup), and partner solutions like
    CloudLink SecureVM.
    Key Vault is available now at no charge with discounted preview pricing starting on January 15, 2015.
    For more information, please visit the
    Key Vault webpage. For a comprehensive look at pricing, please visit the
    Key Vault Pricing webpage.
    Rick Byham, Microsoft, SQL Server Books Online, Implies no warranty

    Thank you for sharing this Rick.
    Regards,
    Alberto Morillo
    SQLCoffee.com

  • No size history for SQL Database in dbacockpit for new system

    Hello,
    we have installed a new ERP 6.0 System on Windows with SQL Server 2005 (3 month ago)
    Now we haven´t any data in datbase size historie in the dbacockpit.
    The message is: No size history.  Check the DB Collector status.
    We checked already the state of Collector. It is running every 20 minutes and collect data for performance. But not for the growth of the database.
    We also deleted the job at the SQL Agent and planned it new with the report MSSPROCS.
    Does anyone have an idea whats the problem?
    Best regards
    Petra Wöritz

    Hello all together,
    we are facing the same problem until now.
    I checked all the points you suggested:
    - the performance collector job runs in client 000
    - time zone is set to CET
    - the SAPOSCOL is running
    - no errors by executing the SQL statement EXECUTE sap_dbcoll
    - no errors in SQL Job SAP_sid_SID_MSSQL_COLLECTOR
    We don't have any data in size history.
    But we found a error message in dev_w0:
    C  dbdsmss: DBSL26 SQL3621C  Violation of PRIMARY KEY constraint 'PK__#perfinfo_________0C5BC11B'. Cannot insert duplicate key in object 'dbo.#perfinfo'.The statement has been terminated
    It occurs, when we try get the data of history in db02 (or dbacockpit).
    The note 1171828 doen't fit.
    We have already SAP_BASIS     701     0007
    We've got a system with SAP_BASIS     702     0006
    There the error is displayed in dbacockpit directly:
    SQL-Fehler 2627: [Microsoft][SQL Native Client][SQL Server]Violatio n of PRIMARY KEY constraint 'PK__#perfinfo________ _5A303401'. Cannot insert duplicate key in object
    Any ideas?
    Best regards
    Petra

  • Hardware requirements for SQL database used with TestStand

    We are wanting to set up a SQL server to store the data from TestStand.
    How do we determine the hardware requirements for this server? It will be used with 10-30 machines running tests and logging data and another 5-10 machines running queries to pull the data back out for analysis. The result data size will range from 50-25,000 results per run (run times are 1 minute for 50 result tests and 5 hours for the 25,000 result tests).

    Hi,
    database design and hardware requirements are never easy. There are a lot of scientific papers on work load tests and requirement assumptions. I can not give a short answer which machine to use. Just some ideas and starting points.
    The most important parts of a database system with large data sets are network bandwidth, RAM and storage bandwidth. With smaller data sets and more complex transactions the CPU becomes more important.
    In this TS case the data sets are usually rather small. If the queries are not too complex, the requirements seems to be not too high.
    Database performance is usually measured in transactions per minute (tpm). Special database servers can perform several thousands tpm and have costs starting at about 15 $ per tpm. See MS' ad page for a g
    ood starting point: http://www.microsoft.com/sql/evaluation/compare/benchmarks.asp
    You may also visit the TPC.org homepage.
    To be more specific.
    I'd choose a modern Intel-based system like P4-3MHz (that have virtual multiprocessors) with at least 512 MB RAM and a RAID5 hard disk storage system (not nercessarily SCSII) with at least 3 single HDs. Use a 100 MB LAN connection at least, best with a switch. Don't forget backup!
    Check also the pages of your preferred database provider.
    I am on a starting point here too. We have chosen mySQL, which runs (at least now) on the very same machine where TS & LV are running. We plan to test this setup with increasing burden to get a practical assumption of the HW requirements. The planned final setup will have up to 5 test stations and 5 query stations. We'll run about 50 rather complex tests of about 4 hours each that operate in parallel on the test stations.
    HTH and
    Greetings from Germany
    Uwe

  • Performance issues for sql developer

    1.     Does leaving DB connection open from a SQLNavigator has any impact on the DB resources and performance?

    That depends.
    If you run a query or browse a table with parallel setup and leave a grid open - you could leave many processes sitting idle on the server waiting for you to fetch the records down.
    But in general, 'no.'
    Also, what do you mean by a 'SQLNavigator' - you mean SQL Developer, yes?

  • Performance Tuning For 11g Database

    I've heard some Oracle DBA's suggest / recommend that specific parameters be tuned in the O.S. for using Linux as a dedicated Oracle 11g database server. I'm guess those parameters vary greatly based on utilization, hardware, and various other factors. I just wanted to ask if there are some soft or safe performance adjustments I can make which regardless of other various factors that are unknown would most likely benefit and or improve database performance?
    Here is what I know:
    - OS = RHEL 6.2 or OEL 6.2 64-bit
    - RAM = 4 GB
    - x2 Quad Core Xeon CPU's
    Thanks for shedding any light on this for me...

    >
    I've heard some Oracle DBA's suggest / recommend that specific parameters be tuned in the O.S. for using Linux as a dedicated Oracle 11g database server. I'm guess those parameters vary greatly based on utilization, hardware, and various other factors. I just wanted to ask if there are some soft or safe performance adjustments I can make which regardless of other various factors that are unknown would most likely benefit and or improve database performance?
    >
    ALWAYS read and follow the instructions provided by Oracle in the installation guide. It discusses the very issue you mention.
    http://docs.oracle.com/cd/E11882_01/install.112/e24321.pdf
    There is even a section for 'Configuring Kernel Parameters for Linux'
    After you read the installation guide and evaluate your own installation environment if you have any specific questions then post them. And if you intend to 'guess those parameters vary greatly based on utilization, hardware, and various other factors' then it should be obvious that you need to provide information on 'utilization, hardware, and various other factors' or we won't be able to help you with your specific use case.

  • How to configure SharePoint 2010 / 2013 Search for SQL Database Contents and Oracle Database Contents?

    Hi All,
    We are planning to maintain the contents in SQL / Oracle. Could you please suggest anyone which is best for SharePoint 2010 / 2013 Search. How to configure the search for external content source?
    Thanks & Regards,
    Prakash

    This link explains supported and non supported scenarios to use Oracle for BCS
    http://social.technet.microsoft.com/Forums/sharepoint/en-US/453a3a05-bc50-45d0-8be8-cbb4e7fe7027/oracle-db-as-external-content-type-in-sharepoint-2013
    And here is more on it
    http://msdn.microsoft.com/en-us/library/ff464424%28office.14%29.aspx 
    And here how you can connect Oracle to SharePoint for BCS functionality
    http://lightningtools.com/bcs/business-connectivity-services-in-sharepoint-2013-and-oracle-using-meta-man/
    Overall it seems SQL doenn't require any special arrangement to connect BCS to SharePoint.
    Regards,
    Pratik Vyas | SharePoint Consultant |
    http://sharepointpratik.blogspot.com
    Posting is provided AS IS with no warranties, and confers no rights
    Please remember to click Mark As Answer if a post solves your problem or
    Vote As Helpful if it was useful.

  • Performance analysis for oracle database on weblogic 10.3 version

    Hi All,
    We are migrating our servers from windows to Linux.In the process we have weblogic 8.1 version being upgraded to weblogic 10.3.
    How could i analysis the performance of the Oracle 10g on weblogic 10.3.
    Do we have any predefined test scripts in place to perform the necessary anaylsis.
    Thanks in anticipation.
    Pavan

    We do not provide any test scripts. As you can be sure, each WLP deployment is unique to the customer environment and business problems that are being solved. Having generic test scripts would not reflect this.
    However, you might wish to read the following documentation which discusses database performance with regard to WLP.
    http://download.oracle.com/docs/cd/E13155_01/wlp/docs103/db/db_architecture.html#wp1069661
    Brad

  • Sql 2005 standard edition for sql and AS databases with bpc 7.0 sp3 ?

    hello
    Install guide tells that we must use SQL 2005 Enterprise Edition with BPC except for Reporting Services.
    Customer asks wether he could use SQL 2005 Standard Edition for SQL database and Analysis Services ?
    Thanks.

    Hello Sorin. Two more questions, please:
    1) what do you mean by "Also into Application server you need EE because only in this version you have control flow." ?
    Is "control flow"  "business process flow" new functionality ?
    2) On the other hand, I don't think that BPC admin task is able to handle lite or full optimize without an Enterprise Edition Client.
    But customer asks wether it is possible to install on application server
    - SSIS standard edition for SSIS service
    - SSRS standard edition for Reporting Services
    - SQL client Enterprise Edition to let BPC admin task handle lite and full optimize properly.
    Why all this ? Because Enterprise Edition costs ten times more than Standard Edition and in a multi-server configuration, customer has to buy several SQL Server licenses, one per server running at least one SQL server service. (For example, with SQL,SSIS and OLAP on one server, and SSIS and SSRS running on the application server, customer has to buy two SQL servers licenses).
    Thanks. R.

  • Some Performance Counters are not displayed in PerfMon

    I have SQL Server 2014 installed in my machine. I need to monitor some performance counters like \SQLServer:Database Mirroring\Log Send Queue KB. When I open the Add Counter dialog of PerfMon, the counter is displayed in the list. But when I add the
    counter and click OK, the counter is not seen added to the list of counters to be monitored. Any one can tell me why this happens? The same issue is there for some other counters like Replication Dist.\Dist:Delivery Latency. My OS is Windows Server Standard

    Ok I understand, but you would need at least 2 instances of SQL. 
    http://www.mssqltips.com/sqlservertip/2464/configure-sql-server-database-mirroring-using-ssms/
    http://www.codeproject.com/Articles/715550/SQL-Server-Replication-Step-by-Step
    Its would be more easy for you, if you have server with these setups in your environment to check counters /or run your program there to test.

  • Problem in Azure SQL Database Basic Tier

    I have a WinForms Application was using Azure SQL Database (Web)
    It was working fine with web version (for last 2.5 years). 
    After Microsfot's new offerings of New Tiers of Azure SQL Database, 
    I tried Basic Tier.
    Observation for last 24 hours according to Azure Portal when Database in
    BASIC state :::
    Successful Connections : 2 (Max), 1.5(Avg), 3(Total)
    Failed Connections : 0 (Max), 0 (Avg), 0 (Total)
    Throttled Connections : 0 (Max), 0 (Avg), 0 (Total)
    Deadlocks : 0 (Max), 0 (Avg), 0 (Total)
    Storage : 23.38MB (Min), 23.75MB (Max), 23.47MB (Avg)
    CPU Percentage : 0 (Min), 3.34 (Max), 1.14 (Avg)
    Log Writes Percentage : 0 (Min), 0.05 (Max), 0.02 (Avg)
    Physical Data Reads Percentage : 0 (Min), 0.01 (Max), 0 (Avg)
    The WinForm Application working fine but there is some problem:
    When a Report is generated (local .rdlc), the connection gives "Timeout Expired"
    This problem occurs only in 25% cases (in 75% cases works fine)with same data.
    If I upgrade database to Standard (S1) it works fine.
    QUESTION IS :
    Resource Stats show it is using very low resources.
    like CPU percentage 3.34 (Max), Log Writes Percentage 0.05 (Max)
    and Physical Data Reads Percentage 0.01 (Max)
    Why the report not works in 25% cases in BASIC state but works in STANDARD S1 state ?
    The price of Standard S1 is 8 timeshigher then BASIC
    Thanks

    Hello Fanny,
    Thanks for reply.
    I know it and that's why I mentioned different resource stats.
    My database (BASIC) using only :
    CPU Percentage :  3.34
    (Max)
    Log Writes Percentage :
    0.05 (Max)
    Physical Data Reads Percentage :
    0.01 (Max)
    While Microsoft says if any parameter goes beyond 80% usage then you need to upgrade tier
    Please see following video on Channel 9:
    Azure SQL Database and the new Database Throughput Unit 
     by Tobias Ternstrom – Principal Program Manager Lead for performance in Azure SQL Database
    Thanks

Maybe you are looking for

  • Target disk mode "USB"

    Anybody know if you can start a MacBook (13 inch - white plastic) in target disk mode without a Fire Wire port ? My version only has USB ports and Internet port. I want to start it up in target disk mode to run DIsk Utilities on the hard drive. Pleas

  • How to turn a string into a date value

    Hi All, What is the best way to approach a date sorting issue I have? A web service that I consume returns a date column in a string format. Thus when the user clicks the column to sort a data grid by that date value it sorts incorrectly. I realize I

  • Pdf becomes jpeg in slideshow

    I've created a .pdf file using paparazzi and inserted it in the slideshow of one of the photo templates. But when i tested it it published the file as a jpeg. I normally wouldn't mind but when I saved the file onto my desktop the quality of the jpeg

  • Equium M40X: WLan card drivers don't work

    Hello, I just installed fresh WXP and I can't find wireless drivers in internet. I found 3 or 4 drivers but the device is still not working. Any idea where to look or what to do? Cheers, Bilas

  • LIS delta not activating.

    Hi guru's, When i am trying to activating delta in lis environment(2lis_03_s036) it is activating.But when i try to run init it is showing error 'lis dalta is still activated'.But when i run full upload data is loding(around 88 records).But i want to