SharePOint 2010 - SQL Server - Logs filling up drive space so quickly - please help

Hi there,
On my DEV SharePoint SQL Server - suddenly the logs have been filling up space so quickly - this morning I increased 10 GIGs and now it is full again just in a few hours.
What action should I take?
- Is it okay to switch Recovery Model from Full to Simple? (It is DEV server)
- Anything else?
Thanks.

Hi frob,
Can please check below link , It might be useful.
https://support.microsoft.com/kb/110139?wa=wsignin1.0
http://www.mssqltips.com/sqlservertip/2092/sql-server-transaction-log-grows-and-fills-up-drive/
If you think is helpful,Vote As Helpful! And if you see a reply being an answer to the question of the thread, click Mark As Answer.
Mahesh

Similar Messages

  • Split TempDb in SharePoint 2010 Sql Server DB

    Hi ,
    We have envirenment of sharepoint farm,one Database server.We have plan to split tempDb acording microsoft best practices.
    By the time of we have the largest content database is 260 in size and SQL Server has a 2-core processor.
    So the tempdb would be split in two files with 40 GB per .mdf file.
    Kinldy suiggest hopw to split TempDb.
    Hasan Jamal Siddiqui(MCTS,MCPD,ITIL@V3),Sharepoint and EPM Consultant,TCS
    |
    | Twitter

    Hi Shanky,
    We are not facing any issue,But we had take Micorosft RaaS services,Microsoft Consutlatan suggest thet we should break oure TempDb
    RaaS   ISSUE/RISK
    Initial Size of Temp DB is too   small
    TempDB database Size may be too   small.
    SQL Server instance has only one   tempdb data file while there are more than one scheduler in use
    One or more user database files   are placed on same disk volume along with TempDB database files
    REMEDIATION   COMMENTS
    The size   and physical placement of the tempdb database can affect the performance of a   SharePoint farm. For example, if the size that is defined for tempdb
    is too   small, part of the system-processing load may be taken up with autogrowing   tempdb to the size required to support the workload every time you restart   the instance of SQL Server. You can avoid this overhead by increasing the  
    sizes of the tempdb data and log files.
    Currently   .mdf and .ldf files are in the D volume along with all other SharePoint   databases and there is just one tempdb .mdf file.
    More information:
    http://technet.microsoft.com/en-us/library/ms175527(v=SQL.105).aspx
    ACTION   PLAN
    Set the   recovery model of tempdb to SIMPLE recovery model.
    Create   as many files as needed to maximize disk bandwidth. Using multiple files   reduces tempdb storage contention and yields significantly better   scalability. However,
    do not create too many files because this can reduce   performance and increase management overhead. As a general guideline, use
    # of TempDB Files = # of Processors Cores   / 4 and then adjust the number of files up or down as necessary.
    Preallocate   space for all tempdb files by setting the file size to a value large enough   to accommodate the typical workload in the environment. You can go by a rule  
    of thumb that the size should be 25% of the largest SharePoint content   database.
    Important: Make each TEMPDB   data file the same size.
    Allow   your TEMPDB files to grow automatically and monitor the disk free space, set   the file growth in fix size and not in percentage. We recommend the following  
    general guidelines for setting the FILEGROWTH increment for tempdb files:
    tempdb   file size from 0 to 100 MB - FILEGROWTH increment in 10 MB
    tempdb   file size from 100 to 200 MB - FILEGROWTH increment in 20 MB
    tempdb   file size from 200 MB or more - FILEGROWTH increment in 10%*
    * The FILEGROWTH increment should be   set to a maximum of 6 GB, regardless of the tempdb file size.
    Put the   tempdb database on a fast I/O subsystem. Use disk striping if there are many   directly attached disks.
    Put the   tempdb database on disks that differ from those that are used by SharePoint.
    Important: This action plan requires a downtime   of your SharePoint
    farm as you will have to stop SQL Server services to move   data and transaction log files to different disks.
    By the time of this remediation the   largest content database is 300 in size and SQL Server has a 2-core   processor. So the tempdb would be
    split in two files with 40 GB per .mdf   file.
    STATUS
    SharePoint administrators will work   on the action plan.
    Downtime required for to apply this   action plan? YES
    Hasan Jamal Siddiqui(MCTS,MCPD,ITIL@V3),Sharepoint and EPM Consultant,TCS
    |
    | Twitter

  • SharePoint 2010 - SQL Server Service Application Server Appears on Server without SSRS Installed

    I have SSRS instance installed on Server 2 in integrated mode and reports within SharePoint are working as expected.  But, when you go to manage the SQL Server Service Application it throws an 503 error and I believe it is related to the fact that the
    service also shows up on another Application Server, Server1, that does not have SSRS installed.  The service is disabled and it shows up in the list of Services for the server but I believe SharePoint thinks its the primary service and cannot access
    it.  Is there any way to remove the Service from Server2?  I attempted the below powershell commands to remove but it gives me an error.  Any suggestions would be appreciated.
    Get-SPRSServiceApplicationServers
    Address
    Server1
    Server2
    get-spserviceinstance -all |where {$_.TypeName -like "SQL Server Reporting*"}
    TypeName                         Status   Id
    SQL Server Reporting Services... Disabled a5179dce-2d6c-476b-a74b-764375d70a94
    SQL Server Reporting Services... Online   64a99ed8-d31c-4dd3-b9f7-b6d946e41e16
    Remove-SPServiceApplication a5179dce-2d6c-476b-a74b64375d70a94 -RemoveData
    Remove-SPServiceApplication : Object not found.
    At line:1 char:28
    + Remove-SPServiceApplication <<<<  a5179dce-2d6c-476b-a74b-764375d70a94 -Remo
    eData
        + CategoryInfo          : ObjectNotFound: (Microsoft.Share...viceApplicati
       on:SPCmdletRemoveServiceApplication) [Remove-SPServiceApplication], Invali
      dOperationException
        + FullyQualifiedErrorId : Microsoft.SharePoint.PowerShell.SPCmdletRemoveSe
       rviceApplication
    Remove-SPServiceApplication : Object reference not set to an instance of an ob
    ect. At line:1 char:28
    + Remove-SPServiceApplication <<<<  a5179dce-2d6c-476b-a74b-764375d70a94 -Remo
    eData    + CategoryInfo          : InvalidData: (Microsoft.Share...viceApplication:
       SPCmdletRemoveServiceApplication) [Remove-SPServiceApplication], NullRefer
      enceException  + FullyQualifiedErrorId : Microsoft.SharePoint.PowerShell.SPCmdletRemoveSe
       rviceApplication

    You're getting a Service Instance object and trying to remove it with a Service Application cmdlet... This won't quite work :)  Can you double check to validate that SSRS 2008 R2 or higher bits were not accidentally laid down on the problem host?
    Trevor Seward, MCC
    Follow or contact me at...
    &nbsp&nbsp
    This post is my own opinion and does not necessarily reflect the opinion or view of Microsoft, its employees, or other MVPs.

  • Sharepoint 2010 sql server 2008 r2 ssrs configuration

    hello,
    I need to configure the ssrs which is a multi server architecture farm ,but client have given user who is having admin rights but i cannot connect to the server in the configuration manager,first it raised time span exception .
    the user is having admin rights and i am runing with run as administrator.
    please help.
    thanks 

    Hi,
    According to your post, my understanding is that you wanted to configure the SSRS which is a multi-server architecture farm.
    Please make sure you configure it correctly. You can refer to:
    How to: Install and Configure SharePoint Integration on Multiple Servers
    Most time-out errors occur during query processing. If you are encountering time-out errors, try increasing the query time-out value
    in report dataset.
    Setting Time-out Values for Report and Shared Dataset Processing (SSRS)
    Best Regards,
    Linda Li
    Linda Li
    TechNet Community Support

  • SharePoint 2010 & SQL Server Report Builder 3.0 - HTTP 404 Error

    Hi all,
    Apologies if this post is not in the correct forum. 
    I have recently upgraded my SQL installation on my SharePoint server from Express 2008 R2 to Standard 2008 R2 so I could set up Reporting Services Integration in CA.  Followed instructions I found on web and all appeared to be good.
    Installed Report Builder on my PC and set up a data source to a List I have on one of my site collections - "Test Connection" works fine.  If I then try to create a Dataset using Query Designer, I get the error below -
    "An error occurred when accessing the specified SharePoint list.  The connection string might not be valid.  Verify that the connection string is correct - The request failed with HTTP status 404: Not Found. (System.Web.Services)". 
    I can't even get to a point where I can see any fields from my list.
    I am happy to provide any further information to try and clarify my problem.  Any help would be most appreciated!

    Just an update.  I resolved my issue.  The problem appeared to that when creating the Data Source, I had been adding the URL of the actual list (eg. http://servername/sitename/listname) as the connection string where in fact I should have just
    added the server name and site - eg. http://servername/sitename.
    Hopefully this may help somebody else :-)

  • Reporting Services 2012 in SharePoint 2010 - SQL database edition?

    We currently have a SharePoint 2010 farm, running on a SQL Server 2008 R2 Standard database.  On one of the SharePoint 2010 servers, we have an instance of Reporting Services 2008 Enterprise running in SharePoint integrated mode.  (We require Enterprise
    for data-driven subscriptions)
    We'd like to upgrade to Reporting Services 2012 for Data Alerts and a few other new features.  However, I noticed something on
    this blog stating that the SharePoint 2010 database server edition needs to match the Reporting Services edition. My concern is that I've not seen any official mention of this in Microsoft's documentation, and would like to avoid upgrading to Enterprise
    for the SharePoint 2010 database if I can avoid it.
    So, what I'd like to run is:
    SharePoint 2010
    SharePoint database: SQL Server 2008 R2 Standard
    Reporting Services 2012 Enterprise
    Reporting Services database: same as SharePoint database, so SQL Server 2008 R2 Standard
    Is this possible?  Or do I need to upgrade the SharePoint database to SQL Server 2008 R2 Enterprise?  
    Alternatively, could I install a separate database engine just for Reporting Services to host its databases (for example, SQL Server 2012 Enterprise), so I don't have to mess around with upgrading the SP database?

    While you don't need to upgrade your SQL 2008 R2 to 2012, you will need to install an instance of SQL 2012 & SSRS 2012. They can be side-by-side and independent of each other. The following blog post provides steps to do so,
    http://www.madronasg.com/blog/how-configure-sql-server-reporting-services-2012-use-sharepoint-2010#.UuEVd2Ao74Y
    Dimitri Ayrapetov (MCSE: SharePoint)

  • SQL Server Logs

    Hi,
    I am getting an errors in SQL Server Logs.
    Its eating up all my space from C: Drive.
    now it says 0 bytes left.
    What could be an issue???
    Please help. its urgent.
    Regards, Kunjay Shah

    Hello,
    Meanwhile you can limit the error log size with the following instruction:
    USE
    [master];
    GO
    EXEC
    xp_instance_regwrite
    N'HKEY_LOCAL_MACHINE',
    N'Software\Microsoft\MSSQLServer\MSSQLServer',
    N'ErrorLogSizeInKb',
    REG_DWORD, 5120;
    GO
    For more information, please read the following article:
    http://support.microsoft.com/kb/2199578/en-us
    Hope this helps.
    Regards,
    Alberto Morillo
    SQLCoffee.com

  • Warning: Failure to calculate super-latch promotion threshold. appears in SQL Server log

    We are running SQL Server 2008 R2 and have just applied Service Pack 2 to our QA environment.
    Today we noticed this message in the SQL Server log:
    Warning: Failure to calculate super-latch promotion threshold.
    Can someone please tell us what this means and how to fix whatever it is?
    There is no associated error message number.
    We believe this message is new with Service Pack 2.  Can someone tell us for sure?
    This entry has occurred in the middle of the night when nothing was running that we could see and during the day.
    Environment
    SQL server 10.50.4000
    Windows Server 2008 R2 Standard Service pack 1
    Server has 4 processors, 8 Gig of Ram with 4 Gig set for SQL Server.  Usage is generally light as it is a test environment.
    KD

    Its just a warning message about super latch .
    Since SQL 2005 SuperLatches have been introducted which can enable increased performance for accessing the shared pages in the database for a multiple concurrency workload which intrun requires worker threads require SH (shared) latches. No need to set or
    enable any configuration this is performed automatically/dynamically based on the mutli-CPU configuration to promote a latch on a page to be a super-latch. Then this SuperLatch partitions the single latch into array of sublatch structure per CPU core.
    Super latches are way of promoting a latch for entire NUMA Node to reduce contention.
    you can also  read following detailed article 
    http://blogs.msdn.com/b/psssql/archive/2010/03/24/how-it-works-bob-dorr-s-sql-server-i-o-presentation.aspx
    Soldier..Sir we are surrounded from all sides by enemy.. Major: Good, we can attack in any direction Thats attitude..

  • SQL Logs filling up disk space

    Hi there,
    On my DEV SQL Server - suddenly the logs have been filling up space so quickly - this morning I increased 10 GIGs and now it is full again just in a few hours.
    What action should I take?
    - Is it okay to switch Recovery Model from Full to Simple? (It is DEV server)
    - Anything else?
    Thanks.

    Hi frob,
    For development databases, if you don’t care that if recent data changes are lost, you can change the recovery model from full to simple. Then shrink the transaction log file to a reasonable size, below is a example for you, please note that you can't shrink
    it below its original size.
    USE AdventureWorks2012;
    GO
    -- Truncate the log by changing the database recovery model to SIMPLE.
    ALTER DATABASE AdventureWorks2012
    SET RECOVERY SIMPLE;
    GO
    -- Shrink the truncated log file to 1024 MB.
    DBCC SHRINKFILE (AdventureWorks2012_Log, 1024);
    GO
    Additionally, there are other options resolving the issue that SQL Server  log file grows out of control as follows.
    • Backing up transaction logs frequently.
    • Adding a log file on a different disk.
    • Completing or killing a long-running transaction.
    Reference:
    Troubleshoot a Full Transaction Log
    SQL Server Runaway Transaction Logs
    Managing the SQL Server Transaction Log: Dealing with Explosive Log Growth
    Thanks,
    Lydia Zhang

  • Crystal Report 2008 results "Password did not match error" in SQL Server Log

    Hi,
    I am trying to develop some crystal reports using Crystal Report 2008 SP4 to connect to SQL server 2008 through RDO (ODBC). My problem is that anytime Crystal Report is trying to establish a connection with SQL Server, first it tries to connect with the wrong password and results the following error message in SQL Server log:
    Login failed for user 'peyman'. Reason: Password did not match that for the login provided. [CLIENT: 192.168.2.198]
    The login name 'peyman' is the right one as I have quoted the same in setting up ODBC System DSN using "SQL Server Native Client 10". But somehow Crystal Report is taking its chance and trying to connect before prompting me for the password. After this error logged to SQL server side, I can see Crystal Report pops up the prompt to enter DSN password. After supplying password to this prompt, Crystal Report works fine and pulls the data and renders the report without having any more incorrect password error logged to SQL Server.
    The attached file shows the step I am taking to regenerate the issue. I need this to be fixed as anytime uses any of these reports and tries to render it with crystal runtime engine the same error message raises in SQL Server side and logged in the log file.
    Thanks,

    Hi Peyman,
    This is the way it should work, In CR designer when you open the report it does nothing. As soon as you hit the Refresh button then CR tries to use the connection saved with the report. It assumes what is saved with the report is a valid server and connection info. CR simply tries to verify the server is still active.
    If you don't want it to fail in CRD then before opening any report Click the File, Log on Database option and connect. Now when refreshing reports it won't log the failed.
    In the SDK it does the same thing, it assumes the Server info is the same and does try to connect to verify the Server does exist. It only takes a few milli-seconds to do this
    To stop this from happening set the log on info first then it won't log the attempt to connect, it's checking to see if Trusted is allowed.
    TIMESTAMP    THREAD_ID    FILENAME    LINE_NUMBER    LOGGED_DATA    LEVEL
    2014-6-2 8:30:11.439    57320    .\QESession.cpp    444    Set Product View Locale: 4105    20
    2014-6-2 8:30:11.439    57320    .\QESession.cpp    478    Set Preferred View Locale: 4105    20
    2014-6-2 8:30:11.439    57320    .\QESession.cpp    500    Set Process Locale: 4105    20
    2014-6-2 8:30:11.440    57320    .\qecommon.cpp    117    This property is currently in a read-only state and cannot be modified. File Name: ".\QEProperty.cpp". Line: 217    1
    2014-6-2 8:30:11.967    57208    .\QESession.cpp    444    Set Product View Locale: 4105    20
    2014-6-2 8:30:11.967    57208    .\QESession.cpp    478    Set Preferred View Locale: 4105    20
    2014-6-2 8:30:11.967    57208    .\QESession.cpp    500    Set Process Locale: 4105    20
    2014-6-2 8:30:11.968    57208    .\qecommon.cpp    117    This property is currently in a read-only state and cannot be modified. File Name: ".\QEProperty.cpp". Line: 217    1
    2014-6-2 8:30:11.999    57208    .\QESession.cpp    444    Set Product View Locale: 1033    20
    2014-6-2 8:30:12.4    57320    .\qecommon.cpp    117    This value is write-only. File Name: ".\QEProperty.cpp". Line: 145    1
    2014-6-2 8:30:56.278    57208    .\odbcapi.cpp    301    Beginning COdbcapi::DriverConnect    20
    2014-6-2 8:30:56.342    57208    .\odbcapi.cpp    335    Ending COdbcapi::DriverConnect    20
    2014-6-2 8:30:56.342    57208    .\connect.cpp    2170    SQLDriverConnect succeeded: DSN = 192.168.13.172, User ID = sa, Password = ********    10
    2014-6-2 8:30:56.348    57208    .\qecommon.cpp    117     File Name: ".\QEQueryInfo.cpp". Line: 826    1
    2014-6-2 8:30:56.348    57208    .\qecommon.cpp    117     File Name: ".\QEQueryInfo.cpp". Line: 854    1
    2014-6-2 8:30:56.348    57208    .\qecommon.cpp    117     File Name: ".\QEQueryInfo.cpp". Line: 826    1
    2014-6-2 8:30:56.348    57208    .\qecommon.cpp    117     File Name: ".\QEQueryInfo.cpp". Line: 826    1
    2014-6-2 8:30:56.348    57208    .\qecommon.cpp    117     File Name: ".\QEQueryInfo.cpp". Line: 854    1
    2014-6-2 8:30:56.348    57208    .\qecommon.cpp    117     File Name: ".\QEQueryInfo.cpp". Line: 826    1
    2014-6-2 8:30:56.348    57208    .\qecommon.cpp    117     File Name: ".\QEQueryInfo.cpp". Line: 854    1
    2014-6-2 8:30:56.348    57208    .\qecommon.cpp    117     File Name: ".\QEQueryInfo.cpp". Line: 916    1
    2014-6-2 8:30:56.348    57208    .\QERowset.cpp    1184    Beginning CQERowset::readFirstRecord    20
    2014-6-2 8:30:56.348    57208    .\QERowset.cpp    2149    Beginning CQERowset::restart    20
    2014-6-2 8:30:56.348    57208    .\QERowset.cpp    2370    Beginning CQERowset::execute    20
    2014-6-2 8:30:56.353    57208    .\DbQueryBuilder.cpp    514    Query Targets: sqlncli10, ODBC3SQLServer    10
    2014-6-2 8:30:56.353    57208    .\DbQueryBuilder.cpp    525    Successfully built query:    SELECT "Orders"."Customer ID", "Orders"."Employee ID"   FROM   "xtreme"."dbo"."Orders" "Orders"    10
    2014-6-2 8:30:56.353    57208    .\odbcapi.cpp    875    Beginning COdbcapi::ExecDirect    20
    2014-6-2 8:30:56.354    57208    .\odbcapi.cpp    884    Finishing COdbcapi::ExecDirect    20
    2014-6-2 8:30:56.354    57208    .\rowset.cpp    220    SQLExecDirect succeeded:  SELECT "Orders"."Customer ID", "Orders"."Employee ID" FROM   "xtreme"."dbo"."Orders" "Orders"    10
    2014-6-2 8:30:56.354    57208    .\QERowset.cpp    2814    bindToField succeeded: Orders.Customer ID is using client buffer    10
    2014-6-2 8:30:56.354    57208    .\QERowset.cpp    2814    bindToField succeeded: Orders.Employee ID is using client buffer    10
    Notice it doesn't try to connect first if I set the log on info first using code.
    So nothing we can do to stop SQL server from logging this info. Check with the DBA, possibly they can "filter" out the application attempts to connect or change your work flow in the app.
    If you have a Preview Button to view the report then simply add your Database log on info prompt there if Connectioninfo.Trusted is not true:
    mainSecureDB = rpt.Database.Tables[tableIndex].LogOnInfo.ConnectionInfo.IntegratedSecurity;
    if mainSecureDB = false then prompt the user for log on info and set accordingly, if it is true the it should not fail when it connects.
    This is simply a matter of changing your App work flow...
    Thanks
    Don

  • SharePoint 2013 SQL Server Edition for BI Features - must be on SharePoint SQL Server?

    I need to install SQL Server 2012 for a new SharePoint 2013 installation.
    Let's say I want to use the BI features of SharePoint 2013 like PowerView.
    I already have a separate SQL Server running SQL Server 2012 BI Edition that is used as the database server for our data warehouse and some apps.  But this SQL Server will not be used to house the SharePoint 2013 databases.
    Do I need to install SQL Server 2013 BI Edition on the SharePoint 2013 SQL Server (where the SharePoint 2013 databases will be housed) or can I used SQL Server 2013 Standard Edition on that server and utilize the BI Edition on the data warehouse server to
    use the BI features of SharePoint?

    Yes, BI or Enterprise must be installed on the SharePoint server in order to integrate SSRS. PowerPivot can be on a separate server with just a download (http://www.microsoft.com/en-us/download/details.aspx?id=35577) for certain components residing on
    the SharePoint server. This will give you PowerView, as well.
    Trevor Seward
    Follow or contact me at...
    &nbsp&nbsp
    This post is my own opinion and does not necessarily reflect the opinion or view of Microsoft, its employees, or other MVPs.

  • Sql server log shipping space consumption

    I have implemented sql server logs shipping from hq to dr server
    The secondary databases are in standby mode .
    The issue is that after configuring it , my dr server is running out of space very rapidly
    I have checked the log shipping folders where the trn files resides and they are of very decent size , and the retention is configured for twenty four hours
    I checked the secondary databases and their size is exactly same as that of the corresponding primary databases
    Could you guys please help me out in identifying the reason behind this odd space increase
    I would be grateful if you could point me to some online resources that explains this matter with depth

    The retention is happening . I have checked the folders they do not have records older then 24 hours .
    I dont know may be its because in the secondary server (Dr) there is no full backup job working , is it because of this the ldf file is getting bigger and bigger but again I far as my understanding goes we cannot take a database full backup in the stand
    by mode .
    The TLog files of log shipped DBs on the secondary will be same size as that of primary. The only way to shrink the TLog files on secondary (I am not advising you to do this) is to shrink them on the primary, force-start the TLog backup job, then the copy
    job, then the restore job on the secondary, which will sync  the size of the TLog file on the secondary. 
    If you have allocated the same sized disk on both primary and secondary for TLog files, then check if the Volume Shadow Copy is consuming the space on the secondary
    Satish Kartan www.sqlfood.com

  • SharePoint 2013 - SQL Server BCS Model Incremental Crawl content doesnt show up in Search results

    SharePoint 2013 - SQL Server BCS Model Incremental Crawl content doesn't show up in Search results, Incremental crawl is working fine, i.e., its picking up newly added records in table to the Search, but the newly added content is not available in search
    results page.
    But when i do a Full Crawl, search results are showing up with appropriate content.
    What could be the issue here?
    Suresh Kumar Udatha.

    This time on the Full crawl I got only 62 warnings and 12 errors and ~537.000 success. Warnings were about truncating the crawl documents because their content length exceed the configured for crawl. The 12 errors were "Processing this item failed because
    of a timeout when parsing its contents." and "The content processing pipeline failed to process the item.". I think 12 errors is not much to re-execute full crawl. Site collection has one SP Site Group (with Read Permission Level). In this site group I have
    only one AD Group added, so permission change is not a possible reason for re-crawl, plus nobody changed anything in this ad group. All documents are stored in 2 document libraries and there are no sub-sites. I want to access these documents trough search
    (custom managed property restriction kql) but this way I have no mechanism to fast re-crawl only error documents from the first full-crawl (those 12). This is very strange and put SP 2013 Search almost unusable for my scenario.
    Thanks,
    Darko
    Darko Milevski http://mkdot.net/blogs/darko/

  • Possible to view MS SQL Server Log in SAP GUI?

    Hi,
    I am wondering if I can view the SQL Server log in SAP GUI using a transaction code.
    I don't readily see how to do this with the TEs.  It would be helpful to see things like: who has tried to log into the SQL Server via a Database Management tool : agnostic to SAP.: is SQL Server Management Studio or DBArtisan or other services.
    thanks
    Chris

    Hi chris
    Yes,you can view the Sql server log in transaction code ST04 & DBA Cockpit in SAPGUI.
    Kindly refer the SAP notes.
    139945 - SAP Database Monitor for MS SQL Server
    1027512 - MSSQL: DBA cockpit for basis release 7.00 and later
    Regards
    Ram

  • "HadrLogCapture" messages in SQL Server Log

    HI 
    After activating  AlwaysOn High Availibility Groups in SQL Server 2012 R2 and adding databases at the availibilitygroup, SQL Server  started genereting lots of messages in the SQL Server Logs. 
    Some example Messages are: "DbId (1466) log capture is rescheduled with partner generation: 45, log consumer id: 9828",
    "HadrLogCapture::BuildMsgBodyForSend - DbId [2535]->[1807D374-DA55-4CAC-A831-AA3C3A8B8EC4] Send From
    saved msg - LogBlockId (0x000221e800000010)"
    Is there is a possibility to stop generating these messages?

    I guess supressing them would be a wrong approach. there is some underlying problem with the data movement which you have between primary and secondary replicas.
    Here are my questions:
    What is the sync setting and how many replicas you have?
    What is the SQL Version? Output of Select @@version query
    Do you see any latency on Dashboard? Check if redo queue or log send queue is high
    Balmukund Lakhani
    Please mark solved if I've answered your question, vote for it as helpful to help other users find a solution quicker
    This posting is provided "AS IS" with no warranties, and confers no rights.
    My Blog |
    Team Blog | @Twitter
    | Facebook
    Author: SQL Server 2012 AlwaysOn -
    Paperback, Kindle

Maybe you are looking for