Using Powershell to execute DBCC CheckDB from SQL Agent.
So, I have a weird situation that I think is tied tot he resource group and powershell, but I am having trouble determining if it is that or not. I run DBCC CheckDB using a resource pool and a secondary account. The account has permissions to do the work
and if I log onto the server and run the procedure locally it runs fine in the resource pool configuration. However, when I kick it from the SQL Agent job using the powershell step, it only does checkDB on the first 3 system databases and then stops once it
hits a user database. I am not seeing any errors or messages, it just stops. I ran profiler and I see it do the work, get to the first user database, issues this statement from DBCC, then just stops, and the job ends.
SELECT @BlobEater = CheckIndex (ROWSET_COLUMN_FACT_BLOB)
FROM { IRowset 0xD0936A7902000000 }
GROUP BY ROWSET_COLUMN_FACT_KEY
>> WITH ORDER BY
ROWSET_COLUMN_FACT_KEY,
ROWSET_COLUMN_SLOT_ID,
ROWSET_COLUMN_COMBINED_ID,
ROWSET_COLUMN_FACT_BLOB
OPTION (ORDER GROUP)
I am not doing anything special in my code that would limit which databases to process. As I said earlier, executing the call to the procedure from a query window runs as expected and processes all of the databases.
Here is the Agent Code calling powershell:
[string] $DayOfWeek = ""
$DayOfWeek = (get-date).DayOfWeek.ToString()
$DayOfWeek
if ($DayOfWeek -eq 'Sunday')
invoke-sqlcmd -database sysadm -serverinstance HQIOSQLDEV01\DEV01 "exec ConsistencyCheck.upConsistencyCheck NULL, 'N', 'Y', 'N', 'N', 'N'"
else
invoke-sqlcmd -database sysadm -serverinstance HQIOSQLDEV01\DEV01 "exec ConsistencyCheck.upConsistencyCheck NULL, 'Y', 'N', 'N', 'N', 'N'"
John M. Couch
There are 3 additional databases. The last known good is today as I am able to execute the procedure via query window just fine. It is only when executed from a SQL Agent job as above that it stops after only doing the System Databases. The largest database
is 130GB in size, with the largest table being 62 GB.
-- Create Procedures
raiserror('Creating Procedure ''%s''', 0, 1, '[ConsistencyCheck].[upConsistencyCheck]')
go
/*==============================================================================
Procedure: upConsistencyCheck
Schema: ConsistencyCheck
Database: SysAdm
Owner: dbo
Application: dbo
Inputs: Catalogue : nvarchar(128) : NULL = All Databases
Physical Only : nchar(1) : Y/N, NULL = N
Data Purity : nchar(1) : Y/N, NULL = N
No Index : nchar(1) : Y/N, NULL = N
Extended Logical Checks : nchar(1) : Y/N, NULL = N
Table Lock : nchar(1) : Y/N, NULL = N
Outputs: (0 = Success, !=0 = failure)
Result Set: N/A
Usage: declare @ii_Rc int
,@invc_Catalogue nvarchar(128)
,@inc_PhysicalOnly nchar(1)
,@inc_DataPurity nchar(1)
,@inc_NoIndex nchar(1)
,@inc_ExtendedLogicalChecks nchar(1)
,@inc_TabLock nchar(1)
select @invc_Catalogue = NULL
,@inc_PhysicalOnly = 'Y'
,@inc_DataPurity = 'N'
,@inc_NoIndex = 'N'
,@inc_ExtendedLogicalChecks = 'N'
,@inc_TabLock = 'N'
exec @ii_Rc = ConsistencyCheck.upConsistencyCheck @invc_Catalogue
, @inc_PhysicalOnly
, @inc_DataPurity
, @inc_NoIndex
, @inc_ExtendedLogicalChecks
, @inc_TabLock
print 'Return Code: ' + convert(varchar, @ii_Rc)
Description: This Procedure is used to run DBCC CheckDB on 1 or all Databases
on the existing instance.
Version: 1.00.00
Compatability: SQL Server 2008 (100)
Created By: John M. Couch
Created On: 04-26-2012
================================================================================
Notes
1. Some logic was taken directly from Ola Hallengren's Maintenance Script.
http://ola.hallengren.com
================================================================================
History: (Format)
When Who Version Code Tag What
04-26-2012 John Couch 1.00.00 (None) Initial Revision
==============================================================================*/
alter procedure ConsistencyCheck.upConsistencyCheck (@invc_Catalogue nvarchar(128)
,@inc_PhysicalOnly nchar(1)
,@inc_DataPurity nchar(1)
,@inc_NoIndex nchar(1)
,@inc_ExtendedLogicalChecks nchar(1)
,@inc_TabLock nchar(1)) as
/*==============================================================================
Variable Declarations & Temporary Tables
==============================================================================*/
declare @li_Rc int = 0
,@lnvc_ExecutedBy nvarchar(128) = user_name()
,@ldt_ExecutedOn datetime = getdate()
,@lnvc_Catalogue nvarchar(128) = @invc_Catalogue
,@lnc_PhysicalOnly nchar(1) = coalesce(@inc_PhysicalOnly, 'N')
,@lnc_DataPurity nchar(1) = coalesce(@inc_DataPurity, 'N')
,@lnc_NoIndex nchar(1) = coalesce(@inc_NoIndex, 'N')
,@lnc_ExtendedLogicalChecks nchar(1) = coalesce(@inc_ExtendedLogicalChecks, 'N')
,@lnc_TabLock nchar(1) = coalesce(@inc_TabLock, 'N')
,@lnvc_Instance nvarchar(128) = cast(serverproperty('ServerName') as nvarchar)
,@lnvc_Version nvarchar(40) = cast(serverproperty('ProductVersion') as nvarchar)
,@lnvc_Edition nvarchar(40) = cast(serverproperty('Edition') as nvarchar)
,@li_Compatibility int
,@ldt_CreateDate datetime
,@lnvc_UserAccess nvarchar(35)
,@lnvc_StateDescription nvarchar(35)
,@lnvc_PageVerifyOption nvarchar(35)
,@lti_IsReadOnly tinyint
,@lti_IsInStandBy tinyint
,@lnvc_Recipients nvarchar(2000) = '[email protected]'
,@lnvc_Subject nvarchar(128)
,@lnvc_ErrorMessage nvarchar(4000)
,@lnvc_SQL nvarchar(max)
,@lnvc_ManualSQL nvarchar(max)
,@lnvc_Query nvarchar(2048)
,@li_ConsistencyCheckID int
,@ldt_ExecutionStart datetime
,@ldt_ExecutionFinish datetime
declare @ltbl_Catalogue table (Catalogue sysname
,CompatibilityLevel int
,CreateDate datetime
,UserAccess nvarchar(35) -- MULTI_USER, SINGLE_USER, RESTRICTED_USER
,StateDescription nvarchar(35) -- ONLINE, RESTORING, RECOVERING, RECOVERY_PENDING, SUSPECT, EMERGENCY, OFFLINE
,PageVerifyOption nvarchar(35) -- NONE, TORN_PAGE_DETECTION, CHECKSUM
,IsReadOnly tinyint
,IsInStandBy tinyint
,IsAutoShrink tinyint
,IsAutoClose tinyint
,Flag bit default 0)
create table #ltbl_Output (Error int,
Level int,
State int,
MessageText nvarchar(max),
RepairLevel nvarchar(30),
Status int,
DBID smallint,
ObjectID int,
IndexID smallint,
PartitionID bigint,
AllocunitID bigint,
[File] int,
Page int,
Slot int,
RefFile int,
RefPage int,
RefSlot int,
Allocation int)
/*==============================================================================
Initialize Environment
==============================================================================*/
set nocount on
set quoted_identifier on
/*==============================================================================
Parameter Validation
==============================================================================*/
-- Configure Alert Mail Subject Line
set @lnvc_Subject = 'Check consistency parameter validation error occurred on ' + cast(@@servername as nvarchar)
if @lnc_PhysicalOnly not in ('Y','N')
begin
set @lnvc_ErrorMessage = N'The value for parameter @inc_PhysicalOnly is not supported.' + char(13) + char(10) + ' '
set @li_Rc = -1
end
if @lnc_DataPurity not in ('Y','N') and @li_Rc = 0
begin
set @lnvc_ErrorMessage = N'The value for parameter @inc_DataPurity is not supported.' + char(13) + char(10) + ' '
set @li_Rc = -1
end
if @lnc_NoIndex not in ('Y','N') and @li_Rc = 0
begin
set @lnvc_ErrorMessage = N'The value for parameter @inc_NoIndex is not supported.' + char(13) + char(10) + ' '
set @li_Rc = -1
end
if @lnc_ExtendedLogicalChecks not in ('Y','N') and @li_Rc = 0
begin
set @lnvc_ErrorMessage = N'The value for parameter @inc_ExtendedLogicalChecks is not supported.' + char(13) + char(10) + ' '
set @li_Rc = -1
end
if @lnc_TabLock not in ('Y','N') and @li_Rc = 0
begin
set @lnvc_ErrorMessage = N'The value for parameter @inc_TabLock is not supported.' + char(13) + char(10) + ' '
set @li_Rc = -1
end
if @lnc_ExtendedLogicalChecks = 'Y' and @lnc_PhysicalOnly = 'Y' and @li_Rc = 0
begin
set @lnvc_ErrorMessage = N'Extended Logical Checks and Physical Only cannot be used together.' + char(13) + char(10) + ' '
set @li_Rc = -1
end
if @lnc_DataPurity = 'Y' and @lnc_PhysicalOnly = 'Y' and @li_Rc = 0
begin
set @lnvc_ErrorMessage = N'Physical Only and Data Purity cannot be used together.' + char(13) + char(10) + ' '
set @li_Rc = -1
end
if @li_Rc != 0
goto errlog
/*==============================================================================
Code Section
==============================================================================*/
select @lnvc_SQL = N'select d.name, d.compatibility_level, d.create_date, d.user_access_desc, d.state_desc,
d.page_verify_option_desc, cast(d.is_in_standby as tinyint), cast(d.is_read_only as tinyint),
cast(databasepropertyex(quotename(d.name), ''IsAutoShrink'') as tinyint),
cast(databasepropertyex(quotename(d.name), ''IsAutoClose'') as tinyint),
0
from master.sys.databases d
where d.name = ' + case when isnull(@lnvc_Catalogue, '') = '' then ' d.name'
else '''' + @lnvc_Catalogue + ''''
end + '
and d.name != ''tempdb'''
insert into @ltbl_Catalogue (Catalogue, CompatibilityLevel, CreateDate, UserAccess,
StateDescription, PageVerifyOption, IsReadOnly, IsInStandBy,
IsAutoShrink, IsAutoClose, Flag)
exec sp_executesql @lnvc_SQL
while (select top 1 1
from @ltbl_Catalogue c
where c.Flag = 0) = 1
begin
select top 1 @lnvc_Catalogue = c.Catalogue
,@li_Compatibility = c.CompatibilityLevel
,@ldt_CreateDate = c.CreateDate
,@lnvc_UserAccess = c.UserAccess
,@lnvc_StateDescription = c.StateDescription
,@lnvc_PageVerifyOption = c.PageVerifyOption
,@lti_IsReadOnly = c.IsReadOnly
,@lti_IsInStandBy = c.IsInStandBy
from @ltbl_Catalogue c
where c.Flag = 0
select top 1 @lnvc_Catalogue
,@li_Compatibility
,@ldt_CreateDate
,@lnvc_UserAccess
,@lnvc_StateDescription
,@lnvc_PageVerifyOption
,@lti_IsReadOnly
,@lti_IsInStandBy
if @lnvc_StateDescription = 'ONLINE' and @lnvc_UserAccess != 'SINGLE_USER'
begin
-- Build Execution String
select @lnvc_SQL = N'dbcc checkdb (' + quotename(@lnvc_Catalogue) + ')' + case when @lnc_NoIndex = 'Y' then ', noindex'
else ''
end + ' with tableresults, no_infomsgs, all_errormsgs '
+ case when @lnc_PhysicalOnly = 'Y' then ', physical_only'
else ''
end
+ case when @lnc_DataPurity = 'Y' then ', data_purity'
else ''
end
-- Option not supported with Compatibility < 100 (SQL Server 2005 and below)
+ case when @lnc_ExtendedLogicalChecks = 'Y' then
case when @li_Compatibility = 100 then ', extended_logical_checks'
else ''
end
else ''
end
+ case when @lnc_TabLock = 'Y' then ', tablock'
else ''
end
-- Prepare Processing Environment
truncate table #ltbl_Output
set @ldt_ExecutionFinish = null
set @ldt_ExecutionStart = null
-- Capture Start Time.
set @ldt_ExecutionStart = getdate()
-- Execute the Command.
insert into #ltbl_Output(Error, Level, State, MessageText, RepairLevel, Status,
DBID, ObjectID, IndexID, PartitionID, AllocunitID,
[File], Page, Slot, RefFile, RefPage, RefSlot, Allocation)
exec sp_executesql @lnvc_SQL
-- Capture Completion Time.
set @ldt_ExecutionFinish = getdate()
-- Add Header Record to Confirm Execution.
insert into SysAdm.ConsistencyCheck.ExecutionLog(Instance, [Version], Edition, Catalogue, PhysicalOnly,
NoIndex, ExtendedLogicalChecks, DataPurity, [TabLock],
Command, ExecutionStart, ExecutionFinish,
CreatedOn, CreatedBy)
values(@lnvc_Instance, @lnvc_Version, @lnvc_Edition, @lnvc_Catalogue, @lnc_PhysicalOnly,
@lnc_NoIndex, @lnc_ExtendedLogicalChecks, @lnc_DataPurity, @lnc_TabLock,
@lnvc_SQL, @ldt_ExecutionStart, @ldt_ExecutionFinish,
@ldt_ExecutedOn, @lnvc_ExecutedBy)
-- Capture Header Record ID
select @li_ConsistencyCheckID = @@IDENTITY
-- Were there errors?
if (select top 1 1
from #ltbl_Output t) = 1
begin
select @li_ConsistencyCheckID, t.Error, t.Level, t.State, t.MessageText, t.RepairLevel,
t.Status, t.ObjectID, t.IndexID, t.PartitionID, t.AllocunitID, t.[File], t.Page,
t.Slot, t.RefFile, t.RefPage, t.RefSlot, t.Allocation, @ldt_ExecutedOn, @lnvc_ExecutedBy
from #ltbl_Output t
-- Log Failure Entries
insert into SysAdm.ConsistencyCheck.ErrorLog (ExecutionLogID, Error, Severity, [State]
,ErrorMessage, RepairLevel, [Status], ObjectID
,IndexID, PartitionID, AllocationUnitID, FileID
,Page, Slot, RefFileID, RefPage, RefSlot, Allocation
,CreatedOn, CreatedBy)
select @li_ConsistencyCheckID, t.Error, t.Level, t.State, t.MessageText, t.RepairLevel,
t.Status, t.ObjectID, t.IndexID, t.PartitionID, t.AllocunitID, t.[File], t.Page,
t.Slot, t.RefFile, t.RefPage, t.RefSlot, t.Allocation, @ldt_ExecutedOn, @lnvc_ExecutedBy
from #ltbl_Output t
-- Configure Alert Mail Subject Line
set @lnvc_Subject = 'Consistency Check failure for Database ' + quotename(@lnvc_Catalogue) + ' on Instance ' + quotename(@lnvc_Instance) + ' !'
set @lnvc_ErrorMessage = 'To view more details, logon to the Instance and execute the query: ' +
+ char(13) + char(10) + char(13) + char(10) +
'select * from sysadm.consistencycheck.ErrorLog where ExecutionLogID = ' + cast(@li_ConsistencyCheckID as varchar) + ''
+ char(13) + char(10) + char(13) + char(10) +
'Consistency Check Output: ' +
+ char(13) + char(10)
select @lnvc_SQL = N'set nocount on;select ltrim(rtrim(ErrorMessage)) as Message
from SysAdm.ConsistencyCheck.ErrorLog r
where r.ExecutionLogID = ''' + cast(@li_ConsistencyCheckID as nvarchar(128)) + ''''
exec msdb.dbo.sp_send_dbmail @profile_name = 'SQL Mailbox',
@recipients = @lnvc_Recipients,
@subject = @lnvc_Subject,
@body = @lnvc_ErrorMessage,
@query = @lnvc_SQL,
@execute_query_database = @lnvc_Catalogue,
@query_result_header = 1,
@query_result_width = 32767,
@query_no_truncate = 1,
@body_format = 'TEXT',
@importance = 'High'
end
end
else
begin
-- If we get here, then the database was not processed due to it having a status other than ONLINE, being in SINGLE_USER mode or
-- having a compatability level < 100
set @lnvc_Subject = 'Unable to perform consistency checks for Database ' + quotename(@lnvc_Catalogue) + ' on ' + quotename(@lnvc_Instance) + '!'
set @lnvc_ErrorMessage = 'One of the following conditions was not met. ' + CHAR(13) + CHAR(10) + CHAR(13) + CHAR(10) +
'* Database must be ONLINE. It''s current state is: ' + @lnvc_StateDescription + CHAR(13) + CHAR(10) +
'* Database access level cannot be set to SINGLE_USER. It''s current access level is: ' + @lnvc_UserAccess
exec msdb.dbo.sp_send_dbmail @profile_name = 'SQL Mailbox',
@recipients = @lnvc_Recipients,
@subject = @lnvc_Subject,
@body = @lnvc_ErrorMessage,
@body_format = 'TEXT',
@importance = 'High'
end
update t
set Flag = 1
from @ltbl_Catalogue t
where t.Catalogue = @lnvc_Catalogue
end
/*==============================================================================
Cleanup Temp Tables
==============================================================================*/
cleanup:
if object_id('tempdb..#ltbl_Output') is not null
drop table #ltbl_Output
/*==============================================================================
Exit Procedure
==============================================================================*/
quit:
return @li_Rc
/*==============================================================================
Error Processing
==============================================================================*/
errlog:
-- Raise Error, and write to Application Event Log
raiserror (@lnvc_ErrorMessage, 16, 1) with log, nowait
-- Send Email Notification of Error
exec msdb.dbo.sp_send_dbmail @profile_name = 'Mailbox',
@recipients = @lnvc_Recipients,
@subject = @lnvc_Subject,
@body = @lnvc_ErrorMessage,
@importance = 'High'
goto cleanup
go
John M. Couch
Similar Messages
-
Whats the difference between executing a package from SQL and Visual Studio?
Hi,
We have a package that is currently failing to run when deployed to SQL. This has been tried from a schedule and also executed manually both are failing.
I have tried from Visual studio running on various machines (windows 8, server 2012) and in all cases run perfectly.
I am trying to understand the differences between deploying to SQL and running from VS so maybe I can figure out why this is happening.
I have the following errors when I run from SQL.
DTS_E_PROCESSINPUTFAILED - all errors like this point to the 'Sort' tasks in the script
dts_e_processinputerror not enough storage is available
I have tested in four environments and all fail from SQL but not from VS!
Last night I tried from my laptop and executed the package from SQL - it didn't fail but was still running in the morning so I terminated. Note this takes around 20 mins running from VS! why would it be so quick from VS but fail or take so long on SQL?
The test running at the moment is on a server with dynamic memory currently at 14GB. I decreased SQLs RAM to 4GB and it hasn't failed yet but has been running for two hours. Before changing this the package failed after a short time.
I thought it may have something to do with running from a virtual machine but doesn't explain why it couldn't run locally on my laptop.
All ideas welcome :) Many thanks,
DavinaI will try to address issues one by one
The error doesn't seems to be related to SSISDB configuration mode. It may be because of
Change in package definition (please confirm that the package which you are running on your laptop is the same on your msdb? - reload your SQL package from MSDB in BIDS/SSDT and recreate your source and destination components)
As your error message shows, "not enough memory available" (may be because of multicast) - Usually you can override this error by restarting SQL Server [not an optimal solution but it work & if that work it shows that your package fills
the memory - Keep an eye on task manager]
Make sure that your statics on table are updated [run EXEC sp_updatestastics]
Make sure your indexes are not de-fragmented [run rebuild indexes]
If you are dealing with many rows may be 40000 * 12 (because of multicast) = 4,800,000 rows the try to separate package.
Check your excel file format/column data type is correct
Check user permission who is running the job has required permission on folder/file.
Understand that sort is a blocking transformation and it requires all your data in memory before it sorts. so, if there are large number of rows then your complete memory will be occupied.
Difference between Visual Studio & BIDS/SSDT
Nothing much, other than BIDS runs on 32 bit while jobs on 64 bit (but there run time environment are configurable, though)
There shouldn't be any performance difference until your
package is on network location because transferring data to the network may slow the package. If package runs on SQL Server it uses SQL buffer pool to fetch data via SQL statement and then SSIS works on its own memory allotment.
Hope this will help you to understand and fix the issue.
Glad to help! Please remember to accept the answer if you found it helpful. It will be useful for future readers having same issue. -
I am running SQL Agent that executes an SSIS process from sql server1. The SSIS process executes its SQL/tables/sp’s against another sql server2.
I get an error after adding data flow tasks with transaction supported within a sequence with transaction required. The error, “The SSIS Runtime has failed to enlist the OLE DB connection in a distributed transaction with error 0x8004D024 "The transaction
manager has disabled its support for remote/network transactions"
Prior to adding this sequence everything was working from sql agent, and there were other sequences with oledb destinations.
Everything works when running within SSIS Package.
I see this article on similar issue,
https://social.msdn.microsoft.com/Forums/sqlserver/en-US/0bfa2569-8849-4884-8f68-8edf98a9b4fe/problem-to-execute-a-package-and-tasks-with-a-certain-transactionoption-property-i-need-help?forum=sqlintegrationservices
“I had similar issue and solved by setting the following on both the machines. Allow Remote Clients, Allow Remote Administration,
Allow Inbound Clients, Allow Outbound Clients, and TIP are enabled in Component Services/My Computer/Properties/MSDTC/Security Configuration.”
I don’t want to remove transaction required for the “Remove Duplicates from Staging” sequence.
Anyone seen this?
Greg HansonDTC was running on remote computer. The problem was it was no longer accepting transactions from remote servers. This was in SSIS so I had to turn to "Transaction Supported" for all Data Flow Transactions.
Greg Hanson -
Hey, guys!
Tried to find an answer, but nothing works for me.
So on one of servers when I am trying to run DBCC CHECKDB throws two errors
Msg 8921, Level 16, State 1, Line 1
Check terminated. A failure was detected while collecting facts. Possibly tempdb out of space or a system table is inconsistent. Check previous errors.
Msg 701, Level 17, State 123, Line 1
There is insufficient system memory in resource pool 'internal' to run this query.
This is a VM hosted on Hyper-V server 2012 R2. VM has Windows Server 2012 R2 and SQL Server 2012 Std. VM had 8 GB of RAM, I increased it to 12GB (static, not dynamic), also I increased paging file size in Windows and size of TEMPDB also recreated TEMPDB.
I also tried to restore the Database, which throws an error from another server. On that server DBCC CHECKDB works fine, but it didn't help - I still receive the same error. Can you suggest, please?Hi,
I agree with you. It is probably a memory issue. First, we need to verify if it is an OS memory issue or caused by SQL Server itself.
Need to use Performance Monitor:
SQLServer:Memory
Memory
Dynamic Management Views:
sys.dm_os_sys_info
sys.dm_exec_query_memory_grants
1. Use performance monitor to check OS memory: available memory(MB) and monitor the OS memory status before the query and when running the query. If it does not change, I can exclude the OS memory factor. Then, I can conclude
this memory issue was caused by SQL Server internal. Also, check if there is Memory leak on your system.
2. Use the below script in SQL Server Management Studio and Result to Text.
while(1=1)
begin
print getdate()
print '*****sys.dm_exec_query_memory_grants******'
select * from sys.dm_exec_query_memory_grants
print 'DBCC memorystatus'
dbcc memorystatus
waitfor delay '00:00:01'
end
Then, check SQLServer:Memory-Granted Workspace Memory (KB) when the issue occurs which specifies the total amount of memory currently granted to executing processes, such as hash, sort, bulk copy, and index creation operations.
And compared with the information got in
sys.dm_exec_query_memory_grants.
3. In addition, use sys.dm_os_sys_info
to identify bpool_commit_target and bpool_commited.
In SQL Server 2012, the columns have been renamed as
committed_target_kb and committed_kb.
committed_kb represents the committed memory in kilobytes (KB) in the memory manager. Does not include reserved memory in the memory manager.
committed_target_kb represents the amount of memory, in kilobytes (KB), that can be consumed by SQL Server memory manager. The target amount is calculated using a variety of
inputs like:
the current state of the system including its load
the memory requested by current processes
the amount of memory installed on the computer
configuration parameters
If committed_target_kb is larger than
committed_kb, the memory manager will try to obtain additional memory. If
committed_target_kb is smaller than committed_kb, the memory manager will try to shrink the amount of memory committed. The
committed_target_kb always includes stolen and reserved memory.
MSSQLSERVER_701
http://msdn.microsoft.com/en-us/library/aa337311.aspx
An in-depth look at SQL Server Memory–Part 3
http://blogs.msdn.com/b/sqljourney/archive/2013/11/02/10402729.aspx
INF: Using DBCC MEMORYSTATUS to Monitor SQL Server Memory Usage
http://support.microsoft.com/kb/271624/en-us
Hope it helps.
Tracy Cai
TechNet Community Support -
Error when execute a package from SQL Server Agent
We have the next problem:
When we execute a package from a Job of SQL Server agent, it shows the success messege, but reviewing the results, the package didnt do all the tasks.
When we run it mannually by MSIS the package, it shows the success message and it works fine.
The workflow of the package is :
1) Shrink the databases (executing a sql file)
2) Backup the databases (Back up Database task of MSIS)
3) Rename the files to .BAK extension (by the Foreach loop container and fyle system task)
4) Execute command to compress the it (by a .bat)
5) Move the compress file to another location (by another Foreach loop)
Manually run correct, but when is a a SQL AGent Job that execute the package it does only the first 2 steps.
we are ussing Microsoft SQL Server 2008 R2 (SP1) - 10.50.2500.0 (X64) Jun 17 2011 00:54:03 Copyright (coffee) Microsoft Corporation Enterprise Edition (64-bit) on Windows NT 6.1 <X64> (Build 7601: Service Pack 1)
We are using a user with administrator privilegies
CheersCan you check if account running package has proper access. You may need to define a proxy account for that.
See
http://www.databasejournal.com/features/mssql/article.php/3789881/Proxy-Accounts-in-SQL-Server.htm
http://gqbi.wordpress.com/2014/01/30/setting-up-a-proxy-account-to-run-sql-server-integration-services-ssis-2012-packages/
Please Mark This As Answer if it helps to solve the issue Visakh ---------------------------- http://visakhm.blogspot.com/ https://www.facebook.com/VmBlogs -
Hello,
The use case is each row logs a cumulative data point, like an odometer, and I need to be able to subtract a previous row from a following row in order to see the change between two rows.
I can do this if I create a Power Query "From Table," but if I do the same thing when the data source is SQL, I get an error message "invalid attempt to call Read when reader is closed".
Given a trivial data table, this works:
let
Source = Excel.CurrentWorkbook(){[Name="Table1"]}[Content],
#"Added Index" = Table.AddIndexColumn(Source, "Index", 0, 1),
#"Added Index1" = Table.AddIndexColumn(#"Added Index", "Index.1", 1, 1),
Merge = Table.NestedJoin(#"Added Index1",{"Index.1"},#"Added Index1",{"Index"},"NewColumn"),
#"Expand NewColumn" = Table.ExpandTableColumn(Merge, "NewColumn", {"Odometer"}, {"NewColumn.Odometer"})
in
#"Expand NewColumn"
But attempting the same technique against data from SQL, I get the above error "invalid attempt to call Read when reader is closed".
Any suggestions? If this is a feature (or bug) that can't be overcome, is there another way to compare values between two rows?
Thanks,
EdPlease use "send a frown" to report the bug so we can fix it. Is the data very big? The simplest workaround is likely to be the use of "Table.Buffer" to buffer the table locally before doing this work; that will cause us to stop any
attempt to do the processing on the server. If the table is very big, though, this isn't an attractive approach. -
Using Powershell to delete all users from the Portal
Summary
This script will delete all users from the Portal except for Administrator and the Built-In Sync account.
Based on Markus's "Delete a User" script.
Useful when developing your system if you want to quickly clear out the data and start again.
set-variable -name URI -value "http://localhost:5725/resourcemanagementservice' " -option constant
function DeleteObject
PARAM($objectType, $objectId)
END
$importObject = New-Object Microsoft.ResourceManagement.Automation.ObjectModel.ImportObject
$importObject.ObjectType = $objectType
$importObject.TargetObjectIdentifier = $objectId
$importObject.SourceObjectIdentifier = $objectId
$importObject.State = 2
$importObject | Import-FIMConfig -uri $URI
if(@(get-pssnapin | where-object {$_.Name -eq "FIMAutomation"} ).count -eq 0) {add-pssnapin FIMAutomation}
$allobjects = export-fimconfig -uri $URI `
–onlyBaseResources `
-customconfig "/Person"
$allobjects | Foreach-Object {
$displayName = $_.ResourceManagementObject.ResourceManagementAttributes | `
Where-Object {$_.AttributeName -eq "DisplayName"}
if([string]::Compare($displayName.Value, "Administrator", $True) -eq 0)
{write-host "Administrator NOT deleted"}
elseif([string]::Compare($displayName.Value, "Built-in Synchronization Account", $True) -eq 0)
{write-host "Built-in Synchronization Account NOT deleted"}
else {
$objectId = (($_.ResourceManagementObject.ObjectIdentifier).split(":"))[2]
DeleteObject -objectType "Person" `
-objectId $objectId
write-host "`nObject deleted`n" $displayName.Value }
Go to the FIM ScriptBox
http://www.wapshere.com/missmiisThe DeleteObject function opens and closes a connection for each object. This approach is faster:
http://social.technet.microsoft.com/wiki/contents/articles/23570.how-to-use-powershell-to-delete-fim-users-that-have-a-null-attribute-name.aspx
Mike Crowley | MVP
My Blog --
Planet Technologies -
Pass value from SQL agent job step
Hi
I have created one SSIS package and I am scheduling the same using SQL Agent job. Now I want to pass one value from SQL job agent and want to use that value in SSIS package and then run that package. Can someone point to the solution.
Aniruddha http://aniruddhathengadi.blogspot.com/I have created one parameter on SSIS package named strValue and assigned a empty value "". After that I have created one SQL job and step wherein I have set the value to "Nike" for parameter under Parameters tab.
Now I am expecting Nike value which I have set on SQL job step should get reflected on SSIS package when I am running my job. Am I doing anything wrong ?
Aniruddha http://aniruddhathengadi.blogspot.com/
Not sure what's going wrong but you can have quick look at below step by step tutorial:
Parameterizing Connections and Values at Runtime Using SSIS Environment Variables (via SQL Agent)
Cheers,
Vaibhav Chaudhari -
Getting Error while trying to execute SSIS package through sql agent
Hi,
I have created a package in SSIS 2008.
I have created sql Agent job which runs perfectly on my Pc.
I tried to create anew job on another pc which doesnot have SSIS.
When i tried to run the Job,
i am getting folowing error:
Executed as user: LT-MAGFIH$. tion. End Error DTExec: The package execution returned DTSER_FAILURE (1). Started: 8:51:31 PM Finished: 8:51:35 PM Elapsed: 4.024 seconds. The package execution failed. The step failed.
Please let me now how can i solve.Hi AjayChigurupati,
I would suggest you check you are install or use the dtexec utility correctly:
On a 64-bit computer, Integration Services installs a 64-bit version of the
dtexec utility (dtexec.exe). If you have to run certain packages in 32-bit mode, you will have to install the 32-bit version of the
dtexec utility. To install the 32-bit version of the
dtexec utility, you must select either Client Tools or Business Intelligence Development Studio during setup.
By default, a 64-bit computer that has both the 64-bit and 32-bit versions of an Integration Services command prompt utility installed will run the 32-bit version at the command prompt. The 32-bit version runs because the directory path for the 32-bit
version appears in the PATH environment variable before the directory path for the 64-bit version. (Typically, the 32-bit directory path is
<drive>:\Program Files(x86)\Microsoft SQL Server\100\DTS\Binn, while the 64-bit directory path is
<drive>:\Program Files\Microsoft SQL Server\100\DTS\Binn.)
For detail information, please see:
http://technet.microsoft.com/en-us/library/ms162810(v=sql.105).aspx
To using SQL Server Agent to Run a Package, please refer to the steps in th article below:
http://technet.microsoft.com/en-us/library/ms138023(v=sql.105).aspx
If you have any feedback on our support, please click
here.
Elvis Long
TechNet Community Support -
Suggestions for how to execute file maintenance in SQL Agent Job
I need to create a SQL Agent Job (it has to be a SQL agent job) to zip all files in a directory, delete all files in the directory but the zip file and then copy the zip file to another location across a UNC path. What are some suggestions for getting
this done?
Thanks.This is about a previous topic ...
"1. Please close your preview thread by marking the answers that you got and not your own summary. Actualy you can mark your summary as well if you feel that this close the thread but yet, this is a summary of the answers that you got! People
that help you Invested time to help you. Feel free to
not be lazy and start you can start voting helpful responses as well :-)"
How do you mark useful answers? I have not seen how to do that.
On the left of each response you have an icon (small image).
I have asked lots of questions on the TechNet and other forums. (One day I hope to even answer some.) That was the first time I had ever marked my own as the answer. Honestly, I did not use any of those well meaning replies and did not feel that any of them
answered my question. When researching a topic I, like most people, I look at the marked answer first and assume it is the answer. Did not feel that any of them were the answer. There is a great push on this forum to mark the answer and there are droves
of people clamoring for points (BTW - you are not one of them). I review each of my questions, always follow up on any replies, post my solution if appropriate, and always try to close the topic. But this one was not answered.
OK, that is logic, If you dont think that you use other response as part of the answer. In my opinion (and you dont have to get it) if an answer is build from 3 responses together then we need to mark all the three (and mentioned in the summery what we have
done). If you find that those response cant be called "answer" since they do not cover enough then maybe your call was correct.
Please close the thread in the way that you feel is OK :-)
(you dont have to accept my opinion since you have your explanation, but i do think that some of the responses there are answers).
[Personal Site] [Blog] [Facebook] -
Using PowerShell to import CSV data from Vendor database to manipulate Active Directory Users
Hello,
I have a big project I am trying to automate. I am working in a K-12 public education IT Dept. and have been tasked with importing data that has been exported from a vendor database via .csv file into Active Directory to manage student accounts.
My client wants to use this data to make bulk changes to student user accounts in AD such as moving accounts from one OU to another, modifying account attributes based on State ID, lunchroom ID, School, Grade, etc. and adding new accounts / disabling
accounts for students no longer enrolled.
The .csv that is exported doesn't have headers that match up with what is needed for importing in AD, so those have to be modified in this process, or set as variables to get the correct info into the correct attributes in AD or else this whole project is
a bust. He is tired of manually manipulating the .csv data and trying to get it onto AD with few or no errors, hence the reason it has been passed off to me.
Since this information changes practically daily, I need a way to automate user management by accomplishing the following on a scheduled basis.
Process must:
Check to see if Student Number already exists
If yes, then modify account
Update {School Name}, {Site Code}, {School Number}, {Grade Level} (Variables)
Add correct group memberships (School / Grade Specific)
Move account to correct OU (OU={Grade},OU=Students,OU=Users,OU={SiteCode},DC=Domain,DC=net)
Remove incorrect group memberships (School / Grade Specific)
Set account status (enabled / disabled)
If no, create account
Import Student #
Import CNP #
Import Student name
Extract First and Middle initial
If duplicate name exists
Create log entry for review
Import School, School Number, Grade Level
Add to correct Group memberships (School / Grade Specific)
Set correct OU (OU={Grade},OU=Students,OU=Users,OU={SiteCode},DC=Domain,DC=net)
Set account Status
I am not familiar with Powershell, but have researched enough to know that it will be the best option for this project. I have seen some partial solutions in VB, but I am more of an infrastructure person instead of scripting / software development.
I have just started creating a script and already have hit a snag. Maybe one of you could help.
#Connect to Active Directory
Import-Module ActiveDirectory
# Import iNOW user information
$Users = import-csv C:\ADUpdate\INOW_export.csv
#Check to see if the account already exists in AD
ForEach ( $user in $users )
#Assign the content to variables
$Attr_employeeID = $users."Student Number"
$Attr_givenName = $users."First Name"
$Attr_middleName = $users."Middle Name"
$Attr_sn = $users."Last Name"
$Attr_postaldeliveryOfficeName = $users.School
$Attr_company = $users."School Number"
$Attr_department = $users."Grade Level"
$Attr_cn = $Attr_givenName.Substring(0,1) + $Attr_middleName.Substring(0,1) + $Attr_sn
IF (Get-ADUser $Attr_cn)
{Write-Host $Attr_cn already exists in Active DirectoryThank you for helping me with that before it became an issue later on, however, even when modified to be $Attr_sAMAaccountName i still get errors.
#Connect to Active Directory
Import-Module ActiveDirectory
# Import iNOW user information
$Users = import-csv D:\ADUpdate\Data\INOW_export.csv
#Check to see if the account already exists in AD
ForEach ( $user in $users )
#Assign the content to variables
$Attr_employeeID = $users."Student Number"
$Attr_givenName = $users."First Name"
$Attr_middleName = $users."Middle Name"
$Attr_sn = $users."Last Name"
$Attr_postaldeliveryOfficeName = $users.School
$Attr_company = $users."School Number"
$Attr_department = $users."Grade Level"
$Attr_sAMAccountName = $Attr_givenName.Substring(0,1) + $Attr_middleName.Substring(0,1) + $Attr_sn
IF (Get-ADUser $Attr_sAMAccountName)
{Write-Host $Attr_sAMAccountName already exists in Active Directory
PS C:\Windows\system32> D:\ADUpdate\Scripts\INOW-AD.ps1
Get-ADUser : Cannot convert 'System.Object[]' to the type 'Microsoft.ActiveDirectory.Management.ADUser'
required by parameter 'Identity'. Specified method is not supported.
At D:\ADUpdate\Scripts\INOW-AD.ps1:28 char:28
+ IF (Get-ADUser $Attr_sAMAccountName)
+ ~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : InvalidArgument: (:) [Get-ADUser], ParameterBindingException
+ FullyQualifiedErrorId : CannotConvertArgument,Microsoft.ActiveDirectory.Management.Commands.GetAD
User -
Executing OS command from sql procedure
i want to execute an OS command from the procedure, can i do this way? looks like it is not executing the command move....
declare
begin
dbms_output.put_line(' moving...');
host;
move c:\file1\test.txt C:\moved
exit;
/True. But that job is going to run in a separate session, some time after the current transaction commits (depending on the number of jobs queued up to run). So if you need to pass information back from the job, or you want the procedure to handle exceptions thrown by the job, or you want to tie job related failures back to a particular application level event, or you want the rest of your procedure to wait for the operating system command to finish, using the DBMS_SCHEDULER for this sort of thing is going to require a fair amount of additional coordination/ monitoring/ infrastructure code. It's certainly not a Herculean task to write that additional code, and it's reasonable in some situations, but the Java stored procedure approach strikes me as substantially easier to deal with in most cases.
Justin -
Can I use attunity CDC to propagate changes from sql server to a sybase database
can i use attunity cdc components in sql server to propagate changes to a sybase destination. I did some experimenting with cdc in ssis 2012 and got it working to send changes to sybase, but I need to do this for approx 5000 tables. so looking for a product
that can do this for me.
ny help welcome
thanks
Geert Vanhove DCOD ------ http://geertvanhove.wordpress.com/ ----------- Please click the Mark as Answer or Vote As Helpful if a post solves your problem or is helpful!Hello,
It seems that you try to with captures SQL Server data changes and move to Sybase database with Attunity CDC. Since it is a thrid party tool, I suggestion that you post the question in the
Attunity site for support.
Reference:Attunity CDC
Regards,
Fanny Liu
Fanny Liu
TechNet Community Support -
What is the best way to handle executing multiple packages from the Agent?
I have several packages that have to be executed in sequence. I thought the best way to do that was by creating a job for each package then have a master job that executes the other packages. In my master job, I'm using sp_start_job to call the other
jobs. The problem is, the master job moves from step to step without waiting for the child jobs to finish; basically they all execute together.
That is the way I've seen it done in other places so I feel like I'm doing something wrong. In the alternative, I know it's possible to set the individual steps up so they execute the packages directly without calling an external job. I prefer the first
way though.
Which way should I jump on this?So basically what I'm hearing is just call the packages in a mulit step job. Creating a master package and calling child packages sounds a little crazy and unscaleable especially considering that the packages have master child relationships
within themselves. It's SSIS Package Inception.
Sorry whats the issue with that?
Provided you're setting the package sequence correctly based on your dependency it will work fine as loop gets iterated based on how you set the package list.
What we have is a audit and control mechanism which even have details on the dependency so based dependency set only the packages gets ordered and listed for the loop to iterate through and executing. Also tomorrow if a new package comes into being, all
it takes for us is to tweak the audit table to add a new entry for the package and set the dependency and it will continue to work fine including the new package without touching the existing job for any modification whatsoever.
Another advantage of this table is we also capture audit details in it like date when package got last executed, its status of execution,rows processed (insert,modified, deleted) etc which can be used for easy monitoring of the data processing tasks as well.
Please Mark This As Answer if it solved your issue
Please Vote This As Helpful if it helps to solve your issue
Visakh
My Wiki User Page
My MSDN Page
My Personal Blog
My Facebook Page -
Error Using New-WebServiceProxy cmdlet with SQL Agent Powershell Subsystem
I created a powershell script to connect to SSRS via web service so I can cache reports in the database and expose them directly through our application. The script works fine when I run it through Powershell directly, but when I try running it through the
SQL Agent Powershell subsystem I am getting a strange error:
"A job step received an error at line 61 in a PowerShell script. The corresponding line is '$RS = New-WebServiceProxy -Class 'RS' -Namespace 'RS' -Uri $reportServerURI -UseDefaultCredential '. Correct the script and reschedule the job. The error
information returned by PowerShell is: 'Could not load file or assembly 'file:///C:\WINDOWS\TEMP\yfqiivtg.dll' or one of its dependencies. The system cannot find the file specified. '. Process Exit Code -1. The step failed."
I am using SQL Server 2014, SSRS 2014, Windows 8.1. The only difference I can think of is that when I run Powershell from the OS, I am using v 4.0 whereas when I run it from SQL Agent it loads v 2.0. My understanding is that v 2.0 supports the New-WebServiceProxy
cmdlet, so I'm not convinced the version discrepancy is the culprit. Any ideas what might be causing this?
On a side note, is there a way to have SQL Agent use Powershell 4.0 for the subsystem? v 2.0 feels a little dated for SQL Server 2014.Hi WilliamW,
When creating a PowerShell job step, there is only one security context available, which is the "SQL Server Agent Service Account." That means that if you intend to execute PowerShell scripts from SQL Agent Job steps, the SQL Server Agent
service account must have appropriate permissions.
According to your error message, I recommend to check if the SQL Server Agent service account has access to the folder where the scripts live, as well as the folder C:\WINDOWS\TEMP.
In addition, when we execute a PowerShell job step in SQL Server, SQL Server Agent subsystem run the sqlps utility, and the sqlps utility launches PowerShell 2.0 and imports the sqlps module. If you need to run a PowerShell v4.0 script from a SQL Server
Agent job, you can create a proxy account to run the agent job which contains PowerShell script. For more details, please review this similar blog:
Run a PowerShell v3 Script From a SQL Server Agent Job.
Thanks,
Lydia Zhang
Maybe you are looking for
-
Block Customer Credit Memo via Quality Management
Hi We are using QM in sale for customer return SO. We activated inspection type 9902 customer return with Movement type 655. We create return sale order and against return SO does the GRN at that time inspection lot generates for Inspection type 990
-
[2011-12-28:10:54:16] Runtime Installer begin with version 3.1.0.4880 on Windows Vista x86 [2011-12-28:10:54:16] Commandline is: -playerVersion=11,1,102,55 -sandboxType=remote -securityDomain=airdownload.adobe.com -https=false -fromUserEvent=false --
-
Display a specific structure based on user input
Hello All, I have a aging report in which one aging is based on Group currency and the other aging based on Document currency, user should be able to specify which bucket he wants to see as a input selection and the report should only display that ag
-
Okay, I may get a bit windy but here is the situation. I have an external hard drive that I used for more than a year with a netbook. I have a folder named Bet's Netbook on it that contains the following folders: Album Artwork Cache Dowload Aut
-
Can I update my OS to another version other than Mavericks. The machine I'd like to update is too old to run Mavericks and I am having some issues with Java 7 not being supported on my v. 10.6.8.