Scheduling a Powershell script as a batch job
I need to run a PowerShell script as a batch job to be executed every 5 minutes on a Windows Server 2012 R2 server.
I want to use a scheduled task.
I read on the web different ways to solve the problem but I was unable to make them work.
What is the simplest way to run a PowerShell file as a scheduled job?
Regards
Mario
Hi Mario,
The quickest method is to write your script and save it on the server. Then create a new scheduled task like so:
Actions -
Start a program
Program/script : C:\Windows\System32\WindowsPowerShell\v1.0\powershell.exe
Add arguments : -File C:\path\to\script.ps1
That's all you need to do for simple scripts (make sure your execution policy is set right too).
Don't retire TechNet! -
(Don't give up yet - 12,700+ strong and growing)
Similar Messages
-
Hi all,
I have a problema scheduling a Powershell script using a batch file:
I've tryed to do that with this two method:
powershell -c ". 'C:\Program Files\Microsoft\Exchange Server\V14\bin\RemoteExchange.ps1'"
C:\Windows\System32\WindowsPowerShell\v1.0\powershell.exe "'C:\Program Files\Microsoft\Exchange Server\V14\bin\RemoteExchange.ps1"
The job works but inside the script there is the instruction:
$Path = Get-Location
This because the script create Log folder in same location where PS1 file is.
Now scheduling the script using one of the method listed the script create the Log folder in path:
C:\Windows\System32
How ca n I prevent this and do the script create the folder in the same location where PS1 file is ?
I hope to explain my problem
ThanksIf you're using Windows PowerShell 3.0, then you can take advantage of the automatic variable $PSScriptRoot to determine the location of the script. For instance, if I have a script that runs from C:\Scripts\myscript.ps1 and I want my script to create a
text file in the same location, then I do not need to hard code this value. I just use $PSScriptRoot, such as:
Set-Content -Path "$PSScriptRoot\logfile.txt" -Value $Value
This would create a text file called logfile.txt in C:\Scripts.
Since you're working with Exchange (and possibly Windows PowerShell 2.0) you may need to stick with the $MyInvocation automatic variable and the Split-Path cmdlet, such as:
Split-Path $MyInvocation.MyCommand.Path
Test these outside of your script so you can learn a little about them, just be sure you put them into their own
saved .ps1 file. They will not return anything if you throw them into the ISE or shell, unless they are part of a .ps1 file.
Edit: Added script blocks for readability. -
Calling powershell script from a batch file
Hello All,
I have a batch script that calls a powershell script. Before calling the script I set the execution policy to unrestricted, but when it gets to the line that calls the batch script i still get the confirmation in the command window: "Do you want to
perform this operation" I then have to press Y for the PS script to run and then my batch script finishes.
Does anyone know the setting I need to change in order to suppress the confirmation?
Note: this is Windows 8, see code below
set THIS_DIR=%~dp0
powershell Set-ExecutionPolicy unrestricted
powershell %THIS_DIR%MyScript.ps1 "param1"I may sound like a jerk but you really want to look at PowerShell.exe /?
PowerShell[.exe] [-PSConsoleFile <file> | -Version <version>]
[-NoLogo] [-NoExit] [-Sta] [-Mta] [-NoProfile] [-NonInteractive]
[-InputFormat {Text | XML}] [-OutputFormat {Text | XML}]
[-WindowStyle <style>] [-EncodedCommand <Base64EncodedCommand>]
[-File <filePath> <args>] [-ExecutionPolicy <ExecutionPolicy>]
[-Command { - | <script-block> [-args <arg-array>]
| <string> [<CommandParameters>] } ]
PowerShell[.exe] -Help | -? | /?
-PSConsoleFile
Loads the specified Windows PowerShell console file. To create a console
file, use Export-Console in Windows PowerShell.
-Version
Starts the specified version of Windows PowerShell.
Enter a version number with the parameter, such as "-version 2.0".
-NoLogo
Hides the copyright banner at startup.
-NoExit
Does not exit after running startup commands.
-Sta
Starts the shell using a single-threaded apartment.
Single-threaded apartment (STA) is the default.
-Mta
Start the shell using a multithreaded apartment.
-NoProfile
Does not load the Windows PowerShell profile.
-NonInteractive
Does not present an interactive prompt to the user.
-InputFormat
Describes the format of data sent to Windows PowerShell. Valid values are
"Text" (text strings) or "XML" (serialized CLIXML format).
-OutputFormat
Determines how output from Windows PowerShell is formatted. Valid values
are "Text" (text strings) or "XML" (serialized CLIXML format).
-WindowStyle
Sets the window style to Normal, Minimized, Maximized or Hidden.
-EncodedCommand
Accepts a base-64-encoded string version of a command. Use this parameter
to submit commands to Windows PowerShell that require complex quotation
marks or curly braces.
-File
Runs the specified script in the local scope ("dot-sourced"), so that the
functions and variables that the script creates are available in the
current session. Enter the script file path and any parameters.
File must be the last parameter in the command, because all characters
typed after the File parameter name are interpreted
as the script file path followed by the script parameters.
-ExecutionPolicy
Sets the default execution policy for the current session and saves it
in the $env:PSExecutionPolicyPreference environment variable.
This parameter does not change the Windows PowerShell execution policy
that is set in the registry.
-Command
Executes the specified commands (and any parameters) as though they were
typed at the Windows PowerShell command prompt, and then exits, unless
NoExit is specified. The value of Command can be "-", a string. or a
script block.
If the value of Command is "-", the command text is read from standard
input.
If the value of Command is a script block, the script block must be enclosed
in braces ({}). You can specify a script block only when running PowerShell.exe
in Windows PowerShell. The results of the script block are returned to the
parent shell as deserialized XML objects, not live objects.
If the value of Command is a string, Command must be the last parameter
in the command , because any characters typed after the command are
interpreted as the command arguments.
To write a string that runs a Windows PowerShell command, use the format:
"& {<command>}"
where the quotation marks indicate a string and the invoke operator (&)
causes the command to be executed.
-Help, -?, /?
Shows this message. If you are typing a PowerShell.exe command in Windows
PowerShell, prepend the command parameters with a hyphen (-), not a forward
slash (/). You can use either a hyphen or forward slash in Cmd.exe.
Hope that helps! Jason -
Scheduled task powershell script cant write file
Hello experts
I have scheduled task with powershell but cant write CSV file. Below my PS script:
$ExemptGroup = Get-ADGroup app_users
Get-ADUser -Filter { -not (memberOf -RecursiveMatch $ExemptGroup.DistinguishedName) } -Properties * |
Select-Object -Property DisplayName,SamAccountName,WhenCreated,@{Name='Last Logon';Expression={[System.DateTime]::FromFileTime($_.LastLogon).ToString('g')}},LogonCount,@{N='Status';E={
If ( $_.useraccountControl -match '^(?:514|546|66050|66082)$' ) { 'Disabled' } Else { 'Enabled' } }} |
Sort-Object -Property DisplayName | Export-Csv C:\Users\22041912\Documents\User_statis_list.csv
In Powershell my script working normally and writing CSV file but in scheduled task cant write CSV file. Task history told me succesfully finished.
Anyone suggest? What wrong?copy this on a batch file and point the task scheduler to the batch file.
PowerShell.exe -WindowStyle Hidden -File
E:\Shell\OdmaaGet-Aduser.ps1
check if it works..
Every second counts..make use of it. Disclaimer: This posting is provided AS IS with no warranties or guarantees and confers no rights.
IT Stuff Quick Bytes -
I've been having a lot of problems trying to get an old batch file we have laying around, to run from my powershell script. The batch file actually asks for two inputs from the user. I've managed to put together a powershell that echos a response, but of
course, that only will answer one of the prompts. As usual, I've simplified things here to get my testing done. The batch file looks like this:
@ECHO OFF
SET /P CUSTID=Customer Number:
SET /P DBCOUNT=Number of Live Databases:
ECHO Customer Id was set to : %CUSTID%
ECHO Database Count was set to : %DBCOUNT%
Two inputs, two echos to verify values have been set. Now, the powershell looks like this:
Param(
[string]$ClientADG,
[string]$ClientDBCount,
[scriptblock]$Command
$ClientADG = '1013'
$ClientDBCount = '2'
$Response = $ClientADG + "`r`n" + $ClientDBCount
$Command = 'Invoke-Command -ComputerName localhost -ScriptBlock {cmd /c "echo ' + $ClientADG + ' | E:\Scripts\Setup\Company\DatabaseSetupTest.bat"}'
powershell -command $Command
Output looks like:
Customer Number: Number of Live Databases: Customer Id was set to : 1013
Database Count was set to :
As expected, as I'm only passing in one value. I can't figure out how to get a second value passed in for the second prompt. Instead of $ClientADG, I tried to mash the two value together in the $Response variable with a cr/lf or a cr or a lf in between,
but no go there either. The first input gets set to the second value, and second input is blank still. In the essence of time, I need to get this batch file called from Powershell to get some folks productive while I actually rewrite what the batch file does
in another powershell, so it can be integrated into other things. (I'm automating what a bunch of people spend hours doing into multiple scripts and eventually one BIG script so they can focus on doing their real jobs instead).
How do I get this right in powershell? I don't want to modify the batch file at all at this point, just get it running from the powershell.
Thanks in advance!
mpleafIt's a "simple" test so I can figure out how to get the arguments passed from ps to bat. The bat file looks like this:
@ECHO OFF
SET CUSTID = %1
SET DBCOUNT = %2
ECHO Customer Id was set to : %CUSTID%
ECHO Database Count was set to : %DBCOUNT%
That's it. The PS script looks like this:
Invoke-Command-ComputerName myserver-ScriptBlock{cmd/c"E:\Scripts\Setup\Company\DatabaseSetupTest.bat
1013 2"}
That's it. The bat file exists on "myserver", and I'm getting the echo back, but without the values.
mpleaf -
What is the best way to run a powershell script with parameters in the Task Scheduler?
Hello,
Want to run the following from a scheduled task in the Task Scheduler on a server. What is the best approach?
.\pscript.ps1 -csvfile "\\Srv1\Share\File.txt"
Thanks for your help! SdeDotHi,
To run a powershell script with parameters in the Task Scheduler:
Program: C:\Windows\System32\WindowsPowerShell\v1.0\powershell.exe
Add argument (optional): -Command "& c:\scripts\test.ps1 -par1 2 -par2 3"
Hope the below two articles be helpful for you:
Schedule PowerShell Scripts that Require Input Values
https://blogs.technet.com/b/heyscriptingguy/archive/2011/01/12/schedule-powershell-scripts-that-require-input-values.aspx
How to Schedule a PowerShell Script
http://dmitrysotnikov.wordpress.com/2011/02/03/how-to-schedule-a-powershell-script/
Regards,
Yan Li
Regards, Yan Li -
Powershell script not running in the task scheduler...
I've created a .ps1 script to transfer a file using WinSCP can run it in the ISE environment, in the PS window, and with the run command. I've transferred the command I used in the run prompt to the task scheduler but it is not running. It is running everywhere
else just not in the scheduler. It says that it completes okay and gives a return code of OpCode=2
The action is set to run this: c:\Windows\System32\WindowsPowerShell\v1.0\Powershell.exe
The Arguments: -ExecutionPolicy Bypass -file "C:\Users\me\scriptsWCP\FileTransferPS.ps1"
Also have it running with the highest permission and as SYSTEMHi,
To run a powershell script with parameters in the Task Scheduler:
Program: C:\Windows\System32\WindowsPowerShell\v1.0\powershell.exe
Add argument (optional): -Command "& c:\scripts\test.ps1 -par1 2 -par2 3"
Hope the below two articles be helpful for you:
Schedule PowerShell Scripts that Require Input Values
https://blogs.technet.com/b/heyscriptingguy/archive/2011/01/12/schedule-powershell-scripts-that-require-input-values.aspx
How to Schedule a PowerShell Script
http://dmitrysotnikov.wordpress.com/2011/02/03/how-to-schedule-a-powershell-script/
Regards,
Yan Li
Regards, Yan Li -
Unable to Configure a PowerShell Script to Run as a Scheduled Task
Hi ,
I have a powershell script from Dieters blog http://scug.be/dieter/2011/07/06/scom-2007-how-to-backup-your-unsealed-management-packs/#comment-1058 which backups the unsealed management packs in SCOM. I am able to run the script successfully in the powershell
prompt but when i schedule it as a task it fails with an error code 1.
"Task Scheduler successfully completed task "\Backup Unsealed Management Pack" , instance "{xxxxxxxxxxxxxxxxxxxxxxxxxxxx}" , action "Powershell.exe" with return
code 1."
I am running the task using the highest privilages and have tried bypassing the script using and have configured the script as per http://www.metalogix.com/help/Content%20Matrix%20Console/SharePoint%20Edition/002_HowTo/004_SharePointActions/012_SchedulingPowerShell.htm
-ExecutionPolicy Bypass c:\scripts\myscript.ps1
Still i am unable to run the task without errors. Kindly assist.
JestyHi Jesty,
Please make sure you have followed these steps to schedule a powershell script task:
1. Save the powershell script as .ps1 in local computer.
2. Right click the .ps1 file and run with powershell and check if the script can run successfully.
3. schedule task->action:Start a program->Program/script:powershell.exe->Add argument:-ExecutionPolicy Bypass c:\scripts\myscript.ps1
Best Regards,
Anna -
How to export Powershell script information to Sharepoint?
I'm trying to export information gather from a Powershell script to a Sharepoint list. I've got a couple of powershell scripts that gather general server information from a server ex: server uptime, disk space, service tag, etc. and it will export the information
to a csv file. What I would like to do is out-put the same information gathered by the powershell scripts to a Sharepoint list directly if at all possible.
Ex:
# all this does is reads from a list and runs a script call "boottime.ps1"
get-content "\\%Data-Path-Of-List%\computers.txt" | %Data-Path-Of-Script%\boottime.ps1 | Export-csv %Data-Path-For-CSV\Computers.csv
# then just exports the information from the boottime.ps1 script to a csv file
#I also have a script that will upload the information to a sharepoint list.
# I found that I have to run this in version 2 of powershell, so I just open a DOS prompt in Admin Priv's and type the following
powershell.exe -version 2.0
# Next I make sure the Sharepoint snap-in is loaded
if ( (Get-PSSnapin -Name Microsoft.sharepoint.powershell - erroraction silentlycontinue) -eq $null)
Add-PsSnapin Microsoft.Sharepoint.Powershell
$spweb = get-SPweb $spserver
$spdata =$spweb.getlist("%URL_Of_My_List%")
# this is the same location from the orginal Powershell script previously stated.
$ComputerInfoFile = "%Data-Path-For-CSV%\Computers.csv"
foreach ($rows in $tblData) {
# here is where I add the information from my csv file
# 2 things needs to be present
# 1st the colums have to be present in the sharepoint site before I can upload the information
# 2nd the columns have to the headers in my csv file
$spItem = $spData.AddItem()
$SpItem["ServerName"] = $row."ServerName".toString()
$SpItem["Uptime"] = $row."Uptime".toString()
$SpItem.Update()
# this just disconnects from Sharepoint
$spWeb.Dispose()
Please dismiss all the comments it just helps me understand what the code is doing, also if this is not the correct place to post this question I appologize in adavance and ask that if this is the incorrect place to post this question please provide me a
link to a where I can post such questions.Sorry for the delay in posting this, but I ended up getting working. I'll post it in the hopes that my head scratching will save someone else some head scratching:
I ended up writting 3 PS scripts and one batch job.
1st Batch file
powershell.exe -version 2.0 -command
\\%Script-Location\Get-Server-Infor-4-SP.ps1
powershell.exe -version 2.0 -command \\%Script-Location\Delete-list-Items.ps1"
powershell -veriosn 2.0 -command
\\%Script-Location\Populate-SP.ps1
1st PS script that gets the info:
get-content
\\%Location-Of-My-File-With-List-Of-Servers%\%name-of-file%.txt |
\\%Location-Of-My-Script-To-get-the-Information-I-want | Export-csv
\\%location-of-my-output\%filename%.csv
Ex: get-content C:\scripts\computers.txt | C:\scripts\boottime.ps1 | export-csv C:\scripts\computer.csv
2nd PS script Delete-List-Items.ps1
# http:
#Script 1 Boottime.ps1:
# This script permits to get UpTime from localHost or a set of remote Computer
# usage
# localHost
# .\BootTime.ps1
# set of remote computers
# get-content .\MyserverList.txt | .\boottime.ps1
# Optionally pipe output to Export-Csv, ConverTo-Html
Process {
$ServerName = $_
if ($serverName -eq $Null) {
$serverName= $env:COMPUTERNAME
$timeVal = (Get-WmiObject -ComputerName $ServerName -Query "SELECT LastBootUpTime FROM Win32_OperatingSystem").LastBootUpTime
#$timeVal
$DbPoint = [char]58
$Years = $timeVal.substring(0,4)
$Months = $timeVal.substring(4,2)
$Days = $timeVal.substring(6,2)
$Hours = $timeVal.substring(8,2)
$Mins = $timeVal.substring(10,2)
$Secondes = $timeVal.substring(12,2)
$dayDiff = New-TimeSpan $(Get-Date –month $Months -day $Days -year $Years -hour $Hours -minute $Mins -Second $Secondes) $(Get-Date)
$Info = "" | select ServerName, Uptime
$Info.servername = $servername
$d =$dayDiff.days
$h =$dayDiff.hours
$m =$dayDiff.Minutes
$s = $daydiff.Seconds
$info.Uptime = "$d Days $h Hours $m Min $s Sec"
$Info
#Script 2: Delete-List-Items.ps1
# http://markimarta.com/sharepoint/delete-all-items-in-sharepoint-list-using-powershell/
# there seems to be a problem with running this script in version 3 or later, the workaround is to run it in version 2
# below is the cmd for doing so, just open up a DOS prompt with Admin Privileges Start-->Run-->cmd
# type then copy and paste the following line the DOS window then you can run this script
#powershell.exe -version 2.0
# make sure that the Microsoft.SharePoint.PowerShell Snap-in is installed as well
if ( (Get-PSSnapin -Name Microsoft.SharePoint.PowerShell -ErrorAction SilentlyContinue) -eq $null )
Add-PsSnapin Microsoft.SharePoint.PowerShell
# "Enter the site URL instead http://serverurl"
$SITEURL = "%http://serverurl%"
$site = new-object Microsoft.SharePoint.SPSite ( $SITEURL )
$web = $site.OpenWeb()
"Web is : " + $web.Title
# Enter name of the List below in the [“%List-Name%”]
$oList = $web.Lists["%List-Name%"];
# This echo out the name of the list its going to be deleting the records from
"List is :" + $oList.Title + " with item count " + $oList.ItemCount
# It’s just counting the rows/records
$collListItems = $oList.Items;
$count = $collListItems.Count - 1
# Here is where it is actually deleting the records and then out put the number or the record it deleted
for($intIndex = $count; $intIndex -gt -1; $intIndex--)
"Deleting record: " + $intIndex
$collListItems.Delete($intIndex);
#Script 3: Populate-SP_Test.ps1
# http://blogs.technet.com/b/stuffstevesays/archive/2013/07/10/3577320.aspx
# there seems to be a problem with running this script in version 3 or later, the workaround is to run it in veriosn 2
# below is the cmd for doing so, just open up a DOS prompt with Admin Privileges Start-->Run-->cmd
# type then copy and paste the following line the the DOS window then you can run this script
#powershell.exe -version 2.0
# make sure that the Microsoft.SharePoint.PowerShell Snap-in is installed
if ( (Get-PSSnapin -Name Microsoft.SharePoint.PowerShell -ErrorAction SilentlyContinue) -eq $null )
Add-PsSnapin Microsoft.SharePoint.PowerShell
# Here are some more varables that can be added I was not able to get this to work
#$SPComputerInfo="/Lists/PowershellTest/"
# Here is where we connect and Open SharePoint List via Powershell
$SPServer= "%http://serverurl%
$spWeb = Get-SPWeb $SPServer
$spData = $spWeb.GetList("%List-Name%")
# This is the variable for the path that has the file I want to input to SharePoint List
$InvFile="\\%location-ofList%\computers.csv"
# This is just some error checking to make sure the file exist
$FileExists = (Test-Path $InvFile -PathType Leaf)
if ($FileExists) {
"Loading $InvFile for processing..."
$tblData = import-csv $InvFile
} else {
"$InvFile not found - stopping import!"
exit
# Loop through Applications add each one to SharePoint
"Uploading data to SharePoint...."
foreach ($row in $tblData) {
#Here is where I add the information from my CSV file
#2 things have to be present
# 1. the columns have to be in the sharepoint site before I can import the information
# 2. columns have to the headers in my csv file
#"Adding entry for "+$row."Computer Information".ToString()
$spItem = $spData.AddItem()
$spItem["ServerName"] = $row."ServerName".ToString()
$spItem["Uptime"] = $row."Uptime".ToString()
#$spItem["DNSHostName"] = $row."DNSHostName".ToString()
#$spItem["DistinguishedName"] = $row."DistinguishedName".ToString()
$spItem.Update()
# This is just disconnecting from SharePoint
$spWeb.Dispose()
Enjoy, and if anyone has a better way of doing this I'm interested in knowing, thanks again
Thanks in Adavance -
Help modifying a powershell script
Hello,
I have recently been given a task to write/find a script that is capable of performing Full and Incremental backups. I found a script that does exactly what I need, however, it requires user input. I need this to be a scheduled task and therefore I need
the input to be a static path. Here is the script I am talking about:
#region Params
param(
[Parameter(Position=0, Mandatory=$true,ValueFromPipeline=$true)]
[ValidateScript({Test-Path -LiteralPath $_ -PathType 'Container'})]
[System.String]
$SourceDir,
[Parameter(Position=1, Mandatory=$true,ValueFromPipeline=$false)]
[ValidateScript({Test-Path -LiteralPath $_ -PathType 'Container'})]
[System.String]
$DestDir,
[Parameter(Position=2, Mandatory=$false,ValueFromPipeline=$false)]
[ValidateScript({Test-Path -LiteralPath $_ -PathType 'Container'})]
[System.String]
$HashPath,
[Parameter(Position=3, Mandatory=$false,ValueFromPipeline=$false)]
[ValidateSet("Full","Incremental","Differential")]
[System.String]
$BackupType="Full",
[Parameter(Position=4, Mandatory=$false,ValueFromPipeline=$false)]
[ValidateNotNullOrEmpty()]
[System.String]
$LogFile=".\Backup-Files.log",
[Parameter(Position=5, Mandatory=$false,ValueFromPipeline=$false)]
[System.Management.Automation.SwitchParameter]
$SwitchToFull
#endregion
begin{
function Write-Log
#region Params
[CmdletBinding()]
[OutputType([System.String])]
param(
[Parameter(Position=0, Mandatory=$true,ValueFromPipeline=$true)]
[ValidateNotNullOrEmpty()]
[System.String]
$Message,
[Parameter(Position=1, Mandatory=$true,ValueFromPipeline=$true)]
[ValidateNotNullOrEmpty()]
[System.String]
$LogFile
#endregion
try{
Write-Host $Message
Out-File -InputObject $Message -Append $LogFile
catch {throw $_}
function Get-Hash
#region Params
[CmdletBinding()]
[OutputType([System.String])]
param(
[Parameter(Position=0, Mandatory=$true,ValueFromPipeline=$true)]
[ValidateNotNullOrEmpty()]
[System.String]
$HashTarget,
[Parameter(Position=1, Mandatory=$true,ValueFromPipeline=$false)]
[ValidateSet("File","String")]
[System.String]
$HashType
#endregion
begin{
try{ $objGetHashMD5 = New-Object -TypeName System.Security.Cryptography.MD5CryptoServiceProvider }
catch {throw $_ }
process{
try {
#Checking hash target is file or just string
switch($HashType){
"String" {
$objGetHashUtf8 = New-Object -TypeName System.Text.UTF8Encoding
$arrayGetHashHash = $objGetHashMD5.ComputeHash($objGetHashUtf8.GetBytes($HashTarget.ToUpper()))
break
"File" {
$arrayGetHashHash = $objGetHashMD5.ComputeHash([System.IO.File]::ReadAllBytes($HashTarget))
break
#Return hash
Write-Output $([System.Convert]::ToBase64String($arrayGetHashHash))
catch { throw $_ }
function Copy-File
#region Params
[CmdletBinding()]
[OutputType([System.String])]
param(
[Parameter(Position=0, Mandatory=$true,ValueFromPipeline=$false)]
[ValidateScript({Test-Path -LiteralPath $_ -PathType 'Any'})]
[System.String]
$SourceFile,
[Parameter(Position=1, Mandatory=$true,ValueFromPipeline=$false)]
[ValidateNotNullOrEmpty()]
[System.String]
$DestFile
#endregion
try{
#The script fails when folder being copied to file. So the item will be removed to avoid the error.
if(Test-Path -LiteralPath $DestFile -PathType Any){
Remove-Item -LiteralPath $DestFile -Force -Recurse
#Creating destination if doesn't exist. It's required because Copy-Item doesn't create destination folder
if(Test-Path -LiteralPath $SourceFile -PathType Leaf){
New-Item -ItemType "File" -Path $DestFile -Force
#Copying file to destination directory
Copy-Item -LiteralPath $SourceFile -Destination $DestFile -Force
catch{ throw $_ }
function Backup-Files
#region Params
[CmdletBinding()]
[OutputType([System.String])]
param(
[Parameter(Position=0, Mandatory=$true,ValueFromPipeline=$false)]
[ValidateScript({Test-Path -LiteralPath $_ -PathType 'Container'})]
[System.String]
$SourceDir,
[Parameter(Position=1, Mandatory=$true,ValueFromPipeline=$false)]
[ValidateNotNullOrEmpty()]
[System.String]
$DestDir,
[Parameter(Position=2, Mandatory=$true,ValueFromPipeline=$false)]
[ValidateNotNull()]
[System.Collections.Hashtable]
$HashTable
#endregion
try{
$xmlBackupFilesHashFile = $HashTable
Write-Host "Backup started"
Get-ChildItem -Recurse -Path $SourceDir|ForEach-Object{
$currentBackupFilesItem = $_
#Full path to source and destination item
$strBackupFilesSourceFullPath = $currentBackupFilesItem.FullName
$strBackupFilesDestFullPath = $currentBackupFilesItem.FullName.Replace($SourceDir,$DestDir)
#Checking that the current item is file and not directory. True - the item is file.
$bBackupFilesFile = $($($currentBackupFilesItem.Attributes -band [System.IO.FileAttributes]::Directory) -ne [System.IO.FileAttributes]::Directory)
Write-Host -NoNewline ">>>Processing item $strBackupFilesSourceFullPath..."
#Generating path hash
$hashBackupFilesPath = $(Get-Hash -HashTarget $strBackupFilesSourceFullPath -HashType "String")
$hashBackupFilesFile = "d"
#If the item is file then generate hash for file content
if($bBackupFilesFile){
$hashBackupFilesFile = $(Get-Hash -HashTarget $strBackupFilesSourceFullPath -HashType "File")
#Checking that the file has been copied
if($xmlBackupFilesHashFile[$hashBackupFilesPath] -ne $hashBackupFilesFile){
Write-Host -NoNewline $("hash changed=>$hashBackupFilesFile...")
Copy-File -SourceFile $strBackupFilesSourceFullPath $strBackupFilesDestFullPath|Out-Null
#Returning result
Write-Output @{$hashBackupFilesPath=$hashBackupFilesFile}
else{
Write-Host -NoNewline "not changed..."
Write-Host "done"
Write-Host "Backup completed"
catch { throw $_ }
function Backup-Full
[CmdletBinding()]
[OutputType([System.String])]
#region Params
param(
[Parameter(Position=0, Mandatory=$true,ValueFromPipeline=$false)]
[ValidateScript({Test-Path -LiteralPath $_ -PathType 'Container'})]
[System.String]
$SourceDir,
[Parameter(Position=1, Mandatory=$true,ValueFromPipeline=$false)]
[ValidateNotNullOrEmpty()]
[System.String]
$DestDir,
[Parameter(Position=2, Mandatory=$true,ValueFromPipeline=$false)]
[ValidateNotNullOrEmpty()]
[System.String]
$HashFile,
[Parameter(Position=3, Mandatory=$true,ValueFromPipeline=$false)]
[ValidateNotNullOrEmpty()]
[System.String]
$ChainKey
#endregion
try{
#Creating an empty hash table
$xmlBackupFullHashFile = @{}
#Starting directory lookup
$uintBackupFullCount = 0
Backup-Files -SourceDir $SourceDir -DestDir $("$DestDir\$ChainKey\Full_" + $(Get-Date -Format "ddMMyyyy")) -HashTable $xmlBackupFullHashFile|`
ForEach-Object{
$xmlBackupFullHashFile.Add([string]$_.Keys,[string]$_.Values)
$uintBackupFullCount++
#Saving chain key.
$xmlBackupFullHashFile.Add("ChainKey",$ChainKey)
Write-Host -NoNewline "Saving XML file to $HashFile..."
Export-Clixml -Path $HashFile -InputObject $xmlBackupFullHashFile -Force
Write-Host "done"
Write-Output $uintBackupFullCount
catch { throw $_ }
function Backup-Diff
#region Params
[CmdletBinding()]
[OutputType([System.String])]
param(
[Parameter(Position=0, Mandatory=$true,ValueFromPipeline=$false)]
[ValidateScript({Test-Path -LiteralPath $_ -PathType 'Container'})]
[System.String]
$SourceDir,
[Parameter(Position=1, Mandatory=$true,ValueFromPipeline=$false)]
[ValidateNotNullOrEmpty()]
[System.String]
$DestDir,
[Parameter(Position=2, Mandatory=$true,ValueFromPipeline=$false)]
[ValidateScript({Test-Path -LiteralPath $_ -PathType 'leaf'})]
[System.String]
$HashFile
#endregion
try{
#Loading hash table
$xmlBackupDiffHashFile = Import-Clixml $HashFile
$chainKeyBackupDiffDifferential = $xmlBackupDiffHashFile["ChainKey"]
$uintBackupDiffCount = 0
#Starting directory lookup
Backup-Files -SourceDir $SourceDir -DestDir $("$DestDir\$chainKeyBackupDiffDifferential\Differential_" + $(Get-Date -Format "ddMMyyyy.HHmm")) -HashTable $xmlBackupDiffHashFile|`
ForEach-Object{ $uintBackupDiffCount++ }
Write-Output $uintBackupDiffCount
catch { throw $_ }
function Backup-Inc
#region Params
[CmdletBinding()]
[OutputType([System.String])]
param(
[Parameter(Position=0, Mandatory=$true,ValueFromPipeline=$false)]
[ValidateScript({Test-Path -LiteralPath $_ -PathType 'Container'})]
[System.String]
$SourceDir,
[Parameter(Position=1, Mandatory=$true,ValueFromPipeline=$false)]
[ValidateNotNullOrEmpty()]
[System.String]
$DestDir,
[Parameter(Position=2, Mandatory=$true,ValueFromPipeline=$false)]
[ValidateScript({Test-Path -LiteralPath $_ -PathType 'leaf'})]
[System.String]
$HashFile
#endregion
try{
#Loading hash table
$xmlBackupIncHashFile = Import-Clixml $HashFile
$chainKeyBackupIncIncremental = $xmlBackupIncHashFile["ChainKey"]
$uintBackupIncCount = 0
#Starting directory lookup
Backup-Files -SourceDir $SourceDir -DestDir $("$DestDir\$chainKeyBackupIncIncremental\Incremental_" + $(Get-Date -Format "ddMMyyyy.HHmm")) -HashTable $xmlBackupIncHashFile|`
ForEach-Object{
$xmlBackupIncHashFile[[string]$_.Keys]=[string]$_.Values
$uintBackupIncCount++
Write-Host -NoNewline "Saving XML file to $HashFile..."
Export-Clixml -Path $HashFile -InputObject $xmlBackupIncHashFile -Force
Write-Host "Done"
Write-Output $uintBackupIncCount
catch { throw $_ }
#0 - is OK. 1 - some error
$exitValue=0
process{
try{
$filesCopied=0
$strSourceFolderName = $(Get-Item $SourceDir).Name
$strHasFile = $("$HashPath\Hash_$strSourceFolderName.xml")
$strMessage = $($(Get-Date -Format "HH:mm_dd.MM.yyyy;") + "$BackupType backup of $SourceDir started")
#Automatically switch to full backup
$bSwitch = $(!$(Test-Path -LiteralPath $strHasFile -PathType "Leaf") -and $SwitchToFull)
Write-Log -Message $strMessage -LogFile $LogFile
switch($true){
$($BackupType -eq "Full" -or $bSwitch) {
$filesCopied = Backup-Full -SourceDir $SourceDir -DestDir $DestDir -HashFile $strHasFile -ChainKey $("Backup_$strSourceFolderName" + "_" + $(Get-Date -Format "ddMMyyyy"))
break
$($BackupType -eq "Incremental") {
$filesCopied = Backup-Inc -SourceDir $SourceDir -DestDir $DestDir -HashFile $strHasFile
break
$($BackupType -eq "Differential") {
$filesCopied = Backup-Diff -SourceDir $SourceDir -DestDir $DestDir -HashFile $strHasFile
break
$strMessage = $($(Get-Date -Format "HH:mm_dd.MM.yyyy;") + "$BackupType backup of $SourceDir completed successfully. $filesCopied items were copied.")
Write-Log -Message $strMessage -LogFile $LogFile
Write-Output $filesCopied
catch {
$strMessage = $($(Get-Date -Format "HH:mm_dd.MM.yyyy;") + "$BackupType backup of $SourceDir failed:" + $_)
Write-Log -Message $strMessage -LogFile $LogFile
$exitValue = 1
end{exit $exitValue}
I have some experience writing Powershell scripts,but I am lost at how this script prompts for Source and Destination paths. I tried modifying the Param section, but this didnt work and up until now I thought the only way you could get a prompt was with
"read-host". Any and all education on this matter would be greatly appreciated. (Side note: I have posted this question on the forum in which I found it and have not got an answer yet).
param(
[Parameter(Position=0, Mandatory=$true,ValueFromPipeline=$true)]
[ValidateScript({Test-Path -LiteralPath $_ -PathType 'Container'})]
[System.String]
$SourceDir,
[Parameter(Position=1, Mandatory=$true,ValueFromPipeline=$false)]
[ValidateScript({Test-Path -LiteralPath $_ -PathType 'Container'})]
[System.String]
$DestDir,
[Parameter(Position=2, Mandatory=$false,ValueFromPipeline=$false)]
[ValidateScript({Test-Path -LiteralPath $_ -PathType 'Container'})]
[System.String]
$HashPath,
[Parameter(Position=3, Mandatory=$false,ValueFromPipeline=$false)]
[ValidateSet("Full","Incremental","Differential")]
[System.String]
$BackupType="Full",
[Parameter(Position=4, Mandatory=$false,ValueFromPipeline=$false)]
[ValidateNotNullOrEmpty()]
[System.String]
$LogFile=".\Backup-Files.log",
[Parameter(Position=5, Mandatory=$false,ValueFromPipeline=$false)]
[System.Management.Automation.SwitchParameter]
$SwitchToFull
#endregion
begin{
function Write-Log
#region Params
[CmdletBinding()]
[OutputType([System.String])]
param(
[Parameter(Position=0, Mandatory=$true,ValueFromPipeline=$true)]
[ValidateNotNullOrEmpty()]
[System.String]
$Message,
[Parameter(Position=1, Mandatory=$true,ValueFromPipeline=$true)]
[ValidateNotNullOrEmpty()]
[System.String]
$LogFile
#endregion
try{
Write-Host $Message
Out-File -InputObject $Message -Append $LogFile
catch {throw $_}
function Get-Hash
#region Params
[CmdletBinding()]
[OutputType([System.String])]
param(
[Parameter(Position=0, Mandatory=$true,ValueFromPipeline=$true)]
[ValidateNotNullOrEmpty()]
[System.String]
$HashTarget,
[Parameter(Position=1, Mandatory=$true,ValueFromPipeline=$false)]
[ValidateSet("File","String")]
[System.String]
$HashType
#endregion
begin{
try{ $objGetHashMD5 = New-Object -TypeName System.Security.Cryptography.MD5CryptoServiceProvider }
catch {throw $_ }
process{
try {
#Checking hash target is file or just string
switch($HashType){
"String" {
$objGetHashUtf8 = New-Object -TypeName System.Text.UTF8Encoding
$arrayGetHashHash = $objGetHashMD5.ComputeHash($objGetHashUtf8.GetBytes($HashTarget.ToUpper()))
break
"File" {
$arrayGetHashHash = $objGetHashMD5.ComputeHash([System.IO.File]::ReadAllBytes($HashTarget))
break
#Return hash
Write-Output $([System.Convert]::ToBase64String($arrayGetHashHash))
catch { throw $_ }
function Copy-File
#region Params
[CmdletBinding()]
[OutputType([System.String])]
param(
[Parameter(Position=0, Mandatory=$true,ValueFromPipeline=$false)]
[ValidateScript({Test-Path -LiteralPath $_ -PathType 'Any'})]
[System.String]
$SourceFile,
[Parameter(Position=1, Mandatory=$true,ValueFromPipeline=$false)]
[ValidateNotNullOrEmpty()]
[System.String]
$DestFile
#endregion
try{
#The script fails when folder being copied to file. So the item will be removed to avoid the error.
if(Test-Path -LiteralPath $DestFile -PathType Any){
Remove-Item -LiteralPath $DestFile -Force -Recurse
#Creating destination if doesn't exist. It's required because Copy-Item doesn't create destination folder
if(Test-Path -LiteralPath $SourceFile -PathType Leaf){
New-Item -ItemType "File" -Path $DestFile -Force
#Copying file to destination directory
Copy-Item -LiteralPath $SourceFile -Destination $DestFile -Force
catch{ throw $_ }
function Backup-Files
#region Params
[CmdletBinding()]
[OutputType([System.String])]
param(
[Parameter(Position=0, Mandatory=$true,ValueFromPipeline=$false)]
[ValidateScript({Test-Path -LiteralPath $_ -PathType 'Container'})]
[System.String]
$SourceDir,
[Parameter(Position=1, Mandatory=$true,ValueFromPipeline=$false)]
[ValidateNotNullOrEmpty()]
[System.String]
$DestDir,
[Parameter(Position=2, Mandatory=$true,ValueFromPipeline=$false)]
[ValidateNotNull()]
[System.Collections.Hashtable]
$HashTable
#endregion
try{
$xmlBackupFilesHashFile = $HashTable
Write-Host "Backup started"
Get-ChildItem -Recurse -Path $SourceDir|ForEach-Object{
$currentBackupFilesItem = $_
#Full path to source and destination item
$strBackupFilesSourceFullPath = $currentBackupFilesItem.FullName
$strBackupFilesDestFullPath = $currentBackupFilesItem.FullName.Replace($SourceDir,$DestDir)
#Checking that the current item is file and not directory. True - the item is file.
$bBackupFilesFile = $($($currentBackupFilesItem.Attributes -band [System.IO.FileAttributes]::Directory) -ne [System.IO.FileAttributes]::Directory)
Write-Host -NoNewline ">>>Processing item $strBackupFilesSourceFullPath..."
#Generating path hash
$hashBackupFilesPath = $(Get-Hash -HashTarget $strBackupFilesSourceFullPath -HashType "String")
$hashBackupFilesFile = "d"
#If the item is file then generate hash for file content
if($bBackupFilesFile){
$hashBackupFilesFile = $(Get-Hash -HashTarget $strBackupFilesSourceFullPath -HashType "File")
#Checking that the file has been copied
if($xmlBackupFilesHashFile[$hashBackupFilesPath] -ne $hashBackupFilesFile){
Write-Host -NoNewline $("hash changed=>$hashBackupFilesFile...")
Copy-File -SourceFile $strBackupFilesSourceFullPath $strBackupFilesDestFullPath|Out-Null
#Returning result
Write-Output @{$hashBackupFilesPath=$hashBackupFilesFile}
else{
Write-Host -NoNewline "not changed..."
Write-Host "done"
Write-Host "Backup completed"
catch { throw $_ }
function Backup-Full
[CmdletBinding()]
[OutputType([System.String])]
#region Params
param(
[Parameter(Position=0, Mandatory=$true,ValueFromPipeline=$false)]
[ValidateScript({Test-Path -LiteralPath $_ -PathType 'Container'})]
[System.String]
$SourceDir,
[Parameter(Position=1, Mandatory=$true,ValueFromPipeline=$false)]
[ValidateNotNullOrEmpty()]
[System.String]
$DestDir,
[Parameter(Position=2, Mandatory=$true,ValueFromPipeline=$false)]
[ValidateNotNullOrEmpty()]
[System.String]
$HashFile,
[Parameter(Position=3, Mandatory=$true,ValueFromPipeline=$false)]
[ValidateNotNullOrEmpty()]
[System.String]
$ChainKey
#endregion
try{
#Creating an empty hash table
$xmlBackupFullHashFile = @{}
#Starting directory lookup
$uintBackupFullCount = 0
Backup-Files -SourceDir $SourceDir -DestDir $("$DestDir\$ChainKey\Full_" + $(Get-Date -Format "ddMMyyyy")) -HashTable $xmlBackupFullHashFile|`
ForEach-Object{
$xmlBackupFullHashFile.Add([string]$_.Keys,[string]$_.Values)
$uintBackupFullCount++
#Saving chain key.
$xmlBackupFullHashFile.Add("ChainKey",$ChainKey)
Write-Host -NoNewline "Saving XML file to $HashFile..."
Export-Clixml -Path $HashFile -InputObject $xmlBackupFullHashFile -Force
Write-Host "done"
Write-Output $uintBackupFullCount
catch { throw $_ }
function Backup-Diff
#region Params
[CmdletBinding()]
[OutputType([System.String])]
param(
[Parameter(Position=0, Mandatory=$true,ValueFromPipeline=$false)]
[ValidateScript({Test-Path -LiteralPath $_ -PathType 'Container'})]
[System.String]
$SourceDir,
[Parameter(Position=1, Mandatory=$true,ValueFromPipeline=$false)]
[ValidateNotNullOrEmpty()]
[System.String]
$DestDir,
[Parameter(Position=2, Mandatory=$true,ValueFromPipeline=$false)]
[ValidateScript({Test-Path -LiteralPath $_ -PathType 'leaf'})]
[System.String]
$HashFile
#endregion
try{
#Loading hash table
$xmlBackupDiffHashFile = Import-Clixml $HashFile
$chainKeyBackupDiffDifferential = $xmlBackupDiffHashFile["ChainKey"]
$uintBackupDiffCount = 0
#Starting directory lookup
Backup-Files -SourceDir $SourceDir -DestDir $("$DestDir\$chainKeyBackupDiffDifferential\Differential_" + $(Get-Date -Format "ddMMyyyy.HHmm")) -HashTable $xmlBackupDiffHashFile|`
ForEach-Object{ $uintBackupDiffCount++ }
Write-Output $uintBackupDiffCount
catch { throw $_ }
function Backup-Inc
#region Params
[CmdletBinding()]
[OutputType([System.String])]
param(
[Parameter(Position=0, Mandatory=$true,ValueFromPipeline=$false)]
[ValidateScript({Test-Path -LiteralPath $_ -PathType 'Container'})]
[System.String]
$SourceDir,
[Parameter(Position=1, Mandatory=$true,ValueFromPipeline=$false)]
[ValidateNotNullOrEmpty()]
[System.String]
$DestDir,
[Parameter(Position=2, Mandatory=$true,ValueFromPipeline=$false)]
[ValidateScript({Test-Path -LiteralPath $_ -PathType 'leaf'})]
[System.String]
$HashFile
#endregion
try{
#Loading hash table
$xmlBackupIncHashFile = Import-Clixml $HashFile
$chainKeyBackupIncIncremental = $xmlBackupIncHashFile["ChainKey"]
$uintBackupIncCount = 0
#Starting directory lookup
Backup-Files -SourceDir $SourceDir -DestDir $("$DestDir\$chainKeyBackupIncIncremental\Incremental_" + $(Get-Date -Format "ddMMyyyy.HHmm")) -HashTable $xmlBackupIncHashFile|`
ForEach-Object{
$xmlBackupIncHashFile[[string]$_.Keys]=[string]$_.Values
$uintBackupIncCount++
Write-Host -NoNewline "Saving XML file to $HashFile..."
Export-Clixml -Path $HashFile -InputObject $xmlBackupIncHashFile -Force
Write-Host "Done"
Write-Output $uintBackupIncCount
catch { throw $_ }
#0 - is OK. 1 - some error
$exitValue=0
process{
try{
$filesCopied=0
$strSourceFolderName = $(Get-Item $SourceDir).Name
$strHasFile = $("$HashPath\Hash_$strSourceFolderName.xml")
$strMessage = $($(Get-Date -Format "HH:mm_dd.MM.yyyy;") + "$BackupType backup of $SourceDir started")
#Automatically switch to full backup
$bSwitch = $(!$(Test-Path -LiteralPath $strHasFile -PathType "Leaf") -and $SwitchToFull)
Write-Log -Message $strMessage -LogFile $LogFile
switch($true){
$($BackupType -eq "Full" -or $bSwitch) {
$filesCopied = Backup-Full -SourceDir $SourceDir -DestDir $DestDir -HashFile $strHasFile -ChainKey $("Backup_$strSourceFolderName" + "_" + $(Get-Date -Format "ddMMyyyy"))
break
$($BackupType -eq "Incremental") {
$filesCopied = Backup-Inc -SourceDir $SourceDir -DestDir $DestDir -HashFile $strHasFile
break
$($BackupType -eq "Differential") {
$filesCopied = Backup-Diff -SourceDir $SourceDir -DestDir $DestDir -HashFile $strHasFile
break
$strMessage = $($(Get-Date -Format "HH:mm_dd.MM.yyyy;") + "$BackupType backup of $SourceDir completed successfully. $filesCopied items were copied.")
Write-Log -Message $strMessage -LogFile $LogFile
Write-Output $filesCopied
catch {
$strMessage = $($(Get-Date -Format "HH:mm_dd.MM.yyyy;") + "$BackupType backup of $SourceDir failed:" + $_)
Write-Log -Message $strMessage -LogFile $LogFile
$exitValue = 1
end{exit $exitValue}Hi Ryan Blaeholder,
Thanks for your posting.
To schedule a powershell script with input value, instead of modifying the script above, you can also try to add the input during creating a scheduled task like this:(save the script above as D:\backup.ps1)
-command "& 'D:\backup.ps1' 'input1' 'input2'"
For more detailed information, please refer to this article to complete:
Schedule PowerShell Scripts that Require Input Values:
http://blogs.technet.com/b/heyscriptingguy/archive/2011/01/12/schedule-powershell-scripts-that-require-input-values.aspx
I hope this helps.
We
are trying to better understand customer views on social support experience, so your participation in this
interview project would be greatly appreciated if you have time.
Thanks for helping make community forums a great place. -
How to create batch job based on sales organization?
I am Sending data using RFC function module, within a report program.
Now according to functional requirement I have to create two batch job scheduling scenarios:
1. Provide a batch job to be run towards month end (2X, 1X) to sent a delta record message using Report. This is scheduled on demand or requested on demand. Recommend 1 message per sales org.
Q: What does it mean by scheduled on demand or requested on demand?? Second thing is that like we have 10 sales org. and in variants for job scheduling we can assign only 1 at a time......how can i schedule per sales organization? do i need to create different job for each sales organization??
(2X, 1X) what is the meaning of this notation??
2. Create a new nightly batch job, run after daily billing batch, to send a delta records message of Report. Recommend 1 message per sales org.
Anyone please provide me some details on this issue.......
Regards
AnshulI am Sending data using RFC function module, within a report program.
Now according to functional requirement I have to create two batch job scheduling scenarios:
1. Provide a batch job to be run towards month end (2X, 1X) to sent a delta record message using Report. This is scheduled on demand or requested on demand. Recommend 1 message per sales org.
Q: What does it mean by scheduled on demand or requested on demand?? Second thing is that like we have 10 sales org. and in variants for job scheduling we can assign only 1 at a time......how can i schedule per sales organization? do i need to create different job for each sales organization??
(2X, 1X) what is the meaning of this notation??
2. Create a new nightly batch job, run after daily billing batch, to send a delta records message of Report. Recommend 1 message per sales org.
Anyone please provide me some details on this issue.......
Regards
Anshul -
Hi
what is the difference between a batch job and backgroun job ??
thanks
kumarhi kumar,
As far as I know
Batch job are the jobs created with sessions. You can process those sessions in SM35. These are not direct update methods.
Where as the background jobs are the jobs running in the background with out user interaction. Once you schedule the job for the background job it will run with out user interaction according to the given schedule.
You can run the batch jobs also in background. -
Dear Scripting Guys,
I am working in an AD migration project (Migration from old legacy AD domains to single AD domain) and in the transition phase. Our infrastructure contains lots
of Users, Servers and Workstations. Authentication is being done through AD only. Many UNIX and LINUX based box are being authenticated through AD bridge to AD.
We have lot of applications in our environment. Many applications are configured to use Managed Service Accounts. Many Workstations and servers are running batch
jobs with AD user credentials. Many applications are using AD user accounts to carry out their processes.
We need to find out all those AD Users, which are configured as MSA, Which are configured for batch jobs and which are being used for different applications on
our network (Need to find out for every machine on network).
These identified AD Users will be migrated to the new Domain with top priority. I get stuck with this requirement and your support will be deeply appreciated.
I hope a well designed PS script can achieve this.
Thanks in advance...
Thanks & Regards Bedanta S MishraHey Satyajit,
Thank you for your valuable reply. It is really a great notion to enable account logon audit and collect those events for the analysis. But you know it is also a tedious job when thousand of Users come in to picture. You can imagine how complex it will be
for this analysis, where more than 200000 users getting logged in through AD. It is the fact that when a batch / MS or an application uses a Domain Users credential with successful process, automatically a successful logon event will be triggered in associated
DC. But there are also too many users which are not part of these accounts like MSA/Batch jobs or not linked to any application. In that case we have to get through unwanted events.
Recently jrv, provided me a beautiful script to find out all MSA from a machine or from a list of machines in an AD environment. (Covers MSA part.)
$Report= 'Audit_Report.html'
$Computers= Get-ADComputer -Filter 'Enabled -eq $True' | Select -Expand Name
$head=@'
<title>Non-Standard Service Accounts</title>
<style>
BODY{background-color :#FFFFF}
TABLE{Border-width:thin;border-style: solid;border-color:Black;border-collapse: collapse;}
TH{border-width: 1px;padding: 2px;border-style: solid;border-color: black;background-color: ThreeDShadow}
TD{border-width: 1px;padding: 2px;border-style: solid;border-color: black;background-color: Transparent}
</style>
$sections=@()
foreach($computer in $Computers){
$sections+=Get-WmiObject -ComputerName $Computer -class Win32_Service -ErrorAction SilentlyContinue |
Select-Object -Property StartName,Name,DisplayName |
ConvertTo-Html -PreContent "<H2>Non-Standard Service Accounts on '$Computer'</H2>" -Fragment
$body=$sections | out-string
ConvertTo-Html -Body $body -Head $head | Out-File $report
Invoke-Item $report
A script can be designed to get all scheduled back ground batch jobs in a machine, from which the author / the Owner of that scheduled job can be extracted. like below one...
Function Get-ScheduledTasks
Param
[Alias("Computer","ComputerName")]
[Parameter(Position=1,ValuefromPipeline=$true,ValuefromPipelineByPropertyName=$true)]
[string[]]$Name = $env:COMPUTERNAME
[switch]$RootOnly = $false
Begin
$tasks = @()
$schedule = New-Object -ComObject "Schedule.Service"
Process
Function Get-Tasks
Param($path)
$out = @()
$schedule.GetFolder($path).GetTasks(0) | % {
$xml = [xml]$_.xml
$out += New-Object psobject -Property @{
"ComputerName" = $Computer
"Name" = $_.Name
"Path" = $_.Path
"LastRunTime" = $_.LastRunTime
"NextRunTime" = $_.NextRunTime
"Actions" = ($xml.Task.Actions.Exec | % { "$($_.Command) $($_.Arguments)" }) -join "`n"
"Triggers" = $(If($xml.task.triggers){ForEach($task in ($xml.task.triggers | gm | Where{$_.membertype -eq "Property"})){$xml.task.triggers.$($task.name)}})
"Enabled" = $xml.task.settings.enabled
"Author" = $xml.task.principals.Principal.UserID
"Description" = $xml.task.registrationInfo.Description
"LastTaskResult" = $_.LastTaskResult
"RunAs" = $xml.task.principals.principal.userid
If(!$RootOnly)
$schedule.GetFolder($path).GetFolders(0) | % {
$out += get-Tasks($_.Path)
$out
ForEach($Computer in $Name)
If(Test-Connection $computer -count 1 -quiet)
$schedule.connect($Computer)
$tasks += Get-Tasks "\"
Else
Write-Error "Cannot connect to $Computer. Please check it's network connectivity."
Break
$tasks
End
[System.Runtime.Interopservices.Marshal]::ReleaseComObject($schedule) | Out-Null
Remove-Variable schedule
Get-ScheduledTasks -RootOnly | Format-Table -Wrap -Autosize -Property RunAs,ComputerName,Actions
So I think, can a PS script be designed to get the report of all running applications which use domain accounts for their authentication to carry out their process. So from that result we can filter out the AD accounts being used for those
applications. After that these three individual modules can be compacted in to a single script to provide the desired output as per the requirement in a single report.
Thanks & Regards Bedanta S Mishra -
How can I setup a scheduled task to run a Powershell Script delivered as a Group Policy Preference
I have a Powershell script I want to run only once when a user logs onto their system. This script would move all the PST files from the Local drive and the Home drive to a folder location within the users profile. I wanted to run this as a Windows 7 Scheduled Task using Group Policy Preferences. How can I get this to happen short of a logon script? I have updated all the machines to WMF 4.0 so could I use a Scheduled Job instead? I wanted to run the script as the logon user but elevated.#Start Outlook and Disconnect attached PST files.
$Outlook = New-Object -ComObject Outlook.Application
$namespace = $outlook.getnamespace("MAPI")
$folder = $namespace.GetDefaultFolder("olFolderInbox")
$explorer = $folder.GetExplorer()
$explorer.Display()
$myArray= @()
$outlook.Session.Stores | where{ ($_.FilePath -like'*.PST') } | foreach{[array]$myArray+= $_.FilePath}
for
($x=0;$x-le$myArray.length-1;$x++)
$PSTPath= $myArray[$x]
$PST= $namespace.Stores | ?{$_.FilePath -like$PSTPath}
$PSTRoot= $PST.GetRootFolder() #Get Root Folder name of PST
$PSTFolder= $Namespace.Folders.Item($PSTRoot.Name) #Bind to PST for disconnection
$Namespace.GetType().InvokeMember('RemoveStore',[System.Reflection.BindingFlags]::InvokeMethod,$null,$Namespace,($PSTFolder)) #Disconnect .PST
#Move All PST files to the default location while deleting the PST files from their original location.
$SourceList = ("$env:SystemDrive", "$env:HOMEDRIVE")
$Destination = ("$env:USERPROFILE\MyOutlookFiles")
(Get-ChildItem -Path $SourceList -Recurse -Filter *.PST) | Move-Item -Destination $Destination
#Attach all PST files from the default location.
Add-type -assembly "Microsoft.Office.Interop.Outlook" | out-null
$outlook = new-object -comobject outlook.application
$namespace = $outlook.GetNameSpace("MAPI")
dir “$env:USERPROFILE\MyOutlookFiles\*.pst” | % { $namespace.AddStore($_.FullName) }Mike,
I do not understand what appears to be a regular expression above. I did add the PowerShell script to the HKCU RunOnce Key as suggested.
Windows Registry Editor Version 5.00
C:\WINDOWS\system32\WindowsPowerShell\v1.0\powershell.exe -noprofile -sta -WindowStyle Hidden -ExecutionPolicy RemoteSigned -File "C:\scripts\Windows PowerShell\Move-PST.ps1"
I'm delivering this using Group Policy Preferences. It seems to fail or time out when run because the behavior is different if I run the script from within the PowerShell IDE. I added the parameters to the script and will try it again in the morning. -
System.Drawing.Bitmap in a scheduled powershell script
I've written a powershell script to date stamp multipage tiffs, but I check to make sure the file name follows the correct format before doing so. The file name must contain the date, sequence number, and number of pages. The script works fine when run manually,
but when run from task scheduler it fails to query the number of pages in the TIFF. Any ideas why the .NET features wouldn't work from a powershell script run as a scheduled task?
I am putting the page number in the variable "count" by doing the following:
$i=[System.Drawing.Bitmap]::FromFile($file.Fullname);$i.GetFrameCount($i.FrameDimensionsList[0])
$count=$i.GetFrameCount([System.Drawing.Imaging.FrameDimension]::Page)
FULL SCRIPT FOLLOWS
#Define the input and output folders and date format
$Original_TIFFs="C:\scans"
$Modified_TIFFs=";\\test\Shared\SDS\"
$date = get-date -Format d
$datename=Get-Date -format yyyyMMdd
Set-Location $Original_TIFFs
#Configure email settings
$emailFrom = "removed"
$emailTo = "removed"
$smtpServer = "removed"
$body = "Rename scanned claims file to the correct format. This email was sent from: ", $env:computername
#Define the location of the TIFF command line executable and its parameters
$200DLL='C:\TiffDLL200Commandline\Cmd200.exe '
$arg1='"FILE='
#Modify arg2 to put the output directory in front of the ; if don't want to overwrite current file
#$arg2=';|OW=Yes|BITS=2|TEXT=2;Received Date: '
$arg2=$modified_TIFFs
$arg3=';|BITS=2|TEXT=2;Received Date: '
$arg4='|TEXTOPS=-5;;10;14;"'
$files=Get-ChildItem $Original_TIFFs -Filter *.tif
if ($files -eq $null)
$subject = "No files to process today, directory empty."
$smtp = new-object Net.Mail.SmtpClient($smtpServer)
$body = "No files were processed today. This email was sent from: ", $env:computername
$smtp.Send($emailFrom, $emailTo, $subject, $body)
else
foreach ($file in $files)
#Begin loop to check each file and process
#Loads subsystems for opening TIFFs and second line puts the number of images into variable
$i=[System.Drawing.Bitmap]::FromFile($file.Fullname);$i.GetFrameCount($i.FrameDimensionsList[0])
$count=$i.GetFrameCount([System.Drawing.Imaging.FrameDimension]::Page)
#If statement checks if filename format is correct
if ($file -match '^\d{8}\d{3}_H_S_\d+_\d{8}[.tif]{4}$')
$file.name -match '^(?<date1>\d{8})\d{3}_H_S_(?<page_count>\d+)_(?<date2>\d{8})[.tif]{4}$' #Regex to put tests in $matches to check against
if (($matches.date1 -eq $datename) -and ($matches.date2 -eq $datename)) #Check if filename contains correct date
if ($count -eq $matches.page_count) #Check if filename
contains the correct page count
#insert TIFF modification
$allargs=$200Dll+$arg1+$file+$arg2+$file+$arg3+$date+$arg4
cmd /c $allargs
#cmd /c xcopy $file \\test\shared\SDS #Deprecated because now having
TIFF200DLL create a new file rather than overwrite
$i.Dispose()
#Close file stream so file can be deleted: http://support.microsoft.com/kb/814675
Remove-Item $file.Name
#Next section is for a different output directory; Setup a seperate batch file to delete original TIFFs in the middle of the night
<#
$allargs="cmd200 "+$arg1+$file+";"+$Modified_TIFFs+";"+$arg2+$date+$arg3
cmd /c $allargs
#>
else
#else statement to send out error message if the number of pages differs from name
$subject = "The number of pages in the file ", $file.FullName, "differs from the actual count of ", $count, ". File will not be sent, please correct before tomorrow for processing."
$smtp = new-object Net.Mail.SmtpClient($smtpServer)
$smtp.Send($emailFrom, $emailTo, $subject, $body)
} #Close IF/THEN for correct date is in filename
else
$subject = "Date portion of filename is incorrect, please fix. File will not be sent to SDS", $file.FullName," ."
$smtp = new-object Net.Mail.SmtpClient($smtpServer)
$smtp.Send($emailFrom, $emailTo, $subject, $body)
} #Close IF/THEN for initial filename check
else
$subject = "File does not meet proper naming convention and will not be stamped nor sent to SDS", $file.FullName, " ."
$smtp = new-object Net.Mail.SmtpClient($smtpServer)
$smtp.Send($emailFrom, $emailTo, $subject, $body)
} #Close FOR loop
} #Close Else for check if FILES=NULLYou are buikding thisin the ISE?
You need too add:
add-type -AssemblyName System.Drawing
¯\_(ツ)_/¯
Maybe you are looking for
-
Can't open .DNG files in Preview
I am using Adobe DNG converter OR Lightroom to convert .RW2 from a Panasonic Lumix 3 to .DNG _This works:_ 1. Working with files in LR 2. Use PanoMaker to create panoramas 3. Create HDR with Photomagix _This does not work_ 4. Open files with MAC OS X
-
Hi, I'm having a real problem with trying to get my xslt stylesheet to produce a html table. I want the table to look something like the following: |radio-button description | radio-button description | |radio-button description | radio-button descri
-
Create partition to upgrade OS 10.6.8 to 10.8.5?
++++++++++++++++++++++++ Model Name: Mac Pro Model Identifier: MacPro3,1 Processor Name: Quad-Core Intel Xeon Processor Speed: 2.8 GHz Number Of Processors: 2 Total Number Of Cores: 8 L2 Cache (per processor): 12 MB RAM Memory: 18 GB Graphics card: A
-
Why is iTunes in safe mode?
What is going on with iTunes, I connected my new iPad and suddenly safe mode freezes me out and nothing works and I cannot access my iPad through iTunes. iTunes on my computer is also useless. What is safe mode , how did it get turned on and how can
-
Web Report export to Excel error
Hello: I face the problem as follow: When I export web report to excel and open it. The characteristic value no display, and the key figure as usually. How to solve this problem? Hope anyone can give me some advices! Regards&Thanks! zagory