System.Drawing.Bitmap in a scheduled powershell script

I've written a powershell script to date stamp multipage tiffs, but I check to make sure the file name follows the correct format before doing so. The file name must contain the date, sequence number, and number of pages. The script works fine when run manually,
but when run from task scheduler it fails to query the number of pages in the TIFF. Any ideas why the .NET features wouldn't work from a powershell script run as a scheduled task?
I am putting the page number in the variable "count" by doing the following:
 $i=[System.Drawing.Bitmap]::FromFile($file.Fullname);$i.GetFrameCount($i.FrameDimensionsList[0]) 
 $count=$i.GetFrameCount([System.Drawing.Imaging.FrameDimension]::Page)
FULL SCRIPT FOLLOWS
#Define the input and output folders and date format
$Original_TIFFs="C:\scans"
$Modified_TIFFs=";\\test\Shared\SDS\"
$date = get-date -Format d
$datename=Get-Date -format yyyyMMdd
Set-Location $Original_TIFFs
#Configure email settings
$emailFrom = "removed"
$emailTo = "removed"
$smtpServer = "removed"
$body = "Rename scanned claims file to the correct format. This email was sent from: ", $env:computername
#Define the location of the TIFF command line executable and its parameters
$200DLL='C:\TiffDLL200Commandline\Cmd200.exe '
$arg1='"FILE='
#Modify arg2 to put the output directory in front of the ; if don't want to overwrite current file
#$arg2=';|OW=Yes|BITS=2|TEXT=2;Received Date: '
$arg2=$modified_TIFFs
$arg3=';|BITS=2|TEXT=2;Received Date: '
$arg4='|TEXTOPS=-5;;10;14;"'
$files=Get-ChildItem $Original_TIFFs -Filter *.tif
if ($files -eq $null)
  $subject = "No files to process today, directory empty."
  $smtp = new-object Net.Mail.SmtpClient($smtpServer)
  $body = "No files were processed today. This email was sent from: ", $env:computername
  $smtp.Send($emailFrom, $emailTo, $subject, $body)
else
foreach ($file in $files)                                                                  
   #Begin loop to check each file and process
 #Loads subsystems for opening TIFFs and second line puts the number of images into variable
 $i=[System.Drawing.Bitmap]::FromFile($file.Fullname);$i.GetFrameCount($i.FrameDimensionsList[0]) 
 $count=$i.GetFrameCount([System.Drawing.Imaging.FrameDimension]::Page)
 #If statement checks if filename format is correct
 if ($file -match '^\d{8}\d{3}_H_S_\d+_\d{8}[.tif]{4}$')
  $file.name -match '^(?<date1>\d{8})\d{3}_H_S_(?<page_count>\d+)_(?<date2>\d{8})[.tif]{4}$'   #Regex to put tests in $matches to check against
  if (($matches.date1 -eq $datename) -and ($matches.date2 -eq $datename))                      #Check if filename contains correct date
  if ($count -eq $matches.page_count)                                                          #Check if filename
contains the correct page count
   #insert TIFF modification
    $allargs=$200Dll+$arg1+$file+$arg2+$file+$arg3+$date+$arg4
    cmd /c $allargs
    #cmd /c xcopy $file \\test\shared\SDS                                                   #Deprecated because now having
TIFF200DLL create a new file rather than overwrite
    $i.Dispose()                                                                  
            #Close file stream so file can be deleted: http://support.microsoft.com/kb/814675
    Remove-Item $file.Name
    #Next section is for a different output directory; Setup a seperate batch file to delete original TIFFs in the middle of the night
    <#
    $allargs="cmd200 "+$arg1+$file+";"+$Modified_TIFFs+";"+$arg2+$date+$arg3
    cmd /c $allargs
    #>
    else                                                                    
                 #else statement to send out error message if the number of pages differs from name
    $subject = "The number of pages in the file ", $file.FullName, "differs from the actual count of ", $count, ". File will not be sent, please correct before tomorrow for processing."
    $smtp = new-object Net.Mail.SmtpClient($smtpServer)
    $smtp.Send($emailFrom, $emailTo, $subject, $body)
  }  #Close IF/THEN for correct date is in filename
 else
    $subject = "Date portion of filename is incorrect, please fix. File will not be sent to SDS", $file.FullName," ."
    $smtp = new-object Net.Mail.SmtpClient($smtpServer)
    $smtp.Send($emailFrom, $emailTo, $subject, $body)
 }                                                    #Close IF/THEN for initial filename check
 else
    $subject = "File does not meet proper naming convention and will not be stamped nor sent to SDS", $file.FullName, " ."
    $smtp = new-object Net.Mail.SmtpClient($smtpServer)
    $smtp.Send($emailFrom, $emailTo, $subject, $body)
}                                                     #Close FOR loop
}                                                     #Close Else for check if FILES=NULL

You are buikding thisin the ISE?
You need too add:
add-type -AssemblyName System.Drawing
¯\_(ツ)_/¯

Similar Messages

  • System.Drawing.Bitmap($_.FullName) fails with "Online only" files stored on OneDrive

    Hi,
    I want to retrieve the DateTaken from JPG files that are stored on my OneDrive. If the file is available offline, the code works fine. However, if the file is "Online only" the code below breaks when creating the New-Object System.Drawing.Bitmap
    and generates the following error: "New-Object : Exception calling ".ctor" with "1" argument(s): "Parameter is not valid."
    Is there any way to work around this problem?
    [reflection.assembly]::loadwithpartialname("System.Drawing") | out-null
    Get-ChildItem ($Path + "\*") -include @('*.jpg', '*.jpeg') -Force | ForEach-Object {
            if (! $DateTimeArg)
                $pic = New-Object System.Drawing.Bitmap($_.FullName)
                $ExifDate = $pic.GetPropertyItem(36867)
                $DateTaken = (New-Object System.Text.UTF8Encoding).GetString($ExifDate.Value)
                $DateTime=[datetime]::ParseExact($DateTaken,"yyyy:MM:dd HH:mm:ss`0",$Null)
                $pic.Dispose()

    Hi,
    I want to retrieve the DateTaken from JPG files that are stored on my OneDrive. If the file is available offline, the code works fine. However, if the file is "Online only" the code below breaks when creating the New-Object System.Drawing.Bitmap
    and generates the following error: "New-Object : Exception calling ".ctor" with "1" argument(s): "Parameter is not valid."
    Is there any way to work around this problem?
    [reflection.assembly]::loadwithpartialname("System.Drawing") | out-null
    Get-ChildItem ($Path + "\*") -include @('*.jpg', '*.jpeg') -Force | ForEach-Object {
            if (! $DateTimeArg)
                $pic = New-Object System.Drawing.Bitmap($_.FullName)
                $ExifDate = $pic.GetPropertyItem(36867)
                $DateTaken = (New-Object System.Text.UTF8Encoding).GetString($ExifDate.Value)
                $DateTime=[datetime]::ParseExact($DateTaken,"yyyy:MM:dd HH:mm:ss`0",$Null)
                $pic.Dispose()
    >> If the file is available offline, the code works fine.
    Right, if the file is available offline, it means the JPG file is stored in your local computer, and consequently, the variable $Path is meaningful, because it refers to a folder in your local computer.
    >> However, if the file is "Online only" the code below breaks when creating the New-Object System.Drawing.Bitmap and generates the following error: "New-Object : Exception calling ".ctor"
    with "1" argument(s): "Parameter is not valid."
    Right, if the file is available online only, it means the JPG file is
    not stored in your local computer, and consequently, the variable $Path is meaningless, because it refers to a folder in your local computer (it doesn't matter which) that
    does not contain the image.
    >> Is there any way to work around this problem?
    Yes. right click on the JPG name (on file explorer) and make it available offline.

  • System.drawing. image to system.drawing.bitmap

    How can I convert a system.drawing. image to system.drawing.bitmap?

    Yes I tried, but doesn't work.
    I'm trying to do it in different way but I'm getting an error,
    se VI annexed...
    Attachments:
    image to bitmap.vi ‏9 KB

  • What is the best way to run a powershell script with parameters in the Task Scheduler?

    Hello, 
    Want to run the following from a scheduled task in the Task Scheduler on a server.  What is the best approach?
    .\pscript.ps1 -csvfile "\\Srv1\Share\File.txt"
    Thanks for your help! SdeDot

    Hi,
    To run a powershell script with parameters in the Task Scheduler:
    Program: C:\Windows\System32\WindowsPowerShell\v1.0\powershell.exe
    Add argument (optional): -Command "& c:\scripts\test.ps1 -par1 2 -par2 3"
    Hope the below two articles be helpful for you:
    Schedule PowerShell Scripts that Require Input Values
    https://blogs.technet.com/b/heyscriptingguy/archive/2011/01/12/schedule-powershell-scripts-that-require-input-values.aspx
    How to Schedule a PowerShell Script
    http://dmitrysotnikov.wordpress.com/2011/02/03/how-to-schedule-a-powershell-script/
    Regards,
    Yan Li
    Regards, Yan Li

  • Powershell script not running in the task scheduler...

    I've created a .ps1 script to transfer a file using WinSCP can run it in the ISE environment, in the PS window, and with the run command. I've transferred the command I used in the run prompt to the task scheduler but it is not running. It is running everywhere
    else just not in the scheduler. It says that it completes okay and gives a return code of OpCode=2
    The action is set to run this: c:\Windows\System32\WindowsPowerShell\v1.0\Powershell.exe
    The Arguments: -ExecutionPolicy Bypass -file "C:\Users\me\scriptsWCP\FileTransferPS.ps1"
    Also have it running with the highest permission and as SYSTEM

    Hi,
    To run a powershell script with parameters in the Task Scheduler:
    Program: C:\Windows\System32\WindowsPowerShell\v1.0\powershell.exe
    Add argument (optional): -Command "& c:\scripts\test.ps1 -par1 2 -par2 3"
    Hope the below two articles be helpful for you:
    Schedule PowerShell Scripts that Require Input Values
    https://blogs.technet.com/b/heyscriptingguy/archive/2011/01/12/schedule-powershell-scripts-that-require-input-values.aspx
    How to Schedule a PowerShell Script
    http://dmitrysotnikov.wordpress.com/2011/02/03/how-to-schedule-a-powershell-script/
    Regards,
    Yan Li
    Regards, Yan Li

  • Help modifying a powershell script

    Hello,
    I have recently been given a task to write/find a script that is capable of performing Full and Incremental backups. I found a script that does exactly what I need, however, it requires user input. I need this to be a scheduled task and therefore I need
    the input to be a static path. Here is the script I am talking about:
    #region Params
    param(
    [Parameter(Position=0, Mandatory=$true,ValueFromPipeline=$true)]
    [ValidateScript({Test-Path -LiteralPath $_ -PathType 'Container'})] 
    [System.String]
    $SourceDir,
    [Parameter(Position=1, Mandatory=$true,ValueFromPipeline=$false)]
    [ValidateScript({Test-Path -LiteralPath $_ -PathType 'Container'})] 
    [System.String]
    $DestDir,
    [Parameter(Position=2, Mandatory=$false,ValueFromPipeline=$false)]
    [ValidateScript({Test-Path -LiteralPath $_ -PathType 'Container'})] 
    [System.String]
    $HashPath,
    [Parameter(Position=3, Mandatory=$false,ValueFromPipeline=$false)]
    [ValidateSet("Full","Incremental","Differential")] 
    [System.String]
    $BackupType="Full",
    [Parameter(Position=4, Mandatory=$false,ValueFromPipeline=$false)]
    [ValidateNotNullOrEmpty()]  
    [System.String]
    $LogFile=".\Backup-Files.log",
    [Parameter(Position=5, Mandatory=$false,ValueFromPipeline=$false)]
    [System.Management.Automation.SwitchParameter]
    $SwitchToFull
    #endregion 
    begin{
    function Write-Log
    #region Params
    [CmdletBinding()]
    [OutputType([System.String])]
    param(
    [Parameter(Position=0, Mandatory=$true,ValueFromPipeline=$true)]
    [ValidateNotNullOrEmpty()]
    [System.String]
    $Message,
    [Parameter(Position=1, Mandatory=$true,ValueFromPipeline=$true)]
    [ValidateNotNullOrEmpty()]
    [System.String]
    $LogFile
    #endregion
    try{
    Write-Host $Message
    Out-File -InputObject $Message -Append $LogFile
    catch {throw $_}
    function Get-Hash 
    #region Params
    [CmdletBinding()]
    [OutputType([System.String])]
    param(
    [Parameter(Position=0, Mandatory=$true,ValueFromPipeline=$true)]
    [ValidateNotNullOrEmpty()]
    [System.String]
    $HashTarget,
    [Parameter(Position=1, Mandatory=$true,ValueFromPipeline=$false)]
    [ValidateSet("File","String")]
    [System.String]
    $HashType
    #endregion
    begin{
    try{ $objGetHashMD5 = New-Object -TypeName System.Security.Cryptography.MD5CryptoServiceProvider } 
    catch {throw $_ }
    process{
    try {
    #Checking hash target is file or just string
    switch($HashType){
    "String" {
    $objGetHashUtf8 = New-Object -TypeName System.Text.UTF8Encoding
    $arrayGetHashHash = $objGetHashMD5.ComputeHash($objGetHashUtf8.GetBytes($HashTarget.ToUpper()))
    break
    "File" {
    $arrayGetHashHash = $objGetHashMD5.ComputeHash([System.IO.File]::ReadAllBytes($HashTarget))
    break
    #Return hash
    Write-Output $([System.Convert]::ToBase64String($arrayGetHashHash))
    catch { throw $_ }
    function Copy-File
    #region Params
    [CmdletBinding()]
    [OutputType([System.String])]
    param(
    [Parameter(Position=0, Mandatory=$true,ValueFromPipeline=$false)]
    [ValidateScript({Test-Path -LiteralPath $_ -PathType 'Any'})] 
    [System.String]
    $SourceFile,
    [Parameter(Position=1, Mandatory=$true,ValueFromPipeline=$false)]
    [ValidateNotNullOrEmpty()] 
    [System.String]
    $DestFile
    #endregion
    try{
    #The script fails when folder being copied to file. So the item will be removed to avoid the error.
    if(Test-Path -LiteralPath $DestFile -PathType Any){
    Remove-Item -LiteralPath $DestFile -Force -Recurse
    #Creating destination if doesn't exist. It's required because Copy-Item doesn't create destination folder
    if(Test-Path -LiteralPath $SourceFile -PathType Leaf){
    New-Item -ItemType "File" -Path $DestFile -Force
    #Copying file to destination directory
    Copy-Item -LiteralPath $SourceFile -Destination $DestFile -Force
    catch{ throw $_ }
    function Backup-Files
    #region Params
    [CmdletBinding()]
    [OutputType([System.String])]
    param(
    [Parameter(Position=0, Mandatory=$true,ValueFromPipeline=$false)]
    [ValidateScript({Test-Path -LiteralPath $_ -PathType 'Container'})] 
    [System.String]
    $SourceDir,
    [Parameter(Position=1, Mandatory=$true,ValueFromPipeline=$false)]
    [ValidateNotNullOrEmpty()] 
    [System.String]
    $DestDir,
    [Parameter(Position=2, Mandatory=$true,ValueFromPipeline=$false)]
    [ValidateNotNull()] 
    [System.Collections.Hashtable]
    $HashTable
    #endregion
    try{
    $xmlBackupFilesHashFile = $HashTable
    Write-Host "Backup started" 
    Get-ChildItem -Recurse -Path $SourceDir|ForEach-Object{
    $currentBackupFilesItem = $_
    #Full path to source and destination item
    $strBackupFilesSourceFullPath = $currentBackupFilesItem.FullName
    $strBackupFilesDestFullPath = $currentBackupFilesItem.FullName.Replace($SourceDir,$DestDir)
    #Checking that the current item is file and not directory. True - the item is file. 
    $bBackupFilesFile = $($($currentBackupFilesItem.Attributes -band [System.IO.FileAttributes]::Directory) -ne [System.IO.FileAttributes]::Directory)
    Write-Host -NoNewline ">>>Processing item $strBackupFilesSourceFullPath..."
    #Generating path hash
    $hashBackupFilesPath = $(Get-Hash -HashTarget $strBackupFilesSourceFullPath -HashType "String")
    $hashBackupFilesFile = "d"
    #If the item is file then generate hash for file content
    if($bBackupFilesFile){
    $hashBackupFilesFile = $(Get-Hash -HashTarget $strBackupFilesSourceFullPath -HashType "File")
    #Checking that the file has been copied
    if($xmlBackupFilesHashFile[$hashBackupFilesPath] -ne $hashBackupFilesFile){
    Write-Host -NoNewline $("hash changed=>$hashBackupFilesFile...")
    Copy-File -SourceFile $strBackupFilesSourceFullPath $strBackupFilesDestFullPath|Out-Null
    #Returning result
    Write-Output @{$hashBackupFilesPath=$hashBackupFilesFile}
    else{
    Write-Host -NoNewline "not changed..."
    Write-Host "done"
    Write-Host "Backup completed"
    catch { throw $_ }
    function Backup-Full
    [CmdletBinding()]
    [OutputType([System.String])]
    #region Params
    param(
    [Parameter(Position=0, Mandatory=$true,ValueFromPipeline=$false)]
    [ValidateScript({Test-Path -LiteralPath $_ -PathType 'Container'})] 
    [System.String]
    $SourceDir,
    [Parameter(Position=1, Mandatory=$true,ValueFromPipeline=$false)]
    [ValidateNotNullOrEmpty()] 
    [System.String]
    $DestDir,
    [Parameter(Position=2, Mandatory=$true,ValueFromPipeline=$false)]
    [ValidateNotNullOrEmpty()] 
    [System.String]
    $HashFile,
    [Parameter(Position=3, Mandatory=$true,ValueFromPipeline=$false)]
    [ValidateNotNullOrEmpty()]  
    [System.String]
    $ChainKey
    #endregion
    try{
    #Creating an empty hash table
    $xmlBackupFullHashFile = @{}
    #Starting directory lookup 
    $uintBackupFullCount = 0
    Backup-Files -SourceDir $SourceDir -DestDir $("$DestDir\$ChainKey\Full_" + $(Get-Date -Format "ddMMyyyy")) -HashTable $xmlBackupFullHashFile|`
    ForEach-Object{ 
    $xmlBackupFullHashFile.Add([string]$_.Keys,[string]$_.Values) 
    $uintBackupFullCount++
    #Saving chain key.
    $xmlBackupFullHashFile.Add("ChainKey",$ChainKey)
    Write-Host -NoNewline "Saving XML file to $HashFile..."
    Export-Clixml -Path $HashFile -InputObject $xmlBackupFullHashFile -Force
    Write-Host "done"
    Write-Output $uintBackupFullCount
    catch { throw $_ }
    function Backup-Diff
    #region Params
    [CmdletBinding()]
    [OutputType([System.String])]
    param(
    [Parameter(Position=0, Mandatory=$true,ValueFromPipeline=$false)]
    [ValidateScript({Test-Path -LiteralPath $_ -PathType 'Container'})] 
    [System.String]
    $SourceDir,
    [Parameter(Position=1, Mandatory=$true,ValueFromPipeline=$false)]
    [ValidateNotNullOrEmpty()] 
    [System.String]
    $DestDir,
    [Parameter(Position=2, Mandatory=$true,ValueFromPipeline=$false)]
    [ValidateScript({Test-Path -LiteralPath $_ -PathType 'leaf'})]
    [System.String]
    $HashFile
    #endregion
    try{
    #Loading hash table
    $xmlBackupDiffHashFile = Import-Clixml $HashFile
    $chainKeyBackupDiffDifferential = $xmlBackupDiffHashFile["ChainKey"]
    $uintBackupDiffCount = 0
    #Starting directory lookup 
    Backup-Files -SourceDir $SourceDir -DestDir $("$DestDir\$chainKeyBackupDiffDifferential\Differential_" + $(Get-Date -Format "ddMMyyyy.HHmm")) -HashTable $xmlBackupDiffHashFile|`
    ForEach-Object{ $uintBackupDiffCount++ }
    Write-Output $uintBackupDiffCount
    catch { throw $_ }
    function Backup-Inc
    #region Params
    [CmdletBinding()]
    [OutputType([System.String])]
    param(
    [Parameter(Position=0, Mandatory=$true,ValueFromPipeline=$false)]
    [ValidateScript({Test-Path -LiteralPath $_ -PathType 'Container'})] 
    [System.String]
    $SourceDir,
    [Parameter(Position=1, Mandatory=$true,ValueFromPipeline=$false)]
    [ValidateNotNullOrEmpty()] 
    [System.String]
    $DestDir,
    [Parameter(Position=2, Mandatory=$true,ValueFromPipeline=$false)]
    [ValidateScript({Test-Path -LiteralPath $_ -PathType 'leaf'})]
    [System.String]
    $HashFile
    #endregion
    try{
    #Loading hash table
    $xmlBackupIncHashFile = Import-Clixml $HashFile
    $chainKeyBackupIncIncremental = $xmlBackupIncHashFile["ChainKey"]
    $uintBackupIncCount = 0
    #Starting directory lookup 
    Backup-Files -SourceDir $SourceDir -DestDir $("$DestDir\$chainKeyBackupIncIncremental\Incremental_" + $(Get-Date -Format "ddMMyyyy.HHmm")) -HashTable $xmlBackupIncHashFile|`
    ForEach-Object{ 
    $xmlBackupIncHashFile[[string]$_.Keys]=[string]$_.Values
    $uintBackupIncCount++
    Write-Host -NoNewline "Saving XML file to $HashFile..."
    Export-Clixml -Path $HashFile -InputObject $xmlBackupIncHashFile -Force
    Write-Host "Done"
    Write-Output $uintBackupIncCount
    catch { throw $_ }
    #0 - is OK. 1 - some error
    $exitValue=0
    process{
    try{
    $filesCopied=0
    $strSourceFolderName = $(Get-Item $SourceDir).Name
    $strHasFile = $("$HashPath\Hash_$strSourceFolderName.xml")
    $strMessage = $($(Get-Date -Format "HH:mm_dd.MM.yyyy;") + "$BackupType backup of $SourceDir started")
    #Automatically switch to full backup
    $bSwitch = $(!$(Test-Path -LiteralPath $strHasFile -PathType "Leaf") -and $SwitchToFull)
    Write-Log -Message $strMessage -LogFile $LogFile
    switch($true){
    $($BackupType -eq "Full" -or $bSwitch) {
    $filesCopied = Backup-Full -SourceDir $SourceDir -DestDir $DestDir -HashFile $strHasFile -ChainKey $("Backup_$strSourceFolderName" + "_" + $(Get-Date -Format "ddMMyyyy"))
    break
    $($BackupType -eq "Incremental") {
    $filesCopied = Backup-Inc -SourceDir $SourceDir -DestDir $DestDir -HashFile $strHasFile 
    break
    $($BackupType -eq "Differential") {
    $filesCopied = Backup-Diff -SourceDir $SourceDir -DestDir $DestDir -HashFile $strHasFile 
    break
    $strMessage = $($(Get-Date -Format "HH:mm_dd.MM.yyyy;") + "$BackupType backup of $SourceDir completed successfully. $filesCopied items were copied.")
    Write-Log -Message $strMessage -LogFile $LogFile
    Write-Output $filesCopied
    catch { 
    $strMessage = $($(Get-Date -Format "HH:mm_dd.MM.yyyy;") + "$BackupType backup of $SourceDir failed:" + $_)
    Write-Log -Message $strMessage -LogFile $LogFile
    $exitValue = 1
    end{exit $exitValue}
    I have some experience writing Powershell scripts,but I am lost at how this script prompts for Source and Destination paths. I tried modifying the Param section, but this didnt work and up until now I thought the only way you could get a prompt was with
    "read-host". Any and all education on this matter would be greatly appreciated. (Side note: I have posted this question  on the forum in which I found it and have not got an answer yet).
    param(
    [Parameter(Position=0, Mandatory=$true,ValueFromPipeline=$true)]
    [ValidateScript({Test-Path -LiteralPath $_ -PathType 'Container'})]
    [System.String]
    $SourceDir,
    [Parameter(Position=1, Mandatory=$true,ValueFromPipeline=$false)]
    [ValidateScript({Test-Path -LiteralPath $_ -PathType 'Container'})]
    [System.String]
    $DestDir,
    [Parameter(Position=2, Mandatory=$false,ValueFromPipeline=$false)]
    [ValidateScript({Test-Path -LiteralPath $_ -PathType 'Container'})]
    [System.String]
    $HashPath,
    [Parameter(Position=3, Mandatory=$false,ValueFromPipeline=$false)]
    [ValidateSet("Full","Incremental","Differential")]
    [System.String]
    $BackupType="Full",
    [Parameter(Position=4, Mandatory=$false,ValueFromPipeline=$false)]
    [ValidateNotNullOrEmpty()]
    [System.String]
    $LogFile=".\Backup-Files.log",
    [Parameter(Position=5, Mandatory=$false,ValueFromPipeline=$false)]
    [System.Management.Automation.SwitchParameter]
    $SwitchToFull
    #endregion
    begin{
    function Write-Log
    #region Params
    [CmdletBinding()]
    [OutputType([System.String])]
    param(
    [Parameter(Position=0, Mandatory=$true,ValueFromPipeline=$true)]
    [ValidateNotNullOrEmpty()]
    [System.String]
    $Message,
    [Parameter(Position=1, Mandatory=$true,ValueFromPipeline=$true)]
    [ValidateNotNullOrEmpty()]
    [System.String]
    $LogFile
    #endregion
    try{
    Write-Host $Message
    Out-File -InputObject $Message -Append $LogFile
    catch {throw $_}
    function Get-Hash
    #region Params
    [CmdletBinding()]
    [OutputType([System.String])]
    param(
    [Parameter(Position=0, Mandatory=$true,ValueFromPipeline=$true)]
    [ValidateNotNullOrEmpty()]
    [System.String]
    $HashTarget,
    [Parameter(Position=1, Mandatory=$true,ValueFromPipeline=$false)]
    [ValidateSet("File","String")]
    [System.String]
    $HashType
    #endregion
    begin{
    try{ $objGetHashMD5 = New-Object -TypeName System.Security.Cryptography.MD5CryptoServiceProvider }
    catch {throw $_ }
    process{
    try {
    #Checking hash target is file or just string
    switch($HashType){
    "String" {
    $objGetHashUtf8 = New-Object -TypeName System.Text.UTF8Encoding
    $arrayGetHashHash = $objGetHashMD5.ComputeHash($objGetHashUtf8.GetBytes($HashTarget.ToUpper()))
    break
    "File" {
    $arrayGetHashHash = $objGetHashMD5.ComputeHash([System.IO.File]::ReadAllBytes($HashTarget))
    break
    #Return hash
    Write-Output $([System.Convert]::ToBase64String($arrayGetHashHash))
    catch { throw $_ }
    function Copy-File
    #region Params
    [CmdletBinding()]
    [OutputType([System.String])]
    param(
    [Parameter(Position=0, Mandatory=$true,ValueFromPipeline=$false)]
    [ValidateScript({Test-Path -LiteralPath $_ -PathType 'Any'})]
    [System.String]
    $SourceFile,
    [Parameter(Position=1, Mandatory=$true,ValueFromPipeline=$false)]
    [ValidateNotNullOrEmpty()]
    [System.String]
    $DestFile
    #endregion
    try{
    #The script fails when folder being copied to file. So the item will be removed to avoid the error.
    if(Test-Path -LiteralPath $DestFile -PathType Any){
    Remove-Item -LiteralPath $DestFile -Force -Recurse
    #Creating destination if doesn't exist. It's required because Copy-Item doesn't create destination folder
    if(Test-Path -LiteralPath $SourceFile -PathType Leaf){
    New-Item -ItemType "File" -Path $DestFile -Force
    #Copying file to destination directory
    Copy-Item -LiteralPath $SourceFile -Destination $DestFile -Force
    catch{ throw $_ }
    function Backup-Files
    #region Params
    [CmdletBinding()]
    [OutputType([System.String])]
    param(
    [Parameter(Position=0, Mandatory=$true,ValueFromPipeline=$false)]
    [ValidateScript({Test-Path -LiteralPath $_ -PathType 'Container'})]
    [System.String]
    $SourceDir,
    [Parameter(Position=1, Mandatory=$true,ValueFromPipeline=$false)]
    [ValidateNotNullOrEmpty()]
    [System.String]
    $DestDir,
    [Parameter(Position=2, Mandatory=$true,ValueFromPipeline=$false)]
    [ValidateNotNull()]
    [System.Collections.Hashtable]
    $HashTable
    #endregion
    try{
    $xmlBackupFilesHashFile = $HashTable
    Write-Host "Backup started"
    Get-ChildItem -Recurse -Path $SourceDir|ForEach-Object{
    $currentBackupFilesItem = $_
    #Full path to source and destination item
    $strBackupFilesSourceFullPath = $currentBackupFilesItem.FullName
    $strBackupFilesDestFullPath = $currentBackupFilesItem.FullName.Replace($SourceDir,$DestDir)
    #Checking that the current item is file and not directory. True - the item is file.
    $bBackupFilesFile = $($($currentBackupFilesItem.Attributes -band [System.IO.FileAttributes]::Directory) -ne [System.IO.FileAttributes]::Directory)
    Write-Host -NoNewline ">>>Processing item $strBackupFilesSourceFullPath..."
    #Generating path hash
    $hashBackupFilesPath = $(Get-Hash -HashTarget $strBackupFilesSourceFullPath -HashType "String")
    $hashBackupFilesFile = "d"
    #If the item is file then generate hash for file content
    if($bBackupFilesFile){
    $hashBackupFilesFile = $(Get-Hash -HashTarget $strBackupFilesSourceFullPath -HashType "File")
    #Checking that the file has been copied
    if($xmlBackupFilesHashFile[$hashBackupFilesPath] -ne $hashBackupFilesFile){
    Write-Host -NoNewline $("hash changed=>$hashBackupFilesFile...")
    Copy-File -SourceFile $strBackupFilesSourceFullPath $strBackupFilesDestFullPath|Out-Null
    #Returning result
    Write-Output @{$hashBackupFilesPath=$hashBackupFilesFile}
    else{
    Write-Host -NoNewline "not changed..."
    Write-Host "done"
    Write-Host "Backup completed"
    catch { throw $_ }
    function Backup-Full
    [CmdletBinding()]
    [OutputType([System.String])]
    #region Params
    param(
    [Parameter(Position=0, Mandatory=$true,ValueFromPipeline=$false)]
    [ValidateScript({Test-Path -LiteralPath $_ -PathType 'Container'})]
    [System.String]
    $SourceDir,
    [Parameter(Position=1, Mandatory=$true,ValueFromPipeline=$false)]
    [ValidateNotNullOrEmpty()]
    [System.String]
    $DestDir,
    [Parameter(Position=2, Mandatory=$true,ValueFromPipeline=$false)]
    [ValidateNotNullOrEmpty()]
    [System.String]
    $HashFile,
    [Parameter(Position=3, Mandatory=$true,ValueFromPipeline=$false)]
    [ValidateNotNullOrEmpty()]
    [System.String]
    $ChainKey
    #endregion
    try{
    #Creating an empty hash table
    $xmlBackupFullHashFile = @{}
    #Starting directory lookup
    $uintBackupFullCount = 0
    Backup-Files -SourceDir $SourceDir -DestDir $("$DestDir\$ChainKey\Full_" + $(Get-Date -Format "ddMMyyyy")) -HashTable $xmlBackupFullHashFile|`
    ForEach-Object{
    $xmlBackupFullHashFile.Add([string]$_.Keys,[string]$_.Values)
    $uintBackupFullCount++
    #Saving chain key.
    $xmlBackupFullHashFile.Add("ChainKey",$ChainKey)
    Write-Host -NoNewline "Saving XML file to $HashFile..."
    Export-Clixml -Path $HashFile -InputObject $xmlBackupFullHashFile -Force
    Write-Host "done"
    Write-Output $uintBackupFullCount
    catch { throw $_ }
    function Backup-Diff
    #region Params
    [CmdletBinding()]
    [OutputType([System.String])]
    param(
    [Parameter(Position=0, Mandatory=$true,ValueFromPipeline=$false)]
    [ValidateScript({Test-Path -LiteralPath $_ -PathType 'Container'})]
    [System.String]
    $SourceDir,
    [Parameter(Position=1, Mandatory=$true,ValueFromPipeline=$false)]
    [ValidateNotNullOrEmpty()]
    [System.String]
    $DestDir,
    [Parameter(Position=2, Mandatory=$true,ValueFromPipeline=$false)]
    [ValidateScript({Test-Path -LiteralPath $_ -PathType 'leaf'})]
    [System.String]
    $HashFile
    #endregion
    try{
    #Loading hash table
    $xmlBackupDiffHashFile = Import-Clixml $HashFile
    $chainKeyBackupDiffDifferential = $xmlBackupDiffHashFile["ChainKey"]
    $uintBackupDiffCount = 0
    #Starting directory lookup
    Backup-Files -SourceDir $SourceDir -DestDir $("$DestDir\$chainKeyBackupDiffDifferential\Differential_" + $(Get-Date -Format "ddMMyyyy.HHmm")) -HashTable $xmlBackupDiffHashFile|`
    ForEach-Object{ $uintBackupDiffCount++ }
    Write-Output $uintBackupDiffCount
    catch { throw $_ }
    function Backup-Inc
    #region Params
    [CmdletBinding()]
    [OutputType([System.String])]
    param(
    [Parameter(Position=0, Mandatory=$true,ValueFromPipeline=$false)]
    [ValidateScript({Test-Path -LiteralPath $_ -PathType 'Container'})]
    [System.String]
    $SourceDir,
    [Parameter(Position=1, Mandatory=$true,ValueFromPipeline=$false)]
    [ValidateNotNullOrEmpty()]
    [System.String]
    $DestDir,
    [Parameter(Position=2, Mandatory=$true,ValueFromPipeline=$false)]
    [ValidateScript({Test-Path -LiteralPath $_ -PathType 'leaf'})]
    [System.String]
    $HashFile
    #endregion
    try{
    #Loading hash table
    $xmlBackupIncHashFile = Import-Clixml $HashFile
    $chainKeyBackupIncIncremental = $xmlBackupIncHashFile["ChainKey"]
    $uintBackupIncCount = 0
    #Starting directory lookup
    Backup-Files -SourceDir $SourceDir -DestDir $("$DestDir\$chainKeyBackupIncIncremental\Incremental_" + $(Get-Date -Format "ddMMyyyy.HHmm")) -HashTable $xmlBackupIncHashFile|`
    ForEach-Object{
    $xmlBackupIncHashFile[[string]$_.Keys]=[string]$_.Values
    $uintBackupIncCount++
    Write-Host -NoNewline "Saving XML file to $HashFile..."
    Export-Clixml -Path $HashFile -InputObject $xmlBackupIncHashFile -Force
    Write-Host "Done"
    Write-Output $uintBackupIncCount
    catch { throw $_ }
    #0 - is OK. 1 - some error
    $exitValue=0
    process{
    try{
    $filesCopied=0
    $strSourceFolderName = $(Get-Item $SourceDir).Name
    $strHasFile = $("$HashPath\Hash_$strSourceFolderName.xml")
    $strMessage = $($(Get-Date -Format "HH:mm_dd.MM.yyyy;") + "$BackupType backup of $SourceDir started")
    #Automatically switch to full backup
    $bSwitch = $(!$(Test-Path -LiteralPath $strHasFile -PathType "Leaf") -and $SwitchToFull)
    Write-Log -Message $strMessage -LogFile $LogFile
    switch($true){
    $($BackupType -eq "Full" -or $bSwitch) {
    $filesCopied = Backup-Full -SourceDir $SourceDir -DestDir $DestDir -HashFile $strHasFile -ChainKey $("Backup_$strSourceFolderName" + "_" + $(Get-Date -Format "ddMMyyyy"))
    break
    $($BackupType -eq "Incremental") {
    $filesCopied = Backup-Inc -SourceDir $SourceDir -DestDir $DestDir -HashFile $strHasFile
    break
    $($BackupType -eq "Differential") {
    $filesCopied = Backup-Diff -SourceDir $SourceDir -DestDir $DestDir -HashFile $strHasFile
    break
    $strMessage = $($(Get-Date -Format "HH:mm_dd.MM.yyyy;") + "$BackupType backup of $SourceDir completed successfully. $filesCopied items were copied.")
    Write-Log -Message $strMessage -LogFile $LogFile
    Write-Output $filesCopied
    catch {
    $strMessage = $($(Get-Date -Format "HH:mm_dd.MM.yyyy;") + "$BackupType backup of $SourceDir failed:" + $_)
    Write-Log -Message $strMessage -LogFile $LogFile
    $exitValue = 1
    end{exit $exitValue}

    Hi Ryan Blaeholder,
    Thanks for your posting.
    To schedule a powershell script with input value, instead of modifying the script above, you can also try to add the input during creating a scheduled task like this:(save the script above as D:\backup.ps1)
    -command "& 'D:\backup.ps1' 'input1' 'input2'"
    For more detailed information, please refer to this article to complete:
    Schedule PowerShell Scripts that Require Input Values:
    http://blogs.technet.com/b/heyscriptingguy/archive/2011/01/12/schedule-powershell-scripts-that-require-input-values.aspx
    I hope this helps.
    We
    are trying to better understand customer views on social support experience, so your participation in this
    interview project would be greatly appreciated if you have time.
    Thanks for helping make community forums a great place.

  • I'm trying to run a scheduled task as system, but it will not run my powershell script

    I have a powershell script I created that uses credentials within the script to connect to a network resource to copy files every night. It works fine if I just run the script manually. but I need to run this as a scheduled task, preferably as system. I
    have checked "run with highest privileges" still no luck. 
    If I change the user to my account it works fine, but I need it to work without the use of my account. Is this some kind of security policy problem or what would cause this? I figure having the credentials within the script would work, but not so much.
    Paul Arbogast

    I have a powershell script I created that uses credentials within the script to connect to a network resource to copy files every night. It works fine if I just run the script manually. but I need to run this as a scheduled task, preferably as system. I
    have checked "run with highest privileges" still no luck. 
    If I change the user to my account it works fine, but I need it to work without the use of my account. Is this some kind of security policy problem or what would cause this? I figure having the credentials within the script would work, but not so much.
    Paul Arbogast
    Is the Execution Policy set for the script?
    With this information it sounds to be.....
    Take a look @
    http://technet.microsoft.com/en-us/library/ee176961.aspx
    Best Wishes,

  • Run Powershell script from Scheduled Task as "NT Authority \ SYSTEM"

    Hello, dear Colleagues.
    Cannot make Powershell script from Scheduled Task as "NT Authority \ System"
    Action: Start a program - 
    C:\Windows\System32\WindowsPowerShell\v1.0\powershell.exe -command "C:\script.ps1"
    The matter is that script is working, moreover if to run Task with Domain Account it works too.
    Checked Run with highest privileges, changed "Configure for" field, tried different arguments (-noprofile, -noexit, -executionpolicy bypass, -command, -file,") - no luck.
    Didn't you try to make it work with SYSTEM account?
    Thanks.

    Hi fapq,
    Try this link task schedulers
    Note
    To identify tasks that run with system permissions, use a verbose query (/query/v). In a verbose query display of a system-run task, the Run As User field has a value of NT AUTHORITY\SYSTEM and
    the Logon Mode field has a value of Background only.
    Naveen Basati

  • How can I setup a scheduled task to run a Powershell Script delivered as a Group Policy Preference

    I have a Powershell script I want to run only once when a user logs onto their system. This script would move all the PST files from the Local drive and the Home drive to a folder location within the users profile. I wanted to run this as a Windows 7 Scheduled Task using Group Policy Preferences. How can I get this to happen short of a logon script? I have updated all the machines to WMF 4.0 so could I use a Scheduled Job instead? I wanted to run the script as the logon user but elevated.#Start Outlook and Disconnect attached PST files.
    $Outlook = New-Object -ComObject Outlook.Application
    $namespace = $outlook.getnamespace("MAPI")
    $folder = $namespace.GetDefaultFolder("olFolderInbox")
    $explorer = $folder.GetExplorer()
    $explorer.Display()
    $myArray= @()
    $outlook.Session.Stores | where{ ($_.FilePath -like'*.PST') } | foreach{[array]$myArray+= $_.FilePath}
    for
    ($x=0;$x-le$myArray.length-1;$x++)
    $PSTPath= $myArray[$x]
    $PST= $namespace.Stores | ?{$_.FilePath -like$PSTPath}
    $PSTRoot= $PST.GetRootFolder() #Get Root Folder name of PST
    $PSTFolder= $Namespace.Folders.Item($PSTRoot.Name) #Bind to PST for disconnection
    $Namespace.GetType().InvokeMember('RemoveStore',[System.Reflection.BindingFlags]::InvokeMethod,$null,$Namespace,($PSTFolder)) #Disconnect .PST
    #Move All PST files to the default location while deleting the PST files from their original location.
    $SourceList = ("$env:SystemDrive", "$env:HOMEDRIVE")
    $Destination = ("$env:USERPROFILE\MyOutlookFiles")
    (Get-ChildItem -Path $SourceList -Recurse -Filter *.PST) | Move-Item -Destination $Destination
    #Attach all PST files from the default location.
    Add-type -assembly "Microsoft.Office.Interop.Outlook" | out-null
    $outlook = new-object -comobject outlook.application
    $namespace = $outlook.GetNameSpace("MAPI")
    dir “$env:USERPROFILE\MyOutlookFiles\*.pst” | % { $namespace.AddStore($_.FullName) }

    Mike,
    I do not understand what appears to be a regular expression above. I did add the PowerShell script to the HKCU RunOnce Key as suggested.
    Windows Registry Editor Version 5.00
    C:\WINDOWS\system32\WindowsPowerShell\v1.0\powershell.exe -noprofile -sta -WindowStyle Hidden -ExecutionPolicy RemoteSigned -File "C:\scripts\Windows PowerShell\Move-PST.ps1"
     I'm delivering this using Group Policy Preferences. It seems to fail or time out when run because the behavior is different if I run the script from within the PowerShell IDE. I added the parameters to the script and will try it again in the morning.

  • Workflow does not start when PowerShell Script is run from Task Scheduler

    I have a PS script that updates an item in a SP2010 list so that a workflow will be started.  When I run the PS script manually from the PS window on the server it resides, the script runs flawlessly.  If I set a scheduled task on the same server
    to run the script with the same credentials as are being used in the PS window, the script runs, updated the info on the list, but DOES NOT start the workflow.  As we know, SPD workflows cannot be started by anonymous or system accounts.  It is as
    if the task scheduler adds a bit of information that makes the SP2010 list think the information was updated by one of these accounts even though the field in the list for the item being updated shows the correct account.
    HELP!!!
    D

    Hi,
    The issue might be related to the script or schedule task settings your configured. You’d better check the settings.
    For example, which option do you set to run the scheduled task? You may select “Run whether user is logged on or not” instead of “run only when user is logged on” as shown in this article:
    http://blog.pointbeyond.com/2010/04/23/run-powershell-script-using-windows-server-2008-task-scheduler/
    Hope it helps.
    Best Regards,
    Sally Tang

  • Executing powershell scripts via Task Scheduler

    Hi,
    I have a powershell script that I wrote that when executed from the shell works fine but when executed from task scheduler does not work.
    In my script, an email is sent out based on the results of the execution.  When I run this from the shell, email goes out, when scheduled, no email and there is no indication of errors having occurred anywhere in the system.
    Has anyone run into a similar issue?
    I did change my powershell execution policy to be unrestricted (both in the x86 and x64 consoles).  I am running Windows 2008 R2.
    Thanks - Greg.

    Hi,
    In addition the above suggestions, please also refer to the below threads:
    Using Task Scheduler for a powershell script on server 2008
    http://social.technet.microsoft.com/Forums/en-US/ITCG/thread/e298d613-47b8-4492-92d1-0b55cc8497c1
     Using Windows Task Scheduler to execute Powershell Script frequently
    http://social.technet.microsoft.com/Forums/en-US/winserverpowershell/thread/5901a6ad-ba18-4817-82a9-f75d2d6b439f
    Hope this helps.
    Best Regards,
    Yan Li
    Yan Li
    TechNet Community Support

  • Schedule a powershell script

    Hi all,
    I have a problema scheduling a Powershell script using a batch file:
    I've tryed to do that with this two method:
    powershell -c ". 'C:\Program Files\Microsoft\Exchange Server\V14\bin\RemoteExchange.ps1'"
    C:\Windows\System32\WindowsPowerShell\v1.0\powershell.exe "'C:\Program Files\Microsoft\Exchange Server\V14\bin\RemoteExchange.ps1"
    The job works but inside the script there is the instruction:
    $Path = Get-Location
    This because the script create Log folder in same location where PS1 file is.
    Now scheduling the script using one of the method listed the script create the Log folder in path:
    C:\Windows\System32
    How ca n I prevent this and do the script create the folder in the same location where PS1 file is ?
    I hope to explain my problem
    Thanks

    If you're using Windows PowerShell 3.0, then you can take advantage of the automatic variable $PSScriptRoot to determine the location of the script. For instance, if I have a script that runs from C:\Scripts\myscript.ps1 and I want my script to create a
    text file in the same location, then I do not need to hard code this value. I just use $PSScriptRoot, such as: 
    Set-Content -Path "$PSScriptRoot\logfile.txt" -Value $Value
    This would create a text file called logfile.txt in C:\Scripts.
    Since you're working with Exchange (and possibly Windows PowerShell 2.0) you may need to stick with the $MyInvocation automatic variable and the Split-Path cmdlet, such as:
    Split-Path $MyInvocation.MyCommand.Path
    Test these outside of your script so you can learn a little about them, just be sure you put them into their own
    saved .ps1 file. They will not return anything if you throw them into the ISE or shell, unless they are part of a .ps1 file.
    Edit: Added script blocks for readability.

  • Scheduling a Powershell script as a batch job

    I need to run a PowerShell script as a batch job to be executed every 5 minutes on a Windows Server 2012 R2 server.
    I want to use a scheduled task.
    I read on the web different ways to solve the problem but I was unable to make them work.
    What is the simplest way to run a PowerShell file as a scheduled job?
    Regards
    Mario

    Hi Mario,
    The quickest method is to write your script and save it on the server. Then create a new scheduled task like so:
    Actions -
    Start a program
    Program/script : C:\Windows\System32\WindowsPowerShell\v1.0\powershell.exe
    Add arguments : -File C:\path\to\script.ps1
    That's all you need to do for simple scripts (make sure your execution policy is set right too).
    Don't retire TechNet! -
    (Don't give up yet - 12,700+ strong and growing)

  • Unable to Configure a PowerShell Script to Run as a Scheduled Task

    Hi ,
    I have a powershell script from Dieters blog http://scug.be/dieter/2011/07/06/scom-2007-how-to-backup-your-unsealed-management-packs/#comment-1058 which backups the unsealed management packs in SCOM. I am able to run the script successfully in the powershell
    prompt but when i schedule it as a task it fails with an error code 1.
    "Task Scheduler successfully completed task "\Backup Unsealed Management Pack" , instance "{xxxxxxxxxxxxxxxxxxxxxxxxxxxx}" , action "Powershell.exe" with return
    code 1."
    I am running the task using the highest privilages and have tried bypassing the script using and have configured the script as per http://www.metalogix.com/help/Content%20Matrix%20Console/SharePoint%20Edition/002_HowTo/004_SharePointActions/012_SchedulingPowerShell.htm
    -ExecutionPolicy Bypass c:\scripts\myscript.ps1
    Still i am unable to run the task without errors. Kindly assist.
    Jesty

    Hi Jesty,
    Please make sure you have followed these steps to schedule a powershell script task:
    1.  Save the powershell script as .ps1 in local computer.
    2.  Right click the .ps1 file and run with powershell and check if the script can run successfully.
    3.  schedule task->action:Start a program->Program/script:powershell.exe->Add argument:-ExecutionPolicy Bypass c:\scripts\myscript.ps1
    Best Regards,
    Anna

  • PowerShell script doesn't appear to work as scheduled task in sharepoint 2013

    PowerShell script doesn't appear to work as scheduled task in sharepoint 2013, it works as normal manual execution
    MCTS Sharepoint 2010, MCAD dotnet, MCPDEA, SharePoint Lead

    Hi,
    To run PowerShell Script as scheduled task in SharePoint 2013, you can take the demo below for a try:
    http://blogs.technet.com/b/meamcs/archive/2013/02/23/sharepoint-2013-backup-with-powershell-and-task-scheduler-for-beginners.aspx
    Thanks
    Patrick Liang
    Forum Support
    Please remember to mark the replies as answers if they help and unmark them if they provide no help. If you have feedback for TechNet Subscriber Support,
    contact [email protected]
    Patrick Liang
    TechNet Community Support

Maybe you are looking for