Powershell get scheduled task jobs on remote computer

Hello guys!
I couldn't find any answer or question about this, I need to query scheduled jobs on server (2008R2) with name which contains variables.
I can get list of jobs, but I don't know how to display only that I need.  For example, there is job $something.$RandomText.$RandomNumbers, So I'm trying to filter jobs by name ($something).
Script which I found is:
$sched = New-Object -Com "Schedule.Service"
$sched.Connect('remote server')
$out = @()
$sched.GetFolder("\").GetTasks(0) | % {
$xml = [xml]$_.xml
$out += New-Object psobject -Property @{
"Name" = $_.Name
"Status" = switch($_.State) {0 {"Unknown"} 1 {"Disabled"} 2 {"Queued"} 3 {"Ready"} 4 {"Running"}}
"NextRunTime" = $_.NextRunTime
"LastRunTime" = $_.LastRunTime
"LastRunResult" = $_.LastTaskResult
"Author" = $xml.Task.Principals.Principal.UserId
"Created" = $xml.Task.RegistrationInfo.Date
And one more thing, I'm using windows 7, so I can't use get-SceduledTasks function. 
Maybe there is some simpler way to do this?
Best regards,
Ronald

Just see the values in $out and check the Name. The name may differ.No results will be produced if it;s not eq to your $something
$sched = New-Object -Com "Schedule.Service"
$sched.Connect($env:ComputerName)
$something = 'GoogleUpdateTaskMachineCore'
$out = @()
$sched.GetFolder("\").GetTasks(0) | % {
$xml = [xml]$_.xml
$out += New-Object psobject -Property @{
"Name" = $_.Name
"Status" = switch($_.State) {0 {"Unknown"} 1 {"Disabled"} 2 {"Queued"} 3 {"Ready"} 4 {"Running"}}
"NextRunTime" = $_.NextRunTime
"LastRunTime" = $_.LastRunTime
"LastRunResult" = $_.LastTaskResult
"Author" = $xml.Task.Principals.Principal.UserId
"Created" = $xml.Task.RegistrationInfo.Date
$out | ? {$_.Name -eq "$something"}
Regards Chen V [MCTS SharePoint 2010]

Similar Messages

  • What permissions are required to see scheduled tasks on a remote server using SCHTASKS /S {server}?

    When I attempt to run SCHTASKS /S {server} some servers return:
    ERROR: Access is denied.
    On servers where I am a member of the local Administrators group I can run this command successfully. There are many servers where it is not appropriate for me to be a local Administrator but would be fine for me to monitor the status of scheduled tasks
    on the server.
    What permissions need to be granted to me to see the scheduled tasks and not be a local administrator? Is this even possible?
    Thanks,
    Matthew

    Hi Matthew,
    Based on my research, any user can schedule a task on the
    local computer and they can view and change the tasks that they scheduled; however, to schedule, view, or change a task on a
    remote computer, we must be member of the Administrators group on the remote computer.
    Therefore,
    What permissions need to be granted to me to see the scheduled tasks and not be a local administrator? Is this even possible
    It is not possible.
    More information for you:
    Schtasks
    http://technet.microsoft.com/en-us/library/cc725744(WS.10).aspx
    Best Regards,
    Amy
    Please remember to mark the replies as answers if they help and un-mark them if they provide no help. If you have feedback for TechNet Subscriber Support, contact [email protected]

  • Get Operating System info for remote computer?

    I trying to figure out how to pass credentials to this command to get operating system of a remote computer.
    get-wmiObject win32_operatingsystem -Credential domain\username
    Anyone know how to include the password as well?  You can see in my picture below, it prompts.  If I enter the password in the prompt, the command works fine.
    Any suggestions?  Thanks

    Hi Romatio,
    where there is a will, there is a script:
    $pwd = ConvertTo-SecureString "password" -AsPlainText -Force
    $user = "Domain\User"
    $cred = New-Object System.Management.Automation.PSCredential($user, $pwd)
    Make sure you store the script securely though, the password is readable in plain text after all.
    Cheers,
    Fred
    Edit: Just in case: Use the $cred variable then for the -Credential Parameter
    There's no place like 127.0.0.1
    well u can store password this way
    http://social.technet.microsoft.com/wiki/contents/articles/4546.working-with-passwords-secure-strings-and-credentials-in-windows-powershell.aspx

  • List all running scheduled tasks on local/remote machine

    Hi All,
    I'm in need of a PS script, which can list all the active/running scheduled tasks either on local or remote machine.
    Please share if you have one readily available.
    Thanks in advance!!!
    Pavan

    Hi Pavan,
    Have you checked the script repository? That's a good place to find prewritten scripts:
    http://gallery.technet.microsoft.com/scriptcenter
    Don't retire TechNet! -
    (Don't give up yet - 13,085+ strong and growing)

  • Getting Worklist Tasks in a Remote Server

    Is it possible to build an application that works in a remote container that references the BPEL container to get the user tasks assigned to a user.
    How should I write the API to get the user tasks from a remote BPEL PM

    Refer to the remote API section in the "Worklist Application chapter of the BPEL Developers guide

  • Cannot get scheduled task to run

    Hi Folks,
    Googled lots for this but cant find the answer. Sorry to have to post.
    When i browse to the file it works fine, but wont execute from cfadmin. Can anyone help?
    I've checked permissions and they seem to be ok.
    Any help much appreciated.
    Thankyou

    Start by checking the Output to file checkbox. Then indicate a text file to write to.
    Schedule the task for some point in the next five minutes.
    Wait for the time to pass
    Read the text file that was created.
    Oftentimes it will reveal that an error occured, and say what type of error.
    As for permissions, are you checking the persmissions in your web server (IIS???) or in your Windows folder?
    You need to go to the web server interface, and make sure the file in question is allowed to be run by whatever account is used to start the ColdFusion service, since THAT is the user as far as the execution of this file is concerned.

  • Alternatives to PowerShell automation of Excel (Scheduled Task) in a Windows Server 2012 R2 world

    I have a handful of PowerShell scripts that ran as Scheduled Tasks on an old Windows 2008 SP2 server, which output reports as Excel workbooks with multiple worksheets, via a COM object. After migrating these scripts to a new Windows 2012 R2 server,
    these scripts no longer function. The Server 2008 system was running Office 2007, and the new Server 2012 R2 server has Office 2013 installed.
    What is everyone using on their Server 2012 R2 systems to automate generation of multi-sheet reports in Excel via PowerShell and Scheduled Tasks?
    I know that Microsoft has said that automating Office applications server-side in non-interactive mode, but it has worked in the past, until now. (https://support.microsoft.com/kb/257757)
    I've tried the OpenXML PowerTools for PowerShell (https://powertools.codeplex.com/), but Export-OpenXMLSpreadsheet will only export a single worksheet, and they don't support adding worksheets to existing files
    (am I missing something?). I understand the OpenXML PowerTools can be extended in C#, but I don't C#. The PowerShell cmdlets were released in Jan 2012, but don't look to have been updated since then (updates have been the core C# code).
    I've seen recommendations to add C:\Windows\(System32 or SysWOW64)\config\systemprofile\Desktop, but this hasn't worked on Server 2012.
    Has anyone been able to get Excel 2013 to cooperate on Server 2012 R2, or come up with a suitable alternative? I'd rather not start running these scripts from my workstation.
    EDIT:
    I'm not necessarily looking for an Excel answer, since there are OpenXML ways of doing things now without the Excel application.
    Here are some of the errors I'm getting when approaching this from different directions. When I use:
    $excel = New-Object -comobject Excel.Application$workbook = $excel.Workbooks.Add()
    Results in error:
    Exception calling "Add" with "0" argument(s): "Microsoft Excel cannot open or
    save any more documents because there is not enough available memory or disk
    space.
    • To make more memory available, close workbooks or programs you no longer
    need.
    • To free disk space, delete files you no longer need from the disk you are
    saving to."
    At C:\path\to\script.ps1:21 char:2
    + $workbook = $excel.Workbooks.Add()
    + ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
    + CategoryInfo : NotSpecified: (:) [], MethodInvocationException
    + FullyQualifiedErrorId : ComMethodTargetInvocation
    The server is definitely not lacking for resources.
    I can copy a blank XLSX file and use .Open($xlFile) instead of .Add(), but then when I go to open a CSV file to copy/paste the content to a worksheet, I get this one:
    Exception calling "Open" with "1" argument(s): "The server threw an exception.
    (Exception from HRESULT: 0x80010105 (RPC_E_SERVERFAULT))"
    At C:\path\to\script.ps1:65 char:2
    + $tempcsv = $excel.Workbooks.Open($CSVFile)
    + ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
    + CategoryInfo : NotSpecified: (:) [], MethodInvocationException
    + FullyQualifiedErrorId : ComMethodTargetInvocation
    Things that I've tried so far, with no success:
    I created C:\Windows\System32\config\systemprofile\Desktop and C:\Windows\SysWOW64\config\systemprofile\Desktop as suggested here:
    http://social.technet.microsoft.com/Forums/windowsserver/en-US/0751119d-84d5-4a77-8240-1c4802f97375/powershell-scheduled-tasks-wont-start-excel?forum=winserverpowershell
    Add Local Launch and Local Activation permissions to the Microsoft Excel Application DCOM Config, Grant "Read & Execute, List folder contents, Read" permissions on the \config\systemprofile\Desktop folder. Grant "Modify, Read & Execute,
    List Folder Content, Read, Write" permissions for the account on the following folders:
    \config\systemprofile\AppData\Roaming\Microsoft
    \config\systemprofile\AppData\Local\Microsoft
    as suggested here:
    http://social.technet.microsoft.com/Forums/windowsserver/en-US/aede572b-4c1f-4729-bc9d-899fed5fad02/run-powershell-script-as-scheduled-task-that-uses-excel-com-object?forum=winserverpowershell
    @ jrv - I will try posting a similar question in the Excel 2013 forum, but since I'm looking for a scripting solution to produce an OpenXML spreadsheet, and not something that necessarily uses the Excel application (though such a solution would not be turned
    away), I thought this the more appropriate venue.
    (to reiterate from OP) I have seen and acknowledge Microsoft's statements regarding the unsupported nature of automating Office applications in non-interactive environments, but the truth of the matter is, supported or not, it worked in the past (Server
    2008), it's what I inherited from my predecessor, and I know I'm not the only one who has been using Excel in this way. I'm only asking fellow scripters, some of whom must also be using Excel in this "unsupported" fashion, how they are automating
    creation of their spreadsheet reports after moving to Server 2012. A different system modification to make the Excel comObject continue working as before? Direct manipulation of the OpenXML document? Other solutions that may or may not require launching the
    Excel application?

    #1 - Ask in Excel 2013 forum.
    #2 - Microsoft has repeatedlynoted that this is NOT a supported configuration for Office products.
    #3 - What errors are you getting?
    #4 - What have you done to debug this:
    #5 - No one can be of much help unless you post a very simple example of how this fails.
    ¯\_(ツ)_/¯

  • Executing powershell script from remote computer using RSAT

    Hi.
    I want to execute powershell script on AD server from remote computer (in the same domain). I installed and tested RSAT - it is working fine. But i cant execute PS from c# code.
    ps.Commands.AddCommand("Import-Module").AddArgument("ActiveDirectory");
    ps.Invoke();
    ps.Commands.AddCommand("Get-ADOrganizationalUnit -Filter 'Name -like \"*\"'");
    var res = ps.Invoke();
     And i get exception:
    An unhandled exception of type 'System.Management.Automation.CommandNotFoundException' occurred in System.Management.Automation.dll
    Additional information: The term 'Get-SBNamespace' is not recognized as the name of a cmdlet, function, script file, or operable program. Check the spelling of the name, or if a path was included, verify that the path is correct and try again.

    Hi
    CapitanPlanet
    For the
    CommandNotFoundException, it means the command cannot be found.
    On the other hand, your issue is about the PowerShell, if you still have the issue, I suggest that you should post it in the
    PowerShell forum for efficient response.
    Here are some useful information, please check
      Powershell
    commands from C# not working (System.Management.Automation.CommandNotFoundException)
    Powershell, Service Bus For Windows Server Programmatically: Command found, module could not be loaded
    https://msdn.microsoft.com/en-us/library/dn282152.aspx
    Best regards,
    Kristin
    We are trying to better understand customer views on social support experience, so your participation in this interview project would be greatly appreciated if you have time. Thanks for helping make community forums a great place.
    Click
    HERE to participate the survey.

  • Get ip address of remote computer

    How can i get the ip address of remote computer?I just find the InetAddress.getLocalhost() method to get local ip.
    How should I do ?

    You can use the jCIFS NTLM filter to authenticate the user against a
    domain and get their NT username via request.getRemoteUser(); this is available from:
    http://jcifs.samba.org
    The filter is transparent to your web application, so you shouldn't need to make any additional code changes.
    Eric

  • IDoc getting scheduled in stead of processed due to no resources available

    I know this is supposed to go into Basis-forum, but I can't find applicable one. If you can redirect me please do so.
    Setup:
    Solution Manager, CUA & ECC6
    Issue:
    Sometimes when triggering changes from CUA to an ECC client several IDocs are fired towards that client.
    (User Master Record changes, unlocks, role assignments, etc.).
    Looking in BD87 I sometimes see that of these IDocs, some are processed directly (transferred to application),
    However some of them remain in Status 64 (IDoc ready to be transferred to application).
    If I Right-Click -> Process, they are handled perfectly so why did they have to be waiting?
    The reason SAP throws for these IDocs is:
    Immediate Processing in Job '0000000000005010 '
    Message no. EA187
    Diagnosis
    At the time of IDoc generation, there were no resources for immediate processing and your system is configured so that if this happens, IDoc processing should be started in a job. Therefore, the job '0000000000005010' was started.
    Procedure
    If the IDoc still has status 64, look for the cause of the error in the log for that job.
    Question:
    We have via WE20 profiles set to process all IDocs immediately, then why do some of them still get
    scheduled in jobs? (Due to lack of resources for immediate processing?).
    Where can I check these 'resources' ?
    Edited by: Mujo on Jan 4, 2012 2:41 PM

    Hey Alex,
    Yes, I have been looking into both available worker processes all seem fine. Also I've been checking to pick up
    those IDocs that are in wait-status to be picked up by a (to be) scheduled job.
    However a few questions:
    - is it common that CUA changes fire multiple IDocs? (in my case 3, only for something trivial as removing 1 composite role
    on a CUA-child system with 1 end user
    - what I don't really understand yet is if this is fired in batches of (in this case) only 3 IDocs, how come the first is processed and
      the other 2 are not? What is blocking that? Is 1 worker process used for processing each IDoc or 1 worker process for all 3 IDocs? To jump into your comment of "when a lot of IDocs are fired", as said: they're only 3.
    - can it be that the 'program" that processes the 1st IDoc change to the user master record is then 'occupying' the change mode and that therefore the other 2 are not able to being processed or can it be that the changes just come too fast after one another?
    - I would think that when CUA is installed and RFCs and IDoc types are correctly configured this resource problem
      should not act up with standard SAP ECC6 settings, for firing something trivial/simple as a composite role change
    - last question: if I should schedule a job to pick up the IDocs that are in waiting-to-be-processed status, what is common
      interval for this? (every minute? every 5 minutes?) and what's the program to schedule?
    EDIT: found at least answer to this question:
    RBDAPP01. Processes inbound IDocs (status 64) that have been buffered to support mass
    processing. The use of RBDAPP01 in the process flow is described in Chapter 4, "The Inbound EDI
    Process." Steps to execute the RBDAPP01 program are listed in Chapter 10, "Testing the EDI
    Interface."
    Looks strange to me as then the point of CUA making direct changes is somewhat undermined. In this case our CUA is being fed from a Sun IDM system, and the changes on SAP landscape will be 'less near-realtime', if I express that correctly.
    Your thoughts are highly appreciated.
    Edited by: Mujo on Jan 5, 2012 9:47 AM
    Edited by: Mujo on Jan 5, 2012 10:21 AM

  • Need help sending dialog to remote computer and saving results

    As the title states, what I want to do is send a dialog box to all remote computers listed in the array $compnames where I can receive a yes or no and either save this as a file or array to pass back to my local computer to shutdown the ones where either
    a yes or no answer is received. I have the following code segment and I am missing the no answer received part and I am not sure how to receive the responses form each computer.
    Invoke-Command -ComputerName $element -scriptblock {
    [system.reflection.assembly]::LoadWithPartialName("system.windows.forms")
    $useOutput = [System.Windows.Forms.MessageBox]::Show("The Lab is being shut down. Press YES to abort.", "WARNING", 'YesNo', 'Exclamation')
    if ($useOutput -eq [Windows.Forms.DialogResult]::Yes)
    $sendBack = 0
    Write-Output $sendBack
    else
    $sendBack = 1
    Write-Output $sendBack
    <# TODO: 1) write computer name and whether a user clicked yes to abort to a csv file (save just no's for logs)
    1.1) maybe add computer name to
    2) immediately shut down any computer where user allows shutdown
    3) after 30 seconds, add remaining computers with no action to csv file to shut down
    4) Open messagebox popup on host computer to ask if want to shutdown remaining computers
    5) Shut down remaining computers
    #>
    if ($sendBack -eq 1)
    #if the computer is ready to turn off
    #shut it down
    else
    #add to array or file to be shut down later
    Any guidance or advice would be great! 
    -------Edit 1-----------
    In accordance with what this first two commenters have posted. I am thinking of a quirky solution.
    What if I have a script on the remote computer that I call with the Invoke-Command that opens a message box and then depending on the response uses the msg.exe cmdlet to send either a 1 or a 0 back. The host computer then takes these answers and creates
    two arrays, one for yeses and one for nos. I guess the only issue with this would be is receiving the messages in an orderly manner over the course of 30seconds and having the messages be 'silent' so the script picks them up but there is no explicit message
    pop up on my computer.
    I have found the following from the invoke-command section of ss64.com:
    "Run the Test.ps1 script on the Server64 computer. The script is located on the local computer. The script runs on the remote computer and the results are returned to the local computer:PS C:\> invoke-command -filepath c:\scripts\test.ps1 -computerName
    Server64"
    So could I not send a message box through this and have the result returned to me?

    Actually, it might be doable (albeit not all that ... cleanly):
    Create a task on the remote computer that will run under any currently logged in user which displays a messagebox that returns values and writes the result to a file. (Which will of course require a folder where all users have write permissions to to write
    the files in)
    Trigger the task
    Check for result files
    Cheers,
    Fred
    There's no place like 127.0.0.1

  • W7 64 Home: Task Scheduler looking for remote computer, can't defrag.

    Needed to defragment HD, went to task scheduler, found that it is looking for a remote computer and options to schedule tasks are grayed out.
    The option in the right panel for the local computer was selected (there are no "apply" buttons or such).  Closed Task Scheduler, reopened, found that it is still looking for a remote computer, options to schedule tasks still grayed out.
    Logged in as Admin. Not on a network.
    Thank you

    In Windows 7, Disk Defragmenter runs at regular intervals when your computer is turned on, so you don't have to remember to run it. It's scheduled to run once a week at an early hour in the morning. However, you can change how often Disk Defragmenter runs,
    and at what time of the day.
    Open Disk Defragmenter by clicking the Start button .
    In the search box, type Disk Defragmenter, and then, in the list of results, click Disk
    Defragmenter.‌  If you're
    prompted for an administrator password or confirmation, type the password or provide confirmation.
    Click Configure schedule, and then do one of the following:
    To change how often Disk Defragmenter runs, click the menu next to Frequency, and then click Daily, Weekly,
    orMonthly.
    If you set the frequency to Weekly or Monthly,
    click the menu next to Day to choose the day of the week or month that you want Disk Defragmenter to run.
    To change the time of day when Disk Defragmenter runs, click the menu next to Time, and then choose a time.
    To change the volumes that are scheduled to be defragmented, click Select disks, and then follow the instructions.
    Click OK.
    Arnav Sharma | http://arnavsharma.net/ Please remember to click “Mark as Answer” on the post that helps you, and to click “Unmark as Answer” if a marked post does not actually answer your question. This can be beneficial to other community members reading
    the thread.

  • How can I setup a scheduled task to run a Powershell Script delivered as a Group Policy Preference

    I have a Powershell script I want to run only once when a user logs onto their system. This script would move all the PST files from the Local drive and the Home drive to a folder location within the users profile. I wanted to run this as a Windows 7 Scheduled Task using Group Policy Preferences. How can I get this to happen short of a logon script? I have updated all the machines to WMF 4.0 so could I use a Scheduled Job instead? I wanted to run the script as the logon user but elevated.#Start Outlook and Disconnect attached PST files.
    $Outlook = New-Object -ComObject Outlook.Application
    $namespace = $outlook.getnamespace("MAPI")
    $folder = $namespace.GetDefaultFolder("olFolderInbox")
    $explorer = $folder.GetExplorer()
    $explorer.Display()
    $myArray= @()
    $outlook.Session.Stores | where{ ($_.FilePath -like'*.PST') } | foreach{[array]$myArray+= $_.FilePath}
    for
    ($x=0;$x-le$myArray.length-1;$x++)
    $PSTPath= $myArray[$x]
    $PST= $namespace.Stores | ?{$_.FilePath -like$PSTPath}
    $PSTRoot= $PST.GetRootFolder() #Get Root Folder name of PST
    $PSTFolder= $Namespace.Folders.Item($PSTRoot.Name) #Bind to PST for disconnection
    $Namespace.GetType().InvokeMember('RemoveStore',[System.Reflection.BindingFlags]::InvokeMethod,$null,$Namespace,($PSTFolder)) #Disconnect .PST
    #Move All PST files to the default location while deleting the PST files from their original location.
    $SourceList = ("$env:SystemDrive", "$env:HOMEDRIVE")
    $Destination = ("$env:USERPROFILE\MyOutlookFiles")
    (Get-ChildItem -Path $SourceList -Recurse -Filter *.PST) | Move-Item -Destination $Destination
    #Attach all PST files from the default location.
    Add-type -assembly "Microsoft.Office.Interop.Outlook" | out-null
    $outlook = new-object -comobject outlook.application
    $namespace = $outlook.GetNameSpace("MAPI")
    dir “$env:USERPROFILE\MyOutlookFiles\*.pst” | % { $namespace.AddStore($_.FullName) }

    Mike,
    I do not understand what appears to be a regular expression above. I did add the PowerShell script to the HKCU RunOnce Key as suggested.
    Windows Registry Editor Version 5.00
    C:\WINDOWS\system32\WindowsPowerShell\v1.0\powershell.exe -noprofile -sta -WindowStyle Hidden -ExecutionPolicy RemoteSigned -File "C:\scripts\Windows PowerShell\Move-PST.ps1"
     I'm delivering this using Group Policy Preferences. It seems to fail or time out when run because the behavior is different if I run the script from within the PowerShell IDE. I added the parameters to the script and will try it again in the morning.

  • Get-DfsrBacklog cmdlet doesn't work from remote computer (pssession)

    Hi there!
    I try to manage our Server 2012 R2 boxes from a Windows 7 remote machine using PowerShell. For some reason the cmdlet "Get-DFSRbacklog" seems not working remotly. The same cmdlet work when logging in locally to the server(s) with the
    same credentials. UAC is turned off on the target machines and i have local admin permissions on this servers using my domain account.
    What i do is:
    Enter-PSSession <servername>
    Get-DfsrBacklog -SourceComputerName <servername> -DestinationComputerName <servername>
    Then i receive the following error:
    Get-DfsrBacklog : Could not retrieve the backlog information. Replication group: "*" Replicated folder: "*" Source
    computer: <servername> Destination computer: <servername> Confirm that you are running in an elevated Windows PowerShell
    session and are a member of the local Administrators group on the destination computer. The destination computer must
    also be accessible over the network, and have the DFSR service running. This cmdlet does not support WMI calls for the
    following or earlier operating systems: Windows Server 2012. Details: WinRM cannot process the request. The following
    error with errorcode 0x8009030e occurred while using Kerberos authentication: A specified logon session does not
    exist. It may already have been terminated.
     Possible causes are:
      -The user name or password specified are invalid.
      -Kerberos is used when no authentication method and no user name are specified.
      -Kerberos accepts domain user names, but not local user names.
      -The Service Principal Name (SPN) for the remote computer name and port does not exist.
      -The client and remote computers are in different domains and there is no trust between the two domains.
     After checking for the above issues, try the following:
      -Check the Event Viewer for events related to authentication.
      -Change the authentication method; add the destination computer to the WinRM TrustedHosts configuration setting or
    use HTTPS transport.
     Note that computers in the TrustedHosts list might not be authenticated.
       -For more information about WinRM configuration, run the following command: winrm help config.
        + CategoryInfo          : ProtocolError: (zursf1003:String) [Get-DfsrBacklog], DfsrException
        + FullyQualifiedErrorId : Get-DfsrBacklog.CimException,Microsoft.DistributedFileSystemReplication.Commands.GetDfsr
       BacklogCommand
    Any ideas?

    This article
    suggests that you're logged into your Win7 management machine with local credentials. You should try the Get-DfsrBacklog command with domain credentials:
    Client is in a domain: Attempting to connect to a remote server by using implicit credentials that are the local administrator's credentials on the client. Instead, use domain credentials that are recognized by the domain of the target server, or right-click
    the server entry in the Servers tile, click Manage As, and then specify credentials of an administrator on the target server.
    Sam Boutros, Senior Consultant, Software Logic, KOP, PA http://superwidgets.wordpress.com (Please take a moment to Vote as Helpful and/or Mark as Answer, where applicable) _________________________________________________________________________________
    Powershell: Learn it before it's an emergency http://technet.microsoft.com/en-us/scriptcenter/powershell.aspx http://technet.microsoft.com/en-us/scriptcenter/dd793612.aspx

  • Using different credetial to connect to remote computer to get Cert information

    I am using following function to get cert details from remote computers. How do I change it to accept credentials to connect to remote computer?
    PS : Remote computers are windows 2003 computers without PowerShell installed on it.
    function Get-Cert( $computer=$env:computername ){    $ro=[System.Security.Cryptography.X509Certificates.OpenFlags]"ReadOnly"    $lm=[System.Security.Cryptography.X509Certificates.StoreLocation]"LocalMachine"    $store=new-object System.Security.Cryptography.X509Certificates.X509Store("\\$computer\my",$lm)    $store.Open($ro)    $store.Certificates}Get-Cert ServerA
    -Edatha-

    Certainly.  Here's a quick example that uses the PowerShell registry provider as a quick an easy way to enumerate the subkeys of HKLM\Software\Microsoft\SystemCertificates\MY\Certificates\ , and to read the Blob value as a byte array.  In your
    case, you'd probably want to look into using something like WMI's StdRegProv class to accomplish the same thing.
    Once you have the byte array, just pass it to the constructor of X509Certificate2.  (Note the unary comma in that constructor call; we're binding an object to the -ArgumentList parameter of New-Object, and when you're calling constructors that take
    a single array argument, you need the comma.)
    Get-ChildItem HKLM:\SOFTWARE\Microsoft\SystemCertificates\MY\Certificates |
    ForEach-Object {
    $byteArray = (Get-ItemProperty -Path $_.PSPath -Name Blob).Blob
    $certificate = New-Object System.Security.Cryptography.X509Certificates.X509Certificate2(,$byteArray)
    $certificate

Maybe you are looking for