Capturing log files from multiple .ps1 scripts called from within a .bat file
I am trying to invoke multiple instances of a powershell script and capture individual log files from each of them. I can start the multiple instances by calling 'start powershell' several times, but am unable to capture logging. If I use 'call powershell'
I can capture the log files, but the batch file won't continue until that current 'call powershell' has completed.
ie. within Test.bat
start powershell . \Automation.ps1 %1 %2 %3 %4 %5 %6 > a.log 2>&1
timeout /t 60
start powershell . \Automation.ps1 %1 %2 %3 %4 %5 %6 > b.log 2>&1
timeout /t 60
start powershell . \Automation.ps1 %1 %2 %3 %4 %5 %6 > c.log 2>&1
timeout /t 60
start powershell . \Automation.ps1 %1 %2 %3 %4 %5 %6 > d.log 2>&1
timeout /t 60
start powershell . \Automation.ps1 %1 %2 %3 %4 %5 %6 > e.log 2>&1
timeout /t 60
start powershell . \Automation.ps1 %1 %2 %3 %4 %5 %6 > f.log 2>&1
the log files get created but are empty. If I invoke 'call' instead of start I get the log data, but I need them to run in parallel, not sequentially.
call powershell . \Automation.ps1 %1 %2 %3 %4 %5 %6 > a.log 2>&1
timeout /t 60
call powershell . \Automation.ps1 %1 %2 %3 %4 %5 %6 > b.log 2>&1
timeout /t 60
call powershell . \Automation.ps1 %1 %2 %3 %4 %5 %6 > c.log 2>&1
timeout /t 60
call powershell . \Automation.ps1 %1 %2 %3 %4 %5 %6 > d.log 2>&1
timeout /t 60call powershell . \Automation.ps1 %1 %2 %3 %4 %5 %6 > e.log 2>&1
Any suggestions of how to get this to work?
Batch files are sequential by design (batch up a bunch of statements and execute them). Call doesn't run in a different process, so when you use it the batch file waits for it to exit. From CALL:
Calls one batch program from another without stopping the parent batch program
I was hoping for the documentation to say the batch file waits for CALL to return, but this is as close as it gets.
Start(.exe), "Starts a separate window to run a specified program or command". The reason it runs in parallel is once it starts the target application start.exe ends and the batch file continues. It has no idea about the powershell.exe process
that you kicked off. Because of this reason, you can't pipe the output.
Update: I was wrong, you can totally redirect the output of what you run with start.exe.
How about instead of running a batch file you run a PowerShell script? You can run script blocks or call individual scripts in parallel with the
Start-Job cmdlet.
You can monitor the jobs and when they complete, pipe them to
Receive-Job to see their output.
For example:
$sb = {
Write-Output "Hello"
Sleep -seconds 10
Write-Output "Goodbye"
Start-Job -Scriptblock $sb
Start-Job -Scriptblock $sb
Here's a script that runs the scriptblock $sb. The script block outputs the text "Hello", waits for 10 seconds, and then outputs the text "Goodbye"
Then it starts two jobs (in this case I'm running the same script block)
When you run this you receive this for output:
PS> $sb = {
>> Write-Output "Hello"
>> Sleep -Seconds 10
>> Write-Output "Goodbye"
>> }
>>
PS> Start-Job -Scriptblock $sb
Id Name State HasMoreData Location Command
1 Job1 Running True localhost ...
PS> Start-Job -Scriptblock $sb
Id Name State HasMoreData Location Command
3 Job3 Running True localhost ...
PS>
When you run Start-Job it will execute your script or scriptblock in a new process and continue to the next line in the script.
You can see the jobs with
Get-Job:
PS> Get-Job
Id Name State HasMoreData Location Command
1 Job1 Running True localhost ...
3 Job3 Running True localhost ...
OK, that's great. But we need to know when the job's done. The Job's Status property will tell us this (we're looking for a status of "Completed"), we can build a loop and check:
$Completed = $false
while (!$Completed) {
# get all the jobs that haven't yet completed
$jobs = Get-Job | where {$_.State.ToString() -ne "Completed"} # if Get-Job doesn't return any jobs (i.e. they are all completed)
if ($jobs -eq $null) {
$Completed=$true
} # otherwise update the screen
else {
Write-Output "Waiting for $($jobs.Count) jobs"
sleep -s 1
This will output something like this:
Waiting for 2 jobs
Waiting for 2 jobs
Waiting for 2 jobs
Waiting for 2 jobs
Waiting for 2 jobs
Waiting for 2 jobs
Waiting for 2 jobs
Waiting for 2 jobs
Waiting for 2 jobs
Waiting for 2 jobs
When it's done, we can see the jobs have completed:
PS> Get-Job
Id Name State HasMoreData Location Command
1 Job1 Completed True localhost ...
3 Job3 Completed True localhost ...
PS>
Now at this point we could pipe the jobs to Receive-Job:
PS> Get-Job | Receive-Job
Hello
Goodbye
Hello
Goodbye
PS>
But as you can see it's not obvious which script is which. In your real scripts you could include some identifiers to distinguish them.
Another way would be to grab the output of each job one at a time:
foreach ($job in $jobs) {
$job | Receive-Job
If you store the output in a variable or save to a log file with Out-File. The trick is matching up the jobs to the output. Something like this may work:
$a_sb = {
Write-Output "Hello A"
Sleep -Seconds 10
Write-Output "Goodbye A"
$b_sb = {
Write-Output "Hello B"
Sleep -Seconds 5
Write-Output "Goodbye B"
$job = Start-Job -Scriptblock $a_sb
$a_log = $job.Name
$job = Start-Job -Scriptblock $b_sb
$b_log = $job.Name
$Completed = $false
while (!$Completed) {
$jobs = Get-Job | where {$_.State.ToString() -ne "Completed"}
if ($jobs -eq $null) {
$Completed=$true
else {
Write-Output "Waiting for $($jobs.Count) jobs"
sleep -s 1
Get-Job | where {$_.Name -eq $a_log} | Receive-Job | Out-File .\a.log
Get-Job | where {$_.Name -eq $b_log} | Receive-Job | Out-File .\b.log
If you check out the folder you'll see the log files, and they contain the script contents:
PS> dir *.log
Directory: C:\Users\jwarren
Mode LastWriteTime Length Name
-a--- 1/15/2014 7:53 PM 42 a.log
-a--- 1/15/2014 7:53 PM 42 b.log
PS> Get-Content .\a.log
Hello A
Goodbye A
PS> Get-Content .\b.log
Hello B
Goodbye B
PS>
The trouble though is you won't get a log file until the job has completed. If you use your log files to monitor progress this may not be suitable.
Jason Warren
@jaspnwarren
jasonwarren.ca
habaneroconsulting.com/Insights
Similar Messages
-
Currently logged in from within the app itself
I need to find out who or how many users are currently logged into my app from within the app itself and display it within a report for the app administrators. Any thoughts or ideas.
For an answer to this question see this discussion.
query apex users and their session
query apex users and thier session
Todd -
Capturing JCheckBox edit from within a JTable cell
Hey,
I've gone through as many posts as i can from this forum and i have managed to understand quite a bit but there is just this one probelm that i have.
What i have is a checkbox in a cell in a jtable that causes the a jpanel elsewhere to repaint in various ways depending on whether the checkbok is checked or not.
The code below is for the table editor. The model is my table model. aCanvas is the panel being repainted. completeVisible is the array that aCanvas accesses to repaint the panel.
The code below works perfectly after the but only after the second click. the first 2 times i click the checkbox the value printed is true and then afterwards it starts alternating and working correctly... can anyone please tell me why this is?
thanks in andvance... :o)
private class TSPCellEditor extends AbstractCellEditor implements TableCellEditor, ActionListener {
Boolean visible;
JCheckBox check;
String TEST = "TEST";
public TSPCellEditor() {
check = new JCheckBox();
check.setActionCommand(TEST);
check.addActionListener(this);
public void actionPerformed(ActionEvent e) {
if(TEST.equals(e.getActionCommand())) {
visible = new Boolean(check.isSelected());
fireEditingStopped();
public Object getCellEditorValue() {
return visible;
public Component getTableCellEditorComponent(JTable table, Object value, boolean isSelected, int row, int column) {
visible = (Boolean)(value);
//boolean temp = visible.booleanValue();
//temp = !temp;
//visible = Boolean.valueOf(temp);
check.setSelected(visible.booleanValue());
System.out.println(completeVisible[row]+" "+row);
completeVisible[row] = visible.booleanValue();
model.setValueAt(visible, row, column);
aCanvas.repaint();
return check;
}I can't answer your question, but I think a better way to implement your solution is to use a TableModelListener. An event is fired whenver the contents of the table model are changed so you can do your processing here. This [url http://forum.java.sun.com/thread.jsp?forum=57&thread=418560]thread gives a simple example.
-
The log file behavior does not follow the logging preferences I set
I set my log file parameters to capture a large amount of information.
Specifically, I wanted to capture log files as big as 1GB and keep them
for 3 sets of backups. The settings I used are as follows:
<P>
logfile.http.maxlogfilesize 1073741824
logfile.http.maxlogsize 4294967296
<P>
However, after setting these values, I can see only two log files, the file
for today and the file for yesterday.
(See attachment)I've given full read and write privileges
To whom? And as whom are you connecting? -
Shell Script to grep Job File name and Log File name from crontab -l
Hello,
I am new to shell scripting. I need to write a shell script where i can grep the name of file ie. .sh file and log file from crontab -l.
#51 18 * * * /home/oracle/refresh/refresh_ug634.sh > /home/oracle/refresh/refresh_ug634.sh.log 2>&1
#40 17 * * * /home/oracle/refresh/refresh_ux634.sh > /home/oracle/refresh/refresh_ux634.sh.log 2>&1
In crontab -l, there are many jobs, i need to grep job name like 'refresh_ug634.sh' and corresponding log name like 'refresh_ug634.sh.log '.
I am thinking of making a universal script that can grep job name, log name and hostname for one server.
Then, suppose i modify the refresh_ug634.sh script and call that universal script and echo those values when the script gets executed.
Please can anyone help.
All i need to do is have footer in all the scripts running in crontab in one server.
job file name
log file name
hostname
Please suggest if any better solution. Thanks.957704 wrote:
I need help how to grep that information from crontab -l
Please can you provide some insight how to grep that shell script name from list of crontab -l jobs
crontab -l > cron.log -- exporting the contents to a file
cat cron.log|grep something -- need some commands to grep that infoYou are missing the point. This forum is for discussion of SQL and PL/SQL questions. What does your question have to do with SQL or PL/SQL.
It's like you just walked into a hardware store and asked where they keep the fresh produce.
I will point out one thing about your question. You are assuming every entry in the crontab has exactly the same format. consider this crontab:
#=========================================================================
# NOTE: If this is on a clustered environment, all changes to this crontab
# must be replicated on all other nodes of the cluster!
# minute (0 thru 59)
# hour (0 thru 23)
# day-of-month (1 thru 31)
# month (1 thru 12)
# weekday (0 thru 6, sunday thru saturday)
# command
#=========================================================================
00 01 1-2 * 1,3,5,7 /u01/scripts/myscript01 5 orcl dev
00 04 * * * /u01/scripts/myscript02 hr 365 >/u01/logs/myscript2.lis
00 6 * * * /u01/scripts/myscript03 >/u01/logs/myscript3.lisThe variations are endless.
When you get to an appropriate forum (this on is not it) it will be helpful to explain your business requirement, not just your proposed technical solution. -
Log messages from multiple instances in single file.
Hi!
I have a requirement that i need to log messages from muliple instances of the same object in a file. The new file will be created every day. Likewise, multiple objects might have various instances each.
One class
->multiple instances
-> log message stored in single file.
Note :
I am using the Message driven bean. I need to log from the bean class. JDK 1.3
If u could help me out that would be great.As long as they are all from the same OS program (a single Java VM), that's OK - you can use Log4j, and use a rotating file logger.
If you point two different virtual machine processes at the same file, one may have it open when the other is trying to rotate it, and your rotation may fail (at best) and/or you may lose the old log (the worst case).
If you need to collect log messages from multiple processes (or even multiple machines), use a syslog-based logger (Log4j has a SyslogAppender) or use Log4j's SocketAppender to write to a log4j-builtin log listener (SocketNode). -
How to send output from SQL script to the specified log file (not *.sql)
## 1 -I write sql command into sql file
echo "SELECT * FROM DBA_USERS;">results.sql
echo "quit;">>results.sql
##2- RUN sqlplus, run sql file and get output/results into jo.log file
%ORACLE_HOME/bin/sqlplus / as sysdba<results.sql>>jo.log
It doesn't work please advise$ echo "set pages 9999" >results.sql ### this is only to make the output more readable
$ echo "SELECT * FROM DBA_USERS;" >>results.sql
$ echo "quit" >>results.sql
$ cat results.sql
set pages 9999
SELECT * FROM DBA_USERS;
quit
$ sqlplus -s "/ as sysdba" @results >jo.log
$ cat jo.log
USERNAME USER_ID PASSWORD
ACCOUNT_STATUS LOCK_DATE EXPIRY_DAT
DEFAULT_TABLESPACE TEMPORARY_TABLESPACE CREATED
PROFILE INITIAL_RSRC_CONSUMER_GROUP
EXTERNAL_NAME
SYS 0 D4C5016086B2DC6A
OPEN
SYSTEM TEMP 06/12/2003
DEFAULT SYS_GROUP
SYSTEM 5 D4DF7931AB130E37
OPEN
SYSTEM TEMP 06/12/2003
DEFAULT SYS_GROUP
DBSNMP 19 E066D214D5421CCC
OPEN
SYSTEM TEMP 06/12/2003
DEFAULT DEFAULT_CONSUMER_GROUP
SCOTT 60 F894844C34402B67
OPEN
USERS TEMP 06/12/2003
DEFAULT DEFAULT_CONSUMER_GROUP
HR 47 4C6D73C3E8B0F0DA
OPEN
EXAMPLE TEMP 06/12/2003
DEFAULT DEFAULT_CONSUMER_GROUPThat's only a part of the file, it's too long :-) -
How to tail log files from particular string
Hello,
We would like to tail several log files "live" in powershell for particular string. We have tried to use "get-content" command but without luck because everytime as a result we received only results from one file. I assume that it was
caused by "-wait" parameter. If there is any other way to tail multiple files ?
Our sample script below
dir d:\test\*.txt -include *.txt | Get-Content -Wait | select-string "windows" |ForEach-Object {Write-EventLog -LogName Application -Source "Application error" -EntryType information -EventId 999 -Message $_}
Any help will be appreciated.
MacBecause we want to capture particular string from that files. Application writes some string time to time and when the string appears we want to catch it and send an eventy to application log, after it our Nagios system will raise alarm.
Mac
Alright, this is my answer, but I think you won't like it.
Run this PowerShell code in PowerShell ISE:
$file1='C:\Temp\TFile1.txt'
'' > $file1
$file2='C:\Temp\TFile2.txt'
'' > $file2
$special='windowswodniw'
$exit='exit'
$sb1={
gc $using:file1 -Wait | %{
if($_-eq$using:exit){
exit
}else{
sls $using:special -InputObject $_ -SimpleMatch
} | %{
Write-Host '(1) found special string: ' $_
$sb2={
gc $using:file2 -Wait | %{
if($_-eq$using:exit){
exit
}else{
sls $using:special -InputObject $_ -SimpleMatch
} | %{
Write-Host '(2) found special string: ' $_
sajb $sb1
sajb $sb2
In this code, $file1 and 2 are the files being waited for.
As I understood you, you care only for the special string, which is in the variable $special.
All other variables, will be discarded.
Also, whenever a string equals to $exit is written to the file, the start job corresponding to that file will be terminated, automatically! (simple, right?)
In the example above, I use only 2 files (being watched) but you can extend it, easily, to any number (as long as you understand the code).
If you are following my instructions, at this point you have PowerShell ISE running, with 2 background jobs,
waiting for data being inputed to $file1 and 2.
Now, it's time to send data to $file1 and 2.
Start PowerShell Console to send data to those files.
From its command line, execute these commands:
$file1 = 'C:\Temp\TFile1.txt'
$file2='C:\Temp\TFile2.txt'
$exit='exit'
Notice that $file1 and 2 are exactly the same as those defined in P
OWERSHELL ISE, and that I've defined the string that will terminate the background jobs.
Command these commands in PowerShell Console:
'more' >> $file1
'less' >> $file1
'more' >> $file2
'less' >> $file2
These commands will provoke no consequences, because these strings will be discarded (they do not contain the special string).
Now, command these commands in PowerShell Console:
'windowswodniw' >> $file1
'1 windowswodniw 2' >> $file1
'more windowswodniw less' >> $file1
'windowswodniw' >> $file2
'1 windowswodniw 2' >> $file2
'more windowswodniw less' >> $file2
All these will be caugth by the (my) code, because they contain the special
string.
Now, let's finish the background jobs with these commands:
$exit >> $file1
$exit >> $file2
The test I'm explaining, now is DONE, TERMINATED, FINISHED, COMPLETED, ...
Time to get back to PowerShell ISE.
You'll notice that it printed out this (right at the beginning):
Id Name PSJobTypeName State HasMoreData Location Command
1 Job1 BackgroundJob Running True localhost ...
2 Job2 BackgroundJob Running True localhost ...
At PowerShell ISE's console, type this:
gjb
And you'll see the ouput like:
Id Name PSJobTypeName State HasMoreData Location Command
1 Job1 BackgroundJob Completed True localhost ...
2 Job2 BackgroundJob Completed True localhost ...
( They are completed! )
Which means the background jobs are completed.
See the background jobs' outputs, commanding this:
gjb | rcjb
The output, will be something like this:
(1) found special string: windowswodniw
(1) found special string: 1 windowswodniw 2
(1) found special string: more windowswodniw less
(2) found special string: windowswodniw
(2) found special string: 1 windowswodniw 2
(2) found special string: more windowswodniw less
I hope you are able to understand all this (the rubbishell coders, surely, are not).
In my examples, the strings caught are written to host's console, but you can change it to do anything you want.
P.S.: I'm using PowerShell, but I'm pretty sure you can use older PowerShell ( version 3 ). Anything less, is not PowerShell anymore. We can call it RubbiShell. -
Help with Script created to check log files.
Hi,
I have a program we use in our organization on multiple workstations that connect to a MS SQL 2005 database on a Virtual Microsoft 2008 r2 Server. The program is quite old and programmed around the days when serial connections were the most efficient means
of connection to a device. If for any reason the network, virtual server or the SAN which the virtual server runs off have roughly 25% utilization or higher on its resources the program on the workstations timeout from the SQL database and drop the program
from the database completely rendering it useless. The program does not have the smarts to resync itself to the SQL database and it just sits there with "connection failed" until human interaction. A simple restart of the program reconnects itself
to the SQL database without any issues. This is fine when staff are onsite but the program runs on systems out of hours when the site is unmanned.
The utilization of the server environment is more than sufficient if not it has double the recommended resources needed for the program. I am in regular contact with the support for the program and it is a known issue for them which i believe they do not
have any desire to fix in the near future.
I wish to create a simple script that checks the log files on each workstation or server the program runs on and emails me if a specific word comes up in that log file. The word will only show when a connection failure has occurred.
After the email is sent i wish for the script to close the program and reopen it to resync the connection.
I will schedule the script to run ever 15 minutes.
I have posted this up in a previous post about a month ago but i went on holidays over xmas and the post died from my lack or response.
Below is what i have so far as my script. I have only completed the monitoring of the log file and the email portion of it. I had some help from a guy on this forum to get the script to where it is now. I know basic to intermediate scripting so sorry for my
crudity if any.
The program is called "wasteman2G" and the log file is located in \\servername\WasteMan2G\Config\DCS\DCS_IN\alert.txt
I would like to get the email side of this script working first and then move on to getting the restart of the program running after.
At the moment i am not receiving an error from the script. It runs and doesn't complete what it should.
Could someone please help?
Const strMailto = "[email protected]"
Const strMailFrom = "[email protected]"
Const strSMTPServer = "mrc1tpv002.XXXX.local"
Const FileToRead = "\\Mrctpv005\WasteMan2G\Config\DCS\DCS_IN\alert.txt"
arrTextToScanFor = Array("SVR2006","SVR2008")
Set WshShell = WScript.CreateObject("WScript.Shell")
Set objFSO = WScript.CreateObject("Scripting.FileSystemObject")
Set oFile = objFSO.GetFile(FileToRead)
dLastCreateDate = CDate(WshShell.RegRead("HKLM\Software\RDScripts\CheckTXTFile\CreateDate"))
If oFile.DateCreated = dLastCreateDate Then
intStartAtLine = CInt(WshShell.RegRead("HKLM\Software\RDScripts\CheckTXTFile\LastLineChecked"))
Else
intStartAtLine = 0
End If
i = 0
Set objTextFile = oFile.OpenAsTextStream()
Do While Not objTextFile.AtEndOfStream
If i < intStartAtLine Then
objTextFile.SkipLine
Else
strNextLine = objTextFile.Readline()
For each strItem in arrTextToScanFor
If InStr(LCase(strNextLine),LCase(strItem)) Then
strResults = strNextLine & vbcrlf & strResults
End If
Next
End If
i = i + 1
Loop
objTextFile.close
WshShell.RegWrite "HKLM\Software\RDScripts\CheckTXTFile\FileChecked", FileToRead, "REG_SZ"
WshShell.RegWrite "HKLM\Software\RDScripts\CheckTXTFile\CreateDate", oFile.DateCreated, "REG_SZ"
WshShell.RegWrite "HKLM\Software\RDScripts\CheckTXTFile\LastLineChecked", i, "REG_DWORD"
WshShell.RegWrite "HKLM\Software\RDScripts\CheckTXTFile\LastScanned", Now, "REG_SZ"
If strResults <> "" Then
SendCDOMail strMailFrom,strMailto,"VPN Logfile scan alert",strResults,"","",strSMTPServer
End If
Function SendCDOMail( strFrom, strSendTo, strSubject, strMessage , strUser, strPassword, strSMTP )
With CreateObject("CDO.Message")
.Configuration.Fields.Item("http://schemas.microsoft.com/cdo/configuration/sendusing") = 2
.Configuration.Fields.Item("http://schemas.microsoft.com/cdo/configuration/smtpserver") = strSMTP
.Configuration.Fields.item("http://schemas.microsoft.com/cdo/configuration/smtpauthenticate") = 1 'basic
.Configuration.Fields.item("http://schemas.microsoft.com/cdo/configuration/sendusername") = strUser
.Configuration.Fields.item("http://schemas.microsoft.com/cdo/configuration/sendpassword") = strPassword
.Configuration.Fields.Update
.From = strFrom
.To = strSendTo
.Subject = strSubject
.TextBody = strMessage
On Error Resume Next
.Send
If Err.Number <> 0 Then
WScript.Echo "SendMail Failed:" & Err.Description
End If
End With
End FunctionThankyou for that link, it did help quite a bit. What i wanted was to move it to csript so i could run the wscript.echo in commandline. It all took to long and found a way to complete it via Batch. I do have a problem with my script though and you might
be able to help.
What i am doing is searching the log file. Finding the specific words then outputting them to an email. I havent used bmail before so thats probably my problem but then im using bmail to send me the results.
Then im clearing the log file so the next day it is empty so that when i search it every 15 minutes its clean and only when an error occurs it will email me.
Could you help me send the output via email using bmail or blat?
@echo off
echo Wasteman Logfile checker
echo Created by: Reece Vellios
echo Date: 08/01/2014
findstr "SRV2006 & SRV2008" \\Mrctpv005\WasteMan2G\Config\DCS\DCS_IN\Alert.Txt > c:\log4mail.txt
if %errorlevel%==0 C:\Documents and Settings\rvellios\Desktop\DCS Checker\bmail.exe -s mrc1tpv002.xxx.local -t [email protected] -f [email protected] -h -a "Process Dump" -m c:\log4mail.txt -c
for %%G in (\\Mrctpv005\WasteMan2G\Config\DCS\DCS_IN\Alert.Txt) do (copy /Y nul "%%G")
This the working script without bmail
@echo off
echo Wasteman Logfile checker
echo Created by: Reece Vellios
echo Date: 08/01/2014
findstr "SRV2006 & SRV2008" \\Mrctpv005\WasteMan2G\Config\DCS\DCS_IN\Alert.Txt > C:\log4mail.txt
if %errorlevel%==0 (echo Connection error)
for %%G in (\\Mrctpv005\WasteMan2G\Config\DCS\DCS_IN\Alert.Txt) do (copy /Y nul "%%G")
I need to make this happen:
If error occurs at "%errorlevel%=0" then it will output the c:\log4mail.txt via smtp email using bmail. -
Amending script to read list of computers, run script and output to log file
Hello all,
I have cobbled together a script that runs and does what I want, now I would like to amend the script to read a list of computers rather than use the msg box that it is currently using for the strcomputer, if the computers doesn't respond to a ping, then
log that, if it does continue with the script and when it is complete, log a success or failure. I have just started scripting and would really appreciate some help on this one,thanks. I created the script to fix an SCCM updates issue and failing task sequences,
so it may prove useful to others.
There are msg box entries that can be removed that were originally in there for the user running the script.
'setting objects
Dim net, objFSO, shell
Dim objFile, strLine, intResult
Set objnet = CreateObject("wscript.network")
Set objFSO = CreateObject("scripting.filesystemobject")
Set objshell = CreateObject("wscript.shell")
strfile = "c:\wuafix\wuafix.vbs"
strUser = "domain\user"
strPassword = "password"
'getting server name or IP address
strComputer=InputBox("Enter the IP or computer name of the remote machine on which to repair the WUA agent:", "Starting WUA Fix")
'check to see if the server can be reached
Dim strPingResults
Set pingExec = objshell.Exec("ping -n 3 -w 2000 " & strComputer) 'send 3 echo requests, waiting 2secs each
strPingResults = LCase(pingExec.StdOut.ReadAll)
If Not InStr(strPingResults, "reply from")>0 Then
WScript.Echo strComputer & " did not respond to ping."
WScript.Quit
End If
'Check if source file exists
If Not objFSO.FileExists(strFile) Then
WScript.Echo "The source file does not exist"
WScript.Quit
End If
MsgBox "The WUA Fix is in process. Please wait.", 64, "Script Message"
'mapping drive to remote machine
If objFSO.DriveExists("Z:") Then
objnet.RemoveNetworkDrive "Z:","True","True"
End If
objnet.MapNetworkDrive "Z:", "\\" & strComputer & "\c$", True
'creating folder for install exe on remote machine
If (objFSO.FolderExists("Z:\wuafix\") = False) Then
objFSO.CreateFolder "Z:\wuafix"
End If
'copying vbs to remote machine
objFSO.CopyFile strFile, "Z:\wuafix\wuafix.vbs"
'set command line executable to run a silent install remotely
strInstaller1 = "cscript.exe c:\wuafix\wuafix.vbs"
'strInstaller2 = "c:\wuafix\wuafix.vbs"
strExec = "c:\pstools\PsExec.exe "
'objshell.Run strExec & " \\" & strComputer & strInstaller1
On Error Resume Next
result = objshell.Run(strExec & " \\" & strComputer & " " & strInstaller1)
If Err.Number = 0 Then
WScript.Echo "PSXEC Runing WUA fix remotely"
Else MsgBox Err.Number
MsgBox result
End If
Set objWMIService = GetObject("winmgmts:" _
& "{impersonationLevel=impersonate}!\\" & strComputer & "\root\cimv2")
Set colLoggedEvents = objWMIService.ExecQuery _
("SELECT * FROM Win32_NTLogEvent WHERE Logfile = 'Application' AND " _
& "EventCode = '4'")
Wscript.Echo "Event Viewer checked and Fix Applied:" & colLoggedEvents.Count
MsgBox "Removing mapped drive Please wait.", 64, "Script Message"
If objFSO.DriveExists("Z:") Then
objnet.RemoveNetworkDrive "Z:","True","True"
End If
MsgBox "The WUA Fix has been applied.", 64, "Script Message"
quit
wscript.quit
Any help appreciated and explanations on the process would be great as I would like to learn the process involved, which is difficult when working during the day.
many thanksHi Bill,
long story short, I have approx. 2800 clients with an old entry in WMI for updates that the sccm client cannot clear or run because they do not exist anymore, so the client will not run updates or use a task sequence because of this, so my script fixes this
and does a couple of other things, I have found another way to do this by running a different script that uses WMI to call a cscript function that uses the wuafix.vbs that is coped to the machine, I am also changing the echo entries to output to a log
file instead so that I can track what client has run the fix andn which ones haven't.
If you have any suggestions then please let me know, nothing nefarious :)
many thanks -
Shrink file (log) from within a procedure
I'd like to incorporate the DBCC shrinkfile command to my maintenance procedure. This procedure gets called after I've finished my weekly importing process. I only need to shrink the log files as almost all the modifications are either a record update or
an insert (there are very few deletions done). I need to do this across several databases and for software maintainability would prefer to have only the one procedure.
My issue is that there does not seem to be a way to point to the various databases from within a procedure to preform this operation. Also the maintenance plan modules have a shrink database operation but I don't see a shrink file operation so that doesn't
appear to be an option.
Have I overlooked something or is it not possible to preform a shrink file operation for the transaction log files for multiple databases?
Developer Frog Haven EnterprisesThank you for your response. While I did not use your answer verbatim it did lead me to my solution as I only need to preform the shrink operation on 4 out of the 7 databases in my SQL instance.
FYI my final solution was...
-- shrink the log files
DECLARE @sql
nvarchar(500);
SET @sql
=
'USE [vp]; DBCC SHRINKFILE (2, 100);';
EXEC
(@sql);
SET @sql
=
'USE [vp_arrow]; DBCC SHRINKFILE (2, 100);';
EXEC
(@sql)
Developer Frog Haven Enterprises -
I've been having a lot of problems trying to get an old batch file we have laying around, to run from my powershell script. The batch file actually asks for two inputs from the user. I've managed to put together a powershell that echos a response, but of
course, that only will answer one of the prompts. As usual, I've simplified things here to get my testing done. The batch file looks like this:
@ECHO OFF
SET /P CUSTID=Customer Number:
SET /P DBCOUNT=Number of Live Databases:
ECHO Customer Id was set to : %CUSTID%
ECHO Database Count was set to : %DBCOUNT%
Two inputs, two echos to verify values have been set. Now, the powershell looks like this:
Param(
[string]$ClientADG,
[string]$ClientDBCount,
[scriptblock]$Command
$ClientADG = '1013'
$ClientDBCount = '2'
$Response = $ClientADG + "`r`n" + $ClientDBCount
$Command = 'Invoke-Command -ComputerName localhost -ScriptBlock {cmd /c "echo ' + $ClientADG + ' | E:\Scripts\Setup\Company\DatabaseSetupTest.bat"}'
powershell -command $Command
Output looks like:
Customer Number: Number of Live Databases: Customer Id was set to : 1013
Database Count was set to :
As expected, as I'm only passing in one value. I can't figure out how to get a second value passed in for the second prompt. Instead of $ClientADG, I tried to mash the two value together in the $Response variable with a cr/lf or a cr or a lf in between,
but no go there either. The first input gets set to the second value, and second input is blank still. In the essence of time, I need to get this batch file called from Powershell to get some folks productive while I actually rewrite what the batch file does
in another powershell, so it can be integrated into other things. (I'm automating what a bunch of people spend hours doing into multiple scripts and eventually one BIG script so they can focus on doing their real jobs instead).
How do I get this right in powershell? I don't want to modify the batch file at all at this point, just get it running from the powershell.
Thanks in advance!
mpleafIt's a "simple" test so I can figure out how to get the arguments passed from ps to bat. The bat file looks like this:
@ECHO OFF
SET CUSTID = %1
SET DBCOUNT = %2
ECHO Customer Id was set to : %CUSTID%
ECHO Database Count was set to : %DBCOUNT%
That's it. The PS script looks like this:
Invoke-Command-ComputerName myserver-ScriptBlock{cmd/c"E:\Scripts\Setup\Company\DatabaseSetupTest.bat
1013 2"}
That's it. The bat file exists on "myserver", and I'm getting the echo back, but without the values.
mpleaf -
i need two things for my powershell script
1-show successful message if script run successfully
2- if script run with error write error in log file
Please Guide me
$Folders1 = "C:\inetpub\temp\YYY"
cd $Folders1
md $Folders1\Change
Copy-Item * Change -recurse -Force -Exclude Change
md $Folders1\Change\IPC
Move-Item Change\ZZZ Change\ZZZ
xcopy ZZZ Change\IPC\ZZZ /s /i
xcopy OrgFundamental Change\IPC\OrgFundamental /s /i
xcopy Card Change\Card-ib /s /i
$Folders2 = Get-ChildItem $Folders1\Change
foreach($f in $Folders2)
if ($f.name -notlike "IPC" -and $f.name -notlike "CardScheduler" -and $f.name -notlike "Scheduler")
md Change\$f\bin
Move-Item Change\$f\*.dll Change\$f\bin
Get-ChildItem -Path Change\$f "*.exe.config" | Rename-Item -NewName web.configThis is sort of separate from error handling, but when it comes to logging what happens in my scripts, I have two approaches. If the system is running PowerShell 3.0 or later, and I don't care about console output (say, if the script is running as a scheduled
task, and no one will be looking at it interactively anyway), then sometimes I'll run the script like this:
PowerShell.exe -NoProfile -File 'c:\path\to\myScript.ps1' *> 'c:\Logs\someLogFile.txt'
The *> operator (introduced in PowerShell 3.0) redirects the Output, Error, Warning, Verbose and Debug streams, in this case, redirecting them all to a file.
More frequently, though, I use the
PowerShellLogging module. This has a few advantages over the redirection operator:
It works with PowerShell 2.0.
Output can be displayed both on-screen and sent to a log file, with minimal modification to the script code.
You can easily control which streams go to which files in any combination you like.
You can control the content of what gets sent to the log file. When using the stock Enable-LogFile cmdlet, you automatically get date and timestamps prepended to each non-blank line in the file.
Using this module in a script only requires two lines of code (and possibly only one, if you're running PowerShell 3.0 or later with module auto-loading enabled):
Import-Module PowerShellLogging
$logFile = Enable-LogFile -Path 'c:\Logs\someLogFile.txt'
It's also a good idea (but not strictly required) to pass your $logFile object to Disable-LogFile when your script finishes, so no other console output bleeds into your log file before the garbage collector comes along and stops that from happening. -
Redirect DBMS_OUTPUT to calling application and to log file
Hi,
I have a procedure to insert a set of records into a table using Merge statement and capture the inserted record count.
Currently i display the record count using DBMS_OUTPUT in Oracle SQL Developer tool using DBMS_OUTPUT.ENABLE.
How do i redirect this output to both calling application and Log file on Unix server.
I have more DBMS_OUTPUT statements in Exception handling to handle failed inserts. How do i redirect these statements to Calling Application and Log file on Unix.
Can we send any email to a group from PL/SQL if at all program fails and Exception handle is triggered OR if the program complete successfully.
I appreciate your responses.user10405899 wrote:
Hi,
I have a procedure to insert a set of records into a table using Merge statement and capture the inserted record count.
Currently i display the record count using DBMS_OUTPUT in Oracle SQL Developer tool using DBMS_OUTPUT.ENABLE.
How do i redirect this output to both calling application and Log file on Unix server.
I have more DBMS_OUTPUT statements in Exception handling to handle failed inserts. How do i redirect these statements to Calling Application and Log file on Unix.
Can we send any email to a group from PL/SQL if at all program fails and Exception handle is triggered OR if the program complete successfully.
I appreciate your responses.DBMS_OUTPUT is not the correct tool to be using for outputting information. It writes data to a buffer on the server, and then it's up to the client tool to read the data out of that buffer using the DBMS_OUTPUT.GET_LINE call.
You could try implementing something like that in your own application if you wanted, but in truth, if you're wanting to capture some trace of what's happening in your application then you are better logging those things to a table using an autonomous transaction procedure, and then have whataver application you want just query that table. -
I just updated an existing slide show that was created several months back. Since loading my new set of files to our web server, we keep getting the following error in our server log files when someone loads our page in IE8 and IE9:
File does not exist: /www/public_html/null, referer:
One of our programmers tracked the error to this line of code, but we're not sure what's causing it:
Muse.Utils.addSelectorFn('#slideshowu70', function(elem) { new WebPro.Widget.ContentSlideShow(elem, {autoPlay:true,displayInterval:6000,slideLinkStopsSlideShow:false,transitionStyle:'horizo ntal',lightboxEnabled_runtime:false,shuffle:false}); });/* #slideshowu70 */
Any suggestions?
Thanks!The errors aren't showing up on the client side, only in the server access logs. Every time the page is loaded with IE8 or IE9 and the Muse.Utils.addSelectorFn with the #slideshowu70 line is hit, it generates four errors in the server access logs.
Our admin guy pulled the errors from where it was clicked on from the Adobe forum. He said it also appeared that you tested it with www added to the url (if this was when you were testing it)...
[Mon Feb 04 13:58:57 2013] [error] [client 121.242.198.2] File does not exist: /www/public_html/null, referer:http://stingrayboats.com/
[Mon Feb 04 13:59:00 2013] [error] [client 121.242.198.2] File does not exist: /www/public_html/null, referer:http://stingrayboats.com/
[Mon Feb 04 13:59:08 2013] [error] [client 121.242.198.2] File does not exist: /www/public_html/null, referer:http://stingrayboats.com/
[Mon Feb 04 13:59:33 2013] [error] [client 121.242.198.2] File does not exist: /www/public_html/null, referer:http://www.stingrayboats.com/
[Mon Feb 04 13:59:43 2013] [error] [client 121.242.198.2] File does not exist: /www/public_html/null, referer:http://www.stingrayboats.com/
The reason we find it strange is because it did not result in errors with the previous version of the slideshow. The only thing that changed this time around is that I removed some slides and added additional ones, but I noticed that there is a lot of difference in the code and even the scripts that are used. And I did copy over all of the scripts, css files, etc., so it's using all of the new source files.
I was hoping it was something that you guys had noticed if you review the access logs when you're testing. While it works perfectly on the client side, the IT guys do go through our access logs, so it would be nice to eliminate the errors.
Thanks for looking at it!
Maybe you are looking for
-
I have an MP3 file but would like to create a new one. Please help!
I would like to take a current MP3 that I have on file and create a new MP3 using only the first 20 seconds of the original. How do I do this? (I have iTunes and another program called something like iGarageBand) Thanks!
-
Source Formatting Bug in Dreamweaver CS5.5?
Dreamweaver is formatting my THEAD code unfavorably. It is correctly set in the Source Formatting Settings, but is doing some weird indentation. In the attached image, note the first TH is aligned with the opening THEAD, and the closing THEAD is in
-
Add Device issue with Bluetooth!
Hi, Earlier, my laptop could not detect any Bluetooth device, however; reading a post on MS community on the subject, I uninstalled all the devices under Bluetooth as shown in the Device Manager. After restart my Bluetooth could detect my devices and
-
I am running Windows 7. When I tried to access Photoshop Elements12 it looked as though it was starting to open, the icon was going round but then nothing. The same happened with Premiere Elements 12where it opened as far as the opening page but agai
-
Crystal Reports 2011 keeps crashing...
I am using CR2011 and using the IBM Informix Client with the data direct 5.0 driver since this is what I was told I needed from Progress to access the Informix SE client on the main frame. I finally got a good connection established in the ODBC 32 bi