Regex to parse stacktrace in log file
Hi,
How to parse the below stack trace in log file:-
Below is the stack trace:-
2012-08-10 08:19:17 java.lang.NullPointerException
2012-08-10 08:19:17 at net.minecraft.server.World.tickEntities(World.java:1146)
2012-08-10 08:19:17 at net.minecraft.server.MinecraftServer.q(MinecraftServer.java:567)
2012-08-10 08:19:17 at net.minecraft.server.DedicatedServer.q(DedicatedServer.java:212)
2012-08-10 08:19:17 at net.minecraft.server.MinecraftServer.p(MinecraftServer.java:476)
2012-08-10 08:19:17 at net.minecraft.server.MinecraftServer.run(MinecraftServer.java:408)
2012-08-10 08:19:17 at net.minecraft.server.ThreadServerApplication.run(SourceFile:539)
How to parse only the stack trace using regex? If i have put the stacktrace to a separate log file, what config needs to be done in log4j to capture only the stack trace?
Thanks.
> How to parse only the stack trace using regex?
From the entire file? I doubt it is possible.
Log files are normally rather large. You would need to read the entire file into memory, then apply a multiline regex which means it is going to be scanning an rescanning multiple times. If it doesn't just die it would probably take a long time.
Better to just build a parser that reads each line and does it.
And of course the given example presumes that NO threads are in use. If they are then the it is not deterministically possible to correctly the parse the file because there is no thread indicator.
Similar Messages
-
Parse log file using powershell
Hi,
Am pretty new to Powershell and would require anyone of your assistance in setting up a script which parse thru a log file and provide me output for my requirements below.
I would like to parse the Main log file for Barra Aegis application(shown below) using powershell.
Main log = C:\BARRALIN\barralin.log
Model specific log = C:\BARRALIN\log\WG*.log
Requirements :
1. scroll to the bottom of the log file and look for name called "GL Daily" and see the latest date which in the example log below is "20150203"
note : Name "GL Daily" and date keep changing in log file
2. Once entry is found i would like to have a check to see all 3 entries PREPROCESS, TRANSFER, POSTPROCESS are sucess.
3. If all 3 are success i would like to the script to identify the respective Model specific log number and print it out.
E.g if you see the sample log below for "GL Daily", it is preceded by number "1718" hence script should append the model log path with "WG00" along with 1718, finally it should look something like this C:\BARRALIN\log\WG001718.log.
4. If all 3 items or anyone of them are in "failed" state then print the same log file info with WG001718.log
Any help on this would be much appreciated.
Thank You.
Main log file :
START BARRALINK Check Auto Update 1716
43006 20150203
Trgt/Arch c:\barralin
PREPROCESS success 0 preprocessor: no error
TRANSFER success 1 Host success: files received
POSTPROCESS success 0 Postprocessor: no error
CONFIRMATION success 2 No Confirm needed
STOP 43105 20150203
START Aegis GL Monthly
1716 43117 20150203
Trgt/Arch K:\barraeqr\aegis\qnt\gleqty
PREPROCESS success 0 preprocessor: no error
TRANSFER success 1 Host success: files received
POSTPROCESS success 0 Postprocessor: no error
CONFIRMATION success 2 No Confirm needed
STOP 44435 20150203
START Aegis UB Daily
1717 43107 20150203
Trgt/Arch K:\barraeqr\aegis\qnt\gleqty
PREPROCESS success 0 preprocessor: no error
TRANSFER success 1 Host success: files received
POSTPROCESS success 0 Postprocessor: no error
CONFIRMATION success 2 No Confirm needed
STOP 44435 20150203
START Aegis GL Daily
1718 44437 20150203
Trgt/Arch K:\barraeqr\aegis\qnt\gleqty
PREPROCESS success 0 preprocessor: no error
TRANSFER success 1 Host success: files received
POSTPROCESS success 0 Postprocessor: no error
CONFIRMATION success 2 No Confirm needed
STOP 50309 20150203
Hi All,
I was writing a function in power shell to send email and i was looking to attach lines as and when required to the body of the email. but am not able to get this done..Here's my code
Function Email ()
$MailMessage = New-Object System.Net.Mail.MailMessage
$SMTPClient = New-Object System.Net.Mail.SmtpClient -ArgumentList "mailhost.xxx.com"
$Recipient = "[email protected]"
If ($MessageBody -ne $null)
$MessageBody = "The details of Barra $strsessionProduct model is listed below
`rHostName : $localhost
`r Model Run Date : $Date
`r Model Data Date : $DateList1
`r`n Click for full job log"+ "\\"+$localhost+"\E$\Local\Scripts\Logs "
$MailMessage.Body = $MessageBody
If ($Subject -ne $null) {
$MailMessage.Subject = $Subject
$Sender = "[email protected]"
$MailMessage.Sender = $Sender
$MailMessage.From = $Sender
$MailMessage.to.Add($Recipient)
If ($AttachmentFile -ne $null) { $MailMessage.Attachments.add($AttachmentFile)}
$SMTPClient.Send($MailMessage)
$Subject = "Hello"
$AttachmentFile = ".\barralin.log"
$MessageBody = "Add this line to Body of email along with existing"
Email -Recipient "" -Subject $Subject -MessageBody $MessageBody -AttachmentFile $AttachmentFile
as you can see before calling Email function i did add some lines to $MessageBody and was expecting that it would print the lines for $MessageBody in Email Function along with the new line. But thats not the case.
I have tried to make $MessageBody as an Array and then add contents to array
$MessageBody += "Add this line to Body of email along with existing"
$MessageBody = $MessageBody | out-string
Even this didnt work for me. Please suggest me any other means to get this done.
THank You -
Parse robocopy Log File - new value
Hello,
I have found a script, that parse the robocopy log file, which looks like this:
ROBOCOPY :: Robust File Copy for Windows
Started : Thu Aug 07 09:30:18 2014
Source : e:\testfolder\
Dest : w:\testfolder\
Files : *.*
Options : *.* /V /NDL /S /E /COPYALL /NP /IS /R:1 /W:5
Same 14.6 g e:\testfolder\bigfile - Copy (5).out
Same 14.6 g e:\testfolder\bigfile - Copy.out
Same 14.6 g e:\testfolder\bigfile.out
Total Copied Skipped Mismatch FAILED Extras
Dirs : 1 0 1 0
0 0
Files : 3 3 0 0
0 0
Bytes : 43.969 g 43.969 g 0 0 0 0
Times : 0:05:44 0:05:43 0:00:00 0:00:00
Speed : 137258891 Bytes/sec.
Speed : 7854.016 MegaBytes/min.
Ended : Thu Aug 07 09:36:02 2014
Most values at output file are included, but the two speed paramter not.
How can I get this two speed paramters at output file?
Here is the script:
param(
[parameter(Position=0,Mandatory=$true,ValueFromPipeline=$false,HelpMessage='Source Path with no trailing slash')][string]$SourcePath,
[switch]$fp
write-host "Robocopy log parser. $(if($fp){"Parsing file entries"} else {"Parsing summaries only, use -fp to parse file entries"})"
#Arguments
# -fp File parse. Counts status flags and oldest file Slower on big files.
$ElapsedTime = [System.Diagnostics.Stopwatch]::StartNew()
$refreshrate=1 # progress counter refreshes this often when parsing files (in seconds)
# These summary fields always appear in this order in a robocopy log
$HeaderParams = @{
"04|Started" = "date";
"01|Source" = "string";
"02|Dest" = "string";
"03|Options" = "string";
"07|Dirs" = "counts";
"08|Files" = "counts";
"09|Bytes" = "counts";
"10|Times" = "counts";
"05|Ended" = "date";
#"06|Duration" = "string"
$ProcessCounts = @{
"Processed" = 0;
"Error" = 0;
"Incomplete" = 0
$tab=[char]9
$files=get-childitem $SourcePath
$writer=new-object System.IO.StreamWriter("$(get-location)\robocopy-$(get-date -format "dd-MM-yyyy_HH-mm-ss").csv")
function Get-Tail([object]$reader, [int]$count = 10) {
$lineCount = 0
[long]$pos = $reader.BaseStream.Length - 1
while($pos -gt 0)
$reader.BaseStream.position=$pos
# 0x0D (#13) = CR
# 0x0A (#10) = LF
if ($reader.BaseStream.ReadByte() -eq 10)
$lineCount++
if ($lineCount -ge $count) { break }
$pos--
# tests for file shorter than requested tail
if ($lineCount -lt $count -or $pos -ge $reader.BaseStream.Length - 1) {
$reader.BaseStream.Position=0
} else {
# $reader.BaseStream.Position = $pos+1
$lines=@()
while(!$reader.EndOfStream) {
$lines += $reader.ReadLine()
return $lines
function Get-Top([object]$reader, [int]$count = 10)
$lines=@()
$lineCount = 0
$reader.BaseStream.Position=0
while(($linecount -lt $count) -and !$reader.EndOfStream) {
$lineCount++
$lines += $reader.ReadLine()
return $lines
function RemoveKey ( $name ) {
if ( $name -match "|") {
return $name.split("|")[1]
} else {
return ( $name )
function GetValue ( $line, $variable ) {
if ($line -like "*$variable*" -and $line -like "* : *" ) {
$result = $line.substring( $line.IndexOf(":")+1 )
return $result
} else {
return $null
function UnBodgeDate ( $dt ) {
# Fixes RoboCopy botched date-times in format Sat Feb 16 00:16:49 2013
if ( $dt -match ".{3} .{3} \d{2} \d{2}:\d{2}:\d{2} \d{4}" ) {
$dt=$dt.split(" ")
$dt=$dt[2],$dt[1],$dt[4],$dt[3]
$dt -join " "
if ( $dt -as [DateTime] ) {
return $dt.ToStr("dd/MM/yyyy hh:mm:ss")
} else {
return $null
function UnpackParams ($params ) {
# Unpacks file count bloc in the format
# Dirs : 1827 0 1827 0 0 0
# Files : 9791 0 9791 0 0 0
# Bytes : 165.24 m 0 165.24 m 0 0 0
# Times : 1:11:23 0:00:00 0:00:00 1:11:23
# Parameter name already removed
if ( $params.length -ge 58 ) {
$params = $params.ToCharArray()
$result=(0..5)
for ( $i = 0; $i -le 5; $i++ ) {
$result[$i]=$($params[$($i*10 + 1) .. $($i*10 + 9)] -join "").trim()
$result=$result -join ","
} else {
$result = ",,,,,"
return $result
$sourcecount = 0
$targetcount = 1
# Write the header line
$writer.Write("File")
foreach ( $HeaderParam in $HeaderParams.GetEnumerator() | Sort-Object Name ) {
if ( $HeaderParam.value -eq "counts" ) {
$tmp="~ Total,~ Copied,~ Skipped,~ Mismatch,~ Failed,~ Extras"
$tmp=$tmp.replace("~","$(removekey $headerparam.name)")
$writer.write(",$($tmp)")
} else {
$writer.write(",$(removekey $HeaderParam.name)")
if($fp){
$writer.write(",Scanned,Newest,Summary")
$writer.WriteLine()
$filecount=0
# Enumerate the files
foreach ($file in $files) {
$filecount++
write-host "$filecount/$($files.count) $($file.name) ($($file.length) bytes)"
$results=@{}
$Stream = $file.Open([System.IO.FileMode]::Open,
[System.IO.FileAccess]::Read,
[System.IO.FileShare]::ReadWrite)
$reader = New-Object System.IO.StreamReader($Stream)
#$filestream=new-object -typename System.IO.StreamReader -argumentlist $file, $true, [System.IO.FileAccess]::Read
$HeaderFooter = Get-Top $reader 16
if ( $HeaderFooter -match "ROBOCOPY :: Robust File Copy for Windows" ) {
if ( $HeaderFooter -match "Files : " ) {
$HeaderFooter = $HeaderFooter -notmatch "Files : "
[long]$ReaderEndHeader=$reader.BaseStream.position
$Footer = Get-Tail $reader 16
$ErrorFooter = $Footer -match "ERROR \d \(0x000000\d\d\) Accessing Source Directory"
if ($ErrorFooter) {
$ProcessCounts["Error"]++
write-host -foregroundcolor red "`t $ErrorFooter"
} elseif ( $footer -match "---------------" ) {
$ProcessCounts["Processed"]++
$i=$Footer.count
while ( !($Footer[$i] -like "*----------------------*") -or $i -lt 1 ) { $i-- }
$Footer=$Footer[$i..$Footer.Count]
$HeaderFooter+=$Footer
} else {
$ProcessCounts["Incomplete"]++
write-host -foregroundcolor yellow "`t Log file $file is missing the footer and may be incomplete"
foreach ( $HeaderParam in $headerparams.GetEnumerator() | Sort-Object Name ) {
$name = "$(removekey $HeaderParam.Name)"
$tmp = GetValue $($HeaderFooter -match "$name : ") $name
if ( $tmp -ne "" -and $tmp -ne $null ) {
switch ( $HeaderParam.value ) {
"date" { $results[$name]=UnBodgeDate $tmp.trim() }
"counts" { $results[$name]=UnpackParams $tmp }
"string" { $results[$name] = """$($tmp.trim())""" }
default { $results[$name] = $tmp.trim() }
if ( $fp ) {
write-host "Parsing $($reader.BaseStream.Length) bytes"
# Now go through the file line by line
$reader.BaseStream.Position=0
$filesdone = $false
$linenumber=0
$FileResults=@{}
$newest=[datetime]"1/1/1900"
$linecount++
$firsttick=$elapsedtime.elapsed.TotalSeconds
$tick=$firsttick+$refreshrate
$LastLineLength=1
try {
do {
$line = $reader.ReadLine()
$linenumber++
if (($line -eq "-------------------------------------------------------------------------------" -and $linenumber -gt 16) ) {
# line is end of job
$filesdone=$true
} elseif ($linenumber -gt 16 -and $line -gt "" ) {
$buckets=$line.split($tab)
# this test will pass if the line is a file, fail if a directory
if ( $buckets.count -gt 3 ) {
$status=$buckets[1].trim()
$FileResults["$status"]++
$SizeDateTime=$buckets[3].trim()
if ($sizedatetime.length -gt 19 ) {
$DateTime = $sizedatetime.substring($sizedatetime.length -19)
if ( $DateTime -as [DateTime] ){
$DateTimeValue=[datetime]$DateTime
if ( $DateTimeValue -gt $newest ) { $newest = $DateTimeValue }
if ( $elapsedtime.elapsed.TotalSeconds -gt $tick ) {
$line=$line.Trim()
if ( $line.Length -gt 48 ) {
$line="[...]"+$line.substring($line.Length-48)
$line="$([char]13)Parsing > $($linenumber) ($(($reader.BaseStream.Position/$reader.BaseStream.length).tostring("P1"))) - $line"
write-host $line.PadRight($LastLineLength) -NoNewLine
$LastLineLength = $line.length
$tick=$tick+$refreshrate
} until ($filesdone -or $reader.endofstream)
finally {
$reader.Close()
$line=$($([string][char]13)).padright($lastlinelength)+$([char]13)
write-host $line -NoNewLine
$writer.Write("`"$file`"")
foreach ( $HeaderParam in $HeaderParams.GetEnumerator() | Sort-Object Name ) {
$name = "$(removekey $HeaderParam.Name)"
if ( $results[$name] ) {
$writer.Write(",$($results[$name])")
} else {
if ( $ErrorFooter ) {
#placeholder
} elseif ( $HeaderParam.Value -eq "counts" ) {
$writer.Write(",,,,,,")
} else {
$writer.Write(",")
if ( $ErrorFooter ) {
$tmp = $($ErrorFooter -join "").substring(20)
$tmp=$tmp.substring(0,$tmp.indexof(")")+1)+","+$tmp
$writer.write(",,$tmp")
} elseif ( $fp ) {
$writer.write(",$LineCount,$($newest.ToString('dd/MM/yyyy hh:mm:ss'))")
foreach ( $FileResult in $FileResults.GetEnumerator() ) {
$writer.write(",$($FileResult.Name): $($FileResult.Value);")
$writer.WriteLine()
} else {
write-host -foregroundcolor darkgray "$($file.name) is not recognised as a RoboCopy log file"
write-host "$filecount files scanned in $($elapsedtime.elapsed.tostring()), $($ProcessCounts["Processed"]) complete, $($ProcessCounts["Error"]) have errors, $($ProcessCounts["Incomplete"]) incomplete"
write-host "Results written to $($writer.basestream.name)"
$writer.close()
I hope somebody can help me,
Horst
Thanks Horst MOSS 2007 Farm; MOSS 2010 Farm; TFS 2010; TFS 2013; IIS 7.5Hi Horst,
To convert mutiple robocopy log files to a .csv file with "speed" option, the script below may be helpful for you, I tested with a single robocopy log file, and the .csv file will output to "D:\":
$SourcePath="e:\1\1.txt" #robocopy log file
write-host "Robocopy log parser. $(if($fp){"Parsing file entries"} else {"Parsing summaries only, use -fp to parse file entries"})"
#Arguments
# -fp File parse. Counts status flags and oldest file Slower on big files.
$ElapsedTime = [System.Diagnostics.Stopwatch]::StartNew()
$refreshrate=1 # progress counter refreshes this often when parsing files (in seconds)
# These summary fields always appear in this order in a robocopy log
$HeaderParams = @{
"04|Started" = "date";
"01|Source" = "string";
"02|Dest" = "string";
"03|Options" = "string";
"09|Dirs" = "counts";
"10|Files" = "counts";
"11|Bytes" = "counts";
"12|Times" = "counts";
"05|Ended" = "date";
"07|Speed" = "default";
"08|Speednew" = "default"
$ProcessCounts = @{
"Processed" = 0;
"Error" = 0;
"Incomplete" = 0
$tab=[char]9
$files=get-childitem $SourcePath
$writer=new-object System.IO.StreamWriter("D:\robocopy-$(get-date -format "dd-MM-yyyy_HH-mm-ss").csv")
function Get-Tail([object]$reader, [int]$count = 10) {
$lineCount = 0
[long]$pos = $reader.BaseStream.Length - 1
while($pos -gt 0)
$reader.BaseStream.position=$pos
# 0x0D (#13) = CR
# 0x0A (#10) = LF
if ($reader.BaseStream.ReadByte() -eq 10)
$lineCount++
if ($lineCount -ge $count) { break }
$pos--
# tests for file shorter than requested tail
if ($lineCount -lt $count -or $pos -ge $reader.BaseStream.Length - 1) {
$reader.BaseStream.Position=0
} else {
# $reader.BaseStream.Position = $pos+1
$lines=@()
while(!$reader.EndOfStream) {
$lines += $reader.ReadLine()
return $lines
function Get-Top([object]$reader, [int]$count = 10)
$lines=@()
$lineCount = 0
$reader.BaseStream.Position=0
while(($linecount -lt $count) -and !$reader.EndOfStream) {
$lineCount++
$lines += $reader.ReadLine()
return $lines
function RemoveKey ( $name ) {
if ( $name -match "|") {
return $name.split("|")[1]
} else {
return ( $name )
function GetValue ( $line, $variable ) {
if ($line -like "*$variable*" -and $line -like "* : *" ) {
$result = $line.substring( $line.IndexOf(":")+1 )
return $result
} else {
return $null
}function UnBodgeDate ( $dt ) {
# Fixes RoboCopy botched date-times in format Sat Feb 16 00:16:49 2013
if ( $dt -match ".{3} .{3} \d{2} \d{2}:\d{2}:\d{2} \d{4}" ) {
$dt=$dt.split(" ")
$dt=$dt[2],$dt[1],$dt[4],$dt[3]
$dt -join " "
if ( $dt -as [DateTime] ) {
return $dt.ToStr("dd/MM/yyyy hh:mm:ss")
} else {
return $null
function UnpackParams ($params ) {
# Unpacks file count bloc in the format
# Dirs : 1827 0 1827 0 0 0
# Files : 9791 0 9791 0 0 0
# Bytes : 165.24 m 0 165.24 m 0 0 0
# Times : 1:11:23 0:00:00 0:00:00 1:11:23
# Parameter name already removed
if ( $params.length -ge 58 ) {
$params = $params.ToCharArray()
$result=(0..5)
for ( $i = 0; $i -le 5; $i++ ) {
$result[$i]=$($params[$($i*10 + 1) .. $($i*10 + 9)] -join "").trim()
$result=$result -join ","
} else {
$result = ",,,,,"
return $result
$sourcecount = 0
$targetcount = 1
# Write the header line
$writer.Write("File")
foreach ( $HeaderParam in $HeaderParams.GetEnumerator() | Sort-Object Name ) {
if ( $HeaderParam.value -eq "counts" ) {
$tmp="~ Total,~ Copied,~ Skipped,~ Mismatch,~ Failed,~ Extras"
$tmp=$tmp.replace("~","$(removekey $headerparam.name)")
$writer.write(",$($tmp)")
} else {
$writer.write(",$(removekey $HeaderParam.name)")
if($fp){
$writer.write(",Scanned,Newest,Summary")
$writer.WriteLine()
$filecount=0
# Enumerate the files
foreach ($file in $files) {
$filecount++
write-host "$filecount/$($files.count) $($file.name) ($($file.length) bytes)"
$results=@{}
$Stream = $file.Open([System.IO.FileMode]::Open,
[System.IO.FileAccess]::Read,
[System.IO.FileShare]::ReadWrite)
$reader = New-Object System.IO.StreamReader($Stream)
#$filestream=new-object -typename System.IO.StreamReader -argumentlist $file, $true, [System.IO.FileAccess]::Read
$HeaderFooter = Get-Top $reader 16
if ( $HeaderFooter -match "ROBOCOPY :: Robust File Copy for Windows" ) {
if ( $HeaderFooter -match "Files : " ) {
$HeaderFooter = $HeaderFooter -notmatch "Files : "
[long]$ReaderEndHeader=$reader.BaseStream.position
$Footer = Get-Tail $reader 16
$ErrorFooter = $Footer -match "ERROR \d \(0x000000\d\d\) Accessing Source Directory"
if ($ErrorFooter) {
$ProcessCounts["Error"]++
write-host -foregroundcolor red "`t $ErrorFooter"
} elseif ( $footer -match "---------------" ) {
$ProcessCounts["Processed"]++
$i=$Footer.count
while ( !($Footer[$i] -like "*----------------------*") -or $i -lt 1 ) { $i-- }
$Footer=$Footer[$i..$Footer.Count]
$HeaderFooter+=$Footer
} else {
$ProcessCounts["Incomplete"]++
write-host -foregroundcolor yellow "`t Log file $file is missing the footer and may be incomplete"
foreach ( $HeaderParam in $headerparams.GetEnumerator() | Sort-Object Name ) {
$name = "$(removekey $HeaderParam.Name)"
if ($name -eq "speed"){ #handle two speed
($HeaderFooter -match "$name : ")|foreach{
$tmp=GetValue $_ "speed"
$results[$name] = $tmp.trim()
$name+="new"}
elseif ($name -eq "speednew"){} #handle two speed
else{
$tmp = GetValue $($HeaderFooter -match "$name : ") $name
if ( $tmp -ne "" -and $tmp -ne $null ) {
switch ( $HeaderParam.value ) {
"date" { $results[$name]=UnBodgeDate $tmp.trim() }
"counts" { $results[$name]=UnpackParams $tmp }
"string" { $results[$name] = """$($tmp.trim())""" }
default { $results[$name] = $tmp.trim() }
if ( $fp ) {
write-host "Parsing $($reader.BaseStream.Length) bytes"
# Now go through the file line by line
$reader.BaseStream.Position=0
$filesdone = $false
$linenumber=0
$FileResults=@{}
$newest=[datetime]"1/1/1900"
$linecount++
$firsttick=$elapsedtime.elapsed.TotalSeconds
$tick=$firsttick+$refreshrate
$LastLineLength=1
try {
do {
$line = $reader.ReadLine()
$linenumber++
if (($line -eq "-------------------------------------------------------------------------------" -and $linenumber -gt 16) ) {
# line is end of job
$filesdone=$true
} elseif ($linenumber -gt 16 -and $line -gt "" ) {
$buckets=$line.split($tab)
# this test will pass if the line is a file, fail if a directory
if ( $buckets.count -gt 3 ) {
$status=$buckets[1].trim()
$FileResults["$status"]++
$SizeDateTime=$buckets[3].trim()
if ($sizedatetime.length -gt 19 ) {
$DateTime = $sizedatetime.substring($sizedatetime.length -19)
if ( $DateTime -as [DateTime] ){
$DateTimeValue=[datetime]$DateTime
if ( $DateTimeValue -gt $newest ) { $newest = $DateTimeValue }
if ( $elapsedtime.elapsed.TotalSeconds -gt $tick ) {
$line=$line.Trim()
if ( $line.Length -gt 48 ) {
$line="[...]"+$line.substring($line.Length-48)
$line="$([char]13)Parsing > $($linenumber) ($(($reader.BaseStream.Position/$reader.BaseStream.length).tostring("P1"))) - $line"
write-host $line.PadRight($LastLineLength) -NoNewLine
$LastLineLength = $line.length
$tick=$tick+$refreshrate
} until ($filesdone -or $reader.endofstream)
finally {
$reader.Close()
$line=$($([string][char]13)).padright($lastlinelength)+$([char]13)
write-host $line -NoNewLine
$writer.Write("`"$file`"")
foreach ( $HeaderParam in $HeaderParams.GetEnumerator() | Sort-Object Name ) {
$name = "$(removekey $HeaderParam.Name)"
if ( $results[$name] ) {
$writer.Write(",$($results[$name])")
} else {
if ( $ErrorFooter ) {
#placeholder
} elseif ( $HeaderParam.Value -eq "counts" ) {
$writer.Write(",,,,,,")
} else {
$writer.Write(",")
if ( $ErrorFooter ) {
$tmp = $($ErrorFooter -join "").substring(20)
$tmp=$tmp.substring(0,$tmp.indexof(")")+1)+","+$tmp
$writer.write(",,$tmp")
} elseif ( $fp ) {
$writer.write(",$LineCount,$($newest.ToString('dd/MM/yyyy hh:mm:ss'))")
foreach ( $FileResult in $FileResults.GetEnumerator() ) {
$writer.write(",$($FileResult.Name): $($FileResult.Value);")
$writer.WriteLine()
} else {
write-host -foregroundcolor darkgray "$($file.name) is not recognised as a RoboCopy log file"
write-host "$filecount files scanned in $($elapsedtime.elapsed.tostring()), $($ProcessCounts["Processed"]) complete, $($ProcessCounts["Error"]) have errors, $($ProcessCounts["Incomplete"]) incomplete"
write-host "Results written to $($writer.basestream.name)"
$writer.close()
If you have any other questions, please feel free to let me know.
If you have any feedback on our support,
please click here.
Best Regards,
Anna Wang
TechNet Community Support -
Parsing a log file on Weblogic
Hi!
I'd like to know how to get started on parsing a log file present in the default directory of Weblogic (ver 6.1 to be precise).
I thought of using regular expressions, and use java.util.regex , but that is supported from JDK1.5 onwards, whereas WL6.1 supports JDK1.3.
If u can also provide the code template for the same , that would be nice.
Thanks in advance,
Deepthy.uncle_alice wrote:
String regex = "([^\"\\\\]++|\\\\.)++"{code} The trick is to match anything except a quotation mark or a backslash, OR match a backslash followed by anything (because the backslash is usually used to escape other characters as well, including backslashes).Superb! Thanks! I have to admit I've never used the ++ before (only the greedies), but that's the thing I was looking for.
Just for the completeness, this is the whole thing that's able to parse a log line:
{code}
public class LogParser {
private static final String NOSPACE_PARAM = "([^ ]++)";
private static final String DATE_PARAM = "([^\\]]++)";
private static final String ESCAPED_PARAM = "((?:[^\"\\\\]++|\\\\.)++)";
private static final String PATTERN_STRING = NOSPACE_PARAM
+ " " + NOSPACE_PARAM
+ " " + NOSPACE_PARAM
+ " \\[" + DATE_PARAM + "\\]"
+ " \"" + ESCAPED_PARAM + "\""
+ " " + NOSPACE_PARAM
+ " " + NOSPACE_PARAM
+ " \"" + ESCAPED_PARAM + "\""
+ " \"" + ESCAPED_PARAM + "\""
+ " \"" + ESCAPED_PARAM + "\""
+ " \"" + ESCAPED_PARAM + "\""
+ " \"" + ESCAPED_PARAM + "\""
+ " " + NOSPACE_PARAM
+ " \"" + ESCAPED_PARAM + "\""
+ " \"" + ESCAPED_PARAM + "\""
+ " \"" + ESCAPED_PARAM + "\""
+ " \"" + ESCAPED_PARAM + "\"";
private static final Pattern PATTERN = Pattern.compile(PATTERN_STRING);
public static String[] parse(String line) {
Matcher m = PATTERN.matcher(line);
if (m.matches()) {
String[] result = new String[m.groupCount()];
for (int i = 0; i < m.groupCount();) {
result[i] = m.group(++i);
return result;
return null;
{code}
Any idea about the efficiency of this thing? -
Performance Issue: Wait event "log file sync" and "Execute to Parse %"
In one of our test environments users are complaining about slow response.
In statspack report folowing are the top-5 wait events
Event Waits Time (cs) Wt Time
log file parallel write 1,046 988 37.71
log file sync 775 774 29.54
db file scattered read 4,946 248 9.47
db file parallel write 66 248 9.47
control file parallel write 188 152 5.80
And after runing the same application 4 times, we are geting Execute to Parse % = 0.10. Cursor sharing is forced and query rewrite is enabled
When I view v$sql, following command is parsed frequently
EXECUTIONS PARSE_CALLS
SQL_TEXT
93380 93380
select SEQ_ORDO_PRC.nextval from DUAL
Please suggest what should be the method to troubleshoot this and if I need to check some more information
Regards,
Sudhanshu BhandariWell, of course, you probably can't eliminate this sort of thing entirely: a setup such as yours is inevitably a compromise. What you can do is make sure your log buffer is a good size (say 10MB or so); that your redo logs are large (at least 100MB each, and preferably large enough to hold one hour or so of redo produced at the busiest time for your database without filling up); and finally set ARCHIVE_LAG_TARGET to something like 1800 seconds or more to ensure a regular, routine, predictable log switch.
It won't cure every ill, but that sort of setup often means the redo subsystem ceases to be a regular driver of foreground waits. -
Parsing sendmail log file in a Java application
Are there any good parsing libraries for sendmail log files?
I already found these libraries but I'm not sure if they do what I need:
http://www.opennms.org/documentation/java-apidocs-stable/org/opennms/netmgt/syslogd/package-summary.html
http://code.google.com/p/jsyslogd/>
I've written a simple text editor, and this editor saves files with a particular extension. It can both open and save files. I've put the text editor program in a JAR. What I'd like to do, if possible, is associate the file extension with the text editor program. That is, I'd like to, when I click on a file with the extension, have the text editor come up with the file opened in it.
Can anyone give me ideas on how to do this, please? >If the editor is launched using webstart, the launch file can suggest a file association.
Note that an application that accesses the local file system needs to be digitally signed before it can break out of the applet like 'sandbox' in which it runs, unless it uses the JNLP API to access the files. The JNLP API is available to any app. launched using webstart.
There is an example of both claiming a file extension, and accessing files using the JNLP API, in this [File Service Demo|http://pscode.org/jws/api.html#fs]. The complete source and a build file can be downloaded from the filetest.zip just near the launch buttons.
I suggest you try the sandboxed version first - if you think that will suit your user, go with that.
As an aside, for best chance of a solution, I recommend folks add [Duke stars|http://wikis.sun.com/display/SunForums/Duke+Stars+Program+Overview] to match the importance of the task. -
Parse a log file...
Hi All,
1. I want to parse the content of log file, but when I open the log file it does not show me field names .
it starts with row containing the contents directly, where i want to read and process only three fields randomly.
I have written the code that works on IIS logs, the log i want to parse having field separator ' ' as single white space.
2. Some of log files are zipped, so I am unable to open and read them. so that I can parse them.
can any one have any clue or code that help me out.
thanks!bhatnagarudit wrote:
Hi All,
1. I want to parse the content of log file, but when I open the log file it does not show me field names .
it starts with row containing the contents directly, where i want to read and process only three fields randomly.
I have written the code that works on IIS logs, the log i want to parse having field separator ' ' as single white space.
2. Some of log files are zipped, so I am unable to open and read them. so that I can parse them.
can any one have any clue or code that help me out.
thanks! Here is a suggested algorithm .. (I don't want to write the code for you :-))
You have the following format.
314159b66967d86f031c7249d1d9a8024.. mybucket +[04/Aug/2006:22:34:02 +0000]+ 72.21.206.5 314159b66967d86f031c724... 3E57427F33A59F07 REST.PUT.OBJECT* /photos/2006/08/puppy.jpg +"GET /mybucket/photos/2006/08/puppy.jpg?x-foo=bar"+ 200 NoSuchBucket 2662992 3462992 70 10 "http://www.amazon.com/webservices" "curl/7.15.1"
Read the file in, go thru lines one by one. For each line,
1. Get the content in the first square brackets. Regular Expression: [&].
2. From there, get the fourth (4th) word separated by space.
3. From there, get the content in the first pair of double quotes. Regular Expression: \"&\". -
Can console.app parse log files?
The Console app displays the system.log file differently from other files ending in .log - it can display different columns with Time, sender, Message, host, etc. Additionally you can search for specific messages.
Here is my question:
Is is possible to display other log files like this? Maybe it's possible to create a "pattern file" that teaches Console how to display my files nicely? I am a web programmer and would like to read the custom log files from my applications in Console in a nicely formatted way.I don't think the Console application will do what you want. The view that you describe is coming out of a database (/var/log/asl.db); it's not being displayed by directly parsing the text files.
Now, you can add to this asl.db database by utilizing syslogd (see "man asl.conf" and "man syslogd" for more information), which could in theory control what you see in Console. However, that's not going to be as clean or as easy as I think you'd like.
Hope this helps a little... -
Parsing Log file with PowerShell
Hey Guys, I have the following line in a txt file (log file)
2012-08-14 18:00:00 [ERROR] . Exception SQL error 1 2012-08-14 18:10:00 [ERROR] . Exception SQL error 22012-08-15 18:00:00 [INFO] . Started
- Check the most recent entry(s) the last 24 hours
- if there's an error [ERROR] write-out a statement that says (Critical) with the date-time of the error
- If there's no erros write-out (Ok)
So far I learned to write this much and would like to learn more from you:
$file = "C:\Users\example\Documents\Log.txt" cat $file | Select-String "ERROR" -SimpleMatchHello,
I am new to PowerShell, and looking for same requirement, here is my function.
Function CheckLogs()
param ([string] $logfile)
if(!$logfile) {write-host "Usage: ""<Log file path>"""; exit}
cat $logfile | Select-String "ERROR" -SimpleMatch | select -expand line |
foreach {
$_ -match '(.+)\s\[(ERROR)\]\s(.+)'| Out-Null
new-object psobject -Property @{Timestamp = [datetime]$matches[1];Error = $matches[2]} |
where {$_.timestamp -gt (get-date).AddDays(-1)}
$error_time = [datetime]($matches[1])
if ($error_time -gt (Get-Date).AddDays(-1) )
write-output "CRITICAL: There is an error in the log file $logfile around
$($error_time.ToShortTimeString())"; exit(2)
write-output "OK: There was no errors in the past 24 hours."
CheckLogs "C:\Log.txt" #Function Call
Content of my log file is as follows
[ERROR] 2013-12-23 19:46:32
[ERROR] 2013-12-24 19:46:35
[ERROR] 2013-12-24 19:48:56
[ERROR] 2013-12-24 20:13:07
After executing above script, getting the below error, can you please correct me.
$error_time = [datetime]($matches[1])
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : InvalidOperation: (:) [], RuntimeException
+ FullyQualifiedErrorId : NullArray
Cannot index into a null array.
At C:\PS\LogTest.ps1:10 char:21
+ new-object psobject -Property @{Timestamp =
[datetime]$match ...
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
~~~
+ CategoryInfo : InvalidOperation: (:) [], RuntimeException
+ FullyQualifiedErrorId : NullArray
Cannot index into a null array.
At C:\Test\LogTest.ps1:12 char:21
+ $error_time = [datetime]($matches[1])
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : InvalidOperation: (:) [], RuntimeException
+ FullyQualifiedErrorId : NullArray -
Parse XMLFormatter log files back into LogRecords
My app uses a java.util.logging.FileHandler and XMLFormatter to generate XML log files, which is great. Now I want to display those logs in my app's web admin console.
I assumed that there would be an easy way to read my XML logs files back into a List of LogRecord objects, but my initial investigation has revealed nothing.
Anyone have any advice?If you remove an "active" log file, then this can cause problems. If you remove an archieved log file, then it is OK.
If you change the log directory, then you SHOULD inform all your applications that use this directory... Depending on the service, this information is usually stored inside the config files of the services.
Mihalis. -
Most efficient way to consume log files
Hello everyone,
I've been absent from the forums for awhile but I'm back at it now...
I have a question about the most efficient way to consume log files. I read in Powershell in action, by Bruce Payette that using a switch statement with a regex worked pretty well, that being said I haven't tried it yet. Select-string is working pretty
well for me but I have about 10 different entry types that I need to search logs for every 5 minutes and I'm scanning about 15 GB of logs at every interval. Anyway, if anyone has information about how to do something like that as quickly as possible
I'd appreciate it.
1. piping log files that meet my criteria to select-string
- This seems to work well but I don't like searching the same files over and over again
2. running logs through get-content and then building a filter statement
- This is ok but it seems to use up a fair bit of memory
3. Some other approach that I haven't thought of yet.
Anyway, I know this is a relatively nebulous question, sorry about that. I'm hoping that someone on here knows a really good way to find strings in logs files quickly.
Hope that helps! JasonYou can sometimes squeeze out more speed at the expense of memory usage, but filters are pretty fast. I don't see a benefit to holding the whole file in memory, in this case.
As I mentioned earlier, though, C# code will usually blow PowerShell away in terms of execution time. Here's a rewrite of what I just did (just for the INI Section pattern, to keep the post size down):
$string = @'
#Comment Line
[Ini-Style Section Line]
Key = Value Line
192.168.0.1 localhost
Some line that doesn't match anything.
Set-Content -Path .\test.txt -Value $string
Add-Type -TypeDefinition @'
using System;
using System.Text.RegularExpressions;
using System.Collections;
using System.IO;
public interface ILineParser
object ParseLine(string line);
public class IniSection
public string Section;
public class IniSectionParser : ILineParser
public object ParseLine(string line)
object o = null;
Match match = Regex.Match(line, @"^\s*\[([^\]]+)\]\s*$");
if (match.Success)
o = new IniSection() { Section = match.Groups[1].Value };
return o;
public class LogParser
public static IEnumerable ParseFile(string fileName, ILineParser[] lineParsers)
using (StreamReader sr = File.OpenText(fileName))
string line;
while ((line = sr.ReadLine()) != null)
foreach (ILineParser parser in lineParsers)
object result = parser.ParseLine(line);
if (result != null)
yield return result;
$parsers = @(
New-Object IniSectionParser
$results = [LogParser]::ParseFile("$pwd\test.txt", $parsers)
$results
Instead of defining separate classes for each type of line and output object, you could probably do something more generic with delegates (similar to how I used ScriptBlock.Invoke() in the PowerShell example), but it might sacrifice some speed to do so. -
Reading .log file & Sorting input
Hello all
Currently I'm working on a projekt were I have to read a "in.log" file, sort it and save it to another "out.log" file, the contents of the in.log file is:
[204.0.44.73]: Dir: path
[204.0.44.73]: Dir: path
[204.0.44.74]: Dir: path
[204.0.44.73]: Dir: path
[204.0.44.74]: Dir: path
and so on, now what I have to end up with is this in the out.log:
#1
[204.0.44.73]: Dir: path
[204.0.44.73]: Dir: path
[204.0.44.73]: Dir: path
Count = 3
#2
[204.0.44.74]: Dir: path
[204.0.44.74]: Dir: path
Count = 2
It's for the system administrator at school who doesn't want to do it himself but wants to pay an amount for me to do it (very small amount). I'll pay max duke dollars for a reply which can help me..
Please help, thx in advanceI must be really bored... Use it with "java Parser <logname>"
import java.io.*;
import java.util.*;
public class Parser{
public void splitLog( File input ){
try{
FileInputStream fis = new FileInputStream( input );
BufferedReader br = new BufferedReader( new InputStreamReader( fis ) );
Hashtable hosts = new Hashtable();
String line;
while( ( line = br.readLine() ) != null ){
StringTokenizer st = new StringTokenizer( line, "]" );
if( st.hasMoreTokens() ){
String host = st.nextToken();
if( host.trim().startsWith("[") ){
host = host.trim().substring(1);
if( !hosts.containsKey( host ) ){
hosts.put( host, new Vector() );
Vector v = (Vector)hosts.get( host );
v.addElement( line );
Enumeration enum = hosts.keys();
while( enum.hasMoreElements() ){
String host = (String)enum.nextElement();
Vector v = (Vector)hosts.get(host);
FileOutputStream fos = new FileOutputStream( host + ".log" );
for( int i = 0; i < v.size(); i++ ){
line = (String)v.elementAt( i );
fos.write( (line + "\r\n").getBytes());
fos.close();
catch( Exception e ){
e.printStackTrace();
public static void main( String[] args ){
File input = new File( args[0] );
Parser parser = new Parser();
parser.splitLog( input );
} -
System Update 5.01.0005 hangs every attempt on my X121e (log files inline)
Hi, I installed TVSU on my x121e laptop and it hangs on 95% on every attempt, like 5th time now. I have tried to uninstall and reinstall and I am using the latest version, I think, which is 5.01.0005.
Note that the system totally freezes on me. Nothing responds and Ctrl+Alt+Delete does not do anything. Only way is to shut down using the power button.
System:
ThinkPad x121e (30456UG)
Windows 8 Pro 64-bit (v6.2 build 9200)
Any ideas on what I can do to get it running? I don't know how to attach log files so I pasted them in full.
tvsu_exe.log:
TSS(L) 1: 22:07:56:0953 tvsu.cpp(1825): Installdir: C:\Program Files (x86)\Lenovo\System Update
TSS(L) 1: 22:07:57:0281 tvsu.cpp(1754): Installdir: C:\Program Files (x86)\Lenovo\System Update
TSS(L) 1: 22:07:57:0281 tvsu.cpp(1775): szCmd: C:\Program Files (x86)\Lenovo\System Update\TvsuCommandLauncher.exe
TSS(L) 1: 22:08:02:0579 tvsu.cpp(579): StartTvsukernel : admin
TSS(L) 1: 22:08:02:0594 tvsu.cpp(586): Begin InvokeUacSdkViaService
TSS(L) 1: 22:08:02:0626 tvsu.cpp(431): Installdir: C:\Program Files (x86)\Lenovo\System Update
TSS(L) 1: 22:08:02:0626 tvsu.cpp(466): szCmd: C:\Program Files (x86)\Lenovo\System Update\TvsuCommandLauncher.exe
ServiceStart.log = empty
UACSdk.txt:
TSS(L) 0: 22:08:06:0188 uacsdk.cpp(170): User: ADMIN
TSS(L) 0: 22:08:06:0204 uacsdk.cpp(171): TempRegFile:
TSS(L) 0: 22:08:06:0204 uacsdk.cpp(172): ComdLine:
TSS(L) 0: 22:08:06:0220 uacsdk.cpp(180): ComdLine tvsukernel: C:\Program Files (x86)\Lenovo\System Update\\Tvsukernel.exe
TSS(L) 0: 22:08:06:0220 uacsdk.cpp(190): Vista admin
TSS(L) 0: 22:08:06:0220 vista.cpp(62): StartUIOnVistaWithAdminLongon ComdLine: C:\Program Files (x86)\Lenovo\System Update\\Tvsukernel.exe
TSS(L) 0: 22:08:06:0220 vista.cpp(70): WTSGetActiveConsoleSessionId dwSessionId: 1
TSS(L) 0: 22:08:06:0235 vista.cpp(124): ImpersonateLoggedOnUser succeed!
TSS(L) 0: 22:08:06:0235 vista.cpp(129): CreateProcessAsUser CommandLine : C:\Program Files (x86)\Lenovo\System Update\\Tvsukernel.exe
TSS(L) 0: 22:08:06:0423 vista.cpp(154): CreateProcessAsUser succeed!Hmm, I can in fact not paste the log since the message "exceeds 20,000 chars. Where do I find the "attach file" Icon - looking everywhere.
Pasting bits and pieces from tvsu_log_121219220812.txt:
Severe 2012-12-19 , 10:08:13
vid Tvsu.Sdk.SuSdk.StartApplication()
Message: Application runs with the framework: 2.0.50727.6400
Severe 2012-12-19 , 10:09:04 vid Tvsu.FileDownloader.HttpsDownload.Init(FileDownloadInfo fileInfo) Message: Debug Log: Init method:GET
Severe 2012-12-19 , 10:09:05 vid Tvsu.FileDownloader.HttpsDownload.doDownloadByHttps(FileDownloadInfo fileInfo, downloadingDelegate downDelegate) Message: Debug Log: doDownloadByHttps InterException is null, uri:https://download.lenovo.com/ibmdl/pub/pc/pccbbs/agent/SSClientCommon/HelloLevel_9_01_00.xml
Severe 2012-12-19 , 10:09:05 vid Tvsu.FileDownloader.HttpsDownload.doDownloadByHttps(FileDownloadInfo fileInfo, downloadingDelegate downDelegate) Message: Debug Log doDownloadByHttps webException message:Fjärrservern returnerade ett fel: (404) Kunde inte hittas.
Severe 2012-12-19 , 10:09:05 vid Tvsu.FileDownloader.HttpsDownload.doDownloadByHttps(FileDownloadInfo fileInfo, downloadingDelegate downDelegate) Message: Debug Log server path: https://download.lenovo.com/ibmdl/pub/pc/pccbbs/agent/SSClientCommon/HelloLevel_9_01_00.xml responseStatus:404
Severe 2012-12-19 , 10:09:05 vid Tvsu.FileDownloader.HttpsDownload.doDownloadByHttps(FileDownloadInfo fileInfo, downloadingDelegate downDelegate) Message: Debug Log server path: https://download.lenovo.com/ibmdl/pub/pc/pccbbs/agent/SSClientCommon/HelloLevel_9_01_00.xml webException.StackTrace: vid System.Net.HttpWebRequest.GetResponse() vid Tvsu.FileDownloader.HttpsDownload.doDownloadByHttps(FileDownloadInfo fileInfo, downloadingDelegate downDelegate)
Severe 2012-12-19 , 10:09:05
vid Tvsu.Engine.Process.HelloProcess.Start()
Message: Could't connect to the HelloServer, no UDF file was downloaded
Severe 2012-12-19 , 10:09:31
vid Tvsu.FileDownloader.HttpsDownload.Init(FileDownloadInfo fileInfo)
Message: Debug Log: Init method:GET
Then just TONS of Info messages and here is the end of the file
Info 2012-12-19 , 10:09:37 vid Tvsukernel.CustomControls.Step.<>c__DisplayClass7.<set_Image>b__6() Message: Setting DONE status.
Info 2012-12-19 , 10:09:37 vid Tvsukernel.CustomControls.Step.<>c__DisplayClass7.<set_Image>b__6() Message: Setting PROCESSING status.
Info 2012-12-19 , 10:09:37 vid Tvsu.Coreq.LoadCoreqsProcessor.ProcessUpdatesImplementation(Update[] ups) Message: Candidate list: Critical patch to fix TVSU UTS issue[reboot type 1] ThinkVantage Fingerprint Software for Windows 7/8 32-bit[reboot type 3] Lenovo Power Management Driver - 8 [32,64][reboot type 3] ThinkVantage Aktivt skyddssystem (64bit)[reboot type 3] ThinkVantage Fingerprint Software for Windows 7/8 64-bit[reboot type 3] WiMAX Driver for Intel Cards (For Win8 64bit)[reboot type 3] Intel Management Engine Interface driver for Win8 32/64 bit[reboot type 3] Synaptics ThinkPad UltraNav Driver - Vista/7/8 [32,64][reboot type 3] Realtek Card Reader Driver for Windows 8 32/64 bit -WW-[reboot type 3] WiFi driver for Intel cards Driver - 8 [64][reboot type 3] Lenovo Settings Dependency Pacakge - 8 [32,64][reboot type 3] ThinkVantage Aktivt skyddssystem (64bit)[reboot type 3] AMD USB Filter Driver for Windows 8 32/64bit[reboot type 3] ThinkVantage Aktivt skyddssystem (32bit)[reboot type 3] WiFi driver for Intel cards Driver - 8 [32][reboot type 3] WiMAX Driver for Intel Cards (For Win8 32bit)[reboot type 3]
Info 2012-12-19 , 10:09:37 vid Tvsu.Coreq.LoadCoreqsProcessor.ProcessUpdatesImplementation(Update[] ups) Message: Resulted order of candidate list: ThinkVantage Fingerprint Software for Windows 7/8 32-bit[reboot type 3] Lenovo Power Management Driver - 8 [32,64][reboot type 3] ThinkVantage Aktivt skyddssystem (64bit)[reboot type 3] ThinkVantage Fingerprint Software for Windows 7/8 64-bit[reboot type 3] WiMAX Driver for Intel Cards (For Win8 64bit)[reboot type 3] Intel Management Engine Interface driver for Win8 32/64 bit[reboot type 3] Synaptics ThinkPad UltraNav Driver - Vista/7/8 [32,64][reboot type 3] Realtek Card Reader Driver for Windows 8 32/64 bit -WW-[reboot type 3] WiFi driver for Intel cards Driver - 8 [64][reboot type 3] Lenovo Settings Dependency Pacakge - 8 [32,64][reboot type 3] ThinkVantage Aktivt skyddssystem (64bit)[reboot type 3] AMD USB Filter Driver for Windows 8 32/64bit[reboot type 3] ThinkVantage Aktivt skyddssystem (32bit)[reboot type 3] WiFi driver for Intel cards Driver - 8 [32][reboot type 3] WiMAX Driver for Intel Cards (For Win8 32bit)[reboot type 3] Critical patch to fix TVSU UTS issue[reboot type 1]
Info 2012-12-19 , 10:09:37 vid Tvsu.Coreq.CoreqProcessor.ProcessUpdatesImplementation(Update[] updates) Message: Beginning Coreq process: Recieved 16Updates -
Hi,
A WebLogic 10.3.2.0 server is hanging at startup. There are no error messages. The last command in the startup window is:
"The server log file <log file dest> is opened. All server side log events will be written to this file."
I think the next line should be:
"Security initializing using security realm realm."
Any ideas on what could be the issue? For instance what resources should be accessed at that point of time? There is sufficient space left on the (virtual machine) disk. The VM configured with 8GB memory. Could it be performance related still?
Following is written to the log file:
####<12.aug.2010 kl 09.47 CEST> <Info> <WebLogicServer> <oim> <> <Main Thread> <> <> <> <1281599254656> <BEA-000214> <WebLogic Server "AdminServer" version:
WebLogic Server 10.3.2.0 Tue Oct 20 12:16:15 PDT 2009 1267925 Copyright (c) 1995, 2009, Oracle and/or its affiliates. All rights reserved.> ####<12.aug.2010 kl 09.47 CEST> <Notice> <Log Management> <oim> <> <[ACTIVE] ExecuteThread: '0' for queue: 'weblogic.kernel.Default (self-tuning)'> <> <> <> <1281599255312> <BEA-170019> <The server log file ....logs\AdminServer.log is opened. All server side log events will be written to this file.> ####<12.aug.2010 kl 09.47 CEST> <Info> <Log Management> <oim> <> <[ACTIVE] ExecuteThread: '0' for queue: 'weblogic.kernel.Default (self-tuning)'> <> <> <> <1281599255390> <BEA-170023> <The Server Logging is initialized with Java Logging API implementation.> ####<12.aug.2010 kl 09.47 CEST> <Info> <Diagnostics> <oim> <AdminServer> <[ACTIVE] ExecuteThread: '0' for queue: 'weblogic.kernel.Default (self-tuning)'> <<WLS Kernel>> <> <> <1281599255671> <BEA-320001> <The ServerDebug service initialized successfully.> ####<12.aug.2010 kl 09.47 CEST> <Info> <Store> <oim> <AdminServer> <[ACTIVE] ExecuteThread: '0' for queue: 'weblogic.kernel.Default (self-tuning)'> <<WLS Kernel>> <> <> <1281599256515> <BEA-280050> <Persistent store "WLS_DIAGNOSTICS" opened: directory="....s\domains\oim\servers\AdminServer\data\store\diagnostics" writePolicy="Disabled" blockSize=512 directIO=false driver="wlfileio2"> ####<12.aug.2010 kl 09.47 CEST> <Info> <Server> <oim> <AdminServer> <[ACTIVE] ExecuteThread: '0' for queue: 'weblogic.kernel.Default (self-tuning)'> <<WLS Kernel>> <> <> <1281599257140> <BEA-002622> <The protocol "t3" is now configured.> ####<12.aug.2010 kl 09.47 CEST> <Info> <Server> <oim> <AdminServer> <[ACTIVE] ExecuteThread: '0' for queue: 'weblogic.kernel.Default (self-tuning)'> <<WLS Kernel>> <> <> <1281599257140> <BEA-002622> <The protocol "t3s" is now configured.> ####<12.aug.2010 kl 09.47 CEST> <Info> <Server> <oim> <AdminServer> <[ACTIVE] ExecuteThread: '0' for queue: 'weblogic.kernel.Default (self-tuning)'> <<WLS Kernel>> <> <> <1281599257140> <BEA-002622> <The protocol "http" is now configured.> ####<12.aug.2010 kl 09.47 CEST> <Info> <Server> <oim> <AdminServer> <[ACTIVE] ExecuteThread: '0' for queue: 'weblogic.kernel.Default (self-tuning)'> <<WLS Kernel>> <> <> <1281599257140> <BEA-002622> <The protocol "https" is now configured.> ####<12.aug.2010 kl 09.47 CEST> <Info> <Server> <oim> <AdminServer> <[ACTIVE] ExecuteThread: '0' for queue: 'weblogic.kernel.Default (self-tuning)'> <<WLS Kernel>> <> <> <1281599257140> <BEA-002622> <The protocol "iiop" is now configured.> ####<12.aug.2010 kl 09.47 CEST> <Info> <Server> <oim> <AdminServer> <[ACTIVE] ExecuteThread: '0' for queue: 'weblogic.kernel.Default (self-tuning)'> <<WLS Kernel>> <> <> <1281599257156> <BEA-002622> <The protocol "iiops" is now configured.> ####<12.aug.2010 kl 09.47 CEST> <Info> <Server> <oim> <AdminServer> <[ACTIVE] ExecuteThread: '0' for queue: 'weblogic.kernel.Default (self-tuning)'> <<WLS Kernel>> <> <> <1281599257156> <BEA-002622> <The protocol "ldap" is now configured.> ####<12.aug.2010 kl 09.47 CEST> <Info> <Server> <oim> <AdminServer> <[ACTIVE] ExecuteThread: '0' for queue: 'weblogic.kernel.Default (self-tuning)'> <<WLS Kernel>> <> <> <1281599257156> <BEA-002622> <The protocol "ldaps" is now configured.> ####<12.aug.2010 kl 09.47 CEST> <Info> <Server> <oim> <AdminServer> <[ACTIVE] ExecuteThread: '0' for queue: 'weblogic.kernel.Default (self-tuning)'> <<WLS Kernel>> <> <> <1281599257187> <BEA-002622> <The protocol "cluster" is now configured.> ####<12.aug.2010 kl 09.47 CEST> <Info> <Server> <oim> <AdminServer> <[ACTIVE] ExecuteThread: '0' for queue: 'weblogic.kernel.Default (self-tuning)'> <<WLS Kernel>> <> <> <1281599257187> <BEA-002622> <The protocol "clusters" is now configured.> ####<12.aug.2010 kl 09.47 CEST> <Info> <Server> <oim> <AdminServer> <[ACTIVE] ExecuteThread: '0' for queue: 'weblogic.kernel.Default (self-tuning)'> <<WLS Kernel>> <> <> <1281599257218> <BEA-002622> <The protocol "snmp" is now configured.> ####<12.aug.2010 kl 09.47 CEST> <Info> <Server> <oim> <AdminServer> <[ACTIVE] ExecuteThread: '0' for queue: 'weblogic.kernel.Default (self-tuning)'> <<WLS Kernel>> <> <> <1281599257218> <BEA-002622> <The protocol "admin" is now configured.> ####<12.aug.2010 kl 09.47 CEST> <Info> <Server> <oim> <AdminServer> <[ACTIVE] ExecuteThread: '0' for queue: 'weblogic.kernel.Default (self-tuning)'> <<WLS Kernel>> <> <> <1281599257218> <BEA-002624> <The administration protocol is "t3s" and is now configured.> ####<12.aug.2010 kl 09.47 CEST> <Info> <RJVM> <oim> <AdminServer> <[ACTIVE] ExecuteThread: '0' for queue: 'weblogic.kernel.Default (self-tuning)'> <<WLS Kernel>> <> <> <1281599257468> <BEA-000570> <Network Configuration for Channel "AdminServer"
Listen Address :7001
Public Address N/A
Http Enabled true
Tunneling Enabled false
Outbound Enabled false
Admin Traffic Enabled true>
####<12.aug.2010 kl 09.47 CEST> <Info> <Server> <oim> <AdminServer> <[ACTIVE] ExecuteThread: '0' for queue: 'weblogic.kernel.Default (self-tuning)'> <<WLS Kernel>> <> <> <1281599257687> <BEA-002609> <Channel Service initialized.> ####<12.aug.2010 kl 09.47 CEST> <Info> <Socket> <oim> <AdminServer> <[ACTIVE] ExecuteThread: '0' for queue: 'weblogic.kernel.Default (self-tuning)'> <<WLS Kernel>> <> <> <1281599258000> <BEA-000406> <NTSocketMuxer was built on Jan 13 2005 17:47:03
####<12.aug.2010 kl 09.47 CEST> <Info> <Socket> <oim> <AdminServer> <[ACTIVE] ExecuteThread: '0' for queue: 'weblogic.kernel.Default (self-tuning)'> <<WLS Kernel>> <> <> <1281599258078> <BEA-000436> <Allocating 3 reader threads.> ####<12.aug.2010 kl 09.47 CEST> <Info> <Socket> <oim> <AdminServer> <[ACTIVE] ExecuteThread: '0' for queue: 'weblogic.kernel.Default (self-tuning)'> <<WLS Kernel>> <> <> <1281599258078> <BEA-000446> <Native IO Enabled.> ####<12.aug.2010 kl 09.47 CEST> <Info> <IIOP> <oim> <AdminServer> <[ACTIVE] ExecuteThread: '0' for queue: 'weblogic.kernel.Default (self-tuning)'> <<WLS Kernel>> <> <> <1281599259500> <BEA-002014> <IIOP subsystem enabled.>
Thanks!!tried both of these, still having same error as below:
<Sep 8, 2010 1:32:37 PM IST> <Critical> <Security> <BEA-090402> <Authentication denied: Boot identity not valid; The user name and/or password from the boot identity file (boot.properties) is not valid. The boot identity may have been changed since the boot identity file was created. Please edit and update the boot identity file with the proper values of username and password. The first time the updated boot identity file is used to start the server, these new values are encrypted.>
<Sep 8, 2010 1:32:37 PM IST> <Critical> <WebLogicServer> <BEA-000386> <Server subsystem failed. Reason: weblogic.security.SecurityInitializationException: Authentication denied: Boot identity not valid; The user name and/or password from the boot identity file (boot.properties) is not valid. The boot identity may have been changed since the boot identity file was created. Please edit and update the boot identity file with the proper values of username and password. The first time the updated boot identity file is used to start the server, these new values are encrypted.
weblogic.security.SecurityInitializationException: Authentication denied: Boot identity not valid; The user name and/or password from the boot identity file (boot.properties) is not valid. The boot identity may have been changed since the boot identity file was created. Please edit and update the boot identity file with the proper values of username and password. The first time the updated boot identity file is used to start the server, these new values are encrypted.
at weblogic.security.service.CommonSecurityServiceManagerDelegateImpl.doBootAuthorization(CommonSecurityServiceManagerDelegateImpl.java:959)
at weblogic.security.service.CommonSecurityServiceManagerDelegateImpl.initialize(CommonSecurityServiceManagerDelegateImpl.java:1050)
at weblogic.security.service.SecurityServiceManager.initialize(SecurityServiceManager.java:875)
at weblogic.security.SecurityService.start(SecurityService.java:141)
at weblogic.t3.srvr.SubsystemRequest.run(SubsystemRequest.java:64)
Truncated. see log file for complete stacktrace
Caused By: javax.security.auth.login.FailedLoginException: [Security:090304]Authentication Failed: User weblogic2 javax.security.auth.login.FailedLoginException: [Security:090302]Authentication Failed: User weblogic2 denied
at weblogic.security.providers.authentication.LDAPAtnLoginModuleImpl.login(LDAPAtnLoginModuleImpl.java:250)
at com.bea.common.security.internal.service.LoginModuleWrapper$1.run(LoginModuleWrapper.java:110)
at java.security.AccessController.doPrivileged(Native Method)
at com.bea.common.security.internal.service.LoginModuleWrapper.login(LoginModuleWrapper.java:106)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
Truncated. see log file for complete stacktrace
>
<Sep 8, 2010 1:32:37 PM IST> <Notice> <WebLogicServer> <BEA-000365> <Server state changed to FAILED>
<Sep 8, 2010 1:32:37 PM IST> <Error> <WebLogicServer> <BEA-000383> <A critical service failed. The server will shut itself down>
<Sep 8, 2010 1:32:37 PM IST> <Notice> <WebLogicServer> <BEA-000365> <Server state changed to FORCE_SHUTTING_DOWN>
Pls help me out ASAP... -
How to read Skype connection logs (.log files in /...
Hi all,
A funny problem but really important - please help!
I set up logging to find out causes of some tech problems occuring during my calls, as it described here: https://support.skype.com/en/faq/FA12321/how-do-i-create-log-files
So, I get now /Logs folder with .log files.
But they are not text files and I don't know what this format is and what tool can read them(
Skype tech support doesn't know as well. Please help, who knows, how to read these .log files?
Thanks!
ps my OS is Ubuntu LinuxIt's somewhat odd that the Skype tech support staffer you spoke with doesn't know what to do with the logs, as the instructions you posted say to place them in a ZIP file and sent it to them. Since the log files I observed start with 'BLOGBEGIN,' there must be software to parse the file. Also...you never described the tech problems during calls which prompted you to enable logging; and if Skype Support suggested it (which seems doubtful, given they didn't know what to do with the files). Perhaps you can provide a description of the issue?
Maybe you are looking for
-
I just upgraded to Yosemite on a MacBook Pro and also needed to upgrade Numbers. However, I can not get a spreadsheet that I had prior to this. I did backup to time machine. How do I now restore to get that spreadsheet?
-
Formatted Search Query for BatchNo
Dear All, I am using the following query as formated search for Identifying the batches availble during the creation of Delivary document in a user defined column at row level. When i click on this field it's showing the Batches for the Item with Ze
-
Is it possible to have videos in multi-state object?
I successfully built the SWF, place in place, export to PDF, fix the acrobat, now the navigation buttons work, it takes me to the corresponding video, but the video only shows up the first frame, no video controls. I check the file size it's only 1MB
-
My iPad gets stucked in the last step of the sync with the computer.
When connecting the iPad to the computer the iTunes starts, reconizes my iPad and starts the sync of the iPad. It syncs the contacts, calendars, pictures, music, etc, but in the last step (7th; "finishing Sync") it stucks and stays for ever without f
-
HREF to open Captivate swf file with Flash Player
Hello all, I am new to the Captivate world as well as the new window options within html code. Currently I am using Captivate to re-create and update a DemoShield CBT and woudl like to call the .swf file the same way the DemoShiled file is called wit