Parsing a log file on Weblogic

Hi!
I'd like to know how to get started on parsing a log file present in the default directory of Weblogic (ver 6.1 to be precise).
I thought of using regular expressions, and use java.util.regex , but that is supported from JDK1.5 onwards, whereas WL6.1 supports JDK1.3.
If u can also provide the code template for the same , that would be nice.
Thanks in advance,
Deepthy.

uncle_alice wrote:
String regex = "([^\"\\\\]++|\\\\.)++"{code} The trick is to match anything except a quotation mark or a backslash, OR match a backslash followed by anything (because the backslash is usually used to escape other characters as well, including backslashes).Superb! Thanks! I have to admit I've never used the ++ before (only the greedies), but that's the thing I was looking for.
Just for the completeness, this is the whole thing that's able to parse a log line:
{code}
public class LogParser {
private static final String NOSPACE_PARAM = "([^ ]++)";
private static final String DATE_PARAM = "([^\\]]++)";
private static final String ESCAPED_PARAM = "((?:[^\"\\\\]++|\\\\.)++)";
private static final String PATTERN_STRING = NOSPACE_PARAM
+ " " + NOSPACE_PARAM
+ " " + NOSPACE_PARAM
+ " \\[" + DATE_PARAM + "\\]"
+ " \"" + ESCAPED_PARAM + "\""
+ " " + NOSPACE_PARAM
+ " " + NOSPACE_PARAM
+ " \"" + ESCAPED_PARAM + "\""
+ " \"" + ESCAPED_PARAM + "\""
+ " \"" + ESCAPED_PARAM + "\""
+ " \"" + ESCAPED_PARAM + "\""
+ " \"" + ESCAPED_PARAM + "\""
+ " " + NOSPACE_PARAM
+ " \"" + ESCAPED_PARAM + "\""
+ " \"" + ESCAPED_PARAM + "\""
+ " \"" + ESCAPED_PARAM + "\""
+ " \"" + ESCAPED_PARAM + "\"";
private static final Pattern PATTERN = Pattern.compile(PATTERN_STRING);
public static String[] parse(String line) {
Matcher m = PATTERN.matcher(line);
if (m.matches()) {
String[] result = new String[m.groupCount()];
for (int i = 0; i < m.groupCount();) {
result[i] = m.group(++i);
return result;
return null;
{code}
Any idea about the efficiency of this thing?

Similar Messages

  • Parse robocopy Log File - new value

    Hello,
    I have found a script, that parse the robocopy log file, which looks like this:
       ROBOCOPY     ::     Robust File Copy for Windows                             
      Started : Thu Aug 07 09:30:18 2014
       Source : e:\testfolder\
         Dest : w:\testfolder\
        Files : *.*
      Options : *.* /V /NDL /S /E /COPYALL /NP /IS /R:1 /W:5
         Same          14.6 g e:\testfolder\bigfile - Copy (5).out
         Same          14.6 g e:\testfolder\bigfile - Copy.out
         Same          14.6 g e:\testfolder\bigfile.out
                   Total    Copied   Skipped  Mismatch    FAILED    Extras
        Dirs :         1         0         1         0        
    0         0
       Files :         3         3         0         0        
    0         0
       Bytes :  43.969 g  43.969 g         0         0         0         0
       Times :   0:05:44   0:05:43                       0:00:00   0:00:00
       Speed :           137258891 Bytes/sec.
       Speed :            7854.016 MegaBytes/min.
       Ended : Thu Aug 07 09:36:02 2014
    Most values at output file are included, but the two speed paramter not.
    How can I get this two speed paramters at output file?
    Here is the script:
    param(
    [parameter(Position=0,Mandatory=$true,ValueFromPipeline=$false,HelpMessage='Source Path with no trailing slash')][string]$SourcePath,
    [switch]$fp
    write-host "Robocopy log parser. $(if($fp){"Parsing file entries"} else {"Parsing summaries only, use -fp to parse file entries"})"
    #Arguments
    # -fp File parse. Counts status flags and oldest file Slower on big files.
    $ElapsedTime = [System.Diagnostics.Stopwatch]::StartNew()
    $refreshrate=1 # progress counter refreshes this often when parsing files (in seconds)
    # These summary fields always appear in this order in a robocopy log
    $HeaderParams = @{
    "04|Started" = "date";
    "01|Source" = "string";
    "02|Dest" = "string";
    "03|Options" = "string";
    "07|Dirs" = "counts";
    "08|Files" = "counts";
    "09|Bytes" = "counts";
    "10|Times" = "counts";
    "05|Ended" = "date";
    #"06|Duration" = "string"
    $ProcessCounts = @{
    "Processed" = 0;
    "Error" = 0;
    "Incomplete" = 0
    $tab=[char]9
    $files=get-childitem $SourcePath
    $writer=new-object System.IO.StreamWriter("$(get-location)\robocopy-$(get-date -format "dd-MM-yyyy_HH-mm-ss").csv")
    function Get-Tail([object]$reader, [int]$count = 10) {
    $lineCount = 0
    [long]$pos = $reader.BaseStream.Length - 1
    while($pos -gt 0)
    $reader.BaseStream.position=$pos
    # 0x0D (#13) = CR
    # 0x0A (#10) = LF
    if ($reader.BaseStream.ReadByte() -eq 10)
    $lineCount++
    if ($lineCount -ge $count) { break }
    $pos--
    # tests for file shorter than requested tail
    if ($lineCount -lt $count -or $pos -ge $reader.BaseStream.Length - 1) {
    $reader.BaseStream.Position=0
    } else {
    # $reader.BaseStream.Position = $pos+1
    $lines=@()
    while(!$reader.EndOfStream) {
    $lines += $reader.ReadLine()
    return $lines
    function Get-Top([object]$reader, [int]$count = 10)
    $lines=@()
    $lineCount = 0
    $reader.BaseStream.Position=0
    while(($linecount -lt $count) -and !$reader.EndOfStream) {
    $lineCount++
    $lines += $reader.ReadLine()
    return $lines
    function RemoveKey ( $name ) {
    if ( $name -match "|") {
    return $name.split("|")[1]
    } else {
    return ( $name )
    function GetValue ( $line, $variable ) {
    if ($line -like "*$variable*" -and $line -like "* : *" ) {
    $result = $line.substring( $line.IndexOf(":")+1 )
    return $result
    } else {
    return $null
    function UnBodgeDate ( $dt ) {
    # Fixes RoboCopy botched date-times in format Sat Feb 16 00:16:49 2013
    if ( $dt -match ".{3} .{3} \d{2} \d{2}:\d{2}:\d{2} \d{4}" ) {
    $dt=$dt.split(" ")
    $dt=$dt[2],$dt[1],$dt[4],$dt[3]
    $dt -join " "
    if ( $dt -as [DateTime] ) {
    return $dt.ToStr("dd/MM/yyyy hh:mm:ss")
    } else {
    return $null
    function UnpackParams ($params ) {
    # Unpacks file count bloc in the format
    # Dirs : 1827 0 1827 0 0 0
    # Files : 9791 0 9791 0 0 0
    # Bytes : 165.24 m 0 165.24 m 0 0 0
    # Times : 1:11:23 0:00:00 0:00:00 1:11:23
    # Parameter name already removed
    if ( $params.length -ge 58 ) {
    $params = $params.ToCharArray()
    $result=(0..5)
    for ( $i = 0; $i -le 5; $i++ ) {
    $result[$i]=$($params[$($i*10 + 1) .. $($i*10 + 9)] -join "").trim()
    $result=$result -join ","
    } else {
    $result = ",,,,,"
    return $result
    $sourcecount = 0
    $targetcount = 1
    # Write the header line
    $writer.Write("File")
    foreach ( $HeaderParam in $HeaderParams.GetEnumerator() | Sort-Object Name ) {
    if ( $HeaderParam.value -eq "counts" ) {
    $tmp="~ Total,~ Copied,~ Skipped,~ Mismatch,~ Failed,~ Extras"
    $tmp=$tmp.replace("~","$(removekey $headerparam.name)")
    $writer.write(",$($tmp)")
    } else {
    $writer.write(",$(removekey $HeaderParam.name)")
    if($fp){
    $writer.write(",Scanned,Newest,Summary")
    $writer.WriteLine()
    $filecount=0
    # Enumerate the files
    foreach ($file in $files) {
    $filecount++
    write-host "$filecount/$($files.count) $($file.name) ($($file.length) bytes)"
    $results=@{}
    $Stream = $file.Open([System.IO.FileMode]::Open,
    [System.IO.FileAccess]::Read,
    [System.IO.FileShare]::ReadWrite)
    $reader = New-Object System.IO.StreamReader($Stream)
    #$filestream=new-object -typename System.IO.StreamReader -argumentlist $file, $true, [System.IO.FileAccess]::Read
    $HeaderFooter = Get-Top $reader 16
    if ( $HeaderFooter -match "ROBOCOPY :: Robust File Copy for Windows" ) {
    if ( $HeaderFooter -match "Files : " ) {
    $HeaderFooter = $HeaderFooter -notmatch "Files : "
    [long]$ReaderEndHeader=$reader.BaseStream.position
    $Footer = Get-Tail $reader 16
    $ErrorFooter = $Footer -match "ERROR \d \(0x000000\d\d\) Accessing Source Directory"
    if ($ErrorFooter) {
    $ProcessCounts["Error"]++
    write-host -foregroundcolor red "`t $ErrorFooter"
    } elseif ( $footer -match "---------------" ) {
    $ProcessCounts["Processed"]++
    $i=$Footer.count
    while ( !($Footer[$i] -like "*----------------------*") -or $i -lt 1 ) { $i-- }
    $Footer=$Footer[$i..$Footer.Count]
    $HeaderFooter+=$Footer
    } else {
    $ProcessCounts["Incomplete"]++
    write-host -foregroundcolor yellow "`t Log file $file is missing the footer and may be incomplete"
    foreach ( $HeaderParam in $headerparams.GetEnumerator() | Sort-Object Name ) {
    $name = "$(removekey $HeaderParam.Name)"
    $tmp = GetValue $($HeaderFooter -match "$name : ") $name
    if ( $tmp -ne "" -and $tmp -ne $null ) {
    switch ( $HeaderParam.value ) {
    "date" { $results[$name]=UnBodgeDate $tmp.trim() }
    "counts" { $results[$name]=UnpackParams $tmp }
    "string" { $results[$name] = """$($tmp.trim())""" }
    default { $results[$name] = $tmp.trim() }
    if ( $fp ) {
    write-host "Parsing $($reader.BaseStream.Length) bytes"
    # Now go through the file line by line
    $reader.BaseStream.Position=0
    $filesdone = $false
    $linenumber=0
    $FileResults=@{}
    $newest=[datetime]"1/1/1900"
    $linecount++
    $firsttick=$elapsedtime.elapsed.TotalSeconds
    $tick=$firsttick+$refreshrate
    $LastLineLength=1
    try {
    do {
    $line = $reader.ReadLine()
    $linenumber++
    if (($line -eq "-------------------------------------------------------------------------------" -and $linenumber -gt 16) ) {
    # line is end of job
    $filesdone=$true
    } elseif ($linenumber -gt 16 -and $line -gt "" ) {
    $buckets=$line.split($tab)
    # this test will pass if the line is a file, fail if a directory
    if ( $buckets.count -gt 3 ) {
    $status=$buckets[1].trim()
    $FileResults["$status"]++
    $SizeDateTime=$buckets[3].trim()
    if ($sizedatetime.length -gt 19 ) {
    $DateTime = $sizedatetime.substring($sizedatetime.length -19)
    if ( $DateTime -as [DateTime] ){
    $DateTimeValue=[datetime]$DateTime
    if ( $DateTimeValue -gt $newest ) { $newest = $DateTimeValue }
    if ( $elapsedtime.elapsed.TotalSeconds -gt $tick ) {
    $line=$line.Trim()
    if ( $line.Length -gt 48 ) {
    $line="[...]"+$line.substring($line.Length-48)
    $line="$([char]13)Parsing > $($linenumber) ($(($reader.BaseStream.Position/$reader.BaseStream.length).tostring("P1"))) - $line"
    write-host $line.PadRight($LastLineLength) -NoNewLine
    $LastLineLength = $line.length
    $tick=$tick+$refreshrate
    } until ($filesdone -or $reader.endofstream)
    finally {
    $reader.Close()
    $line=$($([string][char]13)).padright($lastlinelength)+$([char]13)
    write-host $line -NoNewLine
    $writer.Write("`"$file`"")
    foreach ( $HeaderParam in $HeaderParams.GetEnumerator() | Sort-Object Name ) {
    $name = "$(removekey $HeaderParam.Name)"
    if ( $results[$name] ) {
    $writer.Write(",$($results[$name])")
    } else {
    if ( $ErrorFooter ) {
    #placeholder
    } elseif ( $HeaderParam.Value -eq "counts" ) {
    $writer.Write(",,,,,,")
    } else {
    $writer.Write(",")
    if ( $ErrorFooter ) {
    $tmp = $($ErrorFooter -join "").substring(20)
    $tmp=$tmp.substring(0,$tmp.indexof(")")+1)+","+$tmp
    $writer.write(",,$tmp")
    } elseif ( $fp ) {
    $writer.write(",$LineCount,$($newest.ToString('dd/MM/yyyy hh:mm:ss'))")
    foreach ( $FileResult in $FileResults.GetEnumerator() ) {
    $writer.write(",$($FileResult.Name): $($FileResult.Value);")
    $writer.WriteLine()
    } else {
    write-host -foregroundcolor darkgray "$($file.name) is not recognised as a RoboCopy log file"
    write-host "$filecount files scanned in $($elapsedtime.elapsed.tostring()), $($ProcessCounts["Processed"]) complete, $($ProcessCounts["Error"]) have errors, $($ProcessCounts["Incomplete"]) incomplete"
    write-host "Results written to $($writer.basestream.name)"
    $writer.close()
    I hope somebody can help me,
    Horst
    Thanks Horst MOSS 2007 Farm; MOSS 2010 Farm; TFS 2010; TFS 2013; IIS 7.5

    Hi Horst,
    To convert mutiple robocopy log files to a .csv file with "speed" option, the script below may be helpful for you, I tested with a single robocopy log file, and the .csv file will output to "D:\":
    $SourcePath="e:\1\1.txt" #robocopy log file
    write-host "Robocopy log parser. $(if($fp){"Parsing file entries"} else {"Parsing summaries only, use -fp to parse file entries"})"
    #Arguments
    # -fp File parse. Counts status flags and oldest file Slower on big files.
    $ElapsedTime = [System.Diagnostics.Stopwatch]::StartNew()
    $refreshrate=1 # progress counter refreshes this often when parsing files (in seconds)
    # These summary fields always appear in this order in a robocopy log
    $HeaderParams = @{
     "04|Started" = "date"; 
     "01|Source" = "string";
     "02|Dest" = "string";
     "03|Options" = "string";
     "09|Dirs" = "counts";
     "10|Files" = "counts";
     "11|Bytes" = "counts";
     "12|Times" = "counts";
     "05|Ended" = "date";
     "07|Speed" = "default";
     "08|Speednew" = "default"
    $ProcessCounts = @{
     "Processed" = 0;
     "Error" = 0;
     "Incomplete" = 0
    $tab=[char]9
    $files=get-childitem $SourcePath
    $writer=new-object System.IO.StreamWriter("D:\robocopy-$(get-date -format "dd-MM-yyyy_HH-mm-ss").csv")
    function Get-Tail([object]$reader, [int]$count = 10) {
     $lineCount = 0
     [long]$pos = $reader.BaseStream.Length - 1
     while($pos -gt 0)
      $reader.BaseStream.position=$pos
      # 0x0D (#13) = CR
      # 0x0A (#10) = LF
      if ($reader.BaseStream.ReadByte() -eq 10)
       $lineCount++
       if ($lineCount -ge $count) { break }
      $pos--
     # tests for file shorter than requested tail
     if ($lineCount -lt $count -or $pos -ge $reader.BaseStream.Length - 1) {
      $reader.BaseStream.Position=0
     } else {
      # $reader.BaseStream.Position = $pos+1
     $lines=@()
     while(!$reader.EndOfStream) {
      $lines += $reader.ReadLine()
     return $lines
    function Get-Top([object]$reader, [int]$count = 10)
     $lines=@()
     $lineCount = 0
     $reader.BaseStream.Position=0
     while(($linecount -lt $count) -and !$reader.EndOfStream) {
      $lineCount++
      $lines += $reader.ReadLine()  
     return $lines
    function RemoveKey ( $name ) {
     if ( $name -match "|") {
      return $name.split("|")[1]
     } else {
      return ( $name )
    function GetValue ( $line, $variable ) {
     if ($line -like "*$variable*" -and $line -like "* : *" ) {
      $result = $line.substring( $line.IndexOf(":")+1 )
      return $result
     } else {
      return $null
    }function UnBodgeDate ( $dt ) {
     # Fixes RoboCopy botched date-times in format Sat Feb 16 00:16:49 2013
     if ( $dt -match ".{3} .{3} \d{2} \d{2}:\d{2}:\d{2} \d{4}" ) {
      $dt=$dt.split(" ")
      $dt=$dt[2],$dt[1],$dt[4],$dt[3]
      $dt -join " "
     if ( $dt -as [DateTime] ) {
      return $dt.ToStr("dd/MM/yyyy hh:mm:ss")
     } else {
      return $null
    function UnpackParams ($params ) {
     # Unpacks file count bloc in the format
     # Dirs :      1827         0      1827         0         0         0
     # Files :      9791         0      9791         0         0         0
     # Bytes :  165.24 m         0  165.24 m         0         0         0
     # Times :   1:11:23   0:00:00                       0:00:00   1:11:23
     # Parameter name already removed
     if ( $params.length -ge 58 ) {
      $params = $params.ToCharArray()
      $result=(0..5)
      for ( $i = 0; $i -le 5; $i++ ) {
       $result[$i]=$($params[$($i*10 + 1) .. $($i*10 + 9)] -join "").trim()
      $result=$result -join ","
     } else {
      $result = ",,,,,"
     return $result
    $sourcecount = 0
    $targetcount = 1
    # Write the header line
    $writer.Write("File")
    foreach ( $HeaderParam in $HeaderParams.GetEnumerator() | Sort-Object Name ) {
     if ( $HeaderParam.value -eq "counts" ) {
      $tmp="~ Total,~ Copied,~ Skipped,~ Mismatch,~ Failed,~ Extras"
      $tmp=$tmp.replace("~","$(removekey $headerparam.name)")
      $writer.write(",$($tmp)")
     } else {
      $writer.write(",$(removekey $HeaderParam.name)")
    if($fp){
     $writer.write(",Scanned,Newest,Summary")
    $writer.WriteLine()
    $filecount=0
    # Enumerate the files
    foreach ($file in $files) { 
     $filecount++
        write-host "$filecount/$($files.count) $($file.name) ($($file.length) bytes)"
     $results=@{}
    $Stream = $file.Open([System.IO.FileMode]::Open,
                       [System.IO.FileAccess]::Read,
                        [System.IO.FileShare]::ReadWrite)
     $reader = New-Object System.IO.StreamReader($Stream)
     #$filestream=new-object -typename System.IO.StreamReader -argumentlist $file, $true, [System.IO.FileAccess]::Read
     $HeaderFooter = Get-Top $reader 16
     if ( $HeaderFooter -match "ROBOCOPY     ::     Robust File Copy for Windows" ) {
      if ( $HeaderFooter -match "Files : " ) {
       $HeaderFooter = $HeaderFooter -notmatch "Files : "
      [long]$ReaderEndHeader=$reader.BaseStream.position
      $Footer = Get-Tail $reader 16
      $ErrorFooter = $Footer -match "ERROR \d \(0x000000\d\d\) Accessing Source Directory"
      if ($ErrorFooter) {
       $ProcessCounts["Error"]++
       write-host -foregroundcolor red "`t $ErrorFooter"
      } elseif ( $footer -match "---------------" ) {
       $ProcessCounts["Processed"]++
       $i=$Footer.count
       while ( !($Footer[$i] -like "*----------------------*") -or $i -lt 1 ) { $i-- }
       $Footer=$Footer[$i..$Footer.Count]
       $HeaderFooter+=$Footer
      } else {
       $ProcessCounts["Incomplete"]++
       write-host -foregroundcolor yellow "`t Log file $file is missing the footer and may be incomplete"
      foreach ( $HeaderParam in $headerparams.GetEnumerator() | Sort-Object Name ) {
       $name = "$(removekey $HeaderParam.Name)"
                            if ($name -eq "speed"){ #handle two speed
                            ($HeaderFooter -match "$name : ")|foreach{
                             $tmp=GetValue $_ "speed"
                             $results[$name] = $tmp.trim()
                             $name+="new"}
                            elseif ($name -eq "speednew"){} #handle two speed
                            else{
       $tmp = GetValue $($HeaderFooter -match "$name : ") $name
       if ( $tmp -ne "" -and $tmp -ne $null ) {
        switch ( $HeaderParam.value ) {
         "date" { $results[$name]=UnBodgeDate $tmp.trim() }
         "counts" { $results[$name]=UnpackParams $tmp }
         "string" { $results[$name] = """$($tmp.trim())""" }  
         default { $results[$name] = $tmp.trim() }  
      if ( $fp ) {
       write-host "Parsing $($reader.BaseStream.Length) bytes"
       # Now go through the file line by line
       $reader.BaseStream.Position=0
       $filesdone = $false
       $linenumber=0
       $FileResults=@{}
       $newest=[datetime]"1/1/1900"
       $linecount++
       $firsttick=$elapsedtime.elapsed.TotalSeconds
       $tick=$firsttick+$refreshrate
       $LastLineLength=1
       try {
        do {
         $line = $reader.ReadLine()
         $linenumber++
         if (($line -eq "-------------------------------------------------------------------------------" -and $linenumber -gt 16)  ) {
          # line is end of job
          $filesdone=$true
         } elseif ($linenumber -gt 16 -and $line -gt "" ) {
          $buckets=$line.split($tab)
          # this test will pass if the line is a file, fail if a directory
          if ( $buckets.count -gt 3 ) {
           $status=$buckets[1].trim()
           $FileResults["$status"]++
           $SizeDateTime=$buckets[3].trim()
           if ($sizedatetime.length -gt 19 ) {
            $DateTime = $sizedatetime.substring($sizedatetime.length -19)
            if ( $DateTime -as [DateTime] ){
             $DateTimeValue=[datetime]$DateTime
             if ( $DateTimeValue -gt $newest ) { $newest = $DateTimeValue }
         if ( $elapsedtime.elapsed.TotalSeconds -gt $tick ) {
          $line=$line.Trim()
          if ( $line.Length -gt 48 ) {
           $line="[...]"+$line.substring($line.Length-48)
          $line="$([char]13)Parsing > $($linenumber) ($(($reader.BaseStream.Position/$reader.BaseStream.length).tostring("P1"))) - $line"
          write-host $line.PadRight($LastLineLength) -NoNewLine
          $LastLineLength = $line.length
          $tick=$tick+$refreshrate      
        } until ($filesdone -or $reader.endofstream)
       finally {
        $reader.Close()
       $line=$($([string][char]13)).padright($lastlinelength)+$([char]13)
       write-host $line -NoNewLine
      $writer.Write("`"$file`"")
      foreach ( $HeaderParam in $HeaderParams.GetEnumerator() | Sort-Object Name ) {
       $name = "$(removekey $HeaderParam.Name)"
       if ( $results[$name] ) {
        $writer.Write(",$($results[$name])")
       } else {
        if ( $ErrorFooter ) {
         #placeholder
        } elseif ( $HeaderParam.Value -eq "counts" ) {
         $writer.Write(",,,,,,")
        } else {
         $writer.Write(",")
      if ( $ErrorFooter ) {
       $tmp = $($ErrorFooter -join "").substring(20)
       $tmp=$tmp.substring(0,$tmp.indexof(")")+1)+","+$tmp
       $writer.write(",,$tmp")
      } elseif ( $fp ) {
       $writer.write(",$LineCount,$($newest.ToString('dd/MM/yyyy hh:mm:ss'))")   
       foreach ( $FileResult in $FileResults.GetEnumerator() ) {
        $writer.write(",$($FileResult.Name): $($FileResult.Value);")
      $writer.WriteLine()
     } else {
      write-host -foregroundcolor darkgray "$($file.name) is not recognised as a RoboCopy log file"
    write-host "$filecount files scanned in $($elapsedtime.elapsed.tostring()), $($ProcessCounts["Processed"]) complete, $($ProcessCounts["Error"]) have errors, $($ProcessCounts["Incomplete"]) incomplete"
    write-host  "Results written to $($writer.basestream.name)"
    $writer.close()
    If you have any other questions, please feel free to let me know.
    If you have any feedback on our support,
    please click here.
    Best Regards,
    Anna Wang
    TechNet Community Support

  • Parsing sendmail log file in a Java application

    Are there any good parsing libraries for sendmail log files?
    I already found these libraries but I'm not sure if they do what I need:
    http://www.opennms.org/documentation/java-apidocs-stable/org/opennms/netmgt/syslogd/package-summary.html
    http://code.google.com/p/jsyslogd/

    >
    I've written a simple text editor, and this editor saves files with a particular extension. It can both open and save files. I've put the text editor program in a JAR. What I'd like to do, if possible, is associate the file extension with the text editor program. That is, I'd like to, when I click on a file with the extension, have the text editor come up with the file opened in it.
    Can anyone give me ideas on how to do this, please? >If the editor is launched using webstart, the launch file can suggest a file association.
    Note that an application that accesses the local file system needs to be digitally signed before it can break out of the applet like 'sandbox' in which it runs, unless it uses the JNLP API to access the files. The JNLP API is available to any app. launched using webstart.
    There is an example of both claiming a file extension, and accessing files using the JNLP API, in this [File Service Demo|http://pscode.org/jws/api.html#fs]. The complete source and a build file can be downloaded from the filetest.zip just near the launch buttons.
    I suggest you try the sandboxed version first - if you think that will suit your user, go with that.
    As an aside, for best chance of a solution, I recommend folks add [Duke stars|http://wikis.sun.com/display/SunForums/Duke+Stars+Program+Overview] to match the importance of the task.

  • Parse a log file...

    Hi All,
    1. I want to parse the content of log file, but when I open the log file it does not show me field names .
    it starts with row containing the contents directly, where i want to read and process only three fields randomly.
    I have written the code that works on IIS logs, the log i want to parse having field separator ' ' as single white space.
    2. Some of log files are zipped, so I am unable to open and read them. so that I can parse them.
    can any one have any clue or code that help me out.
    thanks!

    bhatnagarudit wrote:
    Hi All,
    1. I want to parse the content of log file, but when I open the log file it does not show me field names .
    it starts with row containing the contents directly, where i want to read and process only three fields randomly.
    I have written the code that works on IIS logs, the log i want to parse having field separator ' ' as single white space.
    2. Some of log files are zipped, so I am unable to open and read them. so that I can parse them.
    can any one have any clue or code that help me out.
    thanks! Here is a suggested algorithm .. (I don't want to write the code for you :-))
    You have the following format.
    314159b66967d86f031c7249d1d9a8024.. mybucket +[04/Aug/2006:22:34:02 +0000]+ 72.21.206.5 314159b66967d86f031c724... 3E57427F33A59F07 REST.PUT.OBJECT* /photos/2006/08/puppy.jpg +"GET /mybucket/photos/2006/08/puppy.jpg?x-foo=bar"+ 200 NoSuchBucket 2662992 3462992 70 10 "http://www.amazon.com/webservices" "curl/7.15.1"
    Read the file in, go thru lines one by one. For each line,
    1. Get the content in the first square brackets. Regular Expression: [&].
    2. From there, get the fourth (4th) word separated by space.
    3. From there, get the content in the first pair of double quotes. Regular Expression: \"&\".

  • How do I move the location of my log files in WebLogic

    I am using Weblogic on Windows
    I have WebLogic installed on the C: drive
    I want to keep WebLogic on the C: drive, but move my log files to an alternate drive
    How can I do this ?
    Can this be done from the console ?
    Thanks

    You can refer to "Changing the Name and Location of the Server Log File" in the following document :-
    Link : [http://e-docs.bea.com/wls/docs81/ConsoleHelp/logging.html#1050020]
    Link : [http://e-docs.bea.com/wls/docs92/ConsoleHelp/taskhelp/logging/ChangeServerLogFileNamesAndLocations.html]
    Edited by: Manish Chellappan on Jul 14, 2009 8:16 PM

  • How to rotate .out ( stdout) log file in weblogic 9.2 (solaris/linux).

    Hi,
    Is there a way we can rotate .out log file.
    I have written a shell script which will backup the log file and dump .out file.
    #!/bin/bash
    FILE=/opt/bea10/user_projects/domains/wl102xdomain/servers/ManagedServer1/logs/ManagedServer1.log
    if cp $FILE ${FILE}.`date +%m%d`; then
    cp /dev/null $FILE
    fi
    and run this as cron job
    Can we use this ? (cp /dev/null), this will dump the .out file with out restarting the server.
    Thanks,
    Krishna.

    Yes John.. i am using startNodemanager.sh.
    Is this the change OutFile=$ServerDir/logs/$ServerName.out you are talking about.
    wlscontrol.sh file:
    # Directory and file names
    ServerDir=$DomainDir/servers/$ServerName
    SaltFile=$DomainDir/security/SerializedSystemIni.dat
    OldSaltFile=$DomainDir/SerializedSystemIni.dat
    StateFile=$ServerDir/data/nodemanager/$ServerName.state
    PropsFile=$ServerDir/data/nodemanager/startup.properties
    PidFile=$ServerDir/data/nodemanager/$ServerName.pid
    LockFile=$ServerDir/data/nodemanager/$ServerName.lck
    BootFile=$ServerDir/security/boot.properties
    RelBootFile=servers/$ServerName/security/boot.properties
    NMBootFile=$ServerDir/data/nodemanager/boot.properties
    RelNMBootFile=servers/$ServerName/data/nodemanager/boot.properties
    OutFile=$ServerDir/logs/$ServerName.out
    SetDomainEnvScript=$DomainDir/bin/setDomainEnv.sh
    StartWebLogicScript=$DomainDir/bin/startWebLogic.sh
    MigrationScriptDir=$DomainDir/bin/service_migration
    Thanks,
    Krish

  • Parse XMLFormatter log files back into LogRecords

    My app uses a java.util.logging.FileHandler and XMLFormatter to generate XML log files, which is great. Now I want to display those logs in my app's web admin console.
    I assumed that there would be an easy way to read my XML logs files back into a List of LogRecord objects, but my initial investigation has revealed nothing.
    Anyone have any advice?

    If you remove an "active" log file, then this can cause problems. If you remove an archieved log file, then it is OK.
    If you change the log directory, then you SHOULD inform all your applications that use this directory... Depending on the service, this information is usually stored inside the config files of the services.
    Mihalis.

  • Changing log file name (weblogic.log)

     

    hi!
    maybe you could try putting this in the weblogic.properties file or the
    starscript shell:
    weblogic.system.logFile= [logfile name]
    Hope this helps!
    Mauricio

  • How to configure weblogic log file?

    HI
    How do i configure Weblogic server log file to log weblogic related information as well as my application?
    I need to maintain one log file for weblogic and my application.
    Thanks,

    Then glassfish instance is either not configured yet or installed somewhere else. Try looking under your user's home directory like C:\Documents and Settings\<username>\Application Data\glassfish\domains\domain1\logs

  • WEBlogic webserver LOG file examples

    Does anyone have a sample Weblogic webserver log file sample ia can use for a test
    on another application i am creating?
    I am also looking for a portal log file as well . Any help is appreciated. Trying
    to create a some reports based on these outputs.
    Thanks
    Travis Giffin

    No, not as I know of. If you are experiencing some problems and you really want to
    find more information about a problematic installation, you may do the following to
    generate the debug output and you can send that debug info to me:
    Windows: pressing down control key at the beginning of the installation until a
    debug console window pops up.
    Unix Installer ( .bin files): set LAXDEBUG=true in your environment before starting
    your installation. tee or script the output to a file:
    export LAXDEBUG=true
    sh weblogic610.bin -i console 2>&1 | tee buildlog
    Pure java installer (.zip file):
    touch ia_debug
    $JAVA_HOME/java -cp weblogic610.zip -i console 2>&1 | tee buildlog
    The upgrade installer should have a clear log to tell you what went wrong in a
    installation or uninstallation process. And the installation/uninstallation process
    would be rolled back if anything went wrong. The exception is that if you manually
    stoped it in some way, which way may catch and process later.
    -Dan
    Ashique wrote:
    Yes, I am trying to do a full installation (in silent mode).
    Every software has an installation log. Is that not pretty much standard?
    How come weblogic does not have log? If something goes wrong during silent
    installation, I would like to see a log file.
    Are you sure that there is absolutely no log file for a full installation?
    Dan Bai <[email protected]> wrote:
    There is no log file for a full installer installation.
    The log file for an servicepack upgrade installation lives under $BEA_HOME/logs.
    Look into the directory corresponding to that servicepack.
    Are you experiencing any problems with the full installer?
    -Dan
    Ashique wrote:
    Where is the installation log file after WebLogic installation?
    I am installing WebLogic6.1. Can I specify where the installation logfile should
    go? Where is it anyway?
    I searched few directories do not see where it placed the installationlo file.

  • WebLogic Installation Log File

    Where is the installation log file after WebLogic installation?
    I am installing WebLogic6.1. Can I specify where the installation log file should
    go? Where is it anyway?
    I searched few directories do not see where it placed the installation lo file.

    No, not as I know of. If you are experiencing some problems and you really want to
    find more information about a problematic installation, you may do the following to
    generate the debug output and you can send that debug info to me:
    Windows: pressing down control key at the beginning of the installation until a
    debug console window pops up.
    Unix Installer ( .bin files): set LAXDEBUG=true in your environment before starting
    your installation. tee or script the output to a file:
    export LAXDEBUG=true
    sh weblogic610.bin -i console 2>&1 | tee buildlog
    Pure java installer (.zip file):
    touch ia_debug
    $JAVA_HOME/java -cp weblogic610.zip -i console 2>&1 | tee buildlog
    The upgrade installer should have a clear log to tell you what went wrong in a
    installation or uninstallation process. And the installation/uninstallation process
    would be rolled back if anything went wrong. The exception is that if you manually
    stoped it in some way, which way may catch and process later.
    -Dan
    Ashique wrote:
    Yes, I am trying to do a full installation (in silent mode).
    Every software has an installation log. Is that not pretty much standard?
    How come weblogic does not have log? If something goes wrong during silent
    installation, I would like to see a log file.
    Are you sure that there is absolutely no log file for a full installation?
    Dan Bai <[email protected]> wrote:
    There is no log file for a full installer installation.
    The log file for an servicepack upgrade installation lives under $BEA_HOME/logs.
    Look into the directory corresponding to that servicepack.
    Are you experiencing any problems with the full installer?
    -Dan
    Ashique wrote:
    Where is the installation log file after WebLogic installation?
    I am installing WebLogic6.1. Can I specify where the installation logfile should
    go? Where is it anyway?
    I searched few directories do not see where it placed the installationlo file.

  • Application to read 2 log files from internet

    Hi,
    Anybody could tell me how to develop this project. I'm lack of time. thanks
    You have been asked into a company called Broken Arrow Software as a Senior Software Engineer and Management consultant. You are being paid very handsomely and have been asked to write an application program in Java for parsing a log file.
    � The program will have a standard menu using the AWT only.
    � ALL LAYOUTS MUST USE ONLY THE BORDER-LAYOUT OR GRID-LAYOUT OR A COMBINATION OF BOTH (Any other layouts score zero).
    This application reads two Log files from the internet. The first file contains USA state names and the abbreviation used for that state. This information will be used in displaying totals from another network file and should be stored in an array (or two arrays, or a two dimensional array) in the program. The second file should extract the �Reversed subdomain� section from a Log file (a section is shown later) and store this in an array in the application. This should be processed and only those domains beginning with �us.� should be processed. Each �us.� Has an abbreviated USA state after it (�tx� is an abbreviation for Texas). The abbreviations are sorted and all states are on consecutive lines and each displays a number of accesses for that state. The accesses should be totalled and displayed in the current Frame for each state in the form of Java List components (these do not need to be synchronised so that they all scroll in unison).
    The application will be a Java application with the following menus and MenuItems:
    � Splash screen (10%)
    � Application (20%)
    o Open USA abbreviation file
    o Clear screen
    o Exit
    � File (25%)
    o Open network log file
    o Open locally saved report file
    o Recent report files
    o Save as report file
    � Graph (20%)
    o Plot
    � Help (15%)
    o Help on Application
    o About
    A basic pass for the application will be for a basic implementation of an application with �help� options and some basic implementations of a Splash screen and some basic file and application options. Very high marks will be awarded for processing network files, saving and opening files, very good HCI, application design and error handling, excellent OO design for classes, gorgeous layout and commenting of code and excellent graphing capabilities.
    Error handling dialogs and overall application design (10%)
    Every class must be in a separate file.
    Inheritance should be used for WindowListeners of Frames and Dialogs.
    Up to 10% can also be lost by unprofessional code layout and lack of professional standards. Always adhere to standards taught throughout the module and your time at the University of Northumbria.
    Examples of non-professionalism would include bad indentation, no comments, meaningless variable names, politically incorrect graphics, commented out code, empty .java files, .java files which are not part of the project etc. Remember your application is your livelihood and your company depends on your application standards.
    The splash screen must be a Frame with a Canvas as part of it showing your own logo. Your logo should be individual to you but does not need to win the computing equivalent of the Turner prize. The application should be displayed behind the Splash screen and both should be visible. The application must not be able to be brought to the front and used without the Splash screen being disposed of.
    The application should only enable the �Open USA abbreviation file�, �Open locally saved report file�, �Help� options and �Exit� Menus and MenuItems, when the application starts. On opening a valid USA abbreviation file, then the other Menus and MenuItems should be enabled.
    The abbreviations should be read into an array. These should be used in displaying the totals for the reversed subdomain totals for each USA state. A total for all USA states should be displayed at the bottom of the current Frame, with a suitable Label (this design is your own). This current Frame should display a series of Lists starting with a List of USA state number (1 to n). A List of USA state abbreviation should be next followed by a List of the actual USA state name, followed by a List of the total accesses for that state.
    The report file should be an ASCII file that can be printed out from an ASCII text editor such as DOS edit or Microsoft NotePad.
    The �Open� network files should display a Dialog asking for the http:// address of the file, with �OK� and �Cancel� options. It is helpful if the user can hit �return� instead of clicking on �OK� and �Escape� instead of �cancel�. Error Dialogs should be used to indicate any errors that may occur and the state of the application should be reset to that of before displaying the Dialog.
    When �Save as report file� is chosen a FileDialog box should be used for the user to choose both directory and filename. The file should be able to be saved as a �.rpt� file.
    Open report file should display the report in a Frame; the design of which is your own.
    Plotting the graph should pass a two dimensional array to a Frame with a Canvas. The Canvas should have a Paint method that draws the axis for the graph and any suitable Headings etc. The graph should draw a histogram of totals per USA states. The graph design is your own but you may wish to use Microsoft Excel as a good example of drawing a histogram.
    The �Clear screen� option should clear any data off the current screen.
    The �Exit� option should quit the application but it may be helpful to ask the user if they really want to exit the application.
    Help must be Java code and not linking into HTML. It should display help in a well designed screen. The most basic implementation might use a scrollable TextArea for a basic mark.
    See other software for a good �About� screen. The most basic should display your name, date, version and company.
    � Help should display your help on using the application. As a senior software engineer, the design is your own, based on experience of using applications, as is the opening splash screen. You may use other applications for inspiration only, as these will make up your experience.
    � Your good knowledge gained from HCI units studied should prove invaluable in the interface design and the usability of the application.
    � The design (Screens and classes) and quality and documentation of code throughout the application will be marked. The experience gained from programming 1 and 2 and Object Oriented Programming should prove invaluable throughout the application, as should any GUI units studied.
    The log file can be accessed at:
    http://computing.unn.ac.uk/staff/cgpb2/public_html/log.html

    You would really gain ever so much more from this exercise if you would write a couple of classes, then come back with some specific questions. If you're completely lost, try starting with the GUI first. It's not the best practice, always, but it is easy to visualize.
    On a side note, I wish I'd had assignments even half this intersting when I was in my Java classes...

  • Java.io.IOException: Failed to rename log file on attempt to rotate logs

    Hello.
    I'm currently using Weblogic 5.1 SP6 on WinNT Server 4.0 SP6.
    I set the weblogic.properties file like this so that the "access.log" will
    be rotated every day at midnight.
    -- weblogic.properties --
    weblogic.httpd.enableLogFile=true
    weblogic.httpd.logFileName=D:/WLSlog/access.log
    weblogic.httpd.logFileFlushSecs=60
    weblogic.httpd.logRotationType=date
    weblogic.httpd.logRotationPeriodMins=1440
    weblogic.httpd.logRotationBeginTime=11-01-2000-00:00:00
    -- weblogic.properties <end>--
    The rotation has been working well, but one day when I checked my
    weblogic.log, I was getting some errors.
    I found out that my "access.log" wasn't being rotated (nor being written,
    flushed) after this error came out.
    After rebooting WebLogic, this problem went away.
    Has anyone clues about why WebLogic failed to "rename log file?"
    -- weblogic.log --
    ? 2 04 00:00:00 JST 2001:<E> <HTTP> Exception flushing HTTP log file
    java.io.IOException: Failed to rename log file on attempt to rotate logs
    at weblogic.t3.srvr.httplog.LogManagerHttp.rotateLog(LogManagerHttp.java,
    Compiled Code)
    at java.lang.Exception.<init>(Exception.java, Compiled Code)
    at java.io.IOException.<init>(IOException.java, Compiled Code)
    at weblogic.t3.srvr.httplog.LogManagerHttp.rotateLog(LogManagerHttp.java,
    Compiled Code)
    at
    weblogic.t3.srvr.httplog.LogManagerHttp.access$2(LogManagerHttp.java:271)
    at
    weblogic.t3.srvr.httplog.LogManagerHttp$RotateLogTrigger.trigger(LogManagerH
    ttp.java:539)
    at
    weblogic.time.common.internal.ScheduledTrigger.executeLocally(ScheduledTrigg
    er.java, Compiled Code)
    at
    weblogic.time.common.internal.ScheduledTrigger.execute(ScheduledTrigger.java
    , Compiled Code)
    at weblogic.time.server.ScheduledTrigger.execute(ScheduledTrigger.java,
    Compiled Code)
    at weblogic.kernel.ExecuteThread.run(ExecuteThread.java, Compiled Code)
    ? 2 04 00:00:25 JST 2001:<E> <HTTP> Exception flushing HTTP log file
    java.io.IOException: Bad file descriptor
    at java.io.FileOutputStream.writeBytes(Native Method)
    at java.io.FileOutputStream.write(FileOutputStream.java, Compiled Code)
    at
    weblogic.utils.io.DoubleBufferedOutputStream.flushBuffer(DoubleBufferedOutpu
    tStream.java, Compiled Code)
    at
    weblogic.utils.io.DoubleBufferedOutputStream.flush(DoubleBufferedOutputStrea
    m.java, Compiled Code)
    at
    weblogic.t3.srvr.httplog.LogManagerHttp$FlushLogStreamTrigger.trigger(LogMan
    agerHttp.java, Compiled Code)
    at
    weblogic.time.common.internal.ScheduledTrigger.executeLocally(ScheduledTrigg
    er.java, Compiled Code)
    at
    weblogic.time.common.internal.ScheduledTrigger.execute(ScheduledTrigger.java
    , Compiled Code)
    at weblogic.time.server.ScheduledTrigger.execute(ScheduledTrigger.java,
    Compiled Code)
    at weblogic.kernel.ExecuteThread.run(ExecuteThread.java, Compiled Code)
    -- weblogic.log <end> --
    note:
    ? 2 04 00:00:25 JST 2001:<E> <HTTP> Exception flushing HTTP log file
    java.io.IOException: Bad file descriptor
    keeps coming out every minute after on.
    I suppose this is because I have set the HTTP log to be flushed every one
    minute.
    Thanks in advance.
    Ryotaro

    I'm also getting this error on Weblogic 6.1.1.
    It only occurs if you set the format to "extended".
    Is there any fix or workaround for this?

  • Parse Webi Logs

    Has anyone ever attempted to parse a BO webi log file with powershell (or other windows scripting language) to get something in a more friendly output that could be passed on to an internal support team for review?  For example, when I review our log files I see a variety of error messages that might be something I need to pay attention to as a system admin or they might be 'training' type errors that I can hand off to our support team to get with the user.  I want to parse out the 'useful' info and ignore the noise ( such as excess info, 'soft errors', or successes).  Plus, the support team does not have access to our webi logs/webi servers.
    I am self-teaching as I go with powershell and since I have other scripts running in powershell, I started with that (there could be a better way).  I would just like to extract the date, server, full error information and username and format it to a list.  There is just no easy way that I can figure out to identify a single log entry line... or to know where the error information 'really' ends.  Maybe I'm overlooking the obvious though - which is very possible as these webi logs make my eyes go buggy!
    Has anyone ever tried anything like this and can share some insight? we are on 4.0 sp7 patch 7 if it makes a difference in log format or anything.
    thanks in advance,
    Missy

    Hi Mani,
    Thanks for the link to the trace settings - yet another good resource you have provided.  Our ini file is set as follows:
    active = true;
    importance = xl;
    alert = true;
    severity = 'E';
    //keep = false;
    //size = 100 * 1000;
    So, mostly defaults.  Not sure the exact difference in context between s, m, l, and xl, but even with these settings noted above, in our webiserver log, for example, I will get a line that looks like this:
    |12671c8e-cc1c-5164-2b7f-d36028aea63d|2014 07 30 17:14:58:479|-0400|Error| |>>| | |webiserver_SERVERA.WebIntelligenceProcessingServer| 9984|3984|| |2|0|2|0|BIlaunchpad.WebApp|SERVERA:7616:65.106749:1|Webi SDK.CorbaServerImpl.doProcess()|SERVERA:7616:65.106749:2|webiserver_SERVERA.WebIntelligenceProcessingServer.loadStateMDP|localhost:9984:3984.119276:1|CtC8xxN8nEPIkk05usjbm5s1a0fb|||||||||||AsyncCaller:WebiSession[AS6LS2uphS9GilVoKXDMP9k]: loadState by user 'userA' on document 'Webi Report A'  AsyncCaller.cpp:181:bool __cdecl async::AsyncCaller::startRequest(const char *,int): TraceLog message 21020
    So what part of this is an error? To me, this looks like normal processing steps. Another example, would be:
    |1404a918-2cad-9184-6a6b-decec93efe1e|2014 07 30 17:58:06:861|-0400|Error|Error|>>|E| |webiserver_SERVERA.WebIntelligenceProcessingServer| 37332|35556|| |12|0|2|0|BIlaunchpad.WebApp|SERVERB:8700:75751.68935664:1|Webi SDK.CorbaServerImpl.doProcess()|SERVERB:8700:75751.68935664:3|webiserver_SERVERA.WebIntelligenceProcessingServer3.loadStateMDP|localhost:37332:35556.58574:1|CoWFD4_SSEn5rjTHJ2XQgkU41bdff6|||||||||||**ERROR:RequestProc:user: userC, doc: "", error stream: [kctRequestProc.cpp;777]  SharedContextImpl.cpp:227:__cdecl SharedContextImpl::~SharedContextImpl(void): TraceLog message 38279
    That one, I can see how it is an error, generic or otherwise, it is clearly an error.
    What we have been doing with our vb script, is parsing the log files by searching line by line for the word 'user' and then outputting a subset of that line to excel, figuring this gives us errors the user experienced.  What we were surprised to see is the sheer number of errors that the user is supposedly experiencing.  What is also surprising, is that it doesn't appear that the user sees these messages on screen all the time, given that they don't complain and/or we having someone working with the user who verifies no messages on screen appeared, but I can match up timestamps with the log files and their actions performed and see that an error was hit.
    Our initial goal with this was to see what users may need more training, but we also have been using it as a quick way to monitor things too.  So for example, if a user calls and complains they got an error, I'll use this first to help identify which server they were on and check out server related 'things' and then I use the glf viewer to view the full log if I need to analyze things further.
    Also, here's the link I have for the error message guide:
    http://help.sap.com/businessobject/product_guides/boexir4/en/xi4_error_messages_en.pdf
    For example, WIS 30600 is not in there.  Is there a different guide?
    Thanks,
    Missy

  • WebLogic 10.3.2.0 hanging at startup after "The server log file is opened."

    Hi,
    A WebLogic 10.3.2.0 server is hanging at startup. There are no error messages. The last command in the startup window is:
    "The server log file <log file dest> is opened. All server side log events will be written to this file."
    I think the next line should be:
    "Security initializing using security realm realm."
    Any ideas on what could be the issue? For instance what resources should be accessed at that point of time? There is sufficient space left on the (virtual machine) disk. The VM configured with 8GB memory. Could it be performance related still?
    Following is written to the log file:
    ####<12.aug.2010 kl 09.47 CEST> <Info> <WebLogicServer> <oim> <> <Main Thread> <> <> <> <1281599254656> <BEA-000214> <WebLogic Server "AdminServer" version:
    WebLogic Server 10.3.2.0 Tue Oct 20 12:16:15 PDT 2009 1267925 Copyright (c) 1995, 2009, Oracle and/or its affiliates. All rights reserved.> ####<12.aug.2010 kl 09.47 CEST> <Notice> <Log Management> <oim> <> <[ACTIVE] ExecuteThread: '0' for queue: 'weblogic.kernel.Default (self-tuning)'> <> <> <> <1281599255312> <BEA-170019> <The server log file ....logs\AdminServer.log is opened. All server side log events will be written to this file.> ####<12.aug.2010 kl 09.47 CEST> <Info> <Log Management> <oim> <> <[ACTIVE] ExecuteThread: '0' for queue: 'weblogic.kernel.Default (self-tuning)'> <> <> <> <1281599255390> <BEA-170023> <The Server Logging is initialized with Java Logging API implementation.> ####<12.aug.2010 kl 09.47 CEST> <Info> <Diagnostics> <oim> <AdminServer> <[ACTIVE] ExecuteThread: '0' for queue: 'weblogic.kernel.Default (self-tuning)'> <<WLS Kernel>> <> <> <1281599255671> <BEA-320001> <The ServerDebug service initialized successfully.> ####<12.aug.2010 kl 09.47 CEST> <Info> <Store> <oim> <AdminServer> <[ACTIVE] ExecuteThread: '0' for queue: 'weblogic.kernel.Default (self-tuning)'> <<WLS Kernel>> <> <> <1281599256515> <BEA-280050> <Persistent store "WLS_DIAGNOSTICS" opened: directory="....s\domains\oim\servers\AdminServer\data\store\diagnostics" writePolicy="Disabled" blockSize=512 directIO=false driver="wlfileio2"> ####<12.aug.2010 kl 09.47 CEST> <Info> <Server> <oim> <AdminServer> <[ACTIVE] ExecuteThread: '0' for queue: 'weblogic.kernel.Default (self-tuning)'> <<WLS Kernel>> <> <> <1281599257140> <BEA-002622> <The protocol "t3" is now configured.> ####<12.aug.2010 kl 09.47 CEST> <Info> <Server> <oim> <AdminServer> <[ACTIVE] ExecuteThread: '0' for queue: 'weblogic.kernel.Default (self-tuning)'> <<WLS Kernel>> <> <> <1281599257140> <BEA-002622> <The protocol "t3s" is now configured.> ####<12.aug.2010 kl 09.47 CEST> <Info> <Server> <oim> <AdminServer> <[ACTIVE] ExecuteThread: '0' for queue: 'weblogic.kernel.Default (self-tuning)'> <<WLS Kernel>> <> <> <1281599257140> <BEA-002622> <The protocol "http" is now configured.> ####<12.aug.2010 kl 09.47 CEST> <Info> <Server> <oim> <AdminServer> <[ACTIVE] ExecuteThread: '0' for queue: 'weblogic.kernel.Default (self-tuning)'> <<WLS Kernel>> <> <> <1281599257140> <BEA-002622> <The protocol "https" is now configured.> ####<12.aug.2010 kl 09.47 CEST> <Info> <Server> <oim> <AdminServer> <[ACTIVE] ExecuteThread: '0' for queue: 'weblogic.kernel.Default (self-tuning)'> <<WLS Kernel>> <> <> <1281599257140> <BEA-002622> <The protocol "iiop" is now configured.> ####<12.aug.2010 kl 09.47 CEST> <Info> <Server> <oim> <AdminServer> <[ACTIVE] ExecuteThread: '0' for queue: 'weblogic.kernel.Default (self-tuning)'> <<WLS Kernel>> <> <> <1281599257156> <BEA-002622> <The protocol "iiops" is now configured.> ####<12.aug.2010 kl 09.47 CEST> <Info> <Server> <oim> <AdminServer> <[ACTIVE] ExecuteThread: '0' for queue: 'weblogic.kernel.Default (self-tuning)'> <<WLS Kernel>> <> <> <1281599257156> <BEA-002622> <The protocol "ldap" is now configured.> ####<12.aug.2010 kl 09.47 CEST> <Info> <Server> <oim> <AdminServer> <[ACTIVE] ExecuteThread: '0' for queue: 'weblogic.kernel.Default (self-tuning)'> <<WLS Kernel>> <> <> <1281599257156> <BEA-002622> <The protocol "ldaps" is now configured.> ####<12.aug.2010 kl 09.47 CEST> <Info> <Server> <oim> <AdminServer> <[ACTIVE] ExecuteThread: '0' for queue: 'weblogic.kernel.Default (self-tuning)'> <<WLS Kernel>> <> <> <1281599257187> <BEA-002622> <The protocol "cluster" is now configured.> ####<12.aug.2010 kl 09.47 CEST> <Info> <Server> <oim> <AdminServer> <[ACTIVE] ExecuteThread: '0' for queue: 'weblogic.kernel.Default (self-tuning)'> <<WLS Kernel>> <> <> <1281599257187> <BEA-002622> <The protocol "clusters" is now configured.> ####<12.aug.2010 kl 09.47 CEST> <Info> <Server> <oim> <AdminServer> <[ACTIVE] ExecuteThread: '0' for queue: 'weblogic.kernel.Default (self-tuning)'> <<WLS Kernel>> <> <> <1281599257218> <BEA-002622> <The protocol "snmp" is now configured.> ####<12.aug.2010 kl 09.47 CEST> <Info> <Server> <oim> <AdminServer> <[ACTIVE] ExecuteThread: '0' for queue: 'weblogic.kernel.Default (self-tuning)'> <<WLS Kernel>> <> <> <1281599257218> <BEA-002622> <The protocol "admin" is now configured.> ####<12.aug.2010 kl 09.47 CEST> <Info> <Server> <oim> <AdminServer> <[ACTIVE] ExecuteThread: '0' for queue: 'weblogic.kernel.Default (self-tuning)'> <<WLS Kernel>> <> <> <1281599257218> <BEA-002624> <The administration protocol is "t3s" and is now configured.> ####<12.aug.2010 kl 09.47 CEST> <Info> <RJVM> <oim> <AdminServer> <[ACTIVE] ExecuteThread: '0' for queue: 'weblogic.kernel.Default (self-tuning)'> <<WLS Kernel>> <> <> <1281599257468> <BEA-000570> <Network Configuration for Channel "AdminServer"
    Listen Address          :7001
    Public Address          N/A
    Http Enabled          true
    Tunneling Enabled     false
    Outbound Enabled     false
    Admin Traffic Enabled     true>
    ####<12.aug.2010 kl 09.47 CEST> <Info> <Server> <oim> <AdminServer> <[ACTIVE] ExecuteThread: '0' for queue: 'weblogic.kernel.Default (self-tuning)'> <<WLS Kernel>> <> <> <1281599257687> <BEA-002609> <Channel Service initialized.> ####<12.aug.2010 kl 09.47 CEST> <Info> <Socket> <oim> <AdminServer> <[ACTIVE] ExecuteThread: '0' for queue: 'weblogic.kernel.Default (self-tuning)'> <<WLS Kernel>> <> <> <1281599258000> <BEA-000406> <NTSocketMuxer was built on Jan 13 2005 17:47:03
    ####<12.aug.2010 kl 09.47 CEST> <Info> <Socket> <oim> <AdminServer> <[ACTIVE] ExecuteThread: '0' for queue: 'weblogic.kernel.Default (self-tuning)'> <<WLS Kernel>> <> <> <1281599258078> <BEA-000436> <Allocating 3 reader threads.> ####<12.aug.2010 kl 09.47 CEST> <Info> <Socket> <oim> <AdminServer> <[ACTIVE] ExecuteThread: '0' for queue: 'weblogic.kernel.Default (self-tuning)'> <<WLS Kernel>> <> <> <1281599258078> <BEA-000446> <Native IO Enabled.> ####<12.aug.2010 kl 09.47 CEST> <Info> <IIOP> <oim> <AdminServer> <[ACTIVE] ExecuteThread: '0' for queue: 'weblogic.kernel.Default (self-tuning)'> <<WLS Kernel>> <> <> <1281599259500> <BEA-002014> <IIOP subsystem enabled.>
    Thanks!!

    tried both of these, still having same error as below:
    <Sep 8, 2010 1:32:37 PM IST> <Critical> <Security> <BEA-090402> <Authentication denied: Boot identity not valid; The user name and/or password from the boot identity file (boot.properties) is not valid. The boot identity may have been changed since the boot identity file was created. Please edit and update the boot identity file with the proper values of username and password. The first time the updated boot identity file is used to start the server, these new values are encrypted.>
    <Sep 8, 2010 1:32:37 PM IST> <Critical> <WebLogicServer> <BEA-000386> <Server subsystem failed. Reason: weblogic.security.SecurityInitializationException: Authentication denied: Boot identity not valid; The user name and/or password from the boot identity file (boot.properties) is not valid. The boot identity may have been changed since the boot identity file was created. Please edit and update the boot identity file with the proper values of username and password. The first time the updated boot identity file is used to start the server, these new values are encrypted.
    weblogic.security.SecurityInitializationException: Authentication denied: Boot identity not valid; The user name and/or password from the boot identity file (boot.properties) is not valid. The boot identity may have been changed since the boot identity file was created. Please edit and update the boot identity file with the proper values of username and password. The first time the updated boot identity file is used to start the server, these new values are encrypted.
    at weblogic.security.service.CommonSecurityServiceManagerDelegateImpl.doBootAuthorization(CommonSecurityServiceManagerDelegateImpl.java:959)
    at weblogic.security.service.CommonSecurityServiceManagerDelegateImpl.initialize(CommonSecurityServiceManagerDelegateImpl.java:1050)
    at weblogic.security.service.SecurityServiceManager.initialize(SecurityServiceManager.java:875)
    at weblogic.security.SecurityService.start(SecurityService.java:141)
    at weblogic.t3.srvr.SubsystemRequest.run(SubsystemRequest.java:64)
    Truncated. see log file for complete stacktrace
    Caused By: javax.security.auth.login.FailedLoginException: [Security:090304]Authentication Failed: User weblogic2 javax.security.auth.login.FailedLoginException: [Security:090302]Authentication Failed: User weblogic2 denied
    at weblogic.security.providers.authentication.LDAPAtnLoginModuleImpl.login(LDAPAtnLoginModuleImpl.java:250)
    at com.bea.common.security.internal.service.LoginModuleWrapper$1.run(LoginModuleWrapper.java:110)
    at java.security.AccessController.doPrivileged(Native Method)
    at com.bea.common.security.internal.service.LoginModuleWrapper.login(LoginModuleWrapper.java:106)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    Truncated. see log file for complete stacktrace
    >
    <Sep 8, 2010 1:32:37 PM IST> <Notice> <WebLogicServer> <BEA-000365> <Server state changed to FAILED>
    <Sep 8, 2010 1:32:37 PM IST> <Error> <WebLogicServer> <BEA-000383> <A critical service failed. The server will shut itself down>
    <Sep 8, 2010 1:32:37 PM IST> <Notice> <WebLogicServer> <BEA-000365> <Server state changed to FORCE_SHUTTING_DOWN>
    Pls help me out ASAP...

Maybe you are looking for