Parse robocopy Log File - new value

Hello,
I have found a script, that parse the robocopy log file, which looks like this:
   ROBOCOPY     ::     Robust File Copy for Windows                             
  Started : Thu Aug 07 09:30:18 2014
   Source : e:\testfolder\
     Dest : w:\testfolder\
    Files : *.*
  Options : *.* /V /NDL /S /E /COPYALL /NP /IS /R:1 /W:5
     Same          14.6 g e:\testfolder\bigfile - Copy (5).out
     Same          14.6 g e:\testfolder\bigfile - Copy.out
     Same          14.6 g e:\testfolder\bigfile.out
               Total    Copied   Skipped  Mismatch    FAILED    Extras
    Dirs :         1         0         1         0        
0         0
   Files :         3         3         0         0        
0         0
   Bytes :  43.969 g  43.969 g         0         0         0         0
   Times :   0:05:44   0:05:43                       0:00:00   0:00:00
   Speed :           137258891 Bytes/sec.
   Speed :            7854.016 MegaBytes/min.
   Ended : Thu Aug 07 09:36:02 2014
Most values at output file are included, but the two speed paramter not.
How can I get this two speed paramters at output file?
Here is the script:
param(
[parameter(Position=0,Mandatory=$true,ValueFromPipeline=$false,HelpMessage='Source Path with no trailing slash')][string]$SourcePath,
[switch]$fp
write-host "Robocopy log parser. $(if($fp){"Parsing file entries"} else {"Parsing summaries only, use -fp to parse file entries"})"
#Arguments
# -fp File parse. Counts status flags and oldest file Slower on big files.
$ElapsedTime = [System.Diagnostics.Stopwatch]::StartNew()
$refreshrate=1 # progress counter refreshes this often when parsing files (in seconds)
# These summary fields always appear in this order in a robocopy log
$HeaderParams = @{
"04|Started" = "date";
"01|Source" = "string";
"02|Dest" = "string";
"03|Options" = "string";
"07|Dirs" = "counts";
"08|Files" = "counts";
"09|Bytes" = "counts";
"10|Times" = "counts";
"05|Ended" = "date";
#"06|Duration" = "string"
$ProcessCounts = @{
"Processed" = 0;
"Error" = 0;
"Incomplete" = 0
$tab=[char]9
$files=get-childitem $SourcePath
$writer=new-object System.IO.StreamWriter("$(get-location)\robocopy-$(get-date -format "dd-MM-yyyy_HH-mm-ss").csv")
function Get-Tail([object]$reader, [int]$count = 10) {
$lineCount = 0
[long]$pos = $reader.BaseStream.Length - 1
while($pos -gt 0)
$reader.BaseStream.position=$pos
# 0x0D (#13) = CR
# 0x0A (#10) = LF
if ($reader.BaseStream.ReadByte() -eq 10)
$lineCount++
if ($lineCount -ge $count) { break }
$pos--
# tests for file shorter than requested tail
if ($lineCount -lt $count -or $pos -ge $reader.BaseStream.Length - 1) {
$reader.BaseStream.Position=0
} else {
# $reader.BaseStream.Position = $pos+1
$lines=@()
while(!$reader.EndOfStream) {
$lines += $reader.ReadLine()
return $lines
function Get-Top([object]$reader, [int]$count = 10)
$lines=@()
$lineCount = 0
$reader.BaseStream.Position=0
while(($linecount -lt $count) -and !$reader.EndOfStream) {
$lineCount++
$lines += $reader.ReadLine()
return $lines
function RemoveKey ( $name ) {
if ( $name -match "|") {
return $name.split("|")[1]
} else {
return ( $name )
function GetValue ( $line, $variable ) {
if ($line -like "*$variable*" -and $line -like "* : *" ) {
$result = $line.substring( $line.IndexOf(":")+1 )
return $result
} else {
return $null
function UnBodgeDate ( $dt ) {
# Fixes RoboCopy botched date-times in format Sat Feb 16 00:16:49 2013
if ( $dt -match ".{3} .{3} \d{2} \d{2}:\d{2}:\d{2} \d{4}" ) {
$dt=$dt.split(" ")
$dt=$dt[2],$dt[1],$dt[4],$dt[3]
$dt -join " "
if ( $dt -as [DateTime] ) {
return $dt.ToStr("dd/MM/yyyy hh:mm:ss")
} else {
return $null
function UnpackParams ($params ) {
# Unpacks file count bloc in the format
# Dirs : 1827 0 1827 0 0 0
# Files : 9791 0 9791 0 0 0
# Bytes : 165.24 m 0 165.24 m 0 0 0
# Times : 1:11:23 0:00:00 0:00:00 1:11:23
# Parameter name already removed
if ( $params.length -ge 58 ) {
$params = $params.ToCharArray()
$result=(0..5)
for ( $i = 0; $i -le 5; $i++ ) {
$result[$i]=$($params[$($i*10 + 1) .. $($i*10 + 9)] -join "").trim()
$result=$result -join ","
} else {
$result = ",,,,,"
return $result
$sourcecount = 0
$targetcount = 1
# Write the header line
$writer.Write("File")
foreach ( $HeaderParam in $HeaderParams.GetEnumerator() | Sort-Object Name ) {
if ( $HeaderParam.value -eq "counts" ) {
$tmp="~ Total,~ Copied,~ Skipped,~ Mismatch,~ Failed,~ Extras"
$tmp=$tmp.replace("~","$(removekey $headerparam.name)")
$writer.write(",$($tmp)")
} else {
$writer.write(",$(removekey $HeaderParam.name)")
if($fp){
$writer.write(",Scanned,Newest,Summary")
$writer.WriteLine()
$filecount=0
# Enumerate the files
foreach ($file in $files) {
$filecount++
write-host "$filecount/$($files.count) $($file.name) ($($file.length) bytes)"
$results=@{}
$Stream = $file.Open([System.IO.FileMode]::Open,
[System.IO.FileAccess]::Read,
[System.IO.FileShare]::ReadWrite)
$reader = New-Object System.IO.StreamReader($Stream)
#$filestream=new-object -typename System.IO.StreamReader -argumentlist $file, $true, [System.IO.FileAccess]::Read
$HeaderFooter = Get-Top $reader 16
if ( $HeaderFooter -match "ROBOCOPY :: Robust File Copy for Windows" ) {
if ( $HeaderFooter -match "Files : " ) {
$HeaderFooter = $HeaderFooter -notmatch "Files : "
[long]$ReaderEndHeader=$reader.BaseStream.position
$Footer = Get-Tail $reader 16
$ErrorFooter = $Footer -match "ERROR \d \(0x000000\d\d\) Accessing Source Directory"
if ($ErrorFooter) {
$ProcessCounts["Error"]++
write-host -foregroundcolor red "`t $ErrorFooter"
} elseif ( $footer -match "---------------" ) {
$ProcessCounts["Processed"]++
$i=$Footer.count
while ( !($Footer[$i] -like "*----------------------*") -or $i -lt 1 ) { $i-- }
$Footer=$Footer[$i..$Footer.Count]
$HeaderFooter+=$Footer
} else {
$ProcessCounts["Incomplete"]++
write-host -foregroundcolor yellow "`t Log file $file is missing the footer and may be incomplete"
foreach ( $HeaderParam in $headerparams.GetEnumerator() | Sort-Object Name ) {
$name = "$(removekey $HeaderParam.Name)"
$tmp = GetValue $($HeaderFooter -match "$name : ") $name
if ( $tmp -ne "" -and $tmp -ne $null ) {
switch ( $HeaderParam.value ) {
"date" { $results[$name]=UnBodgeDate $tmp.trim() }
"counts" { $results[$name]=UnpackParams $tmp }
"string" { $results[$name] = """$($tmp.trim())""" }
default { $results[$name] = $tmp.trim() }
if ( $fp ) {
write-host "Parsing $($reader.BaseStream.Length) bytes"
# Now go through the file line by line
$reader.BaseStream.Position=0
$filesdone = $false
$linenumber=0
$FileResults=@{}
$newest=[datetime]"1/1/1900"
$linecount++
$firsttick=$elapsedtime.elapsed.TotalSeconds
$tick=$firsttick+$refreshrate
$LastLineLength=1
try {
do {
$line = $reader.ReadLine()
$linenumber++
if (($line -eq "-------------------------------------------------------------------------------" -and $linenumber -gt 16) ) {
# line is end of job
$filesdone=$true
} elseif ($linenumber -gt 16 -and $line -gt "" ) {
$buckets=$line.split($tab)
# this test will pass if the line is a file, fail if a directory
if ( $buckets.count -gt 3 ) {
$status=$buckets[1].trim()
$FileResults["$status"]++
$SizeDateTime=$buckets[3].trim()
if ($sizedatetime.length -gt 19 ) {
$DateTime = $sizedatetime.substring($sizedatetime.length -19)
if ( $DateTime -as [DateTime] ){
$DateTimeValue=[datetime]$DateTime
if ( $DateTimeValue -gt $newest ) { $newest = $DateTimeValue }
if ( $elapsedtime.elapsed.TotalSeconds -gt $tick ) {
$line=$line.Trim()
if ( $line.Length -gt 48 ) {
$line="[...]"+$line.substring($line.Length-48)
$line="$([char]13)Parsing > $($linenumber) ($(($reader.BaseStream.Position/$reader.BaseStream.length).tostring("P1"))) - $line"
write-host $line.PadRight($LastLineLength) -NoNewLine
$LastLineLength = $line.length
$tick=$tick+$refreshrate
} until ($filesdone -or $reader.endofstream)
finally {
$reader.Close()
$line=$($([string][char]13)).padright($lastlinelength)+$([char]13)
write-host $line -NoNewLine
$writer.Write("`"$file`"")
foreach ( $HeaderParam in $HeaderParams.GetEnumerator() | Sort-Object Name ) {
$name = "$(removekey $HeaderParam.Name)"
if ( $results[$name] ) {
$writer.Write(",$($results[$name])")
} else {
if ( $ErrorFooter ) {
#placeholder
} elseif ( $HeaderParam.Value -eq "counts" ) {
$writer.Write(",,,,,,")
} else {
$writer.Write(",")
if ( $ErrorFooter ) {
$tmp = $($ErrorFooter -join "").substring(20)
$tmp=$tmp.substring(0,$tmp.indexof(")")+1)+","+$tmp
$writer.write(",,$tmp")
} elseif ( $fp ) {
$writer.write(",$LineCount,$($newest.ToString('dd/MM/yyyy hh:mm:ss'))")
foreach ( $FileResult in $FileResults.GetEnumerator() ) {
$writer.write(",$($FileResult.Name): $($FileResult.Value);")
$writer.WriteLine()
} else {
write-host -foregroundcolor darkgray "$($file.name) is not recognised as a RoboCopy log file"
write-host "$filecount files scanned in $($elapsedtime.elapsed.tostring()), $($ProcessCounts["Processed"]) complete, $($ProcessCounts["Error"]) have errors, $($ProcessCounts["Incomplete"]) incomplete"
write-host "Results written to $($writer.basestream.name)"
$writer.close()
I hope somebody can help me,
Horst
Thanks Horst MOSS 2007 Farm; MOSS 2010 Farm; TFS 2010; TFS 2013; IIS 7.5

Hi Horst,
To convert mutiple robocopy log files to a .csv file with "speed" option, the script below may be helpful for you, I tested with a single robocopy log file, and the .csv file will output to "D:\":
$SourcePath="e:\1\1.txt" #robocopy log file
write-host "Robocopy log parser. $(if($fp){"Parsing file entries"} else {"Parsing summaries only, use -fp to parse file entries"})"
#Arguments
# -fp File parse. Counts status flags and oldest file Slower on big files.
$ElapsedTime = [System.Diagnostics.Stopwatch]::StartNew()
$refreshrate=1 # progress counter refreshes this often when parsing files (in seconds)
# These summary fields always appear in this order in a robocopy log
$HeaderParams = @{
 "04|Started" = "date"; 
 "01|Source" = "string";
 "02|Dest" = "string";
 "03|Options" = "string";
 "09|Dirs" = "counts";
 "10|Files" = "counts";
 "11|Bytes" = "counts";
 "12|Times" = "counts";
 "05|Ended" = "date";
 "07|Speed" = "default";
 "08|Speednew" = "default"
$ProcessCounts = @{
 "Processed" = 0;
 "Error" = 0;
 "Incomplete" = 0
$tab=[char]9
$files=get-childitem $SourcePath
$writer=new-object System.IO.StreamWriter("D:\robocopy-$(get-date -format "dd-MM-yyyy_HH-mm-ss").csv")
function Get-Tail([object]$reader, [int]$count = 10) {
 $lineCount = 0
 [long]$pos = $reader.BaseStream.Length - 1
 while($pos -gt 0)
  $reader.BaseStream.position=$pos
  # 0x0D (#13) = CR
  # 0x0A (#10) = LF
  if ($reader.BaseStream.ReadByte() -eq 10)
   $lineCount++
   if ($lineCount -ge $count) { break }
  $pos--
 # tests for file shorter than requested tail
 if ($lineCount -lt $count -or $pos -ge $reader.BaseStream.Length - 1) {
  $reader.BaseStream.Position=0
 } else {
  # $reader.BaseStream.Position = $pos+1
 $lines=@()
 while(!$reader.EndOfStream) {
  $lines += $reader.ReadLine()
 return $lines
function Get-Top([object]$reader, [int]$count = 10)
 $lines=@()
 $lineCount = 0
 $reader.BaseStream.Position=0
 while(($linecount -lt $count) -and !$reader.EndOfStream) {
  $lineCount++
  $lines += $reader.ReadLine()  
 return $lines
function RemoveKey ( $name ) {
 if ( $name -match "|") {
  return $name.split("|")[1]
 } else {
  return ( $name )
function GetValue ( $line, $variable ) {
 if ($line -like "*$variable*" -and $line -like "* : *" ) {
  $result = $line.substring( $line.IndexOf(":")+1 )
  return $result
 } else {
  return $null
}function UnBodgeDate ( $dt ) {
 # Fixes RoboCopy botched date-times in format Sat Feb 16 00:16:49 2013
 if ( $dt -match ".{3} .{3} \d{2} \d{2}:\d{2}:\d{2} \d{4}" ) {
  $dt=$dt.split(" ")
  $dt=$dt[2],$dt[1],$dt[4],$dt[3]
  $dt -join " "
 if ( $dt -as [DateTime] ) {
  return $dt.ToStr("dd/MM/yyyy hh:mm:ss")
 } else {
  return $null
function UnpackParams ($params ) {
 # Unpacks file count bloc in the format
 # Dirs :      1827         0      1827         0         0         0
 # Files :      9791         0      9791         0         0         0
 # Bytes :  165.24 m         0  165.24 m         0         0         0
 # Times :   1:11:23   0:00:00                       0:00:00   1:11:23
 # Parameter name already removed
 if ( $params.length -ge 58 ) {
  $params = $params.ToCharArray()
  $result=(0..5)
  for ( $i = 0; $i -le 5; $i++ ) {
   $result[$i]=$($params[$($i*10 + 1) .. $($i*10 + 9)] -join "").trim()
  $result=$result -join ","
 } else {
  $result = ",,,,,"
 return $result
$sourcecount = 0
$targetcount = 1
# Write the header line
$writer.Write("File")
foreach ( $HeaderParam in $HeaderParams.GetEnumerator() | Sort-Object Name ) {
 if ( $HeaderParam.value -eq "counts" ) {
  $tmp="~ Total,~ Copied,~ Skipped,~ Mismatch,~ Failed,~ Extras"
  $tmp=$tmp.replace("~","$(removekey $headerparam.name)")
  $writer.write(",$($tmp)")
 } else {
  $writer.write(",$(removekey $HeaderParam.name)")
if($fp){
 $writer.write(",Scanned,Newest,Summary")
$writer.WriteLine()
$filecount=0
# Enumerate the files
foreach ($file in $files) { 
 $filecount++
    write-host "$filecount/$($files.count) $($file.name) ($($file.length) bytes)"
 $results=@{}
$Stream = $file.Open([System.IO.FileMode]::Open,
                   [System.IO.FileAccess]::Read,
                    [System.IO.FileShare]::ReadWrite)
 $reader = New-Object System.IO.StreamReader($Stream)
 #$filestream=new-object -typename System.IO.StreamReader -argumentlist $file, $true, [System.IO.FileAccess]::Read
 $HeaderFooter = Get-Top $reader 16
 if ( $HeaderFooter -match "ROBOCOPY     ::     Robust File Copy for Windows" ) {
  if ( $HeaderFooter -match "Files : " ) {
   $HeaderFooter = $HeaderFooter -notmatch "Files : "
  [long]$ReaderEndHeader=$reader.BaseStream.position
  $Footer = Get-Tail $reader 16
  $ErrorFooter = $Footer -match "ERROR \d \(0x000000\d\d\) Accessing Source Directory"
  if ($ErrorFooter) {
   $ProcessCounts["Error"]++
   write-host -foregroundcolor red "`t $ErrorFooter"
  } elseif ( $footer -match "---------------" ) {
   $ProcessCounts["Processed"]++
   $i=$Footer.count
   while ( !($Footer[$i] -like "*----------------------*") -or $i -lt 1 ) { $i-- }
   $Footer=$Footer[$i..$Footer.Count]
   $HeaderFooter+=$Footer
  } else {
   $ProcessCounts["Incomplete"]++
   write-host -foregroundcolor yellow "`t Log file $file is missing the footer and may be incomplete"
  foreach ( $HeaderParam in $headerparams.GetEnumerator() | Sort-Object Name ) {
   $name = "$(removekey $HeaderParam.Name)"
                        if ($name -eq "speed"){ #handle two speed
                        ($HeaderFooter -match "$name : ")|foreach{
                         $tmp=GetValue $_ "speed"
                         $results[$name] = $tmp.trim()
                         $name+="new"}
                        elseif ($name -eq "speednew"){} #handle two speed
                        else{
   $tmp = GetValue $($HeaderFooter -match "$name : ") $name
   if ( $tmp -ne "" -and $tmp -ne $null ) {
    switch ( $HeaderParam.value ) {
     "date" { $results[$name]=UnBodgeDate $tmp.trim() }
     "counts" { $results[$name]=UnpackParams $tmp }
     "string" { $results[$name] = """$($tmp.trim())""" }  
     default { $results[$name] = $tmp.trim() }  
  if ( $fp ) {
   write-host "Parsing $($reader.BaseStream.Length) bytes"
   # Now go through the file line by line
   $reader.BaseStream.Position=0
   $filesdone = $false
   $linenumber=0
   $FileResults=@{}
   $newest=[datetime]"1/1/1900"
   $linecount++
   $firsttick=$elapsedtime.elapsed.TotalSeconds
   $tick=$firsttick+$refreshrate
   $LastLineLength=1
   try {
    do {
     $line = $reader.ReadLine()
     $linenumber++
     if (($line -eq "-------------------------------------------------------------------------------" -and $linenumber -gt 16)  ) {
      # line is end of job
      $filesdone=$true
     } elseif ($linenumber -gt 16 -and $line -gt "" ) {
      $buckets=$line.split($tab)
      # this test will pass if the line is a file, fail if a directory
      if ( $buckets.count -gt 3 ) {
       $status=$buckets[1].trim()
       $FileResults["$status"]++
       $SizeDateTime=$buckets[3].trim()
       if ($sizedatetime.length -gt 19 ) {
        $DateTime = $sizedatetime.substring($sizedatetime.length -19)
        if ( $DateTime -as [DateTime] ){
         $DateTimeValue=[datetime]$DateTime
         if ( $DateTimeValue -gt $newest ) { $newest = $DateTimeValue }
     if ( $elapsedtime.elapsed.TotalSeconds -gt $tick ) {
      $line=$line.Trim()
      if ( $line.Length -gt 48 ) {
       $line="[...]"+$line.substring($line.Length-48)
      $line="$([char]13)Parsing > $($linenumber) ($(($reader.BaseStream.Position/$reader.BaseStream.length).tostring("P1"))) - $line"
      write-host $line.PadRight($LastLineLength) -NoNewLine
      $LastLineLength = $line.length
      $tick=$tick+$refreshrate      
    } until ($filesdone -or $reader.endofstream)
   finally {
    $reader.Close()
   $line=$($([string][char]13)).padright($lastlinelength)+$([char]13)
   write-host $line -NoNewLine
  $writer.Write("`"$file`"")
  foreach ( $HeaderParam in $HeaderParams.GetEnumerator() | Sort-Object Name ) {
   $name = "$(removekey $HeaderParam.Name)"
   if ( $results[$name] ) {
    $writer.Write(",$($results[$name])")
   } else {
    if ( $ErrorFooter ) {
     #placeholder
    } elseif ( $HeaderParam.Value -eq "counts" ) {
     $writer.Write(",,,,,,")
    } else {
     $writer.Write(",")
  if ( $ErrorFooter ) {
   $tmp = $($ErrorFooter -join "").substring(20)
   $tmp=$tmp.substring(0,$tmp.indexof(")")+1)+","+$tmp
   $writer.write(",,$tmp")
  } elseif ( $fp ) {
   $writer.write(",$LineCount,$($newest.ToString('dd/MM/yyyy hh:mm:ss'))")   
   foreach ( $FileResult in $FileResults.GetEnumerator() ) {
    $writer.write(",$($FileResult.Name): $($FileResult.Value);")
  $writer.WriteLine()
 } else {
  write-host -foregroundcolor darkgray "$($file.name) is not recognised as a RoboCopy log file"
write-host "$filecount files scanned in $($elapsedtime.elapsed.tostring()), $($ProcessCounts["Processed"]) complete, $($ProcessCounts["Error"]) have errors, $($ProcessCounts["Incomplete"]) incomplete"
write-host  "Results written to $($writer.basestream.name)"
$writer.close()
If you have any other questions, please feel free to let me know.
If you have any feedback on our support,
please click here.
Best Regards,
Anna Wang
TechNet Community Support

Similar Messages

  • Parsing a log file on Weblogic

    Hi!
    I'd like to know how to get started on parsing a log file present in the default directory of Weblogic (ver 6.1 to be precise).
    I thought of using regular expressions, and use java.util.regex , but that is supported from JDK1.5 onwards, whereas WL6.1 supports JDK1.3.
    If u can also provide the code template for the same , that would be nice.
    Thanks in advance,
    Deepthy.

    uncle_alice wrote:
    String regex = "([^\"\\\\]++|\\\\.)++"{code} The trick is to match anything except a quotation mark or a backslash, OR match a backslash followed by anything (because the backslash is usually used to escape other characters as well, including backslashes).Superb! Thanks! I have to admit I've never used the ++ before (only the greedies), but that's the thing I was looking for.
    Just for the completeness, this is the whole thing that's able to parse a log line:
    {code}
    public class LogParser {
    private static final String NOSPACE_PARAM = "([^ ]++)";
    private static final String DATE_PARAM = "([^\\]]++)";
    private static final String ESCAPED_PARAM = "((?:[^\"\\\\]++|\\\\.)++)";
    private static final String PATTERN_STRING = NOSPACE_PARAM
    + " " + NOSPACE_PARAM
    + " " + NOSPACE_PARAM
    + " \\[" + DATE_PARAM + "\\]"
    + " \"" + ESCAPED_PARAM + "\""
    + " " + NOSPACE_PARAM
    + " " + NOSPACE_PARAM
    + " \"" + ESCAPED_PARAM + "\""
    + " \"" + ESCAPED_PARAM + "\""
    + " \"" + ESCAPED_PARAM + "\""
    + " \"" + ESCAPED_PARAM + "\""
    + " \"" + ESCAPED_PARAM + "\""
    + " " + NOSPACE_PARAM
    + " \"" + ESCAPED_PARAM + "\""
    + " \"" + ESCAPED_PARAM + "\""
    + " \"" + ESCAPED_PARAM + "\""
    + " \"" + ESCAPED_PARAM + "\"";
    private static final Pattern PATTERN = Pattern.compile(PATTERN_STRING);
    public static String[] parse(String line) {
    Matcher m = PATTERN.matcher(line);
    if (m.matches()) {
    String[] result = new String[m.groupCount()];
    for (int i = 0; i < m.groupCount();) {
    result[i] = m.group(++i);
    return result;
    return null;
    {code}
    Any idea about the efficiency of this thing?

  • Robocopy log file

    I use the following robocopy command to copy files from old to a new server:
    @ECHO OFF
    SET _source=d:\data\
    SET _dest=\\newserver\DATA\
    SET _what=/COPYALL /ZB /secfix /SEC /MIR
    SET _options=/R:0 /W:0 /LOG:C:\MyLogfile.txt 
    ROBOCOPY %_source% %_dest% %_what% %_options%
    How can i change this command to only log changes new files and directories.  Right now is logging everything.
    Thanks
    patyk

    Hi,
    You can excluded the lines in the logs where word “same” is present. For more detailed information, please refer to the article below:
    Robocopy –>Logging only differences
    http://msexchange.me/2011/04/03/robocopy-logging-only-differences/
    Please Note: Microsoft is providing this information as a convenience to you. The sites are not controlled by Microsoft. Microsoft cannot make any representations regarding the quality, safety, or suitability of any software or information found there.
    Best Regards,
    Mandy
    Please remember to mark the replies as answers if they help and unmark them if they provide no help. If you have feedback for TechNet Subscriber Support, contact [email protected]

  • Robocopy Log File - Skipped files - Interpreting the Log file

    Hey all,
    I am migrating our main file server that contains approximately 8TB of data. I am doing it a few large folders at a time.  The folder below is about 1.2TB.  Looking at the log file (which is over 330MB) I can see it skipped a large number of files,
    however I haven't found text in the file where it specifies what was skipped, any idea on what I should search for?
    I used the following Robocopy command to transfer the data:
    robocopy E:\DATA Z:\DATA /MIR /SEC /W:5 /R:3 /LOG:"Z:\Log\data\log.txt"
    The final log output is:
                    Total    Copied   Skipped  Mismatch    FAILED    Extras
         Dirs :    141093    134629      6464         0         0         0
        Files :   1498053   1310982    160208         0     26863       231
        Bytes :2024.244 g1894.768 g 117.468 g         0  12.007 g  505.38 m
        Times :   0:00:00  18:15:41                       0:01:00 -18:-16:-41
        Speed :            30946657 Bytes/sec.
        Speed :            1770.781 MegaBytes/min.
        Ended : Thu Jul 03 04:05:33 2014
    I assume some are files that are in use but others may be permissions issues, does the log file detail why a file is not copied?
    TIA
    Carl

    Hi.
    Files that are skipped are files that already exists. Files that are open/permissions etc will be listed under failed. As Noah said use /v too see which files were skipped. From robocopy /?:
    :: Logging Options :
    /V :: produce Verbose output, showing skipped files.
    Please remember to click “Mark as Answer” on the post that helps you, and to click “Unmark as Answer” if a marked post does not actually answer your question. Even if you are not the author of a thread you can always help others by voting as Helpful. This can
    be beneficial to other community members reading the thread.
    Oscar Virot

  • Parsing sendmail log file in a Java application

    Are there any good parsing libraries for sendmail log files?
    I already found these libraries but I'm not sure if they do what I need:
    http://www.opennms.org/documentation/java-apidocs-stable/org/opennms/netmgt/syslogd/package-summary.html
    http://code.google.com/p/jsyslogd/

    >
    I've written a simple text editor, and this editor saves files with a particular extension. It can both open and save files. I've put the text editor program in a JAR. What I'd like to do, if possible, is associate the file extension with the text editor program. That is, I'd like to, when I click on a file with the extension, have the text editor come up with the file opened in it.
    Can anyone give me ideas on how to do this, please? >If the editor is launched using webstart, the launch file can suggest a file association.
    Note that an application that accesses the local file system needs to be digitally signed before it can break out of the applet like 'sandbox' in which it runs, unless it uses the JNLP API to access the files. The JNLP API is available to any app. launched using webstart.
    There is an example of both claiming a file extension, and accessing files using the JNLP API, in this [File Service Demo|http://pscode.org/jws/api.html#fs]. The complete source and a build file can be downloaded from the filetest.zip just near the launch buttons.
    I suggest you try the sandboxed version first - if you think that will suit your user, go with that.
    As an aside, for best chance of a solution, I recommend folks add [Duke stars|http://wikis.sun.com/display/SunForums/Duke+Stars+Program+Overview] to match the importance of the task.

  • Parse a log file...

    Hi All,
    1. I want to parse the content of log file, but when I open the log file it does not show me field names .
    it starts with row containing the contents directly, where i want to read and process only three fields randomly.
    I have written the code that works on IIS logs, the log i want to parse having field separator ' ' as single white space.
    2. Some of log files are zipped, so I am unable to open and read them. so that I can parse them.
    can any one have any clue or code that help me out.
    thanks!

    bhatnagarudit wrote:
    Hi All,
    1. I want to parse the content of log file, but when I open the log file it does not show me field names .
    it starts with row containing the contents directly, where i want to read and process only three fields randomly.
    I have written the code that works on IIS logs, the log i want to parse having field separator ' ' as single white space.
    2. Some of log files are zipped, so I am unable to open and read them. so that I can parse them.
    can any one have any clue or code that help me out.
    thanks! Here is a suggested algorithm .. (I don't want to write the code for you :-))
    You have the following format.
    314159b66967d86f031c7249d1d9a8024.. mybucket +[04/Aug/2006:22:34:02 +0000]+ 72.21.206.5 314159b66967d86f031c724... 3E57427F33A59F07 REST.PUT.OBJECT* /photos/2006/08/puppy.jpg +"GET /mybucket/photos/2006/08/puppy.jpg?x-foo=bar"+ 200 NoSuchBucket 2662992 3462992 70 10 "http://www.amazon.com/webservices" "curl/7.15.1"
    Read the file in, go thru lines one by one. For each line,
    1. Get the content in the first square brackets. Regular Expression: [&].
    2. From there, get the fourth (4th) word separated by space.
    3. From there, get the content in the first pair of double quotes. Regular Expression: \"&\".

  • Parse XMLFormatter log files back into LogRecords

    My app uses a java.util.logging.FileHandler and XMLFormatter to generate XML log files, which is great. Now I want to display those logs in my app's web admin console.
    I assumed that there would be an easy way to read my XML logs files back into a List of LogRecord objects, but my initial investigation has revealed nothing.
    Anyone have any advice?

    If you remove an "active" log file, then this can cause problems. If you remove an archieved log file, then it is OK.
    If you change the log directory, then you SHOULD inform all your applications that use this directory... Depending on the service, this information is usually stored inside the config files of the services.
    Mihalis.

  • Application to read 2 log files from internet

    Hi,
    Anybody could tell me how to develop this project. I'm lack of time. thanks
    You have been asked into a company called Broken Arrow Software as a Senior Software Engineer and Management consultant. You are being paid very handsomely and have been asked to write an application program in Java for parsing a log file.
    � The program will have a standard menu using the AWT only.
    � ALL LAYOUTS MUST USE ONLY THE BORDER-LAYOUT OR GRID-LAYOUT OR A COMBINATION OF BOTH (Any other layouts score zero).
    This application reads two Log files from the internet. The first file contains USA state names and the abbreviation used for that state. This information will be used in displaying totals from another network file and should be stored in an array (or two arrays, or a two dimensional array) in the program. The second file should extract the �Reversed subdomain� section from a Log file (a section is shown later) and store this in an array in the application. This should be processed and only those domains beginning with �us.� should be processed. Each �us.� Has an abbreviated USA state after it (�tx� is an abbreviation for Texas). The abbreviations are sorted and all states are on consecutive lines and each displays a number of accesses for that state. The accesses should be totalled and displayed in the current Frame for each state in the form of Java List components (these do not need to be synchronised so that they all scroll in unison).
    The application will be a Java application with the following menus and MenuItems:
    � Splash screen (10%)
    � Application (20%)
    o Open USA abbreviation file
    o Clear screen
    o Exit
    � File (25%)
    o Open network log file
    o Open locally saved report file
    o Recent report files
    o Save as report file
    � Graph (20%)
    o Plot
    � Help (15%)
    o Help on Application
    o About
    A basic pass for the application will be for a basic implementation of an application with �help� options and some basic implementations of a Splash screen and some basic file and application options. Very high marks will be awarded for processing network files, saving and opening files, very good HCI, application design and error handling, excellent OO design for classes, gorgeous layout and commenting of code and excellent graphing capabilities.
    Error handling dialogs and overall application design (10%)
    Every class must be in a separate file.
    Inheritance should be used for WindowListeners of Frames and Dialogs.
    Up to 10% can also be lost by unprofessional code layout and lack of professional standards. Always adhere to standards taught throughout the module and your time at the University of Northumbria.
    Examples of non-professionalism would include bad indentation, no comments, meaningless variable names, politically incorrect graphics, commented out code, empty .java files, .java files which are not part of the project etc. Remember your application is your livelihood and your company depends on your application standards.
    The splash screen must be a Frame with a Canvas as part of it showing your own logo. Your logo should be individual to you but does not need to win the computing equivalent of the Turner prize. The application should be displayed behind the Splash screen and both should be visible. The application must not be able to be brought to the front and used without the Splash screen being disposed of.
    The application should only enable the �Open USA abbreviation file�, �Open locally saved report file�, �Help� options and �Exit� Menus and MenuItems, when the application starts. On opening a valid USA abbreviation file, then the other Menus and MenuItems should be enabled.
    The abbreviations should be read into an array. These should be used in displaying the totals for the reversed subdomain totals for each USA state. A total for all USA states should be displayed at the bottom of the current Frame, with a suitable Label (this design is your own). This current Frame should display a series of Lists starting with a List of USA state number (1 to n). A List of USA state abbreviation should be next followed by a List of the actual USA state name, followed by a List of the total accesses for that state.
    The report file should be an ASCII file that can be printed out from an ASCII text editor such as DOS edit or Microsoft NotePad.
    The �Open� network files should display a Dialog asking for the http:// address of the file, with �OK� and �Cancel� options. It is helpful if the user can hit �return� instead of clicking on �OK� and �Escape� instead of �cancel�. Error Dialogs should be used to indicate any errors that may occur and the state of the application should be reset to that of before displaying the Dialog.
    When �Save as report file� is chosen a FileDialog box should be used for the user to choose both directory and filename. The file should be able to be saved as a �.rpt� file.
    Open report file should display the report in a Frame; the design of which is your own.
    Plotting the graph should pass a two dimensional array to a Frame with a Canvas. The Canvas should have a Paint method that draws the axis for the graph and any suitable Headings etc. The graph should draw a histogram of totals per USA states. The graph design is your own but you may wish to use Microsoft Excel as a good example of drawing a histogram.
    The �Clear screen� option should clear any data off the current screen.
    The �Exit� option should quit the application but it may be helpful to ask the user if they really want to exit the application.
    Help must be Java code and not linking into HTML. It should display help in a well designed screen. The most basic implementation might use a scrollable TextArea for a basic mark.
    See other software for a good �About� screen. The most basic should display your name, date, version and company.
    � Help should display your help on using the application. As a senior software engineer, the design is your own, based on experience of using applications, as is the opening splash screen. You may use other applications for inspiration only, as these will make up your experience.
    � Your good knowledge gained from HCI units studied should prove invaluable in the interface design and the usability of the application.
    � The design (Screens and classes) and quality and documentation of code throughout the application will be marked. The experience gained from programming 1 and 2 and Object Oriented Programming should prove invaluable throughout the application, as should any GUI units studied.
    The log file can be accessed at:
    http://computing.unn.ac.uk/staff/cgpb2/public_html/log.html

    You would really gain ever so much more from this exercise if you would write a couple of classes, then come back with some specific questions. If you're completely lost, try starting with the GUI first. It's not the best practice, always, but it is easy to visualize.
    On a side note, I wish I'd had assignments even half this intersting when I was in my Java classes...

  • Parse Webi Logs

    Has anyone ever attempted to parse a BO webi log file with powershell (or other windows scripting language) to get something in a more friendly output that could be passed on to an internal support team for review?  For example, when I review our log files I see a variety of error messages that might be something I need to pay attention to as a system admin or they might be 'training' type errors that I can hand off to our support team to get with the user.  I want to parse out the 'useful' info and ignore the noise ( such as excess info, 'soft errors', or successes).  Plus, the support team does not have access to our webi logs/webi servers.
    I am self-teaching as I go with powershell and since I have other scripts running in powershell, I started with that (there could be a better way).  I would just like to extract the date, server, full error information and username and format it to a list.  There is just no easy way that I can figure out to identify a single log entry line... or to know where the error information 'really' ends.  Maybe I'm overlooking the obvious though - which is very possible as these webi logs make my eyes go buggy!
    Has anyone ever tried anything like this and can share some insight? we are on 4.0 sp7 patch 7 if it makes a difference in log format or anything.
    thanks in advance,
    Missy

    Hi Mani,
    Thanks for the link to the trace settings - yet another good resource you have provided.  Our ini file is set as follows:
    active = true;
    importance = xl;
    alert = true;
    severity = 'E';
    //keep = false;
    //size = 100 * 1000;
    So, mostly defaults.  Not sure the exact difference in context between s, m, l, and xl, but even with these settings noted above, in our webiserver log, for example, I will get a line that looks like this:
    |12671c8e-cc1c-5164-2b7f-d36028aea63d|2014 07 30 17:14:58:479|-0400|Error| |>>| | |webiserver_SERVERA.WebIntelligenceProcessingServer| 9984|3984|| |2|0|2|0|BIlaunchpad.WebApp|SERVERA:7616:65.106749:1|Webi SDK.CorbaServerImpl.doProcess()|SERVERA:7616:65.106749:2|webiserver_SERVERA.WebIntelligenceProcessingServer.loadStateMDP|localhost:9984:3984.119276:1|CtC8xxN8nEPIkk05usjbm5s1a0fb|||||||||||AsyncCaller:WebiSession[AS6LS2uphS9GilVoKXDMP9k]: loadState by user 'userA' on document 'Webi Report A'  AsyncCaller.cpp:181:bool __cdecl async::AsyncCaller::startRequest(const char *,int): TraceLog message 21020
    So what part of this is an error? To me, this looks like normal processing steps. Another example, would be:
    |1404a918-2cad-9184-6a6b-decec93efe1e|2014 07 30 17:58:06:861|-0400|Error|Error|>>|E| |webiserver_SERVERA.WebIntelligenceProcessingServer| 37332|35556|| |12|0|2|0|BIlaunchpad.WebApp|SERVERB:8700:75751.68935664:1|Webi SDK.CorbaServerImpl.doProcess()|SERVERB:8700:75751.68935664:3|webiserver_SERVERA.WebIntelligenceProcessingServer3.loadStateMDP|localhost:37332:35556.58574:1|CoWFD4_SSEn5rjTHJ2XQgkU41bdff6|||||||||||**ERROR:RequestProc:user: userC, doc: "", error stream: [kctRequestProc.cpp;777]  SharedContextImpl.cpp:227:__cdecl SharedContextImpl::~SharedContextImpl(void): TraceLog message 38279
    That one, I can see how it is an error, generic or otherwise, it is clearly an error.
    What we have been doing with our vb script, is parsing the log files by searching line by line for the word 'user' and then outputting a subset of that line to excel, figuring this gives us errors the user experienced.  What we were surprised to see is the sheer number of errors that the user is supposedly experiencing.  What is also surprising, is that it doesn't appear that the user sees these messages on screen all the time, given that they don't complain and/or we having someone working with the user who verifies no messages on screen appeared, but I can match up timestamps with the log files and their actions performed and see that an error was hit.
    Our initial goal with this was to see what users may need more training, but we also have been using it as a quick way to monitor things too.  So for example, if a user calls and complains they got an error, I'll use this first to help identify which server they were on and check out server related 'things' and then I use the glf viewer to view the full log if I need to analyze things further.
    Also, here's the link I have for the error message guide:
    http://help.sap.com/businessobject/product_guides/boexir4/en/xi4_error_messages_en.pdf
    For example, WIS 30600 is not in there.  Is there a different guide?
    Thanks,
    Missy

  • Problem in Rolling to new a log file only when it exceeds max size (Log4net library)

    Hello,
    I am using log4net library to create log files.
    My requirement is roll to a new log file with name appended with timestamp only when file size exceeds max size (file name ex: log_2014_12_11_12:34:45 etc).
    My config is as follow
     <appender name="LogFileAppender"
                          type="log4net.Appender.RollingFileAppender" >
            <param name="File" value="logging\log.txt" />
            <param name="AppendToFile" value="true" />
            <rollingStyle value="Size" />
            <maxSizeRollBackups value="2" />
            <maximumFileSize value="2MB" />
            <staticLogFileName value="true" />
            <lockingModel type="log4net.Appender.FileAppender+MinimalLock" />
            <layout type="log4net.Layout.PatternLayout">
              <param name="ConversionPattern"
                   value="%-5p%d{yyyy-MM-dd hh:mm:ss} – %m%n" />
              <conversionPattern
                   value="%newline%newline%date %newline%logger 
                           [%property{NDC}] %newline>> %message%newline" />
            </layout>
          </appender>
    Issue is date time is not appending to file name. 
    But if i set "Rolling style" as "Date or composite", file name gets appended with timestamp, but new file gets created before reaching max file size.(Because file gets created  whenever date time changes, which i dont want) .
    Please help me in solving this issue?
    Thanks

    Hello,
    I'd ask the logfornet people: http://logging.apache.org/log4net/
    Or search on codeproject - there may be some tutorials that would help you.
    http://www.codeproject.com/Articles/140911/log-net-Tutorial
    http://www.codeproject.com/Articles/14819/How-to-use-log-net
    Karl
    When you see answers and helpful posts, please click Vote As Helpful, Propose As Answer, and/or Mark As Answer.
    My Blog: Unlock PowerShell
    My Book:
    Windows PowerShell 2.0 Bible
    My E-mail: -join ('6F6C646B61726C406F75746C6F6F6B2E636F6D'-split'(?<=\G.{2})'|%{if($_){[char][int]"0x$_"}})

  • Parse log file using powershell

    Hi,
    Am pretty new to Powershell and would require anyone of your assistance in setting up a script which parse thru a log file and provide me output for my requirements below.
    I would like to parse the Main log file for Barra Aegis application(shown below) using powershell.
    Main log = C:\BARRALIN\barralin.log
    Model specific log = C:\BARRALIN\log\WG*.log
    Requirements :
    1. scroll to the bottom of the log file and look for name called "GL Daily" and see the latest date which in the example log below is "20150203"
    note : Name "GL Daily" and date keep changing in log file
    2. Once entry is found i would like to have a check to see all 3 entries PREPROCESS, TRANSFER, POSTPROCESS are sucess.
    3. If all 3 are success i would like to the script to identify the respective Model specific log number and print it out.
    E.g if you see the sample log below for "GL Daily", it is preceded by number "1718" hence script should append the model log path with "WG00" along with 1718, finally it should look something like this  C:\BARRALIN\log\WG001718.log.
    4. If all 3 items or anyone of them are in "failed" state then print the same log file info with WG001718.log
    Any help on this would be much appreciated.
    Thank You.
    Main log file :
    START BARRALINK            Check Auto Update                                                1716  
    43006  20150203 
        Trgt/Arch c:\barralin                                               
        PREPROCESS           success   0 preprocessor: no error                   
        TRANSFER             success   1 Host success: files received             
        POSTPROCESS          success   0 Postprocessor: no error                  
        CONFIRMATION         success   2 No Confirm needed                        
    STOP  43105  20150203 
    START Aegis                GL Monthly                                                    
      1716   43117  20150203 
        Trgt/Arch K:\barraeqr\aegis\qnt\gleqty                              
        PREPROCESS           success   0 preprocessor: no error                   
        TRANSFER             success   1 Host success: files received             
        POSTPROCESS          success   0 Postprocessor: no error                  
        CONFIRMATION         success   2 No Confirm needed                        
    STOP  44435  20150203
    START Aegis                UB Daily                                                    
      1717   43107  20150203 
        Trgt/Arch K:\barraeqr\aegis\qnt\gleqty                              
        PREPROCESS           success   0 preprocessor: no error                   
        TRANSFER             success   1 Host success: files received             
        POSTPROCESS          success   0 Postprocessor: no error                  
        CONFIRMATION         success   2 No Confirm needed                        
    STOP  44435  20150203 
    START Aegis                GL Daily                                                    
        1718   44437  20150203 
        Trgt/Arch K:\barraeqr\aegis\qnt\gleqty                              
        PREPROCESS           success   0 preprocessor: no error                   
        TRANSFER             success   1 Host success: files received             
        POSTPROCESS          success   0 Postprocessor: no error                  
        CONFIRMATION         success   2 No Confirm needed                        
    STOP  50309  20150203 
     

    Hi All,
    I was writing a function in power shell to send email and i was looking to attach lines as and when required to the body of the email. but am not able to get this done..Here's my code
    Function Email ()
    $MailMessage = New-Object System.Net.Mail.MailMessage
    $SMTPClient = New-Object System.Net.Mail.SmtpClient -ArgumentList "mailhost.xxx.com"
    $Recipient = "[email protected]"
    If ($MessageBody -ne $null)
    $MessageBody = "The details of Barra $strsessionProduct model is listed below
    `rHostName : $localhost
    `r Model Run Date : $Date
    `r Model Data Date : $DateList1
    `r`n Click for full job log"+ "\\"+$localhost+"\E$\Local\Scripts\Logs "
    $MailMessage.Body = $MessageBody
    If ($Subject -ne $null) {
    $MailMessage.Subject = $Subject
    $Sender = "[email protected]"
    $MailMessage.Sender = $Sender
    $MailMessage.From = $Sender
    $MailMessage.to.Add($Recipient)
    If ($AttachmentFile -ne $null) { $MailMessage.Attachments.add($AttachmentFile)}
    $SMTPClient.Send($MailMessage)
    $Subject = "Hello"
    $AttachmentFile = ".\barralin.log"
    $MessageBody = "Add this line to Body of email along with existing"
    Email -Recipient "" -Subject $Subject -MessageBody $MessageBody -AttachmentFile $AttachmentFile
    as you can see before calling Email function i did add some lines to $MessageBody and was expecting that it would print the lines for $MessageBody in Email Function along with the new line. But thats not the case.
    I have tried to make $MessageBody as an Array and then add contents to array
    $MessageBody += "Add this line to Body of email along with existing"
    $MessageBody = $MessageBody | out-string
    Even this didnt work for me. Please suggest me any other means to get this done.
    THank You

  • Performance Issue: Wait event "log file sync" and "Execute to Parse %"

    In one of our test environments users are complaining about slow response.
    In statspack report folowing are the top-5 wait events
    Event Waits Time (cs) Wt Time
    log file parallel write 1,046 988 37.71
    log file sync 775 774 29.54
    db file scattered read 4,946 248 9.47
    db file parallel write 66 248 9.47
    control file parallel write 188 152 5.80
    And after runing the same application 4 times, we are geting Execute to Parse % = 0.10. Cursor sharing is forced and query rewrite is enabled
    When I view v$sql, following command is parsed frequently
    EXECUTIONS PARSE_CALLS
    SQL_TEXT
    93380 93380
    select SEQ_ORDO_PRC.nextval from DUAL
    Please suggest what should be the method to troubleshoot this and if I need to check some more information
    Regards,
    Sudhanshu Bhandari

    Well, of course, you probably can't eliminate this sort of thing entirely: a setup such as yours is inevitably a compromise. What you can do is make sure your log buffer is a good size (say 10MB or so); that your redo logs are large (at least 100MB each, and preferably large enough to hold one hour or so of redo produced at the busiest time for your database without filling up); and finally set ARCHIVE_LAG_TARGET to something like 1800 seconds or more to ensure a regular, routine, predictable log switch.
    It won't cure every ill, but that sort of setup often means the redo subsystem ceases to be a regular driver of foreground waits.

  • Not able to add new log file to the 11g database.

    Hi DBA's
    I am not able to add the log file i am getting error while adding the database.
    SQL> alter database add logfile group 3 ('/oracle/DEV/db/apps_st/data/log03a.dbf','/oracle/DEV/db/apps_st/data/log03a.dbf') size 50m reuse;
    alter database add logfile group 3 ('/oracle/DEV/db/apps_st/data/log03a.dbf','/oracle/DEV/db/apps_st/data/log03a.dbf') size 50m reuse
    ERROR at line 1:
    ORA-01505: error in adding log files
    ORA-01577: cannot add log file '/oracle/DEV/db/apps_st/data/log03a.dbf' - file
    already part of database
    SQL> select a.group#, member, a.status from v$log a, v$logfile b where a.group# = b.group# order by 1;
    GROUP# MEMBER STATUS
    1 /oracle/DEV/db/apps_st/data/log01a.dbf ACTIVE
    1 /oracle/DEV/db/apps_st/data/log01b.dbf ACTIVE
    2 /oracle/DEV/db/apps_st/data/log02a.dbf CURRENT
    2 /oracle/DEV/db/apps_st/data/log02b.dbf CURRENT
    Kindly help me to add the new log file to my database.
    Thanks,
    SG

    Hi Sawwan,
    V$LOGMEMBER was written in the document,
    I query the log members as bellow
    1)select a.group#, member, a.status from v$log a, v$logfile b where a.group# = b.group# order by 1;
    GROUP# MEMBER STATUS
    1 /oracle/DEV/db/apps_st/data/log01a.dbf INACTIVE
    1 /oracle/DEV/db/apps_st/data/log01b.dbf INACTIVE
    2 /oracle/DEV/db/apps_st/data/log02a.dbf CURRENT
    2 /oracle/DEV/db/apps_st/data/log02b.dbf CURRENT
    2)SQL> select group#,member,status from v$logfile;
    GROUP# MEMBER STATUS
    2 /oracle/DEV/db/apps_st/data/log02a.dbf
    2 /oracle/DEV/db/apps_st/data/log02b.dbf
    1 /oracle/DEV/db/apps_st/data/log01a.dbf
    1 /oracle/DEV/db/apps_st/data/log01b.dbf
    But i am littile bit confused that there is no group or datafile called " Group 3 and log03a.dbf" as per the above query, how can i drop tease group and datafile.
    and i crossverified in the data top the files are exist or not but those are not existing. but still i am getting the same error that i can't create that already exist.
    can issue the bellow queris to drop those group which i dont think so it will exist?
    SQL>alter database drop logfile group 3;
    Thanks in advance.
    Regards,
    SG

  • Windows 8 installation error: "We couldn't create a new partition or locate an existing one. For more information, see the Setup log files."

    I recently tried testing Server 2012 for installation and I had a major issue.  Windows 8 also uses the same method for installation so this should also apply to Windows 8.  I thought this might be helpful for other people who have the same
    problem.  My setup is a 1U server with no DVD drive, no hard drives, and only USB.  I have a new hard drive that I wanted to install Server onto, so I put it in and ran the installer off a thumb drive.  I kept receiving the following error when
    I tried to format the hard drive and proceed with the installation:
    "We couldn't create a new partition or locate an existing one. For more information, see the Setup log files."
    After hours of troubleshooting, I determined my problem was I was using a thumb drive to install it, and Windows didn't like it.  Some people had similar issues when similar devices were plugged in (i.e. external hard drives, SD cards, CF cards, etc). 
    In my case, it was the installation drive itself was a thumb drive.  I ended up digging up an external DVD drive and placed the installation onto a DVD.  Worked flawlessly.  Now I am up and running.
    I tried
    the steps here to no avail.
    Hope this helps anyone else with the same problems.

    Hi,
    Thanks for sharing. We really appreciate your time and efforts. Hope your experience will help other community members facing similar problems.
    Leo Huang
    TechNet Community Support

  • I recently upgraded my old powerbook to an iMac.  I've dumped the files of my backup drive into the new iMac. Is there a way to have access to all the files (new and old computers) when logged on as the same user vs. logging in and out to access each?

    I am new to posting to this support community but have often referred to it for answers.  So thank you all who've contributed; you've been a great help!
    I recently upgraded my old powerbook to an iMac.  I've dumped the files of my backup drive into the new iMac. Is there a way to have access to/merge all the files (new and old computers) together so when I'm logged in can access all the files.
    Thanks!
    M

    Sure-glad to help you. You will not lose any data by changing synching to MacBook Pro from imac. You have set up Time Machine, right? that's how you'd do your backup, so I was told, and how I do my backup on my mac.  You should be able to set a password for it. Save it.  Your stuff should be saved there. So if you want to make your MacBook Pro your primary computer,  I suppose,  back up your stuff with Time machine, turn off Time machine on the iMac, turn it on on the new MacBook Pro, select the hard drive in your Time Capsule, enter your password, and do a backup from there. It might work, and it might take a while, but it should go. As for clogging the hard drive, I can't say. Depends how much stuff you have, and the hard drive's capacity.  As for moving syncing from your iMac to your macbook pro, should be the same. Your phone uses iTunes to sync and so that data should be in the cloud. You can move your iTunes Library to your new Macbook pro
    you should be able to sync your phone on your new MacBook Pro. Don't know if you can move the older backups yet-maybe try someone else, anyways,
    This handy article from Apple explains how
    How to move your iTunes library to a new computer - Apple Support''
    don't forget to de-authorize your iMac if you don't want to play purchased stuff there
    and re-authorize your new macBook Pro
    time machine is an application, and should be found in the Applications folder. it is built in to OS X, so there is nothing else to buy. double click on it, get it going, choose the Hard drive in your Time capsule/Airport as your backup Time Machine  and go for it.  You should see a circle with an arrow on the top right hand of your screen (the Desktop), next to the bluetooth icon, and just after the wifi and eject key (looks sorta like a clock face). This will do automatic backups  of your stuff.

Maybe you are looking for

  • Can't load more than 24

    hi- i have a class that loads external images one by one. After one is complete, it begins loading the next till they are all in. Works fine....next, after the first image is "ready", a timer starts at intervals of 4000 mls. Each event of the timer,

  • Web Dynpro Application Accessing ABAP Functions  Tutorial

    Hi, After Downloading 4TutWD_FlightList_Init and importing it . i can't find the views or applications there is nothing ...Pls can anyone tell me what is the problem and tell me how to import it correctly. Thanks&Regards, Mathivanan.G

  • Bandwidth Problems

    We are trying to use iChat with a grandson in Iraq. We get a satisfactory Picture ( pixelates frequentl but passable) but almost unintelligible Audio. We however had an audio chat with sound as clear as a local phone call. He uses AIM in Iraq on a PC

  • Loading MySQL data into ListComponent -help!

    Hi Everyone, Complete newbie to this forum, but I've heard that you guys are the best for help and advice so here I am! So, I have a MySQL database and a Flash Document with a List Component. I need to import (parts) of records to populate my List co

  • Odd RAW behaviour

    PS Elements 5.0 Camera Raw 4.1 Core 2 Duo E6600 Windows XP SP2 2GB DDR2 I have the Camera Raw 4.1 plugin and a Fuji S6000fd camera (RAF files). I noticed when I open an RAF file and then for example save it as a TIF or PNG the resolution is different