Best way to append null bytes to a byte array?

I have a fixed length byte array and need to add two null bytes to its end. I have an implementation that I wrote to do this, though I was wondering whether someone could improve on what I have:
public byte[] appendNullBytes( byte[] bytes ) {
   byte[] copyBytes = new byte[bytes.length+2];
   System.arraycopy(bytes, 0, copyBytes, 0, bytes.length);
   copyBytes[copyBytes.length-1] = 0;
   copyBytes[copyBytes.length-2] = 0;
   return copyBytes
}

ajmasx wrote:
jverd wrote:
ajmasx wrote:
jverd wrote:
Those aren't null bytes, they're zero bytes. A bytes can never have the value null in Java.I was using the term null interchangeably with 'zero', Which is not generally applicable in Java.Well it all depends what you are doing:
- A null is a zero value.Not in general in Java. Null is null and zero is zero.
- char c = '\0' defines a null character.Yes, that's the only use of "null" for primitive types.
- A null pointer is where the address is zero and not pointing to some location in memory.Nope. A null pointer is simply one that doesn't point to any value. Nothing in the spec says it's zero.
- A null reference is more or less the same thing (there are certain difference in the details).Nope. A null pointer and null reference in Java are the same thing.
- A null terminated string is one where null defines the end of the string.This does not exist in Java. As the above smartass pointed out, you can put the null character at the end of a String, but that character is part of the String, not a terminator.
>
If you are dealing with files and other types of I/O then this concept most certainly exists. Not in Java.
Just because you are limiting yourself to high-level use of Java, does not mean they aren't present in Java.The fact that null is precisely defined in the JLS and does not include these things means they aren't present in Java.

Similar Messages

  • What is the best way to append data from one field to another?

    I have the following table, table1:
    Name Null? Type
    MAIL_ID NOT NULL NUMBER(10)
    LAST_NAME VARCHAR2(45)
    FIRST_NAME VARCHAR2(45)
    MIDDLE_INITIAL VARCHAR2(1)
    ADDRESS_1 VARCHAR2(45)
    CITY VARCHAR2(35)
    STATE VARCHAR2(2)
    ZIP VARCHAR2(10)
    REMARKS VARCHAR2(200)
    The table has duplicate entries that need to be removed. The records that will be removed need the
    data in the Remarks column appended to the Remarks data of the record that is not deleted.
    For example, the following listing shows a sample of the duplicate records.
    Mail ID Last Name First Name M Address City St ZIP Remarks
    189 BROWN STEPHEN 6706 MOESER LN EL CERRITO CA 94530-2909 Sf7#s124,f16#d7996(NML)[Cl#117][Ml#1649][NMf1#d288][NCf9#d319][SNl#e62]
    211023 BROWN STEPHEN B 6706 MOESER LN EL CERRITO CA 94530 RLl#a12047[IDl#i398]
    287796 BROWN STEPHEN B 6706 MOESER LN EL CERRITO CA 94530 SNl#e1163
    The following listing shows how the kept record should appear after the duplicate records are deleted.
    Mail ID Last Name First Name M Address City St ZIP Remarks
    189 BROWN STEPHEN 6706 MOESER LN EL CERRITO CA 94530-2909 Sf7#s124,f16#d7996(NML)[Cl#117][Ml#1649][NMf1#d288][NCf9#d319][SNl#e62]RLl#a12047[IDl#i398]SNl#e1163
    I have the process of deleting duplicates working but have yet to determine the best way to move
    the Remarks data from the deleted records to the preserved record.
    I know there are probably various ways to approach this.
    Any suggestions will be greatly appreciated!
    Here is the sql for deleting duplicates.
    DELETE FROM table1
    WHERE mail_id in (SELECT mail_id FROM table1
              where not first_name = 'Null' and
    not last_name = 'Null' and
              not city = 'Null' and
              not state = 'Null'and
    not last_name = 'Anon'
              minus
              select min(mail_id) from table1
              group by first_name, last_name, city, state, address_1, organization, title);
    THANKS in advance!!!!

    Here's quick and dirty example probably a better way to do it, but this is what I came up with quickly.
    My table looks like this:
    MAIL_ID LAST FIRST PHONE REMARKS
    123 Ruff Shawn 555-555-5555 Called 10-10-04
    135 Ruff Shawn 555-555-5555 Called 10-12-04
    201 Ruff Shawn 555-555-5555 Called 10-19-04
    The code below will concatenate the remarks column from the rows, and delete the 135 and 201 rows, then update the 123 row with the concatenated remarks.
    declare
    l_remarks varchar2(500);
    l_min_mail_id number;
    begin
    select min(mail_id) into l_min_mail_id
    from test
    group by last, first, phone;
    select remarks into l_remarks from test where mail_id = l_min_mail_id;
    for i in (select mail_id, remarks from test
         where last = 'Ruff'
              and first = 'Shawn'
              and phone = '555-555-5555'
              and mail_id <> l_min_mail_id)
    loop
    l_remarks := l_remarks||','||i.remarks;
    delete from test where mail_id = i.mail_id;
    end loop;
    update test set remarks = l_remarks where mail_id = l_min_mail_id;
    commit;
    end;
    Hope this helps.

  • What is the proper and best way to destroy a java.util.List or Array for GC

    Hi,
    If I have a java.util.List of objects lets say:
    List<File> files = new ArrayList();The List contains 1000 file objects. When my class finishes, is it enough to do
    files = null;to make GC able to release it from memory?
    Cause it seems like I can't do the following:
    for(int i = 0; i < extracted_files.size(); i++){
                extracted_files.get(i) = null;
    }Or should I use this one:
    files.clear()What is the proper and best way to do this in order to avoid memory leaks? How about normal Arrays like File[]?
    Edited by: TolvanTolvanTolvan on 2009-sep-10 16:58

    TolvanTolvanTolvan wrote:
    Thanks for the info!
    But what if the List is a class variable running in a Web Service for like months without terminating?You mean if the List variable is a member variable of a long-lived object? Then presumably the list needs to live as long as the object does. In the unlikely scenario that the object needs to live on but its list member variable does not, then yes, setting the member to null will be needed to make the list eligible for GC. But I highly doubt this is your situation, and if it is, you probably have a design flaw.
    And if the list is referenced only by a local variable, then, as I already said, when the method ends, the variable goes out of scope, and the List is eligible for GC.
    Now, a slightly different situation is when the List needs to live a long time, and at some point during its life, you're done using some object that it refers to. In that situation, simply remove the element from the list (or set the element to null if it's an array rather than a list), and then if it's not referenced anywhere else, it can be GCed.
    How about Arrays like File[]? Do I need to iterate through the array and null all the objects or is it enough to just null the Array?YOU. DON'T. NEED. TO. HELP. GC. You don't need to set the elements to null, and you most likely don't even need to set the array variable to null.
    Also note that only references can be null, not objects.

  • What is the best way of dealing with an "implicit coercion" of an array to a sprite?

    Hello everyone!
         With continued help from this forum I am getting closer to having a working program. I look forward to being able to help others like myself once I finish learning the AS3 ropes.
         I will briefly explain what I am trying to achieve and then follow it up with my question.
    Background
         I have created a 12 x 9 random number grid that populates each cell with a corresponding image based on each cell's numeric value. I have also created a shuffle button that randomizes the numbers in the grid. The problem I am running into is getting my button-click event to clear the current images off the grid in order to assign new ones (i.e. deleting the display stack objects in order to place news ones in the same locations).
    Question
         My question is this: what is the best way to handle an implicit coercion from an array to a sprite? I have pasted my entire code below so that you can see how the functions are supposed to work together. My trouble apparently lies with not being able to use an array value with a sprite (the sprite represents the actual arrangement of the grid on the display stack while the array starts out as a number than gets assigned an image which should be passed to the sprite).
    ============================================================================
    package 
    import flash.display.MovieClip;
    import flash.display.DisplayObject;
    import flash.events.MouseEvent;
    import flash.display.Sprite;
    import flash.text.TextField;
    import flash.text.TextFormat;
    import flash.utils.getDefinitionByName;
    public class Blanko extends MovieClip
          // Holds 12*9 grid of cells.
          var grid:Sprite;
          // Holds the shuffle button.
          var shuffleButton:Sprite;
          // Equals 12 columns, 9 rows.
          var cols:int = 12;
          var rows:int = 9;
          // Equals number of cells in grid (108).
          var cells:int = cols * rows;
          // Sets cell width and height to 40 pixels.
          var cellW:int = 40;
          var cellH:int = 40;
          // Holds 108 cell images.
          var imageArray:Array = [];
          // Holds 108 numerical values for the cells in the grid.
          var cellNumbers:Array = [];
          // Constructor calls "generateGrid" and "makeShuffleButton" functions.
          public function Blanko()
               generateGrid();
               makeShuffleButton();
      // Creates and displays the 12*9 grid.
      private function generateGrid():void
           grid = new Sprite;
           var i:int = 0;
           for (i = 0; i < cells; i++)
                cellNumbers.push(i % 9 + 1);
           trace("Before shuffle: ", cellNumbers);
           shuffleCells(cellNumbers);
           trace("After shuffle: ", cellNumbers);
           var _cell:Sprite;
           for (i = 0; i < cells; i++)
                // This next line is where the implicit coercion occurs. "_cell" is a sprite that tries
                   to temporarily equal an array value.
                _cell = drawCells(cellNumbers[i]);
                _cell.x = (i % cols) * cellW;
                _cell.y = (i / cols) * cellH;
                grid.addChild(_cell);
      // Creates a "shuffle" button and adds an on-click mouse event.
      private function makeShuffleButton():void
           var _label:TextField = new TextField();
           _label.autoSize = "center";
           TextField(_label).multiline = TextField(_label).wordWrap = false;
           TextField(_label).defaultTextFormat = new TextFormat("Arial", 11, 0xFFFFFF, "bold");
           _label.text = "SHUFFLE";
           _label.x = 4;
           _label.y = 2;
           shuffleButton = new Sprite();
           shuffleButton.graphics.beginFill(0x484848);
           shuffleButton.graphics.drawRoundRect(0, 0, _label.width + _label.x * 2, _label.height +
                                                _label.y * 2, 10);
           shuffleButton.addChild(_label);
           shuffleButton.buttonMode = shuffleButton.useHandCursor = true;
           shuffleButton.mouseChildren = false;
           shuffleButton.x = grid.x + 30 + grid.width - shuffleButton.width;
           shuffleButton.y = grid.y + grid.height + 10;
           this.addChild(shuffleButton);
           shuffleButton.addEventListener(MouseEvent.CLICK, onShuffleButtonClick);
      // Clears cell images, shuffles their numbers and then assigns them new images.
      private function onShuffleButtonClick():void
       eraseCells();
       shuffleCells(cellNumbers);
       trace("After shuffle: ", cellNumbers);
       for (var i:int = 0; i < cells; i++)
        drawCells(cellNumbers[i]);
      // Removes any existing cell images from the display stack.
      private function eraseCells(): void
       while (imageArray.numChildren > 0)
        imageArray.removeChildAt(0);
      // Shuffles cell numbers (randomizes array).
      private function shuffleCells(_array:Array):void
       var _number:int = 0;
       var _a:int = 0;
       var _b:int = 0;
       var _rand:int = 0;
       for (var i:int = _array.length - 1; i > 0; i--)
        _rand = Math.random() * (i - 1);
        _a = _array[i];
        _b = _array[_rand];
        _array[i] = _b;
        _array[_rand] = _a;
      // Retrieves and assigns a custom image to a cell based on its numerical value.
      private function drawCells(_numeral:int):Array
       var _classRef:Class = Class(getDefinitionByName("skin" + _numeral));
       _classRef.x = 30;
       imageArray.push(_classRef);
       imageArray.addChild(_classRef);
       return imageArray;
    ===========================================================================
         Any help with this is greatly appreciated. Thanks!

    Rothrock,
         Thank you for the reply. Let me address a few things here in the hopes of allowing you (and others) to better understand my reasoning for doing things in this manner (admittedly, there is probably a much better/easier approach to what I am trying to accomplish which is one of the things I hope to learn/discover from these posts).
         The elements inside my "imageArray" are all individual graphics that I had imported, changed their type to movie clips using .Sprite as their base class (instead of .MovieClip) and then saved as classes. The reason I did this was because the classes could then be referenced via "getDefinitionByName" by each cell value that was being passed to it. In this grid every number from 1 to 9 appears randomly 12 times each (making the 108 cells which populate the grid). I did not, at the time (nor do I now), know of a better method to implement for making sure that each image appears in the cell that has the corresponding value (i.e. every time a cell has the value of 8 then the custom graphic/class "skin8" will be assigned to it so that the viewer will be able to see a more aesthetically pleasing numerical representation, that is to say a slightly more fancy looking number with a picture behind it). I was advised to store these images in an array so that I could destroy them when I reshuffle the grid in order to make room for the new images (but I probably messed up the instructions).
         If the "drawCell" function only returns a sprite rather than the image array itself, doesn't that mean that my "eraseCells" function won't be able to delete the array's children as their values weren't first returned to the global variable which my erasing function is accessing?
         As for the function name "drawCells," you have to keep in mind that a) my program has been redesigned in stages as I add new functionality/remove old functionality (such as removing text labels and formatting which were originally in this function) and b) that my program is called "Blanko."
         I will try and attach an Illustrator exported JPG file that contains the image I am using as the class "skin7" just to give you an example of what I'm trying to use as labels (although it won't let me insert it here in this post, so I will try it in the next post).
    Thank you for your help!

  • Best way to make a 2 dim byte array

    I have a byte array of 16384 bytes but i want to make
    a 2 dimensional array with a size of 256*64 out of it,
    what is the best way to do this?

    A simple way is to use the "Reshape Array" function (Functions>Array>Reshape Array).
    =====================================================
    Fading out. " ... J. Arthur Rank on gong."
    Attachments:
    Reshape_array.gif ‏3 KB

  • Is this the best way to measure the speed of an input stream?

    Hi guys,
    I have written the following method to read the source of a web page. I have added the functionality to calculate the speed.
    public StringBuffer read(String url)
            int lc = 0;
            long lastSpeed = System.currentTimeMillis();
            //string buffer for reading in the characters
            StringBuffer buffer = new StringBuffer();
            try
                //try to set URL
                URL URL = new URL(url);
                //create input streams
                InputStream content = (InputStream) URL.getContent();
                BufferedReader in = new BufferedReader(new InputStreamReader(content));
                //in line
                String line;
                //while still reading in
                while ((line = in.readLine()) != null)
                    lc++;
                    if ((lc % _Sample_Rate) == 0)
                        this.setSpeed(System.currentTimeMillis() - lastSpeed);
                        lastSpeed = System.currentTimeMillis();
                    //add character to string buffer
                    buffer.append(line);
            //catch errors
            catch (MalformedURLException e)
                System.out.println("Invalid URL - " + e);
            catch (IOException e)
                System.out.println("Invalid URL - " + e);
            //return source
            return buffer;
        }Is it faster to read bytes rather than characters?
    This method is a very important part of my project and must be as quick as possible.
    Any ideas on how I can make it quicker? Is my approach to calculating the speed the best way to it?
    Any help/suggestions would be great.
    thanks
    alex

    sigh
    reading bytes might be slightly faster than reading chars, since you don't have to do the conversion and you don't have to make String objects. Certainly, you don't want to use readLine. If you're using a reader, use read(buf, length, offset)
    My suggestion:
    Get your inputstream, put a bufferedInputStream over it, and use tje loadAll method from my IOUtils class.
    IOUtils is given freely, but please do not change its package or submit this as your own work.
    ====
    package tjacobs;
    import java.awt.Component;
    import java.io.*;
    import java.util.ArrayList;
    import java.util.Iterator;
    import java.util.List;
    import java.util.regex.Matcher;
    import java.util.regex.Pattern;
    import javax.swing.JOptionPane;
    public class IOUtils {
         public static final int DEFAULT_BUFFER_SIZE = (int) Math.pow(2, 20); //1 MByte
         public static final int DEFAULT_WAIT_TIME = 30 * 1000; // 30 Seconds
         public static final int NO_TIMEOUT = -1;
         public static final boolean ALWAYS_BACKUP = false;
         public static String loadTextFile(File f) throws IOException {
              BufferedReader br = new BufferedReader(new FileReader(f));
              char data[] = new char[(int)f.length()];
              int got = 0;
              do {
                   got += br.read(data, got, data.length - got);
              while (got < data.length);
              return new String(data);
         public static class TIMEOUT implements Runnable {
              private long mWaitTime;
              private boolean mRunning = true;
              private Thread mMyThread;
              public TIMEOUT() {
                   this(DEFAULT_WAIT_TIME);
              public TIMEOUT(int timeToWait) {
                   mWaitTime = timeToWait;
              public void stop() {
                   mRunning = false;
                   mMyThread.interrupt();
              public void run () {
                   mMyThread = Thread.currentThread();
                   while (true) {
                        try {
                             Thread.sleep(mWaitTime);
                        catch (InterruptedException ex) {
                             if (!mRunning) {
                                  return;
         public static InfoFetcher loadData(InputStream in) {
              byte buf[] = new byte[DEFAULT_BUFFER_SIZE]; // 1 MByte
              return loadData(in, buf);
         public static InfoFetcher loadData(InputStream in, byte buf[]) {
              return loadData(in, buf, DEFAULT_WAIT_TIME);
         public static InfoFetcher loadData(InputStream in, byte buf[], int waitTime) {
              return new InfoFetcher(in, buf, waitTime);
         public static String loadAllString(InputStream in) {
              InfoFetcher fetcher = loadData(in);
              fetcher.run();
              return new String(fetcher.buf, 0, fetcher.got);
         public static byte[] loadAll(InputStream in) {
              InfoFetcher fetcher = loadData(in);
              fetcher.run();
              byte bytes[] = new byte[fetcher.got];
              for (int i = 0; i < fetcher.got; i++) {
                   bytes[i] = fetcher.buf;
              return bytes;
         public static class PartialReadException extends RuntimeException {
              public PartialReadException(int got, int total) {
                   super("Got " + got + " of " + total + " bytes");
         public static class InfoFetcher implements Runnable {
              public byte[] buf;
              public InputStream in;
              public int waitTime;
              private ArrayList mListeners;
              public int got = 0;
              protected boolean mClearBufferFlag = false;
              public InfoFetcher(InputStream in, byte[] buf, int waitTime) {
                   this.buf = buf;
                   this.in = in;
                   this.waitTime = waitTime;
              public void addInputStreamListener(InputStreamListener fll) {
                   if (mListeners == null) {
                        mListeners = new ArrayList(2);
                   if (!mListeners.contains(fll)) {
                        mListeners.add(fll);
              public void removeInputStreamListener(InputStreamListener fll) {
                   if (mListeners == null) {
                        return;
                   mListeners.remove(fll);
              public byte[] readCompletely() {
                   run();
                   return buf;
              public int got() {
                   return got;
              public void run() {
                   if (waitTime > 0) {
                        TIMEOUT to = new TIMEOUT(waitTime);
                        Thread t = new Thread(to);
                        t.start();
                   int b;
                   try {
                        while ((b = in.read()) != -1) {
                             if (got + 1 > buf.length) {
                                  buf = expandBuf(buf);
                             buf[got++] = (byte) b;
                             int available = in.available();
                             if (got + available > buf.length) {
                                  buf = expandBuf(buf);
                             got += in.read(buf, got, available);
                             signalListeners(false);
                             if (mClearBufferFlag) {
                                  mClearBufferFlag = false;
                                  got = 0;
                   } catch (IOException iox) {
                        throw new PartialReadException(got, buf.length);
                   } finally {
                        buf = trimBuf(buf, got);
                        signalListeners(true);
              private void setClearBufferFlag(boolean status) {
                   mClearBufferFlag = status;
              public void clearBuffer() {
                   setClearBufferFlag(true);
              private void signalListeners(boolean over) {
                   if (mListeners != null) {
                        Iterator i = mListeners.iterator();
                        InputStreamEvent ev = new InputStreamEvent(got, buf);
                        //System.out.println("got: " + got + " buf = " + new String(buf, 0, 20));
                        while (i.hasNext()) {
                             InputStreamListener fll = (InputStreamListener) i.next();
                             if (over) {
                                  fll.gotAll(ev);
                             } else {
                                  fll.gotMore(ev);
         public static interface InputStreamListener {
              public void gotMore(InputStreamEvent ev);
              public void gotAll(InputStreamEvent ev);
         public static class InputStreamEvent {
              public int totalBytesRetrieved;
              public byte buffer[];
              public InputStreamEvent (int bytes, byte buf[]) {
                   totalBytesRetrieved = bytes;
                   buffer = buf;
              public int getBytesRetrieved() {
                   return totalBytesRetrieved;
              public byte[] getBytes() {
                   return buffer;
         public static void copyBufs(byte src[], byte target[]) {
              int length = Math.min(src.length, target.length);
              for (int i = 0; i < length; i++) {
                   target[i] = src[i];
         public static byte[] expandBuf(byte array[]) {
              int length = array.length;
              byte newbuf[] = new byte[length *2];
              copyBufs(array, newbuf);
              return newbuf;
         public static byte[] trimBuf(byte[] array, int size) {
              byte[] newbuf = new byte[size];
              for (int i = 0; i < size; i++) {
                   newbuf[i] = array[i];
              return newbuf;

  • What is the best way to resize a JPEG and store it in the Filesystem

    Hi All,
    I have developped a CMS System that renders JPEGs if it does not have the images available within the desired width already. Within my development setup (Dell Latitude D800 with ubuntu dapper drake) everything works fine and fast, as expected. Then I uploaded the application to my V20Z Server with 4gb RAM and the systems performance goes to its knees. I have hooked in a Java Profiler to see where the problem is, and it showed me that it is hanging wthin
    sun.java2d.SunGraphics2D.drawImage(Image, int, int, ImageObserver) which I use to draw my Image to a BufferedImage. Below is my complete source code That I am using. Plus the orofiling results
    Do not be confused as I am using the Turbine Framework, which gives me a RawScreen which gives me Access to the HttpServletResponse...
    package de.ellumination.carmen.modules.screens;
    import java.awt.image.BufferedImage;
    import java.io.File;
    import java.io.FileInputStream;
    import java.io.IOException;
    import java.io.OutputStream;
    import java.util.Iterator;
    import java.util.Locale;
    import javax.imageio.IIOImage;
    import javax.imageio.ImageIO;
    import javax.imageio.ImageWriteParam;
    import javax.imageio.ImageWriter;
    import javax.imageio.plugins.jpeg.JPEGImageWriteParam;
    import javax.imageio.stream.ImageOutputStream;
    import javax.servlet.http.HttpServletResponse;
    import org.apache.log4j.Logger;
    import org.apache.turbine.modules.screens.RawScreen;
    import org.apache.turbine.util.RunData;
    import de.ellumination.carmen.om.ImagePeer;
    public class Image extends RawScreen
    public static final float DEFAULT_COMPRESSION_QUALITY = 1.0F;
    * Logger for this class
    private static final Logger log = Logger.getLogger(Image.class);
    @Override
    protected String getContentType(RunData data)
    return "image/jpeg";
    @Override
    protected void doOutput(RunData data) throws Exception
    int imageId = data.getParameters().getInt("id");
    int width = data.getParameters().getInt("width", -1);
    int height = data.getParameters().getInt("height", -1);
    HttpServletResponse response = data.getResponse();
    de.ellumination.carmen.om.Image image = ImagePeer.retrieveByPK(imageId);
    File imgFile = new File(image.getLocation());
    if(width > 0 || height > 0)
    outputScaledImage(imgFile, response, width, height);
    else
    outputImage(imgFile, response);
    private void outputScaledImage(File imageFile, HttpServletResponse response, int width, int height) throws Exception
    File scaledFile = new File(imageFile.getParent() + System.getProperty("file.separator") + width + "_" + imageFile.getName());
    if(scaledFile.exists())
    outputImage(scaledFile, response);
    else
    scaleImage(imageFile, scaledFile, width);
    outputImage(scaledFile, response);
    private void outputImage(File imageFile, HttpServletResponse response) throws Exception
    FileInputStream in = new FileInputStream(imageFile);
    response.setContentLength((int) imageFile.length());
    OutputStream out = response.getOutputStream();
    int bSize = 10240;
    byte[] buffer = new byte[bSize];
    int inBuffer = 0;
    while (inBuffer >= 0)
    inBuffer = in.read(buffer);
    if (inBuffer > 0)
    out.write(buffer, 0, inBuffer);
    * scales the image to its new size. while scaling the Image, the code first resizes the image using the new Width Parameter.
    * If the Image is to high after scaling, it then uses the Images height to determin the scaling Factor.
    * @param inputFile the original Image
    * @param outputFile the File to store the scaled image to
    * @param compressionQuality the compression Quality to use
    * @param newWidth the desired images width
    * @param newHeight the desired images height
    public static void scaleImage(File inputFile, File outputFile, float compressionQuality, int newWidth, int newHeight)
    try
    if (inputFile.exists())
    BufferedImage hiRes = ImageIO.read(inputFile);
    double scaleFactor = (double) newWidth / (double) hiRes.getWidth(null);
    int tempHeight = (int) (hiRes.getHeight(null) * scaleFactor);
    if (tempHeight > newHeight)
    scaleFactor = (double) newHeight / (double) hiRes.getHeight(null);
    int width = (int) (hiRes.getWidth(null) * scaleFactor);
    int height = (int) (hiRes.getHeight(null) * scaleFactor);
    scaleImage(outputFile, compressionQuality, hiRes, width, height);
    catch (IOException e)
    log.error("Unable to create the thumbnail " + outputFile.getAbsolutePath() + " from " + inputFile.getAbsolutePath() + " because of the following Reason.", e);
    * scales the image to its new size. while scaling the Image, the code first resizes the image using the new Width Parameter.
    * If the Image is to high after scaling, it then uses the Images height to determine the scaling Factor. This method uses the
    * default compression quality to store image data.
    * @param inputFile the original Image
    * @param outputFile the File to store the scaled image to
    * @param newWidth the desired images width
    * @param newHeight the desired images height
    public static void scaleImage(File inputFile, File outputFile, int newWidth, int newHeight)
    scaleImage(inputFile, outputFile, DEFAULT_COMPRESSION_QUALITY, newWidth, newHeight);
    * scales the image to its new size. while scaling the Image, the code first resizes the image using the new Width Parameter.
    * uses the highest image compression quality by default.
    * @param inputFile the original Image
    * @param outputFile the File to store the scaled image to
    * @param compressionQuality the compression Quality of the new Image
    * @param newWidth the desired images width
    public static void scaleImage(File inputFile, File outputFile, float compressionQuality, int newWidth)
    try
    if (inputFile.exists())
    BufferedImage hiRes = ImageIO.read(inputFile);
    double scaleFactor = (double) newWidth / (double) hiRes.getWidth(null);
    int width = (int) (hiRes.getWidth(null) * scaleFactor);
    int height = (int) (hiRes.getHeight(null) * scaleFactor);
    // draw original image to thumbnail image object and
    // scale it to the new size on-the-fly
    scaleImage(outputFile, compressionQuality, hiRes, width, height);
    else
    log.error("Unable to create the thumbnail " + outputFile.getAbsolutePath() + " from " + inputFile.getAbsolutePath() + " because inputFile not exists: " + inputFile.getName());
    catch (IOException e)
    log.error("Unable to create the thumbnail " + outputFile.getAbsolutePath() + " from " + inputFile.getAbsolutePath() + " because of the following Reason.", e);
    * scales the image to its new size. while scaling the Image, the code first resizes the image using the new Width Parameter.
    * uses the highest image compression quality by default.
    * @param inputFile the original Image
    * @param outputFile the File to store the scaled image to
    * @param newWidth the desired images width
    public static void scaleImage(File inputFile, File outputFile, int newWidth)
    scaleImage(inputFile, outputFile, DEFAULT_COMPRESSION_QUALITY, newWidth);
    * This private method actually scales the inputImage to the desired height, width and compression Quality
    * @param outputFile The File in which the Image should be stored.
    * @param compressionQuality The Compression Quality to be applied to the image
    * @param inputImage the original input Image
    * @param width the height of the new Image
    * @param height the width of the new Image
    * @throws IOException
    private static void scaleImage(File outputFile, float compressionQuality, BufferedImage inputImage, int width, int height) throws IOException
    BufferedImage lowRes = new BufferedImage(width, height, BufferedImage.TYPE_INT_RGB);
    java.awt.Image image = inputImage.getScaledInstance(width, height, java.awt.Image.SCALE_SMOOTH);
    ImageWriter writer = null;
    Iterator iter = ImageIO.getImageWritersByFormatName("jpg");
    if (iter.hasNext()) writer = (ImageWriter) iter.next();
    File outputPath = outputFile.getParentFile();
    if (outputPath != null)
    if (!outputPath.exists()) outputPath.mkdirs();
    lowRes.getGraphics().drawImage(image, 0, 0, null);
    ImageOutputStream ios = ImageIO.createImageOutputStream(outputFile);
    writer.setOutput(ios);
    ImageWriteParam iwparam = new JPEGImageWriteParam(Locale.getDefault());
    iwparam.setCompressionMode(ImageWriteParam.MODE_EXPLICIT);
    iwparam.setCompressionQuality(compressionQuality);
    // save thumbnail image to OUTFILE
    writer.write(null, new IIOImage(lowRes, null, null), iwparam);
    writer.dispose();
    ios.close();
    image.flush();
    inputImage.flush();
    lowRes.flush();
    * scales the image to its new size. while scaling the Image, the code first resizes the image using the new Width Parameter.
    * If the Image is to high after scaling, it then uses the Images height to determin the scaling Factor.
    * @param inputImage the original Image
    * @param outputFile the File to store the scaled image to
    * @param compressionQuality the compression Quality to use
    * @param newWidth the desired images width
    * @param newHeight the desired images height
    public static void scaleImage(BufferedImage inputImage, File outputFile, float compressionQuality, int newWidth, int newHeight)
    try
    double scaleFactor = (double) newWidth / (double) inputImage.getWidth(null);
    int tempHeight = (int) (inputImage.getHeight(null) * scaleFactor);
    if (tempHeight > newHeight)
    scaleFactor = (double) newHeight / (double) inputImage.getHeight(null);
    int width = (int) (inputImage.getWidth(null) * scaleFactor);
    int height = (int) (inputImage.getHeight(null) * scaleFactor);
    scaleImage(outputFile, compressionQuality, inputImage, width, height);
    catch (IOException e)
    log.error("Unable to create the thumbnail " + outputFile.getAbsolutePath() + " because of the following Reason.", e);
    All Threads     702.570     100 %
    java.lang.Thread.run()     551.322     78 %
    de.ellumination.carmen.modules.screens.Image.doOutput(RunData)     170.666     24 %
    de.ellumination.carmen.modules.screens.Image.outputScaledImage(File, HttpServletResponse, int, int)     170.108     24 %
                             de.ellumination.carmen.modules.screens.Image.scaleImage(File, File, int)     170.108     24 %
                                  de.ellumination.carmen.modules.screens.Image.scaleImage(File, File, float, int)     170.108     24 %
                                       de.ellumination.carmen.modules.screens.Image.scaleImage(File, float, BufferedImage, int, int)     165.787     24 %
                                            sun.java2d.SunGraphics2D.drawImage(Image, int, int, ImageObserver)     165.189     24 %
                                            com.sun.imageio.plugins.jpeg.JPEGImageWriter.write(IIOMetadata, IIOImage, ImageWriteParam)     397     0 %
                                            javax.imageio.ImageIO$ImageWriterIterator.next()     69     0 %
                                            javax.imageio.ImageIO.createImageOutputStream(Object)     47     0 %
                                            java.awt.image.BufferedImage.<init>(int, int, int)     36     0 %
                                            java.awt.Image.getScaledInstance(int, int, int)     23     0 %
                                            java.awt.image.BufferedImage.getGraphics()     21     0 %
                                       javax.imageio.ImageIO.read(File)     4.320     1 %
                        de.ellumination.carmen.om.BaseImagePeer.retrieveByPK(int)     557     0 %
                   de.ellumination.carmen.modules.screens.Index.doBuildTemplate(RunData, Context)     1.673     0 %
              org.apache.catalina.startup.Bootstrap.main(String[])     151.225     22 %
              org.quartz.core.QuartzSchedulerThread.run()     22     0 %
    Now I am looking for the Best way to solve my problem. Maybe I am wrong from the get go.
    Runtime Setup Java 1.5.0_04 Tomcat 5.5.12 V20z (AMD64 Opteron 4gb RAM)
    Any help is heighly appreciated
    Kind regards

    This is a bad thing to do with JPEGs. You're better off just reducing the 'q' if you want a smaller/faster/lower resolution image. That way you're throwing away resolution intelligently. Using scaling you're throwing resolution away unintelligently. I was on a project where 40,000 images were scaled when they should have been low-q'd. Don't do it.

  • Best way to log data

    I have an application that receives data from a source, and I would
    like to create and maintain a log file with this data. Small bits of
    data will be received at least every minute, and often once every
    second. The user can also view the logged data at any time.
    Lastly, each entry is timestamped, and I would like to be able to
    remove log entries that are x hour/days old.
    What I would like advice on is what is the bext way to accomplish this?
    Clearly a bad way to do this is every time I get data, read the log in,
    remove old entries, rewrite all the remaining entries, and finally
    append the new entry. This uses way too many file operations.
    I have briefly looked at the java.util.logging stuff, but it appears that
    the purpose of that is more for code debugging/etc. It also doesn't
    appear that there are methods to read the logs. So, this does not
    look like an "easy way out".
    I know how to append to a file, but I am not sure if I should leave
    the file open during execution of the program and simply flush
    the stream/buffer, or open and close the file everytime I write.
    The latter is clearly more expensive, but safer.
    I don't know how to delete data from the beginning of a file.
    To summarize, what is the best way to read and write data to
    a file such that:
    1) the data is appended to the end of the file (easy)
    2) the data is removed from the beginning of the file
    3) the data that is written is not lost of the program crashes/etc.
    4) the whole logging business is not too expensive
    Any suggestions?
    Thanks,
    Chuck.

    Here's what we did at my last job:
    Incoming messages are logged to disk via FileOutputStream (maybe with BufferedOutputStream encasing that, I don't remember) this outputstream is not closed until the applet is stopped. System.out is also reassigned to this fileoutputstream to capture all messages.
    At the same time, there is an InputStream reading from that file. As data is read, we stick the text into a TextArea.
    When the TextArea size hits some certain amount (maybe 20,000 characters) we'd remove ... maybe 5000 from the top. Log stays the same tho
    When the day rolls over, we switch log files
    You're questions:
    1) the data is appended to the end of the file (easy)
    Yes. But if you're not using streams, you should check that out.
    2) the data is removed from the beginning of the file
    That's pretty easy. Open the file for reading, open a new file for writing. Read the first n bytes but don't write them. Then write the rest
    3) the data that is written is not lost of the program crashes/etc.
    Data should be fine if it's logged to disk
    4) the whole logging business is not too expensive
    Saving to disk isn't very expensive at all. In fact, normally writing to disk is slow enough that stuff just gets put into a buffer somewhere to get sent to the disk and the processor goes on with the program.

  • Best way to insert in a table througth a database link

    Hi all,
    i have two databases (oracle 10g, windows 2003 server)
    Database A and Database B
    i need to insert data in a table (Table_A) that lives in database A
    The data i have to insert is in a table that lives a database B
    I have a database link from database B to database A
    so, connected to database B, i'm trying the following :
    insert into table_a@database_link_to_A
    select col1
    from table_b
    where col1 is not null
    This query is taking forever, and i have to cancel it.
    I'd like to hear from your experience. Wich is the best way to accomplish this task ?
    Best Regards
    Rui Madaleno
    NOTE: I forgot to mencion that Table_A does not have any trigger or indexes.
    Edited by: ruival on Jan 20, 2010 3:51 PM

    Try this:
    insert /*+ append */ into table_a@database_link_to_A select col1 from table_b where col1 is not nullOr
    Sqlplus COPY another option.
    [http://www.praetoriate.com/oracle_tips_dm_sqlplus_copy.htm]
    HTH
    -Anantha

  • What is the best way to send a file?

    i am writing a program and i want to transfer a file from a client class to a server class... what is the best way to do that?
    convert the file to bytes using the following
    File file=new File("jobs.xml");
               byte buffer[]=new byte[(int)file.length()];
               try {
                    BufferedInputStream input=new BufferedInputStream(new FileInputStream("jobs.xml"));
                    input.read(buffer,0,buffer.length);
                    input.close();
               } catch(Exception e) {//DIORTHOSE TA MSGS
                  System.out.println("reading jobs.xml->buffer: "+e.getMessage());
                  e.printStackTrace();
               firstServerRef.translationService(theCallbackObjectRef, buffer);for a reason i dont like that i am reading the file again to put it in the buffer and send the buffer... are my worries reasonable or not? is there any other better way to do that?

    Use a smaller buffer, repeatedly read and
    write, and print the exception's stack trace.
    What do you mean by reading again, by the way? I only
    see you reading the file once.hmm you mean use a smaller buffer and call the function with the smaller buffer many times in a while?
    the client and the server are not on a single machine and i want to call the function only once... could you clarify the thing that you said..
    yes you are correct that you see only one reading because i haven't pasted the rest of the code which is sth like...
    FileWriter fw = new FileWriter("jobs.xml");
               ObjectOutputStream out = xstream.createObjectOutputStream(fw);
         //      out.writeObject(new Jobb("ougk2", "Walnes",null));
              for(int i=0;i<nameOfServices.length;i++){
                   Jobb translationJob=new Jobb();
                   //find the service !
                 NameComponent nc = new NameComponent(nameOfServices, " ");
    // Resolve the object reference in naming
    NameComponent path[] = {nc};
    //create a ref for the servant of the service
    ServiceOperations theRemoteObjRef = ServiceHelper.narrow(ncRef.resolve(path));
    // JobOperations theRemoteObjRef = JobHelper.narrow(ncRef.resolve(path));
    translationJob.setObjServerRef(theRemoteObjRef.toString());
    if(i==0){ //this is the first job
         translationJob.setForTranslation(wordForTranslation);
         firstServerRef=theRemoteObjRef;
         out.writeObject(translationJob);
         out.close();
         File file=new File("jobs.xml");
              byte buffer[]=new byte[(int)file.length()];
              try {
                   BufferedInputStream input=new BufferedInputStream(new FileInputStream("jobs.xml"));
                   input.read(buffer,0,buffer.length);
                   input.close();
              } catch(Exception e) {//DIORTHOSE TA MSGS
         System.out.println("reading jobs.xml->buffer: "+e.getMessage());
         e.printStackTrace();
              firstServerRef.translationService(theCallbackObjectRef, buffer);
    which i believe is bad....

  • What's the best way to delete 2.4 million of records from table?

    We are having two tables one is production one and another is temp table which data we want to insert into production table. temp table having 2.5 million of records and on the other side production table is having billions of records. the thing which we want to do just simple delete already existed records from production table and then insert the remaining records from temp to production table.
    Can anyone guide what's the best way to do this?
    Thanks,
    Waheed.

    Waheed Azhar wrote:
    production table is live and data is appending in this table on random basis. if i go insert data from temp to prod table a pk voilation exception occured bcoz already a record is exist in prod table which we are going to insert from temp to prod
    If you really just want to insert the records and don't want to update the matching ones and you're already on 10g you could use the "DML error logging" facility of the INSERT command, which would log all failed records but succeeds for the remaining ones.
    You can create a suitable exception table using the DBMS_ERRLOG.CREATE_ERROR_LOG procedure and then use the "LOG ERRORS INTO" clause of the INSERT command. Note that you can't use the "direct-path" insert mode (APPEND hint) if you expect to encounter UNIQUE CONSTRAINT violations, because this can't be logged and cause the direct-path insert to fail. Since this is a "live" table you probably don't want to use the direct-path insert anyway.
    See the manuals for more information: http://download.oracle.com/docs/cd/B19306_01/server.102/b14200/statements_9014.htm#BGBEIACB
    Sample taken from 10g manuals:
    CREATE TABLE raises (emp_id NUMBER, sal NUMBER
       CONSTRAINT check_sal CHECK(sal > 8000));
    EXECUTE DBMS_ERRLOG.CREATE_ERROR_LOG('raises', 'errlog');
    INSERT INTO raises
       SELECT employee_id, salary*1.1 FROM employees
       WHERE commission_pct > .2
       LOG ERRORS INTO errlog ('my_bad') REJECT LIMIT 10;
    SELECT ORA_ERR_MESG$, ORA_ERR_TAG$, emp_id, sal FROM errlog;
    ORA_ERR_MESG$               ORA_ERR_TAG$         EMP_ID SAL
    ORA-02290: check constraint my_bad               161    7700
    (HR.SYS_C004266) violatedIf the number of rows in the temp table is not too large and you have a suitable index on the large table for the lookup you could also try to use a NOT EXISTS clause in the insert command:
    INSERT INTO <large_table>
    SELECT ...
    FROM TEMP A
    WHERE NOT EXISTS (
    SELECT NULL
    FROM <large_table> B
    WHERE B.<lookup> = A.<key>
    );But you need to check the execution plan, because a hash join using a full table scan on the <large_table> is probably something you want to avoid.
    Regards,
    Randolf
    Oracle related stuff blog:
    http://oracle-randolf.blogspot.com/
    SQLTools++ for Oracle (Open source Oracle GUI for Windows):
    http://www.sqltools-plusplus.org:7676/
    http://sourceforge.net/projects/sqlt-pp/

  • Best way to implement a basic text output window

    Hello,
    I want to monitor the activity of an hardware devices which sends text, in some case, a lot and fairly fast. So far I send this to the standard output and this is fine.
    Now I need to support several devices concurrently so I need to create a very basic window which just display some read only text with a scroll bar (I must have only one instance of my app running on a system). As there are hundreds of ways to do that in Java, I would like some advice to make something which minimize latencies and resources consumption (e.g. using a JTextPane with a StyledDocument might not be optimal for this...)
    Sebastien

    As recommended, I used a JTextArea. With that I use the default document and "Piped" streams.
    I tested it with the little loop below:
            String text = "";
            PipedOutputStream pos=new PipedOutputStream();
            final int PIPE_BUFFER_SIZE=0x100000;
            dialog.monitorStream(new PipedInputStream(pos,PIPE_BUFFER_SIZE));
            long start=System.nanoTime();
            for (int i = 0; i < 0x100000; i++) {
                String zeroes="";
                for(int j=0;j<Integer.numberOfLeadingZeros(i);j++)
                    zeroes+='0';
                text = zeroes+Integer.toBinaryString(i) + "\n";
                pos.write(text.getBytes());//use JTextArea/PipedOutputStream
                //System.out.print(text);//use console
            long end=System.nanoTime();
            System.out.println("exectime: "+Long.toString((end-start)/1000000));With the (win32) console it took around 195 seconds on my laptop
    I got a similar timing with NetBean's output window (but I must say that it is quite good result if we take into account that it keeps all the data while win32 console keeps only the 300 last lines!)
    Using the JTextArea/PipedOutputStream, times fall down to 22 seconds !! (I made it to accept up to 8MB of data and to keep at least the last 4MB chunk of data)
    Ok, I will never put JTextArea performance in question...
    Thanks for your answers.
    Note for those who do similar things: the PIPE_BUFFER_SIZE is a key parameter for the speed of operations, the bigger the better. For example, when I double its size, the times fall down to 15 seconds !
    Still for those interested in that kind of stuff, the core of the monitoring thread:
        protected class StreamMonitor implements Runnable {
            //This will display correctly ony ANSI characters
            //"&#50504;&#45397;" is displayed as "??"
            public StreamMonitor() {
            public void run() {
                int totalCount=0;
                try {
                    byte[] buf=new byte[0x8000];
                    StringBuffer sb=new StringBuffer();
                    sb.ensureCapacity(0x10000);
                    int emptyCnt=0;
                    while (true) {
                        if (toMonitor.available()>0) {
                            emptyCnt=0;
                            int n = toMonitor.read(buf);
                            totalCount+=n;
                            for (int i = 0; i < n; i++) {
                                sb.append((char)buf);
    if(sb.length()>0x8000)
    int MAX_SIZE=0x0800000;
    if(totalCount>MAX_SIZE)
    doc.remove(0,(MAX_SIZE / 2));
    totalCount-=MAX_SIZE / 2;
    doc.insertString(doc.getLength(), sb.toString(), null);
    outputTextArea.setCaretPosition(doc.getLength());
    sb.setLength(0);
    else
    doc.insertString(doc.getLength(), sb.toString(), null);
    outputTextArea.setCaretPosition(doc.getLength());
    sb.setLength(0);
    emptyCnt++;
    if(emptyCnt<1000){
    Thread.yield();
    else{
    //if we execute this thread repeatedly for nothing,
    //make it sleep a while to reduce CPU load
    Thread.sleep(20);
    } catch (Exception ex) {
    Logger.getLogger(ATextScreenOutput.class.getName()).log(Level.SEVERE, null, ex);

  • Best way to parse data

    Hi, I'm fairly new to java programming coming from a midrange, COBOL background.
    I need to take data from a legacy program and use it in an online java program. The data is stored in a table that occurs 1-25 times:
    05  DATA-TABLE OCCURS 25 TIMES.
           10  FIELD1                       PIC X(25).
           10  FIELD2                       PIC X(25).
           10  FIELD3                       PIC X(50).  So, I could have between 100 and 2500 bytes of data to deal with in the java program. Can anyone point me in the right direction on the best way to handle this? I thought about creating an intial Array that has 25 elements and then substring-ing the returned data into that. Then, what would be the best way to break that data down to the individual components?
    If you have any solutions or pointers, I would definitely appreciate it.
    Thanks!
    bfrmbama

    I would say to you to start giving meaning to your data and puting it in classes, for example if your data is about pet animals and you have the animal's name, nick and description:
    package example;
    * @author leonardo     12/11/2004
    * @version 1.0
    public class Pet {
        private String name;
        private String nick;
        private String description;
         * Create a pet from the string with the triplet atributes.
         * @param triplet The triplet data.
        public Pet( String triplet ) {
            tokenize( triplet );
         * Method for tokenizing the pet data.
         * @param triplet the data.
        private void tokenize(String triplet) {
             * I am assuming the data is space separated,
             * look at the api to see a complete usage of
             * split and maybe the string tokenizer.
            String[] strings = triplet.split(" ");
            name = strings[1];
            nick = strings[2];
            description = strings[3];
         * Its always a good idea to encapsulate your data
         * and give access to it through accessors and mutators.
        public String getDescription() {
            return description;
        public void setDescription(String description) {
            this.description = description;
        public String getName() {
            return name;
        public void setName(String name) {
            this.name = name;
        public String getNick() {
            return nick;
        public void setNick(String nick) {
            this.nick = nick;
    }then when you tokenize your file you can do:
    // ... Incomplete ...
    // I am assuming you will write a CobolFileReader that implements the file reading and line
    // extratction logic. when there is no more data to be read it returns null
       List list = new ArrayList();
        CobolFileReader cobolFileReader = new CobolFileReader("TheFile.txt");
            for(String triplet = cobolFileReader.getNextTripet(); triplet != null; triplet = cobolFileReader.getNextTripet()) {
                list.add( new Pet( triplet ) );
      This way you will have a lista as big as it needs to be.

  • Best way to correct Tablespace fragmentation?

    Dear DBAs,
    The following query returns tablespace fragmentation:
    select
    tablespace_name,
    count(*) free_chunks,
    decode(
    round((max(bytes) / 1024000),2),
    null,0,
    round((max(bytes) / 1024000),2)) largest_chunk,
    nvl(round(sqrt(max(blocks)/sum(blocks))*(100/sqrt(sqrt(count(blocks)) )),2),
    0) fragmentation_index
    from
    sys.dba_free_space
    group by
    tablespace_name
    order by
    2 desc, 1;
    I have 3 tablespaces that are Locally Managed and suffer heavyt fragmentation. I just received the task of tuning this database and from what I could tell the fragmentation originated from a number of bad practices from past DBAs and also, some bizarre structures in certain tables.
    My question is: what would be the best way to correct tablespace fragmentation LMT? I'm thinking export - drop - import to reorganize all blocks within the tablespace?

    Alvaro wrote:
    Dear DBAs,
    The following query returns tablespace fragmentation:
    select
    tablespace_name,
    count(*) free_chunks,
    decode(
    round((max(bytes) / 1024000),2),
    null,0,
    round((max(bytes) / 1024000),2)) largest_chunk,
    nvl(round(sqrt(max(blocks)/sum(blocks))*(100/sqrt(sqrt(count(blocks)) )),2),
    0) fragmentation_index
    from
    sys.dba_free_space
    group by
    tablespace_name
    order by
    2 desc, 1;
    I have 3 tablespaces that are Locally Managed and suffer heavyt fragmentation. I just received the task of tuning this database and from what I could tell the fragmentation originated from a number of bad practices from past DBAs and also, some bizarre structures in certain tables.
    My question is: what would be the best way to correct tablespace fragmentation LMT? I'm thinking export - drop - import to reorganize all blocks within the tablespace?In addition to what everyone else notes is the unlikelihood of fragmentation, there may be implicit compaction and skew in your data distribution that could be impacted by export/import. For better or worse.
    What bad practices are you referring to? Sometimes practices are inappropriate, sometimes useless, sometimes perfectly understandable. If you don't know about the fragmentation issue, I wonder how well you judge practices. And yes, there are some really bad practices floating about.

  • Is this REALLY the easiest/best way to do this?

    I'm no SQL guru by any means, but I can get by. Here is some background.
    I am running Perfmon Data Collector Sets to collect performance counters.
    I am using Relog.exe to take the .blg files and pushing them into a database
    I am attempting to write a query that will ultimately get the information out of the database and see if I can aggregate the data a bit as well. If the last part is better off doing in Excel (where this data will ultimately end up for charting), then I will
    do that part there.
    Here is the table layouts when you import the .blg files:
    Table name: CounterData
    Columns:
    GUID
    CounterID
    RecordIndex
    CounterDateTime
    CounterValue
    FirstValueA
    FirstValueB
    SecondValueA
    SecondValueB
    MultiCount
    Table Name: CounterDetails
    CounterID
    MachineName
    ObjectName
    CounterName
    CounterType
    DefaultScale
    InstanceName
    InstanceIndex
    ParentName
    ParentObjectID
    TimeBaseA
    TimeBaseB
    I need to pull multiple sets of data out of these tables, so I built the following query:
    USE PDB
    SELECT
    CAST(LEFT(CounterDateTime, 16) as smalldatetime) AS CounterDateTime,
    REPLACE(CounterDetails.MachineName,'\\','') AS ComputerName,
    CounterDetails.ObjectName + ISNULL('(' + CounterDetails.InstanceName + ')','') + '\' + CounterDetails.CounterName AS [Counter],
    CounterData.CounterValue
    FROM CounterData
    FULL OUTER JOIN CounterDetails ON CounterData.CounterID = CounterDetails.CounterID
    FULL OUTER JOIN DisplayToID ON CounterData.GUID = DisplayToID.GUID
    WHERE CounterDetails.ObjectName = 'Processor'
    AND CounterDetails.CounterName = '% Processor Time'
    AND CounterDetails.InstanceName = '_Total'
    UNION
    SELECT
    CAST(LEFT(CounterDateTime, 16) as smalldatetime) AS CounterDateTime,
    REPLACE(CounterDetails.MachineName,'\\','') AS ComputerName,
    CounterDetails.ObjectName + ISNULL('(' + CounterDetails.InstanceName + ')','') + '\' + CounterDetails.CounterName AS [Counter],
    CounterData.CounterValue
    FROM CounterData
    FULL OUTER JOIN CounterDetails ON CounterData.CounterID = CounterDetails.CounterID
    FULL OUTER JOIN DisplayToID ON CounterData.GUID = DisplayToID.GUID
    WHERE CounterDetails.ObjectName = 'Web Service'
    AND CounterDetails.CounterName = 'Bytes Received/sec'
    AND CounterDetails.InstanceName = 'AppName'
    ORDER BY 1
    Is this REALLY the best way to do this?
    Also, trying to figure out how to build in data aggregation in 5 minute average blocks.
    Any assistance appreciated!

    Erland,
    Thanks for your response.
    The related items in the table are CounterID, in a nutshell, what I want to do is the following:
    Get the data from the CounterDetails table that holds the MachineName, CounterName, CounterType and so forth.
    CounterID MachineName ObjectName CounterName CounterType DefaultScale InstanceName InstanceIndex ParentName ParentObjectID TimeBaseA TimeBaseB
    1 \\ServerName Web Service Bytes Received/sec 272696576 -4 Carnival NULL NULL NULL 14318180 0
    2 \\ServerName Web Service Bytes Sent/sec 272696576 -4 AppName NULL NULL NULL 14318180 0
    3 \\ServerName Web Service Current Connections 65536 0 AppName NULL NULL NULL 14318180 0
    4 \\ServerName Web Service Bytes Total/sec 272696576 -4 AppName NULL NULL NULL 14318180 0
    5 \\ServerName Web Service Get Requests/sec 272696320 0 AppName NULL NULL NULL 14318180 0
    6 \\ServerName Web Service Head Requests/sec 272696320 0 AppName NULL NULL NULL 14318180 0
    7 \\ServerName Processor % Processor Time 558957824 0 _Total NULL NULL NULL 10000000 0
    8 \\ServerName PhysicalDisk Disk Reads/sec 272696320 0 _Total NULL NULL -1 14318180 0
    9 \\ServerName PhysicalDisk Disk Reads/sec 272696320 0 0 C: D: E: NULL NULL -1 14318180 0
    10 \\ServerName PhysicalDisk Disk Read Bytes/sec 272696576 -4 _Total NULL NULL -1 14318180 0
    11 \\ServerName PhysicalDisk Disk Read Bytes/sec 272696576 -4 0 C: D: E: NULL NULL -1 14318180 0
    12 \\ServerName PhysicalDisk Disk Writes/sec 272696320 0 _Total NULL NULL -1 14318180 0
    13 \\ServerName PhysicalDisk Disk Writes/sec 272696320 0 0 C: D: E: NULL NULL -1 14318180 0
    14 \\ServerName PhysicalDisk Current Disk Queue Length 65536 1 _Total NULL NULL -1 14318180 0
    15 \\ServerName PhysicalDisk Current Disk Queue Length 65536 1 0 C: D: E: NULL NULL -1 14318180 0
    16 \\ServerName PhysicalDisk % Disk Read Time 542573824 0 _Total NULL NULL -1 10000000 0
    17 \\ServerName PhysicalDisk % Disk Read Time 542573824 0 0 C: D: E: NULL NULL -1 10000000 0
    18 \\ServerName PhysicalDisk Disk Write Bytes/sec 272696576 -4 _Total NULL NULL -1 14318180 0
    19 \\ServerName PhysicalDisk Disk Write Bytes/sec 272696576 -4 0 C: D: E: NULL NULL -1 14318180 0
    20 \\ServerName PhysicalDisk Disk Transfers/sec 272696320 0 _Total NULL NULL -1 14318180 0
    21 \\ServerName PhysicalDisk Disk Transfers/sec 272696320 0 0 C: D: E: NULL NULL -1 14318180 0
    22 \\ServerName PhysicalDisk % Disk Write Time 542573824 0 _Total NULL NULL -1 10000000 0
    23 \\ServerName PhysicalDisk % Disk Write Time 542573824 0 0 C: D: E: NULL NULL -1 10000000 0
    24 \\ServerName Network Interface Bytes Received/sec 272696576 -4 TEAM : Team _0 - Intel[R] PRO_1000 PT Dual Port Server Adapter NULL NULL -1 14318180 0
    25 \\ServerName Network Interface Bytes Received/sec 272696576 -4 Broadcom BCM5708C NetXtreme II GigE [NDIS VBD Client] NULL NULL -1 14318180 0
    26 \\ServerName Network Interface Bytes Received/sec 272696576 -4 TEAM : Team _0 - Intel[R] PRO_1000 PT Dual Port Server Adapter _3 NULL NULL -1 14318180 0
    27 \\ServerName Network Interface Bytes Received/sec 272696576 -4 TEAM : Team _0 - Intel[R] PRO_1000 PT Dual Port Server Adapter _2 NULL NULL -1 14318180 0
    28 \\ServerName Network Interface Bytes Received/sec 272696576 -4 TEAM : Team _0 - Intel[R] PRO_1000 PT Dual Port Server Adapter _4 NULL NULL -1 14318180 0
    29 \\ServerName Network Interface Bytes Received/sec 272696576 -4 Broadcom BCM5708C NetXtreme II GigE [NDIS VBD Client] _4 NULL NULL -1 14318180 0
    30 \\ServerName Network Interface Bytes Received/sec 272696576 -4 Broadcom BCM5708C NetXtreme II GigE [NDIS VBD Client] _3 NULL NULL -1 14318180 0
    31 \\ServerName Network Interface Bytes Received/sec 272696576 -4 Broadcom BCM5708C NetXtreme II GigE [NDIS VBD Client] _2 NULL NULL -1 14318180 0
    32 \\ServerName Network Interface Bytes Sent/sec 272696576 -4 TEAM : Team _0 - Intel[R] PRO_1000 PT Dual Port Server Adapter NULL NULL -1 14318180 0
    33 \\ServerName Network Interface Bytes Sent/sec 272696576 -4 Broadcom BCM5708C NetXtreme II GigE [NDIS VBD Client] NULL NULL -1 14318180 0
    34 \\ServerName Network Interface Bytes Sent/sec 272696576 -4 TEAM : Team _0 - Intel[R] PRO_1000 PT Dual Port Server Adapter _3 NULL NULL -1 14318180 0
    35 \\ServerName Network Interface Bytes Sent/sec 272696576 -4 TEAM : Team _0 - Intel[R] PRO_1000 PT Dual Port Server Adapter _2 NULL NULL -1 14318180 0
    36 \\ServerName Network Interface Bytes Sent/sec 272696576 -4 TEAM : Team _0 - Intel[R] PRO_1000 PT Dual Port Server Adapter _4 NULL NULL -1 14318180 0
    37 \\ServerName Network Interface Bytes Sent/sec 272696576 -4 Broadcom BCM5708C NetXtreme II GigE [NDIS VBD Client] _4 NULL NULL -1 14318180 0
    38 \\ServerName Network Interface Bytes Sent/sec 272696576 -4 Broadcom BCM5708C NetXtreme II GigE [NDIS VBD Client] _3 NULL NULL -1 14318180 0
    39 \\ServerName Network Interface Bytes Sent/sec 272696576 -4 Broadcom BCM5708C NetXtreme II GigE [NDIS VBD Client] _2 NULL NULL -1 14318180 0
    40 \\ServerName Network Interface Bytes Total/sec 272696576 -4 TEAM : Team _0 - Intel[R] PRO_1000 PT Dual Port Server Adapter NULL NULL -1 14318180 0
    41 \\ServerName Network Interface Bytes Total/sec 272696576 -4 Broadcom BCM5708C NetXtreme II GigE [NDIS VBD Client] NULL NULL -1 14318180 0
    42 \\ServerName Network Interface Bytes Total/sec 272696576 -4 TEAM : Team _0 - Intel[R] PRO_1000 PT Dual Port Server Adapter _3 NULL NULL -1 14318180 0
    43 \\ServerName Network Interface Bytes Total/sec 272696576 -4 TEAM : Team _0 - Intel[R] PRO_1000 PT Dual Port Server Adapter _2 NULL NULL -1 14318180 0
    44 \\ServerName Network Interface Bytes Total/sec 272696576 -4 TEAM : Team _0 - Intel[R] PRO_1000 PT Dual Port Server Adapter _4 NULL NULL -1 14318180 0
    45 \\ServerName Network Interface Bytes Total/sec 272696576 -4 Broadcom BCM5708C NetXtreme II GigE [NDIS VBD Client] _4 NULL NULL -1 14318180 0
    46 \\ServerName Network Interface Bytes Total/sec 272696576 -4 Broadcom BCM5708C NetXtreme II GigE [NDIS VBD Client] _3 NULL NULL -1 14318180 0
    47 \\ServerName Network Interface Bytes Total/sec 272696576 -4 Broadcom BCM5708C NetXtreme II GigE [NDIS VBD Client] _2 NULL NULL -1 14318180 0
    48 \\ServerName Network Interface Output Queue Length 65792 0 TEAM : Team _0 - Intel[R] PRO_1000 PT Dual Port Server Adapter NULL NULL -1 14318180 0
    49 \\ServerName Network Interface Output Queue Length 65792 0 Broadcom BCM5708C NetXtreme II GigE [NDIS VBD Client] NULL NULL -1 14318180 0
    50 \\ServerName Network Interface Output Queue Length 65792 0 TEAM : Team _0 - Intel[R] PRO_1000 PT Dual Port Server Adapter _3 NULL NULL -1 14318180 0
    51 \\ServerName Network Interface Output Queue Length 65792 0 TEAM : Team _0 - Intel[R] PRO_1000 PT Dual Port Server Adapter _2 NULL NULL -1 14318180 0
    52 \\ServerName Network Interface Output Queue Length 65792 0 TEAM : Team _0 - Intel[R] PRO_1000 PT Dual Port Server Adapter _4 NULL NULL -1 14318180 0
    53 \\ServerName Network Interface Output Queue Length 65792 0 Broadcom BCM5708C NetXtreme II GigE [NDIS VBD Client] _4 NULL NULL -1 14318180 0
    54 \\ServerName Network Interface Output Queue Length 65792 0 Broadcom BCM5708C NetXtreme II GigE [NDIS VBD Client] _3 NULL NULL -1 14318180 0
    55 \\ServerName Network Interface Output Queue Length 65792 0 Broadcom BCM5708C NetXtreme II GigE [NDIS VBD Client] _2 NULL NULL -1 14318180 0
    56 \\ServerName Network Interface Current Bandwidth 65792 -6 TEAM : Team _0 - Intel[R] PRO_1000 PT Dual Port Server Adapter NULL NULL -1 14318180 0
    57 \\ServerName Network Interface Current Bandwidth 65792 -6 Broadcom BCM5708C NetXtreme II GigE [NDIS VBD Client] NULL NULL -1 14318180 0
    58 \\ServerName Network Interface Current Bandwidth 65792 -6 TEAM : Team _0 - Intel[R] PRO_1000 PT Dual Port Server Adapter _3 NULL NULL -1 14318180 0
    59 \\ServerName Network Interface Current Bandwidth 65792 -6 TEAM : Team _0 - Intel[R] PRO_1000 PT Dual Port Server Adapter _2 NULL NULL -1 14318180 0
    60 \\ServerName Network Interface Current Bandwidth 65792 -6 TEAM : Team _0 - Intel[R] PRO_1000 PT Dual Port Server Adapter _4 NULL NULL -1 14318180 0
    61 \\ServerName Network Interface Current Bandwidth 65792 -6 Broadcom BCM5708C NetXtreme II GigE [NDIS VBD Client] _4 NULL NULL -1 14318180 0
    62 \\ServerName Network Interface Current Bandwidth 65792 -6 Broadcom BCM5708C NetXtreme II GigE [NDIS VBD Client] _3 NULL NULL -1 14318180 0
    63 \\ServerName Network Interface Current Bandwidth 65792 -6 Broadcom BCM5708C NetXtreme II GigE [NDIS VBD Client] _2 NULL NULL -1 14318180 0
    64 \\ServerName Memory % Committed Bytes In Use 537003008 0 NULL NULL NULL NULL 14318180 0
    65 \\ServerName Memory Available MBytes 65792 0 NULL NULL NULL NULL 14318180 0
    66 \\ServerName Memory Committed Bytes 65792 -6 NULL NULL NULL NULL 14318180 0
    67 \\ServerName ASP.NET Requests Current 65536 -1 NULL NULL NULL NULL 14318180 0
    68 \\ServerName ASP.NET Worker Process Restarts 65536 -1 NULL NULL NULL NULL 14318180 0
    69 \\ServerName ASP.NET Applications Running 65536 -1 NULL NULL NULL NULL 14318180 0
    70 \\ServerName ASP.NET Requests Queued 65536 -1 NULL NULL NULL NULL 14318180 0
    71 \\ServerName ASP.NET Application Restarts 65536 -1 NULL NULL NULL NULL 14318180 0
    72 \\ServerName ASP.NET Worker Processes Running 65536 -1 NULL NULL NULL NULL 14318180 0
    73 \\ServerName ASP.NET Request Execution Time 65536 -1 NULL NULL NULL NULL 14318180 0
    Put that together with the Actual CounterData in the appropriately named table.
    GUID CounterID RecordIndex CounterDateTime CounterValue FirstValueA FirstValueB SecondValueA SecondValueB MultiCount
    8ADCC3A7-4D90-45A3-B912-FB18C9CB3646 1 1 2015-04-30 13:01:17.165 0 -1927979745 2 -598728243 706 1
    8ADCC3A7-4D90-45A3-B912-FB18C9CB3646 1 2 2015-04-30 13:01:22.173 67581.3633745449 -1927642227 2 -527219720 706 1
    8ADCC3A7-4D90-45A3-B912-FB18C9CB3646 1 3 2015-04-30 13:01:27.165 94686.4063445727 -1927169543 2 -455741935 706 1
    8ADCC3A7-4D90-45A3-B912-FB18C9CB3646 1 4 2015-04-30 13:01:32.172 152203.041104636 -1926407371 2 -384042212 706 1
    8ADCC3A7-4D90-45A3-B912-FB18C9CB3646 1 5 2015-04-30 13:01:37.180 165447.09804292 -1925578898 2 -312344215 706 1
    8ADCC3A7-4D90-45A3-B912-FB18C9CB3646 1 6 2015-04-30 13:01:42.172 171837.776053684 -1924721043 2 -240864459 706 1
    8ADCC3A7-4D90-45A3-B912-FB18C9CB3646 1 7 2015-04-30 13:01:47.180 173383.630948422 -1923852824 2 -169166134 706 1
    8ADCC3A7-4D90-45A3-B912-FB18C9CB3646 1 8 2015-04-30 13:01:52.172 144598.914838348 -1923130989 2 -97690055 706 1
    8ADCC3A7-4D90-45A3-B912-FB18C9CB3646 1 9 2015-04-30 13:01:57.179 174737.727857169 -1922255986 2 -25991455 706 1
    8ADCC3A7-4D90-45A3-B912-FB18C9CB3646 1 10 2015-04-30 13:02:02.171 169321.861725293 -1921410708 2 45486867 707 1
    8ADCC3A7-4D90-45A3-B912-FB18C9CB3646 1 11 2015-04-30 13:02:07.179 208117.127073016 -1920368562 2 117185118 707 1
    8ADCC3A7-4D90-45A3-B912-FB18C9CB3646 1 12 2015-04-30 13:02:12.171 141008.757157554 -1919664632 2 188662923 707 1
    8ADCC3A7-4D90-45A3-B912-FB18C9CB3646 1 13 2015-04-30 13:02:17.178 149495.458222544 -1918916026 2 260361927 707 1
    8ADCC3A7-4D90-45A3-B912-FB18C9CB3646 1 14 2015-04-30 13:02:22.170 174341.539879002 -1918045605 2 331847154 707 1
    8ADCC3A7-4D90-45A3-B912-FB18C9CB3646 1 15 2015-04-30 13:02:27.178 143957.916530014 -1917324818 2 403537258 707 1
    8ADCC3A7-4D90-45A3-B912-FB18C9CB3646 1 16 2015-04-30 13:02:32.170 130619.882518362 -1916672720 2 475018386 707 1
    8ADCC3A7-4D90-45A3-B912-FB18C9CB3646 1 17 2015-04-30 13:02:37.178 142332.32395318 -1915959971 2 546718673 707 1
    8ADCC3A7-4D90-45A3-B912-FB18C9CB3646 1 18 2015-04-30 13:02:42.170 184550.944997403 -1915038722 2 618192753 707 1
    8ADCC3A7-4D90-45A3-B912-FB18C9CB3646 1 19 2015-04-30 13:02:47.177 154267.317838657 -1914266244 2 689889592 707 1
    8ADCC3A7-4D90-45A3-B912-FB18C9CB3646 1 20 2015-04-30 13:02:52.169 149238.629713526 -1913521218 2 761368514 707 1
    8ADCC3A7-4D90-45A3-B912-FB18C9CB3646 1 21 2015-04-30 13:02:57.177 188584.542348845 -1912576880 2 833066869 707 1
    8ADCC3A7-4D90-45A3-B912-FB18C9CB3646 1 22 2015-04-30 13:03:02.169 176918.705027469 -1911693639 2 904548308 707 1
    8ADCC3A7-4D90-45A3-B912-FB18C9CB3646 1 23 2015-04-30 13:03:07.176 188369.179859497 -1910750431 2 976242743 707 1
    8ADCC3A7-4D90-45A3-B912-FB18C9CB3646 1 24 2015-04-30 13:03:12.168 148606.905306921 -1910008537 2 1047723754 707 1
    8ADCC3A7-4D90-45A3-B912-FB18C9CB3646 1 25 2015-04-30 13:03:17.176 200078.077196397 -1909006668 2 1119420468 707 1
    The query above is working, but I feel there is a better way to get this done.
    Sample Output:
    CounterDateTime ComputerName Counter CounterValue
    2015-04-30 13:01:00 ServerName Memory\Committed Bytes 23836753920
    2015-04-30 13:01:00 ServerName Memory\Committed Bytes 23837396992
    2015-04-30 13:01:00 ServerName Memory\Committed Bytes 23842693120
    2015-04-30 13:01:00 ServerName Memory\Committed Bytes 23843172352
    2015-04-30 13:01:00 ServerName Memory\Committed Bytes 23861657600
    2015-04-30 13:01:00 ServerName Memory\Committed Bytes 23872827392
    2015-04-30 13:01:00 ServerName Memory\Committed Bytes 23909138432
    2015-04-30 13:01:00 ServerName Memory\Committed Bytes 23960690688
    2015-04-30 13:01:00 ServerName Memory\Committed Bytes 23972872192
    2015-04-30 13:01:00 ServerName PhysicalDisk(_Total)\Current Disk Queue Length 0
    2015-04-30 13:01:00 ServerName Processor(_Total)\% Processor Time 0
    2015-04-30 13:01:00 ServerName Processor(_Total)\% Processor Time 8.65725297547727
    2015-04-30 13:01:00 ServerName Processor(_Total)\% Processor Time 9.34837740384615
    2015-04-30 13:01:00 ServerName Processor(_Total)\% Processor Time 10.45515625
    2015-04-30 13:01:00 ServerName Processor(_Total)\% Processor Time 11.3926622596154
    2015-04-30 13:01:00 ServerName Processor(_Total)\% Processor Time 11.4480309928908
    2015-04-30 13:01:00 ServerName Processor(_Total)\% Processor Time 11.8893621695024
    2015-04-30 13:01:00 ServerName Processor(_Total)\% Processor Time 12.3306933461139
    2015-04-30 13:01:00 ServerName Processor(_Total)\% Processor Time 13.3301821231728
    2015-04-30 13:01:00 ServerName Web Service(AppName)\Bytes Received/sec 0
    2015-04-30 13:01:00 ServerName Web Service(AppName)\Bytes Received/sec 67581.3633745449
    2015-04-30 13:01:00 ServerName Web Service(AppName)\Bytes Received/sec 94686.4063445727
    2015-04-30 13:01:00 ServerName Web Service(AppName)\Bytes Received/sec 144598.914838348
    2015-04-30 13:01:00 ServerName Web Service(AppName)\Bytes Received/sec 152203.041104636
    2015-04-30 13:01:00 ServerName Web Service(AppName)\Bytes Received/sec 165447.09804292
    2015-04-30 13:01:00 ServerName Web Service(AppName)\Bytes Received/sec 171837.776053684
    2015-04-30 13:01:00 ServerName Web Service(AppName)\Bytes Received/sec 173383.630948422
    2015-04-30 13:01:00 ServerName Web Service(AppName)\Bytes Received/sec 174737.727857169
    2015-04-30 13:01:00 ServerName Web Service(AppName)\Bytes Total/sec 0
    2015-04-30 13:01:00 ServerName Web Service(AppName)\Bytes Total/sec 354821.47994974
    2015-04-30 13:01:00 ServerName Web Service(AppName)\Bytes Total/sec 533111.927106303
    2015-04-30 13:01:00 ServerName Web Service(AppName)\Bytes Total/sec 849787.130317823
    2015-04-30 13:01:00 ServerName Web Service(AppName)\Bytes Total/sec 1015485.82303199
    2015-04-30 13:01:00 ServerName Web Service(AppName)\Bytes Total/sec 1286054.48388504
    2015-04-30 13:01:00 ServerName Web Service(AppName)\Bytes Total/sec 1528398.33137765
    2015-04-30 13:01:00 ServerName Web Service(AppName)\Bytes Total/sec 1600789.68540725
    2015-04-30 13:01:00 ServerName Web Service(AppName)\Bytes Total/sec 1690894.89372096
    2015-04-30 13:02:00 ServerName Memory\Committed Bytes 23781527552
    2015-04-30 13:02:00 ServerName Memory\Committed Bytes 23802056704
    2015-04-30 13:02:00 ServerName Memory\Committed Bytes 23803797504
    2015-04-30 13:02:00 ServerName Memory\Committed Bytes 23821389824
    2015-04-30 13:02:00 ServerName Memory\Committed Bytes 23831420928
    2015-04-30 13:02:00 ServerName Memory\Committed Bytes 23835803648
    2015-04-30 13:02:00 ServerName Memory\Committed Bytes 23850049536
    2015-04-30 13:02:00 ServerName Memory\Committed Bytes 23863857152
    2015-04-30 13:02:00 ServerName Memory\Committed Bytes 23875534848
    2015-04-30 13:02:00 ServerName Memory\Committed Bytes 23917281280
    2015-04-30 13:02:00 ServerName Memory\Committed Bytes 23933739008
    2015-04-30 13:02:00 ServerName Memory\Committed Bytes 23978917888
    I hope this additional information I have provided will help. :)
    Thanks for your time!

Maybe you are looking for

  • How to remove Kernel extensions, Slow Restart 10.7.5, Should I really upgrade to 10.9?

    Hello everyone, after spending a few days reading various posts about similar topics, here is a breakdown of my question:  (I made a Time Machine backup on an external HD, even though EtreCheck says "Not Configured") 1:  I used EtreCheck to diagnose

  • Cluster Installation Error while running root.sh

    Hi all, Please help I tried to install clusterware but when i run root.sh on first node it shows as below [root@rac11g1 etc]# /u01/app/oraInventory/orainstRoot.sh Changing permissions of /u01/app/oraInventory to 770. Changing groupname of /u01/app/or

  • Apache Jakarta Commons FileUpload misleading message

    I have a JSP page doing file upload using commons FileUpload package. The code looks like this: <% DiskFileUpload upload = new DiskFileUpload(); List items = upload.parseRequest(request); Iterator itr = items.iterator(); while(itr.hasNext()) {      F

  • Converting DV to ???.  Which format?  Aspect ratio problems.

    I'm trying to digitize some old VHS videos for archival and I need some guidance. My goal is to keep a digital copy on disk and toss out the tapes with the player. Most of the videos will just be for future viewing, not further editing. I tossed out

  • How to I export a primary representation of approx 5mb?

    I need to export primary representations of my 30" promos from Final cut server so I can upload them to websites etc.  When I do this they come out as huge files, how can I reduce their size to around 5mb? thanks Laura