Fast acquring of images using JMF

Hello!
I have been trying to write a class for reading frames from a video camera using JMF, however the BufferToImage.createImage appeared to be very slow -- the code is fast if the method is commented out. How can I get the images faster? Or perhaps read pixels directly from the video stream buffer? Or is there another problem with the code? If BufferToImage was constructed using Buffer.getFormat, it was also slow.
I attach the code that I have used for reading the frames.
Sincerely,
Artur Rataj
* A video input driver that uses Java Media Framework.
* This file is provided under the terms of the GNU General Public License.
* version 0.1, date 2004-07-21, author Artur Rataj
package video.drivers;
import java.io.*;
import java.awt.*;
import java.awt.image.*;
import javax.media.*;
import javax.media.protocol.*;
import javax.media.control.*;
import javax.media.format.*;
import javax.media.util.BufferToImage;
* This is a video driver that uses Java Media Framework to access video input devices.
public class JMFDriver extends VideoDriver implements ControllerListener {
      * The input stream processor.
     Processor processor;
      * The stream from the input video device.
     PushBufferStream frameStream;
      * A converter from stream data to an image.
     BufferToImage frameStreamConverter;
      * A video stream buffer.
     Buffer videoBuffer;
      * Constructs a new driver.
     public JMFDriver(String deviceAddress, int width, int height, int bitsPerPixel)
          throws VideoDriverException {
          super(deviceAddress, width, height, bitsPerPixel);
          MediaLocator locator = new MediaLocator(deviceAddress);
          if(locator == null)
               throw new VideoDriverException("Device not found: " + deviceAddress);
          javax.media.protocol.DataSource source;
          try {
               source = javax.media.Manager.createDataSource(locator);
          } catch(IOException e) {
               throw new VideoDriverException("Could not read device " + deviceAddress +
                    ": " + e.toString());
          } catch(NoDataSourceException e) {
               throw new VideoDriverException("Could not read device " + deviceAddress +
                    ": " + e.toString());
          if(!( source instanceof CaptureDevice ))
               throw new VideoDriverException("The device " + deviceAddress +
                    " not recognized as a video input one.");
          FormatControl[] formatControls =
               ((CaptureDevice)source).getFormatControls();
          if(formatControls == null || formatControls.length == 0)
               throw new VideoDriverException("Could not set the format of images from " +
                    deviceAddress + ".");
          VideoFormat videoFormat = null;
searchFormat:
          for(int i = 0; i < formatControls.length; ++i) {
               FormatControl c = formatControls;
               Format[] formats = c.getSupportedFormats();
               for(int j = 0; j < formats.length; ++j)
                    if(formats[j] instanceof VideoFormat) {
                         VideoFormat f = (VideoFormat)formats[j];
                         if(f.getSize().getWidth() == this.width &&
                              f.getSize().getHeight() == this.height) {
                              int bpp = -1;
                              if(f instanceof RGBFormat)
                                   bpp = ((RGBFormat)f).getBitsPerPixel();
                              else if(f instanceof YUVFormat) {
                                   YUVFormat yuv = (YUVFormat)f;
                                   bpp = (yuv.getStrideY() + yuv.getStrideUV())*8/
                                             this.width;
                              if(bpp == bitsPerPixel) {
                                   videoFormat = (VideoFormat)c.setFormat(f);
                                   break searchFormat;
          if(videoFormat == null)
               throw new VideoDriverException("Could not find the format of images from " +
                    deviceAddress + " at " + bitsPerPixel + " bits per pixel.");
          try {
               source.connect();
          } catch(IOException e) {
               throw new VideoDriverException("Could not connect to the device " +
                    deviceAddress + ": " + e.toString());
          try {
               processor = Manager.createProcessor(source);
          } catch(IOException e) {
               throw new VideoDriverException("Could not initialize the processing " +
                    "of images from " + deviceAddress + ": " + e.toString());
          } catch(NoProcessorException e) {
               throw new VideoDriverException("Could not initialize the processing " +
                    "of images from " + deviceAddress + ": " + e.toString());
          processor.addControllerListener(this);
          synchronized(this) {
               processor.realize();
               try {
                    wait();
               } catch(InterruptedException e) {
                    throw new VideoDriverException("Could not initialize the processing " +
                         "of images from " + deviceAddress + ": " + e.toString());
          processor.start();
          PushBufferDataSource frameSource = null;
          try {
               frameSource = (PushBufferDataSource)processor.getDataOutput();
          } catch(NotRealizedError e) {
               /* empty */
          PushBufferStream[] frameStreams = frameSource.getStreams();
          for(int i = 0; i < frameStreams.length; ++i)
               if(frameStreams[i].getFormat() instanceof VideoFormat) {
                    frameStream = frameStreams[i];
                    break;
          videoBuffer = new Buffer();
          videoBuffer.setTimeStamp(0);
          videoBuffer.setFlags(videoBuffer.getFlags() |
               Buffer.FLAG_NO_WAIT |
               Buffer.FLAG_NO_DROP |
               Buffer.FLAG_NO_SYNC);
          processor.prefetch();
     public void controllerUpdate(ControllerEvent event) {
          if(event instanceof RealizeCompleteEvent)
               synchronized(this) {
                    notifyAll();
     * Acquires an image from the input video device.
     public Image acquireImage() throws VideoDriverException {
          try {
               frameStream.read(videoBuffer);
          } catch(IOException e) {
               throw new VideoDriverException("Could not acquire a video frame: " +
                    e.toString());
          frameStreamConverter = new BufferToImage(
               (VideoFormat)frameStream.getFormat());
          Image out = frameStreamConverter.createImage(videoBuffer);
          return out;
     * Closes the input video device.
     public void close() {
          processor.close();
     public static void main(String[] args) throws Exception {
          JMFDriver driver = new JMFDriver("v4l://0", 768, 576, 16);
          while(true)
               Image image = driver.acquireImage();
               System.out.print(".");
               System.out.flush();

This is how you do this:
First you must know upfront what type of a format your camera is using,
all the valid formats are static members of BufferedImage.
Next you need to create a BufferedImage of the same format.
Here is an example:
My creative webcam is set to: 640x480 RGB 24bit color. i Create a BufferedImage into which i will copy the buffer:
BufferedImage buff = new BufferedImage(640,480,BufferedImage.TYPE_3BYTE_BGR);
Next the copy operation where appropriate
cbuffer is my javax.media.Buffer from which i will copy the image data:
System.arraycopy((byte[])cbuffer.getData(), 0, ((DataBufferByte) buff.getRaster().getDataBuffer()).getData(),0,img_size_bytes);
Things to note here, are i cast the cbuffer.getData() to byte[] because that is the type of data this is. I cast buff.getRaster().getDataBuffer() to DataBufferByte because the bufferedimage is of type TYPE_3BYTE_BGR.
also, img_size_bytes equals to 640*480*3
This operation will copy the raw buffer into your buffered image avoiding the use of BufferToImage
Only problem is the image will be flipped, but just flip the webcam to correct this =)

Similar Messages

  • How can you stretch an image using JMF API

    How is it possible to strech or zoom the video using JMF API? Streching may not distort the resolution, but, zooming might as I imagiine.

    You can zoom with something like this:
    public void zoomTo(float z) {
         if (visualComp != null) {
              insets = getInsets();
              Dimension d = visualComp.getPreferredSize();
              d.width = (int) (d.width * z);
              d.height = (int) (d.height * z);
              if (controlComp != null)
              d.height += controlComp.getPreferredSize().height;
              setSize(d.width + insets.left + insets.right,
                   d.height + insets.top + insets.bottom);
    }I suposse visualComp is the visualComponent of a player and this zoomTo(..) method is in a Component so you can call to setSize(..) method.
    Of course the float z is the scale of zoom.
    ... i hope!

  • Capture Image from Video Using JMF

    Dears
    i want to extract image from video file at my file system , how i can do that using JMF or any other APIs
    my main target is to create thumbnails from video files uploaded by the customer
    thanks

    abo_habibah wrote:
    so if i will have java project run in background under aix machine , this code will work without any problem ?If AIX machine uses an X server, then you'll probably get a "headless exception" thrown...but if it's not X-server based, you should be fine.
    also what is the package for Time object Time object? I believe you mean "Timer", which you can certainly lookup yourself with the API:
    [http://java.sun.com/javase/6/docs/api/]
    +(it's java.util.Timer, but you should still learn to look it up yourself...)+

  • Coverting video from images without using JMF.

    Dear All,
    I want to convert video from sequence of images.
    I have used JMF sample for doing the same.It works pretty fine.
    But I want to do same without using JMF.
    Do anyone here used another libraries like FFMPEG for Java or Theora to achieve same.
    If you have any idea about it, please reply.
    Yours Sincerely,
    Ashish Nijai

    I've done this in the past by calling out to ffmpeg as an external application. There's nothing Java specific in that other than the act of calling an external binary - for which the ProcessBuilder API should be your starting point. If you have questions about ProcessBuilder start a new thread in "Java Programming" or "New To Java" but remember to google first - it's a common topic.
    A bit of Googling suggests that there might be some JNI wrappers for ffmpeg too.
    Note that on the project where I used ffmpeg we eventually went over to running it in batches from a cron script with no Java components at all.

  • Faster broadcasting using JMF

    hi anyone ^^
    I have a problem with broadcasting audio. My program has some delay (about 2 or more seconds) when I send/transmit to another computer. How can I reduce the delay?
    I am using JMF.
    Thanks very much for your help :)

    810605 wrote:
    I have a problem with broadcasting audio. Ok.
    My program has some delay (about 2 or more seconds) when I send/transmit to another computer. How far away is the other computer? In networking terms...
    How can I reduce the delay?Depends on why there's a delay.
    There's going to be an inherant delay due simply to processing time, network speed, etc...and there will be additional delay added everytime the data is buffered somewhere, IE, the sending end and the receiving end...

  • Capture an image using the web camera from a web application

    Hi All,
    Could anyone please share the steps what have to be followed for capture an image using the web camera from a web application.
    I have a similar requirement like this,
    1) Detect the Webcam on the users machine from the web application.(To be more clear when the user clicks on 'Add Photo' tool from the web application)
    2) When the user confirms to save, save the Image in users machine at some temporary location with some unique file name.
    3) Upload the Image to the server from the temporary location.
    Please share the details like, what can be used and how it can be used etc...
    Thanks,
    Suman

    1) Detect the Webcam on the users machine from the web application.(To be more clear when the user clicks on 'Add Photo' tool from the web application)There's not really any good way to do this with JMF. You'd have to somehow create a JMF web-start application that will install the native JMF binaries, and then kick off the capture device scanning code from the application, and then scan through the list of devices found to get the MediaLocator of the web cam.
    2) When the user confirms to save, save the Image in users machine at some temporary location with some unique file name.You'd probably be displaying a "preview" window and then you'd just want to capture the image. There are a handful of ways you could capture the image, but it really depends on your situation.
    3) Upload the Image to the server from the temporary location.You can find out how to do this on google.
    All things told, this application is probably more suited to be a FMJ (Freedom for Media in Java) application than a JMF application. JMF relies on native code to capture from the web cams, whereas FMJ does not.
    Alternately, you might want to look into Adobe Flex for this particular application.

  • Playing a wav file (byte array) using JMF

    Hi,
    I want to play a wav file in form of a byte array using JMF. I have 2 classes, MyDataSource and MyPullBufferStream. MyDataSource class is inherited from PullStreamDataSource, and MyPullBufferStream is derived from PullBufferStream. When I run the following piece of code, I got an error saying "EXCEPTION_ACCESS_VIOLATION (0xc0000005) at pc=0x7c9108b2, pid=3800, tid=1111". Any idea what might be the problem? Thanks.
    File file = new File(filename);
    byte[] data = FileUtils.readFileToByteArray(file);
    MyDataSource ds = new MyDataSource(data);
    ds.connect();
    try
        player = Manager.createPlayer(ds);
    catch (NoPlayerException e)
        e.printStackTrace();
    if (player != null)
         this.filename = filename;
         JMFrame jmframe = new JMFrame(player, filename);
        desktop.add(jmframe);
    import java.io.IOException;
    import javax.media.Time;
    import javax.media.protocol.PullBufferDataSource;
    import javax.media.protocol.PullBufferStream;
    public class MyDataSource extends PullBufferDataSource
        protected Object[] controls = new Object[0];
        protected boolean started = false;
        protected String contentType = "raw";
        protected boolean connected = false;
        protected Time duration = DURATION_UNKNOWN;
        protected PullBufferStream[] streams = null;
        protected PullBufferStream stream = null;
        protected final byte[] data;
        public MyDataSource(final byte[] data)
            this.data = data;
        public String getContentType()
            if (!connected)
                System.err.println("Error: DataSource not connected");
                return null;
            return contentType;
        public void connect() throws IOException
            if (connected)
                return;
            stream = new MyPullBufferStream(data);
            streams = new MyPullBufferStream[1];
            streams[0] = this.stream;
            connected = true;
        public void disconnect()
            try
                if (started)
                    stop();
            catch (IOException e)
            connected = false;
        public void start() throws IOException
            // we need to throw error if connect() has not been called
            if (!connected)
                throw new java.lang.Error(
                        "DataSource must be connected before it can be started");
            if (started)
                return;
            started = true;
        public void stop() throws IOException
            if (!connected || !started)
                return;
            started = false;
        public Object[] getControls()
            return controls;
        public Object getControl(String controlType)
            try
                Class cls = Class.forName(controlType);
                Object cs[] = getControls();
                for (int i = 0; i < cs.length; i++)
                    if (cls.isInstance(cs))
    return cs[i];
    return null;
    catch (Exception e)
    // no such controlType or such control
    return null;
    public Time getDuration()
    return duration;
    public PullBufferStream[] getStreams()
    if (streams == null)
    streams = new MyPullBufferStream[1];
    stream = streams[0] = new MyPullBufferStream(data);
    return streams;
    import java.io.ByteArrayInputStream;
    import java.io.IOException;
    import javax.media.Buffer;
    import javax.media.Control;
    import javax.media.Format;
    import javax.media.format.AudioFormat;
    import javax.media.protocol.ContentDescriptor;
    import javax.media.protocol.PullBufferStream;
    public class MyPullBufferStream implements PullBufferStream
    private static final int BLOCK_SIZE = 500;
    protected final ContentDescriptor cd = new ContentDescriptor(ContentDescriptor.RAW);
    protected AudioFormat audioFormat = new AudioFormat(AudioFormat.GSM_MS, 8000.0, 8, 1,
    Format.NOT_SPECIFIED, AudioFormat.SIGNED, 8, Format.NOT_SPECIFIED,
    Format.byteArray);
    private int seqNo = 0;
    private final byte[] data;
    private final ByteArrayInputStream bais;
    protected Control[] controls = new Control[0];
    public MyPullBufferStream(final byte[] data)
    this.data = data;
    bais = new ByteArrayInputStream(data);
    public Format getFormat()
    return audioFormat;
    public void read(Buffer buffer) throws IOException
    synchronized (this)
    Object outdata = buffer.getData();
    if (outdata == null || !(outdata.getClass() == Format.byteArray)
    || ((byte[]) outdata).length < BLOCK_SIZE)
    outdata = new byte[BLOCK_SIZE];
    buffer.setData(outdata);
    byte[] data = (byte[])buffer.getData();
    int bytes = bais.read(data);
    buffer.setData(data);
    buffer.setFormat(audioFormat);
    buffer.setTimeStamp(System.currentTimeMillis());
    buffer.setSequenceNumber(seqNo);
    buffer.setLength(BLOCK_SIZE);
    buffer.setFlags(0);
    buffer.setHeader(null);
    seqNo++;
    public boolean willReadBlock()
    return bais.available() > 0;
    public boolean endOfStream()
    return willReadBlock();
    public ContentDescriptor getContentDescriptor()
    return cd;
    public long getContentLength()
    return (long)data.length;
    public Object getControl(String controlType)
    try
    Class cls = Class.forName(controlType);
    Object cs[] = getControls();
    for (int i = 0; i < cs.length; i++)
    if (cls.isInstance(cs[i]))
    return cs[i];
    return null;
    catch (Exception e)
    // no such controlType or such control
    return null;
    public Object[] getControls()
    return controls;

    Here's some additional information. After making the following changes to MyPullBufferStream class, I can play a wav file with gsm-ms encoding with one issue: the wav file is played many times faster.
    protected AudioFormat audioFormat = new AudioFormat(AudioFormat.GSM, 8000.0, 8, 1,
                Format.NOT_SPECIFIED, AudioFormat.SIGNED, 8, Format.NOT_SPECIFIED,
                Format.byteArray);
    // put the entire byte array into the buffer in one shot instead of
    // giving a portion of it multiple times
    public void read(Buffer buffer) throws IOException
            synchronized (this)
                Object outdata = buffer.getData();
                if (outdata == null || !(outdata.getClass() == Format.byteArray)
                        || ((byte[]) outdata).length < BLOCK_SIZE)
                    outdata = new byte[BLOCK_SIZE];
                    buffer.setData(outdata);
                buffer.setLength(this.data.length);
                buffer.setOffset(0);
                buffer.setFormat(audioFormat);
                buffer.setData(this.data);
                seqNo++;
        }

  • How to use jmf convert the rtp packet (captured by jpcap) in to wav file?

    I use the jpcap capture the rtp packets(payload: ITU-T G.711 PCMU ,from voip)
    and now I want to use JMF read those data and convert in to wav file
    How to do this? please help me

    pedrorp wrote:
    Hi Captfoss!
    I fixed it but now I have another problem. My application send me this message:
    Cannot initialize audio renderer with format: LINEAR, Unknown Sample Rate, 16-bit, Mono, LittleEndian, Signed
    Unable to handle format: ALAW/rtp, Unknown Sample Rate, 8-bit, Mono, FrameSize=8 bits
    Failed to prefetch: com.sun.media.PlaybackEngine@1b45ddc
    Error: Unable to prefetch com.sun.media.PlaybackEngine@1b45ddc
    This time the fail is prefetching. I have no idea why this problem is. Could you help me?The system cant play an audio file / stream if it doesn't know the sample rate...somewhere along the way, in your code, the sample rate got lost. Sample rates are highly important, because they tell the system how fast to play the file.
    You need to go look through your code and find where the sample rate information is getting lost...

  • Problem with  M-JPEG by using JMF and JPEGCodec .

    Hi, there,
    I want to implement a M-JPEG using JMF and JPEGCodec, is that possible?(I already been trapped)
    My problem is I have a video clip which is a AVI file, the video format is following:
    Video format: RGB, 160x120, FrameRate=14.9, Length=57600, 24-bit, Masks=3:2:1, P
    ixelStride=3, LineStride=480, Flipped.
    I already convered a frame to an Image object(video format with JPEG and CVID doesn't work) ,
    I can also convert this Image back as a Buffer, It works fine with me .But to use JPEGCodec( provided by com.sun.image.codec.jpeg ) I need to convert an Image to a BufferedImage, I use the following defination:
    BufferedImage   bImage = new BufferedImage(frameImage.getWidth(null), frameImage.getHeigh(null),BufferedImage.TYPE_INT_RGB); It seems work, But when I use JPEGImageEncoder to encoder this bImage and save as a jpg file,
    everything is black .
    I also need to cast BufferedImage to an Image: frameImage = (Image) bImage; then I convert frameImage back to Buffer.My video clip still running , but every frame now became black .
    can someone help me? thanks in advance.

    I solved this problem . But I met a new problem.
    I converted the above video clip into a JPEG and I want to create a DataSink for it. the messege is:
    Video format: JPEG, 160x120, FrameRate=12.0, Length=3574
    - set content descriptor to: AVI
    - set track format to: JPEG
    Cannot transcode any track to: JPEG
    Cannot create the DataSink: javax.media.NoDataSinkException: Cannot find a DataS
    ink for: com.sun.media.multiplexer.RawBufferMux$RawBufferDataSource@2b7eea
    Transcoding failedHope some Java Experts can help me.
    Regards.

  • Using JMF to create AVIs

    Hello all, I am new to JMF and I was wondering if anyone could point me in the right direction of using JMF to create AVIs on the fly... I have short [ ] buffers that contain image data, and i need to convert this into an .avi file, but i will not have the buffers all at once, but rather will be passing them one at a time. So i need the avi renderer/writer to be able to handle incremental input of buffers, frame by frame.
    if anyone could give me a hand or tell me where to look, i would be much obliged!
    thanks so much,
    ben

    sheik2311 wrote:
    I must do it in Applet since it should work on the web.Actually, no...if it "should work on the web", it either needs to be an applet (pain in the butt for JMF development) or a WebStart application (not a pain in the button). I'd personally recommend using a web start application to avoid all of the hassles of trying to put JMF in an applet.
    Second, I'm not sure where your sample code came from, but you definately need some more robust sample code to work with...so...in the order you should play with them...
    To show a video in a Swing component
    [http://java.sun.com/javase/technologies/desktop/media/jmf/2.1.1/solutions/SwingJMF.html]
    To transmit audio/video
    [http://java.sun.com/javase/technologies/desktop/media/jmf/2.1.1/solutions/AVTransmit.html]
    To receive audio/video
    [http://java.sun.com/javase/technologies/desktop/media/jmf/2.1.1/solutions/AVReceive.html]
    To save a received video to disk
    [http://java.sun.com/javase/technologies/desktop/media/jmf/2.1.1/solutions/RTPExport.html]
    To show and record video
    [http://java.sun.com/javase/technologies/desktop/media/jmf/2.1.1/solutions/JVidCap.html]

  • Select images using keywords and view slideshow...using what?

    I have a library full of jpeg images that were created either using the export command from LR or by editing in Br/Ps and then savedAs jpeg. Now I want to watch slideshows on my monitor (30" Apple HD Cinema Display). Now I have the following Dilemma:
    - I can start a Search Command in Bridge and narrow it down to exactly the images I want in my slide show. Then from the View menu select Slideshow. The IQ is terrible. True sharpness is only revealed when zooming in 100% but that's not the idea of a slide show.
    - I can also use Apple OS's (Tiger) built-in slide show and start it from the finder. That way I get high quality, fast dissolving slide shows, that are unfortunately limited to the first 100 images in the Finder view. However, in the Finder I cannot select my images using keywords and IPTC info.
    So could there be a simple solution to the problems I'm running into with Bridge?
    Or could there be a nice viewer for MacOS that features the search option I want and renders good images and dissolve option?
    Thanks.

    @Ann Shelbourne: Thanks for your reply. I'm hesitant to upgrade to CS4 at this point. That would mean my only option is to look for a jpeg viewer other than Adobe that can select images based on keywords and iptc data. Would you know one that does?
    @Ramon G Castaneda: I apologise for being an inexperienced forum visitor. I will take note of your advice and follow the posting instructions.
    I checked FAQ's both here and at Adobe, as well as the knowledge base. No results. I also searched the forum.
    My setup: MPB dual core Intel, 2GB RAM, 30GB free disk space. Adobe CS3, just updated to Bridge version 2.1.1.9. No improvement at all so far. According to Ann only CS4 will solve the problem.
    Thank you.

  • How to Access Canon Visualizer Using JMF

    HI all,
    I am using Cannon Visualizer RE-455X and its also a video camera for used for document image capturing.i am trying to access it using jMF.but my capture device is not detected by JSF.please can enyone one help me to fix this.I nave installed JMF but i am not sure it happen well.

    but my capture device is not detected by JSF.perhaps you meant JMF..... JMF does not support all cameras. [This page|http://java.sun.com/javase/technologies/desktop/media/jmf/2.1.1/formats.html] lists supported capture devices at the bottom. In general for windows, all the cameras supporting vfw are supported. Check whether your camera supports it.
    please can enyone one help me to fix this.If JMF can't detect it, then no one can possibly help..... Have you installed the driver for your camera? Have you tried clicking 'File>Preferences>Capture Devices> Detect Capture Devices' button in JMStudio?
    I nave installed JMF but i am not sure it happen well.If JMF can play audio/video files for you then its installed correctly.
    Thanks!

  • How to decrease the size of image using a java peogram

    Hi All,
    I want to decrease the size of an image from MB to KB. I have some images of size around 1MB to 3MB. While displaying all those images in the browser it is taking so much time to download all those and display. So I want to write a Java program to make a duplicate copies of those images, which are small in height and width and display those images in small size. Basically I need to display all the list of images as thumbnail view or list view with faster speed.
    Thanks in advance..
    Upendra P.

    I read the documentation. There is no corresponding Image class. The other
    classes are like BufferedImage and all.Yes BufferedImage sounds like what you want.
    You could (1) Read the image (2) Obtain its dimensions and create a blank
    scaled image (3) Draw a scaled version of the original image using the
    Graphics2D obtained from the smaller one. (4) Save the smaller image.
    The javax.util.ImageIO class will help with reading and writing the data and
    Sun's Tutorial has a section on "2D Graphics".

  • How the client can send video data of it's camera out without using JMF?

    The client A and B don't install the JMF software.How the client A can send video data of it's camera to another client B,when the client A and B login in a web server on which the JMF software has been installed?Off course,all of them have the JRE.
    This is my a video exchang software in doing now using JMF technology.I hope every client using the video exchange function need not install the JMF software when the web server have installed the JMF software.

    doudouhaha wrote:
    The client A and B don't install the JMF software.How the client A can send video data of it's camera to another client B,when the client A and B login in a web server on which the JMF software has been installed?Off course,all of them have the JRE.They can't.
    This is my a video exchang software in doing now using JMF technology.I hope every client using the video exchange function need not install the JMF software when the web server have installed the JMF software.Sorry.
    For playing back, may be it is possible to "jar" the jmf along with the application and the upload on the server, then it may recognize all the classes but there can still be problem because JMF depends on many ".dll" files which are stored in System folders, as I have seen. So,........This is correct, mostly. Here goes a captfoss inforant...
    JMF is a collection of Java files (jmf.jar) that perform a certain set of functions, and it's also a set of native library files. You can use the Java JMF stuff with web-start, so the clients wouldn't have to have JMF installed to use the Java JMF stuff. The problem with doing this approach, you always have to ask yourself "Can I do what I want to do without the native library stuff?"
    To answer this question, it's important to take a look at the JMF supported formats page.
    [http://java.sun.com/javase/technologies/desktop/media/jmf/2.1.1/formats.html]
    If you look in the column "JMF 2.1.1 Cross Platform Version", it'll tell you what you can do in pure JMF, without any of the native libraries. In the case of capturing data, the only thing you can capture without native code is audio through the JavaSound system.
    If you take a look at the RTP formats listed, you'll see a problem...the cross-platform version can't send any kind of video data to itself, which means you'd have to send MPEG1 to a transcoding server to bounce it back to the second client as a JPEG image...and that's if you somehow had the video data you want to transmit, which you cannot get from the cross-platform pack anyway.

  • Stream image with jmf

    Hi guys,
    1) do you know how I can create in JMF a DataSource or processor with the info of a YUV image?
    Usually programmers who use JMF initialize the DataSource by using a capture device or a MediaLocator. What I need now is simply to create a DataSource with a YUV image info as data to transmit..
    2) Do you know how to stream a YUV image from one PC to another?
    Thanks guys & cheers
    CM

    Have you cleared this problem. I am having the same problem right now. Till now amn't able to clear it.
    One thing in my mind is that first of all I need to convert the JPeg images to a movie and after that transmit it...
    If you have cleared thsi problem, I'll be grateful if u can help me...
    Thanks
    Sco

Maybe you are looking for

  • New Tab is replacing the Excise Invoice tab on MIGO Header

    Hi All,   I have Added new tab in MIGO Header,it is working fine, if i am doing MIGO for Excisable PO , then Excise Details tab is not appearing In my MIGO header (New Tab is replacing the Excise Detils Tab) and at my item level Exise item tab is dis

  • General

    What is the typical structure of an ABAP program?

  • How do I save and find files on the ipad?

    How do I save and find files on the iPad?

  • Error loading java class.

    I'm tryng to use a simple java class in internal jvm. When I try to load it I get a lot of error like : 0/0 MailHandler:7: cannot resolve symbol 0/0 symbol : class Folder 0/0 location: package mail 0/0 import javax.mail.Folder; What did I forget ? I'

  • Want to add new field in QA32/QA11 transaction under inspection lot stock

    Dear Experts, I want to add new field in QA32 transaction under inspection lot stock tab or under characteristic tab. The reason behind is: Depends upon UD decision i want to assing customer name. So already we are having one Z Report for all the lot