Using sockets to relay video stream

Hi there,
I've been working with java for 4 years now, and I never had troubles making the applications I wanted with it...since my last project :(
I hope you, gurus from the java community forum, can help me sort this out, but let me explain it to you:
I'd like to stream video content (let's say from a webcam) to several spectators. But my upload bandwidth (ADSL line) is limited to 512kbps, which is far from enough to send a video streaming (with a bitrate of ~400kbps) to 10 or more viewers.
That's why I decided to rent a distant server (with a bandwidth of 5Mbps download/upload), and make a java application run on it, which would be able to broadcast the stream to all the viewers.
So my computer (connected to adsl) captures the video from the webcam, sends it via socket to the rented server, which broadcasts the video to all the spectators. If no video is sent to the rented server, then it should broadcast local files to the spectators.
Here is a schema of the project so you understand easily (my english is not as good as I'd like it to be, so you may not understand with words only):
http://www.bf-stream.com/schema1.gif
My application runs quite well with 5 spectators, but as soon as more viewers come to see the video, it starts hanging, and all the viewers suffer lags and buffering, which is not exactly what I expected...
Of course, I checked the administration interface on the rented server to see if this was due to a cpu/ram/bandwitdh problem. With no success: with 10 viewers connected, the app uses less than 1% of the cpu power, something like 200kb of RAM, and a bandwidth of 3-4Mbps (which is still under the actual bandwidth capacity of the server).
That's why I came to the conclusion that my java app was not as good as I thought, and decided to show it to you, so you can help me make it better.
So here is a schema showing the classes and their interactions:
http://www.bf-stream.com/schema2.gif
And here are the sources of the classes:
StreamRelay.java:
package com.RR4;
import java.net.Socket;
import java.util.Vector;
import java.util.Observable;
import java.util.Observer;
* StreamRelay:
*  Main class controlling every aspects of the stream
*  (live content, or files read from hard disk)
public class StreamRelay extends Observable implements Observer  {
     static public String VERSION = "0.5";
     static public int BUFFER_SIZE = 16;
     private StreamServer streamServer;
     private LiveServer liveServer;
     LocalFile localFile;
     Vector clients;
     boolean streamFromSocket;
     public StreamRelay(int livePort, int relayPort) {
          streamFromSocket = false;
          // clients: vector containing every StreamClient connected
          clients = new Vector();
          // localFile: class reading local files when no live is sent to the relay
          localFile = new LocalFile(this);
          new Thread(localFile).start();
          // streamServer: server listening for new StreamClient
          streamServer = new StreamServer(this, relayPort);
          new Thread(streamServer).start();
          // liveServer: server listening for new sources for live content
          liveServer = new LiveServer(this, livePort);
          new Thread(liveServer).start();
     * addClient: adds new StreamClient to 'clients' vector (a StreamClient is a 'spectator')
     public void addClient(Socket clientSocket) {
         StreamClient client = new StreamClient(this, clientSocket);
     * removeClient: removes a StreamClient from 'clients' vector
     public void removeClient(int index) {
          clients.remove(index);
     * update: updates every StreamClient (which are all Observers of StreamRelay)
     public void update(Observable observable, Object obj) {
          if ((observable.getClass().toString().indexOf("LocalFile")!=-1&&!streamFromSocket)||(observable.getClass().toString().indexOf("LiveStream")!=-1)) {
               this.notifyObservers( (byte[]) obj);
               this.setChanged();
     public static void main(String[] args) {
          new StreamRelay(8000, 8080);
LocalFile.java:
package com.RR4;
import java.util.Observable;
import java.io.*;
* LocalFile: class reading local file when no live content is sent to the relay
public class LocalFile extends Observable implements Runnable {
     // filenames is an array containing a list of files to be read when no live content is sent
     private String[] filenames = {
          "file1",
          "file2"
     private InputStream stream;
     boolean streamOpened;
     private StreamRelay parent;
     int fileIndex;
     // Constructor: initialises LocalFile
     //  sets fileIndex to 0, and sets main StreamRelay as Observer
     public LocalFile(StreamRelay parent) {
          this.parent = parent;
          fileIndex = 0;
          this.addObserver(parent);
          initStream();
      * initStream: initialises stream to read the next file in 'filenames' array
     public void initStream() {
          try {
               stream = new BufferedInputStream(new FileInputStream(filenames[fileIndex]), StreamRelay.BUFFER_SIZE);
               streamOpened = true;
               fileIndex++;
               if (fileIndex==filenames.length)
                    fileIndex = 0;
          } catch (Exception exp) {
               exp.printStackTrace();
               streamOpened = false;
      * run: main loop of the class.
      *     the file is actually read: a buffer of BUFFER_SIZE is filled and sent to
      *     the observer (StreamRelay)
     public void run() {
          byte[] buffer = new byte[StreamRelay.BUFFER_SIZE];
          byte[] bufferToSend;
          boolean quit = false;
          int actualBufferSize = 0;
          while (!quit) {
               try {
                    this.setChanged();
                    // Bytes are read until the buffer is actually filled with BUFFER_SIZE bytes
                    // Only then is it sent to the observer...
                    while (streamOpened && ((actualBufferSize = stream.read(buffer, 0, StreamRelay.BUFFER_SIZE)) > 0)) {
                         if (parent.clients.size()>0 && (!parent.streamFromSocket)) {
                              if (actualBufferSize<StreamRelay.BUFFER_SIZE) {
                                   bufferToSend = new byte[actualBufferSize];
                                   System.arraycopy(buffer, 0, bufferToSend, 0, actualBufferSize);
                                   this.notifyObservers(bufferToSend);
                              } else
                                   this.notifyObservers(buffer);
                              this.setChanged();
                         } else {
                              try { Thread.sleep(100); } catch (Exception exp) {}
               } catch (Exception exp) {
                    exp.printStackTrace();
               try { stream.close(); } catch (Exception exp) {}
               initStream();
StreamServer.java:
package com.RR4;
import java.net.ServerSocket;
public class StreamServer extends Thread {
     private ServerSocket serverSocket;
     private boolean socketOpened;
     private StreamRelay parent;
     * StreamServer: server listening for new StreamClient
     public StreamServer(StreamRelay parent, int relayPort) {
          this.parent = parent;
          try {
               serverSocket = new ServerSocket(relayPort);
               socketOpened = true;
          } catch (Exception exp) {
               exp.printStackTrace();
               socketOpened = false;
     public void run() {
          try {
               while (socketOpened) {
                    parent.addClient(serverSocket.accept());
          } catch (Exception exp) {
               exp.printStackTrace();
StreamClient.java:
package com.RR4;
import java.net.Socket;
import java.util.Observer;
import java.util.Observable;
import java.io.*;
* StreamClient: class representing spectators connected to view the video content
*     whether it is live content, or local files
public class StreamClient implements Observer {
     private Socket socket;
     private OutputStream outStream;
     private boolean connected = true;
     private StreamRelay parent;
     public StreamClient(StreamRelay parent, Socket socket) {
          this.parent = parent;
          this.socket = socket;
          try {
               // initialises OutputStream from socket
               outStream = socket.getOutputStream();
          } catch (Exception exp) {
               try { outStream.close(); } catch (Exception e) {}
               try { socket.close(); } catch (Exception e) {}
               exp.printStackTrace();
               connected = false;
          if (connected) {
               // if initializing of OutputStream didn't fail
               // add this class to StreamBroadcast 'clients' vector
               // and add this class to StreamRelay observers
               parent.clients.add(this);
               parent.addObserver(this);
      * update: actually send read bytes to the client
     public void update(Observable observable, Object obj) {
         try {
               outStream.write( (byte[]) obj);
               outStream.flush();
          } catch (Exception exp) {
               // if read bytes couldn't be sent
               // remove this client from clients list and observers of StreamRelay
               // and try to close OutputStream and Socket
               connected = false;
               try { parent.deleteObserver(this); } catch (Exception e) {}
               try { parent.clients.remove(this); } catch (Exception e) {}
               try { outStream.close(); } catch (Exception e) {}
               try { socket.close(); } catch (Exception e) {}
LiveServer.java:
package com.RR4;
import java.net.ServerSocket;
* LiveServer:
*     SocketServer listening to new 'Live streams'
public class LiveServer extends Thread {
     private ServerSocket liveSocket;
     private boolean liveServerOpened;
     private StreamRelay parent;
     public LiveServer(StreamRelay parent, int livePort) {
          this.parent = parent;
          try {
               liveSocket = new ServerSocket(livePort);
               liveServerOpened = true;
          } catch (Exception exp) {
               exp.printStackTrace();
               liveServerOpened = false;
     public void run() {
          LiveStream liveStream;
          try {
               while (liveServerOpened) {
                    liveStream = new LiveStream(parent, liveSocket.accept());
                    new Thread(liveStream).start();
          } catch (Exception exp) {
               exp.printStackTrace();
LiveStream.java:
package com.RR4;
import java.util.Observable;
import java.io.*;
import java.net.Socket;
*     LiveStream:
*          Socket receiving live content from another distant computer
*          to display it instead of the local files.
public class LiveStream extends Observable implements Runnable {
     private InputStream stream;
     boolean streamOpened;
     private StreamRelay parent;
     public LiveStream(StreamRelay parent, Socket socket) {
          this.parent = parent;
          this.addObserver(parent);
          try {
               stream = new BufferedInputStream(socket.getInputStream(), StreamRelay.BUFFER_SIZE);
               streamOpened = true;
          } catch (Exception exp) {
               exp.printStackTrace();
               streamOpened = false;
     public void run() {
          byte[] buffer = new byte[StreamRelay.BUFFER_SIZE];
          byte[] bufferToSend;
          int actualBufferSize = 0;
          try {
               this.setChanged();
               // Bytes are read until the buffer is actually filled with BUFFER_SIZE bytes
               // Only then is it sent to the observer...
               while (streamOpened && ((actualBufferSize = stream.read(buffer, 0, StreamRelay.BUFFER_SIZE)) > 0)) {
                    if (!parent.streamFromSocket)
                         parent.streamFromSocket = true;
                    if (parent.clients.size() > 0) {
                         if (actualBufferSize<StreamRelay.BUFFER_SIZE) {
                              bufferToSend = new byte[actualBufferSize];
                              System.arraycopy(buffer, 0, bufferToSend, 0, actualBufferSize);
                              this.notifyObservers(bufferToSend);
                         } else
                              this.notifyObservers(buffer);
                         this.setChanged();
                    } else
                         try { Thread.sleep(100); } catch (Exception exp) {}
          } catch (Exception exp) {
               exp.printStackTrace();
          } finally {
               try { stream.close(); } catch (Exception exp) {}
               this.deleteObserver(parent);
               parent.streamFromSocket = false;
}For your information, I uses NSV/VP6 vbr as video codec (but this should have no incidence on it, since the app only takes the video stream from a socket and broadcasts it to other sockets, without analysing it or modifying it). And the java app is hosted on a Celeron 2.6 GHz, 128MB RAM.
I really hope you'll be able to help me with this project, as it is really important to me...
I've been trying several Stream types available with the JDK but I hadn't any success... I've also been playing with the BUFFER_SIZE parameter, unsuccessfully too...
Anyway, thanks in advance for reading me so far... and I hope, for helping me... I know the java community is strong, and I hope I won't have to make it with C or C++ :(

Hi again :)
I've been focusing on the local part of the stream (no live video, just playing local files and sending them to clients) using NIO.
NIOStreamRelay.java:
package com.RR4;
import java.io.*;
import java.nio.*;
import java.nio.channels.*;
import java.nio.channels.spi.*;
import java.net.*;
import java.util.*;
public class NIOStreamRelay {
     static int BUFFER_SIZE = 1024;
     LocalFile localFile;
     Vector clients;
     public NIOStreamRelay() throws IOException {
          localFile = new LocalFile(this);
          new Thread(localFile).start();
          clients = new Vector();
          acceptConnections();
     public void acceptConnections() throws IOException {          
          // Selector for incoming requests
          Selector acceptSelector = SelectorProvider.provider().openSelector();
          // Create a new server socket and set to non blocking mode
          ServerSocketChannel ssc = ServerSocketChannel.open();
          ssc.configureBlocking(false);
          // Bind the server socket to the local host and port
          InetAddress lh = InetAddress.getLocalHost();
          InetSocketAddress isa = new InetSocketAddress(lh, 8080);
          ssc.socket().bind(isa);
          // Register accepts on the server socket with the selector. This
          // step tells the selector that the socket wants to be put on the
          // ready list when accept operations occur, so allowing multiplexed
          // non-blocking I/O to take place.
          SelectionKey acceptKey = ssc.register(acceptSelector, SelectionKey.OP_ACCEPT);
          int keysAdded = 0;
          // Here's where everything happens. The select method will
          // return when any operations registered above have occurred, the
          // thread has been interrupted, etc.
          while ((keysAdded = acceptSelector.select()) > 0) {
              // Someone is ready for I/O, get the ready keys
              Set readyKeys = acceptSelector.selectedKeys();
              Iterator i = readyKeys.iterator();
              // Walk through the ready keys collection and process date requests.
              while (i.hasNext()) {
                    SelectionKey sk = (SelectionKey)i.next();
                    i.remove();
                    // The key indexes into the selector so you
                    // can retrieve the socket that's ready for I/O
                    ServerSocketChannel nextReady = (ServerSocketChannel)sk.channel();
                    // Accept the date request and send back the date string
                    Socket s = nextReady.accept().socket();
                    OutputStream socketStream = s.getOutputStream();
                    StreamClient newClient = new StreamClient(socketStream, this);
                    localFile.addObserver(newClient);
                    clients.add(newClient);
     public static void main(String[] args) {
          try {
               NIOStreamRelay streamRelay = new NIOStreamRelay();
          } catch (Exception e) {
               e.printStackTrace();
LocalFile.java:
package com.RR4;
import java.util.Observable;
import java.io.*;
* LocalFile: class reading local file when no live content is sent to the relay
public class LocalFile extends Observable implements Runnable {
     // filenames is an array containing a list of files to be read when no live content is sent
     private String[] filenames = {
          "test.nsv",
          "test2.nsv"
     private InputStream stream;
     boolean streamOpened;
     int fileIndex;
     NIOStreamRelay parent;
     // Constructor: initialises LocalFile
     //  sets fileIndex to 0, and sets main StreamRelay as Observer
     public LocalFile(NIOStreamRelay parent) {
          this.parent = parent;
          fileIndex = 0;
          initStream();
      * initStream: initialises stream to read the next file in 'filenames' array
     public void initStream() {
          try {
               stream = new BufferedInputStream(new FileInputStream(filenames[fileIndex]), NIOStreamRelay.BUFFER_SIZE);
               streamOpened = true;
               fileIndex++;
               if (fileIndex==filenames.length)
                    fileIndex = 0;
          } catch (Exception exp) {
               exp.printStackTrace();
               streamOpened = false;
      * run: main loop of the class.
      *     the file is actually read: a buffer of BUFFER_SIZE is filled and sent to
      *     the observer (StreamRelay)
     public void run() {
          byte[] buffer = new byte[NIOStreamRelay.BUFFER_SIZE];
          byte[] bufferToSend;
          boolean quit = false;
          int actualBufferSize = 0;
          while (!quit) {
               try {
                    this.setChanged();
                    // Bytes are read until the buffer is actually filled with BUFFER_SIZE bytes
                    // Only then is it sent to the observer...
                    while (streamOpened && ((actualBufferSize = stream.read(buffer, 0, NIOStreamRelay.BUFFER_SIZE)) > 0)) {
                         if (parent.clients.size() > 0) {
                              if (actualBufferSize<NIOStreamRelay.BUFFER_SIZE) {
                                   bufferToSend = new byte[actualBufferSize];
                                   System.arraycopy(buffer, 0, bufferToSend, 0, actualBufferSize);
                                   this.notifyObservers(bufferToSend);
                              } else
                                   this.notifyObservers(buffer);
                              this.setChanged();
                         } else {
                              try { Thread.sleep(100); } catch (Exception exp) {}
               } catch (Exception exp) {
                    exp.printStackTrace();
               try { stream.close(); } catch (Exception exp) {}
               initStream();
StreamClient.java:
package com.RR4;
import java.io.*;
import java.net.*;
import java.util.*;
public class StreamClient implements Observer {
     OutputStream out;
     NIOStreamRelay parent;
     public StreamClient(OutputStream out, NIOStreamRelay parent) {
          this.out = out;
          this.parent = parent;
     public void update(Observable observable, Object obj) {
         try {
               out.write( (byte[]) obj);
               out.flush();
          } catch (Exception exp) {
               // if read bytes couldn't be sent
               // remove this client from clients list and observers of StreamRelay
               // and try to close OutputStream and Socket
               try { parent.localFile.deleteObserver(this); } catch (Exception e) {}
               try { parent.clients.remove(this); } catch (Exception e) {}
               try { out.close(); } catch (Exception e) {}
}Does it look better to you?
I know I'm still using single threading, but as the IO should be non-blocking, I guess it should be better.
Furthermore, I tried it locally, and was able to launch 30+ clients without buffering problems (huh, ok, my cpu here is only a 1.6GHz, so the display was a bit lagguy, but it didn't seem to buffer at all)

Similar Messages

  • Flash Player 10.1when using to view web video streams ,can i download at same time

    Flash player is often the player used when viewing Web Video Streams.As with Real Player is it possible to download at the same time?

    Yes on the restart or reboot.
    Add ons Shockwave Flash Object and Shockwave ActiveX Control both enabled.
    Add or Remove Programs list Adobe Air, Adobe Flash Player 10 ActiveX, Adobe Reader X and Adobe Shockwave Player 11.5.
    You were saying that the " version numbers must match"?  Please clarify.  It appears I have non matching versions...is that correct?
    I thank you for your time, it is greatly appreciated!

  • Video streaming from database in adf

    Hi I want to stream videos stored in oracle database in web application using adf. Is there any way to do this please reply with complete description .

    I have stored video in BLOB type column in database but when ever i try to stream this video using servlet browser always hang for some time and then give currupted data on the browser screen
    I dont know why this is happening by the way i am showing images using the same way but i dont know what is wrong with video
    I am using media tag for video streaming media tag is working good with videos stored in the file system.
    If you have some demo application please share with me.
    Edited by: 921609 on Mar 27, 2012 8:33 AM

  • Playing video streaming

    I begin to implement a video chat in java. I used JMF for capturing video streaming from a digital camera and RTP protocol for transmiting the video stream. I want to play the stream into an Applet (so on the client side). I don't want to use jmf on client side (an RTP Applet Player or something like this) because this means that the client needs to instal the JMF package. I need a solution to play the stream stream into an applet but I want that the client don't need to instal any software. I can use anything on server side (from where I transmit)...Is this possible? Please help me. Thank you.

    I've only done some web cam capturing code, but I thought you only need the "performance pack" of JMF installed on a machine if you want to perform capturing.
    So, maybe you can still use JMF in an applet without a formal installation on the client pc.
    Just itry ncluding the jmf.jar and any CODEC class files necessary to replay your video stream ?
    It's worth a shot... although someone may very well correct me on this.
    regards,
    Owen

  • Capture video stream from network

    Hi all,
    I have application to broadcast video to network.
    And I want to write app use jmf to capture video stream.
    if you have any sample source code, please send to me
    (my mail: [email protected]).
    Thank you.

    yap
    avtransmit2 is giveing ERROR (line:123 "Couldn't create DataSource") so i thing medialocator is not finding so datasource is not created but variable locator is not NULL.
    i am trying to transmit from webcam (realtime).

  • Use Time Capsule for video...

    Has anyone used Time Capsule for video streaming?  How to connect to TV if it's pre-digital?

    Time Capsule is not a Media Player.. it has zero ability to stream video. It has zero ability to do anything but hold files. It will sit there and hold your video files very happily .. but the dumb as a board hard disk and TC controls can do nothing with it.
    You need a media player. And you need one that is able to be hacked or is not Apple. Apple work on itunes, although it is possible to use an Apple TV in the mix if you send video to it from ipad using airplay.
    If by pre-digital TV it doesn't have hdmi input, then buy yourself an older media player that has component out.. there is no point buying expensive hdmi to analogue converters. You will eventually replace the TV and will definitely want to go digital. So get an Apple TV 1 that still works from ebay. And use one of the hacking methods. This has component video and you can also get composite out of it. And it will play files stored in the network. Other non-apple players will do the same. Just find one that is older design with component still but ATV1 can still work well. I use one with a crystal acceleration card running crystalbuntu firmware.

  • Video Stream from Webcam with socket possible?

    Hello there,
    i want to stream a live feed from my webcam in a chat, but i have no idea how to manage it... ( i dont want RTP ) ...
    I can allready display it on my screen, but how do i stream this ?
    Can i create a Socketstream from this ?
    And how can i transmit the format info to the other side?
    What i have is the following:
    The Component of the Camera:Component live = player.getVisualComponent();And the Formatsprivate static Format[] availableFormats;
    availableFormats = deviceInfo.getFormats();Any Ideas ?

    Hi,
    I want to do the same thing ... I was able to stream audio through the socket but could not stream video.
    I posted my question at http://forum.java.sun.com/thread.jspa?threadID=674676&tstart=15
    and presented a sketch of my implementation design. But until now no one has answered my question.
    This is how you should hook up the components on the transmitter side
    Webcam --> Processor --> Custom DataSink --> PushBufferStream --> PushBufferHandler --> Socket --> Network.
    You can implement the transferData() method of the PushBufferHandler to write to a socket.
    And on the Receiver side the components should be hooked up like this
    Network --> Socket --> PullBufferStream --> custom DataSource --> Player
    Now as I mentioned in
    http://forum.java.sun.com/thread.jspa?threadID=674676&tstart=15
    the audio works fine but the video does not. The video does not render on the receiver side.
    Note: I tried TCP sockets with audio and it is terrible (not practical at all) so I reverted to UDP datagrams and the result was better than I expected. I didn't take care of the ordering of received packet at the receiver but it still worked fine.
    I am still looking waiting for some help in why video is not rendering and how can I sovle this problem.
    I am workning on a project and time is running out.
    Any help?
    -Adel

  • Video streamed from YouTube using the Apple Composite AV Cable

    I used to have video streamed from YouTube on my iPad to my TV using the Apple Composite AV Cable, this was in previous versions till the iOS 6 release, then YouTube no longer streams video through this cable :'(
    Very annoying really to miss a feature through a software upgrade!
    Please help me.

    Looks like the connector that goes into the iPad doesn't precisely connect all pins and lock the whole thing in place as well as we would think an Apple-branded product would. I discovered after four hours that if I pull my AV composite cable from the iPad without turning the iPad off (not the Apple recommended way - perhaps there's a better way, like replugging with it off?) and plugging it back in, *poof* I get video again. Now, Netflix still isn't working, but U-Tubes and photo slideshows are back with awesome-quality sound. I believe I'll get similar results when I try this on my iPhone 4.
    I guess Apple TV is really the only way to get the luxury back of being able to stay focused on your entertainment and not this equipment because this was spozed to be a movie nite for me four hous ago. I just wish Apple TV had as much content as the Roku.
    Anyway, just plug and re-plug, but don't be too hard on it. You might also have to turn widescreen off and back on again.

  • Full screen videos/streams on second monitor won't stay full screen while using another window on other monitor

    Whenever I open and then full screen a video or stream of any kind, I am unable to use another window of Firefox while the video/stream maintains full screen. When I click off the video/stream it reverts back to the smaller screen. Is there any way to change this?

    Create a new profile as a test to check if your current profile is causing the problems.
    See "Creating a profile":
    *https://support.mozilla.org/kb/profile-manager-create-and-remove-firefox-profiles
    *http://kb.mozillazine.org/Standard_diagnostic_-_Firefox#Profile_issues
    Profile Backup and Restore
    *http://kb.mozillazine.org/Profile_backup
    *https://support.mozilla.org/en-US/kb/back-and-restore-information-firefox-profiles
    *http://kb.mozillazine.org/Transferring_data_to_a_new_profile_-_Firefox

  • Can PQA use a single reference frame against a test video stream?

    For R&D testing of video "set-top" devices I want to initiate some internal processing on our UUT, then using a PXI-1491 analyze the digital (HDMI) video output of the UUT for some large number of seconds (180 seconds or more for example).
    The video that is being analyzed will have static image content. The amount of time I want to analyze the test stream after doing some stuff on the UUT is variable, but I always anticipate it being fairly long.
    It currently appears that I have to have a reference stream that contains exactly the same number of frames as the test stream.
    This makes the reference vbf files very large. My test requirements include a large number of resolutions that must be tested through the UUT. Having an extensive library of very large vbf files is logistically difficult, it would much easier to maintain such a library made up of single "golden" frame reference files instead.  Additionally, since my analyze time needs to be variable depending on test setup and UUT processing options, it would be better for me to have a single golden reference frame and validate alot of test frames against it.
    Since the analyzed video will have static images, is it possible to run a test video stream against a single reference frame (reduced reference) instead of having a full reference stream frame by frame?
    Solved!
    Go to Solution.

    Doing exactly what you are asking is not within the design of PQA.  Of course, the best option that we would recommend when working with video test is a large capacity hard drive, probably in a RAID configuration for more space and better performance, and then just taking a golden reference with 10k frames, and doing what you originally suggested.  
    One method of achieving what you are looking for without using a large reference file is through offline processing.  This would allow you to acquire 10,000 consecutive frames, which I understand is one of your concerns, and then after the fact run them back through the analyzer.  You would still be performing a looping type of action in TestStand or LabVIEW.  The process would be:
    1) Acquire your source from your 1491 provider with no processors, and saving the media stream to disk.
    Loop:
    2) Load PQA with the disk buffer provider instead of the 1491, the disk buffer file will be the acquisition from above
    3) Point the start frame to your current location.
    4) Perform your processor with the results.
    5) Repeat and iterate to a new start frame location
    This process is going to be much slower as you load and unload resources every time.  If you choose to go this way, a better implementation would be to maybe consider doing 100 frames at a time, or some number larger than 1, because the processing time of handling the extra frames is going to be less than loading/unloading PQA.
    Your second option is going to require more work outside of PQA to implement custom functionality through a Custom User Processor.  Effectively this allows you to come up with a way to process incoming data in ways that you'd like.  To do this you will need:
    1) LabVIEW 2011 - Custom User Processors for PQA can only be developed in LabVIEW 2011.  If you are under a SSP agreement with NI and only have LabVIEW 2012 currently, you still have access to older versions.
    2) Vision Development Module - To perform your image processing
    3) An unencrypted video source - Due to limitations of HDCP we can not expose the raw video feed to user processors.
    In this user processor you would:
    1) Load your static image/frame in directly, you wouldn't need to use the Media Ref input
    Loop
    2) Load the current frame from the incoming video array
    3) Perform your video measurement with the Vision function, likely PNSR or SSIM since these are currently the only 2 referenced measurements in PQA.
    To learn more about Custom User Proessor's, in the PQA help check out: NI PQA Executive and the NI PQA Configuration Panel>NI PQA Tabs>Processors tab>Customizable Output Processors>User Processor  as well as: http://digital.ni.com/public.nsf/allkb/514058CC830D86EE86257881004CB45F
    Paul Davidson
    Sound and Vibration Software Staff Product Support Engineer
    National Instruments

  • How can I use QuickTime SDK to get Compress format of Video stream?

    I can get uncompress video stream like this:
    QDerr = NewGWorldFromPtr( &m_hGWorld, k32BGRAPixelFormat, &rcMac,
    NULL, NULL, 0, (char*)m_pVideoBuffer,
    m_sAVFileInfo.sResInfo.ulWidth*4);
    then the m_pVideoBuffer is uncompress video stream. How can I use QuickTime SDK to get Compress format of Video stream? I could change "k32BGRAPixelFormat" to other format,use other API or something?
    Could someone have the method? Best Wishes!

    Firefox, I would make sure you update your Desktop to Firefox 14.0.1, as the version you are running is quite old and out of date.
    Then, do you have your Sync Recovery key, username and password? If you do, you can re-add Firefox Sync to your computer.

  • How to use vivado hls::mat with AXI-Stream interfaces (not AXI4 video stream) ?

      Hello, everyone. I am trying to design a image processing IP core with vivado hls 2014.4. From xapp1167, I have known that video functions provided by vivado hls should be used with AXI4 video stream and VDMA. However, I want to write/read image data to/from the Ip core through AXI stream interfaces and AXI-DMA for some special reasons.
      To verify the feasibility, a test IP core named detectTest was designed as follows. The function of this IP core is reading a 320x240 8 bit gray image (bit 7-0 of INPUT_STREAM_TDATA) from the axis port "INPUT_STREAM” and then output it with no changes. I fabricated a vivado project of zedboard and then test the IP core with a AXI-DMA. Experimental results show that the IP core works normally. So it seems possible to use hls::mat with axis. 
    #include "hls_video.h"
    #include "hls_math.h"
    typedef ap_axiu<32, 1, 1, 1> AXI_VAL;
    typedef hls::Scalar<HLS_MAT_CN(HLS_8U), HLS_TNAME(HLS_8U)> GRAY_PIXEL;
    typedef hls::Mat<240, 320, HLS_8U> GRAY_IMAGE;
    #define HEIGHT 240
    #define WIDTH 320
    #define COMPRESS_SIZE 2
    template<typename T, int U, int TI, int TD>
    inline T pop_stream(ap_axiu<sizeof(T) * 8, U, TI, TD> const &e) {
    #pragma HLS INLINE off
    assert(sizeof(T) == sizeof(int));
    union {
    int ival;
    T oval;
    } converter;
    converter.ival = e.data;
    T ret = converter.oval;
    volatile ap_uint<sizeof(T)> strb = e.strb;
    volatile ap_uint<sizeof(T)> keep = e.keep;
    volatile ap_uint<U> user = e.user;
    volatile ap_uint<1> last = e.last;
    volatile ap_uint<TI> id = e.id;
    volatile ap_uint<TD> dest = e.dest;
    return ret;
    template<typename T, int U, int TI, int TD>
    inline ap_axiu<sizeof(T) * 8, U, TI, TD> push_stream(T const &v, bool last =
    false) {
    #pragma HLS INLINE off
    ap_axiu<sizeof(T) * 8, U, TI, TD> e;
    assert(sizeof(T) == sizeof(int));
    union {
    int oval;
    T ival;
    } converter;
    converter.ival = v;
    e.data = converter.oval;
    // set it to sizeof(T) ones
    e.strb = -1;
    e.keep = 15; //e.strb;
    e.user = 0;
    e.last = last ? 1 : 0;
    e.id = 0;
    e.dest = 0;
    return e;
    GRAY_IMAGE mframe(HEIGHT, WIDTH);
    void detectTest(AXI_VAL INPUT_STREAM[HEIGHT * WIDTH], AXI_VAL RESULT_STREAM[HEIGHT * WIDTH]) {
    #pragma HLS INTERFACE ap_fifo port=RESULT_STREAM
    #pragma HLS INTERFACE ap_fifo port=INPUT_STREAM
    #pragma HLS RESOURCE variable=RESULT_STREAM core=AXI4Stream metadata="-bus_bundle RESULT_STREAM"
    #pragma HLS RESOURCE variable=INPUT_STREAM core=AXI4Stream metadata="-bus_bundle INPUT_STREAM"
    #pragma HLS RESOURCE variable=return core=AXI4LiteS metadata="-bus_bundle CONTROL_STREAM"
    int i, j;
    for (i = 0; i < HEIGHT * WIDTH; i++) {
    unsigned int instream_value = pop_stream<unsigned int, 1, 1, 1>(INPUT_STREAM[i]);
    hls::Scalar<HLS_MAT_CN(HLS_8U), HLS_TNAME(HLS_8U)> pixel_in;
    *(pixel_in.val) = (unsigned char) instream_value;
    mframe << pixel_in;
    hls::Scalar<HLS_MAT_CN(HLS_8U), HLS_TNAME(HLS_8U)> pixel_out;
    mframe >> pixel_out;
    unsigned int outstream_value = (unsigned int) *(pixel_out.val);
    RESULT_STREAM[i] = push_stream<unsigned int, 1, 1, 1>(
    (unsigned int) outstream_value, i == HEIGHT * WIDTH - 1);
    return;
      Then I tried to modify the function of detectTest as follow. The function of the modified IP core is resizing the input image and then recoverying its original size. However, it did not work fine in the AXI-DMA test. The waveform captured by chipscope show that the ready signal of INPUT_STREAM was cleared after recieving servel pixels. 
    GRAY_IMAGE mframe(HEIGHT, WIDTH);
    GRAY_IMAGE mframe_resize(HEIGHT / COMPRESS_SIZE, WIDTH / COMPRESS_SIZE);
    void detectTest(AXI_VAL INPUT_STREAM[HEIGHT * WIDTH], AXI_VAL RESULT_STREAM[HEIGHT * WIDTH]) {
    #pragma HLS INTERFACE ap_fifo port=RESULT_STREAM
    #pragma HLS INTERFACE ap_fifo port=INPUT_STREAM
    #pragma HLS RESOURCE variable=RESULT_STREAM core=AXI4Stream metadata="-bus_bundle RESULT_STREAM"
    #pragma HLS RESOURCE variable=INPUT_STREAM core=AXI4Stream metadata="-bus_bundle INPUT_STREAM"
    #pragma HLS RESOURCE variable=return core=AXI4LiteS metadata="-bus_bundle CONTROL_STREAM"
    int i, j;
    for (i = 0; i < HEIGHT * WIDTH; i++) {//receiving block
    unsigned int instream_value = pop_stream<unsigned int, 1, 1, 1>(INPUT_STREAM[i]);
    hls::Scalar<HLS_MAT_CN(HLS_8U), HLS_TNAME(HLS_8U)> pixel_in;
    *(pixel_in.val) = (unsigned char) instream_value;
    mframe << pixel_in;
    hls::Resize(mframe, mframe_resize);
    hls::Resize(mframe_resize, mframe);
    for (i = 0; i < HEIGHT * WIDTH; i++) {//transmitting block
    hls::Scalar<HLS_MAT_CN(HLS_8U), HLS_TNAME(HLS_8U)> pixel_out;
    mframe>>pixel_out;
    unsigned char outstream_value=*(pixel_out.val);
    RESULT_STREAM[i] = push_stream<unsigned int, 1, 1, 1>((unsigned int) outstream_value, i == HEIGHT * WIDTH - 1);
    return;
      I also tried to delete or modify the following 2 lines in the modified IP core. But the transmitting problem existed too. It seems that the IP core cannot work normally if the receiving block and the transmitting block in different "for" loops. But if I did not solve this problem, the image processing functions cannot be added into the IP core either. The document of xapp1167 mentioned that " the hls::Mat<> datatype used to model images is internally defined as a stream of pixels". Does that caused the problem? And how can I solve this problem? Thanks a lot !
    hls::Resize(mframe, mframe_resize);
    hls::Resize(mframe_resize, mframe);
     

    Hello
    So the major concept that you need to learn/remember is that hls::Mat<> is basically "only" an hls stream -- hls::stream<> -- It's actually an array of N channels (and you have N=1).
    Next, streams are fifos; in software that's modeled as infinite queues but in HW they have finite size.
    The default value is a depth of 2 (IIRC)
    in your first code you do :
    for all pixels loop {
      .. something to read pixel_in
       mframe takes pixel_in
       pixel_out is read from mframe
       .. wirte out pixel_out
    } // end loop
    If you notice, mframe has never more than one pixel element inside since as soon as you write to it, you unload it. in other terms mframe never contains a full frame of pixel (but a full frame flow through it!).
    In your second coding, mframe has to actually contain all the pixels as you have 2 for loops and you don't start unloading the pixels unless you have the first loop complete.
    Needless to say that your fifo had a depth of 2 so actually you never read more than 3 pixels in.
    That's why you see that the ready signal of the iput stream drops after a few pixels; that's the back pressure being applied by the VHLS block.
    Where to go from there?
    Well first stop doing FPGA tests and chipscope if you did not run cosim first and that it passed.
    you would have done cosim and it had failed - or got stuck - then you would have debugged there, rather than waiting for a bitstream to implement.
    Check UG902 about cosim and self checking testbench. maybe for video you can't have selfchecking so at least you need to have visual checks of generated pictures - you can adapt XAPP1167 for that.
    For your design, you could increased the depth of the stream - the XAPP1167 explains that, but here it's impractical or sometimes impossible to buffer a full size frame.
    If you check carefully the XAPP, the design operates in "dataflow" mode; check UG902 as to what this means.
    In short, dataflow means that the HW functions will operate in parallel, and here the second loop will start executing as soon as data has been generated in the first loop - if you understand, the links between the loops is a stream / fifo, so as soon as a data is generated in the first loop, the second loop could process that; this is possible because the processing happens in sequential order.
    Well I leave you to read more.
    I hope this helps....

  • P2p video streaming using jmf (is it possible to "forward" the stream ?)

    Hello
    In my project a peer will start streaming captured video from the webcam to his neighbors and then his neighbors will have to display the video and forward(stream) it to their neighbors and so on . So my question is can i do this using jmf , a simple scenario will be : peer_1 streams to peeer_2 and then peer_2 to forward(stream) the video received from peer_1 to peer_3 .
    I've read the jmf2_0 guide and i've seen that it's only possible to stream from a server to a client and that's about it , but i want also the client to pass the stream forward to another client ... like for example [http://img72.imageshack.us/img72/593/p2pjmf.gif|http://img72.imageshack.us/img72/593/p2pjmf.gif]
    I want to know at least if this it's possible with jmf or should i start looking for another solution ? and do you have any examples of such projects or examples of forwarding the stream with jmf ....
    thanks for any suggestions

    _Cris_ wrote:
    I want to know at least if this it's possible with jmf or should i start looking for another solution ? and do you have any examples of such projects or examples of forwarding the stream with jmf .... You can do what with JMF. Once you receive the stream, it's just a video stream. You can do anything you want with it...display it, record it, or send it as an RTP stream.

  • Using a video stream (NOT streaming a video clip) in Flash?

    Hello,
    I need to get a RTSP video stream into Flash but I can't
    figure out if it is possible to do. Also, the stream is encoded
    using H.264 for video and AAC for audio. Any suggestions on how to
    make this fly is more than welcome.
    Best regards from a sunny Stockholm - spring is finally here!
    /Cristian

    Just check it with the .flv files. If it works fine for them
    then your code is fine and you have to search for your format or convert it in flv my any means.
    with Regards,
    Shardul Singh Bartwal

  • Director 11 using a webcam or live video streaming input

    I am looking for a solution to use input from a webcam or from any other live video stream in Director 11.
    I have tried using a Flash component but this does not work.
    The MyronXtra does not work with Director 11 too!
    Anyone has an idea?
    Thank you

    You can use this demo from valentin :
    http://valentin.dasdeck.com/xtras/ar_xtra/win/arxtra_win_v019.zip
    it could help
    Brahim AYI

Maybe you are looking for

  • Extremely slow Crystal Reports environment

    <p>Hi there, I need help with my Crystal Reports environment, please!</p><p>While developing a report I've noticed that everything is achieved with great slowness. Even right click menu shows up only after a couple of seconds. It takes more then that

  • An error occurred while trying to connect to a Web service

    In infopath 2013 getting current user through web service getting this error in the browser "An error occurred while trying to connect to a Web service"

  • Ok did some troubleshooting still NO GO

    OK iChat will not work video or audio on my new Intel iMac works text though. Eliminated my router/WAN by doing an audio chat w/my G5 iMac. Eliminated AIM by having my Bro log on to one of laptops via AIM and log on to his other laptop using my info

  • STO Billing Problem

    Hi Sap Gurus we are doing stock transfer from one plant to another palnt with same company code and the STO from plant to plant  is chargable it means we have genarate invoce on receiving plant.   we did all setting in MM and  SD. Created p.o and don

  • Jdev debug stop at breakpoint at ClassLoader.java or BootStrap notfound

    Dear All Anyone kindly guide me why when I start debug jdev toolbox tutorial file, jdev always stop at Exception of ClassLoader.java. and never reach my working files. . Debugger connected to local process. Exception breakpoint occurred at line 891 o