Receiving Video RTP Stream (JMF) in JME ( MMAPI ) - URGENT !!!

Hi Folks...
I�m trying to develop an application that sends the images from a web cam connected to the computer, to the PDA that the images can be viewed by the user...
My code for the JMF RTP Video Stream is as follows.
Processor proc = null;
          javax.media.protocol.DataSource ds = null;
          TrackControl[] tc = null;
          int y;
          boolean encodingOk = false;
          Vector<javax.media.protocol.DataSource> datasources = new Vector<javax.media.protocol.DataSource>();
          for( int x = 0 ; x < camerasInfo.length ; x++ ){
               try {
                    proc = Manager.createProcessor(camerasInfo[x].getLocator());
               }catch (NoProcessorException e) { System.out.println("Erro ao int�nciar PROCESSOR: " + e);     }
                catch (IOException e) { System.out.println("Erro ao int�nciar PROCESSOR: " + e); }
               proc.configure();
               try {
                    Thread.sleep(2000);
               } catch (InterruptedException e1) {
                    // TODO Auto-generated catch block
                    e1.printStackTrace();
               proc.setContentDescriptor( new ContentDescriptor(ContentDescriptor.RAW_RTP) );
               tc = proc.getTrackControls();
               for( y = 0; y < tc.length ; y++ ){
                    if (!encodingOk && tc[y] instanceof FormatControl){
                         if( ( (FormatControl) tc[y] ).setFormat( new VideoFormat(VideoFormat.RGB) ) != null ){
                              tc[y].setEnabled(true);                              
                         }else{
                              encodingOk = true;
                    }else{
                         tc[y].setEnabled(false);
               proc.realize();
               try {
                    Thread.sleep(2000);
               } catch (InterruptedException e1) {
                    e1.printStackTrace();
               try{
                    ds = proc.getDataOutput();
               }catch(NotRealizedError e){
                    System.out.println("ERRO ao realizar datasource: " + e);
               }catch(ClassCastException e){
                    System.out.println("Erro ao realizar datasource: " + e);
               datasources.add(ds);
               System.out.println( ds.getLocator() );
               encodingOk = false;
          MediaLocator ml = new MediaLocator("rtp://10.1.1.100:99/video");
          try {
               DataSink datSink = Manager.createDataSink(ds , ml);
               datSink.open();
               datSink.start();
          } catch (NoDataSinkException e) {
               System.out.println("Erro ao instanciar DataSink: " + e );
          } catch (SecurityException e) {
               System.out.println("Erro ao iniciar DataSink: " + e);
          } catch (IOException e) {
               System.out.println("Erro ao iniciar DataSink: " + e);
          }I�m not sure if this code is correctly... it is ?
So... the next part of the systems runs on the PDA..
The code that access this RTP Stream is as follows..
          VideoControl c = null;
          try {
               player = Manager.createPlayer("rtp://10.1.1.100:99/video");
               c = (VideoControl) player.getControl("VideoControl");
               tela = (Item) c.initDisplayMode( GUIControl.USE_GUI_PRIMITIVE, null);
               player.start();
               append(tela);
          } catch (IOException e) {
               str.setText(e.toString());
               append( str );
          } catch (MediaException e) {
               str.setText(e.toString());
               append( str );
          }So when the APP try to create a player for "rtp://10.1.1.100:99/video" an MediaException is throwed..
javax.microedition.media.MediaException: Unable to create Player for the locator: rtp://10.1.1.100:99/video
So... I don�t know what happen =/
The error is in the PDA module ? Or in the computer�s initialization off the RTP Video Streaming ?
I need finish this job at next week... so any help is usefull..
Waiting for answers
Rodrigo Kerkhoff

First of all: before going onto the j2me part, make sure the server works, before doing anything else! Apparently, it doesn't...
The MediaLocator is generally used to specify a kind of URL depicting where the data put into the datasink should go to. In your case, This cannot be just some URL where you want it to act as a rtps server. You'll need to implement that server yourself I guess.

Similar Messages

  • How to synchronize audio and video rtp-streams

    Hi everyone!
    I've tried to make one of the processors, processor1, control the other processor, processor2, with processor1.addController(processor2) but I get a
    javax.media.IncompatibleTimeBaseException. I need to synchronize audio and video rtp-streams because they are sometimes completely out of sync.
    In JMF API Guide I've read:
    "2. Determine which Player object's time base is going to be used to drive
    the other Player objects and set the time base for the synchronized
    Player objects. Not all Player objects can assume a new time base.
    For example, if one of the Player objects you want to synchronize has
    a *push-data-source*, that Player object's time base must be used to
    drive the other Player objects."
    I'm using a custom AVReceive3 to receive rtp-streams and then I create processors for the incoming stream's datasource's, and they are ALL PushBufferDataSource's.
    Does this mean I can't synchronize these. If so is there any way I can change them into Pull...DataSources ?
    How are you supposed to sync audio and video if not with as above ?

    camelstrike wrote:
    Hi everyone!
    I've tried to make one of the processors, processor1, control the other processor, processor2, with processor1.addController(processor2) but I get a
    javax.media.IncompatibleTimeBaseException. I need to synchronize audio and video rtp-streams because they are sometimes completely out of sync.
    In JMF API Guide I've read:
    "2. Determine which Player object's time base is going to be used to drive
    the other Player objects and set the time base for the synchronized
    Player objects. Not all Player objects can assume a new time base.
    For example, if one of the Player objects you want to synchronize has
    a *push-data-source*, that Player object's time base must be used to
    drive the other Player objects."
    I'm using a custom AVReceive3 to receive rtp-streams and then I create processors for the incoming stream's datasource's, and they are ALL PushBufferDataSource's.
    Does this mean I can't synchronize these. If so is there any way I can change them into Pull...DataSources ?The RTP packets are timestamped when they leave, and they are played in order. You can't change the timebase on an RTP stream because, for all intensive purposes, it's "live" data. It plays at the speed the transmitting RTP server wants it to play, and it never gets behind because it drops out of order and old data packets.
    If your RTP streams aren't synced correctly on the receiving end, it means they aren't being synced on the transmitting end...so you should look into the other side of the equation.

  • Watching in a bigger size a received Video through RTP

    Hello,
    so here is the thing,
    My code was based on AVReceive2, I received a rtp stream, then play the video in a player.
    I tried to change the size of mostly anything I could, the player visual component, with the panel it was added to. Or setBounds etc... it seems like the visual component gets bigger for a really really short moment, then go back to the original streamed video size.
    Is it something I should add in the repaint() function of the player ?
    I just want to watch the video in a bigger size, like a fullscreen mode.
    Do you guys have any ideas? Thanks a lot

    First, you need to make sure you're using lightweight renderers...
    Manager.setHint(Manager.LIGHTWEIGHT_RENDERER, new Boolean(true)); [http://java.sun.com/javase/technologies/desktop/media/jmf/2.1.1/solutions/SwingJMF.html]
    Second, you're going to want to add the visual component to the center region of a JPanel with a border layout manager.
    Third, you're going to want to set the size of the outside JPanel to be the size you want the video to be. Resizing the outside JPanel should cause the video to resize...

  • Having multiple threads for receiving RTP streams

    Hello,
    Developing an audio conference server, I have come to think that if I manage to separate the different audioReceivers who receive the RTP streams the performance could improve.
    At this moment I have the main program, so the main thread let�s say, who initializes a new audioRx object for each remote client.
    Would separation of the different receivers into different threads improve the applications performance?
    has anyone thought of this? or done something similar?
    Thanks for your help.
    bgl

    i need help from about the RTP stream from the same port

  • How do I transmit an RTP stream using only a datasource?

    I've been following this tutorial: http://members.jcom.home.ne.jp/3117216601/jmf2guide/RTPPresenting.html
    And I have clients that can send me RTP audio streams, which are received in the update method.
    Now, update generates a NewReceiveStreamEvent. Which has a datasource that I can play.
    But what I want to do. Is have send the DataSource back to that client. (The ip address is unknown... but that shouldn't matter, as RTP is two-way).
    I want to send the DataSource back to the client, because later, we will be routing different clients, and who they can talk to and such. But I want to start with the basics.

    captfoss wrote:
    DerNalia wrote:
    But what I want to do. Is have send the DataSource back to that client. (The ip address is unknown... but that shouldn't matter, as RTP is two-way).RTP isn't two-way, RTP is one way... RTCP is two-way. The data transport is duplex, the QOS/control messages are duplex...
    So you do need to know the IP address of the client in order to send anything to him...
    I want to send the DataSource back to the client, because later, we will be routing different clients, and who they can talk to and such. But I want to start with the basics.You should just be able to turn around and give the DataSource that's receiving the RTP stream to an RTPManager and broadcast it back out... assuming you're not transcoding it out of its RTP format or anything.I don't have the IP address. So is this even possible without switching to RTCP? if so if you an example, that would be awesome. Also an example of two way communication with RTCP would be awesome too.
    Does RTCP used the same sort of structure that RTP does? like is there a DataSource that could be merged with other datasources, and sent back to where one of the datasources came frome?

  • Save a rtp stream to a file

    hi,
    I can send and receive a rtp stream. Now I want to save a received rtp stream to a file,but i got an exception when create
    a DataSink:
    public synchronized void update(ReceiveStreamEvent evt) {
            RTPManager mgr = (RTPManager) evt.getSource();
            Participant participant = evt.getParticipant();
            ReceiveStream stream = evt.getReceiveStream();
            if (evt instanceof NewReceiveStreamEvent) {
                DataSource ds = stream.getDataSource();
        }

    Well gee wiz, I should would like to help you with your exception with your data sink...
    But you didn't post what exceptino you were getting, so I can't help you...
    And the code you posted has nothing to do with a DataSink to begin with, so I can't help you again...

  • To retransmit the rtp streams received

    hi !
    Program A transmits media data to another program B using RTPSessionMgr.What B has to do is send the recived streams to another receiver program C.This has to play the stream.That is I am trying to implement a client router server model.
    So i maintained a session between A & B and also between B & C.Once NewReceiveStream event occurs at B,it retrieves the datasource from the stream and uses the createSendStream() method of the sessionmanager and sends to C.
    My problem is that,C receives the audio and video streams.but never plays it.I get a pink screen for video and no audio.
    can somebody help me out in pointing out my mistake or tell me a new strategy to implement this.
    Actually,i have tried to use only datagram sockets on the router side.ie I created 2 sockets for B to receive data. and forwarded to C using 2 other sockets .And this did work.But the client does not know the sender details.I require to maintain the sender,receiver reports.So i went for the session manager(RTPManager /RTPSessionMgr).
    kindly help.

    Hi all!
    Nice to meet ya, it very exciting that I got a project which will integrating the JMF and JXTA, it got RTP streams from the JMF framework and then sending it to the JXTA framework ,and sent to any peer in some groups to visualizing it .So some aspects of which is similar to yours ,would you mind add me to your contact list of MSN(my MSN account is:[email protected])

  • Multiple RTP streams + local video Player - S.O.S.

    Hi guys,
    I ran out of ideas, so I need your help now. It's gonna be a long one...
    Given:
    MediaLocator(vfw://0) --> DataSource(video) --> Processor(video)
    Then I send processor's output (videoProcessor.getDataOutput()) over RTP to multiple destinations. Of course, I am using cloneable DataSources for this, cause there's no other way. So, now I also need to have a local video feed just for self-reference, but it seems to be a tough one! Let me go over what I've tried til now:
    1. Using DataSource clone from video processor, I tried to create a Player (i.e. Manager.createPlayer(clonedDS) ). BOOM!
    javax.media.NoPlayerException: Cannot find a Player for: com.ibm.media.protocol.SuperCloneableDataSource$PushBufferDataSourceSlave@126c6ea
         at javax.media.Manager.createPlayerForSource(Manager.java:1512)
         at javax.media.Manager.createPlayer(Manager.java:500)
         at org.interlab.mc.media.MediaManager.setVideoProcessorEnabled(MediaManager.java:2046)
         at org.interlab.mc.media.MediaManager.openVideoStreams(MediaManager.java:1807)
         at org.interlab.mc.UserAgent.callStateChanged(UserAgent.java:2516)
         at org.interlab.mc.call.Call.fireCallStateChangedEvent(Call.java:366)
         at org.interlab.mc.call.Call.setState(Call.java:244)
         at org.interlab.mc.call.CallManager.processInviteOK(CallManager.java:330)
         at org.interlab.mc.UserAgent.processResponse(UserAgent.java:1632)
         at gov.nist.javax.sip.EventScanner.deliverEvent(EventScanner.java:288)
         at gov.nist.javax.sip.EventScanner.run(EventScanner.java:489)
         at java.lang.Thread.run(Unknown Source)So i checked. If I create a clone from my DataSource(video) then Manager would return this instance:
    class com.sun.media.multiplexer.RawBufferMux$RawBufferDataSourceBut if, like now, I want to create a clone from DataSource from video processor (i.e. Processor.getDataOutput()) then I get this instance:
    com.ibm.media.protocol.SuperCloneableDataSource$PushBufferDataSourceSlaveHope you noticed the difference. :) Now, from the latter one neither I can create a Player nor can I create a Processor. Too bad. So, I tried another one.
    2. I took my DataSource(video), created a cloneable (just like in Clone.java) and from there I finally got my precious Player! Yes!! But not so fast.... Player was working, but the rest now wasn't. Once I have my DataSource(video) cloned, my Processor(video)'s data output got starved - i could not send video anymore. What a life?! Fine... Let's try another one.
    3. Just like I send video remote I decided to send the video locally in a loop, for instance, from 203.159.2.3:25000 to 203.159.2.3:25002. And what do you think? Still doesnt work!! This "loop" stream is not detected (i.e. controller's update doest sense anything). But if i open JMStudio, and start listening (i.e. open RTP session) on 203.159.2.3:25002, i get my stream showing nice and clear!
    Someone, anyone, please, help! Point my nose to some document, piece of code - anything to make this work.
    Kind regards.

    I'd better post here, so everyone else can get it too. so here it goes:
    That's RTPSocketAdapter
    import java.io.IOException;
    import java.net.InetAddress;
    import java.net.DatagramSocket;
    import java.net.MulticastSocket;
    import java.net.DatagramPacket;
    import java.net.SocketException;
    import javax.media.protocol.PushSourceStream;
    import javax.media.protocol.ContentDescriptor;
    import javax.media.protocol.SourceTransferHandler;
    import javax.media.rtp.RTPConnector;
    import javax.media.rtp.OutputDataStream;
    * An implementation of RTPConnector based on UDP sockets.
    public class RTPSocketAdapter implements RTPConnector {
        DatagramSocket dataSock=null;
        DatagramSocket ctrlSock=null;
        InetAddress remoteAddress=null;
        int remotePort=0;
        InetAddress localAddress=null;
        int localPort=0;
        SockInputStream dataInStrm = null, ctrlInStrm = null;
        SockOutputStream dataOutStrm = null, ctrlOutStrm = null;
        public RTPSocketAdapter(InetAddress localAddress, int localPort,
                                      InetAddress remoteAddress, int remotePort) throws IOException {
             this(localAddress, localPort, remoteAddress, remotePort, 1);
        public RTPSocketAdapter(InetAddress localAddress, int localPort,
                                       InetAddress remoteAddress, int remotePort, int ttl) throws IOException {
              try {
                  if (remoteAddress.isMulticastAddress()) {
                        dataSock = new MulticastSocket(localPort);
                        ctrlSock = new MulticastSocket(localPort+1);
                        ((MulticastSocket)dataSock).joinGroup(remoteAddress);
                        ((MulticastSocket)dataSock).setTimeToLive(ttl);
                        ((MulticastSocket)ctrlSock).joinGroup(remoteAddress);
                        ((MulticastSocket)ctrlSock).setTimeToLive(ttl);
                  else {
                        dataSock = new DatagramSocket(localPort, localAddress);
                        ctrlSock = new DatagramSocket(localPort+1, localAddress);
              catch (SocketException e) {
                  throw new IOException(e.getMessage());
              this.localAddress = localAddress;
              this.localPort = localPort;
              this.remoteAddress = remoteAddress;
              this.remotePort = remotePort;
         * Returns an input stream to receive the RTP data.
        public PushSourceStream getDataInputStream() throws IOException {
              if (dataInStrm == null) {
                  dataInStrm = new SockInputStream(dataSock, remoteAddress, remotePort);
                  dataInStrm.start();
              return dataInStrm;
         * Returns an output stream to send the RTP data.
        public OutputDataStream getDataOutputStream() throws IOException {
              if (dataOutStrm == null)
                  dataOutStrm = new SockOutputStream(dataSock, remoteAddress, remotePort);
              return dataOutStrm;
         * Returns an input stream to receive the RTCP data.
        public PushSourceStream getControlInputStream() throws IOException {
              if (ctrlInStrm == null) {
                  ctrlInStrm = new SockInputStream(ctrlSock, remoteAddress, remotePort+1);
                  ctrlInStrm.start();
              return ctrlInStrm;
         * Returns an output stream to send the RTCP data.
        public OutputDataStream getControlOutputStream() throws IOException {
              if (ctrlOutStrm == null)
                  ctrlOutStrm = new SockOutputStream(ctrlSock, remoteAddress, remotePort+1);
              return ctrlOutStrm;
         * Close all the RTP, RTCP streams.
        public void close() {
              if (dataInStrm != null)
                  dataInStrm.kill();
              if (ctrlInStrm != null)
                  ctrlInStrm.kill();
              dataSock.close();
              ctrlSock.close();
         * Set the receive buffer size of the RTP data channel.
         * This is only a hint to the implementation.  The actual implementation
         * may not be able to do anything to this.
        public void setReceiveBufferSize(int size) throws IOException {
             dataSock.setReceiveBufferSize(size);
         * Get the receive buffer size set on the RTP data channel.
         * Return -1 if the receive buffer size is not applicable for
         * the implementation.
        public int getReceiveBufferSize() {
              try {
                  return dataSock.getReceiveBufferSize();
              catch (Exception e) {
                  return -1;
         * Set the send buffer size of the RTP data channel.
         * This is only a hint to the implementation.  The actual implementation
         * may not be able to do anything to this.
        public void setSendBufferSize( int size) throws IOException {
             dataSock.setSendBufferSize(size);
         * Get the send buffer size set on the RTP data channel.
         * Return -1 if the send buffer size is not applicable for
         * the implementation.
        public int getSendBufferSize() {
              try {
                  return dataSock.getSendBufferSize();
              catch (Exception e) {
                  return -1;
         * Return the RTCP bandwidth fraction.  This value is used to
         * initialize the RTPManager.  Check RTPManager for more detauls.
         * Return -1 to use the default values.
        public double getRTCPBandwidthFraction() {
             return -1;
         * Return the RTCP sender bandwidth fraction.  This value is used to
         * initialize the RTPManager.  Check RTPManager for more detauls.
         * Return -1 to use the default values.
        public double getRTCPSenderBandwidthFraction() {
             return -1;
         * An inner class to implement an OutputDataStream based on UDP sockets.
        class SockOutputStream implements OutputDataStream {
              DatagramSocket sock;
              InetAddress addr;
              int port;
              public SockOutputStream(DatagramSocket sock, InetAddress addr, int port) {
                  this.sock = sock;
                  this.addr = addr;
                  this.port = port;
              public int write(byte data[], int offset, int len) {
                  try {
                       sock.send(new DatagramPacket(data, offset, len, addr, port));
                  catch (Exception e) {
                       return -1;
                  return len;
         * An inner class to implement an PushSourceStream based on UDP sockets.
        class SockInputStream extends Thread implements PushSourceStream {
              DatagramSocket sock;
              InetAddress addr;
              int port;
              boolean done = false;
              boolean dataRead = false;
              SourceTransferHandler sth = null;
              public SockInputStream(DatagramSocket sock, InetAddress addr, int port) {
                  this.sock = sock;
                  this.addr = addr;
                  this.port = port;
              public int read(byte buffer[], int offset, int length) {
                  DatagramPacket p = new DatagramPacket(buffer, offset, length, addr, port);
                  try {
                       sock.receive(p);
                  catch (IOException e) {
                       return -1;
                  synchronized (this) {
                       dataRead = true;
                       notify();
                  return p.getLength();
              public synchronized void start() {
                  super.start();
                  if (sth != null) {
                       dataRead = true;
                       notify();
              public synchronized void kill() {
                  done = true;
                  notify();
              public int getMinimumTransferSize() {
                  return 2 * 1024;     // twice the MTU size, just to be safe.
              public synchronized void setTransferHandler(SourceTransferHandler sth) {
                  this.sth = sth;
                  dataRead = true;
                  notify();
              // Not applicable.
              public ContentDescriptor getContentDescriptor() {
                  return null;
              // Not applicable.
              public long getContentLength() {
                  return LENGTH_UNKNOWN;
              // Not applicable.
              public boolean endOfStream() {
                  return false;
              // Not applicable.
              public Object[] getControls() {
                  return new Object[0];
              // Not applicable.
              public Object getControl(String type) {
                  return null;
          * Loop and notify the transfer handler of new data.
              public void run() {
                  while (!done) {
                        synchronized (this) {
                            while (!dataRead && !done) {
                                  try {
                                      wait();
                                  catch (InterruptedException e) { }
                            dataRead = false;
                        if (sth != null && !done) {
                            sth.transferData(this);
    }See, how to create RTPManager in next post

  • Get a event for a new received rtp stream in a player

    Hi!
    I'm trying to implement a RTP-player, that receives a AV-stream and play it. The special thing about this player should be, that even if the stream interrupts, the player wait's on the same IP and port for a new stream and open it in the SAME frame (not like jmstudio in new window).
    I try to catch the "ReceiveStreamEvent", so i can restart the player, but i don't get eny events for this. I tried to do it with a "RTPManager", but i don't know how.
    Does anybody has a example how to get the "ReceiveStreamEvent", so i know, the RTP-stream has been interrupted?
    Thanks
    Adam

    See AVReceive2 in JMF Solutions, in JMF web site

  • Can the pc out of subnet ,but in the same LAN receive video stream?

    Can the pc out of subnet ,but in the same LAN receive video stream?
    In the LAN ,the host set the Transmit IP such as 10.13.22.255, all the pc' IP address from 10.13.22.1 to 10.13.22.255 can received the video stream.
    but how can the pc out of this subnet, such as with IP 10.14.14.22 receive it.
    Thanks a lot for advance.
    Myshen

    does anyone know?
    help me please.
    Thanks very much!
    Myshen

  • Problem to create RTP stream on Linux by using JMF studio

    Hello All,
    I am trying to use JMStudio in order to create RTP stream
    I have a problem with capture
    I have run Jmfinit I got following messages
    JavaSound Capture Supported = true
    JavaSoundAuto: Committed ok
    java.lang.Error: Can't open video card 0
    java.lang.Error: Can't open video card 1
    java.lang.Error: Can't open video card 2
    java.lang.Error: Can't open video card 3
    java.lang.Error: Can't open video card 4
    java.lang.Error: Can't open video card 5
    java.lang.Error: Can't open video card 6
    java.lang.Error: Can't open video card 7
    java.lang.Error: Can't open video card 8
    java.lang.Error: Can't open video card 9
    it seems to me that i have no directsoundcapture
    therefore i can not create RTP package stream
    How can i add direct sound capture to my capture devices.
    is there any suggestion for this issue?
    King regards
    BEKIR BALCIK
    ARGELA TECHNOLOGIES

    In $ORACLE_BASE/admin you should have a directory structure for the viton1 instance : create a similar directory structure for the viton2 instance.
    In the pfile subdirectory create the initviton2.ora file : you can copy it from initviton1.ora file and change all occurrences of viton1 to viton2.
    At OS prompt : export ORACLE_SID=viton2, then
    sqlplus /nolog
    conn / as sysdba
    startup nomount pfile=$ORACLE_BASE/admin/viton2/pfile/initviton2.ora
    then CREATE DATABASE.....

  • How to extract data from Buffer and create a RTP stream

    Hi
    I'm working on a project where I need to interrupt a media stream (video and audio) and extract the data. Send the data with a custom protocol. On the recieveing side I would like to reconstruct the stream using only the data chunks.
    I'm currently looking at [DataSourceReader.java|http://java.sun.com/javase/technologies/desktop/media/jmf/2.1.1/solutions/DataSourceReader.java] and more specifically at a method like printDataInfo(Buffer buffer) to extract the data.
    Is it possible to create a RTP stream, only having access to the byte array "data" in Buffer ?
    Thanks in advance.

    camelstrike wrote:
    Hi
    I'm working on a project where I need to interrupt a media stream (video and audio) and extract the data. Send the data with a custom protocol. On the recieveing side I would like to reconstruct the stream using only the data chunks.
    I'm currently looking at [DataSourceReader.java|http://java.sun.com/javase/technologies/desktop/media/jmf/2.1.1/solutions/DataSourceReader.java] and more specifically at a method like printDataInfo(Buffer buffer) to extract the data.
    There are a couple of different ways to get the data. Reading it from inside a DataSink is perfectly fine...
    Is it possible to create a RTP stream, only having access to the byte array "data" in Buffer ?Yes and no.
    [http://java.sun.com/javase/technologies/desktop/media/jmf/2.1.1/solutions/CustomPayload.html]
    You need to know the format of the media in addition to the actual media data...

  • Writing a conference server for RTP streams

    Hello,
    I'm trying to write a conference server which accepts multiple RTP streams (one for each participant), creates a mixed RTP stream of all other participants and sends that stream back to each participant.
    For 2 participants, I was able to correctly receive and send the stream of the other participant to each party.
    For 3 participants, creating the merging data source does not seem to work - i.e. no data is received by the participants.
    I tried creating a cloneable data sources instead, thinking that this may be the root cause, but when creating cloneable data sources from incoming RTP sources, I am unable to get the Processor into Configured state, it seems to deadlock. Here's the code outline :
        Iterator pIt = participants.iterator();
        List dataSources = new ArrayList();
        while(pIt.hasNext()) {
          Party p = (Party) pIt.next();
          if(p!=dest) {
            DataSource ds = p.getDataSource();
            DataSource cds = Manager.createCloneableDataSource(ds);
            DataSource clone= ((SourceCloneable)cds).createClone();
            dataSources.add(clone);
        Object[] sources = dataSources.toArray(new DataSource[0]);
        DataSource dataSource =   Manager.createMergingDataSource((DataSource[])sources);
        Processor p = Manager.createProcessor(dataSource);
        MixControllerListener cl = new MixControllerListener();
        p.addControllerListener(cl);
        // Put the Processor into configured state.
        p.configure();
        if (!cl.waitForState(p, p.Configured)) {
            System.err.println("Failed to configure the processor.");
            assert false;
        }Here are couple of stack traces :
    "RTPEventHandler" daemon prio=1 tid=0x081d6828 nid=0x3ea6 in Object.wait() [98246000..98247238]
            at java.lang.Object.wait(Native Method)
            - waiting on <0x9f37e4a8> (a java.lang.Object)
            at java.lang.Object.wait(Object.java:429)
            at demo.Mixer$MixControllerListener.waitForState(Mixer.java:248)
            - locked <0x9f37e4a8> (a java.lang.Object)
            at demo.Mixer.createMergedDataSource(Mixer.java:202)
            at demo.Mixer.createSendStreams(Mixer.java:165)
            at demo.Mixer.createSendStreamsWhenAllJoined(Mixer.java:157)
            - locked <0x9f481840> (a demo.Mixer)
            at demo.Mixer.update(Mixer.java:123)
            at com.sun.media.rtp.RTPEventHandler.processEvent(RTPEventHandler.java:62)
            at com.sun.media.rtp.RTPEventHandler.dispatchEvents(RTPEventHandler.java:96)
            at com.sun.media.rtp.RTPEventHandler.run(RTPEventHandler.java:115)
    "JMF thread: com.sun.media.ProcessEngine@a3c5b6[ com.sun.media.ProcessEngine@a3c5b6 ] ( configureThread)" daemon prio=1 tid=0x082fe3c8 nid=0x3ea6 in Object.wait() [977e0000..977e1238]
            at java.lang.Object.wait(Native Method)
            - waiting on <0x9f387560> (a java.lang.Object)
            at java.lang.Object.wait(Object.java:429)
            at com.sun.media.parser.RawBufferParser$FrameTrack.parse(RawBufferParser.java:247)
            - locked <0x9f387560> (a java.lang.Object)
            at com.sun.media.parser.RawBufferParser.getTracks(RawBufferParser.java:112)
            at com.sun.media.BasicSourceModule.doRealize(BasicSourceModule.java:180)
            at com.sun.media.PlaybackEngine.doConfigure1(PlaybackEngine.java:229)
            at com.sun.media.ProcessEngine.doConfigure(ProcessEngine.java:43)
            at com.sun.media.ConfigureWorkThread.process(BasicController.java:1370)
            at com.sun.media.StateTransitionWorkThread.run(BasicController.java:1339)
    "JMF thread" daemon prio=1 tid=0x080db410 nid=0x3ea6 in Object.wait() [97f41000..97f41238]
            at java.lang.Object.wait(Native Method)
            - waiting on <0x9f480578> (a com.ibm.media.protocol.CloneableSourceStreamAdapter$PushBufferStreamSlave)
            at java.lang.Object.wait(Object.java:429)
            at com.ibm.media.protocol.CloneableSourceStreamAdapter$PushBufferStreamSlave.run(CloneableSourceStreamAdapter.java:375)
            - locked <0x9f480578> (a com.ibm.media.protocol.CloneableSourceStreamAdapter$PushBufferStreamSlave)
            at java.lang.Thread.run(Thread.java:534)Any ideas ?
    Thanks,
    Jarek

    bgl,
    I was able to get past the cloning issue by following the Clone.java example to the letter :)
    Turns out that the cloneable data source must be added as a send stream first, and then the clonet data source. Now for each party in the call the conf. server does the following :
    Party(RTPManager mgr,DataSource ds) {
          this.mgr=mgr;
          this.ds=Manager.createCloneableDataSource(ds);
       synchronized DataSource cloneDataSource() {
          DataSource retVal;
          if(getNeedsCloning()) {
            retVal = ((SourceCloneable) ds).createClone();
          } else {
            retVal = ds;
            setNeedsCloning();
          return retVal;
        private void setNeedsCloning() {
          needsCloning=true;
        private boolean getNeedsCloning() {
          return needsCloning;
         private synchronized void addSendStreamFromNewParticipant(Party newOne) throws UnsupportedFormatException, IOException {
        debug("*** - New one joined. Creating the send streams. Curr count :" + participants.size());
        Iterator pIt = participants.iterator();
        while(pIt.hasNext()) {
          Party p = (Party)pIt.next();
          assert p!=newOne;
          // update existing participant
          SendStream sendStream = p.getMgr().createSendStream(newOne.cloneDataSource(),0);
          sendStream.start();
          // send data from existing participant to the new one
          sendStream = newOne.getMgr().createSendStream(p.cloneDataSource(),0);
          sendStream.start();
        debug("*** - Done creating the streams.");So I made some progress, but I'm still not quite there.
    The RTP manager JavaDoc for createSendStream states the following :
    * This method is used to create a sending stream within the RTP
    * session. For each time the call is made, a new sending stream
    * will be created. This stream will use the SDES items as entered
    * in the initialize() call for all its RTCP messages. Each stream
    * is sent out with a new SSRC (Synchronisation SouRCe
    * identifier), but from the same participant i.e. local
    * participant. <BR>
    For 3 participants, my conf. server creates 2 send streams to every one of them, so I'd expect 2 SSRCs on the wire. Examining the RTP packets in Ethereal, I only see 1 SSRC, as if the 2nd createSendStream call failed. Consequently, each participany in the conference is able to receive voice from only 1 other participant, even though I create RTPManager instance for each participany, and add 2 send streams.
    Any ideas ?
    Thanks,
    Jarek

  • Rtp stream applet

    Hi !!
    I'm new in JMF. I'd like to play rtp stream on remote machine by using applet . I've searched a lot for making java applet which could receive rtp stream but coldn't find easy to understand tutorial code etc.
    If you have any good links, please write it on forum or paste code.
    Thank you for any answers !!!

    an applet needs nothing great ...As someone who ..
    - has deployed 1.1-1.4 compatible applets,
    applets that interact with JavaScript, full (browser)
    window applets, applets that load screensavers
    and launch applications, web-started applets,
    applets that change PLAF..
    - has dealt with (literally) hundreds of applet
    related problems on usenet, including the
    investigation of applet bugs, and (particularly)
    bugs caused by the MSVM.
    ..I can assure you.
    1) Applets are anything but easy to deploy
    successfully on the wilds of the internet.
    2) As someone who has dealt a little with the
    JMF, I can add that JMF applets add a whole
    new level of complications to the mix.
    Generally, the end user needs to have
    pre-installed JMF in order to see JMF
    based applets. Even then, the browser
    might not detect the JMF classes.
    It is generally far more sensible, and reliable,
    to launch such projects in a free floating frame
    using web-start.
    Here is an example of a webstart launch.
    <http://www.javasaver.com/testjs/jmf/#test3>
    Note that any such web-start launch might be
    optimised in a variety of ways (to download less).
    The most obvious would be to use the
    customizer.jar to make a cut-down version
    that handles only the required sources (e.g. RTP)
    and formats (e.g. MOV).

  • Video RTP transmittion problem

    Hey guys (and girls),
    I'm writing two programs that either send or receive a RTP video stream captured with a webcam. Using JMStudio I can capture and transmit the video and happily receive it with my client program, but when I try to capture and transmit the video using my server program the client can't realize because it's not got any data.
    Here is my transmitter code, my server class creates a new instance of the camTransmitter class.
    import java.io.*;
    import java.awt.*;
    import java.util.Vector;
    import javax.swing.*;
    import java.awt.event.*;
    import javax.media.*;
    import javax.media.format.*;
    import javax.media.protocol.*;
    import javax.media.control.*;
    import javax.media.ControllerListener;
    import javax.media.rtp.*;
    import javax.media.rtp.rtcp.*;
    import javax.media.rtp.event.*;
    import javax.media.rtp.SendStreamListener.*;
    import java.net.InetAddress;
    public class camTransmitter implements ControllerListener
         Processor p = null;
         CaptureDeviceInfo di = null;
         boolean configured, realized = false;
         public String ipAddress = "192.168.123.150";
         public int port = 22222;
         public RTPManager rtpMgrs[];
         public DataSource vDS;
         public SessionAddress localAddr, destAddr;
         public InetAddress ipAddr;
         public SendStream sendStream;
        public camTransmitter()
         // First, we'll need a DataSource that captures live video
         Format vFormat = new VideoFormat(VideoFormat.RGB);
         Vector devices = CaptureDeviceManager.getDeviceList(vFormat);
         if (devices.size() > 0)
              di = (CaptureDeviceInfo) devices.elementAt(0);
         else
              // exit if we could not find the relevant capture device.
              System.out.println("no devises");          
              System.exit(-1);
         MediaLocator camLoctation = di.getLocator();
         // Create a processor for this capture device & exit if we
         // cannot create it
         try
              vDS = Manager.createDataSource(camLoctation);
              p = Manager.createProcessor(vDS);
              p.addControllerListener(this);
         catch (IOException e) {
              System.exit(-1);
         }catch (NoProcessorException e) {
              System.exit(-1);
         }catch(NoDataSourceException e) {
              System.exit(-1);
         // at this point, we have succesfully created the processor.
         // Realize it and block until it is configured.
         p.configure();
         while(configured != true){}
         // block until it has been configured
         p.setContentDescriptor(new ContentDescriptor(ContentDescriptor.RAW_RTP));
         TrackControl track[] = p.getTrackControls();
         boolean encodingOk = false;
         // Go through the tracks and try to program one of them to
         // output ULAW_RTP data.
         for (int i = 0; i < track.length; i++)
              if (!encodingOk && track[i] instanceof FormatControl)
                   if (((FormatControl)track).setFormat(vFormat) == null)
                        track[i].setEnabled(false);
                   else
                        encodingOk = true;
              else
                   // we could not set this track to gsm, so disable it
                   track[i].setEnabled(false);
         // Realize it and block until it is realized.
         p.realize();
         while(realized != true){}
         // block until realized.
         // get the output datasource of the processor and exit
         // if we fail
         DataSource ds = null;
         try
              ds = p.getDataOutput();
         catch (NotRealizedError e)
              System.exit(-1);
         PushBufferDataSource pbds = (PushBufferDataSource)ds;
         PushBufferStream pbss[] = pbds.getStreams();
         rtpMgrs = new RTPManager[pbss.length];
         for (int i = 0; i < pbss.length; i++) {
         try {
              rtpMgrs[i] = RTPManager.newInstance();
              ipAddr = InetAddress.getByName(ipAddress);
              localAddr = new SessionAddress(InetAddress.getLocalHost(),port);
              destAddr = new SessionAddress(ipAddr, port);
              System.out.println(localAddr);
              rtpMgrs[i].initialize(destAddr);
              rtpMgrs[i].addTarget(destAddr);
              System.out.println( "Created RTP session: " + ipAddress + " " + port);
              sendStream = rtpMgrs[i].createSendStream(ds, i);          
              sendStream.start();
         } catch (Exception e) {
              System.err.println(e.getMessage());
         //System.out.println("RTP Stream created and running");
    public void controllerUpdate(ControllerEvent e)
         if(e instanceof ConfigureCompleteEvent)
              System.out.println("Configure Complete");
              configured = true;
         if(e instanceof RealizeCompleteEvent)
              System.out.println("Ralizing complete");
              realized = true;
    When I run this the processor realizes and the RTP seems to stream, but I have no idea why I can't receive the video from this.
    Pls help if anyone can Thanks

    Dear andreyvk ,
    I've read your post
    http://forum.java.sun.com/thread.jspa?threadID=785134&tstart=165
    about how to use a single RTP session for both media reception and trasmission (I'm referring to you RTPSocketAdapter modified version), but, at the moment, I'receive a BIND error.
    I think that your post is an EXCELLENT solution. I'modified AVReceive3 and AVTransmit3 in order to accept all parameters (Local IP & Port, Remote IP & Port).
    Can you please give me a simple scenario so I can understand what the mistake?
    I'use AVTransmit3 and AVReceive3 from different prompts and if I run these 2 classes contemporarely both in 2 different PC (172.17.32.27 and 172.17.32.30) I can transmit the media (vfw://0 for example) using AVTransmit3 but I'can't receive nothing if I run also AVReceive3 in the same PC?
    What's the problem? Furthermore, If I run first AVReceive3 from a MSDOS Prompt and subsequently I run AVTransmit3 from another prompt I see a BIND error (port already in use).
    How can I use your modified RTPSocketAdapter in order to send and receive a single media from the same port (e.g. 7500).
    I've used this scenario PC1: IP 172.17.32.30 Local Port 5000 and PC2:IP 172.17.32.27 LocalPort 10000
    So in the PC1 I run:
    AVTransmit3 vfw://0 <Local IP 172.17.32.30> <5000> <Remote IP 172.17.32.27> <10000>
    AVReceive3 <Local IP 172.17.32.30/5000> <Remote IP 172.17.32.27/10000>
    and in PC2:
    AVTransmit3 vfw://0 <Local IP 172.17.32.27> <10000> <Remote IP 172.17.32.30> <5000>
    AVReceive3 <Local IP 172.17.32.27/10000> <Remote IP 172.17.32.30/5000>
    I'd like to use the same port 5000 (in PC1) and 10000 (in PC2) in order to transmit and receive rtp packets. How can i do that without receive a Bind Error? How can I receive packets (and playing these media if audio &/or video) from the same port used to send stream over the network?
    How can I obtain a RTP Symmetric Transmission/Reception solution?
    Please give me an hint. If you can't post this is my email: [email protected]

Maybe you are looking for