Mixing audio RTP-Streams to conference

I'm trying to create a telefon conference applikation with jmf.
My problem:
I need to mix multiple incomming RTP-Streams (Audio/G711_ulaw) together into one outgoing RTP-Stream.
If I use a MergingDataSource i can get one DataSource with multiple Tracks, but i'm not able to send them out together over one RTP-Session.
Is there any way to do this?
I have also thaught about using RawBufferMux or WAVMux to multiplex the tracks included in the MergingDataSource ,but i cant find out how to use them.
Or is there an easyer way to create an conference over RTP?
I need to connect to cisco ip-phones, so i can only use the g711_ulaw codec.
Is there anyone out who can help me?

Im sorry, but I met this problem in past and find it impossible to resolve.
Im also thought about MergingDataSource, about Many2One class and more and more but nothing I could use. And nobody could answer this question in this forum half of year ago.
Oh, nearly forgot: it is possible to write merged streams to file, seems possible to write them into file and then to transmit it to net from file. But how to realize more than one Processor?..

Similar Messages

  • Writing a conference server for RTP streams

    Hello,
    I'm trying to write a conference server which accepts multiple RTP streams (one for each participant), creates a mixed RTP stream of all other participants and sends that stream back to each participant.
    For 2 participants, I was able to correctly receive and send the stream of the other participant to each party.
    For 3 participants, creating the merging data source does not seem to work - i.e. no data is received by the participants.
    I tried creating a cloneable data sources instead, thinking that this may be the root cause, but when creating cloneable data sources from incoming RTP sources, I am unable to get the Processor into Configured state, it seems to deadlock. Here's the code outline :
        Iterator pIt = participants.iterator();
        List dataSources = new ArrayList();
        while(pIt.hasNext()) {
          Party p = (Party) pIt.next();
          if(p!=dest) {
            DataSource ds = p.getDataSource();
            DataSource cds = Manager.createCloneableDataSource(ds);
            DataSource clone= ((SourceCloneable)cds).createClone();
            dataSources.add(clone);
        Object[] sources = dataSources.toArray(new DataSource[0]);
        DataSource dataSource =   Manager.createMergingDataSource((DataSource[])sources);
        Processor p = Manager.createProcessor(dataSource);
        MixControllerListener cl = new MixControllerListener();
        p.addControllerListener(cl);
        // Put the Processor into configured state.
        p.configure();
        if (!cl.waitForState(p, p.Configured)) {
            System.err.println("Failed to configure the processor.");
            assert false;
        }Here are couple of stack traces :
    "RTPEventHandler" daemon prio=1 tid=0x081d6828 nid=0x3ea6 in Object.wait() [98246000..98247238]
            at java.lang.Object.wait(Native Method)
            - waiting on <0x9f37e4a8> (a java.lang.Object)
            at java.lang.Object.wait(Object.java:429)
            at demo.Mixer$MixControllerListener.waitForState(Mixer.java:248)
            - locked <0x9f37e4a8> (a java.lang.Object)
            at demo.Mixer.createMergedDataSource(Mixer.java:202)
            at demo.Mixer.createSendStreams(Mixer.java:165)
            at demo.Mixer.createSendStreamsWhenAllJoined(Mixer.java:157)
            - locked <0x9f481840> (a demo.Mixer)
            at demo.Mixer.update(Mixer.java:123)
            at com.sun.media.rtp.RTPEventHandler.processEvent(RTPEventHandler.java:62)
            at com.sun.media.rtp.RTPEventHandler.dispatchEvents(RTPEventHandler.java:96)
            at com.sun.media.rtp.RTPEventHandler.run(RTPEventHandler.java:115)
    "JMF thread: com.sun.media.ProcessEngine@a3c5b6[ com.sun.media.ProcessEngine@a3c5b6 ] ( configureThread)" daemon prio=1 tid=0x082fe3c8 nid=0x3ea6 in Object.wait() [977e0000..977e1238]
            at java.lang.Object.wait(Native Method)
            - waiting on <0x9f387560> (a java.lang.Object)
            at java.lang.Object.wait(Object.java:429)
            at com.sun.media.parser.RawBufferParser$FrameTrack.parse(RawBufferParser.java:247)
            - locked <0x9f387560> (a java.lang.Object)
            at com.sun.media.parser.RawBufferParser.getTracks(RawBufferParser.java:112)
            at com.sun.media.BasicSourceModule.doRealize(BasicSourceModule.java:180)
            at com.sun.media.PlaybackEngine.doConfigure1(PlaybackEngine.java:229)
            at com.sun.media.ProcessEngine.doConfigure(ProcessEngine.java:43)
            at com.sun.media.ConfigureWorkThread.process(BasicController.java:1370)
            at com.sun.media.StateTransitionWorkThread.run(BasicController.java:1339)
    "JMF thread" daemon prio=1 tid=0x080db410 nid=0x3ea6 in Object.wait() [97f41000..97f41238]
            at java.lang.Object.wait(Native Method)
            - waiting on <0x9f480578> (a com.ibm.media.protocol.CloneableSourceStreamAdapter$PushBufferStreamSlave)
            at java.lang.Object.wait(Object.java:429)
            at com.ibm.media.protocol.CloneableSourceStreamAdapter$PushBufferStreamSlave.run(CloneableSourceStreamAdapter.java:375)
            - locked <0x9f480578> (a com.ibm.media.protocol.CloneableSourceStreamAdapter$PushBufferStreamSlave)
            at java.lang.Thread.run(Thread.java:534)Any ideas ?
    Thanks,
    Jarek

    bgl,
    I was able to get past the cloning issue by following the Clone.java example to the letter :)
    Turns out that the cloneable data source must be added as a send stream first, and then the clonet data source. Now for each party in the call the conf. server does the following :
    Party(RTPManager mgr,DataSource ds) {
          this.mgr=mgr;
          this.ds=Manager.createCloneableDataSource(ds);
       synchronized DataSource cloneDataSource() {
          DataSource retVal;
          if(getNeedsCloning()) {
            retVal = ((SourceCloneable) ds).createClone();
          } else {
            retVal = ds;
            setNeedsCloning();
          return retVal;
        private void setNeedsCloning() {
          needsCloning=true;
        private boolean getNeedsCloning() {
          return needsCloning;
         private synchronized void addSendStreamFromNewParticipant(Party newOne) throws UnsupportedFormatException, IOException {
        debug("*** - New one joined. Creating the send streams. Curr count :" + participants.size());
        Iterator pIt = participants.iterator();
        while(pIt.hasNext()) {
          Party p = (Party)pIt.next();
          assert p!=newOne;
          // update existing participant
          SendStream sendStream = p.getMgr().createSendStream(newOne.cloneDataSource(),0);
          sendStream.start();
          // send data from existing participant to the new one
          sendStream = newOne.getMgr().createSendStream(p.cloneDataSource(),0);
          sendStream.start();
        debug("*** - Done creating the streams.");So I made some progress, but I'm still not quite there.
    The RTP manager JavaDoc for createSendStream states the following :
    * This method is used to create a sending stream within the RTP
    * session. For each time the call is made, a new sending stream
    * will be created. This stream will use the SDES items as entered
    * in the initialize() call for all its RTCP messages. Each stream
    * is sent out with a new SSRC (Synchronisation SouRCe
    * identifier), but from the same participant i.e. local
    * participant. <BR>
    For 3 participants, my conf. server creates 2 send streams to every one of them, so I'd expect 2 SSRCs on the wire. Examining the RTP packets in Ethereal, I only see 1 SSRC, as if the 2nd createSendStream call failed. Consequently, each participany in the conference is able to receive voice from only 1 other participant, even though I create RTPManager instance for each participany, and add 2 send streams.
    Any ideas ?
    Thanks,
    Jarek

  • How to synchronize audio and video rtp-streams

    Hi everyone!
    I've tried to make one of the processors, processor1, control the other processor, processor2, with processor1.addController(processor2) but I get a
    javax.media.IncompatibleTimeBaseException. I need to synchronize audio and video rtp-streams because they are sometimes completely out of sync.
    In JMF API Guide I've read:
    "2. Determine which Player object's time base is going to be used to drive
    the other Player objects and set the time base for the synchronized
    Player objects. Not all Player objects can assume a new time base.
    For example, if one of the Player objects you want to synchronize has
    a *push-data-source*, that Player object's time base must be used to
    drive the other Player objects."
    I'm using a custom AVReceive3 to receive rtp-streams and then I create processors for the incoming stream's datasource's, and they are ALL PushBufferDataSource's.
    Does this mean I can't synchronize these. If so is there any way I can change them into Pull...DataSources ?
    How are you supposed to sync audio and video if not with as above ?

    camelstrike wrote:
    Hi everyone!
    I've tried to make one of the processors, processor1, control the other processor, processor2, with processor1.addController(processor2) but I get a
    javax.media.IncompatibleTimeBaseException. I need to synchronize audio and video rtp-streams because they are sometimes completely out of sync.
    In JMF API Guide I've read:
    "2. Determine which Player object's time base is going to be used to drive
    the other Player objects and set the time base for the synchronized
    Player objects. Not all Player objects can assume a new time base.
    For example, if one of the Player objects you want to synchronize has
    a *push-data-source*, that Player object's time base must be used to
    drive the other Player objects."
    I'm using a custom AVReceive3 to receive rtp-streams and then I create processors for the incoming stream's datasource's, and they are ALL PushBufferDataSource's.
    Does this mean I can't synchronize these. If so is there any way I can change them into Pull...DataSources ?The RTP packets are timestamped when they leave, and they are played in order. You can't change the timebase on an RTP stream because, for all intensive purposes, it's "live" data. It plays at the speed the transmitting RTP server wants it to play, and it never gets behind because it drops out of order and old data packets.
    If your RTP streams aren't synced correctly on the receiving end, it means they aren't being synced on the transmitting end...so you should look into the other side of the equation.

  • Write(Export) the RTP Stream from a SIP Call to an audio file

    I am working on telephony system that need record each call to a file.
    I got the RTP Stream in ReceiveStreamEvent handler and start to record
    the problem is i got a file and play it but there are any sounds
    when i set FileTypeDescriptor.WAVE and AudioFormat.ULAW
    I try out the FileTypeDescriptor.WAVE and AudioFormat.IMA4_MS, i got a fixed size file with 60kb
    if the FileTypeDescriptor.MPEG_AUDIO and AudioFormat.MPEG, the processor cannot be realize,
    the thread is blocked!!
    Could anyone save me? thanks in advanced!
    =================================Code===================================================
    public void update(ReceiveStreamEvent evt){
    logger.debug("received a new incoming stream. " + evt);
    RTPManager mgr = (RTPManager) evt.getSource();
    Participant participant = evt.getParticipant(); // could be null.
    ReceiveStream stream = evt.getReceiveStream(); // could be null.
    if (evt instanceof NewReceiveStreamEvent)
    try
    stream = ( (NewReceiveStreamEvent) evt).getReceiveStream();
    DataSource ds = stream.getDataSource();
              recorder = new CallRecorderImpl(ds);
    recorder.start();
    catch (Exception e)
    logger.error("NewReceiveStreamEvent exception ", e);
    return;
    ========================================================================================
    this is the CallRecorderImpl Class
    public class CallRecorderImpl implements DataSinkListener, ControllerListener {
         private Processor processor = null;
         private DataSink dataSink = null;
         private DataSource source = null;
         private boolean bRecording = false;
         FileTypeDescriptor contentType = new FileTypeDescriptor(FileTypeDescriptor.WAVE);
         Format encoding = new AudioFormat(AudioFormat.IMA4_MS);
    MediaLocator dest = new MediaLocator("file:/c:/bar.wav");
         public CallRecorderImpl(DataSource ds) {
              this.source = ds;
         public void start() {
              try {
                   processor = Manager.createProcessor(source);
                   processor.addControllerListener(this);
                   processor.configure();
              } catch (Exception e) {
                   System.out.println("exception:" + e);
         public void stop() {
              try {
                   System.out.println("stopping");
                   this.dataSink.stop();
                   this.dataSink.close();
              } catch (Exception ep) {
                   ep.printStackTrace();
         public void controllerUpdate(ControllerEvent evt) {
              Processor pr = (Processor) evt.getSourceController();
              if (evt instanceof ConfigureCompleteEvent) {
                   System.out.println("ConfigureCompleteEvent");
                   processConfigured(pr);
              if (evt instanceof RealizeCompleteEvent) {
                   System.out.println("RealizeCompleteEvent");
                   processRealized(pr);
              if (evt instanceof ControllerClosedEvent) {
                   System.out.println("ControllerClosedEvent");
              if (evt instanceof EndOfMediaEvent) {
                   System.out.println("EndOfMediaEvent");
                   pr.stop();
              if (evt instanceof StopEvent) {
                   System.out.println("StopEvent");
                   pr.close();
                   try {
                        dataSink.stop();
                        dataSink.close();
                   } catch (Exception ee) {
                        ee.printStackTrace();
         public void dataSinkUpdate(DataSinkEvent event) {
              if (event instanceof EndOfStreamEvent) {
                   try {
                        System.out.println("EndOfStreamEvent");
                        dataSink.stop();
                        dataSink.close();
                        System.exit(1);
                   } catch (Exception e) {
         public void processConfigured(Processor p) {
              // Set the output content type
              p.setContentDescriptor(this.contentType);
              // Get the track control objects
              TrackControl track[] = p.getTrackControls();
              boolean encodingPossible = false;
              // Go through the tracks and try to program one of them
              // to output ima4 data.
              for (int i = 0; i < track.length; i++) {
              try {
                   track.setFormat(this.encoding);
                   encodingPossible = true;
              } catch (Exception e) {
                   track[i].setEnabled(false);
              if (!encodingPossible) {
                   System.out.println("cannot encode to " + this.encoding);
              return;
              p.realize();
         public void processRealized(Processor p) {
              System.out.println("Entered processRealized");
    DataSource source = p.getDataOutput();
    try {
         dataSink = Manager.createDataSink(source, dest);
         dataSink.open();
         dataSink.start();
         dataSink.addDataSinkListener(new DataSinkListener() {
                        public void dataSinkUpdate(DataSinkEvent e) {
                             if (e instanceof EndOfStreamEvent) {
                                  System.out.println("EndOfStreamEvent");
                                  dataSink.close();
    } catch (Exception e) {
         e.printStackTrace();
         return;
              p.start();
    ============================================

    Which shows that the output stream cannot be of that particulat format and descriptor.
    Look at this code
    import javax.swing.*;
    import javax.media.*;
    import java.net.*;
    import java.io.*;
    import javax.media.datasink.*;
    import javax.media.protocol.*;
    import javax.media.format.*;
    import javax.media.control.*;
    class Abc implements ControllerListener
         DataSource ds;
         DataSink sink;
         Processor pr;
         MediaLocator mc;
         public void maam() throws Exception
              mc=new MediaLocator("file:C:/Workspace/aaaaa.mpg");
              pr=Manager.createProcessor(new URL("file:G:/java files1/jmf/aa.mp3"));
              pr.addControllerListener(this);
              pr.configure();          
         public void controllerUpdate(ControllerEvent e)
              if(e instanceof ConfigureCompleteEvent)
                   System.out.println ("ConfigureCompleteEvent");
                             Processor p = (Processor)e.getSourceController();
                   System.out.println ("ConfigureCompleteEvent");
                         processConfigured(p);
              if(e instanceof RealizeCompleteEvent)
                   System.out.println ("RealizeCompleteEvent");
                           Processor p = (Processor)e.getSourceController();
                             processRealized(p);
              if(e instanceof ControllerClosedEvent)
                         System.out.println ("ControllerClosedEvent");
                     if(e instanceof EndOfMediaEvent)
                        System.out.println ("EndOfMediaEvent");
                        Processor p = (Processor)e.getSourceController();
                        p.stop();
                 if(e instanceof StopEvent)
                         System.out.println ("StopEvent");
                         Processor p = (Processor)e.getSourceController();
                         p.close();
                   try
                        sink.stop();
                        sink.close();
                   catch(Exception ee)
         public void processConfigured(Processor p)
              System.out.println("Entered processConfigured");
              p.setContentDescriptor (new FileTypeDescriptor (FileTypeDescriptor.MPEG_AUDIO));       
                /*    TrackControl track[] = p.getTrackControls ();
                 boolean encodingPossible = false;
                 for (int i = 0; i < track.length; i++)
                      try
                           track.setFormat (new VideoFormat (VideoFormat.MPEG));
                   encodingPossible = true;
                   catch (Exception e)
                   track[i].setEnabled (false);
         p.realize();
         public void processRealized(Processor p)
              pr=p;
              System.out.println("Entered processRealized");
              try
                   MediaLocator dest = new MediaLocator("file:C:/Workspace/ring1.mpg");
                   sink = Manager.createDataSink(p.getDataOutput(), dest);
              sink.addDataSinkListener(new DataSinkListener()
                        public void dataSinkUpdate(DataSinkEvent e)
                             if(e instanceof EndOfStreamEvent)
                        System.out.println ("EndOfStreamEvent");
                                  sink.close();
                   sink.open();
                   sink.start();
              catch(Exception eX)
              System.out.println("Just before start");
              p.start();
    /*     public void dataSinkUpdate(DataSinkEvent event)
              if(event instanceof EndOfStreamEvent)
                   try
                        System.out.println("EndOfStreamEvent");
                        dsk.stop();
                        dsk.close();
                        System.exit(1);
                   catch(Exception e)
    public class JMFCapture6
         public static void main(String args[]) throws Exception
              Abc a=new Abc();
              a.maam();

  • Having multiple threads for receiving RTP streams

    Hello,
    Developing an audio conference server, I have come to think that if I manage to separate the different audioReceivers who receive the RTP streams the performance could improve.
    At this moment I have the main program, so the main thread let�s say, who initializes a new audioRx object for each remote client.
    Would separation of the different receivers into different threads improve the applications performance?
    has anyone thought of this? or done something similar?
    Thanks for your help.
    bgl

    i need help from about the RTP stream from the same port

  • How to extract data from Buffer and create a RTP stream

    Hi
    I'm working on a project where I need to interrupt a media stream (video and audio) and extract the data. Send the data with a custom protocol. On the recieveing side I would like to reconstruct the stream using only the data chunks.
    I'm currently looking at [DataSourceReader.java|http://java.sun.com/javase/technologies/desktop/media/jmf/2.1.1/solutions/DataSourceReader.java] and more specifically at a method like printDataInfo(Buffer buffer) to extract the data.
    Is it possible to create a RTP stream, only having access to the byte array "data" in Buffer ?
    Thanks in advance.

    camelstrike wrote:
    Hi
    I'm working on a project where I need to interrupt a media stream (video and audio) and extract the data. Send the data with a custom protocol. On the recieveing side I would like to reconstruct the stream using only the data chunks.
    I'm currently looking at [DataSourceReader.java|http://java.sun.com/javase/technologies/desktop/media/jmf/2.1.1/solutions/DataSourceReader.java] and more specifically at a method like printDataInfo(Buffer buffer) to extract the data.
    There are a couple of different ways to get the data. Reading it from inside a DataSink is perfectly fine...
    Is it possible to create a RTP stream, only having access to the byte array "data" in Buffer ?Yes and no.
    [http://java.sun.com/javase/technologies/desktop/media/jmf/2.1.1/solutions/CustomPayload.html]
    You need to know the format of the media in addition to the actual media data...

  • To retransmit the rtp streams received

    hi !
    Program A transmits media data to another program B using RTPSessionMgr.What B has to do is send the recived streams to another receiver program C.This has to play the stream.That is I am trying to implement a client router server model.
    So i maintained a session between A & B and also between B & C.Once NewReceiveStream event occurs at B,it retrieves the datasource from the stream and uses the createSendStream() method of the sessionmanager and sends to C.
    My problem is that,C receives the audio and video streams.but never plays it.I get a pink screen for video and no audio.
    can somebody help me out in pointing out my mistake or tell me a new strategy to implement this.
    Actually,i have tried to use only datagram sockets on the router side.ie I created 2 sockets for B to receive data. and forwarded to C using 2 other sockets .And this did work.But the client does not know the sender details.I require to maintain the sender,receiver reports.So i went for the session manager(RTPManager /RTPSessionMgr).
    kindly help.

    Hi all!
    Nice to meet ya, it very exciting that I got a project which will integrating the JMF and JXTA, it got RTP streams from the JMF framework and then sending it to the JXTA framework ,and sent to any peer in some groups to visualizing it .So some aspects of which is similar to yours ,would you mind add me to your contact list of MSN(my MSN account is:[email protected])

  • Rtp stream applet

    Hi !!
    I'm new in JMF. I'd like to play rtp stream on remote machine by using applet . I've searched a lot for making java applet which could receive rtp stream but coldn't find easy to understand tutorial code etc.
    If you have any good links, please write it on forum or paste code.
    Thank you for any answers !!!

    an applet needs nothing great ...As someone who ..
    - has deployed 1.1-1.4 compatible applets,
    applets that interact with JavaScript, full (browser)
    window applets, applets that load screensavers
    and launch applications, web-started applets,
    applets that change PLAF..
    - has dealt with (literally) hundreds of applet
    related problems on usenet, including the
    investigation of applet bugs, and (particularly)
    bugs caused by the MSVM.
    ..I can assure you.
    1) Applets are anything but easy to deploy
    successfully on the wilds of the internet.
    2) As someone who has dealt a little with the
    JMF, I can add that JMF applets add a whole
    new level of complications to the mix.
    Generally, the end user needs to have
    pre-installed JMF in order to see JMF
    based applets. Even then, the browser
    might not detect the JMF classes.
    It is generally far more sensible, and reliable,
    to launch such projects in a free floating frame
    using web-start.
    Here is an example of a webstart launch.
    <http://www.javasaver.com/testjs/jmf/#test3>
    Note that any such web-start launch might be
    optimised in a variety of ways (to download less).
    The most obvious would be to use the
    customizer.jar to make a cut-down version
    that handles only the required sources (e.g. RTP)
    and formats (e.g. MOV).

  • Simultaneous Recording in file and  playing incoming rtp stream

    I want to Simultaneous Recording in file and  playing incoming rtp stream.I have cloned the data source. After cloning I created two data sinks and two processors.One processor/datasink is meant for recording the stream in a file and other processor/data sink is used to play the rtp through speaker.
    But at a time I am only able to achieve one operation i.e either recording or playing.For recording into the wav file I have to put Thread.sleep(30) so that I can record the rtp into the file.During this time I am not able to play the audio through speaker.I also create separate thread but this does not help.
    Please give suggestion on how to resolve this issue.

    thanks for the reply, but I think I've already solved the problem.

  • RTP stream - record and listen simultaneously ...

    Hi!
    I would like to record and listen RTP stream simultaneously, but I don't know how... I can record to a file OR listen in headphone but just separately.
    If i create a datasink for recording on a port I cannot use that port to listen. But in this case how can I listen the audio stream on headphone?
    Maybe the good solution would be for me if I could clone the stream but I cannot do, because I get an exception
    javax.media.NoProcessorException: Cannot find a Processor for: com.ibm.media.protocol.CloneablePushBufferDataSource@1d8957fThis is the code:
    DataSource ds = Manager.createCloneableDataSource(Manager.createDataSource(media_locator));
    processor = Manager.createRealizedProcessor(new ProcessorModel(ds, formats, outputType));Has anybody got any idea?
    Thanks!
    Ric Flair

    I'd point you in this direction initially.
    [http://java.sun.com/javase/technologies/desktop/media/jmf/2.1.1/solutions/Clone.java]

  • Using RTP stream as data source

    I have successfully written a program that reads G.711 audio from a wav file and sends it out over the network.
    I'm trying to modify that program so that instead of getting its source audio from a file, it instead receives the audio from an RTP stream. So, in effect, it is simply receiving audio from one socket and sending it back out another.
    The error I'm encountering:
    [java] javax.media.NoProcessorException: Cannot find a Processor for: rtp://172.30.18.140:32916
    [java] at javax.media.Manager.createProcessorForContent(Manager.java:1663)
    [java] at javax.media.Manager.createProcessor(Manager.java:627)
    Here is the code fragment where the exception occurs:
    processor = javax.media.Manager.createProcessor(new MediaLocator("rtp://172.30.18.140:32916"));
    Can anybody help? Thanks!
    <removed funky formatting><br>
    Message was edited by:
    feh

    Hi,
    I Had a very similar problem, but in addiction I had to store data in a buffer before retransmission. I resolved it employing low level classes. At this time, I still do not know any other solution, and nobody helped me in any way, also in this forum...
    So, first you have to access single frames of the stream (Buffer or ExtBuffer objects) using a RawBufferParser, then you have to reconstruct a DataSource by multiplexing the frames (use an appropriate Multiplexer class, such as RTPSynchBufferMux).
    The big problem is that there is no documentation avalaible about how to use these classes: I have lerned all I know by "reverse engineering" (lots of hours spent in reading very long annoying code). A little help can be given by the PlugIn Viewer of JMStudio, that tells you which classes are to be employed.
    I have not tried to retransmit data "straightforward", without access single frames, because that was not my task. Maybe if you use RTPManager, it is possible to get a DataSource by the ReceiveStream object, and then to pass it to another RTPManager to create a SendStream.
    I do not know any simpler way to do that, but this does not mean that it does not exist...!
    good luck.
    Alberto M. (Italy)

  • RTP streaming

    Hi guys,
    what I need to do is to stream YUV images between two JMF applications. I am using VideoTransmit application (taken from this site) to transmit and JMStudio to receive. From a little research I have noticed the following to transmit:
    1) Create the DataSource using the Manager.CreateDataSource(MediaLocator)
    2) create processor using the above created datasource
    3) create player etc.
    My question now is the following:
    How can I transmit a YUV file? How can I create the DataSource which has this YUV file without having problems for the further creation of processor etc?
    Thanks in advance guys,
    Chris

    Ok, I think I know what was confusing me. Apparently the setRate() isn't effecting the rate of stream. The audio file I am streaming runs about five minutes usually, so I sped the rate up to 10. It sped the process of writing to a file up significantly, but writing to an RTP stream seems to take the full run-time of the file. Does this sound right or is there something else I'm missing?
    Thanks,
    Khanathor

  • How do I transmit an RTP stream using only a datasource?

    I've been following this tutorial: http://members.jcom.home.ne.jp/3117216601/jmf2guide/RTPPresenting.html
    And I have clients that can send me RTP audio streams, which are received in the update method.
    Now, update generates a NewReceiveStreamEvent. Which has a datasource that I can play.
    But what I want to do. Is have send the DataSource back to that client. (The ip address is unknown... but that shouldn't matter, as RTP is two-way).
    I want to send the DataSource back to the client, because later, we will be routing different clients, and who they can talk to and such. But I want to start with the basics.

    captfoss wrote:
    DerNalia wrote:
    But what I want to do. Is have send the DataSource back to that client. (The ip address is unknown... but that shouldn't matter, as RTP is two-way).RTP isn't two-way, RTP is one way... RTCP is two-way. The data transport is duplex, the QOS/control messages are duplex...
    So you do need to know the IP address of the client in order to send anything to him...
    I want to send the DataSource back to the client, because later, we will be routing different clients, and who they can talk to and such. But I want to start with the basics.You should just be able to turn around and give the DataSource that's receiving the RTP stream to an RTPManager and broadcast it back out... assuming you're not transcoding it out of its RTP format or anything.I don't have the IP address. So is this even possible without switching to RTCP? if so if you an example, that would be awesome. Also an example of two way communication with RTCP would be awesome too.
    Does RTCP used the same sort of structure that RTP does? like is there a DataSource that could be merged with other datasources, and sent back to where one of the datasources came frome?

  • RTP-Streaming (MP3)

    Dear Java-Users,
    we like to build a little RTP-Streaming radio.
    We found a lot of code samples for example the AVTransmit/Recieve from java.sun.
    Our client is working well and is able to recieve RTP-Streams.
    But our server isn't running at the moment.
    When we start both (server and client) on one local machine, the client can recieve the Stream. When we start the server on an other machine from the same network, the client cannot recieve the stream.
    Well there are two problems left :
    1) As I said the server does not work except from localhost.
    2) We read that there is no MP3 support on Java-RTP-Streaming. But as you know MP3 is the most used audio-format and that's why we like to implement it.
    If somebody could help me, I would be really glad.
    Please send me an email or post here. ([email protected])
    I appreciate for your time. with good greetings,
    Tobias Belch

    Really I have to pay to use MP3 ?
    No I didn't knew this.
    I want to use MP3, for I want to build a simple and small client/server for RTP-Streaming over a standard network. And MP3 is the most used format and that's why it is just comfortable for the user to use mp3.
    But thank you very much for your hint.
    greetings

  • RTP streaming noise from JMStudio

    Am using the AVTransmit3 (also tried AVTransmit2) downloaded code to send a RTP stream to JMStudio. It does recognize it as ULAW as does Ethereal however JMStudio only plays noise. This is a priority project so any help would be much appreciated.
    java AVTransmit3 rtp:172.16.85.2:8000/audio 192.168.10.223 9000
    Track 0 is set to transmit as:
    ULAW/rtp, 8000.0 Hz, 8-bit, Mono
    Created RTP session: 192.168.10.223 9000
    Start transmission for 60 seconds...
    ...transmission ended.

    Change the video format to JPEG/RTP

Maybe you are looking for

  • I am having a problem with sending text messages to ONE contact

    We were texting just fine until Sunday night. We text several times a day. As of Monday a text will not go through to my daughters phone. I have done a system reset on my phone, I have cleared the system cache, I tried a different message app, cleare

  • Unable to create restore disk image with Disk Utility

    I've tried three times in order to create a restore image of the users' volume of my Powerbook. I followed exactly the procedure explained typing "man asr" in a terminal window. I booted the Powerbook in target disk mode while connected to my iMac G5

  • IPhone 4 Bumper Scratches

    Apple was kind enough to send me a bumper for my new iPhone. I read something online about the bumper scratching the iPhone. Should I ignore these reports? Or is it true?

  • IE7 Issue with Spry Horizontal Menu

    I am fairly new to all of this, I have developed a web site using horizontal spry menu, everything fine on IE8, Safari and Firefox. When viewed on IE7 there appears a white line above the menu. Url is www.psweb-test.com could you help please as I can

  • Not able to download any apps from Application Manager

    I have no idea what's going on, but I cannot download anything from the CC.  Application Manager opens and give me all the options of what to download, but everything is giving me an error.  I went and followed the steps and uninstalled everything bu