RTP Streaming seek & stop question.

Is it possible to seek in a rtp stream received at the client. Since rtp works on push data concept,is there a way to implement this.
I want the server to stop streaming when the client exits. Normally the sender sends Bye Event to Receiver. Is it possible for other way around to tell the sender to stop streaming.
Thanks.

From what I understand RTP on it's is basically for a blind transmission, like watch tv or the listening to the Radio, if the RTP stream (analogous to the tv signal or radio waves) is being broadcast you can receive it. With this method you have no control really, you are just given what is being streamed.
However RTSP, allows you send commands to a server that will stream RTP content, which you can start, stop, pause, and in some cases move forward or backward to a particular time in the sequence.
You may find the following link useful: http://www.cs.columbia.edu/~hgs/teaching/ais/slides/RTSP.pdf
hope this helps,
Stef

Similar Messages

  • RTP audio stream suddenly stopped

    hello,
    i've developed an audio chat with JMF. i've tested with LAN and internet environment.
    it run ok with local server.
    but when we connect to a public server in the internet, sound that come from my peer always stopped suddenly after a few seconds (it could be more than 10 secs but less than 30 secs). although that i still see the RTP packets are still received by my computer with a sniffer program, even when the voice is lost. the strange thing is that its only me who lose my peers voice while my voice still can be heard by my peer.
    then we tested to run both applications not from my network, it runs ok.
    but i'm still curious what could possible caused this? i afraid that if the packet travels through the internet, the problem could rise again.
    i've read another topic similar to this problem in this forum. some suggested to increase the TTL of the RTP packet. i'll give it a try but, i guess i still see tha packet received by my computer even when the voice is lost.
    Thank you

    thesti wrote:
    i just wonder how should we handle StreamMappedEvent ?That's pretty application-specific question, isn't it?
    The StreamMappaedEvent occurs when a new RTP stream is received without a participant attached to it...so you're getting a stream, but you don't know from who... Subsiquently, the participant's identity will be received, and a "StreamMapped" event will occur.
    Some applications will refuse to play the RTP stream until the participant is identified, because they care who the streams are from... but bear in mind, if the participant is known when the stream is first received, no "StreamMapped" event will occur. It's only when you don't know at first that you'll get a "StreamMapped" event later.

  • RTP stream question

    Hello community, I have a question
    The setup is this, a cisco sccp phone is located in russia and registrated in a callmanager located in denmark. The phone dials a number that translate and use a routelist to a local gateway in russia. The dialed number in the local cisco gateway is a dialpeer with a siptrunk to a lync mediation server also located in denmark.
    How is the final rtp stream going to be from the cisco phone to the Lync server?
    Br. Kasper

    Hi. It actually all comes down to the reporting in Lync. The issue is that Lync reports "bad" conferance per Sip trunk. I want to make a more detailed design where I place a SIP trunk in every affilliate around europe in their local gateway, but as the phones are registrated in cucm cluster in denmark and the Lync mediation server(sip trunk connection to local gateway) also is located in denmark, I need a redesign of the rtp path including the MTP, that is why I want to put the MTP resource in the local affiliate gateway(this eksample russia) So by doing this the rtp path would be phone->local mtp->local vgw-siptrunk->Lync mediation in denmark. I just needed a secound thought on the design.
    The current setup would report all "bad" conferances as beeing in denmark, even if they are originated in other countries. The local dialpattern is just callcorwarded to the dk siptrunk->Lync mediation.The SIP trunk in cucm are limmited to only one connection to the Lync mediation server.

  • Mixing audio RTP-Streams to conference

    I'm trying to create a telefon conference applikation with jmf.
    My problem:
    I need to mix multiple incomming RTP-Streams (Audio/G711_ulaw) together into one outgoing RTP-Stream.
    If I use a MergingDataSource i can get one DataSource with multiple Tracks, but i'm not able to send them out together over one RTP-Session.
    Is there any way to do this?
    I have also thaught about using RawBufferMux or WAVMux to multiplex the tracks included in the MergingDataSource ,but i cant find out how to use them.
    Or is there an easyer way to create an conference over RTP?
    I need to connect to cisco ip-phones, so i can only use the g711_ulaw codec.
    Is there anyone out who can help me?

    Im sorry, but I met this problem in past and find it impossible to resolve.
    Im also thought about MergingDataSource, about Many2One class and more and more but nothing I could use. And nobody could answer this question in this forum half of year ago.
    Oh, nearly forgot: it is possible to write merged streams to file, seems possible to write them into file and then to transmit it to net from file. But how to realize more than one Processor?..

  • How can I manage controls in an RTP Streaming

    Hello,
    I'm currently using an RTP Streaming in order to play some music. I started with AVReceive2 and AVTransmit2 from sun and by adding a plugin, I managed to play MP3. So far so good.
    Because I didn't want to use view components given by sun, I created my own GUI for my player and I use it instead of using the one in sun's examples.
    I use listeners for my "buttons" (I prefer using mouselistener and Panel like in the sun's example Jamp) but I can't find a way to execute those buttons.
    If I press pause, it has to pause the streaming in AVTransmit2.
    If I press next, or previous, it has to change the process AVTransmit2.
    Each control has to be done in AVTransmit2 but my player is part of AVReceive2.
    My first guess was to check if I could use an event to tell my AVTransmit2 object to execute my controls but I havn't found a way to tell my AVTransmit2 object that it was this button I pressed.
    I finally stopped trying transmitting my orders thanks to the events.
    I tried then to find a way to get my AVTransmit2 object in my AVReceive2 object so that I could call methods but i failed :(
    How can i manage my controls so that it will call methods on my server and not my client ?
    Thanks :)
    Shad.
    Edited by: Shadwolf on Feb 9, 2010 2:15 AM

    Hi, again,
    For the next & previous button, I was thinking in something.
    Do you think it could be good to create a new class RTPClientManager which extends from RTPManager and get 2 booleans previous & next that are set to true when I press on buttons ?
    From here, couldn't I modify my function in AVTransmit2 like this (I implements ReseiveStreamEvent on AVTransmit2) :
    @Override
         public void update(ReceiveStreamEvent evt)
              RTPClientManager mgr = (RTPClientManager )evt.getSource();
              Participant participant = evt.getParticipant();     // could be null.
              ReceiveStream stream = evt.getReceiveStream();  // could be null.
               * Détection de la fermeture de connexion du client afin de fermer la transmission streaming
              if (evt instanceof ByeEvent)
                   System.err.println("  - Got \"bye\" from: " + participant.getCNAME());
                         if(mgr.isNext())
                     this.stop();
                             /* SOME STUFF FOR NEXT */
                        else if(mgr.isPrevious())
                     this.stop();
                             /* SOME STUFF FOR PREVIOUS*/
                    else //means it's the stop button called
                              this.stop() ;
         }Would it work ?
    If you have better ideas I am ready to hear theam :P
    But if this works, I will manage stop, next and previous, but how could I manage play & pause button ?
    I can't find a proper solution to manage my controls :(

  • OSMF HTTP Stream seek bug...

    (This was also posted in Flash as I couldn't find the OSMF forum!)
    Hi,
    I have built an OSMF player that allows users to seek to where they were when resuming, but also to scrub back to the beginning.
    EG:
    A stream is 1 minute long.
    User starts play at 25 seconds.
    User can scrub back to 5 seconds or forward to 55 seconds.
    This works perfectly with RTMP - TimeTrait reports correctly and video displays correctly.
    EG:
    Start at 25 seconds, TimeTrait reports: duration = 60, current position = 25 and displaying video is 25.
    Seek to 5 seconds rewinds to current duration = 60, position = 5, displaying video is 5.
    But...
    With HTTP streams...
    Start at 25 seconds. TimeTrait reports: duration = 60, current position = 25 and displaying video is 50.
    Seek to 5 seconds rewinds to before the start of the stream and gives me a media complete... end of stream.
    Start at 30 seconds, TimeTrait reports duration = 60, current position = 30 and displaying video is 60 so gives me a media complete... end of stream.
    It is pretty clear that what is happening is that HTTP stream seeking is resetting my start index to the seek point, as if I had started play with:
    NetStream.play(seekPoint);
    Rather than:
    NetStream.play();
    NetStream.seek(seekPoint);
    This, I hasten to add, is a horrendous issue from my client's point of view as it effectively means I cannot allow resume of content.
    Does anybody have any ideas, workarounds?
    G

    Hi,
    HTTPNetStream.as not able to use because of SWC component using for entire OSMF framework application. Can you please advise how to use it.
    I am using below code to play http mp4 video and local video i.e. c:\myfolder\myvideo.mp4  or myfolder/myvideo.mp4 etc.
         netLoader = new NetLoader();
      mediaElement=new VideoElement(urlResource,netLoader);
      initOSMFPlayer(mediaElement, true);
    private function initOSMFPlayer(medElement:*, autoStart:Boolean=false):void {
      //trace("Media Element changed Successfully" +medElement);
      try {
      mediaPlayer=new MediaPlayer(mediaElement);
      mediaPlayer.addEventListener(DisplayObjectEvent.MEDIA_SIZE_CHANGE, _onSizeChange);
      mediaPlayer.addEventListener(MediaErrorEvent.MEDIA_ERROR, onMediaError, false, 0, true);
      mediaPlayer.addEventListener(MediaPlayerStateChangeEvent.MEDIA_PLAYER_STATE_CHANGE, playerStateChange, false, 0, true);
      mediaPlayer.addEventListener(PlayEvent.PLAY_STATE_CHANGE, currentState, false, 0, true);
      mediaPlayer.addEventListener(TimeEvent.COMPLETE, videoComplete, false, 0, true);
      mediaPlayer.addEventListener(MediaPlayerCapabilityChangeEvent.CAN_SEEK_CHANGE, canSeekChange, false, 0, true);
      mediaPlayer.autoPlay=autoStart;//autoStart;
      mediaPlayer.bufferTime=1;
      mediaContainer.visible=false;
      mediaContainer.addMediaElement(mediaElement);
      if ((urlType=="HTTP" || urlType=="Undefined") && !autoStart) {mediaPlayer.stop();}
      stageResize();
      } catch (e:*) {
      trace("Error found in loading OSMF FRamework");
      //trace("MediaPlayer Error: " + e.toString() + " " + e.type);
      //controller.insertLog("Error found in OSMF Player Framework...");
      //controller.updatePlayerError(ErrorType.UNSUPPORTED_FORMAT);
    Please advise if any things else need to change to use actual HTTPNetStream.as    ( org.osmf.net.httpstreaming.HTTPNetStream);
    Please advise.

  • Write(Export) the RTP Stream from a SIP Call to an audio file

    I am working on telephony system that need record each call to a file.
    I got the RTP Stream in ReceiveStreamEvent handler and start to record
    the problem is i got a file and play it but there are any sounds
    when i set FileTypeDescriptor.WAVE and AudioFormat.ULAW
    I try out the FileTypeDescriptor.WAVE and AudioFormat.IMA4_MS, i got a fixed size file with 60kb
    if the FileTypeDescriptor.MPEG_AUDIO and AudioFormat.MPEG, the processor cannot be realize,
    the thread is blocked!!
    Could anyone save me? thanks in advanced!
    =================================Code===================================================
    public void update(ReceiveStreamEvent evt){
    logger.debug("received a new incoming stream. " + evt);
    RTPManager mgr = (RTPManager) evt.getSource();
    Participant participant = evt.getParticipant(); // could be null.
    ReceiveStream stream = evt.getReceiveStream(); // could be null.
    if (evt instanceof NewReceiveStreamEvent)
    try
    stream = ( (NewReceiveStreamEvent) evt).getReceiveStream();
    DataSource ds = stream.getDataSource();
              recorder = new CallRecorderImpl(ds);
    recorder.start();
    catch (Exception e)
    logger.error("NewReceiveStreamEvent exception ", e);
    return;
    ========================================================================================
    this is the CallRecorderImpl Class
    public class CallRecorderImpl implements DataSinkListener, ControllerListener {
         private Processor processor = null;
         private DataSink dataSink = null;
         private DataSource source = null;
         private boolean bRecording = false;
         FileTypeDescriptor contentType = new FileTypeDescriptor(FileTypeDescriptor.WAVE);
         Format encoding = new AudioFormat(AudioFormat.IMA4_MS);
    MediaLocator dest = new MediaLocator("file:/c:/bar.wav");
         public CallRecorderImpl(DataSource ds) {
              this.source = ds;
         public void start() {
              try {
                   processor = Manager.createProcessor(source);
                   processor.addControllerListener(this);
                   processor.configure();
              } catch (Exception e) {
                   System.out.println("exception:" + e);
         public void stop() {
              try {
                   System.out.println("stopping");
                   this.dataSink.stop();
                   this.dataSink.close();
              } catch (Exception ep) {
                   ep.printStackTrace();
         public void controllerUpdate(ControllerEvent evt) {
              Processor pr = (Processor) evt.getSourceController();
              if (evt instanceof ConfigureCompleteEvent) {
                   System.out.println("ConfigureCompleteEvent");
                   processConfigured(pr);
              if (evt instanceof RealizeCompleteEvent) {
                   System.out.println("RealizeCompleteEvent");
                   processRealized(pr);
              if (evt instanceof ControllerClosedEvent) {
                   System.out.println("ControllerClosedEvent");
              if (evt instanceof EndOfMediaEvent) {
                   System.out.println("EndOfMediaEvent");
                   pr.stop();
              if (evt instanceof StopEvent) {
                   System.out.println("StopEvent");
                   pr.close();
                   try {
                        dataSink.stop();
                        dataSink.close();
                   } catch (Exception ee) {
                        ee.printStackTrace();
         public void dataSinkUpdate(DataSinkEvent event) {
              if (event instanceof EndOfStreamEvent) {
                   try {
                        System.out.println("EndOfStreamEvent");
                        dataSink.stop();
                        dataSink.close();
                        System.exit(1);
                   } catch (Exception e) {
         public void processConfigured(Processor p) {
              // Set the output content type
              p.setContentDescriptor(this.contentType);
              // Get the track control objects
              TrackControl track[] = p.getTrackControls();
              boolean encodingPossible = false;
              // Go through the tracks and try to program one of them
              // to output ima4 data.
              for (int i = 0; i < track.length; i++) {
              try {
                   track.setFormat(this.encoding);
                   encodingPossible = true;
              } catch (Exception e) {
                   track[i].setEnabled(false);
              if (!encodingPossible) {
                   System.out.println("cannot encode to " + this.encoding);
              return;
              p.realize();
         public void processRealized(Processor p) {
              System.out.println("Entered processRealized");
    DataSource source = p.getDataOutput();
    try {
         dataSink = Manager.createDataSink(source, dest);
         dataSink.open();
         dataSink.start();
         dataSink.addDataSinkListener(new DataSinkListener() {
                        public void dataSinkUpdate(DataSinkEvent e) {
                             if (e instanceof EndOfStreamEvent) {
                                  System.out.println("EndOfStreamEvent");
                                  dataSink.close();
    } catch (Exception e) {
         e.printStackTrace();
         return;
              p.start();
    ============================================

    Which shows that the output stream cannot be of that particulat format and descriptor.
    Look at this code
    import javax.swing.*;
    import javax.media.*;
    import java.net.*;
    import java.io.*;
    import javax.media.datasink.*;
    import javax.media.protocol.*;
    import javax.media.format.*;
    import javax.media.control.*;
    class Abc implements ControllerListener
         DataSource ds;
         DataSink sink;
         Processor pr;
         MediaLocator mc;
         public void maam() throws Exception
              mc=new MediaLocator("file:C:/Workspace/aaaaa.mpg");
              pr=Manager.createProcessor(new URL("file:G:/java files1/jmf/aa.mp3"));
              pr.addControllerListener(this);
              pr.configure();          
         public void controllerUpdate(ControllerEvent e)
              if(e instanceof ConfigureCompleteEvent)
                   System.out.println ("ConfigureCompleteEvent");
                             Processor p = (Processor)e.getSourceController();
                   System.out.println ("ConfigureCompleteEvent");
                         processConfigured(p);
              if(e instanceof RealizeCompleteEvent)
                   System.out.println ("RealizeCompleteEvent");
                           Processor p = (Processor)e.getSourceController();
                             processRealized(p);
              if(e instanceof ControllerClosedEvent)
                         System.out.println ("ControllerClosedEvent");
                     if(e instanceof EndOfMediaEvent)
                        System.out.println ("EndOfMediaEvent");
                        Processor p = (Processor)e.getSourceController();
                        p.stop();
                 if(e instanceof StopEvent)
                         System.out.println ("StopEvent");
                         Processor p = (Processor)e.getSourceController();
                         p.close();
                   try
                        sink.stop();
                        sink.close();
                   catch(Exception ee)
         public void processConfigured(Processor p)
              System.out.println("Entered processConfigured");
              p.setContentDescriptor (new FileTypeDescriptor (FileTypeDescriptor.MPEG_AUDIO));       
                /*    TrackControl track[] = p.getTrackControls ();
                 boolean encodingPossible = false;
                 for (int i = 0; i < track.length; i++)
                      try
                           track.setFormat (new VideoFormat (VideoFormat.MPEG));
                   encodingPossible = true;
                   catch (Exception e)
                   track[i].setEnabled (false);
         p.realize();
         public void processRealized(Processor p)
              pr=p;
              System.out.println("Entered processRealized");
              try
                   MediaLocator dest = new MediaLocator("file:C:/Workspace/ring1.mpg");
                   sink = Manager.createDataSink(p.getDataOutput(), dest);
              sink.addDataSinkListener(new DataSinkListener()
                        public void dataSinkUpdate(DataSinkEvent e)
                             if(e instanceof EndOfStreamEvent)
                        System.out.println ("EndOfStreamEvent");
                                  sink.close();
                   sink.open();
                   sink.start();
              catch(Exception eX)
              System.out.println("Just before start");
              p.start();
    /*     public void dataSinkUpdate(DataSinkEvent event)
              if(event instanceof EndOfStreamEvent)
                   try
                        System.out.println("EndOfStreamEvent");
                        dsk.stop();
                        dsk.close();
                        System.exit(1);
                   catch(Exception e)
    public class JMFCapture6
         public static void main(String args[]) throws Exception
              Abc a=new Abc();
              a.maam();

  • RTP streaming

    Hi guys,
    what I need to do is to stream YUV images between two JMF applications. I am using VideoTransmit application (taken from this site) to transmit and JMStudio to receive. From a little research I have noticed the following to transmit:
    1) Create the DataSource using the Manager.CreateDataSource(MediaLocator)
    2) create processor using the above created datasource
    3) create player etc.
    My question now is the following:
    How can I transmit a YUV file? How can I create the DataSource which has this YUV file without having problems for the further creation of processor etc?
    Thanks in advance guys,
    Chris

    Ok, I think I know what was confusing me. Apparently the setRate() isn't effecting the rate of stream. The audio file I am streaming runs about five minutes usually, so I sped the rate up to 10. It sped the process of writing to a file up significantly, but writing to an RTP stream seems to take the full run-time of the file. Does this sound right or is there something else I'm missing?
    Thanks,
    Khanathor

  • No Inter-Cluster RTP Stream with Gatekeepers

    Hello,
    Firstly I am no expert in Cisco telephony as we have just recently migrated to a full Cisco solution, so apologies if I ask a fundamental question.
    Client A = Site A
    Client B = Site B
    Client C = Site C
    Site A and Site B are in 1 cluster
    Site C is in another cluster
    Intra-Cluster Traffic Works
    Client A -> Client B within the same cluster (across 2 sites with a low latency link) RTP stream comes up and the call functions as expected.
    Inter-Cluster Traffic Fails (GK to GK)
    Client C -> Client A this works, the RTP stream comes up and the call functions as expected.
    Client A -> Client C this call connects but there is no RTP stream.
    We are using G711 across the board and I have captured a wireshark capture from a Client A -> Client C failed call.
    I have been going through this capture and noticed that when I search H225 (for the gatekeepers) I see the following –
    CS: setup
    RAS: admissionRequest
    RAS: admissionConfirm
    CS: callProeeding
    CS: alerting
    CS: notify
    RAS: registrationRequest
    RAS: registrationConfirm
    CS: notify
    CS: connect
    CS: notify
    CS: releaseComplete
    RAS: disengageRequest  (DISCONECT_REASON=2,TIME=1321266127,DURATION=24,DISCONNECT_STRING=no resource,ORIGIN=0,LINE_NUMBER=GK,OUTBUND_GW_IP=..
    RAS: disengageConfirm
    There are firewalls inbetween and these were the first thing I looked at, but I dont even see any RTP stream trying to be initiated from the far side. Would anyone have any ideas where I could start looking?
    Thanks,
    Peter

    Pat,
    I can not talk to the UC540, but I ran into a situation recently where the SIP gateway was sending out the private extension of the phone number instead of the full DID that was registered to the provider.  The provider was then blocking call.  
    In our SIP debugs we saw the RDNIS information of the private extension I believe. The error code we were getting back from SP was code 404 or something along those lines.
    I recommend you do some debugs and track where the calls fails, compare the SNR call versus a normal call in the debugs, and then if you still get stuck post running configs and debugs back here. 

  • How to extract data from Buffer and create a RTP stream

    Hi
    I'm working on a project where I need to interrupt a media stream (video and audio) and extract the data. Send the data with a custom protocol. On the recieveing side I would like to reconstruct the stream using only the data chunks.
    I'm currently looking at [DataSourceReader.java|http://java.sun.com/javase/technologies/desktop/media/jmf/2.1.1/solutions/DataSourceReader.java] and more specifically at a method like printDataInfo(Buffer buffer) to extract the data.
    Is it possible to create a RTP stream, only having access to the byte array "data" in Buffer ?
    Thanks in advance.

    camelstrike wrote:
    Hi
    I'm working on a project where I need to interrupt a media stream (video and audio) and extract the data. Send the data with a custom protocol. On the recieveing side I would like to reconstruct the stream using only the data chunks.
    I'm currently looking at [DataSourceReader.java|http://java.sun.com/javase/technologies/desktop/media/jmf/2.1.1/solutions/DataSourceReader.java] and more specifically at a method like printDataInfo(Buffer buffer) to extract the data.
    There are a couple of different ways to get the data. Reading it from inside a DataSink is perfectly fine...
    Is it possible to create a RTP stream, only having access to the byte array "data" in Buffer ?Yes and no.
    [http://java.sun.com/javase/technologies/desktop/media/jmf/2.1.1/solutions/CustomPayload.html]
    You need to know the format of the media in addition to the actual media data...

  • Writing a conference server for RTP streams

    Hello,
    I'm trying to write a conference server which accepts multiple RTP streams (one for each participant), creates a mixed RTP stream of all other participants and sends that stream back to each participant.
    For 2 participants, I was able to correctly receive and send the stream of the other participant to each party.
    For 3 participants, creating the merging data source does not seem to work - i.e. no data is received by the participants.
    I tried creating a cloneable data sources instead, thinking that this may be the root cause, but when creating cloneable data sources from incoming RTP sources, I am unable to get the Processor into Configured state, it seems to deadlock. Here's the code outline :
        Iterator pIt = participants.iterator();
        List dataSources = new ArrayList();
        while(pIt.hasNext()) {
          Party p = (Party) pIt.next();
          if(p!=dest) {
            DataSource ds = p.getDataSource();
            DataSource cds = Manager.createCloneableDataSource(ds);
            DataSource clone= ((SourceCloneable)cds).createClone();
            dataSources.add(clone);
        Object[] sources = dataSources.toArray(new DataSource[0]);
        DataSource dataSource =   Manager.createMergingDataSource((DataSource[])sources);
        Processor p = Manager.createProcessor(dataSource);
        MixControllerListener cl = new MixControllerListener();
        p.addControllerListener(cl);
        // Put the Processor into configured state.
        p.configure();
        if (!cl.waitForState(p, p.Configured)) {
            System.err.println("Failed to configure the processor.");
            assert false;
        }Here are couple of stack traces :
    "RTPEventHandler" daemon prio=1 tid=0x081d6828 nid=0x3ea6 in Object.wait() [98246000..98247238]
            at java.lang.Object.wait(Native Method)
            - waiting on <0x9f37e4a8> (a java.lang.Object)
            at java.lang.Object.wait(Object.java:429)
            at demo.Mixer$MixControllerListener.waitForState(Mixer.java:248)
            - locked <0x9f37e4a8> (a java.lang.Object)
            at demo.Mixer.createMergedDataSource(Mixer.java:202)
            at demo.Mixer.createSendStreams(Mixer.java:165)
            at demo.Mixer.createSendStreamsWhenAllJoined(Mixer.java:157)
            - locked <0x9f481840> (a demo.Mixer)
            at demo.Mixer.update(Mixer.java:123)
            at com.sun.media.rtp.RTPEventHandler.processEvent(RTPEventHandler.java:62)
            at com.sun.media.rtp.RTPEventHandler.dispatchEvents(RTPEventHandler.java:96)
            at com.sun.media.rtp.RTPEventHandler.run(RTPEventHandler.java:115)
    "JMF thread: com.sun.media.ProcessEngine@a3c5b6[ com.sun.media.ProcessEngine@a3c5b6 ] ( configureThread)" daemon prio=1 tid=0x082fe3c8 nid=0x3ea6 in Object.wait() [977e0000..977e1238]
            at java.lang.Object.wait(Native Method)
            - waiting on <0x9f387560> (a java.lang.Object)
            at java.lang.Object.wait(Object.java:429)
            at com.sun.media.parser.RawBufferParser$FrameTrack.parse(RawBufferParser.java:247)
            - locked <0x9f387560> (a java.lang.Object)
            at com.sun.media.parser.RawBufferParser.getTracks(RawBufferParser.java:112)
            at com.sun.media.BasicSourceModule.doRealize(BasicSourceModule.java:180)
            at com.sun.media.PlaybackEngine.doConfigure1(PlaybackEngine.java:229)
            at com.sun.media.ProcessEngine.doConfigure(ProcessEngine.java:43)
            at com.sun.media.ConfigureWorkThread.process(BasicController.java:1370)
            at com.sun.media.StateTransitionWorkThread.run(BasicController.java:1339)
    "JMF thread" daemon prio=1 tid=0x080db410 nid=0x3ea6 in Object.wait() [97f41000..97f41238]
            at java.lang.Object.wait(Native Method)
            - waiting on <0x9f480578> (a com.ibm.media.protocol.CloneableSourceStreamAdapter$PushBufferStreamSlave)
            at java.lang.Object.wait(Object.java:429)
            at com.ibm.media.protocol.CloneableSourceStreamAdapter$PushBufferStreamSlave.run(CloneableSourceStreamAdapter.java:375)
            - locked <0x9f480578> (a com.ibm.media.protocol.CloneableSourceStreamAdapter$PushBufferStreamSlave)
            at java.lang.Thread.run(Thread.java:534)Any ideas ?
    Thanks,
    Jarek

    bgl,
    I was able to get past the cloning issue by following the Clone.java example to the letter :)
    Turns out that the cloneable data source must be added as a send stream first, and then the clonet data source. Now for each party in the call the conf. server does the following :
    Party(RTPManager mgr,DataSource ds) {
          this.mgr=mgr;
          this.ds=Manager.createCloneableDataSource(ds);
       synchronized DataSource cloneDataSource() {
          DataSource retVal;
          if(getNeedsCloning()) {
            retVal = ((SourceCloneable) ds).createClone();
          } else {
            retVal = ds;
            setNeedsCloning();
          return retVal;
        private void setNeedsCloning() {
          needsCloning=true;
        private boolean getNeedsCloning() {
          return needsCloning;
         private synchronized void addSendStreamFromNewParticipant(Party newOne) throws UnsupportedFormatException, IOException {
        debug("*** - New one joined. Creating the send streams. Curr count :" + participants.size());
        Iterator pIt = participants.iterator();
        while(pIt.hasNext()) {
          Party p = (Party)pIt.next();
          assert p!=newOne;
          // update existing participant
          SendStream sendStream = p.getMgr().createSendStream(newOne.cloneDataSource(),0);
          sendStream.start();
          // send data from existing participant to the new one
          sendStream = newOne.getMgr().createSendStream(p.cloneDataSource(),0);
          sendStream.start();
        debug("*** - Done creating the streams.");So I made some progress, but I'm still not quite there.
    The RTP manager JavaDoc for createSendStream states the following :
    * This method is used to create a sending stream within the RTP
    * session. For each time the call is made, a new sending stream
    * will be created. This stream will use the SDES items as entered
    * in the initialize() call for all its RTCP messages. Each stream
    * is sent out with a new SSRC (Synchronisation SouRCe
    * identifier), but from the same participant i.e. local
    * participant. <BR>
    For 3 participants, my conf. server creates 2 send streams to every one of them, so I'd expect 2 SSRCs on the wire. Examining the RTP packets in Ethereal, I only see 1 SSRC, as if the 2nd createSendStream call failed. Consequently, each participany in the conference is able to receive voice from only 1 other participant, even though I create RTPManager instance for each participany, and add 2 send streams.
    Any ideas ?
    Thanks,
    Jarek

  • Can I play and save the incoming RTP streams simultenously

    Hi,
    I want to whether i can play and save the incoming RTP stream simultenously using
    JMF 2.1. The idea is by saving in the file, i can playback the same file later time
    Is there any example code aviliable.
    Thanks in Advance
    bye
    Srikanth

    I think this, today, it's impossible lamentably. By but that I have tried it, I have not been able to obtain it.
    If exists someone that it has obtained it, please, help us with this subject. It's very important.
    Thanks

  • Apple photo streams has stopped working

    My photo stream has stopped working all of a sudden.  It has been working and I can't figure out how to get it back.  Any suggestion from anyone????

    Uninstall and then reinstall the iCloud control panel on your computer.

  • How to synchronize audio and video rtp-streams

    Hi everyone!
    I've tried to make one of the processors, processor1, control the other processor, processor2, with processor1.addController(processor2) but I get a
    javax.media.IncompatibleTimeBaseException. I need to synchronize audio and video rtp-streams because they are sometimes completely out of sync.
    In JMF API Guide I've read:
    "2. Determine which Player object's time base is going to be used to drive
    the other Player objects and set the time base for the synchronized
    Player objects. Not all Player objects can assume a new time base.
    For example, if one of the Player objects you want to synchronize has
    a *push-data-source*, that Player object's time base must be used to
    drive the other Player objects."
    I'm using a custom AVReceive3 to receive rtp-streams and then I create processors for the incoming stream's datasource's, and they are ALL PushBufferDataSource's.
    Does this mean I can't synchronize these. If so is there any way I can change them into Pull...DataSources ?
    How are you supposed to sync audio and video if not with as above ?

    camelstrike wrote:
    Hi everyone!
    I've tried to make one of the processors, processor1, control the other processor, processor2, with processor1.addController(processor2) but I get a
    javax.media.IncompatibleTimeBaseException. I need to synchronize audio and video rtp-streams because they are sometimes completely out of sync.
    In JMF API Guide I've read:
    "2. Determine which Player object's time base is going to be used to drive
    the other Player objects and set the time base for the synchronized
    Player objects. Not all Player objects can assume a new time base.
    For example, if one of the Player objects you want to synchronize has
    a *push-data-source*, that Player object's time base must be used to
    drive the other Player objects."
    I'm using a custom AVReceive3 to receive rtp-streams and then I create processors for the incoming stream's datasource's, and they are ALL PushBufferDataSource's.
    Does this mean I can't synchronize these. If so is there any way I can change them into Pull...DataSources ?The RTP packets are timestamped when they leave, and they are played in order. You can't change the timebase on an RTP stream because, for all intensive purposes, it's "live" data. It plays at the speed the transmitting RTP server wants it to play, and it never gets behind because it drops out of order and old data packets.
    If your RTP streams aren't synced correctly on the receiving end, it means they aren't being synced on the transmitting end...so you should look into the other side of the equation.

  • To retransmit the rtp streams received

    hi !
    Program A transmits media data to another program B using RTPSessionMgr.What B has to do is send the recived streams to another receiver program C.This has to play the stream.That is I am trying to implement a client router server model.
    So i maintained a session between A & B and also between B & C.Once NewReceiveStream event occurs at B,it retrieves the datasource from the stream and uses the createSendStream() method of the sessionmanager and sends to C.
    My problem is that,C receives the audio and video streams.but never plays it.I get a pink screen for video and no audio.
    can somebody help me out in pointing out my mistake or tell me a new strategy to implement this.
    Actually,i have tried to use only datagram sockets on the router side.ie I created 2 sockets for B to receive data. and forwarded to C using 2 other sockets .And this did work.But the client does not know the sender details.I require to maintain the sender,receiver reports.So i went for the session manager(RTPManager /RTPSessionMgr).
    kindly help.

    Hi all!
    Nice to meet ya, it very exciting that I got a project which will integrating the JMF and JXTA, it got RTP streams from the JMF framework and then sending it to the JXTA framework ,and sent to any peer in some groups to visualizing it .So some aspects of which is similar to yours ,would you mind add me to your contact list of MSN(my MSN account is:[email protected])

Maybe you are looking for

  • UTC not appearing in metadata

    I am capturing GPS data on my photos.  On the camera I can see longitude, latitude, altitude and universal time.  When viewing the data in Lightroom 2.5, on a Mac i can see the space coordinates, but not the time.  Is there a way to display this?  If

  • User defined query parameter in QLD

    Hi All, i am using query generator in SAP to print some records of user defined query. precisely, the details about sales employees are to be printed. now the columns which exist in the database (ie. sales employee code, name, cardcode) etc can be gi

  • ECL Viewer 6.0 in cfolders under WINDOWS VISTA

    Hi Experts, I have Windows VISTA installed and can not use the ECL Viewer 6.0 within cfolders 4.5. I know that there has been a restriction for the usage of ECL Viewer in cfolders and VISTA... Is that issue solved? Do you know an appropiate patch? An

  • Call To perform from external program

    Hi , I have Perform in some Function (s_user_list) and i want to cal to this perform in other report, I try like that but i get error ,what it can be? REPORT  ztest_123. PERFORM check_para  IN PROGRAM s_user_list         USING  action <ls_details>   

  • SBO 8.8 Can two LImited Licenses (Eg.. Financials + CRM) be assigned to one

    I have confirmed by testing that it is possible to assign two different limited licenses to one user. In a limited test (COA Edit + Quotations Entry) I have established that the combined functionality is available. Question - is this supported by SAP