RTP stream question

Hello community, I have a question
The setup is this, a cisco sccp phone is located in russia and registrated in a callmanager located in denmark. The phone dials a number that translate and use a routelist to a local gateway in russia. The dialed number in the local cisco gateway is a dialpeer with a siptrunk to a lync mediation server also located in denmark.
How is the final rtp stream going to be from the cisco phone to the Lync server?
Br. Kasper

Hi. It actually all comes down to the reporting in Lync. The issue is that Lync reports "bad" conferance per Sip trunk. I want to make a more detailed design where I place a SIP trunk in every affilliate around europe in their local gateway, but as the phones are registrated in cucm cluster in denmark and the Lync mediation server(sip trunk connection to local gateway) also is located in denmark, I need a redesign of the rtp path including the MTP, that is why I want to put the MTP resource in the local affiliate gateway(this eksample russia) So by doing this the rtp path would be phone->local mtp->local vgw-siptrunk->Lync mediation in denmark. I just needed a secound thought on the design.
The current setup would report all "bad" conferances as beeing in denmark, even if they are originated in other countries. The local dialpattern is just callcorwarded to the dk siptrunk->Lync mediation.The SIP trunk in cucm are limmited to only one connection to the Lync mediation server.

Similar Messages

  • RTP Streaming seek & stop question.

    Is it possible to seek in a rtp stream received at the client. Since rtp works on push data concept,is there a way to implement this.
    I want the server to stop streaming when the client exits. Normally the sender sends Bye Event to Receiver. Is it possible for other way around to tell the sender to stop streaming.
    Thanks.

    From what I understand RTP on it's is basically for a blind transmission, like watch tv or the listening to the Radio, if the RTP stream (analogous to the tv signal or radio waves) is being broadcast you can receive it. With this method you have no control really, you are just given what is being streamed.
    However RTSP, allows you send commands to a server that will stream RTP content, which you can start, stop, pause, and in some cases move forward or backward to a particular time in the sequence.
    You may find the following link useful: http://www.cs.columbia.edu/~hgs/teaching/ais/slides/RTSP.pdf
    hope this helps,
    Stef

  • Mixing audio RTP-Streams to conference

    I'm trying to create a telefon conference applikation with jmf.
    My problem:
    I need to mix multiple incomming RTP-Streams (Audio/G711_ulaw) together into one outgoing RTP-Stream.
    If I use a MergingDataSource i can get one DataSource with multiple Tracks, but i'm not able to send them out together over one RTP-Session.
    Is there any way to do this?
    I have also thaught about using RawBufferMux or WAVMux to multiplex the tracks included in the MergingDataSource ,but i cant find out how to use them.
    Or is there an easyer way to create an conference over RTP?
    I need to connect to cisco ip-phones, so i can only use the g711_ulaw codec.
    Is there anyone out who can help me?

    Im sorry, but I met this problem in past and find it impossible to resolve.
    Im also thought about MergingDataSource, about Many2One class and more and more but nothing I could use. And nobody could answer this question in this forum half of year ago.
    Oh, nearly forgot: it is possible to write merged streams to file, seems possible to write them into file and then to transmit it to net from file. But how to realize more than one Processor?..

  • RTP streaming

    Hi guys,
    what I need to do is to stream YUV images between two JMF applications. I am using VideoTransmit application (taken from this site) to transmit and JMStudio to receive. From a little research I have noticed the following to transmit:
    1) Create the DataSource using the Manager.CreateDataSource(MediaLocator)
    2) create processor using the above created datasource
    3) create player etc.
    My question now is the following:
    How can I transmit a YUV file? How can I create the DataSource which has this YUV file without having problems for the further creation of processor etc?
    Thanks in advance guys,
    Chris

    Ok, I think I know what was confusing me. Apparently the setRate() isn't effecting the rate of stream. The audio file I am streaming runs about five minutes usually, so I sped the rate up to 10. It sped the process of writing to a file up significantly, but writing to an RTP stream seems to take the full run-time of the file. Does this sound right or is there something else I'm missing?
    Thanks,
    Khanathor

  • No Inter-Cluster RTP Stream with Gatekeepers

    Hello,
    Firstly I am no expert in Cisco telephony as we have just recently migrated to a full Cisco solution, so apologies if I ask a fundamental question.
    Client A = Site A
    Client B = Site B
    Client C = Site C
    Site A and Site B are in 1 cluster
    Site C is in another cluster
    Intra-Cluster Traffic Works
    Client A -> Client B within the same cluster (across 2 sites with a low latency link) RTP stream comes up and the call functions as expected.
    Inter-Cluster Traffic Fails (GK to GK)
    Client C -> Client A this works, the RTP stream comes up and the call functions as expected.
    Client A -> Client C this call connects but there is no RTP stream.
    We are using G711 across the board and I have captured a wireshark capture from a Client A -> Client C failed call.
    I have been going through this capture and noticed that when I search H225 (for the gatekeepers) I see the following –
    CS: setup
    RAS: admissionRequest
    RAS: admissionConfirm
    CS: callProeeding
    CS: alerting
    CS: notify
    RAS: registrationRequest
    RAS: registrationConfirm
    CS: notify
    CS: connect
    CS: notify
    CS: releaseComplete
    RAS: disengageRequest  (DISCONECT_REASON=2,TIME=1321266127,DURATION=24,DISCONNECT_STRING=no resource,ORIGIN=0,LINE_NUMBER=GK,OUTBUND_GW_IP=..
    RAS: disengageConfirm
    There are firewalls inbetween and these were the first thing I looked at, but I dont even see any RTP stream trying to be initiated from the far side. Would anyone have any ideas where I could start looking?
    Thanks,
    Peter

    Pat,
    I can not talk to the UC540, but I ran into a situation recently where the SIP gateway was sending out the private extension of the phone number instead of the full DID that was registered to the provider.  The provider was then blocking call.  
    In our SIP debugs we saw the RDNIS information of the private extension I believe. The error code we were getting back from SP was code 404 or something along those lines.
    I recommend you do some debugs and track where the calls fails, compare the SNR call versus a normal call in the debugs, and then if you still get stuck post running configs and debugs back here. 

  • How to extract data from Buffer and create a RTP stream

    Hi
    I'm working on a project where I need to interrupt a media stream (video and audio) and extract the data. Send the data with a custom protocol. On the recieveing side I would like to reconstruct the stream using only the data chunks.
    I'm currently looking at [DataSourceReader.java|http://java.sun.com/javase/technologies/desktop/media/jmf/2.1.1/solutions/DataSourceReader.java] and more specifically at a method like printDataInfo(Buffer buffer) to extract the data.
    Is it possible to create a RTP stream, only having access to the byte array "data" in Buffer ?
    Thanks in advance.

    camelstrike wrote:
    Hi
    I'm working on a project where I need to interrupt a media stream (video and audio) and extract the data. Send the data with a custom protocol. On the recieveing side I would like to reconstruct the stream using only the data chunks.
    I'm currently looking at [DataSourceReader.java|http://java.sun.com/javase/technologies/desktop/media/jmf/2.1.1/solutions/DataSourceReader.java] and more specifically at a method like printDataInfo(Buffer buffer) to extract the data.
    There are a couple of different ways to get the data. Reading it from inside a DataSink is perfectly fine...
    Is it possible to create a RTP stream, only having access to the byte array "data" in Buffer ?Yes and no.
    [http://java.sun.com/javase/technologies/desktop/media/jmf/2.1.1/solutions/CustomPayload.html]
    You need to know the format of the media in addition to the actual media data...

  • Photo stream questions, sharing, etc.

    Shared photo stream questions
    Camera Roll (CR), Photo Stream (PS)
    If I create a shared PS and share it with a friend, is this temporary like the main PS (most recent 1000 images)?
    For example, if I put one photo in a shared PS and then take 1000 new photos (but I don't add it to the shared PS), presumably it will disappear from my main PS, but will the original photo disappear from the shared PS, too?
    I noticed that each month I have a new monthly event titled, for example, "August photo stream", etc. Are these temporary, too? If I take 1000 photos in September, will the August PS event start to lose images, or will this only occur in the main PS section?
    Apple's web site says "iCloud stores new photos for 30 days, so you have plenty of time to connect your iOS device to Wi-Fi and make sure you always have your most recent shots handy." What does this mean? If I take a photo on iPhone it's permanently in CR (until I sync with iTunes on OS X and then iPhoto offers to remove them from CR), but they're also in PS, which is temporary, right? If so, as long as CR is permanent, how big an issue is the 30 day rule?
    Here's where I'm (further) confused...
    On iPad, my PS section (the tab on bottom) only has 30, relatively recent photos. Each photo was taken with iPhone.
    However, on iPhone, my PS section has 547 images. Some of these were taken as screen shots from iPad (even though they're not on iPad's PS??) Some of the images were "Save As... "  image files from OS X, some were photos imported to desktop iPhoto from a camera. Many of the images were taken years ago. On desktop iPhoto, I have over 3000 images, but the images on iPhone's PS do not appear to be the most recent 1000 images imported to iPhoto.
    So why did iPad's screen shots disappear from iPads PS yet appear on iPhone's PS?
    Why aren't the two iOS device's PS sections the same?
    Is this confusing to everyone, or am I just dense?
    Thanks!

    You are most likely using the same apple id under settings>iCloud. If you just want to stop the photo sharing you can open settings>iCloud and turn off photostream. If you want to keep using photostream and still not share you need to set up an alternate apple id to use under settings>icloud

  • Writing a conference server for RTP streams

    Hello,
    I'm trying to write a conference server which accepts multiple RTP streams (one for each participant), creates a mixed RTP stream of all other participants and sends that stream back to each participant.
    For 2 participants, I was able to correctly receive and send the stream of the other participant to each party.
    For 3 participants, creating the merging data source does not seem to work - i.e. no data is received by the participants.
    I tried creating a cloneable data sources instead, thinking that this may be the root cause, but when creating cloneable data sources from incoming RTP sources, I am unable to get the Processor into Configured state, it seems to deadlock. Here's the code outline :
        Iterator pIt = participants.iterator();
        List dataSources = new ArrayList();
        while(pIt.hasNext()) {
          Party p = (Party) pIt.next();
          if(p!=dest) {
            DataSource ds = p.getDataSource();
            DataSource cds = Manager.createCloneableDataSource(ds);
            DataSource clone= ((SourceCloneable)cds).createClone();
            dataSources.add(clone);
        Object[] sources = dataSources.toArray(new DataSource[0]);
        DataSource dataSource =   Manager.createMergingDataSource((DataSource[])sources);
        Processor p = Manager.createProcessor(dataSource);
        MixControllerListener cl = new MixControllerListener();
        p.addControllerListener(cl);
        // Put the Processor into configured state.
        p.configure();
        if (!cl.waitForState(p, p.Configured)) {
            System.err.println("Failed to configure the processor.");
            assert false;
        }Here are couple of stack traces :
    "RTPEventHandler" daemon prio=1 tid=0x081d6828 nid=0x3ea6 in Object.wait() [98246000..98247238]
            at java.lang.Object.wait(Native Method)
            - waiting on <0x9f37e4a8> (a java.lang.Object)
            at java.lang.Object.wait(Object.java:429)
            at demo.Mixer$MixControllerListener.waitForState(Mixer.java:248)
            - locked <0x9f37e4a8> (a java.lang.Object)
            at demo.Mixer.createMergedDataSource(Mixer.java:202)
            at demo.Mixer.createSendStreams(Mixer.java:165)
            at demo.Mixer.createSendStreamsWhenAllJoined(Mixer.java:157)
            - locked <0x9f481840> (a demo.Mixer)
            at demo.Mixer.update(Mixer.java:123)
            at com.sun.media.rtp.RTPEventHandler.processEvent(RTPEventHandler.java:62)
            at com.sun.media.rtp.RTPEventHandler.dispatchEvents(RTPEventHandler.java:96)
            at com.sun.media.rtp.RTPEventHandler.run(RTPEventHandler.java:115)
    "JMF thread: com.sun.media.ProcessEngine@a3c5b6[ com.sun.media.ProcessEngine@a3c5b6 ] ( configureThread)" daemon prio=1 tid=0x082fe3c8 nid=0x3ea6 in Object.wait() [977e0000..977e1238]
            at java.lang.Object.wait(Native Method)
            - waiting on <0x9f387560> (a java.lang.Object)
            at java.lang.Object.wait(Object.java:429)
            at com.sun.media.parser.RawBufferParser$FrameTrack.parse(RawBufferParser.java:247)
            - locked <0x9f387560> (a java.lang.Object)
            at com.sun.media.parser.RawBufferParser.getTracks(RawBufferParser.java:112)
            at com.sun.media.BasicSourceModule.doRealize(BasicSourceModule.java:180)
            at com.sun.media.PlaybackEngine.doConfigure1(PlaybackEngine.java:229)
            at com.sun.media.ProcessEngine.doConfigure(ProcessEngine.java:43)
            at com.sun.media.ConfigureWorkThread.process(BasicController.java:1370)
            at com.sun.media.StateTransitionWorkThread.run(BasicController.java:1339)
    "JMF thread" daemon prio=1 tid=0x080db410 nid=0x3ea6 in Object.wait() [97f41000..97f41238]
            at java.lang.Object.wait(Native Method)
            - waiting on <0x9f480578> (a com.ibm.media.protocol.CloneableSourceStreamAdapter$PushBufferStreamSlave)
            at java.lang.Object.wait(Object.java:429)
            at com.ibm.media.protocol.CloneableSourceStreamAdapter$PushBufferStreamSlave.run(CloneableSourceStreamAdapter.java:375)
            - locked <0x9f480578> (a com.ibm.media.protocol.CloneableSourceStreamAdapter$PushBufferStreamSlave)
            at java.lang.Thread.run(Thread.java:534)Any ideas ?
    Thanks,
    Jarek

    bgl,
    I was able to get past the cloning issue by following the Clone.java example to the letter :)
    Turns out that the cloneable data source must be added as a send stream first, and then the clonet data source. Now for each party in the call the conf. server does the following :
    Party(RTPManager mgr,DataSource ds) {
          this.mgr=mgr;
          this.ds=Manager.createCloneableDataSource(ds);
       synchronized DataSource cloneDataSource() {
          DataSource retVal;
          if(getNeedsCloning()) {
            retVal = ((SourceCloneable) ds).createClone();
          } else {
            retVal = ds;
            setNeedsCloning();
          return retVal;
        private void setNeedsCloning() {
          needsCloning=true;
        private boolean getNeedsCloning() {
          return needsCloning;
         private synchronized void addSendStreamFromNewParticipant(Party newOne) throws UnsupportedFormatException, IOException {
        debug("*** - New one joined. Creating the send streams. Curr count :" + participants.size());
        Iterator pIt = participants.iterator();
        while(pIt.hasNext()) {
          Party p = (Party)pIt.next();
          assert p!=newOne;
          // update existing participant
          SendStream sendStream = p.getMgr().createSendStream(newOne.cloneDataSource(),0);
          sendStream.start();
          // send data from existing participant to the new one
          sendStream = newOne.getMgr().createSendStream(p.cloneDataSource(),0);
          sendStream.start();
        debug("*** - Done creating the streams.");So I made some progress, but I'm still not quite there.
    The RTP manager JavaDoc for createSendStream states the following :
    * This method is used to create a sending stream within the RTP
    * session. For each time the call is made, a new sending stream
    * will be created. This stream will use the SDES items as entered
    * in the initialize() call for all its RTCP messages. Each stream
    * is sent out with a new SSRC (Synchronisation SouRCe
    * identifier), but from the same participant i.e. local
    * participant. <BR>
    For 3 participants, my conf. server creates 2 send streams to every one of them, so I'd expect 2 SSRCs on the wire. Examining the RTP packets in Ethereal, I only see 1 SSRC, as if the 2nd createSendStream call failed. Consequently, each participany in the conference is able to receive voice from only 1 other participant, even though I create RTPManager instance for each participany, and add 2 send streams.
    Any ideas ?
    Thanks,
    Jarek

  • Can I play and save the incoming RTP streams simultenously

    Hi,
    I want to whether i can play and save the incoming RTP stream simultenously using
    JMF 2.1. The idea is by saving in the file, i can playback the same file later time
    Is there any example code aviliable.
    Thanks in Advance
    bye
    Srikanth

    I think this, today, it's impossible lamentably. By but that I have tried it, I have not been able to obtain it.
    If exists someone that it has obtained it, please, help us with this subject. It's very important.
    Thanks

  • How to synchronize audio and video rtp-streams

    Hi everyone!
    I've tried to make one of the processors, processor1, control the other processor, processor2, with processor1.addController(processor2) but I get a
    javax.media.IncompatibleTimeBaseException. I need to synchronize audio and video rtp-streams because they are sometimes completely out of sync.
    In JMF API Guide I've read:
    "2. Determine which Player object's time base is going to be used to drive
    the other Player objects and set the time base for the synchronized
    Player objects. Not all Player objects can assume a new time base.
    For example, if one of the Player objects you want to synchronize has
    a *push-data-source*, that Player object's time base must be used to
    drive the other Player objects."
    I'm using a custom AVReceive3 to receive rtp-streams and then I create processors for the incoming stream's datasource's, and they are ALL PushBufferDataSource's.
    Does this mean I can't synchronize these. If so is there any way I can change them into Pull...DataSources ?
    How are you supposed to sync audio and video if not with as above ?

    camelstrike wrote:
    Hi everyone!
    I've tried to make one of the processors, processor1, control the other processor, processor2, with processor1.addController(processor2) but I get a
    javax.media.IncompatibleTimeBaseException. I need to synchronize audio and video rtp-streams because they are sometimes completely out of sync.
    In JMF API Guide I've read:
    "2. Determine which Player object's time base is going to be used to drive
    the other Player objects and set the time base for the synchronized
    Player objects. Not all Player objects can assume a new time base.
    For example, if one of the Player objects you want to synchronize has
    a *push-data-source*, that Player object's time base must be used to
    drive the other Player objects."
    I'm using a custom AVReceive3 to receive rtp-streams and then I create processors for the incoming stream's datasource's, and they are ALL PushBufferDataSource's.
    Does this mean I can't synchronize these. If so is there any way I can change them into Pull...DataSources ?The RTP packets are timestamped when they leave, and they are played in order. You can't change the timebase on an RTP stream because, for all intensive purposes, it's "live" data. It plays at the speed the transmitting RTP server wants it to play, and it never gets behind because it drops out of order and old data packets.
    If your RTP streams aren't synced correctly on the receiving end, it means they aren't being synced on the transmitting end...so you should look into the other side of the equation.

  • To retransmit the rtp streams received

    hi !
    Program A transmits media data to another program B using RTPSessionMgr.What B has to do is send the recived streams to another receiver program C.This has to play the stream.That is I am trying to implement a client router server model.
    So i maintained a session between A & B and also between B & C.Once NewReceiveStream event occurs at B,it retrieves the datasource from the stream and uses the createSendStream() method of the sessionmanager and sends to C.
    My problem is that,C receives the audio and video streams.but never plays it.I get a pink screen for video and no audio.
    can somebody help me out in pointing out my mistake or tell me a new strategy to implement this.
    Actually,i have tried to use only datagram sockets on the router side.ie I created 2 sockets for B to receive data. and forwarded to C using 2 other sockets .And this did work.But the client does not know the sender details.I require to maintain the sender,receiver reports.So i went for the session manager(RTPManager /RTPSessionMgr).
    kindly help.

    Hi all!
    Nice to meet ya, it very exciting that I got a project which will integrating the JMF and JXTA, it got RTP streams from the JMF framework and then sending it to the JXTA framework ,and sent to any peer in some groups to visualizing it .So some aspects of which is similar to yours ,would you mind add me to your contact list of MSN(my MSN account is:[email protected])

  • How can I manage controls in an RTP Streaming

    Hello,
    I'm currently using an RTP Streaming in order to play some music. I started with AVReceive2 and AVTransmit2 from sun and by adding a plugin, I managed to play MP3. So far so good.
    Because I didn't want to use view components given by sun, I created my own GUI for my player and I use it instead of using the one in sun's examples.
    I use listeners for my "buttons" (I prefer using mouselistener and Panel like in the sun's example Jamp) but I can't find a way to execute those buttons.
    If I press pause, it has to pause the streaming in AVTransmit2.
    If I press next, or previous, it has to change the process AVTransmit2.
    Each control has to be done in AVTransmit2 but my player is part of AVReceive2.
    My first guess was to check if I could use an event to tell my AVTransmit2 object to execute my controls but I havn't found a way to tell my AVTransmit2 object that it was this button I pressed.
    I finally stopped trying transmitting my orders thanks to the events.
    I tried then to find a way to get my AVTransmit2 object in my AVReceive2 object so that I could call methods but i failed :(
    How can i manage my controls so that it will call methods on my server and not my client ?
    Thanks :)
    Shad.
    Edited by: Shadwolf on Feb 9, 2010 2:15 AM

    Hi, again,
    For the next & previous button, I was thinking in something.
    Do you think it could be good to create a new class RTPClientManager which extends from RTPManager and get 2 booleans previous & next that are set to true when I press on buttons ?
    From here, couldn't I modify my function in AVTransmit2 like this (I implements ReseiveStreamEvent on AVTransmit2) :
    @Override
         public void update(ReceiveStreamEvent evt)
              RTPClientManager mgr = (RTPClientManager )evt.getSource();
              Participant participant = evt.getParticipant();     // could be null.
              ReceiveStream stream = evt.getReceiveStream();  // could be null.
               * Détection de la fermeture de connexion du client afin de fermer la transmission streaming
              if (evt instanceof ByeEvent)
                   System.err.println("  - Got \"bye\" from: " + participant.getCNAME());
                         if(mgr.isNext())
                     this.stop();
                             /* SOME STUFF FOR NEXT */
                        else if(mgr.isPrevious())
                     this.stop();
                             /* SOME STUFF FOR PREVIOUS*/
                    else //means it's the stop button called
                              this.stop() ;
         }Would it work ?
    If you have better ideas I am ready to hear theam :P
    But if this works, I will manage stop, next and previous, but how could I manage play & pause button ?
    I can't find a proper solution to manage my controls :(

  • Receiving Video RTP Stream (JMF) in JME ( MMAPI ) - URGENT !!!

    Hi Folks...
    I�m trying to develop an application that sends the images from a web cam connected to the computer, to the PDA that the images can be viewed by the user...
    My code for the JMF RTP Video Stream is as follows.
    Processor proc = null;
              javax.media.protocol.DataSource ds = null;
              TrackControl[] tc = null;
              int y;
              boolean encodingOk = false;
              Vector<javax.media.protocol.DataSource> datasources = new Vector<javax.media.protocol.DataSource>();
              for( int x = 0 ; x < camerasInfo.length ; x++ ){
                   try {
                        proc = Manager.createProcessor(camerasInfo[x].getLocator());
                   }catch (NoProcessorException e) { System.out.println("Erro ao int�nciar PROCESSOR: " + e);     }
                    catch (IOException e) { System.out.println("Erro ao int�nciar PROCESSOR: " + e); }
                   proc.configure();
                   try {
                        Thread.sleep(2000);
                   } catch (InterruptedException e1) {
                        // TODO Auto-generated catch block
                        e1.printStackTrace();
                   proc.setContentDescriptor( new ContentDescriptor(ContentDescriptor.RAW_RTP) );
                   tc = proc.getTrackControls();
                   for( y = 0; y < tc.length ; y++ ){
                        if (!encodingOk && tc[y] instanceof FormatControl){
                             if( ( (FormatControl) tc[y] ).setFormat( new VideoFormat(VideoFormat.RGB) ) != null ){
                                  tc[y].setEnabled(true);                              
                             }else{
                                  encodingOk = true;
                        }else{
                             tc[y].setEnabled(false);
                   proc.realize();
                   try {
                        Thread.sleep(2000);
                   } catch (InterruptedException e1) {
                        e1.printStackTrace();
                   try{
                        ds = proc.getDataOutput();
                   }catch(NotRealizedError e){
                        System.out.println("ERRO ao realizar datasource: " + e);
                   }catch(ClassCastException e){
                        System.out.println("Erro ao realizar datasource: " + e);
                   datasources.add(ds);
                   System.out.println( ds.getLocator() );
                   encodingOk = false;
              MediaLocator ml = new MediaLocator("rtp://10.1.1.100:99/video");
              try {
                   DataSink datSink = Manager.createDataSink(ds , ml);
                   datSink.open();
                   datSink.start();
              } catch (NoDataSinkException e) {
                   System.out.println("Erro ao instanciar DataSink: " + e );
              } catch (SecurityException e) {
                   System.out.println("Erro ao iniciar DataSink: " + e);
              } catch (IOException e) {
                   System.out.println("Erro ao iniciar DataSink: " + e);
              }I�m not sure if this code is correctly... it is ?
    So... the next part of the systems runs on the PDA..
    The code that access this RTP Stream is as follows..
              VideoControl c = null;
              try {
                   player = Manager.createPlayer("rtp://10.1.1.100:99/video");
                   c = (VideoControl) player.getControl("VideoControl");
                   tela = (Item) c.initDisplayMode( GUIControl.USE_GUI_PRIMITIVE, null);
                   player.start();
                   append(tela);
              } catch (IOException e) {
                   str.setText(e.toString());
                   append( str );
              } catch (MediaException e) {
                   str.setText(e.toString());
                   append( str );
              }So when the APP try to create a player for "rtp://10.1.1.100:99/video" an MediaException is throwed..
    javax.microedition.media.MediaException: Unable to create Player for the locator: rtp://10.1.1.100:99/video
    So... I don�t know what happen =/
    The error is in the PDA module ? Or in the computer�s initialization off the RTP Video Streaming ?
    I need finish this job at next week... so any help is usefull..
    Waiting for answers
    Rodrigo Kerkhoff

    First of all: before going onto the j2me part, make sure the server works, before doing anything else! Apparently, it doesn't...
    The MediaLocator is generally used to specify a kind of URL depicting where the data put into the datasink should go to. In your case, This cannot be just some URL where you want it to act as a rtps server. You'll need to implement that server yourself I guess.

  • Adding Effect on an incoming H.263/RTP stream

    Hi,
    I am playing an h.263/rtp stream which is multicasted over a network. I want to play the file with rotation or some other effect added to it. I tried to see following link for some help(as the forums suggested):- http://java.sun.com/products/java-media/jmf/2.1.1/solutions/RotationEffect.html. But the link is disabled and new link which they are providing does not contain the example.
    Can any of me help me how to add the effect on h263/rtp.
    Though i already created my effect class and tried adding it. But could not add the effect.
    Kindly give me some directions.

    Is there some error in above code or if there is some other issue.There's an error in the above code because you're assuming I'm being imprecise with my language. I am not.
    You must copy the data from the input buffer to the output buffer...
    Buffer objects are wrappers around byte[], and you cannot copy data from one array to another by modifying the object references...which is what your code does.
    public int process(Buffer inBuffer, Buffer outBuffer) {
        // Make sure the output buffer will hold data
        if (!(outBuffer.getData() instanceof byte[])) {
            outBuffer.setData(new byte[inBuffer.getLength()]);    
            outBuffer.setLength(inBuffer.getLength());  
            outBuffer.setOffset(0);
        // Make sure the output buffer is long enough
        if (outBuffer.getLength() + outBuffer.getOffset() < inBuffer.getLength())) {
            int offset = outBuffer.getOffset();
            int length = inBuffer.getLength() + outBuffer.getOffset();
            byte[] oldData = (byte[])outBuffer.getData();
            byte[] newData = new byte[length];
            // Copy the previous data
            for (int i = 0; i < offset; i++) {
                newData[i] = oldData;
    // Set the variables for the output buffer
    outBuffer.setData(newData);
    outBuffer.setLength(length);
    outBuffer.setOffset(offset);
    // Copy the data from in to out
    int j = outBuffer.getOffset();
    for (int i = inBuffer.getOffset(); i < inBuffer.getOffset() + inBuffer.getLength(); i++) {
    ((byte[])outBuffer.getData())[j++] = ((byte[])inBuffer.getData())[i];
    return BUFFER_PROCESSED_OK;
    That's untested code above, fix my dumb mistakes as necessary.                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                           

  • RTP stream being stripped

    How can I check to see if 802.1Q VLAN tagging information is being stripped from the RTP stream by the workstation's NIC?

    Some network cards strip the dot1q headers, so your packet capture won't  have this data. If you're not seeing dot1q headers please check the  following link from wireshark:
    http://wiki.wireshark.org/CaptureSetup/VLAN#head-81781716144f2855ab0aff2f8b752e95f2562efb
    Apply the previous settings to the machine you're capturing with.

Maybe you are looking for