RTP stream to 7920 phone

I have a jtapi app that mimics a walkie talkie system. Pretty basic stuff going on.
I'm using the multicast custom softkeys to transmit and receive audio.
So the Cisco Docs say that the 7920s can "only support one incoming and one outgoing unicast stream."
However, I'm able to transmit multicast fine from the 7920...it's just I'm not receiving anything.
Here are my two CiscoTerminal.sendData() methods that are pushed to a phone when it joins a walkie talkie channel:
CT_.sendData("<CiscoIPPhoneExecute>" +
"<ExecuteItem URL=\'RTPMRx:" + SHOUTIP + ":" + (Integer.parseInt(SHOUTPORT)+channel_*2) + "\'/>" +
"</CiscoIPPhoneExecute>");
CT_.sendData("<CiscoIPPhoneText>" +
"<Title>" + channelname + "</Title>" + "<Prompt></Prompt>" +
"<Text>Hold down the [Talk] key to talk.\nPress the [Stop] key to exit the channel.</Text>" +
"<SoftKeyItem>" +
"<Name>Talk</Name>" +
"<URL>RTPTx:Stop</URL>" +
"<URLDown>RTPMTx:" + SHOUTIP + ":" + (Integer.parseInt(SHOUTPORT)+channel_*2) + "</URLDown>" +
"<Position>1</Position>" + "</SoftKeyItem>" +
"<SoftKeyItem>" +
"<Name>Stop</Name>" +
"<URL>http://" + server_ + ":8080" + path_ + "?CHANNEL=" + channel_ + "&TASK=1&DEVICE=" + SEP_ + "</URL>" +
"<Position>2</Position>" +
"</SoftKeyItem>" +
"</CiscoIPPhoneText>");
Is there some trick to receiving RTP on the 7920? I discovered that you have to wrap the CiscoIPPhoneExecute URL in single quotes instead of double quotes or the 7920 won't do anything with it. I was wondering if there's something else I need to call to turn on the earpiece?
I'm seeing the RTP receive animated icon on the phone when it joins the walkie talkie channel, and like I said, I can hold down the "Talk" softkey and hear audio on my desk IP phones.
Additional Info: the SHOUTIP variable is 230.20.20.21 and the SHOUTPORT variable is 20480. My channel numbers are linear from 0 on up, you'll notice I multiply them by 2 to keep on the even ports for broadcasting.
Any help is appreciated,
DB

Pshh...nevermind. I updated the firmware this morning from 3.0 to 3.01 and it works now.
Curiously, we have Syn-Apps SA-Announce and it had no problem doing paging and walkie talkie to the 7920 before the firmware upgrade this morning...so more power to them for figuring out the pre-3.01 trick to multicasting to the 7920s.
Coincidentally, this whole thing came about because we sold an SA-Announce server to a company that requires paging and a service that works EXACTLY like a walkie talkie system. Since the Syn-Apps walkie talkie requires one person to always be in a phone call to keep it active, and can't rejoin members once they've left, I had to write this app to go along with it.
I'm going to donate the code to you guys for those curious. It was written in two days including fighting against stupid issues...so it's not well documented and it hasn't been optimized at all.
I'm zipping up the entire NetBeans project folder.
There's a WT.war file in a dist subfolder but our CallManager and Push user/pass are hardcoded in, so you'll need to update those and rebuild.
The URL: http://your-servlet-server:8080/WT/servlet?DEVICE=#DEVICENAME#
is used for the services URL.
This code provided as-is blah blah blah...
DB

Similar Messages

  • Mixing audio RTP-Streams to conference

    I'm trying to create a telefon conference applikation with jmf.
    My problem:
    I need to mix multiple incomming RTP-Streams (Audio/G711_ulaw) together into one outgoing RTP-Stream.
    If I use a MergingDataSource i can get one DataSource with multiple Tracks, but i'm not able to send them out together over one RTP-Session.
    Is there any way to do this?
    I have also thaught about using RawBufferMux or WAVMux to multiplex the tracks included in the MergingDataSource ,but i cant find out how to use them.
    Or is there an easyer way to create an conference over RTP?
    I need to connect to cisco ip-phones, so i can only use the g711_ulaw codec.
    Is there anyone out who can help me?

    Im sorry, but I met this problem in past and find it impossible to resolve.
    Im also thought about MergingDataSource, about Many2One class and more and more but nothing I could use. And nobody could answer this question in this forum half of year ago.
    Oh, nearly forgot: it is possible to write merged streams to file, seems possible to write them into file and then to transmit it to net from file. But how to realize more than one Processor?..

  • RTP for Cisco IP Phones

    Hello friends!
    I'm developing an application for Cisco IP Phones that whorks with JMF. Since there are some programmers in this forum that knows Cisco IP phones, I send my question here.
    I was send succefully the XML message to phone (CiscoIPPhoneExecute), I was customized the paket's size to 160, etc. All works aparently. But audio stream does not reach to phone. Can anyone tell me why??
    Below I write my source code.
    Tanks!
    Max.
    import java.io.*;
    import java.net.*;
    import java.util.*;
    import org.apache.xerces.impl.dv.util.Base64;
    import javax.media.*;
    import javax.media.control.*;
    import javax.media.format.*;
    import javax.media.protocol.*;
    * @author  Administrator
    public class MMHTTPPost {
        /** Creates a new instance of MMHTTPPost */
        public MMHTTPPost() {
            try {
                 * Envia el CiscoIPPhoneExecute al telefono
                String xml = new String("<CiscoIPPhoneExecute>"+
                                            "<ExecuteItem Priority=\"0\" URL=\"RTPRx:Stop\"/>"+"<ExecuteItem Priority=\"1\" URL=\"RTPRx:10.1.15.10:23480\"/>"+
                                            "</CiscoIPPhoneExecute>");
                String  userId      = "Max",
                        password    = "12345",           
                        basicAuth   = "Basic ",           
                        params      = "XML=" + URLEncoder.encode(xml, "ISO-8859-1" );    
                byte[]  bytes       = params.getBytes();
                // Create a URL pointing to the servlet or CGI script and open an HttpURLConnection on that URL       
                URL url = new URL( "http://10.1.15.56/CGI/Execute" );       
                HttpURLConnection con = ( HttpURLConnection ) url.openConnection();       
                // Indicate that you will be doing input and output, that the method is POST, and that the content       
                // length is the length of the byte array       
                con.setDoOutput( true );       
                con.setDoInput( true );       
                con.setRequestMethod( "POST" );       
                con.setRequestProperty( "Content-length", String.valueOf( bytes.length ) );
                // Create the Basic Auth Header       
                Base64 encoder = new Base64();
                basicAuth = (String)encoder.encode( ((String)(userId + ":" + password)).getBytes() );
                System.out.println("Codificado: " + basicAuth);
                con.setRequestProperty( "Authorization", "Basic " + basicAuth.trim() );
                // Write the parameters to the URL output stream       
                OutputStream output = con.getOutputStream();       
                output.write( bytes );       
                output.flush();       
                // Read the response       
                BufferedReader input = new BufferedReader( new InputStreamReader( con.getInputStream() ) );       
                while ( true ) {           
                    String line = input.readLine();           
                    if ( line == null )                
                        break;           
                    System.out.println( line );       
                input.close();       
                output.close();       
                con.disconnect();
                 * Aca empieza la parte RTP/JMF
                // Create a Processor for the selected file. Exit if the       
                // Processor cannot be created.       
                Processor processor = null;       
                try {           
                    String mediaUrl = "file:\\C:\\pruebas execute\\spacemusic.au";
                    processor = Manager.createProcessor( new MediaLocator(mediaUrl));       
                } catch (IOException e){           
                    System.out.println("Exception occured (1a): " + e);       
                } catch (NoProcessorException e){           
                    System.out.println("Exception occured (1b): " + e);       
                // configure the processor       
                //processor.configure();       
                //Espera el estado Processor.Configured
                boolean result = waitForState(processor, Processor.Configured);
                if (result == false)
                    System.out.println("No se pudo configurar el procesador...");
                // Block until the Processor has been configured       
                TrackControl track[] = processor.getTrackControls();       
                boolean encodingOk = false;       
                // Go through the tracks and try to program one of them to       
                // output ulaw data.       
                for (int i = 0; i < track.length; i++)        
                    if (!encodingOk && track[i] instanceof FormatControl)            
                        if (((FormatControl)track).setFormat( new AudioFormat(AudioFormat.ULAW_RTP,8000,8,1)) == null)
    track[i].setEnabled(false);
    else
    encodingOk = true;
    else
    // we could not set this track to ulaw, so disable it
    track[i].setEnabled(false);
    *Aqu? se cambia el tama?o del paquete a 160
    if (((TrackControl)track[i]).isEnabled()) {
    try {
    System.out.println("Cambiando la lista de codecs...");
    Codec codec[] = new Codec[3];
    codec[0] = new com.ibm.media.codec.audio.rc.RCModule();
    codec[1] = new com.ibm.media.codec.audio.ulaw.JavaEncoder();
    //codec[2] = new com.ibm.media.codec.audio.ulaw.Packetizer();
    codec[2] = new com.sun.media.codec.audio.ulaw.Packetizer();
    ((com.sun.media.codec.audio.ulaw.Packetizer)codec[2]).setPacketSize(160);
    ((TrackControl)track[i]).setCodecChain(codec);
    catch (Exception e){
    System.out.println("Error al cambiar el tamano del paquete: " + e);
    System.out.println("Encoding ok?: " + encodingOk );
    // At this point, we have determined where we can send out
    // ulaw data or not.
    // realize the processor
    if (encodingOk)
    //processor.realize();
    // block until realized.
    // get the output datasource of the processor and exit
    // if we fail
    //Espera el estado Processor.Realized
    result = waitForState(processor, Processor.Realized);
    if (result == false)
    System.out.println("No se pudo realizar el procesador...");
    DataSource ds = null;
    try
    ds = processor.getDataOutput();
    catch (NotRealizedError e)
    System.out.println("Exception occured(2): "+e);
    // hand this datasource to manager for creating an RTP
    // datasink.
    // our RTP datasink will multicast the audio
    try
    String mediaUrl= "rtp://10.1.15.56:23480/audio/1";
    MediaLocator m = new MediaLocator(mediaUrl);
    DataSink d = Manager.createDataSink(ds, m);
    d.open();
    d.start();
    catch (Exception e)
    System.out.println("Exception occured(3): "+e);
    catch ( MalformedURLException murlex )
    System.out.println( murlex );
    catch ( IOException ioex )
    System.out.println( ioex );
    * @param args the command line arguments
    public static void main(String[] args) {
    new MMHTTPPost();
    * Convenience methods to handle processor's state changes.
    private Integer stateLock = new Integer(0);
    private boolean failed = false;
    Integer getStateLock() {
         return stateLock;
    void setFailed() {
         failed = true;
    private synchronized boolean waitForState(Processor p, int state) {
         p.addControllerListener(new StateListener());
         failed = false;
         // Call the required method on the processor
         if (state == Processor.Configured) {
         p.configure();
         } else if (state == Processor.Realized) {
         p.realize();
         // Wait until we get an event that confirms the
         // success of the method, or a failure event.
         // See StateListener inner class
         while (p.getState() < state && !failed) {
         synchronized (getStateLock()) {
              try {
              getStateLock().wait();
              } catch (InterruptedException ie) {
              return false;
         if (failed)
         return false;
         else
         return true;
    * Inner Classes
    class StateListener implements ControllerListener {
         public void controllerUpdate(ControllerEvent ce) {
         // If there was an error during configure or
         // realize, the processor will be closed
         if (ce instanceof ControllerClosedEvent)
              setFailed();
         // All controller events, send a notification
         // to the waiting thread in waitForState method.
         if (ce instanceof ControllerEvent) {
              synchronized (getStateLock()) {
              getStateLock().notifyAll();

    ignore my last post.. I didn't properly read the source code.
    Anyway, I got g.711 streaming to work. I started out using the AVTransmit2 sample, but set the audio output format to what is used in the source code posted here. I've also applied the same codec chain described above, and that was it.
    I could never get streaming to work using the first RTP streaming method described in the JMF Programmers guide (and this is the method used in the source posted above), but using the RTPManager, things work just fine.

  • Attempting to write RTP stream to wave

    I have been working long and hard trying to figure out to save an RTP stream from a Cisco phone to a wave. I need the wave file to be in the format ULAW, 8khz, 8 bit mono.
    Through much research this is the code I come up with but I does not work. I states that I am trying to get a datasink for null when I try to create the filewriter datasink.
    I also have gotten the following error message with other version of the code below:
    newReceiveStreamEvent exception Cannot find a DataSink for: com.sun.media.protocol.rtp.DataSource@3680c1
    When I get the RTP stream I get the following as the format of the input stream:
    - Recevied new RTP stream: ULAW/rtp, 8000.0 Hz, 8-bit, Mono
    else if (evt instanceof NewReceiveStreamEvent) {
         try {
              stream = ((NewReceiveStreamEvent)evt).getReceiveStream();
    Format formats[] = new Format[1];
    formats[0] = new AudioFormat(AudioFormat.LINEAR,8000,8,1);
    FileTypeDescriptor outputType = new FileTypeDescriptor(FileTypeDescriptor.WAVE);
    DataSource ds = stream.getDataSource();
              // Find out the formats.
              RTPControl ctl = (RTPControl)ds.getControl("javax.media.rtp.RTPControl");
              if (ctl != null){
              System.err.println(" - Recevied new RTP stream: " + ctl.getFormat());
              } else
              System.err.println(" - Recevied new RTP stream");
              if (participant == null)
              System.err.println(" The sender of this stream had yet to be identified.");
              else {
              System.err.println(" The stream comes from: " + participant.getCNAME());
    ProcessorModel processorModel = new ProcessorModel(ds, formats, null);
    Processor processor = Manager.createRealizedProcessor(processorModel);
              processor.setContentDescriptor(new ContentDescriptor(ContentDescriptor.RAW_RTP));
    MediaLocator f = new MediaLocator("file:/C:/output.wav");
    DataSink filewriter = null;
    DataSource tempds = processor.getDataOutput();
    filewriter = Manager.createDataSink(tempds, f);
    filewriter.open();
    Edited by: phodat on Sep 16, 2010 10:17 PM
    Edited by: phodat on Sep 16, 2010 10:20 PM

    The content descriptor is designed to specify the output format, not the input format. The input format is specified by the DataSource automatically, and you set the content descriptor to whatever you want the output format to be. In this case, you'd want to set it to a WAV file.
    You'd then need to go through all of the Track objects on the processor and set their output format to the ULAW specifications as you want there.
    [http://www.cs.odu.edu/~cs778/spring04/lectures/jmfsolutions/examplesindex.html]
    There's actually an example program that exports RTP streams, you can probably use that code without modification to fit your needs.

  • No Inter-Cluster RTP Stream with Gatekeepers

    Hello,
    Firstly I am no expert in Cisco telephony as we have just recently migrated to a full Cisco solution, so apologies if I ask a fundamental question.
    Client A = Site A
    Client B = Site B
    Client C = Site C
    Site A and Site B are in 1 cluster
    Site C is in another cluster
    Intra-Cluster Traffic Works
    Client A -> Client B within the same cluster (across 2 sites with a low latency link) RTP stream comes up and the call functions as expected.
    Inter-Cluster Traffic Fails (GK to GK)
    Client C -> Client A this works, the RTP stream comes up and the call functions as expected.
    Client A -> Client C this call connects but there is no RTP stream.
    We are using G711 across the board and I have captured a wireshark capture from a Client A -> Client C failed call.
    I have been going through this capture and noticed that when I search H225 (for the gatekeepers) I see the following –
    CS: setup
    RAS: admissionRequest
    RAS: admissionConfirm
    CS: callProeeding
    CS: alerting
    CS: notify
    RAS: registrationRequest
    RAS: registrationConfirm
    CS: notify
    CS: connect
    CS: notify
    CS: releaseComplete
    RAS: disengageRequest  (DISCONECT_REASON=2,TIME=1321266127,DURATION=24,DISCONNECT_STRING=no resource,ORIGIN=0,LINE_NUMBER=GK,OUTBUND_GW_IP=..
    RAS: disengageConfirm
    There are firewalls inbetween and these were the first thing I looked at, but I dont even see any RTP stream trying to be initiated from the far side. Would anyone have any ideas where I could start looking?
    Thanks,
    Peter

    Pat,
    I can not talk to the UC540, but I ran into a situation recently where the SIP gateway was sending out the private extension of the phone number instead of the full DID that was registered to the provider.  The provider was then blocking call.  
    In our SIP debugs we saw the RDNIS information of the private extension I believe. The error code we were getting back from SP was code 404 or something along those lines.
    I recommend you do some debugs and track where the calls fails, compare the SNR call versus a normal call in the debugs, and then if you still get stuck post running configs and debugs back here. 

  • RTP stream question

    Hello community, I have a question
    The setup is this, a cisco sccp phone is located in russia and registrated in a callmanager located in denmark. The phone dials a number that translate and use a routelist to a local gateway in russia. The dialed number in the local cisco gateway is a dialpeer with a siptrunk to a lync mediation server also located in denmark.
    How is the final rtp stream going to be from the cisco phone to the Lync server?
    Br. Kasper

    Hi. It actually all comes down to the reporting in Lync. The issue is that Lync reports "bad" conferance per Sip trunk. I want to make a more detailed design where I place a SIP trunk in every affilliate around europe in their local gateway, but as the phones are registrated in cucm cluster in denmark and the Lync mediation server(sip trunk connection to local gateway) also is located in denmark, I need a redesign of the rtp path including the MTP, that is why I want to put the MTP resource in the local affiliate gateway(this eksample russia) So by doing this the rtp path would be phone->local mtp->local vgw-siptrunk->Lync mediation in denmark. I just needed a secound thought on the design.
    The current setup would report all "bad" conferances as beeing in denmark, even if they are originated in other countries. The local dialpattern is just callcorwarded to the dk siptrunk->Lync mediation.The SIP trunk in cucm are limmited to only one connection to the Lync mediation server.

  • How can i use shoutcast streams on windows phone?

    I have an idea to make an application but i don't know how to use shoutcast streams in windows phone ? Is there any API or something that i need to fallow?
    Alican

    At the moment Shoutcast audio streams are not supported natively in Windows Phone. You will need to write a self implementation of IMediaStreamSource to parse the audiostream and also handle the http connections to the streaming server.
    Another solution is to use a third party implementation such as www.jupitersdk.com.
    It offers a well documented API, is easy to integrate and use into your project and there is a trial version for testing purposes.
    Jupiter Sdk has successfully been used in several RadioStreaming apps which can be found in the Windows Phone Store. By the way, it allows not only to play Shoutcast and Icecast audiostreams in Windows Phone 8.1, but also to retrieve the stream metadata (i.e.
    current Title and Artist), among other interesting features.
    Disclaimer: I am a member of the Jupiter Sdk Development Team. For information and support you can use the contact form at www.jupitersdk.com

  • My wife and I both have the new iPhones.  Is there a way to set up two separate iCloud accounts but have photo stream from both phones sync with one family mac?  So we don't want to share contacts or anything else, just photos... Thanks in advance

    My wife and I both have the new iPhones.  Is there a way to set up two separate iCloud accounts but have photo stream from both phones sync with one family mac?  So, we don't want to share contacts or anything else, just photos... Thanks in advance

    If you turned off Contacts wouldnt that mean that the Contacts would no longer be backed up to iCloud as well? That would make it a pain when upgrading a phone to not easily pull contacts back down.

  • How to extract data from Buffer and create a RTP stream

    Hi
    I'm working on a project where I need to interrupt a media stream (video and audio) and extract the data. Send the data with a custom protocol. On the recieveing side I would like to reconstruct the stream using only the data chunks.
    I'm currently looking at [DataSourceReader.java|http://java.sun.com/javase/technologies/desktop/media/jmf/2.1.1/solutions/DataSourceReader.java] and more specifically at a method like printDataInfo(Buffer buffer) to extract the data.
    Is it possible to create a RTP stream, only having access to the byte array "data" in Buffer ?
    Thanks in advance.

    camelstrike wrote:
    Hi
    I'm working on a project where I need to interrupt a media stream (video and audio) and extract the data. Send the data with a custom protocol. On the recieveing side I would like to reconstruct the stream using only the data chunks.
    I'm currently looking at [DataSourceReader.java|http://java.sun.com/javase/technologies/desktop/media/jmf/2.1.1/solutions/DataSourceReader.java] and more specifically at a method like printDataInfo(Buffer buffer) to extract the data.
    There are a couple of different ways to get the data. Reading it from inside a DataSink is perfectly fine...
    Is it possible to create a RTP stream, only having access to the byte array "data" in Buffer ?Yes and no.
    [http://java.sun.com/javase/technologies/desktop/media/jmf/2.1.1/solutions/CustomPayload.html]
    You need to know the format of the media in addition to the actual media data...

  • My photos on photo stream on my phone have transfereed but not all have gone to my laptops photostream. help

    My photos on photo stream on my phone have transfereed but not all have gone to my laptops photostream. help

    How many photos are you trying to add to the stream?  There is a limit of 1000.

  • Writing a conference server for RTP streams

    Hello,
    I'm trying to write a conference server which accepts multiple RTP streams (one for each participant), creates a mixed RTP stream of all other participants and sends that stream back to each participant.
    For 2 participants, I was able to correctly receive and send the stream of the other participant to each party.
    For 3 participants, creating the merging data source does not seem to work - i.e. no data is received by the participants.
    I tried creating a cloneable data sources instead, thinking that this may be the root cause, but when creating cloneable data sources from incoming RTP sources, I am unable to get the Processor into Configured state, it seems to deadlock. Here's the code outline :
        Iterator pIt = participants.iterator();
        List dataSources = new ArrayList();
        while(pIt.hasNext()) {
          Party p = (Party) pIt.next();
          if(p!=dest) {
            DataSource ds = p.getDataSource();
            DataSource cds = Manager.createCloneableDataSource(ds);
            DataSource clone= ((SourceCloneable)cds).createClone();
            dataSources.add(clone);
        Object[] sources = dataSources.toArray(new DataSource[0]);
        DataSource dataSource =   Manager.createMergingDataSource((DataSource[])sources);
        Processor p = Manager.createProcessor(dataSource);
        MixControllerListener cl = new MixControllerListener();
        p.addControllerListener(cl);
        // Put the Processor into configured state.
        p.configure();
        if (!cl.waitForState(p, p.Configured)) {
            System.err.println("Failed to configure the processor.");
            assert false;
        }Here are couple of stack traces :
    "RTPEventHandler" daemon prio=1 tid=0x081d6828 nid=0x3ea6 in Object.wait() [98246000..98247238]
            at java.lang.Object.wait(Native Method)
            - waiting on <0x9f37e4a8> (a java.lang.Object)
            at java.lang.Object.wait(Object.java:429)
            at demo.Mixer$MixControllerListener.waitForState(Mixer.java:248)
            - locked <0x9f37e4a8> (a java.lang.Object)
            at demo.Mixer.createMergedDataSource(Mixer.java:202)
            at demo.Mixer.createSendStreams(Mixer.java:165)
            at demo.Mixer.createSendStreamsWhenAllJoined(Mixer.java:157)
            - locked <0x9f481840> (a demo.Mixer)
            at demo.Mixer.update(Mixer.java:123)
            at com.sun.media.rtp.RTPEventHandler.processEvent(RTPEventHandler.java:62)
            at com.sun.media.rtp.RTPEventHandler.dispatchEvents(RTPEventHandler.java:96)
            at com.sun.media.rtp.RTPEventHandler.run(RTPEventHandler.java:115)
    "JMF thread: com.sun.media.ProcessEngine@a3c5b6[ com.sun.media.ProcessEngine@a3c5b6 ] ( configureThread)" daemon prio=1 tid=0x082fe3c8 nid=0x3ea6 in Object.wait() [977e0000..977e1238]
            at java.lang.Object.wait(Native Method)
            - waiting on <0x9f387560> (a java.lang.Object)
            at java.lang.Object.wait(Object.java:429)
            at com.sun.media.parser.RawBufferParser$FrameTrack.parse(RawBufferParser.java:247)
            - locked <0x9f387560> (a java.lang.Object)
            at com.sun.media.parser.RawBufferParser.getTracks(RawBufferParser.java:112)
            at com.sun.media.BasicSourceModule.doRealize(BasicSourceModule.java:180)
            at com.sun.media.PlaybackEngine.doConfigure1(PlaybackEngine.java:229)
            at com.sun.media.ProcessEngine.doConfigure(ProcessEngine.java:43)
            at com.sun.media.ConfigureWorkThread.process(BasicController.java:1370)
            at com.sun.media.StateTransitionWorkThread.run(BasicController.java:1339)
    "JMF thread" daemon prio=1 tid=0x080db410 nid=0x3ea6 in Object.wait() [97f41000..97f41238]
            at java.lang.Object.wait(Native Method)
            - waiting on <0x9f480578> (a com.ibm.media.protocol.CloneableSourceStreamAdapter$PushBufferStreamSlave)
            at java.lang.Object.wait(Object.java:429)
            at com.ibm.media.protocol.CloneableSourceStreamAdapter$PushBufferStreamSlave.run(CloneableSourceStreamAdapter.java:375)
            - locked <0x9f480578> (a com.ibm.media.protocol.CloneableSourceStreamAdapter$PushBufferStreamSlave)
            at java.lang.Thread.run(Thread.java:534)Any ideas ?
    Thanks,
    Jarek

    bgl,
    I was able to get past the cloning issue by following the Clone.java example to the letter :)
    Turns out that the cloneable data source must be added as a send stream first, and then the clonet data source. Now for each party in the call the conf. server does the following :
    Party(RTPManager mgr,DataSource ds) {
          this.mgr=mgr;
          this.ds=Manager.createCloneableDataSource(ds);
       synchronized DataSource cloneDataSource() {
          DataSource retVal;
          if(getNeedsCloning()) {
            retVal = ((SourceCloneable) ds).createClone();
          } else {
            retVal = ds;
            setNeedsCloning();
          return retVal;
        private void setNeedsCloning() {
          needsCloning=true;
        private boolean getNeedsCloning() {
          return needsCloning;
         private synchronized void addSendStreamFromNewParticipant(Party newOne) throws UnsupportedFormatException, IOException {
        debug("*** - New one joined. Creating the send streams. Curr count :" + participants.size());
        Iterator pIt = participants.iterator();
        while(pIt.hasNext()) {
          Party p = (Party)pIt.next();
          assert p!=newOne;
          // update existing participant
          SendStream sendStream = p.getMgr().createSendStream(newOne.cloneDataSource(),0);
          sendStream.start();
          // send data from existing participant to the new one
          sendStream = newOne.getMgr().createSendStream(p.cloneDataSource(),0);
          sendStream.start();
        debug("*** - Done creating the streams.");So I made some progress, but I'm still not quite there.
    The RTP manager JavaDoc for createSendStream states the following :
    * This method is used to create a sending stream within the RTP
    * session. For each time the call is made, a new sending stream
    * will be created. This stream will use the SDES items as entered
    * in the initialize() call for all its RTCP messages. Each stream
    * is sent out with a new SSRC (Synchronisation SouRCe
    * identifier), but from the same participant i.e. local
    * participant. <BR>
    For 3 participants, my conf. server creates 2 send streams to every one of them, so I'd expect 2 SSRCs on the wire. Examining the RTP packets in Ethereal, I only see 1 SSRC, as if the 2nd createSendStream call failed. Consequently, each participany in the conference is able to receive voice from only 1 other participant, even though I create RTPManager instance for each participany, and add 2 send streams.
    Any ideas ?
    Thanks,
    Jarek

  • Streaming video from phone

    I want to stream video from phone's cam to a server.
    I found MMAPI, with RecordControl I can:
    try
    Player player = Manager.createPlayer("capture://video ?encoding=gray8");
    player.realize();
    RecordControl rc = (RecordControl)player.getControl("RecordControl");
    ByteArrayOutputStream output = new ByteArrayOutputStream();
    rc.setRecordStream(output);
    rc.startRecord();
    p.start();
    //Thread.currentThread().sleep(5000);
    rc.commit();
    p.close();
    catch (Exception e)
    this way the video is in "output" only after "commit()", then I could send it. Is there any way to have real streaming??

    What do you understand with "real streaming"? Do you refer to frequency of transmission? The real time streaming is "not so real". If you put a small sleep time you will get a higher frequency and it will simulate a "real-time" streaming. To speed up your transmission, use UDP protocol instead of TCP, because media streaming in all techs is on top of UDP.

  • Can I play and save the incoming RTP streams simultenously

    Hi,
    I want to whether i can play and save the incoming RTP stream simultenously using
    JMF 2.1. The idea is by saving in the file, i can playback the same file later time
    Is there any example code aviliable.
    Thanks in Advance
    bye
    Srikanth

    I think this, today, it's impossible lamentably. By but that I have tried it, I have not been able to obtain it.
    If exists someone that it has obtained it, please, help us with this subject. It's very important.
    Thanks

  • Only two calls on 7920 phone

    I configured the maximum of calls to 4 and the busy trigger to 4 on the callmanager for the 7920 phone.
    Now, the problem is when two calls are activ on the 7920 phone, it is not possible to tranfer one of the calls to a internal number. The phone shows the message that the internal number is busy.
    Why only two calls can be handled on my 7920 phone?
    regrads
    Pascal

    try to configure a second line on the 7920. it can have up to 6 multiline appearance-extensions or speed dials. set the maxCall/busyTrigg and try the xfer again.

  • How to synchronize audio and video rtp-streams

    Hi everyone!
    I've tried to make one of the processors, processor1, control the other processor, processor2, with processor1.addController(processor2) but I get a
    javax.media.IncompatibleTimeBaseException. I need to synchronize audio and video rtp-streams because they are sometimes completely out of sync.
    In JMF API Guide I've read:
    "2. Determine which Player object's time base is going to be used to drive
    the other Player objects and set the time base for the synchronized
    Player objects. Not all Player objects can assume a new time base.
    For example, if one of the Player objects you want to synchronize has
    a *push-data-source*, that Player object's time base must be used to
    drive the other Player objects."
    I'm using a custom AVReceive3 to receive rtp-streams and then I create processors for the incoming stream's datasource's, and they are ALL PushBufferDataSource's.
    Does this mean I can't synchronize these. If so is there any way I can change them into Pull...DataSources ?
    How are you supposed to sync audio and video if not with as above ?

    camelstrike wrote:
    Hi everyone!
    I've tried to make one of the processors, processor1, control the other processor, processor2, with processor1.addController(processor2) but I get a
    javax.media.IncompatibleTimeBaseException. I need to synchronize audio and video rtp-streams because they are sometimes completely out of sync.
    In JMF API Guide I've read:
    "2. Determine which Player object's time base is going to be used to drive
    the other Player objects and set the time base for the synchronized
    Player objects. Not all Player objects can assume a new time base.
    For example, if one of the Player objects you want to synchronize has
    a *push-data-source*, that Player object's time base must be used to
    drive the other Player objects."
    I'm using a custom AVReceive3 to receive rtp-streams and then I create processors for the incoming stream's datasource's, and they are ALL PushBufferDataSource's.
    Does this mean I can't synchronize these. If so is there any way I can change them into Pull...DataSources ?The RTP packets are timestamped when they leave, and they are played in order. You can't change the timebase on an RTP stream because, for all intensive purposes, it's "live" data. It plays at the speed the transmitting RTP server wants it to play, and it never gets behind because it drops out of order and old data packets.
    If your RTP streams aren't synced correctly on the receiving end, it means they aren't being synced on the transmitting end...so you should look into the other side of the equation.

Maybe you are looking for

  • Handling events on a panel

    Hi, I have a JPanel containing many customized child JComponents. These JComponents do not have a background color. Now I want to do some operations depending on mouse clicks on the JPanel (anywhere in the panel). My worry is that if I click on one o

  • InDesign CS6 folio builder/links question. For iPad app.

    Hi there. I am a graphic designer who is very experienced in InDesign using the software for print purposes. I am now branching out with the capacities CS6 has and am learning to design (mainly presentation) apps for the iPad. So, I've designed a por

  • Problems creating an spatial index with srid=4326

    Hi! I would like to know if somebody can help me with the following problem: We are using the 10.2.0.1 version and we need that our SRID value is 4326. We do not have problems with 8307 or another value. However, when we tried to use srid = 4326, app

  • No sound after sleep

    I have the latest MacBook Pro, 17 inch. After putting the MacBook to sleep, I lose the sound to applications like iChat and Mail. I don't know how to get it back other than restarting the computer. Suggestions anyone?

  • Explain To-Do's please

    I subscribe to .Mac. I sync to .Mac. In Mail if I create a To-Do, how does one access the existing calendars to associate a To-do with an existing Calendar? It seems a new calendar is created under a new Titled .Mac folder. I have two choices in Mail