To retransmit the rtp streams received

hi !
Program A transmits media data to another program B using RTPSessionMgr.What B has to do is send the recived streams to another receiver program C.This has to play the stream.That is I am trying to implement a client router server model.
So i maintained a session between A & B and also between B & C.Once NewReceiveStream event occurs at B,it retrieves the datasource from the stream and uses the createSendStream() method of the sessionmanager and sends to C.
My problem is that,C receives the audio and video streams.but never plays it.I get a pink screen for video and no audio.
can somebody help me out in pointing out my mistake or tell me a new strategy to implement this.
Actually,i have tried to use only datagram sockets on the router side.ie I created 2 sockets for B to receive data. and forwarded to C using 2 other sockets .And this did work.But the client does not know the sender details.I require to maintain the sender,receiver reports.So i went for the session manager(RTPManager /RTPSessionMgr).
kindly help.

Hi all!
Nice to meet ya, it very exciting that I got a project which will integrating the JMF and JXTA, it got RTP streams from the JMF framework and then sending it to the JXTA framework ,and sent to any peer in some groups to visualizing it .So some aspects of which is similar to yours ,would you mind add me to your contact list of MSN(my MSN account is:[email protected])

Similar Messages

  • Write(Export) the RTP Stream from a SIP Call to an audio file

    I am working on telephony system that need record each call to a file.
    I got the RTP Stream in ReceiveStreamEvent handler and start to record
    the problem is i got a file and play it but there are any sounds
    when i set FileTypeDescriptor.WAVE and AudioFormat.ULAW
    I try out the FileTypeDescriptor.WAVE and AudioFormat.IMA4_MS, i got a fixed size file with 60kb
    if the FileTypeDescriptor.MPEG_AUDIO and AudioFormat.MPEG, the processor cannot be realize,
    the thread is blocked!!
    Could anyone save me? thanks in advanced!
    =================================Code===================================================
    public void update(ReceiveStreamEvent evt){
    logger.debug("received a new incoming stream. " + evt);
    RTPManager mgr = (RTPManager) evt.getSource();
    Participant participant = evt.getParticipant(); // could be null.
    ReceiveStream stream = evt.getReceiveStream(); // could be null.
    if (evt instanceof NewReceiveStreamEvent)
    try
    stream = ( (NewReceiveStreamEvent) evt).getReceiveStream();
    DataSource ds = stream.getDataSource();
              recorder = new CallRecorderImpl(ds);
    recorder.start();
    catch (Exception e)
    logger.error("NewReceiveStreamEvent exception ", e);
    return;
    ========================================================================================
    this is the CallRecorderImpl Class
    public class CallRecorderImpl implements DataSinkListener, ControllerListener {
         private Processor processor = null;
         private DataSink dataSink = null;
         private DataSource source = null;
         private boolean bRecording = false;
         FileTypeDescriptor contentType = new FileTypeDescriptor(FileTypeDescriptor.WAVE);
         Format encoding = new AudioFormat(AudioFormat.IMA4_MS);
    MediaLocator dest = new MediaLocator("file:/c:/bar.wav");
         public CallRecorderImpl(DataSource ds) {
              this.source = ds;
         public void start() {
              try {
                   processor = Manager.createProcessor(source);
                   processor.addControllerListener(this);
                   processor.configure();
              } catch (Exception e) {
                   System.out.println("exception:" + e);
         public void stop() {
              try {
                   System.out.println("stopping");
                   this.dataSink.stop();
                   this.dataSink.close();
              } catch (Exception ep) {
                   ep.printStackTrace();
         public void controllerUpdate(ControllerEvent evt) {
              Processor pr = (Processor) evt.getSourceController();
              if (evt instanceof ConfigureCompleteEvent) {
                   System.out.println("ConfigureCompleteEvent");
                   processConfigured(pr);
              if (evt instanceof RealizeCompleteEvent) {
                   System.out.println("RealizeCompleteEvent");
                   processRealized(pr);
              if (evt instanceof ControllerClosedEvent) {
                   System.out.println("ControllerClosedEvent");
              if (evt instanceof EndOfMediaEvent) {
                   System.out.println("EndOfMediaEvent");
                   pr.stop();
              if (evt instanceof StopEvent) {
                   System.out.println("StopEvent");
                   pr.close();
                   try {
                        dataSink.stop();
                        dataSink.close();
                   } catch (Exception ee) {
                        ee.printStackTrace();
         public void dataSinkUpdate(DataSinkEvent event) {
              if (event instanceof EndOfStreamEvent) {
                   try {
                        System.out.println("EndOfStreamEvent");
                        dataSink.stop();
                        dataSink.close();
                        System.exit(1);
                   } catch (Exception e) {
         public void processConfigured(Processor p) {
              // Set the output content type
              p.setContentDescriptor(this.contentType);
              // Get the track control objects
              TrackControl track[] = p.getTrackControls();
              boolean encodingPossible = false;
              // Go through the tracks and try to program one of them
              // to output ima4 data.
              for (int i = 0; i < track.length; i++) {
              try {
                   track.setFormat(this.encoding);
                   encodingPossible = true;
              } catch (Exception e) {
                   track[i].setEnabled(false);
              if (!encodingPossible) {
                   System.out.println("cannot encode to " + this.encoding);
              return;
              p.realize();
         public void processRealized(Processor p) {
              System.out.println("Entered processRealized");
    DataSource source = p.getDataOutput();
    try {
         dataSink = Manager.createDataSink(source, dest);
         dataSink.open();
         dataSink.start();
         dataSink.addDataSinkListener(new DataSinkListener() {
                        public void dataSinkUpdate(DataSinkEvent e) {
                             if (e instanceof EndOfStreamEvent) {
                                  System.out.println("EndOfStreamEvent");
                                  dataSink.close();
    } catch (Exception e) {
         e.printStackTrace();
         return;
              p.start();
    ============================================

    Which shows that the output stream cannot be of that particulat format and descriptor.
    Look at this code
    import javax.swing.*;
    import javax.media.*;
    import java.net.*;
    import java.io.*;
    import javax.media.datasink.*;
    import javax.media.protocol.*;
    import javax.media.format.*;
    import javax.media.control.*;
    class Abc implements ControllerListener
         DataSource ds;
         DataSink sink;
         Processor pr;
         MediaLocator mc;
         public void maam() throws Exception
              mc=new MediaLocator("file:C:/Workspace/aaaaa.mpg");
              pr=Manager.createProcessor(new URL("file:G:/java files1/jmf/aa.mp3"));
              pr.addControllerListener(this);
              pr.configure();          
         public void controllerUpdate(ControllerEvent e)
              if(e instanceof ConfigureCompleteEvent)
                   System.out.println ("ConfigureCompleteEvent");
                             Processor p = (Processor)e.getSourceController();
                   System.out.println ("ConfigureCompleteEvent");
                         processConfigured(p);
              if(e instanceof RealizeCompleteEvent)
                   System.out.println ("RealizeCompleteEvent");
                           Processor p = (Processor)e.getSourceController();
                             processRealized(p);
              if(e instanceof ControllerClosedEvent)
                         System.out.println ("ControllerClosedEvent");
                     if(e instanceof EndOfMediaEvent)
                        System.out.println ("EndOfMediaEvent");
                        Processor p = (Processor)e.getSourceController();
                        p.stop();
                 if(e instanceof StopEvent)
                         System.out.println ("StopEvent");
                         Processor p = (Processor)e.getSourceController();
                         p.close();
                   try
                        sink.stop();
                        sink.close();
                   catch(Exception ee)
         public void processConfigured(Processor p)
              System.out.println("Entered processConfigured");
              p.setContentDescriptor (new FileTypeDescriptor (FileTypeDescriptor.MPEG_AUDIO));       
                /*    TrackControl track[] = p.getTrackControls ();
                 boolean encodingPossible = false;
                 for (int i = 0; i < track.length; i++)
                      try
                           track.setFormat (new VideoFormat (VideoFormat.MPEG));
                   encodingPossible = true;
                   catch (Exception e)
                   track[i].setEnabled (false);
         p.realize();
         public void processRealized(Processor p)
              pr=p;
              System.out.println("Entered processRealized");
              try
                   MediaLocator dest = new MediaLocator("file:C:/Workspace/ring1.mpg");
                   sink = Manager.createDataSink(p.getDataOutput(), dest);
              sink.addDataSinkListener(new DataSinkListener()
                        public void dataSinkUpdate(DataSinkEvent e)
                             if(e instanceof EndOfStreamEvent)
                        System.out.println ("EndOfStreamEvent");
                                  sink.close();
                   sink.open();
                   sink.start();
              catch(Exception eX)
              System.out.println("Just before start");
              p.start();
    /*     public void dataSinkUpdate(DataSinkEvent event)
              if(event instanceof EndOfStreamEvent)
                   try
                        System.out.println("EndOfStreamEvent");
                        dsk.stop();
                        dsk.close();
                        System.exit(1);
                   catch(Exception e)
    public class JMFCapture6
         public static void main(String args[]) throws Exception
              Abc a=new Abc();
              a.maam();

  • Save the rtp streaming

    Hi,
    If my data source type is
    "com.sun.media.protocol.vfw.DataSource@cd5f8b" ,
    I can use the following method to grab the image frome the real-time video that my webcam catpured
    cds = (PushBufferDataSource) ds;
    cds.getStreams()[0].setTransferHandler(this);
    cds.start();
    Buffer buf = new Buffer();
    public void transferData(PushBufferStream pbs) {
            try {
                pbs.read(buf);
            }catch(java.io.IOException ioe) {
                System.err.println(ioe);
            if (bti == null) {
                VideoFormat vf = (VideoFormat) buf.getFormat();
                bti = new BufferToImage(vf);
            Image im = bti.createImage(buf);
            try
                     BufferedImage tag = new BufferedImage(320,240,BufferedImage.TYPE_INT_RGB);
              tag.getGraphics().drawImage(im,0,0,320,240,null);
                     FileOutputStream out=new FileOutputStream("test.jpg");
                     JPEGImageEncoder encoder = JPEGCodec.createJPEGEncoder(out);      
                     encoder.encode(tag);
                     out.close();
              catch(Exception e)
    }but if my data source type is "com.sun.media.protocol.rtp.DataSource@14c194d" the previous method does not work.
    Does anyone know how should I modify my method when the data source is rtp for grabing image and saving the video?
    Thanks!

    natdeamer wrote:
    Having some problems streaming over the internet - probably because im doing it wrong.That's correct, you're doing it wrong.
    Im using www.whatismyip.com to get the internet IP of the 2 computers im trying to stream to and from - and using these in the code, but nothing happens.99% of the time, your public IP actually addresses your router, rather than your computer. That means your computer is not publically addressably by it's IP address alone. You'll need to do something called a "NAT holepunch", which you can look up online. Also, I've included two links to discussions I've had with people about the same issue.
    [http://forums.sun.com/thread.jspa?forumID=28&threadID=5355413]
    [http://forums.sun.com/thread.jspa?forumID=28&threadID=5356672]

  • Having multiple threads for receiving RTP streams

    Hello,
    Developing an audio conference server, I have come to think that if I manage to separate the different audioReceivers who receive the RTP streams the performance could improve.
    At this moment I have the main program, so the main thread let�s say, who initializes a new audioRx object for each remote client.
    Would separation of the different receivers into different threads improve the applications performance?
    has anyone thought of this? or done something similar?
    Thanks for your help.
    bgl

    i need help from about the RTP stream from the same port

  • Get a event for a new received rtp stream in a player

    Hi!
    I'm trying to implement a RTP-player, that receives a AV-stream and play it. The special thing about this player should be, that even if the stream interrupts, the player wait's on the same IP and port for a new stream and open it in the SAME frame (not like jmstudio in new window).
    I try to catch the "ReceiveStreamEvent", so i can restart the player, but i don't get eny events for this. I tried to do it with a "RTPManager", but i don't know how.
    Does anybody has a example how to get the "ReceiveStreamEvent", so i know, the RTP-stream has been interrupted?
    Thanks
    Adam

    See AVReceive2 in JMF Solutions, in JMF web site

  • How do I transmit an RTP stream using only a datasource?

    I've been following this tutorial: http://members.jcom.home.ne.jp/3117216601/jmf2guide/RTPPresenting.html
    And I have clients that can send me RTP audio streams, which are received in the update method.
    Now, update generates a NewReceiveStreamEvent. Which has a datasource that I can play.
    But what I want to do. Is have send the DataSource back to that client. (The ip address is unknown... but that shouldn't matter, as RTP is two-way).
    I want to send the DataSource back to the client, because later, we will be routing different clients, and who they can talk to and such. But I want to start with the basics.

    captfoss wrote:
    DerNalia wrote:
    But what I want to do. Is have send the DataSource back to that client. (The ip address is unknown... but that shouldn't matter, as RTP is two-way).RTP isn't two-way, RTP is one way... RTCP is two-way. The data transport is duplex, the QOS/control messages are duplex...
    So you do need to know the IP address of the client in order to send anything to him...
    I want to send the DataSource back to the client, because later, we will be routing different clients, and who they can talk to and such. But I want to start with the basics.You should just be able to turn around and give the DataSource that's receiving the RTP stream to an RTPManager and broadcast it back out... assuming you're not transcoding it out of its RTP format or anything.I don't have the IP address. So is this even possible without switching to RTCP? if so if you an example, that would be awesome. Also an example of two way communication with RTCP would be awesome too.
    Does RTCP used the same sort of structure that RTP does? like is there a DataSource that could be merged with other datasources, and sent back to where one of the datasources came frome?

  • RTP Transmission, Receive Woes

    Hi ppl,
    I developed an app to transmit captured video and audio as RTP streams. I made a monitor of the video and audio being captured which is displayed on the Frame. I also made a player from the RTP data received from another system in the network which transmits the data as RTP streams.
    My problem is as soon as the player is realized and starts, the other components in the window, say a JList control, JButton controls are invisible and the JFrame upon which the Monitor and player is placed is also unresponsive. But the Monitor still displays video/audio being captured and the Player plays the video/audio being received from the network.
    Can anybody offer some help on this? Please ...

    Hi!
    You r probably suffering from the problem of light weight component(tat r swing components) n heavy weight component(getVideo..... method components). Here heavy weight components r not working properly with these light weight components so..... not working properly for it more read article available title something as "Heavy weight n light weight components" as this site.........
    I m not sure much but this might show u some direction to work properly..............

  • RTP Streaming seek & stop question.

    Is it possible to seek in a rtp stream received at the client. Since rtp works on push data concept,is there a way to implement this.
    I want the server to stop streaming when the client exits. Normally the sender sends Bye Event to Receiver. Is it possible for other way around to tell the sender to stop streaming.
    Thanks.

    From what I understand RTP on it's is basically for a blind transmission, like watch tv or the listening to the Radio, if the RTP stream (analogous to the tv signal or radio waves) is being broadcast you can receive it. With this method you have no control really, you are just given what is being streamed.
    However RTSP, allows you send commands to a server that will stream RTP content, which you can start, stop, pause, and in some cases move forward or backward to a particular time in the sequence.
    You may find the following link useful: http://www.cs.columbia.edu/~hgs/teaching/ais/slides/RTSP.pdf
    hope this helps,
    Stef

  • RTP stream being stripped

    How can I check to see if 802.1Q VLAN tagging information is being stripped from the RTP stream by the workstation's NIC?

    Some network cards strip the dot1q headers, so your packet capture won't  have this data. If you're not seeing dot1q headers please check the  following link from wireshark:
    http://wiki.wireshark.org/CaptureSetup/VLAN#head-81781716144f2855ab0aff2f8b752e95f2562efb
    Apply the previous settings to the machine you're capturing with.

  • Setting RTP Stream Properties FrameRate/Size/Format

    i am having problems setting the framerate on the rtp stream, i can set the quality of the jpeg, set the size of the stream, ive tried sizes 160X120, 320X240, 640x480.
    as soon as i try to set the framerate it dies.
    any ideas?

    i am having problems setting the framerate on the rtp stream, i can set the quality of the jpeg, set the size of the stream, ive tried sizes 160X120, 320X240, 640x480.
    as soon as i try to set the framerate it dies.
    any ideas?

  • Merging RTP streams and writing to file

    I'm am in the middle of project using JMF i'm trying to merge incoming RTP streams and write them to file but i always have problem with formats. I have created processor with merged DataSource i set ContentDescriptor to different FileTypeDescriptors and tried to transcode the RTP streams to different formats, but always processor couldnt realize because of this error:
    Unable to handle format: LINEAR, 8000.0 Hz, 16-bit, Mono, LittleEndian, Signed
    Unable to handle format: LINEAR, 8000.0 Hz, 16-bit, Mono, LittleEndian, Signed
    Could someone tell me what content descriptors i need to use and what formats to solve this problem. Any help will be appreciated.
    Message was edited by:
    kaligula

    Hello,
    Here is a procedure i wrote to write on text files the content of any table (and also generate the insert SQL order)
    http://oracle.developpez.com/sources/?page=developpement#Extraction_table
    You can download the script file here http://sheikyerbouti.developpez.com/src/extraction_table.zip
    Hope this help you.
    Francois

  • Problems Presenting RTP Stream

    Hi, I have a Video Streaming server developed in Java using
    Java Media Framework and i need to see the videos with a flash
    player, but i know that flash works with RTMP and JMF doesn't
    support it. Is there any way to present the RTP Streams with flash?
    Thanks.

    Flash Player only supports streams delivered over the RTMP
    protocol from Flash Media Server.

  • Attempting to write RTP stream to wave

    I have been working long and hard trying to figure out to save an RTP stream from a Cisco phone to a wave. I need the wave file to be in the format ULAW, 8khz, 8 bit mono.
    Through much research this is the code I come up with but I does not work. I states that I am trying to get a datasink for null when I try to create the filewriter datasink.
    I also have gotten the following error message with other version of the code below:
    newReceiveStreamEvent exception Cannot find a DataSink for: com.sun.media.protocol.rtp.DataSource@3680c1
    When I get the RTP stream I get the following as the format of the input stream:
    - Recevied new RTP stream: ULAW/rtp, 8000.0 Hz, 8-bit, Mono
    else if (evt instanceof NewReceiveStreamEvent) {
         try {
              stream = ((NewReceiveStreamEvent)evt).getReceiveStream();
    Format formats[] = new Format[1];
    formats[0] = new AudioFormat(AudioFormat.LINEAR,8000,8,1);
    FileTypeDescriptor outputType = new FileTypeDescriptor(FileTypeDescriptor.WAVE);
    DataSource ds = stream.getDataSource();
              // Find out the formats.
              RTPControl ctl = (RTPControl)ds.getControl("javax.media.rtp.RTPControl");
              if (ctl != null){
              System.err.println(" - Recevied new RTP stream: " + ctl.getFormat());
              } else
              System.err.println(" - Recevied new RTP stream");
              if (participant == null)
              System.err.println(" The sender of this stream had yet to be identified.");
              else {
              System.err.println(" The stream comes from: " + participant.getCNAME());
    ProcessorModel processorModel = new ProcessorModel(ds, formats, null);
    Processor processor = Manager.createRealizedProcessor(processorModel);
              processor.setContentDescriptor(new ContentDescriptor(ContentDescriptor.RAW_RTP));
    MediaLocator f = new MediaLocator("file:/C:/output.wav");
    DataSink filewriter = null;
    DataSource tempds = processor.getDataOutput();
    filewriter = Manager.createDataSink(tempds, f);
    filewriter.open();
    Edited by: phodat on Sep 16, 2010 10:17 PM
    Edited by: phodat on Sep 16, 2010 10:20 PM

    The content descriptor is designed to specify the output format, not the input format. The input format is specified by the DataSource automatically, and you set the content descriptor to whatever you want the output format to be. In this case, you'd want to set it to a WAV file.
    You'd then need to go through all of the Track objects on the processor and set their output format to the ULAW specifications as you want there.
    [http://www.cs.odu.edu/~cs778/spring04/lectures/jmfsolutions/examplesindex.html]
    There's actually an example program that exports RTP streams, you can probably use that code without modification to fit your needs.

  • Monitoring RTP streams

    I have downloaded RTPMonitor.java to monitor the RTP streams. It executes fine but when I give the session address and the session name as arguments and click on 'Start', it gives:
    SessionManagerException creating RtpMonitorManager:
    Control Port must be valid and odd.
    Please help me out on this..
    Thanks,
    Meghana

    Where can I download RTPMonitor.java?
    I interested in reading all fields in RTP header. Can I do that with it?

  • No Inter-Cluster RTP Stream with Gatekeepers

    Hello,
    Firstly I am no expert in Cisco telephony as we have just recently migrated to a full Cisco solution, so apologies if I ask a fundamental question.
    Client A = Site A
    Client B = Site B
    Client C = Site C
    Site A and Site B are in 1 cluster
    Site C is in another cluster
    Intra-Cluster Traffic Works
    Client A -> Client B within the same cluster (across 2 sites with a low latency link) RTP stream comes up and the call functions as expected.
    Inter-Cluster Traffic Fails (GK to GK)
    Client C -> Client A this works, the RTP stream comes up and the call functions as expected.
    Client A -> Client C this call connects but there is no RTP stream.
    We are using G711 across the board and I have captured a wireshark capture from a Client A -> Client C failed call.
    I have been going through this capture and noticed that when I search H225 (for the gatekeepers) I see the following –
    CS: setup
    RAS: admissionRequest
    RAS: admissionConfirm
    CS: callProeeding
    CS: alerting
    CS: notify
    RAS: registrationRequest
    RAS: registrationConfirm
    CS: notify
    CS: connect
    CS: notify
    CS: releaseComplete
    RAS: disengageRequest  (DISCONECT_REASON=2,TIME=1321266127,DURATION=24,DISCONNECT_STRING=no resource,ORIGIN=0,LINE_NUMBER=GK,OUTBUND_GW_IP=..
    RAS: disengageConfirm
    There are firewalls inbetween and these were the first thing I looked at, but I dont even see any RTP stream trying to be initiated from the far side. Would anyone have any ideas where I could start looking?
    Thanks,
    Peter

    Pat,
    I can not talk to the UC540, but I ran into a situation recently where the SIP gateway was sending out the private extension of the phone number instead of the full DID that was registered to the provider.  The provider was then blocking call.  
    In our SIP debugs we saw the RDNIS information of the private extension I believe. The error code we were getting back from SP was code 404 or something along those lines.
    I recommend you do some debugs and track where the calls fails, compare the SNR call versus a normal call in the debugs, and then if you still get stuck post running configs and debugs back here. 

Maybe you are looking for

  • Typography with a thin stroke and a fat problem...

    ok... <sigh> My client chose this thick type that didnt look good and asked me to thin it out. I decided to do it by adding a 0.25 stroke inside all the fonts i didnt want bold (its like 70% of a 6 page document). When i was sending the pdf to the cl

  • Repeating Word Mail-Merge

    I have an entity called Tenant. In the Tenant entity there is a form with fields such as Tenant Name, Tenant Address. I record the Name and Address of the Tenants in the appropriate fields. I want to do a mail merge based on the Tenant entity that wh

  • Mobility Service Engine without Client or Tag licenses

    What does the base MSE box get you when you do not have any additional feature licenses? Do I still get rogue AP location mapping and history?

  • What is the normal upload and download speed of a mac book pro?

    im curious because my mac book pro has been quite slow and its not old or anything wrong with it.

  • "1 New Message" does not appear

    Hello forumers I have a Nokia 5800, but it appears to be broken - whenever I receive a new message, it literally doesn't show up on my front page. It just stays blank, as if I didn't receive a message. However when I go to the Messaging application,