Save a rtp stream to a file

hi,
I can send and receive a rtp stream. Now I want to save a received rtp stream to a file,but i got an exception when create
a DataSink:
public synchronized void update(ReceiveStreamEvent evt) {
        RTPManager mgr = (RTPManager) evt.getSource();
        Participant participant = evt.getParticipant();
        ReceiveStream stream = evt.getReceiveStream();
        if (evt instanceof NewReceiveStreamEvent) {
            DataSource ds = stream.getDataSource();
    }

Well gee wiz, I should would like to help you with your exception with your data sink...
But you didn't post what exceptino you were getting, so I can't help you...
And the code you posted has nothing to do with a DataSink to begin with, so I can't help you again...

Similar Messages

  • Capture RTP stream in a file

    I am developing a phone client application, one of the features it should include, is recording conversation, my problem is that I don't find how can I capture the audio stream sent via RTP and record it into a .wav file. I'm using jmf for streaming.
    I know that this is possible using ethereal, that captures the stream in a .au files, and you just have to convert it to a .wav;
    isn't it possible to do it using JMF ?????
    thank you in advance

    Hello!
    do you already know how to do it?
    I think you can create a media locator on the server and save the file on the client. For one stream, you can create a DataSink to a file. But if there are more than one stream at the same time, you can create a SessionManager.
    In this Session Manager, implement the Update method to catch the events of a new stream.
    For every new stream, you can play the stream and save it.
    But this is just what I've read. I'm planning to do it too..do you have it already done?
    Thanks!

  • RTP streaming from local file

    I've a problem with JMF, in particular when I transmit a stream from a local file updated from receveing process. The file is "uLaw 8Khz, 8Bit".
    It would seem that when the processor is running is made of a photo of current file size; this implies that is sent only part of the file accessed. The rest of the file is discarded and JMF raises the event EndOfStream.
    Here's the source:
                   // encoding = ULAW, sampleRate = 8000.0Hz, sampleSizeInBits = 8bit, channels = mono(1) or stereo(2)
                   this.format = new AudioFormat(this.audioFormat, 8000, 8, 1);
                   MediaLocator mlIn = null;
                   if (rtpFilename != null && rtpFilename.startsWith("file://"))
                        mlIn = new MediaLocator(rtpFilename);
                   else
                        mlIn = new MediaLocator("file://" + rtpFilename);
                   logger.debug("Input Media locator URL: " + mlIn);
                   if ((processor = createProcessor(mlIn)) != null) {
                        // configure the processor
                        stateHelper = new StateHelper(processor);
                        if (stateHelper.configure(10000)) {
                             processor.setContentDescriptor(new ContentDescriptor(ContentDescriptor.RAW_RTP));
                             // Go through the tracks and try to program one of them to output gsm data
                             boolean encodingOk = false;
                             TrackControl track[] = processor.getTrackControls();
                             if (track != null) {
                                  logger.debug("Track found: " + track.length);
                                  for (int i = 0; i < track.length; i++) {
                                       if (!encodingOk && track[i] instanceof FormatControl) {
                                            if (((FormatControl) track).setFormat(format) == null) {
                                                 track[i].setEnabled(false);
                                            } else {
                                                 encodingOk = true;
                                       } else {
                                            // we could not set this track to gsm, so disable it
                                            track[i].setEnabled(false);
                             // At this point, we have determined where we can send out format data or not
                             logger.debug("Encoding " + this.format.getEncoding() + " result: " + encodingOk);
                             if (encodingOk) {
                                  // realize the processor
                                  if (stateHelper.realize(10000)) {
                                       try {
                                            // hand this datasource to manager for creating an RTP
                                            // datasink our RTP datasink will multicast the audio
                                            String locator = "rtp://" + this.ip + ":" + this.port + "/audio/1";
                                            MediaLocator mlOut = new MediaLocator(locator);
                                            logger.error("Output Media locator URL: " + mlOut);
                                            // create a send stream for the output data source of a processor and start it
                                            dsource = processor.getDataOutput();
                                            dsink = Manager.createDataSink(dsource, mlOut);
                                            dsink.open();
                                            // now start the datasink
                                            dsink.start();
                                            logger.debug("Data sink created for Media locator: " + mlOut);
                                            if (stateHelper.prefetch(10000)) {
                                                 stateHelper.playToEndOfMedia(60000);
                                            } else {
                                                 logger.warn("Processor prefetch failed");
                                       } catch (Exception e) {
                                            logger.error(e.getMessage(), e);
                                            running = false;
                                       } finally {
                                            if (stateHelper != null)
                                                 stateHelper.close();
                                            if (dsink != null) {
                                                 dsink.close();
                                                 logger.debug("Datasink closed");
                                            try {
                                                 if (dsource != null)
                                                      dsource.stop();
                                            } catch (IOException e) {
                                                 logger.debug(e.getMessage(), e);
                                  } else {
                                       logger.warn("Processor realization failed");
                             } else {
                                  logger.warn("Encoding failed");
                        } else {
                             logger.warn("Processor configuration failed");
                   } else {
                        logger.warn("Processor creation failed");
    Thanks in advance mariusv5.

    Hi,
    I think you should go for BDC , using call transaction 'SE11' you can upload data from flat file. and Use FM gui_download to download the data from z table
    thanks,
    Prashant

  • Save the rtp streaming

    Hi,
    If my data source type is
    "com.sun.media.protocol.vfw.DataSource@cd5f8b" ,
    I can use the following method to grab the image frome the real-time video that my webcam catpured
    cds = (PushBufferDataSource) ds;
    cds.getStreams()[0].setTransferHandler(this);
    cds.start();
    Buffer buf = new Buffer();
    public void transferData(PushBufferStream pbs) {
            try {
                pbs.read(buf);
            }catch(java.io.IOException ioe) {
                System.err.println(ioe);
            if (bti == null) {
                VideoFormat vf = (VideoFormat) buf.getFormat();
                bti = new BufferToImage(vf);
            Image im = bti.createImage(buf);
            try
                     BufferedImage tag = new BufferedImage(320,240,BufferedImage.TYPE_INT_RGB);
              tag.getGraphics().drawImage(im,0,0,320,240,null);
                     FileOutputStream out=new FileOutputStream("test.jpg");
                     JPEGImageEncoder encoder = JPEGCodec.createJPEGEncoder(out);      
                     encoder.encode(tag);
                     out.close();
              catch(Exception e)
    }but if my data source type is "com.sun.media.protocol.rtp.DataSource@14c194d" the previous method does not work.
    Does anyone know how should I modify my method when the data source is rtp for grabing image and saving the video?
    Thanks!

    natdeamer wrote:
    Having some problems streaming over the internet - probably because im doing it wrong.That's correct, you're doing it wrong.
    Im using www.whatismyip.com to get the internet IP of the 2 computers im trying to stream to and from - and using these in the code, but nothing happens.99% of the time, your public IP actually addresses your router, rather than your computer. That means your computer is not publically addressably by it's IP address alone. You'll need to do something called a "NAT holepunch", which you can look up online. Also, I've included two links to discussions I've had with people about the same issue.
    [http://forums.sun.com/thread.jspa?forumID=28&threadID=5355413]
    [http://forums.sun.com/thread.jspa?forumID=28&threadID=5356672]

  • Attempting to write RTP stream to wave

    I have been working long and hard trying to figure out to save an RTP stream from a Cisco phone to a wave. I need the wave file to be in the format ULAW, 8khz, 8 bit mono.
    Through much research this is the code I come up with but I does not work. I states that I am trying to get a datasink for null when I try to create the filewriter datasink.
    I also have gotten the following error message with other version of the code below:
    newReceiveStreamEvent exception Cannot find a DataSink for: com.sun.media.protocol.rtp.DataSource@3680c1
    When I get the RTP stream I get the following as the format of the input stream:
    - Recevied new RTP stream: ULAW/rtp, 8000.0 Hz, 8-bit, Mono
    else if (evt instanceof NewReceiveStreamEvent) {
         try {
              stream = ((NewReceiveStreamEvent)evt).getReceiveStream();
    Format formats[] = new Format[1];
    formats[0] = new AudioFormat(AudioFormat.LINEAR,8000,8,1);
    FileTypeDescriptor outputType = new FileTypeDescriptor(FileTypeDescriptor.WAVE);
    DataSource ds = stream.getDataSource();
              // Find out the formats.
              RTPControl ctl = (RTPControl)ds.getControl("javax.media.rtp.RTPControl");
              if (ctl != null){
              System.err.println(" - Recevied new RTP stream: " + ctl.getFormat());
              } else
              System.err.println(" - Recevied new RTP stream");
              if (participant == null)
              System.err.println(" The sender of this stream had yet to be identified.");
              else {
              System.err.println(" The stream comes from: " + participant.getCNAME());
    ProcessorModel processorModel = new ProcessorModel(ds, formats, null);
    Processor processor = Manager.createRealizedProcessor(processorModel);
              processor.setContentDescriptor(new ContentDescriptor(ContentDescriptor.RAW_RTP));
    MediaLocator f = new MediaLocator("file:/C:/output.wav");
    DataSink filewriter = null;
    DataSource tempds = processor.getDataOutput();
    filewriter = Manager.createDataSink(tempds, f);
    filewriter.open();
    Edited by: phodat on Sep 16, 2010 10:17 PM
    Edited by: phodat on Sep 16, 2010 10:20 PM

    The content descriptor is designed to specify the output format, not the input format. The input format is specified by the DataSource automatically, and you set the content descriptor to whatever you want the output format to be. In this case, you'd want to set it to a WAV file.
    You'd then need to go through all of the Track objects on the processor and set their output format to the ULAW specifications as you want there.
    [http://www.cs.odu.edu/~cs778/spring04/lectures/jmfsolutions/examplesindex.html]
    There's actually an example program that exports RTP streams, you can probably use that code without modification to fit your needs.

  • Can I play and save the incoming RTP streams simultenously

    Hi,
    I want to whether i can play and save the incoming RTP stream simultenously using
    JMF 2.1. The idea is by saving in the file, i can playback the same file later time
    Is there any example code aviliable.
    Thanks in Advance
    bye
    Srikanth

    I think this, today, it's impossible lamentably. By but that I have tried it, I have not been able to obtain it.
    If exists someone that it has obtained it, please, help us with this subject. It's very important.
    Thanks

  • Write(Export) the RTP Stream from a SIP Call to an audio file

    I am working on telephony system that need record each call to a file.
    I got the RTP Stream in ReceiveStreamEvent handler and start to record
    the problem is i got a file and play it but there are any sounds
    when i set FileTypeDescriptor.WAVE and AudioFormat.ULAW
    I try out the FileTypeDescriptor.WAVE and AudioFormat.IMA4_MS, i got a fixed size file with 60kb
    if the FileTypeDescriptor.MPEG_AUDIO and AudioFormat.MPEG, the processor cannot be realize,
    the thread is blocked!!
    Could anyone save me? thanks in advanced!
    =================================Code===================================================
    public void update(ReceiveStreamEvent evt){
    logger.debug("received a new incoming stream. " + evt);
    RTPManager mgr = (RTPManager) evt.getSource();
    Participant participant = evt.getParticipant(); // could be null.
    ReceiveStream stream = evt.getReceiveStream(); // could be null.
    if (evt instanceof NewReceiveStreamEvent)
    try
    stream = ( (NewReceiveStreamEvent) evt).getReceiveStream();
    DataSource ds = stream.getDataSource();
              recorder = new CallRecorderImpl(ds);
    recorder.start();
    catch (Exception e)
    logger.error("NewReceiveStreamEvent exception ", e);
    return;
    ========================================================================================
    this is the CallRecorderImpl Class
    public class CallRecorderImpl implements DataSinkListener, ControllerListener {
         private Processor processor = null;
         private DataSink dataSink = null;
         private DataSource source = null;
         private boolean bRecording = false;
         FileTypeDescriptor contentType = new FileTypeDescriptor(FileTypeDescriptor.WAVE);
         Format encoding = new AudioFormat(AudioFormat.IMA4_MS);
    MediaLocator dest = new MediaLocator("file:/c:/bar.wav");
         public CallRecorderImpl(DataSource ds) {
              this.source = ds;
         public void start() {
              try {
                   processor = Manager.createProcessor(source);
                   processor.addControllerListener(this);
                   processor.configure();
              } catch (Exception e) {
                   System.out.println("exception:" + e);
         public void stop() {
              try {
                   System.out.println("stopping");
                   this.dataSink.stop();
                   this.dataSink.close();
              } catch (Exception ep) {
                   ep.printStackTrace();
         public void controllerUpdate(ControllerEvent evt) {
              Processor pr = (Processor) evt.getSourceController();
              if (evt instanceof ConfigureCompleteEvent) {
                   System.out.println("ConfigureCompleteEvent");
                   processConfigured(pr);
              if (evt instanceof RealizeCompleteEvent) {
                   System.out.println("RealizeCompleteEvent");
                   processRealized(pr);
              if (evt instanceof ControllerClosedEvent) {
                   System.out.println("ControllerClosedEvent");
              if (evt instanceof EndOfMediaEvent) {
                   System.out.println("EndOfMediaEvent");
                   pr.stop();
              if (evt instanceof StopEvent) {
                   System.out.println("StopEvent");
                   pr.close();
                   try {
                        dataSink.stop();
                        dataSink.close();
                   } catch (Exception ee) {
                        ee.printStackTrace();
         public void dataSinkUpdate(DataSinkEvent event) {
              if (event instanceof EndOfStreamEvent) {
                   try {
                        System.out.println("EndOfStreamEvent");
                        dataSink.stop();
                        dataSink.close();
                        System.exit(1);
                   } catch (Exception e) {
         public void processConfigured(Processor p) {
              // Set the output content type
              p.setContentDescriptor(this.contentType);
              // Get the track control objects
              TrackControl track[] = p.getTrackControls();
              boolean encodingPossible = false;
              // Go through the tracks and try to program one of them
              // to output ima4 data.
              for (int i = 0; i < track.length; i++) {
              try {
                   track.setFormat(this.encoding);
                   encodingPossible = true;
              } catch (Exception e) {
                   track[i].setEnabled(false);
              if (!encodingPossible) {
                   System.out.println("cannot encode to " + this.encoding);
              return;
              p.realize();
         public void processRealized(Processor p) {
              System.out.println("Entered processRealized");
    DataSource source = p.getDataOutput();
    try {
         dataSink = Manager.createDataSink(source, dest);
         dataSink.open();
         dataSink.start();
         dataSink.addDataSinkListener(new DataSinkListener() {
                        public void dataSinkUpdate(DataSinkEvent e) {
                             if (e instanceof EndOfStreamEvent) {
                                  System.out.println("EndOfStreamEvent");
                                  dataSink.close();
    } catch (Exception e) {
         e.printStackTrace();
         return;
              p.start();
    ============================================

    Which shows that the output stream cannot be of that particulat format and descriptor.
    Look at this code
    import javax.swing.*;
    import javax.media.*;
    import java.net.*;
    import java.io.*;
    import javax.media.datasink.*;
    import javax.media.protocol.*;
    import javax.media.format.*;
    import javax.media.control.*;
    class Abc implements ControllerListener
         DataSource ds;
         DataSink sink;
         Processor pr;
         MediaLocator mc;
         public void maam() throws Exception
              mc=new MediaLocator("file:C:/Workspace/aaaaa.mpg");
              pr=Manager.createProcessor(new URL("file:G:/java files1/jmf/aa.mp3"));
              pr.addControllerListener(this);
              pr.configure();          
         public void controllerUpdate(ControllerEvent e)
              if(e instanceof ConfigureCompleteEvent)
                   System.out.println ("ConfigureCompleteEvent");
                             Processor p = (Processor)e.getSourceController();
                   System.out.println ("ConfigureCompleteEvent");
                         processConfigured(p);
              if(e instanceof RealizeCompleteEvent)
                   System.out.println ("RealizeCompleteEvent");
                           Processor p = (Processor)e.getSourceController();
                             processRealized(p);
              if(e instanceof ControllerClosedEvent)
                         System.out.println ("ControllerClosedEvent");
                     if(e instanceof EndOfMediaEvent)
                        System.out.println ("EndOfMediaEvent");
                        Processor p = (Processor)e.getSourceController();
                        p.stop();
                 if(e instanceof StopEvent)
                         System.out.println ("StopEvent");
                         Processor p = (Processor)e.getSourceController();
                         p.close();
                   try
                        sink.stop();
                        sink.close();
                   catch(Exception ee)
         public void processConfigured(Processor p)
              System.out.println("Entered processConfigured");
              p.setContentDescriptor (new FileTypeDescriptor (FileTypeDescriptor.MPEG_AUDIO));       
                /*    TrackControl track[] = p.getTrackControls ();
                 boolean encodingPossible = false;
                 for (int i = 0; i < track.length; i++)
                      try
                           track.setFormat (new VideoFormat (VideoFormat.MPEG));
                   encodingPossible = true;
                   catch (Exception e)
                   track[i].setEnabled (false);
         p.realize();
         public void processRealized(Processor p)
              pr=p;
              System.out.println("Entered processRealized");
              try
                   MediaLocator dest = new MediaLocator("file:C:/Workspace/ring1.mpg");
                   sink = Manager.createDataSink(p.getDataOutput(), dest);
              sink.addDataSinkListener(new DataSinkListener()
                        public void dataSinkUpdate(DataSinkEvent e)
                             if(e instanceof EndOfStreamEvent)
                        System.out.println ("EndOfStreamEvent");
                                  sink.close();
                   sink.open();
                   sink.start();
              catch(Exception eX)
              System.out.println("Just before start");
              p.start();
    /*     public void dataSinkUpdate(DataSinkEvent event)
              if(event instanceof EndOfStreamEvent)
                   try
                        System.out.println("EndOfStreamEvent");
                        dsk.stop();
                        dsk.close();
                        System.exit(1);
                   catch(Exception e)
    public class JMFCapture6
         public static void main(String args[]) throws Exception
              Abc a=new Abc();
              a.maam();

  • Save RTP audio in a file

    Hi,
    I am on pc A and want to receive audio stream from pcB and store it in a file on pcB. AVReceive2 works but it plays audio, doesnt save it in file.
    Please someone tell me how to do saving RTP audio in a file.
    Thanks

    Hello!
    do you already know how to do it?
    I think you can create a media locator on the server and save the file on the client. For one stream, you can create a DataSink to a file. But if there are more than one stream at the same time, you can create a SessionManager.
    In this Session Manager, implement the Update method to catch the events of a new stream.
    For every new stream, you can play the stream and save it.
    But this is just what I've read. I'm planning to do it too..do you have it already done?
    Thanks!

  • Play and save rtp streams

    Hello
    I need help can someone tell me how i play and save at the same time an rtp stream?
    Thanks

    Thanks for your response.
    Very much appreciated. Was very informative.
    This is my current situation with 3 your suggestions:
    Daniele,
    Your suggestion 1’s result:
    In Wireshark ----> Under Statistics --->I have VoIP calls.
    (I don’t see VoIP calls under Telephony –> may be a different version of Wireshark).
    Anyway, there is only one call because the Wireshark had a Capture Filter to track information between one source and one destination IP address. So I select that call and click on Player button and then click on Decode button. Then I select the forward stream (From IP1 to IP2) and click on play and I don’t hear anything at all. All silence. Same when I select the reverse stream from IP2 to IP1 and play.
    Your suggestion 2’s result:
    In Wireshark ---> Under Statistics ---> I Selected Stream Analysis (Did not select Show All Streams – not sure what the difference is) then ---> Save Payload ----> Select “au” instead of raw and it says – “Can’t save in a file:saving in au format supported only for alaw / ulaw stream
    Your suggestion 3’s result:
    Saved the file in .raw format. Opened Audacity and imported the file as raw and specified FIRST the A-Law codec for G.711A and selected 8000hz and that didn’t work and SECOND tried the u-Law coding for G.711u and selected the sample frequency again equal to 8000 Hz and that didn't work.
    Didn't work means:
    When I played the imported information I get all noise (like heavy metallic sound) and no voice.
    So my guess is that this capture is neither A-Law or u-Law codec - right. This capture was given to me by a customer.
    Any other suggestions – much appreciated Daniele.

  • Simultaneous Recording in file and  playing incoming rtp stream

    I want to Simultaneous Recording in file and  playing incoming rtp stream.I have cloned the data source. After cloning I created two data sinks and two processors.One processor/datasink is meant for recording the stream in a file and other processor/data sink is used to play the rtp through speaker.
    But at a time I am only able to achieve one operation i.e either recording or playing.For recording into the wav file I have to put Thread.sleep(30) so that I can record the rtp into the file.During this time I am not able to play the audio through speaker.I also create separate thread but this does not help.
    Please give suggestion on how to resolve this issue.

    thanks for the reply, but I think I've already solved the problem.

  • Merging RTP streams and writing to file

    I'm am in the middle of project using JMF i'm trying to merge incoming RTP streams and write them to file but i always have problem with formats. I have created processor with merged DataSource i set ContentDescriptor to different FileTypeDescriptors and tried to transcode the RTP streams to different formats, but always processor couldnt realize because of this error:
    Unable to handle format: LINEAR, 8000.0 Hz, 16-bit, Mono, LittleEndian, Signed
    Unable to handle format: LINEAR, 8000.0 Hz, 16-bit, Mono, LittleEndian, Signed
    Could someone tell me what content descriptors i need to use and what formats to solve this problem. Any help will be appreciated.
    Message was edited by:
    kaligula

    Hello,
    Here is a procedure i wrote to write on text files the content of any table (and also generate the insert SQL order)
    http://oracle.developpez.com/sources/?page=developpement#Extraction_table
    You can download the script file here http://sheikyerbouti.developpez.com/src/extraction_table.zip
    Hope this help you.
    Francois

  • Play an rtp stream and save at the same time

    Hello
    How cam i play and save at the same time an rtp stream.
    can anyone help me about this.
    Thanks

    Hello
    How cam i play and save at the same time an rtp stream.
    can anyone help me about this.
    Thanks

  • How can I save my photo stream pictures?

    So, Apple in it's infinate wisdom has nearly ruined my honeymoon and I'm at my wits end to save what is left of my digital archive of the trip.  How on earth can I SAVE my photo stream pictures OFF of an iDevice??  I have 210 pictures that I desperately need to save onto my computer and the only place they exist at the moment is inside the photostream.
    Not that it matters, but let me vent... the reason they only exist is in the photo stream is because of Apple's under-prepared servers for iOS5 deployment and a glitch in iTunes that wiped my phone immediately as I plugged it in today (no sync, no confirm, just plug, beep, and then a wiped phone in under 5 seconds).
    I lost all of my phone content, which luckilly I'm an exchange user so the most important stuff is always duplicated off the device.  But my pictures and video from my week long honeymoon were on my phone and while SOME of them did sync to iCloud, not all of them did (still don't know why THAT is - but that's the next mission) and NONE of my videos were in the cloud.  On top of that, no backup has been done by iCloud in over a week even though I have been on wifi and charging several nights on the trip, and all last night once I got home.  No backup.  Wiped phone.  *** Apple.
    Where I stand right now is a photo stream full of pictures that I want to save and cannot seem to get to.  First thought was that since iCloud is a cloud based service I bet I can log into a web site and see all of my content an save it from there.  Nope.  This is Apple so of course that option would make too much sense.  All I can do from the web portal is look at pretty icons and delete my pictures.  I can't even browse them, much less save them.  Again, ***.
    So next on my now restored phone (from a week old backup) I can see my photo stream and a nice little save button exists.  Good, that'll save it to the camera roll and I can copy them off like the other pictures.  Nope.  I select some pictures, click save, and nothing happens.  I select 20 or so pictures, click save, and 2 pictures show up in my camera roll??  Again, *** Apple.
    So there is also an option to add the pictures to a new album.  Ok, here we go.  Select all 210 pictures, add to a new album, and yes it makes a new album.  Now to save that album off to my PC... NOPE.  The album is nowhere to be found in the file system (where the camera roll pictures live).
    Ok, so at least I can now back up my phone so I know that these pictures won't just fall off the iCloud in 30 days.  Backup goes fine... and is 30M in size.  Son of a...!  There is no way that backup includes all those pictures.
    I installed the Windows iCloud manager - that sounds like it would let you MANAGE your iCloud content, but nope.  You can manage your payment info, and have it stream NEW photos to you, but you cannot get to your existing photos!  One last time Apple... W T F.
    How can I retrieve photos that are in my photo stream?

    Welcome to the Apple community.
    Yes you can do that, select the rectangle with the arrow in it, in the top right hand corner of your photo stream screen and select the pictures you wish to save to your camera roll then tap the save button at the bottom of the screen and choose camera roll from the options that appear.

  • How to stream multiple audio files over a IP network in one session

    Hi everybody,
    i am going to build an internet radio application as my next project. I thought of taking JMF and the RTP streaming for this purpose. But the problem i faced during the development was it plays only one audio stream in a session. The thing is i want to send more than one audio stream in a given RTP session and using a single port. What my radio does is simply play mp3 music all day long. How can i do this and what other tools that i should take. please reply ASAP. thank u in advance,
    Thusira..

    You'll need to implement what I call a "jukebox" datasource... IE, the DataSource changes what file it's outputting, but the Processor and RTP DataSink associated with it are unaware of the change... so they keep right on transmitting like nothing happened, but the actual file being outputted has changed...
    http://web.archive.org/web/20080316203600/java.sun.com/products/java-media/jmf/2.1.1/solutions/Concat.html
    The concat example calls it a "superglue" datasource and uses it to concat outputs to a file... exact same concept, except you'll want to use a queue you can append to dynamically rather than an array created at compile time... and you'll want to output to RTP instead of to a file. You'll probably also want your DataSource to stay alive waiting for the next thing to be enqueued rather than shutting down when it reaches the end of the list.

  • Mixing audio RTP-Streams to conference

    I'm trying to create a telefon conference applikation with jmf.
    My problem:
    I need to mix multiple incomming RTP-Streams (Audio/G711_ulaw) together into one outgoing RTP-Stream.
    If I use a MergingDataSource i can get one DataSource with multiple Tracks, but i'm not able to send them out together over one RTP-Session.
    Is there any way to do this?
    I have also thaught about using RawBufferMux or WAVMux to multiplex the tracks included in the MergingDataSource ,but i cant find out how to use them.
    Or is there an easyer way to create an conference over RTP?
    I need to connect to cisco ip-phones, so i can only use the g711_ulaw codec.
    Is there anyone out who can help me?

    Im sorry, but I met this problem in past and find it impossible to resolve.
    Im also thought about MergingDataSource, about Many2One class and more and more but nothing I could use. And nobody could answer this question in this forum half of year ago.
    Oh, nearly forgot: it is possible to write merged streams to file, seems possible to write them into file and then to transmit it to net from file. But how to realize more than one Processor?..

Maybe you are looking for

  • LocationServices on iPhone..when do you know your location is "good enough"

    Hi. I've been having some weird issues with location services and maybe I'm just not understanding it completely. I need to grab the user's location with the best accuracy possible so I gather updates for about 10 seconds and take the update with the

  • How can I move photos from my windows7 pc to my iphone?

    Is there any way Without using iTunes that I can move photos from my windows 7 pc to my iphone 5c?  (I don't want to use iTunes because I've had too much touble doing that, with duplicate photos being created on my iphone etc, and then you can't move

  • Startup shortcuts stopped working

    Hey guys, Since I have upgraded my iMac to Snow Leopard my startup shortcuts have stopped working. i used to hold option down at startup to get startup manager and now that will not work. also holding c to boot from a disk stopped working as well. Ti

  • Values cannot added into z-table fields.. (have  full points..)

    Hi All,   I have one problem .    I made a z-table,included a z-structure, two times in this  z-table.   while inserting  some values in the table field. it is ok with the z-table fields but  i am able to inseert values in the z-structure fields used

  • Claymation with still photos

    I am looking for help in importing digital still photos from iphoto to imovieHD 6.0.3. I am trying to set a very short play time so that when viewed as a whole the photos become an animation of three clay figures. I've seen it done w/ older versions