Capture RTP stream in a file

I am developing a phone client application, one of the features it should include, is recording conversation, my problem is that I don't find how can I capture the audio stream sent via RTP and record it into a .wav file. I'm using jmf for streaming.
I know that this is possible using ethereal, that captures the stream in a .au files, and you just have to convert it to a .wav;
isn't it possible to do it using JMF ?????
thank you in advance

Hello!
do you already know how to do it?
I think you can create a media locator on the server and save the file on the client. For one stream, you can create a DataSink to a file. But if there are more than one stream at the same time, you can create a SessionManager.
In this Session Manager, implement the Update method to catch the events of a new stream.
For every new stream, you can play the stream and save it.
But this is just what I've read. I'm planning to do it too..do you have it already done?
Thanks!

Similar Messages

  • Save a rtp stream to a file

    hi,
    I can send and receive a rtp stream. Now I want to save a received rtp stream to a file,but i got an exception when create
    a DataSink:
    public synchronized void update(ReceiveStreamEvent evt) {
            RTPManager mgr = (RTPManager) evt.getSource();
            Participant participant = evt.getParticipant();
            ReceiveStream stream = evt.getReceiveStream();
            if (evt instanceof NewReceiveStreamEvent) {
                DataSource ds = stream.getDataSource();
        }

    Well gee wiz, I should would like to help you with your exception with your data sink...
    But you didn't post what exceptino you were getting, so I can't help you...
    And the code you posted has nothing to do with a DataSink to begin with, so I can't help you again...

  • RTP streaming from local file

    I've a problem with JMF, in particular when I transmit a stream from a local file updated from receveing process. The file is "uLaw 8Khz, 8Bit".
    It would seem that when the processor is running is made of a photo of current file size; this implies that is sent only part of the file accessed. The rest of the file is discarded and JMF raises the event EndOfStream.
    Here's the source:
                   // encoding = ULAW, sampleRate = 8000.0Hz, sampleSizeInBits = 8bit, channels = mono(1) or stereo(2)
                   this.format = new AudioFormat(this.audioFormat, 8000, 8, 1);
                   MediaLocator mlIn = null;
                   if (rtpFilename != null && rtpFilename.startsWith("file://"))
                        mlIn = new MediaLocator(rtpFilename);
                   else
                        mlIn = new MediaLocator("file://" + rtpFilename);
                   logger.debug("Input Media locator URL: " + mlIn);
                   if ((processor = createProcessor(mlIn)) != null) {
                        // configure the processor
                        stateHelper = new StateHelper(processor);
                        if (stateHelper.configure(10000)) {
                             processor.setContentDescriptor(new ContentDescriptor(ContentDescriptor.RAW_RTP));
                             // Go through the tracks and try to program one of them to output gsm data
                             boolean encodingOk = false;
                             TrackControl track[] = processor.getTrackControls();
                             if (track != null) {
                                  logger.debug("Track found: " + track.length);
                                  for (int i = 0; i < track.length; i++) {
                                       if (!encodingOk && track[i] instanceof FormatControl) {
                                            if (((FormatControl) track).setFormat(format) == null) {
                                                 track[i].setEnabled(false);
                                            } else {
                                                 encodingOk = true;
                                       } else {
                                            // we could not set this track to gsm, so disable it
                                            track[i].setEnabled(false);
                             // At this point, we have determined where we can send out format data or not
                             logger.debug("Encoding " + this.format.getEncoding() + " result: " + encodingOk);
                             if (encodingOk) {
                                  // realize the processor
                                  if (stateHelper.realize(10000)) {
                                       try {
                                            // hand this datasource to manager for creating an RTP
                                            // datasink our RTP datasink will multicast the audio
                                            String locator = "rtp://" + this.ip + ":" + this.port + "/audio/1";
                                            MediaLocator mlOut = new MediaLocator(locator);
                                            logger.error("Output Media locator URL: " + mlOut);
                                            // create a send stream for the output data source of a processor and start it
                                            dsource = processor.getDataOutput();
                                            dsink = Manager.createDataSink(dsource, mlOut);
                                            dsink.open();
                                            // now start the datasink
                                            dsink.start();
                                            logger.debug("Data sink created for Media locator: " + mlOut);
                                            if (stateHelper.prefetch(10000)) {
                                                 stateHelper.playToEndOfMedia(60000);
                                            } else {
                                                 logger.warn("Processor prefetch failed");
                                       } catch (Exception e) {
                                            logger.error(e.getMessage(), e);
                                            running = false;
                                       } finally {
                                            if (stateHelper != null)
                                                 stateHelper.close();
                                            if (dsink != null) {
                                                 dsink.close();
                                                 logger.debug("Datasink closed");
                                            try {
                                                 if (dsource != null)
                                                      dsource.stop();
                                            } catch (IOException e) {
                                                 logger.debug(e.getMessage(), e);
                                  } else {
                                       logger.warn("Processor realization failed");
                             } else {
                                  logger.warn("Encoding failed");
                        } else {
                             logger.warn("Processor configuration failed");
                   } else {
                        logger.warn("Processor creation failed");
    Thanks in advance mariusv5.

    Hi,
    I think you should go for BDC , using call transaction 'SE11' you can upload data from flat file. and Use FM gui_download to download the data from z table
    thanks,
    Prashant

  • Cisco RTP Stream Capture Utility

    Hi everybody
    I'm working on VoIP CME
    & as i want to identify the noises on my line during conversations.
    I read in the current Cisco doc that there are a Cisco Utility to capture RTP stream to be able to send it to the TAC.
    http://www.cisco.com/en/US/tech/tk652/tk698/technologies_white_paper09186a00801545e4.shtml
    Anybody knows where i can download it ? or link to this tool ?
    thx a lot
    have a nice day

    Ethereal is one tool that will allow you to do this.
    http://www.ethereal.com/
    Hope this helps. If so, please rate the post.
    Brandon

  • Evesdrop on RTP stream

    Hi,
    I was wondering does anyone know good tools (for WIndows) to reconstruct captured RTP stream into audible format?
    I am trying to assess the level of IPT security in our Enterprise and one thing I would like to show is how easy it is for hackers to capture unencrypted RTP traffic, reconstruct the conversation and play it out.
    I think that such tools are available, so I am just looking for the recommended ones.
    Thanks,
    David

    There are freeware sniffers like ettercap that can be used to perform man in the middle attacks and capture voip traffic. I dont remember the tool available in linux that can decode a voip sniffer dump.i will see if i can find the name for it.
    Its not at all hard for a person with enough knowledge to be dangerous to sniff on your network. Again voip security is something that very much depends on your lan network security as well.
    HTH
    Sankar

  • Audio video capture and stream

    hi anyone, i need some source code for my application.
    i've got a task from my school to build an application that could capture and stream audio video file..
    if you could help me, please send me the code (more simple code is better, coz i'm a beginner)
    btw, i use JMF 2.1.1e
    thanx before

    hello any body knows how to read a .wmv file using jmf

  • How to capture the video in a file using JMF-RTP?

    Please someone help in capturing the live streaming in a file using JMF......
    Thanks..

    Hi, I have a problem with RTPExport output video files. One side streams H263/RTP(AVTransmit2.java) and other write this steam to a file by RTPExport.java. When network conditions are ideal, output video file has same fps and same number of frames like original file. Problem occures, when theres packet lost in network, then output file has different fps,and also has less frames like original video(because it didnt write missing frames to file, and thats why it get shorter). Pls how can I achieve output file that will have the same fps like original one? How to write to file an identical copy of what I can see while receiveing video with AVReceive2.java? Its there a way to modifi rtpexport or avreceiver to do this? Thanks!

  • Simultaneous Recording in file and  playing incoming rtp stream

    I want to Simultaneous Recording in file and  playing incoming rtp stream.I have cloned the data source. After cloning I created two data sinks and two processors.One processor/datasink is meant for recording the stream in a file and other processor/data sink is used to play the rtp through speaker.
    But at a time I am only able to achieve one operation i.e either recording or playing.For recording into the wav file I have to put Thread.sleep(30) so that I can record the rtp into the file.During this time I am not able to play the audio through speaker.I also create separate thread but this does not help.
    Please give suggestion on how to resolve this issue.

    thanks for the reply, but I think I've already solved the problem.

  • Merging RTP streams and writing to file

    I'm am in the middle of project using JMF i'm trying to merge incoming RTP streams and write them to file but i always have problem with formats. I have created processor with merged DataSource i set ContentDescriptor to different FileTypeDescriptors and tried to transcode the RTP streams to different formats, but always processor couldnt realize because of this error:
    Unable to handle format: LINEAR, 8000.0 Hz, 16-bit, Mono, LittleEndian, Signed
    Unable to handle format: LINEAR, 8000.0 Hz, 16-bit, Mono, LittleEndian, Signed
    Could someone tell me what content descriptors i need to use and what formats to solve this problem. Any help will be appreciated.
    Message was edited by:
    kaligula

    Hello,
    Here is a procedure i wrote to write on text files the content of any table (and also generate the insert SQL order)
    http://oracle.developpez.com/sources/?page=developpement#Extraction_table
    You can download the script file here http://sheikyerbouti.developpez.com/src/extraction_table.zip
    Hope this help you.
    Francois

  • Write(Export) the RTP Stream from a SIP Call to an audio file

    I am working on telephony system that need record each call to a file.
    I got the RTP Stream in ReceiveStreamEvent handler and start to record
    the problem is i got a file and play it but there are any sounds
    when i set FileTypeDescriptor.WAVE and AudioFormat.ULAW
    I try out the FileTypeDescriptor.WAVE and AudioFormat.IMA4_MS, i got a fixed size file with 60kb
    if the FileTypeDescriptor.MPEG_AUDIO and AudioFormat.MPEG, the processor cannot be realize,
    the thread is blocked!!
    Could anyone save me? thanks in advanced!
    =================================Code===================================================
    public void update(ReceiveStreamEvent evt){
    logger.debug("received a new incoming stream. " + evt);
    RTPManager mgr = (RTPManager) evt.getSource();
    Participant participant = evt.getParticipant(); // could be null.
    ReceiveStream stream = evt.getReceiveStream(); // could be null.
    if (evt instanceof NewReceiveStreamEvent)
    try
    stream = ( (NewReceiveStreamEvent) evt).getReceiveStream();
    DataSource ds = stream.getDataSource();
              recorder = new CallRecorderImpl(ds);
    recorder.start();
    catch (Exception e)
    logger.error("NewReceiveStreamEvent exception ", e);
    return;
    ========================================================================================
    this is the CallRecorderImpl Class
    public class CallRecorderImpl implements DataSinkListener, ControllerListener {
         private Processor processor = null;
         private DataSink dataSink = null;
         private DataSource source = null;
         private boolean bRecording = false;
         FileTypeDescriptor contentType = new FileTypeDescriptor(FileTypeDescriptor.WAVE);
         Format encoding = new AudioFormat(AudioFormat.IMA4_MS);
    MediaLocator dest = new MediaLocator("file:/c:/bar.wav");
         public CallRecorderImpl(DataSource ds) {
              this.source = ds;
         public void start() {
              try {
                   processor = Manager.createProcessor(source);
                   processor.addControllerListener(this);
                   processor.configure();
              } catch (Exception e) {
                   System.out.println("exception:" + e);
         public void stop() {
              try {
                   System.out.println("stopping");
                   this.dataSink.stop();
                   this.dataSink.close();
              } catch (Exception ep) {
                   ep.printStackTrace();
         public void controllerUpdate(ControllerEvent evt) {
              Processor pr = (Processor) evt.getSourceController();
              if (evt instanceof ConfigureCompleteEvent) {
                   System.out.println("ConfigureCompleteEvent");
                   processConfigured(pr);
              if (evt instanceof RealizeCompleteEvent) {
                   System.out.println("RealizeCompleteEvent");
                   processRealized(pr);
              if (evt instanceof ControllerClosedEvent) {
                   System.out.println("ControllerClosedEvent");
              if (evt instanceof EndOfMediaEvent) {
                   System.out.println("EndOfMediaEvent");
                   pr.stop();
              if (evt instanceof StopEvent) {
                   System.out.println("StopEvent");
                   pr.close();
                   try {
                        dataSink.stop();
                        dataSink.close();
                   } catch (Exception ee) {
                        ee.printStackTrace();
         public void dataSinkUpdate(DataSinkEvent event) {
              if (event instanceof EndOfStreamEvent) {
                   try {
                        System.out.println("EndOfStreamEvent");
                        dataSink.stop();
                        dataSink.close();
                        System.exit(1);
                   } catch (Exception e) {
         public void processConfigured(Processor p) {
              // Set the output content type
              p.setContentDescriptor(this.contentType);
              // Get the track control objects
              TrackControl track[] = p.getTrackControls();
              boolean encodingPossible = false;
              // Go through the tracks and try to program one of them
              // to output ima4 data.
              for (int i = 0; i < track.length; i++) {
              try {
                   track.setFormat(this.encoding);
                   encodingPossible = true;
              } catch (Exception e) {
                   track[i].setEnabled(false);
              if (!encodingPossible) {
                   System.out.println("cannot encode to " + this.encoding);
              return;
              p.realize();
         public void processRealized(Processor p) {
              System.out.println("Entered processRealized");
    DataSource source = p.getDataOutput();
    try {
         dataSink = Manager.createDataSink(source, dest);
         dataSink.open();
         dataSink.start();
         dataSink.addDataSinkListener(new DataSinkListener() {
                        public void dataSinkUpdate(DataSinkEvent e) {
                             if (e instanceof EndOfStreamEvent) {
                                  System.out.println("EndOfStreamEvent");
                                  dataSink.close();
    } catch (Exception e) {
         e.printStackTrace();
         return;
              p.start();
    ============================================

    Which shows that the output stream cannot be of that particulat format and descriptor.
    Look at this code
    import javax.swing.*;
    import javax.media.*;
    import java.net.*;
    import java.io.*;
    import javax.media.datasink.*;
    import javax.media.protocol.*;
    import javax.media.format.*;
    import javax.media.control.*;
    class Abc implements ControllerListener
         DataSource ds;
         DataSink sink;
         Processor pr;
         MediaLocator mc;
         public void maam() throws Exception
              mc=new MediaLocator("file:C:/Workspace/aaaaa.mpg");
              pr=Manager.createProcessor(new URL("file:G:/java files1/jmf/aa.mp3"));
              pr.addControllerListener(this);
              pr.configure();          
         public void controllerUpdate(ControllerEvent e)
              if(e instanceof ConfigureCompleteEvent)
                   System.out.println ("ConfigureCompleteEvent");
                             Processor p = (Processor)e.getSourceController();
                   System.out.println ("ConfigureCompleteEvent");
                         processConfigured(p);
              if(e instanceof RealizeCompleteEvent)
                   System.out.println ("RealizeCompleteEvent");
                           Processor p = (Processor)e.getSourceController();
                             processRealized(p);
              if(e instanceof ControllerClosedEvent)
                         System.out.println ("ControllerClosedEvent");
                     if(e instanceof EndOfMediaEvent)
                        System.out.println ("EndOfMediaEvent");
                        Processor p = (Processor)e.getSourceController();
                        p.stop();
                 if(e instanceof StopEvent)
                         System.out.println ("StopEvent");
                         Processor p = (Processor)e.getSourceController();
                         p.close();
                   try
                        sink.stop();
                        sink.close();
                   catch(Exception ee)
         public void processConfigured(Processor p)
              System.out.println("Entered processConfigured");
              p.setContentDescriptor (new FileTypeDescriptor (FileTypeDescriptor.MPEG_AUDIO));       
                /*    TrackControl track[] = p.getTrackControls ();
                 boolean encodingPossible = false;
                 for (int i = 0; i < track.length; i++)
                      try
                           track.setFormat (new VideoFormat (VideoFormat.MPEG));
                   encodingPossible = true;
                   catch (Exception e)
                   track[i].setEnabled (false);
         p.realize();
         public void processRealized(Processor p)
              pr=p;
              System.out.println("Entered processRealized");
              try
                   MediaLocator dest = new MediaLocator("file:C:/Workspace/ring1.mpg");
                   sink = Manager.createDataSink(p.getDataOutput(), dest);
              sink.addDataSinkListener(new DataSinkListener()
                        public void dataSinkUpdate(DataSinkEvent e)
                             if(e instanceof EndOfStreamEvent)
                        System.out.println ("EndOfStreamEvent");
                                  sink.close();
                   sink.open();
                   sink.start();
              catch(Exception eX)
              System.out.println("Just before start");
              p.start();
    /*     public void dataSinkUpdate(DataSinkEvent event)
              if(event instanceof EndOfStreamEvent)
                   try
                        System.out.println("EndOfStreamEvent");
                        dsk.stop();
                        dsk.close();
                        System.exit(1);
                   catch(Exception e)
    public class JMFCapture6
         public static void main(String args[]) throws Exception
              Abc a=new Abc();
              a.maam();

  • Problem to create RTP stream on Linux by using JMF studio

    Hello All,
    I am trying to use JMStudio in order to create RTP stream
    I have a problem with capture
    I have run Jmfinit I got following messages
    JavaSound Capture Supported = true
    JavaSoundAuto: Committed ok
    java.lang.Error: Can't open video card 0
    java.lang.Error: Can't open video card 1
    java.lang.Error: Can't open video card 2
    java.lang.Error: Can't open video card 3
    java.lang.Error: Can't open video card 4
    java.lang.Error: Can't open video card 5
    java.lang.Error: Can't open video card 6
    java.lang.Error: Can't open video card 7
    java.lang.Error: Can't open video card 8
    java.lang.Error: Can't open video card 9
    it seems to me that i have no directsoundcapture
    therefore i can not create RTP package stream
    How can i add direct sound capture to my capture devices.
    is there any suggestion for this issue?
    King regards
    BEKIR BALCIK
    ARGELA TECHNOLOGIES

    In $ORACLE_BASE/admin you should have a directory structure for the viton1 instance : create a similar directory structure for the viton2 instance.
    In the pfile subdirectory create the initviton2.ora file : you can copy it from initviton1.ora file and change all occurrences of viton1 to viton2.
    At OS prompt : export ORACLE_SID=viton2, then
    sqlplus /nolog
    conn / as sysdba
    startup nomount pfile=$ORACLE_BASE/admin/viton2/pfile/initviton2.ora
    then CREATE DATABASE.....

  • Play and save rtp streams

    Hello
    I need help can someone tell me how i play and save at the same time an rtp stream?
    Thanks

    Thanks for your response.
    Very much appreciated. Was very informative.
    This is my current situation with 3 your suggestions:
    Daniele,
    Your suggestion 1’s result:
    In Wireshark ----> Under Statistics --->I have VoIP calls.
    (I don’t see VoIP calls under Telephony –> may be a different version of Wireshark).
    Anyway, there is only one call because the Wireshark had a Capture Filter to track information between one source and one destination IP address. So I select that call and click on Player button and then click on Decode button. Then I select the forward stream (From IP1 to IP2) and click on play and I don’t hear anything at all. All silence. Same when I select the reverse stream from IP2 to IP1 and play.
    Your suggestion 2’s result:
    In Wireshark ---> Under Statistics ---> I Selected Stream Analysis (Did not select Show All Streams – not sure what the difference is) then ---> Save Payload ----> Select “au” instead of raw and it says – “Can’t save in a file:saving in au format supported only for alaw / ulaw stream
    Your suggestion 3’s result:
    Saved the file in .raw format. Opened Audacity and imported the file as raw and specified FIRST the A-Law codec for G.711A and selected 8000hz and that didn’t work and SECOND tried the u-Law coding for G.711u and selected the sample frequency again equal to 8000 Hz and that didn't work.
    Didn't work means:
    When I played the imported information I get all noise (like heavy metallic sound) and no voice.
    So my guess is that this capture is neither A-Law or u-Law codec - right. This capture was given to me by a customer.
    Any other suggestions – much appreciated Daniele.

  • How to stream multiple audio files over a IP network in one session

    Hi everybody,
    i am going to build an internet radio application as my next project. I thought of taking JMF and the RTP streaming for this purpose. But the problem i faced during the development was it plays only one audio stream in a session. The thing is i want to send more than one audio stream in a given RTP session and using a single port. What my radio does is simply play mp3 music all day long. How can i do this and what other tools that i should take. please reply ASAP. thank u in advance,
    Thusira..

    You'll need to implement what I call a "jukebox" datasource... IE, the DataSource changes what file it's outputting, but the Processor and RTP DataSink associated with it are unaware of the change... so they keep right on transmitting like nothing happened, but the actual file being outputted has changed...
    http://web.archive.org/web/20080316203600/java.sun.com/products/java-media/jmf/2.1.1/solutions/Concat.html
    The concat example calls it a "superglue" datasource and uses it to concat outputs to a file... exact same concept, except you'll want to use a queue you can append to dynamically rather than an array created at compile time... and you'll want to output to RTP instead of to a file. You'll probably also want your DataSource to stay alive waiting for the next thing to be enqueued rather than shutting down when it reaches the end of the list.

  • Mixing audio RTP-Streams to conference

    I'm trying to create a telefon conference applikation with jmf.
    My problem:
    I need to mix multiple incomming RTP-Streams (Audio/G711_ulaw) together into one outgoing RTP-Stream.
    If I use a MergingDataSource i can get one DataSource with multiple Tracks, but i'm not able to send them out together over one RTP-Session.
    Is there any way to do this?
    I have also thaught about using RawBufferMux or WAVMux to multiplex the tracks included in the MergingDataSource ,but i cant find out how to use them.
    Or is there an easyer way to create an conference over RTP?
    I need to connect to cisco ip-phones, so i can only use the g711_ulaw codec.
    Is there anyone out who can help me?

    Im sorry, but I met this problem in past and find it impossible to resolve.
    Im also thought about MergingDataSource, about Many2One class and more and more but nothing I could use. And nobody could answer this question in this forum half of year ago.
    Oh, nearly forgot: it is possible to write merged streams to file, seems possible to write them into file and then to transmit it to net from file. But how to realize more than one Processor?..

  • Can I play and save the incoming RTP streams simultenously

    Hi,
    I want to whether i can play and save the incoming RTP stream simultenously using
    JMF 2.1. The idea is by saving in the file, i can playback the same file later time
    Is there any example code aviliable.
    Thanks in Advance
    bye
    Srikanth

    I think this, today, it's impossible lamentably. By but that I have tried it, I have not been able to obtain it.
    If exists someone that it has obtained it, please, help us with this subject. It's very important.
    Thanks

Maybe you are looking for

  • Tips on how to organize multiple iCloud calendars in iCal 5.0

    Found some workarounds but looking for tips from others since I have not been using Lion and the new iCal very long. Situation: I use iCal for work and have my own calendars and manage my boss's multiple calendars. An upgrade to Lion has thrown a mon

  • How to change availability date of a batch when doing GR for order in MIGO

    Hi all, We are not able to find any exit or BADI to change availability date of batch when doing goods receipt for a order. Exits and BADIs are available for other dates but not for availabiliy date. One Exit that is available for available date but

  • Excel JV Upload without Interface

    Hello All, I have a requirement at my client to develop a JV upload program which takes in excel as an input file, post the FI documents (vendor invoice or credit memo), and also generate a summary of errors which can be referred back to excel, corre

  • What happened to Find My iPhone?

    I can not find the Find my iPhone app(?) on my iMac. I used it before when I had Lion. It is on my pad and my phone.

  • Queries with Sun Grid Engine

    Hi, Has anyone worked with the Sun Grid Engine (SGE)? I had some queries regarding SGE 1. Does SGE support some kind of Autoscaling? I know that there are commands to add Nodes to Grid dynamically, but does this require the Grid to restart after addi