Playing an rtp stream in applet

Hi,
I am building a small application involving capturing audio media from a small client app running on machine1, sending it through a datasink to an rtp session running on machine2, and being able to connect to the machine2 rtp session from machine3 via an applet and play that stream. the first two steps seem to be successful(capturing, transmitting to session etc.). But I have not been able to play that stream from an applet running on machine3. The player never finishes realizing, and it looks like it's because it's not detecting a stream in the machine2 rtp session. I know that machine2 is talking to machine3 to some level, because when I kill the applet on machine3, a ByeEvent is registered on machine2. Does anybody know what I might be doing wrong? Or does anybody have some code that would be similar to my scenario? thanks ...

thanks for the reply, but I think I've already solved the problem.

Similar Messages

  • Simultaneous Recording in file and  playing incoming rtp stream

    I want to Simultaneous Recording in file and  playing incoming rtp stream.I have cloned the data source. After cloning I created two data sinks and two processors.One processor/datasink is meant for recording the stream in a file and other processor/data sink is used to play the rtp through speaker.
    But at a time I am only able to achieve one operation i.e either recording or playing.For recording into the wav file I have to put Thread.sleep(30) so that I can record the rtp into the file.During this time I am not able to play the audio through speaker.I also create separate thread but this does not help.
    Please give suggestion on how to resolve this issue.

    thanks for the reply, but I think I've already solved the problem.

  • Play an rtp stream and save at the same time

    Hello
    How cam i play and save at the same time an rtp stream.
    can anyone help me about this.
    Thanks

    Hello
    How cam i play and save at the same time an rtp stream.
    can anyone help me about this.
    Thanks

  • Using JMF playing AMR RTP Stream

    Hi,
    I wondering if anyone can help me----I used an extend decoder to decode AMR stream in JMF, and after decoding, I got 13bit uniform PCM, seems like JMF don't support this 13 bit LPCM to play directly, so , I used G711 codec converting this LPCM to 8bit A Law (I also tried U law ) , but sound is not correct. I don't know if there's other way to play these 13 bit LPCM or converting them to A/U Law PCM then to play it in JMF?
    Thanking in advance!
    Your sincerely!
    Jane feng

    Hi,
    I am crying for there's nobody who want to help me.

  • Rtp stream applet

    Hi !!
    I'm new in JMF. I'd like to play rtp stream on remote machine by using applet . I've searched a lot for making java applet which could receive rtp stream but coldn't find easy to understand tutorial code etc.
    If you have any good links, please write it on forum or paste code.
    Thank you for any answers !!!

    an applet needs nothing great ...As someone who ..
    - has deployed 1.1-1.4 compatible applets,
    applets that interact with JavaScript, full (browser)
    window applets, applets that load screensavers
    and launch applications, web-started applets,
    applets that change PLAF..
    - has dealt with (literally) hundreds of applet
    related problems on usenet, including the
    investigation of applet bugs, and (particularly)
    bugs caused by the MSVM.
    ..I can assure you.
    1) Applets are anything but easy to deploy
    successfully on the wilds of the internet.
    2) As someone who has dealt a little with the
    JMF, I can add that JMF applets add a whole
    new level of complications to the mix.
    Generally, the end user needs to have
    pre-installed JMF in order to see JMF
    based applets. Even then, the browser
    might not detect the JMF classes.
    It is generally far more sensible, and reliable,
    to launch such projects in a free floating frame
    using web-start.
    Here is an example of a webstart launch.
    <http://www.javasaver.com/testjs/jmf/#test3>
    Note that any such web-start launch might be
    optimised in a variety of ways (to download less).
    The most obvious would be to use the
    customizer.jar to make a cut-down version
    that handles only the required sources (e.g. RTP)
    and formats (e.g. MOV).

  • Can I play and save the incoming RTP streams simultenously

    Hi,
    I want to whether i can play and save the incoming RTP stream simultenously using
    JMF 2.1. The idea is by saving in the file, i can playback the same file later time
    Is there any example code aviliable.
    Thanks in Advance
    bye
    Srikanth

    I think this, today, it's impossible lamentably. By but that I have tried it, I have not been able to obtain it.
    If exists someone that it has obtained it, please, help us with this subject. It's very important.
    Thanks

  • Play and save rtp streams

    Hello
    I need help can someone tell me how i play and save at the same time an rtp stream?
    Thanks

    Thanks for your response.
    Very much appreciated. Was very informative.
    This is my current situation with 3 your suggestions:
    Daniele,
    Your suggestion 1’s result:
    In Wireshark ----> Under Statistics --->I have VoIP calls.
    (I don’t see VoIP calls under Telephony –> may be a different version of Wireshark).
    Anyway, there is only one call because the Wireshark had a Capture Filter to track information between one source and one destination IP address. So I select that call and click on Player button and then click on Decode button. Then I select the forward stream (From IP1 to IP2) and click on play and I don’t hear anything at all. All silence. Same when I select the reverse stream from IP2 to IP1 and play.
    Your suggestion 2’s result:
    In Wireshark ---> Under Statistics ---> I Selected Stream Analysis (Did not select Show All Streams – not sure what the difference is) then ---> Save Payload ----> Select “au” instead of raw and it says – “Can’t save in a file:saving in au format supported only for alaw / ulaw stream
    Your suggestion 3’s result:
    Saved the file in .raw format. Opened Audacity and imported the file as raw and specified FIRST the A-Law codec for G.711A and selected 8000hz and that didn’t work and SECOND tried the u-Law coding for G.711u and selected the sample frequency again equal to 8000 Hz and that didn't work.
    Didn't work means:
    When I played the imported information I get all noise (like heavy metallic sound) and no voice.
    So my guess is that this capture is neither A-Law or u-Law codec - right. This capture was given to me by a customer.
    Any other suggestions – much appreciated Daniele.

  • Playing an audio file from applet

    Hi friends,
    I have a problem to be solved .
    Please help me.
    I have to play an audio file from applet.When the applet loads it shold start playing .There is a progress bar in the applet it shold alsomove corresponding to that.
    How can i do that.Give me an idea.
    I know nothing about jmf.
    I am using j2sdk1.4,windows 2000 server.

    Andrew,
    Forgive my naivety, but i struggle with JAVA! I am desperately trying to create a JAVA plugin to SERVOY and have to date been successful with a few things. Record, Playback, convert to .spx.
    I am really struggling with implementing a rewind and fastforward function of the type you have. I thought it would be easy, but it appears not.
    At the bottom is the complete code from my class that is called by the servoy plugin wrapper,
    i had thought that adding some simple jump 500 ms would have been easy in something like this:
    public AudioStream js_FastForward (AudioStream as) throws IOException {
    !!! Line or two here to move the play head forward !!!
         AudioPlayer.player.start(as);
         return as;
    Can you give me any pointers to how i code that fast forward bit.
    many thanks
    David
    package com.d2e.MyPlugin;
    import com.servoy.j2db.scripting.IScriptObject;
    import javax.swing.*;
    import java.awt.*;
    import java.awt.event.*;
    import java.io.*;
    import java.awt.Button;
    import javax.sound.sampled.*;
    import org.xiph.speex.*;
    import org.xiph.speex.spi.*;
    import  sun.audio.*;    //import the sun.audio package
    public class MyPluginProvider implements IScriptObject  {
          public void JSencode()
             throws IOException{
               /** Version of the Speex Encoder */
                final String VERSION = "Java Speex Command Line Encoder v0.9.7 ($Revision: 1.5 $)";
                /** Copyright display String */
                final String COPYRIGHT = "Copyright (C) 2002-2004 Wimba S.A.";
                /** Print level for messages : Print debug information */
                final int DEBUG = 0;
                /** Print level for messages : Print basic information */
                final int INFO  = 1;
                /** Print level for messages : Print only warnings and errors */
                final int WARN  = 2;
                /** Print level for messages : Print only errors */
                final int ERROR = 3;
                int printlevel = INFO;
                /** File format for input or output audio file: Raw */
                final int FILE_FORMAT_RAW  = 0;
                /** File format for input or output audio file: Ogg */
                final int FILE_FORMAT_OGG  = 1;
                /** File format for input or output audio file: Wave */
                final int FILE_FORMAT_WAVE = 2;
                int srcFormat  = FILE_FORMAT_OGG;
                int destFormat = FILE_FORMAT_WAVE;
                int mode       = -1;
                int quality    = 4;
                /** Defines the encoders algorithmic complexity. */
                 int complexity = 3;
                /** Defines the number of frames per speex packet. */
                 int nframes    = 1;
                /** Defines the desired bitrate for the encoded audio. */
                 int bitrate    = -1;
                /** Defines the sampling rate of the audio input. */
                 int sampleRate = -1;
                /** Defines the number of channels of the audio input (1=mono, 2=stereo). */
                 int channels   = 1;
                /** Defines the encoder VBR quality setting (float from 0 to 10). */
                 float vbr_quality = -1;
                /** Defines whether or not to use VBR (Variable Bit Rate). */
                 boolean vbr    = false;
                /** Defines whether or not to use VAD (Voice Activity Detection). */
                 boolean vad    = false;
                /** Defines whether or not to use DTX (Discontinuous Transmission). */
                 boolean dtx    = false;
              //HArd code src format and dest
               // private String srcPath ="junk.wav";
              srcFormat = FILE_FORMAT_WAVE;
              destFormat = FILE_FORMAT_OGG;
              //destPath="junk.spx";
             byte[] temp    = new byte[2560]; // stereo UWB requires one to read 2560b
             final int HEADERSIZE = 8;
             final String RIFF      = "RIFF";
             final String WAVE      = "WAVE";
             final String FORMAT    = "fmt ";
             final String DATA      = "data";
             final int WAVE_FORMAT_PCM = 0x0001;
             // Open the input stream
             DataInputStream dis = new DataInputStream(new FileInputStream("junk.wav"));
             // Prepare input stream
           //DP - Sort out the Wave File
               // read the WAVE header
               dis.readFully(temp, 0, HEADERSIZE+4);
               // Read other header chunks
               dis.readFully(temp, 0, HEADERSIZE);
               String chunk = new String(temp, 0, 4);
               int size = readInt(temp, 4);
               while (!chunk.equals(DATA)) {
                 dis.readFully(temp, 0, size);
                 if (chunk.equals(FORMAT)) {
                   typedef struct waveformat_extended_tag {
                   WORD wFormatTag; // format type
                   WORD nChannels; // number of channels (i.e. mono, stereo...)
                   DWORD nSamplesPerSec; // sample rate
                   DWORD nAvgBytesPerSec; // for buffer estimation
                   WORD nBlockAlign; // block size of data
                   WORD wBitsPerSample; // Number of bits per sample of mono data
                   WORD cbSize; // The count in bytes of the extra size
                   } WAVEFORMATEX;
                   if (readShort(temp, 0) != WAVE_FORMAT_PCM) {
                     System.err.println("Not a PCM file");
                     return;
                   channels = readShort(temp, 2);
                   sampleRate = readInt(temp, 4);
                   if (readShort(temp, 14) != 16) {
                     System.err.println("Not a 16 bit file " + readShort(temp, 18));
                     return;
                   // Display audio info
                   if (printlevel <= DEBUG) {
                     System.out.println("File Format: PCM wave");
                     System.out.println("Sample Rate: " + sampleRate);
                     System.out.println("Channels: " + channels);
                 dis.readFully(temp, 0, HEADERSIZE);
                 chunk = new String(temp, 0, 4);
                 size = readInt(temp, 4);
               if (printlevel <= DEBUG) System.out.println("Data size: " + size);
           //DP ENd sort wave file 
           //Now Choose the mode , we have a file sampled at 44100
                 mode = 2; // Ultra-wideband
             // Construct a new encoder
             SpeexEncoder speexEncoder = new SpeexEncoder();
             speexEncoder.init(mode, quality, sampleRate, channels);
             if (complexity > 0) {
               speexEncoder.getEncoder().setComplexity(complexity);
             if (bitrate > 0) {
               speexEncoder.getEncoder().setBitRate(bitrate);
             if (vbr) {
               speexEncoder.getEncoder().setVbr(vbr);
               if (vbr_quality > 0) {
                 speexEncoder.getEncoder().setVbrQuality(vbr_quality);
             if (vad) {
               speexEncoder.getEncoder().setVad(vad);
             if (dtx) {
               speexEncoder.getEncoder().setDtx(dtx);
             // Display info
             // Open the file writer
             AudioFileWriter writer;
             if (destFormat == FILE_FORMAT_OGG) {
               writer = new OggSpeexWriter(mode, sampleRate, channels, nframes, vbr);
             else if (destFormat == FILE_FORMAT_WAVE) {
               nframes = PcmWaveWriter.WAVE_FRAME_SIZES[mode-1][channels-1][quality];
               writer = new PcmWaveWriter(mode, quality, sampleRate, channels, nframes, vbr);
             else {
               writer = new RawWriter();
             writer.open("junk.spx");
             writer.writeHeader("Encoded with: " + VERSION);
             int pcmPacketSize = 2 * channels * speexEncoder.getFrameSize();
             try {
               // read until we get to EOF
               while (true) {
                 dis.readFully(temp, 0, nframes*pcmPacketSize);
                 for (int i=0; i<nframes; i++)
                   speexEncoder.processData(temp, i*pcmPacketSize, pcmPacketSize);
                 int encsize = speexEncoder.getProcessedData(temp, 0);
                 if (encsize > 0) {
                   writer.writePacket(temp, 0, encsize);
             catch (EOFException e) {}
             writer.close();
             dis.close();
            * Converts Little Endian (Windows) bytes to an int (Java uses Big Endian).
            * @param data the data to read.
            * @param offset the offset from which to start reading.
            * @return the integer value of the reassembled bytes.
           protected static int readInt(final byte[] data, final int offset)
             return (data[offset] & 0xff) |
                    ((data[offset+1] & 0xff) <<  8) |
                    ((data[offset+2] & 0xff) << 16) |
                    (data[offset+3] << 24); // no 0xff on the last one to keep the sign
            * Converts Little Endian (Windows) bytes to an short (Java uses Big Endian).
            * @param data the data to read.
            * @param offset the offset from which to start reading.
            * @return the integer value of the reassembled bytes.
           protected static int readShort(final byte[] data, final int offset)
             return (data[offset] & 0xff) |
                    (data[offset+1] << 8); // no 0xff on the last one to keep the sign
         AudioFormat audioFormat;
           TargetDataLine targetDataLine;
         public Class[] getAllReturnedTypes() {
              // TODO Auto-generated method stub
              return null;
         public String[] getParameterNames(String arg0) {
              // TODO Auto-generated method stub
              return null;
         public String getSample(String arg0) {
              // TODO Auto-generated method stub
              return null;
         public String getToolTip(String arg0) {
              // TODO Auto-generated method stub
              return null;
         public boolean isDeprecated(String arg0) {
              // TODO Auto-generated method stub
              return false;
         public String js_Record (String name){
              captureAudio();
              return "Started Recording " +name;
        public String js_StopRecord (String name) throws IOException{
             //new ActionListener(){
             //   public void actionPerformed(ActionEvent e)
                  //Terminate the capturing of input data
                  // from the microphone.
                  targetDataLine.stop();
                  targetDataLine.close();
              //  }//end actionPerformed
             //};//end ActionListener
                 // JSpeexEnc ("junk.wav","output.spx");
                  JSencode();
              return "Stop Records " +name;
        //Play audio file
        public AudioStream js_Playback (String name) throws IOException{
             InputStream in = new FileInputStream("junk.wav");
             AudioStream as = new AudioStream(in);        
             AudioPlayer.player.start(as);           
             return as;
        public AudioStream js_ContPlay (AudioStream as) throws IOException {
             AudioPlayer.player.start(as);           
             return as;
         //Stop Play
        public AudioStream js_Stop_Playback (AudioStream as) throws IOException{
             AudioPlayer.player.stop(as);
              return as;
         //This method captures audio input from a
        // microphone and saves it in an audio file.
        private void captureAudio(){
          try{
            //Get things set up for capture
            audioFormat = getAudioFormat();
            DataLine.Info dataLineInfo =
                                new DataLine.Info(
                                  TargetDataLine.class,
                                  audioFormat);
            targetDataLine = (TargetDataLine)
                     AudioSystem.getLine(dataLineInfo);
            //Create a thread to capture the microphone
            // data into an audio file and start the
            // thread running.  It will run until the
            // Stop button is clicked.  This method
            // will return after starting the thread.
            new CaptureThread().start();
          }catch (Exception e) {
            e.printStackTrace();
            System.exit(0);
          }//end catch
        }//end captureAudio method  
    //  This method creates and returns an
        // AudioFormat object for a given set of format
        // parameters.  If these parameters don't work
        // well for you, try some of the other
        // allowable parameter values, which are shown
        // in comments following the declarations.
        private AudioFormat getAudioFormat(){
          float sampleRate = 44100.0F;
          //8000,11025,16000,22050,44100
          int sampleSizeInBits = 16;
          //8,16
          int channels = 1;
          //1,2
          boolean signed = true;
          //true,false
          boolean bigEndian = true;
          //true,false
          return new AudioFormat(sampleRate,
                                 sampleSizeInBits,
                                 channels,
                                 signed,
                                 bigEndian);
        }//end getAudioFormat
        class CaptureThread extends Thread{
               public void run(){
                 AudioFileFormat.Type fileType = null;
                 File audioFile = null;
                 //Set the file type and the file extension
                 // based on the selected radio button.
                 //if(aifcBtn.isSelected()){
                  // fileType = AudioFileFormat.Type.AIFC;
                  // audioFile = new File("junk.aifc");
                // }else if(aiffBtn.isSelected()){
                 //  fileType = AudioFileFormat.Type.AIFF;
                 //  audioFile = new File("junk.aif");
                // }else if(auBtn.isSelected()){
                 //  fileType = AudioFileFormat.Type.AU;
                 //  audioFile = new File("junk.au");
                // }else if(sndBtn.isSelected()){
                //   fileType = AudioFileFormat.Type.SND;
                 //  audioFile = new File("junk.snd");
                // }else if(waveBtn.isSelected()){
                 fileType = AudioFileFormat.Type.WAVE;
                 audioFile = new File("junk.wav");
                // }//end if
                 try{
                   targetDataLine.open(audioFormat);
                   targetDataLine.start();
                   AudioSystem.write(
                         new AudioInputStream(targetDataLine),
                         fileType,
                         audioFile);
                 }catch (Exception e){
                   e.printStackTrace();
                 }//end catch
               }//end run
             }//end inner class CaptureThread
    }

  • How to synchronize audio and video rtp-streams

    Hi everyone!
    I've tried to make one of the processors, processor1, control the other processor, processor2, with processor1.addController(processor2) but I get a
    javax.media.IncompatibleTimeBaseException. I need to synchronize audio and video rtp-streams because they are sometimes completely out of sync.
    In JMF API Guide I've read:
    "2. Determine which Player object's time base is going to be used to drive
    the other Player objects and set the time base for the synchronized
    Player objects. Not all Player objects can assume a new time base.
    For example, if one of the Player objects you want to synchronize has
    a *push-data-source*, that Player object's time base must be used to
    drive the other Player objects."
    I'm using a custom AVReceive3 to receive rtp-streams and then I create processors for the incoming stream's datasource's, and they are ALL PushBufferDataSource's.
    Does this mean I can't synchronize these. If so is there any way I can change them into Pull...DataSources ?
    How are you supposed to sync audio and video if not with as above ?

    camelstrike wrote:
    Hi everyone!
    I've tried to make one of the processors, processor1, control the other processor, processor2, with processor1.addController(processor2) but I get a
    javax.media.IncompatibleTimeBaseException. I need to synchronize audio and video rtp-streams because they are sometimes completely out of sync.
    In JMF API Guide I've read:
    "2. Determine which Player object's time base is going to be used to drive
    the other Player objects and set the time base for the synchronized
    Player objects. Not all Player objects can assume a new time base.
    For example, if one of the Player objects you want to synchronize has
    a *push-data-source*, that Player object's time base must be used to
    drive the other Player objects."
    I'm using a custom AVReceive3 to receive rtp-streams and then I create processors for the incoming stream's datasource's, and they are ALL PushBufferDataSource's.
    Does this mean I can't synchronize these. If so is there any way I can change them into Pull...DataSources ?The RTP packets are timestamped when they leave, and they are played in order. You can't change the timebase on an RTP stream because, for all intensive purposes, it's "live" data. It plays at the speed the transmitting RTP server wants it to play, and it never gets behind because it drops out of order and old data packets.
    If your RTP streams aren't synced correctly on the receiving end, it means they aren't being synced on the transmitting end...so you should look into the other side of the equation.

  • To retransmit the rtp streams received

    hi !
    Program A transmits media data to another program B using RTPSessionMgr.What B has to do is send the recived streams to another receiver program C.This has to play the stream.That is I am trying to implement a client router server model.
    So i maintained a session between A & B and also between B & C.Once NewReceiveStream event occurs at B,it retrieves the datasource from the stream and uses the createSendStream() method of the sessionmanager and sends to C.
    My problem is that,C receives the audio and video streams.but never plays it.I get a pink screen for video and no audio.
    can somebody help me out in pointing out my mistake or tell me a new strategy to implement this.
    Actually,i have tried to use only datagram sockets on the router side.ie I created 2 sockets for B to receive data. and forwarded to C using 2 other sockets .And this did work.But the client does not know the sender details.I require to maintain the sender,receiver reports.So i went for the session manager(RTPManager /RTPSessionMgr).
    kindly help.

    Hi all!
    Nice to meet ya, it very exciting that I got a project which will integrating the JMF and JXTA, it got RTP streams from the JMF framework and then sending it to the JXTA framework ,and sent to any peer in some groups to visualizing it .So some aspects of which is similar to yours ,would you mind add me to your contact list of MSN(my MSN account is:[email protected])

  • How can I manage controls in an RTP Streaming

    Hello,
    I'm currently using an RTP Streaming in order to play some music. I started with AVReceive2 and AVTransmit2 from sun and by adding a plugin, I managed to play MP3. So far so good.
    Because I didn't want to use view components given by sun, I created my own GUI for my player and I use it instead of using the one in sun's examples.
    I use listeners for my "buttons" (I prefer using mouselistener and Panel like in the sun's example Jamp) but I can't find a way to execute those buttons.
    If I press pause, it has to pause the streaming in AVTransmit2.
    If I press next, or previous, it has to change the process AVTransmit2.
    Each control has to be done in AVTransmit2 but my player is part of AVReceive2.
    My first guess was to check if I could use an event to tell my AVTransmit2 object to execute my controls but I havn't found a way to tell my AVTransmit2 object that it was this button I pressed.
    I finally stopped trying transmitting my orders thanks to the events.
    I tried then to find a way to get my AVTransmit2 object in my AVReceive2 object so that I could call methods but i failed :(
    How can i manage my controls so that it will call methods on my server and not my client ?
    Thanks :)
    Shad.
    Edited by: Shadwolf on Feb 9, 2010 2:15 AM

    Hi, again,
    For the next & previous button, I was thinking in something.
    Do you think it could be good to create a new class RTPClientManager which extends from RTPManager and get 2 booleans previous & next that are set to true when I press on buttons ?
    From here, couldn't I modify my function in AVTransmit2 like this (I implements ReseiveStreamEvent on AVTransmit2) :
    @Override
         public void update(ReceiveStreamEvent evt)
              RTPClientManager mgr = (RTPClientManager )evt.getSource();
              Participant participant = evt.getParticipant();     // could be null.
              ReceiveStream stream = evt.getReceiveStream();  // could be null.
               * Détection de la fermeture de connexion du client afin de fermer la transmission streaming
              if (evt instanceof ByeEvent)
                   System.err.println("  - Got \"bye\" from: " + participant.getCNAME());
                         if(mgr.isNext())
                     this.stop();
                             /* SOME STUFF FOR NEXT */
                        else if(mgr.isPrevious())
                     this.stop();
                             /* SOME STUFF FOR PREVIOUS*/
                    else //means it's the stop button called
                              this.stop() ;
         }Would it work ?
    If you have better ideas I am ready to hear theam :P
    But if this works, I will manage stop, next and previous, but how could I manage play & pause button ?
    I can't find a proper solution to manage my controls :(

  • Adding Effect on an incoming H.263/RTP stream

    Hi,
    I am playing an h.263/rtp stream which is multicasted over a network. I want to play the file with rotation or some other effect added to it. I tried to see following link for some help(as the forums suggested):- http://java.sun.com/products/java-media/jmf/2.1.1/solutions/RotationEffect.html. But the link is disabled and new link which they are providing does not contain the example.
    Can any of me help me how to add the effect on h263/rtp.
    Though i already created my effect class and tried adding it. But could not add the effect.
    Kindly give me some directions.

    Is there some error in above code or if there is some other issue.There's an error in the above code because you're assuming I'm being imprecise with my language. I am not.
    You must copy the data from the input buffer to the output buffer...
    Buffer objects are wrappers around byte[], and you cannot copy data from one array to another by modifying the object references...which is what your code does.
    public int process(Buffer inBuffer, Buffer outBuffer) {
        // Make sure the output buffer will hold data
        if (!(outBuffer.getData() instanceof byte[])) {
            outBuffer.setData(new byte[inBuffer.getLength()]);    
            outBuffer.setLength(inBuffer.getLength());  
            outBuffer.setOffset(0);
        // Make sure the output buffer is long enough
        if (outBuffer.getLength() + outBuffer.getOffset() < inBuffer.getLength())) {
            int offset = outBuffer.getOffset();
            int length = inBuffer.getLength() + outBuffer.getOffset();
            byte[] oldData = (byte[])outBuffer.getData();
            byte[] newData = new byte[length];
            // Copy the previous data
            for (int i = 0; i < offset; i++) {
                newData[i] = oldData;
    // Set the variables for the output buffer
    outBuffer.setData(newData);
    outBuffer.setLength(length);
    outBuffer.setOffset(offset);
    // Copy the data from in to out
    int j = outBuffer.getOffset();
    for (int i = inBuffer.getOffset(); i < inBuffer.getOffset() + inBuffer.getLength(); i++) {
    ((byte[])outBuffer.getData())[j++] = ((byte[])inBuffer.getData())[i];
    return BUFFER_PROCESSED_OK;
    That's untested code above, fix my dumb mistakes as necessary.                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                           

  • Some questions while using JMF play mpeg-1 stream media?

    I use Helix server to provide unicast, and use JMF to play the stream media. But the JMF only play the audio stream transferred by rtp. I have used video palyer�s getVisualComponent() method to get Visual component and added it into the frame. But I get a pink screen. So I think some thing wrong with the video player.
    Who can give me an answer?
    1.     Whether JMF can play mpeg-1 video stream media?( Audio can be heard. Video stream comes to my computer but can�t be displayed.)
    2.     Which method it should use to synchronize video stream and audio stream?
    BTW: In JMF, when the RTSP player�s start() method is invoked, the player creates players by the stream�s track numbers. Video stream and audio stream have different players.

    Try posting your question under the Java Media Framework forum instead.
    You'll get much better responses if you ask the right people !
    regards,
    Owen

  • Bug: Quicktime 7.1.6 won't play MPA audio streams

    Since upgrading to 7.1.6, Quicktime will no longer play RTP streams encoded as Mpeg-1 Layer 2 audio. The stream will run, and the info window will show the bitrate, but no sound is heard. The behavior is now the same as Layer 3 (mp3) streams, which is a much older bug. Any chance for a fix?
    Thanks.

    Please ignore this post, it looks like I was completely mistaken, it's working now.

  • Evesdrop on RTP stream

    Hi,
    I was wondering does anyone know good tools (for WIndows) to reconstruct captured RTP stream into audible format?
    I am trying to assess the level of IPT security in our Enterprise and one thing I would like to show is how easy it is for hackers to capture unencrypted RTP traffic, reconstruct the conversation and play it out.
    I think that such tools are available, so I am just looking for the recommended ones.
    Thanks,
    David

    There are freeware sniffers like ettercap that can be used to perform man in the middle attacks and capture voip traffic. I dont remember the tool available in linux that can decode a voip sniffer dump.i will see if i can find the name for it.
    Its not at all hard for a person with enough knowledge to be dangerous to sniff on your network. Again voip security is something that very much depends on your lan network security as well.
    HTH
    Sankar

Maybe you are looking for

  • Is not recognized by the server

    A lot of my emails are stored in my outbox due to an error message saying "the server does not recognize the name [email protected]" Even though it's my address and it is the address I'm sending from. It's also happening with other email addresses I

  • Links in template-dependent files (solved - but new problems towards the end of the thread!)

    This is a simple problem that probably has a simple solution. My links work in the .dwt file itself, but just one of them (although there are only two at the moment) does not work in the template-dependent files (my browser can't find the file). I us

  • Dreamweaver MX 2004 Hangs

    Hello! My trusty copy of Dreamweaver MX 2004 suddenly hangs when I try to run it. Have recently installed Norton 360 in this Win XP pro system. Other than that I don't know what could be the problem. Is this a case of finding and deleting pref files?

  • How I read CSV files to XYChart?

    I'm record my process data whit spreadsheet object in .csv files and now I want read data to XYChart object. It is possible?

  • Configring sneak preview

    hi friends i am trying to configure sld(system landscape director) in my sneak preview but when i try to logon to sld by going on http://localhost:50000/sld i get 404 not error and the error page say resource doesnot exit. please help me out higher p