Play an audio file over HTTP

Hello everybody,
I'm trying to play an audio file over HTTP protocol. The code is the following:
public class HTTPClientJMF {
     static String url = "http://localhost/audio/Reklam1.wav";
     static String urlFile = "file:///C://tmp/audio/Reklam1.wav";
      * @param args
     public static void main(String[] args) {
          // TODO Auto-generated method stub
          try {
               DataSource dataS = new URLDataSource(new URL(url));
               dataS.connect();               
               Player player = Manager.createPlayer(dataS);
               player.start();
          } catch (MalformedURLException e) {
               // TODO Auto-generated catch block
               e.printStackTrace();
          } catch (IOException e) {
               // TODO Auto-generated catch block
               e.printStackTrace();
          } catch (NoPlayerException e) {
               // TODO Auto-generated catch block
               e.printStackTrace();
}Now, if I'm trying to play the file from the local disk (+urlFile+ - using file protocol), everything goes well, but if I'm trying to play the same file from the network (using http protocol) I get the following exception:
javax.media.NoPlayerException: Cannot find a Player for: javax.media.protocol.URLDataSource@fa9cf
Can somebody tell me what I'm doing wrong?
Thank you!

Ah',..okay. I constructed the DataSource like follows:
Buffer mediaBuffer = new Buffer();;
          String mediaURL = "http://ares.inescn.pt/video/Reklam1.wav";
          URL url;
          try {
               url = new URL(mediaURL );
               InputStream in = url.openStream();
               BufferedInputStream bufIn = new BufferedInputStream(in);
               for (;;) {
                    int data = bufIn.read();
                    // Check for EOF
                    if (data == -1)
                         break;
                    else
                         mediaBuffer.setData(data);
               System.out.println(mediaBuffer.getLength());
               if (mediaBuffer.getLength() != 0) {
                    DataSource ds = new DataSource();
                    HttpStream[] httpStream = ds.getStreams();
                    System.out.println(httpStream.length);
                    httpStream[0].read(mediaBuffer);
                    ds.connect();
                    ds.start();                         
                    Player player = Manager.createPlayer(ds);
                    player.start();But I still cannot play an audio file (wav format) over the HTTP. Application starts but nothing happened.
I make the modification like you suggested. (into the DataSource, more exactly for method getStreams()).
I renamed as well the HttpDatasource into HttpStream.
Did you actually tried the code? I've been reading the instruction how to test the code but I did not be able to run the example.

Similar Messages

  • Sending audio data over http problem

    Hi Guys,
    We are trying to create a little servlet in Tomcat, which is capable to send audio files over http to an embedded media player. The definition of the player looks like:
    <OBJECT ID="Mp" CLASSID="CLSID:6BF52A52-394A-11d3-B153-00C04F79FAA6" TYPE="application/x-oleobject" WIDTH="0" HEIGHT="0">
    <PARAM name="uiMode" value="none">
    <PARAM NAME="ShowControls" VALUE="0">
    <PARAM NAME="AutoStart" VALUE="1">
    <PARAM NAME="ShowPositionControls" VALUE="0">
    <PARAM NAME="ShowStatusBar" VALUE="0">
    <PARAM NAME="ShowDisplay" VALUE="0">
    </OBJECT>
    <script language="javascript">document.Mp.URL = "here comes the url of the servlet with item ID";</script>
    The servlet reads the audio file and writes its content to the response with the following http header settings:
    getResponse().setContentType("audio/x-wav");
    getResponse().setHeader("Content-Transfer-Encoding", "binary");
    getResponse().setHeader("Pragma", "Public");
    getResponse().setHeader("Cache-Control", "must-revalidate, post-check=0, pre-check=0");
    getResponse().setHeader("Content-Disposition", "inline; filename=Media.wav");
    getResponse().setHeader("Content-Length", new Integer(MediaBytes.length).toString());
    getResponse().setHeader("Accept-Ranges", "bytes");
    So, everything works fine for wav files in Internet Explorer, but we are facing problems with Firefox, where it does not work. The embedded Media Player says that "Windows Media Player cannot play the file. One or more codecs required to play the file could not be found."
    But if we set the url to directly to the file on the server, everything works fine.
    We have analyzed the HTTP traffic in both situation, but we cannot understand how Internet Explorer/Firefox and Media Player works together:
    - how does Media Player know that the audio file is playable?
    - if the url points directly to the file, the HTTP headers does not contain any kind of information about the file type, only the extension is available; Media Player checks the file extenion in the url?
    - if the url points to the servlet, why Media Player in Firefox cannot determine the file type and throws error?
    Any help is greately appreciated!
    Thanks!
    Gabor

    If you haven't already, I would try breaking down the problem. First confirm you're getting serial data then confirm that netcat can send some data. Like this:
    xxd < /dev/tty.usbmodemfa121 | less
    nc -u 10.0.1.3 7000 <<< 'hello over there'

  • Help me play an audio file in my GUI

    Hi
    I've been having some trouble getting code to play an audio file in my GUI
    Basically all I want to do is play a file in the same directory as the program, and have it loop. No controls, no buttons to change the play state.
    Could someone point me to or help me implement this with some example code? I'm really stuck and I'm just not sure how to implement it into the existing GUI class.
    Thanks for any help

    Hi
    I've done so, and the method gets called without any problems, but doesnt play anything.
    Have a look -
    public void audioFile()
        try {
            // From file
            AudioInputStream stream = AudioSystem.getAudioInputStream(new File("test.wav"));
            // From URL
            //stream = AudioSystem.getAudioInputStream(new URL("http://hostname/audiofile"));
            // At present, ALAW and ULAW encodings must be converted
            // to PCM_SIGNED before it can be played
            AudioFormat format = stream.getFormat();
            if (format.getEncoding() != AudioFormat.Encoding.PCM_SIGNED) {
                format = new AudioFormat(
                        AudioFormat.Encoding.PCM_SIGNED,
                        format.getSampleRate(),
                        format.getSampleSizeInBits()*2,
                        format.getChannels(),
                        format.getFrameSize()*2,
                        format.getFrameRate(),
                        true);        // big endian
                stream = AudioSystem.getAudioInputStream(format, stream);
            // Create the clip
            DataLine.Info info = new DataLine.Info(
                Clip.class, stream.getFormat(), ((int)stream.getFrameLength()*format.getFrameSize()));
            Clip clip = (Clip) AudioSystem.getLine(info);
            // This method does not return until the audio file is completely loaded
            clip.open(stream);
            // Start playing
            clip.start();
        } catch (MalformedURLException e) {
        } catch (IOException e) {
        } catch (LineUnavailableException e) {
        } catch (UnsupportedAudioFileException e) {
    }

  • How do I play an audio file as a soundtrack for entire keynot

    How do I play an audio file as a soundtrack for entire keynote?

    https://developer.apple.com/mac/library/samplecode/PlayFile/Introduction/Intro.h tml
    Docs are your friend....mac os has it's own just like iPhone.

  • Subsequent play of WAV files from HTTP

    Hi,
    I have to subsequently play many WAV files from HTTP in a nokia series 60 emulator.
    Everything is ok for the first two files, but at the third I catch an exception:
    "java.io.IOException: exceeded the configured maximum number of connections"
    I tried two ways to load the files from HTTP:
    Manager.CreatePlayer(InputStream...)
    Manager.CreatePlayer("http:\\...")
    Below there is the code of the two methods I use. Please note that I try in every way to close the connection before open a new one!!!
    public void Method_1(String src, String type){
    int iLen=0;
    is = null;
    httpCon = null;
    byte[] bBuf;
    try {
    if (p != null) {
    if (p.getState() == Player.STARTED) {
    p.stop();
    p.removePlayerListener(this);
    p.deallocate();
    p.close();
    p=null;
    if (is!=null){
    is.close();
    is=null;
    if (httpCon!=null){
    httpCon.close();
    httpCon=null;
    httpCon = (HttpConnection) Connector.open(src);
    if (httpCon.getResponseCode() != HttpConnection.HTTP_OK) {
    System.out.println("Bad response code");
    return;
    System.out.println("Connection OK! (audio)");
    iLen = (int) httpCon.getLength();
    System.out.println("audio file length : " + iLen);
    is = httpCon.openInputStream();
    p = Manager.createPlayer(is, type);
    p.addPlayerListener(this);
    p.setLoopCount(1);
    p.prefetch();
    p.start();
    vc = (VolumeControl)p.getControl("VolumeControl");
    catch (MediaException ex) {
    ex.printStackTrace();
    catch (IOException ex) {
    ex.printStackTrace();
    public void Method_2(String src){
    try {
    if (p!=null){
    if (p.getState() == Player.STARTED)
    p.stop();
    p.deallocate();
    p.close();
    p=null;
    p = Manager.createPlayer(src);
    p.addPlayerListener(this);
    p.setLoopCount(1);
    p.start();
    vc = (VolumeControl)p.getControl("VolumeControl");
    catch (java.io.IOException e) {
    e.printStackTrace();
    catch (MediaException e) {
    e.printStackTrace();

    My question is:
    how can I load (from a HTTP server) and play more than two WAV files without crashing?
    Do I have to close the connection in a better way in order to re-open it without problems (and more than two times)?
    It is simply an emulator bug?
    Thanks.

  • Hi, I am using HP11 and iPlanet web server. When trying to upload files over HTTP using FORM ENCTYPE="multipart/form-data" that are bigger than a few Kilobytes i get a 408 error. (client timeout).

    Hi, I am using HP11 and iPlanet web server. When trying to upload files over HTTP using FORM ENCTYPE="multipart/form-data" that are bigger than a few Kilobytes i get a 408 error. (client timeout). It is as if the server has decided that the client has timed out during the file upload. The default setting is 30 seconds for AcceptTimeout in the magnus.conf file. This should be ample to get the file across, even increasing this to 2 minutes just produces the same error after 2 minutes. Any help appreciated. Apologies if this is not the correct forum for this, I couldn't see one for iPlanet and Web, many thanks, Kieran.

    Hi,
    You didnt mention which version of IWS. follow these steps.
    (1)Goto Web Server Administration Server, select the server you want to manage.
    (2)Select Preference >> Perfomance Tuning.
    (3)set HTTP Persistent Connection Timeout to your choice (eg 180 sec for three minutes)
    (4) Apply changes and restart the server.
    *Setting the timeout to a lower value, however, may    prevent the transfer of large files as timeout does not refer to the time that the connection has been idle. For example, if you are using a 2400 baud modem, and the request timeout is set to 180 seconds, then the maximum file size that can be transferred before   the connection is closed is 432000 bits (2400 multiplied by 180)
    Regards
    T.Raghulan
    [email protected]

  • Flat file over HTTP or SOAP

    Hey Guys,
    I need to post a Flat file over HTTP (or SOAP), is this possible without developing my own Adapter module?
    I just need to get a Flat file from a FTP server and post to another server via HTTP,since there is no message mapping involved, i developed the scenario without any Integration Repository objects, it is just a pass-through scenario.
    Now i am stuck on the receiver side since i am unable to post Flat file over HTTP.
    Secondly i have Login URL, Logout URL and upload URL from the receiver system, i don't see any place in receiver HTTP adapter to put all these 3 URL's, can i use SOAP adapter to put all these URL anywhere?
    Any help would be appreciated.
    Thanks
    Saif
    Edited by: Saif Manzar on Jan 19, 2010 2:51 AM

    Hey Guys,
    I need to post a Flat file over HTTP (or SOAP), is this possible without developing my own Adapter module?
    I just need to get a Flat file from a FTP server and post to another server via HTTP,since there is no message mapping involved, i developed the scenario without any Integration Repository objects, it is just a pass-through scenario.
    Now i am stuck on the receiver side since i am unable to post Flat file over HTTP.
    Secondly i have Login URL, Logout URL and upload URL from the receiver system, i don't see any place in receiver HTTP adapter to put all these 3 URL's, can i use SOAP adapter to put all these URL anywhere?
    Any help would be appreciated.
    Thanks
    Saif
    Edited by: Saif Manzar on Jan 19, 2010 2:51 AM

  • Retreiving files over http or ftp.

    I was wondering what program I should use to retreive files over http or ftp. Previously I had used wget per my hosting provider's recommendation. It worked when I was ssh logged in to his server (via Mac Terminal). However, when I try using wget on my local Mac it says "command not found".

    Thanks. So if I specify a file name (-o /path/to/file), does the incoming file get renamed to that (and put in that location) or does this specify the directory (-o /path/to/directory) that the incoming file will go to? I wasn't quite clear on that.
    Also, I keep hearing about stdout. What is it exactly? I assumed it was just the Terminal window itself, the alternative being things like | more or | nano or something like that... Or am I totally up the wrong tree?

  • Playing an audio file from applet

    Hi friends,
    I have a problem to be solved .
    Please help me.
    I have to play an audio file from applet.When the applet loads it shold start playing .There is a progress bar in the applet it shold alsomove corresponding to that.
    How can i do that.Give me an idea.
    I know nothing about jmf.
    I am using j2sdk1.4,windows 2000 server.

    Andrew,
    Forgive my naivety, but i struggle with JAVA! I am desperately trying to create a JAVA plugin to SERVOY and have to date been successful with a few things. Record, Playback, convert to .spx.
    I am really struggling with implementing a rewind and fastforward function of the type you have. I thought it would be easy, but it appears not.
    At the bottom is the complete code from my class that is called by the servoy plugin wrapper,
    i had thought that adding some simple jump 500 ms would have been easy in something like this:
    public AudioStream js_FastForward (AudioStream as) throws IOException {
    !!! Line or two here to move the play head forward !!!
         AudioPlayer.player.start(as);
         return as;
    Can you give me any pointers to how i code that fast forward bit.
    many thanks
    David
    package com.d2e.MyPlugin;
    import com.servoy.j2db.scripting.IScriptObject;
    import javax.swing.*;
    import java.awt.*;
    import java.awt.event.*;
    import java.io.*;
    import java.awt.Button;
    import javax.sound.sampled.*;
    import org.xiph.speex.*;
    import org.xiph.speex.spi.*;
    import  sun.audio.*;    //import the sun.audio package
    public class MyPluginProvider implements IScriptObject  {
          public void JSencode()
             throws IOException{
               /** Version of the Speex Encoder */
                final String VERSION = "Java Speex Command Line Encoder v0.9.7 ($Revision: 1.5 $)";
                /** Copyright display String */
                final String COPYRIGHT = "Copyright (C) 2002-2004 Wimba S.A.";
                /** Print level for messages : Print debug information */
                final int DEBUG = 0;
                /** Print level for messages : Print basic information */
                final int INFO  = 1;
                /** Print level for messages : Print only warnings and errors */
                final int WARN  = 2;
                /** Print level for messages : Print only errors */
                final int ERROR = 3;
                int printlevel = INFO;
                /** File format for input or output audio file: Raw */
                final int FILE_FORMAT_RAW  = 0;
                /** File format for input or output audio file: Ogg */
                final int FILE_FORMAT_OGG  = 1;
                /** File format for input or output audio file: Wave */
                final int FILE_FORMAT_WAVE = 2;
                int srcFormat  = FILE_FORMAT_OGG;
                int destFormat = FILE_FORMAT_WAVE;
                int mode       = -1;
                int quality    = 4;
                /** Defines the encoders algorithmic complexity. */
                 int complexity = 3;
                /** Defines the number of frames per speex packet. */
                 int nframes    = 1;
                /** Defines the desired bitrate for the encoded audio. */
                 int bitrate    = -1;
                /** Defines the sampling rate of the audio input. */
                 int sampleRate = -1;
                /** Defines the number of channels of the audio input (1=mono, 2=stereo). */
                 int channels   = 1;
                /** Defines the encoder VBR quality setting (float from 0 to 10). */
                 float vbr_quality = -1;
                /** Defines whether or not to use VBR (Variable Bit Rate). */
                 boolean vbr    = false;
                /** Defines whether or not to use VAD (Voice Activity Detection). */
                 boolean vad    = false;
                /** Defines whether or not to use DTX (Discontinuous Transmission). */
                 boolean dtx    = false;
              //HArd code src format and dest
               // private String srcPath ="junk.wav";
              srcFormat = FILE_FORMAT_WAVE;
              destFormat = FILE_FORMAT_OGG;
              //destPath="junk.spx";
             byte[] temp    = new byte[2560]; // stereo UWB requires one to read 2560b
             final int HEADERSIZE = 8;
             final String RIFF      = "RIFF";
             final String WAVE      = "WAVE";
             final String FORMAT    = "fmt ";
             final String DATA      = "data";
             final int WAVE_FORMAT_PCM = 0x0001;
             // Open the input stream
             DataInputStream dis = new DataInputStream(new FileInputStream("junk.wav"));
             // Prepare input stream
           //DP - Sort out the Wave File
               // read the WAVE header
               dis.readFully(temp, 0, HEADERSIZE+4);
               // Read other header chunks
               dis.readFully(temp, 0, HEADERSIZE);
               String chunk = new String(temp, 0, 4);
               int size = readInt(temp, 4);
               while (!chunk.equals(DATA)) {
                 dis.readFully(temp, 0, size);
                 if (chunk.equals(FORMAT)) {
                   typedef struct waveformat_extended_tag {
                   WORD wFormatTag; // format type
                   WORD nChannels; // number of channels (i.e. mono, stereo...)
                   DWORD nSamplesPerSec; // sample rate
                   DWORD nAvgBytesPerSec; // for buffer estimation
                   WORD nBlockAlign; // block size of data
                   WORD wBitsPerSample; // Number of bits per sample of mono data
                   WORD cbSize; // The count in bytes of the extra size
                   } WAVEFORMATEX;
                   if (readShort(temp, 0) != WAVE_FORMAT_PCM) {
                     System.err.println("Not a PCM file");
                     return;
                   channels = readShort(temp, 2);
                   sampleRate = readInt(temp, 4);
                   if (readShort(temp, 14) != 16) {
                     System.err.println("Not a 16 bit file " + readShort(temp, 18));
                     return;
                   // Display audio info
                   if (printlevel <= DEBUG) {
                     System.out.println("File Format: PCM wave");
                     System.out.println("Sample Rate: " + sampleRate);
                     System.out.println("Channels: " + channels);
                 dis.readFully(temp, 0, HEADERSIZE);
                 chunk = new String(temp, 0, 4);
                 size = readInt(temp, 4);
               if (printlevel <= DEBUG) System.out.println("Data size: " + size);
           //DP ENd sort wave file 
           //Now Choose the mode , we have a file sampled at 44100
                 mode = 2; // Ultra-wideband
             // Construct a new encoder
             SpeexEncoder speexEncoder = new SpeexEncoder();
             speexEncoder.init(mode, quality, sampleRate, channels);
             if (complexity > 0) {
               speexEncoder.getEncoder().setComplexity(complexity);
             if (bitrate > 0) {
               speexEncoder.getEncoder().setBitRate(bitrate);
             if (vbr) {
               speexEncoder.getEncoder().setVbr(vbr);
               if (vbr_quality > 0) {
                 speexEncoder.getEncoder().setVbrQuality(vbr_quality);
             if (vad) {
               speexEncoder.getEncoder().setVad(vad);
             if (dtx) {
               speexEncoder.getEncoder().setDtx(dtx);
             // Display info
             // Open the file writer
             AudioFileWriter writer;
             if (destFormat == FILE_FORMAT_OGG) {
               writer = new OggSpeexWriter(mode, sampleRate, channels, nframes, vbr);
             else if (destFormat == FILE_FORMAT_WAVE) {
               nframes = PcmWaveWriter.WAVE_FRAME_SIZES[mode-1][channels-1][quality];
               writer = new PcmWaveWriter(mode, quality, sampleRate, channels, nframes, vbr);
             else {
               writer = new RawWriter();
             writer.open("junk.spx");
             writer.writeHeader("Encoded with: " + VERSION);
             int pcmPacketSize = 2 * channels * speexEncoder.getFrameSize();
             try {
               // read until we get to EOF
               while (true) {
                 dis.readFully(temp, 0, nframes*pcmPacketSize);
                 for (int i=0; i<nframes; i++)
                   speexEncoder.processData(temp, i*pcmPacketSize, pcmPacketSize);
                 int encsize = speexEncoder.getProcessedData(temp, 0);
                 if (encsize > 0) {
                   writer.writePacket(temp, 0, encsize);
             catch (EOFException e) {}
             writer.close();
             dis.close();
            * Converts Little Endian (Windows) bytes to an int (Java uses Big Endian).
            * @param data the data to read.
            * @param offset the offset from which to start reading.
            * @return the integer value of the reassembled bytes.
           protected static int readInt(final byte[] data, final int offset)
             return (data[offset] & 0xff) |
                    ((data[offset+1] & 0xff) <<  8) |
                    ((data[offset+2] & 0xff) << 16) |
                    (data[offset+3] << 24); // no 0xff on the last one to keep the sign
            * Converts Little Endian (Windows) bytes to an short (Java uses Big Endian).
            * @param data the data to read.
            * @param offset the offset from which to start reading.
            * @return the integer value of the reassembled bytes.
           protected static int readShort(final byte[] data, final int offset)
             return (data[offset] & 0xff) |
                    (data[offset+1] << 8); // no 0xff on the last one to keep the sign
         AudioFormat audioFormat;
           TargetDataLine targetDataLine;
         public Class[] getAllReturnedTypes() {
              // TODO Auto-generated method stub
              return null;
         public String[] getParameterNames(String arg0) {
              // TODO Auto-generated method stub
              return null;
         public String getSample(String arg0) {
              // TODO Auto-generated method stub
              return null;
         public String getToolTip(String arg0) {
              // TODO Auto-generated method stub
              return null;
         public boolean isDeprecated(String arg0) {
              // TODO Auto-generated method stub
              return false;
         public String js_Record (String name){
              captureAudio();
              return "Started Recording " +name;
        public String js_StopRecord (String name) throws IOException{
             //new ActionListener(){
             //   public void actionPerformed(ActionEvent e)
                  //Terminate the capturing of input data
                  // from the microphone.
                  targetDataLine.stop();
                  targetDataLine.close();
              //  }//end actionPerformed
             //};//end ActionListener
                 // JSpeexEnc ("junk.wav","output.spx");
                  JSencode();
              return "Stop Records " +name;
        //Play audio file
        public AudioStream js_Playback (String name) throws IOException{
             InputStream in = new FileInputStream("junk.wav");
             AudioStream as = new AudioStream(in);        
             AudioPlayer.player.start(as);           
             return as;
        public AudioStream js_ContPlay (AudioStream as) throws IOException {
             AudioPlayer.player.start(as);           
             return as;
         //Stop Play
        public AudioStream js_Stop_Playback (AudioStream as) throws IOException{
             AudioPlayer.player.stop(as);
              return as;
         //This method captures audio input from a
        // microphone and saves it in an audio file.
        private void captureAudio(){
          try{
            //Get things set up for capture
            audioFormat = getAudioFormat();
            DataLine.Info dataLineInfo =
                                new DataLine.Info(
                                  TargetDataLine.class,
                                  audioFormat);
            targetDataLine = (TargetDataLine)
                     AudioSystem.getLine(dataLineInfo);
            //Create a thread to capture the microphone
            // data into an audio file and start the
            // thread running.  It will run until the
            // Stop button is clicked.  This method
            // will return after starting the thread.
            new CaptureThread().start();
          }catch (Exception e) {
            e.printStackTrace();
            System.exit(0);
          }//end catch
        }//end captureAudio method  
    //  This method creates and returns an
        // AudioFormat object for a given set of format
        // parameters.  If these parameters don't work
        // well for you, try some of the other
        // allowable parameter values, which are shown
        // in comments following the declarations.
        private AudioFormat getAudioFormat(){
          float sampleRate = 44100.0F;
          //8000,11025,16000,22050,44100
          int sampleSizeInBits = 16;
          //8,16
          int channels = 1;
          //1,2
          boolean signed = true;
          //true,false
          boolean bigEndian = true;
          //true,false
          return new AudioFormat(sampleRate,
                                 sampleSizeInBits,
                                 channels,
                                 signed,
                                 bigEndian);
        }//end getAudioFormat
        class CaptureThread extends Thread{
               public void run(){
                 AudioFileFormat.Type fileType = null;
                 File audioFile = null;
                 //Set the file type and the file extension
                 // based on the selected radio button.
                 //if(aifcBtn.isSelected()){
                  // fileType = AudioFileFormat.Type.AIFC;
                  // audioFile = new File("junk.aifc");
                // }else if(aiffBtn.isSelected()){
                 //  fileType = AudioFileFormat.Type.AIFF;
                 //  audioFile = new File("junk.aif");
                // }else if(auBtn.isSelected()){
                 //  fileType = AudioFileFormat.Type.AU;
                 //  audioFile = new File("junk.au");
                // }else if(sndBtn.isSelected()){
                //   fileType = AudioFileFormat.Type.SND;
                 //  audioFile = new File("junk.snd");
                // }else if(waveBtn.isSelected()){
                 fileType = AudioFileFormat.Type.WAVE;
                 audioFile = new File("junk.wav");
                // }//end if
                 try{
                   targetDataLine.open(audioFormat);
                   targetDataLine.start();
                   AudioSystem.write(
                         new AudioInputStream(targetDataLine),
                         fileType,
                         audioFile);
                 }catch (Exception e){
                   e.printStackTrace();
                 }//end catch
               }//end run
             }//end inner class CaptureThread
    }

  • Playing the audio file from the point of mark

    Hi All, i want to play a audio file from the point of mark.
    i have used mark() method to mark but when i play the audio it is still playing the audio from the starting point. please help me.

    Hi,
    Question:
    1. You have a folder for each DVD whose video contents you edited.
    Answer: Yes
    Question:
    You talk about projects 34, 35, 36, 37. Are these projects associated with only one specific DVD?
    What is supposed to be in each project - just edited VTS_01_1.VOB or edits from the others in the series
    such as VTS_01_2.VOB, VTS_01_3.VOB, etc.
    Answer:
    Project 34 has edited Premiere Elements file 34 and the VOB files (VST_01_1.VOB, VTS_01_2.VOB, etc) from DVD#34
    Project 35 has edited Premiere Elements file 35 and the VOB files (VST_01_1.VOB, VTS_01_2.VOB, etc) from DVD#35
    Project 36 has edited Premiere Elements file 36 and the VOB files (VST_01_1.VOB, VTS_01_2.VOB, etc) from DVD#36
    etc.
    Question:
    2. If you screen each DVD video file VOB for a given DVD disc, does the content follow sequentially and cleanly?
    Answer:
    Yes
    When I run the VST_01_1.VOB, VTS_01_2.VOB, etc from each project the audio and the video are clean and match.
    When I run the edited Premiere Elements file that were created after project 34 edited Premiere Elements file project 35, 36, 37 etc. I am getting the audio from project 34
    When I run the edited Premiere Elements file that were created  before project 34 edited Premiere Elements file project 1, 2, all the way to 33. I am getting the correct audio
    Thanks,
    Ron

  • Playing an audio file in a certain location

    Hi Everyone...
    I would like to play an audio file (edc.caf) that is located in this directory....
    /DCIM/TONY/IOW/AUDIO
    When i run it through the NSURL processor....
    [code]
    NSURL *soundFileURL = [NSURL fileURLWithPath:soundFilePath];
    [code]
    the resulting value of soundFileURL is
    file://localhost/DCIM/TONY/IOW/AUDIO/edc.caf
    But i do not hear any audio out of the iPod touch.
    I've tried two different methods of playing the audio,
    and both are failing.
    I know the audio is being recorded, as i can bring it back to the PC and play them fine.
    My initial thoughts are that i cannot find location of the file, but i am not sure.
    Can anyone confirm the location is correct???
    thanks
    tony

    Small edit....
    the resulting value is a little different than what i posted....
    the actual string is...
    file://localhost/var/mobile/Media/DCIM/TONY/AUDIO/edc.caf
    Sorry about the confusion

  • Streaming audio file over the network w JMF. How to know when the file end

    Hi
    I am streaming audio file over the network using JMF. I want to be able to know when a file end so I can close the streaming session.
    Can some one please help
    Thanks

    If you put a ControllerListener on the Processor that's associated with generating the RTP stream, it'll generate an "EndOfMedia" event when the end of the file is reached.

  • Is it possible to play an audio file out of labview (using myDAQ)?

    Hi all,
    I am using a myDAQ (first time). I am trying to play an audio file at a certain time in my VI.
    All the examples I have seen use the audio in and then play out of the audio out port. Is it not possible to simply find the file location and then play it out of the audio out under the given conditions in the VI or even import the sound? I am not trying to play large files or large quantities such as songs. I only want a congratulatory sound when the task is complete (2/3s).
    Any help would be is appreciated (even to say it isn't possible!)
    Cheers 
    Solved!
    Go to Solution.

    Why no just play a WAV file? Use Play Sound File.vi as shown here.
    Richard

  • Is it possible to play two audio files simultaneously.

    Hi,
    Is it possible to play two audio files simultaneously.
    I modified the 'SpeakHere' to play two audio files simultaneously, it worked perfectly on simulator but get hanged on iphone.
    To this I implemented NSTimer, which regularly checks for both player instance every second.As soon as any player gets stopped Timer task stops the second one forcefully and again starts both.
    thanks.

    Hi Jay
    Do you have the eLearning Suite? If so, you have Adobe Presenter available to you. If you don't, you should be able to download the Presenter installation file and evaluate it.
    Presenter is an add in for PowerPoint. The neat thing about it is that it provides for a Table of Contents type of functionality as you see is configured in Captivate. But the twist with that is you are able to do something you cannot do in Captivate. You are able to insert videos inside the TOC area!
    Just musing out loud... Rick
    Helpful and Handy Links
    Captivate Wish Form/Bug Reporting Form
    Adobe Certified Captivate Training
    SorcerStone Blog
    Captivate eBooks

  • Downloading Text file over http

    Hi,
    I am trying to display some plain text files over http to the client using a web browser. Everything works fine till the time the file size is small, but for the files of over 1MB the frontend just dies. Probably the display happens only after the complete transfer of the file.
    Does any one have the solution by which the text files can also be downloaded similar to the html files i.e. incrementaly. I have the code of transferring the html file over http as it happens in the explorer. But the same does not seem to work on the plain text files.
    Please help
    Sachin

    I do not have any Remote server written for this. What i am doing is, the directory in which the files are stored is exposed as a virtual directory through a web server(Apache) and then from client i am just placing a http request to open this file.
    So what i can do under this situations to handle the large files.

Maybe you are looking for