Is possible to play .amr audio files on Nokia Lumi...

Is possible to play .amr audio files on Nokia Lumia 800? Thank you.

Hi,
The Lumia, being a Windows Phone, supports these audio formats;
AAC, AMR-NB, AMR-WB, HE-AAC v1, HE-AAC v2, M4A, MP3, WAV, WMA 10 Pro, WMA 9
So yes, AMR is supported.
Press the 'Accept As Solution' icon if I have solved your problem, click on the Star Icon below if my advice has helped you!

Similar Messages

  • Is it possible to play two audio files simultaneously.

    Hi,
    Is it possible to play two audio files simultaneously.
    I modified the 'SpeakHere' to play two audio files simultaneously, it worked perfectly on simulator but get hanged on iphone.
    To this I implemented NSTimer, which regularly checks for both player instance every second.As soon as any player gets stopped Timer task stops the second one forcefully and again starts both.
    thanks.

    Hi Jay
    Do you have the eLearning Suite? If so, you have Adobe Presenter available to you. If you don't, you should be able to download the Presenter installation file and evaluate it.
    Presenter is an add in for PowerPoint. The neat thing about it is that it provides for a Table of Contents type of functionality as you see is configured in Captivate. But the twist with that is you are able to do something you cannot do in Captivate. You are able to insert videos inside the TOC area!
    Just musing out loud... Rick
    Helpful and Handy Links
    Captivate Wish Form/Bug Reporting Form
    Adobe Certified Captivate Training
    SorcerStone Blog
    Captivate eBooks

  • Is it possible to play an audio file out of labview (using myDAQ)?

    Hi all,
    I am using a myDAQ (first time). I am trying to play an audio file at a certain time in my VI.
    All the examples I have seen use the audio in and then play out of the audio out port. Is it not possible to simply find the file location and then play it out of the audio out under the given conditions in the VI or even import the sound? I am not trying to play large files or large quantities such as songs. I only want a congratulatory sound when the task is complete (2/3s).
    Any help would be is appreciated (even to say it isn't possible!)
    Cheers 
    Solved!
    Go to Solution.

    Why no just play a WAV file? Use Play Sound File.vi as shown here.
    Richard

  • Is it possible to edit/change audio files in a published html5 file?

    Is it possible to edit/change audio files in a published html5 file?

    Theoretically, YES, but they would need to be in exactly the same format and have the same filename.  Why would you be wanting to do this?  If the new files are not the same playing length, this could potentially throw out any synchronisation with other elements in the presentation.

  • Playing an audio file from applet

    Hi friends,
    I have a problem to be solved .
    Please help me.
    I have to play an audio file from applet.When the applet loads it shold start playing .There is a progress bar in the applet it shold alsomove corresponding to that.
    How can i do that.Give me an idea.
    I know nothing about jmf.
    I am using j2sdk1.4,windows 2000 server.

    Andrew,
    Forgive my naivety, but i struggle with JAVA! I am desperately trying to create a JAVA plugin to SERVOY and have to date been successful with a few things. Record, Playback, convert to .spx.
    I am really struggling with implementing a rewind and fastforward function of the type you have. I thought it would be easy, but it appears not.
    At the bottom is the complete code from my class that is called by the servoy plugin wrapper,
    i had thought that adding some simple jump 500 ms would have been easy in something like this:
    public AudioStream js_FastForward (AudioStream as) throws IOException {
    !!! Line or two here to move the play head forward !!!
         AudioPlayer.player.start(as);
         return as;
    Can you give me any pointers to how i code that fast forward bit.
    many thanks
    David
    package com.d2e.MyPlugin;
    import com.servoy.j2db.scripting.IScriptObject;
    import javax.swing.*;
    import java.awt.*;
    import java.awt.event.*;
    import java.io.*;
    import java.awt.Button;
    import javax.sound.sampled.*;
    import org.xiph.speex.*;
    import org.xiph.speex.spi.*;
    import  sun.audio.*;    //import the sun.audio package
    public class MyPluginProvider implements IScriptObject  {
          public void JSencode()
             throws IOException{
               /** Version of the Speex Encoder */
                final String VERSION = "Java Speex Command Line Encoder v0.9.7 ($Revision: 1.5 $)";
                /** Copyright display String */
                final String COPYRIGHT = "Copyright (C) 2002-2004 Wimba S.A.";
                /** Print level for messages : Print debug information */
                final int DEBUG = 0;
                /** Print level for messages : Print basic information */
                final int INFO  = 1;
                /** Print level for messages : Print only warnings and errors */
                final int WARN  = 2;
                /** Print level for messages : Print only errors */
                final int ERROR = 3;
                int printlevel = INFO;
                /** File format for input or output audio file: Raw */
                final int FILE_FORMAT_RAW  = 0;
                /** File format for input or output audio file: Ogg */
                final int FILE_FORMAT_OGG  = 1;
                /** File format for input or output audio file: Wave */
                final int FILE_FORMAT_WAVE = 2;
                int srcFormat  = FILE_FORMAT_OGG;
                int destFormat = FILE_FORMAT_WAVE;
                int mode       = -1;
                int quality    = 4;
                /** Defines the encoders algorithmic complexity. */
                 int complexity = 3;
                /** Defines the number of frames per speex packet. */
                 int nframes    = 1;
                /** Defines the desired bitrate for the encoded audio. */
                 int bitrate    = -1;
                /** Defines the sampling rate of the audio input. */
                 int sampleRate = -1;
                /** Defines the number of channels of the audio input (1=mono, 2=stereo). */
                 int channels   = 1;
                /** Defines the encoder VBR quality setting (float from 0 to 10). */
                 float vbr_quality = -1;
                /** Defines whether or not to use VBR (Variable Bit Rate). */
                 boolean vbr    = false;
                /** Defines whether or not to use VAD (Voice Activity Detection). */
                 boolean vad    = false;
                /** Defines whether or not to use DTX (Discontinuous Transmission). */
                 boolean dtx    = false;
              //HArd code src format and dest
               // private String srcPath ="junk.wav";
              srcFormat = FILE_FORMAT_WAVE;
              destFormat = FILE_FORMAT_OGG;
              //destPath="junk.spx";
             byte[] temp    = new byte[2560]; // stereo UWB requires one to read 2560b
             final int HEADERSIZE = 8;
             final String RIFF      = "RIFF";
             final String WAVE      = "WAVE";
             final String FORMAT    = "fmt ";
             final String DATA      = "data";
             final int WAVE_FORMAT_PCM = 0x0001;
             // Open the input stream
             DataInputStream dis = new DataInputStream(new FileInputStream("junk.wav"));
             // Prepare input stream
           //DP - Sort out the Wave File
               // read the WAVE header
               dis.readFully(temp, 0, HEADERSIZE+4);
               // Read other header chunks
               dis.readFully(temp, 0, HEADERSIZE);
               String chunk = new String(temp, 0, 4);
               int size = readInt(temp, 4);
               while (!chunk.equals(DATA)) {
                 dis.readFully(temp, 0, size);
                 if (chunk.equals(FORMAT)) {
                   typedef struct waveformat_extended_tag {
                   WORD wFormatTag; // format type
                   WORD nChannels; // number of channels (i.e. mono, stereo...)
                   DWORD nSamplesPerSec; // sample rate
                   DWORD nAvgBytesPerSec; // for buffer estimation
                   WORD nBlockAlign; // block size of data
                   WORD wBitsPerSample; // Number of bits per sample of mono data
                   WORD cbSize; // The count in bytes of the extra size
                   } WAVEFORMATEX;
                   if (readShort(temp, 0) != WAVE_FORMAT_PCM) {
                     System.err.println("Not a PCM file");
                     return;
                   channels = readShort(temp, 2);
                   sampleRate = readInt(temp, 4);
                   if (readShort(temp, 14) != 16) {
                     System.err.println("Not a 16 bit file " + readShort(temp, 18));
                     return;
                   // Display audio info
                   if (printlevel <= DEBUG) {
                     System.out.println("File Format: PCM wave");
                     System.out.println("Sample Rate: " + sampleRate);
                     System.out.println("Channels: " + channels);
                 dis.readFully(temp, 0, HEADERSIZE);
                 chunk = new String(temp, 0, 4);
                 size = readInt(temp, 4);
               if (printlevel <= DEBUG) System.out.println("Data size: " + size);
           //DP ENd sort wave file 
           //Now Choose the mode , we have a file sampled at 44100
                 mode = 2; // Ultra-wideband
             // Construct a new encoder
             SpeexEncoder speexEncoder = new SpeexEncoder();
             speexEncoder.init(mode, quality, sampleRate, channels);
             if (complexity > 0) {
               speexEncoder.getEncoder().setComplexity(complexity);
             if (bitrate > 0) {
               speexEncoder.getEncoder().setBitRate(bitrate);
             if (vbr) {
               speexEncoder.getEncoder().setVbr(vbr);
               if (vbr_quality > 0) {
                 speexEncoder.getEncoder().setVbrQuality(vbr_quality);
             if (vad) {
               speexEncoder.getEncoder().setVad(vad);
             if (dtx) {
               speexEncoder.getEncoder().setDtx(dtx);
             // Display info
             // Open the file writer
             AudioFileWriter writer;
             if (destFormat == FILE_FORMAT_OGG) {
               writer = new OggSpeexWriter(mode, sampleRate, channels, nframes, vbr);
             else if (destFormat == FILE_FORMAT_WAVE) {
               nframes = PcmWaveWriter.WAVE_FRAME_SIZES[mode-1][channels-1][quality];
               writer = new PcmWaveWriter(mode, quality, sampleRate, channels, nframes, vbr);
             else {
               writer = new RawWriter();
             writer.open("junk.spx");
             writer.writeHeader("Encoded with: " + VERSION);
             int pcmPacketSize = 2 * channels * speexEncoder.getFrameSize();
             try {
               // read until we get to EOF
               while (true) {
                 dis.readFully(temp, 0, nframes*pcmPacketSize);
                 for (int i=0; i<nframes; i++)
                   speexEncoder.processData(temp, i*pcmPacketSize, pcmPacketSize);
                 int encsize = speexEncoder.getProcessedData(temp, 0);
                 if (encsize > 0) {
                   writer.writePacket(temp, 0, encsize);
             catch (EOFException e) {}
             writer.close();
             dis.close();
            * Converts Little Endian (Windows) bytes to an int (Java uses Big Endian).
            * @param data the data to read.
            * @param offset the offset from which to start reading.
            * @return the integer value of the reassembled bytes.
           protected static int readInt(final byte[] data, final int offset)
             return (data[offset] & 0xff) |
                    ((data[offset+1] & 0xff) <<  8) |
                    ((data[offset+2] & 0xff) << 16) |
                    (data[offset+3] << 24); // no 0xff on the last one to keep the sign
            * Converts Little Endian (Windows) bytes to an short (Java uses Big Endian).
            * @param data the data to read.
            * @param offset the offset from which to start reading.
            * @return the integer value of the reassembled bytes.
           protected static int readShort(final byte[] data, final int offset)
             return (data[offset] & 0xff) |
                    (data[offset+1] << 8); // no 0xff on the last one to keep the sign
         AudioFormat audioFormat;
           TargetDataLine targetDataLine;
         public Class[] getAllReturnedTypes() {
              // TODO Auto-generated method stub
              return null;
         public String[] getParameterNames(String arg0) {
              // TODO Auto-generated method stub
              return null;
         public String getSample(String arg0) {
              // TODO Auto-generated method stub
              return null;
         public String getToolTip(String arg0) {
              // TODO Auto-generated method stub
              return null;
         public boolean isDeprecated(String arg0) {
              // TODO Auto-generated method stub
              return false;
         public String js_Record (String name){
              captureAudio();
              return "Started Recording " +name;
        public String js_StopRecord (String name) throws IOException{
             //new ActionListener(){
             //   public void actionPerformed(ActionEvent e)
                  //Terminate the capturing of input data
                  // from the microphone.
                  targetDataLine.stop();
                  targetDataLine.close();
              //  }//end actionPerformed
             //};//end ActionListener
                 // JSpeexEnc ("junk.wav","output.spx");
                  JSencode();
              return "Stop Records " +name;
        //Play audio file
        public AudioStream js_Playback (String name) throws IOException{
             InputStream in = new FileInputStream("junk.wav");
             AudioStream as = new AudioStream(in);        
             AudioPlayer.player.start(as);           
             return as;
        public AudioStream js_ContPlay (AudioStream as) throws IOException {
             AudioPlayer.player.start(as);           
             return as;
         //Stop Play
        public AudioStream js_Stop_Playback (AudioStream as) throws IOException{
             AudioPlayer.player.stop(as);
              return as;
         //This method captures audio input from a
        // microphone and saves it in an audio file.
        private void captureAudio(){
          try{
            //Get things set up for capture
            audioFormat = getAudioFormat();
            DataLine.Info dataLineInfo =
                                new DataLine.Info(
                                  TargetDataLine.class,
                                  audioFormat);
            targetDataLine = (TargetDataLine)
                     AudioSystem.getLine(dataLineInfo);
            //Create a thread to capture the microphone
            // data into an audio file and start the
            // thread running.  It will run until the
            // Stop button is clicked.  This method
            // will return after starting the thread.
            new CaptureThread().start();
          }catch (Exception e) {
            e.printStackTrace();
            System.exit(0);
          }//end catch
        }//end captureAudio method  
    //  This method creates and returns an
        // AudioFormat object for a given set of format
        // parameters.  If these parameters don't work
        // well for you, try some of the other
        // allowable parameter values, which are shown
        // in comments following the declarations.
        private AudioFormat getAudioFormat(){
          float sampleRate = 44100.0F;
          //8000,11025,16000,22050,44100
          int sampleSizeInBits = 16;
          //8,16
          int channels = 1;
          //1,2
          boolean signed = true;
          //true,false
          boolean bigEndian = true;
          //true,false
          return new AudioFormat(sampleRate,
                                 sampleSizeInBits,
                                 channels,
                                 signed,
                                 bigEndian);
        }//end getAudioFormat
        class CaptureThread extends Thread{
               public void run(){
                 AudioFileFormat.Type fileType = null;
                 File audioFile = null;
                 //Set the file type and the file extension
                 // based on the selected radio button.
                 //if(aifcBtn.isSelected()){
                  // fileType = AudioFileFormat.Type.AIFC;
                  // audioFile = new File("junk.aifc");
                // }else if(aiffBtn.isSelected()){
                 //  fileType = AudioFileFormat.Type.AIFF;
                 //  audioFile = new File("junk.aif");
                // }else if(auBtn.isSelected()){
                 //  fileType = AudioFileFormat.Type.AU;
                 //  audioFile = new File("junk.au");
                // }else if(sndBtn.isSelected()){
                //   fileType = AudioFileFormat.Type.SND;
                 //  audioFile = new File("junk.snd");
                // }else if(waveBtn.isSelected()){
                 fileType = AudioFileFormat.Type.WAVE;
                 audioFile = new File("junk.wav");
                // }//end if
                 try{
                   targetDataLine.open(audioFormat);
                   targetDataLine.start();
                   AudioSystem.write(
                         new AudioInputStream(targetDataLine),
                         fileType,
                         audioFile);
                 }catch (Exception e){
                   e.printStackTrace();
                 }//end catch
               }//end run
             }//end inner class CaptureThread
    }

  • Playing the audio file from the point of mark

    Hi All, i want to play a audio file from the point of mark.
    i have used mark() method to mark but when i play the audio it is still playing the audio from the starting point. please help me.

    Hi,
    Question:
    1. You have a folder for each DVD whose video contents you edited.
    Answer: Yes
    Question:
    You talk about projects 34, 35, 36, 37. Are these projects associated with only one specific DVD?
    What is supposed to be in each project - just edited VTS_01_1.VOB or edits from the others in the series
    such as VTS_01_2.VOB, VTS_01_3.VOB, etc.
    Answer:
    Project 34 has edited Premiere Elements file 34 and the VOB files (VST_01_1.VOB, VTS_01_2.VOB, etc) from DVD#34
    Project 35 has edited Premiere Elements file 35 and the VOB files (VST_01_1.VOB, VTS_01_2.VOB, etc) from DVD#35
    Project 36 has edited Premiere Elements file 36 and the VOB files (VST_01_1.VOB, VTS_01_2.VOB, etc) from DVD#36
    etc.
    Question:
    2. If you screen each DVD video file VOB for a given DVD disc, does the content follow sequentially and cleanly?
    Answer:
    Yes
    When I run the VST_01_1.VOB, VTS_01_2.VOB, etc from each project the audio and the video are clean and match.
    When I run the edited Premiere Elements file that were created after project 34 edited Premiere Elements file project 35, 36, 37 etc. I am getting the audio from project 34
    When I run the edited Premiere Elements file that were created  before project 34 edited Premiere Elements file project 1, 2, all the way to 33. I am getting the correct audio
    Thanks,
    Ron

  • Playing an audio file in a certain location

    Hi Everyone...
    I would like to play an audio file (edc.caf) that is located in this directory....
    /DCIM/TONY/IOW/AUDIO
    When i run it through the NSURL processor....
    [code]
    NSURL *soundFileURL = [NSURL fileURLWithPath:soundFilePath];
    [code]
    the resulting value of soundFileURL is
    file://localhost/DCIM/TONY/IOW/AUDIO/edc.caf
    But i do not hear any audio out of the iPod touch.
    I've tried two different methods of playing the audio,
    and both are failing.
    I know the audio is being recorded, as i can bring it back to the PC and play them fine.
    My initial thoughts are that i cannot find location of the file, but i am not sure.
    Can anyone confirm the location is correct???
    thanks
    tony

    Small edit....
    the resulting value is a little different than what i posted....
    the actual string is...
    file://localhost/var/mobile/Media/DCIM/TONY/AUDIO/edc.caf
    Sorry about the confusion

  • Help me play an audio file in my GUI

    Hi
    I've been having some trouble getting code to play an audio file in my GUI
    Basically all I want to do is play a file in the same directory as the program, and have it loop. No controls, no buttons to change the play state.
    Could someone point me to or help me implement this with some example code? I'm really stuck and I'm just not sure how to implement it into the existing GUI class.
    Thanks for any help

    Hi
    I've done so, and the method gets called without any problems, but doesnt play anything.
    Have a look -
    public void audioFile()
        try {
            // From file
            AudioInputStream stream = AudioSystem.getAudioInputStream(new File("test.wav"));
            // From URL
            //stream = AudioSystem.getAudioInputStream(new URL("http://hostname/audiofile"));
            // At present, ALAW and ULAW encodings must be converted
            // to PCM_SIGNED before it can be played
            AudioFormat format = stream.getFormat();
            if (format.getEncoding() != AudioFormat.Encoding.PCM_SIGNED) {
                format = new AudioFormat(
                        AudioFormat.Encoding.PCM_SIGNED,
                        format.getSampleRate(),
                        format.getSampleSizeInBits()*2,
                        format.getChannels(),
                        format.getFrameSize()*2,
                        format.getFrameRate(),
                        true);        // big endian
                stream = AudioSystem.getAudioInputStream(format, stream);
            // Create the clip
            DataLine.Info info = new DataLine.Info(
                Clip.class, stream.getFormat(), ((int)stream.getFrameLength()*format.getFrameSize()));
            Clip clip = (Clip) AudioSystem.getLine(info);
            // This method does not return until the audio file is completely loaded
            clip.open(stream);
            // Start playing
            clip.start();
        } catch (MalformedURLException e) {
        } catch (IOException e) {
        } catch (LineUnavailableException e) {
        } catch (UnsupportedAudioFileException e) {
    }

  • Play an audio file over HTTP

    Hello everybody,
    I'm trying to play an audio file over HTTP protocol. The code is the following:
    public class HTTPClientJMF {
         static String url = "http://localhost/audio/Reklam1.wav";
         static String urlFile = "file:///C://tmp/audio/Reklam1.wav";
          * @param args
         public static void main(String[] args) {
              // TODO Auto-generated method stub
              try {
                   DataSource dataS = new URLDataSource(new URL(url));
                   dataS.connect();               
                   Player player = Manager.createPlayer(dataS);
                   player.start();
              } catch (MalformedURLException e) {
                   // TODO Auto-generated catch block
                   e.printStackTrace();
              } catch (IOException e) {
                   // TODO Auto-generated catch block
                   e.printStackTrace();
              } catch (NoPlayerException e) {
                   // TODO Auto-generated catch block
                   e.printStackTrace();
    }Now, if I'm trying to play the file from the local disk (+urlFile+ - using file protocol), everything goes well, but if I'm trying to play the same file from the network (using http protocol) I get the following exception:
    javax.media.NoPlayerException: Cannot find a Player for: javax.media.protocol.URLDataSource@fa9cf
    Can somebody tell me what I'm doing wrong?
    Thank you!

    Ah',..okay. I constructed the DataSource like follows:
    Buffer mediaBuffer = new Buffer();;
              String mediaURL = "http://ares.inescn.pt/video/Reklam1.wav";
              URL url;
              try {
                   url = new URL(mediaURL );
                   InputStream in = url.openStream();
                   BufferedInputStream bufIn = new BufferedInputStream(in);
                   for (;;) {
                        int data = bufIn.read();
                        // Check for EOF
                        if (data == -1)
                             break;
                        else
                             mediaBuffer.setData(data);
                   System.out.println(mediaBuffer.getLength());
                   if (mediaBuffer.getLength() != 0) {
                        DataSource ds = new DataSource();
                        HttpStream[] httpStream = ds.getStreams();
                        System.out.println(httpStream.length);
                        httpStream[0].read(mediaBuffer);
                        ds.connect();
                        ds.start();                         
                        Player player = Manager.createPlayer(ds);
                        player.start();But I still cannot play an audio file (wav format) over the HTTP. Application starts but nothing happened.
    I make the modification like you suggested. (into the DataSource, more exactly for method getStreams()).
    I renamed as well the HttpDatasource into HttpStream.
    Did you actually tried the code? I've been reading the instruction how to test the code but I did not be able to run the example.

  • How do I play an audio file as a soundtrack for entire keynot

    How do I play an audio file as a soundtrack for entire keynote?

    https://developer.apple.com/mac/library/samplecode/PlayFile/Introduction/Intro.h tml
    Docs are your friend....mac os has it's own just like iPhone.

  • Is it possible to play MP3 audio type file in Applet?

    Hi,
    I am not familiar with audio files. I guess MP3 file has extension .mp3
    can i still play the AudioClips?
    Thanks!

    It's got to be. It not built into standard java, but you can probably find classes for it on the web.

  • How do i play multiple audio files simultaneously using AudioQueue?

    I've basically gutted what I don't need from the "Speakhere" sample app and I'm left with something I can use to play audio files. I converted wav files to aiff and then renamed the aiff to caf (Just in case this matters). I even tried shortening the files to 10 seconds long, but no difference. I am able to play files singly, but am unable to play more than one file at the same time. When I try to play multiple files, the first one plays correctly but the second will play a horrible long beeping sound. Sounds almost like a buzzer and sometimes will sound like a high pitched beep. My code has all the audio components like the speakhere app (AudioQueueObject, AudioPlayer, AudioViewController). I've changed the AudioViewController to be just my "AudioController" class where when I initialize, I pass in the name of the file and it replaces this line of code:
    CFStringRef fileString = (CFStringRef) [NSString stringWithFormat: @"%@/Recording.caf", self.recordingDirectory];
    with this:
    CFStringRef fileString = (CFStringRef) [NSString stringWithFormat: fileName, self.recordingDirectory];
    Any suggestions? I've been yanking my hair out trying to figure this out and I simply can't. I'm about ready to pull an "Office Space" but instead of beating up a printer, I'm going to destroy my macbook.
    *NOTE: I have only tried this in the simulator, but I'm assuming it will do this on the real thing so I haven't bothered to try.

    Unless I'm mistaken, you need one AudioQueue object for each audio file you want to play. In one of my apps, I basically copied the code provided in the [AudioQueue programming guide|http://developer.apple.com/iphone/library/documentation/MusicAudio/Concep tual/AudioQueueProgrammingGuide/AQPlayback/chapter4_section_1.html#//appleref/doc/uid/TP40005343-CH3-SW1] and made a wrapper object for playback. So then you can do something like this:
    MyAudioManager *manager1 = [[MyAudioManager alloc] initWithFile:@"file1.caf"];
    [manager1 play];
    MyAudioManager *manager2 = [[MyAudioManager alloc] initWithFile:@"file2.caf"];
    [manager2 play];
    // etc...
    If you want to play short files (~10s or less), you can use the simpler AudioServices functions along with SystemSoundID's. It will work fine if you have playback from AudioQueue and AudioServices at the same time.

  • HT3775 How can I play an audio file in 3GP format?

    Hi,
    I have an audio file in 3GP format, which was recorded in a mobile phone and transferred to me.
    I am not able to play it using QuickTim and VLC Player.
    Can someone please help me how to play this?
    Regards
    Sharath Arun

    Is that file played in a tab in the current window or in a separate pop-up window?
    Are you allowing pop-up to modify the window size (dom.disable_window_move_resize = false)?
    *http://kb.mozillazine.org/JavaScript#Advanced_JavaScript_settings
    You can try set the Boolean pref <b>media.windows-media-foundation.enabled</b> to <i>false</i> on the <b>about:config</b> page to disable the built-in HTML5 media player.
    *http://kb.mozillazine.org/about:config

  • How to play the audio file please help me

    i would like to play audio files such wav files, au files etc
    I would like to know how to play audio file easily in Applet and
    applications
    Thank you

    import sun.audio.*; //import the sun.audio package
    import java.io.*;
    //** add this into your application code as appropriate
    // Open an input stream to the audio file.
    InputStream in = new FileInputStream(Filename);
    // Create an AudioStream object from the input stream.
    AudioStream as = new AudioStream(in);
    // Use the static class member "player" from class AudioPlayer to play
    // clip.
    AudioPlayer.player.start(as);
    // Similarly, to stop the audio.
    AudioPlayer.player.stop(as);

  • Playing an audio file in the mainBundle through an NSDictionary

    I have an audio file audio1.caf in my mainBundle.
    In the mainViewController I want to use an NSDictionary with audio1.caf being the object for the key audio1Key
    In my detailViewController I want to have the user tap a button and the audio plays.
    This sounds like a very simple  thing but search as I might I can't seem to find the answer.
    Here is the code for my dictionary in the mainViewController
    if ([vc isKindOfClass:[detailViewController class]])
            detailViewController*lcVC =(detailViewController*)vc;
    NSMutableDictionary *mydDictionary = [NSMutableDictionary dictionaryWithObjectsAndKeys:
                                           @"audio1.caf",@"firstAudioKey",nil];
    lcVC.myDictionary = myDictionary;  
        [self.navigationController pushViewController:vc animated:YES];
    Then in my detailViewController I use
    -(IBAction)buttonPressedid)sender {
        NSError *error = nil;
        NSString *firstAudio = [myDictionary objectForKey:@"firstAudioKey"];
        NSURL *url = [NSURL fileURLWithPath:firstAudio];
        theAudio = [[AVAudioPlayer alloc] initWithContentsOfURL:url error:&error];
        theAudio.delegate =self;
        theAudio.volume = 1.0;
        theAudio.numberOfLoops = 0;
         NSLog (@"%@",url);
        if ([theAudio isPlaying]){[theAudio pause]; }else{[theAudio play];}}
    So I know things are getting passed by the dictionary ok because other things are showing up (text, and other audio) stored in the documents directory.
    However this shows that the url path is audio1.caf --file://localhost/ So I am gettin the name of the objectForKey but not the whole path.  I think I need to look in the mainBundle and get the path to the file but I cannot seem to be able to figure this out.  I have looked all over the internet for a discussion, or tutorial. but no luck.  Can someone point out where I'm going wrong or suggest where to look.  Thanks

    I finally figured this out DUH!  Here's my code if anyone wants to look at it and tell me if I'm doing something wrong.
    -(IBAction)buttonPressed2:(id)sender {
        NSError *error = nil;
        NSString *someAudio = [[NSBundle mainBundle]bundlePath];
        NSString *audioPath = [someAudio stringByAppendingPathComponent:[myDictionary objectForKey:@"firstAudioKey"]];
        NSURL *url = [NSURL fileURLWithPath:audioPath];
        theAudio7 = [[AVAudioPlayer alloc] initWithContentsOfURL:url error:&error];
        theAudio7.delegate =self;
        theAudio7.volume = 10.0;
        theAudio7.numberOfLoops = 0;
        NSLog (@"%@",firstAudioKey);
        if ([theAudio7 isPlaying]){[theAudio7 pause]; }else{[theAudio7 play];}}
    Thanks to everyone.

Maybe you are looking for

  • How to create transaction code for a Z-table

    How to create transaction code for a Z-table? Se93 --> then which radio button to be selected? and what is the program nam e to  be given

  • How to upload a file to database in Apex 4.2.2?

    How to upload a file to database in Apex 4.2.2 in Existing Application? Also How to view the uploaded file within this application? Any help to his question is very appreciated? Thanks, Prak.

  • How to save as rtf. in CS4

    Probably a stupid question but how do I save an InDesign document as an rtf document. In cs3 it is under the export menu but not in CS4. Any help would be greatly appreciated Brock

  • Pdf viewing in mobile

    how to open and read a pdf in flex mobile inside flex its possible ....help

  • Forms2xml

    Hello, I'm trying to convert a legacy forms application over to APEX. This seems pretty straightforward with all the documentation I've found online however they all start with 'run the Forms2XML utility'. I am having troubles finding this short of d