A challenging question on audio streaming

I'm very curious at this point, if it's possible to directly
stream a local mp3 file to an audience using Flash Player &
Flash Media Server (through the browser).
Thinking at a higher level, this should be possible ---- so
if it can make a live stream using the microphone output, why
shouldn't it be able to stream an mp3 sitting at the hard drive of
the user?
I don't know, maybe Flash doesn't have such built-in
functionality, but there could be a work around / hack to make this
happen in a not so complicated way? (using only the browser). (i.e.
making flash think that mp3 music output is the microphone output
----- I totally have no idea how)
But an important thing is that, the stream should be live.
i.e. starts only with a short lag -a couple of seconds at most-
after the user picks the mp3 file.
I'd say putting the file on the server and then streaming
wouldn't work, as it will take a while to upload the mp3 file. It
could be OK though if the server was able to stream the file before
the upload ends, i.e. start the stream as soon as user starts the
upload.
But I think the ideal solution would not invovle uploading
the file to the server. This is b/c i'm also looking for a mixer
functionality. i.e. user mixing multiple mp3 tracks and the
microphone output at the same time, in which their combined audio
output will be streamed.
Answers will be extremely appreciated.

the output of the mini is 5.1 i believe.
not sure about your other problem. I use plex as well, but if I need to use the mini for something and plex is running I quickly switch to plex in a window and switch to another app.

Similar Messages

  • Audio - streaming I have a problem and don't know where to ask the question

    I have a MacBook pro, I also have a presonus 16.4.2 FireWire mixing consule.  My band is going to play a online show that will stream through ustream.tv .  When I am connected to the mixer via FireWire the board is recieving audio and on ustreams web ap it shows a place where I can select the presonus for audio but when I select it it does not show any audio input.  Does anyone know how to make it work so I can use the sound coming from the mixer as the sound being broadcast for the stream?  I've never tried using the 8th inch audio input on the Mac but do you think taking two mono quarter inches from the main of the mixer into a stereo 8th inch would work?  Any help is appreciated, the show is this Thursday and I'd like to make it sound as good as possible and avoid having to use the internal microphone.  Or if any of you know a better forum to ask this question please direct me.
    Thanks,
    Aaron

    This will work to get the audio into the console, go to system preferences and set input to line and use the slider for level control.
    If you go to the Apple site, support and search "streaming audio" there is a number of articles that might help.
    Also, if you search" audio streaming firewire"
    Too much to list here.

  • Real time audio streaming

    Hi, I need to develop a j2me softphone, is there someone who can tell me how to do audio streaming ? I already read jsr135 but nothing important.
    I would appreciate any help.

    Hi Daniel,
    I'm encouraged to receive a response!
    I can't see how the advent of a new codec will affect the issues that I mention above. You still need an unreliable protocol to transport the Speex data! Datagram support is optional in both MIDP2.0 and it's my understanding that it is rarely implemented. The objection about Bluetooth remains - L2CAP support is optional, and setting the flush timeout on an L2CAP connection is impossible.
    Then there is the question of playing and recording at the same time. The anecdotal evidence seems to be that even though you can use threading to start two players at the same time, as soon as you try record and play at the same time, an exception is thrown (sometimes the emulators work but after deploying on the device the exception is thrown).
    Finally, you mention that there are a number of Java ME VOIP applications out there. Do you have links? Personally I can't find any. What I can find is a number of products that operate on different platforms:
    eg Fring that has Symbian, iPhone OS, Windows Mobile and Java ME versions.
    What is telling is that the Java ME version does not support voice, while the other versions do. The same applies to Nimbuzz and Talkonaut. Extensive googling has not revealed any exceptions to this rule.
    If you can prove me wrong it'll make my day :).
    Cheers,
    Fritz

  • C# Webbrowser using VLCplugin to play online audio stream

    Hello all,
    I have been trying to get the VLC to play a audio stream from a URL address.
    I created a forms application and added a webbroswer from the toolbox. I can navigate no problem but if I goto a link like
    http://audio.wgbh.org/otherWaysToListen/wcai.m3u
    I get a popup dialog that asked do I want to open this file. If i accept VLC opens and it plays no issue.
    However, what I want is it to play without the popup and in my application not with the VLC installed on my computer.
    I have seem many things about VLC selecting files but I have not been able to find a reference for pulling a stream with my C# form when browsing with the webbrowser item on my form.
    Any help would be thankful.

    Hi Badidea,
    >> However, what I want is it to play without the popup and in my application not with the VLC installed on my computer.
    Do you want to open the url without the popup? If so, I think you could block downloads from the webbrowser control, the link below shows the details:
    # How to block downloads in .NET WebBrowser control?
    http://stackoverflow.com/questions/483262/how-to-block-downloads-in-net-webbrowser-control
    But with this way, I am afraid that you could not open the audio. If you want to open the audio without the popup, I think you could automatically download the file, more information you could turn to the link below:
    # Suppressing the Save/Open Dialog box in a WebBrowser control
    http://stackoverflow.com/questions/3538874/suppressing-the-save-open-dialog-box-in-a-webbrowser-control
    But if you want to open the audio in your application without the VLC, I am afraid that you could not achieve that. As you know, the audio needs the special player from the URL, I think it is restricted if you want to open the audio without the player.
    Sorry for the inconvenient.
    Best Regards,
    Edward
    We are trying to better understand customer views on social support experience, so your participation in this interview project would be greatly appreciated if you have time. Thanks for helping make community forums a great place.
    Click HERE to participate the survey.

  • Synchronization of Multiple Audio Streams

    Hi,
    I'm trying to synchronize several Audio streams during
    playback and recording. Several questions I have and not sure if
    actually feasible or how to best address :
    1/ When playing back 2 audio streams, how can I ensure the
    playback is full synchronized (<20ms difference) when client
    receives the 2 streams in playback ? Basically trying to simulate a
    mix-down with 2 // streaming. Ultimately wants to achieve this with
    about 5 audio streams playing back and can't accept tolerate delay.
    I suspect I need to synchronize the start of each stream so that
    they all start together, but not been very succesfull so far.
    Alternatively is there a way to mix-down 2 streams on server and
    send back a 1 mixed-down audio stream ?
    2/ More tricky I believe : If I playback 1 audio stream and
    start recording a separate audio stream, I'm getting again
    desynchronised. Later If I replay the 1 audio and the newly
    recorded one, both are not anymore aligned. I suspect (but not sure
    that the delay and desynchronization is introduced during the
    recording but not sure and reason I asked first question...
    If someone knows those answers or could provide some
    direction of work.... I would greatly appreciate.
    Thanks

    When it comes to FMS streams, there really is no way of
    accurately syncronizing streams. Since you can't keep data in
    buffer when pausing the stream, and you have to rebuffer any time
    you seek, it's impossible.
    The only way I've been able to sync flv's is using
    progressive download. With progressive, you can sync streams, but
    the accuracy is limited to the keyframe interval of the flv files.
    For example, if you have 2 keyframes per second, you can achieve
    sync with 1 second accuracy. with 4 keyframes per second, you can
    get the offset down to 500ms. If you make every frame a keyframe
    (makes for a huge file), you can get frame accurate sync.
    The theory is to build a class that monitors the time and
    buffer length properties of your two streams, pausing and/or
    seeking when needed to maintain sync.

  • I recently restored my itunes library and many songs imported from CD  are now marked mpeg audio stream.  I can't play them.  Why did this happen?

    I recently restored my itunes library and many songs imported from CD are now marked mpeg audio stream.  I can't play them.  Other imports are just fine.  Why did this happen?

    If there is a cure you may find the folks over here have more information for you. Repost your question here:
    https://discussions.apple.com/community/itunes/itunes_for_mac?view=discussions

  • Troubles with Multiple Audio Streams.[ca0106]

    Hey everyone.
    I have a few issues on my hands that i just cannot seem to get a strait foward answer.
    First off, i have a Sound Blaster Live! 24bit pci sound card. Sound plays perfect as well as surround sound.
    My issue is that i cannot get mulitiple things to play sound at once. All of the correct modules are loaded just fine and working well. I do believe that when i was using gentoo and could play multiple audio streams, i had alsa built into my kernel.... in arch i dont believe i do.
    Im trying not to build my kernel all over again in arch because i think its something in arch i may be  doing wrong...
    Thanks
    Anthony
    Last edited by anthonyclark (2008-04-02 13:30:58)

    Thanks for the reply.
    I know for a fact im using ALSA. Since i made that post, ive been doing as much research as possible. I found on ubuntuforums, some dude posted some configs for his ca0106 sound card. I tried them and they take me to the same fork.
    Surround Sound .... or .... Multiple audio streams.
    So the big questions is, is it possible to have both? And would compiling a new kernel with ALSA and support for my card make a difference?
    Thanks!
    anthony
    Last edited by anthonyclark (2008-04-02 15:25:48)

  • Audio streaming using DMS

    I have a large financial shop who wants to do audio streaming, ie bloomberg audio feeds to their clients. Do you think the DMS solution would be a fit here?
    2nd question: Can DMS stream audio to an IP Phone? Or would one need a CUAE like application for this?
    Any insight would be very well appreciated...
    thankyou, Lawence

    Are you talking about the video portal or digital signage?
    The answer for the video portal is "yes" Just use a windows *.wma file codec, and have the portal listen to its URL source or mount the file on the DMM for AoD playback.
    For a Digital Media Player for Digital signage, the answer would be No on the streaming in the true since of streaming audio.
    The DMP will play back the following audio codecs.
    Audio MPEG1 Layers 1 and 2
    MPEG4 AAC Low Complexity
    AC-3
    DMS cannot stream video to an IP phone, there is a one to one relationship between the DMM and the Video portal. The endpoints are either the video portal or a Digital Media Player.

  • Audio stream - record and listen simultaneously

    Hi!
    I would like to record and listen RTP stream simultaneously, but I don't know how... I can record to a file OR listen in headphone but just separately.
    If i create a datasink for recording on a port I cannot use that port to listen. But in this case how can I listen the audio stream on headphone?
    Maybe the good solution would be for me if I could clone the stream but I cannot do, because I get an exception
    javax.media.NoProcessorException: Cannot find a Processor for: com.ibm.media.protocol.CloneablePushBufferDataSource@1d8957fThis is the code:
    DataSource ds = Manager.createCloneableDataSource(Manager.createDataSource(media_locator));
    processor = Manager.createRealizedProcessor(new ProcessorModel(ds, formats, outputType));Has anybody got any idea?
    Thanks!
    Ric Flair

    rflair wrote:
    Thank You, it was not a big help for me...That's too bad. But considering that JavaSound doesn't handle RTP, and all of the code you've posted in JMF code, that doesn't change the fact that it's a JMF question...

  • Transmitting audio stream by RTP

    Hello:
    I am trying to transmit audio stream with this code, but i dont get it work well.
    The output is:
    C:\RUN\EX14>java AVTransmit2 file:/c:/run/format/au/drip.au 1.1.9.147 45200
    Track 0 is set to transmit as:
    ULAW/rtp, 8000.0 Hz, 8-bit, Mono, FrameSize=8 bits
    Created RTP session: 1.1.9.147 45200
    Start transmission for 60 seconds...
    Code to AVTransmit2.java:
    * Use the RTPManager API to create sessions for each media track of the processor.
    private String createTransmitter() {
         // Cheated. Should have checked the type.
         PushBufferDataSource pbds = (PushBufferDataSource)dataOutput;
         PushBufferStream pbss[] = pbds.getStreams();
         rtpMgrs = new RTPManager[pbss.length];
         SessionAddress localAddr, destAddr;
         InetAddress ipAddr;
         SendStream sendStream;
         int port;
         SourceDescription srcDesList[];
         for (int i = 0; i < pbss.length; i++) {
         try {
              rtpMgrs[i] = RTPManager.newInstance();     
    // The local session address will be created on the
    // same port as the the target port. This is necessary
    // if you use AVTransmit2 in conjunction with JMStudio.
    // JMStudio assumes - in a unicast session - that the
    // transmitter transmits from the same port it is receiving
    // on and sends RTCP Receiver Reports back to this port of
    // the transmitting host.
              port = portBase + 2*i;
              ipAddr = InetAddress.getByName(ipAddress);
              localAddr = new SessionAddress();
    //*          InetAddress.getLocalHost(),
    //*                                   port);
              destAddr = new SessionAddress( ipAddr,
                                  port);
              rtpMgrs.initialize( localAddr);
              rtpMgrs[i].addTarget( destAddr);
              System.err.println( "Created RTP session: " + ipAddress + " " + port);
              sendStream = rtpMgrs[i].createSendStream(dataOutput, i);          
              sendStream.start();
         } catch (Exception e) {
              return e.getMessage();
         return null;
    The code receptor is working well, because I tried it with JMStudio.
    Why it doesnt transmit ?
    I need your greatly help.
    Thanks.

    What is different from your previous question ?

  • Send video&audio stream to pc

    is it theoretically possible with mhp zu filter a video&audio stream out of the transport stream and send it right away to a pc via ethernet to record it there?
    it's for a student project - thanks a lot

    if i hve understood u'r question, yes it's possible u can use VLC ( file->wizard diffusion...) but it doesn't include SI information nd i'm facing a little problem to use it with xletview(the emulator for mhp applications) or u can use also a very interesting tool JustDVB-IT-2.0 u can get it from www.cineca.tv (italian web site nd register required there is a free demo) if u face a problem with codecs u can use RADTools.exe which u can find it easily on the net.
    good work
    i'm also a student nd i'm working on an itv project: openmhp,xlet,javatv.. what's are u working on ?
    Edited by: nidou01 on Apr 3, 2008 1:37 AM

  • Compressor stops working at audio stream

    I've been having some Compressor problems, which I think I have narrowed down, at least in this one instance to a corrupt audio file. I submitted an hour long self contained QT movie 2 nights ago, and it successfully encoded the MPEG 2 file, but stopped when it came to the ac3 file. I de & re-installed Compressor from the original FCP 2 discs, so I'm now running 3.0, on OS 10.4.11, on a G5 dual 1.8 machine.
    So last night I submitted the same QT movie that had stopped compressing the night before, and it did the same thing again- successfully encoded the video stream, but NOT the audio. Compressor stopped working in exactly in the same place as it did the night before. This morning I used Jon Chappell's app Compressor Repair, and am up and running again, compressing another movie that seems to be working just fine.
    But, I still have the problem with the particular movie with bad audio file. From another forum, someone had a similiar problem 3 years ago with a corrupted audio file. Which I think is what my problem is now, for at least one video I'm trying to encode.
    So the questions are: How do you find the corrupted file, what do you do? How can I encode the audio stream for this file?
    Of course I have a deadline for Monday, so any suggestions are most welcomed!
    Thanks very much
    Melinda

    Try exporting a self contained QT movie which only includes the audio from your sequence. Set it to be "current settings"... will then match your sequence settings. If it exports OK, just put this file in the sequence for compression.
    Micheal's solution is much the same actually as mine... When you use the sequence settings, the QT is nothing more than a QT movie wrap around the aiff.
    QT is a container for media files of any kind actually.
    Jerry
    Message was edited by: Jerry Hofmann
    Message was edited by: Jerry Hofmann

  • Audio Streaming with NetStream on LCDS or FDS

    Hi. I can't make audio streaming working on FDS neither on
    LCDS. I want to record audio on server.
    When trying to :
    quote:
    new NetStream(this.netConnection);
    i get
    quote:
    01:19:48,140 INFO [STDOUT] [Flex]
    Thread[RTMP-Worker-2,5,jboss]: Closing due to unhandled exception
    in reader thread: java.lang.IndexOutOfBoundsException: Index: 0,
    Size: 0
    at java.util.ArrayList.RangeCheck(Unknown Source)
    at java.util.ArrayList.get(Unknown Source)
    at
    flex.messaging.io.tcchunk.TCCommand.getArg(TCCommand.java:260)
    at
    flex.messaging.endpoints.rtmp.AbstractRTMPServer.dispatchMessage(AbstractRTMPServer.java: 821)
    at
    flex.messaging.endpoints.rtmp.NIORTMPConnection$RTMPReader.run(NIORTMPConnection.java:424 )
    at
    edu.emory.mathcs.backport.java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPo olExecutor.java:665)
    at
    edu.emory.mathcs.backport.java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolEx ecutor.java:690)
    at java.lang.Thread.run(Unknown Source)
    The same code works fine on RED5.
    What's wrong?
    RTMP channel definition is :
    quote:
    <channel-definition id="my-rtmp"
    class="mx.messaging.channels.RTMPChannel">
    <endpoint uri="rtmp://{server.name}:2040"
    class="flex.messaging.endpoints.RTMPEndpoint"/>
    <properties>
    <idle-timeout-minutes>20</idle-timeout-minutes>
    <client-to-server-maxbps>100K</client-to-server-maxbps>
    <server-to-client-maxbps>100K</server-to-client-maxbps>
    </properties>
    </channel-definition>

    There are seeral apps that play music over wifi or 3G
    Pandora, I(heart)music, Bing100, all free,

  • Can I deal with the audio stream while recording?How?

    Hi, All,
    I am doing a project about SIP phone using J2ME. I am quite new to this area. I want to mask the audio stream before it sends out. I want to know if it can be achieved in J2Me. if yes, would you give me some reference? Thanks for any reply.

    Is anyone who knows how to achieve this&#65311;

  • I want to be able to use airplay to stream audio to another iOS device, and then use that audio stream to be used when recording video instead of the built in microphone / microphone input. Is this possible?

    I want to be able to use airplay to stream audio to another iOS device, and then use that audio stream to be used when recording video instead of the built in microphone / microphone input. Is this possible?

    A third-party app probably cannot obtain a stream from another app. To the best of my knowledge, such a capability is not provided in the software development kit, apps being "sandboxed" from each other and so allowed to communicate only in very specific and limited ways.
    I'm not completely sure what you mean by "limitations on video capture". An iPhone, to the best of my knowledge, can natively record video only through it's built-in camera, and audio while doing video recording only through the built-in microphone or mic/headphone jack. There might be a video recording app that would allow audio input from an external device connected to the dock connector, but I'm not sure.
    Regards.

Maybe you are looking for