Streaming live audio to a Helix server

Can I use JMF to stream audio to a Helix server? If, not which streaming servers are the best to use with JMF?
The audio is being captured on a mobile phone and sent to the web server via HTTP POSTs (to a Servlet). I want the servlet to break it up into packets and stream it live to a streaming server, where people can listen to it on the web.
Thanks.

Removed; didn't notice the "live audio" when I made my suggestions.
Message was edited by: Dave Sawyer.

Similar Messages

  • Streaming live audio without flash media server

         I have an mp3 player, that I would like to add live streams to.
    here is an example url of what I'd like to stream. http://85.17.174.181:6968
    is there a way to do that in flash AS3?

    >>don't tell me I need the server to include me in their cross-domain policy???
    Always... servers that allow public access will typically have a wildcarded cross domain though. And in that case you just need to tell the sound load to check the policy file using SoundLoaderContext like so:
    var context:SoundLoaderContext = new SoundLoaderContext(100, true);
    var a:Sound = new Sound();
    a.load(new URLRequest("http://85.17.174.181:6968/"), context);
    a.play();
    It doesn't look they have a crossdomain.xml file in place though...

  • Using iTunes/Airplay for streaming live audio/video on local network?

    I'm wondering if I can use iTunes to stream live audio and/or video to other computers, Airport Express, or Apple TVs on a local wifi network. I'm willing to jailbreak or hack anything if necessary.
    So this is what I am hoping to do:
    Camera-->Streaming Computer-->Airplay-->Local Wi-fi network-->Devices recieving audio/video

    This device should be doing UPnP by default so you will not need most of the info on this page.
    What the Link does have is the default access info and pictures.
    In the Wan page turn Off the DOS or SPI firewall. (it will have one or the other)
    Check in the Box to Disable.
    Save Settings.
    If this device is behind a Cable modem then also Allow it to Respond to Internet Pings on the same page.
    (this a Check the Box to Do)
    Save settings (Again if you did them above)
    7:30 PM Monday; April 28, 2008

  • Streaming live audio in a flash movie

    Hi, I'm working on a website for a local radio station, and I
    want to be able to stream a live audio feed of the radio station to
    a flash movie. Right now all I really know is that the URL to get
    this live feed is
    http://kpacradio.pacific.edu:8000
    - if you put that into Windows Media Player it will start the radio
    broadcast. Is it possible to play this through flash so that the
    user can just stream it from a website rather than having to
    introduce windows media player into it? Thanks a lot.
    David

    No. The Flashplayer can't connect to a windows media
    serverYou'd need to capture the stream and republish to FMS, or
    broadcast directly to FMS instead of WMS.

  • Streaming live audio via wifi to home audio system

    I would like to attach a device to my stereo that would act as a wifi audio receiver and play whatever audio my macbook is currently playing.
    Does such a thing exist?  Basically it should act like a wireless version of an audio cable that I would plug into my mac's audio out and into my stereo.
    I can't tell if the airport express will do this.
    I own a Roku and I know I can get it set up to play certain kinds of media files over the network, but I want the freedom to be able to play live streams over the internet from any number of sources.
    I would accept being able to live stream from itunes, but only if it's a true live stream and not just letting me select itunes files.  (I want it to know where I am in a long podcast, for example.)
    Thanks!

    Just for completeness, airfoil + airport express indeed does exactly what I wanted:
    - I purchased an airport express, a mini stereo->phono Y splitter cable, and the airfoil application from Rogue Amoeba.
    - I configured my airport to join my existing network and plugged its audio out jack into my stereo receiver using the Y splitter cable.  At this point I could play itunes to my stereo by selecting the airport as the "speaker" using the little menu in the lower right corner of the itunes window.  (itunes found the airport express automatically)
    - I launched Airfoil and told it to stream "system audio" (as opposed to the other choice of streaming from a single application), which required a brief install and reboot of some system level software, to my airport express, which it had located automatically.
    Now I can stream any audio from my mac to my stereo, not just itunes!  Rock!
    The only sadness?  If the Time Capsule supported an audio out jack I would not have needed to buy the Airport Express.
    In the long run maybe it would have been smarter to buy an Apple TV.

  • Can an iPod Touch Stream Live Audio via Wifi through airplay?

    I would like to stream the audio from a PA system running in one building to another building on the same site. The distance is too far for a wireless speaker system, as well as for a strictly Airport based network. What I am wondering is if it would be possible to use an Ipod Touch to input audio from the PA and use airplay to output the audio via an Airport Express in the other building?

    What I am wondering is if it would be possible to use an Ipod Touch to input audio from the PA and use airplay to output the audio via an Airport Express in the other building?
    Sorry, but no. The basic issue would be how to get the PA audio "into" the iPod Touch.

  • Streaming Live Audio on J2ME

    I am able to play the audio/video on my device using MMAPI , i am am confuse on how to access the data source to send to my destination , anyone can help me,how to do it?
    Thanks a lot for ur support!

    Removed; didn't notice the "live audio" when I made my suggestions.
    Message was edited by: Dave Sawyer.

  • Streaming Live Audio...skipping?

    Coding world....im losing my wits .....does anybody have a clue or experienced the same problem as I'm having or know how to solve. I am streaming audio live from my server and it keeps going in and out every so often almost like a buffer issue. I cant seem to find any documentation as to why this keeps occuring and i'm pretty certain my flash code has nothing to do with it but i cant seem to find any other server explanation. Im using a fast DSL connection so I seriously doubt its that so i'm at wits end.
    Does anybody know what could potentially be the problem with connection or is that just a side effect of using FMS or if there's anything I can do to minimize the skipping during live continuous audio stream.
    ........i'm at the edge of the cliff.....ready to jump

    for example you can mess with the camera settings......
    // -------------- Camera Settings -------------
    var bandwidth:int = 0;
    // Specifies the maximum amount of bandwidth that the current outgoing video feed can use, in bytes per second. To specify that Flash Player video can use as much bandwidth as needed to maintain the value of quality , pass 0 for bandwidth . The default value is 16384.
    var quality:int = 50;
    // this value is 0-100 with 1 being the lowest quality. Pass 0 if you want the quality to vary to keep better framerates
    var camera:Camera = Camera.getCamera();
    camera.setQuality(bandwidth, quality);
    camera.setMode(320,240,15,true);
    // setMode(videoWidth, videoHeight, video fps, favor area)
    and of course some of the mic settings as well.....you just have to play with them to get the best results.......its alot easier with videos that are already recorded that you stream......the live stuff doesnt have.....or i definitely havent found too many settings to adjust......

  • How to Stream live Audio and metadata

    Helo there,
    i'm trying to build some kind of online radio. Therefore i'll be streaming audio (mp3) in real time.
    Ok,  so here's my problem: i have a play-list on my server and the idea is  for this play-list to play all day long, but on the client side  the user can only do play and stop the music.
    The problem is  when one music stops and the next one starts i need the server to send  not only the new audio but also all the info about this file: duration,  author and music name, this way my player can show a play /stop and all  the info about the current music playing.
    Until now i've only come across with video live streaming, but very  little information on how to do a live transmission of audio.
    ps: the only info on streaming audio that i've found so far is about  audio on demand, but the way i want things to happen the client can't  chose the music.
    any ideas?
    thanks in advance!
    Marco Monteiro

    I think you can use Server-side Stream object to create a playlist and other functions on server-side.
    Basically you can add your mp3 files to be played, basically just their names to an array object, and on the fly update it.
    Now take first file from the array and create the playlist and keep on adding new files from the names array.
    Also you can define onPlayStatus on your stream object to track when one song finishes and other starts.
    Also since you have names of the files, you can set an interval based on current song's duration which would get triggered and send information to client of next song to be played. For this you can use onId3 method as mp3 files would not have metadata defined.

  • Streaming "desktop" audio to an icecast server

    I want to stream whatever I can hear on my machine to an icecast server.  How can I do this?

    What audio system do you use? (ALSA,, OSS, PulseAudio, JACK, etc.)
    With PulseAudio:
    - Run paman (http://aur.archlinux.org/packages.php?ID=9045), find and select the 'sink' you are sending audio to, click 'Properties' (in the lower-right corner) and select/copy the 'Monitor Source' name.
    - To get at the audio, run:
    parec -d MONITOR-SOURCE-NAME --raw |COMMAND-THAT-STREAMS-STDIN-TO-ICECAST
    I think you'll have to use ices for this, not darkice.  The default output format of parec is 44100 Hz stereo, native-endian signed 16-bit; use 'parec --help' for documentation (the man page seems to be out-of-date).
    EDIT: The package you need for ices is ices2 (http://aur.archlinux.org/packages.php?ID=24156).
    Last edited by rransom (2010-06-17 10:42:03)

  • Streaming live Audio/Video

    i have to transmit Captured Audio and video ..
    here my steps
    a)capture audio .. create a datasource for capturing device
    b)capture video.. create a datasource for capturing device
    c)merge both the datasource to get merged datasource
    Still this i finished
    now i want to transmit the media using merged datasource.. is it possible using AVtransmit2
    if yes, how to do this?
    help me

    now i want to transmit the media using merged datasource.. is it possible using AVtransmit2Why do you want to transmit the merged datasource? The RTP protocol requires that audio and video streams be transmitted separately, so you'd essentially be merging the video & audio only to have them sent in separate RTP streams, and received as separate RTP streams...
    if yes, how to do this?You'd transmit a merged datasource just like any other datasource...

  • IPhone docking and streaming live audio capabilities?

    I want to setup the following devices and want them to work flawlessly. HP Media Center EX 485 connected to a Linksys Wireless-N Gigabit router WRT310. I want to access media center through iphone using iStream and play the music on iTunes and hear the sound when iPhone is docked into either Bose SoundDock or B&W Zeppellin. Is this possible???
    I would appreciate comments / observations/ solutions / suggestions

    Yes, that is happening to me too.

  • Steps to Publish Live Audio for HTTP Streaming

    I wanted to confirm some steps in order to stream Live Audio from FMS to a mobile device using HTTP Streaming and HTML 5 audio/video tags.  I have a system setup like below:
    Lets say my application which recieves the live audio is called CLASSROOM.
    Lets say my application instance is called "TEST"
    So I have application code like:
    In "onAppStart()"
    this.broadcast_live_stream = Stream.get("f4f:livestream");
    this.broadcast_live_stream.liveEvent = "TEST";
    this.broadcast_live_stream.record("record");
    To play the incoming audio:
    application.broadcast_live_stream.play(streamName); // name of the stream sending audio to the server
    Thus, I have a folder:
    Flash Media Server 4.5\applications\CLASSROOM\streams\TEST\livestream
    which contains a bootstrap, control, meta, f4f, and f4x files.
    Since I specified the Event name as "TEST", I also created the folder:
    Flash Media Server 4.5\applications\livepkgr\events\_definst_\TEST
    Which contains event.xml:
    <Event>
    <EventID>TEST</EventID>
    - <Recording>
    <FragmentDuration>4000</FragmentDuration>
    <SegmentDuration>400000</SegmentDuration>
    <DiskManagementDuration>3</DiskManagementDuration>
    </Recording>
    </Event>
    and manifest.xml:
    <manifest xmlns="http://ns.adobe.com/f4m/1.0">
    <media streamId="livestream" bitrate="100" />
    </manifest>
    Should I then be able to create a video tag and hear the live audio:
    <video id='slide1' controls='controls'><source src='http://fms-dns/hls-live/livepkgr/_definst_/Test/livestream.m3u8'></video>
    I am not sure what other steps are needed, or a good way to test.  I don't understand how the Live Media Encoder would be needed in this situation.

    So I implemented the above change, using code from the livepkgr.
    My custom application is called Unity3, and the instance is Test, and I am trying to  publish to an event called "Test".
    I now have:
    applications\Unity3\streams\Test\livestream
    Which contains segment files, which update when I send audio in my application.
    I have:
    C:\Program Files\Adobe\Flash Media Server 4.5\applications\Unity3\events\Test\Test
    Which contains my event.xml (EventId Test), manifest.xml (StreamId livestream).  I also have a ".stream" file in this folder.  The contents of which are:
    <?xml version="1.0" encoding="UTF-8"?>
    <stream xmlns="http://ns.adobe.com/liveevent/1.0">
    <type>
    f4f
    </type>
    <name>
    livestream
    </name>
    <path>
    C:\Program Files\Adobe\Flash Media Server 4.5\applications\Unity3\streams\Test\livestream
    </path>
    </stream>
    So everything there seems to be good.  I try to use:
    <video id='live' controls='controls'><source src='http://[fms-dns]/hls-live/Unity3/Test/Test/livestream.m3u8'></video>
    or
    <audio id='live' controls='controls'><source src='http://[fms-dns]/hls-live/Unity3/Test/Test/livestream.m3u8'></audio>
    I don't hear audio, the controls don't seem to respond as though they recognize a source.  Are there certain browsers which can be confirmed work?  Any other mistakes stand out?
    Thanks.

  • LIVE audio stereo stream

    Hi there,
    I'm looking for a way to offer a LIVE audio stereo stream to
    the web client using Shockwave plug-in.
    I was "banging my head off forums and search engines" (as
    flashcomguru nice says) for some weeks
    to find a way to stream LIVE audio stereo to Flash Media
    Server but I couldn't find one viable direct
    solution for this platform.
    The client Flash Media Server app uses Microphone.get method
    to capture sound from soundboard
    which unfortunately is mono. it uses a speech Nellymoser
    codec, at which music sounds pretty OK at 44Khz.
    Possible solution: I can capture the LIVE line-in stereo
    sound using two soundcards and thus beeing able to
    create two sepparate mono audio streams one for left, one for
    right using the only available Microphone.get method.
    It will then maybe need some synchronization server scripts
    later.
    The trick is: In Director I use two .swf movies (action
    script controlled) to connect to server
    and play the streams but I couldn't find a way to direct the
    streams to a specific channel
    (one of the 8 available) from which point I can use the LINGO
    Sound Channel PAN property, like this:
    sound(1).pan = 100 for right
    sound(2).pan = -100 for left
    From all that I read I came to the conclusion that you can
    use the Sound Channel PAN property only to
    .swa imported and prerecorded audio cast members, not to the
    live audio stream which is played roughly
    in the main sound output channel to which I can't apply the
    Sound Channel PAN property.
    The key is to route those two streams left and right in
    Director.
    Any hints?
    Thanks for reading,
    My deepest respect,
    hOver

    The microphone code is very similar to what you have posted.  I can successfully use the enhanced microphone.  When it is enabled, the issue I am having is exhibited.
    A simple test I am using:
    Use ffmpeg to stream a stereo mp3 file to the media server.  I am using the following ffmpeg command line:
    ffmpeg -re -i ~/alone.mp3 -vn -acodec copy -ac 2 -f flv rtmp://myserver.com:1935/video/3Daudio/alone
    In this case the file is encoded at 44.1 kHz.
    The client uses a netstream to play with bufferTime = 0
    Without the microphone, the playback is as expected.  With a normal microphone, not the enhanced microphone, the playback is as expected but there is little to no echo cancellation.
    When the enhanced microphone is enabled, again using similar code to your post, the mp3 playback becomes severely distorted and is unacceptable.
    In my opinion, this is an issue with the AEC algorithms of the enhancedMicrophone and stereo playback of a 'live' stream.  If I modify the client playback code to bufferTime > 0, the mp3 playback is normal but there is no echo cancellation.
    Thanks,
    Sam

  • Live audio streaming over wifi network

    What I would like to know, is it possible to stream live audio over a wifi network.
    PC A records a stream from a mic and then simultaneously broadcasts that same stream live over the entire wifi network, sorta like a radio station only on a much smaller and simpler scale.
    Is this possible?

    It seems like I'm missing something with icecast. If I understand it correctly then you can only play an existing media file, no real time live broadcasting?
    I have finished setting it up and can listen in over the network, but I need real time broadcasting. Is there a way to do this in icecast or should I look at something else.
    SYNOPSIS:
    While someone is talking over the mic the server records the stream with Audacity for editing later on. The problem is that I want this stream to be simultaneously streamed live to another PC so that people next door to the meeting can hear it real time, no delays. A few seconds delay is not critical, but I cannot record and hour long session and then broadcast it, it has to be real time.
    To complicate things, there is no internet available, we are in the middle of nowhere on a secluded convention centre, so that is why it seemed to me like the easiest would be to broadcast via WIFI.

Maybe you are looking for