LIVE audio stereo stream

Hi there,
I'm looking for a way to offer a LIVE audio stereo stream to
the web client using Shockwave plug-in.
I was "banging my head off forums and search engines" (as
flashcomguru nice says) for some weeks
to find a way to stream LIVE audio stereo to Flash Media
Server but I couldn't find one viable direct
solution for this platform.
The client Flash Media Server app uses Microphone.get method
to capture sound from soundboard
which unfortunately is mono. it uses a speech Nellymoser
codec, at which music sounds pretty OK at 44Khz.
Possible solution: I can capture the LIVE line-in stereo
sound using two soundcards and thus beeing able to
create two sepparate mono audio streams one for left, one for
right using the only available Microphone.get method.
It will then maybe need some synchronization server scripts
later.
The trick is: In Director I use two .swf movies (action
script controlled) to connect to server
and play the streams but I couldn't find a way to direct the
streams to a specific channel
(one of the 8 available) from which point I can use the LINGO
Sound Channel PAN property, like this:
sound(1).pan = 100 for right
sound(2).pan = -100 for left
From all that I read I came to the conclusion that you can
use the Sound Channel PAN property only to
.swa imported and prerecorded audio cast members, not to the
live audio stream which is played roughly
in the main sound output channel to which I can't apply the
Sound Channel PAN property.
The key is to route those two streams left and right in
Director.
Any hints?
Thanks for reading,
My deepest respect,
hOver

The microphone code is very similar to what you have posted.  I can successfully use the enhanced microphone.  When it is enabled, the issue I am having is exhibited.
A simple test I am using:
Use ffmpeg to stream a stereo mp3 file to the media server.  I am using the following ffmpeg command line:
ffmpeg -re -i ~/alone.mp3 -vn -acodec copy -ac 2 -f flv rtmp://myserver.com:1935/video/3Daudio/alone
In this case the file is encoded at 44.1 kHz.
The client uses a netstream to play with bufferTime = 0
Without the microphone, the playback is as expected.  With a normal microphone, not the enhanced microphone, the playback is as expected but there is little to no echo cancellation.
When the enhanced microphone is enabled, again using similar code to your post, the mp3 playback becomes severely distorted and is unacceptable.
In my opinion, this is an issue with the AEC algorithms of the enhancedMicrophone and stereo playback of a 'live' stream.  If I modify the client playback code to bufferTime > 0, the mp3 playback is normal but there is no echo cancellation.
Thanks,
Sam

Similar Messages

  • Live Audio / Video Streaming Very Basic

    I need to Stream Live Audio and Video, i went through a web site, and it gives a code,
    [http://www.cs.odu.edu/~cs778/spring04/lectures/jmfsolutions/examplesindex.html#transmitboth]
    in this, i dont no how to run the application, because we should have 2 port addresses each for Audio and Video.
    how do u compile this and capture the file from else where using JMStudio.
    Please help me, i am an absolute beginner to JMF.

    Please don't take this to be offensive, but if you're not able to figure this out on your own, you have absolutely no business playing around with something as advanced as JMF. It's not a question of intelligence or insult, but simply stated, if you don't understand the concept of a URL, and you don't have the ability to extrapolate beyond the exact command-line input you've been given... you need to go and learn the basics, because you lack the grasp of the fundamentals required for advanced programming.
    With that in mind, the following is the answer to your question. If you can't understand it, it means that you lack the fundamentals necessary for JMF programmming, and you need to invest a significant amont of time aquiring those. My explanation should be quite clear to anyone with the proper Java programming fundamentals.
    AVTransmit2 is sample code that can broadcast a single media source (live audio or live video or audio file or video file or video/audio file). It does not have the capability to broadcast more than once source, which is required for live audio and video support. It is designed to take in a single media locator, and broastcast it.
    To meet your specifications, you will need to modify the main method so it is capable of passing in multiple media locators, and thus creating multiple instances of the AVTransmit2 class. To do this, you will either need to modify the command-line argument logic so it supports parsing multiple source arguments simultaniously.
    Or, you could just rip out the command-line stuff and hard-code it for live audio and video. That's the "easy" way.
    The default media locator for audio capture is javasound://0 and the default media locator for video capture (under Windows) is vfw://0

  • Play only audio from RTMP Live audio/video stream

    Hi,
    I am working on Flex FMS video conference app, Is there any possibility to stop video and listen only audio of a broadcaster with RTMP Live without broadcaster detaching camera to Stream and should not affect other viewers video feed.
    Thanx & Regards
    Pavan

    Actually, iTunes does this automatically. If you go to music>songs>and scroll down you should see everything in your library including music videos and podcasts. When you select the music video or podcast of your choice in the music menu it will play it as an audio file and not video.

  • One way live audio-video streaming

    Hi
    First of all I want to be honest that I am a beginner with
    FMS2, actually I must have missed much not using it so far. So,
    excuse me, I am sure you will find my problem very easy to solve.
    Having a little experience with Flash I tried to do the
    following: I have 2 PC's in LAN. One of them has camera and
    microphone. I want to stream audio and video to the other computer
    - I only need 1-way live streaming audio-video connection.
    I have read some docs about streaming with FMS2 but I
    couldn't find out which of the PC's should have FMS2 (and Web
    server) - the one with camera and microphone or the other. And if
    the camera and microphone are on the server how audio and viveo
    should be captured and streamed to the client?
    I really need your help. Any idea would be appreciated.
    Thanks in advance!

    Thank you friends!
    Actually I managed to sort out the problem. And the problem
    was that I had not used before FMS at all. After I have read more
    documentation I established the connection using 2 PC-s (one to
    publish and one to plaY) and the 3-rd for FMServer and Web Server.
    By the way there was a little confusion about local and
    network access of .swf files, but now it is okay.
    Now I have a new challenge - to record published stream to
    files, for example about 30 minutes long. I want to record and
    store all them continuously - having all the records for 3 days for
    example. I am not sure now how to do that but I am working on it.
    Anyway, thank you for your assistance!

  • Flash Player - LIVE Stereo Streaming Upload?

    Hi all,
    Is Flashplayer 12 (for streaming live uploads) mono only?
    I run a music studio and have been instigating live video/audio concert streams from our facility - especially mixing multiple microphone inputs to a stereo output via a studio console back into the computer for audio streaming.
    We have been using Concert Window (www.concertwindow.com) for our shows - but I noticed that the Flashplayer 12.0.0.70  that is used in Concert Window to setup video and audio streams IGNORES the right channel of my audio feed.
    This is regardless of whether I use my iMac's stereo line in, and Avid MBox2 USB audio interface, or an RME Fireface 800.
    If I use Flash Media Live Encoder it streams audio in stereo fine - but not so the in-app setup in Concert Window.
    I can select the audio devices connected under the "microphone input"on the Flashplayer window in Concert Window etc, but to no avail can I get it to stream the right channel.
    As FMLE works fine, I assume this is an issue with Flash 12 on the Concert Window site - or is this a limitation?
    System specs:
    iMac running OSX 10.6.8
    Firefox 27.0.1, Chrome 33.0.1750.117 or Safari 5.1.10 all running Adobe Flash 12.0.0.70
    any advice greatly appreciated
    Barks

    Hi Mike, thanks for getting back.
    The problem is I am inputting a Live audio feed with video. Are you streaming live from camera videos with audio - all real time?
    I put up the mics and the band plays. While they are playing it is broadcast with video via Concert Window. There is no rendering as the stream is not recorded - it is a real time transmission that viewers elsewhere can interact with.
    It works fine on both channles going into my computer then into Flash Media Live Encoder and broadcast. FMLE seems to be fussy with external USB cameras.
    The issue is the way it is broadcast via Flash 12.0.0.70 in Concert Window (see pic above) Only the left channel is accepted.
    I tried the stand alone but cannot see how I can test a Live (not a recorded file or rendered file) input with it?

  • Streaming live audio in a flash movie

    Hi, I'm working on a website for a local radio station, and I
    want to be able to stream a live audio feed of the radio station to
    a flash movie. Right now all I really know is that the URL to get
    this live feed is
    http://kpacradio.pacific.edu:8000
    - if you put that into Windows Media Player it will start the radio
    broadcast. Is it possible to play this through flash so that the
    user can just stream it from a website rather than having to
    introduce windows media player into it? Thanks a lot.
    David

    No. The Flashplayer can't connect to a windows media
    serverYou'd need to capture the stream and republish to FMS, or
    broadcast directly to FMS instead of WMS.

  • How to enable live audio streaming in FF running Windows 7

    I am unable to listen to live audio streams, please help !

    Which plugin is used for those streams?
    Your More system details list shows the WMP plugin and the Flash plugin as commonly used plugins to play streams.
    Start Firefox in [[Safe Mode]] to check if one of your add-ons is causing your problem (switch to the DEFAULT theme: Tools > Add-ons > Themes).
    * Don't make any changes on the Safe mode start window.
    See [[Troubleshooting extensions and themes]] and [[Troubleshooting plugins]]

  • I want to stream all audio to my stereo via AirPort. Right now audio only streams with iTunes, not when I'm online (YouTube, SoundCloud, Spotify). What must I do? Thanks! 2011 macbook pro 10.6.8

    I want to stream all audio to my stereo via AirPort. Right now audio only streams with iTunes, not when I'm online (YouTube, SoundCloud, Spotify). What must I do? Thanks! 2011 macbook pro 10.6.8

    Thanks, Bob. I've seen this suggested before. However, I'm at a friend's place (cat-sitting). He owns an older MacBook (almost positive) and he streams audio without Airfoil. All he uses is AirPort Express. He told me to simply click alt-option and the volume icon and a menu drops down, and I select "AirPort Express". But it never appears. All the menu offers is this:
    "AirPort Express" isn't an option.
    Why is that? Is there something fundamental I'm not doing?

  • Visualization of Audio from live RTMP video stream.

    I have a need to do basic remote monitor video is presant and levels of audio.
    I had hoped to used http://www.longtailvideo.com/addons/plugins/247/SUB-Equalizer
    but it seems to be incompatable with live streams.
    the VU level requirement is basic of that there is an live audio feed and does not need to be fancy at all.
    We currently use JWPlayer but am not tied to it for this project.
    Cheers for the help
    Robert

    Would you mind reposting this question over on the Flash Professional forums?  This forum is primarily for end users, the Pro forums will get you in touch with a wider developer audience.
    Thanks,
    Chris

  • Steps to Publish Live Audio for HTTP Streaming

    I wanted to confirm some steps in order to stream Live Audio from FMS to a mobile device using HTTP Streaming and HTML 5 audio/video tags.  I have a system setup like below:
    Lets say my application which recieves the live audio is called CLASSROOM.
    Lets say my application instance is called "TEST"
    So I have application code like:
    In "onAppStart()"
    this.broadcast_live_stream = Stream.get("f4f:livestream");
    this.broadcast_live_stream.liveEvent = "TEST";
    this.broadcast_live_stream.record("record");
    To play the incoming audio:
    application.broadcast_live_stream.play(streamName); // name of the stream sending audio to the server
    Thus, I have a folder:
    Flash Media Server 4.5\applications\CLASSROOM\streams\TEST\livestream
    which contains a bootstrap, control, meta, f4f, and f4x files.
    Since I specified the Event name as "TEST", I also created the folder:
    Flash Media Server 4.5\applications\livepkgr\events\_definst_\TEST
    Which contains event.xml:
    <Event>
    <EventID>TEST</EventID>
    - <Recording>
    <FragmentDuration>4000</FragmentDuration>
    <SegmentDuration>400000</SegmentDuration>
    <DiskManagementDuration>3</DiskManagementDuration>
    </Recording>
    </Event>
    and manifest.xml:
    <manifest xmlns="http://ns.adobe.com/f4m/1.0">
    <media streamId="livestream" bitrate="100" />
    </manifest>
    Should I then be able to create a video tag and hear the live audio:
    <video id='slide1' controls='controls'><source src='http://fms-dns/hls-live/livepkgr/_definst_/Test/livestream.m3u8'></video>
    I am not sure what other steps are needed, or a good way to test.  I don't understand how the Live Media Encoder would be needed in this situation.

    So I implemented the above change, using code from the livepkgr.
    My custom application is called Unity3, and the instance is Test, and I am trying to  publish to an event called "Test".
    I now have:
    applications\Unity3\streams\Test\livestream
    Which contains segment files, which update when I send audio in my application.
    I have:
    C:\Program Files\Adobe\Flash Media Server 4.5\applications\Unity3\events\Test\Test
    Which contains my event.xml (EventId Test), manifest.xml (StreamId livestream).  I also have a ".stream" file in this folder.  The contents of which are:
    <?xml version="1.0" encoding="UTF-8"?>
    <stream xmlns="http://ns.adobe.com/liveevent/1.0">
    <type>
    f4f
    </type>
    <name>
    livestream
    </name>
    <path>
    C:\Program Files\Adobe\Flash Media Server 4.5\applications\Unity3\streams\Test\livestream
    </path>
    </stream>
    So everything there seems to be good.  I try to use:
    <video id='live' controls='controls'><source src='http://[fms-dns]/hls-live/Unity3/Test/Test/livestream.m3u8'></video>
    or
    <audio id='live' controls='controls'><source src='http://[fms-dns]/hls-live/Unity3/Test/Test/livestream.m3u8'></audio>
    I don't hear audio, the controls don't seem to respond as though they recognize a source.  Are there certain browsers which can be confirmed work?  Any other mistakes stand out?
    Thanks.

  • How to merge multiple live audio streams into a single stream in FMS?

    I need to merge multiple live audio streams into a single stream so that i can pass this stream as input to VOIP through a softphone.
    For this i tried the following approach:
    Created a new stream (str1) on FMS onAppStart and recorded the live streams (sent throgh microphone) in that new stream.
    Below is the code :
    application.onAppStart = function()
    application.myStream=Stream.get("foo");           
    application.myStream.record();
    application.onPublish = function (client,stream)
          var streamName = stream.name;
          application.myStream.play(streamName,-2,-1};
    The problem is that the Stream.play plays only 1 live stream at a time.As soon as a the 2nd live stream is sent to FMS then Stream.play stops playing the previous live stream.So,only the latest live stream is getting recorded.Whereas i need to record all the live streams in the new stream simultaneously.
    Any pointers regarding this are welcome.
    Thanks

    well i tried this once for one of my scripts and the final conclusion is its not possible to combine two streams into 1.....how would you time/encode the two voices......there is no know solution to this in flash. If you continue on despite me and find a solution please post it so we can explain to rest of world.

  • Using iTunes/Airplay for streaming live audio/video on local network?

    I'm wondering if I can use iTunes to stream live audio and/or video to other computers, Airport Express, or Apple TVs on a local wifi network. I'm willing to jailbreak or hack anything if necessary.
    So this is what I am hoping to do:
    Camera-->Streaming Computer-->Airplay-->Local Wi-fi network-->Devices recieving audio/video

    This device should be doing UPnP by default so you will not need most of the info on this page.
    What the Link does have is the default access info and pictures.
    In the Wan page turn Off the DOS or SPI firewall. (it will have one or the other)
    Check in the Box to Disable.
    Save Settings.
    If this device is behind a Cable modem then also Allow it to Respond to Internet Pings on the same page.
    (this a Check the Box to Do)
    Save settings (Again if you did them above)
    7:30 PM Monday; April 28, 2008

  • Streaming live audio to a Helix server

    Can I use JMF to stream audio to a Helix server? If, not which streaming servers are the best to use with JMF?
    The audio is being captured on a mobile phone and sent to the web server via HTTP POSTs (to a Servlet). I want the servlet to break it up into packets and stream it live to a streaming server, where people can listen to it on the web.
    Thanks.

    Removed; didn't notice the "live audio" when I made my suggestions.
    Message was edited by: Dave Sawyer.

  • Live audio streaming over wifi network

    What I would like to know, is it possible to stream live audio over a wifi network.
    PC A records a stream from a mic and then simultaneously broadcasts that same stream live over the entire wifi network, sorta like a radio station only on a much smaller and simpler scale.
    Is this possible?

    It seems like I'm missing something with icecast. If I understand it correctly then you can only play an existing media file, no real time live broadcasting?
    I have finished setting it up and can listen in over the network, but I need real time broadcasting. Is there a way to do this in icecast or should I look at something else.
    SYNOPSIS:
    While someone is talking over the mic the server records the stream with Audacity for editing later on. The problem is that I want this stream to be simultaneously streamed live to another PC so that people next door to the meeting can hear it real time, no delays. A few seconds delay is not critical, but I cannot record and hour long session and then broadcast it, it has to be real time.
    To complicate things, there is no internet available, we are in the middle of nowhere on a secluded convention centre, so that is why it seemed to me like the easiest would be to broadcast via WIFI.

  • Encoder live audio in stereo

    Hi sorry for my english,im try to create a radio online, but
    don't find what is the best option to transmit single audio in
    stereo, Im try with Flash media Encoder but the audio encoder
    configuration its more limited and I believe that encoder only in
    Mono.
    It is possible to encoder audio in stereo with FME ?
    What is the best option to encoder live audio for a Web Radio
    Online with FMS ?
    Thanks in advance.

    And how increase the sampling rate and/or the bitrate of the
    audio codec?
    Sampling at 22050 and a bitrate ceiling of 32kbps are both
    quite limiting. Is there any chance of at least increasing the
    bitrate to something like 128kbps or more?
    Thanks in advance.

Maybe you are looking for