Live Audio / Video Streaming Very Basic
I need to Stream Live Audio and Video, i went through a web site, and it gives a code,
[http://www.cs.odu.edu/~cs778/spring04/lectures/jmfsolutions/examplesindex.html#transmitboth]
in this, i dont no how to run the application, because we should have 2 port addresses each for Audio and Video.
how do u compile this and capture the file from else where using JMStudio.
Please help me, i am an absolute beginner to JMF.
Please don't take this to be offensive, but if you're not able to figure this out on your own, you have absolutely no business playing around with something as advanced as JMF. It's not a question of intelligence or insult, but simply stated, if you don't understand the concept of a URL, and you don't have the ability to extrapolate beyond the exact command-line input you've been given... you need to go and learn the basics, because you lack the grasp of the fundamentals required for advanced programming.
With that in mind, the following is the answer to your question. If you can't understand it, it means that you lack the fundamentals necessary for JMF programmming, and you need to invest a significant amont of time aquiring those. My explanation should be quite clear to anyone with the proper Java programming fundamentals.
AVTransmit2 is sample code that can broadcast a single media source (live audio or live video or audio file or video file or video/audio file). It does not have the capability to broadcast more than once source, which is required for live audio and video support. It is designed to take in a single media locator, and broastcast it.
To meet your specifications, you will need to modify the main method so it is capable of passing in multiple media locators, and thus creating multiple instances of the AVTransmit2 class. To do this, you will either need to modify the command-line argument logic so it supports parsing multiple source arguments simultaniously.
Or, you could just rip out the command-line stuff and hard-code it for live audio and video. That's the "easy" way.
The default media locator for audio capture is javasound://0 and the default media locator for video capture (under Windows) is vfw://0
Similar Messages
-
One way live audio-video streaming
Hi
First of all I want to be honest that I am a beginner with
FMS2, actually I must have missed much not using it so far. So,
excuse me, I am sure you will find my problem very easy to solve.
Having a little experience with Flash I tried to do the
following: I have 2 PC's in LAN. One of them has camera and
microphone. I want to stream audio and video to the other computer
- I only need 1-way live streaming audio-video connection.
I have read some docs about streaming with FMS2 but I
couldn't find out which of the PC's should have FMS2 (and Web
server) - the one with camera and microphone or the other. And if
the camera and microphone are on the server how audio and viveo
should be captured and streamed to the client?
I really need your help. Any idea would be appreciated.
Thanks in advance!Thank you friends!
Actually I managed to sort out the problem. And the problem
was that I had not used before FMS at all. After I have read more
documentation I established the connection using 2 PC-s (one to
publish and one to plaY) and the 3-rd for FMServer and Web Server.
By the way there was a little confusion about local and
network access of .swf files, but now it is okay.
Now I have a new challenge - to record published stream to
files, for example about 30 minutes long. I want to record and
store all them continuously - having all the records for 3 days for
example. I am not sure now how to do that but I am working on it.
Anyway, thank you for your assistance! -
Play only audio from RTMP Live audio/video stream
Hi,
I am working on Flex FMS video conference app, Is there any possibility to stop video and listen only audio of a broadcaster with RTMP Live without broadcaster detaching camera to Stream and should not affect other viewers video feed.
Thanx & Regards
PavanActually, iTunes does this automatically. If you go to music>songs>and scroll down you should see everything in your library including music videos and podcasts. When you select the music video or podcast of your choice in the music menu it will play it as an audio file and not video.
-
Please, help with Live Audio/Video example from jmf solutions
Hello,
I�m desperate looking for a solution for a particular problem.
I�m trying to feed JMF with an AudioInputStream generated via Java Sound, so that I can send it via RTP. The problem is that I don�t know how to properly create a DataSource from an InputStream. I know the example Live Audio/Video Data from the jmf solutions focuses on something similar.
The problem is that I don�t know exactly how it works, os, the question is, how can I modify that example in order to use it and try to create a proper DataSource from the AudioInputStream, and then try to send it via RTP?
I think that I manage to create a DataSource and pass it to the class AVTransmit2 from the jmf examples, and from that DataSource create a processor, which creates successfully, and then find a corresponding format and try to send it, but when i try to send it or play it I get garbage sound, so I�m not really sure whether I create the DataSource correctly or not, as I�ve made some changes on the Live Audio/Video Data from the jmf solutions to construct a livestream from the audioinputstream. Actually, I don�t understand where in the code does it construct the DataSource from the livestream, from an inputStream, because there�s not constructor like this DataSource(InputStream) neither nothing similar.
Please help me as I�m getting very stuck with this, I would really appreciate your help,
thanks for your time, bye.import javax.media.*;
import javax.media.format.*;
import javax.media.protocol.*;
import java.io.IOException;
import javax.sound.sampled.AudioInputStream;
public class LiveAudioStream implements PushBufferStream, Runnable {
protected ContentDescriptor cd = new ContentDescriptor(ContentDescriptor.RAW);
protected int maxDataLength;
protected int vez = 0;
protected AudioInputStream data;
public AudioInputStream audioStream;
protected byte[] audioBuffer;
protected javax.media.format.AudioFormat audioFormat;
protected boolean started;
protected Thread thread;
protected float frameRate = 20f;
protected BufferTransferHandler transferHandler;
protected Control [] controls = new Control[0];
public LiveAudioStream(byte[] audioBuf) {
audioBuffer = audioBuf;
audioFormat = new AudioFormat(AudioFormat.ULAW,
8000.0,
8,
1,
Format.NOT_SPECIFIED,
AudioFormat.SIGNED,
8,
Format.NOT_SPECIFIED,
Format.byteArray);
maxDataLength = 40764;
thread = new Thread(this);
* SourceStream
public ContentDescriptor getContentDescriptor() {
return cd;
public long getContentLength() {
return LENGTH_UNKNOWN;
public boolean endOfStream() {
return false;
* PushBufferStream
int seqNo = 0;
double freq = 2.0;
public Format getFormat() {
return audioFormat;
public void read(Buffer buffer) throws IOException {
synchronized (this) {
Object outdata = buffer.getData();
if (outdata == null || !(outdata.getClass() == Format.byteArray) ||
((byte[])outdata).length < maxDataLength) {
outdata = new byte[maxDataLength];
buffer.setData(audioBuffer);
buffer.setFormat( audioFormat );
buffer.setTimeStamp( 1000000000 / 8 );
buffer.setSequenceNumber( seqNo );
buffer.setLength(maxDataLength);
buffer.setFlags(0);
buffer.setHeader( null );
seqNo++;
public void setTransferHandler(BufferTransferHandler transferHandler) {
synchronized (this) {
this.transferHandler = transferHandler;
notifyAll();
void start(boolean started) {
synchronized ( this ) {
this.started = started;
if (started && !thread.isAlive()) {
thread = new Thread(this);
thread.start();
notifyAll();
* Runnable
public void run() {
while (started) {
synchronized (this) {
while (transferHandler == null && started) {
try {
wait(1000);
} catch (InterruptedException ie) {
} // while
if (started && transferHandler != null) {
transferHandler.transferData(this);
try {
Thread.currentThread().sleep( 10 );
} catch (InterruptedException ise) {
} // while (started)
} // run
// Controls
public Object [] getControls() {
return controls;
public Object getControl(String controlType) {
try {
Class cls = Class.forName(controlType);
Object cs[] = getControls();
for (int i = 0; i < cs.length; i++) {
if (cls.isInstance(cs))
return cs[i];
return null;
} catch (Exception e) { // no such controlType or such control
return null;
and the other one, the DataSource,
import javax.media.Time;
import javax.media.protocol.*;
import java.io.IOException;
import java.io.InputStream;
import javax.sound.sampled.AudioInputStream;
public class CustomDataSource extends PushBufferDataSource {
protected Object [] controls = new Object[0];
protected boolean started = false;
protected String contentType = "raw";
protected boolean connected = false;
protected Time duration = DURATION_UNKNOWN;
protected LiveAudioStream [] streams = null;
protected LiveAudioStream stream = null;
public CustomDataSource(LiveAudioStream ls) {
streams = new LiveAudioStream[1];
stream = streams[0]= ls;
public String getContentType() {
if (!connected){
System.err.println("Error: DataSource not connected");
return null;
return contentType;
public byte[] getData() {
return stream.audioBuffer;
public void connect() throws IOException {
if (connected)
return;
connected = true;
public void disconnect() {
try {
if (started)
stop();
} catch (IOException e) {}
connected = false;
public void start() throws IOException {
// we need to throw error if connect() has not been called
if (!connected)
throw new java.lang.Error("DataSource must be connected before it can be started");
if (started)
return;
started = true;
stream.start(true);
public void stop() throws IOException {
if ((!connected) || (!started))
return;
started = false;
stream.start(false);
public Object [] getControls() {
return controls;
public Object getControl(String controlType) {
try {
Class cls = Class.forName(controlType);
Object cs[] = getControls();
for (int i = 0; i < cs.length; i++) {
if (cls.isInstance(cs))
return cs[i];
return null;
} catch (Exception e) { // no such controlType or such control
return null;
public Time getDuration() {
return duration;
public PushBufferStream [] getStreams() {
return streams;
hope this helps -
Hi there,
I'm looking for a way to offer a LIVE audio stereo stream to
the web client using Shockwave plug-in.
I was "banging my head off forums and search engines" (as
flashcomguru nice says) for some weeks
to find a way to stream LIVE audio stereo to Flash Media
Server but I couldn't find one viable direct
solution for this platform.
The client Flash Media Server app uses Microphone.get method
to capture sound from soundboard
which unfortunately is mono. it uses a speech Nellymoser
codec, at which music sounds pretty OK at 44Khz.
Possible solution: I can capture the LIVE line-in stereo
sound using two soundcards and thus beeing able to
create two sepparate mono audio streams one for left, one for
right using the only available Microphone.get method.
It will then maybe need some synchronization server scripts
later.
The trick is: In Director I use two .swf movies (action
script controlled) to connect to server
and play the streams but I couldn't find a way to direct the
streams to a specific channel
(one of the 8 available) from which point I can use the LINGO
Sound Channel PAN property, like this:
sound(1).pan = 100 for right
sound(2).pan = -100 for left
From all that I read I came to the conclusion that you can
use the Sound Channel PAN property only to
.swa imported and prerecorded audio cast members, not to the
live audio stream which is played roughly
in the main sound output channel to which I can't apply the
Sound Channel PAN property.
The key is to route those two streams left and right in
Director.
Any hints?
Thanks for reading,
My deepest respect,
hOverThe microphone code is very similar to what you have posted. I can successfully use the enhanced microphone. When it is enabled, the issue I am having is exhibited.
A simple test I am using:
Use ffmpeg to stream a stereo mp3 file to the media server. I am using the following ffmpeg command line:
ffmpeg -re -i ~/alone.mp3 -vn -acodec copy -ac 2 -f flv rtmp://myserver.com:1935/video/3Daudio/alone
In this case the file is encoded at 44.1 kHz.
The client uses a netstream to play with bufferTime = 0
Without the microphone, the playback is as expected. With a normal microphone, not the enhanced microphone, the playback is as expected but there is little to no echo cancellation.
When the enhanced microphone is enabled, again using similar code to your post, the mp3 playback becomes severely distorted and is unacceptable.
In my opinion, this is an issue with the AEC algorithms of the enhancedMicrophone and stereo playback of a 'live' stream. If I modify the client playback code to bufferTime > 0, the mp3 playback is normal but there is no echo cancellation.
Thanks,
Sam -
Can i make Live Audio/Video application between 2 users
Hello,
I am new to FLASH and FLEX.
I want to know if i can make a Live Audio/Video application(
using Microphone and Camera) for a website by using FMS. If yes
then should i use FMSS or FMIS. I will be using FLEX Buidler IDE .
Any one who has made this type of application or link to a
tutorial.
What i would like to make is like an application of Webcam
with 2 users can see/view each other and also talk on web site. And
alos how can i embed this application in java(EE) project.
I would be very thankful if you people can guide me in this
problem.
Hopefully i have explained my probelm.
Regards,
SaadYes you can make a Live A/V app with FMS! that is exactly
what it was designed for. You would need FMIS as that is the
interactive version that enables live capabilities. -
How do I stream live audio/video on my nokia N70 Music edition?
I can't give you some code, but you should take a look at the java media framework
http://java.sun.com/products/java-media/jmf/index.html
Canbe interesting for you. -
Problem with running example 'Generating Live Audio/Video Data'
Hello,
Concerning the example 'Generating Live Audio/Video Data', I'm having trouble with the run instructions.
http://java.sun.com/javase/technologies/desktop/media/jmf/2.1.1/solutions/LiveData.html
How does JMFRegistry know about the location of jmfsample?
How is 'live' resolved as a URL?
2.Register the package prefix for the new data source using JMFRegistry
- Run JMFRegistry
- In the Protocol Prefix List section add "jmfsample" and hit Commit.
4.Select File->Open URL and enter "live:"
Much thanks,
BenI'm getting the following error message: "Could not create player for live"Implies you've either not registered the "live:" protocol prefix in the JMF Registry, or, it couldn't load the class you registered for it...or it might be erroring out inside the actual live protocol, I'm not sure what that would look like, but a System.err.println statement in the constructor of both of those classes might be a good idea.
I added the output of javac (DataSource.class and LiveStream.class) to a directory on the classpath.
C:\Program Files\JMF2.1.1e\lib\jmfsample\media\protocol\liveEh, that looks a little questionable to me. I'm not 100% sure that the JRE will automaticlly descend into the package subdirectories like that, looking for classfiles, for every folder on the path. I am, of course, fully open to the idea that it does and I just never thought about it...but I guess I just thought it only did that for JAR files, not CLASS files. Regardless, I'd recommend:
1) Make sure you've registered the protocol prefix "live:" correctly in JMF Registry
2) Try to run it with the 2 compiled class files in the same folder as your project
3) Try to run it with the 2 compiled class files in the lib directory, if that's on the classpath
4) Try to run it with the 2 compiled class files installed in the JRE as an extension (google for how to do this because I don't remember off the top of my head)
5) Reinstall JMF and see if that helps -
Why is live and video streaming so slow on my MacBook Pro?
why is live and video streaming so slow on my MacBook Pro?
Hi redngreen,
Does this happen with Wi-Fi and ethernet?
Is your network slow with other activities?
There's a section in this article for Wi-Fi network seems slow. Some of it applies to Ethernet as well.
Wi-Fi: How to troubleshoot Wi-Fi connectivity - Apple Support
Best regards,
Nubz -
Audio/Video stream & MulticastSocket
Hello to everybody!
I want to send to many clients an audio/video stream (captured from mic/webcam) using MulticastSocket.
I found in some code examples that MulticastSocket is used setting a byte array with a particular size for buffering data (sending and receiving phases).
I want to ask you if you know what size I have to use for transmitting WELL audio/video stream captured from mic/cam. What is the best configuration?
Thanks in advance!
Best Regards.
Luca___Hi,
have you checked with your Internet Service Provider (ISP) yet ?
Some ISPs make a disconnect at a specific time in order to prevent users from using it for purposes not allowed by the ISP.
At least here in Germany this is common behaviour with most DSL provider.
Regards
Stefan -
P2p audio video streaming throws permission error.
hi,
i am trying to do p2p audio video streaming using latest api & palyer version. But for guests, although the room has auto-promote=true, it throws permission error .
Error: You need to have owner permissions for changing Multicast property
atcom.adobe.rtc.sharedManagers::StreamManager/set streamMulticast()
attraineravc/onSessionCreation()
attraineravc/__cSession_creationComplete()
atflash.events::EventDispatcher/dispatchEventFunction()
atflash.events::EventDispatcher/dispatchEvent()
atmx.core::UIComponent/dispatchEvent()
atmx.core::UIComponent/set initialized()
atmx.managers::LayoutManager/doPhasedInstantiation()
atmx.managers::LayoutManager/validateNow()
atmx.core::Application/resizeHandler()
But when I login using my adobe developer account it seems to be working fine. (at least the above error is not show).
What am I missing?Hi,
You are trying to set streamMulticast property, and as it is a room level property you need to be a host to do that. Only users with owner role can set/unset multicast in a room.
Its a one time setting and need not be set by each user. So removing the code that sets the streamMulticast by all guests would fix your issue.
Thanks
Arun -
Visualization of Audio from live RTMP video stream.
I have a need to do basic remote monitor video is presant and levels of audio.
I had hoped to used http://www.longtailvideo.com/addons/plugins/247/SUB-Equalizer
but it seems to be incompatable with live streams.
the VU level requirement is basic of that there is an live audio feed and does not need to be fancy at all.
We currently use JWPlayer but am not tied to it for this project.
Cheers for the help
RobertWould you mind reposting this question over on the Flash Professional forums? This forum is primarily for end users, the Pro forums will get you in touch with a wider developer audience.
Thanks,
Chris -
Using iTunes/Airplay for streaming live audio/video on local network?
I'm wondering if I can use iTunes to stream live audio and/or video to other computers, Airport Express, or Apple TVs on a local wifi network. I'm willing to jailbreak or hack anything if necessary.
So this is what I am hoping to do:
Camera-->Streaming Computer-->Airplay-->Local Wi-fi network-->Devices recieving audio/videoThis device should be doing UPnP by default so you will not need most of the info on this page.
What the Link does have is the default access info and pictures.
In the Wan page turn Off the DOS or SPI firewall. (it will have one or the other)
Check in the Box to Disable.
Save Settings.
If this device is behind a Cable modem then also Allow it to Respond to Internet Pings on the same page.
(this a Check the Box to Do)
Save settings (Again if you did them above)
7:30 PM Monday; April 28, 2008 -
Streaming live Audio/Video
i have to transmit Captured Audio and video ..
here my steps
a)capture audio .. create a datasource for capturing device
b)capture video.. create a datasource for capturing device
c)merge both the datasource to get merged datasource
Still this i finished
now i want to transmit the media using merged datasource.. is it possible using AVtransmit2
if yes, how to do this?
help menow i want to transmit the media using merged datasource.. is it possible using AVtransmit2Why do you want to transmit the merged datasource? The RTP protocol requires that audio and video streams be transmitted separately, so you'd essentially be merging the video & audio only to have them sent in separate RTP streams, and received as separate RTP streams...
if yes, how to do this?You'd transmit a merged datasource just like any other datasource... -
No audio video streams upon import of .mov files?
Hi... recently upgraded OS onto Intel 240 GB SSD. Running Widows 7 64 bit, clean re-install, 24 GB RAM, NIVIDIA Quadro FX 4800. I have been trying to optimize the system for maximum GPU accelerated effects while also taking advantage of the high speed read and write of my disk. OS runs on SSD, Adobe media cache on separate 2 TB eSATA connection, Media stored on external drive USB 3.0. Without getting into the fine details this is a snapshot of my workflow.
The few problems I am encountering are some .mov files cannot be imported to PP. Error reads no audio or video stream in the files. I suspected metadata issues so tried to import with media browser and still no luck. These files worked perfectly before I began surgery 2 days ago on my computer. I have installed the latest version of quicktime and yet to no avail. Any recommendations here regarding my issue with imports or suggestions to benefit from my hardware would be appreciated.MOV is a wrapper, what is inside YOUR wrapper - Exactly what is INSIDE the video you are editing?
Codec & Format information, with 2 links inside for you to read http://forums.adobe.com/thread/1270588
Report back with the codec details of your file, use the programs below... A screen shot works well to SHOW people what you are doing
https://forums.adobe.com/thread/1070933 for screen shot instructions
Free programs to get file information for PC/Mac http://mediaarea.net/en/MediaInfo/Download
Maybe you are looking for
-
"package" doesnt work in indesign cs6
Everytime I attempt to "package" my 40 page Indesign file I get the little rolling beachball indefinitely. I was able to do it on a single page document. This "freeze" also happens frequently when attempting to use "Find and change" on same document.
-
Step by Step how to Use Java Beans in Form 6i
Dear All, Can anybody tell me step by step how to use Javabeans in form 6i. What i want is to know which form is getting executed and by whom with ip address on application server. When is execute the form made using javabeans Best Regards, Devendra
-
Give me the steps to get output given below PModule
report is related to PM module 3.3 Selection Criteria Field Name Table Field / Check Box / Radio Button Select-Option(S) or Parameters(P) Comments (Range, Single /Multiple Selections, Patterns, Mandatory, Match Code, etc.) Defau
-
Using offset for global variables in FOX
Dear all, I'm using global variables (i.e. defined in planning area)in my FOX formulas by using VARV-command. I'm doing this for fiscal year, which has default value current year. However I'd like to use this variable also for next year thus creating
-
Indesign CC 2014 Fixed Layout export
My fixed layout epub export, previews beautifully in iBooks but not in Adobe Digital editions 3.0. What are non mac users supposed to use as a desktop e-reader to test their fixed layout epubs?