Streaming live Audio/Video

i have to transmit Captured Audio and video ..
here my steps
a)capture audio .. create a datasource for capturing device
b)capture video.. create a datasource for capturing device
c)merge both the datasource to get merged datasource
Still this i finished
now i want to transmit the media using merged datasource.. is it possible using AVtransmit2
if yes, how to do this?
help me

now i want to transmit the media using merged datasource.. is it possible using AVtransmit2Why do you want to transmit the merged datasource? The RTP protocol requires that audio and video streams be transmitted separately, so you'd essentially be merging the video & audio only to have them sent in separate RTP streams, and received as separate RTP streams...
if yes, how to do this?You'd transmit a merged datasource just like any other datasource...

Similar Messages

  • Using iTunes/Airplay for streaming live audio/video on local network?

    I'm wondering if I can use iTunes to stream live audio and/or video to other computers, Airport Express, or Apple TVs on a local wifi network. I'm willing to jailbreak or hack anything if necessary.
    So this is what I am hoping to do:
    Camera-->Streaming Computer-->Airplay-->Local Wi-fi network-->Devices recieving audio/video

    This device should be doing UPnP by default so you will not need most of the info on this page.
    What the Link does have is the default access info and pictures.
    In the Wan page turn Off the DOS or SPI firewall. (it will have one or the other)
    Check in the Box to Disable.
    Save Settings.
    If this device is behind a Cable modem then also Allow it to Respond to Internet Pings on the same page.
    (this a Check the Box to Do)
    Save settings (Again if you did them above)
    7:30 PM Monday; April 28, 2008

  • Live Audio / Video Streaming Very Basic

    I need to Stream Live Audio and Video, i went through a web site, and it gives a code,
    [http://www.cs.odu.edu/~cs778/spring04/lectures/jmfsolutions/examplesindex.html#transmitboth]
    in this, i dont no how to run the application, because we should have 2 port addresses each for Audio and Video.
    how do u compile this and capture the file from else where using JMStudio.
    Please help me, i am an absolute beginner to JMF.

    Please don't take this to be offensive, but if you're not able to figure this out on your own, you have absolutely no business playing around with something as advanced as JMF. It's not a question of intelligence or insult, but simply stated, if you don't understand the concept of a URL, and you don't have the ability to extrapolate beyond the exact command-line input you've been given... you need to go and learn the basics, because you lack the grasp of the fundamentals required for advanced programming.
    With that in mind, the following is the answer to your question. If you can't understand it, it means that you lack the fundamentals necessary for JMF programmming, and you need to invest a significant amont of time aquiring those. My explanation should be quite clear to anyone with the proper Java programming fundamentals.
    AVTransmit2 is sample code that can broadcast a single media source (live audio or live video or audio file or video file or video/audio file). It does not have the capability to broadcast more than once source, which is required for live audio and video support. It is designed to take in a single media locator, and broastcast it.
    To meet your specifications, you will need to modify the main method so it is capable of passing in multiple media locators, and thus creating multiple instances of the AVTransmit2 class. To do this, you will either need to modify the command-line argument logic so it supports parsing multiple source arguments simultaniously.
    Or, you could just rip out the command-line stuff and hard-code it for live audio and video. That's the "easy" way.
    The default media locator for audio capture is javasound://0 and the default media locator for video capture (under Windows) is vfw://0

  • Please, help with Live Audio/Video example from jmf solutions

    Hello,
    I�m desperate looking for a solution for a particular problem.
    I�m trying to feed JMF with an AudioInputStream generated via Java Sound, so that I can send it via RTP. The problem is that I don�t know how to properly create a DataSource from an InputStream. I know the example Live Audio/Video Data from the jmf solutions focuses on something similar.
    The problem is that I don�t know exactly how it works, os, the question is, how can I modify that example in order to use it and try to create a proper DataSource from the AudioInputStream, and then try to send it via RTP?
    I think that I manage to create a DataSource and pass it to the class AVTransmit2 from the jmf examples, and from that DataSource create a processor, which creates successfully, and then find a corresponding format and try to send it, but when i try to send it or play it I get garbage sound, so I�m not really sure whether I create the DataSource correctly or not, as I�ve made some changes on the Live Audio/Video Data from the jmf solutions to construct a livestream from the audioinputstream. Actually, I don�t understand where in the code does it construct the DataSource from the livestream, from an inputStream, because there�s not constructor like this DataSource(InputStream) neither nothing similar.
    Please help me as I�m getting very stuck with this, I would really appreciate your help,
    thanks for your time, bye.

    import javax.media.*;
    import javax.media.format.*;
    import javax.media.protocol.*;
    import java.io.IOException;
    import javax.sound.sampled.AudioInputStream;
    public class LiveAudioStream implements PushBufferStream, Runnable {
        protected ContentDescriptor cd = new ContentDescriptor(ContentDescriptor.RAW);
        protected int maxDataLength;
        protected int vez = 0;
        protected AudioInputStream data;
        public AudioInputStream audioStream;
        protected byte[] audioBuffer;
        protected javax.media.format.AudioFormat audioFormat;
        protected boolean started;
        protected Thread thread;
        protected float frameRate = 20f;
        protected BufferTransferHandler transferHandler;
        protected Control [] controls = new Control[0];
        public LiveAudioStream(byte[] audioBuf) {
             audioBuffer = audioBuf;
                      audioFormat = new AudioFormat(AudioFormat.ULAW,
                          8000.0,
                          8,
                          1,
                          Format.NOT_SPECIFIED,
                          AudioFormat.SIGNED,
                          8,
                          Format.NOT_SPECIFIED,
                          Format.byteArray);
                      maxDataLength = 40764;
                      thread = new Thread(this);
         * SourceStream
        public ContentDescriptor getContentDescriptor() {
         return cd;
        public long getContentLength() {
         return LENGTH_UNKNOWN;
        public boolean endOfStream() {
         return false;
         * PushBufferStream
        int seqNo = 0;
        double freq = 2.0;
        public Format getFormat() {
             return audioFormat;
        public void read(Buffer buffer) throws IOException {
         synchronized (this) {
             Object outdata = buffer.getData();
             if (outdata == null || !(outdata.getClass() == Format.byteArray) ||
              ((byte[])outdata).length < maxDataLength) {
              outdata = new byte[maxDataLength];
              buffer.setData(audioBuffer);          
              buffer.setFormat( audioFormat );
              buffer.setTimeStamp( 1000000000 / 8 );
             buffer.setSequenceNumber( seqNo );
             buffer.setLength(maxDataLength);
             buffer.setFlags(0);
             buffer.setHeader( null );
             seqNo++;
        public void setTransferHandler(BufferTransferHandler transferHandler) {
         synchronized (this) {
             this.transferHandler = transferHandler;
             notifyAll();
        void start(boolean started) {
         synchronized ( this ) {
             this.started = started;
             if (started && !thread.isAlive()) {
              thread = new Thread(this);
              thread.start();
             notifyAll();
         * Runnable
        public void run() {
         while (started) {
             synchronized (this) {
              while (transferHandler == null && started) {
                  try {
                   wait(1000);
                  } catch (InterruptedException ie) {
              } // while
             if (started && transferHandler != null) {
              transferHandler.transferData(this);
              try {
                  Thread.currentThread().sleep( 10 );
              } catch (InterruptedException ise) {
         } // while (started)
        } // run
        // Controls
        public Object [] getControls() {
         return controls;
        public Object getControl(String controlType) {
           try {
              Class  cls = Class.forName(controlType);
              Object cs[] = getControls();
              for (int i = 0; i < cs.length; i++) {
                 if (cls.isInstance(cs))
    return cs[i];
    return null;
    } catch (Exception e) {   // no such controlType or such control
    return null;
    and the other one, the DataSource,
    import javax.media.Time;
    import javax.media.protocol.*;
    import java.io.IOException;
    import java.io.InputStream;
    import javax.sound.sampled.AudioInputStream;
    public class CustomDataSource extends PushBufferDataSource {
        protected Object [] controls = new Object[0];
        protected boolean started = false;
        protected String contentType = "raw";
        protected boolean connected = false;
        protected Time duration = DURATION_UNKNOWN;
        protected LiveAudioStream [] streams = null;
        protected LiveAudioStream stream = null;
        public CustomDataSource(LiveAudioStream ls) {
             streams = new LiveAudioStream[1];
             stream = streams[0]= ls;
        public String getContentType() {
         if (!connected){
                System.err.println("Error: DataSource not connected");
                return null;
         return contentType;
        public byte[] getData() {
             return stream.audioBuffer;
        public void connect() throws IOException {
          if (connected)
                return;
          connected = true;
        public void disconnect() {
         try {
                if (started)
                    stop();
            } catch (IOException e) {}
         connected = false;
        public void start() throws IOException {
         // we need to throw error if connect() has not been called
            if (!connected)
                throw new java.lang.Error("DataSource must be connected before it can be started");
            if (started)
                return;
         started = true;
         stream.start(true);
        public void stop() throws IOException {
         if ((!connected) || (!started))
             return;
         started = false;
         stream.start(false);
        public Object [] getControls() {
         return controls;
        public Object getControl(String controlType) {
           try {
              Class  cls = Class.forName(controlType);
              Object cs[] = getControls();
              for (int i = 0; i < cs.length; i++) {
                 if (cls.isInstance(cs))
    return cs[i];
    return null;
    } catch (Exception e) {   // no such controlType or such control
    return null;
    public Time getDuration() {
         return duration;
    public PushBufferStream [] getStreams() {
         return streams;
    hope this helps

  • Can i make Live Audio/Video application between 2 users

    Hello,
    I am new to FLASH and FLEX.
    I want to know if i can make a Live Audio/Video application(
    using Microphone and Camera) for a website by using FMS. If yes
    then should i use FMSS or FMIS. I will be using FLEX Buidler IDE .
    Any one who has made this type of application or link to a
    tutorial.
    What i would like to make is like an application of Webcam
    with 2 users can see/view each other and also talk on web site. And
    alos how can i embed this application in java(EE) project.
    I would be very thankful if you people can guide me in this
    problem.
    Hopefully i have explained my probelm.
    Regards,
    Saad

    Yes you can make a Live A/V app with FMS! that is exactly
    what it was designed for. You would need FMIS as that is the
    interactive version that enables live capabilities.

  • Problem with running example 'Generating Live Audio/Video Data'

    Hello,
    Concerning the example 'Generating Live Audio/Video Data', I'm having trouble with the run instructions.
    http://java.sun.com/javase/technologies/desktop/media/jmf/2.1.1/solutions/LiveData.html
    How does JMFRegistry know about the location of jmfsample?
    How is 'live' resolved as a URL?
    2.Register the package prefix for the new data source using JMFRegistry
    - Run JMFRegistry
    - In the Protocol Prefix List section add "jmfsample" and hit Commit.
    4.Select File->Open URL and enter "live:"
    Much thanks,
    Ben

    I'm getting the following error message: "Could not create player for live"Implies you've either not registered the "live:" protocol prefix in the JMF Registry, or, it couldn't load the class you registered for it...or it might be erroring out inside the actual live protocol, I'm not sure what that would look like, but a System.err.println statement in the constructor of both of those classes might be a good idea.
    I added the output of javac (DataSource.class and LiveStream.class) to a directory on the classpath.
    C:\Program Files\JMF2.1.1e\lib\jmfsample\media\protocol\liveEh, that looks a little questionable to me. I'm not 100% sure that the JRE will automaticlly descend into the package subdirectories like that, looking for classfiles, for every folder on the path. I am, of course, fully open to the idea that it does and I just never thought about it...but I guess I just thought it only did that for JAR files, not CLASS files. Regardless, I'd recommend:
    1) Make sure you've registered the protocol prefix "live:" correctly in JMF Registry
    2) Try to run it with the 2 compiled class files in the same folder as your project
    3) Try to run it with the 2 compiled class files in the lib directory, if that's on the classpath
    4) Try to run it with the 2 compiled class files installed in the JRE as an extension (google for how to do this because I don't remember off the top of my head)
    5) Reinstall JMF and see if that helps

  • Streaming live desktop video.

    I'm trying to stream live desktop video to a remote client. I can stream for about 3 seconds then it gives me a memory error saying that there isn't enough memory... Is there a way to flush the memory instead of trying to retain it? Becuase i'm sure if I could just flush the data after I transmit it, the memory error would go away. I'm using the datasource.java and livestream.java from the screen grabber example. Oh by the way i'm transmitting at 30 fps thats why it takes so little time for memory to run out. Anyway thanks in advance, Dana Garcia
    Edited by: Dana_Garcia on Sep 10, 2009 8:17 AM
    Edited by: Dana_Garcia on Sep 10, 2009 8:18 AM

    That's stange, I wouldn't expect that to happen if you're streaming.
    Can you post your code? No need to include the 2 sample files...

  • One way live audio-video streaming

    Hi
    First of all I want to be honest that I am a beginner with
    FMS2, actually I must have missed much not using it so far. So,
    excuse me, I am sure you will find my problem very easy to solve.
    Having a little experience with Flash I tried to do the
    following: I have 2 PC's in LAN. One of them has camera and
    microphone. I want to stream audio and video to the other computer
    - I only need 1-way live streaming audio-video connection.
    I have read some docs about streaming with FMS2 but I
    couldn't find out which of the PC's should have FMS2 (and Web
    server) - the one with camera and microphone or the other. And if
    the camera and microphone are on the server how audio and viveo
    should be captured and streamed to the client?
    I really need your help. Any idea would be appreciated.
    Thanks in advance!

    Thank you friends!
    Actually I managed to sort out the problem. And the problem
    was that I had not used before FMS at all. After I have read more
    documentation I established the connection using 2 PC-s (one to
    publish and one to plaY) and the 3-rd for FMServer and Web Server.
    By the way there was a little confusion about local and
    network access of .swf files, but now it is okay.
    Now I have a new challenge - to record published stream to
    files, for example about 30 minutes long. I want to record and
    store all them continuously - having all the records for 3 days for
    example. I am not sure now how to do that but I am working on it.
    Anyway, thank you for your assistance!

  • Streaming live audio in a flash movie

    Hi, I'm working on a website for a local radio station, and I
    want to be able to stream a live audio feed of the radio station to
    a flash movie. Right now all I really know is that the URL to get
    this live feed is
    http://kpacradio.pacific.edu:8000
    - if you put that into Windows Media Player it will start the radio
    broadcast. Is it possible to play this through flash so that the
    user can just stream it from a website rather than having to
    introduce windows media player into it? Thanks a lot.
    David

    No. The Flashplayer can't connect to a windows media
    serverYou'd need to capture the stream and republish to FMS, or
    broadcast directly to FMS instead of WMS.

  • Streaming live audio to a Helix server

    Can I use JMF to stream audio to a Helix server? If, not which streaming servers are the best to use with JMF?
    The audio is being captured on a mobile phone and sent to the web server via HTTP POSTs (to a Servlet). I want the servlet to break it up into packets and stream it live to a streaming server, where people can listen to it on the web.
    Thanks.

    Removed; didn't notice the "live audio" when I made my suggestions.
    Message was edited by: Dave Sawyer.

  • Streaming live audio via wifi to home audio system

    I would like to attach a device to my stereo that would act as a wifi audio receiver and play whatever audio my macbook is currently playing.
    Does such a thing exist?  Basically it should act like a wireless version of an audio cable that I would plug into my mac's audio out and into my stereo.
    I can't tell if the airport express will do this.
    I own a Roku and I know I can get it set up to play certain kinds of media files over the network, but I want the freedom to be able to play live streams over the internet from any number of sources.
    I would accept being able to live stream from itunes, but only if it's a true live stream and not just letting me select itunes files.  (I want it to know where I am in a long podcast, for example.)
    Thanks!

    Just for completeness, airfoil + airport express indeed does exactly what I wanted:
    - I purchased an airport express, a mini stereo->phono Y splitter cable, and the airfoil application from Rogue Amoeba.
    - I configured my airport to join my existing network and plugged its audio out jack into my stereo receiver using the Y splitter cable.  At this point I could play itunes to my stereo by selecting the airport as the "speaker" using the little menu in the lower right corner of the itunes window.  (itunes found the airport express automatically)
    - I launched Airfoil and told it to stream "system audio" (as opposed to the other choice of streaming from a single application), which required a brief install and reboot of some system level software, to my airport express, which it had located automatically.
    Now I can stream any audio from my mac to my stereo, not just itunes!  Rock!
    The only sadness?  If the Time Capsule supported an audio out jack I would not have needed to buy the Airport Express.
    In the long run maybe it would have been smarter to buy an Apple TV.

  • Streaming Live Audio on J2ME

    I am able to play the audio/video on my device using MMAPI , i am am confuse on how to access the data source to send to my destination , anyone can help me,how to do it?
    Thanks a lot for ur support!

    Removed; didn't notice the "live audio" when I made my suggestions.
    Message was edited by: Dave Sawyer.

  • Uploading Stream (Images/Audio/Video files/Doc files) to Windows Azure Storage Blob by using SharePoint Online 2013 Library

    Dear All,
    How I can store the Images/Audio/Video/documents files in blob storage by help of SharePoint document library and keep a reference to them in SharePoint by putting metadata in a SharePoint Library.
    I searched a lot, but not finding suitable source.
    Thanks,
    Sharad
    sharadpatil

    hi,
    Base on my experience, you could use azure storage reference sharepoint app. I suggest you could recommend this blog via (http://sachintana.blogspot.com/2012/08/azure-blob-storage-for-sharepoint.html ).
    Regards,
    Will
    We are trying to better understand customer views on social support experience, so your participation in this interview project would be greatly appreciated if you have time. Thanks for helping make community forums a great place.
    Click
    HERE to participate the survey.

  • Can an iPod Touch Stream Live Audio via Wifi through airplay?

    I would like to stream the audio from a PA system running in one building to another building on the same site. The distance is too far for a wireless speaker system, as well as for a strictly Airport based network. What I am wondering is if it would be possible to use an Ipod Touch to input audio from the PA and use airplay to output the audio via an Airport Express in the other building?

    What I am wondering is if it would be possible to use an Ipod Touch to input audio from the PA and use airplay to output the audio via an Airport Express in the other building?
    Sorry, but no. The basic issue would be how to get the PA audio "into" the iPod Touch.

  • Play only audio from RTMP Live audio/video stream

    Hi,
    I am working on Flex FMS video conference app, Is there any possibility to stop video and listen only audio of a broadcaster with RTMP Live without broadcaster detaching camera to Stream and should not affect other viewers video feed.
    Thanx & Regards
    Pavan

    Actually, iTunes does this automatically. If you go to music>songs>and scroll down you should see everything in your library including music videos and podcasts. When you select the music video or podcast of your choice in the music menu it will play it as an audio file and not video.

Maybe you are looking for

  • Is there any process to convert an value-based hierarchy to level-based one

    Hi Experts, Is there any automated or manual process to convert the value-based hierarchy to the level-based hierarchy. Thanks VR

  • SMB/CIFS message when logging in to LAN/WAN

    Hi all, My last name changed recently, and our I.T. guy started setting things up on his end to change my e-mail, records, and the like. Now, every time I log on to our Win XP network, I get a prompt from SMB/CIFS File System Authentication to enter

  • Does the Class Word always have to be the last word in column name? What about Units?

    Sometimes we are required to store both a US and a metric column in the same table.  Is it ok to use the unit type as a suffix to the class word like AreaSqMi and AreaSqkm in the case of watersheds which cross both the US and Canadian borders? Before

  • Need of Material!

    Hi Friends i am looking out for some oracle forms related books i have done my certification on forms but oracle material is not sufficient enough for me to write coding with respect to various needs is there any book to download that can help me, I

  • MAX Tools RT Disk Utilities is disabled

    Hi, I've got a problem creating a boot disk for my desktop pc. The whole MAX menu "Tools" >> "RT Disk Utilities" is disabled and I can't create my boot disk. Can u help me? Thanks, Jannes