Sound Channels and Live Video Feeds

Hello All,
I spent the better part of the afternoon researching this one
and have come up empty handed. The app I am designing displays two
live webcam streams from an FMS. What I want to do is control the
audio from each stream so that camera1's audio plays in the left
speaker only and camera2's audio plays in the right speaker only. I
know this is done via soundchannels and soundTransforms, but how do
you associate a soundChannel with the audio from a live webcam
stream? All of the documentation I have read illustrates how to
associate a soundChannel with a static mp3 file. ie:
[code]
sc = s.play();
[/code]
The following code is what I am working with thus far. All I
have been able to do is successfully control the volume with a
soundTransform, but I can't seem to do anything with pan and I
think it is because I am lacking the knowledge concern creating a
soundChannel to control a live webcam stream:
[code]
inStream1 = new NetStream(client_nc);
incomingVid1.attachNetStream(inStream1);
trans1 = new SoundTransform();
trans1.volume = .5;
trans1.pan = -1; //doesn't work
inStream1.soundTransform = trans1;
inStream1.play(video1);
inStream2 = new NetStream(client_nc);
incomingVid2.attachNetStream(inStream2);
trans2 = new SoundTransform();
trans2.volume = .5;
trans2.pan = 1; //doesn't work
inStream2.soundTransform = trans2;
inStream2.play(video2);
[/code]
Thank you for your time and energy!
Nate

try:

Similar Messages

  • Hello. I want to use my iphone as a viewing screen for a usb endoscope, (its a small camera on a long cable for inspecting down piping and such) with an app is it possible to attach the usb to my iphone using a adapter to watch the live video feed on th

    Hello. I want to use my iphone as a viewing screen for a usb endoscope, (its a small camera on a long cable for inspecting down piping and such) with your app is it possible to attach the usb to my iphone using a adapter to watch the live video feed on the iphone?
    Thankyou.

    Hello. I want to use my iphone as a viewing screen for a usb endoscope, (its a small camera on a long cable for inspecting down piping and such) with your app is it possible to attach the usb to my iphone using a adapter to watch the live video feed on the iphone?
    Thankyou.

  • Live video feed from camera and other PC

    My Mac Pro is being shipped as we speak. I'm new to this, and not up with all the lingo so please bear with me. I'm hoping you all can help me figure out what I'm trying to accomplish:
    I currently VJ (Video Jockey) with certain software on a PC and send that video mix out (DVI or VGA) to a Plasma TV. I just bought a Canon A1 camera and a Mac Pro w/FCP studio. How can I mix the live video feed of the Canon with the video feed from PC (preferably in HD)?
    I also want to be able to record and edit the footage from the Canon later.
    I was told by the guys at the Apple store that I could do this using FCP multicam. Is that true? I read a couple posts on here that say FCP is just editing software and can't do live feed. Then what is the purpose of multicam? Please help. Thanks-
    mac pro   Mac OS X (10.4.8)   Two 2.66GHz Dual-Core

    The Apple people probably didn't understand that you wanted a live mix. FCP can't help you there.
    MultiCam is an FCP feature "post-switching" multicamera recordings for that live-to-tape look. It is fairly complex, and especially with only two "cameras" you would be muchn better to start off editing them on the timeline.
    As for a live switch you need a video switcher. Outside the realm of FCP, but http://bhphoptovideo.com is a great place to do research.
    Then if you're really serious you can record both feeds plus optionally the live switched output for further manipulation in FCP or elsewhere.

  • Is there a way to use Illustrator to draw over a live video feed coming from my microscope?

    Hi, we use illustrator to draw (trace) over static images of mite specimens taken under a microscope. Although this works, it would be much better to be able to draw directly from the live video feed coming from my microscope. The specimens are dead, so they are not moving around, however, we need to focus down through the specimen in order to see all details. We normally take montage images, but because the specimens are clear there are many artifacts present and we frequently have to consult the specimen under the microscope to double check things. My students and I have hundreds of drawings to complete and are looking for a way to streamline the process. If we could avoid making montage images and draw direct from the video feed, it would revolutionize the way we approach our research. Is there a way to display the video feed and then overlay Illustrator so that we can directly trace over the image (we use a Wacom tablet, but that shouldn't make a difference) so we can focus at different levels and draw the entire organism?
    Thanks for your help
    Ashley 

    Maybe it's possible to have live video in Illustrator, but you would most certainly need to write your own plugin.
    So this community would be suitable:
    http://forums.adobe.com/community/illustrator/illustrator_sdk?view=discussions

  • Is it possible to record a live video feed online ?  PLEASE ... ASAP !!

    There is a Live video feed today marking Carroll Shelby's wake.  I will be away from computer but would VERY MUCH like to view it at a later time.  How can I do this ?  I have a MacBook, and a MacMini, both of which are running latest OS.
    Thank you !

    there are apps out there that can do this... I found this
    http://mac.eltima.com/streaming-video-downloader.html

  • Insert live video feed in Keynote

    Hello,
    How could I insert a live video feed in a running Keynote presentation ?
    Source could be webcam, iPhone video output, etc.
    I'd like to keep the background of the Keynote presentation while the live video plays on top.
    Thx for your help.
    Benoit

    Here's how I got it working today (slight modifications to Unami's guide):
    1) Install Quartz Composer (it comes with the XCode Developement Utilities on your OSX installation disc)
    *2) Download the KinemeCore ( http://kineme.net/files/release/KinemeCore/KinemeCore-0.5.1bInstaller.pkg.zip ) and install.
    3) Open Quartz Composer and create a blank composition.
    *4) Click on the KinemaCore menu in the top menu (between Window and Help) and select KinemaCore Preferences. Click on the Unsafe Mode tab. For each of the required applications on the left, check "Safe" for Video Input on the right. (I checked Video Input as "Safe" for every application, just in case.)
    5) Open the Patch Library, then add a "Video Input" and a "Billboard" patch
    ( To select another than the default video input device, select the Video Input patch, open the Patch Inspector and change the input device in the Settings tab )
    6) Connect the "Image" output of the Video Input Patch to the "Image" input of the "Billboard" Patch (click and drag from the white dot next to "Image")
    7) Save this composition and close Quartz Composer
    8) Now, open Keynote and drag the .qtz file you just saved into your presentation
    For some reason, Keynote will occasionally not allow me to drag/drop the .qtz file into its window. Closing that window and starting a new document fixes this.
    Hope that helps.
    -jeremy

  • Using FCP for live video feed

    Hi,
    I'm currently shooting with the new HVX200 Panasonic camera. We're using a small, 7" hand-held HD monitor to use for focus-pulling, but I'd like to be able to use a computer monitor or laptop for a live video feed. (director's monitor)
    We've considered using the component-out to go into a Component-VGA adapter for a computer monitor, but I'm wondering if we can do a direct feed using FireWire into FCP or something.
    Anyone have any experience doing a computer-monitor setup for use with these kinds of cameras?
    Thanks for the help!
    -RF
    Everything   Mac OS X (10.4.6)  

    hey there RI,
    if you've tried something similar on your iMac & had no problems it's reasonable to think/believe it will work on your MBP, w/ this caveat: the 400 & 800 ports are actually on the same buss, so the 800 port will handle it's data at the lower(400) rate if there is a device on each port. shouldn't affect any editing w/ the drive once the cam is removed from the MBP. it should the run @ the higher rate.
    good luck.

  • Can iMac receive a live video feed

    Is iMac capable of receiving a live video feed? Hoping to use a live HDMI feed from a video switcher, but iMac doesn't have an HDMI "in".
    Any ideas?
    Thanks, Russ

    You'll need one of these devices to input video/HMDI into the Mac.
    hdmi input for macs.
    A simple adapter won't work.

  • Camera live video feed to mac?

    My camera outputs a HDMI live video feed. Can I desplay this on my IMAC 21.5'' 2011. I want to use my camera as a stop motion capture device so essentially it would be a webcam. Can you help Oliver.
    My camera is a Sony Cyber shot HX200V.

    Is hdmi it's only output or port?
    A cheap altrnative for webcam & mic...
    http://eshop.macsales.com/item/MacAlly/MEGACAM/

  • Live video feed using Flex/Flash

    Hi,
    I have a question if someone doesn't mind giving me a "yes"
    or "no" answer, please.
    My developers have created a program for me that lets me
    broadcast my LIVE web cam feed right on my web site in real time so
    I can provide customer support and allow me to engage with web site
    visitors. We are using Flash, Flex, and Adobe AIR.
    The broadcast window size is 320 x 240 pixels. The problem is
    the frame rate. I am only able to broadcast between 5 - 9 frames
    per second whereas other people have been able to broadcast at
    around 14 FPS.
    Question: is it possible to use these applications
    (Flash/Flex/Adobe AIR) to stream live video at 320 X 240 at frame
    rates at least 24 or higher?
    Thanks in advance for an answer. I don't expect a solution.
    Just a yes or no from some experience(d) developer(s).
    All the best.
    Rodney

    Lucas,
    Here's what I do for multi-camera live shoots:
    1. Connect all cameras to a switcher/DVE (looped through preview monitors) - I use a Panasonic WJ-MX50 or Sony DFS-300 Since you have five cameras, you'll obviously need a switcher with more inputs, or gang two of the above together. You'll need a switcher capable of the signal output your cameras offer, ie: composite, Y/C, component, triax .. whatever.
    2a. Output of switcher goes to DVCam VTR.
    2b. Audio is fed through a mixer and output to the VTR.
    3. Firewire out from the VTR to the Mac/FCP.
    4. In FCP, choose "Non-Controllable Device" and use Capture Now.
    5. Use one of the analog outputs from the VTR to feed the projector(s) and the another analog output to a program monitor for the TD.
    I always run safety tapes in each camera while the VTR is recording a switched master tape in case there's a problem with the capture in FCP (it can happen).
    I also equip each cam-op with a 8" or 9" monitor, typically using the camera's composite output. And everyone is able to communicate using a Clear-Com system.
    In 5 years, doing several shows per year, the capture in FCP has only failed three times and all of those times were when I was using a laptop. Whenever I've used my old G4 tower, capturing to one of the internal drives, its worked flawlessly each time. Fortunately, having a Master DVCam tape saved the day. The camera safety tapes are used just in case of a bad switch by the TD (usually me).
    -DH
    PS I should add that if you're feeding multiple projectors, you'll need a simple DA.

  • Overlay live video feed with flash animation

    I am using a webcam to allow people see themselves on a screen.  I would like to overlay the video with some flash animation (simple graphics of planes, cars etc.) going across the screen.
    How do I achieve this?

    Hi IEC,
    Having just finished an app that has two ROI overlay layers ontop of a
    static image, I might be able to give you some ideas. All of the below
    assumes that you know basic Labview programming.
    - I have not used IMAQ yet, and so am not sure what front panel control
    type will render the live video images. But, I'm not sure that matters.
    - If you can place an XY Graph over your video display on the front panel then we have it made
    - stretch the size of the XY Graph so that the plot area matches the size and placement of your video image panel
    - make the plot area of the XY graph transparent.
    - you can also make the border of the XY graph transparent
    - if you want a scale for your video then use the scale of the XY Graph to show the scale of your video images
    - then it's just a matter of defining the XY points of the ROIs that you want to plot on the XY Graph
    - if you want multiple ROIs then each can be a different color by making them different channels to the XY Graph
    - if you just want a few straight lines, and want to be able to move
    the lines around in realtime then you can get a little fancier:
           - turn on the cursor legend for the XY Graph
           - create two cursors for each line that
    you need (a start point cursor, and an end point cursor)
           - set the color and cursor style as desired
           - in your main loop:
                     - use the XY Graph "crsr list" property to get the XY positions of the various cursors
          - those cursor XY positions become
    points of line segments fed to the XY Graph to plot (draws a line
    between the cursors)
           - In realtime, as you move the cursors, the line segments move accordingly
    In my particular app I had to provide a mechanism for the user to
    create ovals, rectangles, polygons and freehand ROIs. Then I also
    needed mechanisms to rotate, pan or expand/compress the ROIs. This is a
    bit more challenging, but very do-able.
    Hope that helps,
    RMP

  • Website comes up fine, but I'm unable to receive live video feed.

    The target website comes up just as it has for the last two years or so, and I can login and be accepted by the software. However none of the live streaming video from any of the cameras on the site is received. Instead I have only a white 4X4 square where the video feed normally shows.

    See this thread:
    https://discussions.apple.com/thread/5306216?tstart=0
    FIX: 
    1. You need Time Machine
    2. Go to folder /Library/CoreMediaIO/Plug-Ins/DAL/
    3. Copy AppleCamera.plugin to good place (usb memory stick is the best place).
    4. Go to Time machine in date that skype work fine.
    5. Change AppleCamera.plugin with file from Time Machine
    6. Restart system, Now skype need to work with camera.
    If that doesnt pass,
    Suggest looking here, much success has been found:
    http://community.skype.com/t5/Mac/OS-X-10-8-5-broke-Video-on-MacBook-Air/td-p/18 91729/page/4

  • Live Video Feed

    Is there a way to connect a camera to your mac to run live video without using firewire? Is it possible to use an S-video connection to a video card? I want to be able to move away from my mac and firewire doesn't give the distance I need. S-video cable can be up to 150 feet. I have a small program called Video Viewer for live video. It works well.
    Any Thoughts?
    Thanks

    You can go to 150 feet with firewire. I think the limit is 300 feet. You just need to buy a cable with an amp in it.... Just do a Google search and read.

  • Custom (non RTMP) FMS live video feed

    Is there a way to feed the FMS with live video created by
    custom software that does not 'speak' RTMP? How can that be
    accomplished?

    No so far , I guess.

  • I have an Apple TV, a sound system and a video projector. The videoprojector and the sound system are separated by 5 meters and I cannot have a wire. What would be the wireless possibilities?

    Should I put the Apple TV next to the video proj and send the sound wirelessly to the sound system but how to do this from an optical output? Or I can put it next to the sound system and send wirelessly the HD video signal but I have not found a solution unde 100+$ and people seem to be complaining. Any ideas? Thanks in advance

    Read: http://support.apple.com/kb/TS4436
    It is pretty short so here is the entire text:
    Symptoms
    A purplish or other colored flare, haze, or spot is imaged from out-of-scene bright light sources during still image or video capture.
    Resolution
    Most small cameras, including those in every generation of iPhone, may exhibit some form of flare at the edge of the frame when capturing an image with out-of-scene light sources. This can happen when a light source is positioned at an angle (usually just outside the field of view) so that it causes a reflection off the surfaces inside the camera module and onto the camera sensor. Moving the camera slightly to change the position at which the bright light is entering the lens, or shielding the lens with your hand, should minimize or eliminate the effect.
    Which is exactly what eelw wrote.
    No one on Apple Support Communities has authority to speak on behalf of Apple, unless their avatar is accompanied by this Apple image:
    ... identifying themselves as Apple employees.

Maybe you are looking for