I don't support live video according to Buddy on PC

My buddy and I can videochat using macs (both on laptops at work)but when attempting the same from home on his PC laptop using AIM he gets the dialog "Buudy IM does not support live video" despite me using the same computer.
Where's the problem?

Hi Sticky,
Welcome to the Apple Discusion Pages.
With his PC laptop has he enabled the XP firewall for the AIM apllication ? See here
Ralph

Similar Messages

  • IPhone  4s why don't support iPhoto ?

    iPhone  4s why don't support iPhoto ?

    According to the quote below from https://itunes.apple.com/us/app/iphoto/id497786065?mt=8 it is optimized for iPhone 5 but only requires iOS 7.0 or later. I think it should work on a 4S -- what problem are you seeing?
    Compatibility: Requires iOS 7.0 or later. Compatible with iPhone, iPad, and iPod touch. This app is optimized for iPhone 5.

  • Can I connect an external video camera to a mac book pro for viewing live video so I don't have to use the little screen on the camera?

    Can I connect an external video camera to a mac book pro for viewing live video so I don't have to use the little screen on the camera?

    Shadow30 wrote:
    Can I connect an external video camera to a mac book pro for viewing live video so I don't have to use the little screen on the camera?
    • Only if your video camera supports it.  Depending on how your camera works, you may need to add preview software to your Mac to use this feature.
    • Alternatively, your camera may also be able to record directly into your Mac while you are monitoring the Mac's display.
    • If you are using a consumer camcorder that supports HDMI monitoring, a small HDTV might be an simpler solution than your Mac.  For an example of how this is done, see http://youtu.be/GVpSkZD6qE4.
    • The retailer who sold you your video camera may have other suggestions.
    • If you need more professional results, an external preview monitor will offer more capability on compatible cameras.
    Message was edited by: EZ Jim
    Mac OSX 10.9.4

  • I have a JVC video camera and it records the video to a .mod format (which Apple devices don't support). I figured out how to get them to iPhoto using iMovie but I want to be able to get the videos on my iPad 2 but it won't let it sync. Any help?

    I have a JVC video camera and it records the video to a .mod format (which Apple devices don't support). I figured out how to get them to iPhoto using iMovie but I want to be able to get the videos on my iPad 2 but it won't let it sync. Any help?

    Thanks for that. Much more constructive than the last comment. It's only the restriction code I can't recall, not the access passcode. So I can currently access the device, just not age restricted content. Does that's make a difference? I also wondered if anyone knew how many attempts you get to try to get it right. Now tried 21 times and so far nothing bad has happened but I am concerned I'll eventually be completely locked out of the device. That doesn't seem in the spirit of things though. Surely it's foreseeable that a child could repeatedly try to guess the code so I can't see that it would be right to lock the device down completely in that circumstance, particularly if the access code is being typed in correctly every time.
    Thanks

  • How can I view live videos supported with adobe flash on my IPad2?

    Need help viewing live videos supported with adobe flash.

    iSwifter, Photon, Skyfire, Puffin - are all browsers that MAY work for you. The iPad does not run Flash and never has. See if one of the browsers will suit your needs.

  • Live video feed using Flex/Flash

    Hi,
    I have a question if someone doesn't mind giving me a "yes"
    or "no" answer, please.
    My developers have created a program for me that lets me
    broadcast my LIVE web cam feed right on my web site in real time so
    I can provide customer support and allow me to engage with web site
    visitors. We are using Flash, Flex, and Adobe AIR.
    The broadcast window size is 320 x 240 pixels. The problem is
    the frame rate. I am only able to broadcast between 5 - 9 frames
    per second whereas other people have been able to broadcast at
    around 14 FPS.
    Question: is it possible to use these applications
    (Flash/Flex/Adobe AIR) to stream live video at 320 X 240 at frame
    rates at least 24 or higher?
    Thanks in advance for an answer. I don't expect a solution.
    Just a yes or no from some experience(d) developer(s).
    All the best.
    Rodney

    Lucas,
    Here's what I do for multi-camera live shoots:
    1. Connect all cameras to a switcher/DVE (looped through preview monitors) - I use a Panasonic WJ-MX50 or Sony DFS-300 Since you have five cameras, you'll obviously need a switcher with more inputs, or gang two of the above together. You'll need a switcher capable of the signal output your cameras offer, ie: composite, Y/C, component, triax .. whatever.
    2a. Output of switcher goes to DVCam VTR.
    2b. Audio is fed through a mixer and output to the VTR.
    3. Firewire out from the VTR to the Mac/FCP.
    4. In FCP, choose "Non-Controllable Device" and use Capture Now.
    5. Use one of the analog outputs from the VTR to feed the projector(s) and the another analog output to a program monitor for the TD.
    I always run safety tapes in each camera while the VTR is recording a switched master tape in case there's a problem with the capture in FCP (it can happen).
    I also equip each cam-op with a 8" or 9" monitor, typically using the camera's composite output. And everyone is able to communicate using a Clear-Com system.
    In 5 years, doing several shows per year, the capture in FCP has only failed three times and all of those times were when I was using a laptop. Whenever I've used my old G4 tower, capturing to one of the internal drives, its worked flawlessly each time. Fortunately, having a Master DVCam tape saved the day. The camera safety tapes are used just in case of a bad switch by the TD (usually me).
    -DH
    PS I should add that if you're feeding multiple projectors, you'll need a simple DA.

  • How do you auto reconnect a live video stream broadcast in flash action script 3?

    how do you auto reconnect a live video stream broadcast in flash action script 3?
    so i don't have to ask people to refresh the page if the connection drops
    i copy pasted the live video stream broadcast files and script from here;
    http://www.adobe.com/devnet/adobe-media-server/articles/beginner_live_fms3.html
    http://www.adobe.com/content/dotcom/en/devnet/adobe-media-
    server/articles/beginner_live_fms3/_jcr_content/articlePrerequistes/multiplefiles
    /node_1278314297096/file.res/beginner_live_fms3.zip
    i don't know what i'm doing

    Why don't you use several layers with appropriate alpha properties, and move these layers according to the mouse events?

  • Live-Video and ACNS: Different documentation?

    I had found two statements regarding live- video and ETV:
    First in
    http://www.cisco.com/en/US/partner/prod/collateral/video/ps9339/ps7063/data_sheet_c78-504500.html
    "... Cisco ACNS provides both live unicast and multicast streaming services and on-demand access in which digital media files are cached locally for retrieval and viewed over the WAN at LAN speeds (Figure 3)."
    And the second statement in
    "Cisco Digital Media System 5.1 Design
    Guide for Enterprise Medianet":
    "... For Digital Signage and Enterprise TV live content, multicast is required as the transport. Cisco ACNS
    does not support unicast to multicast conversion for the delivery method used by Digital Signage and
    Enterprise TV. For this reason, Cisco ACNS does not provide any benefit for live content...."
    Is there an difference or is this an failure?

    I think the first statement should read:"ACNS does support unicast and multicast splitting for live broadcasts TO THE DESKTOP and with Windows Media streaming"
    The second statements hints to HighDefinition streaming which is mostly compressed with H.264 MPEG4, and i don't think ACNS is able to split this (or convert unicast -> multicast)

  • Stream live video - use http or rtmp protocols from player code?

    I'm trying to stream live video for the first time. I've installed FMIS 3.5 without Apache 2.2. I'm confused on the RTMP/HTTP protocol usage,
    I can only have port 80 opened on our FMIS 3.5 box.
    1. I've created a new directory under "C:\Program Files\Adobe\Flash Media Server 3.5\applications" called "livedev" as my publishing point.
    2. Question:
         Do I need to copy main.asc, Application.xml, allowedHTMLdomains.txt, allowedSWFdomains.txt to this directory as well?
    3. Question:
         Using FMLE 3 - FMS URL parameter which protocol do I need to use, does it matter 'http' or 'rtmp'? - http://localhost/livedev or rtmp://localhost/livedev
         Does RMTP use port 80.
    4. Question:
         In my Flash Player code. I dragged an instance of FLVPlayback component, what URL do I need for the 'Source' parameter?
         Which protocol do I need to use, does it matter 'http' or 'rtmp'? - "http://localhost/livedev/livestream" or "rtmp://localhost/livedev/livestream"
    Thanks,
    Dave

    A:
    2.  You don't need to copy main.asc unless you have logic in there that you want on your new application.  It's a text file so you can open and read it, get familiar.  allowedDomains is up to your security policy, but if you want it to apply to that application then yes.  Finally the Application.xml in there is an override of the default one in the server's conf directory - so you only need to include it if you plan on overriding settings from the default.
    3.  Chosing which type dictates whether or not you're doing HTTP (not streaming) or RTMP (streaming) however you've added a complication that you're only allowed to use port 80 - which RTMP does not by default.  FMLE doesn't support HTTP push to FMS, so the FMS URI must be fomr form of RTMP* so you're left with some good choices
    Use RTMP over port 80 like so rtmp://myServerName:80/appName/ <- notice the coded port number
    Use RTMPT which automatically uses port 80 and will get around most proxies that demand HTTP traffic - however there's more performance overhead than RTMP (think maybe 25% more overhead for a wild guess on my part)
    I dunno the answer for your FLVPlayback component but I'm going to take a guess that the issue is about the same here and you should use the same parameter that you're using for your RTMP ingest.  At worst I'm pretty sure that it will work and that's not so bad.
    Asa

  • Live video application for multiple users with RTMFP

    Hi
    We developed a live video application in Flex that uses RTMP. In one session up to 5 users are able to publish their stream and up to 20 users can join a group. For saving bandwidth costs and making use of lower latency with UDP we check RTMFP. From reading the documentation it seems that RTMFP multicast is the way to go.
    Our frist results and additional research surfaced two 2 issues:
    - Streams have a live delay (which seems to be multicastwindowduration + buffertime) that takes up to 30 sec to disappear but seems to increase again
    - The application allows one user to change the size of a published video (camera.setMode). This causes a freeze with RTMFP for the publisher and all subscribers. Changing the size several times can cause to make the video work again for the publisher and subscriber (some bugs are already reported here)
    For solving these issues I wonder:
    1.) Is RTMFP multicast able to deliver a low latency solution that is needed for face to face v/a communication?
    2.) if yes - are there any settings suggested by Adobe for the RTMFP NetStream multicast properties (like multicastwindowduration property) and if not can you give me some hints what would work for reduce the latency that face to face communication gets acceptable?
    3.) How to deal with the setMode issue - does this happen because I misconfigured RTMFP or is this API not supported? Will unpublish and publish solve the issue (what api series do I need to use to do so?)
    Best regards,
    Marc

    1) any practical use of P2P multicast will have a delay measured in seconds. it shouldn't go to 30 seconds unless you have a large NetStream.bufferTime.
    2) i strongly recommend against changing the multicast stream parameters. they all interact with each other. decreasing the window duration will decrease the effective reliability of the stream, by reducing the amount of time to receive all of the fragments of the stream as they pass through the P2P group -- especially when people join or leave the group, which can disrupt the low latency "push" trees that are constructed in the group.
    3) you should be able to change the encoding parameters on the fly without a major disruption. are you using H.264 or Sorenson Spark?  it's possible that your camera doesn't like to have its capture size changed and it is freezing temporarily.  i'm not having any trouble changing the capture size with my camera (and using H.264).  i recall there being some issue with changing the frames-per-second on the fly (at least some time ago, and i don't remember what the issue is) -- there was enough of an issue that i disabled it in my little app while actively publishing.

  • CNN Live Video Streaming Error!

    I cant seem to get Safari 3.1 (Windows) to play live streaming from cnn.com.
    I installed the recommended "Turner" plugin but that failed to resolve the issue, in-fact the plugin link leads to a firefox plugin which I don't even use.
    I considered the Safari "RealPlayer" plugin, but that is not even a plugin, its the complete full version of RealPlayer gold, which I don't want installed for security issues.
    What is going on here? Can I view cnn live video with Safari 3.1?

    Check with CNN if the "Turner" plugin is compatible with Safari 3.1 for Windows.
    They might have written it for IE and Firefox only, with Safari and Opera support coming up next?

  • Making QtWebkit support HTML5 video and audio

    For those who don't know, HTML5 enables most current browsers to play videos and audios without the need of any plugin. QtWebkit supports this feature, so browsers using it such as Rekonq and Arora should support it too. However on Arch Linux this does not work outside of the box.
    I've tried to make this work and made a little progress, and I'm wondering if anyone has made similar attempts, have suggestions, or are willing to try this out. What I've done basically is to compile Qt with Phonon, and then remove the Phonon files when building the package, so that it uses the Phonon that comes with KDE. I included the PKGBUILD below, and you just need to copy the extra/qt directory from abs, and override the PKGBUILD or apply the patch.
    I also installed phonon-vlc-qt from AUR, which according to this post contains some fixes to support HTML5 video/audio. However neither this nor the default gstreamer backend works for me. With vlc the vidoe doesn't play, and with gstreamer the browser crashes. To try different backends for Phonon, open KDE's System Settings - Multimedia - Backends, and use the Prefer and Defer buttons.
    Some links for testing HTML5 video and audio:
    http://html5test.com Only uses JavaScript to ask the browser if it supports the feature. What I've done does make these tests pass.
    http://html5video.org
    http://youtube.com/html5 Join the beta, and then open some youtube videos, and look for HTML5 in the player's control bar
    The PKGBUILD:
    # $Id: PKGBUILD 98480 2010-11-09 22:17:20Z andrea $
    # Maintainer: Andrea Scarpino <[email protected]>
    # Contributor: Pierre Schmitz <[email protected]>
    pkgname=qt
    pkgver=4.7.1
    pkgrel=2
    pkgdesc='A cross-platform application and UI framework'
    arch=('i686' 'x86_64')
    url='http://qt.nokia.com/'
    license=('GPL3' 'LGPL')
    depends=('libtiff' 'libpng' 'libmng' 'sqlite3' 'ca-certificates' 'glib2' 'dbus'
    'fontconfig' 'libgl' 'libsm' 'libxrandr' 'libxv' 'libxi' 'alsa-lib'
    'xdg-utils' 'hicolor-icon-theme')
    optdepends=('postgresql-libs: PostgreSQL driver'
    'libmysqlclient: MySQL driver'
    'unixodbc: ODBC driver'
    'libxinerama: Xinerama support'
    'libxcursor: Xcursor support'
    'libxfixes: Xfixes support')
    makedepends=('mesa' 'postgresql-libs' 'mysql' 'unixodbc' 'cups' 'gtk2')
    install="${pkgname}.install"
    options=('!libtool')
    _pkgfqn="qt-everywhere-opensource-src-${pkgver}"
    source=("ftp://ftp.qt.nokia.com/qt/source/${_pkgfqn}.tar.gz"
    'assistant.desktop' 'designer.desktop' 'linguist.desktop'
    'qtconfig.desktop')
    md5sums=('6f88d96507c84e9fea5bf3a71ebeb6d7'
    'fc211414130ab2764132e7370f8e5caa'
    '85179f5e0437514f8639957e1d8baf62'
    'f11852b97583610f3dbb669ebc3e21bc'
    '6b771c8a81dd90b45e8a79afa0e5bbfd')
    build() {
    unset QMAKESPEC
    export QT4DIR=$srcdir/$_pkgfqn
    export PATH=${QT4DIR}/bin:${PATH}
    export LD_LIBRARY_PATH=${QT4DIR}/lib:${LD_LIBRARY_PATH}
    cd $srcdir/$_pkgfqn
    sed -i "s|-O2|$CXXFLAGS|" mkspecs/common/g++.conf
    sed -i "/^QMAKE_RPATH/s| -Wl,-rpath,||g" mkspecs/common/g++.conf
    sed -i "/^QMAKE_LFLAGS\s/s|+=|+= $LDFLAGS|g" mkspecs/common/g++.conf
    ./configure -confirm-license -opensource \
    -prefix /usr \
    -docdir /usr/share/doc/qt \
    -plugindir /usr/lib/qt/plugins \
    -importdir /usr/lib/qt/imports \
    -datadir /usr/share/qt \
    -translationdir /usr/share/qt/translations \
    -sysconfdir /etc \
    -examplesdir /usr/share/doc/qt/examples \
    -demosdir /usr/share/doc/qt/demos \
    -largefile \
    -plugin-sql-{psql,mysql,sqlite,odbc} \
    -system-sqlite \
    -xmlpatterns \
    -phonon \
    -phonon-backend \
    -svg \
    -webkit \
    -script \
    -scripttools \
    -system-zlib \
    -system-libtiff \
    -system-libpng \
    -system-libmng \
    -system-libjpeg \
    -nomake demos \
    -nomake examples \
    -nomake docs \
    -no-rpath \
    -openssl-linked \
    -silent \
    -optimized-qmake \
    -dbus \
    -reduce-relocations \
    -no-separate-debug-info \
    -gtkstyle \
    -opengl \
    -no-openvg \
    -glib
    make
    package() {
    cd $srcdir/$_pkgfqn
    make INSTALL_ROOT=$pkgdir install
    # install missing icons and desktop files
    for icon in tools/linguist/linguist/images/icons/linguist-*-32.png ; do
    size=$(echo $(basename ${icon}) | cut -d- -f2)
    install -p -D -m644 ${icon} ${pkgdir}/usr/share/icons/hicolor/${size}x${size}/apps/linguist.png
    done
    install -p -D -m644 src/gui/dialogs/images/qtlogo-64.png ${pkgdir}/usr/share/icons/hicolor/64x64/apps/qtlogo.png
    install -p -D -m644 tools/assistant/tools/assistant/images/assistant.png ${pkgdir}/usr/share/icons/hicolor/32x32/apps/assistant.png
    install -p -D -m644 tools/designer/src/designer/images/designer.png ${pkgdir}/usr/share/icons/hicolor/128x128/apps/designer.png
    install -d ${pkgdir}/usr/share/applications
    install -m644 ${srcdir}/{linguist,designer,assistant,qtconfig}.desktop ${pkgdir}/usr/share/applications/
    # install license addition
    install -D -m644 LGPL_EXCEPTION.txt ${pkgdir}/usr/share/licenses/qt/LGPL_EXCEPTION.txt
    # Fix wrong path in pkgconfig files
    find ${pkgdir}/usr/lib/pkgconfig -type f -name '*.pc' \
    -exec perl -pi -e "s, -L${srcdir}/?\S+,,g" {} \;
    # Fix wrong path in prl files
    find ${pkgdir}/usr/lib -type f -name '*.prl' \
    -exec sed -i -e '/^QMAKE_PRL_BUILD_DIR/d;s/\(QMAKE_PRL_LIBS =\).*/\1/' {} \;
    # Remove phonon files which are installed by KDE
    rm -r ${pkgdir}/usr/{include/phonon/,lib/libphonon*,lib/pkgconfig/phonon.pc}
    Diff from qt's PKGBUILD in ABS:
    --- PKGBUILD.orig 2010-12-15 13:18:19.000000000 -0500
    +++ PKGBUILD 2010-12-15 13:19:15.000000000 -0500
    @@ -57,8 +57,8 @@
    -plugin-sql-{psql,mysql,sqlite,odbc} \
    -system-sqlite \
    -xmlpatterns \
    - -no-phonon \
    - -no-phonon-backend \
    + -phonon \
    + -phonon-backend \
    -svg \
    -webkit \
    -script \
    @@ -109,4 +109,6 @@
    # Fix wrong path in prl files
    find ${pkgdir}/usr/lib -type f -name '*.prl' \
    -exec sed -i -e '/^QMAKE_PRL_BUILD_DIR/d;s/\(QMAKE_PRL_LIBS =\).*/\1/' {} \;
    + # Remove phonon files which are installed by KDE
    + rm -r ${pkgdir}/usr/{include/phonon/,lib/libphonon*,lib/pkgconfig/phonon.pc}
    Last edited by hagabaka (2010-12-15 22:11:46)

    Use a minidisplay/HDMI cable for video:
    http://www.cnaweb.com/6-mini-displayport-male-to-hdmi-male-cable.aspx?gclid=CMbn pNem57wCFRQS7AodzDQACg
    A 2009 MBP does not support audio via HDMI so you will have to tap the Audio Output port.  Use a 3.5 mmRCA plugs and tap the RCA audio inputs on the TV with a cable such as this:
    http://www.monoprice.com/Product?c_id=102&cp_id=10218&cs_id=1021817&p_id=9768&se q=1&format=2&utm_expid=58369800-11.R-enhtUGRrSdHz5vzpVS2g.0&utm_referrer=http%3A %2F%2Fwww.monoprice.com%2FCategory%3Fc_id%3D102%26cp_id%3D10256%26gclid%3DCM2Vro yn57wCFZLm7AodHjkAlQ
    Ciao.

  • LIVE VIDEO USING MULTIPLE CAMERA

    iam trying live video streaming using Flash lite + FMS3 +
    Flash Media Encoder
    if i use multiple camera how to switch it..?

    Almost every frame grabber displays live images on the screen. I use the
    falcon board and control it by it's dll. I give it a handle to a window
    by LV to show it there. The DLL supports me with a pointer to the pixel
    data as well does IMAQ vision. That's all I need to move a frame into an
    IMAQ picture for further processing.
    Detlef
    Paul schrieb:
    > Does someone know how i can watch a live image from my camera in
    > labview ? i have imaq, but i don't know if i would be easier using an
    > active x component.
    > thanxs
    >
    > --
    > [email protected]
    >
    > Sent via Deja.com http://www.deja.com/
    > Before you buy.

  • How can I capture live video and still image by a DirectX compatible USB webcam using LabView ?

    Dear forum members
       I'd like to design a user interface which shows a live video and capture still images for it by a DirectX compatible USB webcam using LabView or NI Vision Toolkit, how can I do this ? and If this is possible How can I reach the webcam DirectX filter properties and set them using LabView ? I would be grateful to anyone Who lead me to the correct solution.
        Sincerely
        Cem DEMiRKIR

    Cem,
        With our example programs you can acquire and save an image (IMAQ >> File Input and Output >> Snap and Save to File.vi) from one camera, if you double the VI's to create the image and the image task, you can easily have one program acquire and save for two cameras.
        I may need more information on what you mean by calibration.  Do you mean basic camera setup?  That can be done in Measurement and Automation Explorer by setting up the options within the IMAQ settings for each camera.  Or do you mean more complex calibration for special types of images?  The more description of what you mean, the better I will be able to help you get it done.  You mentioned zoom and motion parameters, what kind of motion?  Our vision drivers only control triggering the camera to acquire, then analyzing, processing and displaying the resulting image.
        If you have more details on exactly what you want to do, that would be great.  Let me know if you have more questions, thanks!
    -Allison S.
    Applications Engineering
    -Allison S.
    Calibration Services
    Product Support Engineer

  • How to send live video from iphone to ipad

    I have an iphone g1 and gs3
    Is it possible to take live video from the iphone and send it to an ipad II   ??
    Thanks

    I don't know which iPhones support FaceTime butbif either of yours do, you could use that.

Maybe you are looking for