Streaming with H.264 (jiggle)

Hello,
After streaming with Flash Media Encoder 3.1 and ManyCam/Splitcam with VP6 and 320x240,30fps (works good!!),
I tried to stream HQ with H.264,30FPS,640x360,700kbps...
Buter after pressing start,the both Windows (Input and Output) will jiggle very strong.(In the Justin.tv stream,too,of course)
Looking in the Encoding Log,it shows me an Output with 11-12 fps.
But where is the Problem? (PC,Internet...?)
My System:
Laptop Core 2 duo T5250@1,50GHz
3GB RAM
Windows Vista
enough Harddisk
ATI Mobility Radeon X2300
DSL 20.000/1.024
Thanks a lot =)

Ithink the problem is that the H.264 codec needs more CPU resources compared to ON2 VP6.
I tried 640x480, 25fps, 200Kbps with the 2 codec (VP6, and h264). I have the H.264 codec with a delay of 3 seconds and a delay vp6 less than 1 second.
I think these should you need more CPU resources with H264
the cpu works a lot with h.264 and this makes for more delay
http://help.adobe.com/es_ES/Flash/10.0_UsingFlash/WS9222D73A-676D-41cd-9222-A4884858BBA3.h tml
Enrique Figueroa

Similar Messages

  • Record stream with H.264 and PCMU: is it supported?

    I tried to record a stream which is published with H.264 codec and the telephony (PCMU) codec with a Flash Player 11+ client. The stream plays fine live but when I try to record it, I get the following error on the status of the stream: NetStream.Record.NoAccess. The file is created on disk but has only 21 KB, never more, never less. This is my log:
    10:21:49.550  info   Stream (record_Aaj3g/u28_T32l87h.f4v) status: NetStream.Data.Start (level: undefined, length: 0 sec.)
    10:21:49.554  info   Stream (record_Aaj3g/u28_T32l87h.f4v) status: NetStream.Publish.Start (level: status, length: 0 sec.)
    10:21:49.556  info   Stream (record_Aaj3g/u28_T32l87h.f4v) status: NetStream.Play.Reset (level: status, length: 0 sec.)
    10:21:50.107  info   Stream (record_Aaj3g/u28_T32l87h.f4v) status: NetStream.Record.Start (level: status, length: 0 sec.)
    10:21:59.962  info   Stream (record_Aaj3g/u28_T32l87h.f4v) status: NetStream.Record.NoAccess (level: status, length: 0 sec.)
    10:21:59.966  info   Stream (record_Aaj3g/u28_T32l87h.f4v) status: NetStream.Record.Stop (level: status, length: 0 sec.)
    I digged the problem a little and found that I also have an error in the Windows Event Viewer from AMSCore:
    Error from libmp4.dll:
    Received an audio packet of unknown type..
    I even tried to publish the stream in "record" mode instead of "live" directly from the client to see if it does the same thing. It does, I get exactly the same result.
    I searched on the web hoping to find something about the problem but without any result. The only reference I see is a table in the AMS documentation listing the supported file types and codecs but there is no mention of the PCMU/PCMA codecs. Here: http://help.adobe.com/en_US/adobemediaserver/techoverview/WS07865d390fac8e1f-4c43d6e71321e c235dd-7fff.2.3.html#WS5b3ccc516d4fbf351e63e3d119ed944a1a-7fe7.2.3
    In last resort, I tried to publish the file as RAW. It seems to work but I can hardly use this format since it is only usable by Flash Player I think. I cannot convert or reuse the file on another platform.
    By the way, I had the same result on FMS 3.5.3 and AMS 5.0.1
    Any idea? Is this normal or what?

    Update: I discovered several other recordings which had 21 kb in size, also f4v but with only sound, nellymoser. I did not get any error in logs or event viewer which makes it tough to debug! What could cause such a thing? I understand that there is no data recorded in the stream file, except metadata probably. I tried to convert such a file with ffmpeg and got an error saying there is no stream data found in file.
    The method we use for publishing/recording is the following: the stream is published from the client in live mode (no record at all). On the server side, I create a new Stream and attach the publishing stream to it. Then I record the new stream. It seems to be working great in most cases but sometimes, for unknown reasons, it does not work.
    Anybody, an idea?

  • Auto-archiving streams with H.264 video

    My CDN is running FMS, and my customers and I stream a lot of live traffic through it. We've got FMS configured to auto-archive our live streams, but that only works for stream using VP6 video. When I look at the archived files captured from H.264 streams I get audio but no video.
    Is it possible to get FMS to autoarchive H.264 streams? Or is this planned for a future release of FMS?

    Yes, that does help, thanks!
    We do indeed have a script running to do our auto-archving. It currently doesn't recognize h.264, but with the info you provided I should be able to modify it to do so.
    What we are currently running is:
    * (C) Copyright 2007 Adobe Systems Incorporated. All Rights Reserved.
    * NOTICE:  Adobe permits you to use, modify, and distribute this file in accordance with the
    * terms of the Adobe license agreement accompanying it.  If you have received this file from a
    * source other than Adobe, then your use, modification, or distribution of it requires the prior
    * written permission of Adobe.                                                                
    * THIS CODE AND INFORMATION IS PROVIDED "AS-IS" WITHOUT WARRANTY OF                           
    * ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING BUT NOT LIMITED TO                         
    * THE IMPLIED WARRANTIES OF MERCHANTABILITY AND/OR FITNESS FOR A                              
    * PARTICULAR PURPOSE.                                                                         
    *  THIS CODE IS NOT SUPPORTED BY Adobe Systems Incorporated.                                  
    /* Adobe Flash Functions */                                                                    
    /* Load webservices.asc */
    load("webservices/WebServices.asc");
    trace(" * Loaded webservices/WebServices.asc");
    trace(" * Started Application");
    application.onConnect = function( client ) {
            /* Connection has started */     
            client.getStreamLength = function( streamName ) {
                    trace(" * Length => " + Stream.length( streamName ));
                    return Stream.length( streamName );               
            trace(" * Camera Connected");                             
            application.acceptConnection( client );                   
    application.onPublish = function (p_c, p_stream) {
            /* Publishing has started */           
            trace(" * p_c => " + p_c);             
            trace(" * p_stream => " + p_stream);   
            trace(" * Publishing started for stream " + p_stream.name);
            p_stream.archiveStart();                                
    application.onUnpublish = function (p_c, p_stream){
            /* Publishing has stopped */            
            trace(" * Publishing stopped for stream " + p_stream.name);
            p_stream.archive();                                     
    /* Our Custom Functions */
    /* Last Modified Tue Dec 1 2:55 pm */
    trace(" * Loading Custom Functions ... ");
    Date.prototype.unixTime = function () {
            /* Create and returl unix timestamp */
            trace(" * Getting UNIX Timestamp ... ");
            return Math.round(this.getTime() / 1000);
    Stream.prototype.archiveStart = function() {
            /* Start archiving stream */     
            this.record(); // start recording live stream
            trace(" * Recording started for stream");          
            this.startTime  = (new Date()).unixTime();
            // attach start time in unix time to object
    Stream.prototype.archive = function() {
            trace(" * Function => Stream.prototype.archive");
            this.record(false);
            // stop recording live stream                                                                                                                                                                                                              
            trace(" * Setting this.record to false");                                                                                                                                                                                         
            f = "/streams/_definst_/" + this.name + ".flv";
            // current name of file that was recorded                                                                                    
            trace(" * f => " + f);                                                                                                                                                                                                            
            src             = new File(f); // create a new file object of for sourse                                                                                                                                                                                                  
            if(src.exists){
                    /* src file exists */
                    trace(" * Src for " + f + " exists");
                    /* Create new unique name for recorded stream */
                    f1 = '/streams/_definst_/live_event_archives/auto_archive_' + this.name + '_'  + this.startTime + ".flv";
                    trace (" * New file name => " + f1);
                    /* Copy src file to new file */
                    try{
                            src.copyTo(f1);
                            trace(" * Copied src to " + f1);
                    catch(error){
                            trace(" * ERROR copying src to " + f1 + ": " + error.message);
                    /* Remove src file */
                    try{
                            src.remove(); // remove source file
                            trace(" * Removed src");
                    catch(error){
                            trace(" * ERROR deleting file: " + error.message);
                    /* Make web service request to vod server */
                    trace(" * Preparing to send to vod server ... ");
                    this.sendToVOD(this.name,f1,this.startTime,new Date().unixTime());
            else{
                    trace(" * ERROR! Src does not exist! " + src);
    Stream.prototype.sendToVOD = function(n,f,s,e) {
            trace(" * Preparing to submitting data to vod server ... ");
            trace(" * n => " + n );
            trace(" * f => " + f );
            trace(" * s => " + s );
            trace(" * e => " + e );
            /* URL for web service call */
            var cst_url = "http://cst.{OUR DOMAIN}.net/gateway/point/?name=" + n + '&file=' + f + '&start=' + s + '&end=' + e;
            trace(" * URL => " + cst_url);
            /* Init new webservice url call */
            var web = new WebService(cst_url);
            /* Make web service request */
            web.onLoad = function(wsdl){
                    trace(" * Connected to vod web service");
            web.onFault = function (wsdl){
                    trace(" * Could not connect to vod web service!");
    So, I will see if I can't modify it to decide whether it need to do mp4 or flv, and see if that helps.

  • Receiving a stream published with H.264 settings

    I publish the video stream with H.264 settings to fms 3.5 as recorded and at the receiver's end play it as live. The problem is that at the receiver's end stream is played from the last 4-5 seconds. I want it to play from the current position of the live stream.
    Any Help please???
    netstrm = new NetStream(nc);
    netstrm.play("mp4:"+instanceName+".f4v", -1);
    h264Settings= new H264VideoStreamSettings();
    h264Settings.setProfileLevel(H264Profile.BASELINE, H264Level.LEVEL_1_1);
    nsPublish.videoStreamSettings = h264Settings;
    nsPublish.publish("mp4:"+instanceName+".f4v","record");  

    I believe is a matter of playing aroun with hte stream.play parmaeters ant the bufferLength, buffer time propertie on the stream. Though I don't have code too look into it so I may be wrong.
    C

  • DLSR issue causing dropped streams with Helix Server

    It appears from my testing, research and confirmation from Real Networks that the QuickTime 7 Player has not followed the RFC for DLSR (Delay Since Last Sender Report) time stamps. I think this is causing movies created with QuickTime 7 Pro and streaming from Real Helix Server (in my test 11.1.1) to be dropped. Below is a post from another QuickTime user experiencing the same thing with some detail regarding the time stamp difference. In my case, I streaming some H.264 QuickTime Movies from Helix server and the streams are being dropped consistently. Real and WMV files are streaming flawlessly.
    My questions are:
    Does anyone know if this is indeed a problem with QuickTime?
    Has Apple acknowleged this to be a problem?
    Does anyone know a workaround?
    What is very strange about the problem is that does not occur every time on some of my test computers. Out of 7 computers tested (both Mac and Windows) 2 have not displayed the problem on first try but I cannot for the life of me figure out what the difference(s) might be and why the time stamp issue would not cause the first stream to drop.
    http://lists.apple.com/archives/streaming-server-dev/2006/Nov/msg00027.html
    Thanks to anyone who can help.

    anystreamcarl:
    Did you ever sort this out? We have an Axis camera we're testing and trying to get it to stream through our Helix server. The player always stops showing content at 65 seconds but displays no other indications of problems (timer continues, etc). Since RealPlayer is actually calling QuickTime to display this, we have the same problem with using RealPlayer.
    The same stream works fine with a VLC media player but I believe it's using a different protocol to talk to the server.
    We've worked with the RealMedia support people now for more than a month and they have no clue what's happening.
    rtsp://vm.gaslightmedia.com/rtpencoder/petoskey_pointe.sdp
    Does this sound at all similar to your problems? The only difference from what you describe is that it always stops at 65 seconds for any QuickTime player on any system. We have no trouble with any other types of streams and other QuickTime streams run fine.
    It appears from my testing, research and confirmation
    from Real Networks that the QuickTime 7 Player has
    not followed the RFC for DLSR (Delay Since Last
    Sender Report) time stamps. I think this is causing
    movies created with QuickTime 7 Pro and streaming
    from Real Helix Server (in my test 11.1.1) to be
    dropped. Below is a post from another QuickTime user
    experiencing the same thing with some detail
    regarding the time stamp difference. In my case, I
    streaming some H.264 QuickTime Movies from Helix
    server and the streams are being dropped
    consistently. Real and WMV files are streaming
    flawlessly.
      Other OS  

  • Issue with H.264/AVC export

    I am a beginner with Adobe Premiere so please excuse me if this is a naive question and terminology.
    The first thing I tried to achieve with Premiere is to convert/compress videos that I capture by my camera. I have been doing this for long long time using NCH Prism Video Converter and It does a good job. It uses AVI container with H.264 codec and compresses a lot while you don't feel much in quality loss. I have been trying to export a video to the same level of compression with kind of similar quality with Premiere, but I haven't yet been successful. The exported files are always larger and lower in quality. I have uploaded the original captured video as well as samples of the videos compressed by Prism and Premiere on my dropbox:
    Original (.mov 131.72 MB): https://db.tt/Qp7TUkPt 
    Prism (.AVI 18.94 MB): https://db.tt/sREwmMdD
    Premiere (.MP4 20.67 MB): https://db.tt/fhmiRUNk
    What I'd like to find is the right format in Premiere to get pretty similar quality&size as I get with Prism. As you can see in the above videos, the Premiere one is larger and more noisy (pixels are distorted especially when the camera is moving and you can see noises that cause pixels to seem jumpy/jaggy - sorry I don't know the exact name of this kind of noise).
    I tried MediaInfo to see what codec Prism is exactly using. It seems that it is using AVC, but I have no idea on how I can install/use AVC codec in Premiere. Below you can find the detailed information about the codec/format that Prism uses.
    I really appreciate any suggestion.
    Video
    ID                                       : 0
    Format                                   : AVC
    Format/Info                              : Advanced Video Codec
    Format profile                           : [email protected]
    Format settings, CABAC                   : Yes
    Format settings, ReFrames                : 3 frames
    Codec ID                                 : H264
    Duration                                 : 1mn 3s
    Bit rate                                 : 2 371 Kbps
    Width                                    : 1 920 pixels
    Height                                   : 1 080 pixels
    Display aspect ratio                     : 16:9
    Frame rate mode                          : Variable
    Frame rate                               : 23.976 fps
    Color space                              : YUV
    Chroma subsampling                       : 4:2:0
    Bit depth                                : 8 bits
    Scan type                                : Progressive
    Bits/(Pixel*Frame)                       : 0.048
    Stream size                              : 17.8 MiB (94%)
    Writing library                          : x264 core 125 r2209 68dfb7b

    Are you doing anything to the footage whilst in PrPro, or just looking to convert & smallify? Just curious ...
    Neil

  • Something causes video to skip with FMS if we are starting our player after we started a stream with

    Hi, I'm playing live stream supporting FMLE from FMS. There is no problem with playing if we are starting at first the player and then we are starting the stream with FMLE. The video skipping problem is occurs only if we are starting first the stream with FMLE and than our player.
    The versions:  FMLE 3.0.1.5963, FMIS 4.0.0, Flash Player 10.1.53.64, and codec h.264
    Is this a bug in FMS/FMLE or I'm doing something wrong?
    Regards, kissk

    Ideally order of publish and play should not matter - i mean whichever way it should work fine. When you say video skipping problem what do you exactly mean - you mean you get only audio, also are you playing live or DVR stream?

  • How does exactly AMS sync a video stream with another stream of audio only ?

    I have the following situation: 2 instances of FMLE on one machine and another machine with AMS Starter.
    The instances of the FMLE are:
    1. Video H.264 + audio MP3
    2. Audio only MP3 - track 1
    The application on AMS is a livepkgr/_definst_ and i am using HDS with a live event
    I have an OSMF based player to consume the streams and to do audio switch using late binding audio.
    When I consume the Video with the Default audio everything works just fine. The problem is that when I try to consume the Video with the other audio track, the audio tack has a 1 second delay compared to the video.
    I have tried to fix this by changing the configurations on the FMLE, but it doesnt seem to work. So I really want to know how does exactly AMS syncronize differents streams, in this case a video stream with a different audio stream.

    First, what is the file format, and its specs. for the remote recorder?
    Though the rates of modern cameras and digital recorders SHOULD match 100%, in the real-world, they seldom do. This make perfect syncing a labor intensive process.
    One should do tests between the cameras and the recorders, to find out the % or error. Then, adjust their Audio to match first. As an example, one user in the PrPro Forum tested his Panasonic cameras, and his Zoom recorders, and found that the Zooms were off by 0.04% constantly. To correct this, he used the Time Remapping (with Maintain Pitch checked), and would apply that adjustment to all of the Zooms' files - perfect sync.
    As for the Waveform not displaying for your remote recorder, did you allow those files to completely Conform (creation of the CFA and PEK files, the latter is the Waveform Display)? See this article: http://forums.adobe.com/thread/726693?tstart=30
    Also, you should be able to increase the vertical zoom of your Audio Tracks, and hence any Clips on them, by hovering the Cursor over the junction between Tracks, in the Track Header, and when it turns into a = sign, with up/down arrows, click+drag. Then, you should be able to find commonality in the beginning, to sync up to. Tip: for this critical work, toggle the Snap feature OFF, with the S key.
    Good luck,
    Hunt

  • Strange behaviour of iPad Photos app/My Photo Stream with iOS 8.1 and Yosemite

    My requirement is surely simple and commonplace, yet is proving impossible to achieve thanks to Apple's weird design of the Photos app and the way My Photo Stream works.  I'm very technically savvy, but I'm quite prepared to admit I'm being stupid if someone can explain what I'm doing wrong!  I have an iPad Air running iOS 8.1 and a MacBook Pro Retina with OS X 10.10 (Yosemite).  I've enabled iCloud Drive on both.  iPhoto on the Mac is at v9.6.
    I have a selection of 109 photos from a recent holiday in a folder on my MacBook, which I would like in a folder on my iPad as a nice way to show them to friends.   iTunes may work but seems so incredibly clunky and old-fashioned I hoped I could do it over the network or via the cloud.  I would use (and often have used) Dropbox, which is reliable and logical but rather tedious as you have to save each photo one by one. 
    So, with a little research and playing around, I found that I could get them to the iPad using iPhoto on the Mac, with My Photo Stream enabled.  However, it doesn't seem to work.  The photos were all created yesterday so had yesterday's date (I'd removed EXIF data that contained various earlier dates).
    I've just done the following test:
    I deleted all photos from the iPad (using the Photos app)
    On the Mac, I deleted all photos from iPhoto
    In iPhoto Preferences on the Mac, I turned on “My photo stream”, with the two sub-options also enabled
    I imported the 109 photos into iPhoto
    On the iPad, in My Photo Stream, I saw the photos gradually appear over the next few minutes
    A  minute or two later, when no more seemed to coming, I found that 103 of the original 109 were in My Photo Stream.  So, strangely, 6 photos were missing
    Even stranger, in the “Photos” tab, under “Yesterday” 37 of them could be seen.  If I understand this correctly (which I may not), that means 37 of them had been copied from the My Photo Stream to the iPad itself.  Am I right?  Why only 37?
    On the iPad, I added a new Album called Summer 2014
    I selected all 103 photos from My Photo Stream to add to the album
    In the Summer 2014 album, I briefly see “103 Photos” at the bottom of the album, but this quickly changes to 66 as photos that briefly appeared now disappear, leaving only 66 left in the album
    Checking in Photos/Collections/Yesterday, I find there are now 66 photos (rather than 37 as there were earlier) - the same ones that are showing in my Summer 2014 album.
    If I attempt to manually add any of the missing photos to the album, nothing happens - the album refuses to allow any more than the 66 it has.
    In iPhoto on the Mac, if I click on the SHARED/iCloud entry on the left, I see all 109 photos, which I think means iPhoto has correctly shared them all, so I suspect the problem is with the iPad.
    Yesterday, when I was playing with this, the album allowed 68 photos.  I tried deleting one and adding one of the missing photos, but it wouldn't allow this either, so I don't think it's objecting to the number of photos so much as disliking particular ones.
    I've got 7.7GB free space on the iPad so I don't think it's a space issue.
    Can anyone help?  If it's a bug in iOS, iCloud or OS X, how can I report it, given that I'm out of my 3 months support for both my machines?
    Thank you!

    Well done lumacj! This is a good workaround. Forgive me if I offer a simplified version here (this works if original photos are in Photo stream):
    1. In "My photo stream", tap on select at top of screen
    2. Tap on each photo you want to send via WhatsApp (circle with "√" shows for each photo)
    3. Tap on square box with arrow at bottom of screen (Share) - NOT "Add To"
    4. Tap on "Save image(s)"
    5. Check that selected photos now appear on "Camera Roll"
    6. From WhatsApp, send photos as you would have normally

  • How can I do live streaming with a Mac Book Pro 2011, using a new model of Sony HD camcorder which does not have firewire out/input? it comes only with a component video output, USB, HDMI and composite RCA output?

    I need to do live streaming with a Mac Book Pro 2011, using a new model of Sony HD camcorder (http://store.sony.co...ber=HDRAX2000/H) ..this camcorder model does not have firewire out/input ..it comes only with a component video output, USB, HDMI and composite A/V video output..
    I wonder how can I plug this camcorder to the firewire port of my laptop? Browsing on internet I found that Grass Valley Company produces this converter http://www.amazon.co...=A17MC6HOH9AVE6 ..but I am not sure -not even after checking the amazon reviews- if this device will send the video signal through firewire to my laptop, in order to live streaming properly? ..anyone in this forum could help me please?
    Thanx

    I can't broadcast with the built in iSight webcam... how would I zoom in or zoom out? or how would I pan? I've seem people doing it walking with their laptops but that's not an option for me... there's nothing wrong with my USB ports but that's neither an option to stream video because as far as I know through USB you can't connect video in apple operating systems ..you can for sure plug any video cam or photo camera through usb but as a drive to transfer data not as a live video camera...  is by firewire an old interface developed by apple that you can connect all sorts of cameras to apple computers... unfortunately my new sony HDR-AX2000 camcorder doesn't have firewire output...
    thanx

  • I have two different iCloud accounts. I can't sign into photo stream with my personal account and have tried deleting from my iPhone first, then the MacBook Pro. Still won't let me sign in with the personal account. Please help.

    I have two different iCloud accounts - business and personal. I can't sign into Photo Stream with my personal account because it says my business account is my primary account. They are separate but equal accounts. I have tried deleting the iCloud account from my iPhone, then my MacBook Pro and signing in again on both devices. The iPhone says that my primary account ismy personal account (which is good) but my MacBook Pro still will not sign into that account so I can use Photo Stream.
    Any suggestions? This iCloud stuff is getting annoying.

    you have a ps cs4 license for mac and you have some way of finding your serial number.
    if yes, download an install the installation file.  if you already have the installation file, what is its name and file extension?  (eg, if it's one file, it should be a dmg file.)
    if you need the installation file,
    Downloads available:
    Suites and Programs:  CC 2014 | CC | CS6 | CS5.5 | CS5 | CS4 | CS3
    Acrobat:  XI, X | 9,8 | 9 standard
    Premiere Elements:  13 | 12 | 11, 10 | 9, 8, 7 win | 8 mac | 7 mac
    Photoshop Elements:  13 |12 | 11, 10 | 9,8,7 win | 8 mac | 7 mac
    Lightroom:  5.6| 5 | 4 | 3
    Captivate:  8 | 7 | 6 | 5
    Contribute:  CS5 | CS4, CS3
    Download and installation help for Adobe links
    Download and installation help for Prodesigntools links are listed on most linked pages.  They are critical; especially steps 1, 2 and 3.  If you click a link that does not have those steps listed, open a second window using the Lightroom 3 link to see those 'Important Instructions'.

  • I share a photo stream with a family member. Is there a way to make it so we have separate photo stream but still use the same apple id?

    i share a photo stream with a family member. Is there a way to make it so we have separate photo stream but still use the same apple id?

    You'll want a separate iCloud account for each user in addition to the main Apple ID used for purchases.
    Let's say you have a family of four (husband, wife, and two children). Here is what you will need:
    Main Apple ID for shared purchases in Tunes (everybody will be using this)
    iCloud (Apple ID) account for husband
    iCloud (Apple ID) account for wife
    iCloud (Apple ID) account for child 1
    iCloud (Apple ID) account for child 2
    Yes, you can have both an Apple ID and iCloud account setup on the same device. On each iOS device, setup the main Apple ID account for the following services:
    iCloud (Photo Stream and Backup are the only items selected here).
    Music Home Sharing
    Video Home Sharing
    Photo Stream
    Additionally, on each iOS device, add a new iCloud account (Under Mail, Contacts, Calendar) that is unique to each user for the following services:
    Mail (if using @me.com)
    Contacts
    Calendars
    Reminders
    Bookmarks
    Notes
    Find My iPhone
    Messages
    Find Friends
    FaceTime
    On your individual Macs (or Mac accounts), use the main Apple ID in iTunes. Then, use the unique iCloud account in/for iCloud.
    Now that everbody is using separate iCloud accounts for your contacts, calendars, and other things, it will be all be kept separate.
    The only thing that is now 'shared' would be iTunes content (apps and other purchases), backups via iCloud (if enabled), and Photo Stream (if enabled).

  • How can I do live streaming with a Mac Book Pro 2011, using a new model of Sony HD camcorder ..that doesn't have firewire out/input?

    I need to do live streaming with a Mac Book Pro 2011, using a new model of Sony HD camcorder (http://store.sony.co...ber=HDRAX2000/H) ..this camcorder model does not have firewire out/input ..it comes only with a component video output, USB, HDMI and composite A/V video output..
    I wonder how can I plug this camcorder to the firewire port of my laptop? Browsing on internet I found that Grass Valley Company produces this converter http://www.amazon.co...=A17MC6HOH9AVE6 ..but I am not sure -not even after checking the amazon reviews- if this device will send the video signal through firewire to my laptop, in order to live streaming properly? ..anyone in this forum could help me please?
    Thanx

    I can't broadcast with the built in iSight webcam... how would I zoom in or zoom out? or how would I pan? I've seem people doing it walking with their laptops but that's not an option for me... there's nothing wrong with my USB ports but that's neither an option to stream video because as far as I know through USB you can't connect video in apple operating systems ..you can for sure plug any video cam or photo camera through usb but as a drive to transfer data not as a live video camera...  is by firewire an old interface developed by apple that you can connect all sorts of cameras to apple computers... unfortunately my new sony HDR-AX2000 camcorder doesn't have firewire output...
    thanx

  • HT204053 i have 2 apple ids and one is from years ago when icloud wasnt a thought but now somehow it became my primary apple id and i cant photo stream with the apple id i use. Even if im signed into my old one the photo stream doesnt automatically go to

    i have 2 apple ids and one is from years ago when icloud wasnt a thought but now somehow it became my primary apple id and i cant photo stream with the apple id i use. Even if im signed into my old one the photo stream doesnt automatically go to my mac. Im not exactly technology savy but id love to figure this out.

    I have also noticed the same "problem".  I do not fully understand the impact of using the same ID on multiple computers and devices if I want to keep them all in sync.

  • How can I read a binary file stream with many data type, as with AcqKnowledge physio binary data file?

    I would like to read in and write physiological data files which were saved by BioPac�s AcqKnowledge 3.8.1 software, in conjunction with their MP150 acquisition system. To start with, I�d like to write a converter from different physiodata file format into the AcqKnowledge binary file format for version 3.5 � 3.7 (including 3.7.3). It will allow us to read different file format into an analysis package which can only read in file written by AcqKnowledge version 3.5 � 3.7 (including 3.7.3).
    I attempted to write a reader following the Application Note AS156 entitled �AcqKnowledge File Format for PC with Windows� (see http://biopac.com/AppNotes/ app156Fi
    leFormat/FileFormat.htm ). Note the link for the Mac File format is very instructive too - it is presented in a different style and might make sense to some people with C library like look (http://biopac.com/AppNotes/ app155macffmt/macff.htm).
    I guess the problem I had was that I could not manage to read all the different byte data stream with File.vi. This is easy in C but I did not get very far in LabView 7.0. Also, I was a little unsure which LabView data types correspond to int, char , short, long, double, byte, RGB and Rect. And, since it is for PC I am also assuming the data to be written as �little endian� integer, and thus I also used byte swap vi.
    Two samples *.acq binary files are attach to this post to the list. Demo.acq is for version 3.7-3.7.2, while SCR_EKGtest1b.acq was recorded and saved with AcqKnowledge 3.8.1, which version number is 41.
    I would be grateful if you someone could explain how to handle such binary file stream with LabView and send an example to i
    llustrate it.
    Many thanks in advance for your help.
    Donat-Pierre
    Attachments:
    Demo.acq ‏248 KB
    SCR_EKG_test1b.acq ‏97 KB

    The reading of double is also straight forward : just use a dble float wired to the type cast node, after inverting the string (indian conversion).
    See the attached example.
    The measure of skin thickness is based on OCT (optical coherent tomography = interferometry) : an optical fiber system send and received light emitted to/back from the skin at a few centimeter distance. A profile of skin structure is then computed from the optical signal.
    CC
    Chilly Charly    (aka CC)
             E-List Master - Kudos glutton - Press the yellow button on the left...        
    Attachments:
    Read_AK_time_info.vi.zip ‏9 KB

Maybe you are looking for

  • Elements Error 150:30

    I am running Photoshop elements 8 on windows 7 It has been fine for a very long time, Now when I boot it up it gives me the error 150:30 I have tried uninstalling and re installing to no avail. Can anyone help me get this resolved ? Thanks

  • Restored from TM back up and now no Applications or Utilities. Please help.

    I just restored my machine from a Time Machine backup. After it completed its restore, it looks like all my user data is there but my /Applications folder is empty and there is no Utilities since the Applications folder is empty. My dock has nothing

  • Can`t make and recieve calls

    I can`t make calls or recieve calls, but i have full roaming signal on my phone. I went to my operator support station and they said that it was the phones problem. Somebody can help me?

  • Best Practice for new BI project

    we are about to start new BI project. what is the best practice to start new BI project 1. start only with FI as poc 2. more than one module does not matter if the project was planned and scoped well can you give me link to best practice in starting

  • Cannot insert into database

    Can any body help plzzzzzzzzzzzzzzzz, I establish a connection using connection pooling as given in netbeans helps.With that ican retrieve data from database but cant insert into it.While inserting only null values ere entered into database.why? plea