Creating multiple live stream graphs...

I want to be able to add live charts which have dynamic data
(thus the values constantly change)...
I can easily add charts to the series now but the problem is
if I do something like this:
myNewLiveSeries.dataProvider=new ArrayCollection. The array
is not Bindable, so when I do this:
LiveSeries
.dataProvider.push(newDataPoint). The chart does nothing.
how do I overcome this? Any solutions?

Adam's right! However, editing the QuickTime Broadcaster file is a little tricky I found. Using QuickTime Pro I am able to set new "in" and "out" points and then choose "save as" as a self contained movie. This creates a new edited file pretty quickly. Choosing "export" will trigger a complete re-rendering taking a long time. The only issue with the "save as" method is that the reulting edited file will not have it's hint track enabled by default. This means it will not stream from QuickTime Server or Darwin Server. You will only see a portion of the first frame of video. To correct this, I reopen the "edited" file in QuickTime Pro and then choose Export (movie to hinted movie). This will then create a third file, this time with a valid hint track. This third file can then be uploaded to the server and will stream fine. The whole process takes only a few minutes for an hour of video so it's managable. Much better than doing a "transcode".

Similar Messages

  • How do I create multiple live USBs of Operating systems to boot from on one flashdrive?

    How do I create multiple live USBs of Operating systems to boot from on one flashdrive?
    I am attempting to create multiple live USBs within one flashdrive so that i can boot various operating systems without having to buy multiple USb sticks. I am using a Mac OS X Snow Leopard 10.6.8 Operating System currently. I am using the Terminal provided by the OS.
    I would like to try Debian, Kali, Precise Puppy, and Sabayon- with one space left empty for whatever Linux OS I would like to try next. I have partitioned an 8GB flashdrive into the following sections:
    name of intended OS,     space allocated for it:
    1)Debian ........................1.32 GB
    2)Kali..............................2.61GB
    3)Precise Puppy..............163.7MB
    4)Sabayon.......................1.77GB
    5)TBD..............................2.14GB
    I have figured out how to change the .iso files of the operating systems into .img files with the following commands:
    hdiutil convert -format UDRW /path/to/input.iso -o /path/to/output.img
    a .img.dmg file is created, this is corrected in the next line of code *please note: I am a a very beginner at programming, and would greatly appreciate any help. Thank you!
    mv /path/to/output/file.img{.dmg,}
    This is where I started to get confused:
    I entered diskutil list to find the location of the partitions on my flashdrive, and this is the relevant section:
    /dev/disk1
       #:                       TYPE NAME                    SIZE       IDENTIFIER
       0:     FDisk_partition_scheme                        *8.0 GB     disk1
       1:                 DOS_FAT_32 DEBIAN               1.3 GB     disk1s1
       2:                 DOS_FAT_32 KALI                    2.6 GB     disk1s2
       3:                 DOS_FAT_32 PRECISE            163.7 MB   disk1s3
       4:                 DOS_FAT_32 SABAYON           1.8 GB     disk1s5
       5:                 DOS_FAT_32 TBD                     2.1 GB     disk1s6
    This is the code I attempted to use and the subsequent result:
    charles-burtons-mac-mini:~ charlesburton$ sudo dd if=/Users/charlesburton/Desktop/debian-live-7.2-i386-gnome-desktop.img of=/dev/disk1s1 bs=1m
    dd: /dev/disk1s1: Resource busy
    *at this point I went to the Disk Utility GUi and unmounted only the DEBIAN partition, now back to the terminal*
    charles-burtons-mac-mini:~ charlesburton$ sudo dd if=/Users/charlesburton/Desktop/debian-live-7.2-i386-gnome-desktop.img of=/dev/disk1s1 bs=1m
    dd: /dev/disk1s1: end of device
    1259+0 records in
    1258+1 records out
    1320005632 bytes transferred in 1011.966891 secs (1304396 bytes/sec)
    Please may someone help explain why I had a return value of +1 in my records in and how I can make it work properly.

    I would like to know the answer to this as well.  I tried renaming the 3D version by adding 3D after it, but it still not a separate movie in iTunes.
    It's putting them in the same folder.

  • Listen to multiple Live Streams at once

    I am developing an application where I need musicians to play together and I want to have an audience listen to it. I have created my publisher clients and they can create the streams, but now I want to take those multiple streams and combine them so that you can hear the guitar, singer, etc. all together.
    Is this possible, do I need to tell the server side script to combine them and create a new stream? Any help would be appreciated!
    Thanks

    I haven't used it myself, but the info on the ejamming site indicates that they're using a P2P protocol, and I suspect it's UDP based.UDP is more suitable for a use case such as this, as UDP is a lossy protocol. Additionally, the P2P nature of ejamming's application takes the trip to the server out of the picture, as the data doesn't need to bounce off the server. This reduces the time it takes to get the data from one client to another.
    FMS, on the other hand, uses TCP/IP, which is a lossless protocol. Any data that doesn't make it to the server on the first trip (and in the proper sequence) has to be re-sent, which adds to latency on the stream. That, combined with encoding and transmission latency really removes reliability from an application that depends on consistently low latency. That's not to say it isn't possible, it just isn't reliable.
    All that said, you might consider testing with RTMFP. RTMFP is a UDP based P2P protocol supported by Flashplayer 10 (http://labs.adobe.com/technologies/stratus/). RTMFP requires that you use a rendezvous service, and currently, the only such service is Adobe's stratus service. The word is that future versions of FMS will support RTMFP rendezvous services, but there is no official word on whether "future versions" means the next version, or a version further down the road. Of course, all of this introduces a few new considerations:
    1. Each musician will need a lot more upload bandwidth, as each subscriber would need to connect to the publisher to receive the stream. For example, let's say you have 5 musicians. That means each musician needs to serve his/her stream to each of the other 4 musicians (RTMFP does not support multicast/swarming)
    2. This approach doesn't satisfy the need to play the streams to a listening audience (RTMFP is not suitable for serving streams to a large audience). You'd still need to publish a stream from each musician to FMS over RTMP, come up with some sort of timecode logic to track latency, and then try to sync those streams on the playback client.
    I tend to shy away from the term "impossible" (that word tends to lead to foot-in-mouth), but in this case, it seems to me that an FMS-centric solution would be less than reliable.

  • Multiple Live Streaming Video in flash?

    Hello Everyone,
    I have this project in mind which I want to create a video
    montage of different live streaming videos from a few different
    users with their webcam in different locations. There will be the
    website which the participants will be able to view the live video
    montage in real time. However, I am unsure of where to start with
    (live video & server). Therefore I will be really thrilled if
    someone knows an example similar to this or someone who can offer
    me a lead in this topic!!
    I have knowledge in flash and abit of actionscripting.
    Much THANKS!!
    jen

    The 2-way video chat module example in the sample chapter
    from "Learning Flash Media Server 3" should give you some ideas.
    Obviously, you only want the 1-way (user to server) part of
    that, as you'll be displaying the video montage I guess in a web
    page as a set of regular flash streams.
    Main page
    http://www.adobe.com/devnet/flashmediaserver/articles/ora_learning_fms3.html
    Sample chapter (PDF) :
    http://www.adobe.com/devnet/flashmediaserver/articles/ora_learning_fms3/learning_fms3_ch05 .pdf
    It looks to be an O'Reilly book, so you may be able to get
    the whole thing online, or get it delivered from O'Reilly or Amazon
    in a few days.

  • FLME to stream multiple audio streams

    Is it possible to stream multiple audio streams together with one video stream using FLME.
    I need to create a live streaming application that should be available in many languages. My idea is to send 1 video stream + n * audio streams.
    Any ideas are appreciated.

    FMLE is not able to stream multiple audio tracks.
    Perhaps a third party encoder is able to do, but the problem is I think the playback
    cause adobe´s pluggin does also not support multiple audio tracks.

  • Creating trendline in labview graph

    Hi,
    does anyone know whether is it possible to create a trendline in labview's graph?
    kinda like in excel u can create a trendline but i want to do it in labview, is it possible?

    You can use the curve fitting functions to find a curve to fit the graph and then add it as a plot to the graph. Examples ship with LabVIEW on using the fit functions and on creating multiple plots for graphs.

  • Remove Lag from Live Streams

    I am publishing 2 live streams from a computer with 2 video capture cards in it and I get a lag every 30 seconds or so on the subscribers side. I have tried adjusting the camera quality and setMode properties but still the lag persists inside and outside the LAN, is there a way to create a buffer on the server or adjust the way the live stream is received on the subscribers side so there is no noticeable lag? I saw something about editing application.xml to adjust the queue and a suggested bitrate but not sure if this is applicable, here is the link:
    http://www.adobe.com/devnet/flashmediaserver/articles/dynstream_live.html
    Here is my setup:
    The publishing computer:
    2 PCI-e x1 cards, one takes S-Video (480i) and the other DVI (720p)
    Windows 7 64bit
    Intel i7
    6 GB RAM'
    GB NIC and Switch
    From the switch is one hop to a GB router and out to a 10 MB pipe which leads to our datacenter 30 miles away. The swf on this side just gets the 2 cameras, sets the quality to (0,80) and mode to (640,480,25,false) (I have played with these settings a little) and creates 2 lives streams on the FMS.
    The FMS:
    I am running Flash Media Interactive Server 3.5 on my own server with two 3.6 Dual Core Xeon processors and 4 GB RAM. This server resides in a Datacenter and has a 100 MB burstable pipe. From the FMS administration console I am barely using 4MB total bandwidth, 1% CPU usage, 2% RAM.
    The subscribing computer:
    I have used many different types of hardwired PC's within the same LAN and outside the LAN, results are the same.
    The swf on this side just creates 2 new video instances and attaches the 2 netstreams to them. They are placed side by side and the height and width of these videos are undefined.
    Does anyone know where this lag could be coming from, is there any settings I can adjust to improve performance while maintaining a minimum S-Video quality level?
    Thanks

    Hi,
    Thanks for the detailed information in your first post.
    Coming to the latency... It gets affected by various factors. The stream bitrate, FMS server side settings, subscriber's client side setting and on top of these the network bandwidth.
    I need to know the NetStream.bufferTime set at the subscriber's application. Try setting it to 3 when you issue the play, later you can increase it to 8 or so to stabilize the play back.
    Also can you try to subscribe to single stream and check if your client bandwidth is able to play back isngle stream?
    The link which you have mentioned below is a good reference to tune your server side settings.
    Regards,
    Janaki L

  • How to secure publishing of live streams

    Hi,
    I have installed Flash Media Server on our server.  When I load up the application home page, I see the demo video of a train.
    Then I click on the "Interactive" tab on the right hand side.  I was SHOCKED to see that I can create a live stream from my local camera without any  credentials at all.  Anyone who visits this webpage can publish a live video stream on our Flash Media Server?
    Please tell me there is a simple configuration change I can make to close up this gaping security hole.
    Thanks

    jephperro,
    The advised operation for this is a combination of SWF Verification for your general content to ensure that unintended SWFs cannot perform undesired actions on your FMS installation.  If you make use of live publishing then the FMLE authentication plugin is a necessary component as well.
    All this information will be forthcoming in the FMS hardening guide - but for now I can help detail for you the solution you'll need to make sure that only your intended targets are using your FMS installation.  Tell me a little about your setup, or email me offlist to do so @ [email protected] and we'll get you sorted.
    Asa

  • Multiple Audio Streams?

    I'm wondering if there is an application that will allow the computer to route multiple audio streams out of my Mac Pro. or- if anyone knows how to create an easily switchable audio routing panel rather than going through preferences.
    Eg. I'm playing Itunes (or hulu or netflix) through my audio line out through my Samsung TV thorough a speaker system to watch online content through the TV and I want to work on files on the computer at the same time, say edit video using FCP and routing that audio through USB speakers at the desk.
    Being able to create multiple audio stream routing would be a HUGE asset for someone like me. Even if it's not possible, I would certainly love it if Apple would incorporate a quick switch selector in the pulldown menu for audio volume...
    Thoughts? Anyone?
    Thanks!

    Thanks, the "auxillary effect" totally helped me stream itunes music to my iSight with soundflower....giving webcam streaming my system audio, which I couldnt do before without hearing anything....ugh so complicated! but this works!! Sometimes apple is so rediculous and I know there are more features they could build in.
    Thanks!

  • Mediafilesegmenter - HTTP Live Streaming

    I am confused, it can't be this difficult to create HTTP Live Streams.
    I have 10.7 on my Mac Pro it has variantplaylistcreator and mediafilesegmenter installed in /usr/bin. Variantplaylistcreator works, mediafilesegmenter crashes. My MacBook Pro has 10.8, the tools are absent. I luckily have a MacBook with 10.6, on this variantplaylistcreator crashes, mediafilesegmenter works.
    The job option Prepare for HTTP LIve Streaming in Compressor 4 fails dismally it won't work with existing files and doesn't seem to want to create anything when you start from scratch with a video.
    At the moment I run a script on the MacBook, followed by one on the MacPro to create the necessary files from the already created h.264 videos.
    I have an active iOS Developer programme, the "latest" tools are nowhere to be found.
    I cannot believe it is this difficult to deliver video, in the Apple approved fashion, for the iOS platform.
    Anyone got any better experiences?

    This is the iPhone (not iPad), user-to-user tech support (not developers) forum. You're much more likely to get help in the dev forum:
    https://devforums.apple.com/community/iphone

  • Use HTTP Dynamic Streaming (HDS) and HTTP Live Streaming (HLS) to serve live streams to clients over HTTP

    I have created a live stream of a video and it gets stored in live folder.
    Now i need to use HTTP Dynamic Streaming (HDS) and HTTP Live Streaming (HLS) to serve live streams to clients over HTTP, publish the streams to the HTTP Live Packager service on Flash Media Server.
    So what necessary steps do I  need to follow to do that ??

    You need to generate a manifest file using Configurator tool and placed it under the webroot directory.
    C:\Program Files\Adobe\Flash Media Server 4.5\tools\f4mconfig\configurator

  • Synchronous live streams

    Wondering if FMS 3.5 is capable of turning multiple live streams from web cams into a single stream to go out to client machines?  Not interested in concatenating the stream, but rather overlapping them.
    If this is not available through FMS 3.5, are there other tools available by which this could be achieved programmatically?
    Thanks.

    I would work backward in this case.  Take for instance a home DSL connection with 1.5 Mbps down.  Now given that constraint, you would need to aggregate 5 webcam streams over it.  This will compromise the bitrate of the content (ie visual quality) as each stream would take up less than 300 kbps as one would need to account for TCP overhead.   This means that the webcams need to encode the content to meet these requirements.
    Now by doing so, you have 5 separate streams coming from the webcam, which you now want to turn into a "single one".   You can multiplex these 5 streams over a single netconnection (or 5 streams over 5 different netconnections), but the network footprint will not be any smaller because those streams have been encoded at that bitrate.
    So the moral of the story is:  find a good encoder.

  • How to merge multiple live audio streams into a single stream in FMS?

    I need to merge multiple live audio streams into a single stream so that i can pass this stream as input to VOIP through a softphone.
    For this i tried the following approach:
    Created a new stream (str1) on FMS onAppStart and recorded the live streams (sent throgh microphone) in that new stream.
    Below is the code :
    application.onAppStart = function()
    application.myStream=Stream.get("foo");           
    application.myStream.record();
    application.onPublish = function (client,stream)
          var streamName = stream.name;
          application.myStream.play(streamName,-2,-1};
    The problem is that the Stream.play plays only 1 live stream at a time.As soon as a the 2nd live stream is sent to FMS then Stream.play stops playing the previous live stream.So,only the latest live stream is getting recorded.Whereas i need to record all the live streams in the new stream simultaneously.
    Any pointers regarding this are welcome.
    Thanks

    well i tried this once for one of my scripts and the final conclusion is its not possible to combine two streams into 1.....how would you time/encode the two voices......there is no know solution to this in flash. If you continue on despite me and find a solution please post it so we can explain to rest of world.

  • Creating an audio-only stream from a live stream

    I am using Flash Media Live Encoder (3.2) to send multiple bitrate streams to an instance of Adobe Media Server 5. In addition to the video streams, I am trying to generate an audio-only stream.
    I am using the default 'livepkgr' application. The input stream contains H.264 video + AAC audio.
    The stream name that I set in FMLE is set to this:
    livestream%i?adbe-live-event=liveevent&adbe-audio-stream-name=livestream1_audio_only&adbe- audio-stream-src=livestream1
    I created a variant playlist to include the audio-only stream. It looks like this:
    #EXTM3U
    #EXT-X-STREAM-INF:PROGRAM-ID=1,BANDWIDTH=150000
    /hls-live/livepkgr/_definst_/liveevent/livestream1.m3u8
    #EXT-X-STREAM-INF:PROGRAM-ID=1,BANDWIDTH=500000
    /hls-live/livepkgr/_definst_/liveevent/livestream2.m3u8
    #EXT-X-STREAM-INF:PROGRAM-ID=1,BANDWIDTH=700000
    /hls-live/livepkgr/_definst_/liveevent/livestream3.m3u8
    #EXT-X-STREAM-INF:PROGRAM-ID=1,BANDWIDTH=64000,CODECS="mp4a.40.2"
    /hls-live/livepkgr/_definst_/liveevent/livestream1_audio_only.m3u8
    However, when I retrieve the audio-only playlist it contains a list of .ts files. If it was working correctly I would expect there to be a list of .aac files. Here's an extract from the playlist (livestream1_audio_only.m3u8):
    #EXTM3U
    #EXT-X-MEDIA-SEQUENCE:1927
    #EXT-X-ALLOW-CACHE:NO
    #EXT-X-VERSION:2
    #EXT-X-TARGETDURATION:8
    #EXTINF:8,
    ../../../../hls-live/streams/livepkgr/events/_definst_/liveevent/livestream1_audio_onlyNum1927.ts
    #EXTINF:8,
    ../../../../hls-live/streams/livepkgr/events/_definst_/liveevent/livestream1_audio_onlyNum1928.ts
    #EXTINF:8,
    ../../../../hls-live/streams/livepkgr/events/_definst_/liveevent/livestream1_audio_onlyNum1929.ts
    #EXTINF:8,
    ../../../../hls-live/streams/livepkgr/events/_definst_/liveevent/livestream1_audio_onlyNum1930.ts
    Any ideas why it doesn't seem to be working?
    Thanks

    Hi
    No !
    You first have to back-engineer the DVD to something the Mac can use eg StreamingDV
    (I use Roxio Toast™ to do this) then convert this via QuickTime to .aiff 16-bit 44.1 kHz
    then burn this onto a CD via eg iTunes.
    An Analog copy from a DVD-player to the Mac Line-in and into an audiuo editor as Audacity (free on internet) then use this for the CD is easier and about as good as. (I can't differ)
    Yours Bengt W

  • How do I use multiple cameras to live stream through FME?

    I am looking to live stream pool tournaments from multiple angles but don't know what software or hardware I might need. Anybody have any good how to advice or links they might share? I stream through Ustream.tv if that makes a difference. Should I look for something different? Thanks

    I am working on getting just the counter working by using
    the program posted previously, and I am running into issues. Periodically I get
    the error:
    Error -200141
    occurred at DAQmx Read (Counter DBL 1Chan 1Samp).vi
    Possible reason(s):
    Data was overwritten
    before it could be read by the system.
    If Data Transfer
    Mechanism is Interrupts, try using DMA. Otherwise, divide the input signal
    before taking the measurement.
    It seems to work better if I use cascaded counters, but I need timer 0 for
    analog channels when I run this code along with the program for the other
    measurements.
    I have tried averaging, and selecting different values for the millisecond
    timer, and these did not seem to have an effect.
    I tried different DAQms configurations and "Counter DBL 1Samp" seemed
    to work the best.
    The program will work for a while and then it will give me the above error
    message.
    If I use counter 0 as a cascaded counter input, the program runs fine. If I run
    this with other analog channels, it errors out because the analog channels use
    counter 0.
    If I use counter 1 as a cascaded counter input, it seems to work better than a
    single channel, but it will still error out with the above error.
    If I use only counter 1, I get the error above even faster.
    Also, none of the
    configurations give measurements outside the While Loop.
    The only place I can add a speed dial for the front panel is within the While
    Loop.
    Is there someway to get the signal to continuously send out of the while loop?
    I thought if I could get the signal out of the while loop, I could condition it
    anyway I wanted without the program erroring out.
    Any suggestions would be much appreciated.
    Thank you.
    Attachments:
    Counter_error.jpg ‏45 KB

Maybe you are looking for