Dynamic Bitrate Switching on Live Stream

I have FMIS 3.5.  I've installed it with pretty much all the default values.  I haven't changed any of the settings in either the  LIVE or the VOD applications.
Dynamic Bitrate Switching is working well for VOD, but not working at all for LIVE streams. Doing a regular bandwidth detection on both the LIVE and the VOD applications give similar, high bandwidth results.  However, the LIVE application  NetStreamInfo.maxBytesPerSecond is showing a very low bandwidth capability of around maxBytesPerSecond = 19016, where as for the VOD its achieving around 637110. I can play a single LIVE stream of a high quality smoothly without any error.
I don't know if this is relevent but I'm getting some error messages occassionally in the log of the live application saying : Dropping application (live/_definst_) message. Clients not allowed to broadcast message.  These messages aren't consistent, and don't coincide with trying to use bitrate switching.
I have tried downloading the Adobe sample StreamSwitching.fla and it won't play the LIVE streams at all.  Using the opensource Longtail Video player it just always defaults to the lowest stream.  Here is an example: http://www.ltscotland.org.uk/testbed/live/livestream2.asp
Can anyone suggest what the problem might be here?  And any possible solutions?

Thanks,I have read that article.  Based on that article the NetStreamInfo.maxBytesPerSecond is not an accurate measurement to base dynamic switching on. This seems to be the basis of the bitrate switching in both the longtail player, and the adobe examples that I have tried.   That article suggests using the dropped frames property, in conjunciton with bufferlength to determine if switching is neccessary.  Unfortunately I can't seem to find a player online which handles this successfully.  That being said, I can't believe I'm the only person trying to implement dynamic bitrate switching for live streams so surely there are some players out there which can do this successfully?  If anyone knows of any code available which does this successfully I would appreciate knowing where!  The examples provided by Adobe https://www.adobe.com/cfusion/entitlement/index.cfm?e=fms35 unfortunately don't work either.

Similar Messages

  • Flash Media Player which handles bitrate switching for live streams?

    Hello.  I've got a very short timescale to find a solution for a way to display livestreams with bitrate switching. Does anyone
    know of any opensource players which can do this effectively?  Or do the inbuilt components in CS4 deal with this ok?

    Thanks,I have read that article.  Based on that article the NetStreamInfo.maxBytesPerSecond is not an accurate measurement to base dynamic switching on. This seems to be the basis of the bitrate switching in both the longtail player, and the adobe examples that I have tried.   That article suggests using the dropped frames property, in conjunciton with bufferlength to determine if switching is neccessary.  Unfortunately I can't seem to find a player online which handles this successfully.  That being said, I can't believe I'm the only person trying to implement dynamic bitrate switching for live streams so surely there are some players out there which can do this successfully?  If anyone knows of any code available which does this successfully I would appreciate knowing where!  The examples provided by Adobe https://www.adobe.com/cfusion/entitlement/index.cfm?e=fms35 unfortunately don't work either.

  • Only 1st RTMP bitrate for HTTP Live Streaming works

    Using this on AWS with cloudfront/cloudformation.
    When I created new Cloudformation, my first RTMP will works. i.e I publish at 500kbps.
    Then if I stop and send single bitrate at say 800kbps, it will not works. i.e OSMF will keep buffering.
    SSH to my EC2, I restart FMS server, also no different. When I republish at 500kbps, it works again.
    Then, Create fresh new Cloudformation, and send 800kpbs this time, it works fine.
    So my conclusion, I can only send 1 known bitrate, and cannot change to another, ( but can change resolution).
    Must be something wrong here. What exactly happened?

    Hi,
    When you first publish with 500 Kbps and then superimpose with 800 Kbps, it may not allow you to do so. Since on republishing it appends to the previous streams and it might not work with streams of two different bitrates. My suggestion to you is - try publishing in record mode. This will clear old streams and should work like a dream.
    To publish in record mode, use this publishing string;
    livestream?adbe-live-event=liveevent&adbe-record-mode=record
    Thanks,
    Shiven

  • Live stream switching

    hi!
    I have a problem. I want to set up a live streaming experience, where streams will be switched live by a server side application written in .NET. The flash clients will have no control over what stream they want to watch.the setup is such: video streams from two cameras are encoded in Live Encoder. FMS should stream only a single stream to the clients and which stream should be shown should be chosen by a .NET application
    How do I go about accomplishing such a task? As I understand the FMS uses RTMP to communicate with clients. Are there any other ways to contact the FMS and tell it to switch between two streams, i.e. an XMLSocket connection? This would be perfect since it would integrate with the .NET server perfectly. I do not like the prospect of using RTMP to contact the FMS and tell it to switch between streams. Is it even possible, or can the stream switching be performed only per client?

    You can use an XML socket, but the FMS application will need to initiate the connection, as FMS has no support for listening for anything other than RTMP and HTTP requests.
    Stream switching can happen on the FMS side. In your FMS application, you'll create a server side stream, and use the Stream.play method for playing other sources (live streams or recorded flv/h.264 files) over that stream. Your subscribers will connect to the server side stream
    See the FMS docs for the Stream class and the XMLSocket class.

  • Publishing multi-bitrate live streams for iOS

    I'm having difficulties publishing multi-bitrate live streams that can be viewed on an iPad/iPhone.
    I'm running Flash Media Server 4.5, and have carefully followed Adobe's Tutorial on streaming live media (http) using hls found here:  http://help.adobe.com/en_US/flashmediaserver/devguide/WSd391de4d9c7bd609-52e437a812a3725df a0-8000.html#WS0432746db30523c21e63e3d12e8340f669-8000
    After completing the above tutorial, the video can be seen just fine on my desktop computer via the flash plug-in, but it's not working in iOS...
    When I go to what I believe to be the proper URL, (http://myflashmediaserver.net/myevent.m3u8),  I get an error message on my iPhone and iPad saying "The operation could not be completed".
    I have created two set-level manifest files (one .f4m and one .m3u8) for the live stream using the Set Level F4M/M3U8 File Generator and saved them to the WebRoot directory, but alas, no love.
    Any help would be greatly appreciated!
    Mike

    I just finished going through the short and sweet tutorial on the Adobe website "Capture, encode and stream live multi-bitrate video over HTTP to Flash and iOS", which confirmed that I seem to be doing everything right, but I'm still getting the "The operation could not be completed" error message on both iPad and iPhone.
    Grasping at straws, I'm wondering if it could have something to do with some of the "hacks" I was asked to make in the other tutorials, (which, oddly enough, weren't mentioned in the tutorial mentioned above).  Specifically:
         • Edit FMLE config file on the Mac I'm encoding the live video on (change <streamsynchronization> from false to true)
         • Delete/Rename the "manifest.xml" file in applications/livepkgr/events/_definst_/liveevent directory
         • Edit "event.xml" in applications/livepkgr/events/_definst_/liveevent (change <segmentduration> from 40,000 to 16,000)
    However, I've tried running this with the above hacks as well as in their non-hacked state and am still not seeing success.
    Please advise.  Thanks!

  • Playback of low bitrate flv or f4v from live stream in FMS causes player buffer to empty

    We are experiencing a consistent issue when playing a low bitrate (300kbps or less) flv in a live stream from FMS.  Basically, the player will will start off with the appropriate buffer, say 5 seconds, then begin dropping until it empties out, and will have to rebuffer.  We've tried with a variety of flv and f4v files, all that are 300kbps or less, and we consistently get the issue.  Is this something Adobe can investigate in FMS?  Or are there any suggestions on how we can get around the issue?

    hey, i got the similar problem, logging like this
    2012-11-12
    18:50:12
    23434
    (e)2661034
    Connect failed ( , 5779212 ) : Connect failed: Connection refused (111)
    2012-11-12
    18:50:54
    23434
    (e)2661034
    Connect failed ( , 5779212 ) : Connect failed: Connection refused (111)
    2012-11-12
    18:51:36
    23434
    (e)2661034
    Connect failed ( , 1166880400 ) : Connect failed: Connection refused (111)
    2012-11-12
    18:54:14
    23434
    (e)2661034
    Connect failed ( , 1175301776 ) : Connect failed: Connection refused (111)
    2012-11-12
    18:54:55
    23434
    (e)2661034
    Connect failed ( , 1164775056 ) : Connect failed: Connection refused (111)
    2012-11-12
    18:55:37
    23434
    (e)2661034
    Connect failed ( , 16 ) : Connect failed: Connection refused (111)
    2012-11-12
    19:13:08
    23434
    (e)2661034
    Connect failed ( , 1158459024 ) : Connect failed: Connection refused (111)
    it seems that the port number is invalid, but we never use such ports.

  • Does FMS support live stream switching

    I read a blog article (
    http://www.peachpit.com/articles/article.aspx?p=665127)
    talking about switching between different bit rate video streams of
    the same video content on fly. But I can not find such ability in
    RTMP's specification. I am wondering if such function has to be
    implemented by users with ActionScript? Thanks!
    Yue

    Yes FMS does support live stream switching but its not at protocol level hence you are not finding it in RTMP specification.
    please read details about it here:
    http://help.adobe.com/en_US/FlashMediaServer/3.5_Deving/WS5b3ccc516d4fbf351e63e3d11a0773d5 6e-7fea.html
    Hope you find it useful.

  • Use HTTP Dynamic Streaming (HDS) and HTTP Live Streaming (HLS) to serve live streams to clients over HTTP

    I have created a live stream of a video and it gets stored in live folder.
    Now i need to use HTTP Dynamic Streaming (HDS) and HTTP Live Streaming (HLS) to serve live streams to clients over HTTP, publish the streams to the HTTP Live Packager service on Flash Media Server.
    So what necessary steps do I  need to follow to do that ??

    You need to generate a manifest file using Configurator tool and placed it under the webroot directory.
    C:\Program Files\Adobe\Flash Media Server 4.5\tools\f4mconfig\configurator

  • Live stream freezes after bitrate change : Audio remains

    I've setup a very simple OSMF test player but there appears to be an issue when certain bitrate switches occur with no errors. The video just feezes, but you can still hear audio in the background.
    Some details:
    - We're currently using OSMF 2.0, and the Flex 4.9.1 sdk and streaming an rtmpt stream.
    - For the encoding, the keyframes are 3 seconds apart
    - It appears that setting the buffertime above 4 seconds will sometimes make the issue even worse
    Encode settings:
    Stream 1 640x360    1000 kbps 30fps 3s kf  96 kbps audio
    Stream 2 640x360    700 kbps 30fps  3s kf  64 kbps audio
    Stream 3 640x360    440 kbps 15fps  3s kf  64 kbps audio
    Stream 4 320x180    240 kbps 15fps  3s kf  32 kbps audio
    It appears that when we play the stream through an ooyala player (which does not seem to use OSMF), these issues go away.
    Here is a very bare bones implimentation we're using to isolate the issue:
    http://pastebin.com/esWNCEfr
    I was able to replicate the issue only once using the player at OSMF.org:
    http://osmf.org/dev/2.0gm/debug.html?src=
    Here are the sequence of onNetStatus events when it freezes:
    ***** onNetStatus ****** :NetStream.Play.Reset
    ***** onNetStatus ****** :NetStream.Play.Start
    ***** onNetStatus ****** :NetStream.Buffer.Full
    ***** onNetStatus ****** :NetStream.Buffer.Empty
    ***** onNetStatus ****** :NetStream.Play.Transition
    ***** onNetStatus ****** :NetStream.Buffer.Full
    ***** onNetStatus ****** :NetStream.Buffer.Empty
    ***** onNetStatus ****** :NetStream.Buffer.Full
    ***** onNetStatus ****** :NetStream.Play.Transition -- Frozen - still hear audio
    ***** onNetStatus ****** :NetStream.Play.Transition -- Still Frozen.

    I working a little bit on my problem.
    I put an text area in my script to see which stream is taken.
    I slow down the bandwith to 100, 200 and 350 kbps and - surprise surprise only the 100 kbps stream was taken by my app.
    Nobody any idea what the problem can be?
    Regards,
    Daniel

  • Remove Lag from Live Streams

    I am publishing 2 live streams from a computer with 2 video capture cards in it and I get a lag every 30 seconds or so on the subscribers side. I have tried adjusting the camera quality and setMode properties but still the lag persists inside and outside the LAN, is there a way to create a buffer on the server or adjust the way the live stream is received on the subscribers side so there is no noticeable lag? I saw something about editing application.xml to adjust the queue and a suggested bitrate but not sure if this is applicable, here is the link:
    http://www.adobe.com/devnet/flashmediaserver/articles/dynstream_live.html
    Here is my setup:
    The publishing computer:
    2 PCI-e x1 cards, one takes S-Video (480i) and the other DVI (720p)
    Windows 7 64bit
    Intel i7
    6 GB RAM'
    GB NIC and Switch
    From the switch is one hop to a GB router and out to a 10 MB pipe which leads to our datacenter 30 miles away. The swf on this side just gets the 2 cameras, sets the quality to (0,80) and mode to (640,480,25,false) (I have played with these settings a little) and creates 2 lives streams on the FMS.
    The FMS:
    I am running Flash Media Interactive Server 3.5 on my own server with two 3.6 Dual Core Xeon processors and 4 GB RAM. This server resides in a Datacenter and has a 100 MB burstable pipe. From the FMS administration console I am barely using 4MB total bandwidth, 1% CPU usage, 2% RAM.
    The subscribing computer:
    I have used many different types of hardwired PC's within the same LAN and outside the LAN, results are the same.
    The swf on this side just creates 2 new video instances and attaches the 2 netstreams to them. They are placed side by side and the height and width of these videos are undefined.
    Does anyone know where this lag could be coming from, is there any settings I can adjust to improve performance while maintaining a minimum S-Video quality level?
    Thanks

    Hi,
    Thanks for the detailed information in your first post.
    Coming to the latency... It gets affected by various factors. The stream bitrate, FMS server side settings, subscriber's client side setting and on top of these the network bandwidth.
    I need to know the NetStream.bufferTime set at the subscriber's application. Try setting it to 3 when you issue the play, later you can increase it to 8 or so to stabilize the play back.
    Also can you try to subscribe to single stream and check if your client bandwidth is able to play back isngle stream?
    The link which you have mentioned below is a good reference to tune your server side settings.
    Regards,
    Janaki L

  • Live stream problem with FMS 4.5 + FMLE

    Hi,
    I've got a live stream (and DVR) setup using FMS 4.5 and FMLE.
    The hardware for the FMS server we use is 2x Xeon E5645 2.4GHz, 32GB RAM, OCZ RevoDrive R3 X2 PCI-e SSD. Installed OS is CentOS 5.8.
    We have 6 workstation PCs, each equiped with 4 BlackMagic Intensity PRO capture cards and FMLE 3.2. Using these PCs we stream Live video to FMS server.
    Video stream is grabbed from HDMI cable, to BlackMagic Cards, which are encoding by FMLE using H.264 format, 700kbps bitrate per stream.
    Video stream is published to server through RTMP stream, which is later saved for DVR in RAW format. We only keep 3 days of recording and delete the old ones.
    Server and Encoder PCs are in the same network, connected by gigabit managed switch.
    The problem I'm having is that after 10-15 hours FMLE starts to drop frames, because video buffer is increased. What I observed is that this happens immediately when Server CPU load increases above 60%.
    Based on the above observation I decreased the number of channels streamed by the server to 10, which reduced CPU load. But the problem still persists.
    Whenever I restart FMS and delete all DVR data, the CPU load (when streaming 10 live channels) is only 1%, but after 2 days CPU load increases to 50-60%.
    Whenever I restart FMS and don't delete DVR data, the CPU load is 5-10%, and after 2 days it still increases to 60-70%.
    Another thing I observed is that there is only single fmscore process running, but it has lots of threads which are switched on and off in split seconds. These threads are launched on different CPU cores, but at any given point in time the distribution of the load isn't equal among CPU cores. This leads certain CPU cores being loaded by more than 60% and frame drops start to occur.
    For the moment there are just 10 users watching this service, so I don't think this load accounts for the problem.
    Has anybody had similar problem, or know how can I optimize or finetune the system to run without problem? I would appreciate any suggestions.
    Another thing I noted through last couple of days:
    When I was restarting FMS the CPU load was reduced to 30%, but after 1 week past when I restart it CPU load only goes down to 75%. Everything is the same, nothing has been changed and there is no disk IO issues involved.
    P.S. I've modified application.xml using these values:
    <Scope>vhost</Scope>
    <Distribute numprocs="5">app</Distribute>
    <LifeTime>
    <RollOver></RollOver>
    <MaxCores></MaxCores>
    </LifeTime>

    Hi
    How many channels are you publishing?
    If there are too many channels, it is recommended to have one  FMSCore process start for each of them. To do so, you will have to change the scope to app.
    ---snippet---
    <Application>
    <Process>
        <Scope>app</Scope>
       <Distribute numprocs="3">inst</Distribute>
    </Process>
    </Application>
    Also, to delete older content, you will have to enable disk management. Refer http://help.adobe.com/en_US/flashmediaserver/devguide/WSeb6b7485f9649bf23d103e5512e08f3a33 8-8000.html#WSec225f632fa00875-23954b6f1300b641158-8000 for more info.

  • How does the VideoDisplay component select between live streams?

    I am passing to the source property of a VideoDisplay component a DynamicStreamingVideoSource object with 3 different dynamic live stream items, described by this XML, for your consideration:
       <video src="rtmp://88.87.56.214:1935/live/fashiontv_tmo_h.stream" system-bitrate="19200"/>
       <video src="rtmp://88.87.56.214:1935/live/fashiontv_tmo_m.stream" system-bitrate="9000"/>
       <video src="rtmp://88.87.56.214:1935/live/fashiontv_tmo_l.stream" system-bitrate="3600"/>
    But the player then runs the stream with the lowest bitrate, out of those 3. Wasn't it supposed to go for the stream with the highest bitrate, that is viewable by the end-user? All 3 streams have been individually tested and they are all viewable.
    There is a matter of the initialIndex property of the DynamicStreamingVideoSource class, that acts as a preferred first attempted stream index to play. But:
    This is set to 0 by default, even if there is no actual 'preferred' initial index.
    The streams are sorted internally by the VideoDisplay class, from lowest bitrate to highest; should we use an initialIndex value equal to the number of streams MINUS 1, so as to ask for the highest bitrate first? What happens if that is not viewable, what stream will it try then?
    Due to stream sorting, the value for the initial index may be misleading if the streams had initially been given in a different order, e.g. from highest bitrate to lowest
    All in all, the VideoDisplay component and its multi-bitrate support are sadly just briefly described throughout doc pages. Anyone with anything to contribute on the matter is very welcome.
    Thanks,
    Liviu

    Hi Pablo DC Es:Arg
    I believe that is how freehand tool works ...You can tap and draw the line using your finger ...if you find the line is curved you can delete it and re-create it ...

  • HDS live streaming to Flash not working

    Adobe Flash Media Server 4.5.5 r4013
    Windows 2008
    Sources:
    http://help.adobe.com/en_US/flashmediaserver/devguide/WSd391de4d9c7bd609-52e437a812a3725df a0-8000.html
    http://www.adobe.com/devnet/adobe-media-server/articles/live-multi-bitrate-video-http-flas h-ios.html
    Live streaming a single or multi-bitrate video over HTTP to Flash does not work. I have followed the instructions on the 2 sources listed above repeatedly, but I can’t get live streaming over HTTP to Flash to work. Live streaming to iOS over HTTP works with no problems (single and multi-bitrate streams).
    I have tried the troubleshooting steps from the following:
    http://help.adobe.com/en_US/flashmediaserver/devguide/WS0432746db30523c21e63e3d12efac195bd -8000.html
    Troubleshoot live streaming (HTTP)
    1.      Services window (Windows): Flash Media Server (FMS), Flash Media Administration Server, and FMSHttpd services are running. ✓
    2.      Verified that the request URL is correct. ✓
    3.      Configured ports:
    a.      Configure Apache to use port 80. Open rootinstall/Apache2.2/conf/httpd.conf in a text editor. Change the line Listen 8134 to Listen 80.
    b.     Configure Flash Media Server not to use port 80. Open rootinstall/conf/fms.ini in a text editor. Remove 80 from the ADAPTOR.HOSTPORT parameter so the parameter looks like the following: ADAPTOR.HOSTPORT = :1935 ✓
    4.      Placed a crossdomain.xml file to the rootinstall/webroot directory. ✓
    5.      In Flash Media Live Encoder, select the Encoding Options tab, choose Output from the Panel options menu, and verify the following:
    a) The value of FMS URL is rtmp://fms-dns-or-ip/livepkgr. If you’re testing on the same server as Flash Media Server, you can use the value localhost for fms-dns-or-ip. ✓
    b) For a single stream, the value of Stream is livestream?adbe-live-event=liveevent. ✓
    c) For adaptive bitrate streaming, the value of Stream is livestream%i?adbe-live-event=liveevent. ✓
    Flash Media Live Encoder uses this value to create unique stream names. To use another encoder, provide your own unique stream names, for example, livestream1?adbe-live-event=liveevent, livestream2?adbe-live-event=liveevent.
    The encoder is showing all 3 streams being published and streaming.
    6. Check Administration Console: the livepkgr application and the 3 streams are running. ✓
    7. Check the logs for errors. Flash Media Server logs are located in the rootinstall/logs folder. The master.xx.log file and the core.xx.log file show startup failures. Apache logs are located in the rootinstall/Apache2.2/logs folder. X
    a)   core00.log: these errors did not occur every time that I tried playing the live stream but these are the only relevant errors in the logs.
    1. 7968 (w)2611179     Warning from libf4f.dll: [Utils] [livestream2] Discarded all queued Media Messages received before first Video Keyframe Message
    2. 7968 (w)2611179     Warning from libf4f.dll: [Utils] [livestream3] Discarded all queued Media Messages received before first Video Keyframe Message
    b) edge00.log:
    13:33:57 4492          (w)2641213 Connection rejected by server. Reason : [ Server.Reject ] : (_defaultRoot_, _defaultVHost_) : Application (hds-live) is not defined.          -
    c) Apache-Error:
    1.     [warn]  Checking if stream is disabled but bootstrap path in event file is empty for event:livepkgr/events/_definst_/liveevent stream name:livestream
    2.     [warn] bootstrap path is in event file is empty for event:livepkgr/events/_definst_/liveevent stream name:livestream1
    As I mentioned, everything works on iOS and FMS seems to be creating all of the stream segments and meta files:
    a.     The 3 streams are being created in: HD:\Program Files\Adobe\Flash Media Server 4.5\applications\livepkgr\streams\_definst_
    b.    FMS is creating the following files in each stream folder (livestream1, livestream2, livestream 3):
    1. livestream1.bootstrap
    2. livestream1.control
    3. livestream1.meta
    4. .f4f segments
    5. .f4x segments
    The appropriate files are also being created in the HD:\Program Files\Adobe\Flash Media Server 4.5\applications\livepkgr\events\_definst_\liveevent folder, in which I have the following Manifest.xml and Event.xml files:
    <manifest xmlns="http://ns.adobe.com/f4m/1.0">
      <media streamId="livestream1" bitrate="200" />
      <media streamId="livestream2" bitrate="500" />
      <media streamId="livestream3" bitrate="1000" />
    </manifest>
    <Event>
      <EventID>liveevent</EventID>
      <Recording>
    <FragmentDuration>4000</FragmentDuration>
    <SegmentDuration>16000</SegmentDuration>
        <DiskManagementDuration>3</DiskManagementDuration>
      </Recording>
    </Event>
    I’ve tried clearing the contents of both streams\_definst_ and events\_definst_\liveevent (keeping the xml files) after restarting the encoder, and creating a different event definst for the streams (liveevent2 for example).
    We have an event in 2 weeks that we would like to stream to both Flash and iOS. Any help in solving this problem will be greatly appreciated.

    One step closer:
    Changed the crossdomain.xml file (more permissive settings).
    Changed the encoding on FMLE to vp6. Working somewhat (don't know what I did to make it start streaming through hds).
    But at least now I can get the individual streams in the set manifest file to work:
    http://localhost/hds-live/livepkgr/_definst_/livevent/livestream1.f4m
    http://localhost/hds-live/livepkgr/_definst_/livevent/livestream2.f4m
    http://localhost/hds-live/livepkgr/_definst_/livevent/livestream3.f4m
    BUT when I try to play the streams through the set manifest file from http://localhost/liveevent.f4m I'm getting the following error:
    "The F4m document contains errors URL missing from Media tag." I'll search the forums to see if anyone else has come across this problem.
    I used the f4m config tool to make the file. These are the file's contents:
    <manifest xmlns="http://ns.adobe.com/f4m/2.0">
      <baseURL>http://localhost/hds-live/livepkgr/_definst_/liveevent/</baseURL>
      <media href="livestream1.f4m " bitrate="200"/>
      <media href="livestream2.f4m " bitrate="500"/>
      <media href="livestream3.f4m " bitrate="1000"/>
    </manifest>
    Thanks

  • Running a live stream from an XML playlist

    I have just successfully installed Adobe FMS on my server.
    I would now like to know how to program a script to run a continuous live stream, of MP4 videos, from an XML playlist.
    Can anyone tell me how to do this? ...or provide me a good tutorial? (I am a complete newbie to ActionScript.)
    Thanks in advance...

    application.allowDebug = true;
    application.onAppStart = function(){
    this.userID =0;
    this.playObj = new Object();
    this.timObj = new Object();
    this.passCli = new Object();
    this.couObj = new Object();
    this.couObj.count = 1;
    application.so0 = SharedObject.get("so",false);
    this.dates = new Object;
    this.dates.dat0 = new Date().valueOf()+"a";
    this.dates.dat1 = new Date().valueOf()+"b";
    this.dates.dat2 = new Date().valueOf()+"c";
    this.dates.dat3 = new Date().valueOf()+"d";
    this.myStream = new Object;
    this.myStream.st = Stream.get (this.dates.dat0.toString());
    this.myStream.st1 = Stream.get (this.dates.dat1.toString());
    this.myStream.st2 = Stream.get (this.dates.dat2.toString());
    this.myStream.st3 = Stream.get (this.dates.dat3.toString());
    this.int0
    this.int1
    this.int2
    this.int3
    this.int4
    this.lock0=0;
    this.lock1=0;
    this.lock2=0;
    this.lock3=0;
    this.lock4=0;
    listen();
    function listen(){
    clearInterval(application.int3);
    application.int0 = setInterval(time,1000,application.myStream.st);
    application.myStream.st.onStatus = function(info){
    if(info.code == "NetStream.Play.Stop"&&application.lock0==0){
      trace("code0"+info.code);
      clearInterval(application.int0);
      application.timObj.tim = 0;
      application.int1 = setInterval(time,1000,application.myStream.st1);
      application.couObj.count = 2;
      playcurr(application.passCli.cli);
      switchStream(application.so0);
      listen1(application.myStream.st1);
      application.lock0=1;
      function listen1(mystreamst1){
      mystreamst1.onStatus = function(info){
    if(info.code == "NetStream.Play.Stop"&&application.lock1==0){
      trace("code1"+info.code);
      mystreamst1 = null;
      clearInterval(application.int1);
      application.timObj.tim = 0;
      application.int2 = setInterval(time,1000,application.myStream.st2);
      application.couObj.count = 3;
      playcurr(application.passCli.cli);
      switchStream(application.so0);
      listen2(application.myStream.st2);
      application.lock1=1
      function listen2 (mystream2){
    mystream2.onStatus = function(info){
    trace("code2"+info.code);
    if(info.code == "NetStream.Play.Stop"&&application.lock2==0){
      clearInterval(application.int2);
      application.mystream2 = null;
      application.timObj.tim = 0;
      //application.int3 = setInterval(time,1000,application.myStream.st3);
      application.couObj.count = 4;
      playcurr(application.passCli.cli);
      switchStream(application.so0);
         application.lock2=1;
      listen3(application.myStream.st3);
      function listen3(mystream3){
    mystream3.onStatus = function(info){
    trace("code3"+info.code);
    if(info.code == "NetStream.Play.Stop"&&application.lock3==0){
      trace("yes yes yes yes yes yes");
      clearInterval(application.int3);
      application.couObj.count = 1;
      mystream3 = null;
      application.timObj.tim = 0;
      //application.int4 = setInterval(time,1000,application.myStream.st);
      playcurr(application.passCli.cli);
      switchStream(application.so0);
      application.lock0=0;
      application.lock1=0;
      application.lock2=0;
      application.lock3=0;
    application.dates.dat0 = new Date().valueOf()+"e";
    application.dates.dat1 = new Date().valueOf()+"f";
    application.dates.dat2 = new Date().valueOf()+"g";
    application.dates.dat3 = new Date().valueOf()+"h";
    application.myStream.st = Stream.get (application.dates.dat0.toString());
    application.myStream.st1 = Stream.get (application.dates.dat1.toString());
    application.myStream.st2 = Stream.get (application.dates.dat2.toString());
    application.myStream.st3 = Stream.get (application.dates.dat3.toString());
    application.myStream.st.play(application.playObj.vid[0],0,-1,0);
    application.myStream.st1.play(application.playObj.vid[1],0,-1,0);
    application.myStream.st2.play(application.playObj.vid[2],0,-1,0);
    application.myStream.st3.play(application.playObj.vid[3],0,-1,0);
    listen();
    ///here next
    application.onConnect = function(client){
    application.acceptConnection(client);
    application.passCli.cli = client;
    client.call("setUserID",null,this.userID);
    this.userID++;
    if(application.clients.length == 1 ){
    videoArray = new Array();
    var playlist = new XML();
    playlist.ignoreWhite = true;
    //parse xml play list for individual elements
    playlist.onLoad = function( success ) {
    if(playlist.loaded == true) {
    if (playlist.firstChild.hasChildNodes()) {
    for (var aNode = playlist.firstChild.firstChild; aNode != null; aNode=aNode.nextSibling) {
    if (aNode.nodeType == 1) {
    //create array from parsed xml elements.
    videoArray[aNode.attributes.id] = aNode.attributes.name ;
    //pass array out of onload function
    application.playObj.vid = videoArray;
    application.myStream.st.play(application.playObj.vid[0],0,-1,0);
    application.myStream.st1.play(application.playObj.vid[1],0,-1,0);
    application.myStream.st2.play(application.playObj.vid[2],0,-1,0);
    application.myStream.st3.play(application.playObj.vid[3],0,-1,0);
    pass0(videoArray);
    //play first video on playlist
    playlist.load("http://www.privatechatnow.com/fmsuser/playlist.xml");
    }//end onetime if statement
    function pass0(videoArray){
      //receive array
      //play intial video
      if(application.clients.length == 1){
    // application.playObj.vid=videoArray;
    playcurr(application.passCli.cli);
    for (var key in application.playObj){
    trace(key + ": " + application.playObj[key]);
       //put currently playing videio into object
      //isolate playlist switching loop for each connected client
      //listen to currently playing stream with onStatus
      //change to next video in playlist
      //use onStatus and current duration and seek to scrub to cuurently playin video each time a user connects.
      //continue untill playlist is played then loop back to first video in playlist.
        //onConnect play currently playing video
    if (application.clients.length >1){
    playcurr(application.passCli.cli);
    //message client with currently play flv
    //message client when flv changes
    //message client with metadata
    application.onPublish = function(clientObject, streamObject){
    trace("Stream name :: "+streamObject.name);
    function switchStream(so0){
    if(application.couObj.count == 1){
        clength = application.timObj.tim-3;
    currlen = application.playObj.vid[0].length;
        nextlen = application.playObj.vid[1].length;
    so0.send("playSecond",application.playObj.vid[0],clength,currlen,nextlen);
    if(application.couObj.count == 2){
        clength = application.timObj.tim-3;
    currlen = application.playObj.vid[1].length;
    nextlen = application.playObj.vid[2].length;
        so0.send("playSecond",application.playObj.vid[1],clength,currlen,nextlen);
    if(application.couObj.count == 3){
    clength = application.timObj.tim-3;
    currlen = application.playObj.vid[2].length;
      nextlen = application.playObj.vid[3].length;
        so0.send("playSecond",application.playObj.vid[2],clength,currlen,nextlen);
    if(application.couObj.count == 4){
        clength = application.timObj.tim-3;
    currlen = application.playObj.vid[3].length;
      nextlen = application.playObj.vid[0].length;
        so0.send("playSecond",application.playObj.vid[3],clength,currlen,nextlen);
    function playcurr(client){
    trace("count = "+application.couObj.count.toString());
    if(application.couObj.count ==1){
        clength = application.timObj.tim-3;
    currlen = application.playObj.vid[0].length;
      nextlen = application.playObj.vid[1].length;
        client.call("playZero",null,application.playObj.vid[0],clength,currlen,nextlen); 
    if(application.couObj.count ==2){
        clength = application.timObj.tim-3;
    currlen = application.playObj.vid[1].length;
      nextlen = application.playObj.vid[2].length;
        client.call("playZero",null,application.playObj.vid[1],clength,currlen,nextlen); 
    if(application.couObj.count ==3){
        clength = application.timObj.tim-3;
    currlen = application.playObj.vid[2].length;
      nextlen = application.playObj.vid[3].length;
        client.call("playZero",null,application.playObj.vid[2],clength,currlen,nextlen); 
    if(application.couObj.count ==4){
        clength = application.timObj.tim-3;
    currlen = application.playObj.vid[3].length;
      nextlen = application.playObj.vid[0].length;
        client.call("playZero",null,application.playObj.vid[3],clength,currlen,nextlen); 
    application.onDisconnect = function(oldclient){
    if(application.clients.length ==0){
    this.userID--;
    function time(myStream){
    application.timObj.tim = myStream.time;

  • Trying to add metadata to a live stream completely fails

    Hi.  I've been following this document http://help.adobe.com/en_US/flashmediaserver/devguide/WS5b3ccc516d4fbf351e63e3d11a0773d56e -7ff6Dev.html for adding metadata to an FMS live stream from a Flash widget.
    I've created a stripped-down version of the code (listed below); the same as the original code except the controls are removed.  It works fine only if I comment out the ns.send calls.  If I leave in the ns.send calls, no clients are ever able to view anything.  I'm using a stock FMS 4.5 install on Amazon EC2 -- the AMI is ami-904f08c2.  And I'm compiling the swf using Flex SDK 4.6 on Linux with the command "mxmlc -compiler.library-path+=./playerglobal11_0.swc -swf-version=13 -static-link-runtime-shared-libraries Broadcaster.as".
    package {
        import flash.display.MovieClip;
        import flash.net.NetConnection;
        import flash.events.NetStatusEvent; 
        import flash.events.MouseEvent;
        import flash.events.AsyncErrorEvent;
        import flash.net.NetStream;
        import flash.media.Video;
        import flash.media.Camera;
        import flash.media.Microphone;
        public class Broadcaster extends MovieClip {
            private var nc:NetConnection;
            private var ns:NetStream;
            private var nsPlayer:NetStream;
            private var vid:Video;
            private var vidPlayer:Video;
            private var cam:Camera;
            private var mic:Microphone;
            private var myMetadata:Object;
            public function Broadcaster(){
                setupUI();
                nc = new NetConnection();
                nc.addEventListener(NetStatusEvent.NET_STATUS, onNetStatus);
                nc.connect("rtmp://myserver/live");
             *  Clear the MetaData associated with the stream
            private function clearHandler(event:MouseEvent):void {
                if (ns){
                    trace("Clearing MetaData");
                    ns.send("@clearDataFrame", "onMetaData");
            private function startHandler(event:MouseEvent):void {
                displayPlaybackVideo();
            private function onNetStatus(event:NetStatusEvent):void {
                trace(event.target + ": " + event.info.code);
                switch (event.info.code)
                    case "NetConnection.Connect.Success":
                        publishCamera();
                        displayPublishingVideo();
                        break;
                    case "NetStream.Publish.Start":
                        sendMetadata();
                        break;
            private function asyncErrorHandler(event:AsyncErrorEvent):void {
                trace(event.text);
            private function sendMetadata():void {
                trace("sendMetaData() called")
                myMetadata = new Object();
                myMetadata.customProp = "Welcome to the Live feed of YOUR LIFE, already in progress.";
                ns.send("@setDataFrame", "onMetaData", myMetadata);
            private function publishCamera():void {
                cam = Camera.getCamera();
                mic = Microphone.getMicrophone();
                ns = new NetStream(nc);
                ns.client = this;
                ns.addEventListener(NetStatusEvent.NET_STATUS, onNetStatus);
                ns.addEventListener(AsyncErrorEvent.ASYNC_ERROR, asyncErrorHandler);
                ns.attachCamera(cam);
                ns.attachAudio(mic);
                ns.publish("livestream", "live");
            private function displayPublishingVideo():void {
                vid = new Video(cam.width, cam.height);
                vid.x = 10;
                vid.y = 10;
                vid.attachCamera(cam);
                addChild(vid); 
            private function displayPlaybackVideo():void {
                nsPlayer = new NetStream(nc);
                nsPlayer.client = this;
                nsPlayer.addEventListener(NetStatusEvent.NET_STATUS, onNetStatus);
                nsPlayer.addEventListener(AsyncErrorEvent.ASYNC_ERROR, asyncErrorHandler);
                nsPlayer.play("myCamera", 0);
                vidPlayer = new Video(cam.width, cam.height);
                vidPlayer.x = cam.width + 100;
                vidPlayer.y = 10;
                vidPlayer.attachNetStream(nsPlayer);
                addChild(vidPlayer);
            private function setupUI():void {
            public function onMetaData(info:Object):void {

    Also, emitting other events in the ns.send calls works; eg. if I do ns.send("blah", "onMetaData", myMetadata), nothing happens (because there's no "blah" function to do anything), but at least this doesn't cause the entire stream to fail.

Maybe you are looking for

  • 08 inspection

    hi We have a requiremnt for inspection lot to be created during 2 step transfer posting between storage locations (removel from one S.Loc and post another S.Loc). I have a plan to use movement type 313 and 315. But in Std - movement type 313 - lot or

  • Help please with corrupted catalogue

    I am running LR 3.3 in Windows Vista.  I have been using LR since 1.1 and have never encountered anything like this.  I have about 20,000 photos since 2004, with a folder for each year and subfolders by month in each of the years.  In the library LR

  • IMovie, iPhoto not free anymore on iPad Air

    I purchased an iPad Air for myself a month ago. I installed the free iMovie, iPhoto etc. with my own ID. I gave that iPad Air to a family member and purchased another iPad Air and installed everything the same way as I had on the first. My family mem

  • Shockwave player does not install

    I have a number of Windows XP SP2 computers that are anywhere from 1 month old to 8-10 months old that are exhibiting this problem. I have a disk authored in Director MX2004 (10.1.1) that runs fine on most all computers that I play it in. There are a

  • How can I get the table name of my column?

    I'm using the JDK 1.3 trying to get a table name of a column from a result set. Regardless of what driver I'm using (ODBC Bridge/Oracle) I always get a null returned. Does this method work? If not, if I'm doing an SQL query that uses more than 1 tabl