Streaming Multiple Live Events to 1 EC2 Instance

Hello, we are new to the Flash Media Server world.  We are using this with Amazon EC2 integration.  We currently have a need to stream multiple streams from various locations at the same live event.  We have 3 stages we need to stream at a music festival.  How can we utlize streaming these different stages to one EC2 instance running Flash Media Server?  Currently it seems we are only able to generate one .m3u8 file with the cloudformation template.  Ultimately we'd like to have a playlist in our end player where our subscribers can access the multiple stages.  It seems this should be possible.  Any tutorials on how to do this with Amazon EC2, Cloudfront and Flash Media Server? 

I just figured it out through hours of testing. Too simple actually.
Create your amazon fms stack.
in FMLE on each computer- make sure the stream has (anyname)?adbe-live-event=liveevent
The ?adbe-live-event=liveevent  is the key.
you can name the stream anything before the ? . (stage1?adbe-live-event=liveevent  on computer 1 , and  stage2?adbe-live-event=liveevent , on computer 2
The rtmp address is your 1 instance or server /livepkgr from your cloudformation stack not your instance.
(you must have this /livepkgr at the end of your rtmp. find it under cloud formation and the region you made it in.
Then use what ever rtmp player and put the differant streamer names in with the same rtmp address.
I use a player where clients can pick differant parts of an event broadcasting, just like your example.
Best of luck.
[email protected]

Similar Messages

  • Problem with FMIS 4 and streaming of live events

    We have a problem on our platform and its driving us nuts... no seriously... NUTS.
    We have triple checked every possible component from a hardware level up to a software configuration level.
    The problem :  Our platform consists of 2 origin servers with 6 edges talking to them (really beefy hardware).  Once we inject a live stream into our two origins... we can successfully get the stream out via the edges and stream it successfully via our player.  Once we hit around 2200 concurrent connections, the FMIS servers drops all the connections busy with streams.   From the logs the only thing we can see is the following - Tons of disconnects with the Status code 103's which according to the online documentation means Client disconnected due to server shutdown (or application unloaded).
    We simulated the scenario with the FMS load simulator utility... and we start seeing errors + all connections dropped around the 2200 mark.
    The machines are Dell blades with dual CPU Xeons (quad cores) with around 50 gigs of ram per server... The edges are all on 10 Gb/s ethernet interfaces as well. 
    We managed to generate a nice big fat coredump on the one origin and the only thing visible from inspecting the core dumps + logs is the following :
    2011-10-05 
    15:44:10   
    22353   (e)2641112 
    JavaScript runtime is out of memory; server shutting down instance (Adaptor:
    _defaultRoot_, VHost: _defaultVHost_, App: livestreamcast_origin/_definst_). Check the JavaScript runtime size for this application
    in the configuration file.
    And from the core dump :
    warning: no loadable sections found in added symbol-file system-supplied DSO at 0x7fff9ddfc000
    Core was generated by `/opt/adobe/fms/fmscore -adaptor _defaultRoot_ -vhost _defaultVHost_ -app -inst'.
    Program terminated with signal 11, Segmentation fault.
    #0  0x00002aaaab19ab22 in js_MarkGCThing () from /opt/adobe/fms/modules/scriptengines/libasc.so
    (gdb) bt
    #0  0x00002aaaab19ab22 in js_MarkGCThing () from /opt/adobe/fms/modules/scriptengines/libasc.so
    #1  0x00002aaaab196b63 in ?? () from /opt/adobe/fms/modules/scriptengines/libasc.so
    #2  0x00002aaaab1b316f in js_Mark () from /opt/adobe/fms/modules/scriptengines/libasc.so
    #3  0x00002aaaab19a673 in ?? () from /opt/adobe/fms/modules/scriptengines/libasc.so
    #4  0x00002aaaab19a6f7 in ?? () from /opt/adobe/fms/modules/scriptengines/libasc.so
    #5  0x00002aaaab19ab3d in js_MarkGCThing () from /opt/adobe/fms/modules/scriptengines/libasc.so
    #6  0x00002aaaab19abbe in ?? () from /opt/adobe/fms/modules/scriptengines/libasc.so
    #7  0x00002aaaab185bbe in JS_DHashTableEnumerate () from /opt/adobe/fms/modules/scriptengines/libasc.so
    #8  0x00002aaaab19b39d in js_GC () from /opt/adobe/fms/modules/scriptengines/libasc.so
    #9  0x00002aaaab17e6d7 in js_DestroyContext () from /opt/adobe/fms/modules/scriptengines/libasc.so
    #10 0x00002aaaab176bf4 in JS_DestroyContext () from /opt/adobe/fms/modules/scriptengines/libasc.so
    #11 0x00002aaaab14f5e3 in ?? () from /opt/adobe/fms/modules/scriptengines/libasc.so
    #12 0x00002aaaab14fabd in JScriptVMImpl::resetContext() () from /opt/adobe/fms/modules/scriptengines/libasc.so
    #13 0x00002aaaab1527b4 in JScriptVMImpl::postProcessCbk(unsigned int, bool, int) ()
       from /opt/adobe/fms/modules/scriptengines/libasc.so
    #14 0x00002aaaab1035c7 in boost::detail::function::void_function_obj_invoker3<boost::_bi::bind_t<void, boost::_mfi::mf3<void, IJScriptVM, unsigned int, bool, int>, boost::_bi::list4<boost::_bi::value<IJScriptVM*>, boost::arg<1>, boost::arg<2>, boost::arg<3> > >, void, unsigned int, bool, int>::invoke(boost::detail::function::function_buffer&, unsigned int, bool, int) ()
       from /opt/adobe/fms/modules/scriptengines/libasc.so
    #15 0x00002aaaab0fddf6 in boost::function3<void, unsigned int, bool, int>::operator()(unsigned int, bool, int) const ()
       from /opt/adobe/fms/modules/scriptengines/libasc.so
    #16 0x00002aaaab0fbd9d in fms::script::AscRequestQ::run() () from /opt/adobe/fms/modules/scriptengines/libasc.so
    #17 0x00002aaaab0fd0eb in boost::detail::function::function_obj_invoker0<boost::_bi::bind_t<bool, boost::_mfi::mf0<bool, fms::script::AscRequestQ>, boost::_bi::list1<boost::_bi::value<fms::script::IntrusivePtr<fms::script::AscRequestQ> > > >, bool>::invoke(boost::detail::function::function_buffer&) () from /opt/adobe/fms/modules/scriptengines/libasc.so
    #18 0x00000000009c7327 in boost::function0<bool>::operator()() const ()
    #19 0x00000000009c7529 in fms::script::QueueRequest::run() ()
    #20 0x00000000008b868a in TCThreadPool::launchThreadRun(void*) ()
    #21 0x00000000008b8bd6 in TCThreadPool::__ThreadStaticPoolEntry(void*) ()
    #22 0x00000000008ba496 in launchThreadRun(void*) ()
    #23 0x00000000008bb44f in __TCThreadEntry(void*) ()
    #24 0x000000390ca0673d in start_thread () from /lib64/libpthread.so.0
    #25 0x000000390bed44bd in clone () from /lib64/libc.so.6
    From what it looks like above, FMS is hard crashing when trying to use clone(2) (basically it means when its trying to spawn a new/another process).
    I am really hoping there is someone out there who can guide us in the right direction with regards to how we can pinpoint why our platform cannot cope with a pathetic 2200 connections before the FMIS daemon drops all connected streams.
    There has to be someone out there that has run into this or a similiar problem like this...  HELP !!!!
    Any feedback / ideas would be greatly appreciated.

    Thank you very much for the reply :-)
    We have been fiddling with the platform on many levels yesterday, and one thing we did do was bump that value up from 1024 to 8192... This made a HUGE improvement in ensuring the platform now holds the live streaming connections. (up to 8000 per edge)
    I think for other future reference and to aid other people that might run into this problem in the future, its a good idea to increase this value.  From what we have seen, read and heard, that default value is fairly conservative, its suppose to grow when the load demands it, however, if you have a large scale of connections coming in at once from multiple locations, it can happen that it grows too quickly which can result in the application to be reloaded (which disconnects all users, i.e. all edge servers connected to this origin).
    Another option we were recommended to modify was the following :
    In adaptor.xml you currently have this line:
    <HostPort name="edge1" ctl_channel="localhost:19350" rtmfp="${ADAPTOR.HOSTPORT}">${ADAPTOR.HOSTPORT}</HostPort>
    You can set this to
                    <HostPort name="edge1" ctl_channel="localhost:19350" rtmfp=":80">:80</HostPort>
    <HostPort name="edge2" ctl_channel="localhost:19351" rtmfp=":1935">:1935</HostPort>
    This will create two edge processes for both ports 80 and 1935. Currently both ports are served from the same fmsedge process.
    Of course this is not a huge performance improvement, but should further distribute load over more processes which is a good thing. Especially when there are that many connections.
    This setting can be made on all machines (origin + edge)
    Hopefully this could help other people also running into the same problems we have seen ...

  • Multiple output streams of differing quality for live events

    I'm to be working on multicasting a few live events for my company. I've configured multicast to work throughout our networks, works great. But we do have a few remote employees who need unicast streams over the public internet (via QT Streaming Server).
    I want to deliver the local clients a high-quality ~2-3mbit stream, and the remote users a smaller 150-300k stream.
    Is there any way to get QT Streaming Server to recode its stream? Or a way to make Broadcaster output two simultaneous streams? Or do I need two QT Broadcaster boxes?
    TIA,
    -porkchop

    I've found no way to do this with broadcaster, but I have found that WireCast (http://www.telestream.net/wire-cast/) does exactly what I need. Combined with QT Streaming Server, I'll be able to support both multicast and unicast clients with streams of various sizes.

  • How to merge multiple live audio streams into a single stream in FMS?

    I need to merge multiple live audio streams into a single stream so that i can pass this stream as input to VOIP through a softphone.
    For this i tried the following approach:
    Created a new stream (str1) on FMS onAppStart and recorded the live streams (sent throgh microphone) in that new stream.
    Below is the code :
    application.onAppStart = function()
    application.myStream=Stream.get("foo");           
    application.myStream.record();
    application.onPublish = function (client,stream)
          var streamName = stream.name;
          application.myStream.play(streamName,-2,-1};
    The problem is that the Stream.play plays only 1 live stream at a time.As soon as a the 2nd live stream is sent to FMS then Stream.play stops playing the previous live stream.So,only the latest live stream is getting recorded.Whereas i need to record all the live streams in the new stream simultaneously.
    Any pointers regarding this are welcome.
    Thanks

    well i tried this once for one of my scripts and the final conclusion is its not possible to combine two streams into 1.....how would you time/encode the two voices......there is no know solution to this in flash. If you continue on despite me and find a solution please post it so we can explain to rest of world.

  • Live event: same channel same port fail to stream two differ stream type

    Hi All,
    When I create live event in same channel and same port with differ type streaming  (unicast and multicast) I can’t start DME so I cannot  start broadcast too and video cannot stream..
    I did try and used non-dme live event .. the result still same and video not stream in sns. Im ussualy use this way with last dmm version (5.0.2) and to day I use SnS ver 5.2.1 its not running/ video blank......
    anybody know what happening ???
    thanks
    RD

    RD,
    Lets step back a little and get some more details:
    * What type of DME are you using?
    * What is connected to the DME providing the LIVE stream?
    * Specifically, how do you have the DME configured for your setup?
    * Please provide the SCREEN CAPTUREs of your DME configuration on
    the SNS
    * Have you configured a "Basic" configuration (single channel\port only) on the DME
    and SNS with 5.2.1 and have it working?  Have you ever got the DME working with SnS 5.2.1
    Thanks!
    T.

  • FMS3: Multiple live video streams

    Hi,
    After reading the features and documentation I have a doubt:
    Can I have multiple live video streams with FMSS at the same time?
    E.g.: Different TV stations broadcast to a single server, and
    then the user chooses what station wants to watch.
    Thanks,
    Oriol

    Contact them and ask them which video streaming standards they use.  The most popular are:
    Windows Media, Windows Media with DRM enabled, MPEG-4, Realplayer, Adobe Flash, and Silverlight. Microsoft has released a Silverlight update for Lion.  http://www.flip4mac.com/ allows native non-DRM Windows Media.  Alternatively, you may need virtualization*. Realplayer's latest update is Lion compatible.  MPEG-4 make sure the version of Quicktime you have installed is the latest.  Adobe's Flash, the same thing.  Also ask them if their website depends on ActiveX.  Again, with ActiveX, virtualization may be your only solution.

  • Multiple signups at live event & automatically have form reset

    Is there a way to have my form automatically reset itself (or open a new form) everytime a signup is completed so the next person can signup right away? This is needed for a live event with many new signups.

    Please see the following FAQ: http://forums.adobe.com/thread/869660 to see if it works for you.
    Thanks
    Roman

  • Check if a stream is live via PHP?

    Hi all!
    Is there any way to check if a certain RTMP stream is live,
    via PHP?
    The homepage of my site (which is built with PHP) will change
    depending on what streams are live (there are multiple streams).
    It's not enough to be able to check in a swf file as I can't then
    use the information to change the homepage or do other cool stuff
    like add to an RSS feed, or notify people via SMS/IM.
    Is there any way to check this?
    Thanks!

    Unfortunately, the applications running on the FMS Vhost (the
    FMS Core) can't listen for anything other than RTMP requests, so it
    means that your FMs app has to make the request.
    What you might want to try is building a little socket server
    in PHP, and program your FMS application connect to it when it
    starts up. That way, you can keep a stateful connection between PHP
    and the FMS app instance, and PHP can make requests over the
    socket.

  • Hyperion Smart Cut Session Issue streaming multiple reports at a time using

    This is a lengthly post with details, Please review with Patience:
    We are trying to integrate Workspace based web Analysis reports within our intranet portal, thru portlets (made from individual workspace smartcut link), so that users can customize their page as needed, Once they login thru intranet, Cookie is set and passed to succeeding smart cut links via secured http headers. This part is fine, but mainly running into issue with streaming several multiple web-analysis reports at a time,+ we are using Iframe to display and stream multiple individual reports on to one page, but unfortunately all Iframes are sharing one session rather than creating individual session, mostly because of limitation of current web analysis system architecture and IE browser behavior with sessions .
    Which Hyperion reporting tool does allow to stream multiple reports  at a time? ideally we want to create Iframe of each workspace report so that users can customize and pick portlets as needed. To stream multiple reports at a time.
    Problem Description: Hypeiron Web Analsysis, Smart cut reports incorrectly rendering images and intermixing dash boards on POV changes upon multiple reports being diplayed at one time thru Iframe's, issue seem to mostly relate to Hyperion using single session rather to multiple session when displaying several web reports at a time.
    Steps Needed To Reproduce:
    User logs into workspace.
    user click on expolore and able to view the report without any issues.
    When right on any of the report and select properties.
    Select the SmartCut url and paste it in the Internet explorer.
    The webanalysis report opens without any issues.
    Similarly user copies the smartcut url for a different report and paste the link on a different tab or new window of same browser.
    The report on the first tab gets over written of the report of the second tab of the browser.
    The issue happens when two different reports and opened on a browser with multiple tabs. The reports get over written.
    Oracle development answer:
    In the described scenario opening new browser window does not spawn a separate independent browser session. A new browser window is supplied with URL pointing to the same domain, so all browser cookies including the "ORA_WA_SID" (WA session cookie) are shared between the windows. But multiple application instances cannot run in the same browser session simultaneously because of session sharing. This is not actually a defect, but rather a limitation of current system architecture and browser behavior. And there is no way to fix it programmatically on the product side.

    Hi MeHyper,
    Your current arrangement puts most of the session handling and persistence on the client. This forces you to accept whatever the client decides to persist and propogate between browser elements (in this case iframes).
    My approach would be to:
    Consider coding your portlets to manage the sessions. Manage using a somewhat stateless strategy so that each request to the server requires a login and connect, retrieve, and disconnect. Each Iframe should access a different portlet url (or provide a different query) based on its content.
    This way, in theory, you can invalidate the sessions and related cookies and update the various report elements independently.
    Is smartview out of the question here?
    Regards,
    Robb Salzmann

  • How do you get multiple live instruments to put out sound?

    I just got a M-Audio 61 keyboard from my wife for our anniversary. I love it and it works great with GarageBand 1.X. I also have a guitar I use with GB. It is plugged in the 1/8 in. mini jack input in the back of the Mac. I however can't get both to put out sound at the same time. Lets say, for instance, that I would play the guitar and my wife the keys. How can I get both to put out sound at the same time?
    Thank for any help you can offer.
    Your friend in Mac OS X,
    Jose

    Danke Christoph!
    I was glad to get your response. I thought I was not doing something right in GarageBand. I'll get GB2 and try out multiple live sources.
    Once again thanks.
    Your friend in Mac OS X,
    Jose Mauricio Cuervo

  • Multiple start events in a process

    How does one add more than one start event to a process?
    "6.2.1.2 Using Multiple Start Events in a Process" of the modeling and implementation guide suggests this should be possible. I'd like to be able to create process instances using a none start event followed by a user task defined with the initiator pattern, so users can initiate the process. I also need to support creation of process instances via web service call. I should be able to accomplish this using a none start event followed by a receive task. However, I can't seem to get both in a single process. It only seems to allow a single start event.

    Sorry, I didn't realize that my component palette was being hidden - so I was missing a lot of activity/flow object types.
    Why is there only a small subset of flow objects displayed in the header of the process editor? (the one with the swimlanes)

  • How to use flash media server with cisco show n share live event module?

    hello all
    Is it  possible to use flash media server in show n share (non dme)live event ?  i 've configured flash media server, can receive  multicast streams, bu i have no idea what to write in video url fild in the live event basic setup.

    Hi Temur,
    You should be able to stream from a Flash Media Server.
    The URL should look like this:
    rtmp://xxx.yyy.com//flv:
    You can do some tests from a PC to get the exact URL that would suit your environment.
    Regards,
    Nicolas

  • Live stream not live- plays like VOD

    Hi All,
    I'm not sure if this belongs in the AS3 forum or the streaming forum, but here it is.
    I wrote a pair CS3/AS3 players to send and receive a live video stream (webcam) from a browser.  The first is called broadcaster.  It makes a NetConnection to my server, publishes a stream and attaches the camera and microphone.  This works fine. The stream is live.
    The second is originally called receiver.  Using an instance of flash.media.video on the stage it also makes a NetConnectin, attaches a stream and calls play.  This also works, BUT the receiver stream is not live.  It starts playing at time=0 like a VOD.
    --- broadcaster.swf ---
    var connection:NetConnection = new NetConnection()
    connection.connect(connect_url);  // my server/application/instance
    var stream = new NetStream(connection);
    stream.publish("mass", "live");  // live means stream live without recording on the server
    stream.attachCamera(Camera.getCamera());
    stream.attachAudio( Microphone.getMicrophone() );
    --- receiver.swf---
    connection.connect(connect_url);
    var stream = new NetStream(connection);
    video1.attachNetStream(stream); // video1 is on the stage (flash.media.video not FLVPlayback)
    stream.play("mass", -1, -1, true); // -1 means play live stream only, -1 means start at 'wherever the live stream is'
    How can the receiver not obey the -1, -1 parameters especially if the broadcaster is not recording on the server.  How can i see the beginning of the stream!
    I believe I read that the flash plugin 9.0+ uses the on2 codec, and I also saw some discussion about on2 not able to stream live -- or not live from the browser plugin, only the flash media encoder.
    Can anybody tell me what is going on?
    Thanks
    Ted

    I tried streaming with the Adobe Flash Media Live Encoder3 instead of my Broadcater.swf.  This connects fine and pushes h.264 no problem.
    But the reciever is still seeng VOD not Live video.
    Is there something wrong with my receiver?

  • Where can i watch the live event today (oct. 4th)

    Where can i watch the live event today (oct. 4th)? I knwo it's going on now. All the links i have tried will not open, even in different browsers. Maybe i am trying the wrong ones. Any ideas? -Thanks

    You can't. Apple is not streaming today's event live; they haven't streamed any event live for at least a couple of years. For blog/twitter-style updates, try:
    http://live.appleinsider.com
    which still seems to be working. Other sites are getting overloaded.
    Regards.

  • Multicast Show and Share live event from TCS

    Hi
    I have a deployment with DMM, SnS, and TCS and C90 Telepresence codec.
    The TCS is sending the live stream to the SnS server. The question is if it is possible to multicast that live event to the SnS clients, or is unicast the only option here?
    If unicast is the only option with the current deployment, is it possible to make a multicast stream with a different deployment?
    I would appriciate any thoughts on this.
    Regards,
    Ola Dallokken

    Hi, yes you can use mutlicast to do this(but as Gabriel says its not SNS tat streams it, it is merely a container looking at the raw stream from the TCS or wherever you choose to pull it from) but its a complete waste of time IF your network isnt setup for Multicast AND if you have WAN sites running slow links...TCS uses Microsoft Windows Media Server for Uni/multi and is inherently TERRIBLE in sending multicast traffic..! If you have wireshark have a look for yourself..unicast it can deal with and it will send a nice consistent stream but Multicast bursts all over the shop(so a 500k stream can burst to 2mb..which is not a good thing if yhou know about routing congestion etc etc)...took me months of workng out the root cause of the issue and to eventually get Cisco to fess up about it and say 'yes' its an issue. Its really a Microsoft issue and unfortunately thats what Cisco's TCS platform sits on....:).
    Forget reading he 'standard' Cisco blurb as it will say try this and try that, tweak this and tweak that but in the end you just cant use it (unless as i say your on a LAN or your WAN has big links and can deal with the overheads!) also unless you have a TAC engineer who has actually deployed this and not just read the manual they wont having wokring knowledge of the issue either. It took me a long time to find someone at Cisco who actually did know the issue and would actually admit it was a issue..I guess thats why the developed ECDS. Now thats a good product.But as usual not cheap at all.
    Thats why Cisco talk about 'Wowza'..!(or atleast they use it as an example in thier blurb for streaming and not the TCS) As it can deal with multicast fine BUT the catch is its not supported by Cisco...:)...the only REAL solution is to use a content delivery system like Cisco ECDS. Believe me i have spent months going through this with TAC Engineers and i have the current ' Cisco Expert' in this field who has helped me alot with this. Its a very sad fact to find out 'after' the fact. 
    Also the new SNS will be very different so will lend itself to ECDS in the long run anywa so all this we are discussing becomes a mute point...
    Hope this helps.  If you need any assistance apart from the TAC let me know as i have just spent a fair while deploying this myself and working out 'ALL' the non documented issues, of which there seem to be alot. Once its in it seems to work ok though.

Maybe you are looking for

  • How to pass SSO Login Ticket to an URL iView

    Hello experts, We have an application running on Netweaver and it is setup to accept SSO Login Tickets for authorisation. I have the following question: We want to expose some of the portlets in our application in a portal page using URL iViews, but

  • Phone states 'OTHER' storage is taking up most of the space on her iphone 4, how do i find it to make more space?

    Ok this may be a simple fix but I cant figure it out.  I am trying to give my wife more room on her phone to take photos of our new baby, so I synced her phone to iTunes it shows the bar of whats taking up space on it, and 3/4ths of the phone space i

  • What it means iwl3945: Could not read microcode: -2 ?

    My iwl3945 module work fine ( intel pro/wireless 3945abg ). I put it in /etc/rc.conf 1) MODULES=( ... iwl3945 ...) 2) wlan0="dhcp" INTERFACES=(lo wlan0 ...) and in /etc/conf.d/wireless 1) wlan_wlan0="wlan0 mode managed essid MariaTeresa channel 7 ap

  • Use of planning characteristics in APO DP

    I am using APO DP V5. In my planning object structure I have the following planning characteristics: - product - product group - customer Suppose I am extracting history data (based on despatches) from an attached ECC system. As I see things, I have

  • List of instance and databases

    My sql server contains multiple instances and each instance contains multiple databases. Now how to list of all the instances and their corresponding databases? i used select * from sys.servers for instances and sys.databases for number of databases.