Re-Start a Live Stream using AMS 5 through Cloudfront

We have AMS 5 running on an Amazon EC2 instance. We send a Live Stream from an FMLE encoder using H.264/AAC.
If we do an Origin Pull from that instance running the AMS 5 and Live steam through one of our commercial CDN accounts, we can stop and re-start our Live stream without issue. It stops and re-starts as it should. When we re-connect to the AMS using FMLE, using the Origin Pull, the stream re-starts in either the OSMF or JW Player 6 in under 30 seconds.
If we use that same AMS 5 on the same EC2 instance and Live stream through an Amazon Cloudfront distribution, if we stop our Live stream, when we re-start the stream it takes an unacceptably long time for the stream to refresh - if ever. Basically, after the re-start of the Live stream through FMLE, the Live stream at the player plays the first 10 or 15 seconds of the intial Live stream and then stops. If you refresh the player, it will play that same 10 or 15 seconds again - and stop again.
It would seem that the AMS 5 is setup correctly. It works through our commercial CDN.
It also would seem that the issue is with the cache on Amazon Cloudfront. Our assumption is that the Cloudfront cache is not properly updating. But we have tried about every variable in every setting for Cloudfront that we can find. We can't get past this.
Any suggestions?
Thanks
As a followup to this discussion, you can follow the issue through the AWS.Amazon forums at https://forums.aws.amazon.com/thread.jspa?threadID=120448&tstart=0

We also have other threads going.
http://forums.adobe.com/thread/1180721?tstart=0
https://forums.aws.amazon.com/thread.jspa?threadID=120448&tstart=0
There appears to be 2 different issues - one issue with Amazon Cloudfront and a second issue with Adobe Media Server. Unfortunately, they are tied together in the real world application of these programs.
In AMS. when we use the adbe-record-mode=record, then the AMS works fine because the AMS is clearing the streams on each re-publish. Unfortunately, this causes a status code error on Amazon Cloudfront. And that particular status code error from a custom origin server causes Cloudfront to "override the minimum TTL settings for applicable cache settings". As the programmers explained, basically what happens is that Cloudfront overrides our 30 second cache settings and goes into a standard cache setting of 5 minutes. And waiting 5 minutes to re-start a Live stream just doesn't work in the real world for real customers.
We can fix that Cloudfront status code error issue by using adbe-record-mode=append on the AMS. If we use that AMS setting, then we can stop a Live stream, re-start the stream, the Cloudfront cache clears and everything works - unless you also change the stream settings in your FMLE. If you stop the stream and change the video input or output size in the FMLE, then the AMS server gets stuck and can't/won't append to the f4f record file.
Does anyone have any ideas about how we can adjust all of this so that it works together?
Thanks

Similar Messages

  • How to use Phone camera for live streaming rather than using WireCast camera in Live streaming using Azure Media Services

    Hi,
    am planning to develop a live streaming app using C# and xaml.
    I have gone through the below sample and am able to stream the live content using wireacst camera.
    Azure Media Services Live Streaming Sample
    But i want to use Phone camera to stream the live video rather using wirecast camera.
    How can i use that? How to get the published URL from the C# code, is there any API to get that?
    Anybody please help me
    Regards,
    Santhosh

    Hello,
    There are no in box components that will allow you to transmit captured video from a Windows Phone device to Azure Media Services. You will need to create you own implementation. You can do this by creating a custom Media Foundation sync that can plug into
    the MediaCapture element's pipeline. You will then need to modify the captured and compressed video to align to one of the two protocols that Azure Media services can use. At this time Azure Media services can use either Smooth Streaming or RTMP for ingestion.
    You will need to create your own packager (or find compatible packager code from a 3rd party) to package the MP4 encoded data into the necessary format.
    Here is an example of how to create a custom Media Foundation sync (please note you must write your Media Foundation components using C++ MoCOM. We do not support Media Foundation components written in managed languages):
    Real-time communication sample
    I hope this helps,
    James
    Windows SDK Technologies - Microsoft Developer Services - http://blogs.msdn.com/mediasdkstuff/

  • Problem relaying live stream using different urls

    Hi,
    I have setup QT Broadcaster to announce a stream to my qtss.
    I announced the stream to a directory live/stream.sdp
    This works without a clinch when using the same URL for viewing, e.g.
    rtsp://rtsp.myqttss.com/live/stream.sdp
    If I create a symlink (or simply copy the sdp file) from live/stream.sdp to test/stream.sdp and try to access the stream through
    rtsp://rtsp.myqttss.com/test/stream.sdp
    it does not work anymore. QT Viewer will connect, but does not play the stream.
    I am wondering why this is. Is the mountpoint relevant for accessing the stream? All neccessary information should be contained in the sdp file.
    The reason I would like to do that is because we have a payment module for QTSS that will use a 'virtual' address to dispatch a stream to the client. This works perfectly for non live content.
    Thanks
    NikWest

    Ok.
    No problem now.
    Thanks.

  • Can storing a live stream using actionscript fail by the cache filling up with overhead?

    Hi,
    Lately we have been seeing a problem with the archives of live streams we create using FMS. We use FMS for live streaming and concurrently store the stream in a file using ActionScript. We use the following code to record:
    var s2 = Stream.get('mp4:' + mp4name);
    application.publishedStreamMP4= s2;
    application.publishedStreamMP4.record();
    application.publishedStreamMP4.play(application.publishedStream.name,-1,-1);
    (some lines have been removed that are used for logging, etc).
    Sometimes some of these functions fail and return false. In these cases FMS's core log shows that the cache is full:
    2013-06-11 11:45:55        13863   (w)2611372      The FLV segment cache is full.  -
    In investigating this issue I have not yet been able to recreate this exact situation. By lowering the cache to 1MB I have however been able to create a situation where storing a stream can stop because the cache is full. The situation occurs as follows:
    * The server is restarted, the cache is empty.
    * A live stream is started, which is also recorded.
    * Via the Administration API the cache values <bytes> and <bytes_inuse> show to be exactly the same as the <overhead> of the object that relates to the file being saved. The <bytes> and <bytes_inuse> values of the object are 0.
    * This continues in the same way untill the cache is full.
    * When the limit of the cache is reached the message
    2013-06-11 12:07:35        13863   (w)2611372      The FLV segment cache is full.  -
    is shown in the core log and storing of the file stops. The instance log also show status changes:
    2013-06-11 12:07:35        13863   (s)2641173      MP4 recording status for livestream.mp4: Code: NetStream.Record.NoAccess Level: status Details:         -
    2013-06-11 12:07:35        13863   (s)2641173      MP4 recording status for livestream.mp4: Code: NetStream.Record.Stop Level: status Details:     -
    In the filesystem I can confirm that the last change of the file is on this moment (in this case 12:07). The live stream continues without problems.
    I have reproduced this several times. Though I can understand that caches can fill up and this can cause trouble I feel like this situation is a bug in FMS. The cache fills up with overhead, which is apparently reserved untill writing the file ends.
    I hope someone here can help out. Has anyone seen a situation like this and is there any remedy for it? Or even a workaround where this overhead in the cache can be released so the cache does not fill up?
    We use FMS version 4.5.1.

    You can use an XML socket, but the FMS application will need to initiate the connection, as FMS has no support for listening for anything other than RTMP and HTTP requests.
    Stream switching can happen on the FMS side. In your FMS application, you'll create a server side stream, and use the Stream.play method for playing other sources (live streams or recorded flv/h.264 files) over that stream. Your subscribers will connect to the server side stream
    See the FMS docs for the Stream class and the XMLSocket class.

  • Live Streaming Using my Webcam As a secruity camera

    I'm lost here someone please help, : \ im trying to use my isight as a security camera over the internet so i can watch my dog i just got her.
    So the question is what software out there i can use to put my webcam as a camera ( again my powerbook is kinda old im still running on panther so i have 10.3.9 not 10.4 can i work anyway around this) where i can watch live.
    Can someone help me i know i need my ip address i have that but when i hook everythign else up where do i go when im somewhere else when i want to look at my live streaming from work to my house? please can someone help me thank you if you can. : )

    Hello, keith28
    Until we hear back from juanorfrankie, here is my suggestion for your question:
     I would like to access the isight using my iPhone.
     Can anyone make some suggestions?
    If iPhone will can support viewing a streaming webcam page, use EvoCam with your iSight to publish a webcam with streaming video. That would probably be the simplest way to do what you want, but, because I do not have iPhone, so I cannot test this.
    See the instructions and FAQs at EvoCam for system requirements and details on how to set up your webcam page.
    EZ Jim
    PowerBook 1.67 GHz   Mac OS X (10.4.10)    G5 DP 1.8  External iSight

  • Live stream using CDN

    Hello!
    I'm having problems with live streaming.
    When I use my own Flash Media Server at localhost it works perfect, but when I use my CDN and OSMF I get the next error message:
    Error: Connection refused by FMS.
    It works perfect if I use NetConnection and NetStream.
    This is my piece of code:
    var container:MediaContainer = new MediaContainer();
    addChildAt(container, 0);
    _mp = new MediaPlayer();
    _mp.autoRewind = false;
    var videoElementStatic:VideoElement = new VideoElement(new StreamingURLResource("rtmp://myhost/myapp/live/livestream",StreamType.LIVE_OR_RECORDED), new NetLoader());
    _mp.media = videoElementStatic;
    container.addMediaElement(videoElementStatic);
    P.D.: My CDN is Level 3
    Any ideas?
    thanks.

    I'm not sure whether Level 3 requires the FMS application instance.  But you might want to try the instruction in this thread:
    http://forums.adobe.com/message/2782150#2782150
    I doubt that's the problem though, because that should prevent the connection from happening entirely.  It sounds more like an authentication issue, you might need to follow up with Level 3 to make sure you have any necessary token in your URL.

  • Timecode/TimeStamp in RTMP live streaming using Adobe Media Server and FMLE

    HI There,
    Am trying to stream a video on to Adobe media server(using RTMP) through FMLE and play it on my web site using JWPlayer. i could stream the video successfully, but know i want to get the timestamp/timecode on the stream. does Adobe media server and FMLE support this kind of use-case ? if not is there any other way to achieve the same. any comments/suggestion/pointers appreciated .
    Thanks in Advance
    Regards
    Deepak

    If you're talking about nonstop, continuous streaming, your subscribing client will need to close and reconnect the stream every couple of days, as the server will run out of timestamps for the stream.

  • Feasibility of Live Streaming using FMSS 3.5

    Is Live Streaming Feasible using FMSS 3.5 on a web portal which is developed using J2EE?

    Yes very much - there is no correlation between technology one uses developement for web  portal and FMS - its just simple thing that you relevant we pages should embed  necessary swf client which will take responsibility of communicating with FMS

  • Dynamic live streaming on AMS 5 on AWS buffering and skipping

    I'm having a lot of trouble getting Adobe Media Server 5 on Amazon Web Services working with live dynamic streaming, I've done a of this on Flash Media Server 4.5 and am trying to duplicate what I have done there on AMS since I got an email that FMS 4.5 will be dropped soon.
    I have tried replicating my own setup from FMS 4.5 and I have also tried using the base install and following Adobe's tutorial. I get the same results in both events.
    First, the HLS streaming appears to work fine. I've watch it on several iOS devices with no problem.
    I use StrobeMediaPlayback for HDS streaming and that's where there are problems. The streams usually play fine at first and even transition well, then after a few minutes I start getting a lot of buffering and then video skipping. The longer it goes on, the more it buffers and the larger chucks of video get skipped. Sometimes I also get and error that says "TypeError: Error #1009"
    I am using FMLE and both it and the server are set to use absolute time.
    Any help would be appreciated!!

    Hi,
    Welcome to Adobe forums.
    May I know the FMS URL and the stream name you are using inside the FMLE and make sure you have full permissions on Adobe media server folder. And also check the streams folder inside the livepkgr folder and see if you are getting all the files created.
    And would you mind if you can send me the logs folder at [email protected] so that I can analyse this issue at my end.
    Regards,
    Puspendra

  • Inserting Metadata events in a live stream using non-flash client app

    Hi all,
    I wish to insert captions into a live flash video stream.
    I found an example here : http://www.adobe.com/devnet/flashmediaserver/articles/metadata_video_streaming_print.html
    but this example uses a Flash client app wich can invoke something like
    video_nc.call("sendDataEvent",null,inputfield_txt.text);
    How can do this without any Flash client/environment ? (I of course still use an FMS 3.5)
    I would like to use a python (or php...whatever) piece of code to extract caption from VBI in the incoming video stream and insert it in the flash stream.
    Any help / experience appreciated.
    Regards
    Michel

    Well, I'll ask it a different way :
    Is there any documentation on the protocol used between Flash client and FMSI so that I can  "fake" the flash client using php ?
    Is there a way to call sendDataEvent function on the FMSI NOT using a Flash client ?
    Thanks
    Regards

  • Live streaming using PPLive SOPcast etc. to watch TV online

    I have subsrcibed to a football online webserver, hence I can watch live games not on tv etc. However, Panther only seems to want to run the bad quality streams. I want to use PPLive or SOPcast streams but Safari won't even attempt to open them (think the file type or web link is of .mms type). Any ideas? I have only been successful using windows media player (for mac naturally) to view anything. What exactly are these programs PPLive, SOPCast etc. Do they run as a part of a media player or are they seperate programs. Any help would be appreciated. To make it clear, you go to the different stream portals, open one, and safari opens up media player to try and play it. Should Quicktime be able to support this?
    Cheers,
    Cowan

    Yes you can have a UK itms account if you have a UK credit card and address.
    No it doesn't matter where you buy the tv from, although you will need to change the plug if you buy it in the UK.

  • Live streaming not working using REOPS

    I made live streaming work using Live Encoder and a flash player I created with FlvPlayback component. It works fine. Now I like to try live streaming using REOPS. I created a config xml file using similar settings in src/assets/data/Playerconfig_LiveStreaming.xml:
    isLive = true
    <streamtype>live</streamtype>
    part of my xml file is like this:
    <?xml version="1.0" encoding="UTF-8"?>
    <player
        width="640" height="480"
        scaleMode="stretch" isLive="true" autoPlay="true" updateInterval="250"
        hasCaptions="false">
        <mediaElement>
            <mediaElement>
                <id>seeker</id>
                <mimeType>video/x-flv</mimeType>
                <streamType>live</streamType>
                <deliveryType>streaming</deliveryType>
                <media url="rtmp://<myMediaStreamingServerName>:1935/<myapplicationName>/live/livestream" />
            </mediaElement>
        </mediaElement>
    </player>
    in adobe live encoder, I have setting like this:
    URL: rtmp://<myMediaStreamingServerName>:1935/<myapplicationName>/live
    stream:  livestream
    but no matter how hard I tried, it never works for me. I used the same settings for my own player which works fine with live streaming.
    Anyone has any ideas on how to make live streaming work with REOPS?
    Thanks a lot.

    Is the XML a copy/paste? If so you may have a typo in your media url. There
    is a space in "liv e". That could be the issue.
    hth!

  • Can you live stream video to computer using Sony Action Cam X1000V 4K?

    I'm a computer programmer and I need high quality camera and Sony Action Cam X1000V suits me the best. I would like to know can I stream live video to my computer using hdmi/usb/wifi/anything from Sony Action Cam X1000V?

    Hi PatrikWWDC,
    Welcome to the Sony Community!
    Yes, you can do live streaming using the FDRX1000V action cam. You can use UStream to stream your videos live online. Here is a guide on how to prepare the camera for Live Streaming.
    For further assistance with your concern, we recommend visiting our Sony Global Web site for information on contacting the Sony Support Center in your region at http://www.sony.net/SonyInfo/Support/.
    If my post answers your question, please mark it as "Accept as Solution". Thanks_Mitch

  • Live Streaming incomplete

    Hi,
    Going through couple of tutorials on the adobe site, i wanted
    to try out how the live streaming works.
    So i setup the FMS 3 on the system which also hosts my web
    server (IIS).
    I setup the FME 2.5 on my system(with attached web cam).
    Later i started the live streaming by providing the FMS URL:
    "rtmp://ip_address/live/instance1" and with stream name as
    "myFirstStream".
    I was able to confirm that my FMS is able to receive the
    stream by verifying the FMS Admin console on the server.
    Second part:
    i created a "myStream.fla" having the FLV playback object on
    the timeline and providing "livestream" as its instance name. i did
    not provide any source/contentpath in the properties window for the
    FLV playback object.
    Then i created "streamaction.as" file and included that in
    the action script window in timeline2-frame1 by using the following
    code syntax: include "streamaction.as"
    in "streamaction.as" file in included the following 3 lines
    of code:
    livestream.width=320;
    livestream.height=240;
    livesteam.load("rtmp://ip_address/live/instance1/myFirstStream",
    true);
    "true" in the above line indicates "isLive" value.
    i created a "test.html" page in which i used the
    <object> and <embed> tags to embed the "myStream.swf"
    file
    Issue 1: But when i opened the "test.html" in my browser, no
    streaming was visible except the place holder/blank flash object in
    the page, eventhough the FME and FMS are displaying the streaming
    from my webcam.
    i even modifed the code as follows:
    livestream.width=320;
    livestream.height=240;
    //livesteam.load("rtmp://ip_address/live/instance1/myFirstStream",
    true);
    livesteam.contentpath =
    "rtmp://ip_address/live/instance1/myFirstStream";
    livesteam.isLive = true;
    Also can anyone suggest how to access the flash variable so
    that i can pass the URL value from my web page itself to the flash
    object. I had come accross the following way.. could this be the
    right approach (read FlashVars in the tags below)? ref:
    http://www.permadi.com/tutorial/flashVars/index.html
    <OBJECT
    classid="clsid:D27CDB6E-AE6D-11cf-96B8-444553540000"
    codebase="
    http://macromedia.com/cabs/swflash.cab#version=9,0,0,0"
    ID=flaMovie WIDTH=250 HEIGHT=250>
    <PARAM NAME=movie VALUE="myStream.swf">
    <PARAM NAME=FlashVars
    VALUE="streamName=myFirstStream">
    <PARAM NAME=quality VALUE=medium>
    <PARAM NAME=bgcolor VALUE=#99CC33>
    <EMBED src="myStream.swf"
    FlashVars="streamName=myFirstStream"
    bgcolor=#99CC33 WIDTH=250 HEIGHT=250
    TYPE="application/x-shockwave-flash">
    </EMBED>
    </OBJECT>
    and in "streamaction.as" file change line 3 to:
    livesteam.load("rtmp://ip_address/live/instance1/"+_root.streamName+"",
    true);
    will this work?
    TIA,
    Mohith

    Finally figured out what was incorrect in my code. Here is
    the working code (AS3):
    var paramObj:Object =
    LoaderInfo(this.root.loaderInfo).parameters;
    myVideo.autoPlay = false;
    myVideo.isLive = true;
    myVideo.source = "rtmp://ip_address/live/instance1/"+
    paramObj.streamName;
    myVideo.autoPlay = true;
    solution: isLive has to be set to true before setting the
    source of the FLVcomponent

  • About Flash P2P live streaming from non-webcam sources

    Hello, I am a university student. Our lab is attempting to work on a p2p live streaming using flash p2p features. The media source is not from a webcam but a file from a certain server, which is not directly supported by any of the 4 methods flash player offers(posting, direct routing , object replication,and multicast.) As some of the forum threads mentioned, our method is to use NetStream.publish() a stream and use send("callbackname", data)  to all subscribers that have joined in a NetGroup. So we can use p2p transmission. Now here are our questions:
    1. We know from MAX 2009 that flash p2p camera video multicasting implements pull-push mechanism inside which makes full use of p2p features like buffer map exchange. If we use NetStream.send() API to send non-webcam source data, will it also make use of this pull-push mechanism to spread data among all peers(subscribers) with proper data exchange?
    or more detailedly,
    I saw the P2P Gaming Libs at flashrealtime.com. It uses DIRECT_CONNECTIONS when creating publish-used NetStream in order to send data with lowest latency. My question is if I do not use DIRECT_CONNECTIONS, than when the NetGroup grow large (1000), will those peers that are not direct neighbors to the publisher in the same group relay data to each other using pull-push via buffer map?
    2. I have written a sample app and use send() to deliver data only (NetStream.bufferTime = 0) without camera video and audio. When the publisher sends the data  in a "for()" stantence for 20 times, the subscribers can only receive about 8 of them. But when I set a timer and periodically send them(e.g., 500ms), the subscribers can receive them correctly. My questions are: Is the packet loss caused by UDP unreliability? Can this be solved by setting the NetStream.dataReliable = ture? Can this totally be solved by setting a timer? How to set the delaytime property in timer? Are all data sent are orderedly received?
    I'm very appreciated if you can answer my questions? thanks.

    1. NetStream.send() data is delivered in the order it was sent, whether in client-server, DIRECT_CONNECTIONS, or multicast mode.
    2. the lowest latency will be achieved with DIRECT_CONNECTIONS.  however, that isn't scalable to large numbers of recipients.  the P2P multicast system trades low latency for scalability.
    3. there are no plans to support a natural NetStream streaming source that's not a camera/microphone.  Flash Media Server can perform that function.
    4. you should never need to implement "bitmap exchange"; the object replication system handles all the communication needed to accomplish its function. typically you will divide a file into segments, giving each one an index number. object replication uses ranges internally, so the index numbers should be contiguous.  nodes that have indices use NetGroup.addHaveObjects().  nodes that want indices they don't have use NetGroup.addWantObjects().  nodes that have objects will send them to nodes that want the objects.  when a node receives an object, it's automatically removed from its "want" set; once the node decides the object is acceptable, it adds it to its "have" set, at which point it can answer requests from other nodes who still "want" it.  eventually every node will have all the pieces and not want any.  for an example, please see Tom Krcha's articles on using object replication:
       http://www.flashrealtime.com/file-share-object-replication-flash-p2p/
       http://www.flashrealtime.com/video-on-demand-over-p2p-in-flash-player-101-with-object-repl ication/
    5. please see Tom Krcha's articles.  for VOD you will probably want to use the "LOWEST_FIRST" replication strategy, as that may allow you to start playing earlier.  for general file replication you'll get better sharing if you use RAREST_FIRST.

Maybe you are looking for