Protecting Live Stream Encoding URL

We set up a Flash Media Server and we are trying to figure out how to protect our live encoding url. Seems like anyone with a Flash Media Encoder and who knew the location of our server could publish a live stream. Can we password protect this in any way?

Asa,
Are you able to share details of what the solution being developed by Adobe will involve, and which versions of FMS it will apply to (an update for 3.0, 3.5 or just a feature of 4.0)
Thanks,
Michael

Similar Messages

  • Flash live streaming encoder question

    Does flash8 pro encode STREAMING video as VP6 or Sorenson?
    Non live streaming is VP6, but there seems to be no
    conclusive answer as to whether live streaming uses the same
    encoder.
    I guess not, otherwise something like on2 flix live would
    have no reason to exist if you already own flash pro...

    The answer was to go back to Wowza.  Adobe dropped the ball.
    RT
    ashish gupta. <mailto:[email protected]>
    December 15, 2014 at 3:44 AM
    >
          Live Streaming Basic Question
    created by ashish gupta.
    <https://forums.adobe.com/people/ashish+gupta.> in /Adobe Media
    Server/ - View the full discussion
    <https://forums.adobe.com/message/7015154#7015154>

  • Webcam live stream encoding

    I am developing a Video Streaming Application using Flex (AS 3.0) & Flash Media Server. I would like to know whether it is possible to encode (using H.264 or VP7 or Sorensen spark) the video from the web camera on the fly & stream it to the FMS server without using Flash Media Live Encoder.
    Is it possible ?
    Please Help...

    Asa,
    Are you able to share details of what the solution being developed by Adobe will involve, and which versions of FMS it will apply to (an update for 3.0, 3.5 or just a feature of 4.0)
    Thanks,
    Michael

  • Protecting live encoding server

    We set up a Flash Media Server and we are trying to figure out how to protect our live encoding url. Seems like anyone with a Flash Media Encoder and who knew the location of our server could publish a live stream. Can we password protect this in any way?

    You can install the authentication add-in on FMS and allow only authorized FMLE clients to publish. Visit FMLE install page on adobe.com to download the add-in and refer to FMLE help file.

  • User Auth. for Live Streaming with Media Encoder 2.5 and FMSS

    Hi,
    i am currently thinking of buying the Flash Media STREAMING
    Server (~995 $) and in order to see if this is the right choice, i
    also take a look at the Wowza Media Server for comparison.
    I want to support Live Streaming from Events with Flash Media
    Encoder 2.5 and for this i'll need a user authentification for
    giving access to special publishing Users only. Wowza Media Server
    supports a User Authentification by parsing the connecting URL,
    witch is delivered by the Media Encoder on starting a Connection,
    and therefor grants or denies access to the publishing function.
    Only Users with the right Access Persmissions, which is written
    trough a password and the onPublish Comand in the url, can publish
    Live Streams. Everyone can watch the Live Stream, if no restriction
    to the onConnect Command was made.
    The Problem in this case is, i will need seperate Application
    folders for every user who should be able to start a live stream.
    This would be a horrible effort...
    more info ==>
    http://www.wowzamedia.com/forums/showthread.php?t=1281
    Is there a similar way to support authorisation for starting
    publishing a LiveStream to the Flash Media Streaming Server, or do
    i need the Flash Media Interactive Server?
    How can i restrict access to users, who should not be able to
    start a live stream on my FMSS?
    Thx for your help,
    Tobi

    quote:
    Is there a similar way to support authorisation for starting
    publishing a LiveStream to the Flash Media Streaming Server, or do
    i need the Flash Media Interactive Server?
    Unfortunately no, there isn't. I struggled with same issue
    earlier, and we were forced to purcahse the Interactive edition. As
    the 'Streaming' version only runs Adobe-signed apps, there is no
    way to change things that happen within the onConnect handler.
    Furthermore, guys at Adobe haven't provided any possibility for
    even simple access control through configuration files for built-in
    apps that come along with the Streaming Server edition.
    - Jakki

  • Flash media live encoder, play live streaming failed!

    Hi, I install a fms3.5 and fmle3.1 on my computer. In fmle, I set video format is h264,and fms url:RTMP://192.168.8.6/live ,stream: livestream. The input and output windows' video is ok.
    The following is fmle log:
    Tue Jun 22 2010 12:25:19 : Primary - Connected
    Tue Jun 22 2010 12:25:19 : Primary - Network Command: onBWDone
    Tue Jun 22 2010 12:25:19 : Primary - Stream[11] Status: Success
    Tue Jun 22 2010 12:25:19 : Primary - Network Command: onFCPublish
    Tue Jun 22 2010 12:25:19 : Primary - Stream[11] Status: NetStream.Publish.Start
    Tue Jun 22 2010 12:25:19 : Session Started
    Tue Jun 22 2010 12:25:19 : Video Encoding Started
    The following is my source code to play live stream:
    <?xml version="1.0" encoding="utf-8"?>
    <mx:Application xmlns:mx="http://www.adobe.com/2006/mxml" layout="absolute">
        <mx:Script>
            <![CDATA[
             import mx.controls.Alert;
             private function showFlv():void {
                //myVideo.source = "rtmp://192.168.8.6:1935/vod/test.flv";
                myVideo.source = "rtmp://192.168.8.6:1935/live/livestream";
                 Dumper.info(myVideo.source);
             ]]>
        </mx:Script>
        <mx:Panel width="100%" height="100%">
          <mx:VBox width="100%" horizontalAlign="center">
            <mx:Text text="code display"/>
            <mx:Button label="play flv" click="showFlv();"/>
          </mx:VBox>
          <mx:VideoDisplay width="100%" height="100%" id="myVideo"/>
        </mx:Panel>
    </mx:Application>
    If I use myVideo.source = "rtmp://192.168.8.6:1935/vod/test.flv" ,it can play video.  Use :myVideo.source = "rtmp://192.168.8.6:1935/live/livestream",  Can't get video. Why?
    I just want to play live video on pc.

    Who can help me??

  • Problem relaying live stream using different urls

    Hi,
    I have setup QT Broadcaster to announce a stream to my qtss.
    I announced the stream to a directory live/stream.sdp
    This works without a clinch when using the same URL for viewing, e.g.
    rtsp://rtsp.myqttss.com/live/stream.sdp
    If I create a symlink (or simply copy the sdp file) from live/stream.sdp to test/stream.sdp and try to access the stream through
    rtsp://rtsp.myqttss.com/test/stream.sdp
    it does not work anymore. QT Viewer will connect, but does not play the stream.
    I am wondering why this is. Is the mountpoint relevant for accessing the stream? All neccessary information should be contained in the sdp file.
    The reason I would like to do that is because we have a payment module for QTSS that will use a 'virtual' address to dispatch a stream to the client. This works perfectly for non live content.
    Thanks
    NikWest

    Ok.
    No problem now.
    Thanks.

  • FME encoding live streams

    Hi there
    I recently got to read the "Stream Live Video with Flash
    Media Encoder.pdf" that comes with the FMS3 INteractive server.
    Since i am (and was) looking for a video encoding tool for a long
    time now a paragraph in the above mentioned pdf got all my
    attention.
    "Flash Media Encoder 2 can also be tightly integrated into
    your streaming pipeline
    with command-line control both locally and through a remote
    connection."
    I'm not sure if i correctly understood, but if i'm not wrong
    that means i can plug FME into my existing streams and encode them
    before they reach the end-user.
    I for instance am using a one-2-many live streaming platform,
    where the publisher streams his/her webcam's video to FMS, and a
    virtually unlimited number of subscribers can watch it.
    Would FME be able to support this kind of implementation, so
    i can have the video encoded at any level before the subscribers
    are actually seing it? (without having to deploy a desktop
    application that the publisher needs to use).
    Looking forward to hear your opinions.
    Regards
    Andy

    Hi
    Thank you for your anwer. It might not be what i wanted to
    hear. However i was looking through On2 products list. I tried to
    contact them too for a solution like the one i have explained in
    the first post. Unfortunately they did not answer (yet).
    Anyone knows if any of the On2 products can be used to encode
    the live stream before being published and available to the
    subscribers?.
    Looking forward to hear from someone ... anyone.
    Regards
    Andy

  • Encode archive of live stream on FMS 4

    I thought by what I have initially read that this was possible, but I am not sure.
    What we would like to do is create an archive (f4v or mp4) on the FMS 4 server any live stream that is pushed from our endpoints.  Primarily as a backup and if necessary - an editable file within CS4 or CS5.
    We are at a University and capture lecture powerpoints natively and sometimes need to edit the files.  flv is not enough and we would like to take afvantage of mp4 and the FMS server while we use it for overflow as a recorder.
    Can anyone clarify if this can be done?
    Thanks
    -Brian

    Recording can be triggered via client-side publish call, or via server-side script.  From the publish cmd, you specify "record" option.  For example,
    ns.publish("mp4:foo", "record"); // ns is a NetStream, "mp4:" prefix tells FMS to record using MP4 container
    Otherwise, if you want to trigger the record via server-side as Jay says, you can simply publish the stream as,
    ns.publish("mp4:foo");
    then from server-side AS, do something like,
    // this handler is called when a stream is published
    // clientObj is the client that is publishing the stream
    // streamObj is the stream that is being published
    application.onPublish = function(clientObj, streamObj)
        streamObj.record("record"); // start recording the stream

  • Configure Live streaming  FLV Encodering -- Adobe -- End user

    Hello,
    WHile looking on the web, there are no clear cut instructions on how to configure adobe media server fro live streaming.
    What I would like to do is take a streaming coming from alive  flv streaming source. Have adobe media server configured to allow users to connect to the media manager and watch the live stream.
    Could someone provide directions on this?
    Thank you,
    Scott

    Hi,
    This article has information to help you get started with live streaming on FMS, http://www.adobe.com/devnet/flashmediaserver/articles/beginner_live_fms3.html .
    Regards
    Mamata

  • Flex mobile live streaming

    Hi All,
      I am trying to display a live stream from rtmp server. Below is my code.
    <?xml version="1.0" encoding="utf-8"?>
    <s:Group height="100%" width="100%"
             xmlns:fx="http://ns.adobe.com/mxml/2009"
             xmlns:s="library://ns.adobe.com/flex/spark"
             xmlns:mx="library://ns.adobe.com/flex/mx">
        <fx:Script>
            <![CDATA[
                import mx.core.FlexGlobals;
                private const URL:String    = "rtmp://eigr2.flashlive.bigCDN.com/20175D";//"http://www.helpexamples.com/flash/video/cuepoints.flv";
                private const STREAM_KEY:String = "eigr8";
                private var _video:Video = new Video();
                private var _nc:NetConnection;
                private var _custClient:Object;
                private var _ns:NetStream;
                private var _streamKey:String;
                public function playVideo():void
                    var  url:String = URL;
                    if(videoHandler.numChildren == 0)
                        _nc    = new NetConnection();
                        _nc.addEventListener(NetStatusEvent.NET_STATUS, netStatusHandler);
                        _nc.addEventListener(AsyncErrorEvent.ASYNC_ERROR, asyncErrorHandler);
                        _nc.connect(URL);
                private function netStatusHandler(event:NetStatusEvent):void
                    trace("NC " + event.info.code);
                    switch (event.info.code)
                        case "NetConnection.Connect.Success":
                            _video                    = new Video();
                            _video.smoothing        = true;
                            var custClient:Object    = new Object();
                            custClient.onMetaData    = metaDataHandler;
                            _ns                = new NetStream(_nc);
                            _ns.client        = custClient;
                            _ns.addEventListener(AsyncErrorEvent.ASYNC_ERROR, asyncErrorHandler);
                            _ns.addEventListener(DRMErrorEvent.DRM_ERROR, drmerror);
                            _ns.addEventListener(DRMStatusEvent.DRM_STATUS, drmStatus);
                            _ns.addEventListener(StatusEvent.STATUS, status);
                            _ns.addEventListener(NetStatusEvent.NET_STATUS, target);
                            _ns.bufferTime = 2;
                            _ns.receiveVideo(true);
                            var net:NetStreamPlayOptions = new NetStreamPlayOptions();
                            //net.len = 1;
                            //net.transition = NetStreamPlayTransitions.APPEND;
                            net.streamName = STREAM_KEY;
                            _ns.play(STREAM_KEY);
                            _video.attachNetStream(_ns);
                            videoHandler.addChild(_video);
                            videoHandler.visible = true;
                            //background.visible = true;
                            break;
                        case "NetConnection.Connect.Failed":    //shownoVideoText();
                            break;
                        case "NetStream.Play.StreamNotFound":    trace("Unable to locate video: ");
                            break;
                private function status(event:StatusEvent):void
                    trace("Status Event : " + event.toString());
                private function drmerror(event:DRMErrorEvent):void
                    trace("DRM Error : " + event.toString());
                private function drmStatus(event:DRMStatusEvent):void
                    trace("DRM Status : " + event.toString());
                private function videoStates(event:Event):void
                    trace(event.toString());
                public function stopVideo():void
                    _nc.close();
                    _nc = null;
                    _ns = null;
                    this.videoHandler.removeChild(_video);
                private function asyncErrorHandler(event:AsyncErrorEvent):void
                    trace("Async errrorrrrrrrrrrrrrrrrrrrr");
                private function target(e:NetStatusEvent):void
                    trace("Started playing : " + e.info.code);           
                private function metaDataHandler(infoObject:Object):void
                    this.setVideoSize(infoObject.height, infoObject.width);
                /*public function playVideo():void
                    custClient    = new Object();
                    custClient.onMetaData    = metaDataHandler;
                    nc    = new NetConnection();
                    nc.connect(null);
                    nc.client = { onBWDone: function():void{} };
                    //nc.addEventListener(NetStatusEvent.NET_STATUS, target);
                    nc.addEventListener(AsyncErrorEvent.ASYNC_ERROR, asyncErrorHandler);
                    ns        = new DynamicCustomNetStream(nc);
                    ns.client                = custClient;
                    ns.addEventListener(AsyncErrorEvent.ASYNC_ERROR, asyncErrorHandler);
                    ns.addEventListener(NetStatusEvent.NET_STATUS, target);
                    ns.play(URL);
                    videoPlayer.attachNetStream(ns);
                    videoPlayer.smoothing        = true;
                    if(!videoHandler.contains(videoPlayer))
                        videoHandler.addChild(videoPlayer);
                private function metaDataHandler(infoObject:Object):void
                    this.setVideoSize(infoObject.height, infoObject.width);
                override public function set currentState(value:String):void
                    super.currentState = value;
                    this.videoHandler.percentHeight = value == "fullScreen" ? 100 : 50;
                public function onFCSubscribe(info:Object):void
                    trace("onFCSubscribe - succesful");
                protected function goBack():void
                    FlexGlobals.topLevelApplication.showVideo(false);
            ]]>
        </fx:Script>
        <fx:Declarations>
            <!-- Place non-visual elements (e.g., services, value objects) here -->
        </fx:Declarations>
        <s:Button click="goBack()"
                  horizontalCenter="0"
                  label="Go Back" bottom="50"/>
        <mx:UIComponent width="100%"
                        height="50%"
                        top="0" left="0"
                        id="videoHandler"/>
    </s:Group>
    When I try to run this, I could successfully connect to server but not getting the data. If I open the url in browser PublishNotify gets dispatched and I get the stream. As soon as I close UnpublishNotify gets dispatched and I wont get the stream. Can I know the possible reasons and solution?

    Unless I am mistaken here, you must implement Apple's HTTP Live protocol for streams over ten minutes in length. There is a few options you can take:
    1. Split your streams into 9 minute segments and feed them consecutively in groups. (Apple actually offers a segmenting tool that even generates an index file via there Developer tools)
    2. Only enable full length streaming over WiFi
    3. Fully implement HTTP Live....
    Also, do not forgot that not only are you going to have to implement HTTP Live but you are going to have to ALSO include an audio only stream.

  • RTMP Live Streaming Works, HTTP HLS/HDS Streaming Doesnt

    I am running Flash Media Server on my Windows 2008 R2 server, and am trying to get HTTP live streaming working. I am using Flash Media Live Encoder to stream to the server. I can get a basic RTMP stream to work without issue, however anytime I setup to stream to a basic HTTP live stream, I get the generic "We are having problems with playback. We apologize for the inconvenience." in the basic Flash Media Playback player. I can confirm the stream is being published to the server in the Admin Console using the livepkgr application successfully, and I can even get an RTMP stream to work when it is streaming to the livepkgr application.
    Here are the FMLE settings I am using to publish the stream:
    Stream URL: rtmp://184.69.238.58/livepkgr
    (yes, thats my IP, the stream is live continuously, test for yourself!)
    Stream Name: livestream?adbe-live-event=liveevent
    my embeded html for viewing the stream is as follows:
    <object width="640" height="480"> <param name="movie" value="http://fpdownload.adobe.com/strobe/FlashMediaPlayback_101.swf"> </param> <param name="flashvars" value="src=http://<myipgoeshere>/livepkgr/liveevent/livestream.f4m"></param> <param name="allowFullScreen" value="true"></param> <param name="allowscriptaccess" value="always"></param>  <embed src="http://fpdownload.adobe.com/strobe/FlashMediaPlayback_101.swf" type="application/x-shockwave-flash" allowscriptaccess="always" allowfullscreen="true" width="700" height="500" flashvars="src=http://<myipgoeshere>/livepkgr/liveevent/livestream.f4m">
    While this gives me the error, the RTMP version (publishing to the livepkgr) works fine like this:
    <object width="640" height="480"> <param name="movie" value="http://fpdownload.adobe.com/strobe/FlashMediaPlayback_101.swf"> </param> <param name="flashvars" value="src=rtmp://<myipgoeshere>/livepkgr/livestream"> </param> <param name="allowFullScreen" value="true"></param> <param name="allowscriptaccess" value="always"></param>  <embed src="http://fpdownload.adobe.com/strobe/FlashMediaPlayback_101.swf" type="application/x-shockwave-flash" allowscriptaccess="always" allowfullscreen="true" width="700" height="500" flashvars="src=rtmp://<myipgoeshere>/livepkgr/livestream"> </embed>  </object>
    Can anyone explain to me what is happening here? I need HTTP streaming so I can eventually handle multi-bit rate streaming. Is there any further configuration beyond the standard "out of the box" FMS installation I need to consider on my server? There are no firewall issues at play here as all ports for standard streaming are open.

    I had about 2 hours of downtime last night to move my server. Try again? It will be live all day.
    Dustin Rogers
    [email protected]
    780.293.6632

  • HDS live streaming to Flash not working

    Adobe Flash Media Server 4.5.5 r4013
    Windows 2008
    Sources:
    http://help.adobe.com/en_US/flashmediaserver/devguide/WSd391de4d9c7bd609-52e437a812a3725df a0-8000.html
    http://www.adobe.com/devnet/adobe-media-server/articles/live-multi-bitrate-video-http-flas h-ios.html
    Live streaming a single or multi-bitrate video over HTTP to Flash does not work. I have followed the instructions on the 2 sources listed above repeatedly, but I can’t get live streaming over HTTP to Flash to work. Live streaming to iOS over HTTP works with no problems (single and multi-bitrate streams).
    I have tried the troubleshooting steps from the following:
    http://help.adobe.com/en_US/flashmediaserver/devguide/WS0432746db30523c21e63e3d12efac195bd -8000.html
    Troubleshoot live streaming (HTTP)
    1.      Services window (Windows): Flash Media Server (FMS), Flash Media Administration Server, and FMSHttpd services are running. ✓
    2.      Verified that the request URL is correct. ✓
    3.      Configured ports:
    a.      Configure Apache to use port 80. Open rootinstall/Apache2.2/conf/httpd.conf in a text editor. Change the line Listen 8134 to Listen 80.
    b.     Configure Flash Media Server not to use port 80. Open rootinstall/conf/fms.ini in a text editor. Remove 80 from the ADAPTOR.HOSTPORT parameter so the parameter looks like the following: ADAPTOR.HOSTPORT = :1935 ✓
    4.      Placed a crossdomain.xml file to the rootinstall/webroot directory. ✓
    5.      In Flash Media Live Encoder, select the Encoding Options tab, choose Output from the Panel options menu, and verify the following:
    a) The value of FMS URL is rtmp://fms-dns-or-ip/livepkgr. If you’re testing on the same server as Flash Media Server, you can use the value localhost for fms-dns-or-ip. ✓
    b) For a single stream, the value of Stream is livestream?adbe-live-event=liveevent. ✓
    c) For adaptive bitrate streaming, the value of Stream is livestream%i?adbe-live-event=liveevent. ✓
    Flash Media Live Encoder uses this value to create unique stream names. To use another encoder, provide your own unique stream names, for example, livestream1?adbe-live-event=liveevent, livestream2?adbe-live-event=liveevent.
    The encoder is showing all 3 streams being published and streaming.
    6. Check Administration Console: the livepkgr application and the 3 streams are running. ✓
    7. Check the logs for errors. Flash Media Server logs are located in the rootinstall/logs folder. The master.xx.log file and the core.xx.log file show startup failures. Apache logs are located in the rootinstall/Apache2.2/logs folder. X
    a)   core00.log: these errors did not occur every time that I tried playing the live stream but these are the only relevant errors in the logs.
    1. 7968 (w)2611179     Warning from libf4f.dll: [Utils] [livestream2] Discarded all queued Media Messages received before first Video Keyframe Message
    2. 7968 (w)2611179     Warning from libf4f.dll: [Utils] [livestream3] Discarded all queued Media Messages received before first Video Keyframe Message
    b) edge00.log:
    13:33:57 4492          (w)2641213 Connection rejected by server. Reason : [ Server.Reject ] : (_defaultRoot_, _defaultVHost_) : Application (hds-live) is not defined.          -
    c) Apache-Error:
    1.     [warn]  Checking if stream is disabled but bootstrap path in event file is empty for event:livepkgr/events/_definst_/liveevent stream name:livestream
    2.     [warn] bootstrap path is in event file is empty for event:livepkgr/events/_definst_/liveevent stream name:livestream1
    As I mentioned, everything works on iOS and FMS seems to be creating all of the stream segments and meta files:
    a.     The 3 streams are being created in: HD:\Program Files\Adobe\Flash Media Server 4.5\applications\livepkgr\streams\_definst_
    b.    FMS is creating the following files in each stream folder (livestream1, livestream2, livestream 3):
    1. livestream1.bootstrap
    2. livestream1.control
    3. livestream1.meta
    4. .f4f segments
    5. .f4x segments
    The appropriate files are also being created in the HD:\Program Files\Adobe\Flash Media Server 4.5\applications\livepkgr\events\_definst_\liveevent folder, in which I have the following Manifest.xml and Event.xml files:
    <manifest xmlns="http://ns.adobe.com/f4m/1.0">
      <media streamId="livestream1" bitrate="200" />
      <media streamId="livestream2" bitrate="500" />
      <media streamId="livestream3" bitrate="1000" />
    </manifest>
    <Event>
      <EventID>liveevent</EventID>
      <Recording>
    <FragmentDuration>4000</FragmentDuration>
    <SegmentDuration>16000</SegmentDuration>
        <DiskManagementDuration>3</DiskManagementDuration>
      </Recording>
    </Event>
    I’ve tried clearing the contents of both streams\_definst_ and events\_definst_\liveevent (keeping the xml files) after restarting the encoder, and creating a different event definst for the streams (liveevent2 for example).
    We have an event in 2 weeks that we would like to stream to both Flash and iOS. Any help in solving this problem will be greatly appreciated.

    One step closer:
    Changed the crossdomain.xml file (more permissive settings).
    Changed the encoding on FMLE to vp6. Working somewhat (don't know what I did to make it start streaming through hds).
    But at least now I can get the individual streams in the set manifest file to work:
    http://localhost/hds-live/livepkgr/_definst_/livevent/livestream1.f4m
    http://localhost/hds-live/livepkgr/_definst_/livevent/livestream2.f4m
    http://localhost/hds-live/livepkgr/_definst_/livevent/livestream3.f4m
    BUT when I try to play the streams through the set manifest file from http://localhost/liveevent.f4m I'm getting the following error:
    "The F4m document contains errors URL missing from Media tag." I'll search the forums to see if anyone else has come across this problem.
    I used the f4m config tool to make the file. These are the file's contents:
    <manifest xmlns="http://ns.adobe.com/f4m/2.0">
      <baseURL>http://localhost/hds-live/livepkgr/_definst_/liveevent/</baseURL>
      <media href="livestream1.f4m " bitrate="200"/>
      <media href="livestream2.f4m " bitrate="500"/>
      <media href="livestream3.f4m " bitrate="1000"/>
    </manifest>
    Thanks

  • How to use Phone camera for live streaming rather than using WireCast camera in Live streaming using Azure Media Services

    Hi,
    am planning to develop a live streaming app using C# and xaml.
    I have gone through the below sample and am able to stream the live content using wireacst camera.
    Azure Media Services Live Streaming Sample
    But i want to use Phone camera to stream the live video rather using wirecast camera.
    How can i use that? How to get the published URL from the C# code, is there any API to get that?
    Anybody please help me
    Regards,
    Santhosh

    Hello,
    There are no in box components that will allow you to transmit captured video from a Windows Phone device to Azure Media Services. You will need to create you own implementation. You can do this by creating a custom Media Foundation sync that can plug into
    the MediaCapture element's pipeline. You will then need to modify the captured and compressed video to align to one of the two protocols that Azure Media services can use. At this time Azure Media services can use either Smooth Streaming or RTMP for ingestion.
    You will need to create your own packager (or find compatible packager code from a 3rd party) to package the MP4 encoded data into the necessary format.
    Here is an example of how to create a custom Media Foundation sync (please note you must write your Media Foundation components using C++ MoCOM. We do not support Media Foundation components written in managed languages):
    Real-time communication sample
    I hope this helps,
    James
    Windows SDK Technologies - Microsoft Developer Services - http://blogs.msdn.com/mediasdkstuff/

  • Jerkyness observed when live video encoded on FMLE

    Hi All,
    I am trying to encode live stream using FMLE 3.2 on mac mini.
    Input is of 1.5mbps with 640x480. It is coming throgh canopus and firewire to mac mini.
    I have set my bitrates to
    1. 800kbps H264 main @3.1 resolution 640x480
    2. 500kbps H264 main @3.1 resolution 480x360
    3. 300kbps H264 main @3.1 resolution 320x240
    My FMS parameters are,  FMS URL:rtmp://localhost/live  , streamname : livestream%i
    The live video encodes fine when i have only 1 stream encoding. But jerkyness observed when i have 3 bitrates enabled on encoder. What could be the issue?
    I thought i can use FMLE to adaptive streaming.
    I will be using 3 different streams for pc and mobile. Please let me know if i have any issues on the encoder side
    Any pointers would be useful.
    Thanks.

    For VP6 MBR:
    640x480 size or below; 2 or 3 streams at 25 fps: Dual Core (Core 2 Duo 3GHz, 2GB RAM)
    768x576 size; 3 streams at 25 fps: 8 Core (Xeon 3GHz, 3.25 GB RAM)
    Input stream 640x480 @25 fps; 3 output streams: 76x144 @25 fps_150 kbps , 320x240 @25 fps_300 kbps and 640x480 @25 fps_700 kbps: Dual Core 3GHz, 2GB RAM
    For H264 MBR:
    320x240 size ( Main or base profile); 3 streams at 25 fps: Quad Core Xeon 2GHz, 4GB RAM.
    640x480 size ( Main or base profile); 2 streams at 25 fps: 8 core Xeon 3GHz, 3.25 GB RAM.
    768x576 size ( Main or base profile); 2 streams at 25 fps: 8 core Xeon 3GHz, 3.25 GB RAM.

Maybe you are looking for

  • Purchase ledger and General Ledger account codes

    I am looking for information about how Oracle Financials works in respect of "codes" if that is what they are called within the Module. I am an accountant as well as a developer, and I have equivalent information about a number of other accounting pa

  • Unable to create DC using software component

    Hi, I have created Software component, and later downloaded track into my NWDS but while I am trying to create DC using above Software Component which is in gray out. The software component does not support the selected development component type. At

  • Image source on network

    Hi, I can't use images which are not in the ressource folder ... And i need to access images from network, with a link like: \\Axpc2\doc\xx.jpg How can i do this ? Regards, Julien

  • Help in finding lens profile

    I looked in adobe lens profile creator and was unable and could not find a profile for Nikon D90 camera and Nikkor 300mm prime lens ! Where else to look and find what I'm looking 4 ?

  • NAC appliance(security policy/update-files)

    Does anyone know something concerning to the following issues? Please teach me what I can refer to on the WEB,if possible. 1. Is there any way to apply the policy(checking OS/AV) to the kind of client devices which CAA hadn't been installed such like