Debugging live streaming ingest protocol

Hi everyone,
I posted a separate question last week about live streaming protocols. I had been searching for documentation on fmp4 live streaming ingestion and found nothing, but afterward I found the Smooth Streaming protocol specification (MS-SSTR-- the forum won't
let me link to it directly), which seems to apply to my use case. Now I've spent a few days trying to make it work and so far haven't been able to do so.
I'm not looking for a canned solution; I'm looking for resources to help me debug the problem.
According to the specification, I should be able to make a long-lived, chunked http POST request to my ingest endpoint, sending the fragmented mp4 data as the request body. I've tried to implement this without success-- I open the request stream and
write 1000 byte chunks at a time, but the connection is reset after only one or two chunks. I can't tell if the problem is the data or the chunk size or something else.
And if I mimic Expression Encoder 4, which seems to diverge from the spec and not send its data chunked at all, the connection isn't reset no matter how much data I send, but the video never makes it to the preview stream on the Azure portal.
So the question is, how can I debug this? Is there any error log or diagnostic output I can access? Each time I try something new, it doesn't work, but I have no way of knowing whether I am getting closer to or further from a working solution. I'm a bit
lost here, so help would be greatly appreciated.

Hi Jason, and thanks for getting back to me.
The IP security thing is a good thought, but I think I have that part taken care of already. I saw the option to restrict ingest to the IP I used to configure the channel from in the Portal, and I made sure to set it to Off. Also, I can stream successfully
from Expression, which would have the same source IP.
As for why I want to use Smooth Streaming, basically there are two reasons. First, it seems like the simpler protocol conceptually. I'm already generating fragmented mp4 files (though I think they might be packaged wrong; I can't really tell which part is
causing problems), and it seems simple enough that I could implement it without turning to a third-party library (the part that's feeding the stream is written in C#, though I'd be willing to rewrite in C++ if necessary, and there's no RTMP client
that seems trustworthy-- FluorineFX, for instance, seems to have ceased development years ago now).
Second, while reading the RTMP spec, I realized that even if I had such a library it isn't clear how I would need to use it with Azure. It allows creating objects or invoking methods on the server, but I couldn't find any guide to which objects and methods
might be supported. And the spec also doesn't define specifically what video format, codec options, or packaging method I need to use; essentially I would seem to have all the same problems I'm having with Smooth, only with the added difficulty of needing
a third-party library to implement the spec properly.
I guess the main problem is that I don't really know how close or far I am from success. Like I said, I think I am doing something wrong in how the video is encoded, but for all I know I'm being dropped because I'm feeding the data too quickly
or slowly. (I can't find any documentation on how much data Azure can buffer or how it handles underruns when the video to be streamed live is generated on the fly and not tied to strictly realtime output). If there's more documentation on specifically how
Azure uses or expects data from these protocols, it'd be very helpful.

Similar Messages

  • Publish Live stream to FMS on RTMPE protocol

    Hi,
    I am trying to publish Live Stream to FMS using FMLE or third party encoders.
    Is it possible to have RTMPE or RTMPTE protocol used between encoder and FMS?
    Very urgent.
    thanks,
    Pooja Jain

    FMLE can only publish via rtmp or rtmpt.

  • Suitable protocol to test live streaming applications like ESPN3

    Hi,
    I am using load runner 11.5 performance testing tool. Got requirement to test Live streaming applications. Please suggest which protocol is suitable for testing and provide the useful links for scripting and testing using LR.
    Regards,
    Sreek666

    Hello Sreek,
    You are seeking the Load Runner forum. This is just the consumer level forum.
    Granted, HP does not make it easy to find if you don't know about it.
    You may have to re-register to post questions as it is a separate forum.
    http://h30499.www3.hp.com/t5/LoadRunner-Support-Forum/bd-p/sws-LoadRunner_SF
    and check this one out too...
    http://h30499.www3.hp.com/t5/Performance-Center-Support-and/bd-p/itrc-915
    Good luck.
    If this is helpful, click the thumbs up button.

  • P2P-based live streaming protocol

    In a P2P-based live streaming system , a video stream is diveded into segments of uniform length. Each segment contrains 1-second video data.Every peer in the overlay owns a buffer to save segments temporarily.Then, what kind of protocol is appropriate to transmite 1-secong vido data?UDP or RTP?
    If RTP was used ,what kind of process should be take to get the video data which be send to buffer but not to play at once?

    So you know, there are no Macromedia guys... there are only
    Adobe guys, and I wouldn't expect answers from any of them here.
    This is a user to user forum.
    I don't work with mobile devices much, but Flash Lite
    supports the Flash 7 standard , so I would imagine the
    netconnection and video classes are in there. If that's the case,
    then you should be able to deliver Sorenson Spark encoded flv files
    (the flash 6/7 standard).
    Try here:
    http://www.adobe.com/mobile/

  • How to use Phone camera for live streaming rather than using WireCast camera in Live streaming using Azure Media Services

    Hi,
    am planning to develop a live streaming app using C# and xaml.
    I have gone through the below sample and am able to stream the live content using wireacst camera.
    Azure Media Services Live Streaming Sample
    But i want to use Phone camera to stream the live video rather using wirecast camera.
    How can i use that? How to get the published URL from the C# code, is there any API to get that?
    Anybody please help me
    Regards,
    Santhosh

    Hello,
    There are no in box components that will allow you to transmit captured video from a Windows Phone device to Azure Media Services. You will need to create you own implementation. You can do this by creating a custom Media Foundation sync that can plug into
    the MediaCapture element's pipeline. You will then need to modify the captured and compressed video to align to one of the two protocols that Azure Media services can use. At this time Azure Media services can use either Smooth Streaming or RTMP for ingestion.
    You will need to create your own packager (or find compatible packager code from a 3rd party) to package the MP4 encoded data into the necessary format.
    Here is an example of how to create a custom Media Foundation sync (please note you must write your Media Foundation components using C++ MoCOM. We do not support Media Foundation components written in managed languages):
    Real-time communication sample
    I hope this helps,
    James
    Windows SDK Technologies - Microsoft Developer Services - http://blogs.msdn.com/mediasdkstuff/

  • About Live stream with FMS

    Hi,
    I am doing somthing FMS.I download the FMS4 and Adobe Flash Midea Encoder3.5 here.
    Installed to my PC,they are working without any configure.
    and now I want to develop a client base on gstreamer to play the RTMP:// videos.
    My client play VOD video on FMS as rtmp://MYIP/vod/sample.flv without any problem.
    When I change to play the live video published by Adobe Flash Midea Encoder3.5,rtmp://MYIP/live/livestream.flv,it can start,but stoped after about 2 second.
    I make some debug,it seems the server stop to send data to my client.And after this stoped ,it can not start again,must reboot my client;
    I have checked the server with rtmpdump + VLC,it works fine.
    The Log in FMS is:
    Accepted a connection from IP:192.168.0.95, referrer: , pageurl:
    Sending error message: Method not found (FCSubscribe).
    Sending error message: Response object not found (_result:2147483647).
    "Sending error message: Method not found (FCSubscribe)." this display also with rtmpdump.
    "Sending error message: Response object not found (_result:2147483647)." this only with my client,what's the reason?
    In my client,I see the following after about 2 second:
    WARNING: Stream corrupt?!
    ERROR: Wrong data size (16717496), stream corrupted, aborting!
    What's happen when my client connect to FMS?  What's the defference between VOD and LIVE?
    Anyone try this with gstreamer?

    Thanks a lot for you reply;
    With your tool link,it can play the live video from the my server;
    But my client is based on Linux,and can not support web browser,so I try to implament it with Gstreamer;
    I published the video with the default setting of Adobe Flash Media Encoder 2.5,just set video format to H264.
    Today,I make some more debug,
    please help to analyse it,maybe you can find some thing form the log;
    DEBUG: Parsing...
    DEBUG: Parsed protocol: 0
    DEBUG: Parsed host    : 192.168.0.143
    DEBUG: Parsed app     : live
    DEBUG: Protocol : RTMP
    DEBUG: Hostname : 192.168.0.143
    DEBUG: Port     : 1935
    DEBUG: Playpath : livestream
    DEBUG: tcUrl    : rtmp://192.168.0.143:1935/live
    DEBUG: app      : live
    DEBUG: live     : yes
    DEBUG: timeout  : 30 sec
    DEBUG: Setting buffer time to: 36000000ms
    DEBUG: RTMP_Connect1, ... connected, handshaking
    DEBUG: HandShake: Type Answer   : 03
    DEBUG: HandShake: Server Uptime : 558990173
    DEBUG: HandShake: FMS Version   : 4.0.0.1
    DEBUG: HandShake: Handshaking finished....
    DEBUG: RTMP_Connect1, handshaked
    DEBUG: Invoking connect
    Pipeline is PREROLLING ...
    DEBUG: HandleServerBW: server BW = 2500000
    DEBUG: HandleClientBW: client BW = 2500000 2
    DEBUG: HandleChangeChunkSize, received: chunk size change to 4096
    DEBUG: RTMP_ClientPacket, received: invoke 242 bytes
    DEBUG: (object begin)
    DEBUG: (object begin)
    DEBUG: Property: <Name:             fmsVer, STRING:     FMS/4,0,0,1121>
    DEBUG: Property: <Name:       capabilities, NUMBER:     255.00>
    DEBUG: Property: <Name:               mode, NUMBER:     1.00>
    DEBUG: (object end)
    DEBUG: (object begin)
    DEBUG: Property: <Name:              level, STRING:     status>
    DEBUG: Property: <Name:               code, STRING:     NetConnection.Connect.Success>
    DEBUG: Property: <Name:        description, STRING:     Connection succeeded.>
    DEBUG: Property: <Name:     objectEncoding, NUMBER:     0.00>
    DEBUG: Property: <Name:               data, OBJECT>
    DEBUG: (object begin)
    DEBUG: Property: <Name:            version, STRING:     4,0,0,1121>
    DEBUG: (object end)
    DEBUG: (object end)
    DEBUG: (object end)
    DEBUG: HandleInvoke, server invoking <_result>
    DEBUG: HandleInvoke, received result for method call <connect>
    DEBUG: sending ctrl. type: 0x0003,nTime: 0x12c
    DEBUG: Invoking createStream
    DEBUG: FCSubscribe: livestream
    DEBUG: Invoking FCSubscribe
    DEBUG: RTMP_ClientPacket, received: invoke 21 bytes
    DEBUG: (object begin)
    DEBUG: Property: NULL
    DEBUG: (object end)
    DEBUG: HandleInvoke, server invoking <onBWDone>
    DEBUG: Invoking _checkbw
    DEBUG: RTMP_ClientPacket, received: invoke 29 bytes
    DEBUG: (object begin)
    DEBUG: Property: NULL
    DEBUG: (object end)
    DEBUG: HandleInvoke, server invoking <_result>
    DEBUG: HandleInvoke, received result for method call <createStream>
    DEBUG: SendPlay, seekTime=0, stopTime=0, sending play: livestream
    DEBUG: Invoking play
    DEBUG: sending ctrl. type: 0x0003,nTime: 0x2255100
    DEBUG: RTMP_ClientPacket, received: invoke 119 bytes
    DEBUG: (object begin)
    DEBUG: Property: NULL
    DEBUG: (object begin)
    DEBUG: Property: <Name:              level, STRING:     error>
    DEBUG: Property: <Name:               code, STRING:     NetConnection.Call.Failed>
    DEBUG: Property: <Name:        description, STRING:     Method not found (FCSubscribe).>
    DEBUG: (object end)
    DEBUG: (object end)
    DEBUG: HandleInvoke, server invoking <_error>
    ERROR: rtmp server sent error
    DEBUG: RTMP_ClientPacket, received: invoke 16419 bytes
    DEBUG: (object begin)
    DEBUG: Property: NULL
    DEBUG: (object end)
    DEBUG: HandleInvoke, server invoking <_onbwcheck>
    DEBUG: Invoking _result
    DEBUG: HandleChangeChunkSize, received: chunk size change to 4096
    DEBUG: HandleCtrl, received ctrl. type: 0, len: 6
    DEBUG: HandleCtrl, Stream Begin 1
    DEBUG: RTMP_ClientPacket, received: invoke 162 bytes
    DEBUG: (object begin)
    DEBUG: Property: NULL
    DEBUG: (object begin)
    DEBUG: Property: <Name:              level, STRING:     status>
    DEBUG: Property: <Name:               code, STRING:     NetStream.Play.Reset>
    DEBUG: Property: <Name:        description, STRING:     Playing and resetting livestream.>
    DEBUG: Property: <Name:            details, STRING:     livestream>
    DEBUG: Property: <Name:           clientid, STRING:     oAA7AAAA>
    DEBUG: (object end)
    DEBUG: (object end)
    DEBUG: HandleInvoke, server invoking <onStatus>
    DEBUG: HandleInvoke, onStatus: NetStream.Play.Reset
    DEBUG: RTMP_ClientPacket, received: invoke 156 bytes
    DEBUG: (object begin)
    DEBUG: Property: NULL
    DEBUG: (object begin)
    DEBUG: Property: <Name:              level, STRING:     status>
    DEBUG: Property: <Name:               code, STRING:     NetStream.Play.Start>
    DEBUG: Property: <Name:        description, STRING:     Started playing livestream.>
    DEBUG: Property: <Name:            details, STRING:     livestream>
    DEBUG: Property: <Name:           clientid, STRING:     oAA7AAAA>
    DEBUG: (object end)
    DEBUG: (object end)
    DEBUG: HandleInvoke, server invoking <onStatus>
    DEBUG: HandleInvoke, onStatus: NetStream.Play.Start
    DEBUG: HandleCtrl, received ctrl. type: 31, len: 6
    DEBUG: HandleCtrl, Stream BufferEmpty 1
    DEBUG: HandleCtrl, received ctrl. type: 32, len: 6
    DEBUG: HandleCtrl, Stream BufferReady 1
    DEBUG: RTMP_ClientPacket, received: notify 24 bytes
    DEBUG: (object begin)
    DEBUG: (object end)
    DEBUG: RTMP_ClientPacket, received: notify 487 bytes
    DEBUG: (object begin)
    DEBUG: (object begin)
    DEBUG: Property: <Name:             author, STRING:     >
    DEBUG: Property: <Name:          copyright, STRING:     >
    DEBUG: Property: <Name:        description, STRING:     >
    DEBUG: Property: <Name:           keywords, STRING:     >
    DEBUG: Property: <Name:             rating, STRING:     >
    DEBUG: Property: <Name:              title, STRING:     >
    DEBUG: Property: <Name:         presetname, STRING:     Custom>
    DEBUG: Property: <Name:       creationdate, STRING:     Thu Jun 09 08:55:02 2011
    >
    DEBUG: Property: <Name:        videodevice, STRING:     Syntek STK1150>
    DEBUG: Property: <Name:          framerate, NUMBER:     25.00>
    DEBUG: Property: <Name:              width, NUMBER:     320.00>
    DEBUG: Property: <Name:             height, NUMBER:     240.00>
    DEBUG: Property: <Name:       videocodecid, NUMBER:     7.00>
    DEBUG: Property: <Name:      videodatarate, NUMBER:     200.00>
    DEBUG: Property: <Name:           avclevel, NUMBER:     31.00>
    DEBUG: Property: <Name:         avcprofile, NUMBER:     66.00>
    DEBUG: Property: <Name:        audiodevice, STRING:     Realtek HD Audio Input>
    DEBUG: Property: <Name:    audiosamplerate, NUMBER:     22050.00>
    DEBUG: Property: <Name:      audiochannels, NUMBER:     1.00>
    DEBUG: Property: <Name:   audioinputvolume, NUMBER:     100.00>
    DEBUG: Property: <Name:       audiocodecid, NUMBER:     2.00>
    DEBUG: Property: <Name:      audiodatarate, NUMBER:     32.00>
    DEBUG: (object end)
    DEBUG: (object end)
    INFO: Metadata:
    INFO:   author
    INFO:   copyright
    INFO:   description
    INFO:   keywords
    INFO:   rating
    INFO:   title
    INFO:   presetname            Custom
    INFO:   creationdate          Thu Jun 09 08:55:02 2011
    INFO:   videodevice           Syntek STK1150
    INFO:   framerate             25.00
    INFO:   width                 320.00
    INFO:   height                240.00
    INFO:   videocodecid          7.00
    INFO:   videodatarate         200.00
    INFO:   avclevel              31.00
    INFO:   avcprofile            66.00
    INFO:   audiodevice           Realtek HD Audio Input
    INFO:   audiosamplerate       22050.00
    INFO:   audiochannels         1.00
    INFO:   audioinputvolume      100.00
    INFO:   audiocodecid          2.00
    INFO:   audiodatarate         32.00
    DEBUG: HandleCtrl, received ctrl. type: 31, len: 6
    DEBUG: HandleCtrl, Stream BufferEmpty 1
    DEBUG: HandleCtrl, received ctrl. type: 32, len: 6
    DEBUG: HandleCtrl, Stream BufferReady 1
    DEBUG: ignoring too small video packet: size: 2
    Pipeline is PREROLLED ...
    Setting pipeline to PLAYING ...
    New clock: GstSystemClock
    DEBUG: HandleCtrl, received ctrl. type: 31, len: 6
    DEBUG: HandleCtrl, Stream BufferEmpty 1
    DEBUG: HandleCtrl, received ctrl. type: 32, len: 6
    DEBUG: HandleCtrl, Stream BufferReady 1
    DEBUG: ignoring too small video packet: size: 2
    DEBUG: HandleCtrl, received ctrl. type: 31, len: 6
    DEBUG: HandleCtrl, Stream BufferEmpty 1
    DEBUG: HandleCtrl, received ctrl. type: 32, len: 6
    DEBUG: HandleCtrl, Stream BufferReady 1
    DEBUG: ignoring too small audio packet: size: 0
    DEBUG: HandleCtrl, received ctrl. type: 31, len: 6
    DEBUG: HandleCtrl, Stream BufferEmpty 1
    DEBUG: HandleCtrl, received ctrl. type: 32, len: 6
    DEBUG: HandleCtrl, Stream BufferReady 1
    DEBUG: HandleCtrl, received ctrl. type: 31, len: 6
    DEBUG: HandleCtrl, Stream BufferEmpty 1
    DEBUG: HandleCtrl, received ctrl. type: 32, len: 6
    DEBUG: HandleCtrl, Stream BufferReady 1
    DEBUG: HandleCtrl, received ctrl. type: 31, len: 6
    DEBUG: HandleCtrl, Stream BufferEmpty 1
    DEBUG: HandleCtrl, received ctrl. type: 32, len: 6
    DEBUG: HandleCtrl, Stream BufferReady 1
    DEBUG: HandleCtrl, received ctrl. type: 31, len: 6
    DEBUG: HandleCtrl, Stream BufferEmpty 1
    DEBUG: HandleCtrl, received ctrl. type: 32, len: 6
    DEBUG: HandleCtrl, Stream BufferReady 1
    DEBUG: HandleCtrl, received ctrl. type: 31, len: 6
    DEBUG: HandleCtrl, Stream BufferEmpty 1
    DEBUG: HandleCtrl, received ctrl. type: 32, len: 6
    DEBUG: HandleCtrl, Stream BufferReady 1
    DEBUG: HandleCtrl, received ctrl. type: 31, len: 6
    DEBUG: HandleCtrl, Stream BufferEmpty 1
    DEBUG: HandleCtrl, received ctrl. type: 32, len: 6
    DEBUG: HandleCtrl, Stream BufferReady 1
    DEBUG: HandleCtrl, received ctrl. type: 31, len: 6
    DEBUG: HandleCtrl, Stream BufferEmpty 1
    DEBUG: HandleCtrl, received ctrl. type: 32, len: 6
    DEBUG: HandleCtrl, Stream BufferReady 1
    DEBUG: HandleCtrl, received ctrl. type: 31, len: 6
    DEBUG: HandleCtrl, Stream BufferEmpty 1
    DEBUG: HandleCtrl, received ctrl. type: 32, len: 6
    DEBUG: HandleCtrl, Stream BufferReady 1
    DEBUG: HandleCtrl, received ctrl. type: 31, len: 6
    DEBUG: HandleCtrl, Stream BufferEmpty 1
    DEBUG: HandleCtrl, received ctrl. type: 32, len: 6
    DEBUG: HandleCtrl, Stream BufferReady 1
    DEBUG: HandleCtrl, received ctrl. type: 31, len: 6
    DEBUG: HandleCtrl, Stream BufferEmpty 1
    DEBUG: HandleCtrl, received ctrl. type: 32, len: 6
    DEBUG: HandleCtrl, Stream BufferReady 1
    DEBUG: HandleCtrl, received ctrl. type: 31, len: 6
    DEBUG: HandleCtrl, Stream BufferEmpty 1
    DEBUG: HandleCtrl, received ctrl. type: 32, len: 6
    DEBUG: HandleCtrl, Stream BufferReady 1
    DEBUG: HandleCtrl, received ctrl. type: 31, len: 6
    DEBUG: HandleCtrl, Stream BufferEmpty 1
    DEBUG: HandleCtrl, received ctrl. type: 32, len: 6
    DEBUG: HandleCtrl, Stream BufferReady 1
    DEBUG: HandleCtrl, received ctrl. type: 31, len: 6
    DEBUG: HandleCtrl, Stream BufferEmpty 1
    DEBUG: HandleCtrl, received ctrl. type: 32, len: 6
    DEBUG: HandleCtrl, Stream BufferReady 1
    DEBUG: HandleCtrl, received ctrl. type: 31, len: 6
    DEBUG: HandleCtrl, Stream BufferEmpty 1
    DEBUG: HandleCtrl, received ctrl. type: 32, len: 6
    DEBUG: HandleCtrl, Stream BufferReady 1
    DEBUG: HandleCtrl, received ctrl. type: 31, len: 6
    DEBUG: HandleCtrl, Stream BufferEmpty 1
    DEBUG: HandleCtrl, received ctrl. type: 32, len: 6
    DEBUG: HandleCtrl, Stream BufferReady 1
    DEBUG: HandleCtrl, received ctrl. type: 31, len: 6
    DEBUG: HandleCtrl, Stream BufferEmpty 1
    DEBUG: HandleCtrl, received ctrl. type: 32, len: 6
    DEBUG: HandleCtrl, Stream BufferReady 1
    DEBUG: HandleCtrl, received ctrl. type: 31, len: 6
    DEBUG: HandleCtrl, Stream BufferEmpty 1
    DEBUG: HandleCtrl, received ctrl. type: 32, len: 6
    DEBUG: HandleCtrl, Stream BufferReady 1
    DEBUG: HandleCtrl, received ctrl. type: 31, len: 6
    DEBUG: HandleCtrl, Stream BufferEmpty 1
    DEBUG: HandleCtrl, received ctrl. type: 32, len: 6
    DEBUG: HandleCtrl, Stream BufferReady 1
    DEBUG: HandleCtrl, received ctrl. type: 31, len: 6
    DEBUG: HandleCtrl, Stream BufferEmpty 1
    DEBUG: HandleCtrl, received ctrl. type: 32, len: 6
    DEBUG: HandleCtrl, Stream BufferReady 1
    DEBUG: HandleCtrl, received ctrl. type: 31, len: 6
    DEBUG: HandleCtrl, Stream BufferEmpty 1
    DEBUG: HandleCtrl, received ctrl. type: 32, len: 6
    DEBUG: HandleCtrl, Stream BufferReady 1
    DEBUG: HandleCtrl, received ctrl. type: 31, len: 6
    DEBUG: HandleCtrl, Stream BufferEmpty 1
    DEBUG: HandleCtrl, received ctrl. type: 32, len: 6
    DEBUG: HandleCtrl, Stream BufferReady 1
    DEBUG: HandleCtrl, received ctrl. type: 31, len: 6
    DEBUG: HandleCtrl, Stream BufferEmpty 1
    DEBUG: HandleCtrl, received ctrl. type: 32, len: 6
    DEBUG: HandleCtrl, Stream BufferReady 1
    DEBUG: HandleCtrl, received ctrl. type: 31, len: 6
    DEBUG: HandleCtrl, Stream BufferEmpty 1
    DEBUG: HandleCtrl, received ctrl. type: 32, len: 6
    DEBUG: HandleCtrl, Stream BufferReady 1
    DEBUG: HandleCtrl, received ctrl. type: 31, len: 6
    DEBUG: HandleCtrl, Stream BufferEmpty 1
    DEBUG: HandleCtrl, received ctrl. type: 32, len: 6
    DEBUG: HandleCtrl, Stream BufferReady 1
    WARNING: Stream corrupt?!
    ERROR: Wrong data size (8059624), stream corrupted, aborting!
    DEBUG: RTMP_ReadPacket, m_nChannel: 167
    DEBUG: RTMP_ClientPacket, unknown packet type received: 0xe3
    DEBUG: RTMP_ReadPacket, m_nChannel: 6ad6
    DEBUG: RTMP_ReadPacket, m_nChannel: 3d7c
    DEBUG: RTMP_ClientPacket, unknown packet type received: 0x00
    DEBUG: HandleCtrl, received ctrl. type: 8760, len: 6
    DEBUG: HandleCtrl, Stream xx -1742407864
    DEBUG: RTMP_ClientPacket, unknown packet type received: 0x2a
    DEBUG: RTMP_ClientPacket, unknown packet type received: 0x2a
    DEBUG: RTMP_ClientPacket, unknown packet type received: 0x2a
    DEBUG: RTMP_ClientPacket, unknown packet type received: 0x2a
    DEBUG: RTMP_ClientPacket, unknown packet type received: 0x2a
    DEBUG: RTMP_ClientPacket, unknown packet type received: 0xff
    DEBUG: RTMP_ClientPacket, unknown packet type received: 0xff
    DEBUG: RTMP_ReadPacket, m_nChannel: 5260
    DEBUG: RTMP_ReadPacket, failed to allocate packet
    Got EOS from element "pipeline0".
    Execution ended after 69266417190 ns.
    Setting pipeline to PAUSED ...
    Setting pipeline to READY ...
    DEBUG: Invoking deleteStream
    Setting pipeline to NULL ...
    FREEING pipeline ...
    After a short transfers,the client can not get the data.It seems that the Connection reset by peer;

  • Blur Effect in live Streaming publish through camera

    Hi Guys
                   I am currently working on a video chat application throgh FMS.live Stream is playing and publishing perfectly on both end.but the thing is that i have to apply blur effect on that live stream on a particular portion.u mean just like (thieves face) or some moving object.can it possible to apply blur on that live streaming .if yes then please tell me.thanks in advance.i am totally confused
    Thanks and Regards
    Vineet Osho

    Hello,
    There are no in box components that will allow you to transmit captured video from a Windows Phone device to Azure Media Services. You will need to create you own implementation. You can do this by creating a custom Media Foundation sync that can plug into
    the MediaCapture element's pipeline. You will then need to modify the captured and compressed video to align to one of the two protocols that Azure Media services can use. At this time Azure Media services can use either Smooth Streaming or RTMP for ingestion.
    You will need to create your own packager (or find compatible packager code from a 3rd party) to package the MP4 encoded data into the necessary format.
    Here is an example of how to create a custom Media Foundation sync (please note you must write your Media Foundation components using C++ MoCOM. We do not support Media Foundation components written in managed languages):
    Real-time communication sample
    I hope this helps,
    James
    Windows SDK Technologies - Microsoft Developer Services - http://blogs.msdn.com/mediasdkstuff/

  • How to play live streaming in windows phone 8.1 silverlight app?

    Hi,
    I am developing a windows phone 8.1 silverlight app. In my I want to show live streaming youtube channel , I think it is not possible, 
    Actually that youtube channel is a television channel , I want to stream that live tv in my app.(I tried to load youtube channel in webbrowser in iframe tag but it is not opening)
    Anybody help me how to play live tv or live streaming in my app,
    Thanks..
    Suresh.M

    Hello,
    You will likely need to write a custom
    MediaStreamSource that can access the media stream and parse it. Windows Phone supports h.264 natively and as long as the site serves up a media stream that contains h.264 frames you can parse it and have our built in decoder decode and display it. You
    will need to have intimate knowledge of the streaming protocol used by the website that you are trying to play. You must also make sure that the website is not using protected content. IANAL so I would recommend that you have your local law professional
    (lawyer) review the licensing of the website that you are attempting to connect to and stream from before continuing your development. Your lawyer can make sure make sure their licensing allows you to do this.
    I hope this helps,
    James
    Windows SDK Technologies - Microsoft Developer Services - http://blogs.msdn.com/mediasdkstuff/

  • Flex mobile live streaming

    Hi All,
      I am trying to display a live stream from rtmp server. Below is my code.
    <?xml version="1.0" encoding="utf-8"?>
    <s:Group height="100%" width="100%"
             xmlns:fx="http://ns.adobe.com/mxml/2009"
             xmlns:s="library://ns.adobe.com/flex/spark"
             xmlns:mx="library://ns.adobe.com/flex/mx">
        <fx:Script>
            <![CDATA[
                import mx.core.FlexGlobals;
                private const URL:String    = "rtmp://eigr2.flashlive.bigCDN.com/20175D";//"http://www.helpexamples.com/flash/video/cuepoints.flv";
                private const STREAM_KEY:String = "eigr8";
                private var _video:Video = new Video();
                private var _nc:NetConnection;
                private var _custClient:Object;
                private var _ns:NetStream;
                private var _streamKey:String;
                public function playVideo():void
                    var  url:String = URL;
                    if(videoHandler.numChildren == 0)
                        _nc    = new NetConnection();
                        _nc.addEventListener(NetStatusEvent.NET_STATUS, netStatusHandler);
                        _nc.addEventListener(AsyncErrorEvent.ASYNC_ERROR, asyncErrorHandler);
                        _nc.connect(URL);
                private function netStatusHandler(event:NetStatusEvent):void
                    trace("NC " + event.info.code);
                    switch (event.info.code)
                        case "NetConnection.Connect.Success":
                            _video                    = new Video();
                            _video.smoothing        = true;
                            var custClient:Object    = new Object();
                            custClient.onMetaData    = metaDataHandler;
                            _ns                = new NetStream(_nc);
                            _ns.client        = custClient;
                            _ns.addEventListener(AsyncErrorEvent.ASYNC_ERROR, asyncErrorHandler);
                            _ns.addEventListener(DRMErrorEvent.DRM_ERROR, drmerror);
                            _ns.addEventListener(DRMStatusEvent.DRM_STATUS, drmStatus);
                            _ns.addEventListener(StatusEvent.STATUS, status);
                            _ns.addEventListener(NetStatusEvent.NET_STATUS, target);
                            _ns.bufferTime = 2;
                            _ns.receiveVideo(true);
                            var net:NetStreamPlayOptions = new NetStreamPlayOptions();
                            //net.len = 1;
                            //net.transition = NetStreamPlayTransitions.APPEND;
                            net.streamName = STREAM_KEY;
                            _ns.play(STREAM_KEY);
                            _video.attachNetStream(_ns);
                            videoHandler.addChild(_video);
                            videoHandler.visible = true;
                            //background.visible = true;
                            break;
                        case "NetConnection.Connect.Failed":    //shownoVideoText();
                            break;
                        case "NetStream.Play.StreamNotFound":    trace("Unable to locate video: ");
                            break;
                private function status(event:StatusEvent):void
                    trace("Status Event : " + event.toString());
                private function drmerror(event:DRMErrorEvent):void
                    trace("DRM Error : " + event.toString());
                private function drmStatus(event:DRMStatusEvent):void
                    trace("DRM Status : " + event.toString());
                private function videoStates(event:Event):void
                    trace(event.toString());
                public function stopVideo():void
                    _nc.close();
                    _nc = null;
                    _ns = null;
                    this.videoHandler.removeChild(_video);
                private function asyncErrorHandler(event:AsyncErrorEvent):void
                    trace("Async errrorrrrrrrrrrrrrrrrrrrr");
                private function target(e:NetStatusEvent):void
                    trace("Started playing : " + e.info.code);           
                private function metaDataHandler(infoObject:Object):void
                    this.setVideoSize(infoObject.height, infoObject.width);
                /*public function playVideo():void
                    custClient    = new Object();
                    custClient.onMetaData    = metaDataHandler;
                    nc    = new NetConnection();
                    nc.connect(null);
                    nc.client = { onBWDone: function():void{} };
                    //nc.addEventListener(NetStatusEvent.NET_STATUS, target);
                    nc.addEventListener(AsyncErrorEvent.ASYNC_ERROR, asyncErrorHandler);
                    ns        = new DynamicCustomNetStream(nc);
                    ns.client                = custClient;
                    ns.addEventListener(AsyncErrorEvent.ASYNC_ERROR, asyncErrorHandler);
                    ns.addEventListener(NetStatusEvent.NET_STATUS, target);
                    ns.play(URL);
                    videoPlayer.attachNetStream(ns);
                    videoPlayer.smoothing        = true;
                    if(!videoHandler.contains(videoPlayer))
                        videoHandler.addChild(videoPlayer);
                private function metaDataHandler(infoObject:Object):void
                    this.setVideoSize(infoObject.height, infoObject.width);
                override public function set currentState(value:String):void
                    super.currentState = value;
                    this.videoHandler.percentHeight = value == "fullScreen" ? 100 : 50;
                public function onFCSubscribe(info:Object):void
                    trace("onFCSubscribe - succesful");
                protected function goBack():void
                    FlexGlobals.topLevelApplication.showVideo(false);
            ]]>
        </fx:Script>
        <fx:Declarations>
            <!-- Place non-visual elements (e.g., services, value objects) here -->
        </fx:Declarations>
        <s:Button click="goBack()"
                  horizontalCenter="0"
                  label="Go Back" bottom="50"/>
        <mx:UIComponent width="100%"
                        height="50%"
                        top="0" left="0"
                        id="videoHandler"/>
    </s:Group>
    When I try to run this, I could successfully connect to server but not getting the data. If I open the url in browser PublishNotify gets dispatched and I get the stream. As soon as I close UnpublishNotify gets dispatched and I wont get the stream. Can I know the possible reasons and solution?

    Unless I am mistaken here, you must implement Apple's HTTP Live protocol for streams over ten minutes in length. There is a few options you can take:
    1. Split your streams into 9 minute segments and feed them consecutively in groups. (Apple actually offers a segmenting tool that even generates an index file via there Developer tools)
    2. Only enable full length streaming over WiFi
    3. Fully implement HTTP Live....
    Also, do not forgot that not only are you going to have to implement HTTP Live but you are going to have to ALSO include an audio only stream.

  • IOS RTMP live stream audio

    Hi guys hopefully some of you will be able to help me.
    I am trying to stream audio from a live RTMP feed using the NetConnection and NetStream classes. I've managed to get my app running no problem on Android, however I am having some major difficulties getting it to play the audio back on iPad. Interestingly it works in the device emulators when debugging, however I'm assuming this is not really an accurate representation. I've tried streaming the RTMP in both AAC and MP3, but with no luck from either. I can verify through debug that it has connected to the stream, however I just get no audio playing.
    Everything I've read about seems to suggest that this is possible on IOS as I'm only interested in audio and not video. Can anyone help?
    Code sample below (it's quick and dirty! ).
    THanks in advance!
    <?xml version="1.0" encoding="utf-8"?>
    <s:View xmlns:fx="http://ns.adobe.com/mxml/2009"
                        xmlns:s="library://ns.adobe.com/flex/spark" title="Audio" creationComplete="init()">
              <s:layout>
                        <s:VerticalLayout paddingLeft="10" paddingRight="10"
                                                                  paddingTop="10" paddingBottom="10"/>
              </s:layout>
              <fx:Script>
                        <![CDATA[
                                  import flash.media.Video;
                                  import flash.net.NetConnection;
                                  import flash.net.NetStream;
                                  import mx.core.UIComponent;
                                  private var vid:Video;
                                  private var videoHolder:UIComponent;
                                  private var nc:NetConnection;
                                  private var defaultURL:String="[STREAM]";
                                  private var streamName:String="[STREAMNAME]";
                                  private var ns:NetStream;
                                  private var msg:Boolean;
                                  private var intervalMonitorBufferLengthEverySecond:uint;
                                  private function init():void
                                            vid=new Video();
                                            vid.width=864;
                                            vid.height=576;
                                            vid.smoothing = true;                           
                                            //Attach the video to the stage             
                                            videoHolder = new UIComponent();
                                            videoHolder.addChild(vid);
                                            addEventListener(SecurityErrorEvent.SECURITY_ERROR, onSecurityError);
                                            grpVideo.addElement(videoHolder);
                                            connect();
                                  public function onSecurityError(e:SecurityError):void
                                            trace("Security error: ");
                                  public function connect():void
                                            nc = new NetConnection();
                                            nc.client = this;
                                            nc.addEventListener(NetStatusEvent.NET_STATUS, netStatusHandler);
                                            nc.objectEncoding = flash.net.ObjectEncoding.AMF0;
                                            nc.connect(defaultURL);      
                                  public function netStatusHandler(e:NetStatusEvent):void
                                            switch (e.info.code) {
                                                      case "NetConnection.Connect.Success":
                                                                trace("audio - Connected successfully");
                                                                createNS();                
                                                                break;
                                                      case "NetConnection.Connect.Closed":
                                                                trace("audio - Connection closed");                
                                                                //connect();
                                                                break; 
                                                      case "NetConnection.Connect.Failed":
                                                                trace("audio - Connection failed");                
                                                                break;
                                                      case "NetConnection.Connect.Rejected":
                                                                trace("audio - Connection rejected");                                  
                                                                break; 
                                                      case "NetConnection.Connect.AppShutdown":
                                                                trace("audio - App shutdown");                                 
                                                                break;         
                                                      case "NetConnection.Connect.InvalidApp":
                                                                trace("audio - Connection invalid app");                                   
                                                                break; 
                                                      default:
                                                                trace("audio - " + e.info.code + "-" + e.info.description);
                                                                break;                                                                                                    
                                  public function createNS():void
                                            trace("Creating NetStream");
                                            ns=new NetStream(nc);
                                            //nc.call("FCSubscribe", null, "live_production"); // Only use this if your CDN requires it
                                            ns.addEventListener(NetStatusEvent.NET_STATUS, netStreamStatusHandler);
                                            vid.attachNetStream(ns);
                                            //Handle onMetaData and onCuePoint event callbacks: solution at http://tinyurl.com/mkadas
                                            //See another solution at http://www.adobe.com/devnet/flash/quickstart/metadata_cue_points/
                                            var infoClient:Object = new Object();
                                            infoClient.onMetaData = function oMD():void {};
                                            infoClient.onCuePoint = function oCP():void {};        
                                            ns.client = infoClient;
                                            ns.bufferTime = 0;   
                                            ns.play(streamName);  
                                            ns.addEventListener(AsyncErrorEvent.ASYNC_ERROR, asyncErrorHandler);
                                            function asyncErrorHandler(event:AsyncErrorEvent):void {
                                                      trace(event.text);
                                            intervalMonitorBufferLengthEverySecond = setInterval(monPlayback, 1000);
                                  public function netStreamStatusHandler(e:NetStatusEvent):void
                                            switch (e.info.code) {
                                                      case "NetStream.Buffer.Empty":
                                                                trace("audio - Buffer empty: ");
                                                                break;
                                                      case "NetStream.Buffer.Full":
                                                                trace("audio - Buffer full:");
                                                                break;
                                                      case "NetStream.Play.Start":
                                                                trace("audio - Play start:");
                                                                break;
                                                      default:
                                                                trace("audio - " + e.info.code + "-" + e.info.description);
                                                                break;
                                  public function monPlayback():void {
                                            // Print current buffer length
                                            trace("audio - Buffer length: " + ns.bufferLength);
                                            trace("audio - FPS: " + ns.currentFPS);
                                            trace("audio - Live delay: " + ns.liveDelay);
                                  public function onBWDone():void {
                                            //Do nothing
                                  public function onFCSubscribe(info:Object):void {      
                                            // Do nothing. Prevents error if connecting to CDN.    
                                  public function onFCUnsubscribe(info:Object):void {    
                                            // Do nothing. Prevents error if connecting to CDN.    
                        ]]>
              </fx:Script>
              <s:Group id="grpVideo">
              </s:Group>
    </s:View>

    Just an update on this for anyone coming along after me.
    I've managed to get this working on MP3 but not AAC (I guess AAC just isn't supported?).
    My problem was the buffertime. The docs seemed to indicate it shoudl be set to 0 for live streaming, however switching it to "1" solved my problem on the particular stream I was pointing at.
    So essentially justchanged the above line:
                                            ns.bufferTime = 0;  
    to
                                            ns.bufferTime = 1;  
    Would be great to find out if anyone has gotten AAC working however...

  • Live streaming ( play && publish ) over rtmp/rtmfp with codecs ( h264 && PCMU ) on Android/iOS

    I have a question: On what operating systems(android,ios) can be played and published stream with codecs H264 and PCMU with rtmp and rtmfp protocols(live streaming)?    
         As far as I found out with the codecs can be played on Android(rtmp protocol). On IOS video is not displayed, as I understood AIR environment cuting it.
    Another question regarding video texture. Will be included in future releases support for live playing h264+PCMU on IOS?

    On iOS, you'll need to be playing a HLS stream for h264 to decode when streaming from a remote server.

  • How to integrate flash media server 4.0 live streaming for iOS devices ?

    Hi All,
    I have website which has live streaming module its working fine, same module i want to integrate for iOS devices. For live video streaming we are using FMS 4.0. So please let me know how we can integrate this for iOS devices using flash media server 4.0.
    Thanks in Adavnce
    Mohammad Sharique

    You need to place the crossdomain.xml in the webroot folder. Create a text file in the webroot folder using notepad, and call it crossdomain.xml. The text below will give you a wide open access policy, which is fine for testing.
    <?xml version="1.0"?>
    <!DOCTYPE cross-domain-policy SYSTEM "http://www.macromedia.com/xml/dtds/cross-domain-policy.dtd">
    <cross-domain-policy>
              <allow-access-from domain="*" />
    </cross-domain-policy>
    For debugging HTTP streaming I recommend you get hold of something like Charles or Fiddler. These will greatly assist in pinpointing any issues.

  • FMS 3.0.4 - Possible to have captions with live stream?

    Is it possible to have captions with a live stream? I can do it just fine with a VOD stream, but have been unable to figure out how to do it with a LIVE stream, if it's even possible.

    So you are saying your custom video player shows no video playback but in same set up Stock video Component which you are talking about works fine. Is the server under your control? I mean did you try debugging your custom video player - when you say there is no video playback - do you see connection is still alive or connection closes. Is there anything logged in FMS logs to indicate any issue - check application logs of App you are connecting and also other logs like core/edge/master logs.

  • Help with setup for live streaming

    I'm trying to set up a live stream that's viewable on my
    website.
    I have two machines behind a router with one being the
    webserver running FMS (windows 2003) and the other being the
    encoding machine running FME (windows XP).
    When I set up FME, I put in the following as the url to the
    server "rtmp://
    computernamewebserver/live/" with the session name being
    "test".
    This has no problems connecting to FMS .
    On the webserver/FMS, I created a *.swf file which basically
    contains just a flvplayback object with the URL set to "rtmp://
    computernamewebserver/live/test".
    Now after I publish this, I can view the video via the
    published html file from within the network, but when I try outside
    of the network, it doesn't work and I get no error messages so it's
    hard to debug.
    I have port 80 forwarded to my webserver/FMS machine. I tried
    forwarding port 1935 to both my webserver/FMS machine and the FME
    machine with no success.
    I've tried playing around with the rtmp URL by changing the
    computernamewebserver from the internally recognized
    computer name to the externally recognized URL to my webserver.
    The solution is probably something simple, but since this is
    my first attempt at flash streaming, it's not obvious to me and
    I've tried searching for the solution. My suspicion is that there's
    some config file somewhere that I need to config that I don't know
    about or maybe it's some port issue. Neither of my machines have
    windows based firewall disabled.

    "Now after I publish this, I can view the video..." Well this
    shows that FMS is working fine, so I am guessing that it is an
    outside network issue.
    "Neither of my machines have windows based firewall disabled.
    " I don't know if this is the cause either, as a host based
    firewall should prevent any connections from arriving, either by
    the internal lan or the external (well, at least I think that is
    the case, I dont know if windows still allows hosts on the same
    subnet to have local lan permissions or what they do with their
    crazy security model. That said it shouldn't be a problem.)
    I would try to just telnet to the port that you have open to
    make sure the connection can be established.
    telnet computernamewebserver 1935 (this should open up a
    socket, otherwise your port forwarding isnt working correctly.)
    If telnet works ok, perhaps its an issue with the swf. I
    would code the minimal things needed to just get a stream playing
    from the external network (perhaps just using the vod service, not
    the live, to make things simple).
    Hope that helps.

  • Rtmp live streaming of videos!

    I am loving Azure Platform. I am having certain problems, i hope you can help me resolve them.
    Problem : I want to live stream videos using rtmp. The input for rtmp will be using ffmpeg running in my azure VM. I am new to using rtmp. Any help is welcome. Thank you.

    hi i met a problem
    ffmpeg -v verbose -y -i sample.mp4  -strict -2 -c:a aac -b:a 16k -ar 24000 -r 30 -g 60 -b:v 2m -c:v libx264 -f flv rtmp://livetest1-tracyone.channel.mediaservices.chinacloudapi.cn:1935/live/1ea048c582e34115b9d76cb98c6e9fb6
    Parsing...
    Parsed protocol: 0
    Parsed host    : livetest1-tracyone.channel.mediaservices.chinacloudapi.cn
    Parsed app     : live
    RTMP_Connect1, ... connected, handshaking
    RTMP_Connect1, handshake failed.
    rtmp://livetest1-tracyone.channel.mediaservices.chinacloudapi.cn:1935/live/1ea048c582e34115b9d76cb98c6e9fb6: Unknown error occurred

Maybe you are looking for

  • My iMac has panic issues, need help.

    I've got an iMac 4,1 and it's panicking in a bad way.  It's not used for whole lot, just email, web, and printer sharing.  The thing won't boot sometimes and I'm forced to leave it on.  It only panics on boot.  Sometimes when going into Safe Boot, bu

  • Problem with com.sun.xml.parser.Parser

    Hi , I am using weblogic 6.01. In my ejb class, I am parsing a String to create XmlDocument. I created xml registry in weblogic server. In my weblogic server's xml registry, I have com.sun.xml.parser.DocumentBuilderFactoryImpl as DocumentBuilderFacto

  • Problem in CC BPM in PI

    Hi Team We are facing a strange issue in CC BPM. For some of the messages, one send step in BPM remains in "InProcess' status for a long time ( more than 2 hours) When I try to find out the problem , I see this message in SWWL Work item 000000189474

  • Audibook weirdness

    I'm running the latest version of iTunes (7.6) with an iPod Classic (160gb). Ever since upgrading to iTunes 7.6, however, I have encountered a stubborn problem with my collection of Audible audiobooks. I have 36 of these, from a couple of different a

  • STILL cannot save an alias

    I have been trying every day for months now and I'm still unable to create an alias, every time I try I get the response of THIS ALIAS CANNOT BE SAVED AT THIS TIME.  I have tried a variety of different name options incase it was that the name had alr