P2P-based live streaming protocol
In a P2P-based live streaming system , a video stream is diveded into segments of uniform length. Each segment contrains 1-second video data.Every peer in the overlay owns a buffer to save segments temporarily.Then, what kind of protocol is appropriate to transmite 1-secong vido data?UDP or RTP?
If RTP was used ,what kind of process should be take to get the video data which be send to buffer but not to play at once?
So you know, there are no Macromedia guys... there are only
Adobe guys, and I wouldn't expect answers from any of them here.
This is a user to user forum.
I don't work with mobile devices much, but Flash Lite
supports the Flash 7 standard , so I would imagine the
netconnection and video classes are in there. If that's the case,
then you should be able to deliver Sorenson Spark encoded flv files
(the flash 6/7 standard).
Try here:
http://www.adobe.com/mobile/
Similar Messages
-
P2P-based live streaming media data
In a P2P-based live streaming system , what should we do before transmitting media data to overlay net?What kind of transmitting protocol should be take?Is there a possibility to get the content from a
datasource which be transmitted by RTP?Chicago has a terrible new tax that went into effect on Wednesday...an additional 9% tax to streaming services (ie: Netflix) and to access data from a provider's computer (ie: research databases)http://www.chicagotribune.com/bluesky/originals/ct-chicago-cloud-tax-bsi-20150701-story.htmlfta: “watching electronically delivered television shows, movies or videos,” “listening to electronically delivered music” and “participating in games, on-line or otherwise.”...since the taxes will be levied based on Chicago billing addresses, there’s nothing to stop individuals from moving outside city limits or listing billing addresses in other towns. The same is true for businesses... anything that requires a user to do a search or make a request is subject to the taxEDIT: added link.
This topic first appeared in the Spiceworks Community -
Debugging live streaming ingest protocol
Hi everyone,
I posted a separate question last week about live streaming protocols. I had been searching for documentation on fmp4 live streaming ingestion and found nothing, but afterward I found the Smooth Streaming protocol specification (MS-SSTR-- the forum won't
let me link to it directly), which seems to apply to my use case. Now I've spent a few days trying to make it work and so far haven't been able to do so.
I'm not looking for a canned solution; I'm looking for resources to help me debug the problem.
According to the specification, I should be able to make a long-lived, chunked http POST request to my ingest endpoint, sending the fragmented mp4 data as the request body. I've tried to implement this without success-- I open the request stream and
write 1000 byte chunks at a time, but the connection is reset after only one or two chunks. I can't tell if the problem is the data or the chunk size or something else.
And if I mimic Expression Encoder 4, which seems to diverge from the spec and not send its data chunked at all, the connection isn't reset no matter how much data I send, but the video never makes it to the preview stream on the Azure portal.
So the question is, how can I debug this? Is there any error log or diagnostic output I can access? Each time I try something new, it doesn't work, but I have no way of knowing whether I am getting closer to or further from a working solution. I'm a bit
lost here, so help would be greatly appreciated.Hi Jason, and thanks for getting back to me.
The IP security thing is a good thought, but I think I have that part taken care of already. I saw the option to restrict ingest to the IP I used to configure the channel from in the Portal, and I made sure to set it to Off. Also, I can stream successfully
from Expression, which would have the same source IP.
As for why I want to use Smooth Streaming, basically there are two reasons. First, it seems like the simpler protocol conceptually. I'm already generating fragmented mp4 files (though I think they might be packaged wrong; I can't really tell which part is
causing problems), and it seems simple enough that I could implement it without turning to a third-party library (the part that's feeding the stream is written in C#, though I'd be willing to rewrite in C++ if necessary, and there's no RTMP client
that seems trustworthy-- FluorineFX, for instance, seems to have ceased development years ago now).
Second, while reading the RTMP spec, I realized that even if I had such a library it isn't clear how I would need to use it with Azure. It allows creating objects or invoking methods on the server, but I couldn't find any guide to which objects and methods
might be supported. And the spec also doesn't define specifically what video format, codec options, or packaging method I need to use; essentially I would seem to have all the same problems I'm having with Smooth, only with the added difficulty of needing
a third-party library to implement the spec properly.
I guess the main problem is that I don't really know how close or far I am from success. Like I said, I think I am doing something wrong in how the video is encoded, but for all I know I'm being dropped because I'm feeding the data too quickly
or slowly. (I can't find any documentation on how much data Azure can buffer or how it handles underruns when the video to be streamed live is generated on the fly and not tied to strictly realtime output). If there's more documentation on specifically how
Azure uses or expects data from these protocols, it'd be very helpful. -
About Flash P2P live streaming from non-webcam sources
Hello, I am a university student. Our lab is attempting to work on a p2p live streaming using flash p2p features. The media source is not from a webcam but a file from a certain server, which is not directly supported by any of the 4 methods flash player offers(posting, direct routing , object replication,and multicast.) As some of the forum threads mentioned, our method is to use NetStream.publish() a stream and use send("callbackname", data) to all subscribers that have joined in a NetGroup. So we can use p2p transmission. Now here are our questions:
1. We know from MAX 2009 that flash p2p camera video multicasting implements pull-push mechanism inside which makes full use of p2p features like buffer map exchange. If we use NetStream.send() API to send non-webcam source data, will it also make use of this pull-push mechanism to spread data among all peers(subscribers) with proper data exchange?
or more detailedly,
I saw the P2P Gaming Libs at flashrealtime.com. It uses DIRECT_CONNECTIONS when creating publish-used NetStream in order to send data with lowest latency. My question is if I do not use DIRECT_CONNECTIONS, than when the NetGroup grow large (1000), will those peers that are not direct neighbors to the publisher in the same group relay data to each other using pull-push via buffer map?
2. I have written a sample app and use send() to deliver data only (NetStream.bufferTime = 0) without camera video and audio. When the publisher sends the data in a "for()" stantence for 20 times, the subscribers can only receive about 8 of them. But when I set a timer and periodically send them(e.g., 500ms), the subscribers can receive them correctly. My questions are: Is the packet loss caused by UDP unreliability? Can this be solved by setting the NetStream.dataReliable = ture? Can this totally be solved by setting a timer? How to set the delaytime property in timer? Are all data sent are orderedly received?
I'm very appreciated if you can answer my questions? thanks.1. NetStream.send() data is delivered in the order it was sent, whether in client-server, DIRECT_CONNECTIONS, or multicast mode.
2. the lowest latency will be achieved with DIRECT_CONNECTIONS. however, that isn't scalable to large numbers of recipients. the P2P multicast system trades low latency for scalability.
3. there are no plans to support a natural NetStream streaming source that's not a camera/microphone. Flash Media Server can perform that function.
4. you should never need to implement "bitmap exchange"; the object replication system handles all the communication needed to accomplish its function. typically you will divide a file into segments, giving each one an index number. object replication uses ranges internally, so the index numbers should be contiguous. nodes that have indices use NetGroup.addHaveObjects(). nodes that want indices they don't have use NetGroup.addWantObjects(). nodes that have objects will send them to nodes that want the objects. when a node receives an object, it's automatically removed from its "want" set; once the node decides the object is acceptable, it adds it to its "have" set, at which point it can answer requests from other nodes who still "want" it. eventually every node will have all the pieces and not want any. for an example, please see Tom Krcha's articles on using object replication:
http://www.flashrealtime.com/file-share-object-replication-flash-p2p/
http://www.flashrealtime.com/video-on-demand-over-p2p-in-flash-player-101-with-object-repl ication/
5. please see Tom Krcha's articles. for VOD you will probably want to use the "LOWEST_FIRST" replication strategy, as that may allow you to start playing earlier. for general file replication you'll get better sharing if you use RAREST_FIRST. -
Publish Live stream to FMS on RTMPE protocol
Hi,
I am trying to publish Live Stream to FMS using FMLE or third party encoders.
Is it possible to have RTMPE or RTMPTE protocol used between encoder and FMS?
Very urgent.
thanks,
Pooja JainFMLE can only publish via rtmp or rtmpt.
-
Suitable protocol to test live streaming applications like ESPN3
Hi,
I am using load runner 11.5 performance testing tool. Got requirement to test Live streaming applications. Please suggest which protocol is suitable for testing and provide the useful links for scripting and testing using LR.
Regards,
Sreek666Hello Sreek,
You are seeking the Load Runner forum. This is just the consumer level forum.
Granted, HP does not make it easy to find if you don't know about it.
You may have to re-register to post questions as it is a separate forum.
http://h30499.www3.hp.com/t5/LoadRunner-Support-Forum/bd-p/sws-LoadRunner_SF
and check this one out too...
http://h30499.www3.hp.com/t5/Performance-Center-Support-and/bd-p/itrc-915
Good luck.
If this is helpful, click the thumbs up button. -
Hi,
I am doing somthing FMS.I download the FMS4 and Adobe Flash Midea Encoder3.5 here.
Installed to my PC,they are working without any configure.
and now I want to develop a client base on gstreamer to play the RTMP:// videos.
My client play VOD video on FMS as rtmp://MYIP/vod/sample.flv without any problem.
When I change to play the live video published by Adobe Flash Midea Encoder3.5,rtmp://MYIP/live/livestream.flv,it can start,but stoped after about 2 second.
I make some debug,it seems the server stop to send data to my client.And after this stoped ,it can not start again,must reboot my client;
I have checked the server with rtmpdump + VLC,it works fine.
The Log in FMS is:
Accepted a connection from IP:192.168.0.95, referrer: , pageurl:
Sending error message: Method not found (FCSubscribe).
Sending error message: Response object not found (_result:2147483647).
"Sending error message: Method not found (FCSubscribe)." this display also with rtmpdump.
"Sending error message: Response object not found (_result:2147483647)." this only with my client,what's the reason?
In my client,I see the following after about 2 second:
WARNING: Stream corrupt?!
ERROR: Wrong data size (16717496), stream corrupted, aborting!
What's happen when my client connect to FMS? What's the defference between VOD and LIVE?
Anyone try this with gstreamer?Thanks a lot for you reply;
With your tool link,it can play the live video from the my server;
But my client is based on Linux,and can not support web browser,so I try to implament it with Gstreamer;
I published the video with the default setting of Adobe Flash Media Encoder 2.5,just set video format to H264.
Today,I make some more debug,
please help to analyse it,maybe you can find some thing form the log;
DEBUG: Parsing...
DEBUG: Parsed protocol: 0
DEBUG: Parsed host : 192.168.0.143
DEBUG: Parsed app : live
DEBUG: Protocol : RTMP
DEBUG: Hostname : 192.168.0.143
DEBUG: Port : 1935
DEBUG: Playpath : livestream
DEBUG: tcUrl : rtmp://192.168.0.143:1935/live
DEBUG: app : live
DEBUG: live : yes
DEBUG: timeout : 30 sec
DEBUG: Setting buffer time to: 36000000ms
DEBUG: RTMP_Connect1, ... connected, handshaking
DEBUG: HandShake: Type Answer : 03
DEBUG: HandShake: Server Uptime : 558990173
DEBUG: HandShake: FMS Version : 4.0.0.1
DEBUG: HandShake: Handshaking finished....
DEBUG: RTMP_Connect1, handshaked
DEBUG: Invoking connect
Pipeline is PREROLLING ...
DEBUG: HandleServerBW: server BW = 2500000
DEBUG: HandleClientBW: client BW = 2500000 2
DEBUG: HandleChangeChunkSize, received: chunk size change to 4096
DEBUG: RTMP_ClientPacket, received: invoke 242 bytes
DEBUG: (object begin)
DEBUG: (object begin)
DEBUG: Property: <Name: fmsVer, STRING: FMS/4,0,0,1121>
DEBUG: Property: <Name: capabilities, NUMBER: 255.00>
DEBUG: Property: <Name: mode, NUMBER: 1.00>
DEBUG: (object end)
DEBUG: (object begin)
DEBUG: Property: <Name: level, STRING: status>
DEBUG: Property: <Name: code, STRING: NetConnection.Connect.Success>
DEBUG: Property: <Name: description, STRING: Connection succeeded.>
DEBUG: Property: <Name: objectEncoding, NUMBER: 0.00>
DEBUG: Property: <Name: data, OBJECT>
DEBUG: (object begin)
DEBUG: Property: <Name: version, STRING: 4,0,0,1121>
DEBUG: (object end)
DEBUG: (object end)
DEBUG: (object end)
DEBUG: HandleInvoke, server invoking <_result>
DEBUG: HandleInvoke, received result for method call <connect>
DEBUG: sending ctrl. type: 0x0003,nTime: 0x12c
DEBUG: Invoking createStream
DEBUG: FCSubscribe: livestream
DEBUG: Invoking FCSubscribe
DEBUG: RTMP_ClientPacket, received: invoke 21 bytes
DEBUG: (object begin)
DEBUG: Property: NULL
DEBUG: (object end)
DEBUG: HandleInvoke, server invoking <onBWDone>
DEBUG: Invoking _checkbw
DEBUG: RTMP_ClientPacket, received: invoke 29 bytes
DEBUG: (object begin)
DEBUG: Property: NULL
DEBUG: (object end)
DEBUG: HandleInvoke, server invoking <_result>
DEBUG: HandleInvoke, received result for method call <createStream>
DEBUG: SendPlay, seekTime=0, stopTime=0, sending play: livestream
DEBUG: Invoking play
DEBUG: sending ctrl. type: 0x0003,nTime: 0x2255100
DEBUG: RTMP_ClientPacket, received: invoke 119 bytes
DEBUG: (object begin)
DEBUG: Property: NULL
DEBUG: (object begin)
DEBUG: Property: <Name: level, STRING: error>
DEBUG: Property: <Name: code, STRING: NetConnection.Call.Failed>
DEBUG: Property: <Name: description, STRING: Method not found (FCSubscribe).>
DEBUG: (object end)
DEBUG: (object end)
DEBUG: HandleInvoke, server invoking <_error>
ERROR: rtmp server sent error
DEBUG: RTMP_ClientPacket, received: invoke 16419 bytes
DEBUG: (object begin)
DEBUG: Property: NULL
DEBUG: (object end)
DEBUG: HandleInvoke, server invoking <_onbwcheck>
DEBUG: Invoking _result
DEBUG: HandleChangeChunkSize, received: chunk size change to 4096
DEBUG: HandleCtrl, received ctrl. type: 0, len: 6
DEBUG: HandleCtrl, Stream Begin 1
DEBUG: RTMP_ClientPacket, received: invoke 162 bytes
DEBUG: (object begin)
DEBUG: Property: NULL
DEBUG: (object begin)
DEBUG: Property: <Name: level, STRING: status>
DEBUG: Property: <Name: code, STRING: NetStream.Play.Reset>
DEBUG: Property: <Name: description, STRING: Playing and resetting livestream.>
DEBUG: Property: <Name: details, STRING: livestream>
DEBUG: Property: <Name: clientid, STRING: oAA7AAAA>
DEBUG: (object end)
DEBUG: (object end)
DEBUG: HandleInvoke, server invoking <onStatus>
DEBUG: HandleInvoke, onStatus: NetStream.Play.Reset
DEBUG: RTMP_ClientPacket, received: invoke 156 bytes
DEBUG: (object begin)
DEBUG: Property: NULL
DEBUG: (object begin)
DEBUG: Property: <Name: level, STRING: status>
DEBUG: Property: <Name: code, STRING: NetStream.Play.Start>
DEBUG: Property: <Name: description, STRING: Started playing livestream.>
DEBUG: Property: <Name: details, STRING: livestream>
DEBUG: Property: <Name: clientid, STRING: oAA7AAAA>
DEBUG: (object end)
DEBUG: (object end)
DEBUG: HandleInvoke, server invoking <onStatus>
DEBUG: HandleInvoke, onStatus: NetStream.Play.Start
DEBUG: HandleCtrl, received ctrl. type: 31, len: 6
DEBUG: HandleCtrl, Stream BufferEmpty 1
DEBUG: HandleCtrl, received ctrl. type: 32, len: 6
DEBUG: HandleCtrl, Stream BufferReady 1
DEBUG: RTMP_ClientPacket, received: notify 24 bytes
DEBUG: (object begin)
DEBUG: (object end)
DEBUG: RTMP_ClientPacket, received: notify 487 bytes
DEBUG: (object begin)
DEBUG: (object begin)
DEBUG: Property: <Name: author, STRING: >
DEBUG: Property: <Name: copyright, STRING: >
DEBUG: Property: <Name: description, STRING: >
DEBUG: Property: <Name: keywords, STRING: >
DEBUG: Property: <Name: rating, STRING: >
DEBUG: Property: <Name: title, STRING: >
DEBUG: Property: <Name: presetname, STRING: Custom>
DEBUG: Property: <Name: creationdate, STRING: Thu Jun 09 08:55:02 2011
>
DEBUG: Property: <Name: videodevice, STRING: Syntek STK1150>
DEBUG: Property: <Name: framerate, NUMBER: 25.00>
DEBUG: Property: <Name: width, NUMBER: 320.00>
DEBUG: Property: <Name: height, NUMBER: 240.00>
DEBUG: Property: <Name: videocodecid, NUMBER: 7.00>
DEBUG: Property: <Name: videodatarate, NUMBER: 200.00>
DEBUG: Property: <Name: avclevel, NUMBER: 31.00>
DEBUG: Property: <Name: avcprofile, NUMBER: 66.00>
DEBUG: Property: <Name: audiodevice, STRING: Realtek HD Audio Input>
DEBUG: Property: <Name: audiosamplerate, NUMBER: 22050.00>
DEBUG: Property: <Name: audiochannels, NUMBER: 1.00>
DEBUG: Property: <Name: audioinputvolume, NUMBER: 100.00>
DEBUG: Property: <Name: audiocodecid, NUMBER: 2.00>
DEBUG: Property: <Name: audiodatarate, NUMBER: 32.00>
DEBUG: (object end)
DEBUG: (object end)
INFO: Metadata:
INFO: author
INFO: copyright
INFO: description
INFO: keywords
INFO: rating
INFO: title
INFO: presetname Custom
INFO: creationdate Thu Jun 09 08:55:02 2011
INFO: videodevice Syntek STK1150
INFO: framerate 25.00
INFO: width 320.00
INFO: height 240.00
INFO: videocodecid 7.00
INFO: videodatarate 200.00
INFO: avclevel 31.00
INFO: avcprofile 66.00
INFO: audiodevice Realtek HD Audio Input
INFO: audiosamplerate 22050.00
INFO: audiochannels 1.00
INFO: audioinputvolume 100.00
INFO: audiocodecid 2.00
INFO: audiodatarate 32.00
DEBUG: HandleCtrl, received ctrl. type: 31, len: 6
DEBUG: HandleCtrl, Stream BufferEmpty 1
DEBUG: HandleCtrl, received ctrl. type: 32, len: 6
DEBUG: HandleCtrl, Stream BufferReady 1
DEBUG: ignoring too small video packet: size: 2
Pipeline is PREROLLED ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
DEBUG: HandleCtrl, received ctrl. type: 31, len: 6
DEBUG: HandleCtrl, Stream BufferEmpty 1
DEBUG: HandleCtrl, received ctrl. type: 32, len: 6
DEBUG: HandleCtrl, Stream BufferReady 1
DEBUG: ignoring too small video packet: size: 2
DEBUG: HandleCtrl, received ctrl. type: 31, len: 6
DEBUG: HandleCtrl, Stream BufferEmpty 1
DEBUG: HandleCtrl, received ctrl. type: 32, len: 6
DEBUG: HandleCtrl, Stream BufferReady 1
DEBUG: ignoring too small audio packet: size: 0
DEBUG: HandleCtrl, received ctrl. type: 31, len: 6
DEBUG: HandleCtrl, Stream BufferEmpty 1
DEBUG: HandleCtrl, received ctrl. type: 32, len: 6
DEBUG: HandleCtrl, Stream BufferReady 1
DEBUG: HandleCtrl, received ctrl. type: 31, len: 6
DEBUG: HandleCtrl, Stream BufferEmpty 1
DEBUG: HandleCtrl, received ctrl. type: 32, len: 6
DEBUG: HandleCtrl, Stream BufferReady 1
DEBUG: HandleCtrl, received ctrl. type: 31, len: 6
DEBUG: HandleCtrl, Stream BufferEmpty 1
DEBUG: HandleCtrl, received ctrl. type: 32, len: 6
DEBUG: HandleCtrl, Stream BufferReady 1
DEBUG: HandleCtrl, received ctrl. type: 31, len: 6
DEBUG: HandleCtrl, Stream BufferEmpty 1
DEBUG: HandleCtrl, received ctrl. type: 32, len: 6
DEBUG: HandleCtrl, Stream BufferReady 1
DEBUG: HandleCtrl, received ctrl. type: 31, len: 6
DEBUG: HandleCtrl, Stream BufferEmpty 1
DEBUG: HandleCtrl, received ctrl. type: 32, len: 6
DEBUG: HandleCtrl, Stream BufferReady 1
DEBUG: HandleCtrl, received ctrl. type: 31, len: 6
DEBUG: HandleCtrl, Stream BufferEmpty 1
DEBUG: HandleCtrl, received ctrl. type: 32, len: 6
DEBUG: HandleCtrl, Stream BufferReady 1
DEBUG: HandleCtrl, received ctrl. type: 31, len: 6
DEBUG: HandleCtrl, Stream BufferEmpty 1
DEBUG: HandleCtrl, received ctrl. type: 32, len: 6
DEBUG: HandleCtrl, Stream BufferReady 1
DEBUG: HandleCtrl, received ctrl. type: 31, len: 6
DEBUG: HandleCtrl, Stream BufferEmpty 1
DEBUG: HandleCtrl, received ctrl. type: 32, len: 6
DEBUG: HandleCtrl, Stream BufferReady 1
DEBUG: HandleCtrl, received ctrl. type: 31, len: 6
DEBUG: HandleCtrl, Stream BufferEmpty 1
DEBUG: HandleCtrl, received ctrl. type: 32, len: 6
DEBUG: HandleCtrl, Stream BufferReady 1
DEBUG: HandleCtrl, received ctrl. type: 31, len: 6
DEBUG: HandleCtrl, Stream BufferEmpty 1
DEBUG: HandleCtrl, received ctrl. type: 32, len: 6
DEBUG: HandleCtrl, Stream BufferReady 1
DEBUG: HandleCtrl, received ctrl. type: 31, len: 6
DEBUG: HandleCtrl, Stream BufferEmpty 1
DEBUG: HandleCtrl, received ctrl. type: 32, len: 6
DEBUG: HandleCtrl, Stream BufferReady 1
DEBUG: HandleCtrl, received ctrl. type: 31, len: 6
DEBUG: HandleCtrl, Stream BufferEmpty 1
DEBUG: HandleCtrl, received ctrl. type: 32, len: 6
DEBUG: HandleCtrl, Stream BufferReady 1
DEBUG: HandleCtrl, received ctrl. type: 31, len: 6
DEBUG: HandleCtrl, Stream BufferEmpty 1
DEBUG: HandleCtrl, received ctrl. type: 32, len: 6
DEBUG: HandleCtrl, Stream BufferReady 1
DEBUG: HandleCtrl, received ctrl. type: 31, len: 6
DEBUG: HandleCtrl, Stream BufferEmpty 1
DEBUG: HandleCtrl, received ctrl. type: 32, len: 6
DEBUG: HandleCtrl, Stream BufferReady 1
DEBUG: HandleCtrl, received ctrl. type: 31, len: 6
DEBUG: HandleCtrl, Stream BufferEmpty 1
DEBUG: HandleCtrl, received ctrl. type: 32, len: 6
DEBUG: HandleCtrl, Stream BufferReady 1
DEBUG: HandleCtrl, received ctrl. type: 31, len: 6
DEBUG: HandleCtrl, Stream BufferEmpty 1
DEBUG: HandleCtrl, received ctrl. type: 32, len: 6
DEBUG: HandleCtrl, Stream BufferReady 1
DEBUG: HandleCtrl, received ctrl. type: 31, len: 6
DEBUG: HandleCtrl, Stream BufferEmpty 1
DEBUG: HandleCtrl, received ctrl. type: 32, len: 6
DEBUG: HandleCtrl, Stream BufferReady 1
DEBUG: HandleCtrl, received ctrl. type: 31, len: 6
DEBUG: HandleCtrl, Stream BufferEmpty 1
DEBUG: HandleCtrl, received ctrl. type: 32, len: 6
DEBUG: HandleCtrl, Stream BufferReady 1
DEBUG: HandleCtrl, received ctrl. type: 31, len: 6
DEBUG: HandleCtrl, Stream BufferEmpty 1
DEBUG: HandleCtrl, received ctrl. type: 32, len: 6
DEBUG: HandleCtrl, Stream BufferReady 1
DEBUG: HandleCtrl, received ctrl. type: 31, len: 6
DEBUG: HandleCtrl, Stream BufferEmpty 1
DEBUG: HandleCtrl, received ctrl. type: 32, len: 6
DEBUG: HandleCtrl, Stream BufferReady 1
DEBUG: HandleCtrl, received ctrl. type: 31, len: 6
DEBUG: HandleCtrl, Stream BufferEmpty 1
DEBUG: HandleCtrl, received ctrl. type: 32, len: 6
DEBUG: HandleCtrl, Stream BufferReady 1
DEBUG: HandleCtrl, received ctrl. type: 31, len: 6
DEBUG: HandleCtrl, Stream BufferEmpty 1
DEBUG: HandleCtrl, received ctrl. type: 32, len: 6
DEBUG: HandleCtrl, Stream BufferReady 1
DEBUG: HandleCtrl, received ctrl. type: 31, len: 6
DEBUG: HandleCtrl, Stream BufferEmpty 1
DEBUG: HandleCtrl, received ctrl. type: 32, len: 6
DEBUG: HandleCtrl, Stream BufferReady 1
DEBUG: HandleCtrl, received ctrl. type: 31, len: 6
DEBUG: HandleCtrl, Stream BufferEmpty 1
DEBUG: HandleCtrl, received ctrl. type: 32, len: 6
DEBUG: HandleCtrl, Stream BufferReady 1
WARNING: Stream corrupt?!
ERROR: Wrong data size (8059624), stream corrupted, aborting!
DEBUG: RTMP_ReadPacket, m_nChannel: 167
DEBUG: RTMP_ClientPacket, unknown packet type received: 0xe3
DEBUG: RTMP_ReadPacket, m_nChannel: 6ad6
DEBUG: RTMP_ReadPacket, m_nChannel: 3d7c
DEBUG: RTMP_ClientPacket, unknown packet type received: 0x00
DEBUG: HandleCtrl, received ctrl. type: 8760, len: 6
DEBUG: HandleCtrl, Stream xx -1742407864
DEBUG: RTMP_ClientPacket, unknown packet type received: 0x2a
DEBUG: RTMP_ClientPacket, unknown packet type received: 0x2a
DEBUG: RTMP_ClientPacket, unknown packet type received: 0x2a
DEBUG: RTMP_ClientPacket, unknown packet type received: 0x2a
DEBUG: RTMP_ClientPacket, unknown packet type received: 0x2a
DEBUG: RTMP_ClientPacket, unknown packet type received: 0xff
DEBUG: RTMP_ClientPacket, unknown packet type received: 0xff
DEBUG: RTMP_ReadPacket, m_nChannel: 5260
DEBUG: RTMP_ReadPacket, failed to allocate packet
Got EOS from element "pipeline0".
Execution ended after 69266417190 ns.
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
DEBUG: Invoking deleteStream
Setting pipeline to NULL ...
FREEING pipeline ...
After a short transfers,the client can not get the data.It seems that the Connection reset by peer; -
Apple iPhone Http Live Streaming in Flash (Free OSMF library released)
Hello everyone, recently I've read (http://www.flashcomguru.com/index.cfm...) that Matthew Kaufman has developed an AS3 library which adds support for Apple HTTP Live Streaming (HLS) to OSMF and released it under the MPL open-source license.
HLS (wich is a proposed RFC open standard) is the required protocol to deliver VOD or live stream to Apple devices and it's also supported on Android; having it supported in a flash based media-player can make you reach both desktop and mobile users using just HLS, without the need to offer different distribution method.
Benefits:
Less complexity for content distributor and lower distribution costs!
What about integrating HLS in StrobeMediaPlayback? ...maybe using the cited MPL AS3 library, I'm not an AS expert and the library lack documentation but
it should be possible, isn't it?You need an iPhone Developer Account to download ... if you can't see the link then you don't have access or aren't logged in.
-
Live Streaming w/ RTMPE Only?
I'm hoping there's a simple answer to this. I'm attempting to stream a live event using FMLE and FMS. The first thing that throws me off is that I cannot push the stream to the server using RTMPE, only RTMP[T]. I assume this is because the the encryption is handled by the server and not the encoder. However, for the sake of security I would like to disable RTMP connections from the flash player to the server. Yet it would seem that if I incorporate the server side scripting described here http://kb2.adobe.com/cps/405/kb405456.html (namely the changes to my main.asc file), that my server would reject an RTMP connection from FMLE? Is that correct? If so, where do I go from here? Can I broadcast a live stream using RTMPE only? Do I need a different encoder or something, or am I just missing a very simple point?
You're correct that FMLE can't publish RTMPE.
What you can do is, in your logic for inspecting the client protocol for the connection, make an exception for FMLE user-agents. Based on the example code in the doc you linked to, you could do something like:
application.onConnect = function(clientObj) {
if(clientObj.protocol == "rtmpe" || clientObj.protocol == "rtmpte" || client.agent.indexOf("FMLE") > -1 || client.agent.indexOf("FME") > -1){
return true;
return false -
How to play live streaming in windows phone 8.1 silverlight app?
Hi,
I am developing a windows phone 8.1 silverlight app. In my I want to show live streaming youtube channel , I think it is not possible,
Actually that youtube channel is a television channel , I want to stream that live tv in my app.(I tried to load youtube channel in webbrowser in iframe tag but it is not opening)
Anybody help me how to play live tv or live streaming in my app,
Thanks..
Suresh.MHello,
You will likely need to write a custom
MediaStreamSource that can access the media stream and parse it. Windows Phone supports h.264 natively and as long as the site serves up a media stream that contains h.264 frames you can parse it and have our built in decoder decode and display it. You
will need to have intimate knowledge of the streaming protocol used by the website that you are trying to play. You must also make sure that the website is not using protected content. IANAL so I would recommend that you have your local law professional
(lawyer) review the licensing of the website that you are attempting to connect to and stream from before continuing your development. Your lawyer can make sure make sure their licensing allows you to do this.
I hope this helps,
James
Windows SDK Technologies - Microsoft Developer Services - http://blogs.msdn.com/mediasdkstuff/ -
Hi,
am planning to develop a live streaming app using C# and xaml.
I have gone through the below sample and am able to stream the live content using wireacst camera.
Azure Media Services Live Streaming Sample
But i want to use Phone camera to stream the live video rather using wirecast camera.
How can i use that? How to get the published URL from the C# code, is there any API to get that?
Anybody please help me
Regards,
SanthoshHello,
There are no in box components that will allow you to transmit captured video from a Windows Phone device to Azure Media Services. You will need to create you own implementation. You can do this by creating a custom Media Foundation sync that can plug into
the MediaCapture element's pipeline. You will then need to modify the captured and compressed video to align to one of the two protocols that Azure Media services can use. At this time Azure Media services can use either Smooth Streaming or RTMP for ingestion.
You will need to create your own packager (or find compatible packager code from a 3rd party) to package the MP4 encoded data into the necessary format.
Here is an example of how to create a custom Media Foundation sync (please note you must write your Media Foundation components using C++ MoCOM. We do not support Media Foundation components written in managed languages):
Real-time communication sample
I hope this helps,
James
Windows SDK Technologies - Microsoft Developer Services - http://blogs.msdn.com/mediasdkstuff/ -
Dynamic Bitrate Switching on Live Stream
I have FMIS 3.5. I've installed it with pretty much all the default values. I haven't changed any of the settings in either the LIVE or the VOD applications.
Dynamic Bitrate Switching is working well for VOD, but not working at all for LIVE streams. Doing a regular bandwidth detection on both the LIVE and the VOD applications give similar, high bandwidth results. However, the LIVE application NetStreamInfo.maxBytesPerSecond is showing a very low bandwidth capability of around maxBytesPerSecond = 19016, where as for the VOD its achieving around 637110. I can play a single LIVE stream of a high quality smoothly without any error.
I don't know if this is relevent but I'm getting some error messages occassionally in the log of the live application saying : Dropping application (live/_definst_) message. Clients not allowed to broadcast message. These messages aren't consistent, and don't coincide with trying to use bitrate switching.
I have tried downloading the Adobe sample StreamSwitching.fla and it won't play the LIVE streams at all. Using the opensource Longtail Video player it just always defaults to the lowest stream. Here is an example: http://www.ltscotland.org.uk/testbed/live/livestream2.asp
Can anyone suggest what the problem might be here? And any possible solutions?Thanks,I have read that article. Based on that article the NetStreamInfo.maxBytesPerSecond is not an accurate measurement to base dynamic switching on. This seems to be the basis of the bitrate switching in both the longtail player, and the adobe examples that I have tried. That article suggests using the dropped frames property, in conjunciton with bufferlength to determine if switching is neccessary. Unfortunately I can't seem to find a player online which handles this successfully. That being said, I can't believe I'm the only person trying to implement dynamic bitrate switching for live streams so surely there are some players out there which can do this successfully? If anyone knows of any code available which does this successfully I would appreciate knowing where! The examples provided by Adobe https://www.adobe.com/cfusion/entitlement/index.cfm?e=fms35 unfortunately don't work either.
-
Hi All,
I am trying to display a live stream from rtmp server. Below is my code.
<?xml version="1.0" encoding="utf-8"?>
<s:Group height="100%" width="100%"
xmlns:fx="http://ns.adobe.com/mxml/2009"
xmlns:s="library://ns.adobe.com/flex/spark"
xmlns:mx="library://ns.adobe.com/flex/mx">
<fx:Script>
<![CDATA[
import mx.core.FlexGlobals;
private const URL:String = "rtmp://eigr2.flashlive.bigCDN.com/20175D";//"http://www.helpexamples.com/flash/video/cuepoints.flv";
private const STREAM_KEY:String = "eigr8";
private var _video:Video = new Video();
private var _nc:NetConnection;
private var _custClient:Object;
private var _ns:NetStream;
private var _streamKey:String;
public function playVideo():void
var url:String = URL;
if(videoHandler.numChildren == 0)
_nc = new NetConnection();
_nc.addEventListener(NetStatusEvent.NET_STATUS, netStatusHandler);
_nc.addEventListener(AsyncErrorEvent.ASYNC_ERROR, asyncErrorHandler);
_nc.connect(URL);
private function netStatusHandler(event:NetStatusEvent):void
trace("NC " + event.info.code);
switch (event.info.code)
case "NetConnection.Connect.Success":
_video = new Video();
_video.smoothing = true;
var custClient:Object = new Object();
custClient.onMetaData = metaDataHandler;
_ns = new NetStream(_nc);
_ns.client = custClient;
_ns.addEventListener(AsyncErrorEvent.ASYNC_ERROR, asyncErrorHandler);
_ns.addEventListener(DRMErrorEvent.DRM_ERROR, drmerror);
_ns.addEventListener(DRMStatusEvent.DRM_STATUS, drmStatus);
_ns.addEventListener(StatusEvent.STATUS, status);
_ns.addEventListener(NetStatusEvent.NET_STATUS, target);
_ns.bufferTime = 2;
_ns.receiveVideo(true);
var net:NetStreamPlayOptions = new NetStreamPlayOptions();
//net.len = 1;
//net.transition = NetStreamPlayTransitions.APPEND;
net.streamName = STREAM_KEY;
_ns.play(STREAM_KEY);
_video.attachNetStream(_ns);
videoHandler.addChild(_video);
videoHandler.visible = true;
//background.visible = true;
break;
case "NetConnection.Connect.Failed": //shownoVideoText();
break;
case "NetStream.Play.StreamNotFound": trace("Unable to locate video: ");
break;
private function status(event:StatusEvent):void
trace("Status Event : " + event.toString());
private function drmerror(event:DRMErrorEvent):void
trace("DRM Error : " + event.toString());
private function drmStatus(event:DRMStatusEvent):void
trace("DRM Status : " + event.toString());
private function videoStates(event:Event):void
trace(event.toString());
public function stopVideo():void
_nc.close();
_nc = null;
_ns = null;
this.videoHandler.removeChild(_video);
private function asyncErrorHandler(event:AsyncErrorEvent):void
trace("Async errrorrrrrrrrrrrrrrrrrrrr");
private function target(e:NetStatusEvent):void
trace("Started playing : " + e.info.code);
private function metaDataHandler(infoObject:Object):void
this.setVideoSize(infoObject.height, infoObject.width);
/*public function playVideo():void
custClient = new Object();
custClient.onMetaData = metaDataHandler;
nc = new NetConnection();
nc.connect(null);
nc.client = { onBWDone: function():void{} };
//nc.addEventListener(NetStatusEvent.NET_STATUS, target);
nc.addEventListener(AsyncErrorEvent.ASYNC_ERROR, asyncErrorHandler);
ns = new DynamicCustomNetStream(nc);
ns.client = custClient;
ns.addEventListener(AsyncErrorEvent.ASYNC_ERROR, asyncErrorHandler);
ns.addEventListener(NetStatusEvent.NET_STATUS, target);
ns.play(URL);
videoPlayer.attachNetStream(ns);
videoPlayer.smoothing = true;
if(!videoHandler.contains(videoPlayer))
videoHandler.addChild(videoPlayer);
private function metaDataHandler(infoObject:Object):void
this.setVideoSize(infoObject.height, infoObject.width);
override public function set currentState(value:String):void
super.currentState = value;
this.videoHandler.percentHeight = value == "fullScreen" ? 100 : 50;
public function onFCSubscribe(info:Object):void
trace("onFCSubscribe - succesful");
protected function goBack():void
FlexGlobals.topLevelApplication.showVideo(false);
]]>
</fx:Script>
<fx:Declarations>
<!-- Place non-visual elements (e.g., services, value objects) here -->
</fx:Declarations>
<s:Button click="goBack()"
horizontalCenter="0"
label="Go Back" bottom="50"/>
<mx:UIComponent width="100%"
height="50%"
top="0" left="0"
id="videoHandler"/>
</s:Group>
When I try to run this, I could successfully connect to server but not getting the data. If I open the url in browser PublishNotify gets dispatched and I get the stream. As soon as I close UnpublishNotify gets dispatched and I wont get the stream. Can I know the possible reasons and solution?Unless I am mistaken here, you must implement Apple's HTTP Live protocol for streams over ten minutes in length. There is a few options you can take:
1. Split your streams into 9 minute segments and feed them consecutively in groups. (Apple actually offers a segmenting tool that even generates an index file via there Developer tools)
2. Only enable full length streaming over WiFi
3. Fully implement HTTP Live....
Also, do not forgot that not only are you going to have to implement HTTP Live but you are going to have to ALSO include an audio only stream. -
Flash Media Player which handles bitrate switching for live streams?
Hello. I've got a very short timescale to find a solution for a way to display livestreams with bitrate switching. Does anyone
know of any opensource players which can do this effectively? Or do the inbuilt components in CS4 deal with this ok?Thanks,I have read that article. Based on that article the NetStreamInfo.maxBytesPerSecond is not an accurate measurement to base dynamic switching on. This seems to be the basis of the bitrate switching in both the longtail player, and the adobe examples that I have tried. That article suggests using the dropped frames property, in conjunciton with bufferlength to determine if switching is neccessary. Unfortunately I can't seem to find a player online which handles this successfully. That being said, I can't believe I'm the only person trying to implement dynamic bitrate switching for live streams so surely there are some players out there which can do this successfully? If anyone knows of any code available which does this successfully I would appreciate knowing where! The examples provided by Adobe https://www.adobe.com/cfusion/entitlement/index.cfm?e=fms35 unfortunately don't work either.
-
I have a question: On what operating systems(android,ios) can be played and published stream with codecs H264 and PCMU with rtmp and rtmfp protocols(live streaming)?
As far as I found out with the codecs can be played on Android(rtmp protocol). On IOS video is not displayed, as I understood AIR environment cuting it.
Another question regarding video texture. Will be included in future releases support for live playing h264+PCMU on IOS?On iOS, you'll need to be playing a HLS stream for h264 to decode when streaming from a remote server.
Maybe you are looking for
-
I have an iphone 4 and I am thinking about getting an ipad2. Can I use the same apple id and itunes account with both of them? Will my contacts and apps transfer over to the iPad2?
-
My Apple AV composite cable no longer works with my 3G iPhone. Anyone else?
I purchased an Apple composite AV cable from my local Apple store shortly after I bought my 3G iPhone. It worked then. I haven't used it much, but this morning I tried it again. This time I got a message something like this: "This accessory was not m
-
When and How to close database connection in JSP?
Hi there, I am using MySQL and JDBC 3.0, in my system, When and How to close database connection in JSP? Thanks in advance. Lonely Wolf <%@ page session="true" language="java" %> <jsp:include page="checkauthorization.jsp" /> <%@ taglib prefix="c" uri
-
10g parameters for Task Audit System 9
Does anyone have suggestions regarding how to optimize for USER I/O during a metadata load the TASK_AUDIT table on 10g? The ADDM report suggests Segment Tuning and Application tuning (Application tuning not possible since it is HFM code.) Looking at
-
Trying to download free photoshop and received error code 148:3 can you help
Need help tying to download free photoshop and receiving error code 148:3