FMS 4.5 - H264?

I have an app that uses FMS4.5 to record web cam video to the local hard drive. I've had no luck at all getting decent output using H264/MP4 directly. I get video but it's artifacty as hell, while FLV output is quite nice. Right now I'm using FFMPEG to convert the flv's to mp4's, but it'd be better to just have mp4's to start. Anyone offer any settings that work for them? I'm recording at 768x432...

??  waht do you mean ? PCMU PCMA  not supported on FMS ?
so why can I  stream H264 with PCMU live and not for record ?
also Custom Metadat don't work at all for record stream in MP4

Similar Messages

  • Air 3 + FMS 4.5 + HLS = iOS app play live video? h264+aac?

    OK, I tried before with air 2.6-2.7, a live video app VP6+MP3 to the AppStore and it got rejected, because of the http-live-streaming requirement.
    So today, I installed FMS 4.5, and created stream h264+AAC, to the rtmp://myserverip/livepkgr  with stream: mystream?adbe-live-event=liveevent
    I tried openning it on iphone safari with this url and it playes with default player: http://myserverip/hls-live/livepkgr/_definst_/liveevent/mystream.m3u8
    Then I tried with same url in my iphone app (Air 3.0 sdk) to open the stream and it doesnt work, I tried f4f and f4m extensions, dont work.
    What is the URL I have to load from my app as3 to load this HLS stream into my Air3 iOS App?
    I read somewhere now it is possible, or is it not possible yet to make app that plays HLS now?

    OK, I just made it work (kind of) with StageWebView. In a way with the iphone default player it is working smoothly with h264 video and AAC audio.
    - Is this the only way to bring HLS (http-live-streaming) live video to the  Air3 sdk based iOS Apps? I looked through the HLS and HDS documents, it doesnt say anything about how to load the stream into AS3 code with rtmp://ip/live/stream kind of URL.
    - Is there any live tv apps approved to Apple AppStore with this StageWebView implementation? I am worried that that it will rejected, because it is basically just loading a html page.
    Appreciate the help in advance.

  • FMS H264 recording FLV format

    I am trying to record a H264/AAC streamin FLV format on my FMS server and all I get in the recording is the metadata but no media content. Is H264 recording only supported in F4V?
    Siva.

    Yes FLV format does not support H.264/AAC content. You need to use the F4V container (or a MPEG-4 container) to record H.264/AAC content. From FMS 3.5.3 RAW file format is also supported which can hold H.264/AAC content. You can read more about it at http://help.adobe.com/en_US/FlashMediaServer/3.5.3/FeaturesGuide/flashmediaserver_3.5.3_fe atures.pdf
    Regards
    Mamata

  • Please prefix 'mp4:' to the stream name to record H264/AAC/HE-AAC encoded data at FMS using DVR...

    I was able to modify the main.asc file in the application/livepkgr directory to include:
    * DVRSetStreamInfo :
    * This prototype was created to allow DVR recording funtionality to FLME to FMS
    Client.prototype.DVRSetStreamInfo = function (info)
        var s = Stream.get( "mp4:" + info.streamname + ".f4v" ) ;
        if (s)
            if (info.append)
            s.record ("append") ;
            else
            s.record () ;
            s.play (info.streamname) ;
    I get three status messages in the FLME encoding log:
    Requested DVR command has been successfully issued to Primary FMS server for stream livestream1
    Requested DVR command has been successfully issued to Primary FMS server for stream livestream2
    Please prefix 'mp4:' to the stream name to record H264/AAC/HE-AAC encoded data at FMS using DVR feature
    Now I have a few issues:
    1. How to I fix the issue with the 3rd status message "Please prefix 'mp4:' to the stream name to record H264/AAC/HE-AAC encoded data at FMS using DVR feature" since the code above is prefixing the "mp4:" to the stream name.
    2. In the applications/livepkgr/streams directory there is a file called "undefined.f4v" which is telling me the code above isn't passing the info.streamname variable.
    3. Also, what do I need to do to playback this .f4v file. I've tried opening it with Adobe Media Player, but it doesn't recognize it.
    I'm obviously using multi-bitrate streaming and that is working flawlessly.  My goal is to record this livestream to later playback as an mp4 file.
    Any ideas?
    UPDATE:
    I know on page 15 of the FMS 4.5.1 developers guide, "Configure DVR (HDS)" under "Publish a DVR stream", it states "To Publish a DVR stream from FLME, DO NOT check Record OR check DVR Auto Record. Publish the stream just as you publish any live stream. In the next section "Play DVR streams" is states that I can use SMP (Stobe Media Playback), which I am.  So that brings up two more questions:
    1. Why is my SMP Player not displaying the DVR funtionality (See image):
    In my SMP configuration, my streamType is "LiveOrRecorded" - default. There is a streamType = "dvr", but will I lose live funtionality?
    2. If I want to later on, package the stream into an mp4 file and play it back later for those who missed the live stream, what is the best approach for that?
    Message was edited by: giostefani

    Two things: You need to have "DVRCast" application on server-side i.e. "dvrcast_origin" for DVR recording to work and secondly for mp4 recording your stream name should be "mp4:<streamname>.mp4" while publishing.

  • Archiving live stream at FMS and injecting metadata: VP6 good h264 not

    When I record a live stream at FMS, one in which I've injected  metadata in my main.asc file, the archived file plays back fine.  The  metadata plays back too.  I'm able to retreive it just fine - if I  encode VP6.
    If I encode h.264 the file plays back but  the metadata does not.  The fact that the archived file is created and  plays back tells me things are wired correctly.  The only thing I  changed is the format.
    According to FMS docs (http://help.adobe.com/en_US/FlashMediaServer/3.5_SS_ASD/WS5b3ccc516d4fbf351e63e3d11a11afc9 5e-7e42.html#WS5b3ccc516d4fbf351e63e3d11a11afc95e-7f35)
    ..."The recording format is determined by the filename you pass to the Stream.get()
    method."
    So my record code looks like the following:
    application.onPublish = function(client, stream) {
         trace("onPublish");
         s = Stream.get("mp4:streamname.f4v");
         if(s){
             s.record();
         this.doRepublish(this.nc, stream);
    My code that injects the data in to the stream looks like this:
    Client.prototype.sendDataEvent = function(data) {
         trace("Call to sendDataEvent...");
         this.newStream = Stream.get("mp4:streamname.f4v");
         this.newStream.send("onTextData",data);
    All must be wired  correctly because the metadata comes through during the live stream.  On  play back of the archive though, the metadata doesn't appear to be  there.
    Any thoughts?
    Thanks

    My apologies on the s.play() confusion.  I had been trying different versions of the code and posted the one without it.
    Whether I include s.play() or not the file gets created.  Here are the various versions of the onPublish() function I've tried (differences in red):
    1.
    application.onPublish = function(client, stream) {
        trace("onPublish");   
        s = Stream.get("mp4:streamname.f4v");
        if(s){
            s.record();
            s.play("mp4:streamname.f4v");
        this.doRepublish(this.nc, stream);
    2.
    application.onPublish = function(client, stream) {
        trace("onPublish");   
        s = Stream.get("mp4:streamname.f4v");
        if(s){
            s.record();
            s.play(stream);
        this.doRepublish(this.nc, stream);
    3.
    application.onPublish = function(client, stream) {
         trace("onPublish");   
         s = Stream.get("mp4:streamname.f4v");
         if(s){
             s.record();
         this.doRepublish(this.nc, stream);
    All produce the same result - an archived file called mp4:streamname.f4v in my streams folder.  This file plays back fine but does not play back the commands.
    On your other question, about things working fine for VP6, it works fine for FLV.  A file called streamname.flv is produced.  This file plays back fine and does indeed play back commands baked into the file as well.  This is what makes me believe the code is not the problem.  If it works perfectly for one format, there would seem to be very little I could do in my code to break things for the other.
    Can you try this using the record() code snippets in the live docs Stream.record() section?
    http://help.adobe.com/en_US/FlashMediaServer/3.5_SS_ASD/WS5b3ccc516d4fbf351e63e3d11a11afc9 5e-7e42.html#WS5b3ccc516d4fbf351e63e3d11a11afc95e-7f35
    All you'd need is the code snippets there to record your live stream and another server side function to inject commands into that live stream. Here is that function:
    Client.prototype.sendDataEvent = function(data) {
        trace("Call to sendDataEvent...");
        this.newStream = Stream.get("mp4:streamname.f4v");
        this.newStream.send("onTextData",data);
    Do something simple like call onTextData and pass some text in the data parameter.  Then on the client side viewer, handle the onTextData method.  It will receive the text.  Display it in a text area or something.
    If you record while injecting this text into your stream, the text should display on playback of the archived file.  It will if you encode VP6/FLV, but not if you encode H.264/F4V.
    Let me know what you discover.

  • H264 From Axis Encoder feed to FMS 3.5

    Hello, my name is Dave and I am working on an issue that I have very little knowledge in. Here is the problem space.
    I have a feed from an Axis Encoder that I would like to display in Flex 3.4. As far as I know there is no direct way of doing this. Is this is not true please let me know.
    So I started to explore the possibilities of using Flex Media Server 3.5 to receive this feed from the Axis Encoder and using NetConnection and NetStream publish this feed to Flex…
    Is this possible? I wish I had the time to research this to the point so that I can fully understand the technical aspects but I just do not have that luxury.
    I have an axrtsp://MY_IP:POR/mpeg4/1/media.amp feed. Can FMS understand this?
    Thanks for your input and time,
    Dk.

    ok I just solved my problem.
    I am running vista and FMS is installed in ... "program files" ... etc.
    So, when adding user through "users.exe" command users.dat remains empty even though it reports "new user successfuly added".
    The problem was vista write protection.

  • FMS 3.5 says 'Bad network data': error in handling RTMP extended timestamps / chunkSize?

    Hello all,
    For a client, I am working on a project where a live RTMP stream is published to an Adobe FMS 3.5.6 server from a java application, using Red5 0.9.1 RTMPClient code.
    This works fine, until the timestamp becomes higher than 0xFFFFFF after 4.6 hours, and the RTMP extended timestamp field starts being used. I have already found: when the extended timestamp was written after the header, the last 4 bytes of the data were being cut off. I have fixed this locally, and now the data being sent seems to me to be conformant to the spec. However, FMS still throws an error message in the core log and then kills the connection from the Red5 client. Here is the error message:
    This is the error message:
    2011-06-03     14:28:02     13060     (e)2611029     Bad network data; terminating connection : chunkstream error:message length 11893407 is longerthan max rtmp packet length     -
    2011-06-03     14:28:02     13060     (e)2631029     Bad network data; terminating connection : (Adaptor: _defaultRoot_, VHost: _defaultVHost_, IP: 127.0.0.1, App: live/_definst_, Protocol: rtmp, Client: 5290168480216205379, Handle: 2147942405) : 05 FF FF FF 00 13 = 09 01 00 00 00 01 00 01 01 ' 01 00 00 00 00 00 13 4 09 0 00 00 01 ! 9A & L 0F FA F6 12 , B4 A6 CE H 8A AB DC G BB d k 1B 9F ) 13 13 D2 9A E5 t 8 B8 8D 94 ! 8A AE F6 AF } " U 0 D3 Q EF FF ~ 8D 97 D9 FF BE A3 F3 C9 97 o 9D # F9 7F h A4 F7 } / FB & F1 DC 9C BF   BD D3 E7 CA 97 FE E2 B9 E4 F7 9E 1A F6 BA } C9 w FC _ / / w FE n EF D7 P 9C F4 BE 82 8E F7 | BE 97 B4 BB D7 FE ED I / FB D1 93 9A F9 X \ 85 BD DD I E3 4 E8 M 13 D3 " ) BE A9 92 E5 83 D4 B4 12 DE D5 A3 E6 F4 k DE BF Q 3 A0 g r A4 f D9 BD w * } F7 r 8A S 2 . AB BD EE ^ l f AF E1 0B $ AF 9D D7 - BF E8 ! D3 } D3 i E3 B8 F2 M A8 " B1 A5 EF s ] A5 BC 96 E5 u e X q D2 F1 r F9 i 92 b EE Z d F9 * A6 BB FD 17 w 4 DD 3 o u EB ] ] EF FE B5 B1 0A F2 A0 DD FD B2 98 DF E8 e F6 CB FD 96 V % A5 D5 k ] FD w EF AF k v AA E8 ! 9F / w BE FA 9A _ E F2 D3 , ? 17 } AD 7 EC B3   } 07 B5 | z { { A5 = 11 90 CF BF ; 4 FE EF 95 F7 E7 DF B9 , AF z 91 CF C9 BD DE CB { F5 17 } F2 E5 D7 DF z E6 [ 96 > Y m 9F EB AF DD D8 E8 v B9 A8 E9 % A7 | 1 CF 8B D Z k N DF F8 N FA S R FE . ~ CB A 9 E1 ) 8F 8E BB EC c 6 13 F1 AC FD FD FC 8A F7 F3 K B9 FA ^ / A4 FC B9 AA F6 DE C2 [ 1A E c r B3 BF E5 EC B5 x 94 FD . A9 t I Q % EA EC DE | K FE z A4 97 F9 " 1 0F CA FB F5 F5 p 9E 99 3 - ; B8 F4 F1 FF t A3 EC BC # DE AC 91 13 19 o < 06 F5 FD 7F 7 _ $ D B t B5 0D 8A C1 C1 BA 0B FE DB B7 83 _ } BD z F7 CB { FC M A9 8D = D5 B1 < 85 = EF E1 ; BA H y FC BC B4 C A2 D9 ` e E4 94 H 5 13 ' 93 93 8E E C2 1C R 97 9 X B7 FF 10 9F { ) F1 CF AB AC ] EE H A2 DE D3 C5 m F6 K A2 A7 A2 89 D2 z EB DF 97 ^ k 9E 99 BB E7 B6 97 w { ~ + C7 B2 } FE ' C4 | B6 o H DD r A8 9F DC FF F9 Q b l 93 T B6 EE FF 11 j CD s P C F1 3 R I F8 D8 R 9D 93 AA D5 + DE FC BE " B9 E1 ` CB BD 0F F5 C7 AA w CF 8D p 9A F7 g f N FF 84 B7 K Q 93 g E1 - D3 s } w v AE 96 98 ED CF BA E9 2 . f 99 95 97 o 13 CA F7 s e $ F4 B5 15 C4 A8 DE M F7 w \ 8D 00 C6 C2 b D3 / 7 w F2 ' BF CD 89 FF > D7 FB BC A2 S N FB A5 CD AF D3 F9 9D DF AE B5 17 CF 9D B7 , B9 9 ^ 7F [ 93 84 F7 } _ EA DF u \ 99 Z t E CA M EF 7 " AD FE 92 9E n 7F EB D8 C { 99 8B 9E w H BF B1 | g 9F F3 FA E1 - E5 CB BB x CF p 8B D2 w v EF w FA E2 F7 s C5 AC $ FC B4 DB BE G E4 DC F0 A0 96 F3 ! t DC FF % A5 CB A4 ^ AB D2 BD E7 9A E ' 08 + AF U 17 EB 8A w A7 N E4 A5 x 93 12 _ - ; 09 DD DF m 11 BE w \ } BA D3 t BC D9 97 9B C5 7F D8 H F1 D 7 8A ^ FA n F0 B8 W E6 84 5 - 8 B5 h o C4 F7 83 P 88 CB AE m t BB L 95 A9 s 90 A2 Y o DF K _ / l D2 D1 C9 91 ' E4 BD / / D 97 m BB E7 14 93 % C5 ; DD CF D8 : ~ B5 4 F FA U F0 8F w w DC FD 83 FC 13 EF w p DA A5 07 _ * - 1D 14 9D D5 84 F E6 F0 FF E4 15 w n A5 9F DE d AE F5 " - f D2 AE 96 1F # FA F1 x C1 L DF l M 06 8A E4 z DB 17 BA l DA e 15 CD 85 86 1F 09 82 h ] C6 { E7 C5 AF Z C5 B0 83 v D9 03 FC / ~      -
    The message for which the hex dump is displayed, is a video message of size 4925 bytes. Below is the basic logging in my application:
    *** Event sent to RTMP connector: Video - ts: 16777473 length: 4925. Waiting time: -57937, event timestamp: 16777473
    14:28:02.045 [RtmpPublisher-workerThread] DEBUG o.r.s.s.consumer.ConnectionConsumer - Message timestamp: 16777473
    14:28:02.045 [RtmpPublisher-workerThread] DEBUG o.r.s.n.r.codec.RTMPProtocolEncoder - Channel id: 5
    14:28:02.045 [RtmpPublisher-workerThread] DEBUG o.r.s.n.r.codec.RTMPProtocolEncoder - Last ping time for connection: -1
    14:28:02.045 [RtmpPublisher-workerThread] DEBUG o.r.s.n.r.codec.RTMPProtocolEncoder - Client buffer duration: 0
    14:28:02.046 [RtmpPublisher-workerThread] DEBUG o.r.s.n.r.codec.RTMPProtocolEncoder - Packet timestamp: 16777473; tardiness: -30892; now: 1307104082045; message clock time: 1307104051152, dropLiveFuturefalse
    14:28:02.046 [RtmpPublisher-workerThread] DEBUG o.r.s.n.r.codec.RTMPProtocolEncoder - !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!12b Wrote expanded timestamp field
    14:28:02.046 [NioProcessor-22] DEBUG o.r.server.net.rtmp.BaseRTMPHandler - Message sent
    I have captured the entire frame containing this message with wireshark, and annotated it a bit. You can find it here:
    http://pastebin.com/iVtphPgU
    The video message of 4925 bytes (hex 00 13 3D) is cut up into chunks of 1024 bytes (chunkSize 1024 set by Red5 client and sent to FMS). Indeed, after the 12-byte header and the 4-byte extended timestamp, there are 1024 bytes before the 1-byte header for the next chunk (hex C5). The chunks after that also contain 1024 bytes after the chunk header. This appears correct to me (though please correct me if I'm wrong).
    When we look at the error message in the core log, the hex dump displayed also contains 1024 bytes, but it starts from the beginning of the message header. The last 16 bytes of the message chunk itself are not shown.
    My question is this: is the hex dump in the error message always capped to 1024 bytes, or did FMS really read too little data?
    Something that may be of help, is the reported 'too long' message length 11893407. This corresponds to hex B5 7A 9F, which can also be found in the packet, namely at row 0c60 (I've annotated it as [b5 7a 9f]. This location is exactly 16 bytes after the start of the 4th chunk data, not really a place to look for timestamps.
    My assumptions during this bug hunting session were the following (would be nice if someone could validate these for me):
    - message length, as specified in the RTMP 12 and 8-bit headers, defines the total number of data bytes for the message, NOT including the header of the first message chunk, its extended timestamp field, or the 1-byte headers for subsequent chunks. The behaviour is the same whether or not the message has an extended timestamp.
    - chunk size, as set by the chunkSize message, defines the total number of data bytes for the chunk, not incuding the header or extended timestamp field. The behaviour is the same whether or not the message has an extended timestamp.
    I believe I've chased this problem as far as I can without having access to the FMS 3.5 code, or at least being able to crank up the debug logging to the per-message level. I realize it's a pretty detailed issue and a long shot, but being able to publish a stream continuously 24/7 is critical for the project.
    I would be very grateful if someone could have a look at this hex dump to see if the message itself is correct, and if so, to have a look at how FMS3.5.6 handles this.
    Don't hesitate to ask me for more info if it can help.
    Thanks in advance
    Davy Herben
    Solidity

    Hello,
    It took a bit longer than expected, but I have managed to create a minimal test application that will reproduce the error condition on all machines I've tested on. The application will simply read an H264 file and publish it to an FMS as a live stream. To hit the error condition faster, without having to wait 4.6 hours, the application will add a fixed offset to all timestamps before sending it to the FMS.
    I have created two files:
    http://www.solidity.be/publishtest.jar : Runnable java archive with all libraries built in
    http://www.solidity.be/publishtest.zip : Zip file containing sources and libraries
    You can run the jar as follows:
    java -jar publishtest.jar <inputFile> <server> <port> <application> <stream> <timestampOffset>
    - inputFile: path to an H264 input video file
    - server: hostname or IP of FMS server to publish to
    - port: port number to publish to (1935)
    - application: application to publish to (live)
    - stream: stream to publish to (output)
    - timestampOffset: nr of milliseconds to add to the timestamp of each event, in hexadecimal format. Putting FFFFFF here will cause the server to reject the connection immediately, while FFFF00 or FFF000 will allow the publishing to run for awhile before the FMS kills it
    Example of a complete command line:
    java -jar publishtest.jar /home/myuser/Desktop/movie.mp4 localhost 1935 live output FFF000
    Good luck with the bug hunting. Let me know if there is anything I can help you with.
    Kind regards,
    Davy Herben

  • Adobe Air 3.0 iOS streaming H264/Speex over RTMP -- No Video

    Problem:
    H264/Speex RTMP stream from Flash Media Server doesn't display video.  Audio plays fine.
    Conditions:
    Adobe Air 3.0
    iOS device (iPad 2, iOS4.3)
    App settings:
    <renderMode>direct</renderMode>
    Flash Builder 4.5.1 compiler settings:
    -swf-version=13
    -target-player=11.0.0
    I have tried using both stageVideo and the regular Video object to render an H264 signal streaming from Flash Media Server over rtmp, but with no luck.  I can hear the audio, but the video is never rendered.
    I CAN see H263 video when streamed over RTMP using this setup (with just the Video object). 
    I can also stream a locally stored mp4 (H264/AAC) file over rtmp from the iOS device and play it locally just fine (using a stageVideo object).
    I have attempted this with both stageVideo (which works fine when streaming an mp4 file from the iOS device) and with the regular Video object (the regular Video object handles H263 just fine streaming down from FMS over rtmp).  I've also played around with backgroundAlpha = "0" for this case, all with no luck.
    I've followed all the online instructions and tutorials to get this working for the past week, but have not had any luck.  I'm happy to post more code, but this approach is fairly straight forward and has been described multiple times around these forums and in the adobe blog posts.  It goes like this:
         - instantiate NetConnection (_nc) to FMS and wait for successful connection:
                   _ns = new NetConnection();  
                   _nc.connect(serverAddress);
         - instantiate NetStream (_ns) and connect using NetConnection: 
                   _ns = new NetStream(_nc);
         - setup listener for StageVideoAvailabilityEvent; if stageVideo becomes available, attach _ns: 
                  _stageVideo = stage.stageVideos[0];
                  _stageVideo = _stageVideo.attachNetStream(_ns);
         - if stageVideo ISN'T available, fall back to Video object:
                   _video = new Video();
                   _video.attachNetStream(_ns);
                   stage.addChild(_video);
         - play netStream app:
                   _ns.play(appName);
    My renderMode is set correctly ("direct"), and from what I can tell, this isn't a backgroundAlpha issue.
    I have seen this problem described elsewhere on this forums and on the net. 
    I'd like to know from Adobe: are there any compatibility issues with this approach?  There doesn't seem to be any type of compatibility chart listing which video codecs and transport protocols are available on the different mobile platforms.  Since mobile deployment is so heaviliy fragmented, a descriptive chart or table like this would be helpful for those of us developing -- at least for the main mobile platforms predominantly in use.
    Other forum posts similar to this problem:
    http://forums.adobe.com/message/3981541#3981541
    http://forums.adobe.com/message/3954578#3954578
    Thanks in advance!

    I do hope someone from Adobe is "hearing" this, guys. The lack of RTMP-based H.264 video on the Air for iOS is a major problem, indeed.
    As Fabio Sonnati mentioned in http://sonnati.wordpress.com/2011/04/26/air-2-6-for-ios-and-video-play back/, AIR for iOS does support HTTP streaming (via HLS) of h.264 videos. However, when streaming via RTMP, AIR for iOS only supports VP6 and Spark – a couple of old, retired codecs.
    While HTTP streaming (HLS) seems to be a good option for those who simply want to “play a video” in iOS, I do believe it has some severe limitations, especially for live-communications. I’d like to share some of these thoughts with you.
    1. HLS has ridiculously high latency for live videos (around 40 seconds), when compared to RTMP. Although this may not be a problem for on-demand videos, it sure is a great problem for anyone doing serious live-communications applications (such as webconferencing, live webcasting with audience interaction or Skype-like video chats), which require near-zero latency.
    2. Perhaps someone can correct me on this (hopefully!), but as far as I know, HTTP streaming will not allow cuepoints to be read from videos. This is particularly painful for anyone doing video-triggered actions, such as slide changes (for webinar apps), subtitling or live closed captioning, etc. I read somewhere that OSMF player allows cuepoints (or "temporal metadata". See http://blogs.adobe.com/osmf/2009/11/cue_point_support_in_osmf.html), but I haven't been able to test it myself.
    3. Although HLS it is quite compatible with firewalls (since it flows through port 80), RTMP with tunnelling also flows through port 80 or 443, which adds great compatibility, even on very restricted networks. Our experience with very large clients proves that, hands down.
    In other words, HTTP/HLS streaming is Ok. But it simply does *not* fit into every shoe that RTMP does. We do believe that RTMP remains as our best option for live streaming or serious streaming-oriented *apps* (in which things more complex than “mere video playing in a window” actually happen).
    That all said, I do believe we should let Adobe know about this need. The fact that RTMP streaming in AIR for iOS is limited to VP6 and Spark, which are two “dead” codecs, still puts us, Air developers, in a very fragile position in terms of what we can accomplish with video in iOS.
    I’m sure some of you cheered when you heard about Flash Player 11 having h.264 video encoding. This, (plus the echo cancellation feature that came in 10.3) opened great doors for great Unified-Communication applications to be developed for Flash/Air. Now, it’s undeniable that clients want those applications running on tablets, especially the iPad.
    Not being able to use h.264 via RTMP on iOS is certainly a huge step backwards. Anyone shares this same opinion? What do you guys believe to be the best option to let Adobe really know about this need? Is this limitation a simple lack-of-a-feature (which can be fixed by Adobe) or is this some imposed thing by Apple?
    Just one final note: Air for Android does *not* have the same limitation. It does allow RTMP streaming of h.264.
    Thanks for your attention,
    Helder Conde

  • Error: TSorensonVideoSmartQueue would have dropped H264 sequence header

    I'm seeing this error flooding my core logs.
    TSorensonVideoSmartQueue would have dropped H264 sequence header
    I'm running FMS 4.5.1 on Windows.  Live streams are sent to it via FMLE and FP11 H.264.  The FMLE streams are not new, but the implementation of FP11 using the H.264 codec is (used to use Sorenson).  The errors used to be rare but now there is a constant stream of them.
    Any idea what is causing this?

    i am currently getting this warning flooding my logs, and eventually causing the core to crash.
    i have been trying to narrow down any specific settings (h264 level, resolution @ fps) that work without giving this warning, but have not found anything definite yet.
    i only get this warning on a remote AMS 5.0.3 over the internet, and cannot reproduce it locally on my LAN.
    it seems to only happen when there are subscribers to the stream, since i do not see it when i am only publishing.
    ANY info is greatly appreciated! this is very frustrating and is one of the few things holding back my progress.
    thanks!

  • H264 video published with FP11 to FMS4.5 not playing anywhere else

    I have created an swf where I can record the webcam picture as H264 video to a FMS 4.5 (I am using the developer version).
    My code is looking like this:
    var h264Settings = new H264VideoStreamSettings();
    h264Settings.setProfileLevel("baseline", "1.2");
    stream.videoStreamSettings = h264Settings;
    stream.publish("mp4:mystream.f4v, "record");
    I can then replay the video from FMS just fine, but if I try to copy the video from the FMS application directory into a local project and try to play the video with the FLVPlayback component, or with the Adobe Media Player, it is not playing at all.
    Is this to be expected? Can't I record a webcam video with FMS in H264 and use that video later without FMS?

    This migth be expected. H.264 files recorded on FMS might default not play in other players - they would be need some post processing. You can download the tool ADOBE F4V POST PROCESSOR from here:  http://www.adobe.com/products/flashmediaserver/tool_downloads/ . Once you process the file using this tool - it should be playable.

  • Record Audio in mp3 file format using FMS

    I am Recording Audio Using FMS which will record Audio file in FLV File Format default But I like to Record in MP3 File Format. Is it Possible.
    And I also Record Video In MP4 File Format using h264 encode. Which Give me Video File in MP4 File Format but When i will try to play that MP4 File it Didn't play. So, I post-processed video with a tool called F4V Post Processor. Now That MP4 Video File Play But Didn't Give me Sound in that Video.
    My Code is:
            var h264Settings:H264VideoStreamSettings = new H264VideoStreamSettings();
            h264Settings.setProfileLevel(H264Profile.MAIN, H264Level.LEVEL_3);
            ns.videoStreamSettings = h264Settings;
            ns.publish("mp4:"+FILENAME+".mp4","record");
    I search Lot but didn't get the solution. So, Please Give me Solution for Both thing.

    The .flv format is a "container" format, not a video format.
    So that means that inside the container, you can have a video file and/or an audio file or even just an audio file.. which most likely is already an mp3 file. Use something like "FLV Extract" to break open the container and see:
    http://moitah.net/
    But really, for any Web delivery of audio, the .flv (audio only) method works just fine. Lots of media players available for the Flash .flv file.
    For example:
    Sounds from my bunk on the Lady Washington
    http://exploreolympics.com/reports/?p=2857
    Scroll down to third media player.
    A little quiet to start with... but this is a .flv file with just an mp3 audio file inside.
    Best of luck!
    Adninjastrator

  • H264 live encoding in Flash Player 11 slowly?

    Hi,
    I am using the new Flash Player 11 ability to encode H264 video within the Flash Player. I am streaming the webcam via RTMP to a FMS. I am using the following settings:
    H.264 Baseline
    Level 5.1
    Keyframe freq. 5 seconds
    25 FPS
    Input size: 1280 x 720 (720p)
    Output size: 1280 x 720 (720p)
    Bit Rate: 1000Kbps
    When I am using these settings with FMLE 3.2, the final stream plays very fluent with a good amount of FPS. If I am using Flash Player 11 for the encoding job instead, the final video is not as fluent as before and achieves less FPS (maybe 5 FPS).
    So the question is: is Flash Player 11 not as good as FMLE 3.2 or am I missing some configuration? FP11 should be capable of encoding and streaming a fluent 720p stream, shouldn't it?
    Thanks a alot.
    Best
    Malte

    Addition: I've found out that the incoming bandwith is very constant when streaming and encoding from FMLE. It stays relatively constant around 1500 kbps:
    When I am using the Flash Player 11 for encoding and streaming the incoming bandwith looks like the rocky mountains. The incoming bandwith jumps between 1000 and 6000 kbps during the whole stream:
    That is definitely the reason why it "stutters" when using Flash Player 11. But I have no idea what I am doing wrong.
    I am using these encoding settings for FP11 ...
    var videoStreamSettings : H264VideoStreamSettings = new H264VideoStreamSettings();
    videoStreamSettings.setKeyFrameInterval(5);
    videoStreamSettings.setMode(1280,720, 25);
    videoStreamSettings.setProfileLevel(H264Profile.BASELINE, H264Level.LEVEL_5_1);
    videoStreamSettings.setQuality(1000000, 0);
    ... and these settings for the camera:
    var camera : Camera = Camera.getCamera(0);
    camera.setQuality(1000000, 0);
    camera.setKeyFrameInterval(5);
    camera.setMode(1280, 720, 25);
    Any idea why the bandwith is not constant in FP11? The function camera.setQuality() should do the job, right (http://help.adobe.com/en_US/FlashPlatform/reference/actionscript/3/flash/media/Camera.html #setQuality%28%29)?
    Thanks
    Malte

  • Recording/Streaming h264 in AMS 5.01 tries to save flv

    I'm trying to record a live stream to the livepkgr application using the following code:
    var cam:Camera;
    var mic:Microphone;
    var video:Video;
    var nc:NetConnection;
    var ns:NetStream;
    var h264Settings:H264VideoStreamSettings;
    cam = Camera.getCamera();
    mic = Microphone.getMicrophone();
    cam.setMode(320, 240, 30, true);
    cam.setQuality(90000,90);
    cam.setMotionLevel(100,200);
    cam.setKeyFrameInterval(keyframe);
    mic.setLoopBack(false);
    mic.rate = 44;
    video = new Video(vidWidth,vidHeight);
    video.attachCamera(cam);
    nc=new NetConnection();
    nc.addEventListener(NetStatusEvent.NET_STATUS, netStatus);
    nc.connect("rtmp://localhost/livepkgr");
    function netStatus(event:NetStatusEvent):void
              var info:Object = event.info;
              if (info.code == "NetConnection.Connect.Success")
                        ns = new NetStream(nc);
                        ns.attachAudio(mic);
                        ns.attachCamera(cam);
                                  h264Settings = new H264VideoStreamSettings();
                                  h264Settings.setProfileLevel(H264Profile.BASELINE, H264Level.LEVEL_3_1);
                                  h264Settings.setMode(320,240,30);
                                  h264Settings.setQuality(90000, 90);
                                  ns.videoStreamSettings = h264Settings;
                                  ns.publish('mystream1','record');
                                  var metaData:Object = new Object();
                                  metaData.codec = ns.videoStreamSettings.codec;
                                  metaData.profile =  h264Settings.profile;
                                  metaData.level = h264Settings.level;
                                  metaData.fps = cam.fps;
                                  metaData.height = cam.height;
                                  metaData.width = cam.width;
                                  metaData.keyFrameInterval = cam.keyFrameInterval;
                                  ns.send("@setDataFrame", "onMetaData", metaData);
    I get this error in the server core.00.log
    [root logs]# tail -n1000 -f core.00.log
    #Version: 1.0
    #Start-Date: 2013-07-17 00:18:51
    #Software: Adobe Media Server 5.0.1 r1076 x64
    2013-07-17          00:19:41          6724          (w)2581173          /opt/adobe/ams/applications/livepkgr/events/_definst_/mystream1/Event.xml does not exist or is invalid.          -
    2013-07-17          00:19:43          6724          (w)2611179          Warning from libf4f.so: [Utils] [mystream1] Discarded all Media Messages till first Video Key Frame. Total duration of discarded Messages - 0 ms
    2013-07-17          00:19:47          6724          (w)2611179          Warning from libflv.so: Recording H264 to FLV is unsupported, tried in FLV : /opt/adobe/ams/applications/livepkgr/streams/_definst_/mystream1.flv.          -
    If I try to publish like this:
    ns.publish('mp4:mystream1.f4v','record');
    then I get this error in my flash player:
    NetStream.Record.NoAccess
    and the following in the server core.00.log
    2013-07-17
    00:25:19
    6724
    (w)2581173
    /opt/adobe/ams/applications/livepkgr/events/_definst_/mystream1.f4v/Event.xml does not exist or is invalid.
    2013-07-17
    00:25:25
    6724
    (e)2611082
    Failed to record mystream1.f4v.
    Can anyone help in this regards?

    btw looks like the feedback form is broken
    Delivery has failed to these recipients or distribution lists:
    Sent by Microsoft Exchange Server 2007
    Received: from inner-relay-store.corp.adobe.com (153.32.1.53) by nahub01.corp.adobe.com (10.8.189.130) with Microsoft SMTP Server id 8.3.342.0; Tue, 4 Nov 2014 15:50:08 -0800 Received: from da1snv011a (da1snv011a.corp.adobe.com [10.116.76.65])     by inner-relay-store.corp.adobe.com (8.12.9/8.12.9) with ESMTP id sA4No0jV000077 for <>; Tue, 4 Nov 2014 15:50:08 -0800 (PST) Date: Tue, 4 Nov 2014 15:50:08 -0800 From: <> To: <> Message-ID: <25806629.553.1415145008625.JavaMail.jrun@da1snv011a> Subject: [ Bug Report ] MIME-Version: 1.0 Content-Type: text/plain; charset="UTF-8" Content-Transfer-Encoding: 7bit X-Mailer: ColdFusion 9 Application Server Return-Path:
    Feedback Report            : ******BUG******
    Concise problem statement:
    File gets corrupted, this has happened with both video or just audio, over a long period of live recording.  The stream is sent via netstream.append and all users can play subscribe without issue.  however upon playback the file is only partially able to play back. and in many cases the file size is grossly oversized
    Steps to reproduce bug:
    1.record a stream to FMS using flash player
    2. after an 1hr of recording stop publishing the stream
    3. wait 20 seconds and append the stream and let it publish for 19mins then stop for 20...continue to do this till you hit a 2.5hr mark.
    Results: Some recordings will work as expected while 1 in 15  the file will play upto a random time without playing the full duration. The file size is also much greater than an equivalent file playing for the same duration
    Expected results:
    The file should play the full duration and the file size shouldn't be so drastically larger than a file that recorded the same equivalent time.

  • Flash player drops frames with playing live h264 frames

    Hi
    i have been tryin to get to a adobe representative for a long time now regarding this but unfortunately no one responds fast enough. I am facing a terrible issue. When broadcasting h264 live video from a flash based encoder client i notice severe frame drops on the playback end no matter what player i use. It even happens if i use fms development server on my own system. Video becomes smooth if i add a 1 sec buffer to the player. but then the video is no longer live.
    So my question is , is theer somethign wrong with h264 encoding in flash player or thats the way it is suppose to be ? at zero buffer  frame drops are severe.  Sample config :
    320 x 240 @ 24 fps @ 90000 bps @ 100 KFI @ 3.1 level @ baseline profile.
    I sincerely some one at adobe to respond. There are many stranded questions on this topic on the forum and on the web already.

    The CPU usage is 2-6% on the i7 920 vista 64bit computer. The audio doesn't seem to be dropping. Some more clues that might help you help me
    - The quality of the video doesn't matter, 360p drops frames just as often as 720p or 1080p.
    - The dropped frames still occur even when the video is completely downloaded and replayed without any buffering needed.
    One thing that does fix the dropped frames is completely disabling my sound card in the windows playback sound options. This also fixes the second computer's dropped frames which only has onboard realtek audio. I have tried using a Creative PCI sound card and the built in realtek sound but they both behave the same. The latest drivers were installed for both.

  • About Live stream with FMS

    Hi,
    I am doing somthing FMS.I download the FMS4 and Adobe Flash Midea Encoder3.5 here.
    Installed to my PC,they are working without any configure.
    and now I want to develop a client base on gstreamer to play the RTMP:// videos.
    My client play VOD video on FMS as rtmp://MYIP/vod/sample.flv without any problem.
    When I change to play the live video published by Adobe Flash Midea Encoder3.5,rtmp://MYIP/live/livestream.flv,it can start,but stoped after about 2 second.
    I make some debug,it seems the server stop to send data to my client.And after this stoped ,it can not start again,must reboot my client;
    I have checked the server with rtmpdump + VLC,it works fine.
    The Log in FMS is:
    Accepted a connection from IP:192.168.0.95, referrer: , pageurl:
    Sending error message: Method not found (FCSubscribe).
    Sending error message: Response object not found (_result:2147483647).
    "Sending error message: Method not found (FCSubscribe)." this display also with rtmpdump.
    "Sending error message: Response object not found (_result:2147483647)." this only with my client,what's the reason?
    In my client,I see the following after about 2 second:
    WARNING: Stream corrupt?!
    ERROR: Wrong data size (16717496), stream corrupted, aborting!
    What's happen when my client connect to FMS?  What's the defference between VOD and LIVE?
    Anyone try this with gstreamer?

    Thanks a lot for you reply;
    With your tool link,it can play the live video from the my server;
    But my client is based on Linux,and can not support web browser,so I try to implament it with Gstreamer;
    I published the video with the default setting of Adobe Flash Media Encoder 2.5,just set video format to H264.
    Today,I make some more debug,
    please help to analyse it,maybe you can find some thing form the log;
    DEBUG: Parsing...
    DEBUG: Parsed protocol: 0
    DEBUG: Parsed host    : 192.168.0.143
    DEBUG: Parsed app     : live
    DEBUG: Protocol : RTMP
    DEBUG: Hostname : 192.168.0.143
    DEBUG: Port     : 1935
    DEBUG: Playpath : livestream
    DEBUG: tcUrl    : rtmp://192.168.0.143:1935/live
    DEBUG: app      : live
    DEBUG: live     : yes
    DEBUG: timeout  : 30 sec
    DEBUG: Setting buffer time to: 36000000ms
    DEBUG: RTMP_Connect1, ... connected, handshaking
    DEBUG: HandShake: Type Answer   : 03
    DEBUG: HandShake: Server Uptime : 558990173
    DEBUG: HandShake: FMS Version   : 4.0.0.1
    DEBUG: HandShake: Handshaking finished....
    DEBUG: RTMP_Connect1, handshaked
    DEBUG: Invoking connect
    Pipeline is PREROLLING ...
    DEBUG: HandleServerBW: server BW = 2500000
    DEBUG: HandleClientBW: client BW = 2500000 2
    DEBUG: HandleChangeChunkSize, received: chunk size change to 4096
    DEBUG: RTMP_ClientPacket, received: invoke 242 bytes
    DEBUG: (object begin)
    DEBUG: (object begin)
    DEBUG: Property: <Name:             fmsVer, STRING:     FMS/4,0,0,1121>
    DEBUG: Property: <Name:       capabilities, NUMBER:     255.00>
    DEBUG: Property: <Name:               mode, NUMBER:     1.00>
    DEBUG: (object end)
    DEBUG: (object begin)
    DEBUG: Property: <Name:              level, STRING:     status>
    DEBUG: Property: <Name:               code, STRING:     NetConnection.Connect.Success>
    DEBUG: Property: <Name:        description, STRING:     Connection succeeded.>
    DEBUG: Property: <Name:     objectEncoding, NUMBER:     0.00>
    DEBUG: Property: <Name:               data, OBJECT>
    DEBUG: (object begin)
    DEBUG: Property: <Name:            version, STRING:     4,0,0,1121>
    DEBUG: (object end)
    DEBUG: (object end)
    DEBUG: (object end)
    DEBUG: HandleInvoke, server invoking <_result>
    DEBUG: HandleInvoke, received result for method call <connect>
    DEBUG: sending ctrl. type: 0x0003,nTime: 0x12c
    DEBUG: Invoking createStream
    DEBUG: FCSubscribe: livestream
    DEBUG: Invoking FCSubscribe
    DEBUG: RTMP_ClientPacket, received: invoke 21 bytes
    DEBUG: (object begin)
    DEBUG: Property: NULL
    DEBUG: (object end)
    DEBUG: HandleInvoke, server invoking <onBWDone>
    DEBUG: Invoking _checkbw
    DEBUG: RTMP_ClientPacket, received: invoke 29 bytes
    DEBUG: (object begin)
    DEBUG: Property: NULL
    DEBUG: (object end)
    DEBUG: HandleInvoke, server invoking <_result>
    DEBUG: HandleInvoke, received result for method call <createStream>
    DEBUG: SendPlay, seekTime=0, stopTime=0, sending play: livestream
    DEBUG: Invoking play
    DEBUG: sending ctrl. type: 0x0003,nTime: 0x2255100
    DEBUG: RTMP_ClientPacket, received: invoke 119 bytes
    DEBUG: (object begin)
    DEBUG: Property: NULL
    DEBUG: (object begin)
    DEBUG: Property: <Name:              level, STRING:     error>
    DEBUG: Property: <Name:               code, STRING:     NetConnection.Call.Failed>
    DEBUG: Property: <Name:        description, STRING:     Method not found (FCSubscribe).>
    DEBUG: (object end)
    DEBUG: (object end)
    DEBUG: HandleInvoke, server invoking <_error>
    ERROR: rtmp server sent error
    DEBUG: RTMP_ClientPacket, received: invoke 16419 bytes
    DEBUG: (object begin)
    DEBUG: Property: NULL
    DEBUG: (object end)
    DEBUG: HandleInvoke, server invoking <_onbwcheck>
    DEBUG: Invoking _result
    DEBUG: HandleChangeChunkSize, received: chunk size change to 4096
    DEBUG: HandleCtrl, received ctrl. type: 0, len: 6
    DEBUG: HandleCtrl, Stream Begin 1
    DEBUG: RTMP_ClientPacket, received: invoke 162 bytes
    DEBUG: (object begin)
    DEBUG: Property: NULL
    DEBUG: (object begin)
    DEBUG: Property: <Name:              level, STRING:     status>
    DEBUG: Property: <Name:               code, STRING:     NetStream.Play.Reset>
    DEBUG: Property: <Name:        description, STRING:     Playing and resetting livestream.>
    DEBUG: Property: <Name:            details, STRING:     livestream>
    DEBUG: Property: <Name:           clientid, STRING:     oAA7AAAA>
    DEBUG: (object end)
    DEBUG: (object end)
    DEBUG: HandleInvoke, server invoking <onStatus>
    DEBUG: HandleInvoke, onStatus: NetStream.Play.Reset
    DEBUG: RTMP_ClientPacket, received: invoke 156 bytes
    DEBUG: (object begin)
    DEBUG: Property: NULL
    DEBUG: (object begin)
    DEBUG: Property: <Name:              level, STRING:     status>
    DEBUG: Property: <Name:               code, STRING:     NetStream.Play.Start>
    DEBUG: Property: <Name:        description, STRING:     Started playing livestream.>
    DEBUG: Property: <Name:            details, STRING:     livestream>
    DEBUG: Property: <Name:           clientid, STRING:     oAA7AAAA>
    DEBUG: (object end)
    DEBUG: (object end)
    DEBUG: HandleInvoke, server invoking <onStatus>
    DEBUG: HandleInvoke, onStatus: NetStream.Play.Start
    DEBUG: HandleCtrl, received ctrl. type: 31, len: 6
    DEBUG: HandleCtrl, Stream BufferEmpty 1
    DEBUG: HandleCtrl, received ctrl. type: 32, len: 6
    DEBUG: HandleCtrl, Stream BufferReady 1
    DEBUG: RTMP_ClientPacket, received: notify 24 bytes
    DEBUG: (object begin)
    DEBUG: (object end)
    DEBUG: RTMP_ClientPacket, received: notify 487 bytes
    DEBUG: (object begin)
    DEBUG: (object begin)
    DEBUG: Property: <Name:             author, STRING:     >
    DEBUG: Property: <Name:          copyright, STRING:     >
    DEBUG: Property: <Name:        description, STRING:     >
    DEBUG: Property: <Name:           keywords, STRING:     >
    DEBUG: Property: <Name:             rating, STRING:     >
    DEBUG: Property: <Name:              title, STRING:     >
    DEBUG: Property: <Name:         presetname, STRING:     Custom>
    DEBUG: Property: <Name:       creationdate, STRING:     Thu Jun 09 08:55:02 2011
    >
    DEBUG: Property: <Name:        videodevice, STRING:     Syntek STK1150>
    DEBUG: Property: <Name:          framerate, NUMBER:     25.00>
    DEBUG: Property: <Name:              width, NUMBER:     320.00>
    DEBUG: Property: <Name:             height, NUMBER:     240.00>
    DEBUG: Property: <Name:       videocodecid, NUMBER:     7.00>
    DEBUG: Property: <Name:      videodatarate, NUMBER:     200.00>
    DEBUG: Property: <Name:           avclevel, NUMBER:     31.00>
    DEBUG: Property: <Name:         avcprofile, NUMBER:     66.00>
    DEBUG: Property: <Name:        audiodevice, STRING:     Realtek HD Audio Input>
    DEBUG: Property: <Name:    audiosamplerate, NUMBER:     22050.00>
    DEBUG: Property: <Name:      audiochannels, NUMBER:     1.00>
    DEBUG: Property: <Name:   audioinputvolume, NUMBER:     100.00>
    DEBUG: Property: <Name:       audiocodecid, NUMBER:     2.00>
    DEBUG: Property: <Name:      audiodatarate, NUMBER:     32.00>
    DEBUG: (object end)
    DEBUG: (object end)
    INFO: Metadata:
    INFO:   author
    INFO:   copyright
    INFO:   description
    INFO:   keywords
    INFO:   rating
    INFO:   title
    INFO:   presetname            Custom
    INFO:   creationdate          Thu Jun 09 08:55:02 2011
    INFO:   videodevice           Syntek STK1150
    INFO:   framerate             25.00
    INFO:   width                 320.00
    INFO:   height                240.00
    INFO:   videocodecid          7.00
    INFO:   videodatarate         200.00
    INFO:   avclevel              31.00
    INFO:   avcprofile            66.00
    INFO:   audiodevice           Realtek HD Audio Input
    INFO:   audiosamplerate       22050.00
    INFO:   audiochannels         1.00
    INFO:   audioinputvolume      100.00
    INFO:   audiocodecid          2.00
    INFO:   audiodatarate         32.00
    DEBUG: HandleCtrl, received ctrl. type: 31, len: 6
    DEBUG: HandleCtrl, Stream BufferEmpty 1
    DEBUG: HandleCtrl, received ctrl. type: 32, len: 6
    DEBUG: HandleCtrl, Stream BufferReady 1
    DEBUG: ignoring too small video packet: size: 2
    Pipeline is PREROLLED ...
    Setting pipeline to PLAYING ...
    New clock: GstSystemClock
    DEBUG: HandleCtrl, received ctrl. type: 31, len: 6
    DEBUG: HandleCtrl, Stream BufferEmpty 1
    DEBUG: HandleCtrl, received ctrl. type: 32, len: 6
    DEBUG: HandleCtrl, Stream BufferReady 1
    DEBUG: ignoring too small video packet: size: 2
    DEBUG: HandleCtrl, received ctrl. type: 31, len: 6
    DEBUG: HandleCtrl, Stream BufferEmpty 1
    DEBUG: HandleCtrl, received ctrl. type: 32, len: 6
    DEBUG: HandleCtrl, Stream BufferReady 1
    DEBUG: ignoring too small audio packet: size: 0
    DEBUG: HandleCtrl, received ctrl. type: 31, len: 6
    DEBUG: HandleCtrl, Stream BufferEmpty 1
    DEBUG: HandleCtrl, received ctrl. type: 32, len: 6
    DEBUG: HandleCtrl, Stream BufferReady 1
    DEBUG: HandleCtrl, received ctrl. type: 31, len: 6
    DEBUG: HandleCtrl, Stream BufferEmpty 1
    DEBUG: HandleCtrl, received ctrl. type: 32, len: 6
    DEBUG: HandleCtrl, Stream BufferReady 1
    DEBUG: HandleCtrl, received ctrl. type: 31, len: 6
    DEBUG: HandleCtrl, Stream BufferEmpty 1
    DEBUG: HandleCtrl, received ctrl. type: 32, len: 6
    DEBUG: HandleCtrl, Stream BufferReady 1
    DEBUG: HandleCtrl, received ctrl. type: 31, len: 6
    DEBUG: HandleCtrl, Stream BufferEmpty 1
    DEBUG: HandleCtrl, received ctrl. type: 32, len: 6
    DEBUG: HandleCtrl, Stream BufferReady 1
    DEBUG: HandleCtrl, received ctrl. type: 31, len: 6
    DEBUG: HandleCtrl, Stream BufferEmpty 1
    DEBUG: HandleCtrl, received ctrl. type: 32, len: 6
    DEBUG: HandleCtrl, Stream BufferReady 1
    DEBUG: HandleCtrl, received ctrl. type: 31, len: 6
    DEBUG: HandleCtrl, Stream BufferEmpty 1
    DEBUG: HandleCtrl, received ctrl. type: 32, len: 6
    DEBUG: HandleCtrl, Stream BufferReady 1
    DEBUG: HandleCtrl, received ctrl. type: 31, len: 6
    DEBUG: HandleCtrl, Stream BufferEmpty 1
    DEBUG: HandleCtrl, received ctrl. type: 32, len: 6
    DEBUG: HandleCtrl, Stream BufferReady 1
    DEBUG: HandleCtrl, received ctrl. type: 31, len: 6
    DEBUG: HandleCtrl, Stream BufferEmpty 1
    DEBUG: HandleCtrl, received ctrl. type: 32, len: 6
    DEBUG: HandleCtrl, Stream BufferReady 1
    DEBUG: HandleCtrl, received ctrl. type: 31, len: 6
    DEBUG: HandleCtrl, Stream BufferEmpty 1
    DEBUG: HandleCtrl, received ctrl. type: 32, len: 6
    DEBUG: HandleCtrl, Stream BufferReady 1
    DEBUG: HandleCtrl, received ctrl. type: 31, len: 6
    DEBUG: HandleCtrl, Stream BufferEmpty 1
    DEBUG: HandleCtrl, received ctrl. type: 32, len: 6
    DEBUG: HandleCtrl, Stream BufferReady 1
    DEBUG: HandleCtrl, received ctrl. type: 31, len: 6
    DEBUG: HandleCtrl, Stream BufferEmpty 1
    DEBUG: HandleCtrl, received ctrl. type: 32, len: 6
    DEBUG: HandleCtrl, Stream BufferReady 1
    DEBUG: HandleCtrl, received ctrl. type: 31, len: 6
    DEBUG: HandleCtrl, Stream BufferEmpty 1
    DEBUG: HandleCtrl, received ctrl. type: 32, len: 6
    DEBUG: HandleCtrl, Stream BufferReady 1
    DEBUG: HandleCtrl, received ctrl. type: 31, len: 6
    DEBUG: HandleCtrl, Stream BufferEmpty 1
    DEBUG: HandleCtrl, received ctrl. type: 32, len: 6
    DEBUG: HandleCtrl, Stream BufferReady 1
    DEBUG: HandleCtrl, received ctrl. type: 31, len: 6
    DEBUG: HandleCtrl, Stream BufferEmpty 1
    DEBUG: HandleCtrl, received ctrl. type: 32, len: 6
    DEBUG: HandleCtrl, Stream BufferReady 1
    DEBUG: HandleCtrl, received ctrl. type: 31, len: 6
    DEBUG: HandleCtrl, Stream BufferEmpty 1
    DEBUG: HandleCtrl, received ctrl. type: 32, len: 6
    DEBUG: HandleCtrl, Stream BufferReady 1
    DEBUG: HandleCtrl, received ctrl. type: 31, len: 6
    DEBUG: HandleCtrl, Stream BufferEmpty 1
    DEBUG: HandleCtrl, received ctrl. type: 32, len: 6
    DEBUG: HandleCtrl, Stream BufferReady 1
    DEBUG: HandleCtrl, received ctrl. type: 31, len: 6
    DEBUG: HandleCtrl, Stream BufferEmpty 1
    DEBUG: HandleCtrl, received ctrl. type: 32, len: 6
    DEBUG: HandleCtrl, Stream BufferReady 1
    DEBUG: HandleCtrl, received ctrl. type: 31, len: 6
    DEBUG: HandleCtrl, Stream BufferEmpty 1
    DEBUG: HandleCtrl, received ctrl. type: 32, len: 6
    DEBUG: HandleCtrl, Stream BufferReady 1
    DEBUG: HandleCtrl, received ctrl. type: 31, len: 6
    DEBUG: HandleCtrl, Stream BufferEmpty 1
    DEBUG: HandleCtrl, received ctrl. type: 32, len: 6
    DEBUG: HandleCtrl, Stream BufferReady 1
    DEBUG: HandleCtrl, received ctrl. type: 31, len: 6
    DEBUG: HandleCtrl, Stream BufferEmpty 1
    DEBUG: HandleCtrl, received ctrl. type: 32, len: 6
    DEBUG: HandleCtrl, Stream BufferReady 1
    DEBUG: HandleCtrl, received ctrl. type: 31, len: 6
    DEBUG: HandleCtrl, Stream BufferEmpty 1
    DEBUG: HandleCtrl, received ctrl. type: 32, len: 6
    DEBUG: HandleCtrl, Stream BufferReady 1
    DEBUG: HandleCtrl, received ctrl. type: 31, len: 6
    DEBUG: HandleCtrl, Stream BufferEmpty 1
    DEBUG: HandleCtrl, received ctrl. type: 32, len: 6
    DEBUG: HandleCtrl, Stream BufferReady 1
    DEBUG: HandleCtrl, received ctrl. type: 31, len: 6
    DEBUG: HandleCtrl, Stream BufferEmpty 1
    DEBUG: HandleCtrl, received ctrl. type: 32, len: 6
    DEBUG: HandleCtrl, Stream BufferReady 1
    DEBUG: HandleCtrl, received ctrl. type: 31, len: 6
    DEBUG: HandleCtrl, Stream BufferEmpty 1
    DEBUG: HandleCtrl, received ctrl. type: 32, len: 6
    DEBUG: HandleCtrl, Stream BufferReady 1
    WARNING: Stream corrupt?!
    ERROR: Wrong data size (8059624), stream corrupted, aborting!
    DEBUG: RTMP_ReadPacket, m_nChannel: 167
    DEBUG: RTMP_ClientPacket, unknown packet type received: 0xe3
    DEBUG: RTMP_ReadPacket, m_nChannel: 6ad6
    DEBUG: RTMP_ReadPacket, m_nChannel: 3d7c
    DEBUG: RTMP_ClientPacket, unknown packet type received: 0x00
    DEBUG: HandleCtrl, received ctrl. type: 8760, len: 6
    DEBUG: HandleCtrl, Stream xx -1742407864
    DEBUG: RTMP_ClientPacket, unknown packet type received: 0x2a
    DEBUG: RTMP_ClientPacket, unknown packet type received: 0x2a
    DEBUG: RTMP_ClientPacket, unknown packet type received: 0x2a
    DEBUG: RTMP_ClientPacket, unknown packet type received: 0x2a
    DEBUG: RTMP_ClientPacket, unknown packet type received: 0x2a
    DEBUG: RTMP_ClientPacket, unknown packet type received: 0xff
    DEBUG: RTMP_ClientPacket, unknown packet type received: 0xff
    DEBUG: RTMP_ReadPacket, m_nChannel: 5260
    DEBUG: RTMP_ReadPacket, failed to allocate packet
    Got EOS from element "pipeline0".
    Execution ended after 69266417190 ns.
    Setting pipeline to PAUSED ...
    Setting pipeline to READY ...
    DEBUG: Invoking deleteStream
    Setting pipeline to NULL ...
    FREEING pipeline ...
    After a short transfers,the client can not get the data.It seems that the Connection reset by peer;

Maybe you are looking for

  • Submit and redirect is not working in firefox or ie, works great in chrome.

    I built a site to submit forms and it's great except that I was just informed it dosnt submit or redirect outside of chrome.

  • Sites created on different Macs

    Unlike many here, I've had success creating quick, simple sites for various uses. However, some I've built on my Powerbook, others on my G4 at home. Although all are posted to my single .Mac account, I can only see the ones created locally on each co

  • Secondary ACS asking for license after Patching

    I have just been patching out ACS 1121 on Ver 5.3 in preparation for upgrading to 5.5 however having successfully patched the secondary server I tried to promote this to the Primary and this seemed to fail with no error message. A reboot of the Secon

  • Join IBY_DOCS_PAYABLE_ALL and AP_INVOICE_PAYMENTS_ALL

    I am currently pulling information from these two tables but need to join them to get unique records. I am currently getting duplicates with my current method. Is there an additional join that I am missing or do I need to fill this in with paramters?

  • When will the no longer available channel go away?

    I saw a noticed a couple of days ago that we where going to lose a channel.  Now the channel still appears on my guide, but the programming says "Channel No Longer Available."  How long will it be listed on my guide before it drops off? Also, when is