Archiving live stream at FMS and injecting metadata: VP6 good h264 not

When I record a live stream at FMS, one in which I've injected  metadata in my main.asc file, the archived file plays back fine.  The  metadata plays back too.  I'm able to retreive it just fine - if I  encode VP6.
If I encode h.264 the file plays back but  the metadata does not.  The fact that the archived file is created and  plays back tells me things are wired correctly.  The only thing I  changed is the format.
According to FMS docs (http://help.adobe.com/en_US/FlashMediaServer/3.5_SS_ASD/WS5b3ccc516d4fbf351e63e3d11a11afc9 5e-7e42.html#WS5b3ccc516d4fbf351e63e3d11a11afc95e-7f35)
..."The recording format is determined by the filename you pass to the Stream.get()
method."
So my record code looks like the following:
application.onPublish = function(client, stream) {
     trace("onPublish");
     s = Stream.get("mp4:streamname.f4v");
     if(s){
         s.record();
     this.doRepublish(this.nc, stream);
My code that injects the data in to the stream looks like this:
Client.prototype.sendDataEvent = function(data) {
     trace("Call to sendDataEvent...");
     this.newStream = Stream.get("mp4:streamname.f4v");
     this.newStream.send("onTextData",data);
All must be wired  correctly because the metadata comes through during the live stream.  On  play back of the archive though, the metadata doesn't appear to be  there.
Any thoughts?
Thanks

My apologies on the s.play() confusion.  I had been trying different versions of the code and posted the one without it.
Whether I include s.play() or not the file gets created.  Here are the various versions of the onPublish() function I've tried (differences in red):
1.
application.onPublish = function(client, stream) {
    trace("onPublish");   
    s = Stream.get("mp4:streamname.f4v");
    if(s){
        s.record();
        s.play("mp4:streamname.f4v");
    this.doRepublish(this.nc, stream);
2.
application.onPublish = function(client, stream) {
    trace("onPublish");   
    s = Stream.get("mp4:streamname.f4v");
    if(s){
        s.record();
        s.play(stream);
    this.doRepublish(this.nc, stream);
3.
application.onPublish = function(client, stream) {
     trace("onPublish");   
     s = Stream.get("mp4:streamname.f4v");
     if(s){
         s.record();
     this.doRepublish(this.nc, stream);
All produce the same result - an archived file called mp4:streamname.f4v in my streams folder.  This file plays back fine but does not play back the commands.
On your other question, about things working fine for VP6, it works fine for FLV.  A file called streamname.flv is produced.  This file plays back fine and does indeed play back commands baked into the file as well.  This is what makes me believe the code is not the problem.  If it works perfectly for one format, there would seem to be very little I could do in my code to break things for the other.
Can you try this using the record() code snippets in the live docs Stream.record() section?
http://help.adobe.com/en_US/FlashMediaServer/3.5_SS_ASD/WS5b3ccc516d4fbf351e63e3d11a11afc9 5e-7e42.html#WS5b3ccc516d4fbf351e63e3d11a11afc95e-7f35
All you'd need is the code snippets there to record your live stream and another server side function to inject commands into that live stream. Here is that function:
Client.prototype.sendDataEvent = function(data) {
    trace("Call to sendDataEvent...");
    this.newStream = Stream.get("mp4:streamname.f4v");
    this.newStream.send("onTextData",data);
Do something simple like call onTextData and pass some text in the data parameter.  Then on the client side viewer, handle the onTextData method.  It will receive the text.  Display it in a text area or something.
If you record while injecting this text into your stream, the text should display on playback of the archived file.  It will if you encode VP6/FLV, but not if you encode H.264/F4V.
Let me know what you discover.

Similar Messages

  • Publish Live stream to FMS on RTMPE protocol

    Hi,
    I am trying to publish Live Stream to FMS using FMLE or third party encoders.
    Is it possible to have RTMPE or RTMPTE protocol used between encoder and FMS?
    Very urgent.
    thanks,
    Pooja Jain

    FMLE can only publish via rtmp or rtmpt.

  • Playback of low bitrate flv or f4v from live stream in FMS causes player buffer to empty

    We are experiencing a consistent issue when playing a low bitrate (300kbps or less) flv in a live stream from FMS.  Basically, the player will will start off with the appropriate buffer, say 5 seconds, then begin dropping until it empties out, and will have to rebuffer.  We've tried with a variety of flv and f4v files, all that are 300kbps or less, and we consistently get the issue.  Is this something Adobe can investigate in FMS?  Or are there any suggestions on how we can get around the issue?

    hey, i got the similar problem, logging like this
    2012-11-12
    18:50:12
    23434
    (e)2661034
    Connect failed ( , 5779212 ) : Connect failed: Connection refused (111)
    2012-11-12
    18:50:54
    23434
    (e)2661034
    Connect failed ( , 5779212 ) : Connect failed: Connection refused (111)
    2012-11-12
    18:51:36
    23434
    (e)2661034
    Connect failed ( , 1166880400 ) : Connect failed: Connection refused (111)
    2012-11-12
    18:54:14
    23434
    (e)2661034
    Connect failed ( , 1175301776 ) : Connect failed: Connection refused (111)
    2012-11-12
    18:54:55
    23434
    (e)2661034
    Connect failed ( , 1164775056 ) : Connect failed: Connection refused (111)
    2012-11-12
    18:55:37
    23434
    (e)2661034
    Connect failed ( , 16 ) : Connect failed: Connection refused (111)
    2012-11-12
    19:13:08
    23434
    (e)2661034
    Connect failed ( , 1158459024 ) : Connect failed: Connection refused (111)
    it seems that the port number is invalid, but we never use such ports.

  • Problem playing remote live stream from FMS

    Hello all,
    I'm having problems playing remote live streams from FMS (server-side) that I can play with any other player (client-side)
    Example of server-side application which plays a remote live stream:
    nc = new NetConnection();
    nc.connect("rtmp://remoteserver/live");
    stream = Stream.get("localStream");
    stream.play("remoteStream.flv", -1, -1, true, nc);
    This code works only sometimes.
    Most of the times, FMS is able to connect to remote string and the localStream dispatches events:
       NetStream.Publish.Strart
       NetStream.Play.Reset
    In this case, FMS is publishing the local stream but it is not playing the remoteStrem on it.
    The rest of the times, localStream dispatches events:
       NetStream.Publish.Strart
       NetStream.Play.Reset
       NetStream.Play.Start
       NetStream.Data.Start
    In this case FMS plays the remoteStream correctly.
    Any hint to solve this issue?
    Regards.

    Thanks, I tried your code and it works when playing a remote live stream on another FMS.
    But the remote live stream is not on a FMS but on a Wowza Server which re-streams an RTSP stream over RTMP.
    With this code:
    var nc;
    var myStream;
    application.onAppStart = function(){
         nc = new NetConnection();
         myStream = Stream.get("localstream");
         myStream.onStatus= function(info){
               trace(info.code);
         nc.onStatus = function(info){
               trace (info.code);
               if(info.code == "NetConnection.Connect.Success" ){
                    myStream.play("remoteLive.sdp", -1, -1, true, nc);
         nc.connect("rtmp://remoteServer/live");
    Every live stream player I tried was able to play the remote stream "remoteLive.sdp", but FMS play it only sometimes. This is the output log of the FMS application:
    NetConnection.Connect.Success
    NetStream.Publish.Start
    NetStream.Play.Reset        <--------------------------------- Stuck there, stream won't play
    Unloaded application instance wowza4/_definst_ <--- Reload app
    NetConnection.Connect.Success
    NetStream.Publish.Start
    NetStream.Play.Reset        <--------------------------------- Stuck there, stream won't play
    Unloaded application instance wowza4/_definst_ <--- Reload app
    NetConnection.Connect.Success
    NetStream.Publish.Start
    NetStream.Play.Reset
    NetStream.Play.Start         <--------------------------------- Stream is playing just fine
    NetStream.Data.Start
    Unloaded application instance wowza4/_definst_ <--- Reload app
    NetConnection.Connect.Success
    NetStream.Publish.Start
    NetStream.Play.Reset       <--------------------------------- Stuck there, stream won't play
    Any idea? Why FMS won't play it everytime?
    Regards

  • Encode archive of live stream on FMS 4

    I thought by what I have initially read that this was possible, but I am not sure.
    What we would like to do is create an archive (f4v or mp4) on the FMS 4 server any live stream that is pushed from our endpoints.  Primarily as a backup and if necessary - an editable file within CS4 or CS5.
    We are at a University and capture lecture powerpoints natively and sometimes need to edit the files.  flv is not enough and we would like to take afvantage of mp4 and the FMS server while we use it for overflow as a recorder.
    Can anyone clarify if this can be done?
    Thanks
    -Brian

    Recording can be triggered via client-side publish call, or via server-side script.  From the publish cmd, you specify "record" option.  For example,
    ns.publish("mp4:foo", "record"); // ns is a NetStream, "mp4:" prefix tells FMS to record using MP4 container
    Otherwise, if you want to trigger the record via server-side as Jay says, you can simply publish the stream as,
    ns.publish("mp4:foo");
    then from server-side AS, do something like,
    // this handler is called when a stream is published
    // clientObj is the client that is publishing the stream
    // streamObj is the stream that is being published
    application.onPublish = function(clientObj, streamObj)
        streamObj.record("record"); // start recording the stream

  • About Live stream with FMS

    Hi,
    I am doing somthing FMS.I download the FMS4 and Adobe Flash Midea Encoder3.5 here.
    Installed to my PC,they are working without any configure.
    and now I want to develop a client base on gstreamer to play the RTMP:// videos.
    My client play VOD video on FMS as rtmp://MYIP/vod/sample.flv without any problem.
    When I change to play the live video published by Adobe Flash Midea Encoder3.5,rtmp://MYIP/live/livestream.flv,it can start,but stoped after about 2 second.
    I make some debug,it seems the server stop to send data to my client.And after this stoped ,it can not start again,must reboot my client;
    I have checked the server with rtmpdump + VLC,it works fine.
    The Log in FMS is:
    Accepted a connection from IP:192.168.0.95, referrer: , pageurl:
    Sending error message: Method not found (FCSubscribe).
    Sending error message: Response object not found (_result:2147483647).
    "Sending error message: Method not found (FCSubscribe)." this display also with rtmpdump.
    "Sending error message: Response object not found (_result:2147483647)." this only with my client,what's the reason?
    In my client,I see the following after about 2 second:
    WARNING: Stream corrupt?!
    ERROR: Wrong data size (16717496), stream corrupted, aborting!
    What's happen when my client connect to FMS?  What's the defference between VOD and LIVE?
    Anyone try this with gstreamer?

    Thanks a lot for you reply;
    With your tool link,it can play the live video from the my server;
    But my client is based on Linux,and can not support web browser,so I try to implament it with Gstreamer;
    I published the video with the default setting of Adobe Flash Media Encoder 2.5,just set video format to H264.
    Today,I make some more debug,
    please help to analyse it,maybe you can find some thing form the log;
    DEBUG: Parsing...
    DEBUG: Parsed protocol: 0
    DEBUG: Parsed host    : 192.168.0.143
    DEBUG: Parsed app     : live
    DEBUG: Protocol : RTMP
    DEBUG: Hostname : 192.168.0.143
    DEBUG: Port     : 1935
    DEBUG: Playpath : livestream
    DEBUG: tcUrl    : rtmp://192.168.0.143:1935/live
    DEBUG: app      : live
    DEBUG: live     : yes
    DEBUG: timeout  : 30 sec
    DEBUG: Setting buffer time to: 36000000ms
    DEBUG: RTMP_Connect1, ... connected, handshaking
    DEBUG: HandShake: Type Answer   : 03
    DEBUG: HandShake: Server Uptime : 558990173
    DEBUG: HandShake: FMS Version   : 4.0.0.1
    DEBUG: HandShake: Handshaking finished....
    DEBUG: RTMP_Connect1, handshaked
    DEBUG: Invoking connect
    Pipeline is PREROLLING ...
    DEBUG: HandleServerBW: server BW = 2500000
    DEBUG: HandleClientBW: client BW = 2500000 2
    DEBUG: HandleChangeChunkSize, received: chunk size change to 4096
    DEBUG: RTMP_ClientPacket, received: invoke 242 bytes
    DEBUG: (object begin)
    DEBUG: (object begin)
    DEBUG: Property: <Name:             fmsVer, STRING:     FMS/4,0,0,1121>
    DEBUG: Property: <Name:       capabilities, NUMBER:     255.00>
    DEBUG: Property: <Name:               mode, NUMBER:     1.00>
    DEBUG: (object end)
    DEBUG: (object begin)
    DEBUG: Property: <Name:              level, STRING:     status>
    DEBUG: Property: <Name:               code, STRING:     NetConnection.Connect.Success>
    DEBUG: Property: <Name:        description, STRING:     Connection succeeded.>
    DEBUG: Property: <Name:     objectEncoding, NUMBER:     0.00>
    DEBUG: Property: <Name:               data, OBJECT>
    DEBUG: (object begin)
    DEBUG: Property: <Name:            version, STRING:     4,0,0,1121>
    DEBUG: (object end)
    DEBUG: (object end)
    DEBUG: (object end)
    DEBUG: HandleInvoke, server invoking <_result>
    DEBUG: HandleInvoke, received result for method call <connect>
    DEBUG: sending ctrl. type: 0x0003,nTime: 0x12c
    DEBUG: Invoking createStream
    DEBUG: FCSubscribe: livestream
    DEBUG: Invoking FCSubscribe
    DEBUG: RTMP_ClientPacket, received: invoke 21 bytes
    DEBUG: (object begin)
    DEBUG: Property: NULL
    DEBUG: (object end)
    DEBUG: HandleInvoke, server invoking <onBWDone>
    DEBUG: Invoking _checkbw
    DEBUG: RTMP_ClientPacket, received: invoke 29 bytes
    DEBUG: (object begin)
    DEBUG: Property: NULL
    DEBUG: (object end)
    DEBUG: HandleInvoke, server invoking <_result>
    DEBUG: HandleInvoke, received result for method call <createStream>
    DEBUG: SendPlay, seekTime=0, stopTime=0, sending play: livestream
    DEBUG: Invoking play
    DEBUG: sending ctrl. type: 0x0003,nTime: 0x2255100
    DEBUG: RTMP_ClientPacket, received: invoke 119 bytes
    DEBUG: (object begin)
    DEBUG: Property: NULL
    DEBUG: (object begin)
    DEBUG: Property: <Name:              level, STRING:     error>
    DEBUG: Property: <Name:               code, STRING:     NetConnection.Call.Failed>
    DEBUG: Property: <Name:        description, STRING:     Method not found (FCSubscribe).>
    DEBUG: (object end)
    DEBUG: (object end)
    DEBUG: HandleInvoke, server invoking <_error>
    ERROR: rtmp server sent error
    DEBUG: RTMP_ClientPacket, received: invoke 16419 bytes
    DEBUG: (object begin)
    DEBUG: Property: NULL
    DEBUG: (object end)
    DEBUG: HandleInvoke, server invoking <_onbwcheck>
    DEBUG: Invoking _result
    DEBUG: HandleChangeChunkSize, received: chunk size change to 4096
    DEBUG: HandleCtrl, received ctrl. type: 0, len: 6
    DEBUG: HandleCtrl, Stream Begin 1
    DEBUG: RTMP_ClientPacket, received: invoke 162 bytes
    DEBUG: (object begin)
    DEBUG: Property: NULL
    DEBUG: (object begin)
    DEBUG: Property: <Name:              level, STRING:     status>
    DEBUG: Property: <Name:               code, STRING:     NetStream.Play.Reset>
    DEBUG: Property: <Name:        description, STRING:     Playing and resetting livestream.>
    DEBUG: Property: <Name:            details, STRING:     livestream>
    DEBUG: Property: <Name:           clientid, STRING:     oAA7AAAA>
    DEBUG: (object end)
    DEBUG: (object end)
    DEBUG: HandleInvoke, server invoking <onStatus>
    DEBUG: HandleInvoke, onStatus: NetStream.Play.Reset
    DEBUG: RTMP_ClientPacket, received: invoke 156 bytes
    DEBUG: (object begin)
    DEBUG: Property: NULL
    DEBUG: (object begin)
    DEBUG: Property: <Name:              level, STRING:     status>
    DEBUG: Property: <Name:               code, STRING:     NetStream.Play.Start>
    DEBUG: Property: <Name:        description, STRING:     Started playing livestream.>
    DEBUG: Property: <Name:            details, STRING:     livestream>
    DEBUG: Property: <Name:           clientid, STRING:     oAA7AAAA>
    DEBUG: (object end)
    DEBUG: (object end)
    DEBUG: HandleInvoke, server invoking <onStatus>
    DEBUG: HandleInvoke, onStatus: NetStream.Play.Start
    DEBUG: HandleCtrl, received ctrl. type: 31, len: 6
    DEBUG: HandleCtrl, Stream BufferEmpty 1
    DEBUG: HandleCtrl, received ctrl. type: 32, len: 6
    DEBUG: HandleCtrl, Stream BufferReady 1
    DEBUG: RTMP_ClientPacket, received: notify 24 bytes
    DEBUG: (object begin)
    DEBUG: (object end)
    DEBUG: RTMP_ClientPacket, received: notify 487 bytes
    DEBUG: (object begin)
    DEBUG: (object begin)
    DEBUG: Property: <Name:             author, STRING:     >
    DEBUG: Property: <Name:          copyright, STRING:     >
    DEBUG: Property: <Name:        description, STRING:     >
    DEBUG: Property: <Name:           keywords, STRING:     >
    DEBUG: Property: <Name:             rating, STRING:     >
    DEBUG: Property: <Name:              title, STRING:     >
    DEBUG: Property: <Name:         presetname, STRING:     Custom>
    DEBUG: Property: <Name:       creationdate, STRING:     Thu Jun 09 08:55:02 2011
    >
    DEBUG: Property: <Name:        videodevice, STRING:     Syntek STK1150>
    DEBUG: Property: <Name:          framerate, NUMBER:     25.00>
    DEBUG: Property: <Name:              width, NUMBER:     320.00>
    DEBUG: Property: <Name:             height, NUMBER:     240.00>
    DEBUG: Property: <Name:       videocodecid, NUMBER:     7.00>
    DEBUG: Property: <Name:      videodatarate, NUMBER:     200.00>
    DEBUG: Property: <Name:           avclevel, NUMBER:     31.00>
    DEBUG: Property: <Name:         avcprofile, NUMBER:     66.00>
    DEBUG: Property: <Name:        audiodevice, STRING:     Realtek HD Audio Input>
    DEBUG: Property: <Name:    audiosamplerate, NUMBER:     22050.00>
    DEBUG: Property: <Name:      audiochannels, NUMBER:     1.00>
    DEBUG: Property: <Name:   audioinputvolume, NUMBER:     100.00>
    DEBUG: Property: <Name:       audiocodecid, NUMBER:     2.00>
    DEBUG: Property: <Name:      audiodatarate, NUMBER:     32.00>
    DEBUG: (object end)
    DEBUG: (object end)
    INFO: Metadata:
    INFO:   author
    INFO:   copyright
    INFO:   description
    INFO:   keywords
    INFO:   rating
    INFO:   title
    INFO:   presetname            Custom
    INFO:   creationdate          Thu Jun 09 08:55:02 2011
    INFO:   videodevice           Syntek STK1150
    INFO:   framerate             25.00
    INFO:   width                 320.00
    INFO:   height                240.00
    INFO:   videocodecid          7.00
    INFO:   videodatarate         200.00
    INFO:   avclevel              31.00
    INFO:   avcprofile            66.00
    INFO:   audiodevice           Realtek HD Audio Input
    INFO:   audiosamplerate       22050.00
    INFO:   audiochannels         1.00
    INFO:   audioinputvolume      100.00
    INFO:   audiocodecid          2.00
    INFO:   audiodatarate         32.00
    DEBUG: HandleCtrl, received ctrl. type: 31, len: 6
    DEBUG: HandleCtrl, Stream BufferEmpty 1
    DEBUG: HandleCtrl, received ctrl. type: 32, len: 6
    DEBUG: HandleCtrl, Stream BufferReady 1
    DEBUG: ignoring too small video packet: size: 2
    Pipeline is PREROLLED ...
    Setting pipeline to PLAYING ...
    New clock: GstSystemClock
    DEBUG: HandleCtrl, received ctrl. type: 31, len: 6
    DEBUG: HandleCtrl, Stream BufferEmpty 1
    DEBUG: HandleCtrl, received ctrl. type: 32, len: 6
    DEBUG: HandleCtrl, Stream BufferReady 1
    DEBUG: ignoring too small video packet: size: 2
    DEBUG: HandleCtrl, received ctrl. type: 31, len: 6
    DEBUG: HandleCtrl, Stream BufferEmpty 1
    DEBUG: HandleCtrl, received ctrl. type: 32, len: 6
    DEBUG: HandleCtrl, Stream BufferReady 1
    DEBUG: ignoring too small audio packet: size: 0
    DEBUG: HandleCtrl, received ctrl. type: 31, len: 6
    DEBUG: HandleCtrl, Stream BufferEmpty 1
    DEBUG: HandleCtrl, received ctrl. type: 32, len: 6
    DEBUG: HandleCtrl, Stream BufferReady 1
    DEBUG: HandleCtrl, received ctrl. type: 31, len: 6
    DEBUG: HandleCtrl, Stream BufferEmpty 1
    DEBUG: HandleCtrl, received ctrl. type: 32, len: 6
    DEBUG: HandleCtrl, Stream BufferReady 1
    DEBUG: HandleCtrl, received ctrl. type: 31, len: 6
    DEBUG: HandleCtrl, Stream BufferEmpty 1
    DEBUG: HandleCtrl, received ctrl. type: 32, len: 6
    DEBUG: HandleCtrl, Stream BufferReady 1
    DEBUG: HandleCtrl, received ctrl. type: 31, len: 6
    DEBUG: HandleCtrl, Stream BufferEmpty 1
    DEBUG: HandleCtrl, received ctrl. type: 32, len: 6
    DEBUG: HandleCtrl, Stream BufferReady 1
    DEBUG: HandleCtrl, received ctrl. type: 31, len: 6
    DEBUG: HandleCtrl, Stream BufferEmpty 1
    DEBUG: HandleCtrl, received ctrl. type: 32, len: 6
    DEBUG: HandleCtrl, Stream BufferReady 1
    DEBUG: HandleCtrl, received ctrl. type: 31, len: 6
    DEBUG: HandleCtrl, Stream BufferEmpty 1
    DEBUG: HandleCtrl, received ctrl. type: 32, len: 6
    DEBUG: HandleCtrl, Stream BufferReady 1
    DEBUG: HandleCtrl, received ctrl. type: 31, len: 6
    DEBUG: HandleCtrl, Stream BufferEmpty 1
    DEBUG: HandleCtrl, received ctrl. type: 32, len: 6
    DEBUG: HandleCtrl, Stream BufferReady 1
    DEBUG: HandleCtrl, received ctrl. type: 31, len: 6
    DEBUG: HandleCtrl, Stream BufferEmpty 1
    DEBUG: HandleCtrl, received ctrl. type: 32, len: 6
    DEBUG: HandleCtrl, Stream BufferReady 1
    DEBUG: HandleCtrl, received ctrl. type: 31, len: 6
    DEBUG: HandleCtrl, Stream BufferEmpty 1
    DEBUG: HandleCtrl, received ctrl. type: 32, len: 6
    DEBUG: HandleCtrl, Stream BufferReady 1
    DEBUG: HandleCtrl, received ctrl. type: 31, len: 6
    DEBUG: HandleCtrl, Stream BufferEmpty 1
    DEBUG: HandleCtrl, received ctrl. type: 32, len: 6
    DEBUG: HandleCtrl, Stream BufferReady 1
    DEBUG: HandleCtrl, received ctrl. type: 31, len: 6
    DEBUG: HandleCtrl, Stream BufferEmpty 1
    DEBUG: HandleCtrl, received ctrl. type: 32, len: 6
    DEBUG: HandleCtrl, Stream BufferReady 1
    DEBUG: HandleCtrl, received ctrl. type: 31, len: 6
    DEBUG: HandleCtrl, Stream BufferEmpty 1
    DEBUG: HandleCtrl, received ctrl. type: 32, len: 6
    DEBUG: HandleCtrl, Stream BufferReady 1
    DEBUG: HandleCtrl, received ctrl. type: 31, len: 6
    DEBUG: HandleCtrl, Stream BufferEmpty 1
    DEBUG: HandleCtrl, received ctrl. type: 32, len: 6
    DEBUG: HandleCtrl, Stream BufferReady 1
    DEBUG: HandleCtrl, received ctrl. type: 31, len: 6
    DEBUG: HandleCtrl, Stream BufferEmpty 1
    DEBUG: HandleCtrl, received ctrl. type: 32, len: 6
    DEBUG: HandleCtrl, Stream BufferReady 1
    DEBUG: HandleCtrl, received ctrl. type: 31, len: 6
    DEBUG: HandleCtrl, Stream BufferEmpty 1
    DEBUG: HandleCtrl, received ctrl. type: 32, len: 6
    DEBUG: HandleCtrl, Stream BufferReady 1
    DEBUG: HandleCtrl, received ctrl. type: 31, len: 6
    DEBUG: HandleCtrl, Stream BufferEmpty 1
    DEBUG: HandleCtrl, received ctrl. type: 32, len: 6
    DEBUG: HandleCtrl, Stream BufferReady 1
    DEBUG: HandleCtrl, received ctrl. type: 31, len: 6
    DEBUG: HandleCtrl, Stream BufferEmpty 1
    DEBUG: HandleCtrl, received ctrl. type: 32, len: 6
    DEBUG: HandleCtrl, Stream BufferReady 1
    DEBUG: HandleCtrl, received ctrl. type: 31, len: 6
    DEBUG: HandleCtrl, Stream BufferEmpty 1
    DEBUG: HandleCtrl, received ctrl. type: 32, len: 6
    DEBUG: HandleCtrl, Stream BufferReady 1
    DEBUG: HandleCtrl, received ctrl. type: 31, len: 6
    DEBUG: HandleCtrl, Stream BufferEmpty 1
    DEBUG: HandleCtrl, received ctrl. type: 32, len: 6
    DEBUG: HandleCtrl, Stream BufferReady 1
    DEBUG: HandleCtrl, received ctrl. type: 31, len: 6
    DEBUG: HandleCtrl, Stream BufferEmpty 1
    DEBUG: HandleCtrl, received ctrl. type: 32, len: 6
    DEBUG: HandleCtrl, Stream BufferReady 1
    DEBUG: HandleCtrl, received ctrl. type: 31, len: 6
    DEBUG: HandleCtrl, Stream BufferEmpty 1
    DEBUG: HandleCtrl, received ctrl. type: 32, len: 6
    DEBUG: HandleCtrl, Stream BufferReady 1
    DEBUG: HandleCtrl, received ctrl. type: 31, len: 6
    DEBUG: HandleCtrl, Stream BufferEmpty 1
    DEBUG: HandleCtrl, received ctrl. type: 32, len: 6
    DEBUG: HandleCtrl, Stream BufferReady 1
    DEBUG: HandleCtrl, received ctrl. type: 31, len: 6
    DEBUG: HandleCtrl, Stream BufferEmpty 1
    DEBUG: HandleCtrl, received ctrl. type: 32, len: 6
    DEBUG: HandleCtrl, Stream BufferReady 1
    DEBUG: HandleCtrl, received ctrl. type: 31, len: 6
    DEBUG: HandleCtrl, Stream BufferEmpty 1
    DEBUG: HandleCtrl, received ctrl. type: 32, len: 6
    DEBUG: HandleCtrl, Stream BufferReady 1
    WARNING: Stream corrupt?!
    ERROR: Wrong data size (8059624), stream corrupted, aborting!
    DEBUG: RTMP_ReadPacket, m_nChannel: 167
    DEBUG: RTMP_ClientPacket, unknown packet type received: 0xe3
    DEBUG: RTMP_ReadPacket, m_nChannel: 6ad6
    DEBUG: RTMP_ReadPacket, m_nChannel: 3d7c
    DEBUG: RTMP_ClientPacket, unknown packet type received: 0x00
    DEBUG: HandleCtrl, received ctrl. type: 8760, len: 6
    DEBUG: HandleCtrl, Stream xx -1742407864
    DEBUG: RTMP_ClientPacket, unknown packet type received: 0x2a
    DEBUG: RTMP_ClientPacket, unknown packet type received: 0x2a
    DEBUG: RTMP_ClientPacket, unknown packet type received: 0x2a
    DEBUG: RTMP_ClientPacket, unknown packet type received: 0x2a
    DEBUG: RTMP_ClientPacket, unknown packet type received: 0x2a
    DEBUG: RTMP_ClientPacket, unknown packet type received: 0xff
    DEBUG: RTMP_ClientPacket, unknown packet type received: 0xff
    DEBUG: RTMP_ReadPacket, m_nChannel: 5260
    DEBUG: RTMP_ReadPacket, failed to allocate packet
    Got EOS from element "pipeline0".
    Execution ended after 69266417190 ns.
    Setting pipeline to PAUSED ...
    Setting pipeline to READY ...
    DEBUG: Invoking deleteStream
    Setting pipeline to NULL ...
    FREEING pipeline ...
    After a short transfers,the client can not get the data.It seems that the Connection reset by peer;

  • Recording Live Streams on FMS remotely

    Hi,
    We're using FMSS 3 to stream live feeds from cameras.
    However, we'd also like to have the ability to (via an
    administration site) record these live streams and allow the user
    to view pre-recorded streams instead of the live ones.
    Does anyone know of an example of how I can tell FMS to start
    recording a stream (via a web service or AS), and then how to stop
    it? Just for clarification - we don't want FMS to record a stream
    from the user's own camera. We want to record the live streams that
    are already streaming on FMS.
    Thanks,
    Filip Stanek
    bloodforge.com

    you can creat an app folder in fms folder which is inside of
    ur setup folder,
    for details u can mail at [email protected]

  • LIVE Streaming Video-Equipment and Cables Needed?????

    I am planning on starting a 24/7 live online tv station through ustream or livestream. I will be using my apple Macbook Pro to do this. However, I would like to be able to play DVD's, and other recordings from a DVD player through a video switch board/video mixer into my computer and into ustream. Can anyone point me in the right direction as to what equipment I am going to need to do this. I already have everything to produce my shows. I just need to know what kind of video switch board/video mixer I will need and what connection cables I will need to run the video mixer to my Macbook Pro. If there is a better way to do this, I am open to suggestions. I am on a tight budget, but am willing to spend some money. Please keep in mind that I would like this station to be a 24/7 programming station on ustream or livestream and that I need to be able to set video up to run on an automated schedule. Thank You very much for any and all suggestions.

    What you need is grab video/sound form camera/mic, publish it
    on FMS and let the other clients play it. In a few steps:
    // get camera and mic
    var cam:Camera = Camera.getCamera(); // get default camera
    var mic:Microphone = Microphone.getMicrophone(); // get
    default mic
    // if you have VideoDisplay, for monitoring
    videoDisplay.attachCamera(cam);
    // create new RTMP connection to FMS/your app
    var nc:NetConnection = new NetConnection();
    // new stream should be create *after* nc has connected, not
    before
    // so this is executed after the below nc.connect() succeeds
    nc.attachEventListener(Event.CONNECT, function(){
    var ns:NetStream = new NetStream();
    ns.attachCamera(cam);
    ns.attachMicrophone(mic);
    ns.publish("streamname", "live"); // or "record" if you want
    to live+rec
    // connect to default instance of app 'appname' on your FMS
    nc.connect("rtmp://fms.ip.address/appname");
    If you're using Flex2 and FMS2 (the latest FMS is 3), you
    might need to
    cuange the connection encoding to AMF0 (which is what FMS2
    uses) in order for this to work.
    Hope this helps; I've written it from my head so it probably
    has mistakes, but it's the general workflow; the docs in the
    language reference for specific functions mentioned here will also
    help.

  • Need license for using H.264 Video codec for live stream with FMS 3.5 series

    Hi,
        I am creating a live audio-Video chat application  in which I use H264 video codec for live streaming.  I am using Flash Media Server version 3.5.7. I read some where that H.264 Video streaming using FMS required seperate licencse. So I want to confirm , It is true that H264 video streaming required seperate license? If yes then what is the procedure to get license.
    Please let me know its urgent.
    Regrads,

    Hi Again,
    Thanks for your reply.
    Here are some articles/posts which says about Royalties to use h264:
    http://www.mpegla.com/Lists/MPEG%20LA%20News%20List/Attachments/226/n-10-02-02.pdf 15:27
    http://www.mpegla.com/main/programs/AVC/Documents/AVC_TermsSummary.pdf
    http://www.streamingmedia.com/Articles/ReadArticle.aspx?ArticleID=65403&PageNum=3
    http://www.mpegla.com/main/programs/avc/Documents/AVC_TermsSummary.pdf
    According to these, any  product/service provider, need to pay one time fee based on # of subscribers if we are using h264 for encoding and is charging fee from its users.
    About our business: We are web-conferencing solution tool provider and are using Flash Media Server version 3.5.7. I for audio/video. Please help me out if h264 is really free for us or we need to buy license to use that.
    Looking forward to hear from you soon.
    Regards,

  • Live stream on FMS issues when reaching 1600 concurrent users

    Hey all,
    We have an FMS 4.5.2 installation with 1x Origin and 2x Edge setup only for Live streaming. We are using FMLE latest version to stream to the Origin server and Edge servers are connecting to the Origin server. We are running four applications stream1, stream2, stream3 and stream4 all for live streaming (copied live) application. All servers have default configuration except edge/origin setup.
    All servers have 4Gbit connections in bonding and network traffic is nicely distributed across all bonds and uplink is 10Gbit to the ISP.
    Server is a 2 CPU / Quad Core HP DL 380 with 64Gbit of memory running Ubuntu 10.04 LTS
    Now to the problem.
    We are streaming live stream from FMLE with 700kbit/s to for example stream1 application and when we hit around 1600 concurrent users with 50% on edge1 and 50% on edge2 (800 concurrent users on each edge) in the player buffer drops and all people experience buffering issues at approx 30secs - 2minutes intervals. (network is not congested because if at that time do an SCP from the server outside network it copies 1Gbyte file within seconds.
    Funny thing is that If I start a second FMLE and stream live stream to for example stream2 application at same time and open up second player on client the video runs great without any buffer issues from the same server at the same time.
    edge1 server:
    application: stream1 with 800 concurrent users, player has buffer issues
    application: stream2 with 4 concurrent users, player has no issues
    edge2 server
    same as above
    So my current conclusion is that it has to be something per application since other application does not have this issues when running simultaneously? We tried changing fmscore number settings and some buffer settings but nothing helped.
    at the time when we get buffering issues this are the only problematic things that get in the log and its in edge.00.log on both EDGE servers but not at same times:
    Edge1:
    2012-04-23    12:50:21    29270    (e)2661034    Connect failed ( , 8134 ) : Connect failed: Connection refused (111)    -
    2012-04-23    12:55:30    29270    (e)2661034    Connect failed ( , 8134 ) : Connect failed: Connection refused (111)    -
    2012-04-23    12:56:42    29270    (e)2661034    Connect failed ( , 8134 ) : Connect failed: Connection refused (111)    -
    2012-04-23    12:56:42    29270    (e)2661034    Connect failed ( , 8134 ) : Connect failed: Connection refused (111)    -
    2012-04-23    13:14:40    29270    (e)2661034    Connect failed ( , 8134 ) : Connect failed: Connection refused (111)    -
    2012-04-23    13:20:30    29270    (e)2661034    Connect failed ( , 8134 ) : Connect failed: Connection refused (111)    -
    Edge2:
    2012-04-23    12:56:32    9625    (e)2661034    Connect failed ( , 8134 ) : Connect failed: Connection refused (111)    -
    2012-04-23    13:02:23    9625    (e)2661034    Connect failed ( , 8134 ) : Connect failed: Connection refused (111)    -
    2012-04-23    13:08:03    9625    (e)2661034    Connect failed ( , 8134 ) : Connect failed: Connection refused (111)    -
    there is no packet loss between edge and origin servers and latency is at 0.2ms and nothing in the logs of the origin server
    We even tried to deploy Wowza Medis Servers with edge / origin setup and were able to handle around 4000 (2000 on one and 2000 on second edge) concurrent users without any issues.
    Anyone has any ideas or at least what are our next options to do and what settings to change on the FMS? or how to debug and what to check when buffering issue appears? any more debugging we can enable on the FMSs? too se if we hit some kind of limit somewhere?
    thanks

    hey, i got the similar problem, logging like this
    2012-11-12
    18:50:12
    23434
    (e)2661034
    Connect failed ( , 5779212 ) : Connect failed: Connection refused (111)
    2012-11-12
    18:50:54
    23434
    (e)2661034
    Connect failed ( , 5779212 ) : Connect failed: Connection refused (111)
    2012-11-12
    18:51:36
    23434
    (e)2661034
    Connect failed ( , 1166880400 ) : Connect failed: Connection refused (111)
    2012-11-12
    18:54:14
    23434
    (e)2661034
    Connect failed ( , 1175301776 ) : Connect failed: Connection refused (111)
    2012-11-12
    18:54:55
    23434
    (e)2661034
    Connect failed ( , 1164775056 ) : Connect failed: Connection refused (111)
    2012-11-12
    18:55:37
    23434
    (e)2661034
    Connect failed ( , 16 ) : Connect failed: Connection refused (111)
    2012-11-12
    19:13:08
    23434
    (e)2661034
    Connect failed ( , 1158459024 ) : Connect failed: Connection refused (111)
    it seems that the port number is invalid, but we never use such ports.

  • Recording RTMFP P2P stream by sending simultaneous stream to FMS (and other things...)

    Only just getting started on this whole domain of learning, so go easy!
    If I set up a P2P video/audio chat (similar to the sample VideoPhone thing on the Cirrus site), can I get the stream from both parties to send to a server at the same time so that I can record it? If so, would I have to use a FMS to stream it to and perform the recording (and if so which version could I get away with)? Are there any (preferably free, or just tutorialised) solutions for the recording side of things?
    Currently it seems like the only option for doing the P2P thing is to use Stratus/Cirrus unless I use FMS4 Enterprise. Is that correct? 
    Is there any example code out there that can help? Has anyone got any experience of how effective this kind of situation can be, in terms of quality of the stream and recording? Does any of this make sense?
    The Enterprise version of FMS is "Call for price": can anyone give me ball-park figures?
    Would appreciate any advice or pointers on these very general questions.

    can I get the stream from both parties to send to a server at the same time so that I can record it?
         I really do not know how one can do that - it would be like creating separate stream to server and replicating same data which you send to peer and i do not know how useful it would be - again it will increase the load on server which we are trying to avoid using P2P
    If so, would I have to use a FMS to stream it to and perform the recording (and if so which version could I get away with)?
         I think above answer takes care of this. I am not expert on P2P - so someone can correct me if i am completely off
    Are there any (preferably free, or just tutorialised) solutions for the recording side of things?
         Not something i am aware off.
    Currently it seems like the only option for doing the P2P thing is to use Stratus/Cirrus unless I use FMS4 Enterprise. Is that correct?
         Yes - as of now only FMS4 Enterprise edition supports P2P

  • Live Stream from FMS

    I have install the Flash Media Server and Flash Media Live Encoder. Everything run perfectly when I test the Live Stream through localhost, but if I try to embed to another machine, it is failed. Anybody have an idea how to fix it?

    Have you made sure your firewall allows incoming traffic on 1935 - i think it would be firewall issue.

  • Live streaming ( play && publish ) over rtmp/rtmfp with codecs ( h264 && PCMU ) on Android/iOS

    I have a question: On what operating systems(android,ios) can be played and published stream with codecs H264 and PCMU with rtmp and rtmfp protocols(live streaming)?    
         As far as I found out with the codecs can be played on Android(rtmp protocol). On IOS video is not displayed, as I understood AIR environment cuting it.
    Another question regarding video texture. Will be included in future releases support for live playing h264+PCMU on IOS?

    On iOS, you'll need to be playing a HLS stream for h264 to decode when streaming from a remote server.

  • Live streaming to FMS server from analog cable?

    Need to stream an analog cable signal (may consist of 60-70
    channels in VL and VHF)to FMS server. im doing a small scale
    project. so i want some hardwares supporting these. please suggest
    me some reasonable ones..
    is it possible to do the same with a tv tuner card.?

    For HLS streaming, only aac/heaac are supported audio codec. Though FMS do not disallow packaging mp3, but its mostly depnds upon ios devices to play back them.. We don't recommend mp3 as codec for ios playback.
    Thanks

  • Settings of  live stream for FMS 4.5 on Amazon

    Hi. We are cureently have an issue with setting up an outcoming stream for a livechat. We are using FMS 4.5 on Amazon and OSMF player. When streaming from FMLE everything is working fine. But when streaming from  as3  we receive  buffering status of OSMF player and  wornings in amazon server:
    Warning from libf4f.so: [Utils] [1cbec29179f3616a6b284fc9ecc029d8] [1cbec29179f3616a6b284fc9ecc029d8] FMS F4F recording received timestamps with absolute time flowing backward lastTime held was 2109435 and the most recent arrived was 2107876.  This message is being ignored - recording will continue.
    Stream settings are:
    _streamCam = Camera.getCamera();
    _streamCam.setMode (640,480,30,true);
    _streamCam.setQuality (1200 * 1024, 90);
    _streamCam.setKeyFrameInterval(4);
    var pH264Settings:H264VideoStreamSettings = new H264VideoStreamSettings();
    pH264Settings.setProfileLevel(H264Profile.BASELINE, H264Level.LEVEL_3_1);
    pH264Settings.setQuality(_streamCam.bandwidth,_streamCam.quality);
    Also Adobe resommends to enable streamsynchronization like it is done for FMLE:
    http://help.adobe.com/en_US/flashmediaserver/devguide/WSd391de4d9c7bd609-52e437a812a3725df a0-8000.html
    <flashmedialiveencoder_config>
        <mbrconfig>
            <streamsynchronization>
                <!-- "true" to enable this feature, "false" to disable.                    -->
                <enable>true</enable>
    Is it possible to do the same for AS3 Netstream? If so, please tell the name of  a proppety.
    Please kindly, advise.

    Hi. We are cureently have an issue with setting up an outcoming stream for a livechat. We are using FMS 4.5 on Amazon and OSMF player. When streaming from FMLE everything is working fine. But when streaming from  as3  we receive  buffering status of OSMF player and  wornings in amazon server:
    Warning from libf4f.so: [Utils] [1cbec29179f3616a6b284fc9ecc029d8] [1cbec29179f3616a6b284fc9ecc029d8] FMS F4F recording received timestamps with absolute time flowing backward lastTime held was 2109435 and the most recent arrived was 2107876.  This message is being ignored - recording will continue.
    Stream settings are:
    _streamCam = Camera.getCamera();
    _streamCam.setMode (640,480,30,true);
    _streamCam.setQuality (1200 * 1024, 90);
    _streamCam.setKeyFrameInterval(4);
    var pH264Settings:H264VideoStreamSettings = new H264VideoStreamSettings();
    pH264Settings.setProfileLevel(H264Profile.BASELINE, H264Level.LEVEL_3_1);
    pH264Settings.setQuality(_streamCam.bandwidth,_streamCam.quality);
    Also Adobe resommends to enable streamsynchronization like it is done for FMLE:
    http://help.adobe.com/en_US/flashmediaserver/devguide/WSd391de4d9c7bd609-52e437a812a3725df a0-8000.html
    <flashmedialiveencoder_config>
        <mbrconfig>
            <streamsynchronization>
                <!-- "true" to enable this feature, "false" to disable.                    -->
                <enable>true</enable>
    Is it possible to do the same for AS3 Netstream? If so, please tell the name of  a proppety.
    Please kindly, advise.

Maybe you are looking for