Live Stream Pre-Recorded FLV File

Hi I'm using FME 2.5 to stream a web cam feed (to Akamai).
Has anyone ever used this set-up (or something similar) to deliver
a pre-recorded video file? Basically, I'd like to use a file as the
video input device.

Normally, only a device which appears as a video camera
driver will be supported for live streaming. That includes USB web
cams and firewire cameras.
However when using programs like MSN or Yahoo chat, many
users have successfully used "virtual camera driver" software,
which supplies images from a file source, but appears to the system
to be a physical (camera) device.
So for example, without making a specific recommendation,
"Broadcaster StudioPRO" appears to be one such virtual camera
driver :
http://www.snapfiles.com/Freeware/webpublish/fwwebcam.html
Probably the best approach is to hunt on (insert favourite
search engine) for the search term above, in conjunction with "MSN
messenger" or "Yahoo messenger" to restrict the search
results.

Similar Messages

  • Folder location of pre-recorded flv file

    I have a pre-recorded flv file that I've placed in a
    sub-folder within the applications folder of Media Server. When I
    run my Flash movie within Flash or on my localhost using IIS, the
    Media Server Management Console shows my flv streaming. This only
    seems to work, though, if I also have the flv file in the same
    folder or sub-folder relative to my swf and html files and the
    contentPath perimeter of the on-stage video component points to
    this flv folder/sub-folder. Question: does the flv need to be in
    both these locations? The documentation is not clear about
    this.

    quote:
    This only seems to work, though, if I also have the flv file
    in the same folder or sub-folder relative to my swf and html files
    and the contentPath perimeter of the on-stage video component
    points to this flv folder/sub-folder. Question: does the flv need
    to be in both these locations? The documentation is not clear about
    this.
    This makes me suspect that you're not actually connecting to
    FMS, but rather, just playing the progressive http video. If you
    leave the flv in the http directory (and remove the file from the
    fms directory) does it still play?
    If you want to connect to an FMS stream, your content path
    will be an rtmp location, not an http location. It will look
    something like this:
    rtmp://myfmsserver.com/myApplication/myVideo
    I don't use the prefab media components, so I don't know if
    it wants the .flv extension at the end of the url.

  • Recorded FLV files contain corrupt timestamps

    Howdy -
    I'm running FMS 4.5.3 r2005 on standard AWS Linux box.  I'm recording the streams using NetStream.publish("appendWithGap") from a small custom Flash app running on IE+Windows.
    Usually, the generated .flv file is fine. However, I've found a number of cases where the .flv file generated was corrupt.  By using ffprobe -show_packets I'm able to see that the presentation timestamp (PTS) of sequential packets occasionally remain the same.  This happens for both audio and video at times. For example, consider the output of the command:
    $ ffprobe -show_packets saved.flv | egrep -e '(pts=|codec)'
    codec_type=video
    pts=0
    codec_type=video
    pts=0
    codec_type=video
    pts=0
    codec_type=video
    pts=0
    codec_type=video
    pts=0
    codec_type=video
    pts=0
    codec_type=video
    pts=3912
    codec_type=video
    pts=3944
    codec_type=video
    pts=3944
    codec_type=video
    pts=3944
    codec_type=video
    pts=3944
    codec_type=video
    pts=3944
    codec_type=video
    pts=3944
    codec_type=video
    pts=3944
    codec_type=video
    pts=3944
    I really don't know much about the internals of the FLV format, but I assume that invalid timestamps like these make for a non-playable (corrupt) video.
    Any idea what may cause FMS to record such a video?
    What kind of information would be helpful to debug this?
    Obviously, my client expects every video recorded to be one that an be played back, and so these corrupt videos are alarming to say the least.
    FYI, due to privacy restrictions I can't share the recorded video without a signed NDA.  If it would help, I can provide the entire output of ffprobe.
    Thanks,
    Ben

    Flvcheck has validated the format of file, and it is correct. The logs "Adding Silence message" indicates that audio message was expected and is not recieved after a particular time check, because of which video may appear stalled.
    Flvcheck may claim the file is valid, but event after running flvcheck -s the video is corrupt.  According to ffprobe, many of the timestamps are still the same. When I attempt to play the file in an flv player, the entire video is run through up front in sort of a fast-forward view, and then the audio plays out more or less normally.
    This makes sense, as the video player is probably playing all these frames with the same timestamp sequentially, which makes for the fast-forward effect.
    To debug this issue, a good starting point would be, validating the input to FMS. FMS records what is published to it.
    Does FMS tinker with the video frame timestamps, or are they just passed through to the file system?
    Since you are able to reproduce this issue, can you please enable livestream logging and check if input stream to FMS is different from what it is recording.
    Unfortunately I can't regularly reproduce it - though it does happen on a somewhat regular basis (it was happening a couple times a week).
    Is it safe to turn live stream logging on for a few days to see if I can catch this issue happening? Or, will live logging chew up resources and cause issues?
    Thanks for the assistance!
    -Ben

  • Live Streaming from static video file

    Hi everybody,
    I'm using Amazon Cloud & Flash MMedia Server 4.5.
    I would like to take one of my exits videos (a static file), and turn it into a live stream. People can view this video (synchronous). I'm using Flash Media Live Encoder 3.2, but it only capture video from devices as camera...
    How I can do it?
    Cheers.

    Hi Huy,
    Please find the zip attached where I have written the scripts for you to deploy on your local FMS server, in context of option 2:
    You may like to follow these steps:
    1. unzip the folder.
    2. deploy FLVpublishonLoad (in case your recorded file is a FLV use case) on your lcoal server
    3. deploy toPublish on your remote aws server
    4. make sure to correct the path for aws in FLVPublishonLoad application @ line#19:
    // Please mention your aws instance hostname instead of localhost , and application name ....
    nc.connect("rtmp://localhost/toPublish");
    5. go to admin console of your local development server and load the instance of FLVPublishonLoad as shown in the howtoLoadfromAdminConsole.png image attached for refrence purpose.
    6. it will automatically connect to the "toPublish" app on your remote server and start publishing your local vod file as a live stream to your aws instance
    7. start your subscriber app and subscribe to the stream you have used in FLVPublishonLoad app for Stream.get() method, see line#29 and the subscriberScreenExample.png file attached for the illustration purpose.
    few points to note here:
    1. I have removed the sample.flv and "sample1_1500kbps.f4v" for keeping the zip size lower.
    2. You need to put your stream name at line#55 where you call mystream.play()
    3. The stream name that you specify in Stream.get() will be used by subscribers.
    4. if your use case is mp4, then please use "MP4PublishonLoad" instead of "FLVPublishonLoad" application.
    =============
    As I am unable to attach the files here therefore copy-pasting the code for you and others to be re-used:
    =======
    main.asc code for FLVPublishonLoad:
    var nc;
    var ns;
    application.onAppStart = function()
        trace("hello client: ");
        publishIt();
    function publishIt()
        trace("publishing");
        nc = new NetConnection();
        // Please mention your aws instance hostname instead of localhost , and application name ....
        nc.connect("rtmp://localhost/toPublish");
        nc.onStatus = function(info)
            trace(info.code);
        ns = new NetStream(nc);
        // Exact stream name available for subscribers .....
        mystream = Stream.get("myvodfile");
        mystream.onStatus = function(sinfo)
            trace("mystream.onStatus: "+sinfo.code);
            if(sinfo.code == "NetStream.Publish.Start")
                attach_retVal = ns.attach(mystream);
                if(attach_retVal==true)
                        trace("stream attach was successful ...");
                        startPublish();
                else
                    trace("The attempt to attach stream source to NetStream failed");
        // Please put the stream name here inside double-quotes that you want to publish and is available there in streams/_definst_ folder ......
        mystream.play("sample",0,-1,true);
    function startPublish()
        ns.publish(mystream.name,"live");
    main.asc code for MP4PublishonLoad:
    var nc;
    var ns;
    application.onAppStart = function()
        trace("hello client ");
        publishIt();
    function publishIt()
        trace("publishing");
        nc = new NetConnection();
        // Please mention your aws instance hostname instead of localhost , and application name ....
        nc.connect("rtmp://localhost/toPublish");
        nc.onStatus = function(info)
            trace(info.code);
        ns = new NetStream(nc);
        // Exact stream name available for subscribers .....
        mystream = Stream.get("mp4:myvodfile.f4v");
        mystream.onStatus = function(sinfo)
            trace("mystream.onStatus: "+sinfo.code);
            if(sinfo.code == "NetStream.Publish.Start")
                attach_retVal = ns.attach(mystream);
                if(attach_retVal==true)
                        trace("stream attach was successful ...");
                        startPublish();
                else
                    trace("The attempt to attach stream source to NetStream failed");
        // Please put the stream name that you want to publish and is available there in streams/_definst_ folder ......
        mystream.play("mp4:sample1_1500kbps.f4v",0,-1,true);
    function startPublish()
        trace("#### " + mystream.name);
        ns.publish("mp4:" + mystream.name,"live");
    main.asc code for "toPublish" app
    =====================
    application.onPublish = function(clientObj, streamObj)
        trace("published: " + streamObj.name);
    =====================
    Please revert back to me in case of further query.
    Regards,
    Shiraz Anwar

  • Pre-load .flv file

    Hello again,
    Seperate question:
    I have created an SWF file that streams an FLV file.
    What I would like to do is have the SWF load the FLV all at once at the beginning (the way a flash app might pre-load the interface).  Is this possible?
    This way, when the SWF starts playing, the FLV will be loaded already, and will not have to buffer during playtime.
    The FLV is only 300KB, and is an element in the interface itself, so I can't have it stuttering.
    Thanks in advance!

    attached to the first frame where your component exists and its contentPath is defined, disable autoStart and, if flv is your component instance name, use:
    flv.play();
    flv.pause();
    this.onEnterFrame=function()[
    if(flv.bytesLoaded>=flv.bytesTotal){
    delete this.onEnterFrame;
    flv.play();

  • Can't seem to stream (RTMP) an FLV file in Flash Media Playback from the Amazon Cloud

    I seem to be having trouble streaming any FLV file using RTMP from the Amazon Cloud.  Other file formats (i.e., MP4, F4V) seem to work just fine.
    I tried using your Flash Media Playback demo page and still had the same problem:
    http://www.osmf.org/configurator/fmp/
    Without publicly displaying the actual connection path here, this is what I am using for the Video Source (URL):
    rtmp://xxxxxxxxxxxxxx.cloudfront.net/cfx/st/_definst_/MyVideo.flv
    If I change the "MyVideo.flv" to "mp4:MyVideo.f4v", for example, then it works.  But when I try to reference any FLV file that I have located on the Amazon Cloud then it just shows continuous buffering.
    The FLV files that I am trying to use with both the Strobe and Flash Media Playback(s) all stream just fine when I connect to them via other Flash based applications/players that I have.
    Do you have any suggestions?
    Thank you very much.

    Hello,
    The link to the previous post does not seem to apply or help in this case.  Please let me know if I am wrong.
    I am merely trying to stream any FLV file using your Flash Media Playback demo page.  What am I missing here?
    Thanks again.

  • Live vs. archive of .flv files

    I am playing some short videos (.flv files ) in a presentation. Within the live presentation, the audio plays, but the screen is black. However, on playback of the archive, everything looks great, both video and audio. Any idea why this might be happening?
    Should add that these files are being played from the content library.

    I am playing some short videos (.flv files ) in a presentation. Within the live presentation, the audio plays, but the screen is black. However, on playback of the archive, everything looks great, both video and audio. Any idea why this might be happening?
    Should add that these files are being played from the content library.

  • Trying to view the recorded or live stream while recording the live stream doesnt work

    The workflow:
    I push a live feed to the rtmp url rtmp://localhost/my-app/tester where my-app is a copy of the live app in AMS but with the following addition of code in main.asc:
    application.onPublish = function( p_client, p_stream) {
       s = Stream.get("my_hello5");
       trace( s.name + " = name, " + s.publishQueryString + " = pqs");
       if(s) {
         s.record();
         trace("recording...");
         s.play(p_stream.name);
    application.onUnpublish = function( p_client, p_stream) {
       s = Stream.get("my_hello5");
       trace("getting stream my_hello5...");
       if(s) {
         trace("stopping recording...");
         s.record(false);
    And Iam facing multiple issues:
    * The file my_hello5.flv gets generated in applications/my-app/streams/_definst_/ . And Once the recording is complete, If I open it using the url below in media player (Mine is Totem movie player on ubuntu) that I have on linux, it plays. But if I open the same url in  the OSMF flash media playback setup page (http://www.osmf.org/configurator/fmp/# ) , it is always showing "buffering".
    The flv url I used is: rtmp://localhost/my-app/my_hello5
    * If I try to open either the flv url or the live feed url inside the osmf page while the recording is still going on, then Iam not able to view either. It just shows "buffering" all the time.
    Your help is valuable and much appreciated.

    As a good practice if should not play a file while it is being recorded...on other hand you can programatically create chunks of recordking at preodic intervals and use those chunks for laying or copy those chunks to a different folder where a different application and make those availabel to subscribers for rtmp streaming..

  • Question about streaming h264 in flv file

    I read an article at "
    http://www.kaourantin.net/2007/08/what-just-happened-to-video-on-web_20.html"
    which is a blog of an software engineer of Adobe, the author said
    in this article that "...There are functional limits with the FLV
    structure when streaming H.264 which we could not overcome without
    a redesign of the file format..." and "...Specifically dealing with
    sequence headers and enders is tricky with FLV streams...", does
    anybody know what exactly the author mean? what is the "functional
    limits"? how to "specifically dealing with sequence headers and
    enders"? is it a way to solve the "functional limits"?
    Thanks!

    Hi, anybody help me?

  • Live stream switching

    hi!
    I have a problem. I want to set up a live streaming experience, where streams will be switched live by a server side application written in .NET. The flash clients will have no control over what stream they want to watch.the setup is such: video streams from two cameras are encoded in Live Encoder. FMS should stream only a single stream to the clients and which stream should be shown should be chosen by a .NET application
    How do I go about accomplishing such a task? As I understand the FMS uses RTMP to communicate with clients. Are there any other ways to contact the FMS and tell it to switch between two streams, i.e. an XMLSocket connection? This would be perfect since it would integrate with the .NET server perfectly. I do not like the prospect of using RTMP to contact the FMS and tell it to switch between streams. Is it even possible, or can the stream switching be performed only per client?

    You can use an XML socket, but the FMS application will need to initiate the connection, as FMS has no support for listening for anything other than RTMP and HTTP requests.
    Stream switching can happen on the FMS side. In your FMS application, you'll create a server side stream, and use the Stream.play method for playing other sources (live streams or recorded flv/h.264 files) over that stream. Your subscribers will connect to the server side stream
    See the FMS docs for the Stream class and the XMLSocket class.

  • Can storing a live stream using actionscript fail by the cache filling up with overhead?

    Hi,
    Lately we have been seeing a problem with the archives of live streams we create using FMS. We use FMS for live streaming and concurrently store the stream in a file using ActionScript. We use the following code to record:
    var s2 = Stream.get('mp4:' + mp4name);
    application.publishedStreamMP4= s2;
    application.publishedStreamMP4.record();
    application.publishedStreamMP4.play(application.publishedStream.name,-1,-1);
    (some lines have been removed that are used for logging, etc).
    Sometimes some of these functions fail and return false. In these cases FMS's core log shows that the cache is full:
    2013-06-11 11:45:55        13863   (w)2611372      The FLV segment cache is full.  -
    In investigating this issue I have not yet been able to recreate this exact situation. By lowering the cache to 1MB I have however been able to create a situation where storing a stream can stop because the cache is full. The situation occurs as follows:
    * The server is restarted, the cache is empty.
    * A live stream is started, which is also recorded.
    * Via the Administration API the cache values <bytes> and <bytes_inuse> show to be exactly the same as the <overhead> of the object that relates to the file being saved. The <bytes> and <bytes_inuse> values of the object are 0.
    * This continues in the same way untill the cache is full.
    * When the limit of the cache is reached the message
    2013-06-11 12:07:35        13863   (w)2611372      The FLV segment cache is full.  -
    is shown in the core log and storing of the file stops. The instance log also show status changes:
    2013-06-11 12:07:35        13863   (s)2641173      MP4 recording status for livestream.mp4: Code: NetStream.Record.NoAccess Level: status Details:         -
    2013-06-11 12:07:35        13863   (s)2641173      MP4 recording status for livestream.mp4: Code: NetStream.Record.Stop Level: status Details:     -
    In the filesystem I can confirm that the last change of the file is on this moment (in this case 12:07). The live stream continues without problems.
    I have reproduced this several times. Though I can understand that caches can fill up and this can cause trouble I feel like this situation is a bug in FMS. The cache fills up with overhead, which is apparently reserved untill writing the file ends.
    I hope someone here can help out. Has anyone seen a situation like this and is there any remedy for it? Or even a workaround where this overhead in the cache can be released so the cache does not fill up?
    We use FMS version 4.5.1.

    You can use an XML socket, but the FMS application will need to initiate the connection, as FMS has no support for listening for anything other than RTMP and HTTP requests.
    Stream switching can happen on the FMS side. In your FMS application, you'll create a server side stream, and use the Stream.play method for playing other sources (live streams or recorded flv/h.264 files) over that stream. Your subscribers will connect to the server side stream
    See the FMS docs for the Stream class and the XMLSocket class.

  • Custom metada and record live stream

    I tried to inject my own metadata into a live stream (webcam) recorded on FMS 4.5.1
    with no success. I was able to change only the standard metadata but not all my custom ones were recorded
    despite of all example on adobe livedocs, no luck.(as2 and as3)
    this example
    http://help.adobe.com/en_US/flashmediaserver/devguide/WS5b3ccc516d4fbf351e63e3d11a0773d56e -7ff6Dev.html
    doesn't work at all for custom metadat and EVEN recroded special metadata like createdby and creationdata
    example:
    metaData.createdby = String(streamTitle + " of "+fullName);
    metaData.creationdate = String(now.toString());
    metaData.tags = {createdby:metaData.createdby, creationdate:metaData.creationdate};
    metaData.width = Number(devicesQuality.width);
    metaData.height = Number(devicesQuality.height);
    nodeSendStream.send("@setDataFrame", "onMetaData", metaData);
    once the stream is recorded I can see only metData.width and height. createdby, creationdate and tags don't exist.
    notice: I encode and stream from FP11.1 in H2564 with speex or pcmu
    Thanks

    well,
    I exactly respected the example of the link you gave by modifying the Application.xml, but if it was a segment problem
    how can you explain that the NON custom metadas are successfully modified ?
    please check this link
    http://help.adobe.com/en_US/flashmediaserver/devguide/WS5b3ccc516d4fbf351e63e3d11a0773d56e -7ff6Dev.html
    I even tried the official property called createdby and creationdate without success. I was able to change only  width and height
    the code I use from the client is exactly the example of adobe manual, in AS2 and AS3
    Thanks

  • Append stream object to existing file

    Hello, I'm trying to append a live stream to an existing file, but I cant figure out how to open the existing file and then append the live stream. This is server side as well.
    I know I need to open the existing file using stream = stream.get("file"). but I don't understand how to associate stream with the live stream coming in.
    PLEASE HELP~!!!

    This is what I have now, and it still overwrites :*(
    application.onPublish = function (p_c, p_stream)
        //check for the file helpers   
        filePath = new File("/streams/_definst_/");                             
        fileName = p_stream.name + ".flv";
        fileTest = filePath.fileName;     
        if (fileTest.exists == false){
            p_stream.record();
        } else {                
            mynewstream = Stream.get(p_stream.name); 
            mynewstream.onStatus = function(info){
                 trace(info.code);
            mynewstream.record("append"); 
            mynewstream.play(p_stream.name ,-1,-1,true);   //where livestream in name of stream getting published.

  • Live stream and codecs

    Hello.
    Here is described how to use different codecs for different
    clients (Sorenson Spark for Flash Player 7- and On2 VP6 for Flash
    Player 8+).
    http://livedocs.adobe.com/fms/2/docs/00000081.html
    But as I can see, it is only suitable for recorded .flv
    files.
    Is there any possibilities to customize stream delivery or
    live streams?
    Can FMS2 transcode streams on the fly?
    Thank you.

    FMS can't transcode or do any sort of recompression. It's
    bits in, bits out.
    If you publish through the Flashplayer, you will alwaus have
    sorenson spark encoded video (the flashplayer only has the decoder
    half of the VP6 codec). If you want to publish live VP6 video, look
    into the Adobe Flash Video Encoder, or On2 Flix Live.

  • Can I publish static content to Live stream

    I have a requirement of publishing static content like video files mentioned in a playlist to be published to a live stream. It is like creating a TV station with pre-recorded content but with live stream. It will great  if anyone let me know is it possible with FMS?

    Yes its very much possible.
    What you need to do is place all your files on server and play it via server-side stream. You can go through FMS documentation and you should be able to get it work. However i will tell you briefly here how to go about it.
    Create an application folder , under applications directory of FMS. Let us name it as "test". Create a folder called "streams" under "test" and then under "streams" create "room" folder. So structure would be as follows : <FMS installation directory>/applications/streams/room. Now place all your static content here , say you place one.mp4,two.mp4,three.mp4.
    Now you want to give effect of live stream.
    Create a file called main.asc under "test" folder.
    Write below code in main.asc:
    var mylivestream;
    application.onAppStart = function(){
         mylivestream = Stream.get("mp4:livestream.mp4");
         mylivestream.play("mp4:one.mp4",0,-1,true);
         mylivestream.play("mp4:two.mp4",0,-1,false);
         mylivestream.play("mp4:three.mp4",0,-1,false);
    Above code would play all three static content one after other. Subscriber client would have to subscribe in following manner:
    ns.play("mp4:livestream.mp4",-1,-1)
    Please note that above code is sample code and not the most optimal or correct and will need tweaking depending on what is your requirement.
    Hope this helps

Maybe you are looking for