Live stream in different resolutions
hello,
I have an urgent question for the following purpose. We plan
to develope a
solution consisting of a sender software for streaming live
video to the streaming server. Now more than one client should be
able to connect to the stream of the server. Because the different
clients are connected with different bandwidth rates the server
should be able to convert the video stream (e.g. the video width
and height, framerate and quality) on the fly. Maybe a client with
a better connection than DSL should get the video in a resolution
of 640x480, a DSL client should get the video in 352x288 and a
client with ISDN in 176x144.
Is there a possibility to do this with the FMS server on the
server with server-side ActionScript? Or can I limit the requested
stream size on the client side? Or anything I can do?
I hope anybody can answer my question quickly, its really
really urgent.
Thanks in advance
Hi fmslove,
Is there a more elegant way to do this so far?
Regards.
Similar Messages
-
Problem relaying live stream using different urls
Hi,
I have setup QT Broadcaster to announce a stream to my qtss.
I announced the stream to a directory live/stream.sdp
This works without a clinch when using the same URL for viewing, e.g.
rtsp://rtsp.myqttss.com/live/stream.sdp
If I create a symlink (or simply copy the sdp file) from live/stream.sdp to test/stream.sdp and try to access the stream through
rtsp://rtsp.myqttss.com/test/stream.sdp
it does not work anymore. QT Viewer will connect, but does not play the stream.
I am wondering why this is. Is the mountpoint relevant for accessing the stream? All neccessary information should be contained in the sdp file.
The reason I would like to do that is because we have a payment module for QTSS that will use a 'virtual' address to dispatch a stream to the client. This works perfectly for non live content.
Thanks
NikWestOk.
No problem now.
Thanks. -
Duplicating stream object to new stream with lower resolution
Hi all generally I have a question, Is it possible for ams to split my original live stream to a new stream but with lower resolution, so i have two streams one with high resolution and one with low resolution?
If not I how does it Adobe Flash Media Live Encoder do? I can steam up to 3 different resolutions from one camera object. Is it possible to do also from flash?
I want to built a multi user conference application. But have some cpu usage issues, so I need a additional lower resolution stream to use for the miniature views.
Could You help me with this, any tips how could I do such a thing?
I using rtmp protocol to communicat and adobe media server 5.
Sorry if a similar thread was already published on this forum.Thx 4 the replay.
So maybe I could do something like this:
I am able to send two streams from one camera but they both have the same resolution.
Is there any way from flash app to send two streams from one camera with different resolution like FMLE does? -
HDS live streaming to Flash not working
Adobe Flash Media Server 4.5.5 r4013
Windows 2008
Sources:
http://help.adobe.com/en_US/flashmediaserver/devguide/WSd391de4d9c7bd609-52e437a812a3725df a0-8000.html
http://www.adobe.com/devnet/adobe-media-server/articles/live-multi-bitrate-video-http-flas h-ios.html
Live streaming a single or multi-bitrate video over HTTP to Flash does not work. I have followed the instructions on the 2 sources listed above repeatedly, but I can’t get live streaming over HTTP to Flash to work. Live streaming to iOS over HTTP works with no problems (single and multi-bitrate streams).
I have tried the troubleshooting steps from the following:
http://help.adobe.com/en_US/flashmediaserver/devguide/WS0432746db30523c21e63e3d12efac195bd -8000.html
Troubleshoot live streaming (HTTP)
1. Services window (Windows): Flash Media Server (FMS), Flash Media Administration Server, and FMSHttpd services are running. ✓
2. Verified that the request URL is correct. ✓
3. Configured ports:
a. Configure Apache to use port 80. Open rootinstall/Apache2.2/conf/httpd.conf in a text editor. Change the line Listen 8134 to Listen 80.
b. Configure Flash Media Server not to use port 80. Open rootinstall/conf/fms.ini in a text editor. Remove 80 from the ADAPTOR.HOSTPORT parameter so the parameter looks like the following: ADAPTOR.HOSTPORT = :1935 ✓
4. Placed a crossdomain.xml file to the rootinstall/webroot directory. ✓
5. In Flash Media Live Encoder, select the Encoding Options tab, choose Output from the Panel options menu, and verify the following:
a) The value of FMS URL is rtmp://fms-dns-or-ip/livepkgr. If you’re testing on the same server as Flash Media Server, you can use the value localhost for fms-dns-or-ip. ✓
b) For a single stream, the value of Stream is livestream?adbe-live-event=liveevent. ✓
c) For adaptive bitrate streaming, the value of Stream is livestream%i?adbe-live-event=liveevent. ✓
Flash Media Live Encoder uses this value to create unique stream names. To use another encoder, provide your own unique stream names, for example, livestream1?adbe-live-event=liveevent, livestream2?adbe-live-event=liveevent.
The encoder is showing all 3 streams being published and streaming.
6. Check Administration Console: the livepkgr application and the 3 streams are running. ✓
7. Check the logs for errors. Flash Media Server logs are located in the rootinstall/logs folder. The master.xx.log file and the core.xx.log file show startup failures. Apache logs are located in the rootinstall/Apache2.2/logs folder. X
a) core00.log: these errors did not occur every time that I tried playing the live stream but these are the only relevant errors in the logs.
1. 7968 (w)2611179 Warning from libf4f.dll: [Utils] [livestream2] Discarded all queued Media Messages received before first Video Keyframe Message
2. 7968 (w)2611179 Warning from libf4f.dll: [Utils] [livestream3] Discarded all queued Media Messages received before first Video Keyframe Message
b) edge00.log:
13:33:57 4492 (w)2641213 Connection rejected by server. Reason : [ Server.Reject ] : (_defaultRoot_, _defaultVHost_) : Application (hds-live) is not defined. -
c) Apache-Error:
1. [warn] Checking if stream is disabled but bootstrap path in event file is empty for event:livepkgr/events/_definst_/liveevent stream name:livestream
2. [warn] bootstrap path is in event file is empty for event:livepkgr/events/_definst_/liveevent stream name:livestream1
As I mentioned, everything works on iOS and FMS seems to be creating all of the stream segments and meta files:
a. The 3 streams are being created in: HD:\Program Files\Adobe\Flash Media Server 4.5\applications\livepkgr\streams\_definst_
b. FMS is creating the following files in each stream folder (livestream1, livestream2, livestream 3):
1. livestream1.bootstrap
2. livestream1.control
3. livestream1.meta
4. .f4f segments
5. .f4x segments
The appropriate files are also being created in the HD:\Program Files\Adobe\Flash Media Server 4.5\applications\livepkgr\events\_definst_\liveevent folder, in which I have the following Manifest.xml and Event.xml files:
<manifest xmlns="http://ns.adobe.com/f4m/1.0">
<media streamId="livestream1" bitrate="200" />
<media streamId="livestream2" bitrate="500" />
<media streamId="livestream3" bitrate="1000" />
</manifest>
<Event>
<EventID>liveevent</EventID>
<Recording>
<FragmentDuration>4000</FragmentDuration>
<SegmentDuration>16000</SegmentDuration>
<DiskManagementDuration>3</DiskManagementDuration>
</Recording>
</Event>
I’ve tried clearing the contents of both streams\_definst_ and events\_definst_\liveevent (keeping the xml files) after restarting the encoder, and creating a different event definst for the streams (liveevent2 for example).
We have an event in 2 weeks that we would like to stream to both Flash and iOS. Any help in solving this problem will be greatly appreciated.One step closer:
Changed the crossdomain.xml file (more permissive settings).
Changed the encoding on FMLE to vp6. Working somewhat (don't know what I did to make it start streaming through hds).
But at least now I can get the individual streams in the set manifest file to work:
http://localhost/hds-live/livepkgr/_definst_/livevent/livestream1.f4m
http://localhost/hds-live/livepkgr/_definst_/livevent/livestream2.f4m
http://localhost/hds-live/livepkgr/_definst_/livevent/livestream3.f4m
BUT when I try to play the streams through the set manifest file from http://localhost/liveevent.f4m I'm getting the following error:
"The F4m document contains errors URL missing from Media tag." I'll search the forums to see if anyone else has come across this problem.
I used the f4m config tool to make the file. These are the file's contents:
<manifest xmlns="http://ns.adobe.com/f4m/2.0">
<baseURL>http://localhost/hds-live/livepkgr/_definst_/liveevent/</baseURL>
<media href="livestream1.f4m " bitrate="200"/>
<media href="livestream2.f4m " bitrate="500"/>
<media href="livestream3.f4m " bitrate="1000"/>
</manifest>
Thanks -
How do I use multiple cameras to live stream through FME?
I am looking to live stream pool tournaments from multiple angles but don't know what software or hardware I might need. Anybody have any good how to advice or links they might share? I stream through Ustream.tv if that makes a difference. Should I look for something different? Thanks
I am working on getting just the counter working by using
the program posted previously, and I am running into issues. Periodically I get
the error:
Error -200141
occurred at DAQmx Read (Counter DBL 1Chan 1Samp).vi
Possible reason(s):
Data was overwritten
before it could be read by the system.
If Data Transfer
Mechanism is Interrupts, try using DMA. Otherwise, divide the input signal
before taking the measurement.
It seems to work better if I use cascaded counters, but I need timer 0 for
analog channels when I run this code along with the program for the other
measurements.
I have tried averaging, and selecting different values for the millisecond
timer, and these did not seem to have an effect.
I tried different DAQms configurations and "Counter DBL 1Samp" seemed
to work the best.
The program will work for a while and then it will give me the above error
message.
If I use counter 0 as a cascaded counter input, the program runs fine. If I run
this with other analog channels, it errors out because the analog channels use
counter 0.
If I use counter 1 as a cascaded counter input, it seems to work better than a
single channel, but it will still error out with the above error.
If I use only counter 1, I get the error above even faster.
Also, none of the
configurations give measurements outside the While Loop.
The only place I can add a speed dial for the front panel is within the While
Loop.
Is there someway to get the signal to continuously send out of the while loop?
I thought if I could get the signal out of the while loop, I could condition it
anyway I wanted without the program erroring out.
Any suggestions would be much appreciated.
Thank you.
Attachments:
Counter_error.jpg 45 KB -
I see the same photos with different resolutions in my iPhoto library
I am trying to organize my photos in the iPhoto library by deleting those are stored twice or more. However in many doubles I found out that some of them exist in my library in different resolutions. My photos are imported either from my camera, or from my iphone, or from my iPad or from the shared photo streams which include photos imported from various apple devices.
What I see for example is some photos taken from an iPhone exist in both 2048x1536 and in 3264x2448 resolution and some others in 1224x1632 and in 1536x2048 (I guess these 2 resolytions are for the vertically taken thotos while the other 2 for orizontical). For iPad photos I see 2056x1536 and 2592x1936 for the same photos. All doubles have different names (eg. IMG_007 and IMG_1056 for the same photo). Furthermore some doubles include info about the camera (eg. iPad back camer 4.28mm f/2.4) and some others don´t (only info about the device eg. iPad).
Why is this mess happening in my library? Apparently I want to keep in my library the photos only in the maximum available resolution for better quality.
In relation to that I have the following questions:
1. When a photo or a video is shared in a photo stream, is its resolution decreased to a lower one?
2. Is there any way to see what the resolution is of the original photo stored in the iPhone or iPad, so that I can compare with the the same ones I have on my computer´s iPhoto library?
3. Is there any way to see the resolution and other info for the Photos in a shared Photo stream which are not yet imported localy on my computer?
Thanks in advance.Happy New Year to all!
I made a test and I confirm what I was suspected at first place.
I uploaded a photo taken from my iPhone to the shared photo stream. Then I imported this photo from the photo stream to my iPhoto library on my Mac. The resolution in which this photo is stored locally is 2048x1536 and its size 961 KB.
Then I sent the same photo from my iPhone via e-mail. I opened the email on my computer and downloaded the photo. Then I imported it to my Mac´s iPhoto library. I checked the info out and the resolution is now 3264x2448 and size 2.2 MB.
Therefore the quality and resolution of the photos that are uploaded to the icloud photo stream is NOT the same as the original one.
I don´t know whether there is somewhre an option to change that, but this is what happens to me right now. -
My Maxx plays some live news videos and sometimes it does not depending on the site. I tried different video players like the MX and others to no avail. I tried on chrome
And other browsers.
Not sure which way to go with this.
Any ideas would be much appreciated.I went to a Verizon store (agent) yesterday after work and tried the S4. They didn't have a live S5, and it did play the sites I normally use. The iPhone was able to live stream it also. I think its the Razor Max that's missing whatever it needs to play those videos.... plug ins, etc....
Its mind boggling with all these phones out there.
Anyway, thanks for your input, Suzy. -
Pls playlist of mp3 live stream doesn't work properly, what to do?
Can others confirm the following issue with a live mp3 stream on version 7.0.2.16 on windows xp (sp2):
Go to wnyc.org and click on the AM or FM live stream (32k) links at the top left of the page. They should link to either of the following urls:
http://www.wnyc.org/stream/fm.pls
http://www.wnyc.org/stream/am.pls
Open these playlists with the current iTunes for windows. You should hear a 15 second sponsor pre-roll for this NPR station, and the live stream should start after that. I only hear the pre-roll, and then iTunes skips to the next item in the library.
Do you only hear the sponsor pre-roll, and then nothing OR does it work properly for you? If not, what to do?Reboot your computer in Safe Mode and then try to reset all Skype settings.
Go to Windows Start and in the Search/Run box type %appdata% and then press Enter or click the OK button. The Windows File Explorer will pop up. There locate a folder named “Skype”. Rename this folder to something different, e.g. Skype_old.
Next go to Windows Start and in the Search/Run box type %temp%\skype and then press Enter or click the OK button. Delete the DbTemp folder.
Reboot in normal mode and try to start Skype.
N.B. If needed, you will still be able to re-establish your call and chat history. All data is still saved in the Skype_old folder. -
Recommended setup for live streaming of HD video and HE-AAC audio (5.1 channels)
Hi,
Could anybody please post a working hardware configuration for streaming HD video (720p) and 5.1 channel audio using HE-AAC via FMLE3?
What is your experience and what kind of professional equipment would you recommend (camera, capture card, type of cables used, PC specs incl. CPU and memory, etc.)?
I'm looking for a portable solution that ideally supports two concurrent streams (at different bit-rates) and multi-pass encoding (yes, I know that professional hardware supports that even for a live setting with a little delay).
FMLE (or is it Windows?) seems to be very picky as far as the recognition of HD data from specific input sources is concerned. Hence my question before investing into new hardware.
Thanks you.Flash Player does not support 5.1 audio at this time, only 2 channel audio.
For what you describe, you may want to consider one of the streaming appliances or solutions from companies like ViewCast, Digital Rapids, or Inlet. ViewCast and Digital Rapids both offer portable appliances.
Flash Media Live Encoder does not support two-pass encoding, the only solution that I'm aware of that does this for Flash is Kulabyte that offers a 2-pass software solution as well as a complete solution with hardware.
If you use a capture card with FMLE you'll be fine with HD inputs, the problem seems to be with cameras that only offer HD source as MPEG-2 and FMLE only supports RAW input at this time.
Laurel Reitman
Sr. Product Manager -
I am publishing 2 live streams from a computer with 2 video capture cards in it and I get a lag every 30 seconds or so on the subscribers side. I have tried adjusting the camera quality and setMode properties but still the lag persists inside and outside the LAN, is there a way to create a buffer on the server or adjust the way the live stream is received on the subscribers side so there is no noticeable lag? I saw something about editing application.xml to adjust the queue and a suggested bitrate but not sure if this is applicable, here is the link:
http://www.adobe.com/devnet/flashmediaserver/articles/dynstream_live.html
Here is my setup:
The publishing computer:
2 PCI-e x1 cards, one takes S-Video (480i) and the other DVI (720p)
Windows 7 64bit
Intel i7
6 GB RAM'
GB NIC and Switch
From the switch is one hop to a GB router and out to a 10 MB pipe which leads to our datacenter 30 miles away. The swf on this side just gets the 2 cameras, sets the quality to (0,80) and mode to (640,480,25,false) (I have played with these settings a little) and creates 2 lives streams on the FMS.
The FMS:
I am running Flash Media Interactive Server 3.5 on my own server with two 3.6 Dual Core Xeon processors and 4 GB RAM. This server resides in a Datacenter and has a 100 MB burstable pipe. From the FMS administration console I am barely using 4MB total bandwidth, 1% CPU usage, 2% RAM.
The subscribing computer:
I have used many different types of hardwired PC's within the same LAN and outside the LAN, results are the same.
The swf on this side just creates 2 new video instances and attaches the 2 netstreams to them. They are placed side by side and the height and width of these videos are undefined.
Does anyone know where this lag could be coming from, is there any settings I can adjust to improve performance while maintaining a minimum S-Video quality level?
ThanksHi,
Thanks for the detailed information in your first post.
Coming to the latency... It gets affected by various factors. The stream bitrate, FMS server side settings, subscriber's client side setting and on top of these the network bandwidth.
I need to know the NetStream.bufferTime set at the subscriber's application. Try setting it to 3 when you issue the play, later you can increase it to 8 or so to stabilize the play back.
Also can you try to subscribe to single stream and check if your client bandwidth is able to play back isngle stream?
The link which you have mentioned below is a good reference to tune your server side settings.
Regards,
Janaki L -
Re-Start a Live Stream using AMS 5 through Cloudfront
We have AMS 5 running on an Amazon EC2 instance. We send a Live Stream from an FMLE encoder using H.264/AAC.
If we do an Origin Pull from that instance running the AMS 5 and Live steam through one of our commercial CDN accounts, we can stop and re-start our Live stream without issue. It stops and re-starts as it should. When we re-connect to the AMS using FMLE, using the Origin Pull, the stream re-starts in either the OSMF or JW Player 6 in under 30 seconds.
If we use that same AMS 5 on the same EC2 instance and Live stream through an Amazon Cloudfront distribution, if we stop our Live stream, when we re-start the stream it takes an unacceptably long time for the stream to refresh - if ever. Basically, after the re-start of the Live stream through FMLE, the Live stream at the player plays the first 10 or 15 seconds of the intial Live stream and then stops. If you refresh the player, it will play that same 10 or 15 seconds again - and stop again.
It would seem that the AMS 5 is setup correctly. It works through our commercial CDN.
It also would seem that the issue is with the cache on Amazon Cloudfront. Our assumption is that the Cloudfront cache is not properly updating. But we have tried about every variable in every setting for Cloudfront that we can find. We can't get past this.
Any suggestions?
Thanks
As a followup to this discussion, you can follow the issue through the AWS.Amazon forums at https://forums.aws.amazon.com/thread.jspa?threadID=120448&tstart=0We also have other threads going.
http://forums.adobe.com/thread/1180721?tstart=0
https://forums.aws.amazon.com/thread.jspa?threadID=120448&tstart=0
There appears to be 2 different issues - one issue with Amazon Cloudfront and a second issue with Adobe Media Server. Unfortunately, they are tied together in the real world application of these programs.
In AMS. when we use the adbe-record-mode=record, then the AMS works fine because the AMS is clearing the streams on each re-publish. Unfortunately, this causes a status code error on Amazon Cloudfront. And that particular status code error from a custom origin server causes Cloudfront to "override the minimum TTL settings for applicable cache settings". As the programmers explained, basically what happens is that Cloudfront overrides our 30 second cache settings and goes into a standard cache setting of 5 minutes. And waiting 5 minutes to re-start a Live stream just doesn't work in the real world for real customers.
We can fix that Cloudfront status code error issue by using adbe-record-mode=append on the AMS. If we use that AMS setting, then we can stop a Live stream, re-start the stream, the Cloudfront cache clears and everything works - unless you also change the stream settings in your FMLE. If you stop the stream and change the video input or output size in the FMLE, then the AMS server gets stuck and can't/won't append to the f4f record file.
Does anyone have any ideas about how we can adjust all of this so that it works together?
Thanks -
We're encountering a problem with live audio streams going
through FMS. We play back the streams with a generous buffer on the
client to smooth out any hiccups in the end-user's connection. But
over long periods of time (10-30 minutes) the amount of data in the
listener's buffer (monitored with NetStream.bufferLength) gradually
shrinks until it hits 0, pauses to buffer again, and refills. This
occurs on several different internet connections, and there appears
to be plenty of bandwidth available to refill the buffer even if
momentary network problems occur.
Is there anything we can configure in FMS or the client
players to prevent this? Thanks in advance for any
suggestions.No problems here viewing CNN live streams. According to CNN FAQ, you need Flip4Mac to view their live streams/videos.
"You may watch live video using the Flip4Mac plugin and Quicktime 7.1.2+"
http://www.cnn.com/help/live.html -
Importance of Key Frame Interval in live streaming
What is the significance of Key frame interval in live streaming. In VOD I know that media players seeks between these keyframes but what is use of it in live streaming. Any effect on quality ?
Please help
-PremSorry the recommendations didn't work. I tested two 1080p files – one 23.98 fps and a 29.97 fps – at those settings. No loss of sync whatsoever here.
So perhaps there is something about your source file that's a problem. Try another file and see.
On a different, but related topic, the MPEG4 encoding that Compressor 3.5 uses is kind of long in the tooth. You'll note in the user manual that all the discussion about features and settings seems to presume that the file will not be re-encoded. (Hence, my question about whether it was for your own Web Site.)
All the video services take whatever we upload and re-encode them to their proprietary specs. So generally it's best to upload as much info as possible. (David Brewer, who frequently contributes to these boards, uploads very large Pro Res files with good results.)
YT's advanced encoding guidelines recommend the MP4 container, but with the h.264 codec. That is the Part 10 MPEG standard, which Compressor 3.0 and 3.5 cannot do. (However, Compressor 4.0 can do that encode, which is one of benefits of having that version.)
FWIW, all videos that I prepare for the Web are h.264 .mov files (not mp4) and I've not had any quality issues with video or audio. Just personal preference.
Perhaps someone else will have some ideas why you lose sync other than sample rate and/or change in frame rate.
Good luck.
Russ -
So far , I only found tutorials on control these attributes while publishing the stream,
how can I adjust it while playing ?That article judges the band width and assigns a proper video(say these videos are prepared before hand) to play.
But I can't get multiple live video streams with different settings out of a single pc camera.
And I can't require each user to install multiple cameras to use the service. -
Archiving live stream at FMS and injecting metadata: VP6 good h264 not
When I record a live stream at FMS, one in which I've injected metadata in my main.asc file, the archived file plays back fine. The metadata plays back too. I'm able to retreive it just fine - if I encode VP6.
If I encode h.264 the file plays back but the metadata does not. The fact that the archived file is created and plays back tells me things are wired correctly. The only thing I changed is the format.
According to FMS docs (http://help.adobe.com/en_US/FlashMediaServer/3.5_SS_ASD/WS5b3ccc516d4fbf351e63e3d11a11afc9 5e-7e42.html#WS5b3ccc516d4fbf351e63e3d11a11afc95e-7f35)
..."The recording format is determined by the filename you pass to the Stream.get()
method."
So my record code looks like the following:
application.onPublish = function(client, stream) {
trace("onPublish");
s = Stream.get("mp4:streamname.f4v");
if(s){
s.record();
this.doRepublish(this.nc, stream);
My code that injects the data in to the stream looks like this:
Client.prototype.sendDataEvent = function(data) {
trace("Call to sendDataEvent...");
this.newStream = Stream.get("mp4:streamname.f4v");
this.newStream.send("onTextData",data);
All must be wired correctly because the metadata comes through during the live stream. On play back of the archive though, the metadata doesn't appear to be there.
Any thoughts?
ThanksMy apologies on the s.play() confusion. I had been trying different versions of the code and posted the one without it.
Whether I include s.play() or not the file gets created. Here are the various versions of the onPublish() function I've tried (differences in red):
1.
application.onPublish = function(client, stream) {
trace("onPublish");
s = Stream.get("mp4:streamname.f4v");
if(s){
s.record();
s.play("mp4:streamname.f4v");
this.doRepublish(this.nc, stream);
2.
application.onPublish = function(client, stream) {
trace("onPublish");
s = Stream.get("mp4:streamname.f4v");
if(s){
s.record();
s.play(stream);
this.doRepublish(this.nc, stream);
3.
application.onPublish = function(client, stream) {
trace("onPublish");
s = Stream.get("mp4:streamname.f4v");
if(s){
s.record();
this.doRepublish(this.nc, stream);
All produce the same result - an archived file called mp4:streamname.f4v in my streams folder. This file plays back fine but does not play back the commands.
On your other question, about things working fine for VP6, it works fine for FLV. A file called streamname.flv is produced. This file plays back fine and does indeed play back commands baked into the file as well. This is what makes me believe the code is not the problem. If it works perfectly for one format, there would seem to be very little I could do in my code to break things for the other.
Can you try this using the record() code snippets in the live docs Stream.record() section?
http://help.adobe.com/en_US/FlashMediaServer/3.5_SS_ASD/WS5b3ccc516d4fbf351e63e3d11a11afc9 5e-7e42.html#WS5b3ccc516d4fbf351e63e3d11a11afc95e-7f35
All you'd need is the code snippets there to record your live stream and another server side function to inject commands into that live stream. Here is that function:
Client.prototype.sendDataEvent = function(data) {
trace("Call to sendDataEvent...");
this.newStream = Stream.get("mp4:streamname.f4v");
this.newStream.send("onTextData",data);
Do something simple like call onTextData and pass some text in the data parameter. Then on the client side viewer, handle the onTextData method. It will receive the text. Display it in a text area or something.
If you record while injecting this text into your stream, the text should display on playback of the archived file. It will if you encode VP6/FLV, but not if you encode H.264/F4V.
Let me know what you discover.
Maybe you are looking for
-
Simple Sproc Parameters question
Team, MY ENVIRONMENT SQL 2005, Crystal Reports for Visual Studio 2005 MY PROBLEM I am authoring both a sproc and a report, so I have full control over the design. I am a SQL expert and also a Crystal 8.5 expert. I have done the Sproc-Report connecti
-
I tried to update my i pad 2 to ios7 with i tunes, now I seem to have backed up my content to i tunes but now I cannot restore it to my i pad?
-
Settings and saved messages "forever" only approx. 6 stay per day where do they go
went to settings and messaging. it asks if you want to save and for how long your messages. I said forever. but only 6 at one time stay on phone. where do they go
-
Root account locked out after 3 login attempts
I've connected to a 280R (Solaris 9) machine through the console (null modem cable). After trying 3 failed login attempts, it reported that the root account has been locked out. When can I do now to re-enable it? Vincent
-
Should I put my iPad to sleep every night?
I have an iPod Touch and an iPhone for a long time, but since both of them never showed the battery percentage on the up right, I never actually worried about shutting them down when I go to bad. Now I have an iPad that really shows me a decrease in