Issues on publishing an f4v live stream
i use the FMLE to push an f4v live stream to my FMS 4.0's livevideos app. then i set the strobe media playback to play the live video. unfortunately, i cannot view the live video in my strobe player. then i make a change (just delete the steam type : mp4 and the extension : f4v and the h264 to vp6 format) in the FMLE to push a flv live stream, and configure the src (just delete the steam type : mp4 and the extension : f4v) of strobe player, this time i get the live stream. what is the matter?
BTW after i get the flv stream in my strobe player , i restore the FMLE to push an f4v stream to my FMS, and restore back src on my strobe and this time i can get the f4v live stream.
the main.asc on my fms 4.0 is so simple and i am sure of the configuration of my FMLE is correct.
Thank you for your response. my linux os version is rhel 5.4. the adobe official website recommand linux rhel os 5.3. does that matter?
my main.asc is very simple. but add some authenticatioon via webservice. the FMLE's fms url setting is like the following string :
rtmp://192.168.1.2/livestreams/266?username=abc&videoId=89edrs; the fms dev guide says that the fms url is like rtmp://192.168.1.2/livestreams/266 without the query string. the query string added is to send some useful infomation to the fms 4.0, then i can parse the query string via client.uri to get the infomation to authorize the publisher. does the query string matters?
the live application seems OK. i just copy the live application's application.xml and make some change to the "LIVE_DIR". all 3 occassions use different OS on different machine from the same xp client .
Similar Messages
-
Publishing multi-bitrate live streams for iOS
I'm having difficulties publishing multi-bitrate live streams that can be viewed on an iPad/iPhone.
I'm running Flash Media Server 4.5, and have carefully followed Adobe's Tutorial on streaming live media (http) using hls found here: http://help.adobe.com/en_US/flashmediaserver/devguide/WSd391de4d9c7bd609-52e437a812a3725df a0-8000.html#WS0432746db30523c21e63e3d12e8340f669-8000
After completing the above tutorial, the video can be seen just fine on my desktop computer via the flash plug-in, but it's not working in iOS...
When I go to what I believe to be the proper URL, (http://myflashmediaserver.net/myevent.m3u8), I get an error message on my iPhone and iPad saying "The operation could not be completed".
I have created two set-level manifest files (one .f4m and one .m3u8) for the live stream using the Set Level F4M/M3U8 File Generator and saved them to the WebRoot directory, but alas, no love.
Any help would be greatly appreciated!
MikeI just finished going through the short and sweet tutorial on the Adobe website "Capture, encode and stream live multi-bitrate video over HTTP to Flash and iOS", which confirmed that I seem to be doing everything right, but I'm still getting the "The operation could not be completed" error message on both iPad and iPhone.
Grasping at straws, I'm wondering if it could have something to do with some of the "hacks" I was asked to make in the other tutorials, (which, oddly enough, weren't mentioned in the tutorial mentioned above). Specifically:
• Edit FMLE config file on the Mac I'm encoding the live video on (change <streamsynchronization> from false to true)
• Delete/Rename the "manifest.xml" file in applications/livepkgr/events/_definst_/liveevent directory
• Edit "event.xml" in applications/livepkgr/events/_definst_/liveevent (change <segmentduration> from 40,000 to 16,000)
However, I've tried running this with the above hacks as well as in their non-hacked state and am still not seeing success.
Please advise. Thanks! -
Issues while watching content after live streaming operation
Hi,
1)We have performed a live streaming of an event using Azure Media Services , we streamed the entire event
which was of about 5-6 hrs duration . But at the end we came to know that it has streamed only last 1 hour of
entire event.
Can you let me know what might be the issue?
2)We performed a live streaming of an event in 2 parts using azure media services. After 2 hours we stopped the streaming of 1st part and deleted the channel for 1st part and we started streaming of another part by creating another channel . We are unable to
view the 1st part through content url while we are able to view the 2nd part through content url?
Can you let me know what might be the issue?
Thanks,
PushkarThe fan is speeding up to cool the GPU. The GPU (graphic's chip) heats up because video's and game's work the GPU much harder than normal desktop activities.
When was the last time you checked and cleaned the air intake grill on the bottom of your iMac?
I shutdown (stopping the fans), disconnect the power and vigorously vacuum out the bottom grill work on my Mac's a couple of times a year to keep them running cool. -
Publish H.264 live stream
Hi all,
I'm currently trying Flash Media Server 3.0 with CS3 and
Flash Player 9.0.115. Our goal is to implement a videoconferencing
application with H.264 encoded video.
In the feature list, it's clearly stated that the server
allows live publishing of H.264 content. I haven't seen any
documentation on how to do this yet. I don't know if the player
publish in H.264 by default or not?
Is it possible to do this?You should use a H.264 compitable encoder..
FMS only delivery content, player only player the stream..
Flix Engine, from On2, have the H.264 encoder.. -
Can I publish static content to Live stream
I have a requirement of publishing static content like video files mentioned in a playlist to be published to a live stream. It is like creating a TV station with pre-recorded content but with live stream. It will great if anyone let me know is it possible with FMS?
Yes its very much possible.
What you need to do is place all your files on server and play it via server-side stream. You can go through FMS documentation and you should be able to get it work. However i will tell you briefly here how to go about it.
Create an application folder , under applications directory of FMS. Let us name it as "test". Create a folder called "streams" under "test" and then under "streams" create "room" folder. So structure would be as follows : <FMS installation directory>/applications/streams/room. Now place all your static content here , say you place one.mp4,two.mp4,three.mp4.
Now you want to give effect of live stream.
Create a file called main.asc under "test" folder.
Write below code in main.asc:
var mylivestream;
application.onAppStart = function(){
mylivestream = Stream.get("mp4:livestream.mp4");
mylivestream.play("mp4:one.mp4",0,-1,true);
mylivestream.play("mp4:two.mp4",0,-1,false);
mylivestream.play("mp4:three.mp4",0,-1,false);
Above code would play all three static content one after other. Subscriber client would have to subscribe in following manner:
ns.play("mp4:livestream.mp4",-1,-1)
Please note that above code is sample code and not the most optimal or correct and will need tweaking depending on what is your requirement.
Hope this helps -
Live Streaming from Video File
Hi everyone,,, im newbie in programming streaming video with flash....ater reading documentations i tried the instructions on doc, i got a question that i've not been solved. How to make a "live streaming" from "video file". It likes a live streaming from camera but it uses video file as the source.... many thanks
Do you mean you have video file but you want to publish it as live stream , am i right? If yes following is the solution, but before that let me clarify you will need server to do it , i mean there is no way Flash client (i.e. .swf file running in Flash Player) can publish file from local disk.
Now let me tell you how to do it from server:
Place your video file under "streams" folder of your application i.e. say "myVideo.flv" under <FMS root>/applications/myApp/streams/_definst_
Now write following code in your main.asc file (this file would be under <FMS root>/applications/myApp )
var mystream;
application.onAppStart = function(){
mystream = Stream.get("livestream");
mystream.play("myVideo",0,-1,true);
This will publish "myVideo" file as live stream under name of "livestream". Client would have to subscriber using NetStream and
use ns.play("livestream",-1,-1)
You might have to write some extra code if you want to loop same file else once play of recorded file gets over , live publish will automatically stop.
let me know if this solves your problem -
Advice on Snaphots of live streams
I've got a streaming video/audio app(CHAT). and everything is
working nicely. I'm going to add a snapshot function to it and i'd
love some pointers. There will be approx 100users, each user will
have the options of taking snapshots either single shots or a few
automated ones...I'm using FMS2
1. Are there any know issues recording parts of a live stream
while it's streaming?
2. Should I use the same NC and record the FLV file/ CHAT
instance then do a FLV -> FFMPEG -> jpg?
3. Should I make a new NC (and stop the live stream) to a
different app(snapshot) and do the FLV->jpg there
And if you have any other good advice please let me know
BR
/AWell i kind of figured it out : Using a setinterval on the
serverside and instanciating a Stream from the server then playing
the live stream and then recording 1-2s flv files to be converted
by the FFMPEG.
/A -
Publish an audio-only stream (HLS)
Hi,
the documentation says this:
"To serve streams over a cellular network, one of the streams must be audio-only. For more information, see HTTP Live Streaming Overview.
To publish an audio-only stream, enter the following in the Flash Media Encoder Stream field:
livestream%i?adbe-live-event=liveevent&adbe-audio-stream-name=livestream1_audio_only&adbe-audio-stream-src=livestream1
If the encoder specifies individual query strings for each stream, use individual stream names instead of the variable %i:
livestream1?adbe-live-event=liveevent&adbe-audio-stream-name=livestream1_audio_only
livestream2?adbe-live-event=liveevent&adbe-audio-stream-name=livestream2_audio_only
To generate a set-level variant playlist when using an audio-only stream, specify the audio codec of the audio-only stream. Specify the audio and the video codec of the streams that contain audio and video. For more information about using the Set-level F4M/M3U8 File Generator, see Publish and play live multi-bitrate streams over HTTP.
Does this mean that using one FMLE encoder (with the installed AAC plugin) i can stream BOTH audio+video AND audio only stream via HLS to iOS clients?
If yes, how to connect to a+v stream and how to connect audio only stream in iOS ? (To be more precise - i would need two m3u8 files, how do i create them in the provided generator)
What about the need to provide multiple bitrates (for the audio), surely 64 isn't enough - multiple encoder machines or multiple encoder instances?You are right. You don't need to provide multiple bitrate audio in the variant playlist. So while specifying on FMLE
livestream%i?adbe-live-event=liveevent&adbe-audio-stream-name=livestream1_audio_only&adbe-audio-stream-src=livestream1
set adbe-audio-stream-src to such a a+v stream that that audio bitrate of 64 kbps.
This will create 4 streams for you on FMS
livestream1, livestream2, livestream3, livestream1_audio_only
On encoders other than FMLE, where it is not supported to publish multiple streams with a single publish option, you may publish
1. livestream1?adbe-live-event=liveevent&adbe-audio-stream-name=livestream1_audio_only
2. livestream2?adbe-live-event=liveevent
3. livestream3?adbe-live-event=liveevent
This will create 4 streams for you on the FMS
livestream1, livestream2, livestream3, livestream1_audio_only
Yes, there is no point publishing audio only live stream corresponding to each a+v stream. Apple spec mandates only one audio only stream of 64kbps in variant playlist -
Playback of low bitrate flv or f4v from live stream in FMS causes player buffer to empty
We are experiencing a consistent issue when playing a low bitrate (300kbps or less) flv in a live stream from FMS. Basically, the player will will start off with the appropriate buffer, say 5 seconds, then begin dropping until it empties out, and will have to rebuffer. We've tried with a variety of flv and f4v files, all that are 300kbps or less, and we consistently get the issue. Is this something Adobe can investigate in FMS? Or are there any suggestions on how we can get around the issue?
hey, i got the similar problem, logging like this
2012-11-12
18:50:12
23434
(e)2661034
Connect failed ( , 5779212 ) : Connect failed: Connection refused (111)
2012-11-12
18:50:54
23434
(e)2661034
Connect failed ( , 5779212 ) : Connect failed: Connection refused (111)
2012-11-12
18:51:36
23434
(e)2661034
Connect failed ( , 1166880400 ) : Connect failed: Connection refused (111)
2012-11-12
18:54:14
23434
(e)2661034
Connect failed ( , 1175301776 ) : Connect failed: Connection refused (111)
2012-11-12
18:54:55
23434
(e)2661034
Connect failed ( , 1164775056 ) : Connect failed: Connection refused (111)
2012-11-12
18:55:37
23434
(e)2661034
Connect failed ( , 16 ) : Connect failed: Connection refused (111)
2012-11-12
19:13:08
23434
(e)2661034
Connect failed ( , 1158459024 ) : Connect failed: Connection refused (111)
it seems that the port number is invalid, but we never use such ports. -
Live Stream Issues with FP 11.2
We're seeing lots of bugs within the NetStream when connecting live streams and publishing/playing the stream via direct peer connections.
Specifically when attaching a new Camera the receiving peer can no longer see the video stream. Also when toggling publish(bull) publish(streamName) from the publisher side and similarly toggling play(null) and play(streamName) from the subscriber side the streams don't play (render video) anymore even though i can see through the netwrok that the bytes are coming through.
Along with this we seem to be getting new kinds of NetStatus errors ever since we started testing - "NetConnection.Call.BadVersion" was one we've seen but can't make any sense out of given we're using DIRECT peer to peer connections for the netstream classes.
Lastly on general we're seeing lots of instability issues with on the h264settings applied to the nestream is this still in BETA? Instibility = crashing browser mostly when showing multiple video such as in a group video chat.
Are any of you seeing this as well?Hi Damorgan,
As you suggested, I did fresh 11.2.0.2 installation and find the mentioned components are valid from v$registry.
Only desupported component is UltraSearch that I was asked to remove during upgrade and I did so.
COMP_NAME VERSION STATUS
OLAP Catalog 11.2.0.2.0 VALID
Spatia 11.2.0.2.0 VALID
OLAP Analytic Workspace 11.2.0.2.0 VALID
Also the metalink note: ID 270588.1 indicates that mentioned components exists for 11gR2.
DBUA was asking to install these component prior upgrading. My first note has that DBUA instruction.
Is there any way to select mentioned components with "software only" installation prior upgrading?
Thanks. -
Publish Live Stream with VP6 or H264
Hi all,
I am publishing live stream using web camera but the quality with respect to bandwidth is not good.
I want to publish the live image using VP6 or H264 codec, how it can be done, please help.
It is good with FMLE but trying to do in as3.
Thanx in advance.FMLE can only publish via rtmp or rtmpt.
-
I have a question: On what operating systems(android,ios) can be played and published stream with codecs H264 and PCMU with rtmp and rtmfp protocols(live streaming)?
As far as I found out with the codecs can be played on Android(rtmp protocol). On IOS video is not displayed, as I understood AIR environment cuting it.
Another question regarding video texture. Will be included in future releases support for live playing h264+PCMU on IOS?On iOS, you'll need to be playing a HLS stream for h264 to decode when streaming from a remote server.
-
Publish Live stream to FMS on RTMPE protocol
Hi,
I am trying to publish Live Stream to FMS using FMLE or third party encoders.
Is it possible to have RTMPE or RTMPTE protocol used between encoder and FMS?
Very urgent.
thanks,
Pooja JainFMLE can only publish via rtmp or rtmpt.
-
How to secure publishing of live streams
Hi,
I have installed Flash Media Server on our server. When I load up the application home page, I see the demo video of a train.
Then I click on the "Interactive" tab on the right hand side. I was SHOCKED to see that I can create a live stream from my local camera without any credentials at all. Anyone who visits this webpage can publish a live video stream on our Flash Media Server?
Please tell me there is a simple configuration change I can make to close up this gaping security hole.
Thanksjephperro,
The advised operation for this is a combination of SWF Verification for your general content to ensure that unintended SWFs cannot perform undesired actions on your FMS installation. If you make use of live publishing then the FMLE authentication plugin is a necessary component as well.
All this information will be forthcoming in the FMS hardening guide - but for now I can help detail for you the solution you'll need to make sure that only your intended targets are using your FMS installation. Tell me a little about your setup, or email me offlist to do so @ [email protected] and we'll get you sorted.
Asa -
Blur Effect in live Streaming publish through camera
Hi Guys
I am currently working on a video chat application throgh FMS.live Stream is playing and publishing perfectly on both end.but the thing is that i have to apply blur effect on that live stream on a particular portion.u mean just like (thieves face) or some moving object.can it possible to apply blur on that live streaming .if yes then please tell me.thanks in advance.i am totally confused
Thanks and Regards
Vineet OshoHello,
There are no in box components that will allow you to transmit captured video from a Windows Phone device to Azure Media Services. You will need to create you own implementation. You can do this by creating a custom Media Foundation sync that can plug into
the MediaCapture element's pipeline. You will then need to modify the captured and compressed video to align to one of the two protocols that Azure Media services can use. At this time Azure Media services can use either Smooth Streaming or RTMP for ingestion.
You will need to create your own packager (or find compatible packager code from a 3rd party) to package the MP4 encoded data into the necessary format.
Here is an example of how to create a custom Media Foundation sync (please note you must write your Media Foundation components using C++ MoCOM. We do not support Media Foundation components written in managed languages):
Real-time communication sample
I hope this helps,
James
Windows SDK Technologies - Microsoft Developer Services - http://blogs.msdn.com/mediasdkstuff/
Maybe you are looking for
-
hii,i have bb curve 8520 but its giving me some problem regarding message, wenever i get a new msg from any particular number the last recieved msg from d same number gets deleted automically. i have tried updating os from 4 to 5 and then again from
-
Is there a way to recover deleted mails sent from iPad and/or recover deleted messages from my iPhone? A software or even open it? It is very important for me get back the messages sent on December. I did not have a back up at that time, at least tha
-
I want to give my friend a video file, so when he imports it into avid, premiere, final cut etc. he would recieve 6 mono tracks (L,R,C,LE,Ls,Rs) and the last one to be stereo. but every time i export it comes as 7 mono tracks. and in the setting that
-
HTTP to RFC scenario, RWB error
Hi I am following this scenario /people/community.user/blog/2006/12/12/http-to-rfc--a-starter-kit I tried in RWB-->component Monitoring->Integration Engine and gave Header Information sender, interface, Namespace, user, Password(XI user ID and Passwo
-
Dear, I've a XI to SOAP sync scenario. My problem ocurred when i send xml to web service. In SXI_MONITOR I can see that message arrives to the web service but in the response returns a error (without payload). The mapping logic is very simple. In out