Blur Effect in live Streaming publish through camera

Hi Guys
               I am currently working on a video chat application throgh FMS.live Stream is playing and publishing perfectly on both end.but the thing is that i have to apply blur effect on that live stream on a particular portion.u mean just like (thieves face) or some moving object.can it possible to apply blur on that live streaming .if yes then please tell me.thanks in advance.i am totally confused
Thanks and Regards
Vineet Osho

Hello,
There are no in box components that will allow you to transmit captured video from a Windows Phone device to Azure Media Services. You will need to create you own implementation. You can do this by creating a custom Media Foundation sync that can plug into
the MediaCapture element's pipeline. You will then need to modify the captured and compressed video to align to one of the two protocols that Azure Media services can use. At this time Azure Media services can use either Smooth Streaming or RTMP for ingestion.
You will need to create your own packager (or find compatible packager code from a 3rd party) to package the MP4 encoded data into the necessary format.
Here is an example of how to create a custom Media Foundation sync (please note you must write your Media Foundation components using C++ MoCOM. We do not support Media Foundation components written in managed languages):
Real-time communication sample
I hope this helps,
James
Windows SDK Technologies - Microsoft Developer Services - http://blogs.msdn.com/mediasdkstuff/

Similar Messages

  • Can I publish static content to Live stream

    I have a requirement of publishing static content like video files mentioned in a playlist to be published to a live stream. It is like creating a TV station with pre-recorded content but with live stream. It will great  if anyone let me know is it possible with FMS?

    Yes its very much possible.
    What you need to do is place all your files on server and play it via server-side stream. You can go through FMS documentation and you should be able to get it work. However i will tell you briefly here how to go about it.
    Create an application folder , under applications directory of FMS. Let us name it as "test". Create a folder called "streams" under "test" and then under "streams" create "room" folder. So structure would be as follows : <FMS installation directory>/applications/streams/room. Now place all your static content here , say you place one.mp4,two.mp4,three.mp4.
    Now you want to give effect of live stream.
    Create a file called main.asc under "test" folder.
    Write below code in main.asc:
    var mylivestream;
    application.onAppStart = function(){
         mylivestream = Stream.get("mp4:livestream.mp4");
         mylivestream.play("mp4:one.mp4",0,-1,true);
         mylivestream.play("mp4:two.mp4",0,-1,false);
         mylivestream.play("mp4:three.mp4",0,-1,false);
    Above code would play all three static content one after other. Subscriber client would have to subscribe in following manner:
    ns.play("mp4:livestream.mp4",-1,-1)
    Please note that above code is sample code and not the most optimal or correct and will need tweaking depending on what is your requirement.
    Hope this helps

  • Publish Live Stream with VP6 or H264

    Hi all,
            I am publishing live stream using web camera but the quality with respect to bandwidth is not good.
            I want to publish the live image using VP6 or H264 codec, how it can be done, please help.
           It is good with FMLE but trying to do in as3.
    Thanx in advance.

    FMLE can only publish via rtmp or rtmpt.

  • How do I create a variable video delay of a live stream...

    I have a live video feed from my webcam in my Flash Application. I have a second video window next to it that I want to place a variable delayed video of the live stream. Therefore, I need to save the video stream in memory/disk and create this variable delay, say 5-60 seconds. As well as, continue to capture the live stream. The camera I am using suppots H.264 encoding.
    How do I create a variable length queue or buffer to hold the video stream coming into the flash application. Do I create a memory variable or write this to disk ? I have been looking over the ActionScript 3.0 documentation and I can not figure out how to code this either way ( as a memory variable or write to disk queue ).
    I want to be able to change the delay, view the delay stream in slow motion and scrub through the delayed stream.
    I would like to do this with out having to use Flash Media Server.
    Thanks,
    Bob

    I am sure it is practically impossible to accomplish that on the client side. Although theoretically it is conceivable to use NetStream.appendBytes() but it requires an extremely complex implementation.
    I suggest you look into FMS DVR capacities:
    http://help.adobe.com/en_US/flashmediaserver/devguide/WS236AE81A-5319-4327-9E44-310A93CA09 C6Dev.html

  • Synchronous live streams

    Wondering if FMS 3.5 is capable of turning multiple live streams from web cams into a single stream to go out to client machines?  Not interested in concatenating the stream, but rather overlapping them.
    If this is not available through FMS 3.5, are there other tools available by which this could be achieved programmatically?
    Thanks.

    I would work backward in this case.  Take for instance a home DSL connection with 1.5 Mbps down.  Now given that constraint, you would need to aggregate 5 webcam streams over it.  This will compromise the bitrate of the content (ie visual quality) as each stream would take up less than 300 kbps as one would need to account for TCP overhead.   This means that the webcams need to encode the content to meet these requirements.
    Now by doing so, you have 5 separate streams coming from the webcam, which you now want to turn into a "single one".   You can multiplex these 5 streams over a single netconnection (or 5 streams over 5 different netconnections), but the network footprint will not be any smaller because those streams have been encoded at that bitrate.
    So the moral of the story is:  find a good encoder.

  • Mirroring a live stream

    I need to take a live stream from a camera attached to server
    A, and mirror it to server B, so that local users on server B can
    view it. I don't want it saved to server B, I want it to remain
    live. I'm using a push model - server A sets this up.
    I think that if I create a netconnection on server B looking
    back at server A I can use it to add a netstream attached directly
    to server A's camera. Is this correct?
    I am as yet a very inexperienced Flash developer, and my
    efforts are impacted by the fact that I am now modifying someone
    else's code using someone else's design patterns.
    Thanks.

    Right.... server B needs to make a netconnection to the
    server A application, and then play the stream using Stream.play.
    Making a netconnection in SSAS is almost identical to CSAS.
    See the docs for the server side netconnection and Stream
    classes... there are some examples in there.

  • AIR SDK 16 (and 15): live stream video pauses when from non-iOS to iOS

    Using either sdk 15 or 16, when live streaming video through media server, if I go from iOS to iOS devices, no issue. If I go from non-iOS device to iOS device, the non-iOS device receives the video and audio fine from the iOS device, but on the iOS device the video it receives freezes (almost from the start, after a couple of frames, then long, long pause of freeze - minutes long - then a few more frames, then freezing, etc.) - the audio is fine, it appears to be just the video. I have tested this with android to android (fine), android to desktop (fine), iOS to iOS (fine), android to iOS (freezing video only on iOS side), iOS to desktop (freezing only on iOS side).

    VideoTexture is a HUGE improvement for video on Windows and iOS. I am testing on Windows 8 and iPhone 6, but VideoTexture is still in beta and will take a few more months to get the kinks worked out. It is working for me on iOS sometimes, and whenever it works the pixels have a transparency applied to them. I have logged that pixel transparency bug here:
    https://bugbase.adobe.com/index.cfm?event=bug&id=3936111
    Also you will notice that Adobe has mentioned the following in the known issues section:
    [Air Desktop] [Video Texture] Video is not playing if texture is used as ConcreteTexture of starling.(3949908)
         (Release Notes | Flash Player® 17 AIR® 17)
    On iOS the VideoTexture seems to work for MP4s loaded directly from the file system, but RTMP is not ready yet. But when it is ready, I suggest that you adopt it immediately since VideoTexture is where Adobe is focusing it's video efforts.

  • Consuming live streaming in Windows Store app with C#

    I need to know how can I use a URL for camera live streaming. The camera is located in a remote place and used in a radar on the street. I don't know in fact how to consume the live streaming of this camera?
    Ahmed Abd El-Karim

    Hi Ahmed,
    Real Time Messaging Protocol seems what you need, the URL could be start with "rtmp://".
    However we do not have this supported in Windows Store App, if you need this functionality, probably you may need write your own code. See more information from:
    ttps://social.msdn.microsoft.com/Forums/en-US/f1e3aff5-3a12-492b-b383-7886c2eb9f49/playing-rtmp-live-video-stream-in-windows-store-app-c?forum=winappswithcsharp
    --James
    We are trying to better understand customer views on social support experience, so your participation in this interview project would be greatly appreciated if you have time. Thanks for helping make community forums a great place.
    Click
    HERE to participate the survey.

  • How do I use multiple cameras to live stream through FME?

    I am looking to live stream pool tournaments from multiple angles but don't know what software or hardware I might need. Anybody have any good how to advice or links they might share? I stream through Ustream.tv if that makes a difference. Should I look for something different? Thanks

    I am working on getting just the counter working by using
    the program posted previously, and I am running into issues. Periodically I get
    the error:
    Error -200141
    occurred at DAQmx Read (Counter DBL 1Chan 1Samp).vi
    Possible reason(s):
    Data was overwritten
    before it could be read by the system.
    If Data Transfer
    Mechanism is Interrupts, try using DMA. Otherwise, divide the input signal
    before taking the measurement.
    It seems to work better if I use cascaded counters, but I need timer 0 for
    analog channels when I run this code along with the program for the other
    measurements.
    I have tried averaging, and selecting different values for the millisecond
    timer, and these did not seem to have an effect.
    I tried different DAQms configurations and "Counter DBL 1Samp" seemed
    to work the best.
    The program will work for a while and then it will give me the above error
    message.
    If I use counter 0 as a cascaded counter input, the program runs fine. If I run
    this with other analog channels, it errors out because the analog channels use
    counter 0.
    If I use counter 1 as a cascaded counter input, it seems to work better than a
    single channel, but it will still error out with the above error.
    If I use only counter 1, I get the error above even faster.
    Also, none of the
    configurations give measurements outside the While Loop.
    The only place I can add a speed dial for the front panel is within the While
    Loop.
    Is there someway to get the signal to continuously send out of the while loop?
    I thought if I could get the signal out of the while loop, I could condition it
    anyway I wanted without the program erroring out.
    Any suggestions would be much appreciated.
    Thank you.
    Attachments:
    Counter_error.jpg ‏45 KB

  • How to use Phone camera for live streaming rather than using WireCast camera in Live streaming using Azure Media Services

    Hi,
    am planning to develop a live streaming app using C# and xaml.
    I have gone through the below sample and am able to stream the live content using wireacst camera.
    Azure Media Services Live Streaming Sample
    But i want to use Phone camera to stream the live video rather using wirecast camera.
    How can i use that? How to get the published URL from the C# code, is there any API to get that?
    Anybody please help me
    Regards,
    Santhosh

    Hello,
    There are no in box components that will allow you to transmit captured video from a Windows Phone device to Azure Media Services. You will need to create you own implementation. You can do this by creating a custom Media Foundation sync that can plug into
    the MediaCapture element's pipeline. You will then need to modify the captured and compressed video to align to one of the two protocols that Azure Media services can use. At this time Azure Media services can use either Smooth Streaming or RTMP for ingestion.
    You will need to create your own packager (or find compatible packager code from a 3rd party) to package the MP4 encoded data into the necessary format.
    Here is an example of how to create a custom Media Foundation sync (please note you must write your Media Foundation components using C++ MoCOM. We do not support Media Foundation components written in managed languages):
    Real-time communication sample
    I hope this helps,
    James
    Windows SDK Technologies - Microsoft Developer Services - http://blogs.msdn.com/mediasdkstuff/

  • Live Streaming Using my Webcam As a secruity camera

    I'm lost here someone please help, : \ im trying to use my isight as a security camera over the internet so i can watch my dog i just got her.
    So the question is what software out there i can use to put my webcam as a camera ( again my powerbook is kinda old im still running on panther so i have 10.3.9 not 10.4 can i work anyway around this) where i can watch live.
    Can someone help me i know i need my ip address i have that but when i hook everythign else up where do i go when im somewhere else when i want to look at my live streaming from work to my house? please can someone help me thank you if you can. : )

    Hello, keith28
    Until we hear back from juanorfrankie, here is my suggestion for your question:
     I would like to access the isight using my iPhone.
     Can anyone make some suggestions?
    If iPhone will can support viewing a streaming webcam page, use EvoCam with your iSight to publish a webcam with streaming video. That would probably be the simplest way to do what you want, but, because I do not have iPhone, so I cannot test this.
    See the instructions and FAQs at EvoCam for system requirements and details on how to set up your webcam page.
    EZ Jim
    PowerBook 1.67 GHz   Mac OS X (10.4.10)    G5 DP 1.8  External iSight

  • Must i have FMS to publish a stream from my camera ?

    hello
    i am building an live streaming website and i am use
    1- FMS
    2- Apache webserver
    i have made the subcriber and i works well .
    but i need to build the publisher to allow users to broadcast thier stream from thier cameras.
    i have tested a publisher which was build using actionscript and it didn't work untill i have installed the fms on my localhost , but i need a publisher which any user can use in his web browser .

    Are you asking whether each publisher needs to have FMS on its machine in order to publish stream - then answer is NO. There is absolutely no need for someone have FMS insalled on its machine to publish stream - what he needs is microphone and camera.
    Probably your publisher code was wrong and hence it did not work - may be you just need to put correct URI of your FMS machine in your connection URI instead of "localhost"

  • Re-Start a Live Stream using AMS 5 through Cloudfront

    We have AMS 5 running on an Amazon EC2 instance. We send a Live Stream from an FMLE encoder using H.264/AAC.
    If we do an Origin Pull from that instance running the AMS 5 and Live steam through one of our commercial CDN accounts, we can stop and re-start our Live stream without issue. It stops and re-starts as it should. When we re-connect to the AMS using FMLE, using the Origin Pull, the stream re-starts in either the OSMF or JW Player 6 in under 30 seconds.
    If we use that same AMS 5 on the same EC2 instance and Live stream through an Amazon Cloudfront distribution, if we stop our Live stream, when we re-start the stream it takes an unacceptably long time for the stream to refresh - if ever. Basically, after the re-start of the Live stream through FMLE, the Live stream at the player plays the first 10 or 15 seconds of the intial Live stream and then stops. If you refresh the player, it will play that same 10 or 15 seconds again - and stop again.
    It would seem that the AMS 5 is setup correctly. It works through our commercial CDN.
    It also would seem that the issue is with the cache on Amazon Cloudfront. Our assumption is that the Cloudfront cache is not properly updating. But we have tried about every variable in every setting for Cloudfront that we can find. We can't get past this.
    Any suggestions?
    Thanks
    As a followup to this discussion, you can follow the issue through the AWS.Amazon forums at https://forums.aws.amazon.com/thread.jspa?threadID=120448&tstart=0

    We also have other threads going.
    http://forums.adobe.com/thread/1180721?tstart=0
    https://forums.aws.amazon.com/thread.jspa?threadID=120448&tstart=0
    There appears to be 2 different issues - one issue with Amazon Cloudfront and a second issue with Adobe Media Server. Unfortunately, they are tied together in the real world application of these programs.
    In AMS. when we use the adbe-record-mode=record, then the AMS works fine because the AMS is clearing the streams on each re-publish. Unfortunately, this causes a status code error on Amazon Cloudfront. And that particular status code error from a custom origin server causes Cloudfront to "override the minimum TTL settings for applicable cache settings". As the programmers explained, basically what happens is that Cloudfront overrides our 30 second cache settings and goes into a standard cache setting of 5 minutes. And waiting 5 minutes to re-start a Live stream just doesn't work in the real world for real customers.
    We can fix that Cloudfront status code error issue by using adbe-record-mode=append on the AMS. If we use that AMS setting, then we can stop a Live stream, re-start the stream, the Cloudfront cache clears and everything works - unless you also change the stream settings in your FMLE. If you stop the stream and change the video input or output size in the FMLE, then the AMS server gets stuck and can't/won't append to the f4f record file.
    Does anyone have any ideas about how we can adjust all of this so that it works together?
    Thanks

  • How to secure publishing of live streams

    Hi,
    I have installed Flash Media Server on our server.  When I load up the application home page, I see the demo video of a train.
    Then I click on the "Interactive" tab on the right hand side.  I was SHOCKED to see that I can create a live stream from my local camera without any  credentials at all.  Anyone who visits this webpage can publish a live video stream on our Flash Media Server?
    Please tell me there is a simple configuration change I can make to close up this gaping security hole.
    Thanks

    jephperro,
    The advised operation for this is a combination of SWF Verification for your general content to ensure that unintended SWFs cannot perform undesired actions on your FMS installation.  If you make use of live publishing then the FMLE authentication plugin is a necessary component as well.
    All this information will be forthcoming in the FMS hardening guide - but for now I can help detail for you the solution you'll need to make sure that only your intended targets are using your FMS installation.  Tell me a little about your setup, or email me offlist to do so @ [email protected] and we'll get you sorted.
    Asa

  • Publishing multi-bitrate live streams for iOS

    I'm having difficulties publishing multi-bitrate live streams that can be viewed on an iPad/iPhone.
    I'm running Flash Media Server 4.5, and have carefully followed Adobe's Tutorial on streaming live media (http) using hls found here:  http://help.adobe.com/en_US/flashmediaserver/devguide/WSd391de4d9c7bd609-52e437a812a3725df a0-8000.html#WS0432746db30523c21e63e3d12e8340f669-8000
    After completing the above tutorial, the video can be seen just fine on my desktop computer via the flash plug-in, but it's not working in iOS...
    When I go to what I believe to be the proper URL, (http://myflashmediaserver.net/myevent.m3u8),  I get an error message on my iPhone and iPad saying "The operation could not be completed".
    I have created two set-level manifest files (one .f4m and one .m3u8) for the live stream using the Set Level F4M/M3U8 File Generator and saved them to the WebRoot directory, but alas, no love.
    Any help would be greatly appreciated!
    Mike

    I just finished going through the short and sweet tutorial on the Adobe website "Capture, encode and stream live multi-bitrate video over HTTP to Flash and iOS", which confirmed that I seem to be doing everything right, but I'm still getting the "The operation could not be completed" error message on both iPad and iPhone.
    Grasping at straws, I'm wondering if it could have something to do with some of the "hacks" I was asked to make in the other tutorials, (which, oddly enough, weren't mentioned in the tutorial mentioned above).  Specifically:
         • Edit FMLE config file on the Mac I'm encoding the live video on (change <streamsynchronization> from false to true)
         • Delete/Rename the "manifest.xml" file in applications/livepkgr/events/_definst_/liveevent directory
         • Edit "event.xml" in applications/livepkgr/events/_definst_/liveevent (change <segmentduration> from 40,000 to 16,000)
    However, I've tried running this with the above hacks as well as in their non-hacked state and am still not seeing success.
    Please advise.  Thanks!

Maybe you are looking for

  • UPS lost my new MBP

    Yesterday, I finally received my new 17" HiDef 4G 250G HD Mac Book Pro after UPS lost it for 24 hours. They delivered it to another address and then insisted that they delivered it to my home! The next day it showed up with no explanation of where it

  • How do I report a major memory leak problem with Firefox 3.6.10 in WinXP?

    After I installed Firefox 3.6.9 on a WinXP desktop, I occasionally had minor memory leak problems, reflected by getting "out of virtual memory" messages. I upgraded to 3.6.10 when notified that it was available and that it supposedly fixed stability

  • Need step-by-step help on how to set up javabean on form6i

    I create a form to be run on the web,winnt server, database 8.1.7 on sun. I need to set up a javabean on forms6i to allow users to output their file (delimited text file) to their local pc. I used TEXT_IO it works ok but the output file is saved on t

  • CD driver/burning software

    I would like to replace the original 600i CD reader in a PPC 8500 running 8.6 with one that is a CD writer/DVD reader out of an old Sawtooth 350 G4 for burning data CDs. As I jumped from 7.5.3 R2 to OsX on newer Mac(s), I missed out on OS8 and OS9 un

  • WD10EADS hard drive is discontined what is replacing it?

    I have an HPE 210y BiosHD8 error, I am using HP XD2000 business computer still alive,  year old quad is junk is it HDD or motherboard, tell the truth now HP.