Webcam video streaming question

Hello,
Before I buy the Flash Media Server, i would like to know that:
- is it possible to crop the webcam video (because I need just one part of the whole picture) before streaming from the client's Flash Player to the Media Server? (it would reduce the bandwidth a lot)
- if not, can the player store some parts of video until it is fully streamed with all it's data (30 fps) to the Server? (e.g: client: low bandwidth, i need all frames that the webcam records, real-time doesn't matter if it takes 1 minutes for 20sec video, that would be ok.
Thanks,
Peter

It is not possible to crop the video in the flashplayer. You can use the Camera.setMode method with the favorArea flag set to true to force an unsuppored resolution, but that often results in lowered image quality.
You can't store video locally, but you can increase the buffer in efforts to minimize frame drops (see the bufferTime property of the netstream class). When using a publishing buffer, the Flashplayer will only start dropping frames when the buffer is full. I've never tried using a buffer of more than 10 seconds on a publishing stream, so I don't know if there is an upward limit (nothing in the docs about that). That said, if the user does not have sufficient bandwidth to publish in real time, frames will start dropping eventually... no way to completely avoid that AFAIK, other than reducing the publishing bitrate.

Similar Messages

  • Flex Mobile iOS app, and http live video streaming Question.

    I have subbmitted our app (which is mostly a video playing app) to apple only to have it rejected because of the" 9.4: Video streaming content over a cellular network longer than 10 minutes must use HTTP Live and include a baseline 64 kbps audio-only HTTP Live stream" reason.  I was doing progressive download of the files, but now it looks like we will need to re-encode all of our media, which will probably take months and months. 
    My question is this:  We currently use Akamai as our current CDN, with thier HD Flash delivery system.  Their HD Flash is built to do dynamic bitrate streaming simmilar to Apple's HTTP Live.  If we re-encode our video to milti-bitrate and use the Akamai technology to do the adaptive streaming will this be good enough for Apple, or will we infact need to use their HTTP Live protocol.  Has anyone encountered this problem, or have any experience?  I would like to know before we re-encode all of our media only to find out that we will be forced to use Apples stuff instead.  Any help is much appreciated!

    Unless I am mistaken here, you must implement Apple's HTTP Live protocol for streams over ten minutes in length. There is a few options you can take:
    1. Split your streams into 9 minute segments and feed them consecutively in groups. (Apple actually offers a segmenting tool that even generates an index file via there Developer tools)
    2. Only enable full length streaming over WiFi
    3. Fully implement HTTP Live....
    Also, do not forgot that not only are you going to have to implement HTTP Live but you are going to have to ALSO include an audio only stream.

  • Webcam video streaming

    hi,
    i want to use my web camera to live stream.I will use my web
    camera to get video and i will show the users who visits the site
    my webcam's display, how can i do this with flex? i get my webcam
    display with these
    private function videocreate():void {
    var camera:Camera = Camera.getCamera();
    if (camera) {
    videoDisplay.attachCamera(camera);
    } else {
    Alert.show("You don't have a camera");
    }

    Thanks myIP,
    after your message i installed flash media encoder 2.5 and
    media server 3.Can you explain me the main algorithm.i understand
    like this;
    i use media encoder to take the video for example it gives me
    a link like this : rtmp://localhost/live/livestream and after i use
    it in the Flex 3 like this :
    <mx:VideoDisplay source="rtmp://localhost/live/livestream"
    height="254" width="200"/>
    if i use encode server do i need to use netstream and
    Netconnection Class?
    and if i do these steps in the server can i take the
    display(webcam plugged to server ) from the flex client ? this is
    very importan for me thanks again for your help

  • Saving a webcam video stream

    I'm pretty new to flash. I've figured out how to add a webcam, but is there a way to save a video from the webcam feed onto the pc running the flash app?

    As far as I know, flash player doesn't have permission or functionality to write files to your hard drive as this would present a big security problem.  You might have to create an adobe air application to get this kind of functionality.

  • Saving video stream from webcam periodically!!!

    Hi there, I want to catch every 1 minute video stream from the webcam, or video file, and save it into separate files. Of course I woudl like to have possibility to save many streams simultaneously. Any idea?
    Best regards

    Hi,
    I want to do the same thing ... I was able to stream audio through the socket but could not stream video.
    I posted my question at http://forum.java.sun.com/thread.jspa?threadID=674676&tstart=15
    and presented a sketch of my implementation design. But until now no one has answered my question.
    This is how you should hook up the components on the transmitter side
    Webcam --> Processor --> Custom DataSink --> PushBufferStream --> PushBufferHandler --> Socket --> Network.
    You can implement the transferData() method of the PushBufferHandler to write to a socket.
    And on the Receiver side the components should be hooked up like this
    Network --> Socket --> PullBufferStream --> custom DataSource --> Player
    Now as I mentioned in
    http://forum.java.sun.com/thread.jspa?threadID=674676&tstart=15
    the audio works fine but the video does not. The video does not render on the receiver side.
    Note: I tried TCP sockets with audio and it is terrible (not practical at all) so I reverted to UDP datagrams and the result was better than I expected. I didn't take care of the ordering of received packet at the receiver but it still worked fine.
    I am still looking waiting for some help in why video is not rendering and how can I sovle this problem.
    I am workning on a project and time is running out.
    Any help?
    -Adel

  • Video stream recording (not webcam) help required

    I am making a media player / recorder type web application in flex 4.
    The application plays video streams streamed from my red5 server.
    I want to give users the ability to make / record clips of the video they are viewing.
    I know that netstream.publish is used to record/save video on the server.
    All the help i've found on google so far just give examples to save video from webcam.
    But i want to save video from a video being streamed from red5 server and being played
    on VideoPlayer control in flex.
    All help is greatly appreciated.
    Thanks.

    Thanks for the reply but i think u misunderstood my problem.
    I am not receiving any RTP Stream but RTMP stream.RTMP is a proprietary protocol of Adobe For Streaming Flash video from Flash Media Server (FMS).
    I again state the problem.
    I am getting an RTMP Stream now i want to convert it into an RTP Stream so that i can process or transmit it further.I want to convert this RTMP Stream to RTP format.
    Anyone plz help me out.I am stuck into this for past 2 days and it is very critical for my project. :-(

  • Director 11 using a webcam or live video streaming input

    I am looking for a solution to use input from a webcam or from any other live video stream in Director 11.
    I have tried using a Flash component but this does not work.
    The MyronXtra does not work with Director 11 too!
    Anyone has an idea?
    Thank you

    You can use this demo from valentin :
    http://valentin.dasdeck.com/xtras/ar_xtra/win/arxtra_win_v019.zip
    it could help
    Brahim AYI

  • Multiple webcams composite video streaming

    I am relatively new to media foundation and I had just finished building my first single webcam streaming application using 'Developing Microsoft Media foundation Applications'. I have multiple dynamic webcam sources(the webcam number can change) and I need
    to combine them using the video mixer to form a composite video stream much like how multiple CCTV live streaming working on one screen. And yet at the same time I can access individual video stream and highlight/bring in to focus/zoom in when the cursor is
    hovering over to a particular video stream(event listening on video stream). 
    I need a set of guidelines to work with media session, topology, video mixer and video presenter to implement the functionality above. Any help is greatly appreciated. I am not looking into directX to achieve the functionality above. I am into the native
    media foundation architecture.

    Hi,
    Camera are WVC54GCA
    Router is a WRT300N.
    Like I stated earlier, one of these cams I can stream fine to my backberry storm,(Port 554 is forward to the ip address of the camera) but can't seem to configure the other for cell phone viewing.
    All cameras can be viewed via http via computers.

  • Capturing, timestamping and saving a video stream from a webcam ?

    Hello everybody,
    Glad to be in this forum.
    I got to do a program in java doing :
    - a capture of a video stream from a webcam
    - adding time and date on it
    - saving the video stream in a video format compatible with linux and windows
    Is it possible to do that in java (using JMF I guess) ?
    Could anyone help me to progress on my project ?
    thx a lot
    agussi

    Hi agussi, yes, it's possible to do that in Java using JMF.
    I'm not a JMF expert, but I've found this code very useful: http://forum.java.sun.com/thread.jspa?threadID=570463&tstart=50
    This piece of code shows the webcam stream and you can take snapshots with it. With a little tweak you can do what you are expecting.
    I hope this helps.
    Ivan

  • How to take advantage of H.264 video streaming of webcam BCC950

    We used flex 4.6 to develop an web app using Logitech webcam of BCC950, We used Graphedit of Directdraw to know BCC950 has pin 0 to support i320,RGB24,and mjpg stream, and pin 3 to support H.264 stream. We also set h264videostreamingsettings in as code to ask for H.264 data directly from the webcam, but from the CPU consuming we can judge flash player only fetch pin 0 uncompressed data instead of H.264 data directly. We need to take advantage of BCC950 H.264 support to get webcam hardware compressed H.264 streaming data to avoid compression by CPU/software. As we know flash does not give programmers choice to define the pin to get data, then what we need to do to make sure we can get H.264 video stream from the webcam directly? thanks. Henry

    ...well I did follow that article on playing an h264/mpeg4
    video in flash. It works well, but how would you add playback
    controls?
    FLVPlayback component only allows you to source an FLV video
    not h264/mpeg4.
    This is probably a simple as3 answer...but I am not
    knowledgeable on as3 to know what code to add.
    Any suggests or direction that I should look would be great.
    In the meantime I am researching actionscript 3 to find and anwer.
    E.

  • Synchronize on video stream with mplayer

    Hello,
    I recently bought an EasyCap DC60+ USB adapter. It is an RCA -> USB adapter which is recognized as an USB webcam by the kernel.
    I installed the easycapd60 module and it works fine for video with no sound, but I can't get audio working correctly. It looks like my audio stream is 1 second late on the video stream (I get the sound one second after the image).
    The problem is that mplayer is synchronizing the playback on the audio stream (I think). That means that my video stream is delayed in order for it to be in sync with the audio stream. As I want to use this DC60+ adapter to play video games on my computer screen, I can't tolerate a 1 second delay.
    A solution to this problem would be to ask mplayer to synchronize on the video stream instead of the audio stream, then play with the A/V sync parameters to get the audio synchronized. But I can't find the parameter to do this in the mplayer manpage.
    So my question is the following : does this parameter exist ? And is there another solution I didn't think of which could solve my problem ?
    Last edited by delroth (2010-06-06 00:26:14)

    Did you have to do anything special to get the DC60+ to work?  I've got one installed using the em28xx driver and it eventually makes my USB mouse stop responding, and sometimes the whole system crashes after that.

  • Video stream java code

    hey
    i am trying to find how to obtian a video stream from a webcam using java code
    if u can help me please.

    ars_03 wrote:
    i am trying to find how to obtian a video stream from a webcam using java code
    if u can help me please.General questions usually stimulate general answers, such as "google it". If you truly want help, you will put more effort into your question such as answering these questions:
    * What do you know already about doing this?
    * What sources have you looked up and studied?
    * Of this, what specifically do you not understand?
    * What is the current state of your program?
    * Is it running? compiling?
    * Are there errors? if so, post them.
    etc...
    The bottom line here is usually the more thought and effort you put into creating your question, the better your chances are of a volunteer here taking the time and effort to consider it and give you a helpful answer. You would benefit greatly by reading this link (one I try to re-study at least every other week as it helps me):
    [http://www.catb.org/~esr/faqs/smart-questions.html]

  • P2p video streaming using jmf (is it possible to "forward" the stream ?)

    Hello
    In my project a peer will start streaming captured video from the webcam to his neighbors and then his neighbors will have to display the video and forward(stream) it to their neighbors and so on . So my question is can i do this using jmf , a simple scenario will be : peer_1 streams to peeer_2 and then peer_2 to forward(stream) the video received from peer_1 to peer_3 .
    I've read the jmf2_0 guide and i've seen that it's only possible to stream from a server to a client and that's about it , but i want also the client to pass the stream forward to another client ... like for example [http://img72.imageshack.us/img72/593/p2pjmf.gif|http://img72.imageshack.us/img72/593/p2pjmf.gif]
    I want to know at least if this it's possible with jmf or should i start looking for another solution ? and do you have any examples of such projects or examples of forwarding the stream with jmf ....
    thanks for any suggestions

    _Cris_ wrote:
    I want to know at least if this it's possible with jmf or should i start looking for another solution ? and do you have any examples of such projects or examples of forwarding the stream with jmf .... You can do what with JMF. Once you receive the stream, it's just a video stream. You can do anything you want with it...display it, record it, or send it as an RTP stream.

  • Hardware to attach to Macbook pro for live video stream?

    Thinking of buying a Macbook Pro for video streaming of live events and conferences we do.
    My question is what video USB add-ons are compatible with Flash Media Live Encoder?  I can't find a list anywhere.  Are there any cheap USB video input devices that will work with the software for live streaming?
    Thanks

    I think anything with a properly installed driver for your OS will for in the encoder, the isight on my macbook pro works fine, as well do 2 other cheap webcams on windows.
    My suggestion is to purchase a cam from a retailer that offers refunds for pretty much any reason, and you'll be OK and find something on the first try, and if not try again.

  • Video Streaming from multiple camera to cell phone.

    I have about 5 webcams on a network in a different city.  One of these webcams I have been successful at setting up live video streaming to my cell phone. However when I tried to setup and view a second camera (same model), but  I cannot get it to successfully load and view. The second camera has a different access code also.
    Any Suggestions?

    Hi,
    Camera are WVC54GCA
    Router is a WRT300N.
    Like I stated earlier, one of these cams I can stream fine to my backberry storm,(Port 554 is forward to the ip address of the camera) but can't seem to configure the other for cell phone viewing.
    All cameras can be viewed via http via computers.

Maybe you are looking for