C-series h.264 streaming ~ broadcast confidence monitor

Hi all ~
has anyone simply viewed the h.264 stream with some type of software based decoder / codec - and if so
what was the path / url to view it... ? any comments will be appreciated !
I've built a custom java application for a broadcast studio handling call & cam control with on-screen diagnostics etc...
- they use alot of Sony's but now are bringing in Cisco C-series
The application as it stands now grabs the camera snapshot.jpg of camera on the far end and displays it on user interface in a little
.jpg viewer with ptz / call controls... just to let operators know video is happy...
well - now with the C-series I dont get to do that, since TC5 and above... well - within budget anyways...
to get the C-series snapshot I have two options - build a Base_64.json decoder which would emulate
how the web interface on the C-40 displays the snapshot now ( very costly ) - or grab it via SCP out of root/tmp/snapshots folder...
the issue here is I have to copy it from the codec and put it on a server - which would not be so bad, but the overall design
allows for alot of codec's... so - ideally - in 3 three years I could have 40+ codecs to copy / paste that .jpeg
from to the server, constantly over-writing the file etc... not pretty...
so - my only last hope is to grab an h.264 stream ( I only need to get the video when in a call anyways / h.323 only )
but I dont know enough about h.264  to know if it's even possible...
I'm suspecting I'm not going to get my way on this...
thanks in advance for your time !!
Tom Brittingham
Progicon Systems

If the C-Series have TC6 or greater software, you can get the snapshots the following addresses, one for local camera video and the other for incoming remote video.  Note, web snapshots will have to be enabled on the codec for snapshots to work, of course.
/web/api/snapshot/get?SourceType=localMain
/web/api/snapshot/get?SourceType=remoteMain

Similar Messages

  • H.264 streaming on SAFARI

    Hello guys.
    I am a university student and I am doing a project regard of getting live stream from IP camera via vlc and send it to a particular browser using HTML5 video player (Projekktor and VideoJS). Basically I receive RTSP live streaming from IP camera and re-streaming it in the FLV container via VLC player. After that I send it to the browsers using HTML5 video player. FLV works on all browsers such as IE, Chrome... However, it does not work on safari browser. I think it is because Apple does not support flash. So is there any ways that h.264 streaming works on safari browser?
    I will be so appreciated if anyone replys with solution. Thank you so much
    p.s this is on the computer or laptop, not iphone or ipad

    Thanks. We were able to solve the issue by increasing the bandwidth. The next issue we are facing is that we want to connect multiple cams to the same machine and have fast switching with each cam back and forte. The issue is that there is a freeze on the current stream before switching to the new stream and from the broadcaster side it takes a bit before the cam switches . Getting the fast switching is our goal at this point.

  • Pro color profile (i1xtreme) vs broadcast ref monitor

    OK, I've had the privilege of working in production long enough that I know the importance of using a properly setup broadcast ref monitor, but is there a better way today for those of us doing web only delivery? In what little I've googled I'm guessing there is not, but here are my thoughts.
    We've just switched to the Production Premium suite of software and I'm wondering if using our photojournalists i1xtreme monitor calibration system if I can get a decent setup on a DVI/mini display port connected monitor. He's pretty sure he can hit the REC709 spec, but then not sure how I would do the other adjustments. I understand that the max I could get is 8bit. We had considered the MXO2, but now I read on the Matrox site that there is a lag from computer to MXO2 monitor that their engineers are trying to figure out. Seems like I read somewhere that AJA is having some problems with some codecs also lagging from computer to ref monitor.
    MacPro 5,1, dual quad-core Xeon, 12gb mem, 2 Dell 2407WFP monitors, editing HDV, but switching to H.264 via Panasonic AVCCAM camcorder (want to edit natively)
    We do have an old Sony CRT SD broadcast monitor & LHe that was pulled from the Mac due to unresolved crashing issues editing HDV natively in FCP7.

    Premiere Pro could really use your help in getting a proper monitoring solution implemented - without third party cards/codecs.
    http://forums.adobe.com/thread/900221?tstart=0

  • As posted in another forum airport 7.5.2 may affect h.264 streaming

    http://www.macintouch.com/readerreports/airport802_11n/index.html#d20dec2010
    Is there anyone successfully running h.264 streaming with the new airport firmware update?

    also working fine for me - although, all my Macs are hardwired to my TC.
    JGG

  • How do you auto reconnect a live video stream broadcast in flash action script 3?

    how do you auto reconnect a live video stream broadcast in flash action script 3?
    so i don't have to ask people to refresh the page if the connection drops
    i copy pasted the live video stream broadcast files and script from here;
    http://www.adobe.com/devnet/adobe-media-server/articles/beginner_live_fms3.html
    http://www.adobe.com/content/dotcom/en/devnet/adobe-media-
    server/articles/beginner_live_fms3/_jcr_content/articlePrerequistes/multiplefiles
    /node_1278314297096/file.res/beginner_live_fms3.zip
    i don't know what i'm doing

    Why don't you use several layers with appropriate alpha properties, and move these layers according to the mouse events?

  • H.264 Streaming to QT

    Dear Experts,
    I am trying to stream a H.264 stream to Quicktime. I have generated the correct SDP information i.e. spropparametersets and streaming from my server (which is based out of livemedia).
    QT is able to receive the actual dimensions etc which came as part of SDP and is able to resize the window to the same. Also, RTCP channel is also working as pausing the QT or restarting QT or killing a session and starting another one are all registered by the Livemedia server.
    When I stream the data, I am observing a green or mixture of green and pink patches. This is with the workaround in QT Preferences -> Advanced -> Safe Mode (GDI only) option. Once in a while with little luck, I am able to view the first frame of the stream. Otherwise, I get only green screen.
    I am able to get MPEG4 streaming working fine, but H.264 is something I am unable to get any hold off.
    Inputs and suggestions would be of great use. Many thanks in advance.
    Thanks.

    I had this exact same problem. And I was as eqally over-whelmed with responses... (search for "crashing Quicktime").
    However, this persistent and annoying bug was cured only AFTER I upgraded to 7.1.3. So that's a great help, then!
    All I can suggest, having purged logs and opened new User accounts to no effect, is try reinstalling 7.1.3 some how (i.e. download the update, rather than use Software Update). I'm supposing some file somewhere is corrupted, but I don't know enough to speculate what file or where. Trashing the prefs and so on didn't work for me, but it's always worth doing a good debug as suggested previously on these pages (see http://forums.osxfaq.com/viewtopic.php?t=7269 for example).
    Good luck, and let us know how you get on. I at least will be listening

  • QuickTime Broadcaster, from MPEG-2 to H.264 stream

    Can QT Broadcaster connect to a MPEG-2-stream, convert it to H.264 MPEG-4 and broadcast the new stream on the fly?

    I could be missing something, but to the best of my knowledge, no, QT Broadcaster cannot. It's not a transcoder; it's intended to capture live audio and video from Firewire, USB or an analog input.
    There is at least one real-time MPEG-2 to H.264 transcoder available, from Media Excel, but it's not cheap; ca. $13,500 US. If there's anything less expensive that can convert MPEG-2 to H.264 in real time, I haven't been able to find it.
    Your only other option, I think, would be to capture a video output stream from your MPEG-2 content to an analog or DV output, then use QT Broadcaster or another real-time compression tool to handle the H.264 output.

  • H.264 streaming in flash player motion blur

    Hi ,
    We are using H.264 for live streaming in a flash window . The encoding works fine but when there is alot of fast movement on the stream there is a degradation in the quality of the video. When the fast movement stops the video restores itself but it is very noticable . Does anyone know how to resolve this issue and eliminate it all together.
    Thanks
    peter

    Thanks. We were able to solve the issue by increasing the bandwidth. The next issue we are facing is that we want to connect multiple cams to the same machine and have fast switching with each cam back and forte. The issue is that there is a freeze on the current stream before switching to the new stream and from the broadcaster side it takes a bit before the cam switches . Getting the fast switching is our goal at this point.

  • H.264 streaming

    Hi!
    I'd like to compress a movie I have captured with SnapZ Pro with the H.264 codec. I have two questions:
    First: What must I do to prepare the movie for streaming? I already chose Quickstart with compressed header in the codec settings pane. Is there anything else I should do?
    Second: What must I do to actually make the movie stream on the web? Must I use the embed-tag or can I use the ahref-tag as well?
    Take care,
    Christian

    Mike,
    since it seems like you would only need a certain app
    open, why dont you select that window within SnapZ
    Pro? why do you need your whole desktop?
    I have to for two reasons:
    1. The app I demo is a multi window application.
    1. I have to record those movies on my old iMac G4 800Mhz because my client doesn't want those yellow mouse pointer I'd get if I recorded with my more powerful Intel iMac. However, the G4 iMac is barely fast enough for such a task. Reducing the captured area would force me to move the application's windows around while recording. My old iMac is not powerful enough for this. The result would be stuttering video.
    i only ask because if you just selected what you
    needed, rather than the whole screen, then you can up
    your resolution. what is your current monitors
    resolution?
    1024x640
    I know that's large. That's why I tried to downscale the video during the compression process. Isn't it possible to downscale the picture without blurring it? I mean, every image processing app can downscale still images without much quality loss.
    also, you said that the h.264 codec straight from
    SnapZ Pros output selector is processor intensive. it
    will be just as intense if you compress it in
    compressor.
    It's not the CPU usage that bothers me. You see, I don't record those tutorials in one take, but I break them up into several rather short parts. Therfore SnapZ's rendering time with H.264 would end up in long breaks in my workflow between each recording.
    i would still recommend the animation codec @ 10fps
    but with ONLY the application window you need, not
    your WHOLE desktop. that will help you out a lot.
    Again, recording only a part of the screen isn't possible. That's why I tried downscaling.
    Mikey M.
    Take care,
    Christian

  • How do you watch and record a live h.264 stream at the same time using fmis and FMLE?

    I've read all the similar post but could not find a solution that actually works, and please no "just use the dvr or record on the FMLE suggestions" as neither will work for me.
    Some posts I've read suggest naming the stream in the FMLE to something like mp4:mystream.mp4 or mp4:mystream.f4v but then you can no longer view the stream, and according my the admin console the stream will not even publish using this syntax.
    I can view the published stream if I name the FMLE stream to something like mystream.mp4 or just mystream, but then it doesn't record at all or records as a FLV file depending on how I code the main.asc file.
    Here's one version of my main.asc, this one correctly publishes the live stream after the client triggers the joinStreams function but will not record it.
    application.onConnect = function (client, userType)
        trace("userType is " + userType);
        this.acceptConnection(client);
        //this.clientCount++;
        client.joinStreams = function (channel)
            trace("joinStreams on channel " + channel);
            liveVid = Stream.get("livevideo/" + channel + ".mp4");
            liveVid.play("hdvideo/" + channel, -1, -1);
            liveVid.record("append");
        client.clearStream = function (channel)
            trace("clearStream on channel " + channel);
            liveVid = Stream.get("livevideo/" + channel + ".mp4");
            liveVid.play(false);
            delete liveVid;
    I've been working on this for 48 hours straight, please advise.

    I think lets keep it simple, will explain what you have for watching live streams and recorded streams
    Live Publish and Play:
         FMLE Settings:
    Video Codec: H.264
    Audio Codec:<any of your choice>
    Server URI : Please put your server uri with application name here , for example i will use "rtmp://myServer/myApp"
    Stream name: livestream
        Subscriber Settings:
              Server URI :  "rtmp://myServer/myApp"          Stream name: livestream
              Mode: "live" ( i.e. ns.play("livestream",-1,-1,true)
    Playing VOD H.264 file:
         Subscriber settings:
              Server URI :  "rtmp://myServer/myApp"
              Stream name: if file name is "sample", use "mp4:sample", if file name is sample.f4v, use "mp4:sample.f4v"
              Mode: "record" ( i.e. ns.play("mp4:sample.f4v",0,-1,true)
    For DVR (Record-in-progress stream)
    If you are using FMLE and want to use "Record" option of FMLE then you need to have to dvrcast_origin application and subscrier needs to FLVPlayback 2.5 component
    If you want to use that option, do let me know i will give details later.
    For now we will use simple server-side code and FMLE as publisher.
    main.asc of "myApp"
    application.onPublish = function(myclient,mystream){
         mystream.record();
    application.onUnpublish = function(myclient,mystream){
         mystream.record(false);
    FMLE Settings:
    Video Codec: H.264
    Audio Codec:<any of your choice>
    Server URI : Please put your server uri with application name here , for example i will use "rtmp://myServer/myApp"
    Stream name: mp4:mydvrstream.f4v
        Subscriber Settings:
              Server URI :  "rtmp://myServer/myApp"          Stream name:  mp4:mydvrstream.f4v
              Mode: "live" ( i.e. ns.play(" mp4:mydvrstream.f4v",0,-1,true)
    Try out above and let me know if that works or does not work for you.

  • W530 w/ Series 3 Dock - Driving Three 4K Monitors

    Hey All,
    Interesting question for you... My normal monitor configuration on my desktop machine is 3 Monitors in landscape + 1 in portrait. I am considering upgrading my 3 landscape monitors to either 2560x1440 screens or 4k.
    I use a W530 with a Series 3 Dock for work and would like to be able to connect it to my home setup. I am pretty positive that I could drive the 2560x1440 screens since I already drive three 1920x1200 screens at work via the two DPs on the Dock plus the one on Mini-DP on the laptop.
    My question though is if it will run 4K screens in the same configuration? I know that it will have to be ran at 30hz but will it be able to drive that quanity? The W530 is equipped with a K2000 in case that matters.

    The W530 should be able to drive 3840x2160 at a full 60Hz:
    "Maximum external resolution: 3840x2160@60Hz (DisplayPort via optional Mini DP cable);
    1920x1200@60Hz (single-link DVI-I via optional Mini DP cable); 2048x1536@85Hz (VGA)"
    W520: i7-2720QM, Q2000M at 1080/688/1376, 21GB RAM, 500GB + 750GB HDD, FHD screen
    X61T: L7500, 3GB RAM, 500GB HDD, XGA screen, Ultrabase
    Y3P: 5Y70, 8GB RAM, 256GB SSD, QHD+ screen

  • Broadcast studio monitor

    Any suggestions/alternatives for an HD broadcast monitor that's not going to break the bank? I have a small wedding videography business and I can't see spending a mint on a monitor. I do however see the benefit of having a broadcast monitor for color correction purposes. Can you help? Thanks in advance.

    HP DreamColor is not a broadcast monitor, but with an MXO2, it comes somewhat close. If you're serious, the best deal you'll find is a Flanders Scientific monitor (17" should work just fine), running off of an AJA Knoa LHi card. It won't get any cheaper than that, if you want quality.

  • Multiple Video Streams on Same Monitor or Screen

    Is it possible to merge multiple video streams on the same timeline in PP CS5? I have a customer thats wants me to do thisfor him.
    This would be like you see a security monitor with three or four small images on the same screen.
    C. Barker

    Charles,
    What the client is asking for is PiP (Picture in Picture), and Colin has outlined the method for creating the PiP.
    I also find it easiest, when doing a lot of PiP work, to use alignment grids, to help me with both the Scaling and the Position. This ARTICLE goes into a bit more detail. For PiP, I will usually create a custom alignment grid, similar to this:
    Good luck, and hope that this helps,
    Hunt

  • Has anyone here have any exp with Live streaming broadcasting

    Good day.
    Has anyone here have any exp with broadcasting a live event using Apple Mac and assoc. software?
    We have our own domains, desktops etc., but we need to broadcast an event AND charge for this specific event, i.e., Pay direct, paypal or is there a service that does this???
    Any advice, guidence and/or links please would help.
    Thank you very much

    It's "Kappy".
    Here are some links I found that may help:
    http://images.apple.com/server/pdfs/L31754AQTStreaming_TBfinal.pdf
    http://docs.info.apple.com/article.html?artnum=75002
    http://developer.apple.com/opensource/server/streaming/index.html
    http://www.soundscreen.com/index.html
    http://www.apple.com/quicktime/streamingserver/faq.html

  • Using netStream.time on rtmp live stream (broadcaster!=receiver)

    I am trying to synchronize a live stream (which is
    broadcasted from the flash player plugin) with some scripted
    actions. The problem is, that the displayed frames of the rtmp
    stream do not correlate to the netStream.time property on the
    receiver side. In fact, i do not understand at all, how the
    property is rendered on the receiver, since i can not recognize any
    dependencies to e.g. bufferLength.
    Does anybody now, how it is calculated? I tried with Red5 and
    Wowza and they seem to behave similar. I guess, FMS would not make
    any difference (if so: lease let me know!), since i assume that the
    property is rendered during the encoding process i.e. by the
    plugin.

    Hello Jay,
    thank you for your answer! NetStream.send() seems to be at
    least a possibility to solve my problem with a workaround.
    I just want to synchronize the time information of the up-
    and downstream: both should have the same time information when
    they show the same visual content (which i called "frame" in the
    former post).
    What i would suppose the netstream.time to be is that if i
    shake my camera on upstream.time=10.0 then downstream.time should
    equal 10.0 when this camera shaking is played back. This is the
    behaviour that i am used to from streaming prerecorded FLVs. But
    with live streams things work out diffferently.
    In fact my downstream.time is bigger than upstream.time. And
    i can not imagine how this can be. If i streamed up for let's say
    20 seconds and i start the downstream after about 10 seconds, i
    would expect my downstream.time to start with 10 seconds and then
    increase continually. But against this, my downstream.time starts
    with something above 20. How can this be? Or back to my initial
    question: how is the downstream.time rendered?
    This behaviour seems not to be dependent on the
    downstream.bufferTime.
    With netStream.send() i could send the upstream.time
    information via the upstream to render an offset for
    downstream.time on receiver side. This should work (have to check
    it), but is a workaround, no "clean" solution.

Maybe you are looking for

  • Schedule line data

    Hi gurus, How can we see the changes in schedule line data. i.e. all changes in delivery date in open sales orders. Thanks in advance.

  • How do i open CR2 files with photoshop cs5?

    Adobe camera raw will not open CR2 files in photoshop. How do i open these files? i have tried updating camera raw witht he last 8 updates but it still will not open. Camera raw shows a pop up box saying the file is not supported.

  • OBIEE 11g login page not showing

    hi, I installed obiee11g on windows 2008 server...all OPMN process r started but still login page is not showing...Can anyone help

  • Is my iphone locked or unlocked?

    Hi just wanted to ask if my iphone that I bought from istyle in dubai is locked? I tried to insert a sim here in Florida but I think it's not working. What should I do? I've been trying to locate also here in florida if where could I unlock the phone

  • Preview has stopped responding to clicks

    Preview has stopped responding to clicks and now only scrolling and hotkeys work. I already tried resetting the preferences and restarting.  Once in Preview I can't click on any system menus.  The only way to switch is using 4-finger swipe to mission