No Live Stream on Cameras (New Service) - Intermittent
I just had 3 exterior cameras installed yesterday and was wondering if the intermittent loss of live video stream is a common occurrence with new installations. I also needs tips on how to setup a camera to record and notify when it detects motion. And finally, does anyone know if it is possible to take a snapshot, upload the .jpeg file and zoom into the image ? These cameras I have do not have "zoom-in" capability. Thanks
See this article: iPhone: Troubleshooting No Service.
Similar Messages
-
Hi,
am planning to develop a live streaming app using C# and xaml.
I have gone through the below sample and am able to stream the live content using wireacst camera.
Azure Media Services Live Streaming Sample
But i want to use Phone camera to stream the live video rather using wirecast camera.
How can i use that? How to get the published URL from the C# code, is there any API to get that?
Anybody please help me
Regards,
SanthoshHello,
There are no in box components that will allow you to transmit captured video from a Windows Phone device to Azure Media Services. You will need to create you own implementation. You can do this by creating a custom Media Foundation sync that can plug into
the MediaCapture element's pipeline. You will then need to modify the captured and compressed video to align to one of the two protocols that Azure Media services can use. At this time Azure Media services can use either Smooth Streaming or RTMP for ingestion.
You will need to create your own packager (or find compatible packager code from a 3rd party) to package the MP4 encoded data into the necessary format.
Here is an example of how to create a custom Media Foundation sync (please note you must write your Media Foundation components using C++ MoCOM. We do not support Media Foundation components written in managed languages):
Real-time communication sample
I hope this helps,
James
Windows SDK Technologies - Microsoft Developer Services - http://blogs.msdn.com/mediasdkstuff/ -
How to copy recorded live stream to new drive
I was wondering if anyone out there knows how to copy a recorded live stream to a new drive on the server? I can copy the file within the virtual app directory but not to another drive.
The only solution i found was to edit the application.xml:
<StreamManager>
<StorageDir>e:\Sync\</StorageDir>
</StreamManager>
But unfortunately i need to rename the file once its recorded, and i'm having a hard time accessing that drive/directory where the file is located.
Any help is greatly appreciated.
thanksHi,
What I understood from your query I am providing you a solution:
1. You can use the below to copy recorded live stream to new drive/directory in Application.xml
<StreamManager>
<StorageDir>e:\Sync\</StorageDir>
</StreamManager>
2. By default, a script can access files and directories only within the application directory of the hosting application. A server administrator can grant access to additional directories by specifying virtual directory mappings for File object paths. This is done in the FileObject tag in the Application.xml file, as shown in the following example:
<FileObject>
<VirtualDirectory>/_definst_;e:\Sync\_definst_\</VirtualDirectory>
</FileObject>
3. Create one application folder suppose test and then create main.asc file inside that which will have the below code
application.onConnect = function(client) {
application.acceptConnection(client);
var fileObj = new File("_definst_");
var fl = new Array();
trace("File name : " + fileObj.toString());
if (fileObj != null)
if (fileObj.isDirectory)
fl = fileObj.list();
for (i in fl)
trace("Directory : " + fl[i].name);
if (fl[i].isFile)
trace("File : " + fl[i].name);
trace(fl[i].renameTo("dvrstream"+i));
}//if closes
}// for closes
}//if closes
}//if closes
}// application closes
This server side application rename files but since the folder which we had to access was outside application directory of the hosting application we had to map it using step 2.
If this solution does not answer your query please be more ellaborative on what exactly are you trying to achieve. That will help me to answer your query correctly.
Regards,
Amit -
How to use Youtube Live Streaming API ??
I want to make a WPF application that gets the video from my ip camera and sends it to my youtube channel live. I look around all of the websites but there's no example how can i live stream a video to Youtube with c#. There are examples in google's website
but they were written with PHP, Java and Phyton but I don't know this programming languages so i couldn't use the API.
I tried to write a little bit but it didn't work. Here's my code that i wrote looking through the Java example.
string devkey = "AIzaSyCbxm6g9orAw9PF3MkzTb_0PGbpD3Xo1Qg";
string username = "MyYoutubeChannelEmailAdress";
string password = "MyPassword";
YouTubeRequestSettings youtubereqsetting = new YouTubeRequestSettings("API Project", devkey, username, password);
YouTubeRequest youtubereq = new YouTubeRequest(youtubereqsetting);
LiveBroadcastSnippet broadcastSnippet = new LiveBroadcastSnippet();
broadcastSnippet.Title = "Test Live Stream";
broadcastSnippet.ScheduledStartTime = new DateTime(2015, 3, 12, 19, 00, 00);
broadcastSnippet.ScheduledEndTime = new DateTime(2015, 3, 12, 20, 00, 00);
LiveBroadcastStatus status = new LiveBroadcastStatus();
status.PrivacyStatus = "Private";
LiveBroadcast broadcast = new LiveBroadcast();
broadcast.Kind = "youtube#liveBroadcast";
broadcast.Snippet = broadcastSnippet;
broadcast.Status = status;
Google.Apis.YouTube.v3.LiveBroadcastsResource.InsertRequest liveBroadcastInsert = new Google.Apis.YouTube.v3.LiveBroadcastsResource.InsertRequest(service, broadcast, "");
LiveBroadcast returnLiveBroadcast = liveBroadcastInsert.Execute();This article should get you going:
https://developers.google.com/youtube/v3/code_samples/dotnet
Also, questions on YouTube API issues should get the best service here:
https://groups.google.com/forum/#!forum/google-api-dotnet-client -
Duplicating stream object to new stream with lower resolution
Hi all generally I have a question, Is it possible for ams to split my original live stream to a new stream but with lower resolution, so i have two streams one with high resolution and one with low resolution?
If not I how does it Adobe Flash Media Live Encoder do? I can steam up to 3 different resolutions from one camera object. Is it possible to do also from flash?
I want to built a multi user conference application. But have some cpu usage issues, so I need a additional lower resolution stream to use for the miniature views.
Could You help me with this, any tips how could I do such a thing?
I using rtmp protocol to communicat and adobe media server 5.
Sorry if a similar thread was already published on this forum.Thx 4 the replay.
So maybe I could do something like this:
I am able to send two streams from one camera but they both have the same resolution.
Is there any way from flash app to send two streams from one camera with different resolution like FMLE does? -
HDS live streaming to Flash not working
Adobe Flash Media Server 4.5.5 r4013
Windows 2008
Sources:
http://help.adobe.com/en_US/flashmediaserver/devguide/WSd391de4d9c7bd609-52e437a812a3725df a0-8000.html
http://www.adobe.com/devnet/adobe-media-server/articles/live-multi-bitrate-video-http-flas h-ios.html
Live streaming a single or multi-bitrate video over HTTP to Flash does not work. I have followed the instructions on the 2 sources listed above repeatedly, but I can’t get live streaming over HTTP to Flash to work. Live streaming to iOS over HTTP works with no problems (single and multi-bitrate streams).
I have tried the troubleshooting steps from the following:
http://help.adobe.com/en_US/flashmediaserver/devguide/WS0432746db30523c21e63e3d12efac195bd -8000.html
Troubleshoot live streaming (HTTP)
1. Services window (Windows): Flash Media Server (FMS), Flash Media Administration Server, and FMSHttpd services are running. ✓
2. Verified that the request URL is correct. ✓
3. Configured ports:
a. Configure Apache to use port 80. Open rootinstall/Apache2.2/conf/httpd.conf in a text editor. Change the line Listen 8134 to Listen 80.
b. Configure Flash Media Server not to use port 80. Open rootinstall/conf/fms.ini in a text editor. Remove 80 from the ADAPTOR.HOSTPORT parameter so the parameter looks like the following: ADAPTOR.HOSTPORT = :1935 ✓
4. Placed a crossdomain.xml file to the rootinstall/webroot directory. ✓
5. In Flash Media Live Encoder, select the Encoding Options tab, choose Output from the Panel options menu, and verify the following:
a) The value of FMS URL is rtmp://fms-dns-or-ip/livepkgr. If you’re testing on the same server as Flash Media Server, you can use the value localhost for fms-dns-or-ip. ✓
b) For a single stream, the value of Stream is livestream?adbe-live-event=liveevent. ✓
c) For adaptive bitrate streaming, the value of Stream is livestream%i?adbe-live-event=liveevent. ✓
Flash Media Live Encoder uses this value to create unique stream names. To use another encoder, provide your own unique stream names, for example, livestream1?adbe-live-event=liveevent, livestream2?adbe-live-event=liveevent.
The encoder is showing all 3 streams being published and streaming.
6. Check Administration Console: the livepkgr application and the 3 streams are running. ✓
7. Check the logs for errors. Flash Media Server logs are located in the rootinstall/logs folder. The master.xx.log file and the core.xx.log file show startup failures. Apache logs are located in the rootinstall/Apache2.2/logs folder. X
a) core00.log: these errors did not occur every time that I tried playing the live stream but these are the only relevant errors in the logs.
1. 7968 (w)2611179 Warning from libf4f.dll: [Utils] [livestream2] Discarded all queued Media Messages received before first Video Keyframe Message
2. 7968 (w)2611179 Warning from libf4f.dll: [Utils] [livestream3] Discarded all queued Media Messages received before first Video Keyframe Message
b) edge00.log:
13:33:57 4492 (w)2641213 Connection rejected by server. Reason : [ Server.Reject ] : (_defaultRoot_, _defaultVHost_) : Application (hds-live) is not defined. -
c) Apache-Error:
1. [warn] Checking if stream is disabled but bootstrap path in event file is empty for event:livepkgr/events/_definst_/liveevent stream name:livestream
2. [warn] bootstrap path is in event file is empty for event:livepkgr/events/_definst_/liveevent stream name:livestream1
As I mentioned, everything works on iOS and FMS seems to be creating all of the stream segments and meta files:
a. The 3 streams are being created in: HD:\Program Files\Adobe\Flash Media Server 4.5\applications\livepkgr\streams\_definst_
b. FMS is creating the following files in each stream folder (livestream1, livestream2, livestream 3):
1. livestream1.bootstrap
2. livestream1.control
3. livestream1.meta
4. .f4f segments
5. .f4x segments
The appropriate files are also being created in the HD:\Program Files\Adobe\Flash Media Server 4.5\applications\livepkgr\events\_definst_\liveevent folder, in which I have the following Manifest.xml and Event.xml files:
<manifest xmlns="http://ns.adobe.com/f4m/1.0">
<media streamId="livestream1" bitrate="200" />
<media streamId="livestream2" bitrate="500" />
<media streamId="livestream3" bitrate="1000" />
</manifest>
<Event>
<EventID>liveevent</EventID>
<Recording>
<FragmentDuration>4000</FragmentDuration>
<SegmentDuration>16000</SegmentDuration>
<DiskManagementDuration>3</DiskManagementDuration>
</Recording>
</Event>
I’ve tried clearing the contents of both streams\_definst_ and events\_definst_\liveevent (keeping the xml files) after restarting the encoder, and creating a different event definst for the streams (liveevent2 for example).
We have an event in 2 weeks that we would like to stream to both Flash and iOS. Any help in solving this problem will be greatly appreciated.One step closer:
Changed the crossdomain.xml file (more permissive settings).
Changed the encoding on FMLE to vp6. Working somewhat (don't know what I did to make it start streaming through hds).
But at least now I can get the individual streams in the set manifest file to work:
http://localhost/hds-live/livepkgr/_definst_/livevent/livestream1.f4m
http://localhost/hds-live/livepkgr/_definst_/livevent/livestream2.f4m
http://localhost/hds-live/livepkgr/_definst_/livevent/livestream3.f4m
BUT when I try to play the streams through the set manifest file from http://localhost/liveevent.f4m I'm getting the following error:
"The F4m document contains errors URL missing from Media tag." I'll search the forums to see if anyone else has come across this problem.
I used the f4m config tool to make the file. These are the file's contents:
<manifest xmlns="http://ns.adobe.com/f4m/2.0">
<baseURL>http://localhost/hds-live/livepkgr/_definst_/liveevent/</baseURL>
<media href="livestream1.f4m " bitrate="200"/>
<media href="livestream2.f4m " bitrate="500"/>
<media href="livestream3.f4m " bitrate="1000"/>
</manifest>
Thanks -
Live Streaming from Video File
Hi everyone,,, im newbie in programming streaming video with flash....ater reading documentations i tried the instructions on doc, i got a question that i've not been solved. How to make a "live streaming" from "video file". It likes a live streaming from camera but it uses video file as the source.... many thanks
Do you mean you have video file but you want to publish it as live stream , am i right? If yes following is the solution, but before that let me clarify you will need server to do it , i mean there is no way Flash client (i.e. .swf file running in Flash Player) can publish file from local disk.
Now let me tell you how to do it from server:
Place your video file under "streams" folder of your application i.e. say "myVideo.flv" under <FMS root>/applications/myApp/streams/_definst_
Now write following code in your main.asc file (this file would be under <FMS root>/applications/myApp )
var mystream;
application.onAppStart = function(){
mystream = Stream.get("livestream");
mystream.play("myVideo",0,-1,true);
This will publish "myVideo" file as live stream under name of "livestream". Client would have to subscriber using NetStream and
use ns.play("livestream",-1,-1)
You might have to write some extra code if you want to loop same file else once play of recorded file gets over , live publish will automatically stop.
let me know if this solves your problem -
How to make delay Live Streaming ?
Hi everybody,
I follow this turorial make HTTP Live Streaming From Camera, using FMS 4.5, and FMLE 3.2, and I success.
Now I need make delay (1~10 minutes) on HTTP Live Streaming. Example, current now my camera is capture, and client (end-user) can only see the signal (video) is obtained from the camera 10 minutes ago.
P/s: I'm using Amazon Cloud, specify EC2 (include FMS 4.5)
Can I do that ?
Thanks,
Huy MaiHi Bharat,
I read your link, and focus in MaxQueueDelay & MaxQueueSize.But its seem can not use in my case. I try config Application.xml follow it, and not success for my purpose , I think "MaxQueueDelay" is specifies how to often the server will flush the message queue, in miliseconds.
My puporse is client can view video, late 10 minutes than what is obtained current from camera
Will I be missing anything?
Mai Huy -
Hello,
I am creating a script to function like Ustream.tv or
Justin.tv and it is currently configured at www.Toxin.tv. What I
need to do is retrieve the status of a stream (amount of users and
whether or not it's live) more importantly, though, whether or not
it's live in JavaScript or PHP - if this is at all possible.
Please help me, If I can't figure it out I'll just pay
someone to do it :P but I'd really like to try to do it myself
first. Thanks so much in advance.You can't do it at the application instance level (the FMS
core can't listen for HTTP requests), but you can get a list of
live streams from the admin service via http request.
See the docs for the HTTP api of the admin service,
specifically the getLiveStreams method. -
Live streaming with multiple remote cameras?
look up lts security, i found it as a easy, high quality, low cost, and great solution
I didn't see any discussions on SpiceWorks that covered this question. I'm involved in live streaming of my church's services, and we got hit by lightening 2 weeks ago. We had 3 cameras which were able to be remotely controlled by 1 Telemetrics box, and which sent video feed into a digital video mixer. Well, one of the cameras and the remote control box have died, so we are looking to replace the setup with better equipment. I'm able to run the equipment, and even troubleshoot it to an extent, but have no idea what's even out there. Does anyone have any recommendations on a setup for at least 3 remotely controllable (pan/tilt/zoom) HD cameras? The remote control could be either hardware or software based. I have no idea how much the church could budget for a new system, so a range of low-high cost options would be ideal for me to...
This topic first appeared in the Spiceworks Community -
I need to do live streaming with a Mac Book Pro 2011, using a new model of Sony HD camcorder (http://store.sony.co...ber=HDRAX2000/H) ..this camcorder model does not have firewire out/input ..it comes only with a component video output, USB, HDMI and composite A/V video output..
I wonder how can I plug this camcorder to the firewire port of my laptop? Browsing on internet I found that Grass Valley Company produces this converter http://www.amazon.co...=A17MC6HOH9AVE6 ..but I am not sure -not even after checking the amazon reviews- if this device will send the video signal through firewire to my laptop, in order to live streaming properly? ..anyone in this forum could help me please?
ThanxI can't broadcast with the built in iSight webcam... how would I zoom in or zoom out? or how would I pan? I've seem people doing it walking with their laptops but that's not an option for me... there's nothing wrong with my USB ports but that's neither an option to stream video because as far as I know through USB you can't connect video in apple operating systems ..you can for sure plug any video cam or photo camera through usb but as a drive to transfer data not as a live video camera... is by firewire an old interface developed by apple that you can connect all sorts of cameras to apple computers... unfortunately my new sony HDR-AX2000 camcorder doesn't have firewire output...
thanx -
I need to do live streaming with a Mac Book Pro 2011, using a new model of Sony HD camcorder (http://store.sony.co...ber=HDRAX2000/H) ..this camcorder model does not have firewire out/input ..it comes only with a component video output, USB, HDMI and composite A/V video output..
I wonder how can I plug this camcorder to the firewire port of my laptop? Browsing on internet I found that Grass Valley Company produces this converter http://www.amazon.co...=A17MC6HOH9AVE6 ..but I am not sure -not even after checking the amazon reviews- if this device will send the video signal through firewire to my laptop, in order to live streaming properly? ..anyone in this forum could help me please?
ThanxI can't broadcast with the built in iSight webcam... how would I zoom in or zoom out? or how would I pan? I've seem people doing it walking with their laptops but that's not an option for me... there's nothing wrong with my USB ports but that's neither an option to stream video because as far as I know through USB you can't connect video in apple operating systems ..you can for sure plug any video cam or photo camera through usb but as a drive to transfer data not as a live video camera... is by firewire an old interface developed by apple that you can connect all sorts of cameras to apple computers... unfortunately my new sony HDR-AX2000 camcorder doesn't have firewire output...
thanx -
Live Streaming Using my Webcam As a secruity camera
I'm lost here someone please help, : \ im trying to use my isight as a security camera over the internet so i can watch my dog i just got her.
So the question is what software out there i can use to put my webcam as a camera ( again my powerbook is kinda old im still running on panther so i have 10.3.9 not 10.4 can i work anyway around this) where i can watch live.
Can someone help me i know i need my ip address i have that but when i hook everythign else up where do i go when im somewhere else when i want to look at my live streaming from work to my house? please can someone help me thank you if you can. : )Hello, keith28
Until we hear back from juanorfrankie, here is my suggestion for your question:
I would like to access the isight using my iPhone.
Can anyone make some suggestions?
If iPhone will can support viewing a streaming webcam page, use EvoCam with your iSight to publish a webcam with streaming video. That would probably be the simplest way to do what you want, but, because I do not have iPhone, so I cannot test this.
See the instructions and FAQs at EvoCam for system requirements and details on how to set up your webcam page.
EZ Jim
PowerBook 1.67 GHz Mac OS X (10.4.10) G5 DP 1.8 External iSight -
I recently purchased a Mac Pro (one month old) since day one I have found that when trying to use Face Time the camera is very intermittant, originally I thought it was a problem with the software, but have since realized that if I place a slight pressure on the right hand corner of the monitor, or wiggle the screen, then the camera will begin working, so has to be a poor connection...has anyone ele experienced this problem?
Call AppleCare: 1-800-275-2273 or if the Apple store is nearby, take it there to have it checked out.
A new Mac comes with 90 days of free tech support.
Best. -
Blur Effect in live Streaming publish through camera
Hi Guys
I am currently working on a video chat application throgh FMS.live Stream is playing and publishing perfectly on both end.but the thing is that i have to apply blur effect on that live stream on a particular portion.u mean just like (thieves face) or some moving object.can it possible to apply blur on that live streaming .if yes then please tell me.thanks in advance.i am totally confused
Thanks and Regards
Vineet OshoHello,
There are no in box components that will allow you to transmit captured video from a Windows Phone device to Azure Media Services. You will need to create you own implementation. You can do this by creating a custom Media Foundation sync that can plug into
the MediaCapture element's pipeline. You will then need to modify the captured and compressed video to align to one of the two protocols that Azure Media services can use. At this time Azure Media services can use either Smooth Streaming or RTMP for ingestion.
You will need to create your own packager (or find compatible packager code from a 3rd party) to package the MP4 encoded data into the necessary format.
Here is an example of how to create a custom Media Foundation sync (please note you must write your Media Foundation components using C++ MoCOM. We do not support Media Foundation components written in managed languages):
Real-time communication sample
I hope this helps,
James
Windows SDK Technologies - Microsoft Developer Services - http://blogs.msdn.com/mediasdkstuff/
Maybe you are looking for
-
How do I set up analog 0-10 VDC output pulses with varying duty cycles?
I am trying to control a 400 watt laser so that it can pulse on and off within a few milliseconds. The laser controller reads a 0-10 VDC signal, with 0V being 0 watts and 10V being 400 watts. I am using a PCI-6221 DAQ and LabVIEW 8.0. It seems that a
-
Cant access movies on my mac with my iPad!
So I have movies that I have purchased off of iTunes stored on my mac, they are actually on an external hard-drive, but I can no longer access my stored movies from my iPad. the message "request timed out" appears after a short while of waiting. plea
-
The computer will chime and will sit there for a minute and then a red light will appear inside the case. I had unplugged it for a while and it booted once, but shortly after, the fans whipped up and no response. Now the unplugging trick wont work an
-
Is it possible to do total integration only with bpel or esb by notuse both
I am doing simple integration. For these i am using bpel for orchestration and ESB for Routing and Transformation. But i am getting these things done with by using bpel without esb also.then why wat is the need for going to use esb? Can anyone please
-
Ever since I updated the Bejeweled Blitz on my 3G iPod, The game kept crashing so I deleted & re-installed it. Sadly, I now can't play the game on my iPod anymore because new version is compatible with the 4G & iSo or higher only. What should I do to