Live stream to android devices? shouldn't be this difficult.

Why doesn't Adobe post a simple tutorial on how to stream live video from FMS 4.5 to Android device? The developer center has a tutorial series that hasn't been updated in years. The Part 8: Streaming to Android Devices (to Come) hasn't been updated in 1.5 years. The server is useless if we can't watch video on Android devices. Surely someone out there has successfully streamed live video to an Android device and can explain how to do it?
Do we need to create an Android Air App to accomplish this?
Thanks for any help!
Dave

As an avid Wowza developer/integrator/consultant (as well as the occasional AMS work), I concur with JayCharles on all of his assertions. The transcoder is completely optional, and you can actually make your own Wowza transcoder module with FFmpeg if the Transcoder AddOn costs are really too steep for deployment.
Having said that, my current solution for live streaming to Android with an off-the-shelf (OTS) player is JW Player, and setting its fallback property to false. It looks something like this:
<div id="videoPlayer">
     <!-- Code for non-supported JW Player case. JW Player 6 will not show HLS source on Android. -->
     <video src="http://media_server/app_name/_definst_/mp4:stream_ref/playlist.m3u8" controls ></video>
     <p>Video not playing? <a href="rtsp://media_server/app_name/_definst_/mp4:stream_ref">Click this link to play in an external media player.</a></p>
</div>
<script type="text/javascript">
jwplayer("videoPlayer").setup({
                                                            playlist: [
                                                                                sources: [
                                                                                                    file: "rtmp://media_server/appname/_definst_/mp4:streamname"
                                                                                                    file: "http://media_server/appname/_definst_/mp4:streamname/playlist.m3u8"
                                                                                title: "My Sample Live Stream"
                                                            height: 180,
                                                            primary: "html5",
                                                            autostart: false,
                                                            width: 320,
                                                            fallback: false
</script>
HTH.

Similar Messages

  • Streaming for Android devices using Amazon AWS and Adobe FMS 4.5

    I have created a live stream using Amazon Web Services and Adobe Flash Media Server 4.5.
    AWS provides me with both a .f4m and .m3u8 file, to use in  <object><embed> and  <video> tags, respectively.
    The .f4m loads fine on my desktop browser, and the .m3u8 file loads fine on my iOS device. However, my Android devices will not load either file.
    What code/solutions are there to get this to play on Android devices?
    My current .f4m code (retrieved from http://www.osmf.org/configurator/fmp):
    <object width="600" height="409">
        <param name="movie" value="http://fpdownload.adobe.com/strobe/FlashMediaPlayback_101.swf"></param>
        <param name="flashvars" value="src=http%3A%2F%2F<myinfo>.cloudfront.net%2Fhds-live%2Flivepkgr%2F_definst_%2Flivee vent%2Flivestream.f4m"></param>
        <param name="allowFullScreen" value="true"></param><param name="allowscriptaccess" value="always"></param>
        <embed src="http://fpdownload.adobe.com/strobe/FlashMediaPlayback_101.swf" type="application/x-shockwave-flash" allowscriptaccess="always"
                    allowfullscreen="true" width="600" height="409" flashvars="src=http%3A%2F%2F<myinfo>.cloudfront.net%2Fhds-live%2Flivepkgr%2F_definst_%2Fl iveevent%2Flivestream.f4m">
        </embed>
    </object>
    My current .m3u8 code:
    <video src="http://myinfo>.cloudfront.net/hls-live/livepkgr/_definst_/liveevent/livestream.m3u8" height="300" width="400"> </video>

    Few additional things which I just came across while setting up for live stream on my Android from AWS are as follows:
    1. for the first time when I tried to run the live stream on my android app, it didnt run at all and I thought your concerns are valid for an instance :-) .
    2. Later, I killed the application from task manager, restarted it cleanly and again provided the url and it streamed without issues.
    Please also ensure that:
    - You had put a crossdomain.xml file under <fmsinstalldirectory>/webroot folder.
    - You have provided the correct stream name in the <media> tag in Manifest.xml file inside <fmsinstalldirectory>/livepkgr/events/_definst_/liveevent folder
    (hopefully you have done it since you are already playing it on PC)
    You may also like to cehck the apache logs inside <fmsinstalldirectory>/Apache2.2/logs/access_logs folder to find whether your Android device request is reaching to the server.
    Hope these steps will help you in osloating the problem.
    Regards,
    Shiraz Anwar

  • How to integrate flash media server 4.0 live streaming for iOS devices ?

    Hi All,
    I have website which has live streaming module its working fine, same module i want to integrate for iOS devices. For live video streaming we are using FMS 4.0. So please let me know how we can integrate this for iOS devices using flash media server 4.0.
    Thanks in Adavnce
    Mohammad Sharique

    You need to place the crossdomain.xml in the webroot folder. Create a text file in the webroot folder using notepad, and call it crossdomain.xml. The text below will give you a wide open access policy, which is fine for testing.
    <?xml version="1.0"?>
    <!DOCTYPE cross-domain-policy SYSTEM "http://www.macromedia.com/xml/dtds/cross-domain-policy.dtd">
    <cross-domain-policy>
              <allow-access-from domain="*" />
    </cross-domain-policy>
    For debugging HTTP streaming I recommend you get hold of something like Charles or Fiddler. These will greatly assist in pinpointing any issues.

  • Exporting Widescreen - It shouldn't be this difficult!

    Hello all,
    I really hope someone can help me here, I'm a long-term user of Premiere Pro yet I recently upgraded to CS4 with the new media encoder, and simple operations now seem impossible, in this scenerio - exporting a widescreen video...in widescreen!
    I have a video, about 1 minute long. Shot in widescreen. Edited in a Premiere widescreen project and for the export settings: Mov and/or Mpeg and/or WMV, and the DV PAL Widescreen Preset - which then gives me the (unchangeable) frame size of 720x 576. I choose Square Pixels for the ratio selection and export. I get a widescreen video plus letterboxing to give me a 4:3 screen. Seeing as i can't change the frame size, I changed the pixels to Widescreen, and it gives me a 4:3 video without letterboxing, thereby distorting it horribley. I have tried changing everything! Different codecs give me the correct size but a file hundreds and hundreds of MB big. Seriously, it shouldn't be this hard!!
    Please someone, talk some sense!
    Ryan

    The other formats won't let me change the frame size from 720x576.
    If you're attempting to go from a DV source (judging by the screenshot you posted, you are) to a DV output (which you are if you're going to MS DV AVI or to QuickTime DV MOV), you have no other option than to encode at 720x576, period. That's what DV is (in PALLand, anyway; in NTSCVille, it's 720x480). It doesn't matter if you're coming from or going to a standard 4:3 aspect ratio or a widescreen 16:9 aspect ratio, as long as you're using a DV codec your files will alwaysalwaysalways be 720x576. The differentiation between the display aspect ratio or DAR (4:3 or 16:9), therefore, comes from the pixel aspect ratio, or PAR--in other words, what shape are the pixels? For standard DV sources, the pixels' width-to-height ratio is smaller; for widescreen DV sources, the pixels' width-to-height ratio is larger. Note that these are relative, and not absolute, as I'm speaking in generalities about DV--NTSC DV and PAL DV PARs differ. As Jim pointed out, some programs are able to properly read the PAR written into a DV (or other video) file, while some simply read the raw pixel count, which in your case is 720x576. That's why it looks weird and not what you want when played back--the footage is all there, but not drawn as you'd like.
    The only real safe bet, in my mind, is to use square pixels when encoding for computer playback, especially when dealing with less-than-technically-savvy clients. Computers use square pixels, and that's what they like. Encode using square pixels for web playback and local playback formats (like WMV, FLV, H.264, some QT flavors), and you'll avoid the weird stretchies. This does, of course, involve some small amount of computation on your part to determine what is the proper pixel count when using square pixels, but in the case of 16:9 widescreen display, it's usually just a case of dividing your height by 9, and multiplying that result by 16 to get the width. Punch those values into AME, and set your pixel aspect ratio to Square Pixels, and you should be in good shape... pun slightly intended.

  • Is there a way to view the Photo Stream using Android device?

    My iPhone 4 is currently broken at the moment and I want access to my Photo Stream. Is there a way I can access my iCloud Photo Stream using my Samsung Galaxy tab?

    Not that I"m aware of.

  • It can't (shouldn't) be this difficult...

    I've been using a simple home network with my G4 iGloo and a G4 iBook. The iGloo connects to an Ethernet Switch connected to an Airport Extreme which connects to the cable modem. iBook has an Airport card and is used to administer the AEBS. Everything has been great. I even managed to set up both Macs with fixed IP numbers which allows me to have aliases on each Desktop which will accept drag-n-drop file transfers. Now, even if the power fails and the AEBS loses power, it doesn't reassign IP addresses to the Macs.
    OK! Everything is running fine...but I agreed to demo setting up an Airport Network for our User Group tomorrow night! No problem, this stuff is a snap! Or so I thought.
    New Macs to connect to the existing network: G4 mini and a new Intel mini.
    All Macs are using 10.4.10 except the iBook which is still at 10.4.8.
    All Macs can connect to the Internet. All display the other Macs in the "Browse" window (and also in the "Network" Sidebar item in a Finder window).
    From the Intel mini:
    I can connect to my iGloo (wireless to the Airport, Ethernet to the iGloo). I can even connect to a FireWire drive attached to the iGloo and any of the three partitions on the iGloo. Works just like it should!
    I CANNOT connect to the G4 mini (same basic route as above).
    I CANNOT connect to the iBook (wireless to wireless).
    In the last two cases I get the standard, "Unknown user, incorrect password, or login is disabled" dialog.
    From the G4 mini:
    I CANNOT connect to any other Mac. Same standard dialog as above.
    Once again, this Mac DOES connect to the Internet.
    From the iGloo:
    I get the same "Unknown user..." dialog for the G4 mini.
    I can, as stated earlier, connect to either the admin user or the Hard Drive of the iBook.
    The Intel mini presents an "FTP Authentication" dialog(?). What's THAT about?
    From the iBook:
    Connection to the iGloo is perfect.
    It CANNOT connect to either the G4 or the Intel mini.
    Both the minis have "Using:" DHCP. All the machines are withing 10 feet of each other. I found one bad Ethernet cable.
    How many permutations will it take to discover the right combination of settings! Need to have this working within 16 hours, of course! "The difficult will be done immediately, the impossible will take slightly longer!"
    Sincere thanks to anyone who wants to tackle this! 8-)

    As an indication of my frustration level, I clicked on the "Mark as answered" link thinking it said "Mark has answered"! Doh! Stupid me! Again!

  • SSL shouldn't be this difficult

    Hello,
    First off, SSL is NOT really difficult... it is that I am just frustrated with the "service provider" that I have to connect to. They are of no help what so ever when it comes to trying to help me figure out what is going on with the SSL connection.
    OK, all the service provider has "provided" me is their address and port to connect to... which is for example https://xxx.yyy.zzz 5000
    They say that I need to connect into this server on this port in order to send and receive secure messages... So with that I put this little test program together...
    ----------8<----------
    import java.io.*;
    import java.net.*;
    import java.security.*;
    import javax.net.*;
    import javax.net.ssl.*;
    public class SSLSocketClient
    public static void main(String[] args)
    SSLSocket s = null;
    PrintStream out = System.out;
    out.println("\nTesting socket factory with SSLContext:");
    try
    SSLContext sc = SSLContext.getInstance("SSLv3");
    KeyManagerFactory kmf = KeyManagerFactory.getInstance("SunX509");
    String ksName = "test.keystore";
    char ksPass[] = "password".toCharArray();
    char ctPass[] = "password".toCharArray();
    KeyStore ks = KeyStore.getInstance("JKS");
    ks.load(new FileInputStream(ksName), ksPass);
    // Generating KeyManager list
    kmf.init(ks,ctPass);
    KeyManager[] kmList = kmf.getKeyManagers();
    // Generating SSLSocketFactory
    sc.init(kmList, null, null);
    SSLSocketFactory sf = sc.getSocketFactory();
    // Generating SSLSocket
    s = (SSLSocket)sf.createSocket("ssltest.tnsi.com", 5004);
    s.startHandshake();
    InputStream inputstream = s.getInputStream();
    InputStreamReader inputstreamreader = new InputStreamReader(inputstream);
    BufferedReader bufferedreader = new BufferedReader(inputstreamreader);
    OutputStream outputstream = s.getOutputStream();
    OutputStreamWriter outputstreamwriter = new OutputStreamWriter(outputstream);
    BufferedWriter bufferedwriter = new BufferedWriter(outputstreamwriter);
    char[] message = {0x48, 0x45, 0x4C, 0x4C, 0x4F};
    bufferedwriter.write(message, 0, message.length);
    bufferedwriter.newLine();
    bufferedwriter.flush();
    String string;
    int x;
    while ((x = bufferedreader.read()) != -1)
    System.out.println(x);
    catch (Exception e)
    System.err.println(e.toString());
    finally
    try
    if (s != null)
    s.close();
    catch (Exception e)
    ----------8<----------
    I have created a keystore with my client key pair in it.
    When I run the program I receive no errors or exceptions. All I receive is 5 ENQs and then the program exits...
    My question is, since I have not received an exception, can I assume that I have actually connected to the server? They will not tell me if I have connected or not...
    Thanks...

    If you managed to send and receive data you have negotiated the SSL handshake and made the connection.
    After you do that, get the SSLSession from the SSLSocket and have a look at the various things it gives you, such as the peer certiificates and peer principal. This stuff comes from the server.
    It's interesting that you didn't need to define a truststore. Try it without the keystore too, to see if they are doing client authentication. If it succeeds without the keystore, they aren't.

  • Apple iPhone Http Live Streaming in Flash (Free OSMF library released)

    Hello everyone, recently I've read (http://www.flashcomguru.com/index.cfm...) that Matthew Kaufman has developed an AS3 library which adds support for Apple HTTP Live Streaming (HLS) to OSMF and released it under the MPL open-source license.
    HLS (wich is a proposed RFC open standard) is the required protocol to  deliver VOD or live stream to Apple devices and it's also supported on  Android; having it supported in a flash based media-player can make you  reach both desktop and mobile users using just HLS, without the need to  offer different distribution method.
    Benefits:
    Less complexity for content distributor and lower distribution costs!
    What about integrating HLS in StrobeMediaPlayback? ...maybe using the cited MPL AS3 library, I'm not an AS expert and the library lack documentation but
    it should be possible, isn't it?

    You need an iPhone Developer Account to download ... if you can't see the link then you don't have access or aren't logged in.

  • How to package multibitrate live stream as audio-only for Apple compliance

    Hi
    From AMS documentation, I can package vod content as audio-only by appending "audio-only" to url for instance:
    http://<ams server ip>/hls-vod/audio-only-aac/vod.m3u8
    Similarly I can package live content (single bitrate) as audio-only for instance:
    http://<ams server ip>/hls-live/audio-only-aac/livepkgr/_definst_/liveevent/livestream1.m3u8
    In my case, I have 3 bitrates being published for a live audio-video event. The variant playlist for this event is created by hls_http module by refering to Manifest.xml.
    I could not figure out a way to insert an audio-only stream to this variant playlist created by hls_http module for live stream.
    Please let me know if I am missing something because this seems to be a pretty obvious requirement for live streaming to IOS devices.
    Thanks,
    Nitin

    who can answer my question?Is the question a difficult one?

  • Support for live wallpapers for s60 device

    nokia s60 devices should support for live wallpapers like android devices support

    sadly they dont as yet so if thats what you want in a phone stick to android for now.
    If  i have helped at all a click on the white star below would be nice thanks.
    Now using the Lumia 1520

  • I no longer can access my old desktop at all, is there a way to get the bookmarks from just the Sync account or an Android device to a new machine?

    My desktop that I use for my primary device's hard disk died recently and my laptop shortly after got hit with a virus. This has left me unable to get to any of my bookmarks that were stored on my Sync account.
    However, my cell phone still has all the old desktop links. Is there a way to get Sync to get those from the cloud onto the deskop, or do I have to start a slow process of sending links from my phone to my PC one at a time?
    Thank you for your help on this.

    Hi!
    You will need to get your Sync Key from your Android Device and then follow this simple steps:
    https://support.mozilla.com/en-US/kb/How%20to%20sync%20Firefox%20settings%20between%20computers#w_what-if-im-not-near-my-first-computer
    In order to get your Sync Key from your Android Device you will need this add-on:
    https://addons.mozilla.org/en-US/mobile/addon/aboutsynckey/
    This should work. Let me know if you need further help.

  • Let clients record & save part of live stream

    Hi,
    I'm looking for a solution where clients watch a live stream video and can record part of this stream while watching and save this recording to disk. I can think of several solutions but not sure if any of them is possible:
    - The selected part is recorded locally and saved through FileRefence.
    - The selected part is recorded with FMS though a second NetStream and saved with a unique ID. The user can retrieve the recorded video by download.
    - Start-time and end-time of the recording are send to the server where some script extracts the requested video from the live stream recording file and return it as a download.
    Would any of these solutions work and what would be the best approach?
    Thanks!

    Thanks for the reply.
    I'm indeed going for the last option, where the live stream is being recorded as FLV on the server and FFmpeg copies the requested part and converts it to MP4. Still need to do some testing and optimalisation but this seems to work quite well.

  • Configure Live streaming  FLV Encodering -- Adobe -- End user

    Hello,
    WHile looking on the web, there are no clear cut instructions on how to configure adobe media server fro live streaming.
    What I would like to do is take a streaming coming from alive  flv streaming source. Have adobe media server configured to allow users to connect to the media manager and watch the live stream.
    Could someone provide directions on this?
    Thank you,
    Scott

    Hi,
    This article has information to help you get started with live streaming on FMS, http://www.adobe.com/devnet/flashmediaserver/articles/beginner_live_fms3.html .
    Regards
    Mamata

  • Is it possible to stream live video & audio from device cameras & mics to a server?

    I would like to know if I can use flash builder to stream live video and audio from Android & IOS device cameras and mics to a media streaming server such as Flash Media Server or Wowza? I know the Android & IOS APIs allow for this but can it be done using Flex/Actionscript? The key here is "live" so you wouldn't want to have to wait for video and audio files to complete on the disk before sending it out. Ideally you would send it out via RTMP or HTTP streaming to stream out but any stream would do since once it gets to the server you can encode it.

    Hi
    Yes, It's can do but There is a problem on IOS 8 when Switch Camera, Microphone is muted and Sound return when I press home button from my iphone out and press app again for few second
    You can use as same as flash desktop video streaming for coding in mobile.
    Good luck
    Zing1911

  • HTTP Live streaming test in Flash Player, HTML5 player, iphone, android and other smart phones.

    Hi,
    Can anyone please tell me how can I play http live streaming from FMLE 3.2 to Flash player, iphone, ipdad, HTML5 player, Android and other devices?
    I have tried my best to play live streaming usine FMLE 3.2 with help of "http://help.adobe.com/en_US/flashmediaserver/devguide/WSd391de4d9c7bd609-52e437a812a3725df a0-8000.html" but not able to play video using HTTP.
    When I tried to play url-  "http://localhost/hds-live/livepkgr/_definst_/liveevent/livestream.f4m" even in in Strobe Media Player it was showing "Buffering" that's it. But I can play same RTMP URL in Flash AS3 using the netstream very fine.
    Now I want to test the livepkgr content via HTTP to Flash Player, HTML5 Player (on Web browser and smartphones), iphone and other devices.
    Thanks
    Best regards,
    Sunil Kumar

    Hi Sunil,
    It's difficult to debug an issue when you don't have access to the machine and you can't see what's happening, so I request you to please have a little patience.
    For playback just check the following steps and make sure if you have followed them :
    1) Delete the streams folder and any .stream file in the livepkgr
    2) Restart FMS
    3) Make sure you have a crossdomain.xml under root_install/webroot
    4) Publish a stream from FMLE as 'livestream?adbe-live-event=liveevent'
    5) Use this player to playback the stream : http://osmf.org/dev/2.0gm/StrobeMediaPlayback.html?src=http://<your-ip>/hds-live/livepkgr/_definst_/liveevent/livestream.f4m
    If there is no playback, here are some checks you can do :
    1) Request for the http://<your-ip>/hds-live/livepkgr/_definst_/liveevent/livestream.f4m in your browser. You should receive an xml that looks something like this :
    <?xml version="1.0" encoding="UTF-8" ?>
         <manifest xmlns="http://ns.adobe.com/f4m/1.0">
              <id>livepkgr/events/_definst_/liveevent</id>
                      <mimeType />
              <streamType>live</streamType>
              <duration>0</duration>
              <bootstrapInfo profile="named" url="../../../streams/livepkgr/events/_definst_/liveevent/livestream.bootstrap" id="bootstrap7158" />
              <media streamId="livestream" url="../../../streams/livepkgr/events/_definst_/liveevent/livestream" bootstrapInfoId="bootstrap7158">  
                          <metadata>AgAKb25NZXRhRGF0YQgAAAAAAAhkdXJhdGlvbgBAJUUeuFHrhQAFd2lkdGgAQHQAAAAAAA=</m etadata>
                </media>
    </manifest>
    2) Check the access log under Apache to see if the .f4m, .bootstrap and Frag files have been requested and delivered. If a particular entry has a status code of 200 then the file has been served by FMS. You can confirm the same on the client side using a software like fiddler.
    Coming to the player issue, a playback of *f4v via the vod directive is a progressive download, whereas playback of an *f4m file via hds-vod is HTTP streaming. 
    I have personally used Spark VideoPlayer and hence suggested the same.
    You need to add a component like :
    <s:VideoPlayer width="100%" height="30%" chromeColor="#CCCCCC" color="#000000"
                     fontSize="12"
                     source="http://<your-ip>/hds-live/livepkgr/_definst_/liveevent/livestream.f4m"
                        symbolColor="#000000"/>
    Let me know if any of these suggestions help.
    Thanks,
    Apurva

Maybe you are looking for