Serverside Streaming protocols

Does the serverside of flash media server support RPT or RTSP
(Real Time Streaming Protocol)? I know from Media server to player
is only rtmp, but is it possible for the server to recieve a RTSP
stream and the convert it to rtmp for the client side?

I'll toss in my two cents worth, but it's not particularly new information. I'm a recent MBP purchaser, and installed my ATV2 yesterday. In addition, I have an ancient Dell Pentium 4 desktop (XP Pro spac 3) and a Dell Studio 15 laptop running 64-bit Win 7 on my network, the router for which is a D-Link DGL-4500 b/g/n wireless with 4 Gbit Ethernet ports. I'm using a Netgear ReadyNAS Duo for my backups, including Time Machine, and I'm using the built-in iTunes server on the NAS to serve my audio files to the various iTunes installations (all three computers). In addition, all my photos are on the NAS box as well, and fetched into either Picasa (Dells) or iPhoto (MBP).
Currently, the ATV2 is operating wirelessly, but I will be pulling some cable to it to take advantage of its gigabit interface.
I too was hoping that the ATV2 would 'see' my NAS music library, but as you have discovered, iTunes needs to serve the data to the ATV2. I did see some significant network slowdowns while using the MBP to serve up music videos while trying to run Safari on the computer.
My latest scheme is to use the P4 desktop (wired Ethernet) to serve the iTunes music and photos to the ATV2. That at least keeps the MBP out of the loop and reduces some of the wireless network usage.
I don't have videos (movies) on my network, but I did stream two Netflix movies last night to the ATV2 and it worked without any skips or delays.
Hope this is useful to someone,
Tom

Similar Messages

  • TT12241: Backup stream protocol error

    Hi there,
    I get the following error when replicating TT 6.04 datastore:
    $ /data/TimesTen/6.04/bin/ttRepAdmin -duplicate -from UsageCollector -host NODE_B -setMasterRepStart -delXla -localhost NODE_A UsageCollector
    TT12241: Backup stream protocol error: bad log data packet length -- file "restore.c", lineno 2865, procedure "restoreLogFileNS"
    Unfortunately this error, TT12241, is not described in the documentation. Oracle Support and Google searches don’t return any relevant information.
    Have any one came across this problem before?
    Thanks,
    Igor

    I can't find any previous SR reports of the TT12241. No probable candidate bugs are returned either. However I have not seen the -delXla option being used before, so it may have something to do with using this option. Are you able to exclude this option from the ttRepAdmin -duplicate, and allow the bookmarks to be copied over? You could then connect to the newly created datastore and try to manually delete the bookmarks using "xladeletebookmark".

  • How to broadcast/livestream virtual serverside streams ?

    Hello,
    I was wondering if the following is possible with FMS:
    Currently I have an application which performs some rendering through opengl.
    I would like to stream the output of this app as a live movie and broadcast it through FMS.
    The application is running serverside and I want to maximize the framerate, in this case I would like to reduce as much overhead as possible.
    What would be the best way to stream the rendered frames to FMS?
    Is there an API I might use to access the servers frame encoder?
    (I would like to avoid having to write a minidriver, which would create a virtual webcam)
    Thank you,
    -S

    Try this:
    http://help.adobe.com/en_US/FlashMediaServer/3.5_Deving/WS5b3ccc516d4fbf351e63e3d11a0773d5 6e-7ff0.html
    and let me know if this is what you were looking for.
    Btw, I am not sure if you were asking in context of FMS only

  • P2P-based live streaming protocol

    In a P2P-based live streaming system , a video stream is diveded into segments of uniform length. Each segment contrains 1-second video data.Every peer in the overlay owns a buffer to save segments temporarily.Then, what kind of protocol is appropriate to transmite 1-secong vido data?UDP or RTP?
    If RTP was used ,what kind of process should be take to get the video data which be send to buffer but not to play at once?

    So you know, there are no Macromedia guys... there are only
    Adobe guys, and I wouldn't expect answers from any of them here.
    This is a user to user forum.
    I don't work with mobile devices much, but Flash Lite
    supports the Flash 7 standard , so I would imagine the
    netconnection and video classes are in there. If that's the case,
    then you should be able to deliver Sorenson Spark encoded flv files
    (the flash 6/7 standard).
    Try here:
    http://www.adobe.com/mobile/

  • Object serialization stream protocol  grammar parsing

    I am new to RMI. I am invoking a Remote method and capturing data at the same time using Wireshark (Ethereal). I want to parse the tcp data to get the class name, method invoked and parameter passed. Can anybody tell me how to do it?

    Why? This is doing it the hard way by several orders of magnitude. Set -Djava.rmi.server.logCalls=true at the server, and/or -Dsun.rmi.client.logCalls=true at the client, and you will get a log of alll remote method activity.

  • Debugging live streaming ingest protocol

    Hi everyone,
    I posted a separate question last week about live streaming protocols. I had been searching for documentation on fmp4 live streaming ingestion and found nothing, but afterward I found the Smooth Streaming protocol specification (MS-SSTR-- the forum won't
    let me link to it directly), which seems to apply to my use case. Now I've spent a few days trying to make it work and so far haven't been able to do so.
    I'm not looking for a canned solution; I'm looking for resources to help me debug the problem.
    According to the specification, I should be able to make a long-lived, chunked http POST request to my ingest endpoint, sending the fragmented mp4 data as the request body. I've tried to implement this without success-- I open the request stream and
    write 1000 byte chunks at a time, but the connection is reset after only one or two chunks. I can't tell if the problem is the data or the chunk size or something else.
    And if I mimic Expression Encoder 4, which seems to diverge from the spec and not send its data chunked at all, the connection isn't reset no matter how much data I send, but the video never makes it to the preview stream on the Azure portal.
    So the question is, how can I debug this? Is there any error log or diagnostic output I can access? Each time I try something new, it doesn't work, but I have no way of knowing whether I am getting closer to or further from a working solution. I'm a bit
    lost here, so help would be greatly appreciated.

    Hi Jason, and thanks for getting back to me.
    The IP security thing is a good thought, but I think I have that part taken care of already. I saw the option to restrict ingest to the IP I used to configure the channel from in the Portal, and I made sure to set it to Off. Also, I can stream successfully
    from Expression, which would have the same source IP.
    As for why I want to use Smooth Streaming, basically there are two reasons. First, it seems like the simpler protocol conceptually. I'm already generating fragmented mp4 files (though I think they might be packaged wrong; I can't really tell which part is
    causing problems), and it seems simple enough that I could implement it without turning to a third-party library (the part that's feeding the stream is written in C#, though I'd be willing to rewrite in C++ if necessary, and there's no RTMP client
    that seems trustworthy-- FluorineFX, for instance, seems to have ceased development years ago now).
    Second, while reading the RTMP spec, I realized that even if I had such a library it isn't clear how I would need to use it with Azure. It allows creating objects or invoking methods on the server, but I couldn't find any guide to which objects and methods
    might be supported. And the spec also doesn't define specifically what video format, codec options, or packaging method I need to use; essentially I would seem to have all the same problems I'm having with Smooth, only with the added difficulty of needing
    a third-party library to implement the spec properly.
    I guess the main problem is that I don't really know how close or far I am from success. Like I said, I think I am doing something wrong in how the video is encoded, but for all I know I'm being dropped because I'm feeding the data too quickly
    or slowly. (I can't find any documentation on how much data Azure can buffer or how it handles underruns when the video to be streamed live is generated on the fly and not tied to strictly realtime output). If there's more documentation on specifically how
    Azure uses or expects data from these protocols, it'd be very helpful.

  • I recently purchased an Onkyo stereo receiver with a wireless USB adapter, UWF-1, and want to stream audio from my MacBook Pro through my stereo system. The receiver tells me it is connected to my wireless LAN, but I cannot get music to play?

    New to the community here, thanks for your patience.  I recently purchased an Onkyo TX-8050 Network Stereo Receiver with the UWF-1 USB adapter with the intent to stream audio from my MacBook Pro to the stereo. I have gone through the setup instructions for the adapter and the receiver gives me a message saying that it is connected to the password protected network, but that is as far as I can get.
    I have gone into iTunes and clicked on the "Open Stream" option, but I just get a pop up box labeled "URL" that is empty. I have Home Sharing turned on and File Sharing turned on, but I am not seeing the receiver in my Finder window or as a iTunes device.
    I would appreciate any and all advice and assistance.
    Thanks so much.
    Dan

    So I have been searching the iStore for Airplay and I find lots of information about what it is, but nothing about how much it costs, how/where to buy it, etc.?
    Fortunately, AirPlay does not cost anything as it is a streaming protocol used by Apple. Currently, the AirPort Express and Apple TV are AirPlay-ready. A number of other companies, like JBL, iHome, Denon and Klipsch are also rumored to be working on AirPlay versions of their products.
    Your other option would be to get an AirPort Express and connect it to your Onkyo analog or optical digital audio input and stream from iTunes that way. I currently do this with my 10+ year old Harmon Kardon HT receiver.

  • How to stream from iPad to apple tv via airport express with the airport express being connected to the internet

    I have an apple tv, iPad 3 and airport express. Im at school and the apple tv is technically not supported by the network. Ive read that one could use AirPlay to apple tv via AE.
    Can someone help me make this work? Step by step process if possible.
    Thanks in advance,

    The router does provide WIFI, but the thing is the Apple TV will be out of reach. I want to avoid having two routers / networks. That's why Apple TV is hardwired. The AirPort (also hardwired) should only work as a receiver for the iPhone - streaming iPhone content to the Apple TV.
    BOTH the Apple TV and the AirPort Express are AirPlay "speakers." That is, you can stream from iTunes to either device. You don't need both if the Apple TV is hardwired to an existing router. That provided the necessary networking for AirPlay to stream over. Wi-Fi is NOT required for AirPlay if you have a wired connection between the AirPlay "server" (i.e., iTunes host) and the AirPlay "speaker."
    The Apple TV would work as a "receiver" for your iPhone.
    If AirPlay is an application layer software, how may two AirPlay device communicate directly? Normally all communication is relayed by the access point.
    AirPlay is a streaming protocol that runs over TCP/IP. AirPlay "speakers" DO NOT communicate with each other directly. The basic communication path is between an AirPlay "server" and an AirPlay "speaker" or multiple "speakers." Again the path between them can be either be wired or wireless or some combination of each.

  • How to play live streaming in windows phone 8.1 silverlight app?

    Hi,
    I am developing a windows phone 8.1 silverlight app. In my I want to show live streaming youtube channel , I think it is not possible, 
    Actually that youtube channel is a television channel , I want to stream that live tv in my app.(I tried to load youtube channel in webbrowser in iframe tag but it is not opening)
    Anybody help me how to play live tv or live streaming in my app,
    Thanks..
    Suresh.M

    Hello,
    You will likely need to write a custom
    MediaStreamSource that can access the media stream and parse it. Windows Phone supports h.264 natively and as long as the site serves up a media stream that contains h.264 frames you can parse it and have our built in decoder decode and display it. You
    will need to have intimate knowledge of the streaming protocol used by the website that you are trying to play. You must also make sure that the website is not using protected content. IANAL so I would recommend that you have your local law professional
    (lawyer) review the licensing of the website that you are attempting to connect to and stream from before continuing your development. Your lawyer can make sure make sure their licensing allows you to do this.
    I hope this helps,
    James
    Windows SDK Technologies - Microsoft Developer Services - http://blogs.msdn.com/mediasdkstuff/

  • Apple tv stream is lagging

    I have a 2013 Macbook Air and Apple tv 3.
    My wifi  connection is 14mbit in both download and upload.
    I have some troubles when using the Apple Tv to mirror my MB Air desktop onto my Samsung Tv. Mostly I am trying to stream somekind of show in a browser but there is always huge lags on the TV, and it is extremely annoying and therefor makes the Apple Tv worthless. I had the same issue when I had the apple tv a couple of years ago at my old apartment, so I sold it. But now I moved, got a new laptop and new internet conncetion, so I thought everything was good, but it isnt.
    What can be the issue? Do I need to change to router, internet speed etc? My friend has the Airport Timecapsul and everything is running really smooth at his place when he is streaming something from his tv onto his TV, but the Timecapsul is way out of my money range.
    If I need a new router, what kind would be the sure choice to get it working?
    The picture is from Network Utilities.

    Here is my analysis and test results:
    Tested with following to find the root cause...
    Test1:
    1. Changed my router to Asus ac router (RT-87U) and getting excellent bandwidth. (50MBPS with ATT Uverse)
    2. Connected Apple TV to 5GHZ channel from router.
    3. Connected MacBook Pro (OSX Mavericks) to same channel on the router.
    4. Switched Macbook Pro to Apple TV (Airplay display with Mirroring).
    Result:
    Observed lag during video streaming. Audio seems to be Ok though found few instances where there were breaks in audio signal.
    Test2:
    1. Connected iPhone 6 plus to again same channel on the router (5GHZ)
    2. Switched streaming from iPhone to Apple TV
    Result:
    No logs observed and both Audio and Video streaming seems to be working OK.
    Test3:
    1. Switched to Macbook Pro (OSX Yosemite) and connected to Apple TV
    2. Streaming video and audio test.
    Result:
    The video almost freezes for few mins and result is worst. Apple TV is unusable for video streaming with OSX Yosemite / Macbook pro.
    notes:
    1. Tested Netflix and Youtube directly from Apple TV and is working fine without any issues in HD mode.
    2. Tested tv channels (ABC news) from Apple TV and no issues found using my network.
    My Conclusion:
    Apple TV / iPhone seems to be working fine.
    Apple TV/Macbook pro doesn't work well with Video Streaming. (Periodic Lags in video signal)
    The problem is not due to any issues with router or network but is clearly with streaming protocol between MacBook and Apple TV. 
    Clearly Apple needs to resolve this.
    Hope this helps.

  • AppleTV 2G - horrible streaming performance!

    I currently have 3 of the original AppleTVs and have never had any issues streaming movie and TV shows from my iTunes library on my MacPro to these devices. Yesterday I replaced all of them with the new 2G ATV and now streaming content from my MacPro is unwatchable. Just trying to stream a TV show took about 5 minutes for the program to start, then after playing 3 or 4 minutes, the show would stop and start buffering again, but then i couldn't get it to resume playing.
    I am so disappointed with these devices! Is anyone else having issues like this? Content streaming from NetFlix is fine. I am thinking it has something to do with the change to the Apple Home Sharing interface instead of whatever protocol the 1G units were using to connect to the iTunes library.
    Ryan

    Ryan Nowlin wrote:
    Tim - I have been streaming with AppleTV 1st gen via 802.11g through my Airport Extreme base station for 3 years and performance has NEVER been an issue. After connecting my new 2nd gen AppleTV to iTunes via Home Sharing, it is unusable. I also tried connecting the ATV directly into the base station via Ethernet and the performance was no better, so the issue is not network bandwidth. Something has changed with how they are doing the streaming -- my guess is that it's the Home Sharing streaming protocol vs. however it used to be done directly through iTunes.
    My understanding is that home-sharing is really just a difference in how the Apple TV (or any other computer with iTunes) finds the computers and establishes that it's ok to share content. Previously, it couldn't be assumed that another computer with iTunes sharing enabled was owned by the same owner. You needed a password to connect to the share, and even then, that just gave you the right to 'stream' but not copy. Home-sharing establishes that the content is owned by the same iTunes account holder... thus you can select from any home-share source and not just stream... but copy or sync as well (although the new Apple TV doesn't have enough local storage to copy & sync so it uses it's storage as a filesystem cache for streamed content.) Apart from how it establishes the 'right' to stream, I thought the streaming protocol was the same. (BTW, I do not know this for certain.)
    Operating on the assumption that nothing is broken (yeah, I know... but we have to start somewhere.) My first 'guess' would be network (but it sounds like you've isolated that by comparing to the 1st Gen Apple TV) and my second 'guess' is that the source computer is too busy to stream. For example... I have a Mac Mini with a 2TB external firewire disk connected. My purchased iTunes content lives there. I recently purchased a season pass for a show and decided to start watching an episode. It was horrible. I discovered that (a) the Mac Mini was downloading everything it could from Apple. All the while, as soon as a new episode finished loading, it would 'sync' the show to a 1st Gen Apple TV upstairs (which wasn't in use... but it still syncs to it periodically as new content shows up.) Meanwhile, Time Machine decided to start a backup and since the files are huge, well... you can see where this is going. Basically I'm downloading these 2GB files while syncing a 2GB file to an Apple TV while backing up a 2GB file to a NAS device on the network (e.g. Time Capsule) WHILE trying to play and watch the episode. It's no surprise performance was poor. The home network was completely saturated and the disk on the Mac Mini wasn't keeping up.
    There's quite a number of people complaining of the issue here -- so it's entirely possible that this is just a bug that manifests under certain circumstances and nobody has managed to pin down what causes it. Not everyone is having the issue and I've yet to read a review where the reviewers ran into this problem. That leaves us with (a) some Apple TVs are defective, (b) all Apple TVs are the same but there's a bug that manifests itself only under certain conditions or (c) there is no bug but some people are experiencing predictable performance problems due to load on their network or load & performance on their host computer.
    In the meantime... I'd suggest trying to isolate the issue by carefully scrutinizing your iTunes server. Check to see if you've got backups or anything else happening while you are also trying to stream to Apple TV. Do you still get the issue when backups are disabled and you aren't downloading any new content from Apple to the iTunes client while the iTunes client is trying to stream to the Apple TV?
    You might also open the 'Console' utility on the Mac and watch the "All Messages" aggregated list to see if iTunes is complaining of anything while you stream. Warning: If you're not familiar with the 'Console' utility, don't be alarmed when you see lots of messages related to various things... no your Mac is NOT about to explode. It logs every little thing and most of the messages in the logs are harmless. It cleans it's own logs from time to time so you shouldn't try to 'manage' anything... look but don't touch.

  • Streaming mac content to ipad

    Hi
    I've spent time searching for a simple solution but so far all I have found is apple tv. I was hoping to avoid shelling out another £100. I want to stream content from my mac o my iPad 2 while in my house over my home wifi network. I'd also like to be able to stream content from the mac to my Samsung Internet tv (which is connected to the network via an Ethernet cable connected to the router).
    Is there a simple solution?
    Thanks
    Neil

    I want to stream content from my mac o my iPad 2 while in my house over my home wifi network.
    Just use iTunes Home Sharing to stream all your Mac iTunes content to your iPad.
    http://support.apple.com/kb/HT3819
    I'd also like to be able to stream content from the mac to my Samsung Internet tv
    That depends on what media formats your TV supports and what streaming protocols it uses. Most internet TVs use the UPnP/DLNA protocol and for that you'd need a DLNA server running somewhere on your network (Mac or NAS drive) that 'serves' your media content to DLNA compliant devices.
    Personally, I have used Playback: http://www.yazsoft.com/products/playback/

  • Will upgrading to wireless N make iTunes streaming faster?

    I have Airport Express wireless N.  My laptop and router is wireless G.  When I stream an iTunes song, the first song always start a few seconds into the song.  If I upgrade to all wireless N, will it start faster?  If so, do I only need to upgrade my router or do I also need a wireless N adapter for the laptop?

    Of course, it would be impossible to guarentee by upgrading will definately resolve the issue.
    It's important to step back to understand the basics that iTunes uses to stream to the AirPort Express Base Station (AX) to see where there may be potential issues.
    The AX works only with iTunes v4.6+ and is limited to music files that iTunes can read; ie, 16-bit data only. (An exception is you can use a third-party product like Rogue Amoeba's AirFoil to stream other non-iTunes sources.) These data, though, can be in any file format that iTunes recognizes, from lossy MP3s at the low-quality end of the spectrum to Apple Lossless and lossless AIF or WAV files at the high end. It is also important to note that the AX functions only at a 44.1kHz sample rate. When you play 32kHz or 48kHz data, iTunes sample-rate-converts the data in real time before sending it to the AX.
    iTunes uses a QuickTime CODEC to convert audio files to Apple Lossless, and then, uses AirTunes/AirPlay to send them to the AX. In turn, the AX uses built-in software that converts the Apple Lossless to an Encoded Digital Audio format. From there, digital audio is sent to a optical transceiver to convert the electrical signal to an optical one before sending it to the innermost part of the audio port. For analog, the AX has a built-in DAC to convert the Encoded Digital Audio to Analog which is sent to the same audio port.
    One operational glitch is the fact that, as the AX doesn't have a local clock circuit, when the incoming data is interrupted, as it is when you change songs in iTunes, there is no longer a digital output to feed the DAC, which loses lock as a result.
    ... but how does iTunes/AirTunes/AirPlay actually stream to the AX?
    It's actually a two-step process: 1) Establish an encrypted connection, and 2) Stream
    Upon startup, iTunes initiates a RTSP connection to the Airport on port 5000. The AX replies with a response (Apple-Response) which is the challenge encrypted using the private key stored in the AX. iTunes verifies this value using its public key part of the assymmetric key-pair. Note that this step is performed by iTunes to verify if it is talking to a real AX. The connection is tore down after this exchange. Note: It could be at this point that you are seeing this delay.
    Once iTunes establishes a connection to the AX, AirTunes/AirPlay uses the Remote Audio Access Protocol (RAOP) to stream music to the AX. RAOP is based on the Real Time Streaming Protocol (RTSP) but with an extra challenge-response based authentication step. In turn, RAOP uses two channels for streaming music: a control channel which uses RTSP and a data channel for sending the raw data. Periodically, iTunes commuicates with the AX on the control channel to verify that it still has an active connection.

  • Streaming/playing movies with Quicktime

    Whenever I click on a movie clip, the computer starts downloading the movie instead of just playing it. How can I tell QT to just play the clip without downloading/saving it?
    Thank you.

    I would click on a certain clip and it would play in WM or QT (without being saved).
    Where or not a cached movie can be saved is another matter. Just because a movie is cached for playback does not necessarily mean it can be saved as an independent, self-sourced file, QTM, or reference file. As far as viewing on a PC is concerned, you have yet to indicate whether or not the files you were viewing were in fact real time streaming or being temporarily cached the same as QT or whether or not the files your are viewing in QT are real time screaming, being cached as progressively downloads for "fast playback," actually being saved as "standalone" files (which requires a manual save on your part), or merely QTML/Ref/Clippings which cache but actually resource data the remains online. Both the Mac and PC, as far as I know, only cache the data temporarily until the cache is filled and then reuses the cache memory applying a FIFO strategy. Under Leopard, for instance, I allocate 4 GBs of memory to caching movies so I can recall them without having to return to their source URL and reload the data.
    What exactly is a RTSP file?
    Real Time Streaming Protocol (RTSP) "is a network control protocol... used to control streaming media Servers. (See RTSP Wikipedia.) Most QT content is dispensed online using normal (non-RTSS) servers to progressively download files for "fast start" playback. That is, the files load to your computer but in such a way that you don't have to wait for the entire file to download before you can begin playback. On the other hand, RTSP controls the dispensation of "packets" of dat from special Reat Time Streaming Servers (RTSS) that play immediately on your computer and are, or all intents and purposes, are gone from the computer. These types of files cannot saved locally and must be screen captured if you want to copy them. These true streaming files are the type of files found on most news, sports, and similar commercial web site. WMV and Real content were specifically designed for this purpose. QT was designed more to emulate the manner in which files are physically stored on magnetic or optical media as files of data rather than packets of data.
    Not a specialist/technician speaking here, so I hope this relatively simplistic explanation suffices. That is, there are two ways of serving data and each is somewhat different. Most users prefer non-streaming data if they intend to save it (e.g., those who download and save movie trailers). Most site managers prefer streaming data when they want to prevent users from having the ability to save the content (e.g., news, sports, music radio, etc. web sites).

  • Liev streaming on ios using FMS

    Dear All,
    I have made an application using cs5.5 and java with struts2 for live streaming through FMS.
    Architecture is like:
    Flash player makes get stream from camera and then transfer stream over FMS. And this flash player component is embedded in html page.
    Flash player contains code for all work flow like making connectivity with FMS interacting user via javascript and capturing live stream from camera. This application works well on IE, mozilla.
    But when it comes to woking on ios it fails as ios, as IOS does not support flash player. I have gone through many solution provided on different forums, I get there to use Adobe air with making connection using http live streaming protocol and then pakaging the solution for ios. But as per my point of view this would work as an apps for ios rather than browser based application as in my previous case. can anyone suggest me alternate solution to launch live video streaming solution to work on ios on its browser. If I am wrong in perception of working phenomenon of FMS with ios via flash player or adobe air then also please suggest.
    Thanks
    Nitesh Kumar

    I realize that using JW player might not be the exact solution that you are looking for (sounds like you're trying to do something customized for a particular need), but using JW player can help you test your FMS set-up to make sure everything is working. We've used JW player to stream multi-bit rate live events to both Flash and iOS devices at the same time.
    > So how can we make connectionn with FMS and publish stream through jwplayer.
    Step 1: Set up Flash Live Media Encoder and create the live streaming manifest files for JW player. These instructions are taken from this page:
    http://help.adobe.com/en_US/flashmediaserver/devguide/WSd391de4d9c7bd6 09-52e437a812a3725dfa0-8000.html
    FMS live Streaming instructions:
    Install Flash Media Encoder (FMLE) on the encoding computer (the computer what will be encoding the live stream): http://www.adobe.com/eeurope/products/flash-media-encoder.html
    Open the Flash Media Live Encoder rootinstall\conf\config.xml file in a text editor.
    Windows is C:\Program Files\Adobe\Flash Media Live Encoder 3.2
    Mac OS is Macintosh HD:Applications:Adobe:Flash Media Live Encoder 3.2.
    rootinstall\conf\config.xml i.     Set the tag //flashmedialiveencoder_config/mbrconfig/streamsynchronization/enable to true
    Save the file.
    Open FMLE: single bitrate stream.
    Format: Click on wrench and change to appropriate settings: Profile (main), Level 4.1 Keyframes 4 or a multiple of <FragmentDuration> in the applications/livepkgr/events/_definst_/liveevent/Event.xml file. The default value of <FragmentDuration> is 4000 milliseconds.
    FMS URL: rtmp://localhost/livepkgr
    Stream: livestream?adbe-live-event=liveevent&adbe-record-mode=record
    i.     IMPORTANT: &adbe-record-mode=record query is needed to avoid the problems discussed in the threads below.
    ii.     See http://forums.adobe.com/thread/959974
    iii.     http://forums.adobe.com/message/4311876
    iv.     http://forums.adobe.com/thread/981286?tstart=0
    Click Start.
    Test streaming: http://www.osmf.org/configurator/fmp/:
    i.     http://localhost/hds-live/livepkgr/_definst_/liveevent/livestream.f4m
    ii.     iPad: http://localhost/hls-live/livepkgr/_definst_/liveevent/livestream.m3u8
    Publish and play live multi-bitrate streams over HTTP
    Live multi-bitrate streams:
    Edit rootinstall\conf\config.xml file (step 2 above) on the FMLE computer.
    On the streaming server: go to FMS 4.5/applications/livepkgr/events/_definst_/liveevent
    a. Remove the Manifest.xml file from the liveevent folder or rename it.
    b.     Browse to rootinstall/applications/livepkgr/events/_definst_/liveevent and edit the Event.xml file to look like the following:
    <Event>     
    <EventID>liveevent</EventID>         
    <Recording>
    <FragmentDuration>4000</FragmentDuration> <SegmentDuration>16000</SegmentDuration>                  <DiskManagementDuration>3</DiskManagementDuration>
    </Recording> 
    </Event>
    3. Create the f4m and m3u8 manifest files: NOTE: JW player needs a .smil file, not an f4m manifest.
    a. On the FMS 4.5 computer, open rootinstall/tools/f4mconfig/configurator/f4mconfig.html in a browser.
    b. F4M manifest file:
    Stream uri: livestream1.f4m 150kbps, livestream2.f4m 500kbps, livestream3.f4m 700kbps.
    Base uri: http://localhost/hls-live/livepkgr/_definst_/liveevent (replace localhost with FMS internal/external IP address).
    save the file as liveevent.f4m to rootinstall/webroot or on the webserver.
    JW Player does not support f4m files. Must use a smil file for this.
    c. m3u8 manifest file: Open rootinstall/tools/f4mconfig/configurator/f4mconfig.html in a browser.
    http:///hls-live/livepkgr/_definst_/liveevent/livestream1.m3u8 150kbps
    http:///hls-live/livepkgr/_definst_/liveevent/livestream2.m3u8 500kbps
    http:///hls-live/livepkgr/_definst_/liveevent/livestream3.m3u8 700kbps
    Save file as liveevent.m3u8:
    #EXTM3U #EXT-X-STREAM-INF:PROGRAM-ID=1,BANDWIDTH=150000 
    http:// localhost/hls-live/livepkgr/_definst_/liveevent/livestream1.m3u8 
    #EXT-X-STREAM-INF:PROGRAM-ID=1,BANDWIDTH=500000 
    http:// localhost/hls-live/livepkgr/_definst_/liveevent/livestream2.m3u8 
    #EXT-X-STREAM-INF:PROGRAM-ID=1,BANDWIDTH=700000 
    http://localhost/hls-live/livepkgr/_definst_/liveevent/livestream3.m3u8
    4. Publish live streams to FMS:
    a. Start FMLE.
    b. Click the wrench next to Format, Open Advanced Encoder Settings, set Keyframe frequency to 4 seconds.This value matches the <FragmentDuration> value in the applications/livepkgr/events/_definst_/liveevent/Event.xml file. The <FragmentDuration> value is in milliseconds.
                   c. FMS URL field: rtmp://localhost/livepkgr
                 d. Stream field: livestream%i?adbe-live-event=liveevent OR livestream%i?adbe-live-event=liveevent&adbe-record-mode=record Change liveevent to whatever name you want (i.e. Test)
    Flash Media Live Encoder uses the variable %i to create multiple stream names: livestream1, livestream2, livestream3
    To use another encoder, provide your own unique stream names, for example, livestream1?adbe-live-event=liveevent, livestream2?adbe-live-event=liveevent.
    PUBLISH URL: rtmp://localhost/livepkgr/livestream%i?adbe-live-event=liveevent
    IMPORTANT: use the following query parameter: &adbe-record-mode=record
    5. Test live streams:
    a. Flash:
    Open rootinstall/samples/videoPlayer
    In Video Source, enter the following: http://localhost/liveevent.f4m
    b. iOS: Open Safari.
    Open http://localhost/liveevent.m3u8
    Once you create the live streams and the smil and .m3u8 manifest files following the steps described above, set up a test page using JW player (see their documentation and download the files referred to below from their page).
         This is the basic set-up that we use to stream live multibitrate live events (using a smil and m3u8 manifest file created above):
    1. Use JW with javascript file: put <script type='text/javascript' src='jwplayer/jwplayer.js'></script>in the <head> section of the page. I've renamed the javascript file jwplayer.js (look in the JW folder to find the appropriate js file).
    2. Embed the player into the page: put the following code in the <body> of the page:
    <div id="jwplayer"></div>
    <script type="text/javascript">
    jwplayer('jwplayer').setup({
      'autostart': false,
      sources: [
        { 'file': "Manifests/liveevent.smil"},
        { 'file': "Manifests/liveevent.m3u8"}
      rtmp: {
        'bufferlength': 5
      'width': "640",
      'height': "480",
      'image': "Splash/live-poster.png",
      'primary': "flash",
      'stretching': 'exactfit'
    </script>
    Sample .m3u8 manifest file: (replace localhost with your ip, Test with whatever you named the stream in FMLE).
    #EXTM3U
    #EXT-X-STREAM-INF:PROGRAM-ID=1,BANDWIDTH=200000
    http://localhost/hls-live/livepkgr/_definst_/Test/livestream1.m3u8
    #EXT-X-STREAM-INF:PROGRAM-ID=1,BANDWIDTH=400000
    http://localhost/hls-live/livepkgr/_definst_/Test/livestream1.m3u8
    #EXT-X-STREAM-INF:PROGRAM-ID=1,BANDWIDTH=800000
    http://localhost/hls-live/livepkgr/_definst_/Test/livestream1.m3u8
    Sample .smil file:
    <smil>
              <head>
                        <meta base="rtmp://localhost/hds-live/livepkgr/_definst_/Test/"/>
              </head>
              <body>
                        <switch>
                                  <video src="livestream1.f4m" system-bitrate="200000" />
                                  <video src="livestream2.f4m" system-bitrate="400000" />
                                  <video src="livestream3.f4m" system-bitrate="800000" />
                        </switch>
        </body>
    </smil>
    You have to follow Adobe's FMLE instructions (linked above) and the JW documentation to get this to work.
    http://www.longtailvideo.com/support/forums/jw-player/setup-issues-and-embedding/22581/emb edding-jw-player-and-using-with-fms-45-livestreaming/
    Good luck.

Maybe you are looking for