Skip to main content
Participating Frequently
February 4, 2013
Answered

Using FMS v4.5.5 with a browser-based audio player running on iPad/iPhone

  • February 4, 2013
  • 1 reply
  • 2470 views

Hello,

I’m seeking some clarity/guidance on using FMS v4.5.5 (Streaming Edition) with a browser-based audio player running on iPad/iPhone (i.e. under iOS).

Background

I have a Microsoft Windows 2008 R2 Web Server running IIS v7.5 and supporting a website built using ASP.NET and C#. Parts of the website allow a visitor to play a selection of audio clips from our bespoke MP3 library. Due to UK music licencing constraints, we must stream the clip to the client browser – we must not download either part or all of the source MP3 file(s). We therefore had a custom audio player built for us some time ago using Flash/AS3. This has and continues to work very well, and uses RTMP to stream the clips from FMS to the client browser.

As our site has become more established the number of visitors has steadily grown. In the early days the number of visitors from the iOS platform was relatively small but, last month, it had risen to 40%! As Flash is not supported on the iOS platform, this growing group of visitors are unable to fully maximise the benefits offered by the website, which I am now trying to address.

What I’m Trying To Do…

From what I’ve researched and read, I believe that FMS v4.5.5 supports HLS and I need to use this (instead of RTMP) over HTTP between FMS and the iOS Safari client. I have been experimenting with an Open Source player (JPlayer) and a commercial player (JWplayer) and appear to have these playing a MP3 file as a “progressive download” (I think that’s the right term) as all I’m doing is specifying the file path (no HTTP prefix etc.), i.e. the audio is not really being streamed as I require. Unfortunately, I’ve not found / worked out how to configure my FMS to respond to a path request along the lines of:

http://www.myserver.com/live/audio1.mp3

Hopefully someone reading this who is much more expert on FMS, HLS, HDS, HTML5, etc. than me can point me in the right direction!

What I’m confused about…

1) Can all my MP3 assets stay as is and still be streamed to the iOS Safari client? To the best of my knowledge, the key properties of my MP3 files are CBR, 44,100Hz, 320Kbps.

2) If not, can they be converted on-the-fly by FMS or do I need to create additional audio assets to support HLS etc.?

3) If I have to manually create additional audio assets, how do I do that (I’ve seen references to tools from both Adobe and Apple but not actually found the kits)?

4) Some of the Adobe documentation I’ve read implies that FMS support for HLS etc. is only available when FMS is used in conjunction with the bundled Apache web server. As stated earlier, my entire site runs on IIS so does that mean to achieve what I want I’ll have to run two web servers on my physical server – IIS for the website itself and Apache for RTMP/HLS/etc.?

5) What changes do I need to make to my current FMS configuration to get HLS to work? I’m sure someone has been here already and can enlighten me with their experience and hopefully point me to some straight forward documentation to follow!

6) Lastly, and really part of (5), once I’ve got FMS configured correctly, what would be the typical path to a given MP3 audio file that I would give the iOS Safari browser to make this all work.

I appreciate this is rather a lengthy first post but hopefully the background puts my questions (and confusion!) in context. Bottom line is… I simply want to be able to stream audio clips only (no video) to iOS Safari browsers from FMS v4.5.5 running on a Win2K8R2 platform with IIS v7.5, and with no audio content being stored (even temporarily) on the iOS platform.

My thanks and appreciation in advance to anyone who can help me achieve this.

Steve Barker

Business IT Solutions

This topic has been closed for replies.
Correct answer

Hi Apurva,

Thank you for your reply. Unfortunately I am not much further forward, despite having read more Adobe documentation (FMS v4.5 Configuration and Administration Guide, FMS v4.5 Developer’s Guide) and trying different configuration options.

First to confirm a point you made when answering point 4 yesterday. Yes, in my current RTMP scenario I have several MP3 assets (let’s call them Audio1.mp3, Audio2.mp3, Audio3.mp3, etc.) that are made available to a bespoke Flash/AS3 player via the LIVE application. I understand that as I purchased the Streaming Edition of FMS v4.5, a Flash-based player is restricted to using the LIVE application. You go on to say that I “publish” these assets to the LIVE application. I set this up a while back but do not recall having to “publish” anything. All I recall doing was to ensure that all the assets (the .mp3 files) were in a single folder and that the folder was identified by the <Streams> tag in the following file (example shown):

Flash Media Server 4.5\applications\live\Application.xml

                <StreamManager>

                                <VirtualDirectory>

                                                <Streams>/;C:\MP3Library</Streams>

                                </VirtualDirectory>

                </StreamManager>

To the best of my knowledge, I didn’t have to do anything else to make the current solution work. Having read more of the documentation I am still none the wiser as to what I need to do (step-by-step) to “publish” my converted MP3 assets (see next paragraph) to the LIVEPKGR application. Please could you explain these steps to me by way of an example.

WRT your answer to point 1 yesterday, I wish to clarify the exact output I need to create when converting my MP3 assets. Am I expected to produce a Audio1.aac file, or a Audio1.mp4, file, or a Audio1.m4a file, etc.? Or am I meant to be using ffmpeg (either directly or indirectly) with a much more sophisticated command line that somehow generates stream-rated info/files as well? An example of how you would use ffmpeg to convert one of my MP3 assets would be very much appreciated.

Lastly, the coexistence of IIS and Apache (ref your answer to point 2 yesterday). That all made sense but given all the testing/changes I’ve been making I thought it wise to start afresh! So I completely removed FMS from my dev server and re-installed Flash Media Streaming Server v4.5.5 with the bundled Apache v2.2 server. When presented with the dialogue “Would you like for Apache to listen on port 80. If not FMS will be using port 80 instead.” I answered NO. When presented with the dialogue regarding FMS server ports, I entered 1935 NOT 1935,80 (should I have done that as the preamble makes reference to HTTP webserver proxy and HTTP Dynamic Streaming origin services?). I then modified the Application.xml file as described above.

To test the new set-up, I did the following. First, the test environment…

DEV Server

Windows Web Server 2008 R2 SP1

IIS v7.5

ASP.NET v4.0

Flash Media Streaming Server v4.5.5

Apache v2.2

IE v9.0

Adobe Flash Player 11 ActiveX v11.5.502.146

TEST Client

Apple iMac (Intel)

Mac OS X v10.6.8

Safari v5.1.7

Adobe Flash Player 11 Plug-in v11.5.502.146

On DEV Server… launched IE and loaded the home page of my website, which has two elements on it – one to load/run our bespoke Flash player, the other to run JPlayer (configured to use HTTP). I selected our bespoke Flash player and it played a MP3 audio file. I verified this in real-time via the FMS Admin Console on the server.

On TEST Client… entered http://192.168.20.250:8134 into Safari – it comes up with the FMS install completed / test your server page. In the middle of the video display area a black box appears saying “10000 OK” – I clicked OK. I then clicked HDS Single Bitrate – the video plays (Not Using Hardware Acceleration is briefly displayed). There is NO activity logged in the FMS Admin Console back on the DEV server under “live”, “livepkgr” or “vod”. I assume this is because I’ve gone straight to the Apache Server via the port reference – is that correct? Next I tried the HLS Single Bitrate – same results.

The HDS Single Bitrate logs the following in the Apache access_log file:

"GET /hds-vod/sample1_1500kbps.f4v.f4m HTTP/1.1" 304 - "-" "Apple Mac OS X v10.6.8 CoreMedia v1.0.0.10K549"

The HLS Single Bitrate logs this:

"GET /hds-vod/sample1_1500kbps.f4v.f4m HTTP/1.1" 304 - "-" "Apple Mac OS X v10.6.8 CoreMedia v1.0.0.10K549"

"GET /hls-vod/sample1_1500kbps.f4v.m3u8 HTTP/1.1" 304 - "-" "Apple Mac OS X v10.6.8 CoreMedia v1.0.0.10K549"

"GET /hls-vod/sample1_1500kbps.f4v.m3u8 HTTP/1.1" 200 816 "-" "Apple Mac OS X v10.6.8 CoreMedia v1.0.0.10K549"

"GET /hls-vod/sample1_1500kbps.f4vFrag1Num0.ts HTTP/1.1" 200 1143416 "-" "Apple Mac OS X v10.6.8 CoreMedia v1.0.0.10K549"

"GET /hls-vod/sample1_1500kbps.f4vFrag1Num1.ts HTTP/1.1" 200 873072 "-" "Apple Mac OS X v10.6.8 CoreMedia v1.0.0.10K549"

"GET /hls-vod/sample1_1500kbps.f4vFrag2Num2.ts HTTP/1.1" 200 1591984 "-" "Apple Mac OS X v10.6.8 CoreMedia v1.0.0.10K549"

etc…

At this point I’m stuck once again since I’m not clear how to construct the JPlayer HTTP reference. You have suggested:

http://192.168.20.250:8134/hls-live/livepkgr/_definst_/livestream/livestream.m3u8

That produces the following in the Apache error_log file:

[error] [client 192.168.20.4] File does not exist: D:/Program Files/Adobe/Flash Media Server 4.5/webroot/hls-live

If I change the JPlayer HTTP reference to:

http://192.168.20.250:8134/hls-vod/sample1_1500kbps.f4v.m3u8

the audio from the video plays.

Also, if I look in the applications\livepkgr folder structure there is only \events (with sub-folder _definst_ and sub-sub-folder liveevent) – there are no _definst_ , livestream , streams folders etc. So besides me not quite understanding the logic/flow here, I seem to be missing a few folders/files too! I’d be very grateful for any light you can shed on all this. Plus an example of what my HTTP string should ultimately look like; for example:

http://192.168.20.250:8134/hls-live/livepkgr/_definst_/livestream/Audio1.m3u8

Apologies for such a long reply but I’ve been at this for over 10 hours today! I feel I’m getting closer but clearly there are still some gaps and configuration errors.

Thanks again for your time, support, patience and understanding.

Kind Regards, Steve


Hi Steve,

When you told me you were using the live application I assumed you were doing a live publish of some sort. We generally use live application for live publish and the vod application to stream vod assests like your mp3 files. This is why I had suggested using livepkgr folder and http://192.168.20.250:8134/hls-live/livepkgr/_definst_/livestream/live stream.m3u8 url format. But now that I have a better idea of your set-up, I'll say please forget about that.

You require HTTP streaming of your vod assets to iOS devices correct? This is the method you have to use:

1. Install FMS with bundled Apache - Done

2. Have Apache listen to 8134 only and FMS to 1935 - Done

3. Test if everything correctly installed by playing streams on the homepage - Done

4. Test HLS-VOD (HTTP streaming of to vod assets to apple devices) - In your safari browser try to play a stream like http://<server>:8134/hls-vod/sample1_1000kbps.f4v.m3u8

Is the audio-video playback correct?

5. Next open <root_install>/Apache2.2/conf/httpd.conf file in a text editor and search for <Location /hls-vod>. Under this location directive there will be a 'HttpStreamingContentPath' that points to the folder from where the assets will be picked up. Change that to whatever folder you want to and restart AMS.

6. Now try to playback a stream from this new folder of yours (for testing purposes try an mp4/f4v file) as http://<server>:8134/hls-vod/<stream-name-with-extension>.m3u8. Is the playback proper?

If everything works as expected till here then your FMS set-up is ready.

Now coming to converting you assets to AAC. I suggest you have a look here: http://superuser.com/questions/370625/ffmpeg-command-to-convert-mp3-to-aac. The basic commands to be used are explained here. I would recommend mp4/f4v. Unfortunately I havent tried this on my end so an unable to provide an example.

As far as JWPlayer is concerned a simple .m3u8 url should be sufficient to playback a stream. You could read about HLS streaming to JWPlayer here: http://www.longtailvideo.com/support/jw-player/28856/using-apple-hls-streaming/

Do try once again and let me know if it works.

Thanks,

Apurva

1 reply

February 5, 2013

Hi Steve,

To answer your questions:

1. Apple recommends AAC/He-AAC audio codec for streaming to iOS devices, so all MP3 files may not play as desired.

2. FMS currently has no transcoding capabilities, so this cannot be done by FMS on the fly. You'll have to manually create these assets

3. You can use a free tool like ffmpeg for transcoding. See here: http://superuser.com/questions/370625/ffmpeg-command-to-convert-mp3-to-aac

4. The Http streaming (HDS/HLS) in FMS will work only with Apache server as the modules are Apache modules. You will need to run Apache (either bundled with FMS or external) for HLS.

5/6. For simple, unencrypted streaming you'll not have to make any configuration changes. Instead of publishing to live application, publish to livepkgr application. If your streamname is livestream then the url will be:

http://<server>/hls-live/livepkgr/_definst_/livestream/livestream.m3u8

Hope this helps. Let me know if you have any other questions.

Thanks,

Apurva

SteveB-UKAuthor
Participating Frequently
February 5, 2013

Hi Apurva,

Thank you for your answers.

WRT the audio assets, I believe you are saying that I'll need to manually convert all of them into .aac format (is .m4a the same as I've seen references to that here and there?) for them to be reliably streamed to iOS devices, and tools such as ffmpeg (or ConverterLite (www.converterlite.com), which provides a GUI but uses ffmpeg under the bonnet) will help me do this - correct?

WRT to FMS and HDS/HLS I appear to have to use Apache to service these streams. OK, happy to give this a go although I've never configured/managed an Apache server! However, before I get started, just want to make sure my thinking is right here. As stated, the entire site is built on ASP.NET so I'm minded to continue serving the site via IIS v7.5 (does Apache provide suport for ASP.NET even?). So HTTP:80 requests will arrive at my server and be handled by IIS. I believe I can intercept these requests using the IIS rewrite functionality to pick off requests to http://<my-server>/hls-live/... and redirect them back to http://<my-server>/hls-live/...:81 (say) and configure the Apache server, which would be running in parallel to IIS on the same physical server (I only have the one!), to only listen on port 81 (say). The Apache server would then handle the request, envoking the FMS Apache modules in the process. Is that right and would that config work?

Lastly, where do the .m3u8 files fit into all this? Do I have to create these or does this get done for me somewhere?

Bit of a steep learning curve this but I'm up for giving it a shot!

Many Thanks, Steve