Steps to Publish Live Audio for HTTP Streaming
I wanted to confirm some steps in order to stream Live Audio from FMS to a mobile device using HTTP Streaming and HTML 5 audio/video tags. I have a system setup like below:
Lets say my application which recieves the live audio is called CLASSROOM.
Lets say my application instance is called "TEST"
So I have application code like:
In "onAppStart()"
this.broadcast_live_stream = Stream.get("f4f:livestream");
this.broadcast_live_stream.liveEvent = "TEST";
this.broadcast_live_stream.record("record");
To play the incoming audio:
application.broadcast_live_stream.play(streamName); // name of the stream sending audio to the server
Thus, I have a folder:
Flash Media Server 4.5\applications\CLASSROOM\streams\TEST\livestream
which contains a bootstrap, control, meta, f4f, and f4x files.
Since I specified the Event name as "TEST", I also created the folder:
Flash Media Server 4.5\applications\livepkgr\events\_definst_\TEST
Which contains event.xml:
<Event>
</Recording>
</Event>
and manifest.xml:
<manifest xmlns="http://ns.adobe.com/f4m/1.0">
</manifest>
Should I then be able to create a video tag and hear the live audio:
<video id='slide1' controls='controls'><source src='http://fms-dns/hls-live/livepkgr/_definst_/Test/livestream.m3u8'></video>
I am not sure what other steps are needed, or a good way to test. I don't understand how the Live Media Encoder would be needed in this situation.
