Skip to main content
Participant
July 13, 2015
Question

How to encode mp4 files to work with livepkgr

  • July 13, 2015
  • 1 reply
  • 1072 views

Hy guys,

I have 2 servers, Server 1 with FMS 4.5 and Server 2 with AMS 5 both on Ubuntu 14.04. On Server 1 a build a custom application, Server 2 uses just livepkgr for serving clients.

My appliction on Server 1 receive 3 streams from FMLE. In this application i have a "chain" of streams that carry out each stream independent. At the end of streams chain i republish each stream to Server 2 livepkgr application.

So, this is working great with live streams, i can serve RTMP, HTTP, HLS, HDS.

In Server 1 in the chain of streams i get a stream and play an mp4 file. The livepkgr application on Server 2 works well with this "switch" on RTMP, but the application don't record the stream, so on mobile i don`t see the switch... i see something green till the file ends and the stream that switched on file goes back on live.

I tried to encode a file with Adobe Media Encoder with H264 on profile Main and Level 3.2, and it works great on mobile too.

I tried to encode the same file with ffmpeg, handbreak... and doesent work on mobile.

So, i need a solution to re-encode files on linux and the output files to be good to work with AMS livepkgr application.

Thaks.

    This topic has been closed for replies.

    1 reply

    Inspiring
    August 3, 2015

    How exactly do you do relay, republish to the second server's livepkgr?

    sshlordAuthor
    Participant
    August 3, 2015

    Well, that is a easy job, took me some time to do it. I found the solution here on forum.

    Assume that you have a multi-bitrate stream, so you have 3 streams with 3 distinct names. Create a function on the server-side and send send every one of the streams to that function.

    theStream1 = Stream.get(stream1);

      application.nc1 = new NetConnection();

      application.nc1.connect( "rtmp://example.com/livepkgr" );

      application.ns1 = new NetStream(application.nc1);

      application.ns1.onStatus = function(info){

      if (info.code == "NetStream.Publish.Start"){

      var repObj = new Object();

      repObj.local = "stream1";

      repObj.remoteStream = "republishStream1";

      repObj.remoteEvent = "republishEvent";

      }

      }

      application.ns1.setBufferTime(5);

      application.ns1.attach(theStream1);

      application.ns1.publish("republishStream1?adbe-live-event=republishEvent&adbe-record-mode=record", "live" );

      application.nc1.onBWDone = function(){

      }

    Inspiring
    August 3, 2015

    Is that server-side app running on host server or on the edge (server 2)? Is this scheme resilient, eg will the app reconnect on network or encoder issues?

    I am doing relay of rtmp streams (MBR encoded by fmle baseline/H264/aac) for livepkgr like this:

    ffmpeg -v warning -re -i "rtmp://host/live/stream1 live=1" -c copy -flags +global_header -copyts -f flv rtmp://edge/livepkgr/stream1&adbe-live-event=liveevent

    ffmpeg -v warning -re -i "rtmp://host/live/stream2 live=1" -c copy -flags +global_header -copyts -f flv rtmp://edge/livepkgr/stream2&adbe-live-event=liveevent

    livepkgr does generally the job but the streams are broken.

    ffprobe -v verbose -i http://edge/live.m3u8

    drops out a lot of errors:

    [NULL @ 0000000002d658e0] non-existing PPS 0 referenced

    [hls,applehttp @ 0000000000326f00] Invalid stream index 2

    ...

    [NULL @ 0000000002d658e0] non-existing PPS 0 referenced

    [hls,applehttp @ 0000000000326f00] Invalid stream index 2

    My guess is that livepkgr need some "adobe"-way to connect/disconnect, that is compatibility problem.

    Fortunately, ffmpeg does HLS so I run it like:

    ffmpeg -v warning -re -i "rtmp://edge/live/stream1 live=1" -c copy -bsf:v h264_mp4toannexb -hls_time 8 -hls_flags delete_segments -hls_wrap 9 -f hls stream1.m3u8

    ffmpeg -v warning -re -i "rtmp://edge/live/stream2 live=1" -c copy -bsf:v h264_mp4toannexb -hls_time 8 -hls_flags delete_segments -hls_wrap 9 -f hls stream2.m3u8