Copy link to clipboard
Copied
I have a need to record and play back (live or recorded) videos with DVR controls. As such, I have downloaded and installed the DVRCast application for Flash Media Server 4.5. The trick is that I need to support videos from two different sources: Flash Media Live Encoder and a VBrick hardware encoder. While it is possible to have FMLE autostart the recording (live playback works properly in this case), I cannot have the VBrick device do the same.
I have spent more time than I would care to admit consulting resources all over the internet and trying to configure DVRCast to automatically start recording during the onPublish event and stop recording during the onUnpublish event, but there is always something missing. For example, the playback of live video (with a StrobeMedia based player) does not automatically seek to the live point and there is no live section of the scrub bar despite the fact that the video duration still appears to be increasing. I know this functionality works on the video player side because it shows up fine streaming live videos from Wowza Media Server. Additionally, streaming to Flash Media Server from the VBrick just yields all kinds of weird behavior. For example, videos played back from that source start 43 seconds into the video and don't allow seeking to a point earlier than that and the video duration is all over the map (sometimes will show up as 22 hours when it's only 30 minutes).
I tried adding the following to onPublish:
var s = Stream.get("mp4:" + stream.name + ".flv");
if (s == undefined) {
trace("s is undefined in onPublish");
return;
}
s.record("append");
s.play(stream.name, -1, -1);
and the following to onUnPublish:
var s = Stream.get("mp4:" + stream.name + ".flv");
if (s == undefined) {
trace("s is undefined in onUnpublish");
return;
}
s.record(false);
s.play(false);
And got the issues described above so something isn't right. Alternatively, I tried having onPublish call the setStreamInfo function in ExDVRStream.asc to approximate the "start recording" command and I couldn't seem to find the right way to call that either.
What am I missing? It really doesn't seem like it should be this hard to do what I'm trying to do and I doubt I'm the only person who's ever needed to do that.
Thanks,
Jim
Copy link to clipboard
Copied
There are lot of other things happing in the onPublish/onUnpublish and onStartRecord/onStopRecord
though I haven't tried it myself, but I will suggest you to call strm.onStartRecord in onPublish, keeping other things same.
Copy link to clipboard
Copied
Thanks for the replies. I added the following lines:
strm.getStreamInfo();
strm.streamInfo.append = true;
strm.onStartRecord();
The problem now is that if I set the stream type to DVR in the Strobe player, I get the following odd behavior for video from FMLE:
With autoPlay enabled in the player
With autoPlay disabled in the player
In addition to those issues, the code changes above don't seem to fix the crazy video durations when recording and playing back from the VBrick (e.g. 268 hours when it's only a few minutes). Assuming I could actually get the videos to record and playback with DVR controls from FMLE (which I am becoming increasingly doubtful of this right now), is there a way to have the server insert the timecode or whatever is missing so it doesn't get so confused? I've looked at the logs and added more trace statements and it appears to be using a reasonable startRec time so this might be some sort of missing VITC timecode issue.
I still find it extremely hard to believe that I'm the only person trying to record to and playback video with DVR controls from FMS without an explicit DVR record start/stop and that I'm also the only person doing it from other video sources besides FMLE. What am I missing? Why is this so hard?
The DVR app from Wowza combined with Wowza Media Server provided this functionality of the box and the only reason we're not 100% sold on using that as our solution is that we can't get the live latency down below 20sec because it can only do HTTP out instead of RTMP out.
Thanks,
Jim
Copy link to clipboard
Copied
Like Nitin - even I have no tried what i would suggest but i supsect what i migth be goign wrong.
The way it works is FMLE sends "streamInfo" object to DVRCast application - this is primary thing which needs to happen for DVRCast application to work smoothly. What is happening here is that your VBrick encoder is not sending this so things get little messed up. Also if i am not wrong the way DVRCast is coded probably it expects encoder and AMS to be in same timezone ( is it in your case?).
Since you don't understand correct workflow of DVRCast is working and may be someone here also might take quite a time to understand as its legacy application - what i would suggest is if you can develope intermediate application which takes stream from VBrick encodeer and then pushes it to DVRCast and also make sure of setting correct "streaminfo" and calling appropriate handler - that migth help things to work better than they are right now.
Copy link to clipboard
Copied
Hi you are not alone, now i need this solution, i try add your code to my dvrcast files, something change PixelWizard in your code ?