How to detect packet loss or other quality metrics for inbound streams?
So we're trying to come up with a way to automatically assess the "quality" of streams published to an FMS. So far, I've come down to 3 different statistics:
* Round-trip ping time to client (using the Client.ping() method and Client.getStats().ping_rtt. Not a great indicator of stream quality, but could help identify potential network issues.
* A home-grown jitter calculation that compares the deviation between wallclock and media time for an inbound stream, e.g., I get N+/-M seconds of media over a period of N seconds of wallclock time. How big is M on average? (uses Stream.time and Date.getTime()).
* How much packet loss are we getting?
For this third metric, I have the client sending up period reports containing tuples of <media_time, cum_bytes_sent>, where media_time is provided by NetStream.time and cum_bytes_sent is provided by NetStreamInfo.byteCount. I'm then comparing these samples to a similar set of samples obtained on the server via Stream.time and Client.getStats().bytes_in. Unfortunately, these two data sources do not seem to be measuring exactly the same thing. While they are close to each other, the received bandwidth (e.g., measured by the server) always seems to exceed the sent bandwidth (measured by the client) by a small amount (e.g., 0.5% to 2%) on a clean network.
The goal of this whole exercise is to enable to server to automatically detect when a client is having problems (either network or local CPU). Among other things, this will trigger us to tell the client to try sending a lower-bitrate stream.
So, my questions:
* Is there a method for me to reliably calculate (or obtain) a packet loss factor? With something like RTP, I could simply look for holes in sequence numbers, but I don't see any APIs that would give me that sort of information via FMS (our inbound streams are usually RTMFP).
* If there isn't a way for me to get a clean packet loss factor, is there another quality metric I should be looking at?
