Question
Why flash decode the frame when the bufferLength this about 64 frames?
When i use the flv player to player my camera video, i build a http server between camera and flv player. I find the beginning time of playing is changed with different framerate, and i print the bufferLength for netstream. i get a table
Framerate bufferLength when playing
25 2.5
20 3.1
15 4.2
And i find the framerate* bufferLength ~ 64
Who knows what's the reason and how to change the 64 value?
