Skip to main content
Participant
September 24, 2011
Question

Why flash decode the frame when the bufferLength this about 64 frames?

  • September 24, 2011
  • 1 reply
  • 444 views

When i use the flv player to player my camera video, i build a http server between camera and flv player. I find the beginning time of playing is changed with different framerate, and i print the bufferLength for netstream. i get a table
Framerate bufferLength when playing
25 2.5
20 3.1
15 4.2
And i find the framerate* bufferLength ~ 64
Who knows what's the reason and how to change the 64 value?

    This topic has been closed for replies.

    1 reply

    September 24, 2011

    Might be best to post this question in the Flash forum. This is for Adobe Connect specific questions.