I'll try to make myself as clear as possible. I am streaming a video and I'd like to know for each time t
in the video, the no. of bytes that should have been loaded in client's machine to play the video from start to time t
. I want this info as an array(or some similar data structure) for various values of t
I'm trying to achieve this with the help of youtube api but I'm facing some problems. What I did was to load the video from a time and call getVideoStartBytes
. It seems correct seeing the youtube api's documentation but the final graph is weird. It looks like a parabola! It's weird because no. of bytes needed at time 25 can't be less than the no. of bytes needed at time 15.
Earlier I have done bitrate calculation using vlc and the graph comes something like
y=x^2
which was at least plausible.
Is this some kind of bug in youtube or maybe something else? Please help.