I searched online but there is very little information about this.
I have a live broadcasting app where I send encoded H264 video frames and AAC audio chunks resulting from camera & mic using the Android MediaCodec SDK, over a RTMP stack.
My live streams are 720p and I aim for great quality with 2500Kbps. This obviously requires a very good network connection which means 4G if you use a data plan.
Problem is even with the greatest connection there will be low peaks and congestion, so there will be moments where the network cant hold such heavy stream. Because I want to offer high reliability, I want to include automatic adaptive bitrate on my app so that the image quality is dropped in favor or reliability.
The thing is -- how to achieve this automatic adaptation to the network conditions without losing frames? Is it even possible? I've used professional encoding devices like Cerevo and they dont ever lose frames -- however with my app I always get some awful dragging due to p-frames being lost in the network.
This is what I currently have:
private long adaptBitrate(long idleNanos, Frame frame) {
int bytes = frame.getSize();
long nowNanos = System.nanoTime();
if (nowNanos - mLastNanos > 1000L * 1000 * 1000) {
double idle = (double) idleNanos / (double) (nowNanos - mLastNanos);
float actualBitrate = newBitrate;
int size = mBuffer.size();
String s = "Bitrate: " + actualBitrate / 1000
+ " kbps In-Flight:" + bytes
+ " idle: " + idle;
if (size > MAX_BUF_SIZE && size > mLastSize) {
Log.i(TAG, "adaptBitrate: Dropping bitrate");
newBitrate = (int) ((double) actualBitrate * BITRATE_DROP_MULTIPLIER);
if (newBitrate < MIN_BITRATE) {
newBitrate = MIN_BITRATE;
}
s += " late => " + newBitrate;
mRtmpHandler.requestBitrate(newBitrate);
} else if (size <= 2 && idle > IDLE_THRESHOLD) {
mIdleFrames++;
if(mIdleFrames >= MIN_IDLE_FRAMES){
Log.i(TAG, "adaptBitrate: Raising bitrate");
newBitrate = (int) ((double) newBitrate * BITRATE_RAISE_MULTIPLIER);
if (newBitrate > MAX_BITRATE) {
newBitrate = MAX_BITRATE;
}
s += " idle => " + newBitrate;
mRtmpHandler.requestBitrate(newBitrate);
mIdleFrames = 0;
}
}
debugThread(Log.VERBOSE, s);
mLastNanos = System.nanoTime();
mLastSize = size;
idleNanos = 0;
}
return idleNanos;
}
So if my buffer is exceeding a threshold, I lower the bitrate. If my app is spending too much time waiting for a new frame, for a number of consecutive frames, then I raise the bitrate.
No matter how cautious I am with the threshold values, I am always losing important information and my stream breaks until the next keyframe arrives (2 secs). Sometimes it seems that the network can hold a certain bitrate (stable at 1500kbps, for instance) but the image will still have some dragging as though a frame was lost in the way. With good network conditions, everything is smooth.
How do these streaming devices handle these situations? It always looks great with them, no dragging or skipped frames at all...