0

I have been working on an H264 hardware accelerated encoder implementation using VideoToolbox's VTCompressionSession for a while now, and a consistent problem has been the unreliable bitrate coming out of it. I have read many forum posts and looked through existing code for this, and tried to follow suit, but the bitrate out of my encoder is almost always somewhere between 5% and 50% off what it is set at, and on occasion I've seen some huge errors, like even 400% overshoot, where even one frame will be twice the size of the given average bitrate.

My session is setup as follows:

  • kVTCompressionPropertyKey_AverageBitRate = desired bitrate
  • kVTCompressionPropertyKey_DataRateLimits = [desired bitrate / 8, 1]; accounting for bits vs bytes
  • kVTCompressionPropertyKey_ExpectedFrameRate = framerate (30, 15, 5, or 1 fps)
  • kVTCompressionPropertyKey_MaxKeyFrameInterval = 1500
  • kVTCompressionPropertyKey_MaxKeyFrameIntervalDuration = 1500 / framerate
  • kVTCompressionPropertyKey_AllowFrameReordering = NO
  • kVTCompressionPropertyKey_ProfileLevel = kVTProfileLevel_H264_Main_AutoLevel
  • kVTCompressionPropertyKey_RealTime = YES
  • kVTCompressionPropertyKey_H264EntropyMode = kVTH264EntropyMode_CABAC
  • kVTCompressionPropertyKey_BaseLayerFrameRate = framerate / 2

And I adjust the average bitrate and datarate values throughout the session to try and compensate for the volatility (if it's too high, I reduce them a bit, if too low, I increase them, with restrictions on how high and low to go). I create the session and then apply the above configuration as a single dictionary using VTSessionSetProperties and feed frames into it like this:

VTCompressionSessionEncodeFrame(compressionSessionRef, static_cast<CVImageBufferRef<(pixelBuffer), CMTimeMake(capturetime, 1000), kCMTimeInvalid, frameProperties, frameDetailsStruct, &encodeInfoFlags);

So I'm supplying timing information as the API says to do. Then I add up the size of the output for each frame and divide over a periodic time period, to determine the outgoing bitrate and error from desired. This is where I see the significant volatility.

I'm looking for any help in getting the bitrate under control, as I'm not sure what to do at this point. Thank you!

LarryW
  • 89
  • 9

1 Answers1

0

I think you can check the frameTimestamp set in VTCompressionSessionEncodeFrame, it seems affects the bitrate. If you change frame rate, change the frameTimestamp.

  • Thanks for suggestion, but can you please clarify what I should be checking about the `presentationTimeStamp` (I assume that's the parameter you're talking about)? I am updating that with every frame using `CMTimeMake(captureTime, 1000)`. I am also not changing the frames per second during an encoding session. – LarryW Feb 22 '19 at 23:49
  • Use CMTimeMake(totalFrameCount, $frameRate) – xin xin Fang Mar 06 '19 at 09:19
  • Thank you, but I tried that and it didn't seem to change the result. Sometimes the encoder will spike up even 100% over what the target bitrate is. – LarryW Mar 08 '19 at 03:18