So while I'm sure I'm not about to provide enough info for anyone to fix my specific code, what I am itching to know is this:
Does anyone know what might have happened to iOS14 to change HEVC decoding requirements??
I have a decoder built using VideoToolbox for an HEVC encoded video stream coming over the network, that was and is working fine on iOS 13 devices, and iOS 14 simulators. But it's failing most of the time in iOS 14 (up to 14.4 at time of writing) on iOS devices. "Most of the time", because sometimes it does just work, depending on where in the stream I'm trying to begin decoding.
An error I'm occasionally getting from my decompression output callback record is OSStatus -12909
– kVTVideoDecoderBadDataErr
. So far, so unhelpful.
Or I may get no error output, like in a unit test which takes fixed packets of data in and should always generate video frames out. (This test likewise fails to generate expected frames when using iOS14 on devices.)
Anyone else had any issues with HEVC decoding in iOS 14 specifically? I'm literally fishing for clues here... I've tried toggling all the usual input flags for VTDecompressionSessionDecodeFrame()
(._EnableAsynchronousDecompression
, ._EnableTemporalProcessing
, ...)
I've also tried redoing my entire rendering layer to use AVSampleBufferDisplayLayer
with the raw CMSampleBuffers
. It decodes perfectly!! But I can't use it... because I need to micromanage the timing of the output frames myself (and they're not always in order).
(If it helps, the fixed input packets I'm putting into my unit test include NALUs of the following types in order: NAL_UNIT_VPS
, NAL_UNIT_SPS
, NAL_UNIT_PPS
, NAL_UNIT_PREFIX_SEI
, NAL_UNIT_CODED_SLICE_CRA
, and finally NAL_UNIT_CODED_SLICE_TRAIL_N
and NAL_UNIT_CODED_SLICE_TRAIL_R
. I took these from a working network stream at some point in the past to server as a basic sanity test.)