My project currently streams YUV 420P frames over WebRTC, using WebRTC's JavaI420Buffer
class to wrap up the I420 format ByteBuffer
that I obtain from an external camera source, and passing it in to my video source's capture observer.
Now however I want to support another camera feed which is giving me YUV NV12 format input. Is there no way of sending these frames directly over WebRTC, without first converting the NV12 format to 420 planar?
(Converting the ByteBuffer
naively with the CPU (i.e. a couple of for-loops) is predictably giving me horrible results at 30 frames per second. And I'm not entirely sure if I need to follow this conversion route or not, hence this question.)