2

I am writing code to convert a frame in a MP4 file to a OpenGLES texture, and am using the class AVAssetReaderTrackOutput to be able to access the pixel buffer. What is the best pixel buffer format to output as? Right now I am using my old code that converts YUV420P to RGB in a OpenGLES shader as I previously used libav to feed it. Now I am trying to use AVFoundation and wondering whether my OpenGLES shader is faster than setting the pixel buffer format to RGBA, or whether I should use a YUV format and keep with my shader.

Thanks

Cthutu
  • 8,713
  • 7
  • 33
  • 49

1 Answers1

2

I guess this depends on what the destination of your data is. If all you are after is passing through the data, native YUV should be faster than BGRA. If you need to read back the data to RGBA or BGRA, I'd stick to BGRA and use a OpenGL Texture Cache rather than glReadPixels().

I recommend reading the answer for this SO question on the YUV method. Quote:

"Video frames need to go to the GPU in any case: using YCbCr saves you 25% bus bandwidth if your video has 4:2:0 sampled chrominance."

Community
  • 1
  • 1
BlueVoodoo
  • 3,626
  • 5
  • 29
  • 37