2

I've been using CoreGraphics to create a bitmap context around the pixel buffer and extract an image using CGBitmapContextCreateImage, then using CGContextDrawImage to draw that image in another (grayscale) buffer.
Profiling shows that CGContextDrawImage takes a lot of time, so I thought I would avoid it by accessing the original CVPixelBuffer directly. But it turns out that that's much slower!

I guess that buffer lies in some special memory area that is slow to access for some reason.

What's the fastest way to get those pixels into a place where I can process them quickly, then?

LaC
  • 12,624
  • 5
  • 39
  • 38

0 Answers0