4

I want to access the average colour value of a specific area of CVPixelBuffer that I get from ARFrame in real-time. I managed to crop the image, use filter to calculate average colour and after converting to CGImage I get the value from the pixel but unfortunately, it affects the performance of my app (FPS drops below 30fps). I think that the reason for that is using CGContext. Is there a way to access colour without converting CIImage to CGImage? This is code that I'm using at the moment:

func fun() {
  let croppVector = CIVector(cgRect: inputImageRect)

  guard let filter = CIFilter(
    name: "CIAreaAverage",
    parameters: [kCIInputImageKey: image, kCIInputExtentKey: croppVector]
  ),
    let outputImage = filter.outputImage,
    let cgImage = context.createCGImage(
      outputImage,
      from: CGRect(x: 0, y: 0, width: 1, height: 1)
    ),
    let dataProvider = cgImage.dataProvider,
    let data = CFDataGetBytePtr(dataProvider.data) else { return nil }

  let color = UIColor(
    red: CGFloat(data[0]) / 255,
    green: CGFloat(data[1]) / 255,
    blue: CGFloat(data[2]) / 255,
    alpha: CGFloat(data[3]) / 255
  )
}
RealUglyDuck
  • 338
  • 2
  • 13

1 Answers1

1

I think there is not too much you can do here – reduction operations like this are expensive.

A few things you can try:

  • Set up your CIContext to not perform any color management by setting the .workingColorSpace and .outputColorSpace options to NSNull().
  • Render directly into a piece of memory instead of going through a CGImage. The context has the method render(_ image: CIImage, toBitmap data: UnsafeMutableRawPointer, rowBytes: Int, bounds: CGRect, format: CIFormat, colorSpace: CGColorSpace?) you can use for that. Also pass nil as color space here. You should be able to just pass a "pointer" to a simd_uchar4 var as data here. rowBytes would be 4 and format would be .BGRA8 in this case, I think.
  • You can also try to scale down your image (which already is a reduction operation) before you do the average calculation. It wouldn't be the same value, but a fair approximation – and it might be faster.
Frank Rupprecht
  • 9,191
  • 31
  • 56
  • Thanks Frank for the answer. Unfortunately it didn't help much (maybe improved performance a bit) but I still get cases when frames drop to 18fps. I did captured frame with the Metal Frame Debugger and as a result I get an Issue: "Encoder: 'mainMetalEntryPoint' has unused texture: 'com.apple.CoreImage' issue" in regards to the cropped image that I calculate the average from. Do you know any other way to get pixel RGB values from CVPixelBuffer? My goal is to compare its values with reference RGB colour. – RealUglyDuck Dec 19 '19 at 11:40
  • You can access the raw data with `CVPixelBufferGetBaseAddress`. Concerning the performance: Have you tried _disabling_ "GPU Frame Capture" and "Metal API Validation" in your scheme settings? Those interfere heavily with any performance benchmark. – Frank Rupprecht Dec 19 '19 at 11:55
  • @FrankSchlegel what happens if we do not specify workingColorSpace (or outputColorSpace) key when creating a ciContext? – Deepak Sharma Jan 06 '22 at 15:26
  • @DeepakSharma The working space defaults to extended linear sRGB. Same for the output space, I think. But it is rarely used since most of the context's output methods require to explicitly specifying it anyway. – Frank Rupprecht Jan 06 '22 at 15:34
  • @FrankSchlegel So conversion to sRGB and back in input and output, is that what we call **color management**? – Deepak Sharma Jan 06 '22 at 15:53
  • 1
    @DeepakSharma Right, also called ColorSync by Apple. – Frank Rupprecht Jan 06 '22 at 15:55
  • @DeepakSharma Maybe would be better if you ask that in a new question. This comment section is probably not the ideal place to discuss this... – Frank Rupprecht Jan 06 '22 at 16:02
  • Ok will frame a new question and post. – Deepak Sharma Jan 06 '22 at 16:09
  • @FrankSchlegel I tried my best to formulate it [here](https://stackoverflow.com/questions/70620628/core-image-workingcolorspace-outputcolorspace). Have a look please. – Deepak Sharma Jan 07 '22 at 11:34