0

I am developing an app that sends pixel buffers from the Broadcast Upload Extension to OpenTok. When I run my broadcast extension it hits its memory limit in seconds. I have been looking for ways to reduce the size and scale of CMSampleBuffers and ended the process by first converting them to CIImage, then scaling them, and then converting them to CVPixelBuffers for sending OpenTok Servers. Unfortunately, the extension still crashes, even though I tried to reduce the pixel buffer. My code follows:

First I convert the CMSampleBuffer to CVPixelBuffer in processSampleBuffer function from Sample Handler then pass CVPixelBuffer to my function along with timestamps. Here I convert the CVPixelBuffer to cIImage and scale it using cIFilter(CILanczosScaleTransform). After that, I generate Pixel Buffer from CIImage using PixelBufferPool and cIContext and then send the new buffer to OpenTok Servers using videoCaptureConsumer.

    func processPixelBuffer(pixelBuffer:CVPixelBuffer, timeStamp ts:CMTime) {
    
    guard  let ciImage = self.scaleFilterImage(inputImage: pixelBuffer.cmIImage, withAspectRatio: 1.0, scale: CGFloat(kVideoFrameScaleFactor)) else {return}
    

    if self.pixelBufferPool == nil ||
        self.pixelBuffer?.size != pixelBuffer.size{
        
        self.destroyPixelBuffers()
        self.updateBufferPool(newWidth: Int(ciImage.extent.size.width), newHeight: Int(ciImage.extent.size.height))
        
        guard CVPixelBufferPoolCreatePixelBuffer(kCFAllocatorDefault, self.pixelBufferPool, &self.pixelBuffer) == kCVReturnSuccess
        else {return}
        
    }
    
    context?.render(ciImage, to:pixelBuffer)
    
    self.videoCaptureConsumer?.consumeImageBuffer(pixelBuffer,
                                                  orientation:.up,
                                                  timestamp:ts,
                                                  metadata:nil)
    
}

If the pixelBufferPool is nil or there is a change in the size of the pixelBuffer I update the pool.

 private func updateBufferPool(newWidth: Int, newHeight: Int) {
    
    let pixelBufferAttributes: [String: Any] = [
        kCVPixelBufferPixelFormatTypeKey as String: UInt(self.videoFormat),
            kCVPixelBufferWidthKey as String: newWidth,
            kCVPixelBufferHeightKey as String: newHeight,
            kCVPixelBufferIOSurfacePropertiesKey as String: [:]
        ]

    
    CVPixelBufferPoolCreate(nil,nil, pixelBufferAttributes as NSDictionary?, &pixelBufferPool)
}

This is the function I use to scale the cIImage:

func scaleFilterImage(inputImage:CIImage, withAspectRatio aspectRatio:CGFloat, scale:CGFloat) -> CIImage? {
    scaleFilter?.setValue(inputImage, forKey:kCIInputImageKey)
    scaleFilter?.setValue(scale, forKey:kCIInputScaleKey)
    scaleFilter?.setValue(aspectRatio, forKey:kCIInputAspectRatioKey)
    return scaleFilter?.outputImage
}
  • My question is why it still keeps crashing and is there another way to reduce the CVPixelBuffer size without causing a memory limit crash?

I would appreciate any help on this. Swift or Objective - C, I am open to all suggestions.

Mehmet Baykar
  • 367
  • 3
  • 11

1 Answers1

0

I don't know if Core Image copies or references data when it generates a CIImage from a CVPixelBuffer. However, Accelerate's vImage library offers a no-copy solution that may solve this. The vImageBuffer_InitForCopyFromCVPixelBuffer function (combined with kvImageNoAllocate) initializes a vImage_Buffer that shares data with a CVPixelBuffer. Note that need to lock the Core Video pixel buffer with CVPixelBufferLockBaseAddress.

You can take an even more direct route by initializing a vImage_Buffer using:

let buffer = vImage_Buffer(data: data,
                           height: vImagePixelCount(height),
                           width: vImagePixelCount(width),
                           rowBytes: bytesPerRow)

Call CVPixelBufferGetBaseAddress(_:) for data, CVPixelBufferGet[Height|Width](_:) for dimensions, and CVPixelBufferGetBytesPerRow(_:) for bytesPerRow. You'll still need to lock the CVPixelBuffer.

You can create source and destination vImage buffers that reference the source and destination Core Video pixel buffers.

vImage provides scale functions for different pixel formats: https://developer.apple.com/documentation/accelerate/vimage/vimage_operations/image_scaling. The scale is the ratio between the source and destination sizes.

Flex Monkey
  • 3,583
  • 17
  • 19
  • Thanks but I am looking for solutions on GPU not CPU. When I use Accelerate it causes high CPU consumption and renders buffers really slow. Do you know how I can implement this in Metal or VideoToolBox? – Mehmet Baykar Sep 03 '21 at 15:39
  • That’s odd, because I use Accelerate for real time video processing without issue. You could post your code here or in the Apple developer forums to ensure you’re using Accelerate optimally. – Flex Monkey Sep 04 '21 at 18:01
  • The problem is if you run this code on devices like iPhone 7 or 8 it can’t process it because of high CPU usage and finally it disconnects from the WebRTC service. That is why I am looking for a GPU solution. – Mehmet Baykar May 23 '22 at 22:58