0

I'm making a video effect on iOS (using Metal) that requires accessing pixel data from the current video frame as well as some number of previous frames. To do this I'm storing pixel buffers in an Array property that behaves like a stack. When rendering, I cycle through the pixel buffers and create a MTLTexture for each. These textures then get sent to my Metal shader as a texture2d_array.

It all works great, except that as soon as I try display data from more than 12 distinct pixel buffers at a time, new frames stop getting sent from my capture output and the video appears frozen. There are no warnings or crashes. I've been using an iPhone 8.

It seems like I'm hitting some clear limit, though I haven't been able to determine what the nature of that limit is. I'm fairly new to graphics programming, so I very well may be doing something bad. I would be enormously grateful for any help or ideas.

Here's my pixel buffer stack:

// number of pixel buffers, anything over 12 causes "freeze"
let maxTextures = 12
var pixelBuffers: [CVPixelBuffer]?
var pixelBuffer: CVPixelBuffer? {
    set {
        if pixelBuffers == nil {
            pixelBuffers = Array(repeating: newValue!, count: maxTextures)
        }

        pixelBuffers!.append(newValue!)
        pixelBuffers!.removeFirst()

        DispatchQueue.main.async {
            self.setNeedsDisplay()
        }
    }
    get {
        return pixelBuffers?.last
    }
}

Here's where I create the textures:

for i in 0..<maxTextures {

    guard let pixelBuffer = self.pixelBuffers?[i] else { return }

    width = CVPixelBufferGetWidth(pixelBuffer)
    height = CVPixelBufferGetHeight(pixelBuffer)

    var cvTextureOut: CVMetalTexture?
    CVMetalTextureCacheCreateTextureFromImage(kCFAllocatorDefault, self.textureCache!, pixelBuffer, nil, .bgra8Unorm, width, height, 0, &cvTextureOut)
    guard let cvTexture = cvTextureOut else {
        print("Failed to create metal texture")
        return
    }

    if textures == nil {
        textures = Array(repeating: CVMetalTextureGetTexture(cvTexture)!, count: maxTextures)
    }

    textures![i] = CVMetalTextureGetTexture(cvTexture)!

}

See this code in context here: https://gist.github.com/peeinears/5444692079d011a0ca3947c4e49efd47

I'm happy to share more if it's helpful.

Thanks!

Ian Pearce
  • 143
  • 1
  • 10
  • The most likely situation is that you are using up all the memory resources and you app is getting stalled as a result. There is another process that typically vends you the core video buffers, if you hold on to too many (sent to your process) the your app is often just killed by iOS. There is also only so much main memory you can allocate before bad and strange things begin to happen to your app. – MoDJ Mar 03 '19 at 07:44
  • @MoDJ Yeah, that was my first thought too. Do you have any suggestions for how I might confirm whether that's the case? My application doesn't receive any memory warnings. The app doesn't crash. While the video "freezes", the rest of the app's UI is still interactive. I tried reducing the size of the input video (which I thought would reduce the data size of each pixel buffer) but I hit the same 12 pixel buffer limit. Thank you! – Ian Pearce Mar 03 '19 at 17:45
  • I just read this comment on the `CMSampleBufferGetImageBuffer` docs: "The caller does not own the returned buffer, and must retain it explicitly if the caller needs to maintain a reference to it." So maybe there's some underlying buffer pool that maxes out at 12 before buffers are released. (Side note: I've tried using `CVPixelBufferLockBaseAddress` but to no avail.) Perhaps I can try storing pixel data in a type I can guarantee will be owned... – Ian Pearce Mar 03 '19 at 20:37
  • Update: I tried storing and creating textures from CGImages instead of CVPixelBuffers and I was able to go well past 12 frames! So it does seem like it's related to how pixel buffers are managed in memory. Thanks @MoDJ. – Ian Pearce Mar 03 '19 at 22:19

0 Answers0