I want to render a CIImage into a pixel buffer of type kCVPixelFormatType_128RGBAFloat. But CIContext.render() fails saying "unsupported format". I tested on the iPhone 7 Plus running iOS 11.
Here's my code:
let context = CIContext()
var buffer: CVPixelBuffer? = nil
let buffer = CVPixelBufferCreate(nil,
width,
height,
kCVPixelFormatType_128RGBAFloat,
nil,
&buffer)
assert(buffer != nil, "Couldn't create buffer")
context.render(ciImage, to: buffer)
The buffer is created successfully — the assertion doesn't fire. It's only the rendering in the last line that fails saying "unsupported format".
I also tried creating an IOSurface-backed CVPixelBuffer by replacing the second nil with [kCVPixelBufferIOSurfacePropertiesKey: [:]] as CFDictionary
, but it didn't help.
How do I get this to work?
The format needs to be kCVPixelFormatType_128RGBAFloat, for reason that are too complex to get into here, the short version being that the pixel values have a greater range than 0-255, including fractional values that cannot be rounded.
I tried some more things:
- kCVPixelFormatType_64ARGB
- The software renderer
- Creating the CIContext backed by an EAGLContext
- Creating the CIContext backed by an MTLDevice
- Calling CIContext.createCGImage()
- Rendering to a MTLTexture but I couldn't figure out how to create one.
- Rendering to an IOSurface
- Calling clearCaches() on CIContext().
- Calling reclaimResources(), but that's not available on iOS.
- Checking that my input is < CIContext.inputImageMaximumSize() and outputImageMaximumSize()
- rendering to a raw byte array.
None of these worked. Is rendering to 32-bits per channel floats or 16 bits per channel ints not supported by Core Image?