6

I am currently working on live filters using Metal. After defining my CIImage I render the image to a MTLTexture.

Below is my rendering code. context is a CIContext backed by Metal; targetTexture is the alias to the texture attached to the currentDrawable property of my MTKView instance:

context?.render(drawImage, to: targetTexture, commandBuffer: commandBuffer, bounds: targetRect, colorSpace: colorSpace)

It renders correctly as I can see the image being displayed on the metal view.

The problem is that after rendering the image (and displaying it), I want to extract the CVPixelBuffer and save it to disk using the class AVAssetWriter.

Another alternative would be to have two rendering steps, one rendering to the texture and another rendering to a CVPixelBuffer. (But it isn't clear how to create such buffer, or the impact that two rendering steps would have in the framerate)

Any help will be appreciated, Thanks!

Nicolas Miari
  • 16,006
  • 8
  • 81
  • 189
Breno Rodrigues
  • 105
  • 1
  • 3
  • 6

2 Answers2

4

You can try to copy the raw data from the MTLTexture like this :

var outPixelbuffer: CVPixelBuffer?
if let datas = targetTexture.texture.buffer?.contents() {
    CVPixelBufferCreateWithBytes(kCFAllocatorDefault, targetTexture.width, 
    targetTexture.height, kCVPixelFormatType_64RGBAHalf, datas, 
    targetTexture.texture.bufferBytesPerRow, nil, nil, nil, &outPixelbuffer);
}
Stephen Rauch
  • 47,830
  • 31
  • 106
  • 135
Abs
  • 41
  • 3
  • 2
    Oh, I've just found another way, CoreImage can create CIImage with MTLTexture and without care about pixelFormat – Abs May 16 '18 at 10:23
  • 2
    Note that 'buffer' will be NIL if the texture was not created from a buffer in the first place. – Ash Dec 08 '18 at 14:11
  • @Ash any idea of how to create pixel buffer if the texture was not created from a buffer? – prabhu Dec 08 '20 at 13:57
  • @prabhu use `CVPixelBufferCreate()` instead. The first four and last two attributes are the same as in the code above. – Ash Dec 09 '20 at 09:03
-1
+ (void)getPixelBufferFromBGRAMTLTexture:(id<MTLTexture>)texture result:(void(^)(CVPixelBufferRef pixelBuffer))block {
    
    CVPixelBufferRef pxbuffer = NULL;
    NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys:
                             [NSNumber numberWithBool:YES], kCVPixelBufferCGImageCompatibilityKey,
                             [NSNumber numberWithBool:YES], kCVPixelBufferCGBitmapContextCompatibilityKey,
                             nil];
    
    size_t imageByteCount = texture.width * texture.height * 4;
    void *imageBytes = malloc(imageByteCount);
    NSUInteger bytesPerRow = texture.width * 4;
    MTLRegion region = MTLRegionMake2D(0, 0, texture.width, texture.height);
    [texture getBytes:imageBytes bytesPerRow:bytesPerRow fromRegion:region mipmapLevel:0];
    
    CVPixelBufferCreateWithBytes(kCFAllocatorDefault,texture.width,texture.height,kCVPixelFormatType_32BGRA,imageBytes,bytesPerRow,NULL,NULL,(__bridge CFDictionaryRef)options,&pxbuffer);
    
    if (block) {
        block(pxbuffer);
    }
    CVPixelBufferRelease(pxbuffer);
    free(imageBytes);
}
Community
  • 1
  • 1