2

I have a flutter plugin where I need do to some basic 3D rendering on iOS. I decided to go with the Metal API because the OpenGL ES is deprecated on the platform.

Before implementing a plugin I implemented rendering in the iOS application. There rendering works without problems.

While rendering to the texture I get whole area filled with black.

//preparation
Vertices = [Vertex(x:  1, y: -1,   tx: 1, ty: 1),
            Vertex(x: 1, y:  1,   tx: 1, ty: 0),
            Vertex(x: -1, y:  1,  tx: 0, ty: 0),
            Vertex(x:  -1, y:  -1,  tx: 0, ty: 1),]
Indices = [0, 1, 2, 2, 3, 0]

let d = [
    kCVPixelBufferOpenGLCompatibilityKey : true,
    kCVPixelBufferMetalCompatibilityKey : true
]
var cvret = CVPixelBufferCreate(kCFAllocatorDefault, width, height, kCVPixelFormatType_32BGRA, d as CFDictionary, &pixelBuffer); //FIXME jaki format
if(cvret != kCVReturnSuccess) {
    print("faield to create pixel buffer")
}

metalDevice = MTLCreateSystemDefaultDevice()! 

let desc = MTLTextureDescriptor.texture2DDescriptor(pixelFormat: MTLPixelFormat.rgba8Unorm, width: width, height: height, mipmapped: false)
desc.usage = MTLTextureUsage.renderTarget.union( MTLTextureUsage.shaderRead )
targetTexture = metalDevice.makeTexture(descriptor: desc)
metalCommandQueue = metalDevice.makeCommandQueue()!  
ciCtx = CIContext.init(mtlDevice: metalDevice)

let vertexBufferSize = Vertices.size()
vertexBuffer = metalDevice.makeBuffer(bytes: &Vertices, length: vertexBufferSize, options: .storageModeShared)

let indicesBufferSize = Indices.size()
indicesBuffer = metalDevice.makeBuffer(bytes: &Indices, length: indicesBufferSize, options: .storageModeShared)

let defaultLibrary = metalDevice.makeDefaultLibrary()!
let txProgram = defaultLibrary.makeFunction(name: "basic_fragment")
let vertexProgram = defaultLibrary.makeFunction(name: "basic_vertex") 

let pipelineStateDescriptor = MTLRenderPipelineDescriptor()
pipelineStateDescriptor.sampleCount = 1
pipelineStateDescriptor.vertexFunction = vertexProgram
pipelineStateDescriptor.fragmentFunction = txProgram
pipelineStateDescriptor.colorAttachments[0].pixelFormat = .rgba8Unorm

pipelineState = try! metalDevice.makeRenderPipelineState(descriptor: pipelineStateDescriptor)

//drawing
let renderPassDescriptor = MTLRenderPassDescriptor() 
renderPassDescriptor.colorAttachments[0].texture = targetTexture 
renderPassDescriptor.colorAttachments[0].loadAction = .clear 
renderPassDescriptor.colorAttachments[0].clearColor = MTLClearColor(red: 0.85, green: 0.85, blue: 0.85, alpha: 0.5) 
renderPassDescriptor.colorAttachments[0].storeAction = MTLStoreAction.store
renderPassDescriptor.renderTargetWidth = width
renderPassDescriptor.renderTargetHeight = height

guard let commandBuffer = metalCommandQueue.makeCommandBuffer() else { return }

guard let renderEncoder = commandBuffer.makeRenderCommandEncoder(descriptor: renderPassDescriptor) else { return }
renderEncoder.label = "Offscreen render pass"
renderEncoder.setVertexBuffer(vertexBuffer, offset: 0, index: 0) 

renderEncoder.setRenderPipelineState(pipelineState)
renderEncoder.drawIndexedPrimitives(type: .triangle, indexCount: Indices.count, indexType: .uint32, indexBuffer: indicesBuffer, indexBufferOffset: 0) // 2

renderEncoder.endEncoding() 
commandBuffer.commit() 

//copy to pixel buffer
guard let img = CIImage(mtlTexture: targetTexture) else { return }
ciCtx.render(img, to: pixelBuffer!)
Cezary Butler
  • 807
  • 7
  • 21

2 Answers2

1

I'm pretty sure that creating a separate MTLTexture and then blitting it into a CVPixelBuffer is not a way to go. You are basically writing it out to an MTLTexture and then using that result only to write it out to a CIImage.

Instead, you can make them share an IOSurface underneath by creating a CVPixelBuffer with CVPixelBufferCreateWithIOSurface and a corresponding MTLTexture with makeTexture(descriptor:iosurface:plane:) .

Or you can create an MTLBuffer that aliases the same memory as CVPixelBuffer, then create an MTLTexture from that MTLBuffer. If you are going to use this approach, I would suggest also using MTLBlitCommandEncoders methods optimizeContentsForCPUAccess and optimizeContentsForGPUAccess. You first optimizeContentsForGPUAccess, then use the texture on the GPU, then twiddle the pixels back into a CPU-readable format with optimizeContentsForCPUAccess. That way you don't lose the performance when rendering to a texture.

JustSomeGuy
  • 3,677
  • 1
  • 23
  • 31
  • Thanks. If I could acces IOSurface from Flutter plugin I would. I wouldn't even need offscreen rendering then. If you know a way to do that please let me know. For now I'll try MTLBuffer approach. – Cezary Butler Dec 21 '21 at 21:36
  • I'm not sure what flutter is, but what do you have as an interface? Which objects are exposed to your plugin? – JustSomeGuy Dec 21 '21 at 21:54
  • Flutter is a google SDK for developing multi-platform UI. As far as I know the only way to render anything on iOS from within the plugin is by implementing FlutterTexture protocol. That means adding `func copyPixelBuffer() -> Unmanaged` method. – Cezary Butler Dec 22 '21 at 09:49
1

Yes, using Texture requires the native implementation of the FlutterTexture protocol and the implementation of the- (CVPixelBufferRef _Nullable)copyPixelBuffer; method.

Code below:

import Foundation
import CoreVideo
import Metal
import Flutter

class MetalTexture: NSObject {
    var sourceImageBuf: CVMetalTexture?
    var pixelBuf: Unmanaged<CVPixelBuffer>?
    var textureCache: CVMetalTextureCache?
    
    init(width: Int, height: Int) {
        super.init()
        
        guard let defaultDevice = MTLCreateSystemDefaultDevice() else {
            fatalError("Could not create Metal Device")
        }
        
        guard let ioSurface = IOSurfaceCreate([
            kIOSurfaceWidth: width,
            kIOSurfaceHeight: height,
            kIOSurfaceBytesPerElement: 4,
            kIOSurfacePixelFormat: kCVPixelFormatType_32BGRA] as [CFString : Any] as CFDictionary) else {
            fatalError("IOSurfaceCreate error.")
        }
        
        guard CVPixelBufferCreateWithIOSurface(
            kCFAllocatorDefault,
            ioSurface,
            [kCVPixelBufferMetalCompatibilityKey: true] as CFDictionary,
            &pixelBuf) == kCVReturnSuccess else {
            fatalError("CVPixelBufferCreateWithIOSurface create CVPixelBuffer error")
        }
        
        guard CVMetalTextureCacheCreate(kCFAllocatorDefault,
                                        nil,
                                        defaultDevice,
                                        nil,
                                        &textureCache) == kCVReturnSuccess else {
            fatalError("Failed to create texture cache")
        }
        guard let textureCache = textureCache else { return }
        
        guard CVMetalTextureCacheCreateTextureFromImage(
            kCFAllocatorDefault,
            textureCache,
            pixelBuf!.takeUnretainedValue(),
            nil,
            .bgra8Unorm,
            width,
            height,
            0,
            &sourceImageBuf) == kCVReturnSuccess else {
            fatalError("CVMetalTextureCacheCreateTextureFromImage bind CVPixelBuffer to metal texture error")
        }
        if let image = sourceImageBuf {
            sharedContext.metalTexture = CVMetalTextureGetTexture(image)
        }
    }
}

extension MetalTexture: FlutterTexture {
    func copyPixelBuffer() -> Unmanaged<CVPixelBuffer>? {
        if let pixelBuf = pixelBuf?.takeUnretainedValue() {
            return Unmanaged.passRetained(pixelBuf)
        } else {
            return nil
        }
    }
}

Apach3
  • 41
  • 6
  • Could you put the code directly within the answer? It would be good to point to what solution consist of, or point out what was the problem instead of just posting the code in domain which is not internationally recognized. – Cezary Butler Aug 09 '23 at 07:14