1

I am using MetalKit and have a complex rendering pipeline. The results are rendered into a MTKView.

Now I would like to feed the contents of the MTKView to a SCNScene and using a SCNCamera to perform post-process effects like HDR.

How is this possible?

I do not want general directions. I want specific calls if possible.

user16217248
  • 3,119
  • 19
  • 19
  • 37
Summon
  • 968
  • 1
  • 8
  • 19
  • I assume by "post-process effects like HDR," you mean post-processing effects that are only applicable in an HDR workflow, like tone-mapping. This implies you're already rendering to HDR targets before presenting your scene in an MTKView; correct? In general, you don't have access to the render targets managed by SceneKit, so the chief question here is how to get your HDR targets into "SceneKit world" in a performant way. Which specific effects are you trying to achieve? – warrenm May 17 '18 at 08:31
  • @warrenm Yes, I have a single output in MTLPixelFormatRGBA16Float format which is ready to be processed by a HDR flow. I want to apply HDR. So, my thought was to perform rendering in MTKView or a texture and pass that texture to a SCNScene which has a plane node with the texture form my pipeline applied and a SCNCamera node looking at that plane which will apply the HDR effect. – Summon May 17 '18 at 08:39

1 Answers1

5

You should ideally perform your post-process as part of your Metal rendering pipeline. The procedure you are suggesting requires unnecessary resources since you will be rendering a 2D plane in 3D in SceneKit just to apply some HDR effects.

Nevertheless, you can achieve what you want by rendering your Metal pipeline output to a texture and then simply applying it to a plane in SceneKit.

First assign your texture:

plane.materials.first?.diffuse.contents = offscreenTexture

Then override your SceneKit rendering to your Metal rendering loop:

func renderer(_ renderer: SCNSceneRenderer, willRenderScene scene: SCNScene, atTime time: TimeInterval) {
    doRender()
}

Then perform your Metal rendering with the texture as target, and once that is done you render your SceneKit scene:

func doRender() {
    //rendering to a MTLTexture, so the viewport is the size of this texture
    let viewport = CGRect(x: 0, y: 0, width: CGFloat(textureSizeX), height: CGFloat(textureSizeY))

    //write to offscreenTexture, clear the texture before rendering using green, store the result
    let renderPassDescriptor = MTLRenderPassDescriptor()
    renderPassDescriptor.colorAttachments[0].texture = offscreenTexture
    renderPassDescriptor.colorAttachments[0].loadAction = .clear
    renderPassDescriptor.colorAttachments[0].clearColor = MTLClearColorMake(0, 1, 0, 1.0); //green
    renderPassDescriptor.colorAttachments[0].storeAction = .store

    let commandBuffer = commandQueue.makeCommandBuffer()

    // reuse scene1 and the current point of view
    renderer.scene = scene1
    renderer.pointOfView = scnView1.pointOfView
    renderer.render(atTime: 0, viewport: viewport, commandBuffer: commandBuffer, passDescriptor: renderPassDescriptor)

    commandBuffer.commit()
}`

Full example project:

https://github.com/lachlanhurst/SceneKitOffscreenRendering

Oskar
  • 3,625
  • 2
  • 29
  • 37