2

I added a SCNNode to a ARSCNView:

 func renderer(_ renderer: SCNSceneRenderer, nodeFor anchor: ARAnchor) -> SCNNode? {
  guard let faceAnchor = anchor as? ARFaceAnchor else { return nil }
  guard let device = sceneView.device else { return nil }
  guard let faceGeometry = ARSCNFaceGeometry(device: device, fillMesh: true) else { return nil }

  let faceNode = FaceNode(faceGeometry)

  // Node is a custom SCNNode class
  let glassesNode = Node(image: UIImage(named: "glasses")!, position: .glasses, anchor: faceAnchor)
  faceNode.addChildNode(glassesNode)

  return faceNode
 }

Then I use this delegate function to add a filter to each frame. The problem is the SCNNode that's added: glassesNode is not recognized in frame.capturedImage. Therefore, no filter is applied to the SCNNode, but is applied to everything else.

 func session(_ session: ARSession, didUpdate frame: ARFrame) {

  guard let image = CIImage(cvImageBuffer: frame.capturedImage)) else { return }

  let filterImage = setFilter(image)

  sceneView.scene.background.contents = context.createCGImage(filterImage, from: filterImage.extent)
 }

The only workaround is to use the snapshot() property of ARSCNView. Which works well. The SCNNode gets filtered, but there is too much latency with UI elements on the screen which is a problem.

Is there anyway I can use ARFrame's capturedImage and have it recognize the SCNNode that is on the ARSCNView?

Bobby
  • 6,115
  • 4
  • 35
  • 36

1 Answers1

0

This is not possible using capturedImage of the ARFrame. ARKit itself is not aware that you are using SceneKit for rendering your 3D content, so the capturedImage will only contain the camera image itself.