2

I'm taking a snapshot of every frame, applying a filter, and updating the background contents of the ARSCNView with the filtered image. Everything is working fine, but there is a lot of latency with all the UI elements on the screen. No latency on the ARSCNView.

 func session(_ session: ARSession, didUpdate frame: ARFrame) {

  guard let image = CIImage(image: sceneView.snapshot()) else { return }

  // I'm setting a filter to each image here. Which has no effect on the latency.

   sceneView.scene.background.contents = context.createCGImage(image, from: image.extent)
 }

I know I can use frame.capturedImage, which makes latency go away. However, I also place AR objects on the screen which frame.capturedImage ignores for some reason, and sceneView.scene.background.contents cannot be reset to its original source. So, I cannot turn off the image filter. That's why I need to take a snapshot.

Is there anything I can do that will reduce latency on the UI elements? I have a few UIScrollViews on the screen that have tremendous lag.

Bobby
  • 6,115
  • 4
  • 35
  • 36
  • Snapshotting Metal content (which is what the SceneKit view is drawing) and bringing it back to the CPU is always going to be expensive. What’s the actual Core Image filtering you’re applying? There may be a way to do something similar with a Metal shader in an [SCNTechnique](https://developer.apple.com/documentation/scenekit/scntechnique). – Noah Witherspoon Apr 09 '20 at 17:04
  • I did look into SCNTechnique, but it seems like a lot to learn. I'm using a lot of CIFilters, mostly these types: CICategoryColorAdjustment, CICategoryColorEffect, CICategoryDistortionEffect, CICategoryStylize. Currently, I'm using frame.capturedImage and applying CIFilters, but the filters ignore SCNNode's I place on the screen. This would solve my problem as well: https://stackoverflow.com/questions/61068625/scnnode-not-showing-in-arframes-capturedimage – Bobby Apr 09 '20 at 22:57

1 Answers1

0

I'm also in the middle of looking for a way to do this with no lag, but I was able to at least reduce the lag by rendering the view into an image manually:

extension ARSCNView {
    /// Performs screen snapshot manually, seems faster than built in snapshot() function, but still somewhat noticeable
    var snapshot: UIImage? {
       let renderer = UIGraphicsImageRenderer(size: self.bounds.size)
        let image = renderer.image(actions: { context in
            self.drawHierarchy(in: self.bounds, afterScreenUpdates: true)
        })
        return image
    }
}

It's frustrating that this is faster than the built-in snapshot function, but it seems to be, and also still captures all the SceneKit graphics in the snapshot. (Doing this every frame will still be expensive though, FYI, and the only real solution for that would likely be a custom Metal shader.)

I'm also trying to work with ARSCNView.snapshotView(afterScreenUpdates: Bool) because that seems to have essentially no lag for my purposes, but whenever I try to turn the resulting View into a UIImage, it's totally blank. Either way, the above method cut the lag in about half for me, so you might have some luck with that.

Michael
  • 291
  • 2
  • 6
  • Thanks for the response! Unfortunately, I did try this, but for me it actually makes it slower than the snapshot function. Would using: frame.capturedImage work for you? That works fast for me, but I'm placing SCNNode's on the screen which are not recognized. – Bobby Apr 14 '20 at 09:10
  • 1
    Yeah frame.capturedImage will work well depending on your purposes, but that specifically captures the raw camera frame, *not* the SceneKit models unfortunately. (And similarly, using something like an SCNTechnique on the SceneKit objects will only process the SceneKit geometries, not the camera frame.) So to process all of them together, you'd probably want to pass both to a custom metal shader. :( – Michael Apr 14 '20 at 18:29