1

I am working on an application with ARKit and SceneKit frameworks. In my application I have enabled surface detection (I followed the placing objects sample provided by Apple). How to find if the surface detected is no more available? That is, initially only if user has detected the surface in ARSession I am allowing him to place the 3D object.

But if the user moves rapidly or focuses somewhere, the detected surface area is getting lost. In this case if the user tries to place another object I shouldn't allow him to place it until he scans the floor again and get the surface corrected.

Is there any delegate which is available to let us know that the surface detected is no more available?

Andy Jazz
  • 49,178
  • 17
  • 136
  • 220
Vidhya Sri
  • 1,773
  • 1
  • 17
  • 46

2 Answers2

3

There are delegate functions that you can use. The delegate is the ARSCNViewDelegate

It has a function that is renderer(_:didRemove:for:) that fires when an ARAnchor has been removed. You can use this function to perform some operation when a surface gets removed.

ARSCNViewDelegate Link

Alan
  • 1,132
  • 7
  • 15
  • Do you mean that this method will get called if there is no surface detection? I tried putting log in this method. What i tried is i focussed the floor, surface detection happened and then i focussed somewhere else which is not a horizontal surface and this method was not getting called – Vidhya Sri Sep 05 '17 at 10:43
  • This method should get lost if any surface or plane that you found gets removed from the scene itself. If it gets 'lost' and the phone has lost the position of the plane/surface itself after shaking, it could still be in the scene and available in the AR world, but that the phone just lost it's tracking and thinks the surface is elsewhere. – Alan Sep 05 '17 at 10:46
  • So this method will probably get called when the surface tracking is lost due to shaking. Am i right? – Vidhya Sri Sep 05 '17 at 11:31
  • This method will get called when a surface get's fully removed from the AR Scene. If it's getting lost it might still be in the scene, but that the phone has lost it's orientation or position and can't accurately place the surface anymore. – Alan Sep 05 '17 at 11:33
  • If it lost the orientation or position, will this method get called? Or only if the surface is fully removed. Like i said, let us assume i am focusing the floor and surface got detected, then i am focusing somewhere else where there is no proper plane surface. Let us assume the already scanned surface is still there but phone just lost its tracking. In this case will this method be called? – Vidhya Sri Sep 05 '17 at 11:44
  • No if the phone loses its orientation or position this function probably won't get called. The surface would still be in the scene of the AR, but it would just not be displayed in the correct positions due to this loss. Whereas, if, for some case, a surface/plane/node was removed due to memory constraints or an action by the user or whichever, this method should get called. I'm sorry that this discussion might not have answered your question. I just figured that when I answered that we were both talking about the same issue. – Alan Sep 05 '17 at 11:51
2

There are two ways to “lose” a surface, so there’s more than one approach to dealing with such a problem.


As noted in the other answer, there’s an ARSCNViewDelegate method that ARKit calls when an anchor is removed from the AR session. However, ARKit doesn’t remove plane anchors during a running session — once it’s detected a plane, it assumes the plane is always there. So that method gets called only if:

  1. You remove the plane anchor directly by passing it to session.remove(anchor:), or
  2. You reset the session by running it again with the .removeExistingAnchors option.

I’m not sure the former is a good idea, but the latter is important to handle, so you probably want your delegate to handle it well.


You can also “lose” a surface by having it pass out of view — for example, ARKit detects a table, and then the user turns around so the camera isn’t pointed at or near the table anymore.

ARKit itself doesn’t offer you any help for dealing with this problem. It gives you all the info you need to do the math yourself, though. You get the plane anchor’s position, orientation, and size, so you can calculate its four corner points. And you get the camera’s projection matrix, so you can check for whether any point is in the viewing frustum.

Since you’re already using SceneKit, though, there are also ways to get SceneKit to do the math for you... Working backwards:

  1. SceneKit gives you an isNode(_:insideFrustumOf:) test, so if you have a SCNNode whose bounding box matches the extent of your plane anchor, you can pass that along with the camera (view.pointOfView) to find out if the node is visible.
  2. To get a node whose bounding box matches a plane anchor, implement the ARSCNViewDelegate didAdd and didUpdate callbacks to create/update an SCNPlane whose position and dimensions match the ARPlaneAnchor’s center and extent. (Don’t forget to flip the plane sideways, since SCNPlane is vertically oriented by default.)
  3. If you don’t want that plane visible in the AR view, set its materials to be transparent.
rickster
  • 124,678
  • 26
  • 272
  • 326