1

When I touch the screen I Use the raycast to place a object (entity and anchor) to it's worldTransform

Then I try to touch this object to get it's anchor (or it's own entity)

I am trying to find previously placed anchor with raycast (or hitTest ) but every things returns nil

like that :

This is my onTap code :

@IBAction func onTap(_ sender: UITapGestureRecognizer){
   let tapLocation = sender.location(in: arView)
        
   guard let result = arView.raycast(from: tapLocation, allowing: .estimatedPlane, alignment: .any).first else { return }
  print("we have anchor??")
  print(result.anchor) // always nil
        
  // never hit
  if let existResult = arView.hitTest(tapLocation).first {
     print("hit test trigger")
     if let entity = existResult.entity as Entity? {
        NSLog("we have an entity \(entity.name)")
...
      } 
   }  
}

And this is the way I create some object and anchors :

let anchor = AnchorEntity(world: position)
anchor.addChild(myObj)
arView.scene.anchors.append(anchor)
// my obj is now visible

Do you have any idea why I can't manage to get the anchor I touch ?

EDIT :

ARview configuration :


        arView.session.delegate = self
        arView.automaticallyConfigureSession = true

        let config = ARWorldTrackingConfiguration()
        config.planeDetection = [.horizontal, .vertical]
        
        NSLog("FINISHED INIT")
        
        if ARWorldTrackingConfiguration.supportsSceneReconstruction(.mesh) {
            config.sceneReconstruction = .mesh // .meshWithClassification
            arView.environment.sceneUnderstanding.options.insert([.occlusion])

            arView.debugOptions.insert(.showSceneUnderstanding)
            NSLog("FINISHED with scene reco")
        } else {
            NSLog("no not support scene Reconstruction")
        }
        

        
        let tapGesture = UITapGestureRecognizer(target: self, action:#selector(onTap))
        
        arView.addGestureRecognizer(tapGesture)
        arView.session.run(config)



F4Ke
  • 1,631
  • 1
  • 20
  • 49

2 Answers2

3

I finally manage to found a solution :

My ModelEntity (anchored) had to have a collision shape !

So after adding simply entity.generateCollisionShapes(recursive: true).

This is how I generate a simple box :


    let box: MeshResource = .generateBox(width: width, height: height, depth: length)
    var material = SimpleMaterial()
    material.tintColor = color
    let entity = ModelEntity(mesh: box, materials: [material])
    entity.generateCollisionShapes(recursive: true) // Very important to active collisstion and hittesting !
    return entity

and so after that we must tell the arView to listen to gestures :

arView.installGestures(.all, for: entity)

and finally :


    @IBAction func onTap(_ sender: UITapGestureRecognizer){
        let tapLocation = sender.location(in: arView)

        if let hitEntity = arView.entity(
            at: tapLocation
        ) {
            print("touched")
            print(hitEntity.name)
            // touched !

            return ;
        }

    }

F4Ke
  • 1,631
  • 1
  • 20
  • 49
0

Raycasting is only possible after ARKit detects planes. It can only raycast to planes or feature points. So make sure you run the AR configuration with plane detection (vertical or horizontal depending on your case)

let configuration = ARWorldTrackingConfiguration()
configuration.planeDetection = .horizontal

You can check whether plane anchor added here in renderer didAdd delegate of ARSCNViewDelegate

func renderer(_ renderer: SCNSceneRenderer, didAdd node: SCNNode, for anchor: ARAnchor) {
     
if let planeAnchor = anchor as? ARPlaneAnchor {
   //plane detected
  }
}
Ozgur Sahin
  • 1,305
  • 16
  • 24
  • Hi, yes raycast works fine when I when to add a new anchor, however here I want to detect when I touch an already existing object – F4Ke Jan 22 '21 at 12:11