0

I have a RealityKit that allows for a user to adjust its scale and translation along a plane through some gestures. It has been generated as follows:

let aPlane = MeshResource.generatePlane(width: width, depth: depth)
let model = ModelEntity(mesh: aPlane, materials: [someMaterial])
model.generateCollisionShapes(recursive: true)
arView.installGestures([.scale, .translation], for: model)

I would like to retrieve its width and depth as a user is pinching to resize it. The goal is to display the scale of the entity (e.g. 1:1, 1:2, 3.5:1).

Any idea how I can get the width and depth values as it is being resized? I can't seem to find any attributes in the Documentation.

I have also tried overriding touchesBegan and printing the boundingRadius, but it seems to print a constant value even after it has been pinched to resize.

override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent?) {
    let model = modelEntities[0].children.first!.components[ModelComponent] as? ModelComponent
    print(model!.mesh.bounds.boundingRadius)

}
defn
  • 1
  • 1

1 Answers1

0

The response from ARView.installGestures gives back an array of EntityGestureRecognizer, subclass of UIGestureRecognizer.

You can use UIGestureRecognizerDelegate methods there to check when the touch begins and go from there.

maxxfrazer
  • 1,173
  • 6
  • 15
  • 2
    Hey Max! (You are this Max Fraser, right? https://medium.com/@maxxfrazer/realitykit-touch-gestures-1ecddb9f9e15). I actually ended up using your guide to add a UITapGestureRecognizer, grabbing the entity at the gesture location using your guide, and then using the various cases to handle the scaling. Thanks for your articles, they were very helpful! – defn Jun 04 '20 at 14:09
  • 2
    that’s me, glad they’ve been helpful for you! – maxxfrazer Jun 06 '20 at 11:02