0

I've been researching Apple's Vision API or Core ML for hand detection. All I'd like to do is touch a real-life object with my hand (or even just a detected plane) and have it perform behaviors such as loading a new scene based on the detected touch. This is all easily accomplished in RealityComposer for tapping objects on screen (using some ray-casting I imagine) however after a few weeks of research I haven't come across anything like this in pure AR.

0 Answers0