0

I`m using ARKit to detect if mouth is open or not.

Mouth Open: This value is controlled by how much you open your mouth.

Range: 0.0 to 1.0

However, when you yawn the value is saying that mouth is closed. (wtf)

I'm receiving values from method faceAnchor.blendShapes

     func renderer(_ renderer: SCNSceneRenderer, didUpdate node: SCNNode, for anchor: ARAnchor) {
        guard let faceAnchor = anchor as? ARFaceAnchor,
              let faceGeometry = node.geometry as? ARSCNFaceGeometry else {
                return
        }

I saw the equivalent on Android on this post Android Mobile Vision API detect mouth is open

Manu
  • 50
  • 1
  • 9

0 Answers0