Using ARKit for facetracking, I get faceAnchor (ARFaceAnchor) as soon as the face is detected, which provides a simd_float4x4 matrix. I know about transformation matrices, and am also aware that the topic has been partially addressed (here: How to get values from simd_float4 in objective-c , and here: simd_float4x4 Columns), but is there a straighforward way to get yaw/pitch/rool values from the face anchor? (in order to feed my y/p/r values in the code below).
func renderer(_ renderer: SCNSceneRenderer, didUpdate node: SCNNode, for anchor: ARAnchor) {
let faceAnchor = anchor as? ARFaceAnchor
let data = faceAnchor?.transform
print(data!)
let message = OSCMessage(
OSCAddressPattern("/orientation"),
yawValue,
pitchValue,
rollValue
)
client.send(message)
print(message)
}
FYI, OSCMessage comes from the SwiftOSC framework which is embedded in my project.