1

Sometimes the expression and emotion metrics drop to zero even though the face is tracked.

In this example video, the value of smile classifier drops to zero when the head pitch angle changes.

ahamino
  • 634
  • 1
  • 4
  • 12

1 Answers1

1

Currently the expression and emotion tracking is limited to near frontal orientation of the face. The SDK estimates the head pose, and stops reporting metrics when the head pitch, yaw +/- 25 degrees. This is done to limit the amount of false positives that occur due to the changes in head pose. We are planning to make our classifiers more robust to head orientation in future releases.

ahamino
  • 634
  • 1
  • 4
  • 12