I downloaded sample affectiva camera test iOS app from the below github.
https://github.com/Affectiva/ios-sdk-samples
It is working nice. I would be able to see logs for emotions such as anger, contempt, disgust, fear, joy, surprise except "sadness" emotion.
I would like to know, does this sample project uses their cloud service for detecting emotions or does it work natively on iOS? Does it use OpenCV internally? How does it work internally? I want to modify a bit to see the features, but i want to know how does this program detect emotions?
Please explain.