0

I downloaded sample affectiva camera test iOS app from the below github.

https://github.com/Affectiva/ios-sdk-samples

It is working nice. I would be able to see logs for emotions such as anger, contempt, disgust, fear, joy, surprise except "sadness" emotion.

I would like to know, does this sample project uses their cloud service for detecting emotions or does it work natively on iOS? Does it use OpenCV internally? How does it work internally? I want to modify a bit to see the features, but i want to know how does this program detect emotions?

Please explain.

Stella
  • 1,728
  • 5
  • 41
  • 95

2 Answers2

4

The Affectiva iOS SDK doesn't require cloud service calls in order to analyze the face. The current version of the SDK uses computer vision to extract HoG features from the face and SVM classifiers to classify the facial expressions and the emotions. The classification is done on iphone / ipad, no images are getting uploaded to the cloud.

ahamino
  • 634
  • 1
  • 4
  • 12
0

I don't think Affectiva iOS SDK works without cloud service call. It doesn't use OpenCV either.

user1953977
  • 163
  • 2
  • 12