0

I created a sample app to test the image sentiment analysis from Google Cloud Vision API, but I'm not getting good results. The app is running on App Engine: https://feel-vision.appspot.com

The likeliness of sorrow, anger and surprise are really hard to move from the "Very unlikely" threshold, no matter how sad, angry or surprised I try to be at the photo.

The likeliness of joy is something that moves easily by just making a fake smile.

Is there something I can do to improve the sentiment analysis results?

Currently I'm calling the API this way (using the Java client):

//dataUrl is the JPEG base64 encoded image sent by the client
AnnotateImageRequest req = new AnnotateImageRequest()
                .setImage(new Image().setContent(dataUrl))
                .setFeatures(Arrays.asList(
                        new Feature().setType("LABEL_DETECTION"),
                        new Feature().setType("FACE_DETECTION")));

Annotate annotate = vision.images().annotate(new BatchAnnotateImagesRequest().setRequests(Arrays.asList(req)));
BatchAnnotateImagesResponse batchResponse = annotate.execute();
//process the response
Gilberto Torrezan
  • 5,113
  • 4
  • 31
  • 50
  • I've had the same type of results. We've run a number of tests on every type of sentiment that is analysed, and Joy is consistently tracked, but the rest have reached Possible at best, but more often than not are Very Unlikely. It could be that it's in beta and it isn't completely plugged in yet, but we're not quite sure what a shocked face could be other than how most people would imagine it to be. – Dave Starling Mar 09 '16 at 15:23
  • Anger and Surprise seem to be dependant on your mouth being open + eye brow expression mostly. not sure if that helps – Belfordz Mar 10 '16 at 00:36

0 Answers0