4

I'm using the Azure Face Recognition API in an iPhone app. It's working just fine when I take pictures with the back camera but when I use the front-facing one, the API fails to detect faces.

I've tried transferring the (front-facing) photo to my laptop and dragged it into the test area in the documentation and there the face was detected just fine.

This leads me to believe that there's maybe some metadata or flags specific to front-facing photos that confuse the API? And that those are stripped when uploaded through a browser?

UPDATE

Here's how I'm uploading the file using AlamoFire:

let data = UIImageJPEGRepresentation(photo, 0.5)
let url = "https://.../detect"
let octetHeaders = ["Content-Type": "application/octet-stream", "Ocp-Apim-Subscription-Key": "..."]
Alamofire.upload(data, to: url, method: .post, headers: octetHeaders)

Thanks!

mikker
  • 811
  • 7
  • 20
  • 1
    How is the image uploaded to Face API? The service determines the rotation according to the exif information. So I suspect the app you use does not upload those information correctly. – Xuan Hu Aug 15 '17 at 10:10
  • You might be right – I've updated the question. – mikker Aug 16 '17 at 10:48
  • 1
    I am not an ios development expert. But it seems that UIImageJPEGRepresentation is an re-encoded image that has already lose exif information. It would be better to upload the origin file stream instead of image raw data (which only contains pixel information). – Xuan Hu Aug 16 '17 at 15:32
  • Can you help on some points ? 1) Did you create a project on azure portal and uses that API key to recognize person name . 2) Did you add your login user photo folder wise on the portal ? – Jamshed Alam Nov 27 '19 at 06:03

1 Answers1

3

Xuan Hu was right in the comments. Turns out the iPhone doesn't rotate images – it just sets an orientation EXIF-tag.

Hard rotating the photo before uploading made it all work:

func normalizeImageRotation(_ image: UIImage) -> UIImage {
    if (image.imageOrientation == UIImageOrientation.up) { return image }

    UIGraphicsBeginImageContextWithOptions(image.size, false, image.scale)
    image.draw(in: CGRect(x: 0, y: 0, width: image.size.width, height: image.size.height))
    let normalizedImage = UIGraphicsGetImageFromCurrentImageContext()!
    UIGraphicsEndImageContext()
    return normalizedImage
}
mikker
  • 811
  • 7
  • 20