I am creating a iOS app that uses the Firebase ML Kit Face Detection and I am trying to allow users to take a photo from their camera and check if there was a face in it. So I have followed the documentation and some youtube videos but it seems that it just doesn't work properly/accurately for me. I did some testing using a photo library not just pictures that I take, and what I found is it works well when I use selfies from google, but when I take my own selfies it never seems to work. I noticed when I take a selfie on my camera it does like a "mirror" kind of thing where it flips it, but I even took a picture of my friend using the front facing camera and it still didn't work. So I am not sure if I implemented this wrong, or what is going on. I have attached some of the relevant code to show how it was implemented. Thanks to anyone who takes the time to help out, I am a novice at iOS development so hopefully this isn't a waste of your time.
func photoVerification(){
let options = VisionFaceDetectorOptions()
let vision = Vision.vision()
let faceDetector = vision.faceDetector(options: options)
let image = VisionImage(image: image_one.image!)
faceDetector.process(image) { (faces, error) in
guard error == nil, let faces = faces, !faces.isEmpty else{
//No face detected provide error on image
print("No face detected!")
self.markImage(isVerified: false)
return
}
//Face Has been detected Offer Verified Tag to user
print("Face detected!")
self.markImage(isVerified: true)
}
}