I am using Firebase ML Kit for Face Detection, and in the documentation it says:
If necessary, rotate the image so that its imageOrientation property is .up. Create a VisionImage object using the correctly-rotated UIImage. Do not specify any rotation metadata—the default value, .topLeft, must be used.
I am running into the problem where photos I upload from the internet tend to work properly but when I take photos from my camera there seems to be issues. I have a feeling it is due to the way the images are oriented and I can't figure out how I should check the images to ensure these two requirements listed above are satisfied. I tried printing out the images.imageOrientation but it wasn't helping me much, and I for some reason could not use the UIImageOrientationUp
which I saw used in a different stackoverflow answer.
This is what gets printed when I try to print the images orientation:
int:0x2809f9a40 'UISV-alignment' UIImageView:0x13de4d4b0.bottom == UILabel:0x13dec1630'orient's Profile'.bottom (active)>",
"<NSLayoutConstraint:0x2809f9a90 'UISV-alignment' UIImageView:0x13de4d4b0.top == UILabel:0x13dec1630'orient's Profile'.top (active)>",
Anyways if someone could help me write a function that I can use to ensure the orientations of the image I am about to pass to the ML Kit are oriented correctly, I would really appreciate it. Thanks! I am an iOS novice this is my first "Real" app so I am sorry if there was a better or easier way to accomplish my goal.
*** So I found that when I take a picture with my camera it is oriented to the .right but it looks fine on the actual imageView. I tried changing the orientation to the .up but now the image is actually rotated to the right and the detection still failed... I think I need to change the orientation to .Up without actually rotating the image if that is possible. Because when I try to set the value it says its a get only property