10

I am trying to use CoreImage's face detection in iOS 5 but it is not detecting anything. I am trying to detect faces in an image that was just captured by the camera using this code:

- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info {
    UIImage *image = [info objectForKey:@"UIImagePickerControllerOriginalImage"];
    NSDictionary *detectorOptions = [[NSDictionary alloc] initWithObjectsAndKeys:CIDetectorAccuracyHigh, CIDetectorAccuracy, nil];     
    CIDetector *faceDetector = [CIDetector detectorOfType:CIDetectorTypeFace context:nil options:detectorOptions];
    NSArray *features = [faceDetector featuresInImage:image.CIImage];
    NSLog(@"Features = %@", features);
    [self dismissModalViewControllerAnimated:YES];
}

This compiles and runs fine but it the features array is always empty regardless of what's in the image... Any ideas?

Vic320
  • 1,105
  • 2
  • 10
  • 22

4 Answers4

22

I can't reply to your @14:52 comment directly Vic320, but I've been playing with the front camera for face detection - I went round and round in circles since I couldn't get the front camera to pick up my face at all...

Turns out it's very sensitive to rotation - I noticed that when holding my iPad2 in portrait (as you'd expect while using the front camera) I was getting less than 10% recognition accuracy. On a whim, turned it sideways and was getting 100% recognition with the front camera.

Simple fix for this if you're using the front camera always in portrait is to add this little snippet:

NSDictionary* imageOptions = [NSDictionary dictionaryWithObject:[NSNumber numberWithInt:6] forKey:CIDetectorImageOrientation];
NSArray* features = [detector featuresInImage:image options:imageOptions];

That 6 in there forces the detector to operate in portrait mode. Apple's SquareCam Sample has a whole bunch of utility methods to figure out what orientation you're in if you need it to dynamically figure out your orientation.

Costique
  • 23,712
  • 4
  • 76
  • 79
robwormald
  • 853
  • 7
  • 6
5

OK, it's always helpful to read the documentation CAREFULLY. In the UIImage docs, under the CIImage property it says: "If the UIImage object was initialized using a CGImageRef, the value of the property is nil." Apparently, the UIImagePickerController does initialize the image from a CGImageRef because this property is indeed nil. To make the above code work, you need to add:

CIImage *ciImage = [CIImage imageWithCGImage:image.CGImage];

and change this line:

NSArray *features = [faceDetector featuresInImage:ciImage];

Another BIG thing I noticed is that face detection from a still image does not really work on the low-res image from the front camera! It works every time when I use the back, high-res camera. Perhaps the algorithm is tuned for high-res...

Vic320
  • 1,105
  • 2
  • 10
  • 22
4

Try following. Assuming that you load photo in the image variable:

NSDictionary *options = [NSDictionary dictionaryWithObject: CIDetectorAccuracyLow forKey: CIDetectorAccuracy];
            CIDetector  *detector = [CIDetector detectorOfType: CIDetectorTypeFace context: nil options: options];

        CIImage *ciImage = [CIImage imageWithCGImage: [image CGImage]];
        NSNumber *orientation = [NSNumber numberWithInt:[image imageOrientation]+1];
        NSDictionary *fOptions = [NSDictionary dictionaryWithObject:orientation forKey: CIDetectorImageOrientation];
            NSArray *features = [detector featuresInImage:ciImage options:fOptions];
            for (CIFaceFeature *f in features) {

                NSLog(@"left eye found: %@", (f. hasLeftEyePosition ? @"YES" : @"NO"));

                NSLog(@"right eye found: %@", (f. hasRightEyePosition ? @"YES" : @"NO"));

                NSLog(@"mouth found: %@", (f. hasMouthPosition ? @"YES" : @"NO"));

                if(f.hasLeftEyePosition)

                    NSLog(@"left eye position x = %f , y = %f", f.leftEyePosition.x, f.leftEyePosition.y);

                if(f.hasRightEyePosition)

                    NSLog(@"right eye position x = %f , y = %f", f.rightEyePosition.x, f.rightEyePosition.y);

                if(f.hasMouthPosition)

                    NSLog(@"mouth position x = %f , y = %f", f.mouthPosition.x, f.mouthPosition.y);

            }
Ram G.
  • 3,045
  • 2
  • 25
  • 31
1

None of the answers above worked for me (ios 8.4) ipad mini & ipad air 2

I had the same observation as robwormald. Face detection worked when iPad was rotated, so I rotated the ciImage :)

let ciImage = CIImage(CVPixelBuffer: pixelBuffer, options: attachments)
let angle = CGFloat(-M_PI/2)
let rotatedImage = ciImage.imageByApplyingTransform(CGAffineTransformMakeRotation(angle))
Nico Haase
  • 11,420
  • 35
  • 43
  • 69
Riskov
  • 844
  • 7
  • 12