0

I have a problem. I use 2 image. One is download from internet. the other is captured by camera of iPhone. I use CIDetector to detect face in 2 images. It work perfect in image that download from internet. But the other, it's can't detect or detect wrong.

I check in many images. That result is the same.

Marcel Gosselin
  • 4,610
  • 2
  • 31
  • 54
Nhat Huy
  • 21
  • 3

3 Answers3

0

Try this

   NSDictionary *options = [NSDictionary dictionaryWithObject: CIDetectorAccuracyLow            forKey: CIDetectorAccuracy];
        CIDetector  *detector = [CIDetector detectorOfType: CIDetectorTypeFace context:    nil options: options];

      CIImage *ciImage = [CIImage imageWithCGImage: [image CGImage]];
    NSNumber *orientation = [NSNumber numberWithInt:[image imageOrientation]+1];
    NSDictionary *fOptions = [NSDictionary dictionaryWithObject:orientation forKey: CIDetectorImageOrientation];
        NSArray *features = [detector featuresInImage:ciImage options:fOptions];
        for (CIFaceFeature *f in features) {

            NSLog(@"left eye found: %@", (f. hasLeftEyePosition ? @"YES" : @"NO"));

            NSLog(@"right eye found: %@", (f. hasRightEyePosition ? @"YES" : @"NO"));

            NSLog(@"mouth found: %@", (f. hasMouthPosition ? @"YES" : @"NO"));

            if(f.hasLeftEyePosition)

                NSLog(@"left eye position x = %f , y = %f", f.leftEyePosition.x, f.leftEyePosition.y);

            if(f.hasRightEyePosition)

                NSLog(@"right eye position x = %f , y = %f", f.rightEyePosition.x, f.rightEyePosition.y);

            if(f.hasMouthPosition)

                NSLog(@"mouth position x = %f , y = %f", f.mouthPosition.x, f.mouthPosition.y);

        }

if you're using the front camera always in portrait add this

      NSDictionary* imageOptions = [NSDictionary dictionaryWithObject:[NSNumber        numberWithInt:6] forKey:CIDetectorImageOrientation];
      NSArray* features = [detector featuresInImage:image options:imageOptions];

For more info

sample: https://github.com/beetlebugorg/PictureMe

iOS Face Detection Issue

Face Detection issue using CIDetector

https://stackoverflow.com/questions/4332868/detect-face-in-iphone?rq=1

Community
  • 1
  • 1
Ramz
  • 596
  • 1
  • 6
  • 23
  • The big problem is it's can't detect when I use image that just captured by camera of Iphone – Nhat Huy Nov 09 '12 at 04:41
  • But if i set **NSDictionary* imageOptions = [NSDictionary dictionaryWithObject:[NSNumber numberWithInt:6] forKey:CIDetectorImageOrientation];** Image that downloaded in internet can't be detect if it is landscape.........:( – Nhat Huy Nov 09 '12 at 04:54
  • Have you tried the above? Is it working now for captured image from iPhone?update your question with your code used for face detection. – Ramz Nov 09 '12 at 04:57
  • Can we know the image captured by Iphone is captured by front camera or back camera ? – Nhat Huy Nov 09 '12 at 08:13
0

I try to this code above. It's can detect images captured by Iphone. But it's can't detect image download from Internet. This is my code

NSDictionary *options = [NSDictionary dictionaryWithObject: CIDetectorAccuracyLow            forKey: CIDetectorAccuracy];
CIDetector  *detector = [CIDetector detectorOfType: CIDetectorTypeFace context:    nil options: options];

CIImage *ciImage = [CIImage imageWithCGImage: [facePicture CGImage]];
NSDictionary* imageOptions = [NSDictionary dictionaryWithObject:[NSNumber        numberWithInt:6] forKey:CIDetectorImageOrientation];
NSArray *features = [detector featuresInImage:ciImage options:imageOptions];

And when it's detect face. I show by code

for (CIFaceFeature *feature in features) {

// // Set red feature color

    CGRect faceRect = [feature bounds];

    CGContextSetRGBFillColor(context, 0.0f, 0.0f, 0.0f, 0.5f);
    CGContextSetStrokeColorWithColor(context, [UIColor whiteColor].CGColor);
    CGContextSetLineWidth(context, 2.0f * scale);
    CGContextAddRect(context, feature.bounds);
    CGContextDrawPath(context, kCGPathFillStroke);
    CGContextDrawImage(context, faceRect, [imgDraw CGImage]);

it's not right position. It's move to right a distance.

Nhat Huy
  • 21
  • 3
0

I had the same problem. You can change size of the image before detection.

CGSize size = CGSizeMake(cameraCaptureImage.size.width, cameraCaptureImage.size.height);
UIGraphicsBeginImageContext(size);
[cameraCaptureImage drawInRect:CGRectMake(0, 0, size.width, size.height)];
cameraCaptureImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
Muhammed Tanriverdi
  • 3,230
  • 1
  • 23
  • 24