3

I have followed a tutorial to detect a face within an image, it works. It creates a red rectangle around the face by making a UIView *faceView. Now i am trying to obtain the coordinates of the face detected however the results returned are off slightly in the y-axis. How can i fix this? where am i going wrong.

This is what i have attempted :

CGRect newBounds = CGRectMake(faceFeature.bounds.origin.x, 
                              imageView.bounds.size.height - faceFeature.bounds.origin.y - faceFeature.bounds.size.height,
                              faceFeature.bounds.size.width, 
                              faceFeature.bounds.size.height);

This is the source code for the detection :

-

(void)markFaces:(UIImageView *)facePicture
{
    // draw a CI image with the previously loaded face detection picture
    CIImage* image = [CIImage imageWithCGImage:facePicture.image.CGImage];

    // create a face detector - since speed is not an issue we'll use a high accuracy
    // detector
    CIDetector* detector = [CIDetector detectorOfType:CIDetectorTypeFace 
                                              context:nil options:[NSDictionary dictionaryWithObject:CIDetectorAccuracyHigh forKey:CIDetectorAccuracy]];

    // create an array containing all the detected faces from the detector    
    NSArray* features = [detector featuresInImage:image];

    // we'll iterate through every detected face.  CIFaceFeature provides us
    // with the width for the entire face, and the coordinates of each eye
    // and the mouth if detected.  Also provided are BOOL's for the eye's and
    // mouth so we can check if they already exist.
    for(CIFaceFeature* faceFeature in features)
    {
        // get the width of the face
        CGFloat faceWidth = faceFeature.bounds.size.width;

        // create a UIView using the bounds of the face
        UIView* faceView = [[UIView alloc] initWithFrame:faceFeature.bounds];

        // add a border around the newly created UIView
        faceView.layer.borderWidth = 1;
        faceView.layer.borderColor = [[UIColor redColor] CGColor];


        CGRect newBounds = CGRectMake(faceFeature.bounds.origin.x, 
                                      imageView.bounds.size.height - faceFeature.bounds.origin.y - faceFeature.bounds.size.height,
                                      faceFeature.bounds.size.width, 
                                      faceFeature.bounds.size.height);





        NSLog(@"My view frame: %@", NSStringFromCGRect(newBounds));

        [self.view addSubview:faceView];

        if(faceFeature.hasLeftEyePosition)
        {
        }

        if(faceFeature.hasRightEyePosition)
        {
        }

        if(faceFeature.hasMouthPosition)
        {
        }
    }
}

-(void)faceDetector
{
    // Load the picture for face detection
    UIImageView* image = [[UIImageView alloc] initWithImage:[UIImage imageNamed:@"jolie.jpg"]];

    // Draw the face detection image
    [self.view addSubview:image];

    // flip image on y-axis to match coordinate system used by core image
    [image setTransform:CGAffineTransformMakeScale(1, -1)];

    // flip the entire window to make everything right side up
    [self.view setTransform:CGAffineTransformMakeScale(1, -1)];

    // Execute the method used to markFaces in background
    [self performSelectorInBackground:@selector(markFaces:) withObject:image];
}
Rory Lester
  • 2,858
  • 11
  • 49
  • 66
  • Generally in face detection technique we have to invert our view on y axis thats why you are getting wrong Y-axis .... see this - http://stackoverflow.com/questions/11154585/face-detection-issue-using-cidetector same problem :( – TheTiger Aug 20 '12 at 15:24
  • i know but i read on forums that the calculation that i performed in the y-axis should of fixed it however it works partially =/ – Rory Lester Aug 20 '12 at 15:29
  • I'm still searching about it ......... :/ – TheTiger Aug 20 '12 at 15:32
  • I found this http://www.cluttr.com/?p=291 but it is what i tried above. Do let me know if you find the answer please – Rory Lester Aug 20 '12 at 15:36
  • O! Why not ..... Sure !! – TheTiger Aug 20 '12 at 16:33

1 Answers1

3

CoreImage Coordination system and UIKit coordination system are quite different. CIFaceFeature provides coordinates in coreimage coordination system and for your work you need to convert them into uikit coordinate system:

// CoreImage coordinate system origin is at the bottom left corner and UIKit is at the top left corner
// So we need to translate features positions before drawing them to screen
// In order to do so we make an affine transform
// **Note**
// Its better to convert CoreImage coordinates to UIKit coordinates and
// not the other way around because doing so could affect other drawings
// i.e. In the original sample project you see the image and the bottom, Isn't weird?
CGAffineTransform transform = CGAffineTransformMakeScale(1, -1);
transform = CGAffineTransformTranslate(transform, 0, -_pickerImageView.bounds.size.height);

for(CIFaceFeature* faceFeature in features)
{
 // Translate CoreImage coordinates to UIKit coordinates
 const CGRect faceRect = CGRectApplyAffineTransform(faceFeature.bounds, transform);

    // create a UIView using the bounds of the face
    UIView* faceView = [[UIView alloc] initWithFrame:faceRect];
    faceView.layer.borderWidth = 1;
    faceView.layer.borderColor = [[UIColor redColor] CGColor];

    // get the width of the face
    CGFloat faceWidth = faceFeature.bounds.size.width;

    // add the new view to create a box around the face
[_pickerImageView addSubview:faceView];

        if(faceFeature.hasLeftEyePosition)
    {
        // Get the left eye position: Translate CoreImage coordinates to UIKit coordinates
        const CGPoint leftEyePos = CGPointApplyAffineTransform(faceFeature.leftEyePosition, transform);

        // Note1:
        // If you want to add this to the the faceView instead of the imageView we need to translate its
        // coordinates a bit more {-x, -y} in other words: {-faceFeature.bounds.origin.x, -faceFeature.bounds.origin.y}
        // You could do the same for the other eye and the mouth too.

        // Create an UIView to represent the left eye, its size depend on the width of the face.
        UIView* leftEyeView = [[UIView alloc] initWithFrame:CGRectMake(leftEyePos.x - faceWidth*EYE_SIZE_RATE*0.5f /*- faceFeature.bounds.origin.x*/, // See Note1
                                                                       leftEyePos.y - faceWidth*EYE_SIZE_RATE*0.5f /*- faceFeature.bounds.origin.y*/, // See Note1
                                                                       faceWidth*EYE_SIZE_RATE,
                                                                       faceWidth*EYE_SIZE_RATE)];
        leftEyeView.backgroundColor = [[UIColor magentaColor] colorWithAlphaComponent:0.3];
        leftEyeView.layer.cornerRadius = faceWidth*EYE_SIZE_RATE*0.5;
        //[faceView addSubview:leftEyeView];  // See Note1
        [_pickerImageView addSubview:leftEyeView];
    }
}
buildsucceeded
  • 4,203
  • 4
  • 34
  • 72
kshitij godara
  • 1,523
  • 18
  • 30