1

I'm trying to make a face recognition app that recognizes the person once it detects his face. i already done the face detection part but i couldn't find a way to compare the face to a photo from a photo album that is stored in the app.

Here is the face detection code:

-(void)markFaces:(UIImageView *)facePicture

{

// draw a CI image with the previously loaded face detection picture

CIImage* image = [CIImage imageWithCGImage:facePicture.image.CGImage];



// create a face detector

CIDetector* detector = [CIDetector detectorOfType:CIDetectorTypeFace

                                          context:nil options:[NSDictionary dictionaryWithObject:CIDetectorAccuracyHigh forKey:CIDetectorAccuracy]];



// create an array containing all the detected faces from the detector

NSArray* features = [detector featuresInImage:image];


for(CIFaceFeature* faceFeature in features)

{

    // get the width of the face

    CGFloat faceWidth = faceFeature.bounds.size.width;



    // create a UIView using the bounds of the face

    UIView* faceView = [[UIView alloc] initWithFrame:faceFeature.bounds];



    // add a border around the newly created UIView

    faceView.layer.borderWidth = 1;

    faceView.layer.borderColor = [[UIColor redColor] CGColor];



    // add the new view to create a box around the face

    [self.view addSubview:faceView];



    if(faceFeature.hasLeftEyePosition)

    {

        // create a UIView with a size based on the width of the face

        UIView* leftEyeView = [[UIView alloc] initWithFrame:CGRectMake(faceFeature.leftEyePosition.x-faceWidth*0.15, faceFeature.leftEyePosition.y-faceWidth*0.15, faceWidth*0.3, faceWidth*0.3)];

        // change the background color of the eye view

        [leftEyeView setBackgroundColor:[[UIColor blueColor] colorWithAlphaComponent:0.3]];

        // set the position of the leftEyeView based on the face

        [leftEyeView setCenter:faceFeature.leftEyePosition];

        // round the corners

        leftEyeView.layer.cornerRadius = faceWidth*0.15;

        // add the view to the window

        [self.view addSubview:leftEyeView];

    }



    if(faceFeature.hasRightEyePosition)

    {

        // create a UIView with a size based on the width of the face

        UIView* leftEye = [[UIView alloc] initWithFrame:CGRectMake(faceFeature.rightEyePosition.x-faceWidth*0.15, faceFeature.rightEyePosition.y-faceWidth*0.15, faceWidth*0.3, faceWidth*0.3)];

        // change the background color of the eye view

        [leftEye setBackgroundColor:[[UIColor blueColor] colorWithAlphaComponent:0.3]];

        // set the position of the rightEyeView based on the face

        [leftEye setCenter:faceFeature.rightEyePosition];

        // round the corners

        leftEye.layer.cornerRadius = faceWidth*0.15;

        // add the new view to the window

        [self.view addSubview:leftEye];

    }



    if(faceFeature.hasMouthPosition)

    {

        // create a UIView with a size based on the width of the face

        UIView* mouth = [[UIView alloc] initWithFrame:CGRectMake(faceFeature.mouthPosition.x-faceWidth*0.2, faceFeature.mouthPosition.y-faceWidth*0.2, faceWidth*0.4, faceWidth*0.4)];

        // change the background color for the mouth to green

        [mouth setBackgroundColor:[[UIColor greenColor] colorWithAlphaComponent:0.3]];

        // set the position of the mouthView based on the face

        [mouth setCenter:faceFeature.mouthPosition];

        // round the corners

        mouth.layer.cornerRadius = faceWidth*0.2;

        // add the new view to the window

        [self.view addSubview:mouth];

    }

}

}



-(void)faceDetector

{

// Load the picture for face detection

UIImageView* image = [[UIImageView alloc] initWithImage:[UIImage imageNamed:@"testpicture.png"]];



// Draw the face detection image

[self.view addSubview:image];



// Execute the method used to markFaces in background

[self performSelectorInBackground:@selector(markFaces:) withObject:image];



// flip image on y-axis to match coordinate system used by core image

[image setTransform:CGAffineTransformMakeScale(1, -1)];



// flip the entire window to make everything right side up

[self.view setTransform:CGAffineTransformMakeScale(1, -1)];





}
Cristik
  • 30,989
  • 25
  • 91
  • 127
malcolm
  • 39
  • 4

1 Answers1

1

From Documentation:

Core Image can analyze and find human faces in an image. It performs face detection, not recognition. Face detection is the identification of rectangles that contain human face features, whereas face recognition is the identification of specific human faces (John, Mary, and so on). After Core Image detects a face, it can provide information about face features, such as eye and mouth positions. It can also track the position an identified face in a video.

Unfortunately there is no api provided by apple yet to recognise the face. You might look into the third party libraries.

Teja Nandamuri
  • 11,045
  • 6
  • 57
  • 109