6

I'm working on an app in which i have to detect left eye, right eye, and mouth position. I have an imageView on my self.view and imageView contains a face image, now I want to get both eyes and mouth coordinates. I have seen 2-3 sample codes for this but all are approximately same in all codes we have to invert my view for matching the coordinates which I don't want because my view have some other controls. And one more thing they all are using

UIImageView *imageView = [[UIImageView alloc]initWithImage:[UIImage imageNamed:@"image.png"]];

but my imageView has frame and i cant init it with image. When I do so I found faceFeature's eyes and mouth coordinates wrong.

I had started my code from this sample code but in this also view is being invert its Y coordinate.

Can any one help me how can i detect the face eyes and mouth position on UIImageView's image without invert my self.view.

Please let me know if my question is not clear enough.

TheTiger
  • 13,264
  • 3
  • 57
  • 82

2 Answers2

1

The trick here is to transform the returned points and bounds from CIDetector to your coordinates, instead of flipping your own view. CIImage has origin at the bottom left which you will need to transform to the top left

int height = CVPixelBufferGetHeight(pixelBuffer);
CGAffineTransform transform = CGAffineTransformMakeScale(1, -1);
transform = CGAffineTransformTranslate(transform, 0, -1 * height);

/* Do your face detection */

CGRect faceRect = CGRectApplyAffineTransform(feature.bounds, transform);
CGPoint mouthPoint = CGPointApplyAffineTransform(feature.mouthPosition, transform);
// Same for eyes, etc

For your second question about UIImageView, you just have to do

imageview.image = yourImage

After you have initialized your imageview

Jack
  • 16,677
  • 8
  • 47
  • 51
0

Worked it out! - edited the class to have a faceContainer which contains all of the face objects (the mouth and eyes), then this container is rotated and thats all. Obviously this is very crude, but it does work. Here is a link, http://www.jonathanlking.com/download/AppDelegate.m. Then replace the app delegate from the sample code with it.

-- OLD POST --

Have a look at this Apple Documentation, and slide 42 onwards from this apple talk. Also you should probably watch the talk as it has a demo of what you are trying to achieve, it's called "Using Core Image on iOS & Mac OS X" and is here.

Jonathan King
  • 1,528
  • 14
  • 25
  • thanks for your reply, but what are you trying to explain ? i know how to detect faceFeature eyes and mouth position with CGImage, my question is different, i have issue with uiimageview's uiimage. – TheTiger Jun 22 '12 at 11:29
  • Sorry, completely misunderstood your question! – Jonathan King Jun 22 '12 at 11:49
  • @VakulSaini did you find my new answer helpful? And if so, can you mark it as the correct answer. – Jonathan King Jun 23 '12 at 08:42
  • Thank you @jonathanlking, you wrote a code for me and i found it good... you made a new container view and instead of invert whole view you inverted the container view....but there is one problem remains as i said in my question that my imageView has its own frame and i dont want to init it with image as you are doing in your code ..... but thanks a lot for your help. – TheTiger Jun 23 '12 at 10:45