2

I am trying to do the following:

  1. Start front facing camera
  2. Detect a person’s face
  3. Place 3D/image overlay over their face (a hat, for example)

Using the example project from apple (here) the first two steps work fine. However now I want to complete the third step and add an UIImage as a subview and have it move in sync with the face so that the hat (for example) stays in place on top of the user's head while they move the camera.

I am able to add the image and have it move with the face somewhat editing this code from the example project:

fileprivate func addIndicators(to faceRectanglePath: CGMutablePath, faceLandmarksPath: CGMutablePath, for faceObservation: VNFaceObservation) {
    let displaySize = self.captureDeviceResolution

    let faceBounds = VNImageRectForNormalizedRect(faceObservation.boundingBox, Int(displaySize.width), Int(displaySize.height))
    faceRectanglePath.addRect(faceBounds)

    if let landmarks = faceObservation.landmarks {
        // Landmarks are relative to -- and normalized within --- face bounds
        let affineTransform = CGAffineTransform(translationX: faceBounds.origin.x, y: faceBounds.origin.y)
            .scaledBy(x: faceBounds.size.width, y: faceBounds.size.height)

        // Treat eyebrows and lines as open-ended regions when drawing paths.
        let openLandmarkRegions: [VNFaceLandmarkRegion2D?] = [
            landmarks.leftEyebrow,
            landmarks.rightEyebrow,
            landmarks.faceContour,
            landmarks.noseCrest,
            landmarks.medianLine
        ]
        for openLandmarkRegion in openLandmarkRegions where openLandmarkRegion != nil {
            self.addPoints(in: openLandmarkRegion!, to: faceLandmarksPath, applying: affineTransform, closingWhenComplete: false)
        }

        // Draw eyes, lips, and nose as closed regions.
        let closedLandmarkRegions: [VNFaceLandmarkRegion2D?] = [
            landmarks.leftEye,
            landmarks.rightEye,
            landmarks.outerLips,
            landmarks.innerLips,
            landmarks.nose
        ]
        for closedLandmarkRegion in closedLandmarkRegions where closedLandmarkRegion != nil {
            self.addPoints(in: closedLandmarkRegion!, to: faceLandmarksPath, applying: affineTransform, closingWhenComplete: true)
        }



        self.imageView.frame = CGRect(x: faceBounds.origin.x, y: faceBounds.origin.y, width: 300, height: 165)
        self.previewView?.addSubview(self.imageView)

    }
}

But I am not sure what X and Y values to use for the imageView.frame because using faceBounds.origin the result is not very consistent.

Here is the link to the modified sample project if you want to try it: https://files.fm/f/zqzrd6gf

Thank in advance.

Hols
  • 361
  • 1
  • 5
  • 21
  • Did you check this? https://developer.apple.com/documentation/arkit/creating_face-based_ar_experiences – canister_exister Jan 10 '19 at 20:46
  • @canister_exister I saw it but I don't have access to any of the devices it is supported by at the moment. – Hols Jan 10 '19 at 21:00
  • I can't check full project, but if you already have faceBounds and you can move UIView with face, what it the problem to place view on face? – canister_exister Jan 10 '19 at 21:21
  • @canister_exister the issue is that the Imageview coordinates do not match the face coordinates so it does not falls int he right place. As I write in my question I think the values I am using for the x/y coordinates of the image are not correct and give an inconsistent result. If you can check the full project linked you'll see. – Hols Jan 10 '19 at 21:30
  • https://stackoverflow.com/questions/45151218/vnfaceobservation-boundingbox-not-scaling-in-portrait-mode – canister_exister Jan 10 '19 at 21:33
  • @canister_exister i'll check that out. – Hols Jan 11 '19 at 13:04

0 Answers0