I desire to add a Snap like effect to a Live Video based on tracking the users face in real time. My design would like to place streams of particles coming from they eyebrows, eyes, or lips. I already have a flexible effects library that can place the desired streams at any chosen points on the screen that can be updated in real time.
Apple provides a Swift demo project that I downloaded at this link : https://developer.apple.com/documentation/vision/tracking_the_user_s_face_in_real_time
If you download and run that project without any changes, it will show you an overlay containing face landmarks such as left and right eyebrows, eyes, nose, lips, etc that tracks a persons face in real time.
There wasn’t much documentation on the coordinate system, layer drawing, etc. to enable me to extract CGPoint values that would correspond to face landmarks such as points on the left eyebrow for instance.
I made some progress in analyzing the drawing code used in the Apple demo but have had only limited success in getting the desired coordinates.
The left eyebrow appears to consist of an array of 6 points on a path connected by lines. I would just like to get a CGPoint that indicates the current location for one of the points on the left eyebrow.
Apple provides a routine called addPoints.
addPoints is called for both open and closed landmarks.
That routine is called for each face landmark. Since the eyebrow is not a closed path it corresponds to being of this type : openLandmarkRegions . The mouth and eyes corresponds to a slightly different type, closedLandmarkRegions, since they are closed paths where the start point and end point are the same.
fileprivate func addPoints(in landmarkRegion: VNFaceLandmarkRegion2D, to path: CGMutablePath, applying affineTransform: CGAffineTransform, closingWhenComplete closePath: Bool)
It really doesn’t matter if the path is open or closed. All I care about is getting a valid CGPoint on any of the landmarks. Eventually I will have some effects for the eyes and mouth as well, as soon as I figure out how to get a valid CGPoint for just one of the face landmarks.
This is what I tried. I declared some global variables and I added some logic inside Apples drawing code to try to help pick out CGPoints on the left eyebrow.
var sampleLeftEyebrowPoint = false
var mostRecentLeftEyebrowPoint = CGPoint()
Since addPoints is called in for loops over all the landmarks, I had to try to pick out the loop that corresponded to the left eyebrow.
In addPoints Apple has this line of code where they use the points on any given landmark :
let points: [CGPoint] = landmarkRegion.normalizedPoints
I added this code snippet just after that line of code :
if sampleLeftEyebrowPoint
{
mostRecentLeftEyebrowPoint = points[1]
mostRecentLeftEyebrowPoint = mostRecentLeftEyebrowPoint.applying(affineTransform)
sampleLeftEyebrowPoint = false
}
Note that points[1] is the 2nd point on the eyebrow, which is one of the middle points.
Note that I apply the same affine transform to the single point that Apple applies in their logic.
I set sampleLeftEyebrowPoint to true in this Apple routine with some logic that determines if the left eyebrow is currently being looped over :
fileprivate func addIndicators(to faceRectanglePath: CGMutablePath, faceLandmarksPath: CGMutablePath, for faceObservation: VNFaceObservation)
In that routine Apple has a for loop over the open landmarks as shown below. I added some logic to set sampleLeftEyebrowPoint so that the logic in addPoints will recognize the left eyebrow is currently in work so it can set .
for openLandmarkRegion in openLandmarkRegions where openLandmarkRegion != nil {
if openLandmarkRegion == landmarks.leftEyebrow
{
sampleLeftEyebrowPoint = true
}
The mostRecentLeftEyebrowPoint that I obtain seems to correlate somewhat in ways to my desired CGPoint, but not fully. The X coordinate seems to track but needs some scaling. But the Y coordinate seems inverted, with maybe something else going on.
Can anyone provide a routine that will get me the desired CGPoint corresponding to mostRecentLeftEyebrowPoint ?
Once I have that, I have already figured out how to hide the face landmarks, so that only my effect will be visible and my effect will track the left eyebrow in real time. To hide the face detection lines that are shown, just comment out Apples call to :
// self.updateLayerGeometry()