I'm currently trying to implement a face tracking by using optical flow with opencv.
To achieve this, I detect faces with the openCV face detector, I determine features to track on the detected areas by calling goodFeaturesToTrack
and I operate tracking by calling calcOpticalFlowPyrLK
.
It gives good results.
However, I'd like to know when the face I'm currently tracking is not visible anymore (the person leaves the room, is hidden behind an object or another person, ...) but calcOpticalFlowPyrLK
tells me nothing about it.
The status parameter of the calcOpticalFlowPyrLK
function rarely reports errors concerning a tracked feature (so, if the person disappear, I will still have a good amount of valid features to track).
I've tried to calculate the directional vectors for each feature to determine the move between the previous and the actual frame for each feature of the face (for example, determining that some point of the face has move to the left between the two frames) and to calculate the variance of these vectors (if vectors are mostly different, variance is high, otherwise it is not) but it did not give the expected results (good in some situation, but bad in other cases).
What could be a good condition to determine whether the optical flow tracking has to be stopped or not?
I've thought of some possible solutions like these ones:
- variance of the distance for the vectors of each tracked feature (if the move is linear, distances should be nearly the same, but if something happened, distances will be different).
- Comparing the shape and size of the area containing the original position of the tracked features with the area containing the current one. At the beginning we have a square containing the features of the face. But if the person leaves the room, it can lead to a deformation of the shape.