I have a stationery video of me pedaling a bike, and my goal is to track the position of my joints and then compute some angles etc. The joints are marked with a color tape. Based on the color of the tape, i threshold the frames, detect contours and compute centroids of those contours. These are my detections. This approach was suggested to me here: Methods to track marked points in a stationary video? where i first asked about this topic. This works well, but I'm struggling with tracking those detections across the video.
I tried to compute distance of every tracked point to all the detections next frame, and then greedily assign point with lowest distance to the nearest detection. However, this does not work well if there are multiple points close together (the foot) because the near points are "switching" between frames (the ankle point becomes the toe point etc).
link to a video showcasing this: https://youtu.be/g0kED6EBg54
I tried to use this implementation of kalman filter, to better predict the next location of the tracked point, but I don't really understand how it works so I am not sure how to set it to work better. This is the setting that I'm using:
// new kalman filter
kalman_filters.push(new KalmanFilter({
observation: 2,
dynamic: {
name: 'constant-speed', // works better than constant-position and constant acceleration. idk why
timeStep: 5, // works better than default 1 idk why
}
}));
How can i better track the points, so the points that are close wont switch with each other?
The whole codebase is on github (only index.html is relevant) and is deployed here: https://vlasakjiri.github.io/opencv.js/
How to use: the top canvas shows current frame of the video. Small green circles are detected contours, filled colorful circles are matched points, hollow colorful circles are predictions from the kalman filter. The bottom canvas shows the result of the thresholding
Click on some points marked with the red tape. The thresholding will adjust and these points will now be tracked
Play the video