I'm using drones to monitor tulip fields weekly. I capture images every week with a set flight path on my DJI Phantom 4. The images are stitched with Agisoft Metashape into an orthomosaic (large geo-referenced .tif file ~2gb).
I would like to compare orthomosaics at different times. Unfortunately the orthomosaics are not exactly aligned. So when I lay this week's orthomosaic over last week's, they're not aligned. Like so:
Two orthomosaics, same location, one week later. Detail shows orthos are not properly aligned.
As the detail shows, the images are not aligned. I need them to be aligned for proper inspection, growth tracking etc. I would like to create an automatic alignment algorithm that fits week 2 (using some translation, rotation and possible stretching) onto week 1. The difficulty herein lies in the fact that the tulips change over time, so the alignment should rely on non-changing features like the sewer, paths and rows. Also, I would like to extend this method to also work for other crops.
What would be a suitable method for aligning the orthomosaics?
UPDATE: I tested two methods:
- Searching for keypoints, match them and use RANSAC to select the best matches and determine a homography matrix. Applying the resulting matrix to warp the second image results in a better fit, but not a great fit.
- Trying to optimize the homography matrix based on MSE between the two images in greyscale. Results are similar: slightly better, but far from perfect.
I think the main 'culprit' here is that the images are no perfect match anyway, no matter the homography. Also, the keypoint method seems to detect small details as features, whereas the larger 'details' (like the sewage) qualify much better as matchable features. I think there may be some value in smart pre-processing.
So i'm still working on this and would be thankful for any advice some of you may have!