I am basically extracting many keypoints with SURF from similar images and adding them to the BFMatcher(NORM_L2)
In runtime It can happen that I add new keypoints to my matcher with matcher->add(myNewDescriptors);
Now when I have added an image with only 1 keypoint/descriptor and I use knnMatch it returns no matches:
matcher->knnMatch(queryDesc,matches,2);
After a while I get a vector with 0 Nearest-neighbour:
for(auto i = 0; i <matches.size(); i++) {
cout << "matches size: "<<matches[i].size()<<endl;
//> PRINTS: "matches size: 0"
This happens only when I have inserted an image with only 1 keypoint/descriptor. Before the knnMatch works fine.
I have tried to check if matcher.getTrainDescriptors();
contains my descriptors and effectively it contains everything. To check this, if i do:
cout << matcher->getTrainDescriptors().at(0).size(); // (Get the size of the descriptors Mat associated to the first training image)
I get: [128 x 32]. This means that descriptors are there but the knnMatches return an empty vector
Also NOTE that If I change .knnMatch with a simple .match the matcher returns all DMatches normally! The code fails only with knnMatch
- OpenCV:2.4.5
- Training image:
- Query Image (the same image scaled down by a factor of 0.4)