I have a set of Images of more then 1000 pictures. For every Image I extract SURF descriptors. Now I'll add a query Image and want to try to find the most similar image in the image set. For perfomance and memory reasons I just extract for every image 200 keypoint with descriptors. And this is more or less my problem. At the moment I filter the matches by doing this:
Symmetrie Matching: Simple BruteForce Matching in both directions. So from Image1 to Image2 and from Image2 to Image1. I just keep the matches which exist in both directions.
List<Matches> match1 = BruteForceMatching.BFMatch(act.interestPoints, query.interestPoints);
List<Matches> match2 = BruteForceMatching.BFMatch(query.interestPoints, act.interestPoints);
List<Matches> finalMatch = FeatureMatchFilter.DoSymmetryTest(match1, match2);
float distance = 0;
for(int i = 0; i < finalMatch.size(); i++)
distance += finalMatch.get(i).distance;
act.pic.distance = distance * (float) query.interestPoints.size() / (float) finalMatch.size();
I know there are more filter methods. How you can see I try to weight the distances by the number of the final matches. But I don't have the feeling Iam doing this correct. When I look to other approaches it looks they all compute with all extractet interest points which exists in the image. Does anyone have a good approach for this? Or a good idea to weight the distances?
I know there is no golden solution, but some experiences, ideas and other approaches would be really helpfull.