In KNN (K nearest neighbour)
classifiers, if an even value of K is chosen, then what would be the prediction in majority voting rule or in Euclidean distance rule. For example if there are 3 classes say
- Iris-setosa
- Iris-versicolor
- Iris-virginica
And now say we have value of n_neighbors = 6
. There is a fair amount of chance to get a tie result for Majority Voting Rule? In most visualisation this region is represented in white saying no decision could be achieved. But What would be the actual prediction for tie. This problem is hard to emulate and fairly conceptual hence may not be emulated so easily.
Also does an odd value of n_neighbors
solves/reduces this issue? Do you think instead of using simple majority voting, euclidean/Manhattan distance would better handle this. However sklearn docs does not mention this at all.