I built a classifier model using KNN as learners for an ensemble based on the random subspace method.
I have three predictors, whose dimension is 541 samples, and I develop an optimization procedure to find the best k (number of neighbours). I chose the k that maximize the AUC of the classifier, whose performance is computed with 10-fold-cross-validation. The result for the best k was 269 for each single weak learners (that are 60 as a result of a similar optimization).
Now, my question is: Are 269 neighbours too many? I trust the results of the optimization, but I have never used so many neighbours and I am worried about overfitting.
Thank you in advance, MP