I have 9 parameters, I want to select 6 important parameters and discard 3. What is the best method to do it? I have seen some methods of ranking the parameters by recursive feature elimination (e.g. RFECV). Can I use the random forest classification to rank the parameters and select those important parameters and use them for the random forest classifier? My question is that while using a random forest algorithm for feature selection, how can I make sure that I have used the best hyperparameters. Is it right to use an un-hyper tuned random forest classifier and decide the importance of the parameter? Are there any other methods for selecting important features?
Asked
Active
Viewed 193 times
0
-
why do you want to discard 3 feature if you have them... just try using a grid search over some hyper parameters of your randforest and see what's the best... if 3 feature are useless, the random forest won't use them to classify your input, since they won't make any change on entropy / any property you are using to pick the feature to classify on – Alberto Sinigaglia Jan 27 '22 at 01:25
-
The model overfits, so I want to reduce the number of parameters and check if the accuracy improves. Moreover, I need to reduce 2-3 parameters from the model as I am not confident about their data ,therefore, I was looking for a method for ranking the parameters. – lsr729 Jan 27 '22 at 04:43