1

I have been using the scikit-learn RandomForestClassifier for a project. I am wondering if there's any tool or trick to grow a "full" forest, and then experiment with certain hyperparameters ex-post. For example, is there a way to quickly chop every fully grown tree to a depth of, say, 10 to check its performance on a test set? I imagine that this could be done in a computationally feasible manner for any hyperparameter that limits tree depth (e.g. max_depth, min_sample_leaf, min_sample_split).

I currently use GridSearchCV to find the best configuration of max_depth, max_features and max_samples, but with three hyperparameters, it takes a long time to search the space.

DavidSilverberg
  • 121
  • 1
  • 4
  • Not sure if this is feature is currently available though there appears to be interest in it. https://stackoverflow.com/questions/49419703/scikit-learn-post-pruning-in-randomforestclassifier – Capybara Nov 20 '22 at 02:45

0 Answers0