0

until recently I was using GridSearchCV to get the std scores on the crossvalidation by doing this pointed in here

So basically doing this

grid_search.cv_results_['std_test_score'][grid_search.best_index_]

But now I get a key error telling me that 'std_test_score' is not a key.

This is how I call the GridSearchCV function

splitter = StratifiedKFold(n_splits=5, shuffle=True, random_state=11)


scoring_functions = {'mcc': make_scorer(matthews_corrcoef), 'accuracy': make_scorer(accuracy_score), 
                     'balanced_accuracy': make_scorer(balanced_accuracy_score)}

grid_search = GridSearchCV(pipeline, param_grid=grid, scoring=scoring_functions, n_jobs=-1, cv=splitter, refit='mcc')
Atirag
  • 1,660
  • 7
  • 32
  • 60

1 Answers1

2

From the documentation, the cv_results_ attribute description:

For multi-metric evaluation, the scores for all the scorers are available in the cv_results_ dict at the keys ending with that scorer's name ('_<scorer_name>') instead of '_score' shown above. ('split0_test_precision', 'mean_train_precision' etc.)

You can get there pretty easily on your own, with grid_search.cv_results_.keys() to see what's available. It ought to be e.g. grid_search.cv_results_['std_test_mcc'].

Ben Reiniger
  • 10,517
  • 3
  • 16
  • 29
  • Yeah I just finished looking into the keys and came here to write the solution but you beat me to it. Thx! – Atirag Feb 08 '21 at 20:53