4

I'm using scikit-learn's LassoCV function. During cross-validation, what scoring metric is being used by default?

I would like cross-validation to be based on "Mean squared error regression loss". Can one use this metric with LassoCV? One can specify a scoring metric for LogisticRegressionCV, so it may be possible with LassoCV too?

Oliver Angelil
  • 1,099
  • 15
  • 31
  • 3
    Not possible in current implementation. You can put this as an issue to scikit-learn github page and see what the response is. – Vivek Kumar May 22 '17 at 05:48
  • Do you know what the current scoring metric is? – Oliver Angelil May 22 '17 at 07:31
  • 2
    R2 is the default metric for most regression estimators. See the description of [score() for LassoCV](http://scikit-learn.org/stable/modules/generated/sklearn.linear_model.LassoCV.html#sklearn.linear_model.LassoCV.score) – Vivek Kumar May 22 '17 at 07:39

2 Answers2

5

LassoCV uses R^2 as the scoring metric. From the docs:

By default, parameter search uses the score function of the estimator to evaluate a parameter setting. These are the sklearn.metrics.accuracy_score for classification and sklearn.metrics.r2_score for regression.

To use an alternative scoring metric, such as mean squared error, you need to use GridSearchCV or RandomizedSearchCV (instead of LassoCV) and specify the scoring parameter as scoring='neg_mean_squared_error'. From the docs:

An alternative scoring function can be specified via the scoring parameter to GridSearchCV, RandomizedSearchCV and many of the specialized cross-validation tools described below.

DontDivideByZero
  • 1,171
  • 15
  • 28
3

I think the accepted answer is wrong, as it quotes the documentation of Grid Search, but LassoCV uses regularisation paths, not grid search. In fact in the docs page for LassoCV, it says that the loss function is:

(1 / (2 * n_samples)) * ||y - Xw||^2_2 + alpha * ||w||_1

Meaning that its minimising the MSE (plus the LASSO term).

Alberto Santini
  • 6,425
  • 1
  • 26
  • 37
  • I don't think it's wrong. I think you two are answering different questions, with the accepted answer answering the OP's question, and you answering a different question. The OP asked for the "scoring metric" DURING CROSS VALIDATION. That's not equivalent to asking what objective function is solved, which is what you're answering, I think? – 24n8 Apr 24 '21 at 20:05