3

Here is my code. When I run it the ridge is fine, however for the lasso I get the error message:

ConvergenceWarning: Objective did not converge. You might want to increase the number of iterations.

Please help.

from sklearn.linear_model import LinearRegression, Lasso, Ridge, RidgeCV, LassoCV
from sklearn.model_selection import cross_val_score
import numpy as np
import sys

dataset = np.loadtxt(sys.argv[1], delimiter = ',')
X = dataset[:,:10]
y = dataset[:,10]

ridge_cv = RidgeCV(alphas=[1e-3, 1e-2, 1e-1, 1, 10, 100]).fit(X,y)

lasso_cv = LassoCV(alphas=[1e-3, 1e-2, 1e-1, 1, 10, 100]).fit(X,y)

lin_reg = LinearRegression()
ridge_reg = Ridge(alpha = ridge_cv.alpha_)
lasso_reg = Lasso(alpha = lasso_cv.alpha_)

print(cross_val_score(lin_reg, X, y, cv=2).mean())
print(cross_val_score(ridge_reg, X, y, cv=2).mean())
print(cross_val_score(lasso_reg, X, y, cv=2).mean())
StupidWolf
  • 45,075
  • 17
  • 40
  • 72
slowpoking9
  • 189
  • 1
  • 2
  • 10

1 Answers1

-1

Increase tolerance.

Check documentation The tolerance for the optimization: if the updates are smaller than tol, the optimization code checks the dual gap for optimality and continues until it is smaller than tol

Bascially responsible for convergence

Noah Weber
  • 312
  • 2
  • 13
  • 1
    Please don't post a comment as an answer unless you test and see that your suggestion, with specific code, answers the question. Your post is merely an untested suggestion at the moment. – FatihAkici Dec 18 '19 at 20:02
  • Fair point, sorry. I am 80% sure its correct. How do you expect to replicate without dataset, thats whats convergence dependent. – Noah Weber Dec 18 '19 at 20:07