-1

while selecting features in machine learning, one can use Lasso regression to figure out the least required feature by selecting the least coefficient but we can do the same using Linear Regression

linear regression

Y=x0+x1b1+x2b2.......xnbn

here x1,x2,x3...xn are coefficient, using gradient descent we get the best coefficient, we can remove the features who has the least coefficient. now when it is possible using Linear Regression then why should one use Lasso Regression? am i missing something, please help

Lijin Durairaj
  • 4,910
  • 15
  • 52
  • 85

1 Answers1

1

Lasso is a regularization technique which is for avoiding overfitting when you train your model. When you do not use any regularization technique, your loss function just tries to minimize the difference between the predicted value and real value min |y_pred - y|. To minimize this loss function, gradient descent changes the coefficient of your model. This step may cause the overfitting of your model because your optimization function want only to minimize the difference between prediction and real value. To solve this issue, regularization techniques add another penalty term to the loss functions: value of coefficients. In this way, when your model tries to minimize the difference between predicted and real value, it also tries to do not increase the coefficients too much.

As you mentioned, you can select features in both ways, however, Lasso technique also takes care of the overfitting problem.

Batuhan B
  • 1,835
  • 4
  • 29
  • 39