0

In this article the author says

...without applying regularization we also run the risk of underfitting...

Why we might get underfitting without regularization? Regularization “make” network simpler to avoid overfitting and not underfittin. So, if we don’t have regularization it won’t cause underfitting.

theateist
  • 13,879
  • 17
  • 69
  • 109

1 Answers1

0

We require regularization when our model is overfitting, i.e our training accuracy is considerably higher than our testing accuracy.

When our model is underfitting,we need to increase complexity of the model( by, say, adding new features).

Hence, Regularization is not a solution to underfitting , and that is what the author is trying to say.

Faraz Gerrard Jamal
  • 238
  • 1
  • 3
  • 14
  • so, by "applying regularization" he meant "adding new feautres"? I thought he meant to apply l1-norm or l2-norm to avoid underfitting and therefore it was confusing. – theateist Jun 18 '18 at 18:13
  • @theateist No, he means that regularization is not the solution if your model is underfitting. You apply l1 or “2 regularization when your model is overfitting. – Faraz Gerrard Jamal Jun 18 '18 at 18:29
  • I understand that we apply l1 or l2 when the model is overfitting. It's just when someone tells "without doing A we might get B" I understand the sentence as "if I don't want to get B I need to do A. That's all. That's why I understood that sentence as "if I don't want udnerfitting I need regularization" and it's confusing since regularization is needed when the model overfitting and not underfitting. I hope I explained why that particular sentence confused me. – theateist Jun 18 '18 at 18:40
  • @theateist The author is just saying that if we don’t apply regularization it may be because our model is underfitting, that is our model DOES NOT require regularization. – Faraz Gerrard Jamal Jun 18 '18 at 18:49
  • I think this is a correct answer but I also have to say that the writing of that author is really easy to misunderstand – Long Oct 22 '19 at 06:23