-3

Reading the paper ' Cyclical Learning Rates for Training Neural Networks' https://arxiv.org/abs/1506.01186

Does it make sense to use the learning rate finder if the model is over-fitting ? Other than reduce the number of iterations before the model overfit's will using the learning finder prevent over-fitting ?

From reading the paper there is no suggestion this method of reduces over-fitting, is my interpretation correct ?

blue-sky
  • 51,962
  • 152
  • 427
  • 752
  • Hard to say, as the learning rate changes accordingly to defined cycle, it may escape local minimas and find more optimal values. Depends on the nature of your function, but the idea behind it is not related to preventing over-fitting, you would use other approaches to accomplish this like `dropout`. – Szymon Maszke Feb 10 '19 at 18:56

1 Answers1

0

I don't think changing the learning rate reduces over-fit. To avoid over-fitting you might want to use L1/L2 regularization and drop-out or some of its variant.

Umang Gupta
  • 15,022
  • 6
  • 48
  • 66