0

Is there any possibility to change (decrease) parameter 'learning rate', a gradient step coefficient, during training the model CatBoostRegressor() ? It would reduce the iterations number and quicken the training?

2 Answers2

0

The smaller the gradient step size, the more iterations you need to train the model. This will increase training time, but can help to more accurately minimize the average error in your loss function. Read the official recomendations for tunning you CBR model

  • I mean to change this parameter, for example, linear DURING training. For the first iteration it is, probably, 0.03, then decrease to 0.003 by decreasing step 0.001 on each iteration – Sergey Novozhilov Mar 02 '18 at 14:38
0

This seems to not be implemented yet.

In this issue thread, the catboost developers report that learning rate decay generally degrades performance in their experiments.

If you want to use learning rate decay regardless, you could achieve it by i) training a model with a higher learning rate and then ii) passing that trained model as an initialiser to a second model which you train with a lower learning rate etc.

Matsaulait
  • 71
  • 2