1

I want to write a learning rate schedule based on logged metrics. The API's doc provides example for linear decaying (https://stable-baselines3.readthedocs.io/en/master/guide/examples.html#learning-rate-schedule) and examples can be found for exponential decaying. Is it a way to perform parametric decaying (loss based) ? Is it relevant with adam optimizer (which is already adaptive) ?

Thanks for your answer.

GerardL
  • 81
  • 7

0 Answers0