1

For debugging purpose, I would like to monitor in my fit callback func the learning rate value to make sure my mx.lr_scheduler.MultiFactorScheduler does the job as expected.

Unfortunately the learning rate does not seem the be accessible in the Params. Is there a way do access the actual used lr for the current batch?

Many thanks !

HALMTL
  • 39
  • 3

1 Answers1

1

One possible solution may be creating a subclass from the current optimizer you use. Then override update method to log the current learning rate for every updates.

kevinthesun
  • 336
  • 1
  • 5
  • Well I have written my own scheduler to store the learning rates over time. My hope was the model to return the current value in the param so it can be track in the callback func. Thank you for your answer Kevin ! – HALMTL Dec 30 '16 at 05:13
  • It seems difficult to directly access lr in the param. But if you use optimizer interface, you can override update method and log lr. If you have your own lr_scheduler, you can create an optimizer with it. http://mxnet.io/api/python/model.html?highlight=optimizer#optimizer-api-reference Hope it works for you :) – kevinthesun Jan 03 '17 at 19:30
  • Many thanks for your advises and time Kevin. I will dig into this ! – HALMTL Jan 04 '17 at 04:22