1

I have used custom learning rate scheduler. Code is as follows(Same of https://www.tensorflow.org/tutorials/text/transformer#optimizer).

class CustomSchedule(tf.keras.optimizers.schedules.LearningRateSchedule):
   def __init__(self, d_model, warmup_steps=4000):
     super(CustomSchedule, self).__init__()

     self.d_model = d_model
     self.d_model = tf.cast(self.d_model, tf.float32)

     self.warmup_steps = warmup_steps

  def __call__(self, step):
    arg1 = tf.math.rsqrt(step)
    arg2 = step * (self.warmup_steps ** -1.5)

    return tf.math.rsqrt(self.d_model) * tf.math.minimum(arg1, arg2)

I am using model.fit. Want to print lr at each epoch end.

desertnaut
  • 57,590
  • 26
  • 140
  • 166
  • 1
    Does this answer your question? [Getting the current learning rate from a tf.train.AdamOptimizer](https://stackoverflow.com/questions/36990476/getting-the-current-learning-rate-from-a-tf-train-adamoptimizer) – desertnaut Apr 08 '21 at 10:54
  • Does [this](https://stackoverflow.com/a/61780874/14290681) answer your query? –  Apr 21 '21 at 09:34

0 Answers0