2

This question is a possible duplicate of another question on stack overflow.

However, the answer doesn't indicate how to modify the optimizer learning rate using a scheduler (that could be implemented in simple python).

I'm training a tensorflow model from scratch as explained here. Hence, an optimizer is defined as: optimizer = keras.optimizers.SGD(learning_rate=1e-3), therefore, the learning rate is defined at the beginning. However, I'd like to have a learning rate scheduler such as tf.keras.optimizers.schedules.ExponentialDecay. So how can I change the optimizer learning rate from within the training loop?

Please note that I am not using the model.fit in this case.

Any help is much appreciated.

I. A
  • 2,252
  • 26
  • 65

1 Answers1

0

Try this code:

import tensorflow as tf

class CustomSchedule(tf.keras.optimizers.schedules.LearningRateSchedule):
  def __init__(self, drop=0.5):
    super(CustomSchedule, self).__init__()
    self.drop = drop
    
  def __call__(self, step):
    return tf.math.pow(self.drop, step)

def train_step(images, labels):
  with tf.GradientTape() as tape:
    predictions = model(images)
    loss = tf.losses.mse(labels, predictions)
  gradients = tape.gradient(loss, model.trainable_variables)
  optimizer.apply_gradients(zip(gradients, model.trainable_variables))

learning_rate = CustomSchedule(0.5)
optimizer = tf.keras.optimizers.SGD(learning_rate=learning_rate)
model = tf.keras.Sequential([
  tf.keras.layers.Dense(10)
])
input = tf.random.uniform([10, 10])
labels = tf.random.uniform([10, 10], 0, 10, dtype=tf.int32)
train_step(input, labels)
Andrey
  • 5,932
  • 3
  • 17
  • 35