1

I am using Python 3.7 and TensorFlow 2.0, I have to train a neural network for 160 epochs with the following learning rate scheduler:

Decreasing the learning rate by a factor of 10 at 80 and 120 epochs, where the initial learning rate = 0.01.

How can I write a function to incorporate this learning rate scheduler:

def scheduler(epoch):
    if epoch < 80:
        return 0.01
    elif epoch >= 80 and epoch < 120:
        return 0.01 / 10
    elif epoch >= 120:
        return 0.01 / 100

callback = tf.keras.callbacks.LearningRateScheduler(scheduler)

model.fit(
    x = data, y = labels,
    epochs=100, callbacks=[callback],
    validation_data=(val_data, val_labels))

Is this a correct implementation?

Thanks!

Arun
  • 2,222
  • 7
  • 43
  • 78

1 Answers1

2

The tf.keras.callbacks.LearningRateScheduler expects a function that takes an epoch index as input (integer, indexed from 0) and returns a new learning rate as output (float):

def scheduler(epoch, current_learning_rate):
    if epoch == 79 or epoch == 119:
        return current_learning_rate / 10
    else:
        return min(current_learning_rate, 0.001)

This will reduce the learning rate by a factor of 10 at Epochs 80 and 120 and will leave it as it is in all other epochs.

ITiger
  • 1,056
  • 3
  • 11
  • 24
  • How can I use the 'scheduler()' function for learning rate scheduler with 'GradientTape' ? Or, should I open a new question for it? Thanks! – Arun Mar 22 '20 at 16:08