I am using Python 3.7 and TensorFlow 2.0, I have to train a neural network for 160 epochs with the following learning rate scheduler:
Decreasing the learning rate by a factor of 10 at 80 and 120 epochs, where the initial learning rate = 0.01.
How can I write a function to incorporate this learning rate scheduler:
def scheduler(epoch):
if epoch < 80:
return 0.01
elif epoch >= 80 and epoch < 120:
return 0.01 / 10
elif epoch >= 120:
return 0.01 / 100
callback = tf.keras.callbacks.LearningRateScheduler(scheduler)
model.fit(
x = data, y = labels,
epochs=100, callbacks=[callback],
validation_data=(val_data, val_labels))
Is this a correct implementation?
Thanks!