This question is a possible duplicate of another question on stack overflow.
However, the answer doesn't indicate how to modify the optimizer learning rate using a scheduler (that could be implemented in simple python).
I'm training a tensorflow model from scratch as explained here. Hence, an optimizer is defined as: optimizer = keras.optimizers.SGD(learning_rate=1e-3)
, therefore, the learning rate is defined at the beginning. However, I'd like to have a learning rate scheduler such as tf.keras.optimizers.schedules.ExponentialDecay
. So how can I change the optimizer learning rate from within the training loop?
Please note that I am not using the model.fit
in this case.
Any help is much appreciated.