I am having difficulty understanding how the training will be resumed when the model is loaded from disk when a scheduler like the one below is used.
learning_rate_scheduler = tensorflow.keras.optimizers.schedules.ExponentialDecay(
0.01,
decay_steps=1000,
decay_rate=0.96,
staircase=True)
Consider this hypothetical situation, where I trained the model for one epoch and saved. Later I loaded the model and fit again. In this case, will the training resumes from the learning rate that was when the model saved previously or will it start from the pre-defined configuration of the scheduler?
Edit
I am saving my model in the standard way,
model.save("model")
Below is the optimizer config after loading. Learning rate config is same as the definition.
hour_glass_model.optimizer.get_config()
{'amsgrad': False,
'beta_1': 0.9,
'beta_2': 0.999,
'decay': 0.0,
'epsilon': 1e-07,
'learning_rate': {'class_name': 'ExponentialDecay',
'config': {'decay_rate': 0.96,
'decay_steps': 1000,
'initial_learning_rate': 0.01,
'name': None,
'staircase': True}},
'name': 'Adam'}