0

this is my first question here. I'm playing with tensorflow.keras, doing some CNNs, and I would like to know if anyone understands why this conflict arises, thanks.

from tensorflow.keras.optimizers import Nadam
from tensorflow.keras.optimizers.schedules import ExponentialDecay 

initial_learning_rate = 0.1
lr_schedule = ExponentialDecay(
    initial_learning_rate,
    decay_steps=100000, decay_rate=0.96, staircase=True)


model.compile(optimizer=Nadam(learning_rate=lr_schedule), loss='categorical_crossentropy', metrics=['accuracy'])
  • 1
    Please clarify your specific problem or provide additional details to highlight exactly what you need. As it's currently written, it's hard to tell exactly what you're asking. – Community Mar 10 '22 at 15:31
  • What is the conflict?. I am also learning to use keras.optimizers.schedules. I'm thinking of doing a gridsearch,but i still have no idea how to do it. – robintux Apr 07 '22 at 02:53

1 Answers1

1

This ValueError: The Nadam optimizer does not support tf.keras.optimizers.LearningRateSchedules as the learning rate is caused because Nadam optimzer does not support LearningRateSchedule as other optimzers do.

You can use other optimizers except Nadam which supports schedules.

  • Adadelta
  • Adagrad
  • Adam
  • Adamx
  • Ftrl
  • RMSprop
  • SGD