0

I write a cosineannealingLRScheduler:

class CosineAnnealingLRScheduler(optimizers.schedules.LearningRateSchedule):
def __init__(self, epochs, train_step, lr_max, lr_min, warmth_rate=0.2):
    super(CosineAnnealingLRScheduler, self).__init__()

    self.total_step = epochs * train_step
    self.warm_step = int(self.total_step * warmth_rate)
    self.lr_max = lr_max
    self.lr_min = lr_min

@tf.function
def __call__(self, step):
    if step < self.warm_step:
        lr = self.lr_max / self.warm_step * step
    else:
        lr = self.lr_min + 0.5 * (self.lr_max - self.lr_min) * (1.0 + tf.cos((step - self.warm_step) / self.total_step * np.pi))

    return lr

I wanted know whether learning rate decay correctly, so I write a metrics to show my lr:

def print_lr(optimizer):
def lr(y_true, y_pred):
    return optimizer._decayed_lr("float32")
return lr

And output is:

Epoch 1/10 488/488 [==============================] - 51s 104ms/step - loss: 2.0970 - lr: 2.5051e-05 - val_loss: 1.3853 - val_lr: 5.0000e-05

Epoch 2/10 488/488 [==============================] - 50s 102ms/step - loss: 1.1636 - lr: 7.5051e-05 - val_loss: 1.2225 - val_lr: 1.0000e-04

So I want to know whether keras model.fit() validate calling optimizer? I think the optimizer should only be called during the training phase and not during the validation phase.

0 Answers0