3

How can I create a TensorFlow Keras API callback that, for every epoch, will add the learning rate value to the csv file created by tf.keras.callbacks.CSVLogger?

With the callback below I can print out my learning rate after each epoch, and add it to the history. But I can't figure out how to have it added to the CSVLogger csv file. It seems there is a log dict stored somewhere, which holds the values printed by CSVLogger, but I don't understand where it is or how to add to it.

class Print_lr(tf.keras.callbacks.Callback):
    def on_epoch_begin(self, epoch, logs=None):
        print('lr = %f' % self.model.optimizer.lr)
        if 'lr' not in self.model.history.history.keys():
            self.model.history.history['lr'] = []
        self.model.history.history['lr'].append(self.model.optimizer.lr.numpy())
mattroos
  • 125
  • 1
  • 11
  • By default I believe it already includes the learning rate. My test shows `epoch;accuracy;loss;lr;val_accuracy;val_loss` as the values logged. However if you do require custom logging, consider a [LambdaCallback](https://stackoverflow.com/a/60037721/8471799). More examples in the [documentation](https://keras.io/api/callbacks/lambda_callback/). – cddt Sep 10 '20 at 00:18
  • Thanks, @cddt. I've never seeing lr in my history or in the CSVLogger csv file. Are you using a LearningRateScheduler callback? Thanks for the LambdaCallback idea. Not exactly what I was hoping for, but it might suffice. Will try it out. – mattroos Sep 10 '20 at 03:54
  • You're right, I was using `ReduceLROnPlateau`. I'm not sure whether that's an option for you. – cddt Sep 10 '20 at 08:05
  • It is, @cddt. So I tried using ```ReduceLROnPlateau``` but the ``lr`` still didn't appear in the ```CSVLogger``` csv file. I've tried TF 2.1 and 2.3. What version are you using? – mattroos Sep 10 '20 at 20:55
  • 3
    I found the answer here: https://stackoverflow.com/questions/48488549/keras-append-to-logs-from-callback – mattroos Sep 10 '20 at 21:30

0 Answers0