2

I am training my model in Keras using a tensorflow backend and Jupyter-Notebook. While the MNIST Example updates the output of the training log after each batch, my new model on a different dataset outputs a new value for each batch. Now rather than using verbose=2, I would like to see the value being updated after every batch.

My fit function looks like this:

model.fit(X, y_train, validation_split=0.33, epochs=1, batch_size=200, verbose=1)

The output looks like this:

    Train on 16415 samples, validate on 8085 samples
    Epoch 1/1
    16415/16415 [==============================] - 
ETA: 73s - loss: 9.0281 -acc: 0.44 - ETA: 49s - loss: 9.0271 - acc: 0.44 -  
ETA: 36s - loss: 8.7043 - acc: 0.46 - ETA: 33s - loss: 8.3979 - acc: 0.47 - 
ETA: 31s - loss: 8.3549 - acc: 0.48 - ETA: 29s - loss: 8.3011 - acc: 0.48 - 
ETA: 28s - loss: 8.1802 - acc: 0.49 - ETA: 27s - loss: 8.1220 - acc: 0.49 - 
ETA: 26s - loss: 8.0995 - acc: 0.49 - ETA: 26s - loss: 8.1178 - acc: 0.49 - 
ETA: 25s - loss: 8.1264 - acc: 0.49 - ETA: 24s - loss: 8.1274 - acc: 0.49 - 
ETA: 24s - loss: 8.0880 - acc: 0.49 - ETA: 23s - loss: 8.0860 - acc: 0.49 - 
ETA: 23s - loss: 8.0894 - acc: 0.49 - ETA: 22s - loss: 8.1303 - acc: 0.49 -
... 

However, I would like to see only one line that updates after each batch like so:

Epoch 1/1
        16415/16415 [==============================] - 
    ETA: 23s - loss: 9.0281 -acc: 0.44 - ETA: 22s - loss: 9.0271 - acc: 0.49

I can't find any option in the keras documentation besides setting verbose=2, but this does not update the log during training.

AaronDT
  • 3,940
  • 8
  • 31
  • 71
  • I think its a problem with `stdout`, if you ran your code from the command line, I think you would see the output how you would expect. This is something that bothers me too.... – DJK Sep 25 '17 at 03:35
  • 1
    Possible duplicate of [Keras verbose training progress bar writing a new line on each batch issue](https://stackoverflow.com/questions/41442276/keras-verbose-training-progress-bar-writing-a-new-line-on-each-batch-issue) – DJK Sep 25 '17 at 03:38

2 Answers2

2

You can use a LambdaCallback to call custom functions between batches and epochs.

Use the on_batch_end parameter to pass the function to call:

from keras.callbacks import LambdaCallback

def batchOutput(batch, logs):

    print("Finished batch: " + str(batch))
    print(logs)

batchLogCallback = LambdaCallback(on_batch_end=batchOutput)

model.fit(x,y,....,callbacks=[batchLogCallback])
Daniel Möller
  • 84,878
  • 18
  • 192
  • 214
0

At the time of you writing this question, there wasn't a built-in solution. You could subclass tf.keras.callbacks.TensorBoard and implement an on_batch_end method. In 2018, it was added to Keras.

In Tensorflow you can use the update_freq parameter of the tf.keras.callbacks.TensorBoard class and set it to batch.

Like so:

tensorboard_callback = tf.keras.callbacks.TensorBoard(update_freq='batch')
tf_model.fit(x, y, epochs=1, callbacks=[tensorboard_callback])
Ladislav Ondris
  • 1,246
  • 10
  • 19