0

I searched everywhere the meaning of what does number written while training of model below epoch number tells us?

for example what does 1563/1563 means in this case

I am using CNN model , but I don't think so my model type should affect this because I see this on every Deep Learning algorithm

Epoch 1/10
1563/1563 [==============================] - 29s 18ms/step - loss: 1.4878 - accuracy: 0.46430s - loss: 1.4912 - ac
desertnaut
  • 57,590
  • 26
  • 140
  • 166
  • 1
    This is total data divided by batch size. It means how many batches are flowing in your network. This is first batch 1/1563 and the second batch 2/1563 and so on. If you have 50,016 data points and a batch size of 32, you will need 50016/32=1563 back-propagation steps in 1 epoch. – Adarsh Wase Sep 28 '21 at 08:04

1 Answers1

0

Current batch being processed over total batches. Depends on the batch size and your total training instances.

Gaussian Prior
  • 756
  • 6
  • 16