Good morning!
I just want to clarify - is the batch_size parameter in model.fit declaring how many samples go in, or is the number of samples that go in at a time x/batch_size, where there are a total of x samples?
That is, suppose we have 20,000 samples and we make a batch size of 100. Does that mean 200 samples are passed in at a time (meaning 100 batches), or 100 samples?
I ask because https://deeplizard.com/learn/video/Skc8nqJirJg says "If we passed our entire training set to the model at once (batch_size=1), then the process we just went over for calculating the loss will occur at the end of each epoch during training", implying that it's one batch. However, batch_size seems to mean something difference based on its name, so I wanted to clarify.
Thank you!
Note: there is another question like this, but it wasn't answered - How BatchSize in Keras works ? LSTM-WithState-Time Series
That adds: how are those samples chosen?