For most purposes the accepted answer is the best, don't change the batch size. There's probably a better way 99% of the time that this question comes up.
For those 1%-ers who do have an exceptional case where changing the batch size mid-network is appropriate there's a git discussion that addresses this well:
https://github.com/keras-team/keras/issues/4807
To summarize it: Keras doesn't want you to change the batch size, so you need to cheat and add a dimension and tell keras it's working with a batch_size of 1. For example, your batch of 10 cifar10 images was sized [10, 32, 32, 3]
, now it becomes [1, 10, 32, 32, 3]
. You'll need to reshape this throughout the network appropriately. Use tf.expand_dims
and tf.squeeze
to add and remove a dimension trivially.