7

Keras codes I have looked or wrote have fixed batchsize during training (i.e. 32, 64, 128 ...). I am wondering if it is possible to have dynamic batchsize. (For example, 104 in the first iteration, 82 in the next iteration, 95 in next, and so on.)

I am currently using tensorflow backend.

user3377018
  • 93
  • 1
  • 5

2 Answers2

4

It is possible if you train on a loop vs training with fit. an example

from random import shuffle    

dataSlices = [(0,104),(104,186),(186,218)]

for epochs in range(0,10):
    shuffle(dataSlices)
    for i in dataSlices:
        x,y = X[i[0]:i[1],:],Y[i[0]:i[1],:]
        model.fit(x,y,epochs=1,batchsize=x.shape[0])
        #OR as suggest by Daniel Moller
        #model.train_on_batch(x,y)

This would assume your data is 2d numpy arrays. This idea can be further expanded to use a fit_generator() inplace of the for loop if you so choose (see docs).

DJK
  • 8,924
  • 4
  • 24
  • 40
0

Use None to set any batch size. In particular this answer explains it.

prosti
  • 42,291
  • 14
  • 186
  • 151