How can I increase the number of mini-batch using the Standard Updater class in Chainer substantially?
In case of PyTorch, I can increase the number of mini-batch substantially.
- Execute loss.backward() every time.
- Execute optimizer.step() / optimizer.zero_grad() once every three times. This effectively increase the number of mini-batch substantially.
Question 1. In case of Chainer, Is it possible to increase the number of mini-batch substantially?
- Execute loss.backward() every time.
- Execute net.cleargrads() / optimizer.update() once every three times. Can this increase the number of mini-batch substantially?
Question 2. In fact, I'm using the StandardUpdater class. Is it possible to increase the number of mini-batch using any of hyper parameters substantially? Or should I make my class that inherits from StandardUpdater class and change the implementation above?
I'm sorry if the questions have already been asked.
I hope any advice.