0

I am following this multiprocessing notebook. I want to understand how the batch_size parameter of the model is distributed across the multiple environments.

I have a model trained with 1 worker on 1 environment with a batch_size = 64, I understand that the network is updated in batches of 64 samples/timesteps.

Now what if I have that same model but trained with 4 workers on 4 environments, with parameter batch_size set to 64? Is the model now actually being updated with 64*4 samples/timesteps? Or is the 64 batch size being split 4 ways, so model updated with 64 samples, but 16 from each environment?

Thank you!

Vladimir Belik
  • 280
  • 1
  • 12

0 Answers0