0

I'm using Stable-Baseline to train A2C model.

My data length is 9000. So how many total_timesteps in model.learn should I set?

model.learn(total_timesteps = 9000) # ?

I did some research and some suggest like 10000, and some suggest 1 million. I'm really confused.

Any suggestions?

desertnaut
  • 57,590
  • 26
  • 140
  • 166
William
  • 3,724
  • 9
  • 43
  • 76
  • Not a *programming* question, hence off-topic here; please see the NOTE in https://stackoverflow.com/tags/reinforcement-learning/info . Also, why both `pytorch` and `tensorflow` tags? – desertnaut Sep 02 '22 at 22:49

1 Answers1

0

The total_timesteps in Stable-Baseline is the total number of samples (env steps) to train on. So it should not be necessary the same as the length of the episode you have.