0

I am using the "neuralnet" package of R to train a neural network. My training set has 6000 datapoints. After finishing the training I can check the results to see how many steps were necessary until all partial derivatives reached the defined threshold. I wonder why the number of steps is always much smaller (for example 800 steps) than the number of datapoints in my training set. Does that mean that the algorithm doesn't pass through the whole training set until convergence was reached? That wouldn't be good because then it misses a lot of information. Or do I have a wrong understanding of the meaning of the number of steps?

McKoppter
  • 11
  • 2
  • Its most likely, that steps == the number of training epochs or batches. – Eli Korvigo Jan 16 '18 at 15:31
  • @EliKorvigo: Apparently, they are different. https://stackoverflow.com/questions/38340311/what-is-the-difference-between-steps-and-epochs-in-tensorflow McKoppter This should solve some or all of your doubts as well. – Shridhar R Kulkarni Jan 16 '18 at 16:56
  • @ShridharR.Kulkarni so basically, it is the number of batches as I've assumed. – Eli Korvigo Jan 16 '18 at 17:01
  • Thank you for your help, its much clearer now. During one step one batch is presented to the net. And one batch consists of several data points. Unfortunatly there is no parameter which defines the batch size in the neuralnet package of R – McKoppter Jan 16 '18 at 18:15
  • @ShridharR.Kulkarni how does your message contradicts mine? I've stated that the number of steps is the number of batches. You call me wrong, yet you write the same thing. – Eli Korvigo Jan 16 '18 at 18:29
  • @EliKorvigo: Sorry, you are right! Removing the earlier comment. – Shridhar R Kulkarni Jan 16 '18 at 18:43
  • @McKoppter FYI https://stats.stackexchange.com/questions/215320/default-batch-size-in-r-nnet-package – Shridhar R Kulkarni Jan 16 '18 at 18:51

0 Answers0