10

How many times do I use a sample of training data in one training cycle? Say I have 60 training data. I go through the 1st row and do a forward pass and adjust weights using results from backward pass. Using the sigmoidal function as below:

Forward pass 
Si = sum of (Wi * Uj)
Ui = f(Si) = 1 / 1 + e^ - Si

Backward pass 
Output Cell = (expected -Ui)(f'(Si)), where 
f'(Si) = Ui(1-Ui)

Do I then go through the 2nd row and do the same process as the 1st or do I go around the 1st row until the error is less?

I hope someone can help please

obsessiveCookie
  • 1,130
  • 2
  • 18
  • 33

1 Answers1

7

Training the network

You should use each instance of the training set once per training epoch.

A training epoch is a complete cycle through your dataset.

After you've looped through the dataset and calculated the deltas, you should adjust the weights of the network. Then you may perform a new forward pass on the neural network and do another training epoch, looping through your training dataset.

Graphical representation
A really great graphical representation of backpropagation may be found at this link.


Single-step training

There are two approaches to train you network to perform classification on a dataset. The easiest method is called single-step or online learning. This is the method you will find in most litterature, and it is also the fastest to converge. As you train your network you will calculate the deltas for each layer and adjust the weights for each instance of your dataset.

Thus if you have a dataset of 60 instances, this means you should have adjusted the weights 60 times before the training epoch is over.

Batch training

The other approach is called batch training or offline learning. This approach often yields a network with a lower residual error. When you train the network you should calculate the deltas for each layer for every instance of the dataset, and then finally average the individual deltas and correct the weights once per epoch.

If you have a dataset of 60 instances, this means you should have adjusted the weights once before the training epoch is over.

jorgenkg
  • 4,140
  • 1
  • 34
  • 48
  • Really appreciate your help. Are you saying I should loop through all 60 of my data and calculate the deltas and once the loop has finished I adjust the weights? Or do I loop through 60 of my data and right after calculating the delta I adjust the weights? – obsessiveCookie Mar 24 '14 at 19:03
  • I've extended my answer, since this couldn't be explained with 600 inline characters. I think you should go for the online version! – jorgenkg Mar 25 '14 at 06:07
  • Just to iron out your main question: *How many times do I use a sample of training data in one training cycle?* **Once**, almost always. – jorgenkg Mar 25 '14 at 06:23
  • Thanks. I understand how it works if I use only one set of data. The confusion starts when I've got multiple training data. I think I'm trying to go for single step training. I'm not sure what you meant by "adjust the weights for each instance of your network." I thought we only use one instance of a network to train – obsessiveCookie Mar 25 '14 at 09:55
  • What jorgenkp calls "Single-step training" is in most literature called stochastic training. In nearly all cases stochastic training will beat batch training. – Øystein Schønning-Johansen Mar 25 '14 at 10:52
  • @oysteijo That's only partly true. Stochastic training is when you choose the dataset instance to train the network on stochastically, instead of looping through the dataset and choosing the instances deterministically. But both of them are *single-step* training. – jorgenkg Mar 25 '14 at 12:33
  • @obsessiveCookie sorry for the massive typo! "For each instance in the dataset"! You only have one instance of the network of course. I should be on stackoverflow at 0600am – jorgenkg Mar 25 '14 at 12:37
  • I think, In the online training, each time the weights are updated according to the new data entry, and your curve may relocate and this can increase the overall error. Please correct me If I'm wrong. (I'm just learning the NN) – Muaaz Khalid Mar 29 '17 at 06:25