0

I'm building a multilayer neural network. I have question regarding training process, I have a set of training-data with desired outputs. I am using Backpropagation algorithm for updating the connections weights.

should the network train training-data separately? e.g.: the network takes 1 input, whenever it finds the proper connections weights that give actual-output equals to desired output; the network takes another training-input.

is this correct?

Makaveli
  • 309
  • 1
  • 3
  • 9
  • I don't understand the question. What does "should" have to do with the training? The network adjusts the weights to minimize the difference between the predicted and observed values over all the training data. – Gordon Linoff Oct 16 '16 at 18:20
  • you're right, but I am not sure about the way how the network should adjust the weights. Does the network take only one input (from training-data) in time? and after it finds the proper weights.... it will take another input? – Makaveli Oct 16 '16 at 18:30
  • . . That is conceptually how it works. There are methods of accumulating information and then adjusting the weights less frequently. – Gordon Linoff Oct 17 '16 at 01:11

1 Answers1

1

No, regardless of whether or not the actual output equals to the target output , backpropagation algorithm should move to next element of the training set. Then it will update the weights/parameters after a certain amount of training cases have passed, which is determined by the batch size specified. And for each training iteration passed, the average total error should normally be lower than previous iteration.

atjua
  • 541
  • 1
  • 9
  • 18
  • thanks! this's what I was looking for!!! so the error rate can be computed as the following? 0.5*[SUM(target-atcual)^2] ? – Makaveli Oct 16 '16 at 18:32
  • @Makaveli yes, for backpropagation, as usual we'll need to define an errror cost function for error correction.The "0.5*[SUM(target-atcual)^2]" you mentioned is one of usable type of cost functions, which is called a mean squared error cost function. There are many more, but that particular cost function is sufficient for most general cases. – atjua Oct 16 '16 at 19:02
  • thank you very much! this helped a lot! in the testing-process data do not have output, how can we use the error rate to predict the output? – Makaveli Oct 16 '16 at 19:11
  • @Makaveli Since backpropagation is a supervised learning algo, you'll always need to provide a sample training output for the error correction. I'd suggest you read more about the algorithm, only then you'll understand fully. – atjua Oct 16 '16 at 22:15