-1

I've recently completed Professor Ng's Machine Learning course on Coursera, but I have some problem with understanding backpropagation algorithm. so I try to read Bishop codes for backpropagation using sigmoid function. I searched and found clean codes which try to explain what backpropagation does, but still have problem with understanding codes

can any one explain me what does really backpropagation do? and also explain codes for me?

here is the code that I found in the github and I mentioned it before

mkafiyan
  • 944
  • 2
  • 9
  • 31

1 Answers1

1

You have an error of the network. And first step of backpropagation is to compute a portion of guilt for each neuron in network. Your goal is to describe an error as dependence of weights(parameter which you can change). So backprop equation is partial derivation error/weights.

First step: error signal = (desired result - output of output neuron) x derivationactivation(x) where x is input of output neuron. That is portion of guilt for output neuron.

Next step is compute a portion of guilt for hidden units. First part of this step is summation of error signals of next layer x weights which connect hidden unit with next layer unit. And rest is partial derivation of activation function. error signal = sum(nextlayererror x weight)x derivationactivation(x).

Final step is adaptation of weights.

wij = errorsignal_i x learning_rate x output_of_neuron_j

My implementation of BP in Matlab NN

viceriel
  • 835
  • 12
  • 18