11

I think I've understood each step of backpropagation algorithm but the most important one. How do weights get updated? Like at the end of this tutorial? http://home.agh.edu.pl/~vlsi/AI/backp_t_en/backprop.html

Snate
  • 195
  • 1
  • 1
  • 9

1 Answers1

12

The weight updates are done via the equations written at the last part of the page (Backpropagation) you provided. Let me elaborate a little bit:

New Weights = Old Weights - learning-rate x Partial derivatives of loss function w.r.t. parameters

For a given weight, calculate the enter image description here (which can be done easily by back propagating the error) which is nothing but the steepest direction of the function and subtract a scaled version of it, the scale factor being the step size or how large step you want to make in that direction. Just a little clarification which I felt you might need after looking at the way you asked the question ...

What is exactly Back-propagation?

Backpropagation is just a trick to quickly evaluate the partial derivatives of the loss function w.r.t. all weights. It has nothing to do with weight updating. Updating the weights is a part of gradient descent algorithm.

ayandas
  • 2,070
  • 1
  • 13
  • 26
  • why is it New_Weight = Old_Weight MINUS... instead of Old_Weight PLUS...? – Dee Dec 18 '18 at 07:22
  • 1
    @datdinhquoc It depends on how you've calculated the error/loss. For eg. If you did `obtained_value - target values`, you need to add the weights, otherwise subtract them. – SUB0DH Jul 16 '19 at 10:34