-2

Can you break down 'Backpropagation' into its simplest form? I'm ok with math, but I'm just trying to get an general idea of the term.

I'm reading this article: The mostly complete chart of Neural Networks, explained

In the last paragraph on Feed Forward network it mentions 'Backpropagation'. I have no clue what it is. Please help.

Maxim
  • 52,561
  • 27
  • 155
  • 209

1 Answers1

2

Backpropogation is an algorithm to train a neural network. During the training process, the network makes a prediction and incurs some sort of "cost" or "loss" based on what the right answer is. We want our network to adjust based on this loss, so we use backpropagation to update the individual neurons in the network in order to make a better prediction (hopefully) on the next data point.

The reason it's called backpropagation is because the algorithm starts at the end of the network, with the single loss value based on the output, and updates neurons in the reverse order, with the neurons at the start of the network updated last. The algorithm makes heavy use of the chain rule, so you can think of the gradient "propagating" backward through the network.

http://karpathy.github.io/neuralnets/ and http://cs231n.github.io/optimization-2/ are great resources for you to start with for a (much) better explanation.

Robbie Jones
  • 391
  • 1
  • 7