-1

I have searched this topic everywhere, but I couldn't find the exact solution I was looking for. So, I am still quite confused between the two terms Back Propagation and Recurrent Neural Networks. I have read that back propagation is used after the feed-forward neural networks step in order to update weight. Then, where are how are recurring neural networks used?

Vivek
  • 336
  • 2
  • 4
  • 18

1 Answers1

1

Back-propagation is an algorithm that is used to compute parameter gradients in a neural network. For the purpose of the algorithm, we imagine the computation as a pass through a directed acyclic graph where the leaves are parameters and inputs and inner nodes of the nodes are operation. Applying the chain rule for derivatives then corresponds to a backward pass in such a graph. This algorithm is general can be applied to any neural net (feed-forward, recurrent or convolutional).

Recurrent neural networks are a type of neural network that recurrently applies the same function on its hidden state and a new input in a sequence. Computation of an RNN can also be described as a computation graph and therefore the backpropagation algorithm can be applied to estimate gradients of its parameters too.

Jindřich
  • 10,270
  • 2
  • 23
  • 44