0

I am having a hard time finding resources online about how to preform backpropagation with the bias in a convolutional neural network. By bias I mean the number added to every number resulting from a convolution.

Here is a picture further explaining

I know how to calculate the gradient for the filter's weights but I am not sure what to do about the biases. Right now I am just adjusting it by the average error for that layer. Is this correct?

cr5519
  • 53
  • 1
  • 4

1 Answers1

2

It is similar to the bias gradient in standard neural networks but here we sum over all the gradients w.r.t convolution output:

where L is the loss function, w and h are the width and height of the conv output, is the gradient of the conv output w.r.t the loss function.

Thus, the gradient of b is computed by summing all the convolution output gradients at each position (w, h) w.r.t the loss function L.

Hope this helps.