1

I'm learning about neural networks. I saw many videos, PDFs etc telling about adjusting the weights in backpropagation. But nowhere i saw someone explaining how adjusting the biases work.

Can anyone here give me a brief explanation on how the biases are adjusted?

Ruthvik
  • 790
  • 5
  • 28
  • exactly the same way. Bias is just another weight. You can think of it as a weight multiplied by a constant extra "1" input/feature. – lejlot May 30 '22 at 21:58
  • ^^^ yup, think of biases as another input only the input is always 1. – Bhupen May 31 '22 at 03:49
  • @lejlot, But weights are multiplied and biases are added, how can they be adjusted together without meddling with other – Ruthvik May 31 '22 at 15:58
  • That's not true. They look different, but are the same. Look at a linear layer. `f(x) = + b = SUM_{i=1} w_i x_i + b = SUM_{i=0} w_i x_i,` where `x_0 := 1, w_0 := b` – lejlot May 31 '22 at 19:08

0 Answers0