0

Is the above statement true? bias and \beta_0 are both weights that are independent of the input so the allow the model to add some constant value.

Quastiat
  • 1,164
  • 1
  • 18
  • 37
  • Consider a case where the network trains equally well either with or without an output bias neuron. In such a case, the answer is no even if the bias neuron output is not zero. – James Phillips Aug 02 '19 at 02:08

1 Answers1

1

Firstly, a linear regression tries to estimate a function while a single neuron divides the input space into two sub-spaces, so they do essentially different tasks.

Having said that, the \beta_0 in neuron and the y-interception in linear regression are both biases: constants that are applied to the final output regardless of the output (just for clarification: they depend on the input in the sense that models learns them from input data).

Bahman Rouhani
  • 1,139
  • 2
  • 14
  • 33