0

I just read Bishop book's Pattern Recognition and Machine Learning, I read the chapter 5.3 about the backpropagation, it said in his book that, In general feed-forward network, each unit computes a weighted sum of its inputs of the form $$\a_j=\sum\limits_{i}w_{ji}z_i$$@

Then the book say that the sum in above equation transformed by non linear activation function $h(.)$ to give the activation $z_j$ of unit $j$ in the form $$z_j=h(a_j)$$.

I think the notation is somehow awkward, suppose I want to compute $a_2$, then $$a_2=w_{21}z_1+w_{2,2}z_2+\dots$$

Then it means that $$a_2=w_{21}z_1+w_{2,2}h(a_2)+\dots$$ it means that the neuron $a_2$ connected to itself?

Please correct me if I wrong.

Brian Tompsett - 汤莱恩
  • 5,753
  • 72
  • 57
  • 129
kkr4k
  • 123
  • 1
  • 8

2 Answers2

0

You are wrong, but I agree that there are some easy problems with the notation used. The tricky bit is that sum is over i, but he never claimed that it is a sum over natural numbers. In other words, in each equation used he is omitting the set of indices, which denote which neurons are connected to which one, it should be something like sum_{i \in set_of_neurons_connected_to(j)}, and obviously it does not imply any self loops unless you specify such graph. This notation is quite common in ML, not only in Bishop book.

lejlot
  • 64,777
  • 8
  • 131
  • 164
0

No, the neuron does not connect to itself. If you are referring to the example in 5.3.2, in particular formulas (5.62-64) this confusion comes from the fact that you are reading about a shallow network. If you add a layer and compute z_{j+1} you will see that this time the x_{i} is indeed z_{j}.