I searched through the internet a lot but could not come to the conclusion why do we use weights in each layer of backpropagation algorithm. i know that the weights are multiplied to the output of previous layer to get the input of the next layer, but i do not understand why do we need these weights? Please help Thanks Ark
Asked
Active
Viewed 73 times
1 Answers
1
Without the weights there could be no learning. Weigths are the values which are adjusted during the backpropagation learning process. A neural network is nothing more than a function and the weights parametrize the behavior of this function.
To better understand first look at a single layer perceptron such as the ADALINE.

Louis Hugues
- 586
- 4
- 6