I am confused regarding the dying ReLU problem. ReLU will kill the neuron only during the forward pass? Or also during the backward pass?
Asked
Active
Viewed 102 times
1 Answers
1
A combination of random initialization, so at the beginning, and vanishing gradients during backward pass may lead to such a state that during the forward pass some units will never be activated (or we can say some neurons will never fire). Furthermore the next backward passes probably won't have gradients big enough to change the state.

FluidCode
- 271
- 2
- 10