Suppose I have a deep neural network for classification purposes, which consists of two hidden layers, 50 neurons in each layer and "Relu" as the activation function. The output of the activation function ranges from 0 to +1 in my model. Now, is there any way to say a higher activation value (say close to 1) of a neuron means the neuron is more important compared to a neuron which has lower activation value (close to 0)?
Asked
Active
Viewed 157 times
2
-
Please clarify what is meant by “important.” – John Wu Mar 02 '20 at 19:37
-
@JohnWu I want to find out if there are neurons that are activated for a particular class. I was thinking of using the activation values as a measurement. So does higher activation value mean something? – Sakib Mostafa Mar 02 '20 at 19:52