Usually, an activation function is applied to all neurons of a given layer as in
layer = tf.nn.relu(layer)
How can I apply an activation function to say the second neuron only?
How can I apply a specific transformation (say tf.exp()
) to a specific neuron only?
Slicing a column cannot apply here since to slice a column I need to know the number of rows and it is unknown at construction time.