I have a custom keras layer and I have to create my custom activation function. Is it possible to put fixed activations for different neuron in the same layer? For example, let's say I have something like a Dense Layer with 3 units, and I want that the activation of the first unit is a relu, of the second one is a tanh and of the third one is a sigmoid; independently on the value of x, so that this is not ok:
def myactivation(x):
if x something:
return relu(x)
elif something else :
return another_activation(x)
What I want to do is apply an activation on a specific neuron as
def myactivation(x):
if x == neuron0:
return relu(x)
elif x == neuron1:
return tanh(x)
else:
return sigmoid(x)
Is this possible? Or there is another way to implement something like this?