Is it possible to create a custom activation function of the form:
def newactivation(x):
if x <= -1:
return -1
elif x > -1 and x <= 1
return x
else :
return 1
So basically it would be a linearized version of tanh(x).
Is it a problem during optimization, that the function has two non differentiable positions at -1 and 1?
How could I implement this?