I would like to implement the following activation function in pytorch:
x = T if abs(x)>T else x
I could do something close with torch.clamp(min=-T, max=T) but it's not exactly the behavior I want (this would behave the same as above for x>-T but would return -T for x<-T). Is there any torch function that could help me achieve this?