I need to create tent activation function as it is shown in the following image. I was wondering if there is torch built-in tent activation function? If not, is there any way to create this activation function?
Thanks
I need to create tent activation function as it is shown in the following image. I was wondering if there is torch built-in tent activation function? If not, is there any way to create this activation function?
Thanks
I think this can be used.
y = (x>0)*(1-x>0)*(1-x) + (x<0)*(1+x>0)*(1+x)
I found simple solution to implement the tent activation function in forward function
import torch
import torch.nn as nn
class Net(nn.Module):
def __init__(self):
super(Net, self).__init__()
def forward(self, input):
out = nn.functional.relu(input + 1) - 2 * nn.functional.relu(input) + nn.functional.relu(input - 1)
return out
same solution could be implemented using
nn.ReLU()
as the layers of network.