0

I need to create tent activation function as it is shown in the following image. I was wondering if there is torch built-in tent activation function? If not, is there any way to create this activation function?

enter image description here

Thanks

user1538653
  • 121
  • 1
  • 1
  • 10

3 Answers3

0

I think this can be used.

y = (x>0)*(1-x>0)*(1-x) + (x<0)*(1+x>0)*(1+x)
0

I found simple solution to implement the tent activation function in forward function

import torch
import torch.nn as nn
class Net(nn.Module):
    def __init__(self):
         super(Net, self).__init__()

    def forward(self, input):
        out = nn.functional.relu(input + 1) - 2 * nn.functional.relu(input) + nn.functional.relu(input - 1)
        return out

same solution could be implemented using

nn.ReLU()

as the layers of network.

user1538653
  • 121
  • 1
  • 1
  • 10
0

This should work:

torch.maximum(1 - x.abs(), 0)       
Sia Rezaei
  • 427
  • 1
  • 5
  • 15