1

I would like to implement a custom activation function in tensorflow. The idea of this activation function is that it should learn how linear it will be. Using the following function.

tanh(x*w)/w  for w!= 0
x            for w = 0 

The parameter w should be learned. However I do not know how to implement this in tensorflow.

codeprof
  • 13
  • 2

1 Answers1

0

An activation function is just part of the model, so here is the code for the function you described.

import tensorflow as tf
from tensorflow.keras import Model

class MyModel(Model):
    def __init__(self):
        super().__init__()
        # Some layers
        self.W = tf.Variable(tf.constant([[0.1, 0.1], [0.1, 0.1]]))
        
    def call(self, x):
        # Some transformations with your layers
        x = tf.where(x==0, x, tf.tanh(self.W*x)/self.W)
        return x

So, for non-zero matrix MyModel()(tf.constant([[1.0, 2.0], [3.0, 4.0]])) it returns

<tf.Tensor: shape=(2, 2), dtype=float32, numpy=
array([[0.9966799, 1.9737529],
       [2.913126 , 3.79949  ]], dtype=float32)>

For zero matrix MyModel()(tf.constant([[0.0, 0.0], [0.0, 0.0]])) it returns zeros

<tf.Tensor: shape=(2, 2), dtype=float32, numpy=
array([[0., 0.],
       [0., 0.]], dtype=float32)>
PermanentPon
  • 702
  • 5
  • 10