An activation function is just part of the model, so here is the code for the function you described.
import tensorflow as tf
from tensorflow.keras import Model
class MyModel(Model):
def __init__(self):
super().__init__()
# Some layers
self.W = tf.Variable(tf.constant([[0.1, 0.1], [0.1, 0.1]]))
def call(self, x):
# Some transformations with your layers
x = tf.where(x==0, x, tf.tanh(self.W*x)/self.W)
return x
So, for non-zero matrix MyModel()(tf.constant([[1.0, 2.0], [3.0, 4.0]]))
it returns
<tf.Tensor: shape=(2, 2), dtype=float32, numpy=
array([[0.9966799, 1.9737529],
[2.913126 , 3.79949 ]], dtype=float32)>
For zero matrix MyModel()(tf.constant([[0.0, 0.0], [0.0, 0.0]]))
it returns zeros
<tf.Tensor: shape=(2, 2), dtype=float32, numpy=
array([[0., 0.],
[0., 0.]], dtype=float32)>