I am trying to make a simple exponential decay (gamma-decay) function in Tensorflow as Y = e^(-gamma * X)
where gamma is in the range (0,1). I am using sub-classing so the layer looks like this:
class Test_Layer(keras.layers.Layer):
def __init__(self):
super(Test_Layer, self).__init__()
self.name_ = 'Decay'
def build(self, input_shape):
self.gamma_raw = tf.Variable(shape=(1,input_shape[-1]), trainable=True, name = self.name_ + '_gamma')
def call(self, inputs):
return inputs * (1 + 1/(1 - self.gamma_raw))
The problem is that I wish to bound the learnt coefficients in the range of (0,1) and in every epoch, use this bound Variable to calculate the loss and y_hat.
For this, I tried to convert the raw coefficients using the sigmoid function:
self.gamma = tf.nn.sigmoid(self.gamma_raw)
But when I replace gamma_raw with gamma in the call(self, inputs)
definition, it ends up learning nothing as it is a Tensor, not a Variable anymore due to the sigmoid transform.
Is there a fix for this problem with the current approach? Or is there a way to solve it using some other implementation?