0

I'm trying to create a leaky relu that has the same gradient for values > 1 than for values < 0.

I have an implementation that seems to work but it's about 50% slower than the normal leaky relu. So I think there must be a better way.

Here is a minimal example:

##############################################################################

import tensorflow as tf
import tensorflow.keras as ke
import tensorflow.keras.layers as l

##############################################################################

def myRelu(x):
    return tf.where(x<0, x*0.1, tf.where(tf.math.logical_and(x>=0, x<=1), x, 0.9+x*0.1))

##############################################################################

def build_model_1():

    model_input = l.Input(shape=(None, 365, 15, 26, 2))

    x = l.Dense(1, activation='linear')(model_input)
    x = l.Lambda(myRelu)(x)
    # x = l.Activation(myRelu)(x) # or this

    model = ke.Model(inputs=[model_input], outputs=[x])
    model.compile(optimizer='Adam', loss='mean_squared_error')
    
    return model

##############################################################################

I've already searched the internet for a couple of hours but I haven't found an easy or clear solution yet. I'm aware that the standard tf.keras.layers.ReLU supports a max_value which I could set to 1 but I'm trying to avoid this to avoid the dying relu problem.

I hope someone can help me out or point me in the right direction.

Siano
  • 13
  • 2
  • Quick update: This works a lot better already: `############################################################################## def myRelu_2(): data = np.arange(-5, 5+1, 0.5).astype(np.float64) print(data) output = tf.where(data<=1, backend.relu(data, alpha=0.1), 0.9+data*0.1) print(output) ##############################################################################` still about 30% slower than the normal leaky-relu though. – Siano Jun 23 '21 at 18:19
  • Second Update: Turns out TensorLayer implements a 'Twice Leaky ReLU6' https://tensorlayer.readthedocs.io/en/latest/modules/activation.html#twice-leaky-relu6 almost perfect but it seems you can't change it to a 'Twice Leaky ReLU1' – Siano Jun 24 '21 at 10:47

0 Answers0