Using Tensorflow 1.5, I am trying to add leaky_relu
activation to the output of a dense layer while I am able to change the alpha
of leaky_relu
(check here). I know I can do it as follows:
output = tf.layers.dense(input, n_units)
output = tf.nn.leaky_relu(output, alpha=0.01)
I was wondering if there is a way to write this in one line as we can do for relu
:
ouput = tf.layers.dense(input, n_units, activation=tf.nn.relu)
I tried the following but I get an error:
output = tf.layers.dense(input, n_units, activation=tf.nn.leaky_relu(alpha=0.01))
TypeError: leaky_relu() missing 1 required positional argument: 'features'
Is there a way to do this?