1

In the keras documentation, the function keras.activations.relu(x, alpha=0.0, max_value=None, threshold=0.0) is defined as:

f(x) = max_value for x >= max_value,
f(x) = x for threshold <= x < max_value,
f(x) = alpha * (x - threshold) otherwise.

I did a small test with alpha=0.01, threshold=5.0 and max_value=100.0 and for x=5.0 the output I get is f(x)=0.0.

If I am not mistaken, since x == threshold, I should get f(x)=x=5.0.

Can anyone explain please?

Thanks,

  • Julien
Julien REINAULD
  • 599
  • 2
  • 5
  • 18

1 Answers1

0

The documentation in the source code is wrong. (And you should be moving to tf.keras instead of keras). It should be,

f(x) = max_value for x >= max_value,
--> f(x) = x for threshold < x < max_value,
f(x) = alpha * (x - threshold) otherwise.

So when your x == threshold it goes to the third case which has a 0 in it (i.e. x - threshold). This is why you get 0.

If you need the behavior documented this line needs to change as,

x = x * tf.cast(tf.greater_equal(x, threshold), floatx())

thushv89
  • 10,865
  • 1
  • 26
  • 39