-1

if I set alpha = 1, then I simply extend the normal Relu function down. Should this function then be viewed as a linear activation function?

janwe
  • 11
  • 2

1 Answers1

1

Yes, Leaky ReLU with alpha equals to 1 is in fact the linear activation function.

Ignacio Peletier
  • 2,056
  • 11
  • 24