if I set alpha = 1, then I simply extend the normal Relu function down. Should this function then be viewed as a linear activation function?
Asked
Active
Viewed 401 times
1 Answers
1
Yes, Leaky ReLU with alpha equals to 1 is in fact the linear activation function.

Ignacio Peletier
- 2,056
- 11
- 24
-
You are welcome! If the post answered your question. Please mark it as answered. – Ignacio Peletier Jul 13 '20 at 10:23