What does the negative slope in a LeakyReLU function refer to?
The term "negative slope" is used in the documentation of both TensorFlow and Pytorch, but it does not seem to point to reality.
The slope of a LeakyReLU function for both positive and negative inputs is generally non-negative.
The Pytorch and TensorFlow docs provide examples of setting the negative slope, and they both use a positive value. TensorFlow explicitly enforces non-negative values. (See below)
Are they just wrong or am I missing something?
CLASStorch.nn.LeakyReLU(negative_slope=0.01, inplace=False)
Args alpha Float >= 0. Negative slope coefficient. Default to 0.3.