I am using pylearn2 library to design a CNN. I want to use Leaky ReLus as the activation function in one layer. Is there any possible way to do this using pylearn2? Do I have to write a custom function for it or does pylearn2 have inbuilt funtions for tha? If so, how to write a custom code? Please can anyone help me out here?
Asked
Active
Viewed 392 times
1 Answers
0
ConvElemwise super-class is a generic convolutional elemwise layer. Among its subclasses ConvRectifiedLinear is a convolutional rectified linear layer that uses RectifierConvNonlinearity class.
In the apply()
method:
p = linear_response * (linear_response > 0.) + self.left_slope *\
linear_response * (linear_response < 0.)
As this gentle review points out:
... Maxout neuron (introduced recently by Goodfellow et al.) that generalizes the ReLU and its leaky version.
Examples are MaxoutLocalC01B or MaxoutConvC01B.
The reason for lack of answer in pylearn2-user may be that pylearn2 is mostly written by researches at LISA lab and, thus, the threshold for point 13 in FAQ may be high.

Nicu Tofan
- 1,052
- 14
- 34
-
This works. Thanks heaps. Yes it was really hard to find any info regarding this issue on internet – SameeraR Apr 22 '15 at 05:35
-
I had a hard time figuring out the workings of PyLearn2, too. I had to spend some time reading the source code. Pretty enlightening, tho. – Nicu Tofan Apr 22 '15 at 09:40