I am building a simple convolutional network using the Lasagne package and wanted to add a ReLu layer with a simple threshold [max(0,x-threshold)] but could only find rectifiers without a trainable parameter (lasagne.layers.NonlinearityLayer) or that has a parameter that is being multiplied (lasagne.layers.ParametricRectifierLayer). Does this layer exist or am I missing something obvious?
Thank you for any help! Terry