0

I am building a simple convolutional network using the Lasagne package and wanted to add a ReLu layer with a simple threshold [max(0,x-threshold)] but could only find rectifiers without a trainable parameter (lasagne.layers.NonlinearityLayer) or that has a parameter that is being multiplied (lasagne.layers.ParametricRectifierLayer). Does this layer exist or am I missing something obvious?

Thank you for any help! Terry

1 Answers1

0

I don't think that exists. With the reason being that you usually have a trainable layer before the relu (e.g. convolutional or fully connected), which has a bias included. Moving the data by some bias is equivalent to having a threshold at the relu. If you don't have a trainable layer before the relu, you can also explicitly add a lasagne.layers.BiasLayer (http://lasagne.readthedocs.org/en/latest/modules/layers/special.html)

Hope this helps

Michael

Michael Gygli
  • 900
  • 6
  • 11