I'm working with Lasagne on a neural network.
I want that on the first layer, the same weight is applied to many neurons of the input layer (and obviously I want that weights update considering the contribute of all these neurons)
This because my input has many symmetries: I have 24*n different inputs but I want only 4*n different weights (n is a parameter that I still need to decide)
How can I do it?