I am using nesterov momentum for updating the weights of a Convolutional Neural Network. I am using Lasagne for building a CNN. How to implement learning rate decay for every epoch?
prediction = lasagne.layers.get_output(network)
loss = lasagne.objectives.categorical_crossentropy(prediction, target_var)
params = lasagne.layers.get_all_params(network, trainable=True)
lasagne.updates.nesterov_momentum(loss, params, learning_rate, momentum=0.9)
train_fn = theano.function([input_var_1, input_var_2, target_var], loss, updates=updates)