0

I am using Lasagne and Theano library to build my own deep learning model following the MNIST example. Can anyone please tell me how the adaptively change the learning rate?

Avijit Dasgupta
  • 2,055
  • 3
  • 22
  • 36

2 Answers2

0

I recommend having a look at https://github.com/Lasagne/Lasagne/blob/master/lasagne/updates.py.

If you are using sgd, then you can use a momentum term (e.g. https://github.com/Lasagne/Lasagne/blob/master/lasagne/updates.py#L156) to adaptively change the learning rate. If you want to make anything non-standard, the momentum implementation give you enough hints how to create something similar on your own.

Martin Thoma
  • 124,992
  • 159
  • 614
  • 958
0

I think the best way of doing this is by creating a theano shared variable for your learning rate, passing the shared variable to the updates function and changing through the set_value method, as follows:

lr_shared = theano.shared(np.array(0.1, dtype=theano.config.floatX))
updates = lasagne.updates.rmsprop(..., learning_rate=lr_shared)

...

for epoch in range(num_epochs):
    if epoch % 10 == 0:
        lr_shared.set_value(lr_shared.get_value() / 10)

Of course you can change the optimizer and the if codition, this is just an example.

Roxana Istrate
  • 632
  • 1
  • 6
  • 18