3

I'm using LambdaLR as a learning rate function:

import torch
import torch.nn as nn
import matplotlib.pyplot as plt

model = torch.nn.Linear(2, 1)
optimizer = torch.optim.SGD(model.parameters(), lr=0.01)
lambda1 = lambda epoch: 0.99 ** epoch
scheduler = torch.optim.lr_scheduler.LambdaLR(optimizer, lr_lambda=lambda1, last_epoch = -1)

lrs = []


for i in range(2001):
    optimizer.step()
    lrs.append(optimizer.param_groups[0]["lr"])
    scheduler.step()

plt.plot(lrs)  

enter image description here

I'm trying to set a min learning rate so it won't go to 0. How can I do that?

Penguin
  • 1,923
  • 3
  • 21
  • 51

1 Answers1

1

The new learning rate is always calculated like that:

LRepoch = LRinitial * Lambda(epoch):

And with the inital learning rate they mean the first one, not the last one used.

That means we can just write:

INITIAL_LEARNING_RATE = 0.01
your_min_lr = 0.0001

lambda1 = lambda epoch: max(0.99 ** epoch, your_min_lr / INITIAL_LEARNING_RATE)

Then you get your_min_lr back when INITIAL_LEARNING_RATE * (0.99 ** epoch) gets too small because INITIAL_LEARNING_RATE * your_min_lr / INITIAL_LEARNING_RATE equals just your_min_lr.

Tobi
  • 21
  • 6