With SGD learning rate should not be changed during epochs but it is. Help me understand why it happens please and how to prevent this LR changing?
import torch
params = [torch.nn.Parameter(torch.randn(1, 1))]
optimizer = torch.optim.SGD(params, lr=0.9)
scheduler = torch.optim.lr_scheduler.StepLR(optimizer, 1, gamma=0.9)
for epoch in range(5):
print(scheduler.get_lr())
scheduler.step()
Output is:
[0.9]
[0.7290000000000001]
[0.6561000000000001]
[0.5904900000000002]
[0.5314410000000002]
My torch version is 1.4.0