Questions tagged [learning-rate]
83 questions
2
votes
0 answers
StepLR Learning Rate Scheduler applying an almost infinitely small decrease and also too early
I am using the StepLR scheduler with the Adam optimizer:
optimizer = torch.optim.Adam(model.parameters(), lr=LrMax, weight_decay=decay) # , betas=(args.beta1, args.beta2)
print(f'Optimizer = {repr(optimizer)}')
scheduler =…

WestCoastProjects
- 58,982
- 91
- 316
- 560
2
votes
2 answers
Can I specify kernel-weight specific learning rates in PyTorch?
I would like to set specific learning rates for each parameter on their lowest level. I.e. each value in a kernels weight and biases should have their own learning rate.
I can specify filter-wise learning rates like that:
optim =…

oezguensi
- 930
- 1
- 12
- 23
1
vote
1 answer
Getting rid of the clutter of `.lr_find_` in pytorch lightning?
When using the Lightning’s built-in LR finder:
# Create a Tuner
tuner = Tuner(trainer)
# finds learning rate automatically
# sets hparams.lr or hparams.learning_rate to that learning rate
tuner.lr_find(model)
a lot of checkpoint lr_find_XXX.ckpt…

Gabi Gubu
- 25
- 3
1
vote
1 answer
Argument must be a string or a number, not 'ExponentialDecay'
I am on Tensorflow 2.4.0, and tried to perform Exponential decay on the learning rate as follows:
learning_rate_scheduler = tf.keras.optimizers.schedules.ExponentialDecay(initial_learning_rate=0.1, decay_steps=1000, decay_rate=0.97,…

mad
- 2,677
- 8
- 35
- 78
1
vote
1 answer
How to use OneCycleLR?
I want to train on CIFAR-10, suppose for 200 epochs.
This is my optimizer:
optimizer = optim.Adam([x for x in model.parameters() if x.requires_grad], lr=0.001)
I want to use OneCycleLR as scheduler. Now, according to the documentation, these are the…

CasellaJr
- 378
- 2
- 11
- 26
1
vote
1 answer
StableBaslines3 - Can I adaptively decrease learning rate?
I am working in the StableBaselines3 package. I know that I can make learning rate schedule by inputting a function to the "learning_rate" argument. However, what I want to be able to do is adaptively decrease the learning rate. Instead of…

Vladimir Belik
- 280
- 1
- 12
1
vote
0 answers
Is there any way to gradually increase learning rate using TFOD API?
I am training CenterNet models using Tensorflow Object Detection API. I need to find better learning rate range. I used learning rate finder with Keras models before but I couldn't find any way to implement same strategy with TFOD API. I tried…

Ahmet Mert Saygu
- 31
- 3
1
vote
0 answers
How to use variable learning rate that decreases with loss using pytorch-geometric?
I have the following code snippet from PyTorch geometric example. I want to use a learning rate that decreases as the loss value during training decreases. I tried using scheduler but that didn't work for me.
A clean code-snippet is below. Can…

Astra Uvarova - Saturn's star
- 677
- 2
- 9
- 25
1
vote
0 answers
What does the global step for the learning rate decay do?
I am following this tutorial:
https://cloud.google.com/architecture/clv-prediction-with-offline-training-train#introduction
and I am rewriting some of the code on Google Colab.
They are using the following for a learning rate decay:
initial_lr =…

timmy
- 21
- 3
1
vote
1 answer
Plotting learning rate vs Loss
I am trying to find the best learning rate by multiplying the learning rate by a constant factor and them training the model on the the varying learning rates .I need to choose the learning rate at the turning point where the loss starts to increase…

Adarsh Singh
- 11
- 3
1
vote
0 answers
Is there a way to schedule the learning rate based on step and epoch in Keras
I know about Keras learning rate scheduler, and tf.keras.optimizers.schedules.InverseTimeDecay, but they only take the current epoch or only current step as argument, what I would like is for my learning rate to stay initial up to the tenth epoch…

Corentin Salomon
- 90
- 1
- 6
1
vote
0 answers
how to print learning rate every epoch where functional learning rate scheduler is used?
I have used custom learning rate scheduler. Code is as follows(Same of https://www.tensorflow.org/tutorials/text/transformer#optimizer).
class CustomSchedule(tf.keras.optimizers.schedules.LearningRateSchedule):
def __init__(self, d_model,…

Atanu Mandal
- 11
- 2
1
vote
1 answer
How to resolve 'RuntimeError: Trying to eval in EAGER mode' while using a custom learning rate?
I am working on using a custom learning rate scheduler and I am running into the error RuntimeError: Trying to eval in EAGER mode while I am trying to do so.
I have made a function for calculating the Learning Rate for an epoch and I've used the…

Ravish Jha
- 481
- 3
- 25
1
vote
2 answers
What is the behaviour when resuming the training, when learning rate decay is used, in TensorFlow/Keras?
I am having difficulty understanding how the training will be resumed when the model is loaded from disk when a scheduler like the one below is used.
learning_rate_scheduler = tensorflow.keras.optimizers.schedules.ExponentialDecay(
0.01,…

sreagm
- 378
- 2
- 10
1
vote
1 answer
Learning rate finder for CNNLstm model
I have CNNLstm model as follows.
class CNN(nn.Module):
def __init__(self):
super(CNN, self).__init__()
self.conv1 = nn.Sequential(
nn.Conv2d(
in_channels=3,
…

batuman
- 7,066
- 26
- 107
- 229