-
Notifications
You must be signed in to change notification settings - Fork 3.6k
Closed
Labels
questionFurther information is requestedFurther information is requested
Description
❓ Questions and Help
When restarting training on a model using learning rate scheduler, it seems like the original learning rate is used rather than the scheduler-update learning rate.
Code
For example, a model with the following configure_optimizers:
def configure_optimizers(self):
optimizer = optim.Adam(self.parameters(), lr=self.hparams.learning_rate)
scheduler = optim.lr_scheduler.ExponentialLR(
optimizer, gamma=self.hparams.learning_gamma
)
return [optimizer], [scheduler]
With learning_gamma != 1.0, when restarting training, e.g.:
model = myModel.load_from_checkpoint(ckpt_fname)
lr_monitor = LearningRateMonitor(logging_interval="epoch")
trainer = Trainer(resume_from_checkpoint=ckpt_fname, callbacks=[lr_monitor])
trainer.fit(model)
The logged learning rate is equal to the original initial learning rate, rather than the schedule-updated learning rate.
What's your environment?
- OS: Linux
- Packaging: conda
- Version: 1.1.0
Metadata
Metadata
Assignees
Labels
questionFurther information is requestedFurther information is requested