-
Notifications
You must be signed in to change notification settings - Fork 3.6k
Closed
Labels
docsDocumentation relatedDocumentation related
Description
In the documentation it's given that to use ReduceLROnPlateau Scheduler we should do it as:
# The ReduceLROnPlateau scheduler requires a monitor
def configure_optimizers(self):
return {
'optimizer': Adam(...),
'scheduler': ReduceLROnPlateau(optimizer, ...),
'monitor': 'metric_to_track'
}
But there is no variable initialized as optimizer for 'scheduler': ReduceLROnPlateau(optimizer, ...),, so I did:
def configure_optimizers(self):
optimizer = torch.optim.Adam(
params=self.parameters(),
lr=LEARNING_RATE,
weight_decay=WEIGHT_DECAY
)
scheduler = torch.optim.lr_scheduler.ReduceLROnPlateau(
optimizer,
patience=1,
verbose=True
)
return {
'optimizer': optimizer,
'scheduler': scheduler,
'monitor': 'val_loss'
}
But for some reason the Scheduler was not working. I created a Callback to check, but trainer.lr_schedulers gives an empty list.
So I tried:
def configure_optimizers(self):
optimizer = torch.optim.Adam(
params=self.parameters(),
lr=LEARNING_RATE,
weight_decay=WEIGHT_DECAY
)
scheduler = torch.optim.lr_scheduler.ReduceLROnPlateau(
optimizer,
patience=1,
verbose=True
)
return {
'optimizer': optimizer,
'lr_scheduler': scheduler, # Changed scheduler to lr_scheduler
'monitor': 'val_loss'
}
And now it's working.
basusourya, xuao575, triangle3pyramid9, SWKoreaBME, mahbodnr and 1 moreKaviarasan-R
Metadata
Metadata
Assignees
Labels
docsDocumentation relatedDocumentation related