Skip to content

ReduceLROnPlateau Scheduler documentation problem #4454

@KevinMathewT

Description

@KevinMathewT

In the documentation it's given that to use ReduceLROnPlateau Scheduler we should do it as:

# The ReduceLROnPlateau scheduler requires a monitor
def configure_optimizers(self):
   return {
       'optimizer': Adam(...),
       'scheduler': ReduceLROnPlateau(optimizer, ...),
       'monitor': 'metric_to_track'
   }

But there is no variable initialized as optimizer for 'scheduler': ReduceLROnPlateau(optimizer, ...),, so I did:

def configure_optimizers(self):
        optimizer = torch.optim.Adam(
            params=self.parameters(), 
            lr=LEARNING_RATE,
            weight_decay=WEIGHT_DECAY
        )
        scheduler = torch.optim.lr_scheduler.ReduceLROnPlateau(
            optimizer,
            patience=1,
            verbose=True
        )
        return {
           'optimizer': optimizer,
           'scheduler': scheduler,
           'monitor': 'val_loss'
       }

But for some reason the Scheduler was not working. I created a Callback to check, but trainer.lr_schedulers gives an empty list.
So I tried:

def configure_optimizers(self):
        optimizer = torch.optim.Adam(
            params=self.parameters(), 
            lr=LEARNING_RATE,
            weight_decay=WEIGHT_DECAY
        )
        scheduler = torch.optim.lr_scheduler.ReduceLROnPlateau(
            optimizer,
            patience=1,
            verbose=True
        )
        return {
           'optimizer': optimizer,
           'lr_scheduler': scheduler, # Changed scheduler to lr_scheduler
           'monitor': 'val_loss'
       }

And now it's working.

Metadata

Metadata

Assignees

Labels

docsDocumentation related

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions