-
Notifications
You must be signed in to change notification settings - Fork 3.6k
Closed
Labels
bugSomething isn't workingSomething isn't workinghelp wantedOpen to be worked onOpen to be worked onpriority: 0High priority taskHigh priority task
Milestone
Description
#614 🐛 Bug
To Reproduce
Steps to reproduce the behavior:
model = TestModel()
trainer = pl.Trainer(gpus=1, default_save_path=exp_path, max_epochs=100)
def configure_optimizers(self):
optim = torch.optim.Adam(self.parameters(), lr=self.lr)
sched = torch.optim.lr_scheduler.ReduceLROnPlateau(optim, 'min')
return [optim], [sched]
# Run learning rate finder
lr_finder = trainer.lr_find(model)
# Results can be found in
lr_finder.results
# Plot with
fig = lr_finder.plot(suggest=True)
fig.show()
The following returns consistently:
optimizer got an empty parameter list
The regular .fit method works as expected.
PL version: '0.7.6'
Metadata
Metadata
Assignees
Labels
bugSomething isn't workingSomething isn't workinghelp wantedOpen to be worked onOpen to be worked onpriority: 0High priority taskHigh priority task