-
Notifications
You must be signed in to change notification settings - Fork 3.6k
Closed
Labels
bugSomething isn't workingSomething isn't workinghelp wantedOpen to be worked onOpen to be worked onlr schedulerpriority: 2Low priority taskLow priority taskstrategy: fairscale sharded (removed)Sharded Data ParallelSharded Data Parallel
Milestone
Description
I am using a learning rate scheduler called every step:
def configure_optimizers(self):
optimizer = Adam(...)
scheduler = LambdaLR(optimizer, ...)
return {
"optimizer": optimizer,
"lr_scheduler": {"scheduler": scheduler, "interval": "step"},
}When using DDPPlugin, there is no issue, but when using DDPShardedPlugin, I get the following warning:
lib/python3.8/site-packages/torch/optim/lr_scheduler.py:129: UserWarning: Detected call of `lr_scheduler.step()` before `optimizer.step()`. In PyTorch 1.1.0 and later, you should call them in the opposite order: `optimizer.step()` before `lr_scheduler.step()`. Failure to do this will result in PyTorch skipping the first value of the learning rate schedule. See more details at https://pytorch.org/docs/stable/optim.html#how-to-adjust-learning-rate
warnings.warn("Detected call of `lr_scheduler.step()` before `optimizer.step()`. "
pytorch 1.8.1
lighting 1.2.9
Metadata
Metadata
Assignees
Labels
bugSomething isn't workingSomething isn't workinghelp wantedOpen to be worked onOpen to be worked onlr schedulerpriority: 2Low priority taskLow priority taskstrategy: fairscale sharded (removed)Sharded Data ParallelSharded Data Parallel