-
Notifications
You must be signed in to change notification settings - Fork 3.6k
Description
🐛 Bug
LR Monitor does not work with SWA.
LR Monitor: LearningRateMonitor
SWA: Stochastic Weight Averaging
To Reproduce
https://colab.research.google.com/drive/1sksfJGj5d-_KJGVz3tM_ccOMnk9TPkys?usp=sharing
These lines in particular:
lrmcb = pl.callbacks.LearningRateMonitor(logging_interval='step')
stochastic_weight_avg=True, callbacks = [lrmcb]
Expected behavior
Not have this warning:
/usr/local/lib/python3.7/dist-packages/pytorch_lightning/callbacks/lr_monitor.py:116: RuntimeWarning: You are using 'LearningRateMonitor' callback with models that have no learning rate schedulers. Please see documentation for 'configure_optimizers' method. RuntimeWarning,
Because SWA is using a scheduler according to this: https://pytorch.org/blog/pytorch-1.6-now-includes-stochastic-weight-averaging/#how-to-use-swa-in-pytorch
And not have an empty list for the list of LR when inspecting the LearningRateMonitor callback instance:
for v,k in vars(lrmcb).items(): print(v,k)
OUTPUT:
logging_interval None log_momentum False lrs {'lr-SGD': []} lr_sch_names [] log <bound method LightningModule.log of BoringModel( (layer): Linear(in_features=32, out_features=2, bias=True) )> log_dict <bound method LightningModule.log_dict of BoringModel( (layer): Linear(in_features=32, out_features=2, bias=True) )> last_momentum_values {}
The line showing LR is an empty list:
lrs {'lr-SGD': []}
Environment
- CUDA:
- GPU:
- available: False
- version: 10.2
- Packages:
- numpy: 1.19.5
- pyTorch_debug: False
- pyTorch_version: 1.9.0+cu102
- pytorch-lightning: 1.4.6
- tqdm: 4.62.0
- System:
- OS: Linux
- architecture:
- 64bit
- processor: x86_64
- python: 3.7.11
- version: Proposal for help #1 SMP Sat Jun 5 09:50:34 PDT 2021