Skip to content

Commit 218c650

Browse files
committed
Merge branch 'master' into after-v1.3.6
2 parents cc908dd + 7978a53 commit 218c650

File tree

70 files changed

+3283
-1671
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

70 files changed

+3283
-1671
lines changed

.gitignore

Lines changed: 1 addition & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -8,9 +8,7 @@ lightning_logs/
88
.vscode/
99

1010
# Test-tube
11-
test_tube_logs/
12-
test_tube_data/
13-
test_tube_exp/
11+
test_tube_*/
1412

1513
# Documentations
1614
docs/source/api

.gitmodules

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
[submodule "notebooks"]
2-
path = notebooks
1+
[submodule "_notebooks"]
2+
path = _notebooks
33
url = https://github.com/PyTorchLightning/lightning-tutorials.git
44
branch = publication

.pre-commit-config.yaml

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -56,6 +56,7 @@ repos:
5656
rev: 'v2.3'
5757
hooks:
5858
- id: vulture
59+
name: Check dead code
5960

6061
- repo: https://github.com/PyCQA/flake8
6162
rev: 3.9.2

CHANGELOG.md

Lines changed: 40 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -9,6 +9,9 @@ The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/).
99

1010
### Added
1111

12+
- Add support for named parameter groups in `LearningRateMonitor` ([#7987](https://github.com/PyTorchLightning/pytorch-lightning/pull/7987))
13+
14+
1215
- Add `dataclass` support for `pytorch_lightning.utilities.apply_to_collection` ([#7935](https://github.com/PyTorchLightning/pytorch-lightning/pull/7935))
1316

1417

@@ -77,18 +80,31 @@ The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/).
7780
- Added IPU Accelerator ([#7867](https://github.com/PyTorchLightning/pytorch-lightning/pull/7867))
7881

7982

83+
- Fault-tolerant training
84+
* Add `{,load_}state_dict` to `ResultCollection` ([#7948](https://github.com/PyTorchLightning/pytorch-lightning/pull/7948))
85+
86+
8087
- Added a warning if `Trainer(log_every_n_steps)` is a value too high for the training dataloader ([#7734](https://github.com/PyTorchLightning/pytorch-lightning/pull/7734))
8188

8289

8390
- Added LightningCLI support for argument links applied on instantiation ([#7895](https://github.com/PyTorchLightning/pytorch-lightning/pull/7895))
8491

8592

93+
- Added LightningCLI support for configurable callbacks that should always be present ([#7964](https://github.com/PyTorchLightning/pytorch-lightning/pull/7964))
94+
95+
8696
- Added DeepSpeed Infinity Support, and updated to DeepSpeed 0.4.0 ([#7234](https://github.com/PyTorchLightning/pytorch-lightning/pull/7234))
8797

8898

8999
- Added support for `torch.nn.UninitializedParameter` in `ModelSummary` ([#7642](https://github.com/PyTorchLightning/pytorch-lightning/pull/7642))
90100

91101

102+
- Added support `LightningModule.save_hyperparameters` when `LightningModule` is a dataclass ([#7992](https://github.com/PyTorchLightning/pytorch-lightning/pull/7992))
103+
104+
105+
- Add support for overriding `optimizer_zero_grad` and `optimizer_step` when using accumulate_grad_batches ([#7980](https://github.com/PyTorchLightning/pytorch-lightning/pull/7980))
106+
107+
92108
### Changed
93109

94110

@@ -119,8 +135,10 @@ The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/).
119135
* Moved attributes `hiddens` and `split_idx` to TrainLoop ([#7507](https://github.com/PyTorchLightning/pytorch-lightning/pull/7507))
120136
* Refactored the logic around manual and automatic optimization inside the optimizer loop ([#7526](https://github.com/PyTorchLightning/pytorch-lightning/pull/7526))
121137
* Simplified "should run validation" logic ([#7682](https://github.com/PyTorchLightning/pytorch-lightning/pull/7682))
122-
* Refactored "should run validation" logic when the trainer is signaled to stop ([#7701](https://github.com/PyTorchLightning/pytorch-lightning/pull/7701))
123-
138+
* Simplified logic for updating the learning rate for schedulers ([#7682](https://github.com/PyTorchLightning/pytorch-lightning/pull/7682))
139+
* Removed the `on_epoch` guard from the "should stop" validation check ([#7701](https://github.com/PyTorchLightning/pytorch-lightning/pull/7701))
140+
* Refactored internal loop interface; added new classes `FitLoop`, `TrainingEpochLoop`, `TrainingBatchLoop` ([#7871](https://github.com/PyTorchLightning/pytorch-lightning/pull/7871))
141+
* Removed `pytorch_lightning/trainer/training_loop.py` ([#7985](https://github.com/PyTorchLightning/pytorch-lightning/pull/7985))
124142

125143
- Refactored logging
126144
* Renamed and moved `core/step_result.py` to `trainer/connectors/logger_connector/result.py` ([#7736](https://github.com/PyTorchLightning/pytorch-lightning/pull/7736))
@@ -156,6 +174,8 @@ The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/).
156174

157175

158176
- Changed `WandbLogger(log_model={True/'all'})` to log models as artifacts ([#6231](https://github.com/PyTorchLightning/pytorch-lightning/pull/6231))
177+
178+
159179
- MLFlowLogger now accepts `run_name` as an constructor argument ([#7622](https://github.com/PyTorchLightning/pytorch-lightning/issues/7622))
160180

161181

@@ -174,6 +194,12 @@ The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/).
174194
- Added `on_load_checkpoint` and `on_save_checkpoint` hooks to the `PrecisionPlugin` base class ([#7831](https://github.com/PyTorchLightning/pytorch-lightning/pull/7831))
175195

176196

197+
- `LightningCLI` now aborts with a clearer message if config already exists and disables save config during `fast_dev_run`([#7963](https://github.com/PyTorchLightning/pytorch-lightning/pull/7963))
198+
199+
200+
- `Trainer(resume_from_checkpoint=...)` now restores the model directly after `LightningModule.setup()`, which is before `LightningModule.configure_sharded_model()` ([#7652](https://github.com/PyTorchLightning/pytorch-lightning/pull/7652))
201+
202+
177203
### Deprecated
178204

179205

@@ -239,9 +265,18 @@ The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/).
239265
- Fixed a bug where `precision=64` with `accelerator='ddp_spawn'` would throw a pickle error ([#6924](https://github.com/PyTorchLightning/pytorch-lightning/pull/6924))
240266

241267

268+
- Do not override the existing `epoch` value in `logged_metrics` when already logged by the user ([#7982](https://github.com/PyTorchLightning/pytorch-lightning/issues/7982))
269+
270+
271+
- Support manual optimization with DeepSpeed ([#7970](https://github.com/PyTorchLightning/pytorch-lightning/pull/7970))
272+
273+
242274
- Fixed `dataloader_idx` argument value when predicting with only one `DataLoader` ([#7941](https://github.com/PyTorchLightning/pytorch-lightning/pull/7941))
243275

244276

277+
- Pass the `stage` argument of `Callback.{setup,teardown}` as a keyword ([#7973](https://github.com/PyTorchLightning/pytorch-lightning/pull/7973))
278+
279+
245280
## [1.3.6] - 2021-06-15
246281

247282
### Fixed
@@ -295,6 +330,9 @@ The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/).
295330
- Fixed training loop total batch counter when accumulate grad batches was enabled ([#7692](https://github.com/PyTorchLightning/pytorch-lightning/pull/7692))
296331

297332

333+
- Fixed a bug where skipping an optimizer while using amp causes amp to trigger an assertion error ([#7975](https://github.com/PyTorchLightning/pytorch-lightning/pull/7975))
334+
335+
298336
## [1.3.2] - 2021-05-18
299337

300338
### Changed

MANIFEST.in

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -61,7 +61,7 @@ exclude .pyrightconfig.json
6161

6262
# Exclude submodules
6363
exclude .gitmodules
64-
exclude notebooks
64+
exclude _notebooks
6565

6666
# Exclude Makefile
6767
exclude Makefile

_notebooks

Submodule _notebooks added at 3321b46

docs/source/advanced/amp.rst

Lines changed: 0 additions & 94 deletions
This file was deleted.

0 commit comments

Comments
 (0)