Skip to content

Commit f78fa8d

Browse files
authored
Merge branch 'master' into fix_module_flatten
2 parents b2e45cb + 9fbe724 commit f78fa8d

File tree

2 files changed

+9
-12
lines changed

2 files changed

+9
-12
lines changed

CHANGELOG.md

Lines changed: 8 additions & 12 deletions
Original file line numberDiff line numberDiff line change
@@ -180,9 +180,6 @@ The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/).
180180
- Set better defaults for `rank_zero_only.rank` when training is launched with SLURM and torchelastic ([#6802](https://github.com/PyTorchLightning/pytorch-lightning/pull/6802/))
181181

182182

183-
- Sanitize `None` params during pruning ([#6836](https://github.com/PyTorchLightning/pytorch-lightning/pull/6836))
184-
185-
186183
- Made the `Plugin.reduce` method more consistent across all Plugins to reflect a mean-reduction by default ([#6011](https://github.com/PyTorchLightning/pytorch-lightning/pull/6011))
187184

188185

@@ -210,21 +207,21 @@ The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/).
210207
- Fixed torch distributed not available in setup hook for DDP ([#6506](https://github.com/PyTorchLightning/pytorch-lightning/pull/6506))
211208

212209

213-
- Fixed TPU Colab hang issue, post training ([#6816](https://github.com/PyTorchLightning/pytorch-lightning/pull/6816))
214-
215-
216-
- Enforce an epoch scheduler interval when using SWA ([#6588](https://github.com/PyTorchLightning/pytorch-lightning/pull/6588))
217-
218-
219-
- Fixed an issue with `IterableDataset` when `__len__` is not defined ([#6828](https://github.com/PyTorchLightning/pytorch-lightning/pull/6828))
210+
- Fixed bug where `BaseFinetuning.flatten_modules()` was duplicating leaf node parameters ([#6879](https://github.com/PyTorchLightning/pytorch-lightning/pull/6879))
220211

221212

222213
- Fixed `EarlyStopping` logic when `min_epochs` or `min_steps` requirement is not met ([#6705](https://github.com/PyTorchLightning/pytorch-lightning/pull/6705))
223214

224215

225-
- Fixed bug where `BaseFinetuning.flatten_modules()` was duplicating leaf node parameters ([#6879](https://github.com/PyTorchLightning/pytorch-lightning/pull/6879))
216+
## [1.2.7] - 2021-04-06
226217

218+
### Fixed
227219

220+
- Fixed resolve a bug with omegaconf and xm.save ([#6741](https://github.com/PyTorchLightning/pytorch-lightning/pull/6741))
221+
- Fixed an issue with IterableDataset when __len__ is not defined ([#6828](https://github.com/PyTorchLightning/pytorch-lightning/pull/6828))
222+
- Sanitize None params during pruning ([#6836](https://github.com/PyTorchLightning/pytorch-lightning/pull/6836))
223+
- Enforce an epoch scheduler interval when using SWA ([#6588](https://github.com/PyTorchLightning/pytorch-lightning/pull/6588))
224+
- Fixed TPU Colab hang issue, post training ([#6816](https://github.com/PyTorchLightning/pytorch-lightning/pull/6816))
228225
- Fixed a bug where `TensorBoardLogger` would give a warning and not log correctly to a symbolic link `save_dir` ([#6730](https://github.com/PyTorchLightning/pytorch-lightning/pull/6730))
229226

230227

@@ -260,7 +257,6 @@ The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/).
260257
- Fixed comparing required versions ([#6434](https://github.com/PyTorchLightning/pytorch-lightning/pull/6434))
261258
- Fixed duplicate logs appearing in console when using the python logging module ([#6275](https://github.com/PyTorchLightning/pytorch-lightning/pull/6275))
262259
- Added Autocast in validation, test and predict modes for Native AMP ([#6565](https://github.com/PyTorchLightning/pytorch-lightning/pull/6565))
263-
- Fixed resolve a bug with omegaconf and xm.save ([#6741](https://github.com/PyTorchLightning/pytorch-lightning/pull/6741))
264260

265261

266262
## [1.2.4] - 2021-03-16

tests/checkpointing/test_legacy_checkpoints.py

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -59,6 +59,7 @@
5959
"1.2.4",
6060
"1.2.5",
6161
"1.2.6",
62+
"1.2.7",
6263
]
6364
)
6465
def test_resume_legacy_checkpoints(tmpdir, pl_version: str):

0 commit comments

Comments
 (0)