diff --git a/.github/PULL_REQUEST_TEMPLATE.md b/.github/PULL_REQUEST_TEMPLATE.md index c2ce4a5e8bf26..ada6c6b8c62bd 100644 --- a/.github/PULL_REQUEST_TEMPLATE.md +++ b/.github/PULL_REQUEST_TEMPLATE.md @@ -16,26 +16,26 @@ If we didn't discuss your PR in Github issues there's a high chance it will not Fixes # (issue) <- this [links related issue to this PR](https://docs.github.com/en/free-pro-team@latest/github/managing-your-work-on-github/linking-a-pull-request-to-an-issue#linking-a-pull-request-to-an-issue-using-a-keyword) ## Before submitting -- [ ] Was this discussed/approved via a Github issue? (no need for typos and docs improvements) -- [ ] Did you read the [contributor guideline](https://github.com/PyTorchLightning/pytorch-lightning/blob/master/.github/CONTRIBUTING.md), Pull Request section? +- [ ] Was this discussed/approved via a GitHub issue? (not for typos and docs) +- [ ] Did you read the [contributor guideline](https://github.com/PyTorchLightning/pytorch-lightning/blob/master/.github/CONTRIBUTING.md), **Pull Request** section? - [ ] Did you make sure your PR does only one thing, instead of bundling different changes together? -- [ ] Did you make sure to update the documentation with your changes [if needed]? -- [ ] Did you write any new necessary tests [no need for typos, docs]? +- [ ] Did you make sure to update the documentation with your changes? (if necessary) +- [ ] Did you write any new necessary tests? (not for typos and docs) - [ ] Did you verify new and existing tests pass locally with your changes? -- [ ] If you made a notable change (that affects users), did you update the [CHANGELOG](https://github.com/PyTorchLightning/pytorch-lightning/blob/master/CHANGELOG.md)? +- [ ] Did you update the [CHANGELOG](https://github.com/PyTorchLightning/pytorch-lightning/blob/master/CHANGELOG.md)? (not for typos, docs, test updates, or internal minor changes/refactorings) ## PR review Anyone in the community is free to review the PR once the tests have passed. -Before you start reviewing make sure you have read [Review guidelines](https://github.com/PyTorchLightning/pytorch-lightning/wiki/Review-guidelines). In short, see the following bullet-list: +Before you start reviewing make sure you have read [Review guidelines](https://github.com/PyTorchLightning/pytorch-lightning/wiki/Review-guidelines). In short, see the following bullet-list: - [ ] Is this pull request ready for review? (if not, please submit in draft mode) - [ ] Check that all items from **Before submitting** are resolved - [ ] Make sure the title is self-explanatory and the description concisely explains the PR - [ ] Add labels and milestones (and optionally projects) to the PR so it can be classified - - [ ] **Check that target branch and milestone are aligned!** - + - [ ] **Check that target branch and milestone match!** + ## Did you have fun? Make sure you had fun coding 🙃 diff --git a/CHANGELOG.md b/CHANGELOG.md index 8bb0d31169b87..869f71f5cc3c0 100644 --- a/CHANGELOG.md +++ b/CHANGELOG.md @@ -9,28 +9,25 @@ The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/). ### Added -- Added a check for optimizer attached to lr_scheduler ([#5338](https://github.com/PyTorchLightning/pytorch-lightning/pull/5338)) - -- Added `resume_from_checkpoint` accept non-existing file path ([#4402](https://github.com/PyTorchLightning/pytorch-lightning/pull/4402)) - +- Added a check for optimizer attached to `lr_scheduler` ([#5338](https://github.com/PyTorchLightning/pytorch-lightning/pull/5338)) +- Added support for passing non-existing filepaths to `resume_from_checkpoint` ([#4402](https://github.com/PyTorchLightning/pytorch-lightning/pull/4402)) ### Changed - -### Deprecated - - -### Removed - - -### Fixed - -- Skip restore from `resume_from_checkpoint` in while `testing` ([#5161](https://github.com/PyTorchLightning/pytorch-lightning/pull/5161)) - +- Skip restore from `resume_from_checkpoint` while `testing` ([#5161](https://github.com/PyTorchLightning/pytorch-lightning/pull/5161)) - Allowed `log_momentum` for adaptive optimizers in `LearningRateMonitor` ([#5333](https://github.com/PyTorchLightning/pytorch-lightning/pull/5333)) +- Disabled checkpointing, earlystopping and logging with `fast_dev_run` ([#5277](https://github.com/PyTorchLightning/pytorch-lightning/pull/5277)) +- Distributed group defaults to `WORLD` if `None` ([#5125](https://github.com/PyTorchLightning/pytorch-lightning/pull/5125)) -- Disabled checkpointing, earlystopping and logger with `fast_dev_run` ([#5277](https://github.com/PyTorchLightning/pytorch-lightning/pull/5277)) +### Fixed +- Fixed `trainer.test` returning non-test metrics ([#5214](https://github.com/PyTorchLightning/pytorch-lightning/pull/5214)) +- Fixed metric state reset ([#5273](https://github.com/PyTorchLightning/pytorch-lightning/pull/5273)) +- Fixed `--num-nodes` on `DDPSequentialPlugin` ([#5327](https://github.com/PyTorchLightning/pytorch-lightning/pull/5327)) +- Fixed invalid value for `weights_summary` ([#5296](https://github.com/PyTorchLightning/pytorch-lightning/pull/5296)) +- Fixed `Trainer.test` not using the latest `best_model_path` ([#5161](https://github.com/PyTorchLightning/pytorch-lightning/pull/5161)) +- Fixed existence check for hparams not using underlying filesystem ([#5250](https://github.com/PyTorchLightning/pytorch-lightning/pull/5250)) +- Fixed `LightningOptimizer` AMP bug ([#5191](https://github.com/PyTorchLightning/pytorch-lightning/pull/5191)) - Fixed casted key to string in `_flatten_dict` ([#5354](https://github.com/PyTorchLightning/pytorch-lightning/pull/5354)) diff --git a/pl_examples/basic_examples/mnist_datamodule.py b/pl_examples/basic_examples/mnist_datamodule.py index 95e20d22e1fdd..27a7590b64ee9 100644 --- a/pl_examples/basic_examples/mnist_datamodule.py +++ b/pl_examples/basic_examples/mnist_datamodule.py @@ -11,7 +11,7 @@ # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. - +import platform from typing import Optional from torch.utils.data import DataLoader, random_split @@ -55,6 +55,9 @@ def __init__( normalize: If true applies image normalize """ super().__init__(*args, **kwargs) + if platform.system() == "Windows": + # see: https://stackoverflow.com/a/59680818/4521646 + num_workers = 0 self.dims = (1, 28, 28) self.data_dir = data_dir diff --git a/pytorch_lightning/__init__.py b/pytorch_lightning/__init__.py index d1da4da1963ac..5f7ae6bdee9d2 100644 --- a/pytorch_lightning/__init__.py +++ b/pytorch_lightning/__init__.py @@ -1,6 +1,6 @@ """Root package info.""" -__version__ = '1.1.2' +__version__ = '1.1.3' __author__ = 'William Falcon et al.' __author_email__ = 'waf2107@columbia.edu' __license__ = 'Apache-2.0' diff --git a/pytorch_lightning/plugins/rpc_plugin.py b/pytorch_lightning/plugins/rpc_plugin.py index a1464f3c70e0b..223a1f0a13110 100644 --- a/pytorch_lightning/plugins/rpc_plugin.py +++ b/pytorch_lightning/plugins/rpc_plugin.py @@ -12,18 +12,19 @@ # See the License for the specific language governing permissions and # limitations under the License. import os +from contextlib import suppress from typing import Optional import torch from pytorch_lightning.core.lightning import LightningModule from pytorch_lightning.plugins.ddp_plugin import DDPPlugin -from pytorch_lightning.utilities import _module_available, RPC_AVAILABLE +from pytorch_lightning.utilities import RPC_AVAILABLE DEFAULT_RPC_TIMEOUT_SEC = 60. if RPC_AVAILABLE: from torch.distributed import rpc - if _module_available("torch.distributed.rpc.constants") and hasattr(torch.distributed.rpc.constants, "DEFAULT_RPC_TIMEOUT_SEC"): + with suppress(ModuleNotFoundError, ImportError): from torch.distributed.rpc.constants import DEFAULT_RPC_TIMEOUT_SEC diff --git a/tests/checkpointing/test_model_checkpoint.py b/tests/checkpointing/test_model_checkpoint.py index 3adb45c0b1869..8d4a859a88784 100644 --- a/tests/checkpointing/test_model_checkpoint.py +++ b/tests/checkpointing/test_model_checkpoint.py @@ -11,20 +11,20 @@ # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. -from argparse import Namespace import os -from pathlib import Path import pickle import platform import re +from argparse import Namespace +from pathlib import Path from unittest import mock from unittest.mock import Mock import cloudpickle -from omegaconf import Container, OmegaConf import pytest import torch import yaml +from omegaconf import Container, OmegaConf import pytorch_lightning as pl import tests.base.develop_utils as tutils