Skip to content
Merged
7 changes: 7 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -15,6 +15,7 @@ The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/).

### Changed


- Log epoch metrics before the `on_evaluation_end` hook ([#7272](https://github.com/PyTorchLightning/pytorch-lightning/pull/7272))


Expand Down Expand Up @@ -61,16 +62,22 @@ The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/).
- Removed deprecated trainer attributes - `get_model` and `accelerator_backend` ([7502](https://github.com/PyTorchLightning/pytorch-lightning/pull/7502))


- Removed deprecated utils modules `model_utils`, `warning_utils`, `xla_device_utils` and partially `argparse_utils` ([7503](https://github.com/PyTorchLightning/pytorch-lightning/pull/7503))


- Removed deprecated trainer attributes - `on_cpu`, `on_tpu`, `use_tpu`, `on_gpu`, `use_dp`, `use_ddp`, `use_ddp2`, `use_horovod`, `use_single_gpu` ([#7501](https://github.com/PyTorchLightning/pytorch-lightning/pull/7501))


### Fixed


- Fixed parsing of multiple training dataloaders ([#7433](https://github.com/PyTorchLightning/pytorch-lightning/pull/7433))


- Fixed recursive passing of `wrong_type` keyword argument in `pytorch_lightning.utilities.apply_to_collection` ([#7433](https://github.com/PyTorchLightning/pytorch-lightning/pull/7433))



## [1.3.1] - 2021-05-11

### Fixed
Expand Down
2 changes: 1 addition & 1 deletion dockers/tpu-tests/tpu_test_cases.jsonnet
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@ local tputests = base.BaseTest {
|||
cd pytorch-lightning
coverage run --source=pytorch_lightning -m pytest -v --capture=no \
pytorch_lightning/utilities/xla_device_utils.py \
pytorch_lightning/utilities/xla_device.py \
tests/accelerators/test_tpu_backend.py \
tests/models/test_tpu.py
test_exit_code=$?
Expand Down
2 changes: 0 additions & 2 deletions pytorch_lightning/utilities/argparse_utils.py
Original file line number Diff line number Diff line change
Expand Up @@ -2,8 +2,6 @@

rank_zero_deprecation("`argparse_utils` package has been renamed to `argparse` since v1.2 and will be removed in v1.4")

from pytorch_lightning.utilities.argparse import * # noqa: F403 E402 F401

# for backward compatibility with old checkpoints (versions < 1.2.0)
# that need to be able to unpickle the function from the checkpoint
from pytorch_lightning.utilities.argparse import _gpus_arg_default # noqa: E402 F401 # isort: skip
Comment on lines -5 to 7
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@carmocca In theory this should be fine because we still have the import below here for BC.
Unless we remove the file argparse_utils.py we don't need to worry (yet)

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The problem here is that the deprecation tests have been removed but not the deprecation message

And if we do, this will be forgotten for eternity haha

Is that okay?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think that the best way would be to add a test for legacy checkpoint which covers this use-case
at this moment we have really very simple legacy model to test with...
so I would merge this as it is and raise a priority issue to update legacy checkpoint tests 🐰

Copy link
Contributor

@awaelchli awaelchli May 12, 2021

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

And if we do, this will be forgotten for eternity haha

Is that okay?

No, I will never forget the pain it has caused me. 🤣

If you want, I can generate one of these checkpoints and we make a tests that loads it? If someone removes the line, we make sure the test breaks?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nah, no need for that. If spending time on it, it should be on how to figure out the monkey patching

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yes, can you prepare a model which would cover those issue
https://github.com/PyTorchLightning/pytorch-lightning/blob/master/legacy/zero_training.py
then we can regenerate all legacy checkpoints

Copy link
Contributor

@awaelchli awaelchli May 12, 2021

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

all we need to do is add self.save_hyperparameters() to the zero_training model and generate the default hparams form Trainer.parse_argparse_args. The problem is that save_hyperparameters() doesn't exist that long and changed over time so I fear we won't be able to get this working for too old versions of Lightning. So we may need multiple versions of scripts as the Lightning feature set evolves over time

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

we can regenerate just some "recent" checkpoints...
cc: @kaushikb11

7 changes: 0 additions & 7 deletions pytorch_lightning/utilities/model_utils.py

This file was deleted.

5 changes: 0 additions & 5 deletions pytorch_lightning/utilities/warning_utils.py

This file was deleted.

20 changes: 0 additions & 20 deletions pytorch_lightning/utilities/xla_device_utils.py

This file was deleted.

1 change: 0 additions & 1 deletion setup.cfg
Original file line number Diff line number Diff line change
Expand Up @@ -44,7 +44,6 @@ exclude_lines =
# *metrics (94%+) are temporarily removed from testing while tests speed up
omit =
pytorch_lightning/cluster_environments/*.py
pytorch_lightning/utilities/xla_device_utils.py
pytorch_lightning/utilities/distributed.py
pytorch_lightning/tuner/auto_gpu_select.py

Expand Down
14 changes: 1 addition & 13 deletions tests/deprecated_api/test_remove_1-4.py
Original file line number Diff line number Diff line change
Expand Up @@ -33,19 +33,7 @@
def test_v1_4_0_deprecated_imports():
_soft_unimport_module('pytorch_lightning.utilities.argparse_utils')
with pytest.deprecated_call(match='will be removed in v1.4'):
from pytorch_lightning.utilities.argparse_utils import from_argparse_args # noqa: F811 F401

_soft_unimport_module('pytorch_lightning.utilities.model_utils')
with pytest.deprecated_call(match='will be removed in v1.4'):
from pytorch_lightning.utilities.model_utils import is_overridden # noqa: F811 F401

_soft_unimport_module('pytorch_lightning.utilities.warning_utils')
with pytest.deprecated_call(match='will be removed in v1.4'):
from pytorch_lightning.utilities.warning_utils import WarningCache # noqa: F811 F401

_soft_unimport_module('pytorch_lightning.utilities.xla_device_utils')
with pytest.deprecated_call(match='will be removed in v1.4'):
from pytorch_lightning.utilities.xla_device_utils import XLADeviceUtils # noqa: F811 F401
from pytorch_lightning.utilities.argparse_utils import _gpus_arg_default # noqa: F811 F401


class CustomDDPPlugin(DDPSpawnPlugin):
Expand Down