Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
27 changes: 9 additions & 18 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,20 +4,8 @@ All notable changes to this project will be documented in this file.

The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/).

## [unreleased] - YYYY-MM-DD

### Added

### Changed

### Deprecated

### Removed

### Fixed


## [0.8.0] - 2020-06-DD
## [0.8.0] - 2020-06-18

### Added

Expand All @@ -40,6 +28,8 @@ The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/).
- Added loading checkpoints from URLs ([#1667](https://github.com/PyTorchLightning/pytorch-lightning/issues/1667))
- Added a callback method `on_keyboard_interrupt` for handling KeyboardInterrupt events during training ([#2134](https://github.com/PyTorchLightning/pytorch-lightning/pull/2134))
- Added a decorator `auto_move_data` that moves data to the correct device when using the LightningModule for inference ([#1905](https://github.com/PyTorchLightning/pytorch-lightning/pull/1905))
- Added `ckpt_path` option to `LightningModule.test(...)` to load particular checkpoint ([#2190](https://github.com/PyTorchLightning/pytorch-lightning/issues/2190))
- Added `setup` and `teardown` hooks for model ([#2229](https://github.com/PyTorchLightning/pytorch-lightning/issues/2229))

### Changed

Expand All @@ -52,14 +42,14 @@ The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/).
- Raise an error when lightning replaces an existing sampler ([#2020](https://github.com/PyTorchLightning/pytorch-lightning/pull/2020))
- Enabled prepare_data from correct processes - clarify local vs global rank ([#2166](https://github.com/PyTorchLightning/pytorch-lightning/pull/2166))
- Remove explicit flush from tensorboard logger ([#2126](https://github.com/PyTorchLightning/pytorch-lightning/pull/2126))
- Changed epoch/step indexing from 1 instead of 0 ([#2206](https://github.com/PyTorchLightning/pytorch-lightning/pull/2206))
- Changed epoch indexing from 1 instead of 0 ([#2206](https://github.com/PyTorchLightning/pytorch-lightning/pull/2206))

### Deprecated

- Deprecated flags: ([#2213](https://github.com/PyTorchLightning/pytorch-lightning/pull/2213))
* `overfit_pct` >> `overfit_batches`
* `val_percent_check` >> `limit_val_batches`
* `test_percent_check` >> `limit_test_batches`
* `overfit_pct` in favour of `overfit_batches`
* `val_percent_check` in favour of `limit_val_batches`
* `test_percent_check` in favour of `limit_test_batches`
- Deprecated `ModelCheckpoint`'s attributes `best` and `kth_best_model` ([#1799](https://github.com/PyTorchLightning/pytorch-lightning/pull/1799))
- Dropped official support/testing for older PyTorch versions <1.3 ([#1917](https://github.com/PyTorchLightning/pytorch-lightning/pull/1917))

Expand Down Expand Up @@ -89,8 +79,9 @@ The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/).
- Fixed an issue with `_auto_collect_arguments` collecting local variables that are not constructor arguments and not working for signatures that have the instance not named `self` ([#2048](https://github.com/PyTorchLightning/pytorch-lightning/pull/2048))
- Fixed mistake in parameters' grad norm tracking ([#2012](https://github.com/PyTorchLightning/pytorch-lightning/pull/2012))
- Fixed CPU and hanging GPU crash ([#2118](https://github.com/PyTorchLightning/pytorch-lightning/pull/2118))

- Fixed an issue with the model summary and `example_input_array` depending on a specific ordering of the submodules in a LightningModule ([#1773](https://github.com/PyTorchLightning/pytorch-lightning/pull/1773))
- Fixed Tpu logging ([#2230](https://github.com/PyTorchLightning/pytorch-lightning/pull/2230))
- Fixed Pid port + duplicate `rank_zero` logging ([#2140](https://github.com/PyTorchLightning/pytorch-lightning/pull/2140), [#2231](https://github.com/PyTorchLightning/pytorch-lightning/pull/2231))

## [0.7.6] - 2020-05-16

Expand Down
2 changes: 2 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -47,6 +47,8 @@ conda install pytorch-lightning -c conda-forge

## Docs
- [master](https://pytorch-lightning.readthedocs.io/en/latest)
- [stable](https://pytorch-lightning.readthedocs.io/en/stable)
- [0.8.0](https://pytorch-lightning.readthedocs.io/en/0.8.0/)
- [0.7.6](https://pytorch-lightning.readthedocs.io/en/0.7.6/)
- [0.7.5](https://pytorch-lightning.readthedocs.io/en/0.7.5/)
- [0.7.3](https://pytorch-lightning.readthedocs.io/en/0.7.3/)
Expand Down
7 changes: 3 additions & 4 deletions pytorch_lightning/__init__.py
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
"""Root package info."""

__version__ = '0.8.0rc4'
__version__ = '0.8.0'
__author__ = 'William Falcon et al.'
__author_email__ = '[email protected]'
__license__ = 'Apache-2.0'
Expand Down Expand Up @@ -50,11 +50,10 @@
sys.stdout.write(f'Partial import of `{__name__}` during the build process.\n') # pragma: no-cover
# We are not importing the rest of the lightning during the build process, as it may not be compiled yet
else:
from pytorch_lightning.core import LightningModule
from pytorch_lightning.core import LightningModule, data_loader
from pytorch_lightning.callbacks import Callback
from pytorch_lightning.trainer import Trainer
from pytorch_lightning.utilities.seed import seed_everything
from pytorch_lightning.callbacks import Callback
from pytorch_lightning.core import data_loader

__all__ = [
'Trainer',
Expand Down
4 changes: 2 additions & 2 deletions pytorch_lightning/callbacks/base.py
Original file line number Diff line number Diff line change
Expand Up @@ -14,11 +14,11 @@ class Callback(abc.ABC):
Abstract base class used to build new callbacks.
"""

def setup(self, trainer, step: str):
def setup(self, trainer, stage: str):
"""Called when fit or test begins"""
pass

def teardown(self, trainer, step: str):
def teardown(self, trainer, stage: str):
"""Called when fit or test ends"""
pass

Expand Down
4 changes: 2 additions & 2 deletions pytorch_lightning/core/hooks.py
Original file line number Diff line number Diff line change
Expand Up @@ -17,15 +17,15 @@

class ModelHooks(Module):

def setup(self, step: str):
def setup(self, stage: str):
"""
Called at the beginning of fit and test.

Args:
step: either 'fit' or 'test'
"""

def teardown(self, step: str):
def teardown(self, stage: str):
"""
Called at the end of fit and test.

Expand Down
8 changes: 4 additions & 4 deletions pytorch_lightning/trainer/callback_hook.py
Original file line number Diff line number Diff line change
Expand Up @@ -11,15 +11,15 @@ class TrainerCallbackHookMixin(ABC):
callbacks: List[Callback] = []
get_model: Callable = ...

def setup(self, step: str):
def setup(self, stage: str):
"""Called in the beginning of fit and test"""
for callback in self.callbacks:
callback.setup(self, step)
callback.setup(self, stage)

def teardown(self, step: str):
def teardown(self, stage: str):
"""Called at the end of fit and test"""
for callback in self.callbacks:
callback.teardown(self, step)
callback.teardown(self, stage)

def on_init_start(self):
"""Called when the trainer initialization begins, model has not yet been set."""
Expand Down