Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
6 changes: 3 additions & 3 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -96,9 +96,6 @@ The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/).

- Raise an error if there are insufficient training batches when using a float value of `limit_train_batches` ([#12885](https://github.com/PyTorchLightning/pytorch-lightning/pull/12885))


- Changed `pytorch_lightning.core.lightning` to `pytorch_lightning.core.module` ([#12740](https://github.com/PyTorchLightning/pytorch-lightning/pull/12740))

### Deprecated

- Deprecated `pytorch_lightning.loggers.base.LightningLoggerBase` in favor of `pytorch_lightning.loggers.logger.Logger`, and deprecated `pytorch_lightning.loggers.base` in favor of `pytorch_lightning.loggers.logger` ([#120148](https://github.com/PyTorchLightning/pytorch-lightning/pull/12014))
Expand All @@ -116,6 +113,9 @@ The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/).
- Deprecated `pytorch_lightning.core.lightning.LightningModule` in favor of `pytorch_lightning.core.module.LightningModule` ([#12740](https://github.com/PyTorchLightning/pytorch-lightning/pull/12740))


- Deprecated `pytorch_lightning.loops.base.Loop` in favor of `pytorch_lightning.loops.loop.Loop` ([#13043](https://github.com/PyTorchLightning/pytorch-lightning/pull/13043))


- Deprecated `Trainer.reset_train_val_dataloaders()` in favor of `Trainer.reset_{train,val}_dataloader` ([#12184](https://github.com/PyTorchLightning/pytorch-lightning/pull/12184))

### Removed
Expand Down
22 changes: 11 additions & 11 deletions docs/source/extensions/loops.rst
Original file line number Diff line number Diff line change
Expand Up @@ -81,7 +81,7 @@ The core research logic is simply shifted to the :class:`~pytorch_lightning.core
loss.backward()
optimizer.step()

Under the hood, the above loop is implemented using the :class:`~pytorch_lightning.loops.base.Loop` API like so:
Under the hood, the above loop is implemented using the :class:`~pytorch_lightning.loops.loop.Loop` API like so:

.. code-block:: python

Expand Down Expand Up @@ -183,7 +183,7 @@ Now your code is FULLY flexible and you can still leverage ALL the best parts of
Creating a New Loop From Scratch
--------------------------------

You can also go wild and implement a full loop from scratch by sub-classing the :class:`~pytorch_lightning.loops.base.Loop` base class.
You can also go wild and implement a full loop from scratch by sub-classing the :class:`~pytorch_lightning.loops.loop.Loop` base class.
You will need to override a minimum of two things:

.. code-block:: python
Expand Down Expand Up @@ -222,7 +222,7 @@ Loop API
--------
Here is the full API of methods available in the Loop base class.

The :class:`~pytorch_lightning.loops.base.Loop` class is the base of all loops in the same way as the :class:`~pytorch_lightning.core.module.LightningModule` is the base of all models.
The :class:`~pytorch_lightning.loops.loop.Loop` class is the base of all loops in the same way as the :class:`~pytorch_lightning.core.module.LightningModule` is the base of all models.
It defines a public interface that each loop implementation must follow, the key ones are:

Properties
Expand All @@ -231,13 +231,13 @@ Properties
done
~~~~

.. autoattribute:: pytorch_lightning.loops.base.Loop.done
.. autoattribute:: pytorch_lightning.loops.loop.Loop.done
:noindex:

skip (optional)
~~~~~~~~~~~~~~~

.. autoattribute:: pytorch_lightning.loops.base.Loop.skip
.. autoattribute:: pytorch_lightning.loops.loop.Loop.skip
:noindex:

Methods
Expand All @@ -246,19 +246,19 @@ Methods
reset (optional)
~~~~~~~~~~~~~~~~

.. automethod:: pytorch_lightning.loops.base.Loop.reset
.. automethod:: pytorch_lightning.loops.loop.Loop.reset
:noindex:

advance
~~~~~~~

.. automethod:: pytorch_lightning.loops.base.Loop.advance
.. automethod:: pytorch_lightning.loops.loop.Loop.advance
:noindex:

run (optional)
~~~~~~~~~~~~~~

.. automethod:: pytorch_lightning.loops.base.Loop.run
.. automethod:: pytorch_lightning.loops.loop.Loop.run
:noindex:


Expand All @@ -267,7 +267,7 @@ run (optional)
Subloops
--------

When you want to customize nested loops within loops, use the :meth:`~pytorch_lightning.loops.base.Loop.replace` method:
When you want to customize nested loops within loops, use the :meth:`~pytorch_lightning.loops.loop.Loop.replace` method:

.. code-block:: python

Expand All @@ -276,7 +276,7 @@ When you want to customize nested loops within loops, use the :meth:`~pytorch_li
# Trainer runs the fit loop with your new epoch loop!
trainer.fit(model)

Alternatively, for more fine-grained control, use the :meth:`~pytorch_lightning.loops.base.Loop.connect` method:
Alternatively, for more fine-grained control, use the :meth:`~pytorch_lightning.loops.loop.Loop.connect` method:

.. code-block:: python

Expand Down Expand Up @@ -326,7 +326,7 @@ Here is what the structure would look like in plain Python:
...


Each of these :code:`for`-loops represents a class implementing the :class:`~pytorch_lightning.loops.base.Loop` interface.
Each of these :code:`for`-loops represents a class implementing the :class:`~pytorch_lightning.loops.loop.Loop` interface.


.. list-table:: Trainer entry points and associated loops
Expand Down
6 changes: 3 additions & 3 deletions docs/source/extensions/loops_advanced.rst
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@ A powerful property of the class-based loop interface is that it can own an inte
Loop instances can save their state to the checkpoint through corresponding hooks and if implemented accordingly, resume the state of execution at the appropriate place.
This design is particularly interesting for fault-tolerant training which is an experimental feature released in Lightning v1.5.

The two hooks :meth:`~pytorch_lightning.loops.base.Loop.on_save_checkpoint` and :meth:`~pytorch_lightning.loops.base.Loop.on_load_checkpoint` function very similarly to how LightningModules and Callbacks save and load state.
The two hooks :meth:`~pytorch_lightning.loops.loop.Loop.on_save_checkpoint` and :meth:`~pytorch_lightning.loops.loop.Loop.on_load_checkpoint` function very similarly to how LightningModules and Callbacks save and load state.

.. code-block:: python

Expand All @@ -30,9 +30,9 @@ The two hooks :meth:`~pytorch_lightning.loops.base.Loop.on_save_checkpoint` and
def on_load_checkpoint(self, state_dict):
self.iteration = state_dict["iteration"]

When the Trainer is restarting from a checkpoint (e.g., through :code:`trainer.fit(ckpt_path=...)`), the loop exposes a boolean attribute :attr:`~pytorch_lightning.loops.base.Loop.restarting`.
When the Trainer is restarting from a checkpoint (e.g., through :code:`trainer.fit(ckpt_path=...)`), the loop exposes a boolean attribute :attr:`~pytorch_lightning.loops.loop.Loop.restarting`.
Based around the value of this variable, the user can write the loop in such a way that it can restart from an arbitrary point given the state loaded from the checkpoint.
For example, the implementation of the :meth:`~pytorch_lightning.loops.base.Loop.reset` method could look like this given our previous example:
For example, the implementation of the :meth:`~pytorch_lightning.loops.loop.Loop.reset` method could look like this given our previous example:

.. code-block:: python

Expand Down
2 changes: 1 addition & 1 deletion pl_examples/loop_examples/kfold.py
Original file line number Diff line number Diff line change
Expand Up @@ -31,8 +31,8 @@
from pl_examples.basic_examples.mnist_examples.image_classifier_4_lightning_module import ImageClassifier
from pytorch_lightning import LightningDataModule, seed_everything, Trainer
from pytorch_lightning.core.module import LightningModule
from pytorch_lightning.loops.base import Loop
from pytorch_lightning.loops.fit_loop import FitLoop
from pytorch_lightning.loops.loop import Loop
from pytorch_lightning.trainer.states import TrainerFn

#############################################################################################
Expand Down
6 changes: 2 additions & 4 deletions pytorch_lightning/loops/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -11,11 +11,9 @@
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.

from pytorch_lightning.loops.base import Loop # noqa: F401
from pytorch_lightning.loops.batch import ManualOptimization # noqa: F401
from pytorch_lightning.loops.loop import Loop # noqa: F401 isort: skip (avoids circular imports)
from pytorch_lightning.loops.batch import TrainingBatchLoop # noqa: F401
from pytorch_lightning.loops.dataloader import DataLoaderLoop, EvaluationLoop, PredictionLoop # noqa: F401
from pytorch_lightning.loops.epoch import EvaluationEpochLoop, PredictionEpochLoop, TrainingEpochLoop # noqa: F401
from pytorch_lightning.loops.fit_loop import FitLoop # noqa: F401
from pytorch_lightning.loops.optimization.optimizer_loop import OptimizerLoop # noqa: F401
from pytorch_lightning.loops.optimization import ManualOptimization, OptimizerLoop # noqa: F401
Loading