-
Notifications
You must be signed in to change notification settings - Fork 3.6k
Description
Proposed refactor
I propose to deprecate and remove on_epoch_start/end and on_batch_start/end hooks.
Motivation
-
on_epoch_start/on_epoch_end: These 2 hooks runs within each modetrain/val/testcurrently. We already haveon_train_epoch_start/on_val_epoch_start/on_test_epoch_start. The reason why we kept them to have a common hook that is called in each mode is so that users can configure operations required to be done within each mode. But I think this is more of a specific application/use-case and can be configured easily by the user without this hook. Also, it can be confusing when referring to val/test becauseepochdoesn't mean anything during evaluation. Also, there are other mode-specific hooks (actually many) that don't have a special version of a hook that runs for all modes. For eg. we don't have any separateon_start/on_endhooks oron_dataloaderhook that run along withon_{train/val/test}_start/endso why special treatment here? -
on_batch_start/on_batch_end: They run along withon_train_batch_start/on_train_batch_endand don't provide any other significance so we should remove them as well. We can make them run within each mode but then again the same points from above, do we even need them?
History:
all these hooks seem to be added at the inception, the best I could find is this PR which is very old.
the behavior that enabled on_epoch_start/end was I think discussed/approved over slack and I added them 😅 a while back: #6498
Pitch
Simply deprecate and remove.
Additional context
If you enjoy Lightning, check out our other projects! ⚡
-
Metrics: Machine learning metrics for distributed, scalable PyTorch applications.
-
Lite: enables pure PyTorch users to scale their existing code on any kind of device while retaining full control over their own loops and optimization logic.
-
Flash: The fastest way to get a Lightning baseline! A collection of tasks for fast prototyping, baselining, fine-tuning, and solving problems with deep learning.
-
Bolts: Pretrained SOTA Deep Learning models, callbacks, and more for research and production with PyTorch Lightning and PyTorch.
-
Lightning Transformers: Flexible interface for high-performance research using SOTA Transformers leveraging Pytorch Lightning, Transformers, and Hydra.
cc @tchaton @carmocca @awaelchli @Borda @ninginthecloud @justusschock @akihironitta