-
Notifications
You must be signed in to change notification settings - Fork 3.6k
Description
Proposed refactor
Part 3 of Accelerator and Plugin refactor #10416
Motivation
Moving towards stable version
After step 2 Move Precision Plugin into TTP Precision Plugins should be part of Training Type Plugins #7324
Accelerator is not the routing layer for strategy and precision anymore, optimizer related logic, steps, hooks all moved in to strategy.
Accelerator only have device information - strategy should own accelerator. Reduce the code complexity and improve code maintainability
Pitch
move accelerator into ttp/strategy as device_plugin (similar to checkpoint_io) and updating logic in accelerator-connector, training, loops accordingly
[RFC] Should we have a new name for Accelerator?
Additional context
If you enjoy Lightning, check out our other projects! ⚡
-
Metrics: Machine learning metrics for distributed, scalable PyTorch applications.
-
Lite: enables pure PyTorch users to scale their existing code on any kind of device while retaining full control over their own loops and optimization logic.
-
Flash: The fastest way to get a Lightning baseline! A collection of tasks for fast prototyping, baselining, fine-tuning, and solving problems with deep learning.
-
Bolts: Pretrained SOTA Deep Learning models, callbacks, and more for research and production with PyTorch Lightning and PyTorch.
-
Lightning Transformers: Flexible interface for high-performance research using SOTA Transformers leveraging Pytorch Lightning, Transformers, and Hydra.
cc @justusschock @awaelchli @akihironitta @kaushikb11 @ananthsub