-
Notifications
You must be signed in to change notification settings - Fork 3.6k
Description
🚀 Feature
We've decided to have testing with both PyTorch LTS and stable release (1.8 and 1.11 as of now) in CI, and we've already seen some issues arose while trying to enable it in #12373.
TODO
Known issues with PL with PyTorch 1.11
- Fix false positive deprecation warning from
register_ddp_comm_hook#12846 - Update
deepspeedandfairscaleversions #12860 - Fix an issue with fitting a model initialised in init_meta_context Fix
materialize_modulerecursively setting its child module #12870 - Fix an issue with DDP comm tests with some newer PyTorch versions Fix tests related to DDP communication hooks #12878
- Fix an issue with inference mode with FSDP
Motivation
To test new features, e.g. meta init and native FSDP, in CI that are only available in newer PyTorch versions.
Pitch
Use the following image:
pytorchlightning/pytorch_lightning:base-cuda-py3.7-torch1.11
Alternatives
n/a
Additional context
n/a
If you enjoy Lightning, check out our other projects! ⚡
-
Metrics: Machine learning metrics for distributed, scalable PyTorch applications.
-
Lite: enables pure PyTorch users to scale their existing code on any kind of device while retaining full control over their own loops and optimization logic.
-
Flash: The fastest way to get a Lightning baseline! A collection of tasks for fast prototyping, baselining, fine-tuning, and solving problems with deep learning.
-
Bolts: Pretrained SOTA Deep Learning models, callbacks, and more for research and production with PyTorch Lightning and PyTorch.
-
Lightning Transformers: Flexible interface for high-performance research using SOTA Transformers leveraging Pytorch Lightning, Transformers, and Hydra.