Skip to content

Support FSDP in LiteΒ #14898

@awaelchli

Description

@awaelchli

πŸš€ Feature

Motivation

Native PyTorch FSDP is a powerful alternative to regular DDP.

Pitch

Add the strategy to Lite. Build it using the existing implementation in Trainer.


If you enjoy Lightning, check out our other projects! ⚑

  • Metrics: Machine learning metrics for distributed, scalable PyTorch applications.

  • Lite: enables pure PyTorch users to scale their existing code on any kind of device while retaining full control over their own loops and optimization logic.

  • Flash: The fastest way to get a Lightning baseline! A collection of tasks for fast prototyping, baselining, fine-tuning, and solving problems with deep learning.

  • Bolts: Pretrained SOTA Deep Learning models, callbacks, and more for research and production with PyTorch Lightning and PyTorch.

  • Lightning Transformers: Flexible interface for high-performance research using SOTA Transformers leveraging PyTorch Lightning, Transformers, and Hydra.

Metadata

Metadata

Assignees

Labels

fabriclightning.fabric.FabricfeatureIs an improvement or enhancement

Type

No type

Projects

Milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions