-
Notifications
You must be signed in to change notification settings - Fork 3.6k
Description
🚀 Feature
integrating torch.use_deterministic_algorithms(True) into the Trainer(Deterministic =True).
Motivation
The parameter Trainer(deterministic=True) realizes determinism for cudnn by torch.backends.cudnn.deterministic = deterministic. But torch has the more comprehensive function torch.use_deterministic_algorithms(True) to ensure determinism.
Pitch
change {torch.backends.cudnn.deterministic = deterministic} in pytorch_lightening.trainer.connectors.accelerator_connecttor.py as {torch.use_deterministic_algorithms(True)}.
If you enjoy Lightning, check out our other projects! ⚡
-
Metrics: Machine learning metrics for distributed, scalable PyTorch applications.
-
Flash: The fastest way to get a Lightning baseline! A collection of tasks for fast prototyping, baselining, finetuning and solving problems with deep learning
-
Bolts: Pretrained SOTA Deep Learning models, callbacks and more for research and production with PyTorch Lightning and PyTorch
-
Lightning Transformers: Flexible interface for high performance research using SOTA Transformers leveraging Pytorch Lightning, Transformers, and Hydra.