Skip to content

extend the parameter Trainer(deterministic=False)  #9164

@ghost

Description

🚀 Feature

integrating torch.use_deterministic_algorithms(True) into the Trainer(Deterministic =True).

Motivation

The parameter Trainer(deterministic=True) realizes determinism for cudnn by torch.backends.cudnn.deterministic = deterministic. But torch has the more comprehensive function torch.use_deterministic_algorithms(True) to ensure determinism.

Pitch

change {torch.backends.cudnn.deterministic = deterministic} in pytorch_lightening.trainer.connectors.accelerator_connecttor.py as {torch.use_deterministic_algorithms(True)}.


If you enjoy Lightning, check out our other projects! ⚡

  • Metrics: Machine learning metrics for distributed, scalable PyTorch applications.

  • Flash: The fastest way to get a Lightning baseline! A collection of tasks for fast prototyping, baselining, finetuning and solving problems with deep learning

  • Bolts: Pretrained SOTA Deep Learning models, callbacks and more for research and production with PyTorch Lightning and PyTorch

  • Lightning Transformers: Flexible interface for high performance research using SOTA Transformers leveraging Pytorch Lightning, Transformers, and Hydra.

Metadata

Metadata

Assignees

No one assigned

    Labels

    featureIs an improvement or enhancementhelp wantedOpen to be worked on

    Type

    No type

    Projects

    No projects

    Milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions