-
Notifications
You must be signed in to change notification settings - Fork 3.6k
Description
Proposed refactor
Deprecate the trainer property here: https://github.com/PyTorchLightning/pytorch-lightning/blob/6b728713bb3b35ad58cd0085acaa443b33ab03ac/pytorch_lightning/trainer/trainer.py#L1731-L1744
Motivation
This property doesn't depend on the Trainer at all: this property can be converted into a utility function/staticmethod instead.
Proposal: Migrate this logic into the SLURMEnvironment class as a staticmethod. Doing this will consolidate SLURM environment variable code in Lightning and simplify the Trainer. https://github.com/PyTorchLightning/pytorch-lightning/search?q=slurm_job_id
Pitch
- Move this logic to the
SLURMEnvironmentclass as a staticmethod
@staticmethod
def job_id() -> Optional[int]:@staticmethod
def job_name() -> Optional[str]:- Then replace usages of
trainer.slurm_job_idwithSLURMEnvironment.slurm_job_id()/slurm_job_name - Then deprecate
trainer.slurm_job_id
Additional context
Part of #7740
If you enjoy Lightning, check out our other projects! ⚡
-
Metrics: Machine learning metrics for distributed, scalable PyTorch applications.
-
Lite: enables pure PyTorch users to scale their existing code on any kind of device while retaining full control over their own loops and optimization logic.
-
Flash: The fastest way to get a Lightning baseline! A collection of tasks for fast prototyping, baselining, fine-tuning, and solving problems with deep learning.
-
Bolts: Pretrained SOTA Deep Learning models, callbacks, and more for research and production with PyTorch Lightning and PyTorch.
-
Lightning Transformers: Flexible interface for high-performance research using SOTA Transformers leveraging Pytorch Lightning, Transformers, and Hydra.
cc @justusschock @awaelchli @akihironitta @tchaton @kaushikb11 @Borda