Skip to content

Commit c5938f8

Browse files
authored
Fix torchelastic detection with non-distributed installations (#13142)
* Fix torchelastic detection under Mac * CHANGELOG
1 parent 29fe1da commit c5938f8

File tree

2 files changed

+5
-1
lines changed

2 files changed

+5
-1
lines changed

CHANGELOG.md

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -223,6 +223,9 @@ The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/).
223223
- Avoid redundant callback restore warning while tuning ([#13026](https://github.com/PyTorchLightning/pytorch-lightning/pull/13026))
224224

225225

226+
- Fixed torchelastic detection with non-distributed installations ([#13142](https://github.com/PyTorchLightning/pytorch-lightning/pull/13142))
227+
228+
226229
- Fixed an issue wrt unnecessary usage of habana mixed precision package for fp32 types ([#13028](https://github.com/PyTorchLightning/pytorch-lightning/pull/13028))
227230

228231

pytorch_lightning/plugins/environments/torchelastic_environment.py

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -62,7 +62,8 @@ def main_port(self) -> int:
6262
def detect() -> bool:
6363
"""Returns ``True`` if the current process was launched using the torchelastic command."""
6464
if _TORCH_GREATER_EQUAL_1_9_1:
65-
return torch.distributed.is_torchelastic_launched()
65+
# if not available (for example on MacOS), `is_torchelastic_launched` is not defined
66+
return torch.distributed.is_available() and torch.distributed.is_torchelastic_launched()
6667
required_env_vars = {"RANK", "GROUP_RANK", "LOCAL_RANK", "LOCAL_WORLD_SIZE"}
6768
return required_env_vars.issubset(os.environ.keys())
6869

0 commit comments

Comments
 (0)