Skip to content

Commit 86a0cb7

Browse files
ananthsubtchaton
andauthored
Check max_time when setting defaults for min/max epochs (#9072)
Co-authored-by: tchaton <[email protected]>
1 parent 811d37b commit 86a0cb7

File tree

3 files changed

+11
-2
lines changed

3 files changed

+11
-2
lines changed

CHANGELOG.md

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -237,9 +237,13 @@ The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/).
237237

238238
- Fixed bug where data-loading functions where not getting the correct running stage passed ([#8858](https://github.com/PyTorchLightning/pytorch-lightning/pull/8858))
239239

240+
240241
- Fixed a bug in the binary search mode of auto batch size scaling where exception was thrown if the first trainer run resulted in OOM ([#8954](https://github.com/PyTorchLightning/pytorch-lightning/pull/8954))
241242

242243

244+
- Fixed not setting a default value for `max_epochs` if `max_time` was specified on the `Trainer` constructor ([#9072](https://github.com/PyTorchLightning/pytorch-lightning/pull/9072))
245+
246+
243247
## [1.4.3] - 2021-08-17
244248

245249
- Fixed plateau scheduler stepping on incomplete epoch ([#8861](https://github.com/PyTorchLightning/pytorch-lightning/pull/8861))

pytorch_lightning/trainer/trainer.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -375,8 +375,8 @@ def __init__(
375375
self.tuner = Tuner(self)
376376

377377
fit_loop = FitLoop(
378-
min_epochs=(1 if (min_epochs is None and min_steps is None) else min_epochs),
379-
max_epochs=(1000 if (max_epochs is None and max_steps is None) else max_epochs),
378+
min_epochs=(1 if (min_epochs is None and min_steps is None and max_time is None) else min_epochs),
379+
max_epochs=(1000 if (max_epochs is None and max_steps is None and max_time is None) else max_epochs),
380380
)
381381
training_epoch_loop = TrainingEpochLoop(min_steps, max_steps)
382382
training_batch_loop = TrainingBatchLoop()

tests/callbacks/test_timer.py

Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -42,6 +42,11 @@ def on_fit_start(self):
4242
trainer.fit(TestModel())
4343
assert "callbacks list already contains a Timer" in caplog.text
4444

45+
seconds = 1
46+
trainer = Trainer(max_time=dict(seconds=seconds))
47+
assert trainer.max_epochs is None
48+
assert trainer.max_steps is None
49+
4550

4651
@pytest.mark.parametrize(
4752
"duration,expected",

0 commit comments

Comments
 (0)