Skip to content

Commit c963bf6

Browse files
authored
[loops] Reset reference to dataloader iterator on run end (#9386)
* [loops] Reset reference to dataloader iterator on run end
1 parent 58de08d commit c963bf6

File tree

3 files changed

+6
-0
lines changed

3 files changed

+6
-0
lines changed

CHANGELOG.md

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -334,6 +334,9 @@ The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/).
334334
- Fixed `replace_sampler` missing the batch size under specific conditions ([#9367](https://github.com/PyTorchLightning/pytorch-lightning/pull/9367))
335335

336336

337+
- Fixed freeing data iterators in loop `on_run_end` ([#9386](https://github.com/PyTorchLightning/pytorch-lightning/pull/9386))
338+
339+
337340
## [1.4.5] - 2021-08-31
338341

339342
- Fixed reduction using `self.log(sync_dict=True, reduce_fx={mean,max})` ([#9142](https://github.com/PyTorchLightning/pytorch-lightning/pull/9142))

pytorch_lightning/loops/epoch/evaluation_epoch_loop.py

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -135,6 +135,7 @@ def on_run_end(self) -> EPOCH_OUTPUT:
135135
outputs = self.outputs
136136
# free memory
137137
self.outputs = []
138+
self.dataloader_iter = None
138139
return outputs
139140

140141
def evaluation_step(self, batch: Any, batch_idx: int, dataloader_idx: int) -> Optional[STEP_OUTPUT]:

pytorch_lightning/loops/epoch/training_epoch_loop.py

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -232,6 +232,8 @@ def on_run_end(self) -> None:
232232
if self._num_training_batches_reached(self.is_last_batch):
233233
self.update_lr_schedulers("epoch", update_plateau_schedulers=True)
234234

235+
self.dataloader_iter = None
236+
235237
def teardown(self) -> None:
236238
self._results.cpu()
237239
self.batch_loop.teardown()

0 commit comments

Comments
 (0)