Skip to content

Commit 88ea69a

Browse files
committed
[bugfix] Resolve memory leak for evaluation (#6326)
* resolve bug * resolve flake8 * revert name
1 parent 9ca9a7c commit 88ea69a

File tree

2 files changed

+5
-0
lines changed

2 files changed

+5
-0
lines changed

pytorch_lightning/trainer/evaluation_loop.py

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -203,6 +203,10 @@ def __run_eval_epoch_end(self, num_dataloaders):
203203

204204
# with a single dataloader don't pass an array
205205
outputs = self.outputs
206+
207+
# free memory
208+
self.outputs = []
209+
206210
eval_results = outputs
207211
if num_dataloaders == 1:
208212
eval_results = outputs[0]

tests/trainer/logging_/test_eval_loop_logging_1_0.py

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -126,6 +126,7 @@ def validation_step_end(self, acc):
126126
def validation_epoch_end(self, outputs):
127127
self.log('g', torch.tensor(2, device=self.device), on_epoch=True)
128128
self.validation_epoch_end_called = True
129+
assert len(self.trainer.evaluation_loop.outputs) == 0
129130

130131
def backward(self, loss, optimizer, optimizer_idx):
131132
return LightningModule.backward(self, loss, optimizer, optimizer_idx)

0 commit comments

Comments
 (0)