Skip to content

Commit 7f91c5e

Browse files
authored
Fix unfreeze_and_add_param_group expects modules rather than module (#6822)
1 parent c3da7f5 commit 7f91c5e

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

pytorch_lightning/callbacks/finetuning.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -77,7 +77,7 @@ def finetune_function(self, pl_module, current_epoch, optimizer, optimizer_idx):
7777
# When `current_epoch` is 10, feature_extractor will start training.
7878
if current_epoch == self._unfreeze_at_epoch:
7979
self.unfreeze_and_add_param_group(
80-
module=pl_module.feature_extractor,
80+
modules=pl_module.feature_extractor,
8181
optimizer=optimizer,
8282
train_bn=True,
8383
)

0 commit comments

Comments
 (0)