Skip to content

Commit c92f84a

Browse files
sadiqjkaushikb11
authored andcommitted
Fix unfreeze_and_add_param_group expects modules rather than module (Lightning-AI#6822)
1 parent cd997d6 commit c92f84a

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

pytorch_lightning/callbacks/finetuning.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -77,7 +77,7 @@ def finetune_function(self, pl_module, current_epoch, optimizer, optimizer_idx):
7777
# When `current_epoch` is 10, feature_extractor will start training.
7878
if current_epoch == self._unfreeze_at_epoch:
7979
self.unfreeze_and_add_param_group(
80-
module=pl_module.feature_extractor,
80+
modules=pl_module.feature_extractor,
8181
optimizer=optimizer,
8282
train_bn=True,
8383
)

0 commit comments

Comments
 (0)