Skip to content

For multiple optimizers, model.toggle_optimizer() is setting the requires_grad=True for all the params. #5292

@jain-anshul

Description

@jain-anshul

https://github.com/PyTorchLightning/pytorch-lightning/blob/dabfeca92e0702e55f09ac53e9412672cd258cd3/pytorch_lightning/core/lightning.py#L1152-L1170

Setup :
pytorch-lightning 1.1.2
pytorch 1.7.1

When using multiple optimizers, in the toggle_optimizers(..) function, the requires_grad property is set to true for all the params belonging to the param_groups of the optimizer. This is incorrect as in case the user has explicitly disabled the requires_grad property for some parameters permanently, the function would enable requires_grad for those parameters also.

Proposed fix:
The requires_grad value for all the parameters should be stored beforehand in an object variable self.params_dict of lightning object. In the toggle_optimizer() function, instead of setting params.requires_grad=True, set params.requires_grad = self.params_dict[params].This would set the correct value for the params of the concerned optimizer.

Metadata

Metadata

Assignees

Labels

Type

No type

Projects

No projects

Milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions