-
Notifications
You must be signed in to change notification settings - Fork 3.6k
Description
🐛 Bug
When using manual optimization by setting the property automatic_optimization=False, it is not necessary (and sometimes undesirable) for training_step() to return a loss output. For example, in the case of GANs or other models with complex multi-optimizer setups, manual optimization can be preferable or necessary for the correct behavior. The user warning should therefore be omitted in this case.
Since the user calls the update step themselves (self.manual_backward(); optimizer.step()) it is not necessary for the training step to return a loss since PL does not need to use the loss to update weights. Furthermore, logging the returned output loss is unhelpful for certain complex multi-optimizer setups, since aggregation of losses from different optimizers is not desired.
In short, since the updates and logging when using automatic_optimization=False is non-standard and does not always involve returning a loss output, I believe the UserWarning("Your training_step returned None. Did you forget to return an output?") should be omitted in the case of manual optimization.
Basic example for GAN:
@property
def automatic_optimization(self) -> bool:
return False
def training_step(...):
loss_d = self.step_and_optimize_d(...)
self.log("loss_d", loss_d)
loss_g = self.step_and_optimize_g(...)
self.log("loss_g", loss_g)
# Note that the training step does not return a value.