-
Notifications
You must be signed in to change notification settings - Fork 3.6k
Closed
Labels
designIncludes a design discussionIncludes a design discussionoptimizerwon't fixThis will not be worked onThis will not be worked on
Description
In #1279, support for returning None from configure_optimizers was added to Lightning. The use case was training without an optimizer. This preceded support for manual optimization, in which the user controls the backward & optimizer step directly inside of their training step.
The _MockOptimizer leaks out like this which could be very confusing for developers.
class MyLightningModule(LightningModule):
def configure_optimizers():
return None
def training_step(batch, batch_idx):
opt = self.optimizers()
# opt is not None! what?!- Is training with no optimizer a valid use case Lightning supports? Are there examples/references one could share to learn more about these use cases?
- If the Trainer creates a mock optimizer for users, should the mock optimizer ever be exposed back to the user?
- If training with no optimizer is a valid use case, should we require users to use manual optimization for this, so we don't configure a mock optimizer instance for them?
Originally posted by @ananthsub in #11155 (comment)
daniellepintzrohitgr7
Metadata
Metadata
Assignees
Labels
designIncludes a design discussionIncludes a design discussionoptimizerwon't fixThis will not be worked onThis will not be worked on