Skip to content

Conversation

@akihironitta
Copy link
Contributor

@akihironitta akihironitta commented Mar 6, 2021

What does this PR do?

In most cases, zero_grad can be called outside the closure, but zero_grad needs to be called inside the closure when using an optimizer like LBFGS which requires reevaluation of the loss in optimizer.step(closure).

This PR fixes the closure in the docs so that it works in all cases above.

Before submitting

  • Was this discussed/approved via a GitHub issue? (not for typos and docs)
  • Did you read the contributor guideline, Pull Request section?
  • Did you make sure your PR does only one thing, instead of bundling different changes together?
  • Did you make sure to update the documentation with your changes? (if necessary)
  • Did you write any new necessary tests? (not for typos and docs)
  • Did you verify new and existing tests pass locally with your changes?
  • Did you update the CHANGELOG? (not for typos, docs, test updates, or internal minor changes/refactorings)

PR review

Anyone in the community is free to review the PR once the tests have passed.
Before you start reviewing make sure you have read Review guidelines. In short, see the following bullet-list:

  • Is this pull request ready for review? (if not, please submit in draft mode)
  • Check that all items from Before submitting are resolved
  • Make sure the title is self-explanatory and the description concisely explains the PR
  • Add labels and milestones (and optionally projects) to the PR so it can be classified

Did you have fun?

Make sure you had fun coding 🙃

@akihironitta akihironitta added bug Something isn't working docs Documentation related labels Mar 6, 2021
@codecov
Copy link

codecov bot commented Mar 6, 2021

Codecov Report

Merging #6374 (b1721dc) into master (4f391bc) will decrease coverage by 7%.
The diff coverage is n/a.

@@           Coverage Diff            @@
##           master   #6374     +/-   ##
========================================
- Coverage      93%     86%     -7%     
========================================
  Files         161     161             
  Lines       11464   12596   +1132     
========================================
+ Hits        10706   10840    +134     
- Misses        758    1756    +998     

@Borda Borda added the ready PRs ready to be merged label Mar 7, 2021
@Borda Borda enabled auto-merge (squash) March 7, 2021 10:49
@mergify mergify bot requested a review from a team March 7, 2021 10:49
Copy link
Contributor

@rohitgr7 rohitgr7 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

quick question:
is zero_grad and its corresponding hooks called before backward and optimizer.step in case of automatic_optimization??

@mergify mergify bot requested a review from a team March 7, 2021 12:02
@Borda Borda merged commit c7f30a2 into Lightning-AI:master Mar 7, 2021
@akihironitta akihironitta deleted the docs/fix-manopt branch March 7, 2021 12:36
@carmocca
Copy link
Contributor

carmocca commented Mar 7, 2021

quick question:
is zero_grad and its corresponding hooks called before backward and optimizer.step in case of automatic_optimization??

Yes: #6147

@akihironitta
Copy link
Contributor Author

@rohitgr7 Yes, but internally, we call those methods inside optimizer.step:

optimizer.step(closure=closure)
  closure()
    forward()
    zero_grad()
    backward()
  update_params()

but this is equivalent to:

forward()
zero_grad()
backward()
optimizer.step(closure=None)
  update_params()

All the difference is whether forward(), zero_grad(), and backward() are called inside or outside the closure. We do that because some optimizers, such as LBFGS, need to call them inside the closure. https://pytorch.org/docs/master/optim.html#torch.optim.LBFGS

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

bug Something isn't working docs Documentation related ready PRs ready to be merged

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants