-
Notifications
You must be signed in to change notification settings - Fork 3.6k
[doc] Fix closure in manual optimization #6374
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
Codecov Report
@@ Coverage Diff @@
## master #6374 +/- ##
========================================
- Coverage 93% 86% -7%
========================================
Files 161 161
Lines 11464 12596 +1132
========================================
+ Hits 10706 10840 +134
- Misses 758 1756 +998 |
rohitgr7
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
quick question:
is zero_grad and its corresponding hooks called before backward and optimizer.step in case of automatic_optimization??
Yes: #6147 |
|
@rohitgr7 Yes, but internally, we call those methods inside but this is equivalent to: All the difference is whether |
What does this PR do?
In most cases,
zero_gradcan be called outside the closure, butzero_gradneeds to be called inside the closure when using an optimizer likeLBFGSwhich requires reevaluation of the loss inoptimizer.step(closure).This PR fixes the closure in the docs so that it works in all cases above.
Before submitting
PR review
Anyone in the community is free to review the PR once the tests have passed.
Before you start reviewing make sure you have read Review guidelines. In short, see the following bullet-list:
Did you have fun?
Make sure you had fun coding 🙃