-
Notifications
You must be signed in to change notification settings - Fork 3.6k
Remove default optimizer, add None optimizer option #1279
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Remove default optimizer, add None optimizer option #1279
Conversation
| return model, optimizers | ||
|
|
||
| def configure_optimizers(self) -> Union[ | ||
| def configure_optimizers(self) -> Optional[Union[ |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
what if we just make the whole function optional? instead of just the args? or is that optional syntax doing that?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
or i guess if we make the method still required but optional return, then it maintains readability? on second thought i kind of like this better. that way it’s explicit in code that you used no optimizer
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yeah, this does that - or at least, PyCharm isn't prompting me to implement the method :) - the optional thing just means you wont get a type warning if you return None
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
^^ we can do that - currently method isn't required but happy to change that :) agree that this behaviour should be explicit
Codecov Report
@@ Coverage Diff @@
## master #1279 +/- ##
=======================================
+ Coverage 92% 92% +<1%
=======================================
Files 62 62
Lines 3245 3173 -72
=======================================
- Hits 2970 2905 -65
+ Misses 275 268 -7 |
Borda
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
can you pls sync work with #1269
Co-Authored-By: Jirka Borovec <[email protected]>
|
This pull request is now in conflict... :( |
|
Hello @ethanwharris! Thanks for updating this PR. There are currently no PEP 8 issues detected in this Pull Request. Cheers! 🍻 Comment last updated at 2020-04-02 12:38:16 UTC |
* Add warning when using default optimizer * Refactor optimizer tests to test_optimizers * Remove default optimizer, add option to use no optimizer * Update CHANGELOG.md * Update pytorch_lightning/trainer/optimizers.py Co-Authored-By: Jirka Borovec <[email protected]> * Fix style Co-authored-by: Jirka Borovec <[email protected]>
Before submitting
What does this PR do?
configure_optimizersis not overriden or returnsNoneinit_optimizersandconfigure_schedulersto their own mixin in seperate file (optimizers.py)init_optimizersto just take model so thatconfigure_optimizersis only called in one placePR review
Anyone in the community is free to review the PR once the tests have passed.
If we didn't discuss your PR in Github issues there's a high chance it will not be merged.
Did you have fun?
Make sure you had fun coding 🙃