-
Notifications
You must be signed in to change notification settings - Fork 3.6k
Trainer only references accelerator #6039
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
Codecov Report
@@ Coverage Diff @@
## master #6039 +/- ##
======================================
- Coverage 93% 93% -0%
======================================
Files 160 160
Lines 11358 11356 -2
======================================
- Hits 10569 10538 -31
- Misses 789 818 +29 |
justusschock
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Just one minor question. I like it.
| """Hook to do something before the training/evaluation/prediction starts.""" | ||
| self.training_type_plugin.post_dispatch() | ||
| self.precision_plugin.post_dispatch() | ||
| self.teardown() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Does this have to be part of post-dispatch?
Doesn't the Trainer have a teardown that should/could call this teardown?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
great point! the trainer should be responsible for calling tear down as its responsible for control over the accelerator, i'll move it
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@SeanNaren I can also understand having it here, since this is where the accelerators control ends. More a point of discussion I think.
But to me teardown is more when you leave it finally (we should think about also calling that in the del- method)
tchaton
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM ! Nice cleaning !
| return self.training_type_plugin.process_dataloader(dataloader) | ||
|
|
||
| @property | ||
| def results(self) -> Any: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Like it !
| def barrier(self, name: Optional[str] = None) -> None: | ||
| self.training_type_plugin.barrier(name=name) | ||
|
|
||
| def broadcast(self, obj: object, src: int = 0) -> object: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
what is src could be better named or add docsring?
What does this PR do?
There were a few places where the trainer references the plugins directly. Now the trainer only references the accelerator (except for a few edge cases), and the accelerator calls the training type/precision plugins. This is crucial for accelerators that will require custom/non plugin supported behaviour.
Before submitting
PR review
Anyone in the community is free to review the PR once the tests have passed.
Before you start reviewing make sure you have read Review guidelines. In short, see the following bullet-list:
Did you have fun?
Make sure you had fun coding 🙃