Skip to content

Conversation

@SeanNaren
Copy link
Contributor

@SeanNaren SeanNaren commented Feb 17, 2021

What does this PR do?

There were a few places where the trainer references the plugins directly. Now the trainer only references the accelerator (except for a few edge cases), and the accelerator calls the training type/precision plugins. This is crucial for accelerators that will require custom/non plugin supported behaviour.

Before submitting

  • Was this discussed/approved via a GitHub issue? (not for typos and docs)
  • Did you read the contributor guideline, Pull Request section?
  • Did you make sure your PR does only one thing, instead of bundling different changes together?
  • Did you make sure to update the documentation with your changes? (if necessary)
  • Did you write any new necessary tests? (not for typos and docs)
  • Did you verify new and existing tests pass locally with your changes?
  • Did you update the CHANGELOG? (not for typos, docs, test updates, or internal minor changes/refactorings)

PR review

Anyone in the community is free to review the PR once the tests have passed.
Before you start reviewing make sure you have read Review guidelines. In short, see the following bullet-list:

  • Is this pull request ready for review? (if not, please submit in draft mode)
  • Check that all items from Before submitting are resolved
  • Make sure the title is self-explanatory and the description concisely explains the PR
  • Add labels and milestones (and optionally projects) to the PR so it can be classified

Did you have fun?

Make sure you had fun coding 🙃

@SeanNaren SeanNaren added feature Is an improvement or enhancement distributed Generic distributed-related topic labels Feb 17, 2021
@SeanNaren SeanNaren added this to the 1.2 milestone Feb 17, 2021
@SeanNaren SeanNaren self-assigned this Feb 17, 2021
@codecov
Copy link

codecov bot commented Feb 17, 2021

Codecov Report

Merging #6039 (4cfa025) into master (6a409c7) will decrease coverage by 0%.
The diff coverage is 100%.

@@          Coverage Diff           @@
##           master   #6039   +/-   ##
======================================
- Coverage      93%     93%   -0%     
======================================
  Files         160     160           
  Lines       11358   11356    -2     
======================================
- Hits        10569   10538   -31     
- Misses        789     818   +29     

Copy link
Member

@justusschock justusschock left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Just one minor question. I like it.

"""Hook to do something before the training/evaluation/prediction starts."""
self.training_type_plugin.post_dispatch()
self.precision_plugin.post_dispatch()
self.teardown()
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Does this have to be part of post-dispatch?

Doesn't the Trainer have a teardown that should/could call this teardown?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

great point! the trainer should be responsible for calling tear down as its responsible for control over the accelerator, i'll move it

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@SeanNaren I can also understand having it here, since this is where the accelerators control ends. More a point of discussion I think.
But to me teardown is more when you leave it finally (we should think about also calling that in the del- method)

Copy link
Contributor

@tchaton tchaton left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM ! Nice cleaning !

return self.training_type_plugin.process_dataloader(dataloader)

@property
def results(self) -> Any:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Like it !

@SeanNaren SeanNaren added _Will and removed _Will labels Feb 17, 2021
@SeanNaren SeanNaren enabled auto-merge (squash) February 17, 2021 20:26
def barrier(self, name: Optional[str] = None) -> None:
self.training_type_plugin.barrier(name=name)

def broadcast(self, obj: object, src: int = 0) -> object:
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

what is src could be better named or add docsring?

@SeanNaren SeanNaren merged commit b7c2e0a into master Feb 17, 2021
@SeanNaren SeanNaren deleted the feat/training_type_refs branch February 17, 2021 20:41
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

distributed Generic distributed-related topic feature Is an improvement or enhancement

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants