-
Notifications
You must be signed in to change notification settings - Fork 3.6k
Add before_batch_transfer and after_batch_transfer hooks #3671
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
This pull request is now in conflict... :( |
|
Hi @rohitgr7! Mind rebasing your PR? |
will complete this PR this week probably. |
|
This pull request has been automatically marked as stale because it has not had recent activity. It will be closed in 7 days if no further activity occurs. If you need further help see our docs: https://pytorch-lightning.readthedocs.io/en/latest/CONTRIBUTING.html#pull-request or ask the assistance of a core contributor here or on Slack. Thank you for your contributions. |
|
This pull request has been automatically marked as stale because it has not had recent activity. It will be closed in 7 days if no further activity occurs. If you need further help see our docs: https://pytorch-lightning.readthedocs.io/en/latest/CONTRIBUTING.html#pull-request or ask the assistance of a core contributor here or on Slack. Thank you for your contributions. |
Borda
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM, is it ready to review (regarding the very last reverting actions needed)
not yet @Borda.. little busy this week. Will try to complete this ASAP. |
329f9d0 to
0f28e24
Compare
Codecov Report
@@ Coverage Diff @@
## master #3671 +/- ##
=======================================
- Coverage 93% 91% -2%
=======================================
Files 160 160
Lines 11340 11341 +1
=======================================
- Hits 10550 10270 -280
- Misses 790 1071 +281 |
before_batch_transfer and after_batch_transfer hooks in LightningDataModulebefore_batch_transfer and after_batch_transfer hooks
bc0121c to
8ddc2d7
Compare
|
Thanks for adding these hooks! I would like to use these to apply transforms across my entire batch when training. How would I find out if a batch is part of the training, validation or test set? I want to make sure I'm only applying augmentations during training. I had a look to see I could get that from a state variable within the trainer, but I don't think I can access that within methods called by these hooks. Thank you again, it's really a joy porting my code to pytorch-lightning |
Hey @leifdenby! Do you mind making a new question issue or asking the question in the PyTorch Lightning Slack channel? primarily I don't want this getting lost as the issue is closed already :) |
|
hey @leifdenby you can use def transfer_batch_to_device(self, batch, ...):
if self.trainer.testing:
return test_transforms(batch)
if self.trainer.training:
return train_transforms(batch)
...will add this to docs for convenience. |
What does this PR do?
Fixes #3399
Fixes #3087
Fixes #3461
Before submitting
PR review
Anyone in the community is free to review the PR once the tests have passed.
If we didn't discuss your PR in Github issues there's a high chance it will not be merged.
Did you have fun?
Make sure you had fun coding 🙃