-
Notifications
You must be signed in to change notification settings - Fork 3.6k
Add support for torch.use_deterministic_algorithms
#9121
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add support for torch.use_deterministic_algorithms
#9121
Conversation
|
test failures for parity tests are related due to the new flag. will take a deeper look |
Codecov Report
@@ Coverage Diff @@
## master #9121 +/- ##
=======================================
- Coverage 93% 89% -4%
=======================================
Files 177 177
Lines 15456 15460 +4
=======================================
- Hits 14317 13698 -619
- Misses 1139 1762 +623 |
tchaton
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM !
454f826 to
b294c57
Compare
|
Actually, I am not sure if we can switch it that easily, Before deterministic was more like a "make it as deterministic as possible but still run it". Changing that to raising issues is quite a breaking change here even though this is more what one would expect. Personally, I have a lot of code using that flag and relying on it not to fail. |
|
@justusschock I agree, this would be a breaking change. Even some of our tests have been updated to avoid the runtime error raised. I am not sure how to square this with new expectations PyTorch provides around these deterministic checks. The new API is more comprehensive and offers stricter guarantees around reproducibility. Options:
Are there other paths you see? |
IMO torch should've provided a function or flag that doesn't raise a For context: pytorch/pytorch#15359 |
|
@kurtamohler (author of Thanks :) |
|
I'm not sure at the moment where the discussion was, I'll do some searching. To me, it seems like we would have to add a warn-only option to use_deterministic_algorithms. Would you mind opening an issue in pytorch and tag me in it? Otherwise, I can open an issue next time I'm at my computer |
|
+1 for the current approach. IMO if our Trainer has an argument "deterministic" it should do what it says and the stricter version introduced here makes sense to me. |
tchaton
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM !
346264a to
e04c956
Compare
704d2da to
3b2ad55
Compare
What does this PR do?
https://pytorch.org/docs/stable/generated/torch.use_deterministic_algorithms.html
Sets the flag for using deterministic algorithms based on the trainer flag
deterministicFixes #9107
Fixes #9544
Does your PR introduce any breaking changes? If yes, please list them.
Yes. This will raise a runtime error if there are no deterministic algorithms available. Previously, this was best-effort, and no error would be raised.
Before submitting
PR review
Anyone in the community is welcome to review the PR.
Before you start reviewing make sure you have read Review guidelines. In short, see the following bullet-list:
Did you have fun?
Make sure you had fun coding 🙃