Skip to content

Conversation

@SeanNaren
Copy link
Contributor

@SeanNaren SeanNaren commented Mar 9, 2021

What does this PR do?

Fixes #6219
Fixes #5799
Fixes #6409
Fixes #5604

A lot of back and forth here, but we're making a trade off for compatability/speed. This PR suggests setting the default to True, and adding into the documentation to set this parameter to False for speed.

As always I suggest checking Sharded Training, which does not have the same issue!

cc @ananthsub @ericharper @tchaton @carmocca @awaelchli @justusschock

Before submitting

  • Was this discussed/approved via a GitHub issue? (not for typos and docs)
  • Did you read the contributor guideline, Pull Request section?
  • Did you make sure your PR does only one thing, instead of bundling different changes together?
  • Did you make sure to update the documentation with your changes? (if necessary)
  • Did you write any new necessary tests? (not for typos and docs)
  • Did you verify new and existing tests pass locally with your changes?
  • Did you update the CHANGELOG? (not for typos, docs, test updates, or internal minor changes/refactorings)

PR review

Anyone in the community is free to review the PR once the tests have passed.
Before you start reviewing make sure you have read Review guidelines. In short, see the following bullet-list:

  • Is this pull request ready for review? (if not, please submit in draft mode)
  • Check that all items from Before submitting are resolved
  • Make sure the title is self-explanatory and the description concisely explains the PR
  • Add labels and milestones (and optionally projects) to the PR so it can be classified

Did you have fun?

Make sure you had fun coding 🙃

@SeanNaren SeanNaren added bug Something isn't working distributed Generic distributed-related topic labels Mar 9, 2021
@SeanNaren SeanNaren added this to the 1.2.x milestone Mar 9, 2021
@SeanNaren SeanNaren self-assigned this Mar 9, 2021
@SeanNaren SeanNaren requested a review from ananthsub March 9, 2021 13:10
@codecov
Copy link

codecov bot commented Mar 9, 2021

Codecov Report

Merging #6438 (5070a49) into master (55dd3a4) will not change coverage.
The diff coverage is n/a.

@@          Coverage Diff           @@
##           master   #6438   +/-   ##
======================================
  Coverage      94%     94%           
======================================
  Files         161     161           
  Lines       11495   11495           
======================================
  Hits        10751   10751           
  Misses        744     744           

@carmocca carmocca mentioned this pull request Mar 9, 2021
@SeanNaren SeanNaren enabled auto-merge (squash) March 9, 2021 18:50
@SeanNaren SeanNaren added the ready PRs ready to be merged label Mar 9, 2021
Copy link
Contributor

@awaelchli awaelchli left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I linked two more issues

@SeanNaren SeanNaren merged commit c81b2a8 into master Mar 10, 2021
@SeanNaren SeanNaren deleted the fix/find_unused_params branch March 10, 2021 09:40
SeanNaren added a commit that referenced this pull request Mar 16, 2021
…bility (#6438)

* Set find unused parameters to True by default to fix breaking models, add suggestion to re-enable

* Add changelog

(cherry picked from commit c81b2a8)
lexierule pushed a commit that referenced this pull request Mar 16, 2021
…bility (#6438)

* Set find unused parameters to True by default to fix breaking models, add suggestion to re-enable

* Add changelog

(cherry picked from commit c81b2a8)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

bug Something isn't working distributed Generic distributed-related topic ready PRs ready to be merged

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Find_unused_parameter=false is causing multi GPU to hang Training is interrupted without error with MulitGPU

5 participants