Skip to content

Conversation

@speediedan
Copy link
Contributor

@speediedan speediedan commented Jun 10, 2021

What does this PR do?

Fixes #7930

This PR fixes issue #7930 and provides relevant test coverage: test_finetuning_callback.py::test_parent_module_w_param_model

Models w/ parent modules that directly contained parameters themselves (as opposed to parameters exclusively in their constituent submodules) had their parameters ignored by the BaseFinetuning callback. The primary issue was addressed with an additional filtering condition in BaseFinetuning.flatten_modules(). To avoid passing duplicate parameters in methods that depended upon BaseFinetuning.flatten_modules, I also added recurse=False to the parameter loops in BaseFinetuning.[freeze, make_trainable, filter_params]. The new test I've added could potentially be consolidated with /tests/callbacks/test_finetuning_callback.py::test_deep_nested_model but I at least initially have broken the test out separately for clarity. As a practical example of the issue, I initially encountered this bug when using BaseFinetuning w/ deberta, specifically, the DisentangledSelfAttention parent module

I've updated the docstring of flatten_modules() to reflect the changes. All test_finetuning_callback.py tests are passing on my end.

  • Was this discussed/approved via a GitHub issue? (not for typos and docs)
  • Did you read the contributor guideline, Pull Request section?
  • Did you make sure your PR does only one thing, instead of bundling different changes together?
  • Did you make sure to update the documentation with your changes? (if necessary)
  • Did you write any new necessary tests? (not for typos and docs)
  • Did you verify new and existing tests pass locally with your changes?
  • Did you update the CHANGELOG? (not for typos, docs, test updates, or internal minor changes/refactorings)

PR review

Anyone in the community is free to review the PR once the tests have passed.
Before you start reviewing make sure you have read Review guidelines. In short, see the following bullet-list:

  • Is this pull request ready for review? (if not, please submit in draft mode)
  • Check that all items from Before submitting are resolved
  • Make sure the title is self-explanatory and the description concisely explains the PR
  • Add labels and milestones (and optionally projects) to the PR so it can be classified

Did you have fun?

Make sure you had fun coding 🙃
YES! This is my first contribution to PL but I'm hoping to be more involved in the future. Thanks for the awesome product/work!

@codecov
Copy link

codecov bot commented Jun 10, 2021

Codecov Report

Merging #7931 (2350723) into master (22d8266) will decrease coverage by 5%.
The diff coverage is 100%.

@@           Coverage Diff           @@
##           master   #7931    +/-   ##
=======================================
- Coverage      92%     87%    -5%     
=======================================
  Files         203     203            
  Lines       13127   13127            
=======================================
- Hits        12013   11395   -618     
- Misses       1114    1732   +618     

Copy link
Contributor

@carmocca carmocca left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nice catch!

Remember to update the CHANGELOG.md file

@carmocca carmocca added bug Something isn't working callback labels Jun 11, 2021
@carmocca carmocca added this to the v1.3.x milestone Jun 11, 2021
@carmocca carmocca changed the title Fix for #7930: Parent Modules w/ Parameters Improperly Handled by BaseFinetuning Callback Properly handle parent modules w/ parameters in BaseFinetuning callback Jun 11, 2021
@speediedan speediedan force-pushed the bugfix/7930_parent_module_w_param branch from e477e51 to d36525d Compare June 11, 2021 19:09
@speediedan speediedan force-pushed the bugfix/7930_parent_module_w_param branch from d36525d to 8b8a8cf Compare June 11, 2021 21:50
@mergify mergify bot removed the has conflicts label Jun 12, 2021
@awaelchli awaelchli added the ready PRs ready to be merged label Jun 13, 2021
@mergify mergify bot removed the has conflicts label Jun 13, 2021
@kaushikb11 kaushikb11 enabled auto-merge (squash) June 13, 2021 04:33
@carmocca carmocca disabled auto-merge June 13, 2021 15:41
Copy link
Contributor

@tchaton tchaton left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good catch ! Thanks for using the BaseFinetuning Callback and catching this bug.

@awaelchli awaelchli enabled auto-merge (squash) June 14, 2021 13:59
@mergify mergify bot removed the has conflicts label Jun 14, 2021
@awaelchli awaelchli merged commit 3a0ed02 into Lightning-AI:master Jun 14, 2021
@carmocca carmocca mentioned this pull request Jun 15, 2021
carmocca added a commit that referenced this pull request Jun 15, 2021
…back (#7931)



Co-authored-by: Daniel Dale <[email protected]>
Co-authored-by: Carlos Mocholí <[email protected]>
Co-authored-by: Kaushik B <[email protected]>
Co-authored-by: Adrian Wälchli <[email protected]>
carmocca added a commit that referenced this pull request Jun 15, 2021
…back (#7931)

Co-authored-by: Daniel Dale <[email protected]>
Co-authored-by: Carlos Mocholí <[email protected]>
Co-authored-by: Kaushik B <[email protected]>
Co-authored-by: Adrian Wälchli <[email protected]>
lexierule pushed a commit that referenced this pull request Jun 17, 2021
…back (#7931)

Co-authored-by: Daniel Dale <[email protected]>
Co-authored-by: Carlos Mocholí <[email protected]>
Co-authored-by: Kaushik B <[email protected]>
Co-authored-by: Adrian Wälchli <[email protected]>
@speediedan speediedan deleted the bugfix/7930_parent_module_w_param branch May 12, 2022 19:59
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

bug Something isn't working callback ready PRs ready to be merged

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Parent (i.e. non-leaf) Modules w/ Parameters Not Properly Handled By Finetuning Callback

6 participants