Skip to content

Conversation

@lijm1358
Copy link
Contributor

@lijm1358 lijm1358 commented Jul 27, 2022

What does this PR do?

Fixes mypy typing errors in pytorch_lightning/strategies/ddp.py in #13445 .

src\pytorch_lightning\strategies\ddp.py:86: error: Function "builtins.callable" is not valid as a type  [valid-type]
src\pytorch_lightning\strategies\ddp.py:86: note: Perhaps you meant "typing.Callable" instead of "callable"?
src\pytorch_lightning\strategies\ddp.py:87: error: Function "builtins.callable" is not valid as a type  [valid-type]
src\pytorch_lightning\strategies\ddp.py:87: note: Perhaps you meant "typing.Callable" instead of "callable"?
src\pytorch_lightning\strategies\ddp.py:120: error: Value of type "Optional[List[device]]" is not indexable  [index]
src\pytorch_lightning\strategies\ddp.py:132: error: Function is missing a return type annotation  [no-untyped-def]
src\pytorch_lightning\strategies\ddp.py:136: error: Function is missing a return type annotation  [no-untyped-def]
src\pytorch_lightning\strategies\ddp.py:149: error: Item "None" of "Optional[ClusterEnvironment]" has no attribute "creates_processes_externally"  [union-attr]
src\pytorch_lightning\strategies\ddp.py:150: error: Argument 1 to "_SubprocessScriptLauncher" has incompatible type "Optional[ClusterEnvironment]"; expected "ClusterEnvironment"  [arg-type]
src\pytorch_lightning\strategies\ddp.py:159: error: Incompatible types in assignment (expression has type "object", variable has type "bool")  [assignment]
src\pytorch_lightning\strategies\ddp.py:163: error: Item "None" of "Optional[Accelerator]" has no attribute "setup"  [union-attr]
src\pytorch_lightning\strategies\ddp.py:173: error: Argument 1 to "apply" of "LayerSync" has incompatible type "Optional[Module]"; expected Module  [arg-type]
src\pytorch_lightning\strategies\ddp.py:194: error: Argument 3 to "DistributedDataParallel" has incompatible type "**Dict[str, Union[Any, Dict[str, Any]]]"; expected "Union[int, device, None]"  [arg-type]
src\pytorch_lightning\strategies\ddp.py:194: error: Argument 3 to "DistributedDataParallel" has incompatible type "**Dict[str, Union[Any, Dict[str, Any]]]"; expected "int"  [arg-type]
src\pytorch_lightning\strategies\ddp.py:194: error: Argument 3 to "DistributedDataParallel" has incompatible type "**Dict[str, Union[Any, Dict[str, Any]]]"; expected "bool"  [arg-type]
src\pytorch_lightning\strategies\ddp.py:194: error: Argument 3 to "DistributedDataParallel" has incompatible type "**Dict[str, Union[Any, Dict[str, Any]]]"; expected "float"  [arg-type]
src\pytorch_lightning\strategies\ddp.py:196: error: Function is missing a return type annotation  [no-untyped-def]
src\pytorch_lightning\strategies\ddp.py:196: note: Use "-> None" if function does not return a value
src\pytorch_lightning\strategies\ddp.py:234: error: Argument "model" to "register_ddp_comm_hook" has incompatible type "Optional[Module]"; expected "DistributedDataParallel"  [arg-type]
src\pytorch_lightning\strategies\ddp.py:266: error: Item "None" of "Optional[object]" has no attribute "start_localSGD_iter"  [union-attr]
src\pytorch_lightning\strategies\ddp.py:299: error: Argument 1 to "LightningDistributedModule" has incompatible type "Optional[Module]"; expected "Union[LightningModule, _LightningPrecisionModuleWrapperBase]"  [arg-type]
src\pytorch_lightning\strategies\ddp.py:302: error: Function is missing a return type annotation  [no-untyped-def]
src\pytorch_lightning\strategies\ddp.py:307: error: Function is missing a type annotation for one or more arguments  [no-untyped-def]
src\pytorch_lightning\strategies\ddp.py:315: error: Return type "object" of "broadcast" incompatible with return type "TBroadcast" in supertype "Strategy"  [override]
src\pytorch_lightning\strategies\ddp.py:324: error: Item "None" of "Optional[LightningModule]" has no attribute "automatic_optimization"  [union-attr]
src\pytorch_lightning\strategies\ddp.py:325: error: Argument 1 to "prepare_for_backward" has incompatible type "Optional[Module]"; expected "DistributedDataParallel"  [arg-type]
src\pytorch_lightning\strategies\ddp.py:327: error: Function is missing a return type annotation  [no-untyped-def]
src\pytorch_lightning\strategies\ddp.py:327: note: Use "-> None" if function does not return a value
src\pytorch_lightning\strategies\ddp.py:331: error: Function is missing a type annotation for one or more arguments  [no-untyped-def]
src\pytorch_lightning\strategies\ddp.py:331: error: Argument 3 of "reduce" is incompatible with supertype "Strategy"; supertype defines the argument type as "Union[ReduceOp, str, None]"  [override]
src\pytorch_lightning\strategies\ddp.py:331: note: This violates the Liskov substitution principle
src\pytorch_lightning\strategies\ddp.py:331: note: See https://mypy.readthedocs.io/en/stable/common_issues.html#incompatible-overrides
src\pytorch_lightning\strategies\ddp.py:347: error: Function is missing a type annotation for one or more arguments  [no-untyped-def]
src\pytorch_lightning\strategies\ddp.py:349: error: "None" not callable  [misc]
src\pytorch_lightning\strategies\ddp.py:351: error: Function is missing a type annotation for one or more arguments  [no-untyped-def]
src\pytorch_lightning\strategies\ddp.py:353: error: Item "None" of "Optional[LightningModule]" has no attribute "trainer"  [union-attr]
src\pytorch_lightning\strategies\ddp.py:355: error: "None" not callable  [misc]
src\pytorch_lightning\strategies\ddp.py:358: error: Item "None" of "Optional[Module]" has no attribute "validation_step"  [union-attr]
src\pytorch_lightning\strategies\ddp.py:358: error: "Tensor" not callable  [operator]
src\pytorch_lightning\strategies\ddp.py:360: error: Function is missing a type annotation for one or more arguments  [no-untyped-def]
src\pytorch_lightning\strategies\ddp.py:362: error: Item "None" of "Optional[Module]" has no attribute "test_step"  [union-attr]
src\pytorch_lightning\strategies\ddp.py:362: error: "Tensor" not callable  [operator]
src\pytorch_lightning\strategies\ddp.py:364: error: Function is missing a type annotation for one or more arguments  [no-untyped-def]
src\pytorch_lightning\strategies\ddp.py:366: error: Item "None" of "Optional[Module]" has no attribute "predict_step"  [union-attr]
src\pytorch_lightning\strategies\ddp.py:366: error: "Tensor" not callable  [operator]
src\pytorch_lightning\strategies\ddp.py:368: error: Function is missing a return type annotation  [no-untyped-def]
src\pytorch_lightning\strategies\ddp.py:368: note: Use "-> None" if function does not return a value
src\pytorch_lightning\strategies\ddp.py:409: error: Incompatible types in assignment (expression has type "object", variable has type "Optional[str]")  [assignment]
src\pytorch_lightning\strategies\ddp.py:447: error: Item "None" of "Optional[List[int]]" has no attribute "__iter__" (not iterable)  [union-attr]
src\pytorch_lightning\strategies\ddp.py:461: error: "Tensor" not callable  [operator]
src\pytorch_lightning\strategies\ddp.py:478: error: Argument 1 to "revert" of "LayerSync" has incompatible type "Optional[Module]"; expected Module  [arg-type]
Found 44 errors in 1 file (checked 240 source files)

Does your PR introduce any breaking changes? If yes, please list them.

Before submitting

  • Was this discussed/approved via a GitHub issue? (not for typos and docs)
  • Did you read the contributor guideline, Pull Request section?
  • Did you make sure your PR does only one thing, instead of bundling different changes together?
  • Did you make sure to update the documentation with your changes? (if necessary)
  • Did you write any new necessary tests? (not for typos and docs)
  • Did you verify new and existing tests pass locally with your changes?
  • Did you list all the breaking changes introduced by this pull request?
  • Did you update the CHANGELOG? (not for typos, docs, test updates, or minor internal changes/refactors)

PR review

Anyone in the community is welcome to review the PR.
Before you start reviewing, make sure you have read the review guidelines. In short, see the following bullet-list:

  • Is this pull request ready for review? (if not, please submit in draft mode)
  • Check that all items from Before submitting are resolved
  • Make sure the title is self-explanatory and the description concisely explains the PR
  • Add labels and milestones (and optionally projects) to the PR so it can be classified

Did you have fun?

Make sure you had fun coding 🙃

@github-actions github-actions bot added the pl Generic label for PyTorch Lightning package label Jul 27, 2022
@lijm1358 lijm1358 force-pushed the mypy-strategies-ddp branch from cdee464 to 3dc2fec Compare July 27, 2022 17:22
@lijm1358 lijm1358 requested a review from rohitgr7 July 27, 2022 18:01
@lijm1358 lijm1358 force-pushed the mypy-strategies-ddp branch 2 times, most recently from 84327cc to 75559fe Compare July 28, 2022 03:03
@otaj otaj mentioned this pull request Jul 29, 2022
52 tasks
@lijm1358 lijm1358 force-pushed the mypy-strategies-ddp branch from 781a09d to ce0839c Compare July 30, 2022 01:55
@mergify mergify bot removed the has conflicts label Jul 30, 2022
@lijm1358
Copy link
Contributor Author

lijm1358 commented Aug 1, 2022

Also, thank you for review & feedback, @rohitgr7, @awaelchli 😀

@mergify mergify bot added the has conflicts label Aug 3, 2022
@mergify mergify bot removed the has conflicts label Aug 4, 2022
@awaelchli awaelchli added strategy: ddp DistributedDataParallel code quality community This PR is from the community labels Aug 4, 2022
@awaelchli awaelchli added this to the pl:1.8 milestone Aug 4, 2022
@codecov
Copy link

codecov bot commented Aug 4, 2022

Codecov Report

Merging #13885 (22cf941) into master (d072e44) will increase coverage by 15%.
The diff coverage is 73%.

❗ Current head 22cf941 differs from pull request most recent head 5090579. Consider uploading reports for the commit 5090579 to get more accurate results

@@            Coverage Diff            @@
##           master   #13885     +/-   ##
=========================================
+ Coverage      61%      76%    +15%     
=========================================
  Files         324      341     +17     
  Lines       26342    26652    +310     
=========================================
+ Hits        16141    20268   +4127     
+ Misses      10201     6384   -3817     

@mergify mergify bot added the ready PRs ready to be merged label Aug 4, 2022
@awaelchli awaelchli enabled auto-merge (squash) August 8, 2022 12:38
@awaelchli awaelchli merged commit 890156a into Lightning-AI:master Aug 8, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

code quality community This PR is from the community pl Generic label for PyTorch Lightning package ready PRs ready to be merged strategy: ddp DistributedDataParallel

Projects

No open projects
Status: Done

Development

Successfully merging this pull request may close these issues.

7 participants