Skip to content

Conversation

@awaelchli
Copy link
Contributor

What does this PR do?

Fixes #6382

Everytime something changes in Google Colab you have to go here and manually update the numbers, duh :(

Before submitting

  • Was this discussed/approved via a GitHub issue? (not for typos and docs)
  • Did you read the contributor guideline, Pull Request section?
  • Did you make sure your PR does only one thing, instead of bundling different changes together?
  • Did you make sure to update the documentation with your changes? (if necessary)
  • Did you write any new necessary tests? (not for typos and docs)
  • Did you verify new and existing tests pass locally with your changes?
  • Did you update the CHANGELOG? (not for typos, docs, test updates, or internal minor changes/refactorings)

PR review

Anyone in the community is free to review the PR once the tests have passed.
Before you start reviewing make sure you have read Review guidelines. In short, see the following bullet-list:

  • Is this pull request ready for review? (if not, please submit in draft mode)
  • Check that all items from Before submitting are resolved
  • Make sure the title is self-explanatory and the description concisely explains the PR
  • Add labels and milestones (and optionally projects) to the PR so it can be classified

Did you have fun?

Make sure you had fun coding 🙃

@awaelchli awaelchli added bug Something isn't working accelerator: tpu Tensor Processing Unit 3rd party Related to a 3rd-party labels Mar 7, 2021
@review-notebook-app
Copy link

Check out this pull request on  ReviewNB

See visual diffs & provide feedback on Jupyter Notebooks.


Powered by ReviewNB

@awaelchli awaelchli added this to the 1.2.x milestone Mar 7, 2021
@codecov
Copy link

codecov bot commented Mar 7, 2021

Codecov Report

Merging #6399 (9ac5bc8) into master (c7f30a2) will decrease coverage by 2%.
The diff coverage is n/a.

@@           Coverage Diff           @@
##           master   #6399    +/-   ##
=======================================
- Coverage      94%     92%    -2%     
=======================================
  Files         161     161            
  Lines       11476   11476            
=======================================
- Hits        10735   10511   -224     
- Misses        741     965   +224     

},
"source": [
"! pip install cloud-tpu-client==0.10 https://storage.googleapis.com/tpu-pytorch/wheels/torch_xla-1.7-cp36-cp36m-linux_x86_64.whl"
"! pip install cloud-tpu-client==0.10 https://storage.googleapis.com/tpu-pytorch/wheels/torch_xla-1.7-cp37-cp37m-linux_x86_64.whl"
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

any documentation we should add to handle future version changes?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Just follow Google announcement on Colab upgrades....
Unfortunately there is no simple way to test colab in CI

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In the docs we show these commands here:
https://pytorch-lightning.readthedocs.io/en/latest/advanced/tpu.html#colab-tpus

maybe this is more future proof, and we should include them in the notebooks?
I have totally no clue here, found this fix randomly and only did this because on call tbh.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@Borda Borda added the ready PRs ready to be merged label Mar 7, 2021
@Borda
Copy link
Collaborator

Borda commented Mar 7, 2021

@awaelchli eventually you can also drop py 3.6 from testing matrix in our CI

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

3rd party Related to a 3rd-party accelerator: tpu Tensor Processing Unit bug Something isn't working ready PRs ready to be merged

Projects

None yet

Development

Successfully merging this pull request may close these issues.

TPU setup in docs results in import failure for flash.Task

4 participants