Skip to content

Commit a1de5cf

Browse files
committed
Update TPU docs for installation
1 parent 13f67ad commit a1de5cf

File tree

2 files changed

+2
-5
lines changed

2 files changed

+2
-5
lines changed

docs/source/advanced/tpu.rst

Lines changed: 1 addition & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -64,8 +64,7 @@ To get a TPU on colab, follow these steps:
6464

6565
.. code-block::
6666
67-
!curl https://raw.githubusercontent.com/pytorch/xla/master/contrib/scripts/env-setup.py -o pytorch-xla-env-setup.py
68-
!python pytorch-xla-env-setup.py --version 1.7 --apt-packages libomp5 libopenblas-dev
67+
!pip install cloud-tpu-client==0.10 https://storage.googleapis.com/tpu-pytorch/wheels/torch_xla-1.8-cp37-cp37m-linux_x86_64.whl
6968
7069
5. Once the above is done, install PyTorch Lightning (v 0.7.0+).
7170

docs/source/starter/introduction_guide.rst

Lines changed: 1 addition & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -572,9 +572,7 @@ Next, install the required xla library (adds support for PyTorch on TPUs)
572572

573573
.. code-block:: shell
574574
575-
!curl https://raw.githubusercontent.com/pytorch/xla/master/contrib/scripts/env-setup.py -o pytorch-xla-env-setup.py
576-
577-
!python pytorch-xla-env-setup.py --version nightly --apt-packages libomp5 libopenblas-dev
575+
!pip install cloud-tpu-client==0.10 https://storage.googleapis.com/tpu-pytorch/wheels/torch_xla-1.8-cp37-cp37m-linux_x86_64.whl
578576
579577
In distributed training (multiple GPUs and multiple TPU cores) each GPU or TPU core will run a copy
580578
of this program. This means that without taking any care you will download the dataset N times which

0 commit comments

Comments
 (0)