Skip to content

Commit 6c7d597

Browse files
committed
User relative URLs
1 parent 6a62f96 commit 6c7d597

File tree

2 files changed

+2
-2
lines changed

2 files changed

+2
-2
lines changed

intermediate_source/model_parallel_tutorial.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -31,7 +31,7 @@
3131
3232
For distributed model parallel training where a model spans multiple
3333
servers, please refer to
34-
`Getting Started With Distributed RPC Framework <rpc_tutorial.html>__
34+
`Getting Started With Distributed RPC Framework <rpc_tutorial.html>`__
3535
for examples and details.
3636
3737
Basic Usage

intermediate_source/rpc_tutorial.rst

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -16,7 +16,7 @@ Source code of the two examples can be found in
1616

1717
Previous tutorials,
1818
`Getting Started With Distributed Data Parallel <ddp_tutorial.html>`__
19-
and `Writing Distributed Applications With PyTorch <https://pytorch.org/tutorials/intermediate/dist_tuto.html>`__,
19+
and `Writing Distributed Applications With PyTorch <dist_tuto.html>`__,
2020
described `DistributedDataParallel <https://pytorch.org/docs/stable/_modules/torch/nn/parallel/distributed.html>`__
2121
which supports a specific training paradigm where the model is replicated across
2222
multiple processes and each process handles a split of the input data.

0 commit comments

Comments
 (0)