Skip to content
6 changes: 6 additions & 0 deletions index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -203,6 +203,11 @@ Parallel and Distributed Training
:description: :doc:`/intermediate/dist_tuto`
:figure: _static/img/distributed/DistPyTorch.jpg

.. customgalleryitem::
:tooltip: Getting Started with Distributed RPC Framework
:description: :doc:`/intermediate/rpc_tutorial`
:figure: _static/img/distributed/DistPyTorch.jpg

.. customgalleryitem::
:tooltip: PyTorch distributed trainer with Amazon AWS
:description: :doc:`/beginner/aws_distributed_training_tutorial`
Expand Down Expand Up @@ -377,6 +382,7 @@ PyTorch Fundamentals In-Depth
intermediate/model_parallel_tutorial
intermediate/ddp_tutorial
intermediate/dist_tuto
intermediate/rpc_tutorial
beginner/aws_distributed_training_tutorial

.. toctree::
Expand Down
9 changes: 8 additions & 1 deletion intermediate_source/model_parallel_tutorial.py
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# -*- coding: utf-8 -*-
"""
Model Parallel Best Practices
Single-Machine Model Parallel Best Practices
================================
**Author**: `Shen Li <https://mrshenli.github.io/>`_

Expand All @@ -27,6 +27,13 @@
of model parallel. It is up to the readers to apply the ideas to real-world
applications.

.. note::

For distributed model parallel training where a model spans multiple
servers, please refer to
`Getting Started With Distributed RPC Framework <rpc_tutorial.html>`__
for examples and details.

Basic Usage
-----------
"""
Expand Down
Loading