File tree Expand file tree Collapse file tree 3 files changed +654
-1
lines changed Expand file tree Collapse file tree 3 files changed +654
-1
lines changed Original file line number Diff line number Diff line change @@ -203,6 +203,11 @@ Parallel and Distributed Training
203203 :description: :doc: `/intermediate/dist_tuto `
204204 :figure: _static/img/distributed/DistPyTorch.jpg
205205
206+ .. customgalleryitem ::
207+ :tooltip: Getting Started with Distributed RPC Framework
208+ :description: :doc: `/intermediate/rpc_tutorial `
209+ :figure: _static/img/distributed/DistPyTorch.jpg
210+
206211.. customgalleryitem ::
207212 :tooltip: PyTorch distributed trainer with Amazon AWS
208213 :description: :doc: `/beginner/aws_distributed_training_tutorial `
@@ -377,6 +382,7 @@ PyTorch Fundamentals In-Depth
377382 intermediate/model_parallel_tutorial
378383 intermediate/ddp_tutorial
379384 intermediate/dist_tuto
385+ intermediate/rpc_tutorial
380386 beginner/aws_distributed_training_tutorial
381387
382388.. toctree ::
Original file line number Diff line number Diff line change 11# -*- coding: utf-8 -*-
22"""
3- Model Parallel Best Practices
3+ Single-Machine Model Parallel Best Practices
44================================
55**Author**: `Shen Li <https://mrshenli.github.io/>`_
66
2727of model parallel. It is up to the readers to apply the ideas to real-world
2828applications.
2929
30+ .. note::
31+
32+ For distributed model parallel training where a model spans multiple
33+ servers, please refer to
34+ `Getting Started With Distributed RPC Framework <rpc_tutorial.html>`__
35+ for examples and details.
36+
3037Basic Usage
3138-----------
3239"""
You can’t perform that action at this time.
0 commit comments