@@ -44,7 +44,7 @@ two RPC workers on the same host, named "worker0" and "worker1" respectively. Th
4444be spawned as subprocesses, and we set some environment variables required for proper
4545initialization (see torch.distributed documentation for more details).
4646
47- .. code :: python
47+ .. code :: python3
4848 import torch
4949 import torch.distributed.rpc as rpc
5050 import torch.autograd.profiler as profiler
@@ -95,7 +95,7 @@ Now that we have a skeleton setup of our RPC framework, we can move on to
9595sending RPCs back and forth and using the profiler to obtain a view of what's
9696happening under the hood. Let's add to the above "worker" function:
9797
98- ..code:: python
98+ ..code:: python3
9999 def worker(rank, world_size):
100100 # Above code omitted...
101101 if rank == 0:
@@ -152,7 +152,7 @@ call are prefixed with ::rpc_async#aten::mul(worker0 -> worker1).
152152We can also use the profiler gain insight into user-defined functions that are executed over RPC.
153153For example, let's add the following to the above "worker" function:
154154
155- ..code:: python
155+ ..code:: python3
156156 # Define somewhere outside of worker() func.
157157 def udf_with_ops():
158158 import time
@@ -161,7 +161,7 @@ For example, let's add the following to the above "worker" function:
161161 torch.add(t1, t2)
162162 torch.mul(t1, t2)
163163
164- ..code::python
164+ ..code::python3
165165 def worker(rank, world_size):
166166 # Above code omitted
167167 with profiler.profile() as p:
@@ -197,7 +197,7 @@ remote operators that have been executed on worker 1 as part of executing this R
197197Lastly, we can visualize remote execution using the tracing functionality provided by the profiler.
198198Let's add the following code to the above "worker" function:
199199
200- ..code:: python
200+ ..code:: python3
201201 def worker(rank, world_size):
202202 # Above code omitted
203203 # Will generated trace for above profiling output
@@ -215,7 +215,7 @@ in this case, given in the trace column for "node_id: 1".
215215
216216Putting it all together, we have the following code for this recipe:
217217
218- ..code:: python
218+ ..code:: python3
219219 import torch
220220 import torch.distributed.rpc as rpc
221221 import torch.autograd.profiler as profiler
0 commit comments