Skip to content

Commit d61392e

Browse files
committed
update
1 parent c2014b1 commit d61392e

File tree

1 file changed

+7
-1
lines changed

1 file changed

+7
-1
lines changed

recipes_source/distributed_rpc_profiling.rst

Lines changed: 7 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -45,6 +45,7 @@ be spawned as subprocesses, and we set some environment variables required for p
4545
initialization (see torch.distributed documentation for more details).
4646

4747
.. code:: python3
48+
4849
import torch
4950
import torch.distributed.rpc as rpc
5051
import torch.autograd.profiler as profiler
@@ -87,7 +88,8 @@ initialization (see torch.distributed documentation for more details).
8788
8889
Running the above program should present you with the following output:
8990

90-
..
91+
..
92+
9193
DEBUG:root:worker0 successfully initialized RPC.
9294
DEBUG:root:worker1 successfully initialized RPC.
9395

@@ -96,6 +98,7 @@ sending RPCs back and forth and using the profiler to obtain a view of what's
9698
happening under the hood. Let's add to the above "worker" function:
9799

98100
..code:: python3
101+
99102
def worker(rank, world_size):
100103
# Above code omitted...
101104
if rank == 0:
@@ -153,6 +156,7 @@ We can also use the profiler gain insight into user-defined functions that are e
153156
For example, let's add the following to the above "worker" function:
154157

155158
..code:: python3
159+
156160
# Define somewhere outside of worker() func.
157161
def udf_with_ops():
158162
import time
@@ -198,6 +202,7 @@ Lastly, we can visualize remote execution using the tracing functionality provid
198202
Let's add the following code to the above "worker" function:
199203

200204
..code:: python3
205+
201206
def worker(rank, world_size):
202207
# Above code omitted
203208
# Will generated trace for above profiling output
@@ -217,6 +222,7 @@ in this case, given in the trace column for "node_id: 1".
217222
Putting it all together, we have the following code for this recipe:
218223

219224
..code:: python3
225+
220226
import torch
221227
import torch.distributed.rpc as rpc
222228
import torch.autograd.profiler as profiler

0 commit comments

Comments
 (0)