Skip to content

Commit 39d641b

Browse files
authored
Fix comment from "point-to-point" -> "collective" for describing all-reduce (#1544)
1 parent c38ba86 commit 39d641b

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

intermediate_source/dist_tuto.rst

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -207,7 +207,7 @@ to obtain the sum of all tensors at all processes, we can use the
207207
208208
""" All-Reduce example."""
209209
def run(rank, size):
210-
""" Simple point-to-point communication. """
210+
""" Simple collective communication. """
211211
group = dist.new_group([0, 1])
212212
tensor = torch.ones(1)
213213
dist.all_reduce(tensor, op=dist.ReduceOp.SUM, group=group)

0 commit comments

Comments
 (0)