Skip to content

Commit d91adc9

Browse files
authored
Fix parameter server command line options in README (#776)
1 parent 4e9172b commit d91adc9

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

distributed/rpc/parameter_server/README.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -3,7 +3,7 @@
33
This is a basic example of RPC-based training that uses several trainers remotely train a model hosted on a server.
44

55
To run the example locally, run the following command worker for the server and each worker you wish to spawn, in separate terminal windows:
6-
`python rpc_parameter_server.py --world_size=WORLD_SIZE --rank=RANK`. For example, for a master node with world size of 2, the command would be `python rpc_parameter_server.py ---world_size=2 --rank=0`. The trainer can then be launched with the command `python rpc_parameter_server.py --world_size=2 --rank=1` in a separate window, and this will begin training with one server and a single trainer.
6+
`python rpc_parameter_server.py --world_size=WORLD_SIZE --rank=RANK`. For example, for a master node with world size of 2, the command would be `python rpc_parameter_server.py --world_size=2 --rank=0`. The trainer can then be launched with the command `python rpc_parameter_server.py --world_size=2 --rank=1` in a separate window, and this will begin training with one server and a single trainer.
77

88
Note that for demonstration purposes, this example supports only between 0-2 GPUs, although the pattern can be extended to make use of additional GPUs. To configure the number of GPUs, pass in `--num_gpus=N` to your training command.
99

0 commit comments

Comments
 (0)