File tree Expand file tree Collapse file tree 2 files changed +3
-3
lines changed Expand file tree Collapse file tree 2 files changed +3
-3
lines changed Original file line number Diff line number Diff line change @@ -593,9 +593,9 @@ Below are the possible configurations we support.
593593
594594Implement Your Own Distributed (DDP) training
595595^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
596- If you need your own way to init PyTorch DDP you can override :meth: `pytorch_lightning.core.LightningModule. `.
596+ If you need your own way to init PyTorch DDP you can override :meth: `pytorch_lightning.plugins.ddp_plugin.DDPPlugin.init_ddp_connection `.
597597
598- If you also need to use your own DDP implementation, override: :meth: `pytorch_lightning.core.LightningModule .configure_ddp `.
598+ If you also need to use your own DDP implementation, override: :meth: `pytorch_lightning.plugins.ddp_plugin.DDPPlugin .configure_ddp `.
599599
600600
601601----------
Original file line number Diff line number Diff line change @@ -46,7 +46,7 @@ You can customize the checkpointing behavior to monitor any quantity of your tra
46461. Calculate any metric or other quantity you wish to monitor, such as validation loss.
47472. Log the quantity using :func: `~~pytorch_lightning.core.lightning.LightningModule.log ` method, with a key such as `val_loss `.
48483. Initializing the :class: `~pytorch_lightning.callbacks.ModelCheckpoint ` callback, and set `monitor ` to be the key of your quantity.
49- 4. Pass the callback to ` checkpoint_callback ` :class: `~pytorch_lightning.trainer.Trainer ` flag.
49+ 4. Pass the callback to the ` callbacks ` :class: `~pytorch_lightning.trainer.Trainer ` flag.
5050
5151.. code-block :: python
5252
You can’t perform that action at this time.
0 commit comments