We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
There was an error while loading. Please reload this page.
1 parent 91f1387 commit 01b7293Copy full SHA for 01b7293
pytorch_lightning/accelerators/accelerator.py
@@ -174,8 +174,6 @@ def batch_to_device(
174
dataloader_idx: The index of the dataloader to which the batch belongs.
175
"""
176
model = self.lightning_module
177
-
178
- # TODO: Add support to allow batch transfer to device in Lightning for DP mode.
179
if model is not None and not isinstance(self.training_type_plugin, DataParallelPlugin):
180
# no need to transfer batch to device in DP mode
181
return model._apply_batch_transfer_handler(batch, device, dataloader_idx)
0 commit comments