-
Notifications
You must be signed in to change notification settings - Fork 3.6k
Closed
Labels
docsDocumentation relatedDocumentation related
Description
📚 Improve documentation
It would be great if you could extend this page by also explaining how to define the training_step and validation_step when using multiple dataloaders.
Currently, it's not that clear to me how to modify this. For example, when the train_dataloader is defined as follows:
class LitModel(LightningModule):
def train_dataloader(self):
loader_a = torch.utils.data.DataLoader(range(8), batch_size=4)
loader_b = torch.utils.data.DataLoader(range(16), batch_size=4)
loader_c = torch.utils.data.DataLoader(range(32), batch_size=4)
loader_c = torch.utils.data.DataLoader(range(64), batch_size=4)
# pass loaders as a nested dict. This will create batches like this:
# {'loader_a_b': {'a': batch from loader a, 'b': batch from loader b},
# 'loader_c_d': {'c': batch from loader c, 'd': batch from loader d}}
loaders = {'loaders_a_b': {'a': loader_a, 'b': loader_b},
'loaders_c_d': {'c': loader_c, 'd': loader_d}}
return loaders
, does the API of training_step still look like training_step(self, batch, batch_idx)? Or how can I access the multiple batches?
Thanks!
Metadata
Metadata
Assignees
Labels
docsDocumentation relatedDocumentation related