Skip to content

auto_scale_batch_size not working with datamodule #3233

@carmocca

Description

@carmocca

🐛 Bug

The Trainer expects the LightningModule to have self.batch_size (see scale_batch_size() in training_tricks.py). However, if one is using the new LightningDataModule, that should be the class with self.batch_size defined.

To Reproduce

assert hasattr(lightning_data_module, "batch_size")
trainer = Trainer(auto_scale_batch_size=True)
trainer.fit(lightning_module, datamodule=lightning_data_module)
pytorch_lightning.utilities.exceptions.MisconfigurationException: Field batch_size not found in both `model` and `model.hparams`

Expected behavior

auto_scale_batch_size should work using LightningDataModule

Environment

* Packages:
	- numpy:             1.18.5
	- pyTorch_debug:     False
	- pyTorch_version:   1.6.0
	- pytorch-lightning: 0.9.1rc1
	- tensorboard:       2.2.0
	- tqdm:              4.48.2

Metadata

Metadata

Assignees

Labels

bugSomething isn't workinghelp wantedOpen to be worked on

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions