version 1.3.1 pytorch_lightining stopped calling datamodule.train_dataloader() when tuning the batch_size using pl.trainer.tune()
I don't know if this was reported already, but since version 1.3.1 pytorch_lightining stopped calling datamodule.train_dataloader() when tuning the batch_size using pl.trainer.tune(), so the batch_size effectivelly stays fixed at 2, see the attached files, in particular the print statements of
def train_dataloader(self):
print("THIS SHOULD BE CALLED!!!!!", self.batch_size)
and
def training_step(self, train_batch, batch_idx):
inputv, target = train_batch
output = self.forward(inputv)
print("actual batch_size", len(inputv))
when tune is called.
code_examples.zip