Skip to content

stopped calling datamodule.train_dataloader() when tuning the batch_size using pl.trainer.tune() #7562

@randommm

Description

@randommm

version 1.3.1 pytorch_lightining stopped calling datamodule.train_dataloader() when tuning the batch_size using pl.trainer.tune()

I don't know if this was reported already, but since version 1.3.1 pytorch_lightining stopped calling datamodule.train_dataloader() when tuning the batch_size using pl.trainer.tune(), so the batch_size effectivelly stays fixed at 2, see the attached files, in particular the print statements of

    def train_dataloader(self):
        print("THIS SHOULD BE CALLED!!!!!", self.batch_size)

and

    def training_step(self, train_batch, batch_idx):
        inputv, target = train_batch
        output = self.forward(inputv)
        print("actual batch_size", len(inputv))

when tune is called.

code_examples.zip

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't workingdata handlingGeneric data-related topichelp wantedOpen to be worked onpriority: 0High priority task

    Type

    No type

    Projects

    No projects

    Milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions