Skip to content

Trainer.scale_batch_size requires model.batch_size instead of model.hparams.batch_size #2484

@wietsedv

Description

@wietsedv

🐛 Bug

Trainer.scale_batch_size only works if a model has the batch_size property and does not work with model.hparams.batch_size even though all documentation points to the reverse.

To Reproduce

All of my hyperparameters are available as model.hparams like suggested in the documentation: (hyperparameters, option 3.
This means that my batch_size is available as model.hparams.batch_size.

This should be fully compatible with the documented example code of Trainer.scale_batch_size() since that code also uses model.hparams.batch_size instead of model.batch_size.

However, when I put my model in Trainer.scale_batch_size, I get the following error:

pytorch_lightning.utilities.exceptions.MisconfigurationException: Field batch_size not found in `model.hparams`

Example code

class LitModel(pl.LightningModule):
    def __init__(self, hparams):
        super().__init__()
        self.hparams = args

model = LitModel(args)
trainer = Trainer()
trainer.scale_batch_size(model)

Expected behavior

Either Trainer.scale_batch_size should work with model.hparams or the error message, linked documentation examples and docstrings should all change (i.e. here, here and here).

(I would prefer the second option. I think that it should work with both model.batch_size and model.hparams.batch_size.)

Environment

  • pytorch-lightning 0.8.4

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't workinggood first issueGood for newcomershelp wantedOpen to be worked on

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions