Skip to content

Batch size finder is not working if batch_size is specified in LightningDataModule #4226

@maxjeblick

Description

@maxjeblick

🐛 Bug

The batch size finder won't work if the batch size is specified in the LightningDataModule, only (it is natural to define it there).

An instance of a LightningModule always has the attribute hparams; the batch_size finder raises a MisconfigurationException if batch_size isn't found there.

Please reproduce using the BoringModel and post here

https://colab.research.google.com/drive/1gruW3UwitVijzkhcUYIzHlIpoIB1shzt?usp=sharing

Expected behavior

Batch size finder works.

Environment

  • CUDA:
    • GPU:
      • Tesla T4
    • available: True
    • version: 10.1
  • Packages:
    • numpy: 1.18.5
    • pyTorch_debug: False
    • pyTorch_version: 1.6.0+cu101
    • pytorch-lightning: 0.10.0
    • tqdm: 4.41.1
  • System:
    • OS: Linux
    • architecture:
      • 64bit
    • processor: x86_64
    • python: 3.6.9
    • version: Proposal for help #1 SMP Thu Jul 23 08:00:38 PDT 2020

Metadata

Metadata

Assignees

Labels

bugSomething isn't workinghelp wantedOpen to be worked ontuner

Type

No type

Projects

No projects

Milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions