-
Notifications
You must be signed in to change notification settings - Fork 3.6k
Closed
Labels
bugSomething isn't workingSomething isn't workinghelp wantedOpen to be worked onOpen to be worked ontuner
Milestone
Description
🐛 Bug
The batch size finder won't work if the batch size is specified in the LightningDataModule, only (it is natural to define it there).
An instance of a LightningModule always has the attribute hparams; the batch_size finder raises a MisconfigurationException if batch_size isn't found there.
Please reproduce using the BoringModel and post here
https://colab.research.google.com/drive/1gruW3UwitVijzkhcUYIzHlIpoIB1shzt?usp=sharing
Expected behavior
Batch size finder works.
Environment
- CUDA:
- GPU:
- Tesla T4
- available: True
- version: 10.1
- GPU:
- Packages:
- numpy: 1.18.5
- pyTorch_debug: False
- pyTorch_version: 1.6.0+cu101
- pytorch-lightning: 0.10.0
- tqdm: 4.41.1
- System:
- OS: Linux
- architecture:
- 64bit
- processor: x86_64
- python: 3.6.9
- version: Proposal for help #1 SMP Thu Jul 23 08:00:38 PDT 2020
Metadata
Metadata
Assignees
Labels
bugSomething isn't workingSomething isn't workinghelp wantedOpen to be worked onOpen to be worked ontuner