-
Notifications
You must be signed in to change notification settings - Fork 3.6k
Closed
Labels
Milestone
Description
🐛 Bug
test_iter = RandomIterableDataset(32, 10)
test_iter = DataLoader(test_iter, batch_size=32)
# Using an iterable dataset uses the incorrect batch size
trainer.predict(dataloaders=test_iter)
will result in the trainer using a batch size of 1 instead of 32. This issue only appears for IterableDataset (not Dataset).
To Reproduce
Notebook here
Expected behavior
The batch size specified in the DataLoader should be used.
Environment
* CUDA:
- GPU:
- Tesla T4
- available: True
- version: 10.1
* Packages:
- numpy: 1.18.5
- pyTorch_debug: False
- pyTorch_version: 1.6.0+cu101
- pytorch-lightning: 0.10.0
- tqdm: 4.41.1
* System:
- OS: Linux
- architecture:
- 64bit
-
- processor: x86_64
- python: 3.6.9
- version: #1 SMP Thu Jul 23 08:00:38 PDT 2020
Additional context
This used to work before 1.5.1 (not sure exactly which version broke it though).
cc @rohitgr7