Skip to content

Commit ccda7d4

Browse files
kaushikb11pre-commit-ci[bot]
authored andcommitted
Add __len__ method to IndexBatchSamplerWrapper (#7681)
Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
1 parent e0850b3 commit ccda7d4

File tree

3 files changed

+86
-1
lines changed

3 files changed

+86
-1
lines changed

CHANGELOG.md

Lines changed: 70 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -4,8 +4,50 @@ All notable changes to this project will be documented in this file.
44

55
The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/).
66

7+
## [1.4.0] - 2021-MM-DD
8+
9+
### Added
10+
11+
- Added support to `LightningModule.to_torchscript` for saving to custom filesystems with fsspec ([#7617](https://github.com/PyTorchLightning/pytorch-lightning/pull/7617))
12+
13+
14+
- Added `KubeflowEnvironment` for use with the `PyTorchJob` operator in Kubeflow
15+
16+
17+
- Added LightningCLI support for config files on object stores ([#7521](https://github.com/PyTorchLightning/pytorch-lightning/pull/7521))
18+
19+
20+
- Added `ModelPruning(prune_on_train_epoch_end=True|False)` to choose when to apply pruning ([#7704](https://github.com/PyTorchLightning/pytorch-lightning/pull/7704))
21+
22+
23+
- Added support for checkpointing based on a provided time interval during training ([#7515](https://github.com/PyTorchLightning/pytorch-lightning/pull/7515))
24+
25+
26+
- Added dataclasses for progress tracking (
27+
[#6603](https://github.com/PyTorchLightning/pytorch-lightning/pull/6603),
28+
[#7574](https://github.com/PyTorchLightning/pytorch-lightning/pull/7574))
29+
30+
31+
- Added argument `trainer.predict(ckpt_path)` ([#7430](https://github.com/PyTorchLightning/pytorch-lightning/pull/7430))
32+
33+
34+
- Added `clip_grad_by_value` support for TPUs ([#7025](https://github.com/PyTorchLightning/pytorch-lightning/pull/7025))
35+
36+
37+
- Added `sub_dir` parameter to `TensorBoardLogger` ([#6195](https://github.com/PyTorchLightning/pytorch-lightning/pull/6195))
38+
39+
40+
- Added correct `dataloader_idx` to batch transfer hooks ([#6241](https://github.com/PyTorchLightning/pytorch-lightning/pull/6241))
41+
42+
43+
- Added `ddp_fully_sharded` support ([#7487](https://github.com/PyTorchLightning/pytorch-lightning/pull/7487))
44+
45+
46+
- Added `__len__` to `IndexBatchSamplerWrapper` ([#7681](https://github.com/PyTorchLightning/pytorch-lightning/pull/7681))
47+
48+
49+
- Added `should_rank_save_checkpoint` property to Training Plugins ([#7684](https://github.com/PyTorchLightning/pytorch-lightning/pull/7684))
750

8-
## [1.3.3] - 2021-05-27
951

1052
### Changed
1153

@@ -21,6 +63,33 @@ The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/).
2163
- Fixed training loop total batch counter when accumulate grad batches was enabled ([#7692](https://github.com/PyTorchLightning/pytorch-lightning/pull/7692))
2264

2365

66+
## [1.3.4] - 2021-06-01
67+
68+
### Changed
69+
70+
- Update pre-commit and add new hooks ([#7781](https://github.com/PyTorchLightning/pytorch-lightning/pull/7781))
71+
72+
73+
### Fixed
74+
75+
76+
## [1.3.3] - 2021-05-27
77+
78+
79+
### Changed
80+
81+
- Move parameter validation specific to TPU Training plugins ([#7415](https://github.com/PyTorchLightning/pytorch-lightning/pull/7415))
82+
- Override broadcast_object_list for torch<1.8 ([#7592](https://github.com/PyTorchLightning/pytorch-lightning/pull/7592))
83+
- Clear predict_progress_bar in `__getstate__` from ProgressBar ([#7608](https://github.com/PyTorchLightning/pytorch-lightning/pull/7608))
84+
85+
### Fixed
86+
87+
- Increment the total batch idx before the accumulation early exit ([#7692](https://github.com/PyTorchLightning/pytorch-lightning/pull/7692))
88+
- Fix global step update when the epoch is skipped ([#7677](https://github.com/PyTorchLightning/pytorch-lightning/pull/7677))
89+
- Fix progress bar print error when called before training ([#7674](https://github.com/PyTorchLightning/pytorch-lightning/pull/7674))
90+
- Fix dataloaders are not reset when tuning the model ([#7566](https://github.com/PyTorchLightning/pytorch-lightning/pull/7566))
91+
- Fix/mismatched toggle optimizer ([#7563](https://github.com/PyTorchLightning/pytorch-lightning/pull/7563))
92+
2493
## [1.3.2] - 2021-05-18
2594

2695
### Changed

pytorch_lightning/overrides/distributed.py

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -132,6 +132,9 @@ def __iter__(self) -> Iterator[List[int]]:
132132
self.batch_indices = batch
133133
yield batch
134134

135+
def __len__(self) -> int:
136+
return len(self._sampler)
137+
135138
@property
136139
def drop_last(self) -> bool:
137140
return self._sampler.drop_last

tests/overrides/test_distributed.py

Lines changed: 13 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -11,11 +11,14 @@
1111
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
1212
# See the License for the specific language governing permissions and
1313
# limitations under the License.
14+
from collections.abc import Iterable
15+
1416
import pytest
1517
from torch.utils.data import BatchSampler, SequentialSampler
1618

1719
from pytorch_lightning import seed_everything
1820
from pytorch_lightning.overrides.distributed import IndexBatchSamplerWrapper, UnrepeatedDistributedSampler
21+
from pytorch_lightning.utilities.data import has_len
1922

2023

2124
@pytest.mark.parametrize("shuffle", [False, True])
@@ -54,3 +57,13 @@ def test_index_batch_sampler(tmpdir):
5457

5558
for batch in index_batch_sampler:
5659
assert index_batch_sampler.batch_indices == batch
60+
61+
62+
def test_index_batch_sampler_methods():
63+
dataset = range(15)
64+
sampler = SequentialSampler(dataset)
65+
batch_sampler = BatchSampler(sampler, 3, False)
66+
index_batch_sampler = IndexBatchSamplerWrapper(batch_sampler)
67+
68+
assert isinstance(index_batch_sampler, Iterable)
69+
assert has_len(index_batch_sampler)

0 commit comments

Comments
 (0)