Skip to content

lr_tuner fails if called after tuning batch_size #4616

@Palzer

Description

@Palzer

https://github.com/PyTorchLightning/pytorch-lightning/blob/514cb22bd719e6ca056cacce730c8de875c9dbf6/pytorch_lightning/tuner/lr_finder.py#L411

The line above sets current_step to 2 if the batch_size tuner has already run.

https://github.com/PyTorchLightning/pytorch-lightning/blob/514cb22bd719e6ca056cacce730c8de875c9dbf6/pytorch_lightning/tuner/lr_finder.py#L419-L420

This causes best_loss to be zero and compare against the smoothed loss. This will cause the tuner to terminate and fail.

Also, just looks like this was already supposed to be changed.

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't workingpriority: 1Medium priority tasktuner

    Type

    No type

    Projects

    No projects

    Milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions