Skip to content

When I use manual optimization lightning still check optimizer_idx argument. #6803

@Luciennnnnnn

Description

@Luciennnnnnn

🐛 Bug

I set self.automatic_optimization = False in __init__ like that in official docs, but still got an error:

ValueError: Your LightningModule defines 2 optimizers but training_step is missing the "optimizer_idx" argument.

This is confusion, since in example there are no optimizer_idx in training_step , also I think there are no need for it.

Please reproduce using the BoringModel

To Reproduce

Use following BoringModel and post here

Expected behavior

Environment

Note: Bugs with code are solved faster ! Colab Notebook should be made public !

You can get the script and run it with:

wget https://raw.githubusercontent.com/PyTorchLightning/pytorch-lightning/master/tests/collect_env_details.py
# For security purposes, please check the contents of collect_env_details.py before running it.
python collect_env_details.py
  • PyTorch Version (e.g., 1.0):
  • OS (e.g., Linux):
  • How you installed PyTorch (conda, pip, source):
  • Build command you used (if compiling from source):
  • Python version:
  • CUDA/cuDNN version:
  • GPU models and configuration:
  • Any other relevant information:

Additional context

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't workinghelp wantedOpen to be worked on

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions