🐛 Bug
I got the error RuntimeError: element 0 of tensors does not require grad and does not have a grad_fn when having 2 optimizers (to train a GAN model), and also set option accumulate_grad_batches with value greater than 1 in trainer. The loss, returned from method LightningModule.traininng_step and corresponding to the second optimizer, has requires_grad as False, which results to the error.
Similar fixed bug is 5574, though it does not handle the case when option accumulate_grad_batches with value greater than 1 is used.
To Reproduce
Use following BoringModel
Environment
PyTorch Lightning Version 1.3.7