Add to DDP accelerator 'trainer=None' as a first argument (otherwise I cannot pass it to the Trainer instantiation) https://github.com/PyTorchLightning/pytorch-lightning/blob/master/pytorch_lightning/accelerators/ddp_accelerator.py#L49 ``` class DDPAccelerator(Accelerator): def __init__(self, trainer, cluster_environment=None, ddp_plugin=None) ```