Skip to content

Missing details in the document for the random seed  #3460

@DKandrew

Description

@DKandrew

📚 Documentation

Even though in the documentation, lightning very thoughtfully reminds readers of setting the random seed when using distributed data parallel. However, it does not mention where should we set them. Based on my experience on PyTorch, the seed can be set after calling the torch.distributed.init_process_group. For lightning, this call is handled by the trainer, so where should we set the random seed? One thought from me is that we set the seed in the __init__() of our LightningModule, any other suggestions? What would be the best practice for lightning?

Also, I believe that the seed_everything() function is called in the wrong place on the lightning's official example.

Metadata

Metadata

Assignees

No one assigned

    Labels

    docsDocumentation related

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions