Skip to content

Deprecate and redefine add_to_queue/get_from_queue in spawn plugins #10865

@awaelchli

Description

@awaelchli

🚀 Feature

Motivation

In #10059 we will completely rework and simplify the mechanism to transfer results back to the main process. A core change there is to remove the special self.mp_queue attribute from the plugin. The only blocker is that there are two hooks that take such a queue as input. To provide the queue to the user and letting them add elements to it is a confusing concept, when really all we care about is returning and retrieving the data.

Pitch

Change

def add_to_queue(self, trainer: "pl.Trainer", queue: torch.multiprocessing.SimpleQueue) -> None:
    # called on worker process 0
    ...

def get_from_queue(self, trainer: "pl.Trainer", queue: torch.multiprocessing.SimpleQueue) -> None:
    # called on main process
    ...

to

def get_extra_results(self, trainer: "pl.Trainer") -> Any:
    # called on worker process 0
    ...

def update_main_process_results(self, trainer: "pl.Trainer", results: Any) -> None:
   # called on main process
   ...

I'm open for better names, but the focus here is to return the objects in plain instead of handling the queue indirectly.
Note: These hooks are already under a deprecation cycle.

  • Additionally, add better documentation. Ideally, Optuna with best metrics.

Alternatives

In #10059, the temporary proposal will be to mimic the queue with a list until this issue here gets approved.


If you enjoy Lightning, check out our other projects! ⚡

  • Metrics: Machine learning metrics for distributed, scalable PyTorch applications.

  • Lite: enables pure PyTorch users to scale their existing code on any kind of device while retaining full control over their own loops and optimization logic.

  • Flash: The fastest way to get a Lightning baseline! A collection of tasks for fast prototyping, baselining, fine-tuning, and solving problems with deep learning.

  • Bolts: Pretrained SOTA Deep Learning models, callbacks, and more for research and production with PyTorch Lightning and PyTorch.

  • Lightning Transformers: Flexible interface for high-performance research using SOTA Transformers leveraging Pytorch Lightning, Transformers, and Hydra.

cc @Borda @tchaton @justusschock @kaushikb11 @awaelchli

Metadata

Metadata

Assignees

Labels

deprecationIncludes a deprecationfeatureIs an improvement or enhancementstrategy: ddpDistributedDataParallel

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions