Skip to content

Length of loggers #17280

@bipin-lekhak

Description

@bipin-lekhak

Description & Motivation

The feature to use multiple loggers is really helpful. But I have issues when I try to handle code in such a way that any logger maybe turned off at any time.

For example, when I'm writing this code, I am writing a model able to log into mlflow and tensorboard. When both loggers are passed to Trainer(logger=[ml_flow, tesnorboard_logger]) vs Trainer(logger=[tesnorboard_logger]), I have to write a lot of different custom code to handle that. Upstream handling of how many loggers were passed and of which type in which order is messy. Example, I want to log an image to tesnorboard logs when I only pass tensorboard logger will be: self.logger.experiment.add_image() but when multiple logs are passed and in unknown order, the code becomes unnecessarily convoluted.

Pitch

Handle multiple loggers in model easily. We should be able to check if logger of a type is present or not and if present, get them. This should happen with minimal friction such as checking all indexes. One solution may be to have self.loggers as a dictionary rather than list?

Alternatives

Providing __len__ for pytorch_lightning.loggers.base.LoggerCollection can be one low hanging fruit which does simplify a lot (of course doesnot go all the way).

Additional context

No response

cc @Borda @awaelchli @Blaizzy

Metadata

Metadata

Assignees

No one assigned

    Labels

    featureIs an improvement or enhancementloggerRelated to the Loggers

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions