-
Notifications
You must be signed in to change notification settings - Fork 3.6k
Description
Proposed refactoring or deprecation
Move and/or deprecate aggregation-related code to individual loggers
Motivation
We are auditing the Lightning components and APIs to assess opportunities for improvements:
Revisiting some of the API decisions regarding aggregations in Lightning Logger to simplify the interface and move logic specific to individual loggers away from the base class:
- Aggregation is currently being done by the step dimension. Users may want more control over complex aggregations, like aggregating metrics over a time window.
- It's not clear to the end-user whether they should use
agg_and_log_metricsorlog_metrics. In LoggerConnector, whenlog_metricsis called, we callagg_and_log_metrics.- Which method(s) should developers override when creating their custom logger?
LightningLoggerBase.__init__accepts two arguments,agg_key_funcs, andagg_default_func. They aren't being called in any sub-classed loggers within Lightning. They can be implementation details of the loggers instead.- Do we know if users are adding aggregation functionality using
update_agg_funcsdirectly?
- Do we know if users are adding aggregation functionality using
Pitch
- Simplify
LightningLoggerBase.__init__by removingagg_key_funcsandagg_default_func - Move aggregation-related logger code out of base class to simplify the interface
- Provide clarity between what the Lightning trainer calls between
log_metricsandagg_and_log_metrics. Proposal: the trainer should only calllog_metrics
Additional context
Related issues:
If you enjoy Lightning, check out our other projects! ⚡
-
Metrics: Machine learning metrics for distributed, scalable PyTorch applications.
-
Flash: The fastest way to get a Lightning baseline! A collection of tasks for fast prototyping, baselining, finetuning and solving problems with deep learning
-
Bolts: Pretrained SOTA Deep Learning models, callbacks and more for research and production with PyTorch Lightning and PyTorch
-
Lightning Transformers: Flexible interface for high performance research using SOTA Transformers leveraging Pytorch Lightning, Transformers, and Hydra.