-
Notifications
You must be signed in to change notification settings - Fork 3.6k
Description
Bug description
There are issues when using WandbLogger from a LightningCLI configuration. First, full configuration is not automatically forwarded to the kwargs of the WandbLogger, so configuration is not automatically saved.
This can be fixed with a custom CLI:
class CustomSaveConfigCallback(SaveConfigCallback):
# Saves full training configuration
def save_config(
self, trainer: Trainer, pl_module: LightningModule, stage: _LITERAL_WARN
) -> None:
for logger in trainer.loggers:
if issubclass(type(logger), WandbLogger):
logger.experiment.config.update(self.config.as_dict())
return super().save_config(trainer, pl_module, stage)However, there will still be duplicate hyperparameters on wandb: parameters saved with save_hyperparameters are not nested within the relative model or dataclass, but are placed in the root (because save_hyperparameters feeds the logger a flattened list).
save_hyperparameters should place the updated parameters in the correct config path on wandb, instead of duplicating them at the root.
Logging can be disabled by subclassing WandbLogger
class CustomWandbLogger(WandbLogger):
# Disable unintended hyperparameter logging (already saved on init)
def log_hyperparams(self, *args, **kwargs): ...but then updated (or additional) hyperparameters added when initializing the model won't be stored.
Maybe there's already a better way to fix this behavior (or is it indended)?
What version are you seeing the problem on?
v2.2
How to reproduce the bug
Config file:
trainer:
logger:
class_path: WandbLoggerInside a model e.g.
class ConvolutionalNetwork(L.LightningModule):
def __init__(
self,
dim_in: int,
num_internal_channels: int,
num_layers: int,
kernel_size: int,
num_neurons_dense: int,
seq_len: int,
):
super().__init__()
self.save_hyperparameters()
cc @lantiga @morganmcg1 @borisdayma @scottire @parambharat @mauvilsa