Skip to content

Loss format from .3f to .3g in the training loop #4971

@fjhheras

Description

@fjhheras

🚀 Feature

I propose to change the default format of loss during training (in the tqdm bar) from .3f to .3g

Motivation

When using pytorch-lightning with losses that are quite close to zero, the tqdm information during training becomes non informative, because the loss is always 0.000. For example:

Epoch 41:  76%|██████                                   | 37/49 [00:00<00:00, 65.74it/s, loss=0.000, v_num=92]

The proposed change takes it to

Epoch 41:  76%|██████                                   | 37/49 [00:00<00:00, 65.74it/s, loss=2.2e-08, v_num=92]

This change from .3f to .3g has also an advantage when loss is a large number

$print('loss: %.3f' % 80808423243)
loss: 80808423243.000
$print('loss: %.3g' % 80808423243)
loss: 8.08e+10

In other situations, the output of .3f and .3g does not change

$print('loss: %.3f' % .884)
loss: 0.884
$print('loss: %.3g' % .884)
loss: 0.884

Metadata

Metadata

Assignees

No one assigned

    Labels

    designIncludes a design discussionfeatureIs an improvement or enhancementhelp wantedOpen to be worked on

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions