-
Notifications
You must be signed in to change notification settings - Fork 3.6k
Closed
Labels
designIncludes a design discussionIncludes a design discussionfeatureIs an improvement or enhancementIs an improvement or enhancementhelp wantedOpen to be worked onOpen to be worked on
Description
🚀 Feature
I propose to change the default format of loss during training (in the tqdm bar) from .3f to .3g
Motivation
When using pytorch-lightning with losses that are quite close to zero, the tqdm information during training becomes non informative, because the loss is always 0.000. For example:
Epoch 41: 76%|██████ | 37/49 [00:00<00:00, 65.74it/s, loss=0.000, v_num=92]
The proposed change takes it to
Epoch 41: 76%|██████ | 37/49 [00:00<00:00, 65.74it/s, loss=2.2e-08, v_num=92]
This change from .3f to .3g has also an advantage when loss is a large number
$print('loss: %.3f' % 80808423243)
loss: 80808423243.000
$print('loss: %.3g' % 80808423243)
loss: 8.08e+10
In other situations, the output of .3f and .3g does not change
$print('loss: %.3f' % .884)
loss: 0.884
$print('loss: %.3g' % .884)
loss: 0.884
Metadata
Metadata
Assignees
Labels
designIncludes a design discussionIncludes a design discussionfeatureIs an improvement or enhancementIs an improvement or enhancementhelp wantedOpen to be worked onOpen to be worked on