Skip to content

clip_gradient with clip_grad_value #5460

@dhkim0225

Description

@dhkim0225

🚀 Feature

Same issue with #4927 #5456

The current clip_gradient uses clip_grad_norm; can we add clip_grad_value?

https://github.com/PyTorchLightning/pytorch-lightning/blob/f2e99d617f05ec65fded81ccc6d0d59807c47573/pytorch_lightning/plugins/native_amp.py#L63-L65

============================================================
@tchaton

As far as I know, there is a difference between clip_grad_by_value and clip_grad_by_norm.
All of the implementations in PL only use clip_grad_by_norm.
clip_grad_by_value does not perform clipping with norm value but just performs clipping by value, so it is useful when learning model with noisy data.
Please let me know if you think I'm wrong.

pytorch clip by norm link
pytorch clip by value link

Sincerely,
Anthony Kim.

Metadata

Metadata

Assignees

No one assigned

    Labels

    featureIs an improvement or enhancementhelp wantedOpen to be worked onpriority: 1Medium priority task

    Type

    No type

    Projects

    No projects

    Milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions