Skip to content

Conversation

@carolineechen
Copy link
Contributor

@carolineechen carolineechen commented Jun 28, 2021

Note: merge after #1610

Remove fused_log_softmax option and keep the behavior of the default True option. fused_log_softmax=False option was only relevant when reuse_logits_for_grads=True, but this behavior is no longer supported after #1610.

@carolineechen carolineechen marked this pull request as draft June 28, 2021 19:51
@carolineechen carolineechen force-pushed the rnntl-remove-fused-smax branch from 3fbc013 to 77b3082 Compare June 28, 2021 20:50
@carolineechen carolineechen force-pushed the rnntl-remove-fused-smax branch 2 times, most recently from 67d0a19 to 8944d59 Compare July 1, 2021 15:13
Copy link
Contributor

@vincentqb vincentqb left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@carolineechen carolineechen force-pushed the rnntl-remove-fused-smax branch 2 times, most recently from ec3d019 to f7ccf83 Compare July 9, 2021 19:32
@carolineechen carolineechen marked this pull request as ready for review August 3, 2021 18:06
@carolineechen carolineechen force-pushed the rnntl-remove-fused-smax branch from f7ccf83 to b83afe6 Compare August 3, 2021 18:26
@carolineechen carolineechen force-pushed the rnntl-remove-fused-smax branch from b83afe6 to afa3ec8 Compare August 3, 2021 18:41
@carolineechen carolineechen merged commit d74d060 into pytorch:main Aug 3, 2021
mthrok pushed a commit to mthrok/audio that referenced this pull request Dec 13, 2022
…f the missing end `}` (pytorch#1615)

* Tiny fix of torch_script_custom_classes custom_op namespace

* fix doc
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants