Skip to content

Conversation

@vincentqb
Copy link
Contributor

@vincentqb vincentqb commented May 31, 2021

Follow up to #1532 (and needs to lands after #1532)

The default for reuse_logits_for_grads=True can also be confusing to users and lead to the inputs being modified in place.

  • This is not the usual behavior of a loss function and needs to be changed.
  • When a function changes its inputs in place, it also needs to report this to autograd by marking them as dirty.

I’ll simply remove this option for now.

@carolineechen
Copy link
Contributor

we should also make sure to remove this parameter from the tests and c++ source code as well

@vincentqb
Copy link
Contributor Author

vincentqb commented Jun 18, 2021

Closed by comment. The follow-up will need remove both reuse_logits_for_grads and fused_log_softmax.

@vincentqb vincentqb closed this Jun 18, 2021
mthrok pushed a commit to mthrok/audio that referenced this pull request Dec 13, 2022
…torch#1536)

* Update optimization_tutorial.py

* Update quickstart_tutorial.py

* Fix typo in optimization tutorial

Co-authored-by: suraj813 <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants