Skip to content
This repository was archived by the owner on Nov 1, 2024. It is now read-only.

Conversation

@rohan-varma
Copy link
Contributor

What does this PR do? Please describe:
Adds basic unittest for AnyPrecisionOptimizer to ensure it is equivalent to Adam under fp32 configuration. We should have unittests as the feature is being used by e.g. @stas00 and it would be good to provide correctness guarantees.

Fixes #57

Does your PR introduce any breaking changes? If yes, please list them:
List of all backwards-incompatible API changes.

Check list:

  • Was this discussed and approved via a GitHub issue? (not for typos or docs)
  • Did you read the contributor guideline?
  • Did you make sure that your PR does only one thing instead of bundling different changes together?
  • Did you make sure to update the documentation with your changes? (if necessary)
  • Did you write any new necessary tests?
  • Did you verify new and existing tests pass locally with your changes?
  • Did you update the CHANGELOG? (not for typos, docs, or minor internal changes)

@facebook-github-bot facebook-github-bot added the CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. label Sep 6, 2022
Copy link
Member

@H-Huang H-Huang left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

lgtm

@rohan-varma rohan-varma merged commit 21a5c8e into main Sep 8, 2022
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.

Labels

CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed.

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[AnyPrecision optimizer] - needs unit tests

4 participants