Skip to content

Conversation

@anton-l
Copy link
Member

@anton-l anton-l commented Oct 13, 2022

This will make the float16 loss terms compatible with accelerate.
Fixes #817

@HuggingFaceDocBuilderDev
Copy link

HuggingFaceDocBuilderDev commented Oct 13, 2022

The documentation is not available anymore as the PR was closed or merged.

Copy link
Contributor

@patil-suraj patil-suraj left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good, thanks a lot!

@patil-suraj patil-suraj merged commit e001fed into main Oct 13, 2022
@patil-suraj patil-suraj deleted the dreambooth-loss branch October 13, 2022 13:41
prathikr pushed a commit to prathikr/diffusers that referenced this pull request Oct 26, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Mixed precision is not working on dreambooth example

4 participants