Skip to content

Conversation

@pcuenca
Copy link
Member

@pcuenca pcuenca commented Sep 19, 2022

This makes it possible to save weights and configuration files.

This makes it possible to save models and configuration files.
@HuggingFaceDocBuilderDev
Copy link

HuggingFaceDocBuilderDev commented Sep 19, 2022

The documentation is not available anymore as the PR was closed or merged.

config_name = CONFIG_NAME
_automatically_saved_args = ["_diffusers_version", "_class_name", "_name_or_path"]
_flax_internal_args = ["name", "parent"]
_flax_internal_args = ["name", "parent", "dtype"]
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

did you receive a warning while initializing?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

did you receive a warning while initializing?

Not about dtype. In the case of the VAE, for example, I did the following tests:

  • Loaded from the current configuration in fusing. Then I got this warning:
The config attributes {'architectures': ['encoderKL'], 'double_z': True, 'transformers_version': '4.21.0.dev0'} were passed to FlaxAutoencoderKL, but are not expected and will be ignored. Please verify your config.json configuration file.
  • Saved the configuration.
  • Saved the pretrained weights.
  • Loaded from the saved pretrained weights. No warning was emitted here.

Copy link
Contributor

@patil-suraj patil-suraj left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM!

@pcuenca pcuenca mentioned this pull request Sep 19, 2022
@patrickvonplaten
Copy link
Contributor

Ok removing for now. Overall I was actually in favor of keeping it in because if someone trains a model with dtype=bfloat16, it'd be good to know that. So I think eventually we should store this parameter in the config. However as pointed out by @patil-suraj:

  • Should probs be saved as a string
  • We need good naming to not confuse the weight dtype with the computation dtype

@pcuenca could you maybe open a "Feature request" issue after merging this that reminds us to eventually save this param?

@pcuenca pcuenca merged commit 5b3f249 into main Sep 19, 2022
@pcuenca pcuenca deleted the flax-ignore-dtype branch September 19, 2022 13:37
PhaneeshB pushed a commit to nod-ai/diffusers that referenced this pull request Mar 1, 2023
yoonseokjin pushed a commit to yoonseokjin/diffusers that referenced this pull request Dec 25, 2023
Flax: ignore dtype for configuration.

This makes it possible to save models and configuration files.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

6 participants