-
Notifications
You must be signed in to change notification settings - Fork 6.5k
Flax documentation #589
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Flax documentation #589
Conversation
|
The documentation is not available anymore as the PR was closed or merged. |
|
to preview locally (quikcer):
git clone https://github.com/huggingface/doc-builder.git
cd doc-builder
pip install -e .then run: doc-builder preview diffusers ~/Desktop/diffusers/docs/source/(obv, change the path based on your system) |
src/diffusers/models/vae_flax.py
Outdated
| Tuple containing the number of output channels for each block | ||
| layers_per_block (:obj:`int`, *optional*, defaults to `2`): | ||
| Number of Resnet layer for each block | ||
| norm_num_groups (:obj:`int`, *optional*, defaults to `2`): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This argument seems to be never used 🤔 Seems to be corresponding to resnet_groups in vae.py but not sure about here
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I opened #621 to fix this separately.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks a lot @pcuenca 💪
src/diffusers/models/vae_flax.py
Outdated
| Activation function | ||
| latent_channels (:obj:`int`, *optional*, defaults to `4`): | ||
| Latent space channels | ||
| norm_num_groups (:obj:`int`, *optional*, defaults to `2`): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Same comment as above
35716dd to
309f445
Compare
|
cc @patil-suraj with #595 being merged I think this is ready for review 🔥 (I cannot somehow request review here) |
|
Wow super cool! |
|
Wow this is super cool! |
patrickvonplaten
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Great effort @younesbelkada ! Looks good to me for merge
pcuenca
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Amazing work! Very comprehensive and detailed, thanks a lot!
| - [Just-In-Time (JIT) compilation](https://jax.readthedocs.io/en/latest/jax.html#just-in-time-compilation-jit) | ||
| - [Automatic Differentiation](https://jax.readthedocs.io/en/latest/jax.html#automatic-differentiation) | ||
| - [Vectorization](https://jax.readthedocs.io/en/latest/jax.html#vectorization-vmap) | ||
| - [Parallelization](https://jax.readthedocs.io/en/latest/jax.html#parallelization-pmap) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
👍 ❤️
Co-authored-by: Pedro Cuenca <[email protected]>
|
Thank you very much @patrickvonplaten @pcuenca ! 💯 |
|
I think it already looks great :) But if you think it's useful to add more references just go for it and merge when you are done :) |
|
Perfect looks good to me too! I can't somehow merge nor approve, I think that I don't have the rights on this repo so feel free to merge whenever you want 💪 🔥 |
* documenting `attention_flax.py` file * documenting `embeddings_flax.py` * documenting `unet_blocks_flax.py` * Add new objs to doc page * document `vae_flax.py` * Apply suggestions from code review * modify `unet_2d_condition_flax.py` * make style * Apply suggestions from code review * make style * Apply suggestions from code review * fix indent * fix typo * fix indent unet * Update src/diffusers/models/vae_flax.py * Apply suggestions from code review Co-authored-by: Pedro Cuenca <[email protected]> Co-authored-by: Mishig Davaadorj <[email protected]> Co-authored-by: Pedro Cuenca <[email protected]>
* documenting `attention_flax.py` file * documenting `embeddings_flax.py` * documenting `unet_blocks_flax.py` * Add new objs to doc page * document `vae_flax.py` * Apply suggestions from code review * modify `unet_2d_condition_flax.py` * make style * Apply suggestions from code review * make style * Apply suggestions from code review * fix indent * fix typo * fix indent unet * Update src/diffusers/models/vae_flax.py * Apply suggestions from code review Co-authored-by: Pedro Cuenca <[email protected]> Co-authored-by: Mishig Davaadorj <[email protected]> Co-authored-by: Pedro Cuenca <[email protected]>
What does this PR do?
This PR adds more documentation on Flax modules
I am planning to add documentation only on modules that are documented on PyTorch modeling files, this includes
Inside
attention_flax.pyfile:FlaxAttentionBlockFlaxBasicTransformerBlockFlaxSpatialTransformerFlaxGluFeedForwardFlaxGEGLUInside
embeddings_flax.pyfile:FlaxTimestepEmbeddingFlaxTimestepsInside
unet_2d_condition_flax.pyfile:FlaxCrossAttnDownBlock2DFlaxDownBlock2DFlaxCrossAttnUpBlock2DFlaxUpBlock2DFlaxUNetMidBlock2DCrossAttnInside
vae_flax.pyfile:FlaxUpsample2DFlaxDownsample2DFlaxResnetBlock2DFlaxAttentionBlockFlaxUpEncoderBlock2DFlaxUNetMidBlock2DFlaxEncoderFlaxDecoderFlaxDiagonalGaussianDistributionFlaxAutoencoderKLQuestions
Do we put the paragraph
on each
nn.Moduleas we do it intransformersor it's too redundant?cc @patrickvonplaten @mishig25