We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
There was an error while loading. Please reload this page.
1 parent 2300056 commit 60f4c7bCopy full SHA for 60f4c7b
src/diffusers/models/attention_flax.py
@@ -247,7 +247,7 @@ class FlaxGEGLU(nn.Module):
247
Flax implementation of a Linear layer followed by the variant of the gated linear unit activation function from
248
https://arxiv.org/abs/2002.05202.
249
250
- arameters:
+ Parameters:
251
dim (:obj:`int`):
252
Input hidden states dimension
253
dropout (:obj:`float`, *optional*, defaults to 0.0):
0 commit comments