We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
There was an error while loading. Please reload this page.
1 parent d6f3155 commit 9873b2fCopy full SHA for 9873b2f
src/transformers/models/glm4v/modeling_glm4v.py
@@ -325,7 +325,7 @@ def forward(
325
value_states,
326
attention_mask,
327
dropout=0.0 if not self.training else self.attention_dropout,
328
- scaling=self.scale,
+ scaling=self.scaling,
329
is_causal=self.is_causal,
330
**kwargs,
331
)
src/transformers/models/glm4v/modular_glm4v.py
@@ -549,7 +549,7 @@ def forward(
549
550
551
552
553
554
555
0 commit comments