Skip to content
This repository was archived by the owner on Jul 10, 2025. It is now read-only.

Commit 72c0662

Browse files
authored
Update to accepted
1 parent 9778f63 commit 72c0662

File tree

1 file changed

+3
-3
lines changed

1 file changed

+3
-3
lines changed

rfcs/20200616-keras-multihead-attention.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
# RFC: Multihead Attention and EinsumDense on Keras
22

3-
| Status | Proposed |
3+
| Status | Accepted |
44
| :------------ | :------------------------------------------------------ |
55
| **RFC #** | [260](https://github.com/tensorflow/community/pull/260) |
66
| **Author(s)** | Hongkun Yu ([email protected]), Mark Omernick ([email protected]) |
@@ -342,8 +342,8 @@ method. For example, we implemented
342342
[TalkingHeadAttention](https://github.com/tensorflow/models/blob/master/official/nlp/modeling/layers/talking_heads_attention.py)
343343
introduced by ["Talking-Heads Attention "](https://arxiv.org/abs/2003.02436)
344344
paper. Using the keras Attention layer as another example, since it supports the
345-
basic single-head case 1-D attention, we can use it inside `_build_attention`
346-
and `_compute_attention`.
345+
basic single-head case 1-D attention, we can use it inside `build_attention`
346+
and `compute_attention`.
347347

348348
## Questions and Discussion Topics
349349

0 commit comments

Comments
 (0)