-
Notifications
You must be signed in to change notification settings - Fork 617
Closed
Labels
Description
System information
- Have I written custom code: Yes
- OS Platform and Distribution: Ubuntu 16.04
- TensorFlow installed from: binary
- TensorFlow version: 2.0.0rc0
- TensorFlow Addons installed from: PyPi
- TensorFlow Addons version: 0.5.0.dev20190829
- Python version and type: 3.6
Describe the bug
When creating an AttentionMechanism without a memory and then creating an AttentionWrapper with a custom attention_layer, an error is raised.
Describe the expected behavior
No error should be raised.
Code to reproduce the issue
import tensorflow as tf
import tensorflow_addons as tfa
units = 32
attention_mechanism = tfa.seq2seq.LuongAttention(units)
cell = tf.keras.layers.LSTMCell(units)
attention_layer = tf.keras.layers.Dense(
units, use_bias=False, activation=tf.math.tanh)
attention_wrapper = tfa.seq2seq.AttentionWrapper(
cell, attention_mechanism, attention_layer=attention_layer)Other info / logs
File "/lib/python3.6/site-packages/tensorflow_addons/seq2seq/attention_wrapper.py", line 1698, in <genexpr>
])[-1]) for layer, mechanism in zip(
AttributeError: 'NoneType' object has no attribute 'shape'
The code tries to compute the attention layer output shape and for that, it uses attention_mechanism.values which is None at the time the AttentionWrapper constructor is called.