Is there a way to use the native tf Attention layer with keras Sequential API?
I'm looking to use this particular class. I have found custom implementations such as this one. What I'm truly looking for is the use of this particular class with the Sequential API
Here's a code example of what I'm looking for
model = tf.keras.models.Sequential()
model.add(tf.keras.layers.Embedding(vocab_length,
EMBEDDING_DIM, input_length=MAX_SEQUENCE_LENGTH,
weights=[embedding_matrix], trainable=False))
model.add(tf.keras.layers.Dropout(0.3))
model.add(tf.keras.layers.Conv1D(64, 5, activation='relu'))
model.add(tf.keras.layers.MaxPooling1D(pool_size=4))
model.add(tf.keras.layers.CuDNNLSTM(100))
model.add(tf.keras.layers.Dropout(0.4))
model.add(tf.keras.layers.Attention()) # Doesn't work this way
model.add(tf.keras.layers.Dense(1, activation='sigmoid'))