I am going to implement attention mechanism for a LSTM neural network. I used this layer (https://pypi.org/project/keras-self-attention/) but it increases the error ! Maybe this is because of my data set but similar studies have got higher accuracies with attention layer. Would you please introduce me another easy to use method for implementing attention in Keras?
Asked
Active
Viewed 2,324 times
1
-
here a simple way to add attention: https://stackoverflow.com/questions/62948332/how-to-add-attention-layer-to-a-bi-lstm/62949137#62949137 – Marco Cerliani Jul 17 '20 at 14:59
2 Answers
1
You may use the TensorFlow Keras module tf.keras.layers.Attention
. This is, assuming you are working with TensorFlow 2.0.
You more read more here: https://www.tensorflow.org/api_docs/python/tf/keras/layers/Attention?version=stable

Ekaba Bisong
- 2,918
- 2
- 23
- 38
-
2Thanks Ekaba. The example provided in this link is not clear to me. Do you have any example of implementation in LSTM neural network? – user186204 Dec 29 '19 at 09:45
0
Attention need not always improve the scores. They come real handy though when dealing with longer and longer inputs.
If you are using LSTMs, I would not recommend using tf.keras.layers.Attention as this class is more suitable for CNN, DNN.
In very few lines of code you can add your own custom attention layer as in: Custom Attention Layer using in Keras

Allohvk
- 915
- 8
- 14