How to implement attention for a sequence to sequence model in keras. I understand this seq2seq model, but I want to do attention with Fig B (shown in the attached link seq2seq). Please explain step by step.
Asked
Active
Viewed 440 times
0
-
Stackoverflow is not a tutorial site. Please do some research before asking questions here. A simple search in a search engine like Google with a query like "keras sequence to sequence attention" would lead to a couple of tutorials/articles. If you stumble upon any **programming** question/issue while reading those tutorial, then you can ask a question with specific details here and others would kindly help you. – today Sep 01 '18 at 12:54
-
https://blog.keras.io/a-ten-minute-introduction-to-sequence-to-sequence-learning-in-keras.html This should help you. – Vikas NS Sep 01 '18 at 16:25
-
https://machinelearningmastery.com/define-encoder-decoder-sequence-sequence-model-neural-machine-translation-keras/ even this is a good one. – Vikas NS Sep 01 '18 at 16:26