0

How to implement attention for a sequence to sequence model in keras. I understand this seq2seq model, but I want to do attention with Fig B (shown in the attached link seq2seq). Please explain step by step.

Mr.Beans
  • 1
  • 2
  • Stackoverflow is not a tutorial site. Please do some research before asking questions here. A simple search in a search engine like Google with a query like "keras sequence to sequence attention" would lead to a couple of tutorials/articles. If you stumble upon any **programming** question/issue while reading those tutorial, then you can ask a question with specific details here and others would kindly help you. – today Sep 01 '18 at 12:54
  • https://blog.keras.io/a-ten-minute-introduction-to-sequence-to-sequence-learning-in-keras.html This should help you. – Vikas NS Sep 01 '18 at 16:25
  • https://machinelearningmastery.com/define-encoder-decoder-sequence-sequence-model-neural-machine-translation-keras/ even this is a good one. – Vikas NS Sep 01 '18 at 16:26

0 Answers0