Questions tagged [sequence-to-sequence]

This tag is used for Google's deprecated seq2seq framework, an encoder-decoder framework for Tensorflow (revamped version is called Neural Machine Translation)

94 questions
0
votes
2 answers

What are the details of Sequence-to-sequence model for text summarization?

It is clear how to train encoder-decoder model for translation: each source sequence has its corresponding target sequence (translation). But in case of text summarization abstract is much shorter than its article. According to Urvashi Khandelwal,…
ichernob
  • 357
  • 2
  • 13
0
votes
1 answer

The difference between RNN Decoder and RNN

We are only using the RNN decoder (without encoder) for text generation, how is RNN decoder different from pure RNN operation? RNN Decoder in TensorFlow: https://www.tensorflow.org/api_docs/python/tf/contrib/seq2seq/dynamic_rnn_decoder Pure RNN in…
0
votes
2 answers

Creating ensemble for sequence to sequence (seq2seq) tensorflow models?

I have a trained a tensorflow seq2seq model for 30 epochs, and saved a checkpoint for each epoch. What I want to do now is combining the best X of those checkpoints (based on results on a development set). Specifically, I'm looking for a way that…
0
votes
1 answer

Extracting attention matrix with TensorFlow's seq2seq example code during decoding

It seems like the attention() method used to compute the attention mask in the seq2seq_model.py code in the example TensorFlow code for the sequence-to-sequence code is not called during decoding. Does anyone know how to resolve this? A similar…
1 2 3 4 5 6
7