4

In Tensorflow, all the encoder-decoder functions (https://github.com/tensorflow/tensorflow/blob/master/tensorflow/python/ops/seq2seq.py) use a unidirectional implementation of RNN unit.

How can we implement the bidirectional encoder (http://arxiv.org/abs/1409.0473 or alike systems) in Tensorflow so that both forward and backward sequence can be simultaneously learnt in encoder-decoder setting?

user3480922
  • 564
  • 1
  • 10
  • 22

1 Answers1

1

It is actually very easy. You can just encode a sequence normally from the first to the last, and get the states and last output as the output; and then reverse the sequence and apply the same process and you get the same amount of states and output. You concatenate each state-pair from the same item and then you get the combined states and two outputs.

You can just use the API in TensorFlow: bidirectional_dynamic_rnn. Plus, this is the implementation in Theano.

Lerner Zhang
  • 6,184
  • 2
  • 49
  • 66