6

How exactly is tf.contrib.rnn.AttentionCellWrapper used? Can someone give a piece of example code? Specifically, I only managed to make the following

    fwd_cell = tf.contrib.rnn.AttentionCellWrapper(tf.contrib.rnn.BasicLSTMCell(hidden_size),10,50,state_is_tuple=True)
    bwd_cell = tf.contrib.rnn.AttentionCellWrapper(tf.contrib.rnn.BasicLSTMCell(hidden_size),10,50,state_is_tuple=True)

but in Bahdanau et al. 2015, the attention operates on the entire bidirectional RNN. I have no idea how to code that in Tensorflow.

user3373273
  • 61
  • 1
  • 3

0 Answers0