2

How to decode one to many LSTM architecture (https://discuss.pytorch.org/t/example-of-many-to-one-lstm/1728) in tensorflow? Can we use tf.contrib.seq2seq.dynamic_decode of tensorflow?

For training I used tf.nn.dynamic_rnn

cells = []
for i, each_filter in range(4):
    cell = LSTM cell / GRU cell
    cells.append(cell)
cell = tf.nn.rnn_cell.MultiRNNCell(cells, state_is_tuple=True)
states_series, current_state = tf.nn.dynamic_rnn(cell, inputs, dtype=inputs.dtype)

how to decode using cell for one to many sequence problem at test time?

My dataset is not about words, for example lets say i want to predict [4,1,2,3] given input 8

RamRasia
  • 51
  • 1
  • 7

0 Answers0