0

Consider the following BiLSTM diagram for timeseries prediction:

enter image description here

I believe this can easily be applied to train dataset but I do not think this is possible for the test dataset. The reverse LSTM layer learns from the future values and passes the hidden state to the previous sample. So why so many research studies use BiLSTM for timeseries prediction?

learning-man
  • 119
  • 2
  • 11
  • If you set your model up as a encoder-decoder architecture for time series prediction, the encoder part can be a BiLSTM. However, the decoder has to be unidirectional. – Christian Jun 11 '23 at 07:07
  • I am not talking about seq2seq or encoder-decoder network. I am trying to replicate the results of this paper: https://www.sciencedirect.com/science/article/pii/S0893608022003938 and I really can. But in the out-of-sample dataset, I can get good results only if I use the entire dataset. Which means that I need future values to predict the present. Do you get my point? – learning-man Jun 11 '23 at 07:22

0 Answers0