I am trying to predict time series data using an encoder/decoder with LSTM layers. So far, I am using 20 points of past data to predict 20 future points. For each sample of 20 past data points, the 1st value in the predicted sequence is very close to the true 1st value in each sequence: predicting 1 step into the future
However, for the 2nd value in each sequence (2 timesteps into the future), the predicted values look like they are "shifted": predicting 2 steps into the future
This "shifted" nature is true for all values of the predicted sequences, with the shifts increasing as I go farther into the predicted sequence. Here is the code for my model:
model = Sequential()
model.add(LSTM(input_dim = 1, output_dim=128,
return_sequences=False))
model.add(RepeatVector(20))
model.add(LSTM(output_dim=128, return_sequences=True))
model.add(TimeDistributed(Dense(1)))
Is it something with RepeatVector? Any help would be appreciated.