I am using a recurrent neural network for time series prediction with LSTM as the activation function. The inputs are sequence datasets, with the output being the next datum after the input sequence. I have hundreds of inputs, one hidden layer of equal size, and a single output in the output layer. However much I train, the result is always much higher than the actual value (with other functions too), shown respectively by green and blue below. What is the solution?
Asked
Active
Viewed 436 times
-1
1 Answers
0
It seems that LSTM is not suited for this kind of pattern. Softmax works well.

mikael
- 2,097
- 3
- 18
- 24