14

I'm training a model to predict the stock price and input data is close price. I use 45 days data to predict the 46th day's close price and a economic Indicator to be second feature, here is the model:

model = Sequential()
model.add( LSTM( 512, input_shape=(45, 2), return_sequences=True))
model.add( LSTM( 512, return_sequences=True))
model.add( (Dense(1)))
model.compile(loss='mse', optimizer='adam')
history = model.fit( X_train, y_train, batch_size = batchSize, epochs=epochs, shuffle = False)

When I run this I get the following error:

ValueError: Error when checking target: expected dense_1 to have 3 dimensions, but got array with shape (118, 1)

However, I print the shape of data and they are:

X_train:(118, 45, 2)
y_train:(118, 1)

I have no idea why the model is expecting a 3 dimensional output when y_train is (118, 1). Where am I wrong and what should I do?

Chris Wong
  • 143
  • 1
  • 1
  • 6

3 Answers3

16

Your second LSTM layer also returns sequences and Dense layers by default apply the kernel to every timestep also producing a sequence:

# (bs, 45, 2)
model.add( LSTM( 512, input_shape=(45, 2), return_sequences=True))
# (bs, 45, 512)
model.add( LSTM( 512, return_sequences=True))
# (bs, 45, 512)
model.add( (Dense(1)))
# (bs, 45, 1)

So your output is shape (bs, 45, 1). To solve the problem you need to set return_sequences=False in your second LSTM layer which will compress sequence:

# (bs, 45, 2)
model.add( LSTM( 512, input_shape=(45, 2), return_sequences=True))
# (bs, 45, 512)
model.add( LSTM( 512, return_sequences=False)) # SET HERE
# (bs, 512)
model.add( (Dense(1)))
# (bs, 1)

And you'll get the desired output. Note bs is the batch size.

nuric
  • 11,027
  • 3
  • 27
  • 42
  • 1
    After reshaping for 131 sample data using [131,32,512] for training and reshaping [131,32,8] for corresponding labels I have received the following error: ValueError: Error when checking target: expected dense_8 to have 2 dimensions, but got array with shape (131, 32, 8) – Nhqazi Jun 09 '19 at 23:46
6

I had a similar problem, found the answer here:

I added model.add(Flatten()) before the last Dense layer

Helen
  • 316
  • 3
  • 16
1

The Shape of the training data should be in the format of:(num_samples,num_features,num_signals/num_vectors). Following this convention, try passing the training data in the form of an array with the reshaped size in convention described above, along with that ensure to add the validation_data argument in the model.fit command. An example of this is:

model.fit(x=X_input ,y=y_input,batch_size=2,validation_data=(X_output, y_output),epochs=10)

where both X_input, y_input are training data arrays with shape (126,711,1) and (126,1) respectively and X_output, y_output are validation/test data arrays with shapes (53,711,1) and (53,1) respectively.

In case you find a shuffling error try setting the value of shuffle argument as True after following the above methodology.

aliaksei
  • 714
  • 1
  • 9
  • 23
SHAGUN SHARMA
  • 1,418
  • 1
  • 7
  • 5
  • This solution doesn't seem to apply to the original problem (it already has the input in the correct convention). – Helen Nov 29 '19 at 18:14
  • The testing/validation data must also be in the same format as described above to achieve perfect mapping. Adding a flattening layer before the dense layer might also help to attain the required dimension of input but testing set should also be moulded in the required format as specified on the answer. – SHAGUN SHARMA Nov 30 '19 at 21:11