2

I'm trying to create a keras LSTM to predict time series. My x_train is shaped like 3000,15,10 (Examples, Timesteps, Features), y_train like 3000,15,1 and I'm trying to build a many to many model (10 input features per sequence make 1 output / sequence).

The code I'm using is this:

model = Sequential()

model.add(LSTM(
    10,
    input_shape=(15, 10),
    return_sequences=True))
model.add(Dropout(0.2))

model.add(LSTM(
    100,
    return_sequences=True))
model.add(Dropout(0.2))
model.add(Dense(1, activation='linear'))
model.compile(loss="mse", optimizer="rmsprop")
model.fit(
        X_train, y_train,
        batch_size=512, nb_epoch=1, validation_split=0.05)

However, I can't fit the model when using :

model.add(Dense(1, activation='linear'))
>> Error when checking model target: expected dense_1 to have 2 dimensions, but got array with shape (3000, 15, 1)

or when formatting it this way:

model.add(Dense(1))
model.add(Activation("linear"))
>> Error when checking model target: expected activation_1 to have 2 dimensions, but got array with shape (3000, 15, 1)

I already tried flattening the model ( model.add(Flatten()) ) before adding the dense layer but that just gives me ValueError: Input 0 is incompatible with layer flatten_1: expected ndim >= 3, found ndim=2. This confuses me because I think my data actually is 3 dimensional, isn't it?

The code originated from https://github.com/Vict0rSch/deep_learning/tree/master/keras/recurrent

Milo Lu
  • 3,176
  • 3
  • 35
  • 46
sbz
  • 100
  • 2
  • 9
  • As it's *many-to-many*, why have you set `return_sequences=False`? Try to set it to `True` in second `LSTM`. – Marcin Możejko Sep 13 '17 at 21:37
  • Hi Marcin, I changed it to `True` but still getting the same error. – sbz Sep 13 '17 at 21:45
  • Could you update your code snippet? And which version of `keras` do you use? Are you 100% sure that it's the same message? – Marcin Możejko Sep 13 '17 at 21:46
  • Updated the snippet and I'm using 1.2.2 (with Python 2.7.5). The error is `Error when checking model target: expected activation_1 to have 2 dimensions, but got array with shape (3000, 15, 1)` and I'm using the `model.add(Dense(1)) model.add(Activation("linear"))` formatting. – sbz Sep 13 '17 at 21:50
  • 1
    Now I see. You are using relatively old version of `keras`. Try: `model.add(TimeDistributed(Dense(1)))`. – Marcin Możejko Sep 13 '17 at 22:12
  • Oh I see. I'll update it right now but using `TimeDistributed` it gives me the dimensionality error again... `ValueError: Input 0 is incompatible with layer timedistributeddense_1: expected ndim=3, found ndim=2` – sbz Sep 13 '17 at 22:32
  • It's extremely weird. Could you update a code snippet? – Marcin Możejko Sep 13 '17 at 22:34
  • Well this is embarrassing... After changing to python 3.4, current keras and updating my code for py3 it works! Anyway, thanks for all the help! – sbz Sep 14 '17 at 00:11
  • So may I formulate an answer? – Marcin Możejko Sep 14 '17 at 05:11

2 Answers2

2

In case of keras < 2.0: you need to use TimeDistributed wrapper in order to apply it element-wise to a sequence.

In case of keras >= 2.0: Dense layer is applied element-wise by default.

Marcin Możejko
  • 39,542
  • 10
  • 109
  • 120
0

Since you updated your keras version and your error messages changed, here is what works on my machine (Keras 2.0.x)

This works:

model = Sequential()

model.add(LSTM(10,input_shape=(15, 10), return_sequences=True))
model.add(Dropout(0.2))
model.add(LSTM( 100, return_sequences=True))
model.add(Dropout(0.2))
model.add(Dense(1, activation='linear'))

This also works:

model = Sequential()

model.add(LSTM(10,input_shape=(15, 10), return_sequences=True))
model.add(Dropout(0.2))
model.add(LSTM( 100, return_sequences=True))
model.add(Dropout(0.2))
model.add(LSTM(1,return_sequences=True, activation='linear'))

Testing with:

x = np.ones((3000,15,10))
y = np.ones((3000,15,1))

Compiling and training with:

model.compile(optimizer='adam',loss='mse')
model.fit(x,y,epochs=4,verbose=2)
Daniel Möller
  • 84,878
  • 18
  • 192
  • 214
  • Hi Daniel, unfortunately this gives me following error: `an input and an output are associated with the same recurrent state and should have the same type but have type 'TensorType(float32, col)' and 'TensorType(float32, matrix)' respectively.` However, even though `model.add(Dense(1, activation="linear"))` compiles and outputs the expected shape, the results seem wrong - I am only getting values around -1 to 2. It should output values between ~ -5 and ~ 20 - 25 – sbz Sep 15 '17 at 17:08
  • You're using an old version of keras, right? This answer will probably fit only keras 2. – Daniel Möller Sep 15 '17 at 17:10
  • 1
    Are you using any `TimeDistributed` before this LSTM? (LSTM, at least in keras 2, is not supposed to use `TimeDistributed`, it's already a temporal layer). – Daniel Möller Sep 15 '17 at 17:12
  • I am now using the current version and `Dense`. The code is just like in the original question. – sbz Sep 15 '17 at 17:14
  • Ok.... I'm not sure what is the cause of that error. I've never seen it. I've posted on my answer the two options that I tested and work ok on my machine. – Daniel Möller Sep 15 '17 at 17:30
  • Your first code does work for me as well but I am still getting seemingly wrong results like in my first comment to this answer. Your method 2 outputs following error though: `TypeError: The broadcast pattern of the output of scan (TensorType(float32, 3D)) is inconsistent with the one provided in output_info (TensorType(float32, col)). The output on axis 1 is False, but it is True on axis 2 in output_info. This can happen if one of the dimension is fixed to 1 in the input, while it is still variable in the output, or vice-verca.[...]` – sbz Sep 15 '17 at 17:45
  • Let us [continue this discussion in chat](http://chat.stackoverflow.com/rooms/154555/discussion-between-sbz-and-daniel-moller). – sbz Sep 15 '17 at 18:11