-1

I'm building a RNN and I use LSTM. The X matrix has this dimension (1824, 7) instead Y has this dim (1824, 1). This is my model:

  num_units = 64
  learning_rate = 0.0001
  activation_function = 'sigmoid'
  adam = Adam(lr=learning_rate)
  loss_function = 'mse'
  batch_size = 5
  num_epochs = 50

  # Initialize the RNN
  model = Sequential()
  model.add(LSTM(units = num_units, activation=activation_function, input_shape=(1824, 7, )))
  model.add(LeakyReLU(alpha=0.5))
  model.add(Dropout(0.1))
  model.add(Dense(units = 1))

  # Compiling the RNN
  model.compile(optimizer=adam, loss=loss_function, metrics=['accuracy'])

  history = model.fit(
        X,
        y,
        validation_split=0.1,
        batch_size=batch_size,
        epochs=num_epochs,
        shuffle=False
  )

I know the error is in input_shape parameter. When I try to fit the model I get this error:

ValueError: Input 0 of layer sequential is incompatible with the layer: expected ndim=3, found ndim=2. Full shape received: [None, 7]

I have seen similar questions, And I tried to apply some of that changes, such as:

input_dim = X.shape
input_dim=(7,)
input_dim=(1824, 7, 1)

But in any case I got this kind of error. How can I fix it?

Nicolas Gervais
  • 33,817
  • 13
  • 115
  • 143
GKM__
  • 99
  • 2
  • 8
  • `input_dim=(7,)` should be a right answer. Please show the error message for this case – Andrey Feb 18 '21 at 11:26
  • @Andrey LSTM layers need 3D input. That won't work – Nicolas Gervais Feb 18 '21 at 12:21
  • Does this answer your question? [ValueError : Input 0 of layer lstm is incompatible with the layer: expected ndim=3, found ndim=2. Full shape received: \[None, 18\]](https://stackoverflow.com/questions/58119320/valueerror-input-0-of-layer-lstm-is-incompatible-with-the-layer-expected-ndim) – Nicolas Gervais Feb 18 '21 at 12:21

1 Answers1

0

As commented by @Nicolas Gervais,

Tensorflow Keras LSTM expects inputs: A 3D tensor with shape [batch, timesteps, feature].

Working sample code

import tensorflow as tf
inputs = tf.random.normal([32, 10, 8])
print(inputs.shape)
lstm = tf.keras.layers.LSTM(4)
output = lstm(inputs)
print(output.shape)

Output

(32, 10, 8)
(32, 4)