0

I'm having some difficulty understanding the flow of cells in stacked LSTM network. I have this network:

def make_model(x_train):

    # Create a new linear regression model.
    model = Sequential()
    model.add(Bidirectional(LSTM(units=30, return_sequences=True, input_shape = (x_train.shape[1],1) ) ))
    model.add(Dropout(0.2))
    model.add(LSTM(units= 30 , return_sequences=True))
    model.add(Dropout(0.2))
    model.add(LSTM(units= 30 , return_sequences=True))
    model.add(Dropout(0.2))
    model.add(LSTM(units= 30))
    model.add(Dropout(0.2))
    model.add(Dense(units = n_future,activation='linear'))
    model.compile(optimizer='adam', loss='mean_squared_error',metrics=['acc'])
    return model

1)Does the input from the 1st LSTM layer goes to the second LSTM layer?

2)I have read that in LSTMs, we have the previous hidden state and the current input as inputs. If the input from the 1st LSTM layer (input_shape) doesn't go to the 2nd LSTM layer, what is the input from of the 2nd LSTM layer? only the hidden state? which hidden state?

thushv89
  • 10,865
  • 1
  • 26
  • 39
lasdaou
  • 1
  • 1

1 Answers1

0

What you are creating is a network of layers. So yes, input from the first goes to the second and so on. You can control the input that goes from the layers incase of lstm you can use return_sequences or return state.

Here is an article that explains this

https://www.dlology.com/blog/how-to-use-return_state-or-return_sequences-in-keras/

theastronomist
  • 955
  • 2
  • 13
  • 33
Prashanth M
  • 326
  • 1
  • 7
  • how can i get the weights from all LSTM layers? I used this one: https://stackoverflow.com/questions/42861460/how-to-interpret-weights-in-a-lstm-layer-in-keras but it only works for index 0. I want the weights from all 4 LSTMs and from the Dense layer. – lasdaou Sep 23 '20 at 09:43