Questions tagged [lstm-stateful]

Tag refers to stateful long short-term memory (LSTM) cells in a neural network (i.e. cells that remember their state for the next training batch)

Stateful LSTMs will use the last state for each sample at index i in a training batch as initial state for the sample of index i in the following batch.

63 questions
1
vote
1 answer

Training many-to-many stateful LSTM with and without final dense layer

I am trying to train a recurrent model in Keras containing an LSTM for regression purposes. I would like to use the model online and, as far as I understood, I need to train a stateful LSTM. Since the model has to output a sequence of values, I…
Developer
  • 47
  • 5
1
vote
0 answers

LSTM Time Series Anomaly detection

I am trying to find an anomaly in Time series with the LSTM. And I am still wondering, what should be the right architecture, timesteps, batch-size, sliding or non-sliding windows for finding an anomaly based on time series past behaviour from numpy…
Manu Sharma
  • 1,593
  • 4
  • 25
  • 48
1
vote
0 answers

How to rewrite Keras Stateful LSTM in pure Tensorflow?

Anyone could share an idea/blog/code snipet on how to convert a Keras Stateful LSTM into pure Tensorflow model? And then train it on batch.. Tensorflow doesn't support Keras Stateful LSTM on TPUs. Their devs refused to fix it. I have tons of TPU…
Boppity Bop
  • 9,613
  • 13
  • 72
  • 151
0
votes
0 answers

RNNs/LSTMs for one-step prediction in deterministic sequences of the type 1,1,1,-1,1,1,1-1,1,1,1,-1,

Trying to understand recurrent neural nets better based on a simple example where the training set is of the form of n ones followed by a minus one (e.g., train_set =[*([1]*n),-1]*10_000). Would like to find an architecture that is able to converge…
0
votes
0 answers

LSTM Autoencoder - real-time prediction

I have trained the following LSTM Autoencoder: Note: For training I set first dimension of input to None to allow for training and testing on data with different timesteps. I also use the Lambda layer instead of repeat =…
0
votes
0 answers

Why final memory state equals to the last hidden state of entire hidden state sequence?

when return_sequences=True and return_state=True LSTM layer outputs the hidden states of all the LSTM cells along with the memory state and hidden state of the final cell as described in Tensorflow docs lstm = tf.keras.layers.LSTM(4,…
noone
  • 6,168
  • 2
  • 42
  • 51
0
votes
0 answers

Layer model_1 expects 2 input but it received 1 tensors received:[]

Encountered the error while trying to fit model of encoder-decoder using ConvLSTM2D. the x_train is of shape (31567, 7, 210, 203, 1)(batch_size,framelength,H,W,C). The encoder part works when executed in isolation but the error occurs when i add the…
0
votes
1 answer

ValueError in model.fit in lstm

I am trying to fit an lstm model to my data read as csv file. (320,6) is the shape of x_train, and the model is given as def build_modelLSTMlite(input_shape): model = keras.Sequential() model.add(keras.layers.LSTM(64,…
poorna
  • 61
  • 1
  • 9
0
votes
0 answers

AlphaStar architecture: Core - deep LSTM

While researching the architecture of the AlphaStar neural network, I came across a description that I am not fully clear on. Despite understanding the majority of the blocks, this specific part have left me with some questions. Knowledge about the…
0
votes
1 answer

Stateful RNN (LSTM) in keras

imagin the following the data: X = [x1, x2, x3, x4, x5, x6, ...] and Y = [y1, y2, y3, y4, ...] the label represent the input in the following manner: [x1,x2] -> y1 [x2,x3] -> y2 . . . I am trying to make a model in using keras, so that when the…
NeuroEng
  • 191
  • 8
0
votes
1 answer

Pytorch: Having trouble understanding the inline replacement happening

This seems to be a common error people get, but i can't really understand the real cause. I am having trouble figuring out where the inline replacement is happening. My forward function: def forward(self, input, hidden=None): if hidden is None…
Eliethesaiyan
  • 2,327
  • 1
  • 22
  • 35
0
votes
1 answer

ValueError: Input 0 of layer lstm_14 is incompatible with the layer: expected ndim=3, found ndim=4. Full shape received: [None, 12, 12, 64]

I am using CNN-LSTM network for image classification. My image size is (224, 224, 3) and batch size is 90. I m getting this error when i passing input to LSTM layer. Following is my code snippet: input1 = Input(shape=(224, 224,3)) x = Conv2D(8,…
0
votes
2 answers

InvalidArgumentError when making a stateful LSTM

I'm working on a stateful LSTM to predict stock prices. These are the shapes of my input data: (updated) x_train = (10269, 300, 89) y_train = (10269, 1) x_test = (4401, 300, 89) y_test = (4401, 1) This is my model initialisation: batch_size =…
MGeureka
  • 63
  • 1
  • 9
0
votes
1 answer

Use Adam optimizer for LSTM network vs LBGFS

I have modified pytorch tutorial on LSTM (sine-wave prediction: given [0:N] sine-values -> [N:2N] values) to use Adam optimizer instead of LBFGS optimizer. However, the model does not train well and cannot predict sine-wave correctly. Since in most…
Roy
  • 65
  • 2
  • 15
  • 40
0
votes
1 answer

How can I get a stateful LSTM to reset its states between epochs during a Keras Tuner search?

I am trying to tune a stateful LSTM using Keras Tuner. I have the code working and it is able to train models, but I still can't figure out how to get the model to reset states between epochs. Normally I would train for 1 epoch at a time in a loop…
WVJoe
  • 515
  • 7
  • 21