Questions tagged [lstm]

Long short-term memory. A neural network (NN) architecture that contains recurrent NN blocks that can remember a value for an arbitrary length of time. A very popular building block for deep NN.

Long short-term memory neural networks (LSTMs) are a subset of recurrent neural networks. They can take time-series data and make predictions using knowledge of how the system is evolving.

A major benefit to LSTMs is their ability to store and utilize long-term information, not just what they are provided at a particular instance. For more information on LSTMs check out these links from colah's blog post and MachineLearningMastery.

6289 questions
2
votes
0 answers

Multi layer dynamic LSTM in tflearn

I want to feed IMDB dataset with multi layer dynamic LSTM network but it seems that next LSTM layers couldn't parse previous layers output. code: net = tflearn.input_data([None, 100]) net = tflearn.embedding(net, input_dim=6819, output_dim=256) net…
2
votes
1 answer

Many to one LSTM, multiclass classification

I am trying to train an LSTM-RNN with 64 hidden units. My data is the following: input: numpy array with dimensions (170000, 50, 500) -> (examples, time steps, number of features) output: numpy array with dimensions (170000, 10) The output is a…
Alejandro
  • 67
  • 5
2
votes
1 answer

Tensorflow: Create a custom sub-class of LSTM cell with a dfferent "call" function

Using dynamic_rnn on a sequence of input currently returns you a sequence of outputs and the last cell-state. For the task at hand (a truncated back-prop that can start/end at any index in the sequence) I need to access not just the last cell-state…
Evan Pu
  • 2,099
  • 5
  • 21
  • 36
2
votes
1 answer

Keras: Embedding layer + LSTM: Time Dimension

This might be too stupid to ask ... but ... When using LSTM after the initial Embedding layer in Keras (for example the Keras LSTM-IMDB tutorial code), how does the Embedding layer know that there is a time dimension? In another word, how does the…
Zane
  • 91
  • 1
  • 7
2
votes
1 answer

mismatch in input_shape and model structure

This is the code: model = Sequential() model.add(LSTM(24, input_shape = (trainX.shape[0], 1, 4))) model.add(Dense(12, activation = 'softmax')) model.compile(loss='mean_squared_error', optimizer='adam') model.fit(trainX, trainY, epochs=100,…
PQMeng
  • 61
  • 4
2
votes
1 answer

How on earth can I pass the sequences with different length to an LSTM on keras?

I have a X_train set of 744983 samples divided into 24443 sequences, while the number of samples in each sequence is different. Each sample is a vector of 30 dimensions. How can I feed these data into a LSTM of Keras? Here is some description of the…
2
votes
1 answer

Issue in LSTM Input Dimensions in Keras

I am trying to implement a multi-input LSTM model using keras. The code is as follows: data_1 -> shape (1150,50) data_2 -> shape (1150,50) y_train -> shape (1150,50) input_1 = Input(shape=data_1.shape) LSTM_1 = LSTM(100)(input_1) input_2 =…
Arpitha
  • 109
  • 1
  • 1
  • 5
2
votes
3 answers

How to import LSTM in Keras, Tensorflow

When try to import the LSTM layer I encounter the following error: from keras.layers.recurrent import LSTM No module named 'LSTM' So, I tried to download this module from website and another problem is the file type is .tar I don't know how to…
Tanakorn Taweepoka
  • 197
  • 2
  • 3
  • 14
2
votes
0 answers

How to learn variable time interval events with lstm

I'm a little bit confusing about how to format my model for variable time interval events classification with lstm. I'm trying to classify events in time dimension. The events occur with different intervals (e.g., Event1 and Event2 have interval of…
2
votes
0 answers

Perplexity calculation for Language Model on 1 Billion Word Language Model Benchmark

Recently, I have been trying to implement RNNLM based on this article. There is an implementation with some LSTM factorization tricks, but similar to the original implementation by the author. Preambula 1) The dataset is split into files and then…
2
votes
0 answers

How to model correlated features using LSTM for multivariate regression

I have a an MxN matrix of time-series data that includes X, Y, and Z features such that X, Y, and Z are correlated. Instead of simply fitting a line to these features independently, I would like the LSTM to model the correlations between these…
5ive
  • 273
  • 2
  • 11
2
votes
1 answer

Keras LSTM Shape for Pandas DataFrame

I'm playing around with machine learning and trying to follow along with some examples but am stuck trying to get my data into a Keras LSTM layer. I have some stock ticker data in a Pandas DataFrame which is resampled at 15 minute intervals with…
Ludo
  • 2,739
  • 2
  • 28
  • 42
2
votes
1 answer

Keras - Making two predictions from one neural network

I'm trying to combine two outputs that are produced by the same network that makes predictions on a 4 class task and a 10 class task. Then I look to combine these outputs to give a length 14 array which I use as my end target. While this seems to…
tryingtolearn
  • 2,528
  • 7
  • 26
  • 45
2
votes
0 answers

Sequence Autoencoder for compression

I'm want to build a sequence to sequence auto encoder for signal compression. I wanted to start with a std, LSTM based auto encoder. However, Keras complains about my model. any hint what I'm doing wrong from keras.layers import Input, LSTM,…
CAFEBABE
  • 3,983
  • 1
  • 19
  • 38
2
votes
0 answers

PyTorch, simple char level RNN, can't overfit one example

I'm new to the PyTorch framework (coming from Theano and Tensorflow mainly): I've followed the introduction tutorial and read the Classifying Names with a Character-Level RNN one. I now try to adapt it to a char level LSTM model in order to gain…
JimZer
  • 918
  • 2
  • 9
  • 19
1 2 3
99
100