0

I am trying to use CNN 1D for time series prediction. I have a time series data set with 30 features, 3 targets and more than 3000 rows.

This is my keras model

model = Sequential()
model.add(Embedding(64, 10, batch_input_shape=  (100,30))) #100 time steps and 30 features
model.add(Convolution1D(nb_filter=256,
                    filter_length=3,
                    border_mode='valid',
                    activation='relu',
                    subsample_length=1))

model.add(MaxPooling1D())
model.add(Convolution1D(nb_filter=150,
                    filter_length=3,
                    border_mode='valid',
                    activation='relu',
                    subsample_length=1))
model.add(MaxPooling1D())
model.add(Flatten())
model.add(Dropout(0.2))
model.add(Dense(3))
model.add(Activation('tanh'))
optimizer = RMSprop(lr=0.01)
model.compile(loss='mse', optimizer=optimizer)

model.fit(x,y)

The model compiles without any error but when I tried to do model fit it gave this error

IndexError: index 124 is out of bounds for size 64
Apply node that caused the error: AdvancedSubtensor1(embedding_17_W, Reshape{1}.0)

I saw this answer but my x (feature) and y (target) are already in numpy array form. How to solve this?

EDITED

After some tinkering I have found out the problem is caused by my CNN model itself. I tried to train the same dataset with a simple neural network and it ran with out any issue.

model = Sequential()
model.add(Dense(30, input_dim=30))
model.add(Activation('tanh'))
model.add(Dense(15))
model.add(Activation('tanh'))
model.add(Dropout(0.2))
model.add(Dense(3))
model.add(Activation('tanh'))
optimizer = RMSprop(lr=0.01)
model.compile(loss='mse', optimizer=optimizer)

model.fit(x,y)

Any one know what is wrong with my CNN model?

Community
  • 1
  • 1
Eka
  • 14,170
  • 38
  • 128
  • 212
  • Problem in your Embedding layer: instead of 64 should be `1 + max_index` – Alexey Golyshev Jan 18 '17 at 11:11
  • its showing the same error `IndexError: index 3 is out of bounds for size 1` – Eka Jan 18 '17 at 12:28
  • You don't understand me. See Keras [documentation](https://keras.io/layers/embeddings/): `input_dim: int > 0. Size of the vocabulary, ie. 1 + maximum integer index occurring in the input data` Your Embedding should be 125 or more. Evaluate `numpy.max(x)` and plus 1 (index 0 is reserved for unknown values). – Alexey Golyshev Jan 18 '17 at 14:03

0 Answers0