I have 20 channel data each with 5000 values (total of 150,000+ records stored as .npy files on the HD).
I am following the keras fit_generator tutorial available on https://stanford.edu/~shervine/blog/keras-how-to-generate-data-on-the-fly.html to read the data (each record is read as (5000, 20) numpy array of type float32.
The networks that I have theorized, have parallel convolutional networks for each channel which concatenate at the end to and thus need to be feed data in parallel. Reading and feeding only single channel from the data and feeding to a single network is successful
def __data_generation(self, list_IDs_temp):
'Generates data containing batch_size samples' # X : (n_samples, *dim, n_channels)
# Initialization
if(self.n_channels == 1):
X = np.empty((self.batch_size, *self.dim))
else:
X = np.empty((self.batch_size, *self.dim, self.n_channels))
y = np.empty((self.batch_size), dtype=int)
# Generate data
for i, ID in enumerate(list_IDs_temp):
# Store sample
d = np.load(self.data_path + ID + '.npy')
d = d[:, self.required_channel]
d = np.expand_dims(d, 2)
X[i,] = d
# Store class
y[i] = self.labels[ID]
return X, keras.utils.to_categorical(y, num_classes=self.n_classes)
However when reading the whole record and trying to feed it to the network with slicing using Lambda layers I get the
Reading the whole record
X[i,] = np.load(self.data_path + ID + '.npy')
Using the Lambda Slicing Layer implementation available at : https://github.com/keras-team/keras/issues/890 and calling
input = Input(shape=(5000, 20))
slicedInput = crop(2, 0, 1)(input)
I am able to compile the model and it show the expected layer sizes.
When the data is fed to this network, I get
ValueError: could not broadcast input array from shape (5000,20) into shape (5000,1)
Any help would be much appreciated....