The datasets I am working with correspond to individual time series signals. Each signal is unique, with differing total number of data points, though each signal represents the same semantic data (speed in mph).
I am working with Keras, and trying to fit a basic neural network to the data just to evaluate it. Below is the Python code for that:
model = Sequential()
model.add(Dense(64, activation='relu'))
model.add(Dropout(0.2))
model.add(Dense(128, activation='relu'))
model.add(Dropout(0.2))
model.add(Dense(64, activation='relu'))
model.add(Dense(1, activation='sigmoid'))
model.compile(optimizer='adam', loss='binary_crossentropy', metrics=['accuracy'])
Essentially, I am fitting the model to each dataset as follows:
for file in directory:
data = pd.read_csv(file)
# get x_train and y_train ...
model.fit(X_train, y_train, epochs=10)
Is this a valid way to train a model on multiple datasets of the same semantic data?