I'm training a neural network using Tensorflow/Keras, X_train=(627,24), Y_train=(627,102). For test X_dev=(157,24), Y_dev=(157,102). I used the Sequential model:
import tensorflow
model = tensorflow.keras.models.Sequential()
model.add(tensorflow.keras.Input(shape=(24)))
model.add(tensorflow.keras.layers.Dense(64, activation='relu'))
model.add(tensorflow.keras.layers.Dropout(0.2))
model.add(tensorflow.keras.layers.Dense(102, activation='sigmoid'))
print(model.output_shape)
print(model.compute_output_signature)
model.compile(loss='mean_absolute_percentage_error', optimizer='SGD', metrics=['accuracy'])
model.fit(X_train, Y_train, epochs=64, batch_size=32, verbose=0)
loss,accuracy=model.evaluate(X_dev, Y_dev, verbose=0)
print('Model Loss: %.2f, Accuracy: %.2f' % ((loss*100),(accuracy*100)))
import csv
predictions=model.predict(X_dev)
for i in predictions:
with open('restoredPavia.csv', 'a') as file:
writer = csv.writer(file)
writer.writerow(np.round(i, 0))
The problem is that I can't obtain an accuracy of more than 3% and the output I get is either 0 or 1. But Y should vary in the positive range (1000, 500, 2000 etc.) Because X data varies in those ranges. I used simple Linear Regression before for this problem and I obtained good results. But I'm required to develop NN to predict the output. What is wrong with this approach? Can it be fixed? Or NN is not suitable for this problem. Please, let me know if you need more information. Maybe I should change the output of the sigmoid function to convert 0 and 1 to a wide range? Maybe I should provide the model with more data.