I am using a Keras sequential model, a prediction output is of the shape (1, 5) (5 features).
I have an accuracy metric defined as follows:
For N predictions, the accuracy of the model will be the percentage of predicted samples such that: for each prediction and its respective true labels, all of the features are with no more than 10 difference.
For example, if y_i = [1, 2, 3, 4, 5]
and ypred_i = [1, 2, 3, 4, 16]
is not a match since the last feature has difference 11. If y_i = [1, 2, 3, 4, 5]
and ypred_i = [10, 8, 0, 5, 7]
is a match, because all features have no more than 10 difference to its respective real features.
I am wondering which loss function to use in my Keras sequential model as to increase the explained accuracy the most. Should I define a custom loss function, how should it look like, or how should I proceed?
My code is:
class NeuralNetMulti(Regressor):
def __init__(self):
self.name = 'keras-sequential'
self.model = Sequential()
# self.earlystopping = callbacks.EarlyStopping(monitor="mae",
# mode="min", patience=5,
# restore_best_weights=True)
def fit(self, X, y):
print('Fitting into the neural net...')
n_inputs = X.shape[1]
n_outputs = y.shape[1]
self.model.add(Dense(400, input_dim=n_inputs, kernel_initializer='he_uniform', activation='relu'))
# self.model.add(Dense(20, activation='relu'))
self.model.add(Dense(200, activation='relu'))
# self.model.add(Dense(10, activation='relu'))
self.model.add(Dense(n_outputs))
self.model.summary()
self.model.compile(loss='mae', optimizer='adam', metrics=['mse', 'mae', 'accuracy'])
history = self.model.fit(X, y, verbose=1, epochs=200, validation_split=0.1)
# self.model.fit(X, y, verbose=1, epochs=1000, callbacks=[self.earlystopping])
print('Fitting completed!')
def predict(self, X):
print('Predicting...')
predictions = self.model.predict(X, verbose=1)
print('Predicted!')
return predictions
My suggestion for a loss function:
def N_distance(y_true, y_pred):
score = 0
vals = abs(y_true - y_pred)
if all(a <= 10 for a in vals):
return 0
return 1
It returns:
0
if the condition holds1
otherwise.