I have the RMSE loss, defined as :
RMSE = np.sum(np.sqrt((np.array(pred_df.real_values) - np.array(pred_df.estimate_values))**2))
where the real values and predictions are between 0.0 and 5.0 .
I want to use this as an accuracy metric, not as a loss, however I don't know the interval in which this function takes values. The only thing I can think of is that:
Worse case - all predictions are wrong (all are 5.0 apart) : RMSE = 5.0 * len(pred_df)
Best case - all predictions are correct : RMSE = 0.0
Can I just use RMSE - 5.0 * len(pred_df)
as my accuracy metric? Is there a smarter way of doing this?