I am trying to create a confusion matrix for a Sequential model. This is what I have right now.
cleaned_pokedex = pd.read_csv('cleaned_pokedex.csv')
labels = sorted(cleaned_pokedex['Type1'].unique())
pred_y = model.predict(test_x)
confusion = confusion_matrix(true_y, np.argmax(pred_y, axis=1))
_, axis = plt.subplots(figsize = (12, 10))
sns.heatmap(confusion,
annot = True,
fmt = 'd',
xticklabels = labels,
yticklabels = labels,
ax = axis)
plt.xlabel('Prediction')
plt.ylabel('True')
plt.show()
It's resulting in this:
As you can see, all of the data points are stuck on predicting Bug (which they aren't actually always predicting Bug). Any ideas? I feel like it's really small but I've been changing things and can't figure it out.
EDIT: Clarification - There is nothing wrong with the model / classifier. It is achieving a score of 42% with model.evaluate(). It is not a problem with the heatmap either, as it shows the same thing when I print 'confusion'. There is a problem with the line where I create the confusion matrix.
EDIT2: I was wrong, I will make a different question because I've never seen this problem before. Somehow evaluate and predict are achieving two different pred_y's. Evaluate is predicting it fine, but predict is predicting all 0's, really weird.