I'm doing a binary classification. Whenever my prediction equals the ground truth, I find sklearn.metrics.confusion_matrix
to return a single value. Isn't there a problem?
from sklearn.metrics import confusion_matrix
print(confusion_matrix([True, True], [True, True])
# [[2]]
I would expect something like:
[[2 0]
[0 0]]