
Accuracy
How many of the correct predictions are made in total? (Closer to 1)

TP plus TN, divided by the sum of all
Recall
In a sample that is actually positive, the proportion of samples that are determined to be positive

How many of the total things I'm trying to get right? (Closer to 1)
Precision
If it is predicted to be positive, moderate positive. How accurate the positive prediction is

How many correct answers are correct among the questions you solved? (Closer to 1 is better)
Okay lets do 3 x 3 confusion matrix

class A precision = 15 / 24 = 0.625
class B precision = 15 / 20 = 0.75
class C precision = 45 / 56 = 0.80
class A recall = 15 / 20 = 0.75
class B recall = 15 / 30 = 0.5
class C recall = 45 / 50 = 0.9
Accuracy of classifier = (15 + 15 + 45) / 100 = 0.75
Weighted Average Precision = Actual class A instances * precison of class A + Actual class B instances * precison of class B + Actual class C instances * precison of class C
= 20 / 100 * 0.625 + 30 / 100 * 0.75 + 50 / 100 * 0.8
= 0.75
Weighted Average Recall = Actual class A instances * Recall of class A + Actual class B instances * Recall of class B + Actual class C instances * Recall of class C
= 20 / 100 * 0.75 + 30 / 100 * 0.5 + 50 / 100 * 0.9
= 0.75
In your case

class A precision = 0.9 / 0.9 = 1
class B precision = 0.91 / 1.02 = 0.89
class C precision = 0.93 / 1.05 = 0.89
class A recall = 0.9 / 0.99 = 0.91
class B recall = 0.91 / 0.99 = 0.92
class C recall = 0.93 / 0.99 = 0.94
Accuracy of classifier = (0.9 + 0.91 + 0.93) / 2.97 = 0.92
Weighted Average Precision = Actual class A instances * precison of class A + Actual class B instances * precison of class B + Actual class C instances * precison of class C
= 0.99 / 2.97 * 1 + 0.99 / 2.97 * 0.89 + 0.99 / 2.97 * 0.89 = 0.93
Weighted Average Recall = Actual class A instances * Recall of class A + Actual class B instances * Recall of class B + Actual class C instances * Recall of class C
= 0.99 / 2.97 * 0.91 + 0.99 / 2.97 * 0.92 + 0.99 / 2.97 * 0.94 = 0.92