0

I am performing a logistic regression and performing probabilistic modeling. When I go through the definition of this ** Precision, Precision@K, ROC curve, and precision-recall AUC curve** performance metrics I am not able to differentiate differences between them. Please correct me if my understanding is wrong and any suggestion would be much appreciated.

**What is the difference between precision and precision@K? The interpretation of precision score 0.8 and precision@K score 0.8 is same whereas k might give some extra information **

Precision: it gives the ratio of correctly classified positive outcomes out of all predicted positive outcomes

Precision@K: it gives the ratio of correctly classified positive outcomes over the k-value. Out of K-value how many of them are relevant to us. I understand this on the problem of recommendation but how can we use this as a performance metric in prediction classification? For example, employees leaving a company and so on. 

What is the difference between the ROC curve and Precision-Recall curve AUC? How 0.8 ROC curve and 0.8 precision-recall curve are interpretated

ROC curve is ratio between **True Positive** and **False Positive**

Precision-Recall curve AUC is the ration between **Precision** and **Recall**

I am having a problem understanding the concept. Can somebody please help me to understand the concept?

Thank You

Bad Coder
  • 177
  • 11
  • Here is some good discussion about ROC curve and precision-recall curve [https://www.kaggle.com/general/7517] and [https://www.biostat.wisc.edu/~page/rocpr.pdf] – Bad Coder Jun 04 '22 at 20:29
  • https://stats.stackexchange.com/questions/7207/roc-vs-precision-and-recall-curves – Bad Coder Jun 04 '22 at 20:36

0 Answers0