Questions tagged [roc]

ROC (Receiver Operating Characteristic) curve is a graphical plot comparing the true positive and false positive rates of a classifier as its discrimination threshold is varied.

Receiver Operating Characteristic curve, or ROC curve, is a graphical depiction of classifier performance that shows the trade-off between increasing true positive rates (on the vertical axis) and increasing false positive rates (on the horizontal axis) as the discrimination threshold of the classifier is varied.

The true positive rate, defined as is the the fraction of true positives out of the positives, is also called the sensitivity or recall. The false positive rate, defined as the fraction of false positives out of the negatives, is equivalent to 1 - sensitivity.

In its original form, the ROC curve was used to summarize performance of a binary classification task, although it can be extended for use in multi-class problems.

A classifier performing at chance is expected to have true positive and false positive rates that are equal, producing a diagonal line. Classifiers that exceed chance produce a curve above this diagonal. The area under the curve (or AUC) is commonly used as a summary of the ROC curve and as a measure of classifier performance. The AUC is equal to the probability that a classifier will rank a randomly chosen positive case higher than a randomly chosen negative one. This is equivalent to the Wilcoxon test of ranks.

ROC curves enable visualizing and organizing classifier performance without regard to class distributions or error costs. This can be helpful when investigating learning with skewed distributions or cost-sensitive learning.

Helpful reading includes:

Fawcett, Tom. "ROC graphs: Notes and practical considerations for researchers." Machine Learning 31 (2004): 1-38.

Swets, John A., Robyn M. Dawes, and John Monahan. "Better decisions through Science." Scientific American (2000): 83.

1076 questions
-2
votes
1 answer

Multiclass AUC with 95% confidence interval

I am currently trying to figure if there is a way to get the 95% CI of the AUC in python. Currently, I have a ypred list that contains the highest probability class predictions between the 4 classes I have(so either a 0/1/2/3 at each position) and a…
Superwiz1
  • 81
  • 2
  • 10
-2
votes
1 answer

R RandomForest classification- test data does not have the value to predict

I am trying a classification with Random Forest in R I have a training data set that has a complexityFlag that is 1 or 0 and I am training my model on the data set using Random Forest: model1 <- randomForest(as.factor(ComplexityFlag) ~…
Jawahar
  • 183
  • 4
  • 16
-2
votes
1 answer

what is the prediction array in ROC curves in scikit

import numpy as np from sklearn import metrics y = np.array([1, 1, 2, 2]) scores = np.array([0.1, 0.4, 0.35, 0.8]) fpr, tpr, thresholds = metrics.roc_curve(y, scores, pos_label=2) I am doing link prediction using an algorithm and I have a test and…
Code_ninja
  • 117
  • 1
  • 10
-2
votes
1 answer

ROC curve based on means and variances of controls and cases

Does anyone know of an R package (or any other statistical freeware or just a piece of code) that lets you plot a smooth ROC curve knowing only the means and variances of the control and case groups? That is, one that doesn't require a dataset with…
prishly
  • 11
  • 3
-2
votes
2 answers

Roc curve in linear discriminant analysis with R

I want to compute the Roc curve and then the AUC from the linear discriminant model. Do you know how can I do this? here there is the code: ##LDA require(MASS) library(MASS) lda.fit = lda(Negative ~., trainSparse) lda.fit plot(lda.fit) ###prediction…
mac gionny
  • 333
  • 1
  • 3
  • 8
-2
votes
1 answer

How to plot ROC curve for the following set?

I have a two sets set1=[0.333; 0.509; 0.607; 1.172; 0.275; 0.762; 0.850; 0.920; 0.556; -0.046]; set2=[ 0.295; -0.203; -0.097; 0.633; 0.147; 0.356; 0.235; -0.054; -0.024; 0.377; -0.180; 0.512; 0.428; -0.129; 0.094]; How to plot ROC curve for it…
user3830162
-3
votes
1 answer

Why roc_curve() function reversed the values of FPR and TPR?

I want to plot the ROC curve using python. I using roc_curve() function. I passed binary labels and the scores as input along with the pos_label = 1. When I plot fpr and tpr, the plot seems to reverse. I provided the same inputs to MedCalc, and it…
-3
votes
1 answer

The Gnuplot output is set to display 'onscreen' how to change the program so that plot output is directed to a folder I am providing the program below

From the below code the accuracy values of the ROC can be predicted and GNUplot was used to display the plot output. But the output is set to 'onscreen'. Thus the plot just appears for few seconds and closes. I want to direct the plot output to a…
-4
votes
0 answers

I wan't to see precision of AUC value upto 4 decimal places for each fold in the plot of ROC curve

I want to set the floating-point precision of AUC (shown in the legend) to 4 digits. I used the default function available in sklearn.metrics.RocCurveDisplay. Also I would like to import the yes no results of each fold into a separate excel file.…
Samiul
  • 51
  • 5
1 2 3
71
72