-1

I am trying to get the prediction accuracy, precision and recall when running Lasso logistic model but it turns out that no matter what lambda I set, the recall is always 1 (even when I set lambda = 0)

enter image description here

Below is my code. Here I'm using Smartket dataset from ISLR package.

library(ISLR)

k = 10  #10 folds cv

set.seed(42)
folds <- sample(1:k, nrow(Smarket), replace = T)

library(glmnet)

x = model.matrix(Direction ~ Lag1 + Lag2 + Lag3 + Lag4 + Lag5 + Volume
, Smarket)[,-1]
y = Smarket$Direction

results <- data.frame(matrix(NA, nrow = 10, ncol = 3))
colnames(results) <- c("Accuracy", "Precision", "Recall")

for(j in 1:k){

  lasso.mod  <-  glmnet(x[folds != j,], y[folds != j], alpha = 1,
lambda = 0.01, family = "binomial")

  preds <- predict(lasso.mod, newx = x[folds == j,], type = "response")
class <- ifelse(preds >= 0.5, "Up", "Down")

  accuracy <- sum(class == y[folds == j])/length(class)

  precision <- sum(class == "Up" & y[folds == j] == "Up")/sum(class == "Up")

  recall <- sum(class == "Up" & y[folds == j] == "Up")/sum(y[folds ==
j] == "Up")

  results[j,] <- c(accuracy, precision, recall)
}

Is there anything wrong in my model. I think something's wrong in my function call glmnet, not the loop because everything's good when I use glm to call the logistic model instead of glmnet.

Zheyuan Li
  • 71,365
  • 17
  • 180
  • 248
Sandy Vo
  • 11
  • 2

1 Answers1

-1

This seems to be a rounding issue with your results table. When I run your code I get the following:

    Accuracy Precision    Recall
1  0.5034965 0.4963504 0.9714286
2  0.5000000 0.4915254 0.9830508
3  0.4961240 0.4961240 1.0000000
4  0.5315315 0.5510204 0.8709677
5  0.5000000 0.4959350 1.0000000
6  0.5641026 0.5888889 0.7910448
7  0.4869565 0.4821429 0.9818182
8  0.5378151 0.5304348 0.9838710
9  0.5486726 0.5648148 0.9384615
10 0.5222930 0.5285714 0.8915663

I'm not sure why it is rounding just your Recall column. Try running the following to see if you can see the unrounded recall values:

print(results,digits=10)
rw2
  • 1,549
  • 1
  • 11
  • 20