0

I'm building a penalized multinomial logistic regression, but I'm having trouble coming up with a easy way to get the prediction accuracy. Here's my code:

fit.ridge.cv <- cv.glmnet(train[,-1], train[,1], type.measure="mse", alpha=0,
                      family="multinomial")

fit.ridge.best <- glmnet(train[,-1], train[,1], family = "multinomial", alpha = 0,
                     lambda = fit.ridge.cv$lambda.min)

fit.ridge.pred <- predict(fit.ridge.best, test[,-1], type = "response")

The first column of my test data is the response variable and it has 4 categories. And if I look at the result(fit.ridge.pred) it looks like this:

1,2,3,4
0.8743061353, 0.0122328811, 0.004798154, 0.1086628297

From what I understand these are the class probabilities. I want to know if there's a easy way to compute the model accuracy on the test data. Now I'm taking the max for each row and comparing with the original label. Thanks

ajax2000
  • 711
  • 2
  • 10
  • 23
  • @李哲源 thanks, I'm really just trying to compute the accuracy and confusion matrix. So when I just nnet, I can just do table(pred, test[,1]) and this will give me the confusion matrix. Need to figure out how to do it with glmnet – ajax2000 Aug 14 '18 at 15:06

2 Answers2

4

Something like:

predicted <- colnames(fit.ridge.pred)[apply(fit.ridge.pred,1,which.max)]
table(predicted, test[, 1]

The first line takes the class for which the model outputs the highest probability per row, after which the second line constructs a confusion matrix.

The accuracy is then basically the proportion of observations classified correct (sum of the diagonal / total)

Robert
  • 429
  • 4
  • 9
  • thanks, I just used apply(fit.ridge.pred,1,which.max) and was able to compute mean error and get the confusion matrix – ajax2000 Aug 14 '18 at 16:40
-1

For more details see Glmnet Vignette

fit.ridge.pred <- predict(fit.ridge.best, test[,-1], type = "class") # predict classes, not probability
table(fit.ridge.pred,test[,1]) # confusion matrix
mean(fit.ridge.pred==test[,1]) # accuracy