0

I am searching for an efficient logistic regression implementation in matlab. I used lassoglm in matlab. But when I try with 10000 examples with 1000 features and regularization params 0.005 to 1, it is really slow. I use two fold cross validation. Starting with lambda 0.05, it is very slow and takes a lot of time.

Is there any better method?

user34790
  • 2,020
  • 7
  • 30
  • 37

1 Answers1

0

You might want to check out LIBLINEAR. It is a free, state-of-the-art library for linear large scale learning. It has a MATLAB interface.

LIBLINEAR features several linear methods, including:

 for multi-class classification
     0 -- L2-regularized logistic regression (primal)
     1 -- L2-regularized L2-loss support vector classification (dual)
     2 -- L2-regularized L2-loss support vector classification (primal)
     3 -- L2-regularized L1-loss support vector classification (dual)
     4 -- support vector classification by Crammer and Singer
     5 -- L1-regularized L2-loss support vector classification
     6 -- L1-regularized logistic regression
     7 -- L2-regularized logistic regression (dual)
   for regression
    11 -- L2-regularized L2-loss support vector regression (primal)
    12 -- L2-regularized L2-loss support vector regression (dual)
    13 -- L2-regularized L1-loss support vector regression (dual)
Marc Claesen
  • 16,778
  • 6
  • 27
  • 62
  • Thanks for the info. I just wanted to confirm that Liblinear for L1 regularized logistic regression doesn't normalize data itself. We have to provide normalized training instances. Also I could see this col option. My training dataset has each instance as a row. So I don't need to do anything for this col param right? I just didn't get what col means here – user34790 Mar 02 '14 at 15:14
  • @user34790 you are right, liblinear does no implicit normalization. I don't know about the `col` option as it seems to be absent in the command line interface (I've never used liblinear in matlab myself). – Marc Claesen Mar 02 '14 at 15:18