11

I'm trying to find a way to fit a linear regression model with positive coefficients.

The only way I found is sklearn's Lasso model, which has a positive=True argument, but doesn't recommend using with alpha=0 (means no other constraints on the weights).

Do you know of another model/method/way to do it?

desertnaut
  • 57,590
  • 26
  • 140
  • 166
Oren
  • 258
  • 1
  • 2
  • 10

3 Answers3

6

IIUC, this is a problem which can be solved by the scipy.optimize.nnls, which can do non-negative least squares.

Solve argmin_x || Ax - b ||_2 for x>=0.

In your case, b is the y, A is the X, and x is the β (coefficients), but, otherwise, it's the same, no?

Ami Tavory
  • 74,578
  • 11
  • 141
  • 185
3

Many functions can keep linear regression model with positive coefficients.

  1. scipy.optimize.nnls can solve above problem.
  2. scikit-learn LinearRegression can set the parameter positive=True to solve this. And, the sklearn also uses the scipy.optimize.nnls. Interestingly, you can learn how to write multiple targets outputs in source code.
  3. Additionally, if you want to solve linear least squares with bounds on the variables. You can see lsq_linear .
Ali Ma
  • 31
  • 1
0

As of version 0.24, scikit-learn LinearRegression includes a similar argument positive, which does exactly that; from the docs:

positive : bool, default=False

When set to True, forces the coefficients to be positive. This option is only supported for dense arrays.

New in version 0.24.

desertnaut
  • 57,590
  • 26
  • 140
  • 166