3

I am using the R interface to the Lawson-Hanson NNLS implementation of an algorithm for non-negative linear least squares that solves ||A x - b||^2 with the constraint that all elements of vector x ≥ 0. This works fine but I would like to add further constrains. Of interest to me are:

  1. Also minimize "energy" of x: ||A x - b||^2 + m*||x||^2

  2. Minimize "energy in the x derivative" ||A x - b||^2 + m ||H x||^2, where H is the sum of identity and a matrix with -1 on the first off-diagonal

  3. Most generally, minimize ||A x - b||^2 + m ||H x - f||^2.

Is there are a way to coax nnls to do this by some clever way of restating the problems 1.-3. Above? The reason I have hope for such a thing is that there is a little-throw away comment in a paper by Whitall et al (sorry for the paywall) that claims that "fortunately, NNLS can be adopted from the original form above to accommodate something in problem 3".

double-beep
  • 5,031
  • 17
  • 33
  • 41
DrSAR
  • 1,522
  • 1
  • 15
  • 36
  • Does solution below work though, because I thought that nnls requires that all elements of the design matrix are positive, which for this A* matrix wouldn't be the case... It would work for solving case 1 above though. Maybe using quadprog instead of nnls would work though, as in https://stats.stackexchange.com/questions/136563/linear-regression-with-individual-constraints-in-r/415171#415171 for case 2? Case 1 is known as ridge regression, see https://stats.stackexchange.com/questions/69205/how-to-derive-the-ridge-regression-solution, and case 2 as "fused ridge". – Tom Wenseleers Jun 28 '19 at 10:14

1 Answers1

3

I take it m is a scalar, right? Consider the simple case m=1; you can generalize for other values of m by letting H* = sqrt(m) H and f* = sqrt(m) f and using the solution method given here.

So now you're trying to minimise ||A x - b||^2 + ||H x - f||^2.

Let A* = [A' | H']' and let b* = [b' | f']' (i.e. stack up A on top of H and b on top of f) and solve the original problem of non-negative linear least squares on ||A* x - b*||^2 with the constraint that all elements of vector x ≥ 0 .

Glen_b
  • 7,883
  • 2
  • 37
  • 48
  • Yes, m is scalar. Oh dear - Slightly embarrassed - this was quite obvious and I had somehow confused myself with dimensions of the matrices and vectors involved. I've put that into my call to nnls and it works quite nicely... – DrSAR Nov 13 '11 at 18:32
  • Does this work though, because I thought that nnls requires that all elements of the design matrix are positive, which for this A* matrix wouldn't be the case... It would work for solving case 1 above though. Maybe using quadprog instead of nnls would work though, as in https://stats.stackexchange.com/questions/136563/linear-regression-with-individual-constraints-in-r/415171#415171 for case 2? Case 1 is known as ridge regression, see https://stats.stackexchange.com/questions/69205/how-to-derive-the-ridge-regression-solution, and case 2 as "fused ridge". – Tom Wenseleers Jun 28 '19 at 10:14