0

I have the following simple program:

import numpy as np
import cvxpy as cp
np.random.seed(0)
n = 100
i = 20
y = np.random.rand(n)
A = np.random.rand(i, n).T
x = cp.Variable(n)
lmbd = cp.Variable(i)
objective = cp.Minimize(cp.sum_squares(x - y))
constraints = [x == A*lmbd,
               lmbd >= np.zeros(i),
               cp.sum(lmbd) == 1]
prob = cp.Problem(objective, constraints)
result = prob.solve(verbose=True)

I would like to know what happens under the hood. I know that for example the solver OSQP is being used thanks to the following variable prob.solver_stats.solver_name, I might also decide to use another solver (e.g. result = prob.solve(solver="CVXOPT", verbose=True)).

I would like to know how the problem is treated. I have the idea that it should be pre-treated since it seems like a double problem (the quadratic minimization one - y variable, and the lmbd variable as a constraint satisfaction). However, in the CVXOPT documentation, it seems to me, that the problem should be only treated as a quadratic or linear problem. In the case of CVXOPT, I know how to use it, and I wouldn't know how to translate the problem in this case, however, CVXPY does this with no trouble.

Thanks for the insight.

silgon
  • 6,890
  • 7
  • 46
  • 67
  • Whats exactly the question? 1) Minimization vs. "constraint-satisfaction"? Well... drop QPs and look at (the easier) Linear Programming theory. It's defined on a linear objective and linear equality constraints (we use slacks for inequalities). Nothing special. 2) A (convex) QP is a linearily constrained problem with a (convex) Quadratic form as objective (+ a linear term). Same as LP, only the objective changes. The "constraint-satisfaction" part is still natural. Even the more complex [QCQP](https://en.wikipedia.org/wiki/Quadratically_constrained_quadratic_program) has these lin equalities – sascha Sep 06 '21 at 19:33
  • 1
    cvxopt's [QP](https://cvxopt.org/userguide/coneprog.html#cvxopt.solvers.qp) also has these linear equalities (`Ax=b`) which are your "constraint satisfaction" parts. – sascha Sep 06 '21 at 19:36
  • Ohhhh... that's true, it's the slack variables, I forgot the basics. However, I would like to know how is that transformed. How do I know the call (with input parameters) `CVXPY` does to `CVXOPT`, `OSQP` or some other. That'd be great to understand what happens and debug better. – silgon Sep 07 '21 at 18:52
  • Then read the code. Each solver has a wrapper and recognizing the [cvxopt](https://github.com/cvxpy/cvxpy/blob/master/cvxpy/reductions/solvers/conic_solvers/cvxopt_conif.py) or [osqp](https://github.com/cvxpy/cvxpy/blob/master/cvxpy/reductions/solvers/qp_solvers/osqp_qpif.py) matrices there will be easy. The [conic transformations happening before](https://github.com/cvxpy/cvxpy/tree/master/cvxpy/reductions) on the other hand... which are producing these matrices and also depend on the target-solver (e.g. SOCP vs. QP for a norm_2 objective) -> this will be less fun to read probably. – sascha Sep 07 '21 at 20:51
  • Before going full in on those reductions, i would recommend reading the academic papers first, especially the first two links in the [docs](https://www.cvxpy.org/citing/index.html). – sascha Sep 07 '21 at 20:56
  • thanks for the insight! =) – silgon Sep 08 '21 at 19:20

1 Answers1

0

Since your problem is a least squares, its matrices are probably only cast from least squares to quadratic programming then passed to the QP solver as is. (This operation is simpler than the SOCP→QP conversion mentioned in the comments.)

Tastalian
  • 369
  • 4
  • 12