0

I need to solve the following Least Squares Problem where A and B and X are all matrices:

cvx_begin quiet;
variable X(len_x) nonnegative;
minimize ( norm(X * A  - B , 2));
subject to
 X >= 0;
for i=1: size(X,2)
    for j= i + 1: size(X,2)
        transpose(X(:,i)) * X(:,j)  <= epsilon
    end
end
cvx_end

I choose CVX, but it doesn't require me to transform the problem into standard form. But with CVX, I get the following error:

Error using cvx/quad_form (line 230)
The second argument must be positive or negative semidefinite.
Error in  *  (line 261)
            [ z2, success ] = quad_form( xx, P, Q, R );
Error in sanaz_opt (line 28)
        transpose(X(:,i)) * X(:,j) <= 0.1

I'm wondering how I can solve this problem? I'm trying to use Gurobi or least squares function in Matlab, but it seems they can't handle the transpose(X(:,i)) * X(:,j) constraint.

Erin
  • 177
  • 3
  • 14
  • It's not about transpose in general. CVX is doing proof-of-convexity by construction (CVX can only be used to formulate convex problems). Therefore there are limitations like ```The second argument must be positive or negative semidefinite.``` in these ```quad_forms``` or either CVX can't reson about convexity and will stop. As i'm not really a matlab-user and the code seems to be incomplete it's hard to reason about what you actually want to do and what you have to do to achieve it. – sascha Feb 10 '17 at 16:48
  • So if that's the case, I can solve this problem without orthogonality constraint. But now how can I transform it into an orthogonal solution while keeping the error minimized? – Erin Feb 10 '17 at 18:08
  • I actually tried this without the constraint and then solving for every column again, such that the new column is close to its previous value and orthogonal to other columns. But the error of the original problem is very large. – Erin Feb 10 '17 at 18:09
  • This is not least-squares. You used the spectral norm instead of the Frobenius norm. – Rodrigo de Azevedo Feb 25 '17 at 14:02

0 Answers0