1

Short version: Why is unconstrained optimization with a constraint "hacked" into the objective function working while constrained optimization isn't?

Long version:

In Matlab, I'm trying to find a p x n subspace U which explains an arbitrary amount of variance in my m x p dataset X. The objective function is:

projection = U'*X';
fracVar = sum(var(projection'))/totalVar;
fVal = (targetVar - fracVar)^2;

I tried using fmincon() with a nonlinear equality constraint that U must have orthogonal columns of unit norm:

temp = norm(U'*U - eye(size(U,2)));

I can not get this to work. It fails in different ways, including not getting anywhere near the desired fraction of variance and failing to meet the constraints.

However, if I use fminsearch() instead and change the objective to:

projection = orth(U)'*X';
fracVar = sum(var(projection'))/totalVar;
fVal = abs(targetVar - fracVar);

and then use orth() again on the result, it seems to work perfectly. I thought this would be the "wrong" way as I'm basically hacking the constraint into the objective.

What is an explanation for this? Is there a "correct" approach to this problem?

Evan
  • 435
  • 3
  • 12
  • are you aware to behavior of 'norm' on matrix? this is not columns norm. – Mendi Barel Apr 16 '17 at 21:36
  • @MendiBarel Yes, it returns the maximum singular value, which should be 0 in this case, no? – Evan Apr 16 '17 at 21:45
  • this is naturally that when you replace the constraint with cost function than the search area become wider and smooth on the path to the global minimum. when you use direct constraint usually you must also have good initial guess to avoid local minima. use direct constraint only when the constraint is violated in the global minima. – Mendi Barel Apr 16 '17 at 22:09

0 Answers0