0

I have a SVM problem of the form:

minimize ||Q||_F subject to l_i (x_i^T Q x_i) >= 1

where Q is a square matrix, the x_i's are the training examples and the l_i's are the labels for the training examples.

Is there a way to solve this using existing optimization tools for MATLAB using a built-in optimization routine or CVX, libsvm or other optimization package?

Rodrigo de Azevedo
  • 1,097
  • 9
  • 17
BMillikan
  • 21
  • 2
  • 3
    I'm afraid this question is not a good fit for Stack Overflow. First, it's probably unclear for most programmers around. Then, it's either too broad (if you'd want us to write the code) or a library/tool recommendation question (which is also off-topic as it tends to attract spammy and low-quality answers. – Andras Deak -- Слава Україні Jun 13 '16 at 22:17

1 Answers1

0

Exploiting properties of the trace operator, we obtain the following

enter image description here

which should be easy to translate to CVX. Squaring the norm and vectorizing the matrix, we obtain an inequality-constrained quadratic program. Instead of using CVX, one can also use quadprog.


LaTeX code:

$$\begin{array}{ll} \text{minimize} & \| \mathrm Q \|_F\\ \text{subject to} & \mbox{tr} (l_1  \mathrm x_1  \mathrm x_1^T  \mathrm Q) \geq 1\\ &\qquad\vdots\\ & \mbox{tr} (l_m  \mathrm x_m  \mathrm x_m^T  \mathrm Q) \geq 1\end{array}$$
Rodrigo de Azevedo
  • 1,097
  • 9
  • 17