I have a SVM problem of the form:
minimize ||Q||_F subject to l_i (x_i^T Q x_i) >= 1
where Q
is a square matrix, the x_i
's are the training examples and the l_i
's are the labels for the training examples.
Is there a way to solve this using existing optimization tools for MATLAB using a built-in optimization routine or CVX, libsvm or other optimization package?