1

I am using cvxpy to solve quadratic optimization problem. The quadratic problem is part of another optimization routine which iteratively feeds the quadratic optimization problem P matrix and q vector. The issue is if my number of variables are less than 20, then code runs fine. Anything above 20 leads to DCP error, which is surprising as the model is very simple and convex. I checked the matrix (denoted as var50 in the error message below) where this issue occurs and the matrix has positive eigen values and is symmetric.

Code

import numpy as np
import cvxpy as cp

def getConvexProblem(numSamples,K_XX,K_XX_Diag,c_parameter):
    beta = cp.Variable(numSamples) 
    #print(f'Ref count of beta: {sys.getrefcount(beta)}')
    
    objective = cp.Minimize(cp.quad_form(beta,K_XX) - K_XX_Diag@beta)
    constraints = [cp.sum(beta)==1.0,0 <= beta, beta<= c_parameter]
    prob = cp.Problem(objective, constraints)
    try:
        result = prob.solve()
    except:
        print(f'All eigen values greater than 1e-6: {np.all(np.linalg.eigvals(K_XX))>1e-6}')
        print(f'K matrix symmetric: {np.all(K_XX == K_XX.T)}')
        print(beta.value)
        print(bazinga)  #to end the code as I need to get the beta values in each iteration
 
    #Get the support vectors
    betaValues = beta.value.copy() 
    return betaValues

K_XX = np.load('KMatrix.npy')
c_parameter = 1.0 #This parameter decides if a point is in hyperspehere or not
shape_parameter = 1.0 
numSamples = K_XX.shape[0]
K_XX_Diag = np.diag(K_XX)

for i in range(200):
    print(i)
    betaValues = getConvexProblem(numSamples,K_XX,K_XX_Diag,c_parameter)
    print("Successful")

The above code runs successfully for the first iteration but for i=1, the code returns the following DCP error

cvxpy.error.DCPError: Problem does not follow DCP rules. Specifically:
The objective is not DCP. Its following subexpressions are not:
QuadForm(var50, [[1.         1.         0.71649747 0.71649747 0.89519758 0.89519758
  0.63046307 0.91988885 0.79853532 0.79853532 0.7002468  0.7002468
  0.72852208 0.72852208 0.77480099 0.77480099 0.83594325 0.83594325
  0.98437813 0.91988885 0.8042854  0.8042854  0.80019135 0.80019135]
 [1.         1.         0.71649747 0.71649747 0.89519758 0.89519758
  0.63046307 0.91988885 0.79853532 0.79853532 0.7002468  0.7002468
  0.72852208 0.72852208 0.77480099 0.77480099 0.83594325 0.83594325
  0.98437813 0.91988885 0.8042854  0.8042854  0.80019135 0.80019135]
 [0.71649747 0.71649747 1.         1.         0.69744935 0.69744935
  0.62395897 0.74792259 0.62951852 0.62951852 0.57325859 0.57325859
  0.597729   0.597729   0.64419811 0.64419811 0.70605087 0.70605087
  0.72322202 0.74792259 0.58615326 0.58615326 0.82901318 0.82901318]
 [0.71649747 0.71649747 1.         1.         0.69744935 0.69744935
  0.62395897 0.74792259 0.62951852 0.62951852 0.57325859 0.57325859
  0.597729   0.597729   0.64419811 0.64419811 0.70605087 0.70605087
  0.72322202 0.74792259 0.58615326 0.58615326 0.82901318 0.82901318]
 [0.89519758 0.89519758 0.69744935 0.69744935 1.         1.
  0.56701987 0.87509903 0.80303374 0.80303374 0.63364027 0.63364027
  0.66131871 0.66131871 0.69600847 0.69600847 0.81028339 0.81028339
  0.88705628 0.87509903 0.75688561 0.75688561 0.75758664 0.75758664]
 [0.89519758 0.89519758 0.69744935 0.69744935 1.         1.
  0.56701987 0.87509903 0.80303374 0.80303374 0.63364027 0.63364027
  0.66131871 0.66131871 0.69600847 0.69600847 0.81028339 0.81028339
  0.88705628 0.87509903 0.75688561 0.75688561 0.75758664 0.75758664]
 [0.63046307 0.63046307 0.62395897 0.62395897 0.56701987 0.56701987
  1.         0.62186271 0.54185608 0.54185608 0.70918568 0.70918568
  0.69541603 0.69541603 0.74549181 0.74549181 0.59475479 0.59475479
  0.63808876 0.62186271 0.62083801 0.62083801 0.6539573  0.6539573 ]
 [0.91988885 0.91988885 0.74792259 0.74792259 0.87509903 0.87509903
  0.62186271 1.         0.82338712 0.82338712 0.69007471 0.69007471
  0.72570357 0.72570357 0.76416144 0.76416144 0.89388049 0.89388049
  0.9159043  1.         0.75197911 0.75197911 0.85457952 0.85457952]
 [0.79853532 0.79853532 0.62951852 0.62951852 0.80303374 0.80303374
  0.54185608 0.82338712 1.         1.         0.67410508 0.67410508
  0.71366685 0.71366685 0.70978169 0.70978169 0.87989017 0.87989017
  0.78713511 0.82338712 0.73269949 0.73269949 0.73722156 0.73722156]
 [0.79853532 0.79853532 0.62951852 0.62951852 0.80303374 0.80303374
  0.54185608 0.82338712 1.         1.         0.67410508 0.67410508
  0.71366685 0.71366685 0.70978169 0.70978169 0.87989017 0.87989017
  0.78713511 0.82338712 0.73269949 0.73269949 0.73722156 0.73722156]
 [0.7002468  0.7002468  0.57325859 0.57325859 0.63364027 0.63364027
  0.70918568 0.69007471 0.67410508 0.67410508 1.         1.
  0.93619563 0.93619563 0.88565239 0.88565239 0.69711228 0.69711228
  0.70067003 0.69007471 0.75401201 0.75401201 0.67006468 0.67006468]
 [0.7002468  0.7002468  0.57325859 0.57325859 0.63364027 0.63364027
  0.70918568 0.69007471 0.67410508 0.67410508 1.         1.
  0.93619563 0.93619563 0.88565239 0.88565239 0.69711228 0.69711228
  0.70067003 0.69007471 0.75401201 0.75401201 0.67006468 0.67006468]
 [0.72852208 0.72852208 0.597729   0.597729   0.66131871 0.66131871
  0.69541603 0.72570357 0.71366685 0.71366685 0.93619563 0.93619563
  1.         1.         0.90306711 0.90306711 0.74109946 0.74109946
  0.72798759 0.72570357 0.75908715 0.75908715 0.70561867 0.70561867]
 [0.72852208 0.72852208 0.597729   0.597729   0.66131871 0.66131871
  0.69541603 0.72570357 0.71366685 0.71366685 0.93619563 0.93619563
  1.         1.         0.90306711 0.90306711 0.74109946 0.74109946
  0.72798759 0.72570357 0.75908715 0.75908715 0.70561867 0.70561867]
 [0.77480099 0.77480099 0.64419811 0.64419811 0.69600847 0.69600847
  0.74549181 0.76416144 0.70978169 0.70978169 0.88565239 0.88565239
  0.90306711 0.90306711 1.         1.         0.75382039 0.75382039
  0.77761597 0.76416144 0.78370395 0.78370395 0.74564304 0.74564304]
 [0.77480099 0.77480099 0.64419811 0.64419811 0.69600847 0.69600847
  0.74549181 0.76416144 0.70978169 0.70978169 0.88565239 0.88565239
  0.90306711 0.90306711 1.         1.         0.75382039 0.75382039
  0.77761597 0.76416144 0.78370395 0.78370395 0.74564304 0.74564304]
 [0.83594325 0.83594325 0.70605087 0.70605087 0.81028339 0.81028339
  0.59475479 0.89388049 0.87989017 0.87989017 0.69711228 0.69711228
  0.74109946 0.74109946 0.75382039 0.75382039 1.         1.
  0.82877717 0.89388049 0.72045674 0.72045674 0.83626163 0.83626163]
 [0.83594325 0.83594325 0.70605087 0.70605087 0.81028339 0.81028339
  0.59475479 0.89388049 0.87989017 0.87989017 0.69711228 0.69711228
  0.74109946 0.74109946 0.75382039 0.75382039 1.         1.
  0.82877717 0.89388049 0.72045674 0.72045674 0.83626163 0.83626163]
 [0.98437813 0.98437813 0.72322202 0.72322202 0.88705628 0.88705628
  0.63808876 0.9159043  0.78713511 0.78713511 0.70067003 0.70067003
  0.72798759 0.72798759 0.77761597 0.77761597 0.82877717 0.82877717
  1.         0.9159043  0.80139615 0.80139615 0.80388186 0.80388186]
 [0.91988885 0.91988885 0.74792259 0.74792259 0.87509903 0.87509903
  0.62186271 1.         0.82338712 0.82338712 0.69007471 0.69007471
  0.72570357 0.72570357 0.76416144 0.76416144 0.89388049 0.89388049
  0.9159043  1.         0.75197911 0.75197911 0.85457952 0.85457952]
 [0.8042854  0.8042854  0.58615326 0.58615326 0.75688561 0.75688561
  0.62083801 0.75197911 0.73269949 0.73269949 0.75401201 0.75401201
  0.75908715 0.75908715 0.78370395 0.78370395 0.72045674 0.72045674
  0.80139615 0.75197911 1.         1.         0.66411597 0.66411597]
 [0.8042854  0.8042854  0.58615326 0.58615326 0.75688561 0.75688561
  0.62083801 0.75197911 0.73269949 0.73269949 0.75401201 0.75401201
  0.75908715 0.75908715 0.78370395 0.78370395 0.72045674 0.72045674
  0.80139615 0.75197911 1.         1.         0.66411597 0.66411597]
 [0.80019135 0.80019135 0.82901318 0.82901318 0.75758664 0.75758664
  0.6539573  0.85457952 0.73722156 0.73722156 0.67006468 0.67006468
  0.70561867 0.70561867 0.74564304 0.74564304 0.83626163 0.83626163
  0.80388186 0.85457952 0.66411597 0.66411597 1.         1.        ]
 [0.80019135 0.80019135 0.82901318 0.82901318 0.75758664 0.75758664
  0.6539573  0.85457952 0.73722156 0.73722156 0.67006468 0.67006468
  0.70561867 0.70561867 0.74564304 0.74564304 0.83626163 0.83626163
  0.80388186 0.85457952 0.66411597 0.66411597 1.         1.        ]])

user1443613
  • 171
  • 2
  • 11

1 Answers1

1

Maybe it has a slight negative eigenvalue after all (at least up to numerical precision)? Your code has a typo in the check. Replace

np.all(np.linalg.eigvals(K_XX))>1e-6

with

np.all(np.linalg.eigvals(K_XX)>1e-6)

and check again.

  • Thanks for pointing it out. I see there are couple of eigenvalues of the order 10^{-18} which are negative. Not sure if they can be treated as zero. But my confusion is that cvxpy solved this problem first time and on second iteration it threw the DCP error. Is there any way around? I was thinking of adding a small (1e-6) term to the diagonal entries of the K_XX matrix so that would result in positive eigen values. Mathematically, Is it allowed? K_XX is a squared exponential kernel – user1443613 Nov 04 '22 at 23:35
  • Tested the above approach. Whenever I get DCP error, I added 1e-6 to the diagonal entries of K_XX and solved the problem again. This time it ran for all the iterations. I also tested it for higher number of variables and didn't get any DCP error. – user1443613 Nov 05 '22 at 00:21
  • I cannot reproduce the issue with 0-th iteration solving and the subsequent not solving. For me (with my generic data) it fails right away. – Michal Adamaszek Nov 05 '22 at 15:28