1

Beginner style, general question regarding convex optimization. As part of the learning cvxpy specifically, and convex optimization in general, was trying a basic (imaginary) geometry (hopefully convex) optimization problem. Given two points in 2D unit square (a line segment), find optimal values for [x1,y1] and [x2,y2] which meet bounding box and Euclidean distance constraints. It seems that constraint which uses lower bound on distance is not convex. See example Python/cvxpy code.

Question I have: given how Euclidean distance is so common/fundamental, is it possible to express it as convex constraint and convex optimization problem? Before I dig deeper into SOC, PSD and others, can they even solve this problem? Or non-convex optimization packages would have to be used. (There are probably other types of algorithms to solve this imaginary problem, but interested here only in optimization approach).

Just need some guiding tips to steer me in the right direction, before I start papers like 'Convex Optimization - Euclidean Distance Geometry' etc. Have checked cvxpy examples and source code, haven't found anything similar.

def setUp(self):
    self.xy1 = VarCvx(shape=(2,), name='xy1', var_id=0, nonneg=True)
    self.xy2 = VarCvx(shape=(2,), name='xy2', var_id=1, nonneg=True)

def testMinimizeNorm(self):
    """
    cvxpy.error.DCPError: Problem does not follow DCP rules. Specifically:
        The following constraints are not DCP:
        0.1 <= Pnorm(xy1 + -xy2, 2) , because the following subexpressions are not:
        |--  0.1 <= Pnorm(xy1 + -xy2, 2)
    """
    constraints = [
        self.xy1 >= 0,
        self.xy2 >= 0,
        self.xy1 <= 10,
        self.xy2 <= 10,
        cvx.norm(self.xy1 - self.xy2, 2) <= 0.9, # works fine
        cvx.norm(self.xy1 - self.xy2, 2) >= 0.1, # doesn't work
    ]
    objective = Minimize(cvx.norm(self.xy1 - self.xy2, 2))
    problem = Problem(objective, constraints)
    self.solveAndPrint(problem)

0 Answers0