4

I'm trying to use the minimize function from scipy to do curve fitting.

I have a 4-parameter equation with bounds constraints on each of them and I'd like to add the constraint that param1 is different from param2 (and if possible param2 > param1). From what I understood, I have to use scipy.optimize.minimize with bound and constraints. But the constraints can only be equality (to 0) or inequality (non-negative, >= 0).

So if I set the constraint function to that (params are store in an array):

cons = {"type": "ineq", "fun": lambda x: x[1] - x[0]}

in case x[1] and x[0] are equal, the inequality constraint doesn't apply. Actually I don't really understand why there is no "strictly positive" constraint...

A workaround could be to add some very small residue to this "p[1] - p[0]" but isn't there a cleanest solution??

Cheers

mbahin
  • 129
  • 2
  • 10
  • 8
    In numerical computing, there is no difference in `>` and `>=`. For example `x[0] = x[1]+10**-100` implies `x[0]>x[1]`. However, under machine precision (which can only handle differences around the order of `10**-15`), `x[0] = x[1]+10**-100` is interpreted as `x[0]==x[1]`. If you really want a strict inequality you must introduce a (small) residue. – Stelios Aug 09 '17 at 08:46
  • 3
    *"Actually I don't really understand why there is no "strictly positive" constraint..."* Think about this example: for a scalar x, minimize x**2, subject to the constraint x > 0. What is the solution? – Warren Weckesser Aug 09 '17 at 18:08

0 Answers0