11

I'm using SciPy for optimization and the method SLSQP seems to ignore my constraints.

Specifically, I want x[3] and x[4] to be in the range [0-1]

I'm getting the message: 'Inequality constraints incompatible'

Here is the results of the execution followed by an example code (uses a dummy function):

  status: 4
  success: False
njev: 2
nfev: 24
 fun: 0.11923608071680103
   x: array([-10993.4278558 , -19570.77080806, -23495.15914299, -26531.4862831 ,
     4679.97660534])
message: 'Inequality constraints incompatible'
 jac: array([ 12548372.4766904 ,  12967696.88362279,  39928956.72239509,
    -9224613.99092537,   3954696.30747453,         0.        ])
 nit: 2

Here is my code:

from random import random
from scipy.optimize import minimize

def func(x):
   """ dummy function to optimize """
   print 'x'+str(x)
   return random()

my_constraints = ({'type':'ineq', 'fun':lambda(x):1-x[3]-x[4]},
                  {'type':'ineq', 'fun':lambda(x):x[3]},
                  {'type':'ineq', 'fun':lambda(x):x[4]},
                  {'type':'ineq', 'fun':lambda(x):1-x[4]},
                  {'type':'ineq', 'fun':lambda(x):1-x[3]})

minimize(func, [57.9499 ,-18.2736,1.1664,0.0000,0.0765],
         method='SLSQP',constraints=my_constraints)

EDIT - The problem persists when even when removing the first constraint.

The problem persists when I try to use the bounds variables. i.e.,

bounds_pairs = [(None,None),(None,None),(None,None),(0,1),(0,1)]
minimize(f,initial_guess,method=method_name,bounds=bounds_pairs,constraints=non_negative_prob)
Zahy
  • 367
  • 7
  • 18
  • 4
    Why are you using a nonsensical function to optimize? If the function just returns `random()` (and in particular, doesn't even return consistent results for the same input), of course SciPy is going to get confused. – user2357112 Nov 03 '15 at 23:56
  • For the sake of the example. This problem occurs regardless of the function I use. I don't think that's the problem @user2357112 – Zahy Nov 04 '15 at 00:23
  • At least in the scipy docs, when using a lambda they take pains to return an np.array(), like: ’fun’ : lambda x: np.array([x[0]**3 - x[1]]). – Jon Custer Nov 04 '15 at 15:18
  • Thanks @JonCuster but this is not the case too. The problem persists whether I use np.array or just a sequence. I tried to have the minimal example that reproduces this issue. – Zahy Nov 04 '15 at 17:17
  • It was worth a shot. OK, lets look at the constraints. If you really just want what you stated, i.e. x[3] and x[4] be [0,1], why do you have the first constraint? You could just use the bounds option instead for that. – Jon Custer Nov 04 '15 at 17:33
  • 1
    @JonCuster I actually need that constraint too. Problem persists when I remove the first constraint. I also tried the bounds variable before with no luck! – Zahy Nov 04 '15 at 18:26
  • So, thinking about it a bit more, one issue could be that your first constraint reduces the dimensionality of the problem - you could equally well replace x[4] with 1-x[3], and limit x[3] to [0,1]. This reduces the dimensionality of the problem, reduces the constraints to 3, and probably makes for a much happier algorithm. – Jon Custer Nov 05 '15 at 14:22
  • I checked it. It's not that either. @JonCuster – Zahy Nov 10 '15 at 18:04
  • Well, I'm pretty much out of ideas on my end. Sorry. – Jon Custer Nov 10 '15 at 18:06
  • @Zahy I don't know if you are still active on SO, but I've done some digging on this question as it comes up on the "featured problems" on the SciPy tag. Have a look. – chthonicdaemon May 27 '18 at 14:07

1 Answers1

15

I know this is a very old question, but I was intrigued.

When does it happen?

This problem occurs when the optimisation function is not reliably differentiable. If you use a nice smooth function like this:

opt = numpy.array([2, 2, 2, 2, 2])

def func(x):
   return sum((x - opt)**2)

The problem goes away.

How do I impose hard constraints?

Note that none of the constrained algorithms in scipy.minimize have guarantees that the function will never be evaluated outside the constraints. If this is a requirement for you, you should rather use transformations. So for instance to ensure that no negative values for x[3] are ever used, you can use a transformation x3_real = 10^x[3]. This way x[3] can be any value but the variable you use will never be negative.

Deeper analysis

Investigating the Fortran code for slsqp yields the following insights into when this error occurs. The routine returns a MODE variable, which can take on these values:

C*        MODE = -1: GRADIENT EVALUATION, (G&A)                        *
C*                0: ON ENTRY: INITIALIZATION, (F,G,C&A)               *
C*                   ON EXIT : REQUIRED ACCURACY FOR SOLUTION OBTAINED *
C*                1: FUNCTION EVALUATION, (F&C)                        *
C*                                                                     *
C*                   FAILURE MODES:                                    *
C*                2: NUMBER OF EQUALITY CONTRAINTS LARGER THAN N       *
C*                3: MORE THAN 3*N ITERATIONS IN LSQ SUBPROBLEM        *
C*                4: INEQUALITY CONSTRAINTS INCOMPATIBLE               *
C*                5: SINGULAR MATRIX E IN LSQ SUBPROBLEM               *
C*                6: SINGULAR MATRIX C IN LSQ SUBPROBLEM               *

The part which assigns mode 4 (which is the error you are getting) is as follows:

C   SEARCH DIRECTION AS SOLUTION OF QP - SUBPROBLEM

      CALL dcopy_(n, xl, 1, u, 1)
      CALL dcopy_(n, xu, 1, v, 1)
      CALL daxpy_sl(n, -one, x, 1, u, 1)
      CALL daxpy_sl(n, -one, x, 1, v, 1)
      h4 = one
      CALL lsq (m, meq, n , n3, la, l, g, a, c, u, v, s, r, w, iw, mode)

C   AUGMENTED PROBLEM FOR INCONSISTENT LINEARIZATION

      IF (mode.EQ.6) THEN
          IF (n.EQ.meq) THEN
              mode = 4
          ENDIF
      ENDIF

So basically you can see it attempts to find a descent direction, if the constraints are active it attempts derivative evaluation along the constraint and fails with a singular matrix in the lsq subproblem (mode = 6), then it reasons that if all the constraint equations were evaluated and none yielded successful descent directions, this must be a contradictory set of constraints (mode = 4).

chthonicdaemon
  • 19,180
  • 2
  • 52
  • 66
  • 1
    This is really old so can't even check it. Thanks for digging it up anyway. – Zahy May 29 '18 at 14:24
  • @chthonicdaemon can you please explain the hard constraints, again? I need to set constraints of a variable 0 – Andreas Schuldei Jan 18 '23 at 10:31
  • 1
    @AndreasSchuldei You can use a [sigmoid](https://en.wikipedia.org/wiki/Sigmoid_function) to enforce two sided hard constraints. – chthonicdaemon Jan 19 '23 at 14:22
  • @chthonicdaemon can you suggest an effective way to hard constrain sums of search variables, where sum(x1 + x1 + x3...) must be between 0 and 1, individually and in combinations? (I have a few more than 3 that must be hard limited like that. ) – Andreas Schuldei Jan 24 '23 at 17:01
  • https://stackoverflow.com/questions/75225039/constraining-sums-of-search-variables-to-search-algorithms i opened an other question for this topic – Andreas Schuldei Jan 24 '23 at 17:21