11

I am learning to optimize a multivariate constrained nonlinear problem with scipy.optimize.minimize,but received strange results.

My problem:

minimize objfun

objfun   x*y

constraints 0<=x<=5,  0<=y<=5,  x+y==5

My code:

from scipy import optimize
def func(x):

    return x[0]*x[1]

bnds=((0,100),(0,5))

cons=({'type':'eq','fun':lambda x:x[0]+x[1]-5})
x0=[0,0]
res= optimize.minimize(func,x0,method='SLSQP',bounds=bnds,constraints=cons)

Received results:

status: 0 success: True njev: 2 nfev: 8 fun: 6.2499999999999991 x: array([ 2.5, 2.5]) message: 'Optimization terminated successfully.' jac: array([ 2.5, 2.5, 0. ]) nit: 2

I am expecting the fun to be 0 or significantly close to 0 and x or y to be 0

E.S
  • 536
  • 1
  • 9
  • 20
david
  • 113
  • 1
  • 1
  • 5

1 Answers1

8

I think you are hitting a edge case. If you try with a guess that is not symmetric, you converge to the right solution.

Just change x0=[0,0] to something else, like x0=[.2,.9].

EDIT: expanding after @pv comment.

[x,y]=[2.5,2.5] is a local maximum of the constrained function. After jumping to this local maximum, the algorithm calculates again the direction it should take to minimize the target.

It does so by calculating the value at [ 2.50000001 2.5 ] and at [ 2.5 2.50000001]. It finds that this direction is (-1,-1). This direction is however orthogonal to the constraint, and it then stops.

The problem arises because the target and the constraint are symmetric with respect to x=y, and that we are starting with the guess exactly on x=y.

gg349
  • 21,996
  • 5
  • 54
  • 64
  • Adding `print(x)` to `func` indeed shows that the solver jumps directly to `[2.5, 2.5]` on the first iteration --- this is a point where the gradient of the constrained objective function is zero. SLSQP probably then notices it has arrived at a saddle point, and terminates. – pv. Nov 12 '14 at 14:13