I am trying simple experiment to learn scipy's SLSQP optimizer.
I took the functions:
def obj(x):
return -1*((x[0]*x[0])+(x[1]*x[1]))
It's jacobian as :
def jacj(x):
return [-2*x[0],-2*x[1]]
It's bounds as:
bounds=[(0,1),(0,1)]
A simple constraint-- x[0]+2*x[1]<=1:
cons2=({'type':'ineq',
'fun':lambda x: np.array([-(x[0])-2*(x[1])+1]),
'jac':lambda x: np.array([-1.0,-2.0])})
Now I try with initial guess x0=[.1,0.01]
res=minimize(obj,x0,method='slsqp',jac=jacj,bounds=bounds,
constraints=cons2,options={'maxiter':100,'ftol':0.000001,'eps':1.0e-08})
When I run this I get the solution as: x[0]=1,x[1]=0 and obj=-1
But when I start with the initial guess as x0=[0.001,0.01], i get the solution as: x[0]=0,x[1]=0.5 and obj=0.25
Why is it not giving an optimal solution in the latter run? How does it work?