10

for my constrained Problem I want to use the Scipy-Trusted-Constr algorithm as I have a multivariable, constraint problem. I dont want /can't calculate the Jacobi/Hessian analytically, and compute it. However, when setting the bounds, the computation of the Jacobian crashes:

  File "C:\Python27\lib\site-packages\scipy\optimize\_trustregion_constr\tr_interior_point.py", line 56, in __init__
    self.jac0 = self._compute_jacobian(jac_eq0, jac_ineq0, s0)
  File "C:\Python27\lib\site-packages\scipy\optimize\_trustregion_constr\tr_interior_point.py", line 164, in _compute_jacobian
    [J_ineq, S]]))
  File "C:\Python27\lib\site-packages\numpy\matrixlib\defmatrix.py", line 1237, in bmat
    arr_rows.append(concatenate(row, axis=-1))
ValueError: all the input array dimensions except for the concatenation axis must match exactly

The error occurs both when using old style bounds and the newest Bounds object. I could reproduce the error with this code:

import numpy as np
import scipy.optimize as scopt

def RosenbrockN(x):
    result = 0
    for i in range(len(x)-1):
        result += 100*(x[i+1]-x[i]**2)**2+(1-x[i])**2
    return result
x0 = [0.0, 0.0, 0.0]
#bounds = scopt.Bounds([-2.0,-0.5,-2.0],[2.0,0.8,0.7])
bounds = [(-2.0,2.0),(-0.5,0.8),(-2.0,0.7)] 
Res = scopt.minimize(RosenbrockN, x0, \
                    method = 'trust-constr', bounds = bounds, \
                    jac = '2-point', hess = scopt.SR1())

I take it that I just misunderstand how bounds are set, but cant find my mistake. Advice is appreciated.

EDIT: I also tried the code example from the documentation which gave the same result. Other methods as SLSQP work well with bounds.

SciPy Version 1.1.0, Python Version 2.7.4, OS Win 7 Ent.

Pirate
  • 121
  • 1
  • 9
  • See [probably related issue](https://github.com/scipy/scipy/issues/9146). – sascha Sep 02 '18 at 12:11
  • 3
    Thanks for the link. Trust-constr not working with constraints in general comes suprising as it is listed as the algorithm of choice for constrainted problems in the documentation, though. – Pirate Sep 03 '18 at 12:56
  • So is the conclusion that trust-constr just doesn't work at all with bounds? – Kvothe Jun 23 '21 at 18:03
  • @Kvothe, the Rosenbrock example works fine with `scopt.Bounds`. Bytheway, you can `from scipy.optimize import rosen, rosen_der, rosen_hess` – denis Jun 25 '21 at 12:27
  • Ok, I don't know than what happened in this question (it was upvoted quite a bit so I think it was probably really not working at some point.) For me the issue was that you have to explicitly demand that the search stays within the region of constraints with for example: `Bounds(np.array([0.5, 0.5]), np.array([0.55, 2.0]), keep_feasible = True)` Otherwise it will happily step outside your bounds (for the intermediate steps). – Kvothe Jun 25 '21 at 13:22

2 Answers2

1

I tried several times with the method "trust-constr" and the boundary constraints fails to be incorporated. I solved this issue by using linear constraints for the boundary conditions. Following the example

from scipy.optimize import minimize, LinearConstraint, Bounds
  
def RosenbrockN(x):
    result = 0
    for i in range(len(x)-1):
        result += 100*(x[i+1]-x[i]**2)**2+(1-x[i])**2
    return result

x0 = [0.0, 0.0, 0.0]

# This will not work:
#bounds = Bounds([-2.0,-0.5,-2.0],[2.0,0.8,0.7])

# This works
lb = [-2.0,-0.5,-2.0]
ub = [2.0,0.8,0.7]
A = [[1, 0, 0], [0, 1, 0], [0, 0, 1]]
lcons = LinearConstraint(A, lb=lb, ub=ub, keep_feasible=True)
Res = minimize(RosenbrockN, x0, method = 'trust-constr', constraints=lcons) 
MasterJedi
  • 1,618
  • 1
  • 18
  • 17
-3

I removed your jac and hess arguments and got it to work; perhaps the problem lies there?

import numpy as np
import scipy.optimize as scopt

def RosenbrockN(x):
    result = 0
    for i in range(len(x)-1):
        result += 100*(x[i+1]-x[i]**2)**2+(1-x[i])**2
    return result

x0 = [0.0, 0.0, 0.0]
#bounds = scopt.Bounds([-2.0,-0.5,-2.0],[2.0,0.8,0.7])
bounds = [(-2.0,2.0),(-0.5,0.8),(-2.0,0.7)] 
Res = scopt.minimize(RosenbrockN, x0, \
                    method = 'SLSQP', bounds = bounds)
print(Res)

Result is

     fun: 0.051111012543332675
     jac: array([-0.00297706, -0.50601892, -0.00621008])
 message: 'Optimization terminated successfully.'
    nfev: 95
     nit: 18
    njev: 18
  status: 0
 success: True
       x: array([0.89475126, 0.8       , 0.63996894])
ansri
  • 37
  • 6
  • 1
    You switched to a completely different optimization algorithm. That is not solving the issue! The jacobian and Hessian were of course supplied for a reason because it is much more efficient! – Kvothe Jun 23 '21 at 18:23
  • whoops - sorry for the confusion. my bad – ansri Aug 09 '21 at 05:05