0
    def get_cons(self, ub, lb):
        cons = []
        for i in range(len(ub)):
            cons.append({'type':'ineq','fun':lambda x0:ub[i]-x0[i]})
            cons.append({'type':'ineq','fun':lambda x0:x0[i]-lb[i]})
        return cons
    ub = [1,3,1,1]
    lb = [0,0,0,0]
    cons = self.get_cons(self.ub, self.lb)
    res = minimize(fun, x0[:,i], method='SLSQP', constraints=cons)

Here fun is custom loss function initial parameter is [0.08024884 0.14003958 0.0786131 0.00157402]. I expect all parameter>0,but after optimize parmeter is [-0.45684621 0.02531972 -0.10755587 0.2108312].

Whether this constraint fails?

Progman
  • 16,827
  • 6
  • 33
  • 48
hjli
  • 3
  • 2
  • Take a look at https://stackoverflow.com/questions/37791680/scipy-optimize-minimize-slsqp-with-linear-constraints-fails/37792650#37792650 – Warren Weckesser Jan 08 '23 at 15:46

1 Answers1

1

There's no need to use generic constraints for adding simple bounds on the variables. Instead, pass the variable bounds via minimize's bound argument:

bounds = [(l, u) for (l, u) in zip(lb, ub)]

res = minimize(fun, x0[:, i], bounds=bounds, method="SLSQP") 

However, if you really want to pass the bounds as generic constraints, you need to capture the value of the loop variable i:

for i in range(len(ub)):
    cons.append({'type':'ineq','fun': lambda x0, i=i: ub[i]-x0[i]})
    cons.append({'type':'ineq','fun': lambda x0: i=i: x0[i]-lb[i]})
return cons

Otherwise, each constraint shares the value of i of the last loop iteration, see here for further details.

joni
  • 6,840
  • 2
  • 13
  • 20