0

I am trying to solve an optimization problem in Python environment using Scipy minimize. My problem is to apply the constraints that have if-else conditions in them. My input arguments are arrays and my objective function returns a scalar according to the three scalar variables I pick from the arrays and feed to the problem.

Here is my main problem - constraints that I have to write somehow:

# constraint 1
x_3 < x_1 + x_2

# constraint 2
if x_1 > x_2:
    x_3 + x_2 > x_1

# constraint 3
if x_2 > x_1:
    x_2 - x_3 < x_1

And here is the code that I come up with to solve my problem so far:

import numpy as np
from scipy import interpolate
from scipy.optimize import minimize

# My inputs are arrays as such
x_1 = np.linspace(0.9, 1.1, 21)
x_2 = np.linspace(0.7, 1, 31)
x_3 = np.linspace(0.8, 0.9, 11)

# Just to have a simplified problem here,
# I just made up this Obj_func. In reality, I have got
# some FEM running which returns me a scalar in return
# to the x_1, x_2, x_3 parameters I give from the arrays above.
# You can change this Obj_fun if it does not make sense as long as
# it gives a single scalar "y".

def Obj_func(x, x_1, x_2, x_3):
    
    x_1, x_2, x_3 = x
        
    y = x_1 + 5*x_2 - 2*x_3
    
    return y

# I assume I can write my first constraint as such:

def ConstOne (args):

    x_1 = args[0]
    x_2 = args[1]
    x_3 = args[2]
    
    return x_1 + x_2 - x_3

constraints=({'type': 'ineq',
       'fun': ConstOne})

# But I don't know how I can write the second and third constraints
# with if conditions

# initial guess
x0 = (1, 0.75, 0.85)

# input arrays entering to the optimization as arguments
args = (x_1, x_2, x_3)

# solve the optimization problem
print (minimize(Obj_func ,x0 , args, method='SLSQP', constraints=constraints))

I am open to any suggestions and solutions. Thank you!

meteoguc
  • 1
  • 2
  • Please clarify your specific problem or provide additional details to highlight exactly what you need. As it's currently written, it's hard to tell exactly what you're asking. – Community Mar 02 '22 at 23:37
  • Why does the objective function receive `x` when it is not used at all in the optimization? – Muhammad Mohsin Khan Mar 03 '22 at 08:31
  • In general: impossible with scipy.minimize. Those solvers assume twice differentiability of constraints and the objective. An if/switch is not. What's shown would probably be possible with mixed-integer linear-programming (if we assume non-strict ineqs or can replace strict ones with some epsilon), but that's not available in scipy yet. There you would use something like indicator-variables (which make use of binary-variables provided by the integer-programming part). – sascha Mar 03 '22 at 12:06
  • Thank you all for comments. Muhammad, I think it is how it just is how it is done as far as I can understand from the examples I have found so far. But I am not quite sure actually. sascha, I think you show me a very good path to follow. But please clear this up for me, can I manipulate my scipy code to work for a mixed-integer linear-programming problem this way or should I find some other library? I think that means my problem requires a Big M Method approach. I actually don't know how to code it right now. – meteoguc Mar 03 '22 at 15:46
  • Long story short, use a modelling framework like [PuLP](https://coin-or.github.io/pulp/) or [python-mip](https://python-mip.readthedocs.io/en/latest/). Both make it much easier to formulate your problem. However, there's already a wrapper around the [HiGHS MILP solver that got merged into the master branch a few weeks ago](https://github.com/scipy/scipy/pull/15460). So it's possible to solve MIPs with scipy (master). The disadvantage compared to a modelling framework is that you need the MIP in matrix form. – joni Mar 04 '22 at 06:21

0 Answers0