0

I'm trying to learn how to use scipy.optimize.minimize. I'm going to need it for functions of two variables with around a thousand terms each; so I came up with an easy example first:

from scipy.optimize import minimize

def test_function(x,y):
    return x*(x-1)+y*(y-1)

mins=minimize(test_function,x0=(0,0),bounds=[(0,1),(0,1)])

So I expect an answer of x=0.5, y=0.5.

Unfortunately, I instead get the following error:

TypeError: test_function() missing 1 required positional argument: 'y'

What does it mean by my test function missing a positional argument?

J.D.
  • 139
  • 4
  • 14

1 Answers1

1

The design vector needs to be an iterable. You can unpack it within the objective function to your x and y variables. The below example assumes that x and y are scalars as in your code snippet, if they are vectors, you need to get the slices of design_variables corresponding to the length of your variables.

from scipy.optimize import minimize

def test_function(design_variables):
    x = design_variables[0]
    y = design_variables[1]
    return x*(x-1)*y*(y-1)

mins = minimize(test_function, x0=(0,0), bounds=[(0,1),(0,1)])
print(mins)

Optimization result:

     fun: 0.0
     jac: array([0., 0.])
 message: 'Optimization terminated successfully.'
    nfev: 4
     nit: 1
    njev: 1
  status: 0
 success: True
       x: array([0., 0.])
onodip
  • 635
  • 7
  • 12