3

I am currently using scipy optimize.minimize to get the minimal value of a function with 5 parameters. I would like for four of the inputs to be put in as fixed parameters of the function and I would like the optimize.minimize to give me the value of the fifth input in order to get the lowest possible output from the function.

Here is the code I have currently:

from numpy import array
import scipy.optimize as optimize
from scipy.optimize import minimize

def objective(speed, params):
    a,b,c,d=params
    return abs(rg.predict([[speed,params]]))

p0=np.array([[98.3,46.9,119.9,59.1]])

x0=np.array([[4]])
result = optimize.minimize(objective, x0, args=(p0,),method='nelder-mead')
print(result.x)

I am looking for a way to be able to pass a list or array of the fixed parameters inside of the optimize.minimize function. However the above code gives me this error:

ValueError: not enough values to unpack (expected 4, got 1)

The only way I can seem to make it work is to hard code in the inputs like this:

def objective(params):
 a=100
 b=20
 c=119.9
 d=params
 e=59.1
 return abs(rg.predict([[a,b,c,d,e]]))

x0=np.array([[4.5]])
result = optimize.minimize(objective, x0, method='nelder-mead')
print(result.x)

Am I approaching this in the correct way? How can I pass in a list or array as fixed inputs?

Dana McDowelle
  • 269
  • 1
  • 6
  • 13

2 Answers2

4

The tuple passed as args will be passed as *args to the objective function. If you define the objective function in the way you did, it expects a single input argument (apart from speed to be minimized), so passing a single-element tuple (p0,) as the args keyword to minimize is perfect. Your error comes after the function call:

ValueError: not enough values to unpack (expected 4, got 1)

This actually comes from the first line of your objective function:

a,b,c,d=params # params = np.array([[98.3,46.9,119.9,59.1]])

The array you passed as p0 has two sets of square brackets, so it has shape (1,4). Arrays unpack along their first dimension, so during unpacking this behaves like a 1-tuple (that contains a 4-element array). This is why you can't unpack shape (1,4) to four variables, hence the error.

This is basically a typo (one too many pair of square brackets), which wouldn't merit a full answer. The reason I'm writing this after all is because depending on your use case it might be easier to define those arguments directly in your function's signature, and pass those on accordingly during minimization:

def objective(speed, a, b, c, d):
    ... # return stuff using a,b,c,d

# define a0, b0, c0, d0 as convenient
result = optimize.minimize(objective, x0, args=(a0,b0,c0,d0), method='nelder-mead')

Whether it's more elegant to define your function like this depends on how your fixed parameters can be easily defined and what happens with those arguments inside objective. If you're just going to pass on the list of arguments as in your MCVE then there's no need to separate those variables in the first place, but if those four inputs are involved very differently in the calculations then it may make sense to handle each individually starting with the definition of your objective function.

  • Thanks!! When I do put the arguments directly into args it does work. I do have another question though. When I do keep the rest of the code the same and try and remove the extra brackets I get this error: ValueError: setting an array element with a sequence. – Dana McDowelle Aug 21 '18 at 15:09
  • @DanaMcDowelle you probably have to be more specific than that: where are you trying to remove the brackets? That error usually happens when you try to assign to a non-scalar to an element of an array, for instance `arr = np.arange(3); arr[0] = [1,2]`. I don't see anything like this in your code, apart from the undefined `rg.predict` call which I have no idea what might be :) – Andras Deak -- Слава Україні Aug 21 '18 at 15:42
0

These are linear constraints, of the form Ax = b. For example, say we want to hold the first two variables x0, x1 (your a, b) fixed:

A = [[ 1 0 0 ... ]
     [ 0 1 0 ... ]]
b = [b0 b1]

There's a general way to turn a linear-constrained problem Ax = b to an UNconstrained problem in fewer variables, in this exaample n - 2, using SVD.
minlin.py under my gists is a one-page wrapper of numpy SVD for this process. Its doc:

""" Minlin: convert linear-constrained min f(x): Ax = b
to unconstrained min in fewer variables.
For example, 10 variables with 4 linear constraints, A 4 x 10,
-> 6 unconstrained variables:

minlin = Minlin( A, b, bigfunc, verbose=1 )   # bigfunc( 10 vars )
then
minimize( minlin.littlefunc, minlin.y0 ... )  # littlefunc( 6 vars )
    with your favorite unconstrained minimizer. For example,

from scipy.optimize import minimize
res = minimize( minlin.littlefunc, minlin.y0 ... )
fbest = res.fun
ybest = res.x  # 6 vars
xbest = minlin.tobig(ybest)  # 10 vars
    = minlin.x0  + y . nullspace  from svd(A)
    x0 = Ainv b = lstsq( A, b )

Methods: .func(x) .tonullspace(X) .torowspace(X) .tobig(y)
Attributes: .sing .Vt .Vtop .Vnull .x0

How it works, in a simple case:
consider holding x0 = b0, x1 = b1 fixed, i.e.
A = [[ 1 0 0 ... ]
     [ 0 1 0 ... ]]
We can minimize unconstrained over the n - 2 variables [x2 x3 ...]
if we could find them / find a basis for them, given just A.
This is what SVD / Minlin does.
"""

This may be overkill for your problem. But, good question -- looking around, trying to understand rough country in 5d by varying just a few variables that you understand, is a good thing to do.

denis
  • 21,378
  • 10
  • 65
  • 88