8

I'm trying to use the scipy.optimize package to find the maximum of my cost function.

In this particular case: I have a list of prices which vary over the day. To make it easier, lets assume the day has 8 hours and the price in each hour are as follows:

price_list = np.array([1,2,6,8,8,5,2,1])

In this simplified case I want to select the 4 highest prices from that price_list. And for various reasons I don't want to simply sort and select the best four prices, but use some optimization algorithm. I have several constraints, therefore I decided to use the least square algorithm from scipy, scipy.optimize.fmin_slsqp.

I first create a schedule for the hours I select:

schedule_list = np.zeros( len(price_list), dtype=float)

Next I need to define my inversed profit_function. For all selected hours I want to sum up my profit. While I want to optimize my schedule, the price_list is fixed, therefore I need to put it to the *args:

def price_func( schedule_list, *price_list ):
    return -1.*np.sum( np.dot( schedule_list, price_list ) )

Once I understand how things work in principle I will move some stuff around. So long, I simply avoid using more *args and define my constraint with a hard coded number of hours to run. And I want my selected hours to be exactly 4, therefore I use an equality constrain:

def eqcon(x, *price_list):
    return sum( schedule_list ) - 4

Furthermore I want to restrict my schedule values to be either 0 or 1. I'm not sure how to implement this right now, therefore I simply use the bounds keywords.

The unrestricted optimization with bounds works fine. I simply pass my schedule_list as first guess.

scipy.optimize.fmin_slsqp( price_func, schedule_list, args=price_list, bounds=[[0,1]]*len(schedule_list) )

and the output is as good as it can be:

Optimization terminated successfully.    (Exit mode 0)
        Current function value: -33.0
        Iterations: 2
        Function evaluations: 20
        Gradient evaluations: 2
Out[8]: array([ 1.,  1.,  1.,  1.,  1.,  1.,  1.,  1.])

Without any further constraints, this is the optimal solution!

Using the constrained optimization with the following command:

scipy.optimize.fmin_slsqp( price_func, schedule_list, args=price_list, bounds=[[0,1]]*len(schedule_list), eqcons=[eqcon, ] )

gives me the error:

Singular matrix C in LSQ subproblem    (Exit mode 6)
        Current function value: -0.0
        Iterations: 1
        Function evaluations: 10
        Gradient evaluations: 1
Out[9]: array([ 0.,  0.,  0.,  0.,  0.,  0.,  0.,  0.])

From gnuplot I know that this is often related to non-sense questions or bad initial values. I tried almost optimal initial values, but that does not help. Does anybody has an idea or even solution for me?

In a next step, I already have formulated inequality constrains. Do I understand it correctly that in the scipy minimize wrapper the inequalities are assumed to be larger than 0, while in fmin_slsqp it is vice versa. Solutions are constrained to negative constrain functions?

Ali
  • 56,466
  • 29
  • 168
  • 265
Jochen
  • 155
  • 1
  • 3
  • 15
  • 1
    Ok, first major problem: The constraints should have the same arguments in the definition as used in the function. Don't mix x and schedule_list – Jochen Feb 17 '14 at 19:50

3 Answers3

7

You have a plain linear program, is that right ?

min: - prices . x
constrain: x >= 0, sum x = 4

so the second derivative matrix aka Hessian is exactly 0.
slsqp is trying to invert this --- not possible. Agreed, the error message could be better.
(The same will happen with other quadratic methods, in any package: they'll converge much faster on smooth functions, but crash on rough cliffs.)

See also why-cant-i-rig-scipys-constrained-optimization-for-integer-programming --
but LP should do the job (max 4), Integer programming is harder.

Community
  • 1
  • 1
denis
  • 21,378
  • 10
  • 65
  • 88
  • Are you sure linear constraints cause this problem? In official SciPy tutorial for SLSQP algorithm there is a linear constraint (y - 1 <= 0) ref: https://docs.scipy.org/doc/scipy/reference/tutorial/optimize.html#tutorial-sqlsp – Marek Jan 08 '17 at 10:18
  • @Marek. the problem is I think *only* linear constraints, quadratic term 0. – denis Jan 08 '17 at 10:56
3

The SLSQP algorithm is a gradient-based optimizer, meaning it expects the derivatives of the objective and constraints to be continuous. From my understanding, it seems like you're trying to solve an integer programming problem (continuous values in the schedule list are not acceptable). You need an algorithm that selects appropriate values (0 or 1) for the independent variables, rather than trying to find the minimum of a continuous space of values. Unfortunately, I'm not sure that there are any in scipy that do that.

Rob Falck
  • 2,324
  • 16
  • 14
  • I should have thought about continuity. But in its current form, so contious Setup without binary restrictions, it is a continous function and the derivitave should exist. Basically it is a simple multidimensional line of slope price... ok, the later restriction is not continous. But is this necessary? – Jochen Feb 16 '14 at 21:32
0

There are some errors in your code. the correct:

1. def price_func( schedule_list, price_list )
2. def eqcon(schedule_list , price_list):
       return sum( schedule_list ) - 4
3. scipy.optimize.fmin_slsqp( price_func, schedule_list, args=(price_list,), bounds=[[0,1]]*len(schedule_list), eqcons=[eqcon, ] )

then it works.

uuuuuu
  • 1
  • 1