4

Created an objective function
Added constraints

The problem is no matter what initial guess I use, the minimize functions just keeps on using that number. for example: If I use 15 for the initial guess, the solver will not try any other number and say the answer is 15. I'm sure the ere is an issue with the code but I am not sure where.

CODE BELOW:

from scipy.optimize import minimize
import numpy as np
from pandas import *

#----------------------------------------------------
#-------- Create Function ------------
#----------------------------------------------------
def MovingAverage(Input,N,test=0):

    # Create data frame
    df = DataFrame(Input, columns=['Revenue'])

    # Add columns
    df['CummSum'] = df['Revenue'].cumsum()
    df['Mavg'] = rolling_mean(df['Revenue'], N)
    df['Error'] = df['Revenue'] - df['Mavg']
    df['MFE'] = (df['Error']).mean()
    df['MAD'] = np.fabs(df['Error']).mean()
    df['MSE'] = np.sqrt(np.square(df['Error']).mean())
    df['TS'] = np.sum(df['Error'])/df['MAD']

    print N, df.MAD[0]

    if test == 0:
        return df.MAD[0]
    else: return df

#----------------------------------------------------
#-------- Input ------------
#----------------------------------------------------
data = [1,2,3,4,5,5,5,5,5,5,5,5,5,5,5]


#----------------------------------------------------
#-------- SOLVER ------------
#----------------------------------------------------

## Objective Function
fun = lambda x: MovingAverage(data, x[0])

## Contraints
cons = ({'type': 'ineq', 'fun': lambda x:  x[0] - 2}, # N>=2
        {'type': 'ineq', 'fun': lambda x:  len(data) - x[0]}) # N<=len(data)


## Bounds (note sure what this is yet)
bnds = (None,None)

## Solver
res = minimize(fun, 15, method='SLSQP', bounds=bnds, constraints=cons)

##print res
##print res.status
##print res.success
##print res.njev
##print res.nfev
##print res.fun
##for i in res.x:
##    print i
##print res.message
##for i in res.jac:
##    print i
##print res.nit

# print final results
result = MovingAverage(data,res.x,1)
print result

List of possible values:
2 = 0.142857142857,
3 = 0.25641025641,
4 = 0.333333333333,
5 = 0.363636363636,
6 = 0.333333333333,
7 = 0.31746031746,
8 = 0.3125,
9 = 0.31746031746,
10 = 0.333333333333,
11 = 0.363636363636,
12 = 0.416666666667,
13 = 0.487179487179,
14 = 0.571428571429,
15 = 0.666666666667

DataByDavid
  • 1,039
  • 3
  • 13
  • 20
  • Start by investigating your lambda function: What does it return for different values close to 15? Also, remember that minimize() won't find a global minimum, only a local one, unless the optimized function has some useful property like convexity. – Bitwise Nov 13 '12 at 20:27
  • Added List of possible values – DataByDavid Nov 13 '12 at 20:35
  • Ok so you can already see that your function is not convex, since you have multiple local minimums. – Bitwise Nov 13 '12 at 20:44
  • From the values you added it seems that fun(15)=0.666, so how does minimize() return 15? – Bitwise Nov 13 '12 at 20:46
  • I see your point, I just thought the solver will find the global for me. What I meant by 15 is fun(15). – DataByDavid Nov 13 '12 at 20:52
  • Optimizing a general function without any additional information is NP-hard. – Bitwise Nov 13 '12 at 20:54
  • You seem to be using a gradient method on a discreet optimization space, it won't even manage to estimate the jacobian. You should probably just brute force (or maybe do a first rough scan, then an exact one). Just to note: while I don't think it helps here openopt has more optimizers that might be at least closer to what you do. – seberg Nov 13 '12 at 21:27

1 Answers1

7

Your function is piecewise constant between integer input values, as seen in the plot below (plotted in steps of 0.1 on the x axis):

function plot

So the derivative is zero at almost all points, and that's why a gradient based minimization method will return any given initial point as a local minimum.

To rescue the situation, you could think about using interpolation in the objective function to get intermediate function values for non-integer input values. If you combine this with a gradient-based minimization, it might find some point around 8 as a local minimum when starting at 15.

silvado
  • 17,202
  • 2
  • 30
  • 46