I want to minimize a convex function on an area with bounds and constraints, therefore I try to use scipy.optimize.minimize
with the SLSQP
option. However my function is only defined at discrete points. Linear interpolation does not seem like an option as computing my functions in all of its values would take way too much time. As a Minimal Working Example I have:
from scipy.optimize import minimize
import numpy as np
f=lambda x : x**2
N=1000000
x_vals=np.sort(np.random.random(N))*2-1
y_vals=f(x_vals)
def f_disc(x, x_vals, y_vals):
return y_vals[np.where(x_vals<x)[-1][-1]]
print(minimize(f_disc, 0.5, method='SLSQP', bounds = [(-1,1)], args = (x_vals, y_vals)))
which yields the following output:
fun: 0.24999963136767756
jac: array([ 0.])
message: 'Optimization terminated successfully.'
nfev: 3
nit: 1
njev: 1
status: 0
success: True
x: array([ 0.5])
which we of course know to be false, but the definition of f_disc
tricks the optimizer in believing that it is constant at the given index. For my problem I only have f_disc
and do not have access to f
. Moreover one call to f_disc
can take as much as a minute.