I have two functions f(x,y,z) and g(x,y,z). I want to minimise the sum h(x,y,z) = f(x,y,z) + g(x,y,z), allowing x to be variable over both functions f and g.
I can minimise both these functions separately or together using scipy.optimise.minimise
, which basically calculates the values of f + g (or h) at a bunch of x, y and z values, and then returns me the values (x, y, z) for which f + g is minimum.
What happens here is that : both f and g are evaluated at same values of (x, y, z), but I want one of the arguments (say x) to vary over f and g.
This is a rough outline of what I am trying:
def f(x,y,z):
return scalar
def g(x,y,z):
return another_scalar
def h(theta):
x, y, z = theta
return f(x,y,z) + g(x,y,z)
def bestfit(guess, method='Nelder-Mead'):
result = op.minimize(h,
guess,
method=method,
options={'maxfev': 5000,
'disp': False})
if not result.success:
print('Optimisation did not converge.')
return result
g = [x0, y0, z0]
bf = bestfit(g, method='Nelder-Mead')
print(bf)
I am not sure if I can do it using scipy.optimise
. Can I? Or is there some other python module I can use?