0

I have two functions f(x,y,z) and g(x,y,z). I want to minimise the sum h(x,y,z) = f(x,y,z) + g(x,y,z), allowing x to be variable over both functions f and g.

I can minimise both these functions separately or together using scipy.optimise.minimise, which basically calculates the values of f + g (or h) at a bunch of x, y and z values, and then returns me the values (x, y, z) for which f + g is minimum. What happens here is that : both f and g are evaluated at same values of (x, y, z), but I want one of the arguments (say x) to vary over f and g.

This is a rough outline of what I am trying:

    def f(x,y,z):
        return scalar

    def g(x,y,z):
        return another_scalar

    def h(theta):
        x, y, z = theta
        return f(x,y,z) + g(x,y,z)

  
    def bestfit(guess, method='Nelder-Mead'):
        result = op.minimize(h,
                             guess,
                             method=method,
                             options={'maxfev': 5000,
                                      'disp': False})

        if not result.success:
            print('Optimisation did not converge.')

        return result

    g = [x0, y0, z0]
    bf = bestfit(g, method='Nelder-Mead')
    print(bf)

I am not sure if I can do it using scipy.optimise. Can I? Or is there some other python module I can use?

psi
  • 77
  • 1
  • 11

1 Answers1

0

My first thought would be to define new functions, say a and b, with fixed values of y and z, such that your new functions are a(x) = f(x, y0, z0) and b(x) = g(x, y0, z0) and then minimize these functions.

cmay
  • 153
  • 6
  • 1
    I did try something like this. Created h(x1,x2,y,z)=f(x1,y,z)+g(x2,y,z), and try to minimise h assuming x1 and x2 are independent variables. But, I am not sure if this is right way. – psi Nov 09 '21 at 23:14