2

I have been trying to locate a method similar to Excel's Solver where I can target a specific value for a function to converge on. I do not want a minimum or maximum optimization.

For example, if my function is:

f(x) = A^2 + cos(B) - sqrt(C)

I want f(x) = 1.86, is there a Python method that can iterate a solution for A, B, and C to get as close to 1.86 as possible? (given an acceptable error to target value?)

Mufeed
  • 3,018
  • 4
  • 20
  • 29
Tanner
  • 23
  • 4
  • https://docs.scipy.org/doc/scipy/reference/optimize.html and https://www.scipy-lectures.org/advanced/mathematical_optimization/ – Joe Jul 11 '18 at 06:13

1 Answers1

2

You need a root finding algorithm for your problem. Only a small transformation required. Find roots for g(x):

g(x) = A^2 + cos(B) - sqrt(C) - 1.86

Using scipy.optimize.root, Refer documentation:

import numpy as np
from scipy import optimize

# extra two 0's as dummy equations as root solves a system of equations 
# rather than single multivariate equation
def func(x):                                        # A,B,C represented by x ndarray
    return [np.square(x[0]) + np.cos(x[1]) - np.sqrt(x[2]) - 1.86, 0, 0]

result = optimize.root(func , x0 = [0.1,0.1,0.1])
x = result.x
A, B, C = x                       
x
# array([ 1.09328544, -0.37977694,  0.06970678])

you can now check your solution:

np.square(x[0]) + np.cos(x[1]) - np.sqrt(x[2])

# 1.8600000000000005
Mankind_008
  • 2,158
  • 2
  • 9
  • 15