I have an equation f(x)=y
with the follwing properties:
- we have an explicit expression for the derivative
f'
- the root is known to be between 0 and 1 (in fact
f
is defined only in this interval) f
is (weakly) increasing
I am currently solving this equation with scipy.optimize.brentq
, which works well enough.
optimize.brentq(f_to_zero, 0, 1)
brentq
uses the secant method, a finite-difference equivalent of Newton's method; it does not make use of any explicit f'
.
I was hoping I could make the root finding faster by making use of this derivative. In other words, I want to use Newton's method augmented with bounds; this article suggests the following idea: if at any point the Newton guess falls outside the bounds, perform a bisection (between the current guess and the bound) instead. Is there any well-tested package for this? I would really rather not write my own implementation (it's probably not worth it for the performance gain).
Another point is that the function isn't even defined outside the bounds, so using
optimize.newton(f_to_zero, fprime=f_prime, x0=0.5)
is not just inefficient, it will throw an error
Somebody asked what the expression for f'
is. f
and f'
are very long, so I have made gist here: https://gist.github.com/d3c48dde09389e8c48da0e990b57bf99