2

I have a set of polynomial equations and linear inequalities for real numbers. How can I determine whether a solution exists? I need no symbolic solution or a guarantee that the answer is correct, but the algorithm should give the correct answer most of the time.

Example: Consider the system of polynomials consisting of just the single polynomial x^2+y^2-1 for real numbers x and y. The linear inequalities prescribe that I am looking for a solution on a small square, i.e.: 0.500<x<0.501, 0.865<y<0.866. The system is then:

x^2+y^2-1==0,
0.500<x<0.501,
0.865<y<0.866

There are infinitely many solutions to this system, for example x=0.5005, y=Sqrt(1-x^2)=0.865737....

Language would ideally be Python, but other languages are welcome.

user505117
  • 269
  • 1
  • 2
  • 6
  • 5
    Note that "non-linear" is pretty vast. There are some equations that are very hard to solve. If you can supply the derivative of the function, that's already easier. If the function is a polynomial, it's even better. And if the function is a second-degree polynomial, in other words if the equations are "quadratic equations", that's muuuuuuuch easier. – Stef May 26 '22 at 16:03
  • You could check out `sympy` (a python library) but a long term solution would be something like wolfram https://reference.wolfram.com/language/guide/EquationSolving.html – Deera Wijesundara May 26 '22 at 16:06
  • @Stef The equations are polynomials of arbitrary degree, in almost all cases of degree at most 5 – user505117 May 26 '22 at 16:23
  • 2
    @user505117 I would suggest trying with [scipy.optimize.root](https://docs.scipy.org/doc/scipy/reference/generated/scipy.optimize.root.html#scipy.optimize.root), supplying the polynomial function as first argument, and its derivative as the optional `jac` argument. And supplying `x0` as the center of your small square. The solver doesn't guarantee that it will find a solution inside your square, but it should find a solution close to `x0`. If necessary, repeat with `x0` at the four corners of the square. – Stef May 26 '22 at 16:47
  • Although if there are only two variables `x` and `y` and at least one of them is only used with degree at-most-4, then I think you can solve the equation algebraically (as opposed to `scipy.optimize.solve` which finds a numerical approximation). – Stef May 26 '22 at 16:49
  • @Stef When using scipy.optimize.root and starting in the center of the square, do you suggest to interpret a solution outside of my square as "there is no solution inside the square"? That is not always true, but it may be accurate enough for my purposes. And in my applications I always have more than two variables. – user505117 May 26 '22 at 16:55
  • @user505117 No, I was suggesting "if starting from the center of the square, the found solution is outside the square", then try again starting from another point, such as the corners of the square. But yes, maybe that might be accurate enough for your purposes, I don't know. – Stef May 26 '22 at 17:20
  • Very interesting, but off topic here, since it's not a specific programming question; try math.stackexchange.com instead. Looks to me like this might be a pretty subtle question; good luck and have fun. – Robert Dodier May 26 '22 at 17:34
  • @Stef FWIW explicit solutions of polynomials, even when they exist, are pretty unwieldy and difficult to work with except in special cases. – Robert Dodier May 26 '22 at 17:35

1 Answers1

2

If there is a single polynomial equation that needs to solved inside a rectangular domain, then there is a nice algorithm based on Bernstein polynomials.

Assume your domain is [0,1]x[0,1], which can be achieved by rescaling. You can convert your polynomial equation into one using the Bernstein polynomials. The 1D case is easiest to describe, for a 2 degree polynomial you can write it as

B(x) = b_0 x^2 + 2 b_1 x (1-x) + b_2 (1-x)^2

What these polynomials give you is a simple convexity test for zeros. If all the coefficients, b_0, b_1, b_2 are positive, then you can guarantee that B(x) is strictly positive in the domain [0,1]. Likewise, if they are all negative, then it is strictly negative. So to have a zero the coefficients must differ in sign, or be zero.

You can then proceed in a recursive fashion, split the domain in half, rescale to fit [0,1] calculate the new Bernstein polynomial and apply the zero test. In this way you can quickly narrow down on where the zeros are, ignoring large parts of the domain.

This algorithm was describe in the 2D case in P. Milne, 1991, Zero set of Multivarient Polynomial Equations, Mathematics of Surfaces IV. I've adapted it to 3D in my paper at A new method for drawing Algebraic Surfaces https://singsurf.org/papers/algsurf/index.html which describes all the relevant algorithms, and you can see a live demo at Implict curve plotter with code on github.

There is a vast literature on this problem and plenty of other algorithms for it. Marching squares is a popular algorithm.

Salix alba
  • 7,536
  • 2
  • 32
  • 38
  • Thanks for the answer. I think I can reduce the case of multiple polynomials p1, p2, ..., pk to the case of a single polynomial by defining q=p1^2+...+pk^2. That way, the zero set of q is precisely the set where p1, ..., pk simultaneously vanish. As an example, take x^2+y^2 in R^3 with coordinates x, y, z. There, x^2+y^2 describes a line. Will your suggestion work in this case? – user505117 May 29 '22 at 12:21
  • You can do it that way, but that increases the degree of the polynomial, slowing down the algorithm. Alternatively, you could run the algorithm on each polynomial at the same time. If you are looking for the intersections, both mush have a zero in a given region. – Salix alba May 29 '22 at 20:47