3

I'm looking for a fast algorithm to solve a system of N polynomial equations on 3 unknown variables. That is, given 3 functions, F0(x,y,z), F1(x,y,z)... FN(x,y,z), I want to find x, y, z such that F0(x,y,z) = F1(x,y,z) = ... = FN(x,y,z) = 0.

I've tried finding the solution in several different places, but I could only find very advanced papers on topics such as algebraic geometry or cryptography. What I need, though, is a simple/quick algorithm that returns a fast numerical solution. Is there such algorithm?

MaiaVictor
  • 51,090
  • 44
  • 144
  • 286
  • Multivariate Newton Raphson would be my first try. – John Alexiou Jan 11 '15 at 20:11
  • Solving several polynomial equations in several variables is a hard problem. You're unlikely to find a simple and quick algorithm for it that always works. – tmyklebu Jan 11 '15 at 20:40
  • 1
    Ah, thank you, @tmyklebu. That is a perfectly good answer if you can provide a source, btw. Would be helpful because I would just stop searching. By the way, are there not even approximate solutions? – MaiaVictor Jan 11 '15 at 20:45
  • By the way, if you just add up your functions, $\sum F(x,y,z) = 0$ is just a single polynomial you need to find a solution for. Now, your system might be under- or overdetermined, so this might influence the solvability of your problem further. – Marcus Müller Jan 11 '15 at 21:19
  • By the way, are there boundaries on the highest exponent? Do the Fs use the same exponents, or are they partly or fully disjoint? – Marcus Müller Jan 11 '15 at 21:21
  • Yes, the highest exponent can be no higher than `^2`, I believe. The reason is the equations are coming from the intersection of a 3D line `F(t) = a + b*t` and a parametric surface, `G(u,v)`, that is handcrafted by an artist, and will use basically sines, cosines, sqrt, etc. [I've read you can eliminate the sines/cos](http://en.wikipedia.org/wiki/System_of_polynomial_equations), though (section "Trigonometric equations"). – MaiaVictor Jan 11 '15 at 21:47

2 Answers2

3

Solving several polynomial equations in several variables is a hard problem. Doing so in polynomial time in the average case is Smale's 17th problem. It is unlikely that you will find a fast and simple algorithm for doing so that actually works.

You might look at "Ideals, varieties, and algorithms" by Cox, Little, and O'Shea for an intro to Groebner bases. Buchberger's algorithm finds a Groebner basis for a given polynomial ideal. You can find all solutions of a given polynomial system using a Groebner basis for the ideal generated by the polynomials, though the solution comes in a slightly awkward form.

Newton's method is a basic method for solving a system of several nonlinear equations in several variables. Applied naively, Newton's method is heuristic; it won't always find a solution to a system even if a solution exists. However, if Newton's method converges, then it converges really fast. Thus the challenge of the theory problem posed by Smale lies in finding a provably good initial guess to start Newton's method from.

Beltran and Pardo made considerable progress on Smale's 17th problem, giving an algorithm that works on the average for systems with bounded degree using real-number arithmetic. This has since been turned into a finite-precision algorithm by Briquel, Cucker, Pena, and Roshchina. Fascinating as they are, I'm not aware of any implementations, or any attempts at implementations, of these ideas---we're still very, very far away from having usable code for solving systems of polynomial equations.

tmyklebu
  • 13,915
  • 3
  • 28
  • 57
  • Oh, that is an awesome answer, thank you very much. Just one thing, sorry, I'm really curious. What is stopping us from coding those algorithms? – MaiaVictor Jan 11 '15 at 22:43
  • 2
    @Viclib: For Beltran and Pardo: We don't have infinite-precision real arithmetic on computers. Part of the point of the other paper is that it works around this difficulty in theory. For the Briquel et al result: Nothing, technically. But people don't really implement these sorts of theoretical algorithms because they tend to be much less effective in practise than their asymptotic running time would indicate. – tmyklebu Jan 11 '15 at 22:50
1

In your last comment, you reduced the initial problem to a much simpler problem, the intersection of a line x(t)=a+b*t and a surface G(x)=0. By simply inserting the line into the surface equation you obtain the univariate problem F(t)=G(a+b*t)=0. There you can use the univariate Newton method or derivative free methods as the Illinois method (regula falsi with a twist) or Brents method. There remains still the global problem of identifying intervals with a sign change. There you either have some idea from the form of the surface or you have to tabulate the function. Homotopy continuation can also play a role, as nearby lines will, very often, have nearby roots.

kristianp
  • 5,496
  • 37
  • 56
Lutz Lehmann
  • 25,219
  • 2
  • 22
  • 51