R question: Looking for the fastest way to NUMERICALLY solve a bunch of arbitrary cubics known to have real coeffs and three real roots. The polyroot function in R is reported to use Jenkins-Traub's algorithm 419 for complex polynomials, but for real polynomials the authors refer to their earlier work. What are the faster options for a real cubic, or more generally for a real polynomial?
-
By "solve a bunch", are you talking about several separate polynomials, or are you trying to solve a system of polynomials with n variables? – John Feminella Jan 05 '10 at 01:05
-
Hundreds of independent cubics, one by one, as one does with polyroot or any of the householder's methods. – andrekos Jan 06 '10 at 13:41
7 Answers
The numerical solution for doing this many times in a reliable, stable manner, involve: (1) Form the companion matrix, (2) find the eigenvalues of the companion matrix.
You may think this is a harder problem to solve than the original one, but this is how the solution is implemented in most production code (say, Matlab).
For the polynomial:
p(t) = c0 + c1 * t + c2 * t^2 + t^3
the companion matrix is:
[[0 0 -c0],[1 0 -c1],[0 1 -c2]]
Find the eigenvalues of such matrix; they correspond to the roots of the original polynomial.
For doing this very fast, download the singular value subroutines from LAPACK, compile them, and link them to your code. Do this in parallel if you have too many (say, about a million) sets of coefficients.
Notice that the coefficient of t^3
is one, if this is not the case in your polynomials, you will have to divide the whole thing by the coefficient and then proceed.
Good luck.
Edit: Numpy and octave also depend on this methodology for computing the roots of polynomials. See, for instance, this link.

- 40,844
- 23
- 87
- 135
-
-
I think this is still a relevant answer, it's easy enough to find the eigenvalues of the companion matrix in R using the ''eigen()'' function. – Ken Williams Jan 12 '10 at 16:24
-
+1. GSL does this too. The Jenkins-Traub algorithm is the state of art for solving polynomials directly, but it can encounter instabilities sometimes (and it is complex as hell), and when looking closer, it seems like doing the power iteration on the companion matrix... This method you indicate is bullet proof. – Alexandre C. Apr 06 '11 at 07:17
The fastest known way (that I'm aware of) to find the real solutions a system of arbitrary polynomials in n variables is polyhedral homotopy. A detailed explanation is probably beyond a StackOverflow answer, but essentially it's a path algorithm that exploits the structure of each equation using toric geometries. Google will give you a number of papers.
Perhaps this question is better suited for mathoverflow?

- 37,241
- 25
- 195
- 267

- 303,634
- 46
- 339
- 357
Fleshing out Arietta's answer above:
> a <- c(1,3,-4)
> m <- matrix(c(0,0,-a[1],1,0,-a[2],0,1,-a[3]), byrow=T, nrow=3)
> roots <- eigen(m, symm=F, only.values=T)$values
Whether this is faster or slower than using the cubic solver in the GSL package (as suggested by knguyen above) is a matter of benchmarking it on your system.

- 22,756
- 10
- 85
- 147
-
1The companion matrix method is probably the slowest of all the polynomial root-finding algorithms (though for cubics it should still be quite fast). Jenkins-Traub's real-coefficient variant is certainly faster. There is an open source c++ implementation (BSD license) of the Jenkins-Traub algorithm called [RPOLY++](https://github.com/sweeneychris/RpolyPlusPlus) – kip622 Sep 20 '15 at 15:04
Do you need all 3 roots or just one? If just one, I would think Newton's Method would work ok. If all 3 then it might be problematic in circumstances where two are close together.

- 184,598
- 164
- 608
- 970
-
-
I vote for Newton's method, then. Start with 0.5 as a guess, limit the domain between 0 and 1, and it should work fine. It should have quadratic convergence; the only reason you would have a problem is if the function has a minimum or maximum within that range (derivative = 0) – Jason S Jan 06 '10 at 14:08
1) Solve for the derivative polynomial P' to locate your three roots. See there to know how to do it properly. Call those roots a and b (with a < b)
2) For the middle root, use a few steps of bisection between a and b, and when you're close enough, finish with Newton's method.
3) For the min and max root, "hunt" the solution. For the max root:
- Start with x0 = b, x1 = b + (b - a) * lambda, where lambda is a moderate number (say 1.6)
- do x_n = b + (x_{n - 1} - a) * lambda until P(x_n) and P(b) have different signs
- Perform bisection + newton between x_{n - 1} and x_n

- 1
- 1

- 55,948
- 11
- 128
- 197
The common methods are available: Newton's Method, Bisection Method, Secant, Fixed point iteration, etc. Google any one of them.
If you have a non-linear system on the other hand (e.g. a system on N polynomial eqn's in N unknowns), a method such as high-order Newton may be used.

- 17,947
- 6
- 53
- 58
Have you tried looking into the GSL package http://cran.r-project.org/web/packages/gsl/index.html?

- 2,974
- 5
- 25
- 27