0

Imagine having any two functions. You need to find intersections of that functions. You definitely don't want to try all x values to check for f(x)==g(x). Normally in math, you create simultaneous equations derived from f(x)==g(x). But I see no way how to implement equations in any programing language.
So once more, what am I looking for:

  1. Conceptual algorithm to solve equations.
  2. The same for simultaneous and quadratic equations.

I believe there should be some workaround using function derivations, but I've recently learned derivation concept at school and I have no idea how to use it in this case.

Tomáš Zato
  • 50,171
  • 52
  • 268
  • 778
  • Solving simultaneous non-linear equations is an advanced topic that I wouldn't expect to be built into any language. However, lots of them have linear algebra libraries that implement such solutions if you're knowledgable about the subject. This is not the place to learn about them. Voting to close. – duffymo Feb 06 '13 at 18:08
  • I must let you know that researching any algorithm should not be reported is not-constructive. Actually, this is much more constructive than asking very specific question, because many users may find the answers usefull - mind that programming is not only source code. – Tomáš Zato Feb 06 '13 at 18:36
  • I disagree. That's your opinion. I agree that lots of people might find it useful, but this isn't a discussion forum. – duffymo Feb 06 '13 at 19:18

2 Answers2

5

That is a much harder problem than you would imagine. A good place to start for learning about these things is the Newton-Raphson method, which gives numerical approximations to equations of the form h(x) = 0. (When you set h(x) = g(x) - f(x), this provides solutions for the problem you are asking about.)

Exact, algebraic solving of equations (as implemented in Mathematica, for example) are even more difficult, you basically have to recreate everything you would do in your head when solving an equation on a piece of paper.

NonNumeric
  • 1,079
  • 7
  • 19
us2012
  • 16,083
  • 3
  • 46
  • 62
  • 2
    You could differentiate `h(x)` by `x` analytically (if `g(x)` and `f(x)` are defined analytically), find all the points `x` where `dh(x) / dx` = 0. Between those points `h(x)` is going to be monotonic (either non-increasing or non-decreasing but never increasing and decreasing between the adjacent points). And between those points you can then look for `x` that are solutions of `h(x)` = 0 (e.g. you could use a variant of bisection) and those solutions are where `f(x)` = `g(x)`. So, in order to solve `h(x)` = 0, you solve `dh(x) / dx` = 0. This can be done recursively. – Alexey Frunze Feb 06 '13 at 16:00
  • Thank you for your proposition Alexey, but do you think there is way to explain this even more simple? I'm not sure what does analitical differention mean. – Tomáš Zato Feb 06 '13 at 18:23
  • You can find descriptions of Newton's and other root finding algorithms here: http://en.wikipedia.org/wiki/Root-finding_algorithm – Joni Feb 07 '13 at 06:53
1

Obviously this problem is not solvable in the general case because you can construct a "function" which is arbitrarily complex. For example, if you have a "function" with 5 trillion terms in it including various transcendental and complex transformations in it, the computer could take years just to compute a single value, much less intersect it with another similar function.

So, first of all you need to define what you mean by a "function". If you mean a polynomial of degree less than 4 then the problem becomes much more straightforward. In such cases you combine the terms of the polynomial and find the roots of the equation, which will be the intersections.

If the polynomial has more than 5 terms (a quintic or greater) then there is no easy symbolic solution. In this case the terms are combined and you find the roots by iterative approximation. See Root Finding Algorithms.

If the function involves transcendentals such sin/cos/log/e^x, etc, you can potentially find the intersection by representing the functions as a series or a continued fraction. You then subtract one series from the other and set the value to zero. The solution of the continuous equation yields an approximation of the root(s).

Tyler Durden
  • 11,156
  • 9
  • 64
  • 126