9

What is the computational complexity of the Gram-Schmidt orthogonalization algorithm?

Suppose a matrix of m rows and k columns, how many operations are required to compute the orthogonalization?

If possible I would like to have the exact number of multiplications and additions.

EDIT: It seems to me that the total number of operations (multiplication + additions) is 3/2k^2m + 3/2mk +k^2/2 +k/2.
I would like to know if this is correct and if there is a quicker version.

MonadBoy
  • 211
  • 4
  • 12
Donbeo
  • 17,067
  • 37
  • 114
  • 188

2 Answers2

14

Dot product takes m-1 additions and m multiplies.

Vector normalization takes 1 vector square (dot product), 1 square root and m divisions i.e.

m-1 +, m *, m /, 1 √

Subtraction of a vector projection takes 1 dot product, m multiplies and m additions, i.e.

2m-1 +, 2m *

Computation of the j-th vector takes (j-1) subtractions of projections, followed by a normalization, i.e.

(2m-1)(j-1)+m-1 +, 2m(j-1)+m *, m /, 1 √

You compute vectors from j=1 to k, so the factors (j-1) become the triangular number (k-1)k/2 and the terms independent of j are multiplied by k:

(2m-1)(k-1)k/2+(m-1)k +, 2m(k-1)k/2+mk *, mk /, k √

In the dot product, the m divisions can be traded for m multiplies by the inverse, yielding

(2m-1)(k-1)k/2+(m-1)k +, 2m(k-1)k/2+2mk *, k /, k √

So essentially 2mk² operations.

4

The overall complexity of Gram-Schmidt algorithm is O(m.k^2):

The process must be applied k times and each orthogonalization takes O(m.k) opérations (multiplications and additions) so altogether it makes O(m.k^2) complexity

Gerard Rozsavolgyi
  • 4,834
  • 4
  • 32
  • 39
  • Do you know the total number of floats operations? – Donbeo Jan 16 '15 at 14:51
  • @Donbeo According to wikipedia, "The cost of this algorithm is asymptotically 2nk^2 floating point operations, where n is the dimensionality of the vectors." They cite Golub, Gene H.; Van Loan, Charles F. (1996), Matrix Computations (3rd ed.), Johns Hopkins, ISBN 978-0-8018-5414-9. – Katie Jan 16 '15 at 14:52
  • @Donbeo: Total number of floats operations not sure but you have to count multiplications as they are more relevant and in your formula you should ignore terms other than in m.k^2. I don't think you can do better with standard Gram-Schmidt Algorithm. – Gerard Rozsavolgyi Jan 16 '15 at 14:59
  • In PFUs, it is not so obvious that multiplies dominate additions. –  Jan 16 '15 at 15:31
  • Yes it can happen but why are you thinking so here ? – Gerard Rozsavolgyi Jan 16 '15 at 15:39
  • 1
    In Gram-Schmidt there are about as many additions as multiplies, there is no reason to ignore additions. –  Jan 16 '15 at 15:43
  • It's known that multiplications cost more in terms of calculation time and additions are often ignored in Computational Complexity evaluation faced to mults. See http://en.wikipedia.org/wiki/Computational_complexity_of_mathematical_operations – Gerard Rozsavolgyi Jan 16 '15 at 16:08
  • 1
    This link is not relevant, it relates to operations on `n` digits numbers. Here were are using floating-point representations and both additions and multiplication are O(1), multiplication being slightly slower. –  Jan 16 '15 at 17:56
  • yes you are right after checking not big difference for floating point ops – Gerard Rozsavolgyi Jan 16 '15 at 19:11