1

I have written a linear solver employing Householder reflections/transformations in ANSI C which solves Ax=b given A and b. I want to use it to find the eigenvector associated with an eigenvalue, like this:

(A-lambda*I)x = 0

The problem is that the 0 vector is always the solution that I get (before someone says it, yes I have the correct eigenvalue with 100% certainty).

Here's an example which pretty accurately illustrates the issue:

Given A-lambda*I (example just happens to be Hermitian):

1 2 0 | 0
2 1 4 | 0
0 4 1 | 0

Householder reflections/transformation will yield something like this

# # # | 0
0 # # | 0
0 0 # | 0

Back substitution will find that solution is {0,0,0}, obviously.

Thomas Dickey
  • 51,086
  • 7
  • 70
  • 105
nick_name
  • 1,353
  • 3
  • 24
  • 39
  • It is highly unlikely that a bug exists in my code since I can perform matrix inversion operations, (this entails multiple linear solves back to back). The inverses are always correct. I believe it's more likely that my tactic isn't going to be fruitful and that there is a better way to go about the back-substitution portion that doesn't always yield the 0 vector. – nick_name Oct 25 '11 at 06:41
  • 1
    It is generally a bad idea to use a linear solver to find eigenvectors (one of the reasons being ill-posedness: (A - lambda I)x = 0 has by definition several linearly independent solutions). If you are looking for an eigenvector associated to a particular eigenvalue, there are routines in LAPACK for this task. – Alexandre C. Oct 25 '11 at 09:48

2 Answers2

4

It's been a while since I've written an eigensolver, but I seem to recall that the trick was to refactor it from (A - lambda*I) * x = 0 to A*x = lambda*x. Then your Householder or Givens steps will give you something like:

# # # | #
0 # # | #
0 0 1 | 1

...from which you can back substitute without reaching the degenerate 0 vector. Usually you'll want to deliver x in normalized form as well.

My memory is quite rusty here, so I'd recommend checking Golub & Van Loan for the definitive answer. There are quite a few tricks involved in getting this to work robustly, particularly for the non-symmetric case.

Drew Hall
  • 28,429
  • 12
  • 61
  • 81
  • @nick_name: Not that I don't appreciate it, but I wouldn't accept this response just yet. You might get something better if you wait a day or so. – Drew Hall Oct 25 '11 at 06:52
1

This is basically the same answer as @Drew, but explained a bit differently.

If A is the matrix

1  2  0
2  1  4
0  4  1

then the eigenvalues are lambda = 1, 1+sqrt(20), 1-sqrt(20). Let us take for simplicity lambda = 1. Then the augmented matrix for the system (A - lambda*I) * x = 0 is

0  2  0 | 0
2  0  4 | 0
0  4  0 | 0

Now you do the Householder / Givens to reduce it to upper triangular form. As you say, you get something of the form

#  #  # | 0
0  #  # | 0
0  0  # | 0

However, the last # should be zero (or almost zero). Exactly what you get depends on the details of the transformations you do, but if I do it by hand I get

2  0  4 | 0
0  2  0 | 0
0  0  0 | 0

Now you do backsubstitution. In the first step, you solve the equation in the last row. However, this equation does not yield any information, so you can set x[2] (the last element of the vector x) to any value you want. If you set it to zero and continue the back-substitution with that value, you get the zero vector. If you set it to one (or any nonzero value), you get a nonzero vector. The idea behind Drew's answer is to replace the last row with 0 0 1 | 1 which sets x[2] to 1.

Round-off error means that the last #, which should be zero, is probably not quite zero but some small value like 1e-16. This can be ignored: just take it as zero and set x[2] to one.

Obligatory warning: I assume you are implementing this for fun or educational purposes. If you need to find eigenvectors in serious code, you are better off using code written by others as this stuff is tricky to get right.

Jitse Niesen
  • 4,492
  • 26
  • 23
  • Thanks for the detailed response! The matrix was simply an example. You could think of any hermitian matrix which has linearly independent rows as an example. Perhaps the one I listed was not a good choice. – nick_name Oct 25 '11 at 13:25