0

I am implementing Jacobi algorithms, to get eigenvectors of symmetric matrix. I don't understand why i gain different eigenvector from my applications (same result like mine here: http://fptchlx02.tu-graz.ac.at/cgi-bin/access.com?c1=0000&c2=0000&c3=0000&file=0638) and diffrent from Wolfram Aplha: http://www.wolframalpha.com/input/?i=eigenvector%7B%7B1%2C2%2C3%7D%2C%7B2%2C2%2C1%7D%2C%7B3%2C1%2C1%7D%7D

Example matrix:

1 2 3
2 2 1 
3 1 1

My Result:

0.7400944496522529, 0.6305371413491765, 0.23384421945632447
-0.20230251371232585, 0.5403584533063043, -0.8167535949636785
-0.6413531776951003, 0.5571668060588798, 0.5274763043839444

Result from WA:

1.13168, 0.969831, 1 
-1.15396, 0.315431, 1 
0.443327, -1.54842, 1 

I expect that solution is trivial, but i can't find it. I've asked this question on mathoverflow and they pointed me to this site.

cmaster - reinstate monica
  • 38,891
  • 9
  • 62
  • 106
Kuba Wenta
  • 580
  • 1
  • 5
  • 25

1 Answers1

0

Eigenvectors of a matrix are not unique, and there are multiple possible decompositions; in fact, only eigenspaces can be defined uniquely. Both results that you are receiving are valid. You can easily see that by asking Wolfram Alpha to orthogonalize the second matrix. Run the following query:

Orthogonalize[{{1.13168, 0.969831, 1.}, {-1.15396, 0.315431, 1.}, {0.443327, -1.54842, 1.}}]

to obtain

 0.630537    0.540358    0.557168
-0.740094    0.202306    0.641353
 0.233844   -0.816754    0.527475

Now you can see that your algorithm returns a correct result. First, the matrix is transposed: WA gave you row vectors, and your algorithm returns them in columns. Then, the first vector is multiplied by a -1, but any eigenvector can be multiplied by a non-zero constant to yield a valid eigenvector. Otherwise, the results perfectly match.

You may also find the following Mathematics StackExchange answer helpful: Are the eigenvectors of a real symmetric matrix always an orthonormal basis without change?

Community
  • 1
  • 1