1

I have some problem with the eigenvectors that julia gives me when I calculate the eigenvectors of some matrix of the type:

[-3.454373366796186+1.0*im -0.25350955594231006 0.08482455312233446 0.5677952929872186 0.8512642461184345 -3.3973836853171955
 -0.25350955594231006 -4.188304472566067 -0.7536261600953561 -0.2208291476393107 -0.9576102121737481 0.7295909738153196
 0.08482455312233446 -0.7536261600953561 -4.145281297093087 0.40094370842599164 -0.3177721876030173 -1.1267847565490017
 0.5677952929872186 -0.2208291476393107 0.40094370842599164 -2.561932209885087 0.40874651002530255 -0.5972057181377701
 0.8512642461184345 -0.9576102121737481 -0.3177721876030173 0.40874651002530255 -4.22394564475772 -0.6957268391716376
 -3.3973836853171955 0.7295909738153196 -1.1267847565490017 -0.5972057181377701 -0.6957268391716376 -3.4158987954939084+1.0*im]

(the matrix should be hermitian, except for the element (1,1) and (6,6).

It's eigenvectors are: Real part

[[-0.60946085  0.66877065 -0.10826958 -0.253947    0.30520429  0.02194697]
 [ 0.20102357 -0.07276538  0.60248336 -0.07765244  0.71609468 -0.24683536]
 [-0.18741272  0.21271718  0.48641162  0.11191183 -0.52801356 -0.62029698]
 [-0.26210071 -0.0094668  -0.07383844  0.91999668  0.22550855 -0.0102918 ]
 [-0.23182113 -0.02787858  0.61634939  0.03726956 -0.20443225  0.72225431]
 [ 0.64708605  0.70447722  0.04021026  0.22014373 -0.06068686  0.16822489]]

Imaginary part

[[ 0.00680416  0.01172969  0.0036139  -0.00816376  0.02468384 -0.05604585]
 [ 0.04974942  0.00719276 -0.01608118  0.09895638  0.         -0.01326765]
 [-0.04007749 -0.06932898  0.01283773 -0.06201991 -0.01329243  0.00324368]
 [-0.07372251  0.00715689  0.0038056   0.         -0.09608138  0.01970827]
 [-0.04798741 -0.00062382  0.         -0.07323346  0.03896021  0.        ]
 [ 0.          0.          0.03589898  0.04052119 -0.08599638 -0.00702559]]

Obviously there's a dependence on the imaginary part, otherwise the zeros in every imaginary part of the eigenvectors would not appear. I know this in part because I did the calculation in mathematica and it doesn't give me zeroes.

How do I erase such behaviour?

user2820579
  • 3,261
  • 7
  • 30
  • 45
  • Possible duplicate of [Could we get different solutions for eigenVectors from a matrix?](http://stackoverflow.com/questions/13041178/could-we-get-different-solutions-for-eigenvectors-from-a-matrix) – Colin T Bowers Jan 29 '16 at 01:46

2 Answers2

3

By way of extending Colin's exploration (and my comments on it), here is a function which might help transform the results from Julia/Matlab into the Mathematica results:

matlab2mathematica(m) = m/Diagonal(vec(m[end,:]))

It simply uses the freedom to choose any multiple of an eigenvector and still span the same space.

On the matrix in the OP this gives:

# m2 is the matrix from OP

real(matlab2mathematica(m2)) = 

6x6 Array{Float64,2}:
 -0.941854   0.949315   -1.45368   -1.12235   -1.86352    0.144124 
  0.31066   -0.10329     8.13901   -0.261148  -3.92277   -1.46145  
 -0.289626   0.30195     6.89       0.441542   2.99565   -3.68169  
 -0.405048  -0.0134381  -0.974822   4.04212   -0.489495  -0.0659565
 -0.358254  -0.0395734   8.52958    0.104523   0.817448   4.28591  
  1.0        1.0         1.0        1.0        1.0        1.0      

imag(matlab2mathematica(m2)) = 

6x6 Array{Float64,2}:
 -0.0105151  -0.0166502    -1.38769      -0.169504  -2.23397       0.327141
 -0.0768822  -0.0102101     7.66628      -0.497577  -5.55877       0.139903
  0.0619353   0.098412      5.832         0.362998   4.02595       0.134477
  0.11393    -0.0101592    -0.964946      0.744021  -2.27687      -0.1144  
  0.0741592   0.000885503   7.61505       0.351901   1.80035      -0.178993
 -0.0        -0.0          -5.55112e-17  -0.0        5.55112e-17  -0.0     

This is probably what Mathematica gives. Is it?

Dan Getz
  • 17,002
  • 2
  • 23
  • 41
2

UPDATE: Given lack of clarification from OP, I'm going to mark this question as an exact duplicate and vote-to-close.

You state: "obviously there's a dependence on the imaginary part, otherwise the zeros in every imaginary part of the eigenvectors would not appear."

I'm not sure what that means.

However, all the numbers you provide in the question look normal and correct to me, i.e. typical behaviour.

Remember that eigenvectors are unique only up to an orthogonal transformation, so any piece of software needs to choose a rule for how to scale the output of an eigenvector function. Mathematica uses a different rule to most other pieces of software, and this has confused many users in the past. For example, if you have Matlab, you'll notice that it provides exactly the output you describe in the question. So Julia behaves like Matlab in this instance, and not Mathematica.

Come to think of it, I've answered this question before in relation to Matlab/Mathematica. See here. I think this question is a duplicate, but might wait for a response from you before marking it as such. It is possible I have misunderstood what you want.

Community
  • 1
  • 1
Colin T Bowers
  • 18,106
  • 8
  • 61
  • 89
  • ...unique up to a dilation transformation (multiplication by a diagonal matrix). – Dan Getz Oct 29 '15 at 06:20
  • the dilation can be by a complex factor, always allowing the imaginary part of at least one entry of each vector (column) to be zero – Dan Getz Oct 29 '15 at 06:21
  • and when several eigenvalues are the same (a.k.a have higher multiplicity) then there is more freedom in choosing eigenvectors (orthogonal transformations on the subspace for each eigenvalue and such) – Dan Getz Oct 29 '15 at 06:23