Questions tagged [eigenvector]

The eigenvectors of a square matrix are the non-zero vectors that, after being multiplied by the matrix, remain parallel to the original vector.

The eigenvectors of a square matrix are the non-zero vectors that, after being multiplied by the matrix, remain parallel to the original vector. For each eigenvector, the corresponding eigenvalue is the factor by which the eigenvector is scaled when multiplied by the matrix. The prefix eigen- is adopted from the German word "eigen" for "own"[1] in the sense of a characteristic description. The eigenvectors are sometimes also called characteristic vectors. Similarly, the eigenvalues are also known as characteristic values.

Find more on this topic on Wikipedia.

633 questions
24
votes
1 answer

floating point error in Ruby matrix calculation

I'm writing some code that involves finding the eigenvectors of a given matrix, and was surprised that Ruby produces some unreasonable results in simple cases. For example, the following matrix has an eigenvector associated with eigenvalue 1: > m =…
Théophile
  • 1,032
  • 10
  • 16
20
votes
1 answer

Calculating eigen values of very large sparse matrices in python

I have a very large sparse matrix which represents a transition martix in a Markov Chain, i.e. the sum of each row of the matrix equals one and I'm interested in finding the first eigenvalue and its corresponding vector which is smaller than one. I…
Zachi Shtain
  • 826
  • 1
  • 13
  • 31
20
votes
2 answers

C++ eigenvalue/vector decomposition, only need first n vectors fast

I have a ~3000x3000 covariance-alike matrix on which I compute the eigenvalue-eigenvector decomposition (it's a OpenCV matrix, and I use cv::eigen() to get the job done). However, I actually only need the, say, first 30 eigenvalues/vectors, I don't…
MShekow
  • 1,526
  • 2
  • 14
  • 28
18
votes
8 answers

What is the fastest way to calculate first two principal components in R?

I am using princomp in R to perform PCA. My data matrix is huge (10K x 10K with each value up to 4 decimal points). It takes ~3.5 hours and ~6.5 GB of Physical memory on a Xeon 2.27 GHz processor. Since I only want the first two components, is…
384X21
  • 6,553
  • 3
  • 17
  • 17
16
votes
1 answer

Eigenvectors computed with numpy's eigh and svd do not match

Consider singular value decomposition M=USV*. Then the eigenvalue decomposition of M* M gives M* M= V (S* S) V*=VS* U* USV*. I wish to verify this equality with numpy by showing that the eigenvectors returned by eigh function are the same as those…
matus
  • 275
  • 2
  • 9
15
votes
1 answer

What is the difference between 'eig' and 'eigs'?

I've searched a lot for this but I can't find any answer about how the two methods 'eig' and 'eigs' differ. What is the difference between the eigenvalues and eigenvectors received from them?
Gustav
  • 1,361
  • 1
  • 12
  • 24
14
votes
3 answers

Finding smallest eigenvectors of large sparse matrix, over 100x slower in SciPy than in Octave

I am trying to compute few (5-500) eigenvectors corresponding to the smallest eigenvalues of large symmetric square sparse-matrices (up to 30000x30000) with less than 0.1% of the values being non-zero. I am currently using scipy.sparse.linalg.eigsh…
Spacekiller23
  • 141
  • 1
  • 5
14
votes
2 answers

Does Matlab eig always returns sorted values?

I use a function at Matlab: [V,D] = eig(C); I see that V and D are always sorted ascending order. Does it always like that or should I sort them after I get V and D values?
kamaci
  • 72,915
  • 69
  • 228
  • 366
13
votes
3 answers

Can I use Lapack for calculating the eigenvalues and eigenvectors of large sparse matrices?

If I had a square matrix that is 1,000 by 1,000 could Lapack calculate the eigenvectors and eigenvalues for this matrix? And if it can how long would it take? Also what about for a 10,000 by 10,000 matrix or even a 1,000,000 by 1,000,000 matrix?…
Spencer
  • 245
  • 2
  • 5
  • 9
13
votes
3 answers

Eigenvector computation using OpenCV

I have this matrix A, representing similarities of pixel intensities of an image. For example: Consider a 10 x 10 image. Matrix A in this case would be of dimension 100 x 100, and element A(i,j) would have a value in the range 0 to 1, representing…
Arnkrishn
  • 29,828
  • 40
  • 114
  • 128
12
votes
2 answers

OpenCV/JavaCV face recognition - Very similar confidence values

I will explain what I am trying to do, as it seems to be relevant in order to understand my question. I am currently trying to do face recognition of people that step in front of a camera, based on known pictures in the database. These known…
Fábio Constantino
  • 197
  • 1
  • 1
  • 10
11
votes
1 answer

Quickly and efficiently calculating an eigenvector for known eigenvalue

Short version of my question: What would be the optimal way of calculating an eigenvector for a matrix A, if we already know the eigenvalue belonging to the eigenvector? Longer explanation: I have a large stochastic matrix A which, because it is…
5xum
  • 5,250
  • 8
  • 36
  • 56
11
votes
1 answer

Finding generalized eigenvectors numerically in Matlab

I have a matrix such as this example (my actual matrices can be much larger) A = [-1 -2 -0.5; 0 0.5 0; 0 0 -1]; that has only two linearly-independent eigenvalues (the eigenvalue -1 is repeated). I would like to obtain a complete…
horchler
  • 18,384
  • 4
  • 37
  • 73
10
votes
1 answer

How to use princomp () function in R when covariance matrix has zero's?

While using princomp() function in R, the following error is encountered : "covariance matrix is not non-negative definite". I think, this is due to some values being zero (actually close to zero, but becomes zero during rounding) in the covariance…
384X21
  • 6,553
  • 3
  • 17
  • 17
10
votes
1 answer

Eigenvalues in Python: A Bug?

Here are two assumptions about eigenvectors and eigenvalues of square matrices. I believe that both are true: If a matrix is symmetric and contains only real values, then it is a Hermitian matrix, and then all eigenvalues should be real numbers and…
Hubert Schölnast
  • 8,341
  • 9
  • 39
  • 76
1
2 3
42 43