0

I am trying to grasp the basic concept of invertible and non-invertible matrices.

I created a random non-singular square matrix

S <- matrix(rnorm(100, 0, 1), ncol = 10, nrow = 10)

I know that this matrix is positive definite (thus invertible) because when I decompose the matrix S into its eigenvalues, their product is positive.

eig_S <- eigen(S)
eig_S$values
 [1]  3.0883683+0.000000i -2.0577317+1.558181i -2.0577317-1.558181i  1.6884120+1.353997i  1.6884120-1.353997i
 [6] -2.1295086+0.000000i  0.1805059+1.942696i  0.1805059-1.942696i -0.8874465+0.000000i  0.8528495+0.000000i
solve(S)

According to this paper, we can compute the inverse of a non-singular matrix by its SVD too. Where enter image description here (where U and V are eigenvectors and D eigenvalues, please do correct me if I am wrong).

The inverse then is, enter image description here.

Indeed, I can run the formula in R:

s <- svd(S)
s$v%*%solve(diag(s$d))%*%t(s$u)

Which produces exactly the same result as solve(S).

My first question is:

1) Are s$d indeed represent the eigenvalues of S? Because s$d and eig_S$values are quite different.

Now the second part,

If I create a singular matrix

I <- matrix(rnorm(100, 0, 1), ncol = 5, nrow = 20)
I <- I%*%t(I)
eig_I <- eigen(I)
eig_I$values
 [1]  3.750029e+01  2.489995e+01  1.554184e+01  1.120580e+01  8.674039e+00  3.082593e-15  5.529794e-16  3.227684e-16
 [9]  2.834454e-16  5.876634e-17 -1.139421e-18 -2.304783e-17 -6.636508e-17 -7.309336e-17 -1.744084e-16 -2.561197e-16
[17] -3.075499e-16 -4.150320e-16 -7.164553e-16 -3.727682e-15

The solve function will produce an error

solve(I)

system is computationally singular: reciprocal condition number = 1.61045e-19

So, again according to the same paper we can use the SVD

i <- svd(I)    
solve(i$u %*% diag(i$d) %*% t(i$v))

which produces the same error. Then I tried to use the Cholesky decomposition for matrix inversion enter image description here

Conj(t(I))%*%solve(I%*%Conj(t(I)))

and again I get the same error.

Could someone please explain where am I using the equations wrong?

I know that for matrix I%*%Conj(t(I)), the determinant of the eigenvalue matrix is positive but the matrix is not a full rank due to the initial multiplication that I did.

j <- eigen(I%*%Conj(t(I)))
det(diag(j$values))
[1] 3.17708e-196
qr(I %*% Conj(t(I)))$rank
[1] 5

UPDATE 1: Following the comments bellow, and after going through the paper/Wikipedia page again. I used these two codes, which they produce some results but I am not sure about their validity. The first example seems more believable. The SVD solution

i$v%*%diag(1/i$d)%*%t(i$u)

and the Cholesky

Conj(t(I))%*%(I%*%Conj(t(I)))^(-1)

I am not sure if I interpreted the two sources correctly though.

Jespar
  • 1,017
  • 5
  • 16
  • 29
  • 2
    erigen values are **NOT** singular values. You cannot invert a singular of badly conditioned matrix. Some of the eigenvalues of `I` you show are tiny and almost zero. The value of `det(...)` you show is tiny and much too small. That indicates a singular or a very badly conditioned singular matrix. You simply cannot use `solve` or a variant with your `I` matrix. – Bhas Nov 16 '17 at 18:52
  • 1
    I'm confused by your question. You say "if I create a singular matrix", which is by definition not invertible, none of your inversion methods work. Everything is working as expected, yes? – Gregor Thomas Nov 16 '17 at 18:55
  • 2
    You might be interested [in this stats.stackexchange question](https://stats.stackexchange.com/q/314046/7515). Among other things, it clarifies that the SVD singular values and the eigenvalues will be the same only if the matrix is *symmetric* as well as positive semi-definite. – Gregor Thomas Nov 16 '17 at 19:16
  • Bhas, thanks for the clarification about the Eigenvalues and the singular values. @Gregor, I cannot say I agree with your statement. The first document I attached states: 'If A is singular or ill-conditioned, then we can use SVD to approximate its inverse' Also, the wiki page states: 'A non-Hermitian matrix B can also be inverted using the following identity'. In both cases they say that there are ways to invert singlular matrices, and I tried to apply their formulas. Now, if I follow the first paper to the letter I can use instead of solver this i$v%*%diag(1/i$d)%*%t(i$u) and is done – Jespar Nov 21 '17 at 13:48
  • You can *approximate* inverses to singular matrices, but exact inverses do not exist. I will be fascinated if you can show me a definition of singular matrix that doesn't include "not invertible". – Gregor Thomas Nov 21 '17 at 14:52
  • I never used the words exact inverse. I am not trying to be antagonistic here mate, I am trying to learn. And I was honest with you saying that I do not agree with your statement. The methods I described, by definition, are meant for approximation. Never claimed otherwise. – Jespar Nov 21 '17 at 14:57

0 Answers0