0

I'm trying to compute the covariance matrix of a very large image data matrix. I have tried both

cov(data)

and

data %*% t(data)/ (nrow(t(data))-1)

and ended up with a matrix of NaN values which makes absolutely no sense. The size of the covariance matrix is correct but why the values are all NaN does not. If I try

cov(data)

and

t(data) %*% data/ (nrow(data)-1)

I get an error message saying

Error: cannot allocate vector of size ...

I have also tried using the bigcor() but I get this error every time:

Error in if (length < 0 || length > .Machine$integer.max) stop("length must be between 0 and .Machine$integer.max") : missing value where TRUE/FALSE needed In addition: Warning message: In ff(vmode = "double", dim = c(NCOL, NCOL)) : NAs introduced by coercion to integer range

Any idea of what could be causing this and how to fix it?

I'm following this tutorial:

https://rpubs.com/dherrero12/543854

darzan
  • 17
  • 4
  • The first error "cannot allocate vector of size" means that insufficient memory is available. The second error means that a calculation has exceeded the maximum integer size on your machine. So basically, the matrix is too large to be processed on your computer. – neilfws Nov 28 '22 at 22:53
  • Is there a way to compute such matrices in R? I have done this already in Matlab with even larger image files. – darzan Nov 28 '22 at 22:55

0 Answers0