0

I'm new to spatial statistics, and I'm trying to create a spatial weight matrix for all Census tracts in the US in R. There are around 74000 tracts.

Based on US Census Tiger Files, I created a shapefile of all tracts, and then did (using the spdep package):

#Create adjacency matrix
am = poly2nb(us)
is.symmetric.nb(am)

This works fine, though am is pretty large.

Next:

am = nb2mat(am, style="B",zero.policy=T)

Which gives me this error:

Error: cannot allocate vector of size 40.9 Gb

Obviously my laptop cannot handle 40.9 Gb of memory. I tried doing this on AWS EC2 cloud, but to get that much memory I'd need to get a very large instance which I'd like to avoid since I'm totally new at cloud computing and would rather play in the free T2.micro sandbox (max up to 1 GiB of memory) until I'm ready to spend some cash on a bigger machine. If I could turn the weight matrix into a sparse matrix I think I'd be able to handle it, but I don't know how to do that. I tried doing something like this:

Wmat<-Matrix(nb2mat(am, style="B",zero.policy=T),sparse=TRUE)

But it still needs all the memory to do the nb2mat command before creating the sparse matrix.

Any solutions?

rafa.pereira
  • 13,251
  • 6
  • 71
  • 109
  • See the "Large memory and out-of-memory data" section at http://cran.r-project.org/web/views/HighPerformanceComputing.html. Maybe [bigmemory](http://cran.r-project.org/web/packages/bigmemory/index.html) is what you're looking for. – alexforrence Mar 11 '15 at 14:46
  • According to bigmemory's documentation, it's limited to the available RAM on the computer, which in my case is 8 GB. I know there are other packages out there - ff, BufferedMatrix- that may work. I'll give those a shot. But I was hoping someone with expertise with spatial matrices, spdep, geostats, etc. would have another solution. – robin.datadrivers Mar 11 '15 at 16:46

1 Answers1

0

Sure it's a bit late. But I think I just figured out a solution. I have a similar situation with a 71k*71k matrix.

I just reworked the nb2mat function to use big.matrix from the bigmemory library. We need to define two new functions:

    my_nb2mat = function (neighbours, glist = NULL, style = "W", zero.policy = NULL) 
    {
      if (is.null(zero.policy)) 
        zero.policy <- get("zeroPolicy", envir = .spdepOptions)
      stopifnot(is.logical(zero.policy))
      if (!inherits(neighbours, "nb")) 
        stop("Not a neighbours list")
      listw <- nb2listw(neighbours, glist = glist, style = style, 
                        zero.policy = zero.policy)
      res <- my_listw2mat(listw)
      attr(res, "call") <- match.call()
      res
    }

my_listw2mat = function (listw) 
    {
      require(bigmemory)
      n <- length(listw$neighbours)
      if (n < 1) 
        stop("non-positive number of entities")
      cardnb <- card(listw$neighbours)
      if (any(is.na(unlist(listw$weights)))) 
        stop("NAs in general weights list")
      #res <- matrix(0, nrow = n, ncol = n)
      res <- big.matrix(n, n, type='double', init=NULL)
      options(bigmemory.allow.dimnames=TRUE)

      for (i in 1:n) if (cardnb[i] > 0) 
        res[i, listw$neighbours[[i]]] <- listw$weights[[i]]
      if (!is.null(attr(listw, "region.id"))) 
        row.names(res) <- attr(listw, "region.id")
      res
    }

Call the new my_nb2mat function here:

a=my_nb2mat(neighbours = out, style='W',zero.policy =F )

Note: bigmemory library only seems to work in R\R-2.15.3 for me.

mmann1123
  • 5,031
  • 7
  • 41
  • 49