I'm new to spatial statistics, and I'm trying to create a spatial weight matrix for all Census tracts in the US in R. There are around 74000 tracts.
Based on US Census Tiger Files, I created a shapefile of all tracts, and then did (using the spdep
package):
#Create adjacency matrix
am = poly2nb(us)
is.symmetric.nb(am)
This works fine, though am is pretty large.
Next:
am = nb2mat(am, style="B",zero.policy=T)
Which gives me this error:
Error: cannot allocate vector of size 40.9 Gb
Obviously my laptop cannot handle 40.9 Gb of memory. I tried doing this on AWS EC2 cloud, but to get that much memory I'd need to get a very large instance which I'd like to avoid since I'm totally new at cloud computing and would rather play in the free T2.micro sandbox (max up to 1 GiB of memory) until I'm ready to spend some cash on a bigger machine. If I could turn the weight matrix into a sparse matrix I think I'd be able to handle it, but I don't know how to do that. I tried doing something like this:
Wmat<-Matrix(nb2mat(am, style="B",zero.policy=T),sparse=TRUE)
But it still needs all the memory to do the nb2mat command before creating the sparse matrix.
Any solutions?