1

I am using the errorsarlm() function of the spdep package on a PC with a 64-bit OS and 7.8 GB RAM. The allocated quota by my administrator is 12 GB.

Input to my errorsarlm() is:

1) a non-symmetric weight matrix, converted from a .gwt (by means of the nb2listw function; of 1.7 mb). It is a k=4 nearest neighbor matrix for 85684 regions (# of observations):

Characteristics of weights list object:
Neighbour list object:
Number of regions: 85684
Number of nonzero links: 342736
Percentage nonzero weights: 0.004668316
Average number of links: 4
Non-symmetric neighbours list
Link number distribution:
    4
85684
Weights style: W
Weights constants summary:
      n         nn    S0       S1       S2
W 85684 7341747856 85684 34664.44 377277.6
Characteristics of weights list object:

2) a CSV datat set of 246.3 mb, from which I use 80 variables in the errorsarlm() model. Each variable has 85684 observations.

When I run a simple linear regression (lm) based on the 80 variables, I have no problems. But, when I run the errorsalm model with default arguments, I immediately get the following message:

'Error in matrix(0, nrow = n, ncol = n) : too many elements specified'

A traceback() tells me:

6: matrix(0, nrow = n, ncol = n)
5: listw2mat(listw)
4: eigenw(get("listw", envir = env))
3: eigen_setup(env, which = which)
2: jacobianSetup(method, env, con, pre_eig = con$pre_eig, trs = trs, 
       interval = interval)
1: errorsarlm(Lnp_N ~ Lnlivings_ + Yearofcon_ + Garden + Lnkirche_N + 
       Station_N + Bus_N + LnschuleA_ + Lnind200_N + Dem_N + Slope_N + 
       Aspect_N + Income_N + InhaHa_N + Lnlake_N + Lnriver_N + Riversize5 + 
       Riversize6 + Riversize7 + Riversize8 + Riversize9 + Riversize1 + 
       Greensp_N + Lnpark_N + Lnhighw_N + Lnbadi_N + LakeNat + LakeAlt_N + 
       LakeArea_N + Zh + Be + Lu + Ur + Sz + Ow + Nw + Gl + Zg + 
       Fr + So + Bs + Bl + Sh + Ar + Ai + Sg + Gr + Ag + Ti + Vd + 
       Vs + Ne + Ju + Tg + Q052 + Q053 + Q054 + Q061 + Q062 + Q063 + 
       Q064 + Q071 + Q072 + Q073 + Q074 + Q081 + Q082 + Q083 + Q084 + 
       Q091 + Q092 + Q093 + Q094 + Q101 + Q102 + Q103 + Q104, heddata, 
       w4n)

And a matrix density check with sum(card()) of the matrix nb object, which was converted from a .gwt with read.gwt2nb, tells me:

[1] 342736

Because of the immediate appearance of the error message, the memory usage doesn't increase. Is there somehow a maximum matrix size that spdep allows, which I have exceeded? Or is there a different explanation?

When I include method="LU" and method "MC" in my errorsarlm(), the model runs fine and gives results.

Hopefully, someone can help me make some sense out of this.

thanks,

Diana

Thomas Broyer
  • 64,353
  • 7
  • 91
  • 164

0 Answers0