I am executing the following code from https://gis.stackexchange.com/questions/357997/creating-multi-ring-buffers-around-points-in-a-single-layer-and-for-each-point-s.
library(sf)
library(dplyr)
library(ggplot2)
library(stringr)
library(rgdal)
library(lwgeom)
library(sp)
#library (bigmemory)
#library (biganalytics)
The Pointsdata
has the following format:
Pointsdata<- data.frame(ID = paste0("ID_", 1:20), lon = runif(20, -20, 20),
lat = runif(20, -20, 20), pop_size = runif(10, 0, 2000))
> head(Pointsdata)
ID lon lat pop_size
1 ID_1 -10.051100 18.441917 675.3872
2 ID_2 5.394999 4.148627 1472.2674
3 ID_3 1.306370 15.794475 137.6444
4 ID_4 5.910318 -6.438676 1380.2332
5 ID_5 9.355191 -10.118736 513.3624
6 ID_6 -4.804669 -7.762536 212.9331
#load raw data
Mydata <- read.csv("Pointsdata.csv") # My actual data
#Transform and project to required UTM
projdata <- st_as_sf(Mydata,coords = c("lon", "lat"), crs = 4326) %>% st_transform(26913)
bufferR <- c(402.336, 1609.34, 3218.69, 4828.03, 6437.38) # sequence of radii in miles??
#Create data of neighboring wells per buffer
dataout <- do.call("rbind", lapply(1:length(bufferR), function(y) {
bfr <- projdata %>% st_buffer(bufferR[y]) ## create Buffer
## minus the next smaller buffer
if(y>1) {
inters <- suppressWarnings(st_difference(bfr, projdata %>% st_buffer(bufferR[y-1])))
bfr <- inters[which(inters$ID == inters$ID.1),]
}
# get ids that intersect with buffer
inters <- bfr %>% st_intersects(projdata)
do.call("rbind", lapply(which(sapply(inters, length)>0),
function(z) data.frame(orig = projdata[z,]$ID, radius = bufferR[y],
incl = projdata[unlist(inters[z]),]$ID,
afpump_mtchd = projdata[unlist(inters[z]),]$afpump)))
}))
write.csv(dataout,'data.csv')
Everything runs fine when I run it on round a 1000 location points. But beyond a 1000 points, R throws out the following error:
Error in CPL_geos_op2(op, st_geometry(x), st_geometry(y)) : Evaluation error: std::bad_alloc.
By "experimenting" with different sizes of my data, it appears this as an issue of inadequate memory which is quite surprising because I think my 16GB RAM should be able to take care of this analysis.
After several days of googling around this problem and reading around to understand without success how quite similar problems were resolved on other platforms, I decided to raise it here, hoping I could get help on it.
My session info is :
R version 3.6.3 (2020-02-29)
Platform: x86_64-w64-mingw32/x64 (64-bit)
Running under: Windows 10 x64 (build 18362)
Matrix products: default
locale:
[1] LC_COLLATE=English_United States.1252
[2] LC_CTYPE=English_United States.1252
[3] LC_MONETARY=English_United States.1252
[4] LC_NUMERIC=C
[5] LC_TIME=English_United States.1252
attached base packages:
[1] stats graphics grDevices utils datasets methods base
other attached packages:
[1] lwgeom_0.2-1 rgdal_1.4-8 stringr_1.4.0 ggplot2_3.3.0 dplyr_0.8.5
[6] raster_3.0-12 sp_1.4-0 sf_0.9-0
loaded via a namespace (and not attached):
[1] Rcpp_1.0.4 pillar_1.4.3 compiler_3.6.3 class_7.3-15
[5] tools_3.6.3 lifecycle_0.2.0 tibble_3.0.0 gtable_0.3.0
[9] lattice_0.20-38 pkgconfig_2.0.3 rlang_0.4.5 DBI_1.0.0
[13] cli_2.0.2 e1071_1.7-3 withr_2.1.2 vctrs_0.2.4
[17] classInt_0.4-2 grid_3.6.3 tidyselect_0.2.5 glue_1.3.2
[21] R6_2.4.1 fansi_0.4.1 purrr_0.3.3 magrittr_1.5
[25] scales_1.0.0 codetools_0.2-16 ellipsis_0.3.0 units_0.6-6
[29] assertthat_0.2.1 colorspace_1.4-1 KernSmooth_2.23-16 stringi_1.4.6
[33] munsell_0.5.0 crayon_1.3.4
>
My memory limit also is:
> memory.limit()
[1] 16257
Can someone help me with any ideas on how to resolve this problem?