This is basically the same issue found in this question https://gis.stackexchange.com/questions/95481/in-r-set-na-cells-in-one-raster-where-another-raster-has-values and here R - Chaging specific cell values in a large raster layer. The answers in the first question did not solve my problem because I do not have another object that I can use with overlay
or calc
. Reclassify is the only thing that works.
in my case, I have a a raster file which is 300 MB. I am applying a simple operation, just trying to replace all values in the raster that are equal to a certain number with NAs. I have about 4 GB of RAM available, but it seems I cannot complete this operation because I get the error "cannot allocate vector of size 4.6 GB". I have tried even setting my memory size to 16 GB but then I get the same error just saying that cannot allocate vector of size 9.2 GB. I have tried the following two options:
r[r==5]=NA
values(r)[values(r)==5]
strangely, even a simple operation such as table(values(r)) gives the same error, for example ArcMap can create this table in a few seconds. I have already solved my problem but I am wondering why this huge inefficiency of memory use and how can it be prevented or avoided? Why is raster
requiring up to 9 GB to process a file that is 300 MB? Is this a limitation of this package or with R?