0

I am currently using a code in R for calculating heat stroke , but it has been impossible for me to run more than 100 MB . The code operates a small raster elevation model but it has been impossible for me to use larger raster (raster DEM Colombia Size: 8.5 GB). Working with Windows 10 platform and 64-bit I read on the package ff process memory increases but not how to implement it in my code. This is the script that runs small raster that do not exceed the capacity of the RAM (8GB):


require(insol)
require(rgdal)
require(raster)
require(ff)## not how to implement

setwd('C:/proyecto/RG/GDEM') 
img = 'MDT_COLGDEM_COL.tif'
dem = raster(img)

cgr = cgrad(dem) ## Here the problem occurs
demm = raster:::as.matrix(dem)
dl = res(dem)[1]
height= cellStats(m, 'mean')
visibility=30
RH=80
tempK=286.15
tmz=0
year = 2013
month=3
day=21
timeh=12
jd=JDymd(year,month,day,hour=timeh)
Iglobal=array(0,dim=dim(demm))
deltat=0.5

coor <- coordinates(dem)
coorlong <- coor [,1]
coorlat <- coor [,2]

long = mean(coorlong)
lat = mean(coorlat)   

dayl=daylength(lat,long,jd,0)

for (srs in seq(dayl[1],dayl[2],deltat)){
  jd=JDymd(year,month,day,hour=srs)
  sv=sunvector(jd,lat,long,tmz)
  hsh=hillshading(cgr,sv)
  sh=doshade(demm,sv,dl)
  zenith=sunpos(sv)[2]
  Idirdif = insolation(zenith,jd,height,visibility,RH,tempK,0.002,0.15)
  ## Radiacian directa modificada por terreno + irradiacion difusa (     skyviewfactor ignorado )
  ## values in J/m^2
  Iglobal = Iglobal + (Idirdif[,1] * hsh + Idirdif[,2] )*3600*deltat}

## rasterize to plot nicely
Iglobal=raster(Iglobal,crs=projection(dem))
extent(Iglobal) = extent(dem)

writeRaster(Iglobal, 'Colombia.tif', drivername = 'GTiff')

The code works for raster not exceeding the large

This is the error that appears me :
Command line :
CGR = ff ( cgrad ( dem ) )
Error : can not allocate vector of size 8.3 Gb
In Addition : Warning messages:
1: In array ( dim = as.integer ( c (rev ( output.dim ) , length ( band) ))):
  Reached Total allocation of 7364Mb : see help ( memory.size )
2: In array ( dim = as.integer ( c (rev ( output.dim ) , length ( band) ))) :
  Reached Total allocation of 7364Mb : see help ( memory.size )
3: In array ( dim = as.integer ( c (rev ( output.dim ) , length ( band) ))) :
  Reached Total allocation of 7364Mb : see help ( memory.size )
4: In array ( dim = as.integer ( c (rev ( output.dim ) , length ( band) ))) :
  Reached Total allocation of 7364Mb : see help ( memory.size )

Sorry for the English , my level is very basic.

  • When **raster** package solutions don't scale well, I tend to turn to **gdalUtils**. It's a thin wrapper around **GDAL**, and often provides huge speedups in raster processing. For your application, perhaps first have a look at `gdalUtils::gdaldem()`. (Oscar Perpinan's [**solaR**](https://oscarperpinan.github.io/solar/) package might also be of interest, although I can't attest to how well it scales to large rasters.) – Josh O'Brien Sep 14 '16 at 15:00
  • Thank you very much! @JoshO'Brien for your contribution , the plot and have executed and works well problems occur when I try to overcome the memory cache that delivers my computer , I read in the other tags that ff bigmemory package and can handle these problems. And my confusion lies in how to implement them in the line where the error exceeds the vector process that reaches RAM appears . – Guillermo Padilla Sep 14 '16 at 15:29
  • Hi Guillermo. [`gdaldem`](http://www.gdal.org/gdaldem.html) can be used to compute both slope and aspect, from which you should be able to compute a unit normal vector. I won't work out the code for you, but that is what I'd try. (Perhaps as a first step, confirm that computing slope and aspect using `gdalUtils::gdaldem()` is in fact fast enough.) Best of luck! – Josh O'Brien Sep 14 '16 at 16:37

0 Answers0