I have this stars object (could be also formatted to raster):
stars object with 2 dimensions and 2 attributes
attribute(s):
LST_mean elevation
Min. :14.98 Min. :296.0
1st Qu.:16.89 1st Qu.:346.9
Median :17.64 Median :389.3
Mean :17.52 Mean :389.2
3rd Qu.:18.18 3rd Qu.:428.3
Max. :20.11 Max. :521.6
dimension(s):
from to offset delta refsys point values
x 71 83 4387654 860.241 DHDN / 3-degree Gauss-Kru... FALSE NULL [x]
y 33 41 5598885 -860.241 DHDN / 3-degree Gauss-Kru... FALSE NULL [y]
Which has 2 attributes (layers in the case of raster): temperature and elevation. Using temperature, I would like to select the pixels that fall within a buffer and return the mean, only for the pixels whose difference in elevation with the considered one everytime is less than 90 meters.
Any ideas how to do this? Calculating the averages of the pixels that fall within the buffer is very easy, but I couldn't find a way to set any condition on them.
I will be immensly grateful for your help and suggestions. Approaches using other packages than satrs
are also very welcome :)