I have a 4D xarray Dataset. I want to carry out a linear regression between two variables on a specific dimension (here time), and keep the regression parameters in a 3D array (the remaining dimensions). I managed to get the results I want by using this serial code, but it is rather slow:
# add empty arrays to store results of the regression
res_shape = tuple(v for k,v in ds[x].sizes.items() if k != 'year')
res_dims = tuple(k for k,v in ds[x].sizes.items() if k != 'year')
ds[sl] = (res_dims, np.empty(res_shape, dtype='float32'))
ds[inter] = (res_dims, np.empty(res_shape, dtype='float32'))
# Iterate in kept dimensions
for lat in ds.coords['latitude']:
for lon in ds.coords['longitude']:
for duration in ds.coords['duration']:
locator = {'longitude':lon, 'latitude':lat, 'duration':duration}
sel = ds.loc[locator]
res = scipy.stats.linregress(sel[x], sel[y])
ds[sl].loc[locator] = res.slope
ds[inter].loc[locator] = res.intercept
How could I speed-up and parallelize this operation?
I understand that apply_ufunc
might be an option (and could be parallelized with dask), but I did not managed to get the parameters right.
The following questions are related but without an answer:
- Applying numpy.polyfit to xarray Dataset
- Python: How to find regression equation of multiple 3D (lat-lon-time-value) dataArrays?
- calculating cross-correlation function in xarray
Edit 2: move previous edit to an answer