1

I am trying to open and extract data from lat/lons using the cfgrib package and ecCodes on Linux.

NOTE: this code works when running outside jupyter - but crashes inside it.

I've installed ecCodes and checked it using python -m cfgrib selfcheck which gave the output

Found: ecCodes v2.18.0
Your system is ready

I then installed cfgrib on a virtual env using anaconda as per the instructions here.

When I try and run the code with a relatively small sample grib file in a jupyter notebook with the virtual env as the kernel, the following code kills the kernel each time.

import cfgrib
import xarray
ds = xarray.open_dataset('/path/to/my/file/era5-levels-members.grib', engine = 'cfgrib')

I have updated to the most recent version of ecCodes and cfgrib and I can't see what is going wrong.

This is the error I get: enter image description here

Pad
  • 841
  • 2
  • 17
  • 45
  • This is probably an out-of-memory eror. How large is the array when successfully loaded into memory vs. your machine's memory? – Maximilian Oct 15 '20 at 23:08
  • I don't think it is, the array is only 2.4 Mb, I am not sure how to check how large it is when loaded into memory, but the machine memory is 15 Gb – Pad Oct 16 '20 at 09:26
  • @Pad you can check that with `ds.nbytes / 10**9`. Also, `cfgrib` engine will only work if you have write permission in the dir and the necessary space to write those files. By default it will write in the dir from which you're opening the dataset. Moreover, if you're opening jupyter via terminal, then it will print out some log messages when your kernel dies. Those messages can give more insight into what's the source of problem. – Light_B Oct 16 '20 at 12:15

0 Answers0