-1

so i am trying to unpack a netCDF file and i have gotten to a point where i have put the variables that i am interested in, in a np array. However, my problem comes in for the size of the array that i am trying to analyse. The folowing are the variables and their sizes:

_parameters ()
globalx (1, 231)
globaly (1, 231)
globaltime (6,)
vz (6, 100, 1, 231)
uz (6, 100, 1, 231)
u (6, 1, 231)
v (6, 1, 231)
ustz (6, 100, 1, 231)

I am interested in vz,uz and ustz (explanation on what they contain... (the number of iterations, nz, ny, nx) i am not interested in ny and it holds no value.

so the question is now can i make the size to (the number of iterations, nz, nx) and unpack nz and nx per iteration.

Thanks

here is my code,

from netCDF4 import Dataset
import numpy as np
import pandas as pd

ncfile = 'xboutput.nc'
jan = Dataset(ncfile, more = 'r')

#print (file.variables)

_parameters = jan.variables['_parameters']
globalx = jan.variables['globalx'].units
globaly = jan.variables['globaly'].units
globaltime = jan.variables['globaltime'].units
vz = jan.variables['vz'][:]
uz = jan.variables['uz'][:]
ustz = jan.variables['ustz'][:]
v = jan.variables['v'][:]
u = jan.variables['u'][:]

for i in jan.variables:
    print (i,jan.variables[i].shape)

vz_ar = np.array(jan.variables['vz'])
u_ar = np.array(jan.variables['u'])

1 Answers1

1

This problem is what the xarray package was designed to solve. Especially because of its integration with dask, xarray is the ideal tool for working with large netcdf files with many dimensions.

Try opening your data with xarray instead

import xarray as xr
ds = xr.open_dataset('xboutput.nc')

Then work with the data directly via the xarray API. Xarray's broadcasting rules make it very easy to operate on data with different combinations of dimensions.

Ryan
  • 766
  • 6
  • 13