so i am trying to unpack a netCDF file and i have gotten to a point where i have put the variables that i am interested in, in a np array. However, my problem comes in for the size of the array that i am trying to analyse. The folowing are the variables and their sizes:
_parameters ()
globalx (1, 231)
globaly (1, 231)
globaltime (6,)
vz (6, 100, 1, 231)
uz (6, 100, 1, 231)
u (6, 1, 231)
v (6, 1, 231)
ustz (6, 100, 1, 231)
I am interested in vz,uz and ustz (explanation on what they contain... (the number of iterations, nz, ny, nx) i am not interested in ny and it holds no value.
so the question is now can i make the size to (the number of iterations, nz, nx) and unpack nz and nx per iteration.
Thanks
here is my code,
from netCDF4 import Dataset
import numpy as np
import pandas as pd
ncfile = 'xboutput.nc'
jan = Dataset(ncfile, more = 'r')
#print (file.variables)
_parameters = jan.variables['_parameters']
globalx = jan.variables['globalx'].units
globaly = jan.variables['globaly'].units
globaltime = jan.variables['globaltime'].units
vz = jan.variables['vz'][:]
uz = jan.variables['uz'][:]
ustz = jan.variables['ustz'][:]
v = jan.variables['v'][:]
u = jan.variables['u'][:]
for i in jan.variables:
print (i,jan.variables[i].shape)
vz_ar = np.array(jan.variables['vz'])
u_ar = np.array(jan.variables['u'])