1

I have tried using pool.map() and pool.allpy() for parallelizing the CAPE estimation but getting errors, below is the code I have been running:

import multiprocessing  
from multiprocessing import Pool
from pytictoc import TicToc # conda install pytictoc -c ecf

tims = 4 #FIXME: u.shape[0] USE 4 FOR 6 HOURLY DATA

if __name__ == '__main__': #PART OF THE SYNTAX
pool = Pool(processes=8) # set number of process


for tim in range(tims): # loop across 8 timestamps of a day
    print('processing time-step: ',tim)
    t = TicToc()
    t.tic()
    
    for lat in SP_test.latitude.values:
        for lon in SP_test.longitude.values:
            for tim in SP_test.time.values:
                Temp = SP_test.sel(time =tim, latitude=lat, longitude = lon).t
                #print(Temp)
                RH = SP_test.sel(time =tim, latitude=lat, longitude = lon).r
                #print(RH)
                TD = dewpoint_from_relative_humidity(Temp,RH)
                
                cape[tim] = [pool.apply(most_unstable_cape_cin,args = (p,Temp,TD))] # run parallel (across (lat,lon), but for the fixed current timestamp) processes; save all 91*151 results in 1 item of the list


    t.toc()
        

'cape' is an empty xarrary data that is to be filled with the output cape values. The function that needs to be parallelized is 'most_unstable_cape_cin' but I am getting this error:

ValueError: `most_unstable_cape_cin` given arguments with incorrect units: `dewpoint` requires "[temperature]" but given "dimensionless"

Any other method to do this? or any changes to the present code?

piyush
  • 19
  • 3

1 Answers1

0

That error indicates that units are missing on the input data. The calculation functions in MetPy, like most_unstable_cape_cin rely on having unit information included with the data. You can do this with xarray by setting the units attribute like:

SP_test.t.attrs['units'] = 'Kelvin'
SP_test.r.attrs['units'] = 'percent'
DopplerShift
  • 5,472
  • 1
  • 21
  • 20