I would like to make a 1D interpolation on a 3D data cube in Python. I have 18 images taken at fiducial wavelengths. I want to make a 1D interpolation on each pixel to form a new image for 185 given specific wavelengths. My problem is a problem of optimization. I've already seen tricks like scipy.ndimage.map_coordinates
but I don't think this can be applied on my problem and result as an optimized solution. The real problem is the size of my data : I'm working with a (18x1024x1024)
data cube and want a new data cube in output of size (185x1024x1024)
. Is there a way to optimize the following code?
import numpy as np
from scipy.interpolate import interp1d
cube_software = np.zeros((18,1024,1024))
lambda_instrument = np.linspace(2,13,185)
lambda_software = np.linspace(2,13,18)
cube_new = np.zeros((185,1024,1024))
for r in range(np.shape(cube_software)[1]):
# print('Number %i over %i'%(r,np.shape(cube_software)[1]))
for u in range(np.shape(cube_software)[2]):
pixel_inter = interp1d(lambda_software,cube_software[:,r,u])
cube_new[:,r,u] = pixel_inter(lambda_instrument)