Rather than doing that in a layer, you can use a nolearn.lasagne.BatchIterator
; in the following snippet I resample my original 1D signal to a 1000 points signal:
from nolearn.lasagne import BatchIterator
from scipy.signal import resample
import numpy as np
class ResampleIterator(BatchIterator):
def __init__(self, batch_size, newSize):
super(ResampleIterator, self).__init__(batch_size)
self.newSize = newSize
def transform(self, Xb, yb):
X_new = resample(Xb, self.newSize, axis=2).astype(np.float32)
return X_new, yb
myNet = NeuralNet(
# define your usual other parameters (layers, etc) here
# and these are the lines you are interested in:
batch_iterator_train=CropIterator(batch_size=128, newSize=1000),
batch_iterator_test=CropIterator(batch_size=128, newSize=1000),
)
I don't know if you've used nolearn
yet, you can read more about it (installation, examples) here