What about a 1 dimensional convolution? You can use strides like in the following:
n,w = x.shape
c = 1
x = x.reshape(n,w,c) # 1d vector with one 1 channel
x = conv1d(x, 1, 3, stride=2, pad=1) # 1 filter so output size will be (n,w/2,c)
x = x.reshape(n,w//2)
That'll give you integer divisions of your current dimensionality. Or you could have a channel for each dimension for your output and then pool over the entire 1D region:
x = x.reshape(n,w,c)
x = conv1d(x, d, 3, pad=1) # d filters so output (n,w,d)
x = x.mean(1) # mean over 1d space so now (n,d)
No guarantees on whether any of these will actually work well, but this being a neural network they probably won't break anything too bad.
Finally, the cheat answer:
x = x.reshape(n,c,w) # (n,1,w)
x = conv1d(x, d, 1) # (n,1,d)
x = x.reshape(n,d)