I have a bottleneck in a 2D median filter (3x3 window) I use on a very large set of images, and I'd like to try and optimize it. I've tested scipy.ndimage
median_filter, as well as PIL
, scipy.signal
and scikits-image
. However, browsing in SO I've learned that there's a fast O(n) median filter out there in C (Median Filtering in Constant Time see Rolling median algorithm in C), and I wondered whether I can implement it in Python using scipy.weave.inline ?
Any suggestions on an alternative route?
-
2I have a cython wrapper for Perreault + Hebert, Median Filtering in Constant Time, someplace -- but that'd be overkill for 3x3, better do a sorting network, 9 inputs in 25 compares. Are the inputs 0..255 or what ? – denis Mar 26 '12 at 16:01
-
Do you just want to process your image set? Is [OpenCV](http://opencv.org/) an alternative route? [medianBlure](http://docs.opencv.org/modules/imgproc/doc/filtering.html#medianblur) perhaps? You also have CUDA and OpenCl implementations. – mfrellum May 27 '14 at 07:51
-
most of the code framework is already in python, small parts are in matlab, it would be best to implement this part of the code in python somehow. – bla May 28 '14 at 03:33
-
@natan OpenCV has a nice Python interface for most of its functionality. E.g. from the documentation: Python: cv2.medianBlur(src, ksize[, dst]) → dst – mfrellum May 30 '14 at 09:39
-
What "1" are you referring to in `O(1)`? I don't think you can expect to be less than linear in the the amount of pixels in your image. – eickenberg May 30 '14 at 15:01
-
1@eickenberg, I guess you are right and it means linear, so O(n) is the right way to write it? – bla May 30 '14 at 22:17
3 Answers
Try this: Rolling median in C - Turlach implementation
Usage:
Mediator* m = MediatorNew(9);
for (...)
{
MediatorInsert(m, value);
median = MediatorMedian(m);
}
I believe this is the same as the R algo, but cleaner (amazingly so, in fact).
You can either wrap this, or port it and use Numba (or Cython). I think I'd recommend Numba over Cython, if nothing else because it is plain old python code.
I suggest adding this to scikits, if it runs faster than the one in scikits already :)
If your still interested I'd try numpy's reshape and median:
a= some big array
a.reshape(N,3,3) #N being specific to your array
[numpy.median(m) for m in a]
I don't know how this scales compared to your testet methods but if you want to optimize with C you could fasten the for loop in the list comprehension...

- 1,745
- 4
- 19
- 26
-
2This is elegant, but unfortunately it calculates the median of every non-overlapping 3x3, not the median of every possible 3x3. – Alex I Jun 01 '14 at 10:58
I don't know the underlying algorithm, but scikits-image has a rolling median filter.
Otherwise, I'd recommend writing it in Cython (C/Python pidgin language). Be sure to check out the convolution example/tutorial for working with numpy arrays.

- 4,089
- 1
- 20
- 17
-
Thanks, I'll look into it. Also, by writing 3x3 I meant that the median window size is 3x3, the actual image it is used in is megapixles big. – bla Mar 26 '12 at 17:39
-
Well, it seems that scikits is even slower than PIL and scipy.ndimage. Unfortunately, the images I work on are 16bit so the 0..255 option is not relevant. Next Cython... – bla Mar 27 '12 at 03:23