0

I'm trying to explore 3D image analysis using Python by scipy.ndimage. When I applied median filter ,scipy.ndimage.filters.median_filter to my 3D image with size (874, 1150, 1150), it runs so slowly. The calculation speed apparently highly depends on the footprint size. Here are some codes, where a is the 3D image with size (874, 1150, 1150), and mf is the module:

scipy.ndimage.filters.median_filter:

%time a_mf = mf(a, size = 2)

CPU times: user 1min 47s, sys: 684 ms, total: 1min 48s
Wall time: 1min 48s

%time a_mf = mf(a, size = 3)

CPU times: user 6min 25s, sys: 1.79 s, total: 6min 27s
Wall time: 6min 28s

I never got the results when I set the size as 5...since I think this time consuming is unacceptable.

Do you know why is that and how I can improve it?

Taufiq Rahman
  • 5,600
  • 2
  • 36
  • 44
Zhou Zhou
  • 1
  • 1
  • Is easy to say "why is that". Your array is simply very large (>8GB if float). Maybe you would be better off if you processed in in chunks. I also wonder if you really want to process the median in all 3 dimensions or, maybe, only in 2. – dnalow Nov 20 '16 at 12:56
  • The image is uint8 and its size is 1.16 GB. But yes, you are right that it is time consuming because of the large array... I just checked with Matlab which runs even a little bit slower than ndimage: 117.722498 seconds for median filter with size = 2 ... Now seems I asked a stupid question. Thanks a lot for your reply! And also I should think about if I really need to process the median filter in 3D. But I do think I need to apply filters to enhance the contrast before segmentation. – Zhou Zhou Nov 20 '16 at 13:28
  • OK, this may be. It just seemed to me that you want to apply the filter on each of 874 2d pictures. Now you apply it on the data cube. So the pictures get mixed. – dnalow Nov 21 '16 at 08:21

0 Answers0