0

I have a Brain MRI. It is gray scale with 20 slices. I put it into a numpy array with shape (20,256,256). I use scipy.ndimage affine_transform to rotate and resample the array as below.

enter image description here

The dark stipes in the image is the artifact that I want reduce. The artifact is caused by the relatively large gap between the slices. In this example the pixel spacing is 0.85 mm, but the distance between slices is 7 mm.

I tried to change the order of the affine transform, but even order=5 has the same artifacts. Below is order=0 (nearest neighbor)...

enter image description here

and you can see how the curvature of the skull is compounding the problem. Are there any tricks to fix this? Maybe I should add dummy data between the pixels to equalize the spacing? Maybe I should use polar coordinates to eliminate the curvature? Any other ideas?

John Henckel
  • 10,274
  • 3
  • 79
  • 79

1 Answers1

0

The affine_transform using any order will look terrible. You need to do feature detection and delaunay triangulation on both images first, and then use the interpolating variable as a morph parameter to move the pixels between the corresponding features in adjacent images. see link https://devendrapratapyadav.github.io/FaceMorphing/

see also this video https://www.youtube.com/watch?v=5FEr5SiXB1g

John Henckel
  • 10,274
  • 3
  • 79
  • 79