1

I am trying to downsample an image (for speed), run prediction, then upsample it back. Due to rounding, I get mismatches with the original image size for some pixel dimensions/voxel sizes. What is the best way to handle this?

Forward pass

original_size = (192, 192, 299)
original_spacing = (3.6458332538605, 3.6458332538605, 3.27)
out_spacing= (5.0, 5.0, 5.0) 
out_size = [
    int(np.round(original_size[0] * (original_spacing[0] / out_spacing[0]))),
    int(np.round(original_size[1] * (original_spacing[1] / out_spacing[1]))),
    int(np.round(original_size[2] * (original_spacing[2] / out_spacing[2])))]

= [140, 140, 196]

Reverse Pass

original_size = (140, 140, 196)
original_spacing = (5.0, 5.0, 5.0) 
out_spacing= (3.6458332538605, 3.6458332538605, 3.27)
out_size = [
    int(np.round(original_size[0] * (original_spacing[0] / out_spacing[0]))),
    int(np.round(original_size[1] * (original_spacing[1] / out_spacing[1]))),
    int(np.round(original_size[2] * (original_spacing[2] / out_spacing[2])))]

out_size = [192, 192, 300]

The foward-reverse output size has 300 slices vs the input which has 299 due to rounding.

illan
  • 163
  • 1
  • 13

1 Answers1

0

The error happens because the output size is rounded, and then you attempt to compute the original size from these rounded values.

You can overcome this issue by realizing that rounding the output size forces a slightly different spacing. You can re-compute the spacing, and store those values together with the image. When you use these correct spacing values instead of the ones you originally desired, the original input size can be reconstructed.

original_size = (192, 192, 299)
original_spacing = (3.6458332538605, 3.6458332538605, 3.27)
out_spacing= (5.0, 5.0, 5.0) 
out_size = [
    int(np.round(original_size[0] * (original_spacing[0] / out_spacing[0]))),
    int(np.round(original_size[1] * (original_spacing[1] / out_spacing[1]))),
    int(np.round(original_size[2] * (original_spacing[2] / out_spacing[2])))]

out_spacing = [
   original_size[0] * original_spacing[0] / out_size[0],
   original_size[1] * original_spacing[1] / out_size[1],
   original_size[2] * original_spacing[2] / out_size[2]]

= [4.999999891008685, 4.999999891008685, 4.988418367346939]

Cris Luengo
  • 55,762
  • 10
  • 62
  • 120
  • Thanks. I was thinking I would have to store information and pass it in/out of the function. I guess you do have to preserve the original size, rounding direction or something like that across the forward and backward passes. – illan Jul 06 '23 at 19:30
  • @illan you should probably keep the original image, not scale up the downscaled one. If you’re upscaling the annotations produced for the downscaled image to match the original one, then just scale them by the ratio of the sizes of the two images. – Cris Luengo Jul 06 '23 at 21:46
  • Yes, that is what I am doing. – illan Aug 17 '23 at 15:17