0

Full Disclaimer, this is my first attempt at edge detection so when responding assume I know about software but not necessarily about computer vision/related fields.

I am reading about various edge detection methods. A natural first step in many of these is to smooth out the image then proceeding. This seems reasonable because you don't want random noise to interfere with any higher level logic.

My question is, what useful information can be lost due to a Gaussian blur? (If any?)

Carlos Bribiescas
  • 4,197
  • 9
  • 35
  • 66

1 Answers1

1

Useful contrast information could be lost due to any blurring technique (not only gaussian blur). The reason is that blurring averages neighboring image intensities, which naturally kill contrast. In the context of signal processing, this action could be seen as a low-pass filter. An illustration for what happens with too violent blurring: http://www.borisfx.com/images/bcc3/gaussian_blur.jpg

ezfn
  • 173
  • 5