0

I was just seeing this example to write a code in different way for denoising using CSR(centralized sparse representation). But in that example, i got few lines, I don't know why they had used this lines =>

lines are below =>

1)

face = face[::2, ::2] + face[1::2, ::2] + face[::2, 1::2] + face[1::2, 1::2]
face /= 4.0

2)

data -= np.mean(data, axis=0)
data /= np.std(data, axis=0)
Cœur
  • 37,241
  • 25
  • 195
  • 267
Sudip Das
  • 1,178
  • 1
  • 9
  • 24

1 Answers1

2

The first one is extended slicing and second is standardization

Explanation:

1.

face = face[::2, ::2] + face[1::2, ::2] + face[::2, 1::2] + face[1::2, 1::2]
face /= 4.0

is just downsampling the image by bilinear interpolation (that is reducing its resolution).

In the given example, the original dimensions of face is [768,1024]. The above two lines effectively find the average of each 2x2 pixels and put it in single pixel. See the below image.

Bilinear interpolation downsampling

The blue one are the original pixels of the image and the red ones are result of averaging of surrounding pixels. Extend this idea to whole image and you will get the new reduced resolution of the face = [384, 512].

This is done here just so that the calculations can become faster. Because larger image will take more time processing.

2.

data -= np.mean(data, axis=0)
data /= np.std(data, axis=0)

This is a common technique in machine learning algorithms to scale and centre the data around zero mean and unit variance. There are many resources which you can refer to know more about standardization. Here are a few:

Community
  • 1
  • 1
Vivek Kumar
  • 35,217
  • 8
  • 109
  • 132