I had read a blog Histogram Equalization for Image Enhancement which gives 7 steps to convert normal images to a HDR image (below). It's said that a C/C++ program for histogram equalization can easily written using the Open Computer Vision Library or OpenCV. The major steps of such a program include:
Read the input image. This can be in most any image format thanks to OpenCV. This input image contains
n
pixels:n = height × width
Convert from RGB (curiously stored in the order blue, green, red by OpenCV) to HSV: Hue, Saturation, and Value.
Calculate the histogram of the input image. This is a 256 value array, where
H[x]
contains the number of pixels with valuex
.Calculate the cumulative density function of the histogram. This is a 256 value array, where
cdf[x]
contains the number of pixels with valuex
or less:cdf[x] = H[0] + H[1] + H[2] + ... + H[x]
Loop through the
n
pixels in the entire image and replace the value at eachi
'th point:V[i] <-- floor(255*(cdf[V[i]] - cdf[0])/(n - cdf[0]))
Convert the image back from HSV to RGB.
Save the image in the desired format and file name.
At step 3, I do not understand what H[x]
is? Does x
refer to the R, G, B, or H, S, or V values? Also, at step 5 what the meaning of the value i
?