I need to know the level of gamma correction that ImageMagick is automatically computing when we use the following command
convert in.jpg -auto-gamma out.jpg
Is that possible?
Thank you
I need to know the level of gamma correction that ImageMagick is automatically computing when we use the following command
convert in.jpg -auto-gamma out.jpg
Is that possible?
Thank you
If I remember correctly, -auto-gamma
would call the AutoGammaImage
method, and apply the following equation.
gamma = log(pixel_mean)/log(0.5)
So to calculate what value of gamma correction will be applied, you can do something like...
pixel_mean=$(identify -format '%[mean]' rose:)
echo "l($pixel_mean)/l(0.5)" | bc -l
#=> -14.72189
As Mark pointed out in the comments, you can also use ImageMagick FX language to calculate.
identify -format '%[fx:ln(mean*QuantumRange)/ln(0.5)]' rose:
#=> -14.7219
-auto-gamma is an experimental option. For each color channel, it computes the mean of all pixels, then applies
gamma=log(mean)/log(0.5)
to each sample. I've never tried it until just now, but it does work pretty nicely on the severely underexposed photo in this question.
As far as finding out after the conversion what gamma was applied, that isn't directly reported, but you can use "identify" on the original image and look at the "mean" reported for each channel to figure out the gammas that would be used, as in the answer by emcconville.