0

I recently came upon a question that I haven't seen anywhere else while searching about lossy compression. Can you determine the quality lost through a certain algorithm? I have been asking around and it seems like that there isn't a sure way to determine the quality lost compared to an original image and can only be differentiated by the naked eye. Is there an algorithm that shows % lost or blended?

I would really appreciate it if someone could give me some insight into this matter.

Ian Zhang
  • 11
  • 8
  • Are you asking about a way to measure the difference between two images in percentage? Because of course the JPG compression value gives you a rough idea, although neither 50% loses half of the image nor 100% is lossless. FWIW, what I do to compare is put both images in different layers in Paint.net and use the XOR fusion method for the top layer. That way equal pixels are black and slightly changed one appear highlighted. – Andrew Aug 17 '21 at 13:43
  • Yeah that is exactly what I am looking for. I didn't think about using 2 layers to see the difference. I will try that out. I will keep you updated if it works. – Ian Zhang Aug 17 '21 at 13:51
  • But remember this is a programming forum, so are you not looking for something on that area? – Andrew Aug 17 '21 at 14:29
  • I am, I was looking for a certain algorithm that could do that with code. But, if your method works I could just write a simple algorithm to calculate the percentage given the pixels. – Ian Zhang Aug 17 '21 at 15:22
  • I guess it should be fairly simple, just loop through the pixels, calculate a percentage of change of each one based on the delta of R, G and B (and alpha if it supports transparency, like a PNG), sum all those values and then divide by the number of pixels. :D – Andrew Aug 17 '21 at 19:10
  • Yeah I think that is the way to go. Do you know a specific python method that deals with pixels? I am researching but it seems like you might know a specific library that does this. – Ian Zhang Aug 18 '21 at 13:43
  • 1
    Never mind I found a python library called Pillow that does it. Thanks for all the help! – Ian Zhang Aug 18 '21 at 14:57

1 Answers1

1

You can use lots of metrics to measure quality loss. But, of course, each metric will interpret quality loss differently.

One direction, following the suggestion already commented, would be to use something like the Euclidian distance or the mean squared error between the original and the compressed image (considered as vectors). There are many more metrics of this "absolute" kind.

The above will indicate a certain quality loss but the result may not correlate with human perception of quality. To give more weight to perception you can inspect the structural similarity of the images and use the structural similarity index measure (SSIM) or one of its variants. Another algorithm in this area is butteraugli.

In Python, for instance, there is an implementation of SSIM in the scikit-image package, see this example.

The mentioned metrics have in common that they do not return a percentage. If this is crucial to you, another conversion step will be necessary.

Benjamin
  • 21
  • 3
  • Not mentioned is the best metric, which unfortunately is also the most expensive: a panel of humans in a carefully controlled experiment. – Mark Adler Apr 11 '22 at 01:36