1

While comparing various images with SSIM + Python, I found that some very similar images had a very low SSIM ratio. This is consequence of the interaction of the images background with the resize that has to be done to execute the calculations. This issue is important because I am comparing thousands of images and many dissimilar ones have higher similitude ratios that ones that are almost a morphological equal.

pod SSIM ratio = 0.2724580796

shirt SSIM ratio = 0.3111611817

My code:

import cv2
import statistics
import numpy as np
from PIL import Image
import urllib.request
import matplotlib.pyplot as plt
from skimage.metrics import structural_similarity

class CompareImages_n():

    def __init__(self, url_1, url_2):
        self.img_url_1 = url_1
        self.img_url_2 = url_2

    def load_images(self, img_url, flag_color=False):
        req = urllib.request.urlopen(img_url)
        arr = np.asarray(bytearray(req.read()), dtype=np.uint8)
        image = cv2.imdecode(arr, -1) # Load image as it is
        if not flag_color:
            return cv2.cvtColor(image, cv2.COLOR_BGR2GRAY) # Change color to greyscale
        else:
            return image

    def main_process_ssmi(self):
        ima1 = self.load_images(self.img_url_1)
        ima2 = self.load_images(self.img_url_2)
        (H, W) = ima1.shape
        ima2 = cv2.resize(ima2, (W, H))
        (score, diff) = structural_similarity(ima1, ima2, full=True)
        return score

How can I compare the images eliminating the background before the resize process? What other solution could be implemented? (Automated cropping?) Any library available?

The Dan
  • 1,408
  • 6
  • 16
  • 41

0 Answers0