0

I have a clean image, and a noisy image. I created a denoiser and applied it to the noisy image, that was my final output. Now to compare how much this image is close to a clean image I need to compare it using PSNR and SSIM, but due to different positions of the image I am unable to compare.

Now I am getting SSIM as 0.5, which is very low, due to the improper placement of both the images. If the images are registered properly, then I guess SSIM should come around 0.80+. But I have not been able to accomplish this.

How can I align these two images to obtain a good SSIM value?

I have two coin images, 1st image (CLEAN), 2nd image (IMPROVED a NOISY IMG), for comparison.

Clean Img:

enter image description here

Noisy Img:

enter image description here

Due to positions of images at different positions ssim(img1,img2) is giving incorrect output. I tried cropping but that did not work. Here is what I have tried so far:

Attempt 1:

function [valPSNR,valSSIM,badpict]=getSSIM(clean_img,img2)
% pad reference image since object is so close to edges
refpict = padarray(mat2gray(clean_img),[20 20],'replicate','both');
% crop test image down to extract the object alone
badpict = imcrop(mat2gray(img2),[2.5 61.5 357 363]);
% maximize normalized cross-correlation to find offset
szb = size(badpict);
c = normxcorr2(badpict,refpict);
[idxy idxx] = find(c == max(c(:)));
osy = idxy-szb(1);
osx = idxx-szb(2);
% crop the reference pict to the ROI
refpict = refpict(osy:idxy-1,osx:idxx-1);
%imshow(imfuse(badpict,refpict,'checkerboard'));
%imagesc(badpict);
valSSIM=ssim(badpict,refpict);
valPSNR=getPSNR(badpict,refpict);
img2=badpict;
clean_img=refpict;
figure; imshowpair(clean_img,img2);
figure; montage({mat2gray(clean_img),mat2gray(img2)}, 'Size', [1 2], 'BackgroundColor', 'w', 'BorderSize', [2 2]);
end

Attempt 2:

function [valPSNR,valSSIM,badpict]=getSSIM2(clean_img,img2)
% pad reference image since object is so close to edges
bw1 = im2bw(mat2gray(clean_img));
bw2 = imclose(im2bw(mat2gray(img2),0.3),strel('disk',9));
bw2 = bwareafilt(bw2,1);
% make same size
[r,c] = find(bw1);
clean_img = clean_img(min(r):max(r),min(c):max(c));
[r,c] = find(bw2);
img2 = img2(min(r):max(r),min(c):max(c));
img2= imresize(img2, size(clean_img),'bilinear');
valPSNR=getPSNR(mat2gray(clean_img),mat2gray(img2));
valSSIM=ssim(mat2gray(clean_img),mat2gray(img2));
badpict=img2;
figure; imshowpair(clean_img,img2);
figure; montage({mat2gray(clean_img),mat2gray(img2)}, 'Size', [1 2], 'BackgroundColor', 'w', 'BorderSize', [2 2]);
end
Cris Luengo
  • 55,762
  • 10
  • 62
  • 120
Rohit gupta
  • 211
  • 3
  • 18
  • If you have an algorithm made by you that cleans the image and then puts it in a different location, that means that you know the mechanism that misaligns the two images. Just make sure you don't, or you correct it back using exactly the same method. Don't try to register the images, as that will always have some error. – Ander Biguri Feb 24 '22 at 11:33
  • My algo don't puts it in different location, it already in different location – Rohit gupta Feb 24 '22 at 14:36
  • So you are comparing your denoised image to a different image that is clean? If that is the case you need to do some image registration – Ander Biguri Feb 24 '22 at 16:56
  • exactly, both are different images of a same coin, I'm trying from 3 days, I have done my research on algo generation and its complete to finally evaluate I need to validate it using ssim – Rohit gupta Feb 25 '22 at 07:08
  • 1
    https://uk.mathworks.com/discovery/image-registration.html – Ander Biguri Feb 25 '22 at 10:17
  • But getting issue in this, ERRORIMG: https://www.yogile.com/ecurfj6mxby/024455462l/share/?vsc=233c7df09 – Rohit gupta Feb 26 '22 at 12:40
  • NOISY IMG: https://www.yogile.com/ecurfj6mxby/024455460l/share/?vsc=233c7df09 – Rohit gupta Feb 26 '22 at 12:40
  • CLEAN IMG: https://www.yogile.com/ecurfj6mxby/024455458l/share/?vsc=233c7df09 – Rohit gupta Feb 26 '22 at 12:40

2 Answers2

1

As others have pointed out, the resampling required by registration will have some non-zero error. But, here is some sample code that will take you through the registration part that is the crux of your question.

% SSIM isn't defined on RGB images, convert to grayscale.
ref = rgb2gray(imread('https://i.stack.imgur.com/tPKEJ.png'));
X = rgb2gray(imread('https://i.stack.imgur.com/KmU4y.png'));

% The input image data has bright borders at the edges that create
% artifacts in resampling, best to just crop those or maybe there are
% aquisitions that don't have these borders?
X = X(3:end-2,3:end-2);
ref = ref(4:end-3,4:end-3);

figure
montage({X,ref});

tform = imregcorr(X,ref,"translation");

Xreg = imwarp(X,tform,OutputView=imref2d(size(ref)),SmoothEdges=true);

figure
imshowpair(Xreg,ref)

ssim(Xreg,ref)
Alex Taylor
  • 1,402
  • 1
  • 9
  • 15
  • my ultimate task is to get higher ssim value on superimposing, using above two code I'm getting higher ssim values around 50% – Rohit gupta Mar 03 '22 at 16:48
0

Maybe you can refer to my github.

I implemented a template matching algorithm by OpenCV which you can use NCC-Based Pattern Matching to find targets, and then get a score (similarity).

You can then use this score to decide if it is clean.

Besides, tranforming c++ code may be an issue for you, but just find the all corresponded function in matlab version.

Here are effects (red blocks are areas with similarity higher than threshold 0.85 in comparison with golden sample): enter image description here

The whole function is too long to be posted here. Part of the function:

for (int i = 0; i < iSize; i++)
{
    Mat matRotatedSrc, matR = getRotationMatrix2D (ptCenter, vecAngles[i], 1);
    Mat matResult;
    Point ptMaxLoc;
    double dValue, dMaxVal;
    double dRotate = clock ();
    Size sizeBest = GetBestRotationSize (vecMatSrcPyr[iTopLayer].size (), pTemplData->vecPyramid[iTopLayer].size (), vecAngles[i]);
    float fTranslationX = (sizeBest.width - 1) / 2.0f - ptCenter.x;
    float fTranslationY = (sizeBest.height - 1) / 2.0f - ptCenter.y;
    matR.at<double> (0, 2) += fTranslationX;
    matR.at<double> (1, 2) += fTranslationY;
    warpAffine (vecMatSrcPyr[iTopLayer], matRotatedSrc, matR, sizeBest);


    MatchTemplate (matRotatedSrc, pTemplData, matResult, iTopLayer);

    minMaxLoc (matResult, 0, &dMaxVal, 0, &ptMaxLoc);

    vecMatchParameter[i * (m_iMaxPos + MATCH_CANDIDATE_NUM)] = s_MatchParameter (Point2f (ptMaxLoc.x - fTranslationX, ptMaxLoc.y - fTranslationY), dMaxVal, vecAngles[i]);

    for (int j = 0; j < m_iMaxPos + MATCH_CANDIDATE_NUM - 1; j++)
    {
        ptMaxLoc = GetNextMaxLoc (matResult, ptMaxLoc, -1, pTemplData->vecPyramid[iTopLayer].cols, pTemplData->vecPyramid[iTopLayer].rows, dValue, m_dMaxOverlap);
        vecMatchParameter[i * (m_iMaxPos + MATCH_CANDIDATE_NUM) + j + 1] = s_MatchParameter (Point2f (ptMaxLoc.x - fTranslationX, ptMaxLoc.y - fTranslationY), dValue, vecAngles[i]);
    }
}
FilterWithScore (&vecMatchParameter, m_dScore-0.05*iTopLayer);
BoKuan Liu
  • 21
  • 1
  • 6