2

I am working on a project on Image classification using Mutual Information. It requires me to use probability distribution of a color image, either I want to calculate the Mutual Information or the Kullback Leibler Divergence in Matlab. Can anyone help me out in this? I have calculated the entropy of a colored image as:

I = imread('s1.png');
% rgb_columns = reshape(rgb, [], 3);

% %Change RGB matrices to a single matrix of color indices.
% %Removes the third dimension from the pixel intensity matrix.
Color_ind=double(I(:,:,1)).*256^2+double(I(:,:,2).*256)+double(I(:,:,3));      
disp(size(Color_ind));     

% Finding unique elements in the matrix and find their length
unique_ind=unique(Color_ind);
unique_len=length(unique_ind);

%Pre-allocate space for the vector that will hold the number of entries
%for each unique color
color_count_prob=zeros(unique_len,1);

%Count the number of each occurrence of each unique color index in the 
%original matrix.
for i = 1:unique_len
  color_count_prob(i)=(length(find(unique_ind(i)==Color_ind)))/(2073600);
end
en_sum=0;
for i = 1:unique_len
  en_sum = en_sum + log2(color_count_prob(i));
end
en = -en_sum;
  • 1
    Post some code and show us what you've tried. Welcome to stack overflow – Jon Mar 30 '19 at 17:08
  • I have done the calculation of entropy of an image but I am confused how to calculate mutual information and KL divergence as it requires pdf of the image. How to define the probability distribution of a colored image? – Karn Srivastava Mar 30 '19 at 17:24
  • entropy = Sum( pi * log2(pi)). But you seem to be calculating it as Sum(log2(pi)) – koshy george Apr 02 '19 at 01:01
  • how can i use scipy to get the generator of a probability distribution with min KL divergence in python? – yishairasowsky May 18 '21 at 16:24

1 Answers1

0

For PDF calculation of a colored image:

First, you need to convert the image to grayscale. If you insist on staying in RGB mode (or any other colored mode) you will have to generate 3 PDFs (one for each color channel) - I would not suggest doing that for the purposes of Kullback Liebler or Mutual Information, the grayscale image will do.

Second, you need to calculate the distribution of each image. For this purpose, you will need to flatten your image (convert from a 2D array to 1D array). Once you flatten the image, you should sort the values. Once sorted, you should normalize them (you can choose not to, but would recommend). After that, you can derive the histogram for the image.

And to measure the Kullback Liebler divergence, you need:

  1. Measure the entropy on your image histograms. This will be a number.
  2. Simply subtract the values from step one and it will give you the Kullback Liebler divergence value for those two images.
Amir Charkhi
  • 768
  • 7
  • 23