0

I try to use the package mpmi to calculate the mutual information between two set of continuous variables. I am confused of the source code put on GutHub: https://github.com/cran/mpmi/blob/master/src/cminjk.f95

ans = ans + log(s12(i) / (s1(i) * s2(i)))

end do

ans = ans / lv + log(dble(lv))

The s12 seems like the p(x,y), and s1,s2 seems like the p(x),p(y). Why not multiply the p(x,y) before log(s12(i) / (s1(i) * s2(i))), for the formula to calculate MI is p(x,y)*log(p(x,y)/(p(x)p(y)))

And why there is ans = ans / lv + log(dble(lv)) after finishing the summation?

figurine
  • 746
  • 9
  • 22
S.C. Jiang
  • 21
  • 3

2 Answers2

0

See reference "Fast calculation of pairwise mutual information for gene regulatory network reconstruction"

S.C. Jiang
  • 21
  • 3
0

Care with this reference: "Fast calculation of pairwise mutual information for gene regulatory network reconstruction"

This strategy assumes that each of the two random variables follow a normal distribution. Except if you know with certainty that both variables follow a normal distribution,I suggest the use of packages free of any distribution assumption. I recommend the R package "minerva". Just requires specify MIC(X,Y)$MIC.

Nissa
  • 4,636
  • 8
  • 29
  • 37