4

After obtaining various probability distributions from various documents in mallet, I have applied the following code to calculate the KL divergence between the first and the second document:

        Maths.klDivergence(double[] d1,double[] d2);

How should I interpret the data obtained? For example, I get: 12.3640... What does this mean? the two distributions are near or far away?

user3318618
  • 73
  • 1
  • 9

1 Answers1

0

As the name suggests, KL-Divergence gives the divergence of one distribution from another. it is basically the information a distribution loses while approximating to another distribution which means - Lesser the value, more the similarity. If more similar you don't lose any information

Praveen
  • 338
  • 2
  • 11