0

For a distribution of N values, how can I efficiently upper-bound the largest divergence between all non-negative distributions over the same random field? For example, for all distributions of a random variable that takes values in ([1,2,3,4]), i.e., N = 4, and the probability of a = 1 or a = 2 or a = 3 or a = 4 is always nonzero (but can be very small, e.g., 1e-1000).

Is there a known bound (other than infinity)? Say, given the number N, the divergence between the uniform distribution [1/4 1/4 1/4 1/4] and "delta" [1e-10 1e-10 1e-10 1/(1+3e-10)] over N is the largest?...

Thanks all in advance, A.

user3861925
  • 713
  • 2
  • 10
  • 24
  • There are a couple of different definitions of "divergence" (e.g. Kullback-Leibler Divergence, Jensen-Shannon Divergence, both of which are special cases of f-divergence (I think)). Could you be more specific? – Beta Nov 17 '14 at 16:32
  • Sorry - I referred to the Kullback Leibler divergence but the tag was omitted during the submission stage. – user3861925 Nov 17 '14 at 16:52
  • I think you're right, the divergence between uniform and delta will be the largest. Do you need a rigorous proof, a hand-waving argument, or what? – Beta Nov 17 '14 at 16:58
  • I am wrong. I tested it empirically and it didn't work out. Very unfortunately. It looked like a clean win :-( – user3861925 Nov 17 '14 at 17:15
  • What pair of distributions gives a larger divergence? Wait, is it two different deltas? – Beta Nov 17 '14 at 17:27
  • Unfortunately no too. Two distribution with higher probs at different values, neither deltas nor semi-uniform... – user3861925 Nov 17 '14 at 17:43
  • Might want to try stats.stackexchange.com or math.stackexchange.com, since this isn't a programming problem. – Robert Dodier Nov 17 '14 at 20:02
  • Thank you. Will let know if I come up with something. – user3861925 Nov 18 '14 at 08:59

0 Answers0