-1

Let's say you have measured a bit b ∈ {0, 1} and you know that with probability p ∈ [0, 1] that your measurement might be wrong, i.e. that the measurement is correct by probability 1-p. How much information, on average, is contained in one such bit?

Is it simply (1 bit) * (1-p) = (1-p) bits? (this guess was wrong)

I think you can find the result if you see it as a binary symmetric channel. That means a correct bit value gets sent, possibly wrong bit value is received at failure probability p. How to calculate the amount of information transferred on average?

Daniel S.
  • 6,458
  • 4
  • 35
  • 78
  • Assume the prior distribution of 0's and 1's to be uniform, i.e. there are as many 0's as there are 1's. – Daniel S. Aug 11 '23 at 15:07
  • Beware of chatgpt (3.5) for this. For this question, chatgpt fantasizes the binary entropy function H(p) to be the answer, while the correct answer is actually 1-H(p). – Daniel S. Aug 14 '23 at 14:19

1 Answers1

0

You can indeed directly use the channel capacity of the BSC (binary symmetric channel) to calculate the average amount of information in one bit that is incorrect at probability p.

When you transfer one bit over a BSC, the result on the receiver side is just a bit that is wrong (flipped) at probability p. Thus the channel capacity of this channel gives you what you are looking for, the amount of information transferred by the transmission:

C = 1 - H(p),

where H is the binary entropy function

H(p) = -p·log₂(p) - (1 - p)·log₂(1 - p)

so

C(p) = 1 + p·log₂(p) + (1 - p)·log₂(1 - p)

.

The following diagram is C(p) plottet over p:

enter image description here

Daniel S.
  • 6,458
  • 4
  • 35
  • 78