-2

It is not clear to me how to calculate the shannon limit given the following information:

Bits per second of a transmission channel.

Bit error rate.

If I try it out it doesn't seem right, example:

Information given:

Transmission channel 8 bits per second.

Error probability: 10 procent.

Trying to plug this into shannon's formula gives:

I < B * log2 (1 + (S / N) )

B = 8 ?
S = 8 ?
B = 0.8 ?

Result:

I < 8 * log2( 1 + (8 / 0.8) )
I < 8 * 3.4594316186372972561993630467258
I < 27.675452949098378049594904373806

What am I doing wrong?

Bugs
  • 4,491
  • 9
  • 32
  • 41
oOo
  • 261
  • 2
  • 16

2 Answers2

1

Here is a belated answer. The BER you quote has to do with coding and throughput of the information bits. Shannon's equation does not take into any coding and gives you, not the net throughput but, the gross throughput which includes all the code bits.

So forget about the BER when computing the Shannon limit. It is not relevant. Also you state the bit rate. However, Shannon limit gives you the bit rate (the max possible), so this parameter is also not relevant to your quest. Secondly "noise" is not related to BER. It is the cause of BER and same noise can change the BER depending on the code used.

It seems that in your formulation, what you are missing is the key information about the true SNR.

If it just happened to be 10 (8/.8, what you used), then your answer would be correct. But from the items you give, it seems that you are missing the mark on what is needed.

0

I see the original link is broken. (Can't remember clearly why I wanted this answered, but I think it's pretty obvious, given digital links, and given a certain bit error probablity or error rate what is maximum ammount of bits that can be sent forward/transmitted)

New link:

https://en.wikipedia.org/w/index.php?title=Eb/N0&oldid=848677212

Perhaps this is a weird question, because these formulas seem to be literally about "electric power" ? Or at least some kind of signal power quantities.

Perhaps with a probablity of 10% as described, 90 bits will arrive, 10 bits will fail.

I know now at least how to compute B I think it's a ratio given by this link:

https://en.wikipedia.org/wiki/Bit_error_rate

Bad bits / Total bits = 10 / 100 = 0.1 = B

Just guessing though, though shannon mentions "signal power" and such later it does go on about "symbols per sec" and such...

It seems to be a combination of "digital" = "symbols per second" and analog/electronical/theoretical signal strength = "power per symbol" S / N and so forth.

Not sure if these formulas can be used when "powers" are not known and just the statistics as described in my original question above.

(In response to "answer" below by charan langton)

From link, in case it disappears again:

The Shannon–Hartley theorem says that the limit of reliable information rate (data rate exclusive of error-correcting codes) of a channel depends on bandwidth and signal-to-noise ratio according to:

I < B log 2 ⁡ ( 1 + S N ) {\displaystyle I<B\log _{2}\left(1+{\frac {S}{N}}\right)} I<B\log _{2}\left(1+{\frac {S}{N}}\right)

where

I is the information rate in bits per second excluding error-correcting codes;
B is the bandwidth of the channel in hertz;
S is the total signal power (equivalent to the carrier power C); and
N is the total noise power in the bandwidth.

This equation can be used to establish a bound on Eb/N0 for any system that achieves reliable communication, by considering a gross bit rate R equal to the net bit rate I and therefore an average energy per bit of Eb = S/R, with noise spectral density of N0 = N/B. For this calculation, it is conventional to define a normalized rate Rl = R/2B, a bandwidth utilization parameter of bits per second per half hertz, or bits per dimension (a signal of bandwidth B can be encoded with 2B dimensions, according to the Nyquist–Shannon sampling theorem). Making appropriate substitutions, the Shannon limit is:

R B = 2 R l < log 2 ⁡ ( 1 + 2 R l E b N 0 ) {\displaystyle {R \over B}=2R_{l}<\log _{2}\left(1+2R_{l}{\frac {E_{\text{b}}}{N_{0}}}\right)} {\displaystyle {R \over B}=2R_{l}<\log _{2}\left(1+2R_{l}{\frac {E_{\text{b}}}{N_{0}}}\right)}

Which can be solved to get the Shannon-limit bound on Eb/N0:

E b N 0 > 2 2 R l − 1 2 R l {\displaystyle {\frac {E_{\text{b}}}{N_{0}}}>{\frac {2^{2R_{l}}-1}{2R_{l}}}} {\displaystyle {\frac {E_{\text{b}}}{N_{0}}}>{\frac {2^{2R_{l}}-1}{2R_{l}}}}

When the data rate is small compared to the bandwidth, so that Rl is near zero, the bound, sometimes called the ultimate Shannon limit,[3] is:

E b N 0 > ln ⁡ ( 2 ) {\displaystyle {\frac {E_{\text{b}}}{N_{0}}}>\ln(2)} {\displaystyle {\frac {E_{\text{b}}}{N_{0}}}>\ln(2)}

which corresponds to −1.59 dB.

Note that this often-quoted limit of −1.59 dB applies only to the theoretical case of infinite bandwidth. The Shannon limit for finite-bandwidth signals is always higher.

Bugs
  • 4,491
  • 9
  • 32
  • 41
oOo
  • 261
  • 2
  • 16