I see the original link is broken. (Can't remember clearly why I wanted this answered, but I think it's pretty obvious, given digital links, and given a certain bit error probablity or error rate what is maximum ammount of bits that can be sent forward/transmitted)
New link:
https://en.wikipedia.org/w/index.php?title=Eb/N0&oldid=848677212
Perhaps this is a weird question, because these formulas seem to be literally about "electric power" ? Or at least some kind of signal power quantities.
Perhaps with a probablity of 10% as described, 90 bits will arrive, 10 bits will fail.
I know now at least how to compute B I think it's a ratio given by this link:
https://en.wikipedia.org/wiki/Bit_error_rate
Bad bits / Total bits = 10 / 100 = 0.1 = B
Just guessing though, though shannon mentions "signal power" and such later it does go on about "symbols per sec" and such...
It seems to be a combination of "digital" = "symbols per second" and analog/electronical/theoretical signal strength = "power per symbol" S / N and so forth.
Not sure if these formulas can be used when "powers" are not known and just the statistics as described in my original question above.
(In response to "answer" below by charan langton)
From link, in case it disappears again:
The Shannon–Hartley theorem says that the limit of reliable information rate (data rate exclusive of error-correcting codes) of a channel depends on bandwidth and signal-to-noise ratio according to:
I < B log 2 ( 1 + S N ) {\displaystyle I<B\log _{2}\left(1+{\frac {S}{N}}\right)} I<B\log _{2}\left(1+{\frac {S}{N}}\right)
where
I is the information rate in bits per second excluding error-correcting codes;
B is the bandwidth of the channel in hertz;
S is the total signal power (equivalent to the carrier power C); and
N is the total noise power in the bandwidth.
This equation can be used to establish a bound on Eb/N0 for any system that achieves reliable communication, by considering a gross bit rate R equal to the net bit rate I and therefore an average energy per bit of Eb = S/R, with noise spectral density of N0 = N/B. For this calculation, it is conventional to define a normalized rate Rl = R/2B, a bandwidth utilization parameter of bits per second per half hertz, or bits per dimension (a signal of bandwidth B can be encoded with 2B dimensions, according to the Nyquist–Shannon sampling theorem). Making appropriate substitutions, the Shannon limit is:
R B = 2 R l < log 2 ( 1 + 2 R l E b N 0 ) {\displaystyle {R \over B}=2R_{l}<\log _{2}\left(1+2R_{l}{\frac {E_{\text{b}}}{N_{0}}}\right)} {\displaystyle {R \over B}=2R_{l}<\log _{2}\left(1+2R_{l}{\frac {E_{\text{b}}}{N_{0}}}\right)}
Which can be solved to get the Shannon-limit bound on Eb/N0:
E b N 0 > 2 2 R l − 1 2 R l {\displaystyle {\frac {E_{\text{b}}}{N_{0}}}>{\frac {2^{2R_{l}}-1}{2R_{l}}}} {\displaystyle {\frac {E_{\text{b}}}{N_{0}}}>{\frac {2^{2R_{l}}-1}{2R_{l}}}}
When the data rate is small compared to the bandwidth, so that Rl is near zero, the bound, sometimes called the ultimate Shannon limit,[3] is:
E b N 0 > ln ( 2 ) {\displaystyle {\frac {E_{\text{b}}}{N_{0}}}>\ln(2)} {\displaystyle {\frac {E_{\text{b}}}{N_{0}}}>\ln(2)}
which corresponds to −1.59 dB.
Note that this often-quoted limit of −1.59 dB applies only to the theoretical case of infinite bandwidth. The Shannon limit for finite-bandwidth signals is always higher.