I've a password with a length of 10 and 78 unique characters. I know that the first two characters of the password must be digits (from 0-9). My calculation is:
E = log2(10^2) + log2(78^8) = 56,93
Is that right?
I've a password with a length of 10 and 78 unique characters. I know that the first two characters of the password must be digits (from 0-9). My calculation is:
E = log2(10^2) + log2(78^8) = 56,93
Is that right?
Yep this is the correct calculation for information-theoretic entropy.
Remember though, entropy is a measure of the uncertainty generated by a source, not a property of the generated bits themselves.