Is it possible to somehow calculate minimum number of bits that are needed to represent integers in arithmetic coding? Let's say i have 100characters-length string. How many bits do i need to encode and decode that sequence?
Asked
Active
Viewed 215 times
0
-
It depends on the frequency table for those 100 characters – Nayuki Dec 07 '16 at 15:17
-
Thanks for the comment. Is it any formula or any way to count amount of those bits? – peter Schiza Dec 08 '16 at 12:10
-
Yes. Use the standard formula for calculating information entropy: https://en.wikipedia.org/wiki/Entropy_(information_theory)#Definition – Nayuki Dec 08 '16 at 15:20