I’m very interested on the crc topic. I understand that the probability of the error with a crc8 is 1/256, a 0.39%. But I want to know the number of bits that could change in an amount like 320 bytes. This error is about an amount of bits or using bytes… 320bytes/256= 1.25 bits could change and it can not be detected through this crc8 because 2^8 is about every single bit change inside a byte; or the amount would be (3208)/256=10 bits could change and it can not be detected through this crc8 because 2^8 is about every single bit in the total amount of bits. It is a little bit confusing because the error probability of an amount of bytes should be the sum of every single probability. If I use crc8 I have the results that I show before, but if I use a crc16: 320bytes/65536=0.0048 bits or 3208/65536=0.039 bit could change and not be detected through this crc16. Someone knows a web, book or paper where I would find this kind of information. Thanks in advance
I really don’t know what is the right option… :(