-1

I’m very interested on the crc topic. I understand that the probability of the error with a crc8 is 1/256, a 0.39%. But I want to know the number of bits that could change in an amount like 320 bytes. This error is about an amount of bits or using bytes… 320bytes/256= 1.25 bits could change and it can not be detected through this crc8 because 2^8 is about every single bit change inside a byte; or the amount would be (3208)/256=10 bits could change and it can not be detected through this crc8 because 2^8 is about every single bit in the total amount of bits. It is a little bit confusing because the error probability of an amount of bytes should be the sum of every single probability. If I use crc8 I have the results that I show before, but if I use a crc16: 320bytes/65536=0.0048 bits or 3208/65536=0.039 bit could change and not be detected through this crc16. Someone knows a web, book or paper where I would find this kind of information. Thanks in advance

I really don’t know what is the right option… :(

Kikecea
  • 11

1 Answers1

0

If your message is longer than eight bits, then a 16-bit CRC is "better", with respect to detecting the greatest number of possible patterns of alterations of the message.

Mark Adler
  • 101,978
  • 13
  • 118
  • 158
  • Hi Mark! Thanks for your help, but what I want to really know is how I can calculate the amount of bits which could change from 1 to 0 or viceversa within an amount of bytes, like for example 320bytes. I know that the probability is 0.39% with crc8, but the amount of possible errors are more with more bytes, because is the sum of the errors in every byte I believe… thanks again for answering – Kikecea Jul 12 '23 at 22:40
  • Then edit your question. It is apparently not "What is better crc8 or crc16?" – Mark Adler Jul 12 '23 at 23:50
  • Also make the text of your question more clear. You have a run-on sentence of about a hundred words that I can't make any sense of. – Mark Adler Jul 12 '23 at 23:54