I have around 1500 bytes of data that I want to construct a checksum for so that if the data gets corrupted the chances of the checksum still matching the data is less than say 1 in 10^15, i.e. a low enough probability that I can treat it as it is never going to happen.
The question is how many bits should I compute? I have a sha-160 computation that gives me a 160 bit hash of my data, but I expect this is way larger than necessary. So I'm thinking I could truncate the resulting hash down to say the low 40 bits and use that as a sufficiently large bit pattern that if the data gets corrupted, I will most likely detect it.
So the question is two fold, how many bits is good enough and is taking the lower bits of a sha-160 hash a good approach to take?