I have a customer's broken application that I am trying to fix, but the original code has been lost, and I am trying to re-engineer it. It involves downloading 140 blocks of data, 768 bytes per block, to a device. In the data there is a "checksum" consisting of 16 bytes of data, that presumably covers all 140 blocks or possibly some small subset (no way to know).
If I change so little as 1 bit in the data, the entire 16 bytes of "checksum" changes. What I'm looking for are ideas about how this "checksum" might be calculated.
Example:
In block 24 at offset 0x116
I change two bytes from 0xe001
to 0xe101
, and the "checksum" data changes from:
53 20 5a 20 3e f5 38 72 eb d7 f4 3c d9 0a 3f c5
to this:
7f fe ad 1f cc c3 1e 3c 22 0a bf 6a 6d 03 ad 97
I can experiment if I have some clue how they might be calculating this "checksum".
Looking for any idea to get me started.