I was browsing the Serpent implementation in TrueCrypt (7.1a) and something does not look right! Here is the interface of this algorithm:
void serpent_set_key(const unsigned __int8 userKey[], int keylen, unsigned __int8 *ks);
void serpent_encrypt(const unsigned __int8 *inBlock, unsigned __int8 *outBlock, unsigned __int8 *ks);
void serpent_decrypt(const unsigned __int8 *inBlock, unsigned __int8 *outBlock, unsigned __int8 *ks);
The function of interest here is serpent_set_key
. The user key is 32 bytes in length, the keylen
should be its size and the ks
is the output key to be used for encryption/decryption. The issue is the implementation. Here is the relevant fragment at the beginning:
unsigned __int32 a,b,c,d,e;
unsigned __int32 *k = (unsigned __int32 *)ks;
unsigned __int32 t;
int i;
for (i = 0; i < keylen / (int)sizeof(__int32); i++)
k[i] = LE32(((unsigned __int32*)userKey)[i]);
The for loop, is actually copying the data from the user key to the implementation key. It is done by 'looking' at the data as 4 bytes integers. Now, all is ok if the key len is sent in as bytes (32 is the correct value) but...
In all the implementation of trueCrypt this is called in two places. Here is the first one: In CipherInit is called like this:
case SERPENT:
serpent_set_key (key, CipherGetKeySize(SERPENT) * 8, ks);
break;
CipherGetKeySize(SERPENT)
will return 32 (bytes) so the passed in param will have a value of 256! That is correct as the length of the key is concerned but, NOT for this implementation! This will cause buffer overflow in 'serpent_set_key' because the for loop will be run 64 times instead of just 8!
The other place where this is called is in EAInit like this:
serpent_set_key (key, 32 * 8, ks);
Here is it obvious that the passed in param will be 256.
I am curious what other people think about this? Can anyone else confirm this bug?