I'm trying to encrypt text block with AES128 (ECB mode), to test resulted encryption/decryption functionality I'm using using ECB-AES123 test vectors from "Recommendation for Block Cipher Modes of Operation: Methods and Techniques, NIST Special Publication 800-38A, 2001 Edition;".
For example:
Key: 2b7e151628aed2a6abf7158809cf4f3c (16 bytes)
Input Plaintext: 6bc1bee22e409f96e93d7e117393172a (16 bytes)
Resulted Ciphertext: 3ad77bb40d7a3660a89ecaf32466ef97 (16 bytes)
For OpenSSL following code works perfectly:
AES_KEY aes_key;
//key vector of bytes 2b7e151628aed2a6abf7158809cf4f3c (16 bytes)
//input_vector also vector of bytes 6bc1bee22e409f96e93d7e117393172a (16 bytes)
if ( AES_set_encrypt_key( &key[0], 128, &aes_key ) != 0 )
{
return false;
}
AES_ecb_encrypt( &input_vector[ 0 ], &encrypted_vector[ 0 ], &aes_key, AES_ENCRYPT );
//encrypted_vector will have expected value 3ad77bb40d7a3660a89ecaf32466ef97 (16 bytes)
But at the same time I need implement same functionality with Microsoft's Crypto API. Following is code which should do it:
HCRYPTPROV crypto_provider_handle;
HCRYPTHASH crypto_hash_handle;
HCRYPTKEY crypto_key_handle;
CryptAcquireContext( &crypto_provider_handle, NULL, MS_ENH_RSA_AES_PROV, PROV_RSA_AES,
CRYPT_VERIFYCONTEXT | CRYPT_NEWKEYSET );
CryptCreateHash( crypto_provider_handle, CALG_SHA1, 0, 0, &crypto_hash_handle );
CryptHashData( crypto_hash_handle, &key[ 0 ], key.size( ), 0 );
CryptDeriveKey( crypto_provider_handle, CALG_AES_128, crypto_hash_handle,
0x00800000 | CRYPT_NO_SALT, crypto_key_handle );
// Set ECB mode
DWORD mode = CRYPT_MODE_ECB;
CryptSetKeyParam( crypto_key_handle, KP_MODE, (BYTE*)&mode, 0 );
// input_data is buffer of 16 bytes with additional space, total size 16*2
// input_size equals 16
CryptEncrypt( crypto_key_handle, NULL, TRUE, 0, input_data, &input_size, 16*2 );
But this code will produce different result.
- If "Final" will be TRUE (as was shown above) input_data will have size 32 bytes, which doesn't match expected value at all. According to MSDN it's OK, since in this case one additional block of padding is appended to the data.
- If "Final" will be FALSE input_data will have expected size (16 bytes) but also wrong result.
I have stuck with this case and I don't understand what I have missed. Is it possible to produce same result as OpenSSL (which is expected according specs)? Why AES128 in ECB mode changes size of result (In ECB mode it shouldn't do it)?