I have a string that has been encrypted using Java
Cipher cipher = Cipher.getInstance("AES/CBC/PKCS5Padding");
cipher.init(Cipher.ENCRYPT_MODE, key);
return cipher.doFinal(text.getBytes());
(Note that in Java, PKCS5Padding is actually PKCS7Padding when using AES link)
And my decryption internal code is:
CCCryptorStatus cryptStatus = CCCrypt(kCCDecrypt,
kCCAlgorithmAES128,
0,
key.bytes,
kCCBlockSizeAES128,
iv.bytes,
[encrypted bytes],
dataLength,
buffer,
bufferSize,
&numBytesEncrypted);
called like:
let decryptedData = decryptor.AES128DecryptWithKeyString(key, encryptedString: encodedString) //this does the CCCrypt stuff
let string:NSString = NSString(data: decryptedData, encoding: NSUTF8StringEncoding) ?? ""
let data = NSData(base64EncodedString: string as String, options: NSDataBase64DecodingOptions.IgnoreUnknownCharacters)
So even though it was encrypted using PKCS5Padding, my decryption works fine despite not giving a padding option. Additionally, if I change the 0
option to kCCOptionPKCS7Padding
it also works to decrypt.
Is this the expected behavior? Is the padding option only relevant for kCCEncrypt and not kCCDecrypt?
Additionally, if we change the Java encryption to
Cipher cipher = Cipher.getInstance("AES/CBC/NoPadding");
and pad the payload manually with zero bytes, then it still decrypts properly regardless of whether I use 0
as an option or kCCOptionPKCS7Padding