I have 2 existing implementations of an HMAC hash for a 3rd party API. Java and Ruby match perfectly when doing SHA1 or SHA256, but Node doesn't match either. The code to implement in Node seems simple and straight forward, so I'm not sure where the difference is.
In Ruby:
def calculateRFC2104HMAC(canonicalizedData, accessKey, algorithm)
digest = OpenSSL::Digest.new(algorithm)
hmac = OpenSSL::HMAC.digest(digest, accessKey, canonicalizedData)
return Base64.encode64(hmac)
end
# SHA1: SCN+L/M/nwwbk90VXBmEPe+18RU=
# SHA256: hgY38OlBKRsFlcBYAiX94blJPisXTIr08rvZDc7Ljhk=
In Java:
private static String calculateRFC2104HMAC(String data,String
accessKey,String algorithm) {
SecretKeySpec signingKey=null;
byte[]rawHmac=null;
// get an hmac_sha256 key from the raw key bytes
signingKey=new SecretKeySpec(accessKey.getBytes(),
algorithm);
// get an hmac_sha256 Mac instance and initialize with the signing key
Mac mac=Mac.getInstance(algorithm);
mac.init(signingKey);
// compute the hmac on input data bytes
rawHmac=mac.doFinal(data.getBytes());
// base64-encode the HMAC
return new String(Base64.encodeBase64(rawHmac));
}
# SHA1: SCN+L/M/nwwbk90VXBmEPe+18RU=
# SHA256: hgY38OlBKRsFlcBYAiX94blJPisXTIr08rvZDc7Ljhk=
In Node:
_calculateRFC2104HMAC = ({canonicalizedData, accessKey, algorithm}) => {
const hmac = crypto.createHmac(algorithm, accessKey);
hmac.update(canonicalizedData);
const hash = hmac.digest('base64');
return hash;
};
# SHA1: GspTWly+Ezh2aW/QkKZA1o+qHRg=
# SHA256: FjVQ1Uj7866QZUv+jgLz/OahPbtPGEXpGwBbioqtBec=
I've verified the key and data is identical.
Edit: Looks like the issue is line endings. The data we build has to be separated by \n line return. This is done in both Java and Ruby just fine. But the same \n in Node makes the hash different and the request fail.