I create detached signature with openssl as
openssl smime -sign -md sha256 -in {f_in} -signer {cert} -inkey {key} -out {f_out} -outform DER
Next I try to replay it with CryptoAPI using same input file and loading same certificate with private key from windows certificate storage. How should I call the API to receive the same ASN.1 format in the output?
EDIT (ADDITION): I feel that there's something wrong with CMSG_SIGNER_ENCODE_INFO. How to fill rgAuthAttr to add contentType, signingTimeto and messageDigest to it before passing to CryptMsgOpenToEncode. See the ASN.1 below to understand what openssl command done in this part. Mind it alse add 1.2.840.113549.1.9.15 sMIMECapabilities section, but I'm not sure that it's the root of the problem.
I trying it using ("Microsoft Enhanced RSA and AES Cryptographic Provider", "", 24). And as I see the main diff in how messageDigest is written. Here's how it written with openssl (it should be ethalon for me)
[0] {
SEQUENCE {
OBJECTIDENTIFIER 1.2.840.113549.1.9.3 (contentType)
SET {
OBJECTIDENTIFIER 1.2.840.113549.1.7.1 (data)
}
}
SEQUENCE {
OBJECTIDENTIFIER 1.2.840.113549.1.9.5 (signingTime)
SET {
UTCTime '180406133432Z'
}
}
SEQUENCE {
OBJECTIDENTIFIER 1.2.840.113549.1.9.4 (messageDigest)
SET {
OCTETSTRING c75f664aef53e428e65c58cb926e3c175b81070417628105941d387c1d4fa8b0
}
}
SEQUENCE {
OBJECTIDENTIFIER 1.2.840.113549.1.9.15
SET {
SEQUENCE {
SEQUENCE {
OBJECTIDENTIFIER 2.16.840.1.101.3.4.1.42
}
SEQUENCE {
OBJECTIDENTIFIER 2.16.840.1.101.3.4.1.22
}
SEQUENCE {
OBJECTIDENTIFIER 2.16.840.1.101.3.4.1.2
}
SEQUENCE {
OBJECTIDENTIFIER 1.2.840.113549.3.7 (id_des_EDE3_CBC)
}
SEQUENCE {
OBJECTIDENTIFIER 1.2.840.113549.3.2
INTEGER 0x0080 (128 decimal)
}
SEQUENCE {
OBJECTIDENTIFIER 1.2.840.113549.3.2
INTEGER 0x40 (64 decimal)
}
SEQUENCE {
OBJECTIDENTIFIER 1.3.14.3.2.7 (desCBC)
}
SEQUENCE {
OBJECTIDENTIFIER 1.2.840.113549.3.2
INTEGER 0x28 (40 decimal)
}
}
}
}
}