I'm using WinHttp and WinHttpQueryOption API in particular to ensure that my connection employs strong https encryption. For that I'm doing the following:
DWORD dwHttpSecurityFlags = 0;
DWORD dwcbSzSec = sizeof(dwHttpSecurityFlags);
if(WinHttpQueryOption(hRequest, WINHTTP_OPTION_SECURITY_FLAGS, &dwHttpSecurityFlags, &dwcbSzSec))
{
//Check security -- for connection to employ 128-bit encryption
if((dwHttpSecurityFlags & SECURITY_FLAG_SECURE) &&
(dwHttpSecurityFlags & (SECURITY_FLAG_STRENGTH_WEAK |
SECURITY_FLAG_STRENGTH_MEDIUM |
SECURITY_FLAG_STRENGTH_STRONG)) == SECURITY_FLAG_STRENGTH_STRONG)
{
//Passed security check
}
else
{
//Security check failed
}
}
else
{
//API Failed
}
But I'm not very clear how SECURITY_FLAG_STRENGTH_* flags are used -- as bitwise flags, or as one-only values?
If I look in a header file, they are defined as such:
#define SECURITY_FLAG_SECURE 0x00000001 // can query only
#define SECURITY_FLAG_STRENGTH_WEAK 0x10000000
#define SECURITY_FLAG_STRENGTH_MEDIUM 0x40000000
#define SECURITY_FLAG_STRENGTH_STRONG 0x20000000
which hints at bitwise use, but it doesn't make sense for a connection to use both 40-bit and 128-bit strong encryption.
Can someone clarify this? Is my code above correct?