I'm using VBScript to parse a return code inside some Windows installer stuff. I wanted to confirm that I was doing the bitwise stuff correctly, so I jotted down some Echo
statements, and found that one of them didn't produce the result I expected:
WScript.Echo (&H01010101) ' prints 16843009 (0x01010101). Correct!
WScript.Echo (&H01010101 And &Hff000000) ' prints 16777216 (0x01000000). Correct!
WScript.Echo (&H01010101 And &H00ff0000) ' prints 65536 (0x00010000). Correct!
WScript.Echo (&H01010101 And &H0000ff00) ' prints 16843008 (0x01010100). What's happening here?
WScript.Echo (&H01010101 And &H000000ff) ' prints 1 (0x00000001). Correct!
That fourth one appears to have only masked the bottom two bytes. Okay, sure, I guess I can see where maybe it gets converted to the smallest int that can hold it before it gets And
ed and so the mask is shorter than what's being masked or something like that, but then why does the final case work?
If I throw in something in the highest byte it works:
WScript.Echo (&H01010101 And &Hf000ff00) ' prints 256 (0x01010100).
and here are some other cases:
WScript.Echo (&H01010101 And &H0000f0ff) ' prints 16842753 (0x01010001).
WScript.Echo (&H01010101 And &Hf00000ff) ' prints 1 (0x00000001).
which sort of support the idea that it's getting stored in a WORD, not a DWORD, but I still don't really understand what's happening.