I have the following code:
#include <stdio.h>
#include <stdlib.h>
int main(){
int first = cdab3412;
int second = abcd1234;
int result1 = (second >> 16) | (first & 0xFFFF0000);
int result2 = (first << 16) | (second & 0x0000FFFF);
printf("Outputs: %x and %x.\n", result1, result2);
result2 turns out as expected an outputs: 34121234
However, result1 outputs ffffabcd. If I just leave it as (first & 0xFFFF0000) it correctly outputs cdab0000.
Why is result1 ffffabcd and not cdababcd?