2

I have the following code:

#include <stdio.h>
#include <stdlib.h>

int main(){

int first = cdab3412;

int second = abcd1234;

int result1 = (second >> 16) | (first & 0xFFFF0000);
int result2 = (first << 16) | (second & 0x0000FFFF);

printf("Outputs: %x and %x.\n", result1, result2);

result2 turns out as expected an outputs: 34121234

However, result1 outputs ffffabcd. If I just leave it as (first & 0xFFFF0000) it correctly outputs cdab0000.

Why is result1 ffffabcd and not cdababcd?

2 Answers2

5

It's called sign extension. Set the types to unsigned int and it should work.

Freddie
  • 871
  • 6
  • 10
4

second is a signed integer. So when you shift it to the right the leftmost bit becomes a 1.

If you use unsigned int's you'll get the result you expect.

Signed vs. unsigned has always been a source of confusion, so you need to tread carefully.

bara
  • 374
  • 2
  • 8
  • If you want to see this drive you crazy change the 'C' in first for a '7' and see it work even with the signed int code (and then see it fail spectacularly in production). I've spent more than a trivial amount of time debugging issues like this one. :) – bara Sep 17 '13 at 22:19