-1

I have this code in a c# console application:

int x = 10;
int* ptr1 = &x;
Console.WriteLine((int)ptr1);
Console.WriteLine((long)ptr1);
Console.WriteLine((ulong)ptr1);

Why do I get a diferente value for the int datatype? Is it because of the 2,147,483,647 limit for int?

-1529354580
453737178796
453737178796
wohlstad
  • 12,661
  • 10
  • 26
  • 39
gtpt
  • 113
  • 7

1 Answers1

3

I assume you build for x64 architecture . The size of addresses in x64 is 64 bits (8 bytes). A pointer contains an address, in your case of the data of type int. This address cannot be fit into an int because its size is only 32 bits (4 bytes).

The value is truncated, and this is what you see when you use:

Console.WriteLine((int)ptr1);

By truncated I mean that the compiler will take 4 bytes of the address, and store them in a 32bit storage (int). You'll loose 4 bytes, and if the value exceeds the maximum for int, the result will be quite meaningless.

The other 2 print lines:

Console.WriteLine((long)ptr1);
Console.WriteLine((ulong)ptr1);

cast the pointer into a 64bit type (long or ulong) and therefore you get the whole address.

In principle you can also have different outputs for long and ulong. But since addresses are always positive and usually within the valid range for long you didn't encounter this issue.

BTW - If you'll build for x86 (32bit architecture), pointers will have the size of 4 bytes and therefore all your print lines will print the same value.

wohlstad
  • 12,661
  • 10
  • 26
  • 39