printf
is a varargs function; it's prototype is:
int printf(const char *format, ...);
That means that the first argument has type const char*
, and the remaining arguments do not have a specified type.
Arguments without a specified type undergo the default argument promotions:
float
arguments are converted to double
.
integer types (both signed and unsigned) which are strictly narrower than an int
are converted to a signed int
.
all other arguments are unchanged.
This happens before printf
is called, and is not in any way specific to printf
. The arguments in a call to any varargs function (which has an ...
in its prototype).
Varargs functions have no way of knowing the types of their arguments, so they need to have some convention which lets the caller tell the function what types to expect. In the case of printf
, the types are specified in the format string, and printf
uses the specified type in the format string in the expectation that it is correct. If you lie to printf
by telling it that an argument is of a certain type when it is actually of a different type, the resulting behaviour is undefined, and is occasionally catastrophic (although usually it just means that the wrong thing is printed.)
Printf is aware of the default argument promotions. So if you tell it to expect an unsigned short, for example, then it will actually expect either an int
(if int
is wider than unsigned short
) or unsigned int
if int
and short
are the same size.
The type of a format item is specified using the format code (such as d
and x
, which are signed and unsigned int
respectively) and possibly modifiers. In particular, the modifier h
changes the expectation from int
to short
, while hh
changes it to char
. (It doesn't affect the signedness.)
So, if you provide a signed short
, then you should use the format code %hd
. If you provide an unsigned short, you should use %hu
or %hx
, depending on whether you want the output in decimal or hexadecimal. There is no way to specify a hexadecimal conversion of a signed argument, so you need to cast the argument to an unsigned type in order to use the hexadecimal format code:
printf("This signed short is printed as unsigned: %hx\n",
(unsigned short)x);
That will first convert x
from short
to unsigned short
; then (because of the default argument promotions, and assuming short
is actually shorter than int
) to int
. That int
is what is actually sent to printf
. When printf
sees the %hx
format code, it knows that it should expect the default promotion of an unsigned short
(that is, an int
); it takes that int
and converts it back to an unsigned short
, which it then prints.
Many programmers would just write %x
without a cast, just as you did. Technically, that is undefined behaviour and the rather wordier expression above is correct but pedantic. However, it is worth noting that it produces the expected value, whereas the incorrect %x
format code without the cast does not.