0

int x; So there will be 2 bytes memory for the variable. Now, if I entered 66 and because scanf() with %d, 66 will be stored in 2 bytes memory because the variable is declared int.

Now in printf() with %c, should collect data from only one byte memory to display.

But %c displayed correctly B by getting correct data 66 from memory to display.

Why it %c has not just get data from one byte?

3 Answers3

6

%c expects an int argument, due to the default argument promotions for vararg functions. In other words, all of the following are exactly equivalent:

int x = 66;
char y = 66;
printf("%c", x);         // A
printf("%c", (char)x);   // B
printf("%c", y);         // C
printf("%c", (int)y);    // D

So all that's happening is printf is interpreting the int value of 66 as an ASCII code1 and printing the corresponding character.


1. Note that ASCII is technically an implementation-defined design decision. Just an overwhelmingly common one.

Oliver Charlesworth
  • 267,707
  • 33
  • 569
  • 680
4

The %c conversion specifier in a printf() statement expects an int argument. Further, since printf() is a variadic function, a char is converted to an int by virtue of the default argument promotions.

The int argument that is passed to printf() corresponding to a %c specifier is then converted to an unsigned char by printf() before printing. Note that the conversion of a signed integer type to an unsigned integer type is well-defined in C, and does not involve collecting "data from only one byte." Rather, if the new type can hold the original value, the value remains unchanged; otherwise one larger than the maximum value for the new type is added to (or subtracted from) the old (signed) value. For example, an int value of -1 would be converted to an unsigned char value (assuming UCHAR_MAX is 255) of -1 + 256, or 255, which is within the range of an unsigned char.

Note that an int with a value of 66 would be unchanged in the conversion to unsigned char, since 66 is well within the range of an unsigned char. UCHAR_MAX must be a minimum of 255.

ad absurdum
  • 19,498
  • 5
  • 37
  • 60
  • Where does the OP mention he is passing a `char` to `printf`? – chqrlie Jul 16 '17 at 00:08
  • @chqrlie-- Nowhere, as far as I can see; just adding a detail about why `printf()` expects an `int`, yet it is typical to pass a `char` with the `%c` conversion specifier. – ad absurdum Jul 16 '17 at 00:09
  • 1
    Good point. To add to the confusion, `'a'` is actually an `int` in C (unlike ++). – chqrlie Jul 16 '17 at 00:11
  • @chqrlie-- that is also a good point, that character constants are type `int` to begin with. Is it any wonder that this stuff is a little confusing for learners? – ad absurdum Jul 16 '17 at 00:17
  • 'A' is actually a int in C I know that. That I think equal to 65 and saved in memory as corresponding binary value combination of 8 0s and 1s as binary code. – Javaid Akhtar Jul 16 '17 at 01:47
1

Regardless of how the argument is passed, the %c format specifier always converts its argument to unsigned char before printing it. So, %c always prints one byte.

Your assertion that %c gets its data from more than one byte is unfounded. The example presented does not show any evidence to the contrary - 66 is a number that fits into one byte.

The intricacies of variadic argument passing (yes, it is passed as an int) have no bearing on the observed behavior in this case.

AnT stands with Russia
  • 312,472
  • 42
  • 525
  • 765
  • I could be wrong, but it seems that OP believes that `%c` expects a `char`, and is confused about what may happen when the actual argument is wider than a `char`; hence my attempt to describe the goings-on. Maybe also confused about representation of integer types.... – ad absurdum Jul 16 '17 at 00:28
  • Yes Sir. I am confused about what may happen when the actual argument is wider than a char size i.e. one byte. Then data more than one byte should go to where in memory? Because there is only one byte that relates to char variable. – Javaid Akhtar Jul 16 '17 at 01:39
  • I know that char is actually nothing just an integer – Javaid Akhtar Jul 16 '17 at 01:40
  • But memory size of char binary code is just one byte – Javaid Akhtar Jul 16 '17 at 01:42