I can't quite seem to figure out how does such algorithm work...
Code:
#include <stdio.h>
int main(void){
int number = 0;
while(number < 16){
if(number < 10){
printf("decimal: \t%d\n", number);
printf("hexadec.: \t%c\n", number + '0');
}else{
printf("decimal: \t%d\n", number);
printf("hexadec.: \t%c\n", number - 10 + 'A');
}
printf("\n");
number++;
}
return 1;
}
Output:
decimal: 0
hexadec.: 0
decimal: 1
hexadec.: 1
decimal: 2
hexadec.: 2
decimal: 3
hexadec.: 3
decimal: 4
hexadec.: 4
decimal: 5
hexadec.: 5
decimal: 6
hexadec.: 6
decimal: 7
hexadec.: 7
decimal: 8
hexadec.: 8
decimal: 9
hexadec.: 9
decimal: 10
hexadec.: A
decimal: 11
hexadec.: B
decimal: 12
hexadec.: C
decimal: 13
hexadec.: D
decimal: 14
hexadec.: E
decimal: 15
hexadec.: F
It prints out exactly what I want but I don't understand the process behind it. If I am not mistaken integer + 'character' converts the number to a character by the ascii table.
Let's say we have number 11, then 11 - 10 = 1 which is 49 in ascii, A is 65 in ascii. So how does 49 + 65 = 66. I am sure this thinking is completely wrong, I just wanted to show you what I think it does in the background.