#include <stdio.h>
int main() {
long long a, b;
scanf("%d %d", &a, &b);
printf("%lld", a + b);
return 0;
}
The code above reads two numbers and prints the sum of them.
I know the precise one should use format specifier %lld
rather than %d
, which I believe is the cause of compiling error.
However, the problem is that some compilers, such as https://www.programiz.com/c-programming/online-compiler/, execute the code without any syntax error but print awkward value like below, which I don't get it at all.
Input: 123 -123
Output: 235046380240896
(This value consistently changes)
What is happening on the foundational level when int type is stored in long long
type?