#include <stdio.h>
int main(){
printf("%d\t",sizeof(6.5));
printf("%d\t",sizeof(90000));
printf("%d\t",sizeof('a'));
return 0;
}
When I'm compiling my code, the output will be: "842". Can somebody explain why I get this output?
#include <stdio.h>
int main(){
printf("%d\t",sizeof(6.5));
printf("%d\t",sizeof(90000));
printf("%d\t",sizeof('a'));
return 0;
}
When I'm compiling my code, the output will be: "842". Can somebody explain why I get this output?
First of all syntax error in your code
printf("%d\t";sizeof('a'));
change this to
printf("%zu\t",sizeof('a')); //note the change in format specifier also
^
|
see here
Then, assuming your platfrom is 32-bit
sizeof(6.5)
== sizeof(double)
== 8sizeof(90000)
== sizeof(int)
== 4sizeof('a')
== sizeof(int)
== 4To clarify, a
represents a value of 97 which defaults to int
. so, sizeof('a')
should give a value of 4, not 2 or 1.
Edit:
To add, you will get an output of 8 4 2
, if, in 16-bit arch
sizeof(6.5)
== sizeof(double)
== 8sizeof(90000)
== sizeof(long)
== 4sizeof('a')
== sizeof(int)
== 2If you're on a 32 bit compiler
printf("%d\t",sizeof(6.5));
6.5
is a double, so sizeof(double)
gives 8
.
printf("%d\t",sizeof(90000));
90000
is an int ( or a long ), so sizeof(int)
gives 4
.
printf("%d\t";sizeof('a'));
^
you left a semicolon here, change it to a comma
'a'
is converted to int, so sizeof(int)
gives 4
.
So the actual output is
8 4 4
But , if you're on a 16 bit compiler, you will get
sizeof(6.5)
= sizeof(double)
= 8
sizeof(90000)
= sizeof(long)
= 4
sizeof('a')
= sizeof(int)
= 2
So that would explain your output.