I have been trying to figure out the whole shifting process and it just doesn't make sense to me. How does it count the number of bits there were set and the right shift just throws away whatever calculation you had earlier?
A function loop_while has the following overall structure:
long loop_while(unsigned long x)
{
long val= 0;
while ( ... ) {
...
...
}
return ...;
}
GCC generates the following assembly code:
long fun_a(unsigned long x)
x in %rdi
1 fun_a:
2 movl $0, %eax
3 jmp .L5
4 .L6:
5 xorq %rdi, %rax
6 shrq %rdi Shift right by 1
7 .L5:
8 testq %rdi, %rdi
9 jne .L6
10 andl $1, %eax
11 ret
We can see that the compiler used a jump-to-middle translation, using the jmp instruction on line 3 to jump to the test starting with label .L2.
Describe what this function computes.
Select one: a. This code computes the sign of argument x. That is, it returns 1 if the most significant bit in x is 0 and 0 if it is 1.
b. This code computes the parity of argument x. That is, it returns 1 if there is an even number of ones in x and 0 if there is an odd number.
c. This code computes the parity of argument x. That is, it returns 1 if there is an odd number of ones in x and 0 if there is an even number.
d. This code computes the sign of argument x. That is, it returns 1 if the most significant bit in x is 1 and 0 if it is 0.